pax_global_header00006660000000000000000000000064147445663340014532gustar00rootroot0000000000000052 comment=df05e69a8a3fb37628a0e3a33518ca0425334bc9 pydantic-2.10.6/000077500000000000000000000000001474456633400134335ustar00rootroot00000000000000pydantic-2.10.6/.git-blame-ignore-revs000066400000000000000000000004111474456633400175270ustar00rootroot00000000000000# Linting/formatting: # isort/pyupgrade -> Ruff: 4f3e794a69e84e4294c605c669e4d1876a18dd50 # Black -> Ruff format: 419398d1dd9f3c0babdcfde1d52c249266f59ef0 # Ruff 0.2.1: 918402f01d82694214ff93cd77ff62d5d5beb1ab # Ruff 0.4.8: 332e77ba3b658c2a57fc72f832587b72311d87c7 pydantic-2.10.6/.github/000077500000000000000000000000001474456633400147735ustar00rootroot00000000000000pydantic-2.10.6/.github/FUNDING.yml000066400000000000000000000000251474456633400166050ustar00rootroot00000000000000github: samuelcolvin pydantic-2.10.6/.github/ISSUE_TEMPLATE/000077500000000000000000000000001474456633400171565ustar00rootroot00000000000000pydantic-2.10.6/.github/ISSUE_TEMPLATE/bug-v1.yml000066400000000000000000000073511474456633400210100ustar00rootroot00000000000000name: 🐛 Pydantic V1.X Bug description: Report a bug or unexpected behavior in Pydantic V1.X, e.g. all releases prior to V2 labels: [bug V1, pending] body: - type: markdown attributes: value: Thank you for contributing to pydantic! ✊ - type: checkboxes id: checks attributes: label: Initial Checks description: | Just a few checks to make sure you need to create a bug report. _Sorry to sound so draconian 👿; but every second spent replying to issues is time not spent improving pydantic 🙇._ options: - label: I have searched GitHub for a duplicate issue and I'm sure this is something new required: true - label: I have searched Google & StackOverflow for a solution and couldn't find anything required: true - label: I have read and followed [the docs](https://docs.pydantic.dev) and still think this is a bug required: true - label: > I am confident that the issue is with pydantic (not my code, or another library in the ecosystem like [FastAPI](https://fastapi.tiangolo.com) or [mypy](https://mypy.readthedocs.io/en/stable)) required: true - type: textarea id: description attributes: label: Description description: | Please explain what you're seeing and what you would expect to see. Please provide as much detail as possible to make understanding and solving your problem as quick as possible. 🙏 validations: required: true - type: textarea id: example attributes: label: Example Code description: > If applicable, please add a self-contained, [minimal, reproducible, example](https://stackoverflow.com/help/minimal-reproducible-example) demonstrating the bug. placeholder: | import pydantic ... render: Python - type: textarea id: version attributes: label: Python, Pydantic & OS Version description: | Which version of Python & Pydantic are you using, and which Operating System? Please run the following command and copy the output below: ```bash python -c "import pydantic.utils; print(pydantic.utils.version_info())" ``` render: Text validations: required: true - type: checkboxes id: affected-components attributes: label: Affected Components description: Which of the following parts of pydantic does this bug affect? # keep this lis in sync with feature_request.yml options: - label: '[Compatibility between releases](https://docs.pydantic.dev/changelog/)' - label: '[Data validation/parsing](https://docs.pydantic.dev/concepts/models/#basic-model-usage)' - label: '[Data serialization](https://docs.pydantic.dev/concepts/serialization/) - `.model_dump()` and `.model_dump_json()`' - label: '[JSON Schema](https://docs.pydantic.dev/concepts/json_schema/)' - label: '[Dataclasses](https://docs.pydantic.dev/concepts/dataclasses/)' - label: '[Model Config](https://docs.pydantic.dev/concepts/config/)' - label: '[Field Types](https://docs.pydantic.dev/api/types/) - adding or changing a particular data type' - label: '[Function validation decorator](https://docs.pydantic.dev/concepts/validation_decorator/)' - label: '[Generic Models](https://docs.pydantic.dev/concepts/models/#generic-models)' - label: '[Other Model behaviour](https://docs.pydantic.dev/concepts/models/) - `model_construct()`, pickling, private attributes, ORM mode' - label: '[Plugins](https://docs.pydantic.dev/) and integration with other tools - mypy, FastAPI, python-devtools, Hypothesis, VS Code, PyCharm, etc.' pydantic-2.10.6/.github/ISSUE_TEMPLATE/bug-v2.yml000066400000000000000000000031071474456633400210040ustar00rootroot00000000000000name: 🐛 Pydantic V2 Bug description: Report a bug or unexpected behavior in Pydantic V2 labels: [bug V2, pending] body: - type: markdown attributes: value: Thank you for contributing to pydantic! ✊ - type: checkboxes id: checks attributes: label: Initial Checks description: Just making sure you're really using Pydantic V2 options: - label: I confirm that I'm using Pydantic V2 required: true - type: textarea id: description attributes: label: Description description: | Please explain what you're seeing and what you would expect to see. Please provide as much detail as possible to make understanding and solving your problem as quick as possible. 🙏 validations: required: true - type: textarea id: example attributes: label: Example Code description: > If applicable, please add a self-contained, [minimal, reproducible, example](https://stackoverflow.com/help/minimal-reproducible-example) demonstrating the bug. placeholder: | import pydantic ... render: Python - type: textarea id: version attributes: label: Python, Pydantic & OS Version description: | Which version of Python & Pydantic are you using, and which Operating System? Please run the following command and copy the output below: ```bash python -c "import pydantic.version; print(pydantic.version.version_info())" ``` render: Text validations: required: true pydantic-2.10.6/.github/ISSUE_TEMPLATE/config.yml000066400000000000000000000003471474456633400211520ustar00rootroot00000000000000blank_issues_enabled: true contact_links: - name: 🤔 Ask a Question url: 'https://github.com/pydantic/pydantic/discussions/new?category=question' about: Ask a question about how to use pydantic using github discussions pydantic-2.10.6/.github/ISSUE_TEMPLATE/feature_request.yml000066400000000000000000000053701474456633400231110ustar00rootroot00000000000000name: 🚀 Pydantic V2 Feature request description: 'Suggest a new feature for Pydantic V2 (NOTE: and we only making critical bug fixes to Pydantic V1)' labels: [feature request] body: - type: markdown attributes: value: Thank you for contributing to pydantic! ✊ - type: checkboxes id: searched attributes: label: Initial Checks description: | Just a few checks to make sure you need to create a feature request. _Sorry to sound so draconian 👿; but every second spent replying to issues is time not spent improving pydantic 🙇._ options: - label: I have searched Google & GitHub for similar requests and couldn't find anything required: true - label: I have read and followed [the docs](https://docs.pydantic.dev) and still think this feature is missing required: true - type: textarea id: description attributes: label: Description description: | Please give as much detail as possible about the feature you would like to suggest. 🙏 You might like to add: * A demo of how code might look when using the feature * Your use case(s) for the feature * Why the feature should be added to pydantic (as opposed to another library or just implemented in your code) validations: required: true - type: checkboxes id: affected-components attributes: label: Affected Components description: Which of the following parts of pydantic does this feature affect? # keep this lis in sync with bug.yml options: - label: '[Compatibility between releases](https://docs.pydantic.dev/changelog/)' - label: '[Data validation/parsing](https://docs.pydantic.dev/concepts/models/#basic-model-usage)' - label: '[Data serialization](https://docs.pydantic.dev/concepts/serialization/) - `.model_dump()` and `.model_dump_json()`' - label: '[JSON Schema](https://docs.pydantic.dev/concepts/json_schema/)' - label: '[Dataclasses](https://docs.pydantic.dev/concepts/dataclasses/)' - label: '[Model Config](https://docs.pydantic.dev/concepts/config/)' - label: '[Field Types](https://docs.pydantic.dev/api/types/) - adding or changing a particular data type' - label: '[Function validation decorator](https://docs.pydantic.dev/concepts/validation_decorator/)' - label: '[Generic Models](https://docs.pydantic.dev/concepts/models/#generic-models)' - label: '[Other Model behaviour](https://docs.pydantic.dev/concepts/models/) - `model_construct()`, pickling, private attributes, ORM mode' - label: '[Plugins](https://docs.pydantic.dev/) and integration with other tools - mypy, FastAPI, python-devtools, Hypothesis, VS Code, PyCharm, etc.' pydantic-2.10.6/.github/PULL_REQUEST_TEMPLATE.md000066400000000000000000000012711474456633400205750ustar00rootroot00000000000000 ## Change Summary ## Related issue number ## Checklist * [ ] The pull request title is a good summary of the changes - it will be used in the changelog * [ ] Unit tests for the changes exist * [ ] Tests pass on CI * [ ] Documentation reflects the changes where applicable * [ ] My PR is ready to review, **please add a comment including the phrase "please review" to assign reviewers** pydantic-2.10.6/.github/actions/000077500000000000000000000000001474456633400164335ustar00rootroot00000000000000pydantic-2.10.6/.github/actions/people/000077500000000000000000000000001474456633400177175ustar00rootroot00000000000000pydantic-2.10.6/.github/actions/people/action.yml000066400000000000000000000011171474456633400217170ustar00rootroot00000000000000inputs: token: description: 'User token for accessing the GitHub API. Can be passed in using {{ secrets.GITHUB_TOKEN }}' required: true runs: using: 'composite' steps: - uses: actions/checkout@v4 - name: set up python uses: actions/setup-python@v4 with: python-version: '3.11' - name: install deps run: pip install -U PyGithub pyyaml pydantic pydantic-settings shell: bash - name: update pydantic people run: python .github/actions/people/people.py shell: bash env: INPUT_TOKEN: ${{ inputs.token }} pydantic-2.10.6/.github/actions/people/people.py000066400000000000000000000413641474456633400215650ustar00rootroot00000000000000"""Use the github API to get lists of people who have contributed in various ways to Pydantic. This logic is inspired by that of @tiangolo's [FastAPI people script](https://github.com/tiangolo/fastapi/blob/master/.github/actions/people/app/main.py). """ import logging import subprocess import sys from collections import Counter from datetime import datetime, timedelta, timezone from pathlib import Path from typing import Any, Container, Dict, List, Set, Union import requests import yaml from github import Github from pydantic_settings import BaseSettings from pydantic import BaseModel, SecretStr github_graphql_url = 'https://api.github.com/graphql' discussions_query = """ query Q($after: String) { repository(name: "pydantic", owner: "samuelcolvin") { discussions(first: 100, after: $after) { edges { cursor node { number author { login avatarUrl url } title createdAt comments(first: 100) { nodes { createdAt author { login avatarUrl url } isAnswer replies(first: 10) { nodes { createdAt author { login avatarUrl url } } } } } } } } } } """ issues_query = """ query Q($after: String) { repository(name: "pydantic", owner: "samuelcolvin") { issues(first: 100, after: $after) { edges { cursor node { number author { login avatarUrl url } title createdAt state comments(first: 100) { nodes { createdAt author { login avatarUrl url } } } } } } } } """ prs_query = """ query Q($after: String) { repository(name: "pydantic", owner: "samuelcolvin") { pullRequests(first: 100, after: $after) { edges { cursor node { number labels(first: 100) { nodes { name } } author { login avatarUrl url } title createdAt state comments(first: 100) { nodes { createdAt author { login avatarUrl url } } } reviews(first:100) { nodes { author { login avatarUrl url } state } } } } } } } """ class Author(BaseModel): login: str avatarUrl: str url: str # Issues and Discussions class CommentsNode(BaseModel): createdAt: datetime author: Union[Author, None] = None class Replies(BaseModel): nodes: List[CommentsNode] class DiscussionsCommentsNode(CommentsNode): replies: Replies class Comments(BaseModel): nodes: List[CommentsNode] class DiscussionsComments(BaseModel): nodes: List[DiscussionsCommentsNode] class IssuesNode(BaseModel): number: int author: Union[Author, None] = None title: str createdAt: datetime state: str comments: Comments class DiscussionsNode(BaseModel): number: int author: Union[Author, None] = None title: str createdAt: datetime comments: DiscussionsComments class IssuesEdge(BaseModel): cursor: str node: IssuesNode class DiscussionsEdge(BaseModel): cursor: str node: DiscussionsNode class Issues(BaseModel): edges: List[IssuesEdge] class Discussions(BaseModel): edges: List[DiscussionsEdge] class IssuesRepository(BaseModel): issues: Issues class DiscussionsRepository(BaseModel): discussions: Discussions class IssuesResponseData(BaseModel): repository: IssuesRepository class DiscussionsResponseData(BaseModel): repository: DiscussionsRepository class IssuesResponse(BaseModel): data: IssuesResponseData class DiscussionsResponse(BaseModel): data: DiscussionsResponseData # PRs class LabelNode(BaseModel): name: str class Labels(BaseModel): nodes: List[LabelNode] class ReviewNode(BaseModel): author: Union[Author, None] = None state: str class Reviews(BaseModel): nodes: List[ReviewNode] class PullRequestNode(BaseModel): number: int labels: Labels author: Union[Author, None] = None title: str createdAt: datetime state: str comments: Comments reviews: Reviews class PullRequestEdge(BaseModel): cursor: str node: PullRequestNode class PullRequests(BaseModel): edges: List[PullRequestEdge] class PRsRepository(BaseModel): pullRequests: PullRequests class PRsResponseData(BaseModel): repository: PRsRepository class PRsResponse(BaseModel): data: PRsResponseData class Settings(BaseSettings): input_token: SecretStr github_repository: str = 'pydantic/pydantic' request_timeout: int = 30 def get_graphql_response( *, settings: Settings, query: str, after: Union[str, None] = None, ) -> Dict[str, Any]: headers = {'Authorization': f'token {settings.input_token.get_secret_value()}'} variables = {'after': after} response = requests.post( github_graphql_url, headers=headers, timeout=settings.request_timeout, json={'query': query, 'variables': variables, 'operationName': 'Q'}, ) if response.status_code != 200: logging.error(f'Response was not 200, after: {after}') logging.error(response.text) raise RuntimeError(response.text) data = response.json() if 'errors' in data: logging.error(f'Errors in response, after: {after}') logging.error(data['errors']) logging.error(response.text) raise RuntimeError(response.text) return data def get_graphql_issue_edges(*, settings: Settings, after: Union[str, None] = None): data = get_graphql_response(settings=settings, query=issues_query, after=after) graphql_response = IssuesResponse.model_validate(data) return graphql_response.data.repository.issues.edges def get_graphql_question_discussion_edges( *, settings: Settings, after: Union[str, None] = None, ): data = get_graphql_response( settings=settings, query=discussions_query, after=after, ) graphql_response = DiscussionsResponse.model_validate(data) return graphql_response.data.repository.discussions.edges def get_graphql_pr_edges(*, settings: Settings, after: Union[str, None] = None): data = get_graphql_response(settings=settings, query=prs_query, after=after) graphql_response = PRsResponse.model_validate(data) return graphql_response.data.repository.pullRequests.edges def get_issues_experts(settings: Settings): issue_nodes: List[IssuesNode] = [] issue_edges = get_graphql_issue_edges(settings=settings) while issue_edges: for edge in issue_edges: issue_nodes.append(edge.node) last_edge = issue_edges[-1] issue_edges = get_graphql_issue_edges(settings=settings, after=last_edge.cursor) commentors = Counter() last_month_commentors = Counter() authors: Dict[str, Author] = {} now = datetime.now(tz=timezone.utc) one_month_ago = now - timedelta(days=30) for issue in issue_nodes: issue_author_name = None if issue.author: authors[issue.author.login] = issue.author issue_author_name = issue.author.login issue_commentors = set() for comment in issue.comments.nodes: if comment.author: authors[comment.author.login] = comment.author if comment.author.login != issue_author_name: issue_commentors.add(comment.author.login) for author_name in issue_commentors: commentors[author_name] += 1 if issue.createdAt > one_month_ago: last_month_commentors[author_name] += 1 return commentors, last_month_commentors, authors def get_discussions_experts(settings: Settings): discussion_nodes: List[DiscussionsNode] = [] discussion_edges = get_graphql_question_discussion_edges(settings=settings) while discussion_edges: for discussion_edge in discussion_edges: discussion_nodes.append(discussion_edge.node) last_edge = discussion_edges[-1] discussion_edges = get_graphql_question_discussion_edges(settings=settings, after=last_edge.cursor) commentors = Counter() last_month_commentors = Counter() authors: Dict[str, Author] = {} now = datetime.now(tz=timezone.utc) one_month_ago = now - timedelta(days=30) for discussion in discussion_nodes: discussion_author_name = None if discussion.author: authors[discussion.author.login] = discussion.author discussion_author_name = discussion.author.login discussion_commentors = set() for comment in discussion.comments.nodes: if comment.author: authors[comment.author.login] = comment.author if comment.author.login != discussion_author_name: discussion_commentors.add(comment.author.login) for reply in comment.replies.nodes: if reply.author: authors[reply.author.login] = reply.author if reply.author.login != discussion_author_name: discussion_commentors.add(reply.author.login) for author_name in discussion_commentors: commentors[author_name] += 1 if discussion.createdAt > one_month_ago: last_month_commentors[author_name] += 1 return commentors, last_month_commentors, authors def get_experts(settings: Settings): # Migrated to only use GitHub Discussions # ( # issues_commentors, # issues_last_month_commentors, # issues_authors, # ) = get_issues_experts(settings=settings) ( discussions_commentors, discussions_last_month_commentors, discussions_authors, ) = get_discussions_experts(settings=settings) # commentors = issues_commentors + discussions_commentors commentors = discussions_commentors # last_month_commentors = ( # issues_last_month_commentors + discussions_last_month_commentors # ) last_month_commentors = discussions_last_month_commentors # authors = {**issues_authors, **discussions_authors} authors = {**discussions_authors} return commentors, last_month_commentors, authors def get_contributors(settings: Settings): pr_nodes: List[PullRequestNode] = [] pr_edges = get_graphql_pr_edges(settings=settings) while pr_edges: for edge in pr_edges: pr_nodes.append(edge.node) last_edge = pr_edges[-1] pr_edges = get_graphql_pr_edges(settings=settings, after=last_edge.cursor) contributors = Counter() commentors = Counter() reviewers = Counter() authors: Dict[str, Author] = {} for pr in pr_nodes: author_name = None if pr.author: authors[pr.author.login] = pr.author author_name = pr.author.login pr_commentors: Set[str] = set() pr_reviewers: Set[str] = set() for comment in pr.comments.nodes: if comment.author: authors[comment.author.login] = comment.author if comment.author.login == author_name: continue pr_commentors.add(comment.author.login) for author_name in pr_commentors: commentors[author_name] += 1 for review in pr.reviews.nodes: if review.author: authors[review.author.login] = review.author pr_reviewers.add(review.author.login) for reviewer in pr_reviewers: reviewers[reviewer] += 1 if pr.state == 'MERGED' and pr.author: contributors[pr.author.login] += 1 return contributors, commentors, reviewers, authors def get_top_users( *, counter: Counter, min_count: int, authors: Dict[str, Author], skip_users: Container[str], ): users = [] for commentor, count in counter.most_common(50): if commentor in skip_users: continue if count >= min_count: author = authors[commentor] users.append( { 'login': commentor, 'count': count, 'avatarUrl': author.avatarUrl, 'url': author.url, } ) return users if __name__ == '__main__': logging.basicConfig(level=logging.INFO) settings = Settings() logging.info(f'Using config: {settings.model_dump_json()}') g = Github(settings.input_token.get_secret_value()) repo = g.get_repo(settings.github_repository) question_commentors, question_last_month_commentors, question_authors = get_experts(settings=settings) contributors, pr_commentors, reviewers, pr_authors = get_contributors(settings=settings) authors = {**question_authors, **pr_authors} maintainers_logins = { 'samuelcolvin', 'adriangb', 'dmontagu', 'hramezani', 'Kludex', 'davidhewitt', 'sydney-runkle', 'alexmojaki', } bot_names = {'codecov', 'github-actions', 'pre-commit-ci', 'dependabot'} maintainers = [] for login in maintainers_logins: user = authors[login] maintainers.append( { 'login': login, 'answers': question_commentors[login], 'prs': contributors[login], 'avatarUrl': user.avatarUrl, 'url': user.url, } ) min_count_expert = 10 min_count_last_month = 3 min_count_contributor = 4 min_count_reviewer = 4 skip_users = maintainers_logins | bot_names experts = get_top_users( counter=question_commentors, min_count=min_count_expert, authors=authors, skip_users=skip_users, ) last_month_active = get_top_users( counter=question_last_month_commentors, min_count=min_count_last_month, authors=authors, skip_users=skip_users, ) top_contributors = get_top_users( counter=contributors, min_count=min_count_contributor, authors=authors, skip_users=skip_users, ) top_reviewers = get_top_users( counter=reviewers, min_count=min_count_reviewer, authors=authors, skip_users=skip_users, ) extra_experts = [ { 'login': 'ybressler', 'count': None, 'avatarUrl': 'https://avatars.githubusercontent.com/u/40807730?v=4', 'url': 'https://github.com/ybressler', }, ] expert_logins = {e['login'] for e in experts} experts.extend([expert for expert in extra_experts if expert['login'] not in expert_logins]) people = { 'maintainers': maintainers, 'experts': experts, 'last_month_active': last_month_active, 'top_contributors': top_contributors, 'top_reviewers': top_reviewers, } people_path = Path('./docs/plugins/people.yml') people_old_content = people_path.read_text(encoding='utf-8') new_people_content = yaml.dump(people, sort_keys=False, width=200, allow_unicode=True) if people_old_content == new_people_content: logging.info("The Pydantic People data hasn't changed, finishing.") sys.exit(0) people_path.write_text(new_people_content, encoding='utf-8') logging.info('Setting up GitHub Actions git user') subprocess.run(['git', 'config', 'user.name', 'github-actions'], check=True) subprocess.run(['git', 'config', 'user.email', 'github-actions@github.com'], check=True) branch_name = 'pydantic-people-update' logging.info(f'Creating a new branch {branch_name}') subprocess.run(['git', 'checkout', '-b', branch_name], check=True) logging.info('Adding updated file') subprocess.run(['git', 'add', str(people_path)], check=True) logging.info('Committing updated file') message = '👥 Update Pydantic People' result = subprocess.run(['git', 'commit', '-m', message], check=True) logging.info('Pushing branch') subprocess.run(['git', 'push', 'origin', branch_name], check=True) logging.info('Creating PR') pr = repo.create_pull(title=message, body=message, base='main', head=branch_name) logging.info(f'Created PR: {pr.number}') logging.info('Finished') pydantic-2.10.6/.github/dependabot.yml000066400000000000000000000001521474456633400176210ustar00rootroot00000000000000version: 2 updates: - package-ecosystem: github-actions directory: / schedule: interval: monthly pydantic-2.10.6/.github/labels/000077500000000000000000000000001474456633400162355ustar00rootroot00000000000000pydantic-2.10.6/.github/labels/default_pass.yml000066400000000000000000000001341474456633400214300ustar00rootroot00000000000000# add relnotes-fix by default relnotes-fix: - changed-files: - any-glob-to-any-file: '**' pydantic-2.10.6/.github/labels/first_pass.yml000066400000000000000000000006521474456633400211400ustar00rootroot00000000000000relnotes-fix: - head-branch: ['^fix', 'fix'] relnotes-feature: - head-branch: ['^feature', 'feature'] documentation: - head-branch: ['^documentation', 'documentation', '^docs', 'docs'] relnotes-change: - head-branch: ['^change', 'change'] relnotes-performance: - head-branch: ['^performance', 'performance'] relnotes-packaging: - head-branch: ['^bump', 'bump', '^version', 'version', '^packaging', 'packaging'] pydantic-2.10.6/.github/release.yml000066400000000000000000000006361474456633400171430ustar00rootroot00000000000000changelog: exclude: labels: - relnotes-ignore - documentation categories: - title: Packaging labels: - relnotes-packaging - title: New Features labels: - relnotes-feature - title: Changes labels: - relnotes-change - title: Performance labels: - relnotes-performance - title: Fixes labels: - relnotes-fix pydantic-2.10.6/.github/workflows/000077500000000000000000000000001474456633400170305ustar00rootroot00000000000000pydantic-2.10.6/.github/workflows/ci.yml000066400000000000000000000321171474456633400201520ustar00rootroot00000000000000name: CI on: push: branches: - main tags: - '**' pull_request: {} env: COLUMNS: 150 UV_FROZEN: true jobs: lint: runs-on: ubuntu-latest name: Lint ${{ matrix.python-version }} strategy: fail-fast: false matrix: python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', '3.13'] steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python ${{ matrix.python-version }} --group linting --all-extras - uses: pre-commit/action@v3.0.1 with: extra_args: --all-files --verbose env: SKIP: no-commit-to-branch docs-build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies # Unlike the docs build, we don't use mkdocs_material-insiders # Because the secret for accessing the library is not accessible from forks, but we still want to run # this job on public CI runs. run: uv sync --python 3.12 --group docs - run: uv run python -c 'import docs.plugins.main' # Adding local symlinks gets nice source locations like # pydantic_core/core_schema.py # instead of # .venv/lib/python3.10/site-packages/pydantic_core/core_schema.py - name: prepare shortcuts for extra modules run: | ln -s .venv/lib/python*/site-packages/pydantic_core pydantic_core ln -s .venv/lib/python*/site-packages/pydantic_settings pydantic_settings ln -s .venv/lib/python*/site-packages/pydantic_extra_types pydantic_extra_types - run: uv run mkdocs build test-memray: name: Test memray runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: install deps run: uv sync --python 3.12 --group testing-extra - name: Run tests run: uv run pytest --ignore=tests/mypy/ --ignore=tests/test_docs.py --memray test: name: Test ${{ matrix.os }} / ${{ matrix.python-version }} strategy: fail-fast: false matrix: os: [ubuntu-latest, macos-13, macos-latest, windows-latest] python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', '3.13'] include: # no pydantic-core binaries for pypy on windows, so tests take absolute ages # macos tests with pypy take ages (>10mins) since pypy is very slow # so we only test pypy on ubuntu - os: ubuntu-latest python-version: 'pypy3.9' - os: ubuntu-latest python-version: 'pypy3.10' exclude: # Python 3.8 and 3.9 are not available on macOS 14 - os: macos-13 python-version: '3.10' - os: macos-13 python-version: '3.11' - os: macos-13 python-version: '3.12' - os: macos-latest python-version: '3.13' - os: macos-latest python-version: '3.8' - os: macos-latest python-version: '3.9' env: OS: ${{ matrix.os }} DEPS: yes UV_PYTHON: ${{ matrix.python-version }} UV_PYTHON_PREFERENCE: only-managed runs-on: ${{ matrix.os }} steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --extra timezone - run: 'uv run python -c "import pydantic.version; print(pydantic.version.version_info())"' - run: mkdir coverage - name: Test without email-validator # speed up by skipping this step on pypy if: "!startsWith(matrix.python-version, 'pypy')" run: make test env: COVERAGE_FILE: coverage/.coverage.${{ runner.os }}-py${{ matrix.python-version }}-without-deps CONTEXT: ${{ runner.os }}-py${{ matrix.python-version }}-without-deps - name: Install extra dependencies run: uv sync --group testing-extra --all-extras - name: Test with all extra dependencies run: make test env: COVERAGE_FILE: coverage/.coverage.${{ runner.os }}-py${{ matrix.python-version }}-with-deps CONTEXT: ${{ runner.os }}-py${{ matrix.python-version }}-with-deps - name: Store coverage files uses: actions/upload-artifact@v4 with: name: coverage-${{ matrix.os }}-${{ matrix.python-version }} path: coverage include-hidden-files: true test-fastapi: # If some tests start failing due to out-of-date schemas/validation errors/etc., # update the `tests/test_fastapi.sh` script to exclude tests that have known-acceptable failures. name: Test FastAPI runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: set up python uses: actions/setup-python@v5 with: python-version: '3.12' - name: Run tests run: make test-fastapi test-plugin: name: Test Pydantic plugin runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 - name: Install dependencies run: uv sync --python 3.12 - name: Install example plugin run: uv pip install ./tests/plugin - run: uv run pytest tests/plugin env: TEST_PLUGIN: 1 test-mypy: name: mypy ${{ matrix.mypy-version }} / ${{ matrix.python-version }} runs-on: ubuntu-latest strategy: fail-fast: false matrix: # test the latest version on all supported Python versions and the rest on 3.12 mypy-version: ['1.10.1', '1.11.2'] python-version: ['3.12'] include: - mypy-version: '1.12.0' python-version: '3.8' - mypy-version: '1.12.0' python-version: '3.9' - mypy-version: '1.12.0' python-version: '3.10' - mypy-version: '1.12.0' python-version: '3.11' - mypy-version: '1.12.0' python-version: '3.12' - mypy-version: '1.12.0' python-version: '3.13' steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python ${{ matrix.python-version }} --group typechecking --all-extras - name: Install mypy if: steps.cache.outputs.cache-hit != 'true' run: uv pip install 'mypy==${{ matrix.mypy-version }}' - run: mkdir coverage - name: Run mypy tests run: uv run coverage run -m pytest tests/mypy --test-mypy env: COVERAGE_FILE: coverage/.coverage.linux-py${{ matrix.python-version }}-mypy${{ matrix.mypy-version }} CONTEXT: linux-py${{ matrix.python-version }}-mypy${{ matrix.mypy-version }} - name: Store coverage files uses: actions/upload-artifact@v4 with: name: coverage-${{ matrix.python-version }}-mypy${{ matrix.mypy-version }} path: coverage include-hidden-files: true test-typechecking-integration: name: Typechecking integration tests runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python 3.12 --group typechecking - name: Run typechecking integration tests (Pyright) run: make test-typechecking-pyright - name: Run typechecking integration tests (Mypy) run: make test-typechecking-mypy coverage-combine: needs: [test, test-mypy] runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: actions/setup-python@v5 with: python-version: '3.12' - name: Get coverage files uses: actions/download-artifact@v4 with: merge-multiple: true pattern: coverage-* path: coverage - run: pip install coverage[toml] - run: ls -la coverage - run: coverage combine coverage - run: coverage report - run: coverage html --show-contexts --title "pydantic coverage for ${{ github.sha }}" - name: Store coverage data uses: actions/upload-artifact@v4 with: name: coverage-data path: .coverage include-hidden-files: true - name: Store coverage HTML uses: actions/upload-artifact@v4 with: name: coverage-html path: htmlcov coverage-pr-comment: needs: coverage-combine runs-on: ubuntu-latest if: github.event_name == 'pull_request' permissions: pull-requests: write contents: write steps: - uses: actions/checkout@v4 - name: Download coverage data uses: actions/download-artifact@v4 with: name: coverage-data - name: Generate coverage comment id: coverage-comment uses: py-cov-action/python-coverage-comment-action@v3 with: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Store coverage comment uses: actions/upload-artifact@v4 if: steps.coverage-comment.outputs.COMMENT_FILE_WRITTEN == 'true' with: name: python-coverage-comment-action path: python-coverage-comment-action.txt test-typing-extensions: name: Test typing-extensions (`main` branch) on Python ${{ matrix.python-version }} runs-on: ubuntu-latest strategy: fail-fast: false matrix: python-version: ['3.8', '3.9', '3.10', '3.11', '3.12', '3.13'] steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python ${{ matrix.python-version }} - name: Install typing-extensions run: uv pip install 'typing-extensions @ git+https://github.com/python/typing_extensions.git' - name: Run tests run: make test # https://github.com/marketplace/actions/alls-green check: # This job does nothing and is only used for the branch protection if: always() outputs: result: ${{ steps.all-green.outputs.result }} needs: - lint - docs-build - test - test-memray - test-mypy - test-fastapi - test-plugin runs-on: ubuntu-latest steps: - name: Decide whether the needed jobs succeeded or failed uses: re-actors/alls-green@release/v1 id: all-green with: jobs: ${{ toJSON(needs) }} release: needs: [check] if: needs.check.outputs.result == 'success' && startsWith(github.ref, 'refs/tags/') runs-on: ubuntu-latest environment: release permissions: id-token: write outputs: pydantic-version: ${{ steps.check-tag.outputs.VERSION }} steps: - uses: actions/checkout@v4 - uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install 'build' library run: pip install -U build - name: Check version id: check-tag uses: samuelcolvin/check-python-version@v4.1 with: version_file_path: pydantic/version.py - name: Build library run: python -m build - name: Upload package to PyPI uses: pypa/gh-action-pypi-publish@release/v1 send-tweet: name: Send tweet needs: [release] if: needs.release.result == 'success' runs-on: ubuntu-latest steps: - uses: actions/setup-python@v5 with: python-version: '3.12' - name: Install dependencies run: pip install tweepy==4.14.0 - name: Send tweet shell: python run: | import os import tweepy client = tweepy.Client( access_token=os.getenv("TWITTER_ACCESS_TOKEN"), access_token_secret=os.getenv("TWITTER_ACCESS_TOKEN_SECRET"), consumer_key=os.getenv("TWITTER_CONSUMER_KEY"), consumer_secret=os.getenv("TWITTER_CONSUMER_SECRET"), ) version = os.getenv("VERSION").strip('"') if "b" in version: official_version = version[:version.index("b")] tweet = os.getenv("BETA_TWEET").format(version=version, official_version=official_version) else: tweet = os.getenv("TWEET").format(version=version) client.create_tweet(text=tweet) env: VERSION: ${{ needs.release.outputs.pydantic-version }} TWEET: | Pydantic version {version} is out! 🎉 https://github.com/pydantic/pydantic/releases/tag/v{version} BETA_TWEET: | Pydantic beta version {version} is out! 🚀 Please try v{version} in the next week before we release v{official_version}, and let us know if you encounter any issues! https://github.com/pydantic/pydantic/releases/tag/v{version} TWITTER_CONSUMER_KEY: ${{ secrets.TWITTER_CONSUMER_KEY }} TWITTER_CONSUMER_SECRET: ${{ secrets.TWITTER_CONSUMER_SECRET }} TWITTER_ACCESS_TOKEN: ${{ secrets.TWITTER_ACCESS_TOKEN }} TWITTER_ACCESS_TOKEN_SECRET: ${{ secrets.TWITTER_ACCESS_TOKEN_SECRET }} pydantic-2.10.6/.github/workflows/codspeed.yml000066400000000000000000000015251474456633400213440ustar00rootroot00000000000000name: codspeed on: push: branches: - main pull_request: # `workflow_dispatch` allows CodSpeed to trigger backtest # performance analysis in order to generate initial data. workflow_dispatch: env: UV_FROZEN: true jobs: codspeed-profiling: name: CodSpeed profiling runs-on: ubuntu-22.04 steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true # Using this action is still necessary for CodSpeed to work: - uses: actions/setup-python@v5 with: python-version: '3.12' - name: install deps run: uv sync --python 3.12 --group testing-extra --extra email --frozen - name: Run CodSpeed benchmarks uses: CodSpeedHQ/action@v3 with: run: uv run pytest ./tests/benchmarks --codspeed pydantic-2.10.6/.github/workflows/coverage.yml000066400000000000000000000014271474456633400213520ustar00rootroot00000000000000name: Post coverage comment on: workflow_run: workflows: ["CI"] types: - completed jobs: post-coverage-comment: name: Push coverage comment runs-on: ubuntu-latest if: github.event.workflow_run.event == 'pull_request' && github.event.workflow_run.conclusion == 'success' permissions: pull-requests: write contents: write actions: read steps: # DO NOT run actions/checkout here, for security reasons # For details, refer to https://securitylab.github.com/research/github-actions-preventing-pwn-requests/ - name: Post comment uses: py-cov-action/python-coverage-comment-action@v3 with: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_PR_RUN_ID: ${{ github.event.workflow_run.id }} pydantic-2.10.6/.github/workflows/dependencies-check.yml000066400000000000000000000021641474456633400232570ustar00rootroot00000000000000name: Dependencies Check on: schedule: - cron: '43 3 * * 6,3' workflow_dispatch: {} env: UV_FROZEN: true jobs: find_dependency_cases: runs-on: ubuntu-latest outputs: PYTHON_DEPENDENCY_CASES: ${{ steps.list-python-dependencies.outputs.PYTHON_DEPENDENCY_CASES }} steps: - uses: actions/checkout@v4 - uses: samuelcolvin/list-python-dependencies@main id: list-python-dependencies with: mode: first-last test: name: Test py${{ matrix.python-version }} on ${{ matrix.PYTHON_DEPENDENCY_CASE }} needs: - find_dependency_cases strategy: fail-fast: true matrix: python-version: ['3.8', '3.11'] PYTHON_DEPENDENCY_CASE: ${{ fromJSON(needs.find_dependency_cases.outputs.PYTHON_DEPENDENCY_CASES) }} runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: actions/setup-python@v5 with: python-version: ${{ matrix.python-version }} - run: pip install uv - run: uv sync --all-extras - run: uv pip install ${{ matrix.PYTHON_DEPENDENCY_CASE }} - run: uv pip freeze - run: make test pydantic-2.10.6/.github/workflows/docs-update.yml000066400000000000000000000060521474456633400217660ustar00rootroot00000000000000name: Publish Documentation on: push: branches: - main - docs-update tags: - '**' env: COLUMNS: 150 UV_FROZEN: true jobs: lint: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python 3.12 --group linting --all-extras - uses: pre-commit/action@v3.0.1 with: extra_args: --all-files --verbose env: SKIP: no-commit-to-branch test: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - name: Install dependencies run: uv sync --python 3.12 --group testing-extra --all-extras - run: 'uv run python -c "import pydantic.version; print(pydantic.version.version_info())"' - run: make test publish: # Compare with the docs-build job in .github/workflows/ci.yml needs: [lint, test] runs-on: ubuntu-latest timeout-minutes: 30 steps: - name: Checkout docs-site uses: actions/checkout@v4 with: ref: docs-site - name: Checkout current branch uses: actions/checkout@v4 - uses: astral-sh/setup-uv@v3 with: enable-cache: true - run: uv sync --python 3.12 --group docs - run: uv pip install --default-index https://pydantic:${PPPR_TOKEN}@pppr.pydantic.dev/simple/ mkdocs-material env: PPPR_TOKEN: ${{ secrets.PPPR_TOKEN }} - run: uv run python -c 'import docs.plugins.main' # Adding local symlinks gets nice source locations like # pydantic_core/core_schema.py # instead of # .venv/lib/python3.10/site-packages/pydantic_core/core_schema.py - name: Prepare shortcuts for extra modules run: | ln -s .venv/lib/python*/site-packages/pydantic_core pydantic_core ln -s .venv/lib/python*/site-packages/pydantic_settings pydantic_settings ln -s .venv/lib/python*/site-packages/pydantic_extra_types pydantic_extra_types - name: Set git credentials run: | git config --global user.name "${{ github.actor }}" git config --global user.email "${{ github.actor }}@users.noreply.github.com" - run: uv run mike deploy -b docs-site dev --push if: "github.ref == 'refs/heads/main'" - if: "github.ref == 'refs/heads/docs-update' || startsWith(github.ref, 'refs/tags/')" id: check-version uses: samuelcolvin/check-python-version@v4.1 with: version_file_path: 'pydantic/version.py' skip_env_check: true - run: uv run mike deploy -b docs-site ${{ steps.check-version.outputs.VERSION_MAJOR_MINOR }} latest --update-aliases --push if: "(github.ref == 'refs/heads/docs-update' || startsWith(github.ref, 'refs/tags/')) && !fromJSON(steps.check-version.outputs.IS_PRERELEASE)" env: PYDANTIC_VERSION: v${{ steps.check-version.outputs.VERSION }} pydantic-2.10.6/.github/workflows/integration.yml000066400000000000000000000007771474456633400221110ustar00rootroot00000000000000name: Pydantic Family Integration Tests on: schedule: - cron: '21 3 * * 1,2,3,4,5' workflow_dispatch: {} jobs: test-pydantic-settings: name: Test pydantic settings runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run tests run: make test-pydantic-settings test-pydantic-extra-types: name: Test pydantic extra types runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run tests run: make test-pydantic-extra-types pydantic-2.10.6/.github/workflows/labeler.yml000066400000000000000000000013151474456633400211610ustar00rootroot00000000000000name: Release notes on: pull_request_target: types: [opened] jobs: auto-labeler: name: auto-labeler permissions: contents: read pull-requests: write runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - id: label-PR-by-branch-name uses: actions/labeler@v5 with: configuration-path: '.github/labels/first_pass.yml' - id: add-default-if-no-labels if: ${{ !contains(steps.label-PR-by-branch-name.outputs.all-labels, 'relnotes') && !contains(steps.label-PR-by-branch-name.outputs.all-labels, 'documentation') }} uses: actions/labeler@v5 with: configuration-path: '.github/labels/default_pass.yml' pydantic-2.10.6/.github/workflows/update-pydantic-people.yml000066400000000000000000000004431474456633400241310ustar00rootroot00000000000000name: Pydantic people update on: schedule: - cron: "0 12 1 * *" workflow_dispatch: {} jobs: pydantic-people: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: ./.github/actions/people with: token: ${{ secrets.GITHUB_TOKEN }} pydantic-2.10.6/.github/workflows/upload-previews.yml000066400000000000000000000020271474456633400227020ustar00rootroot00000000000000name: Upload previews on: workflow_run: workflows: [CI] types: [completed] permissions: statuses: write jobs: upload-previews: if: ${{ github.event.workflow_run.conclusion == 'success' }} runs-on: ubuntu-latest steps: - uses: actions/setup-python@v5 with: python-version: '3.10' - run: pip install smokeshow - uses: dawidd6/action-download-artifact@v6 with: workflow: ci.yml commit: ${{ github.event.workflow_run.head_sha }} - run: smokeshow upload coverage-html env: SMOKESHOW_GITHUB_STATUS_DESCRIPTION: Coverage {coverage-percentage} # 5 is set here while V2 is in development and coverage is far from complete SMOKESHOW_GITHUB_COVERAGE_THRESHOLD: 91 SMOKESHOW_GITHUB_CONTEXT: coverage SMOKESHOW_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} SMOKESHOW_GITHUB_PR_HEAD_SHA: ${{ github.event.workflow_run.head_sha }} SMOKESHOW_AUTH_KEY: ${{ secrets.SMOKESHOW_AUTH_KEY }} pydantic-2.10.6/.gitignore000066400000000000000000000013021474456633400154170ustar00rootroot00000000000000# Virtual environments env/ env3*/ venv/ .venv/ __pypackages__/ # IDEs and editors .idea/ .vscode/ # Package distribution and build files *.egg-info/ dist/ /build/ _build/ # Python bytecode and cache files *.py[cod] .cache/ /.ghtopdep_cache/ .hypothesis .mypy_cache/ .pytest_cache/ /.ruff_cache/ # Benchmark and test files /benchmarks/*.json /tests/benchmarks/*.json /htmlcov/ /codecov.sh /coverage.lcov .coverage /test*.py run_mypy_on_file.py # Documentation files /docs/changelog.md /docs/theme/mkdocs_run_deps.html /site/ /site.zip # Project-specific files pydantic/*.c pydantic/*.so /fastapi/ .mypy-configs/ # Other files and folders .python-version .DS_Store .auto-format /sandbox/ /worktrees/ pydantic-2.10.6/.hyperlint/000077500000000000000000000000001474456633400155275ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/.vale.ini000066400000000000000000000002431474456633400172340ustar00rootroot00000000000000StylesPath = styles MinAlertLevel = suggestion Vocab = hyperlint SkippedScopes = script, style, pre, figure, code, code-block [*] BasedOnStyles = Vale, hyperlint pydantic-2.10.6/.hyperlint/style_guide_test.md000066400000000000000000000002551474456633400214270ustar00rootroot00000000000000 # This ia a test file it will flag errors like on pydantic. It won't flag on validators. but it won't flag errors on SDK or SDKs or APIs anymore. This is is an issue. pydantic-2.10.6/.hyperlint/styles/000077500000000000000000000000001474456633400170525ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/styles/config/000077500000000000000000000000001474456633400203175ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/styles/config/vocabularies/000077500000000000000000000000001474456633400227765ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/styles/config/vocabularies/hyperlint/000077500000000000000000000000001474456633400250145ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/styles/config/vocabularies/hyperlint/accept.txt000066400000000000000000000001551474456633400270150ustar00rootroot00000000000000validator Pydantic validators namespace Hyperlint preprocess tokenization tokenizer tzdata API APIs SDKs SDK pydantic-2.10.6/.hyperlint/styles/hyperlint/000077500000000000000000000000001474456633400210705ustar00rootroot00000000000000pydantic-2.10.6/.hyperlint/styles/hyperlint/repeatedWords.yml000066400000000000000000000002011474456633400244140ustar00rootroot00000000000000extends: repetition message: "'%s' is repeated, did you mean to repeat this word?" level: error alpha: true tokens: - '[^\s]+' pydantic-2.10.6/.pre-commit-config.yaml000066400000000000000000000015471474456633400177230ustar00rootroot00000000000000repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v5.0.0 hooks: - id: no-commit-to-branch # prevent direct commits to main branch - id: check-yaml args: ['--unsafe'] - id: check-toml - id: end-of-file-fixer - id: trailing-whitespace - repo: https://github.com/codespell-project/codespell rev: v2.3.0 hooks: - id: codespell additional_dependencies: - tomli exclude: '^uv\.lock$' - repo: local hooks: - id: lint name: Lint entry: make lint types: [python] language: system pass_filenames: false - id: usage_docs name: Usage docs links entry: uv run ./tests/check_usage_docs.py files: '^pydantic/' types: [python] language: system - id: typecheck name: Typecheck entry: uv run pyright pydantic types: [python] language: system pass_filenames: false pydantic-2.10.6/CITATION.cff000066400000000000000000000022771474456633400153350ustar00rootroot00000000000000cff-version: 1.2.0 title: Pydantic message: 'If you use this software, please cite it as below.' type: software authors: - family-names: Colvin given-names: Samuel - family-names: Jolibois given-names: Eric - family-names: Ramezani given-names: Hasan - family-names: Garcia Badaracco given-names: Adrian - family-names: Dorsey given-names: Terrence - family-names: Montague given-names: David - family-names: Matveenko given-names: Serge - family-names: Trylesinski given-names: Marcelo - family-names: Runkle given-names: Sydney - family-names: Hewitt given-names: David - family-names: Hall given-names: Alex - family-names: Plot given-names: Victorien repository-code: 'https://github.com/pydantic/pydantic' url: 'https://docs.pydantic.dev/latest/' abstract: >- Pydantic is the most widely used data validation library for Python. Fast and extensible, Pydantic plays nicely with your linters/IDE/brain. Define how data should be in pure, canonical Python 3.8+; validate it with Pydantic. keywords: - python - validation - parsing - json-schema - hints - typing license: MIT version: v2.10.6 date-released: 2025-01-23 pydantic-2.10.6/HISTORY.md000066400000000000000000007631131474456633400151310ustar00rootroot00000000000000## v2.10.6 (2025-01-23) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.6) ### What's Changed #### Fixes * Fix JSON Schema reference collection with `'examples'` keys by @Viicos in [#11325](https://github.com/pydantic/pydantic/pull/11325) * Fix url python serialization by @sydney-runkle in [#11331](https://github.com/pydantic/pydantic/pull/11331) ## v2.10.5 (2025-01-08) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.5) ### What's Changed #### Fixes * Remove custom MRO implementation of Pydantic models by @Viicos in [#11184](https://github.com/pydantic/pydantic/pull/11184) * Fix URL serialization for unions by @sydney-runkle in [#11233](https://github.com/pydantic/pydantic/pull/11233) ## v2.10.4 (2024-12-18) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.4) ### What's Changed #### Packaging * Bump `pydantic-core` to v2.27.2 by @davidhewitt in [#11138](https://github.com/pydantic/pydantic/pull/11138) #### Fixes * Fix for comparison of `AnyUrl` objects by @alexprabhat99 in [#11082](https://github.com/pydantic/pydantic/pull/11082) * Properly fetch PEP 695 type params for functions, do not fetch annotations from signature by @Viicos in [#11093](https://github.com/pydantic/pydantic/pull/11093) * Include JSON Schema input core schema in function schemas by @Viicos in [#11085](https://github.com/pydantic/pydantic/pull/11085) * Add `len` to `_BaseUrl` to avoid TypeError by @Kharianne in [#11111](https://github.com/pydantic/pydantic/pull/11111) * Make sure the type reference is removed from the seen references by @Viicos in [#11143](https://github.com/pydantic/pydantic/pull/11143) ### New Contributors * @alexprabhat99 made their first contribution in [#11082](https://github.com/pydantic/pydantic/pull/11082) * @Kharianne made their first contribution in [#11111](https://github.com/pydantic/pydantic/pull/11111) ## v2.10.3 (2024-12-03) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.3) ### What's Changed #### Fixes * Set fields when `defer_build` is set on Pydantic dataclasses by @Viicos in [#10984](https://github.com/pydantic/pydantic/pull/10984) * Do not resolve the JSON Schema reference for `dict` core schema keys by @Viicos in [#10989](https://github.com/pydantic/pydantic/pull/10989) * Use the globals of the function when evaluating the return type for `PlainSerializer` and `WrapSerializer` functions by @Viicos in [#11008](https://github.com/pydantic/pydantic/pull/11008) * Fix host required enforcement for urls to be compatible with v2.9 behavior by @sydney-runkle in [#11027](https://github.com/pydantic/pydantic/pull/11027) * Add a `default_factory_takes_validated_data` property to `FieldInfo` by @Viicos in [#11034](https://github.com/pydantic/pydantic/pull/11034) * Fix url json schema in `serialization` mode by @sydney-runkle in [#11035](https://github.com/pydantic/pydantic/pull/11035) ## v2.10.2 (2024-11-25) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.2) ### What's Changed #### Fixes * Only evaluate FieldInfo annotations if required during schema building by @Viicos in [#10769](https://github.com/pydantic/pydantic/pull/10769) * Do not evaluate annotations for private fields by @Viicos in [#10962](https://github.com/pydantic/pydantic/pull/10962) * Support serialization as any for `Secret` types and `Url` types by @sydney-runkle in [#10947](https://github.com/pydantic/pydantic/pull/10947) * Fix type hint of `Field.default` to be compatible with Python 3.8 and 3.9 by @Viicos in [#10972](https://github.com/pydantic/pydantic/pull/10972) * Add hashing support for URL types by @sydney-runkle in [#10975](https://github.com/pydantic/pydantic/pull/10975) * Hide `BaseModel.__replace__` definition from type checkers by @Viicos in [10979](https://github.com/pydantic/pydantic/pull/10979) ## v2.10.1 (2024-11-21) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.1) ### What's Changed #### Packaging * Bump `pydantic-core` version to `v2.27.1` by @sydney-runkle in [#10938](https://github.com/pydantic/pydantic/pull/10938) #### Fixes * Use the correct frame when instantiating a parametrized `TypeAdapter` by @Viicos in [#10893](https://github.com/pydantic/pydantic/pull/10893) * Relax check for validated data in `default_factory` utils by @sydney-runkle in [#10909](https://github.com/pydantic/pydantic/pull/10909) * Fix type checking issue with `model_fields` and `model_computed_fields` by @sydney-runkle in [#10911](https://github.com/pydantic/pydantic/pull/10911) * Use the parent configuration during schema generation for stdlib `dataclass`es by @sydney-runkle in [#10928](https://github.com/pydantic/pydantic/pull/10928) * Use the `globals` of the function when evaluating the return type of serializers and `computed_field`s by @Viicos in [#10929](https://github.com/pydantic/pydantic/pull/10929) * Fix URL constraint application by @sydney-runkle in [#10922](https://github.com/pydantic/pydantic/pull/10922) * Fix URL equality with different validation methods by @sydney-runkle in [#10934](https://github.com/pydantic/pydantic/pull/10934) * Fix JSON schema title when specified as `''` by @sydney-runkle in [#10936](https://github.com/pydantic/pydantic/pull/10936) * Fix `python` mode serialization for `complex` inference by @sydney-runkle in [pydantic-core#1549](https://github.com/pydantic/pydantic-core/pull/1549) ## v2.10.0 (2024-11-20) The code released in v2.10.0 is practically identical to that of v2.10.0b2. [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.0) See the [v2.10 release blog post](https://pydantic.dev/articles/pydantic-v2-10-release) for the highlights! ### What's Changed #### Packaging * Bump `pydantic-core` to `v2.27.0` by @sydney-runkle in [#10825](https://github.com/pydantic/pydantic/pull/10825) * Replaced pdm with uv by @frfahim in [#10727](https://github.com/pydantic/pydantic/pull/10727) #### New Features * Support `fractions.Fraction` by @sydney-runkle in [#10318](https://github.com/pydantic/pydantic/pull/10318) * Support `Hashable` for json validation by @sydney-runkle in [#10324](https://github.com/pydantic/pydantic/pull/10324) * Add a `SocketPath` type for `linux` systems by @theunkn0wn1 in [#10378](https://github.com/pydantic/pydantic/pull/10378) * Allow arbitrary refs in JSON schema `examples` by @sydney-runkle in [#10417](https://github.com/pydantic/pydantic/pull/10417) * Support `defer_build` for Pydantic dataclasses by @Viicos in [#10313](https://github.com/pydantic/pydantic/pull/10313) * Adding v1 / v2 incompatibility warning for nested v1 model by @sydney-runkle in [#10431](https://github.com/pydantic/pydantic/pull/10431) * Add support for unpacked `TypedDict` to type hint variadic keyword arguments with `@validate_call` by @Viicos in [#10416](https://github.com/pydantic/pydantic/pull/10416) * Support compiled patterns in `protected_namespaces` by @sydney-runkle in [#10522](https://github.com/pydantic/pydantic/pull/10522) * Add support for `propertyNames` in JSON schema by @FlorianSW in [#10478](https://github.com/pydantic/pydantic/pull/10478) * Adding `__replace__` protocol for Python 3.13+ support by @sydney-runkle in [#10596](https://github.com/pydantic/pydantic/pull/10596) * Expose public `sort` method for JSON schema generation by @sydney-runkle in [#10595](https://github.com/pydantic/pydantic/pull/10595) * Add runtime validation of `@validate_call` callable argument by @kc0506 in [#10627](https://github.com/pydantic/pydantic/pull/10627) * Add `experimental_allow_partial` support by @samuelcolvin in [#10748](https://github.com/pydantic/pydantic/pull/10748) * Support default factories taking validated data as an argument by @Viicos in [#10678](https://github.com/pydantic/pydantic/pull/10678) * Allow subclassing `ValidationError` and `PydanticCustomError` by @Youssefares in [pydantic/pydantic-core#1413](https://github.com/pydantic/pydantic-core/pull/1413) * Add `trailing-strings` support to `experimental_allow_partial` by @sydney-runkle in [#10825](https://github.com/pydantic/pydantic/pull/10825) * Add `rebuild()` method for `TypeAdapter` and simplify `defer_build` patterns by @sydney-runkle in [#10537](https://github.com/pydantic/pydantic/pull/10537) * Improve `TypeAdapter` instance repr by @sydney-runkle in [#10872](https://github.com/pydantic/pydantic/pull/10872) #### Changes * Don't allow customization of `SchemaGenerator` until interface is more stable by @sydney-runkle in [#10303](https://github.com/pydantic/pydantic/pull/10303) * Cleanly `defer_build` on `TypeAdapters`, removing experimental flag by @sydney-runkle in [#10329](https://github.com/pydantic/pydantic/pull/10329) * Fix `mro` of generic subclass by @kc0506 in [#10100](https://github.com/pydantic/pydantic/pull/10100) * Strip whitespaces on JSON Schema title generation by @sydney-runkle in [#10404](https://github.com/pydantic/pydantic/pull/10404) * Use `b64decode` and `b64encode` for `Base64Bytes` type by @sydney-runkle in [#10486](https://github.com/pydantic/pydantic/pull/10486) * Relax protected namespace config default by @sydney-runkle in [#10441](https://github.com/pydantic/pydantic/pull/10441) * Revalidate parametrized generics if instance's origin is subclass of OG class by @sydney-runkle in [#10666](https://github.com/pydantic/pydantic/pull/10666) * Warn if configuration is specified on the `@dataclass` decorator and with the `__pydantic_config__` attribute by @sydney-runkle in [#10406](https://github.com/pydantic/pydantic/pull/10406) * Recommend against using `Ellipsis` (...) with `Field` by @Viicos in [#10661](https://github.com/pydantic/pydantic/pull/10661) * Migrate to subclassing instead of annotated approach for pydantic url types by @sydney-runkle in [#10662](https://github.com/pydantic/pydantic/pull/10662) * Change JSON schema generation of `Literal`s and `Enums` by @Viicos in [#10692](https://github.com/pydantic/pydantic/pull/10692) * Simplify unions involving `Any` or `Never` when replacing type variables by @Viicos in [#10338](https://github.com/pydantic/pydantic/pull/10338) * Do not require padding when decoding `base64` bytes by @bschoenmaeckers in [pydantic/pydantic-core#1448](https://github.com/pydantic/pydantic-core/pull/1448) * Support dates all the way to 1BC by @changhc in [pydantic/speedate#77](https://github.com/pydantic/speedate/pull/77) #### Performance * Schema cleaning: skip unnecessary copies during schema walking by @Viicos in [#10286](https://github.com/pydantic/pydantic/pull/10286) * Refactor namespace logic for annotations evaluation by @Viicos in [#10530](https://github.com/pydantic/pydantic/pull/10530) * Improve email regexp on edge cases by @AlekseyLobanov in [#10601](https://github.com/pydantic/pydantic/pull/10601) * `CoreMetadata` refactor with an emphasis on documentation, schema build time performance, and reducing complexity by @sydney-runkle in [#10675](https://github.com/pydantic/pydantic/pull/10675) #### Fixes * Remove guarding check on `computed_field` with `field_serializer` by @nix010 in [#10390](https://github.com/pydantic/pydantic/pull/10390) * Fix `Predicate` issue in `v2.9.0` by @sydney-runkle in [#10321](https://github.com/pydantic/pydantic/pull/10321) * Fixing `annotated-types` bound by @sydney-runkle in [#10327](https://github.com/pydantic/pydantic/pull/10327) * Turn `tzdata` install requirement into optional `timezone` dependency by @jakob-keller in [#10331](https://github.com/pydantic/pydantic/pull/10331) * Use correct types namespace when building `namedtuple` core schemas by @Viicos in [#10337](https://github.com/pydantic/pydantic/pull/10337) * Fix evaluation of stringified annotations during namespace inspection by @Viicos in [#10347](https://github.com/pydantic/pydantic/pull/10347) * Fix `IncEx` type alias definition by @Viicos in [#10339](https://github.com/pydantic/pydantic/pull/10339) * Do not error when trying to evaluate annotations of private attributes by @Viicos in [#10358](https://github.com/pydantic/pydantic/pull/10358) * Fix nested type statement by @kc0506 in [#10369](https://github.com/pydantic/pydantic/pull/10369) * Improve typing of `ModelMetaclass.mro` by @Viicos in [#10372](https://github.com/pydantic/pydantic/pull/10372) * Fix class access of deprecated `computed_field`s by @Viicos in [#10391](https://github.com/pydantic/pydantic/pull/10391) * Make sure `inspect.iscoroutinefunction` works on coroutines decorated with `@validate_call` by @MovisLi in [#10374](https://github.com/pydantic/pydantic/pull/10374) * Fix `NameError` when using `validate_call` with PEP 695 on a class by @kc0506 in [#10380](https://github.com/pydantic/pydantic/pull/10380) * Fix `ZoneInfo` with various invalid types by @sydney-runkle in [#10408](https://github.com/pydantic/pydantic/pull/10408) * Fix `PydanticUserError` on empty `model_config` with annotations by @cdwilson in [#10412](https://github.com/pydantic/pydantic/pull/10412) * Fix variance issue in `_IncEx` type alias, only allow `True` by @Viicos in [#10414](https://github.com/pydantic/pydantic/pull/10414) * Fix serialization schema generation when using `PlainValidator` by @Viicos in [#10427](https://github.com/pydantic/pydantic/pull/10427) * Fix schema generation error when serialization schema holds references by @Viicos in [#10444](https://github.com/pydantic/pydantic/pull/10444) * Inline references if possible when generating schema for `json_schema_input_type` by @Viicos in [#10439](https://github.com/pydantic/pydantic/pull/10439) * Fix recursive arguments in `Representation` by @Viicos in [#10480](https://github.com/pydantic/pydantic/pull/10480) * Fix representation for builtin function types by @kschwab in [#10479](https://github.com/pydantic/pydantic/pull/10479) * Add python validators for decimal constraints (`max_digits` and `decimal_places`) by @sydney-runkle in [#10506](https://github.com/pydantic/pydantic/pull/10506) * Only fetch `__pydantic_core_schema__` from the current class during schema generation by @Viicos in [#10518](https://github.com/pydantic/pydantic/pull/10518) * Fix `stacklevel` on deprecation warnings for `BaseModel` by @sydney-runkle in [#10520](https://github.com/pydantic/pydantic/pull/10520) * Fix warning `stacklevel` in `BaseModel.__init__` by @Viicos in [#10526](https://github.com/pydantic/pydantic/pull/10526) * Improve error handling for in-evaluable refs for discriminator application by @sydney-runkle in [#10440](https://github.com/pydantic/pydantic/pull/10440) * Change the signature of `ConfigWrapper.core_config` to take the title directly by @Viicos in [#10562](https://github.com/pydantic/pydantic/pull/10562) * Do not use the previous config from the stack for dataclasses without config by @Viicos in [#10576](https://github.com/pydantic/pydantic/pull/10576) * Fix serialization for IP types with `mode='python'` by @sydney-runkle in [#10594](https://github.com/pydantic/pydantic/pull/10594) * Support constraint application for `Base64Etc` types by @sydney-runkle in [#10584](https://github.com/pydantic/pydantic/pull/10584) * Fix `validate_call` ignoring `Field` in `Annotated` by @kc0506 in [#10610](https://github.com/pydantic/pydantic/pull/10610) * Raise an error when `Self` is invalid by @kc0506 in [#10609](https://github.com/pydantic/pydantic/pull/10609) * Using `core_schema.InvalidSchema` instead of metadata injection + checks by @sydney-runkle in [#10523](https://github.com/pydantic/pydantic/pull/10523) * Tweak type alias logic by @kc0506 in [#10643](https://github.com/pydantic/pydantic/pull/10643) * Support usage of `type` with `typing.Self` and type aliases by @kc0506 in [#10621](https://github.com/pydantic/pydantic/pull/10621) * Use overloads for `Field` and `PrivateAttr` functions by @Viicos in [#10651](https://github.com/pydantic/pydantic/pull/10651) * Clean up the `mypy` plugin implementation by @Viicos in [#10669](https://github.com/pydantic/pydantic/pull/10669) * Properly check for `typing_extensions` variant of `TypeAliasType` by @Daraan in [#10713](https://github.com/pydantic/pydantic/pull/10713) * Allow any mapping in `BaseModel.model_copy()` by @Viicos in [#10751](https://github.com/pydantic/pydantic/pull/10751) * Fix `isinstance` behavior for urls by @sydney-runkle in [#10766](https://github.com/pydantic/pydantic/pull/10766) * Ensure `cached_property` can be set on Pydantic models by @Viicos in [#10774](https://github.com/pydantic/pydantic/pull/10774) * Fix equality checks for primitives in literals by @sydney-runkle in [pydantic/pydantic-core#1459](https://github.com/pydantic/pydantic-core/pull/1459) * Properly enforce `host_required` for URLs by @Viicos in [pydantic/pydantic-core#1488](https://github.com/pydantic/pydantic-core/pull/1488) * Fix when `coerce_numbers_to_str` enabled and string has invalid Unicode character by @andrey-berenda in [pydantic/pydantic-core#1515](https://github.com/pydantic/pydantic-core/pull/1515) * Fix serializing `complex` values in `Enum`s by @changhc in [pydantic/pydantic-core#1524](https://github.com/pydantic/pydantic-core/pull/1524) * Refactor `_typing_extra` module by @Viicos in [#10725](https://github.com/pydantic/pydantic/pull/10725) * Support intuitive equality for urls by @sydney-runkle in [#10798](https://github.com/pydantic/pydantic/pull/10798) * Add `bytearray` to `TypeAdapter.validate_json` signature by @samuelcolvin in [#10802](https://github.com/pydantic/pydantic/pull/10802) * Ensure class access of method descriptors is performed when used as a default with `Field` by @Viicos in [#10816](https://github.com/pydantic/pydantic/pull/10816) * Fix circular import with `validate_call` by @sydney-runkle in [#10807](https://github.com/pydantic/pydantic/pull/10807) * Fix error when using type aliases referencing other type aliases by @Viicos in [#10809](https://github.com/pydantic/pydantic/pull/10809) * Fix `IncEx` type alias to be compatible with mypy by @Viicos in [#10813](https://github.com/pydantic/pydantic/pull/10813) * Make `__signature__` a lazy property, do not deepcopy defaults by @Viicos in [#10818](https://github.com/pydantic/pydantic/pull/10818) * Make `__signature__` lazy for dataclasses, too by @sydney-runkle in [#10832](https://github.com/pydantic/pydantic/pull/10832) * Subclass all single host url classes from `AnyUrl` to preserve behavior from v2.9 by @sydney-runkle in [#10856](https://github.com/pydantic/pydantic/pull/10856) ### New Contributors * @jakob-keller made their first contribution in [#10331](https://github.com/pydantic/pydantic/pull/10331) * @MovisLi made their first contribution in [#10374](https://github.com/pydantic/pydantic/pull/10374) * @joaopalmeiro made their first contribution in [#10405](https://github.com/pydantic/pydantic/pull/10405) * @theunkn0wn1 made their first contribution in [#10378](https://github.com/pydantic/pydantic/pull/10378) * @cdwilson made their first contribution in [#10412](https://github.com/pydantic/pydantic/pull/10412) * @dlax made their first contribution in [#10421](https://github.com/pydantic/pydantic/pull/10421) * @kschwab made their first contribution in [#10479](https://github.com/pydantic/pydantic/pull/10479) * @santibreo made their first contribution in [#10453](https://github.com/pydantic/pydantic/pull/10453) * @FlorianSW made their first contribution in [#10478](https://github.com/pydantic/pydantic/pull/10478) * @tkasuz made their first contribution in [#10555](https://github.com/pydantic/pydantic/pull/10555) * @AlekseyLobanov made their first contribution in [#10601](https://github.com/pydantic/pydantic/pull/10601) * @NiclasvanEyk made their first contribution in [#10667](https://github.com/pydantic/pydantic/pull/10667) * @mschoettle made their first contribution in [#10677](https://github.com/pydantic/pydantic/pull/10677) * @Daraan made their first contribution in [#10713](https://github.com/pydantic/pydantic/pull/10713) * @k4nar made their first contribution in [#10736](https://github.com/pydantic/pydantic/pull/10736) * @UriyaHarpeness made their first contribution in [#10740](https://github.com/pydantic/pydantic/pull/10740) * @frfahim made their first contribution in [#10727](https://github.com/pydantic/pydantic/pull/10727) ## v2.10.0b2 (2024-11-13) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.10.0b2) for details. ## v2.10.0b1 (2024-11-06) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.0b1) for details. ## v2.9.2 (2024-09-17) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.2) ### What's Changed #### Fixes * Do not error when trying to evaluate annotations of private attributes by @Viicos in [#10358](https://github.com/pydantic/pydantic/pull/10358) * Adding notes on designing sound `Callable` discriminators by @sydney-runkle in [#10400](https://github.com/pydantic/pydantic/pull/10400) * Fix serialization schema generation when using `PlainValidator` by @Viicos in [#10427](https://github.com/pydantic/pydantic/pull/10427) * Fix `Union` serialization warnings by @sydney-runkle in [pydantic/pydantic-core#1449](https://github.com/pydantic/pydantic-core/pull/1449) * Fix variance issue in `_IncEx` type alias, only allow `True` by @Viicos in [#10414](https://github.com/pydantic/pydantic/pull/10414) * Fix `ZoneInfo` validation with various invalid types by @sydney-runkle in [#10408](https://github.com/pydantic/pydantic/pull/10408) ## v2.9.1 (2024-09-09) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.1) ### What's Changed #### Fixes * Fix Predicate issue in v2.9.0 by @sydney-runkle in [#10321](https://github.com/pydantic/pydantic/pull/10321) * Fixing `annotated-types` bound to `>=0.6.0` by @sydney-runkle in [#10327](https://github.com/pydantic/pydantic/pull/10327) * Turn `tzdata` install requirement into optional `timezone` dependency by @jakob-keller in [#10331](https://github.com/pydantic/pydantic/pull/10331) * Fix `IncExc` type alias definition by @Viicos in [#10339](https://github.com/pydantic/pydantic/pull/10339) * Use correct types namespace when building namedtuple core schemas by @Viicos in [#10337](https://github.com/pydantic/pydantic/pull/10337) * Fix evaluation of stringified annotations during namespace inspection by @Viicos in [#10347](https://github.com/pydantic/pydantic/pull/10347) * Fix tagged union serialization with alias generators by @sydney-runkle in [pydantic/pydantic-core#1442](https://github.com/pydantic/pydantic-core/pull/1442) ## v2.9.0 (2024-09-05) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.0) The code released in v2.9.0 is practically identical to that of v2.9.0b2. ### What's Changed #### Packaging * Bump `ruff` to `v0.5.0` and `pyright` to `v1.1.369` by @sydney-runkle in [#9801](https://github.com/pydantic/pydantic/pull/9801) * Bump `pydantic-extra-types` to `v2.9.0` by @sydney-runkle in [#9832](https://github.com/pydantic/pydantic/pull/9832) * Support compatibility with `pdm v2.18.1` by @Viicos in [#10138](https://github.com/pydantic/pydantic/pull/10138) * Bump `v1` version stub to `v1.10.18` by @sydney-runkle in [#10214](https://github.com/pydantic/pydantic/pull/10214) * Bump `pydantic-core` to `v2.23.2` by @sydney-runkle in [#10311](https://github.com/pydantic/pydantic/pull/10311) #### New Features * Add support for `ZoneInfo` by @Youssefares in [#9896](https://github.com/pydantic/pydantic/pull/9896) * Add `Config.val_json_bytes` by @josh-newman in [#9770](https://github.com/pydantic/pydantic/pull/9770) * Add DSN for Snowflake by @aditkumar72 in [#10128](https://github.com/pydantic/pydantic/pull/10128) * Support `complex` number by @changhc in [#9654](https://github.com/pydantic/pydantic/pull/9654) * Add support for `annotated_types.Not` by @aditkumar72 in [#10210](https://github.com/pydantic/pydantic/pull/10210) * Allow `WithJsonSchema` to inject `$ref`s w/ `http` or `https` links by @dAIsySHEng1 in [#9863](https://github.com/pydantic/pydantic/pull/9863) * Allow validators to customize validation JSON schema by @Viicos in [#10094](https://github.com/pydantic/pydantic/pull/10094) * Support parametrized `PathLike` types by @nix010 in [#9764](https://github.com/pydantic/pydantic/pull/9764) * Add tagged union serializer that attempts to use `str` or `callable` discriminators to select the correct serializer by @sydney-runkle in in [pydantic/pydantic-core#1397](https://github.com/pydantic/pydantic-core/pull/1397) #### Changes * Breaking Change: Merge `dict` type `json_schema_extra` by @sydney-runkle in [#9792](https://github.com/pydantic/pydantic/pull/9792) * For more info (how to replicate old behavior) on this change, see [here](https://docs.pydantic.dev/dev/concepts/json_schema/#merging-json_schema_extra) * Refactor annotation injection for known (often generic) types by @sydney-runkle in [#9979](https://github.com/pydantic/pydantic/pull/9979) * Move annotation compatibility errors to validation phase by @sydney-runkle in [#9999](https://github.com/pydantic/pydantic/pull/9999) * Improve runtime errors for string constraints like `pattern` for incompatible types by @sydney-runkle in [#10158](https://github.com/pydantic/pydantic/pull/10158) * Remove `'allOf'` JSON schema workarounds by @dpeachey in [#10029](https://github.com/pydantic/pydantic/pull/10029) * Remove `typed_dict_cls` data from `CoreMetadata` by @sydney-runkle in [#10180](https://github.com/pydantic/pydantic/pull/10180) * Deprecate passing a dict to the `Examples` class by @Viicos in [#10181](https://github.com/pydantic/pydantic/pull/10181) * Remove `initial_metadata` from internal metadata construct by @sydney-runkle in [#10194](https://github.com/pydantic/pydantic/pull/10194) * Use `re.Pattern.search` instead of `re.Pattern.match` for consistency with `rust` behavior by @tinez in [pydantic/pydantic-core#1368](https://github.com/pydantic/pydantic-core/pull/1368) * Show value of wrongly typed data in `pydantic-core` serialization warning by @BoxyUwU in [pydantic/pydantic-core#1377](https://github.com/pydantic/pydantic-core/pull/1377) * Breaking Change: in `pydantic-core`, change `metadata` type hint in core schemas from `Any` -> `Dict[str, Any] | None` by @sydney-runkle in [pydantic/pydantic-core#1411](https://github.com/pydantic/pydantic-core/pull/1411) * Raise helpful warning when `self` isn't returned from model validator by @sydney-runkle in [#10255](https://github.com/pydantic/pydantic/pull/10255) #### Performance * Initial start at improving import times for modules, using caching primarily by @sydney-runkle in [#10009](https://github.com/pydantic/pydantic/pull/10009) * Using cached internal import for `BaseModel` by @sydney-runkle in [#10013](https://github.com/pydantic/pydantic/pull/10013) * Simplify internal generics logic - remove generator overhead by @sydney-runkle in [#10059](https://github.com/pydantic/pydantic/pull/10059) * Remove default module globals from types namespace by @sydney-runkle in [#10123](https://github.com/pydantic/pydantic/pull/10123) * Performance boost: skip caching parent namespaces in most cases by @sydney-runkle in [#10113](https://github.com/pydantic/pydantic/pull/10113) * Update ns stack with already copied ns by @sydney-runkle in [#10267](https://github.com/pydantic/pydantic/pull/10267) ##### Minor Internal Improvements * ⚡️ Speed up `multiple_of_validator()` by 31% in `pydantic/_internal/_validators.py` by @misrasaurabh1 in [#9839](https://github.com/pydantic/pydantic/pull/9839) * ⚡️ Speed up `ModelPrivateAttr.__set_name__()` by 18% in `pydantic/fields.py` by @misrasaurabh1 in [#9841](https://github.com/pydantic/pydantic/pull/9841) * ⚡️ Speed up `dataclass()` by 7% in `pydantic/dataclasses.py` by @misrasaurabh1 in [#9843](https://github.com/pydantic/pydantic/pull/9843) * ⚡️ Speed up function `_field_name_for_signature` by 37% in `pydantic/_internal/_signature.py` by @misrasaurabh1 in [#9951](https://github.com/pydantic/pydantic/pull/9951) * ⚡️ Speed up method `GenerateSchema._unpack_refs_defs` by 26% in `pydantic/_internal/_generate_schema.py` by @misrasaurabh1 in [#9949](https://github.com/pydantic/pydantic/pull/9949) * ⚡️ Speed up function `apply_each_item_validators` by 100% in `pydantic/_internal/_generate_schema.py` by @misrasaurabh1 in [#9950](https://github.com/pydantic/pydantic/pull/9950) * ⚡️ Speed up method `ConfigWrapper.core_config` by 28% in `pydantic/_internal/_config.py` by @misrasaurabh1 in [#9953](https://github.com/pydantic/pydantic/pull/9953) #### Fixes * Respect `use_enum_values` on `Literal` types by @kwint in [#9787](https://github.com/pydantic/pydantic/pull/9787) * Prevent type error for exotic `BaseModel/RootModel` inheritance by @dmontagu in [#9913](https://github.com/pydantic/pydantic/pull/9913) * Fix typing issue with field_validator-decorated methods by @dmontagu in [#9914](https://github.com/pydantic/pydantic/pull/9914) * Replace `str` type annotation with `Any` in validator factories in documentation on validators by @maximilianfellhuber in [#9885](https://github.com/pydantic/pydantic/pull/9885) * Fix `ComputedFieldInfo.wrapped_property` pointer when a property setter is assigned by @tlambert03 in [#9892](https://github.com/pydantic/pydantic/pull/9892) * Fix recursive typing of `main.IncEnx` by @tlambert03 in [#9924](https://github.com/pydantic/pydantic/pull/9924) * Allow usage of `type[Annotated[...]]` by @Viicos in [#9932](https://github.com/pydantic/pydantic/pull/9932) * `mypy` plugin: handle frozen fields on a per-field basis by @dmontagu in [#9935](https://github.com/pydantic/pydantic/pull/9935) * Fix typo in `invalid-annotated-type` error code by @sydney-runkle in [#9948](https://github.com/pydantic/pydantic/pull/9948) * Simplify schema generation for `uuid`, `url`, and `ip` types by @sydney-runkle in [#9975](https://github.com/pydantic/pydantic/pull/9975) * Move `date` schemas to `_generate_schema.py` by @sydney-runkle in [#9976](https://github.com/pydantic/pydantic/pull/9976) * Move `decimal.Decimal` validation to `_generate_schema.py` by @sydney-runkle in [#9977](https://github.com/pydantic/pydantic/pull/9977) * Simplify IP address schema in `_std_types_schema.py` by @sydney-runkle in [#9959](https://github.com/pydantic/pydantic/pull/9959) * Fix type annotations for some potentially generic `GenerateSchema.match_type` options by @sydney-runkle in [#9961](https://github.com/pydantic/pydantic/pull/9961) * Add class name to "has conflict" warnings by @msabramo in [#9964](https://github.com/pydantic/pydantic/pull/9964) * Fix `dataclass` ignoring `default_factory` passed in Annotated by @kc0506 in [#9971](https://github.com/pydantic/pydantic/pull/9971) * Fix `Sequence` ignoring `discriminator` by @kc0506 in [#9980](https://github.com/pydantic/pydantic/pull/9980) * Fix typing for `IPvAnyAddress` and `IPvAnyInterface` by @haoyun in [#9990](https://github.com/pydantic/pydantic/pull/9990) * Fix false positives on v1 models in `mypy` plugin for `from_orm` check requiring from_attributes=True config by @radekwlsk in [#9938](https://github.com/pydantic/pydantic/pull/9938) * Apply `strict=True` to `__init__` in `mypy` plugin by @kc0506 in [#9998](https://github.com/pydantic/pydantic/pull/9998) * Refactor application of `deque` annotations by @sydney-runkle in [#10018](https://github.com/pydantic/pydantic/pull/10018) * Raise a better user error when failing to evaluate a forward reference by @Viicos in [#10030](https://github.com/pydantic/pydantic/pull/10030) * Fix evaluation of `__pydantic_extra__` annotation in specific circumstances by @Viicos in [#10070](https://github.com/pydantic/pydantic/pull/10070) * Fix `frozen` enforcement for `dataclasses` by @sydney-runkle in [#10066](https://github.com/pydantic/pydantic/pull/10066) * Remove logic to handle unused `__get_pydantic_core_schema__` signature by @Viicos in [#10075](https://github.com/pydantic/pydantic/pull/10075) * Use `is_annotated` consistently by @Viicos in [#10095](https://github.com/pydantic/pydantic/pull/10095) * Fix `PydanticDeprecatedSince26` typo by @kc0506 in [#10101](https://github.com/pydantic/pydantic/pull/10101) * Improve `pyright` tests, refactor model decorators signatures by @Viicos in [#10092](https://github.com/pydantic/pydantic/pull/10092) * Fix `ip` serialization logic by @sydney-runkle in [#10112](https://github.com/pydantic/pydantic/pull/10112) * Warn when frozen defined twice for `dataclasses` by @mochi22 in [#10082](https://github.com/pydantic/pydantic/pull/10082) * Do not compute JSON Schema default when plain serializers are used with `when_used` set to `'json-unless-none'` and the default value is `None` by @Viicos in [#10121](https://github.com/pydantic/pydantic/pull/10121) * Fix `ImportString` special cases by @sydney-runkle in [#10137](https://github.com/pydantic/pydantic/pull/10137) * Blacklist default globals to support exotic user code with `__` prefixed annotations by @sydney-runkle in [#10136](https://github.com/pydantic/pydantic/pull/10136) * Handle `nullable` schemas with `serialization` schema available during JSON Schema generation by @Viicos in [#10132](https://github.com/pydantic/pydantic/pull/10132) * Reorganize `BaseModel` annotations by @kc0506 in [#10110](https://github.com/pydantic/pydantic/pull/10110) * Fix core schema simplification when serialization schemas are involved in specific scenarios by @Viicos in [#10155](https://github.com/pydantic/pydantic/pull/10155) * Add support for stringified annotations when using `PrivateAttr` with `Annotated` by @Viicos in [#10157](https://github.com/pydantic/pydantic/pull/10157) * Fix JSON Schema `number` type for literal and enum schemas by @Viicos in [#10172](https://github.com/pydantic/pydantic/pull/10172) * Fix JSON Schema generation of fields with plain validators in serialization mode by @Viicos in [#10167](https://github.com/pydantic/pydantic/pull/10167) * Fix invalid JSON Schemas being generated for functions in certain scenarios by @Viicos in [#10188](https://github.com/pydantic/pydantic/pull/10188) * Make sure generated JSON Schemas are valid in tests by @Viicos in [#10182](https://github.com/pydantic/pydantic/pull/10182) * Fix key error with custom serializer by @sydney-runkle in [#10200](https://github.com/pydantic/pydantic/pull/10200) * Add 'wss' for allowed schemes in NatsDsn by @swelborn in [#10224](https://github.com/pydantic/pydantic/pull/10224) * Fix `Mapping` and `MutableMapping` annotations to use mapping schema instead of dict schema by @sydney-runkle in [#10020](https://github.com/pydantic/pydantic/pull/10020) * Fix JSON Schema generation for constrained dates by @Viicos in [#10185](https://github.com/pydantic/pydantic/pull/10185) * Fix discriminated union bug regression when using enums by @kfreezen in [pydantic/pydantic-core#1286](https://github.com/pydantic/pydantic-core/pull/1286) * Fix `field_serializer` with computed field when using `*` by @nix010 in [pydantic/pydantic-core#1349](https://github.com/pydantic/pydantic-core/pull/1349) * Try each option in `Union` serializer before inference by @sydney-runkle in [pydantic/pydantic-core#1398](https://github.com/pydantic/pydantic-core/pull/1398) * Fix `float` serialization behavior in `strict` mode by @sydney-runkle in [pydantic/pydantic-core#1400](https://github.com/pydantic/pydantic-core/pull/1400) * Introduce `exactness` into Decimal validation logic to improve union validation behavior by @sydney-runkle in in [pydantic/pydantic-core#1405](https://github.com/pydantic/pydantic-core/pull/1405) * Fix new warnings assertions to use `pytest.warns()` by @mgorny in [#10241](https://github.com/pydantic/pydantic/pull/10241) * Fix a crash when cleaning the namespace in `ModelMetaclass` by @Viicos in [#10242](https://github.com/pydantic/pydantic/pull/10242) * Fix parent namespace issue with model rebuilds by @sydney-runkle in [#10257](https://github.com/pydantic/pydantic/pull/10257) * Remove defaults filter for namespace by @sydney-runkle in [#10261](https://github.com/pydantic/pydantic/pull/10261) * Use identity instead of equality after validating model in `__init__` by @Viicos in [#10264](https://github.com/pydantic/pydantic/pull/10264) * Support `BigInt` serialization for `int` subclasses by @kxx317 in [pydantic/pydantic-core#1417](https://github.com/pydantic/pydantic-core/pull/1417) * Support signature for wrap validators without `info` by @sydney-runkle in [#10277](https://github.com/pydantic/pydantic/pull/10277) * Ensure `__pydantic_complete__` is set when rebuilding `dataclasses` by @Viicos in [#10291](https://github.com/pydantic/pydantic/pull/10291) * Respect `schema_generator` config value in `TypeAdapter` by @sydney-runkle in [#10300](https://github.com/pydantic/pydantic/pull/10300) ### New Contributors #### `pydantic` * @kwint made their first contribution in [#9787](https://github.com/pydantic/pydantic/pull/9787) * @seekinginfiniteloop made their first contribution in [#9822](https://github.com/pydantic/pydantic/pull/9822) * @a-alexander made their first contribution in [#9848](https://github.com/pydantic/pydantic/pull/9848) * @maximilianfellhuber made their first contribution in [#9885](https://github.com/pydantic/pydantic/pull/9885) * @karmaBonfire made their first contribution in [#9945](https://github.com/pydantic/pydantic/pull/9945) * @s-rigaud made their first contribution in [#9958](https://github.com/pydantic/pydantic/pull/9958) * @msabramo made their first contribution in [#9964](https://github.com/pydantic/pydantic/pull/9964) * @DimaCybr made their first contribution in [#9972](https://github.com/pydantic/pydantic/pull/9972) * @kc0506 made their first contribution in [#9971](https://github.com/pydantic/pydantic/pull/9971) * @haoyun made their first contribution in [#9990](https://github.com/pydantic/pydantic/pull/9990) * @radekwlsk made their first contribution in [#9938](https://github.com/pydantic/pydantic/pull/9938) * @dpeachey made their first contribution in [#10029](https://github.com/pydantic/pydantic/pull/10029) * @BoxyUwU made their first contribution in [#10085](https://github.com/pydantic/pydantic/pull/10085) * @mochi22 made their first contribution in [#10082](https://github.com/pydantic/pydantic/pull/10082) * @aditkumar72 made their first contribution in [#10128](https://github.com/pydantic/pydantic/pull/10128) * @changhc made their first contribution in [#9654](https://github.com/pydantic/pydantic/pull/9654) * @insumanth made their first contribution in [#10229](https://github.com/pydantic/pydantic/pull/10229) * @AdolfoVillalobos made their first contribution in [#10240](https://github.com/pydantic/pydantic/pull/10240) * @bllchmbrs made their first contribution in [#10270](https://github.com/pydantic/pydantic/pull/10270) #### `pydantic-core` * @kfreezen made their first contribution in [pydantic/pydantic-core#1286](https://github.com/pydantic/pydantic-core/pull/1286) * @tinez made their first contribution in [pydantic/pydantic-core#1368](https://github.com/pydantic/pydantic-core/pull/1368) * @fft001 made their first contribution in [pydantic/pydantic-core#1362](https://github.com/pydantic/pydantic-core/pull/1362) * @nix010 made their first contribution in [pydantic/pydantic-core#1349](https://github.com/pydantic/pydantic-core/pull/1349) * @BoxyUwU made their first contribution in [pydantic/pydantic-core#1379](https://github.com/pydantic/pydantic-core/pull/1379) * @candleindark made their first contribution in [pydantic/pydantic-core#1404](https://github.com/pydantic/pydantic-core/pull/1404) * @changhc made their first contribution in [pydantic/pydantic-core#1331](https://github.com/pydantic/pydantic-core/pull/1331) ## v2.9.0b2 (2024-08-30) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.0b2) for details. ## v2.9.0b1 (2024-08-26) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.9.0b1) for details. ## v2.8.2 (2024-07-03) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.8.2) ### What's Changed #### Fixes * Fix issue with assertion caused by pluggable schema validator by @dmontagu in [#9838](https://github.com/pydantic/pydantic/pull/9838) ## v2.8.1 (2024-07-03) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.8.1) ### What's Changed #### Packaging * Bump `ruff` to `v0.5.0` and `pyright` to `v1.1.369` by @sydney-runkle in [#9801](https://github.com/pydantic/pydantic/pull/9801) * Bump `pydantic-core` to `v2.20.1`, `pydantic-extra-types` to `v2.9.0` by @sydney-runkle in [#9832](https://github.com/pydantic/pydantic/pull/9832) #### Fixes * Fix breaking change in `to_snake` from v2.7 -> v2.8 by @sydney-runkle in [#9812](https://github.com/pydantic/pydantic/pull/9812) * Fix list constraint json schema application by @sydney-runkle in [#9818](https://github.com/pydantic/pydantic/pull/9818) * Support time duration more than 23 by @nix010 in [pydantic/speedate#64](https://github.com/pydantic/speedate/pull/64) * Fix millisecond fraction being handled with the wrong scale by @davidhewitt in [pydantic/speedate#65](https://github.com/pydantic/speedate/pull/65) * Handle negative fractional durations correctly by @sydney-runkle in [pydantic/speedate#71](https://github.com/pydantic/speedate/pull/71) ## v2.8.0 (2024-07-01) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.8.0) The code released in v2.8.0 is functionally identical to that of v2.8.0b1. ### What's Changed #### Packaging * Update citation version automatically with new releases by @sydney-runkle in [#9673](https://github.com/pydantic/pydantic/pull/9673) * Bump pyright to `v1.1.367` and add type checking tests for pipeline API by @adriangb in [#9674](https://github.com/pydantic/pydantic/pull/9674) * Update `pydantic.v1` stub to `v1.10.17` by @sydney-runkle in [#9707](https://github.com/pydantic/pydantic/pull/9707) * General package updates to prep for `v2.8.0b1` by @sydney-runkle in [#9741](https://github.com/pydantic/pydantic/pull/9741) * Bump `pydantic-core` to `v2.20.0` by @sydney-runkle in [#9745](https://github.com/pydantic/pydantic/pull/9745) * Add support for Python 3.13 by @sydney-runkle in [#9743](https://github.com/pydantic/pydantic/pull/9743) * Update `pdm` version used for `pdm.lock` to v2.16.1 by @sydney-runkle in [#9761](https://github.com/pydantic/pydantic/pull/9761) * Update to `ruff` `v0.4.8` by @Viicos in [#9585](https://github.com/pydantic/pydantic/pull/9585) #### New Features * Experimental: support `defer_build` for `TypeAdapter` by @MarkusSintonen in [#8939](https://github.com/pydantic/pydantic/pull/8939) * Implement `deprecated` field in json schema by @NeevCohen in [#9298](https://github.com/pydantic/pydantic/pull/9298) * Experimental: Add pipeline API by @adriangb in [#9459](https://github.com/pydantic/pydantic/pull/9459) * Add support for programmatic title generation by @NeevCohen in [#9183](https://github.com/pydantic/pydantic/pull/9183) * Implement `fail_fast` feature by @uriyyo in [#9708](https://github.com/pydantic/pydantic/pull/9708) * Add `ser_json_inf_nan='strings'` mode to produce valid JSON by @josh-newman in [pydantic/pydantic-core#1307](https://github.com/pydantic/pydantic-core/pull/1307) #### Changes * Add warning when "alias" is set in ignored `Annotated` field by @nix010 in [#9170](https://github.com/pydantic/pydantic/pull/9170) * Support serialization of some serializable defaults in JSON schema by @sydney-runkle in [#9624](https://github.com/pydantic/pydantic/pull/9624) * Relax type specification for `__validators__` values in `create_model` by @sydney-runkle in [#9697](https://github.com/pydantic/pydantic/pull/9697) * **Breaking Change:** Improve `smart` union matching logic by @sydney-runkle in [pydantic/pydantic-core#1322](https://github.com/pydantic/pydantic-core/pull/1322) You can read more about our `smart` union matching logic [here](https://docs.pydantic.dev/dev/concepts/unions/#smart-mode). In some cases, if the old behavior is desired, you can switch to `left-to-right` mode and change the order of your `Union` members. #### Performance ##### Internal Improvements * ⚡️ Speed up `_display_error_loc()` by 25% in `pydantic/v1/error_wrappers.py` by @misrasaurabh1 in [#9653](https://github.com/pydantic/pydantic/pull/9653) * ⚡️ Speed up `_get_all_json_refs()` by 34% in `pydantic/json_schema.py` by @misrasaurabh1 in [#9650](https://github.com/pydantic/pydantic/pull/9650) * ⚡️ Speed up `is_pydantic_dataclass()` by 41% in `pydantic/dataclasses.py` by @misrasaurabh1 in [#9652](https://github.com/pydantic/pydantic/pull/9652) * ⚡️ Speed up `to_snake()` by 27% in `pydantic/alias_generators.py` by @misrasaurabh1 in [#9747](https://github.com/pydantic/pydantic/pull/9747) * ⚡️ Speed up `unwrap_wrapped_function()` by 93% in `pydantic/_internal/_decorators.py` by @misrasaurabh1 in [#9727](https://github.com/pydantic/pydantic/pull/9727) #### Fixes * Replace `__spec__.parent` with `__package__` by @hramezani in [#9331](https://github.com/pydantic/pydantic/pull/9331) * Fix Outputted Model JSON Schema for `Sequence` type by @anesmemisevic in [#9303](https://github.com/pydantic/pydantic/pull/9303) * Fix typing of `_frame_depth` by @Viicos in [#9353](https://github.com/pydantic/pydantic/pull/9353) * Make `ImportString` json schema compatible by @amitschang in [#9344](https://github.com/pydantic/pydantic/pull/9344) * Hide private attributes (`PrivateAttr`) from `__init__` signature in type checkers by @idan22moral in [#9293](https://github.com/pydantic/pydantic/pull/9293) * Make detection of `TypeVar` defaults robust to the CPython `PEP-696` implementation by @AlexWaygood in [#9426](https://github.com/pydantic/pydantic/pull/9426) * Fix usage of `PlainSerializer` with builtin types by @Viicos in [#9450](https://github.com/pydantic/pydantic/pull/9450) * Add more robust custom validation examples by @ChrisPappalardo in [#9468](https://github.com/pydantic/pydantic/pull/9468) * Fix ignored `strict` specification for `StringConstraint(strict=False)` by @vbmendes in [#9476](https://github.com/pydantic/pydantic/pull/9476) * **Breaking Change:** Use PEP 570 syntax by @Viicos in [#9479](https://github.com/pydantic/pydantic/pull/9479) * Use `Self` where possible by @Viicos in [#9479](https://github.com/pydantic/pydantic/pull/9479) * Do not alter `RootModel.model_construct` signature in the `mypy` plugin by @Viicos in [#9480](https://github.com/pydantic/pydantic/pull/9480) * Fixed type hint of `validation_context` by @OhioDschungel6 in [#9508](https://github.com/pydantic/pydantic/pull/9508) * Support context being passed to TypeAdapter's `dump_json`/`dump_python` by @alexcouper in [#9495](https://github.com/pydantic/pydantic/pull/9495) * Updates type signature for `Field()` constructor by @bjmc in [#9484](https://github.com/pydantic/pydantic/pull/9484) * Improve builtin alias generators by @sydney-runkle in [#9561](https://github.com/pydantic/pydantic/pull/9561) * Fix typing of `TypeAdapter` by @Viicos in [#9570](https://github.com/pydantic/pydantic/pull/9570) * Add fallback default value for private fields in `__setstate__` of BaseModel by @anhpham1509 in [#9584](https://github.com/pydantic/pydantic/pull/9584) * Support `PEP 746` by @adriangb in [#9587](https://github.com/pydantic/pydantic/pull/9587) * Allow validator and serializer functions to have default values by @Viicos in [#9478](https://github.com/pydantic/pydantic/pull/9478) * Fix bug with mypy plugin's handling of covariant `TypeVar` fields by @dmontagu in [#9606](https://github.com/pydantic/pydantic/pull/9606) * Fix multiple annotation / constraint application logic by @sydney-runkle in [#9623](https://github.com/pydantic/pydantic/pull/9623) * Respect `regex` flags in validation and json schema by @sydney-runkle in [#9591](https://github.com/pydantic/pydantic/pull/9591) * Fix type hint on `IpvAnyAddress` by @sydney-runkle in [#9640](https://github.com/pydantic/pydantic/pull/9640) * Allow a field specifier on `__pydantic_extra__` by @dmontagu in [#9659](https://github.com/pydantic/pydantic/pull/9659) * Use normalized case for file path comparison by @sydney-runkle in [#9737](https://github.com/pydantic/pydantic/pull/9737) * Modify constraint application logic to allow field constraints on `Optional[Decimal]` by @lazyhope in [#9754](https://github.com/pydantic/pydantic/pull/9754) * `validate_call` type params fix by @sydney-runkle in [#9760](https://github.com/pydantic/pydantic/pull/9760) * Check all warnings returned by pytest.warns() by @s-t-e-v-e-n-k in [#9702](https://github.com/pydantic/pydantic/pull/9702) * Reuse `re.Pattern` object in regex patterns to allow for regex flags by @sydney-runkle in [pydantic/pydantic-core#1318](https://github.com/pydantic/pydantic-core/pull/1318) ### New Contributors * @idan22moral made their first contribution in [#9294](https://github.com/pydantic/pydantic/pull/9294) * @anesmemisevic made their first contribution in [#9303](https://github.com/pydantic/pydantic/pull/9303) * @max-muoto made their first contribution in [#9338](https://github.com/pydantic/pydantic/pull/9338) * @amitschang made their first contribution in [#9344](https://github.com/pydantic/pydantic/pull/9344) * @paulmartin91 made their first contribution in [#9410](https://github.com/pydantic/pydantic/pull/9410) * @OhioDschungel6 made their first contribution in [#9405](https://github.com/pydantic/pydantic/pull/9405) * @AlexWaygood made their first contribution in [#9426](https://github.com/pydantic/pydantic/pull/9426) * @kinuax made their first contribution in [#9433](https://github.com/pydantic/pydantic/pull/9433) * @antoni-jamiolkowski made their first contribution in [#9431](https://github.com/pydantic/pydantic/pull/9431) * @candleindark made their first contribution in [#9448](https://github.com/pydantic/pydantic/pull/9448) * @nix010 made their first contribution in [#9170](https://github.com/pydantic/pydantic/pull/9170) * @tomy0000000 made their first contribution in [#9457](https://github.com/pydantic/pydantic/pull/9457) * @vbmendes made their first contribution in [#9470](https://github.com/pydantic/pydantic/pull/9470) * @micheleAlberto made their first contribution in [#9471](https://github.com/pydantic/pydantic/pull/9471) * @ChrisPappalardo made their first contribution in [#9468](https://github.com/pydantic/pydantic/pull/9468) * @blueTurtz made their first contribution in [#9475](https://github.com/pydantic/pydantic/pull/9475) * @WinterBlue16 made their first contribution in [#9477](https://github.com/pydantic/pydantic/pull/9477) * @bittner made their first contribution in [#9500](https://github.com/pydantic/pydantic/pull/9500) * @alexcouper made their first contribution in [#9495](https://github.com/pydantic/pydantic/pull/9495) * @bjmc made their first contribution in [#9484](https://github.com/pydantic/pydantic/pull/9484) * @pjvv made their first contribution in [#9529](https://github.com/pydantic/pydantic/pull/9529) * @nedbat made their first contribution in [#9530](https://github.com/pydantic/pydantic/pull/9530) * @gunnellEvan made their first contribution in [#9469](https://github.com/pydantic/pydantic/pull/9469) * @jaymbans made their first contribution in [#9531](https://github.com/pydantic/pydantic/pull/9531) * @MarcBresson made their first contribution in [#9534](https://github.com/pydantic/pydantic/pull/9534) * @anhpham1509 made their first contribution in [#9584](https://github.com/pydantic/pydantic/pull/9584) * @K-dash made their first contribution in [#9595](https://github.com/pydantic/pydantic/pull/9595) * @s-t-e-v-e-n-k made their first contribution in [#9527](https://github.com/pydantic/pydantic/pull/9527) * @airwoodix made their first contribution in [#9506](https://github.com/pydantic/pydantic/pull/9506) * @misrasaurabh1 made their first contribution in [#9653](https://github.com/pydantic/pydantic/pull/9653) * @AlessandroMiola made their first contribution in [#9740](https://github.com/pydantic/pydantic/pull/9740) * @mylapallilavanyaa made their first contribution in [#9746](https://github.com/pydantic/pydantic/pull/9746) * @lazyhope made their first contribution in [#9754](https://github.com/pydantic/pydantic/pull/9754) * @YassinNouh21 made their first contribution in [#9759](https://github.com/pydantic/pydantic/pull/9759) ## v2.8.0b1 (2024-06-27) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.8.0b1) for details. ## v2.7.4 (2024-06-12) [Github release](https://github.com/pydantic/pydantic/releases/tag/v2.7.4) ### What's Changed #### Packaging * Bump `pydantic.v1` to `v1.10.16` reference by @sydney-runkle in [#9639](https://github.com/pydantic/pydantic/pull/9639) #### Fixes * Specify `recursive_guard` as kwarg in `FutureRef._evaluate` by @vfazio in [#9612](https://github.com/pydantic/pydantic/pull/9612) ## v2.7.3 (2024-06-03) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.3) ### What's Changed #### Packaging * Bump `pydantic-core` to `v2.18.4` by @sydney-runkle in [#9550](https://github.com/pydantic/pydantic/pull/9550) #### Fixes * Fix u style unicode strings in python @samuelcolvin in [pydantic/jiter#110](https://github.com/pydantic/jiter/pull/110) ## v2.7.2 (2024-05-28) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.2) ### What's Changed #### Packaging * Bump `pydantic-core` to `v2.18.3` by @sydney-runkle in [#9515](https://github.com/pydantic/pydantic/pull/9515) #### Fixes * Replace `__spec__.parent` with `__package__` by @hramezani in [#9331](https://github.com/pydantic/pydantic/pull/9331) * Fix validation of `int`s with leading unary minus by @RajatRajdeep in [pydantic/pydantic-core#1291](https://github.com/pydantic/pydantic-core/pull/1291) * Fix `str` subclass validation for enums by @sydney-runkle in [pydantic/pydantic-core#1273](https://github.com/pydantic/pydantic-core/pull/1273) * Support `BigInt`s in `Literal`s and `Enum`s by @samuelcolvin in [pydantic/pydantic-core#1297](https://github.com/pydantic/pydantic-core/pull/1297) * Fix: uuid - allow `str` subclass as input by @davidhewitt in [pydantic/pydantic-core#1296](https://github.com/pydantic/pydantic-core/pull/1296) ## v2.7.1 (2024-04-23) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.1) ### What's Changed #### Packaging * Bump `pydantic-core` to `v2.18.2` by @sydney-runkle in [#9307](https://github.com/pydantic/pydantic/pull/9307) #### New Features * Ftp and Websocket connection strings support by @CherrySuryp in [#9205](https://github.com/pydantic/pydantic/pull/9205) #### Changes * Use field description for RootModel schema description when there is `…` by @LouisGobert in [#9214](https://github.com/pydantic/pydantic/pull/9214) #### Fixes * Fix `validation_alias` behavior with `model_construct` for `AliasChoices` and `AliasPath` by @sydney-runkle in [#9223](https://github.com/pydantic/pydantic/pull/9223) * Revert `typing.Literal` and import it outside the TYPE_CHECKING block by @frost-nzcr4 in [#9232](https://github.com/pydantic/pydantic/pull/9232) * Fix `Secret` serialization schema, applicable for unions by @sydney-runkle in [#9240](https://github.com/pydantic/pydantic/pull/9240) * Fix `strict` application to `function-after` with `use_enum_values` by @sydney-runkle in [#9279](https://github.com/pydantic/pydantic/pull/9279) * Address case where `model_construct` on a class which defines `model_post_init` fails with `AttributeError` by @babygrimes in [#9168](https://github.com/pydantic/pydantic/pull/9168) * Fix `model_json_schema` with config types by @NeevCohen in [#9287](https://github.com/pydantic/pydantic/pull/9287) * Support multiple zeros as an `int` by @samuelcolvin in [pydantic/pydantic-core#1269](https://github.com/pydantic/pydantic-core/pull/1269) * Fix validation of `int`s with leading unary plus by @cknv in [pydantic/pydantic-core#1272](https://github.com/pydantic/pydantic-core/pull/1272) * Fix interaction between `extra != 'ignore'` and `from_attributes=True` by @davidhewitt in [pydantic/pydantic-core#1276](https://github.com/pydantic/pydantic-core/pull/1276) * Handle error from `Enum`'s `missing` function as `ValidationError` by @sydney-runkle in [pydantic/pydantic-core#1274](https://github.com/pydantic/pydantic-core/pull/1754) * Fix memory leak with `Iterable` validation by @davidhewitt in [pydantic/pydantic-core#1271](https://github.com/pydantic/pydantic-core/pull/1751) ### New Contributors * @zzstoatzz made their first contribution in [#9219](https://github.com/pydantic/pydantic/pull/9219) * @frost-nzcr4 made their first contribution in [#9232](https://github.com/pydantic/pydantic/pull/9232) * @CherrySuryp made their first contribution in [#9205](https://github.com/pydantic/pydantic/pull/9205) * @vagenas made their first contribution in [#9268](https://github.com/pydantic/pydantic/pull/9268) * @ollz272 made their first contribution in [#9262](https://github.com/pydantic/pydantic/pull/9262) * @babygrimes made their first contribution in [#9168](https://github.com/pydantic/pydantic/pull/9168) * @swelborn made their first contribution in [#9296](https://github.com/pydantic/pydantic/pull/9296) * @kf-novi made their first contribution in [#9236](https://github.com/pydantic/pydantic/pull/9236) * @lgeiger made their first contribution in [#9288](https://github.com/pydantic/pydantic/pull/9288) ## v2.7.0 (2024-04-11) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.0) The code released in v2.7.0 is practically identical to that of v2.7.0b1. ### What's Changed #### Packaging * Reorganize `pyproject.toml` sections by @Viicos in [#8899](https://github.com/pydantic/pydantic/pull/8899) * Bump `pydantic-core` to `v2.18.1` by @sydney-runkle in [#9211](https://github.com/pydantic/pydantic/pull/9211) * Adopt `jiter` `v0.2.0` by @samuelcolvin in [pydantic/pydantic-core#1250](https://github.com/pydantic/pydantic-core/pull/1250) #### New Features * Extract attribute docstrings from `FieldInfo.description` by @Viicos in [#6563](https://github.com/pydantic/pydantic/pull/6563) * Add a `with_config` decorator to comply with typing spec by @Viicos in [#8611](https://github.com/pydantic/pydantic/pull/8611) * Allow an optional separator splitting the value and unit of the result of `ByteSize.human_readable` by @jks15satoshi in [#8706](https://github.com/pydantic/pydantic/pull/8706) * Add generic `Secret` base type by @conradogarciaberrotaran in [#8519](https://github.com/pydantic/pydantic/pull/8519) * Make use of `Sphinx` inventories for cross references in docs by @Viicos in [#8682](https://github.com/pydantic/pydantic/pull/8682) * Add environment variable to disable plugins by @geospackle in [#8767](https://github.com/pydantic/pydantic/pull/8767) * Add support for `deprecated` fields by @Viicos in [#8237](https://github.com/pydantic/pydantic/pull/8237) * Allow `field_serializer('*')` by @ornariece in [#9001](https://github.com/pydantic/pydantic/pull/9001) * Handle a case when `model_config` is defined as a model property by @alexeyt101 in [#9004](https://github.com/pydantic/pydantic/pull/9004) * Update `create_model()` to support `typing.Annotated` as input by @wannieman98 in [#8947](https://github.com/pydantic/pydantic/pull/8947) * Add `ClickhouseDsn` support by @solidguy7 in [#9062](https://github.com/pydantic/pydantic/pull/9062) * Add support for `re.Pattern[str]` to `pattern` field by @jag-k in [#9053](https://github.com/pydantic/pydantic/pull/9053) * Support for `serialize_as_any` runtime setting by @sydney-runkle in [#8830](https://github.com/pydantic/pydantic/pull/8830) * Add support for `typing.Self` by @Youssefares in [#9023](https://github.com/pydantic/pydantic/pull/9023) * Ability to pass `context` to serialization by @ornariece in [#8965](https://github.com/pydantic/pydantic/pull/8965) * Add feedback widget to docs with flarelytics integration by @sydney-runkle in [#9129](https://github.com/pydantic/pydantic/pull/9129) * Support for parsing partial JSON strings in Python by @samuelcolvin in [pydantic/jiter#66](https://github.com/pydantic/jiter/pull/66) **Finalized in v2.7.0, rather than v2.7.0b1:** * Add support for field level number to str coercion option by @NeevCohen in [#9137](https://github.com/pydantic/pydantic/pull/9137) * Update `warnings` parameter for serialization utilities to allow raising a warning by @Lance-Drane in [#9166](https://github.com/pydantic/pydantic/pull/9166) #### Changes * Correct docs, logic for `model_construct` behavior with `extra` by @sydney-runkle in [#8807](https://github.com/pydantic/pydantic/pull/8807) * Improve error message for improper `RootModel` subclasses by @sydney-runkle in [#8857](https://github.com/pydantic/pydantic/pull/8857) * **Breaking Change:** Use `PEP570` syntax by @Viicos in [#8940](https://github.com/pydantic/pydantic/pull/8940) * Add `enum` and `type` to the JSON schema for single item literals by @dmontagu in [#8944](https://github.com/pydantic/pydantic/pull/8944) * Deprecate `update_json_schema` internal function by @sydney-runkle in [#9125](https://github.com/pydantic/pydantic/pull/9125) * Serialize duration to hour minute second, instead of just seconds by @kakilangit in [pydantic/speedate#50](https://github.com/pydantic/speedate/pull/50) * Trimming str before parsing to int and float by @hungtsetse in [pydantic/pydantic-core#1203](https://github.com/pydantic/pydantic-core/pull/1203) #### Performance * `enum` validator improvements by @samuelcolvin in [#9045](https://github.com/pydantic/pydantic/pull/9045) * Move `enum` validation and serialization to Rust by @samuelcolvin in [#9064](https://github.com/pydantic/pydantic/pull/9064) * Improve schema generation for nested dataclasses by @sydney-runkle in [#9114](https://github.com/pydantic/pydantic/pull/9114) * Fast path for ASCII python string creation in JSON by @samuelcolvin in in [pydantic/jiter#72](https://github.com/pydantic/jiter/pull/72) * SIMD integer and string JSON parsing on `aarch64`(**Note:** SIMD on x86 will be implemented in a future release) by @samuelcolvin in in [pydantic/jiter#65](https://github.com/pydantic/jiter/pull/65) * Support JSON `Cow` from `jiter` by @davidhewitt in [pydantic/pydantic-core#1231](https://github.com/pydantic/pydantic-core/pull/1231) * MAJOR performance improvement: update to PyO3 0.21 final by @davidhewitt in [pydantic/pydantic-core#1248](https://github.com/pydantic/pydantic-core/pull/1248) * cache Python strings by @samuelcolvin in [pydantic/pydantic-core#1240](https://github.com/pydantic/pydantic-core/pull/1240) #### Fixes * Fix strict parsing for some `Sequence`s by @sydney-runkle in [#8614](https://github.com/pydantic/pydantic/pull/8614) * Add a check on the existence of `__qualname__` by @anci3ntr0ck in [#8642](https://github.com/pydantic/pydantic/pull/8642) * Handle `__pydantic_extra__` annotation being a string or inherited by @alexmojaki in [#8659](https://github.com/pydantic/pydantic/pull/8659) * Fix json validation for `NameEmail` by @Holi0317 in [#8650](https://github.com/pydantic/pydantic/pull/8650) * Fix type-safety of attribute access in `BaseModel` by @bluenote10 in [#8651](https://github.com/pydantic/pydantic/pull/8651) * Fix bug with `mypy` plugin and `no_strict_optional = True` by @dmontagu in [#8666](https://github.com/pydantic/pydantic/pull/8666) * Fix `ByteSize` error `type` change by @sydney-runkle in [#8681](https://github.com/pydantic/pydantic/pull/8681) * Fix inheriting annotations in dataclasses by @sydney-runkle in [#8679](https://github.com/pydantic/pydantic/pull/8679) * Fix regression in core schema generation for indirect definition references by @dmontagu in [#8702](https://github.com/pydantic/pydantic/pull/8702) * Fix unsupported types bug with plain validator by @sydney-runkle in [#8710](https://github.com/pydantic/pydantic/pull/8710) * Reverting problematic fix from 2.6 release, fixing schema building bug by @sydney-runkle in [#8718](https://github.com/pydantic/pydantic/pull/8718) * fixes `__pydantic_config__` ignored for TypeDict by @13sin in [#8734](https://github.com/pydantic/pydantic/pull/8734) * Fix test failures with `pytest v8.0.0` due to `pytest.warns()` starting to work inside `pytest.raises()` by @mgorny in [#8678](https://github.com/pydantic/pydantic/pull/8678) * Use `is_valid_field` from 1.x for `mypy` plugin by @DanielNoord in [#8738](https://github.com/pydantic/pydantic/pull/8738) * Better-support `mypy` strict equality flag by @dmontagu in [#8799](https://github.com/pydantic/pydantic/pull/8799) * model_json_schema export with Annotated types misses 'required' parameters by @LouisGobert in [#8793](https://github.com/pydantic/pydantic/pull/8793) * Fix default inclusion in `FieldInfo.__repr_args__` by @sydney-runkle in [#8801](https://github.com/pydantic/pydantic/pull/8801) * Fix resolution of forward refs in dataclass base classes that are not present in the subclass module namespace by @matsjoyce-refeyn in [#8751](https://github.com/pydantic/pydantic/pull/8751) * Fix `BaseModel` type annotations to be resolvable by `typing.get_type_hints` by @devmonkey22 in [#7680](https://github.com/pydantic/pydantic/pull/7680) * Fix: allow empty string aliases with `AliasGenerator` by @sydney-runkle in [#8810](https://github.com/pydantic/pydantic/pull/8810) * Fix test along with `date` -> `datetime` timezone assumption fix by @sydney-runkle in [#8823](https://github.com/pydantic/pydantic/pull/8823) * Fix deprecation warning with usage of `ast.Str` by @Viicos in [#8837](https://github.com/pydantic/pydantic/pull/8837) * Add missing `deprecated` decorators by @Viicos in [#8877](https://github.com/pydantic/pydantic/pull/8877) * Fix serialization of `NameEmail` if name includes an email address by @NeevCohen in [#8860](https://github.com/pydantic/pydantic/pull/8860) * Add information about class in error message of schema generation by @Czaki in [#8917](https://github.com/pydantic/pydantic/pull/8917) * Make `TypeAdapter`'s typing compatible with special forms by @adriangb in [#8923](https://github.com/pydantic/pydantic/pull/8923) * Fix issue with config behavior being baked into the ref schema for `enum`s by @dmontagu in [#8920](https://github.com/pydantic/pydantic/pull/8920) * More helpful error re wrong `model_json_schema` usage by @sydney-runkle in [#8928](https://github.com/pydantic/pydantic/pull/8928) * Fix nested discriminated union schema gen, pt 2 by @sydney-runkle in [#8932](https://github.com/pydantic/pydantic/pull/8932) * Fix schema build for nested dataclasses / TypedDicts with discriminators by @sydney-runkle in [#8950](https://github.com/pydantic/pydantic/pull/8950) * Remove unnecessary logic for definitions schema gen with discriminated unions by @sydney-runkle in [#8951](https://github.com/pydantic/pydantic/pull/8951) * Fix handling of optionals in `mypy` plugin by @dmontagu in [#9008](https://github.com/pydantic/pydantic/pull/9008) * Fix `PlainSerializer` usage with std type constructor by @sydney-runkle in [#9031](https://github.com/pydantic/pydantic/pull/9031) * Remove unnecessary warning for config in plugin by @dmontagu in [#9039](https://github.com/pydantic/pydantic/pull/9039) * Fix default value serializing by @NeevCohen in [#9066](https://github.com/pydantic/pydantic/pull/9066) * Fix extra fields check in `Model.__getattr__()` by @NeevCohen in [#9082](https://github.com/pydantic/pydantic/pull/9082) * Fix `ClassVar` forward ref inherited from parent class by @alexmojaki in [#9097](https://github.com/pydantic/pydantic/pull/9097) * fix sequence like validator with strict `True` by @andresliszt in [#8977](https://github.com/pydantic/pydantic/pull/8977) * Improve warning message when a field name shadows a field in a parent model by @chan-vince in [#9105](https://github.com/pydantic/pydantic/pull/9105) * Do not warn about shadowed fields if they are not redefined in a child class by @chan-vince in [#9111](https://github.com/pydantic/pydantic/pull/9111) * Fix discriminated union bug with unsubstituted type var by @sydney-runkle in [#9124](https://github.com/pydantic/pydantic/pull/9124) * Support serialization of `deque` when passed to `Sequence[blah blah blah]` by @sydney-runkle in [#9128](https://github.com/pydantic/pydantic/pull/9128) * Init private attributes from super-types in `model_post_init` by @Viicos in [#9134](https://github.com/pydantic/pydantic/pull/9134) * fix `model_construct` with `validation_alias` by @ornariece in [#9144](https://github.com/pydantic/pydantic/pull/9144) * Ensure json-schema generator handles `Literal` `null` types by @bruno-f-cruz in [#9135](https://github.com/pydantic/pydantic/pull/9135) * **Fixed in v2.7.0**: Fix allow extra generic by @dmontagu in [#9193](https://github.com/pydantic/pydantic/pull/9193) ### New Contributors * @hungtsetse made their first contribution in [#8546](https://github.com/pydantic/pydantic/pull/8546) * @StrawHatDrag0n made their first contribution in [#8583](https://github.com/pydantic/pydantic/pull/8583) * @anci3ntr0ck made their first contribution in [#8642](https://github.com/pydantic/pydantic/pull/8642) * @Holi0317 made their first contribution in [#8650](https://github.com/pydantic/pydantic/pull/8650) * @bluenote10 made their first contribution in [#8651](https://github.com/pydantic/pydantic/pull/8651) * @ADSteele916 made their first contribution in [#8703](https://github.com/pydantic/pydantic/pull/8703) * @musicinmybrain made their first contribution in [#8731](https://github.com/pydantic/pydantic/pull/8731) * @jks15satoshi made their first contribution in [#8706](https://github.com/pydantic/pydantic/pull/8706) * @13sin made their first contribution in [#8734](https://github.com/pydantic/pydantic/pull/8734) * @DanielNoord made their first contribution in [#8738](https://github.com/pydantic/pydantic/pull/8738) * @conradogarciaberrotaran made their first contribution in [#8519](https://github.com/pydantic/pydantic/pull/8519) * @chris-griffin made their first contribution in [#8775](https://github.com/pydantic/pydantic/pull/8775) * @LouisGobert made their first contribution in [#8793](https://github.com/pydantic/pydantic/pull/8793) * @matsjoyce-refeyn made their first contribution in [#8751](https://github.com/pydantic/pydantic/pull/8751) * @devmonkey22 made their first contribution in [#7680](https://github.com/pydantic/pydantic/pull/7680) * @adamency made their first contribution in [#8847](https://github.com/pydantic/pydantic/pull/8847) * @MamfTheKramf made their first contribution in [#8851](https://github.com/pydantic/pydantic/pull/8851) * @ornariece made their first contribution in [#9001](https://github.com/pydantic/pydantic/pull/9001) * @alexeyt101 made their first contribution in [#9004](https://github.com/pydantic/pydantic/pull/9004) * @wannieman98 made their first contribution in [#8947](https://github.com/pydantic/pydantic/pull/8947) * @solidguy7 made their first contribution in [#9062](https://github.com/pydantic/pydantic/pull/9062) * @kloczek made their first contribution in [#9047](https://github.com/pydantic/pydantic/pull/9047) * @jag-k made their first contribution in [#9053](https://github.com/pydantic/pydantic/pull/9053) * @priya-gitTest made their first contribution in [#9088](https://github.com/pydantic/pydantic/pull/9088) * @Youssefares made their first contribution in [#9023](https://github.com/pydantic/pydantic/pull/9023) * @chan-vince made their first contribution in [#9105](https://github.com/pydantic/pydantic/pull/9105) * @bruno-f-cruz made their first contribution in [#9135](https://github.com/pydantic/pydantic/pull/9135) * @Lance-Drane made their first contribution in [#9166](https://github.com/pydantic/pydantic/pull/9166) ## v2.7.0b1 (2024-04-03) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.7.0b1) for details. ## v2.6.4 (2024-03-12) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.4) ### What's Changed #### Fixes * Fix usage of `AliasGenerator` with `computed_field` decorator by @sydney-runkle in [#8806](https://github.com/pydantic/pydantic/pull/8806) * Fix nested discriminated union schema gen, pt 2 by @sydney-runkle in [#8932](https://github.com/pydantic/pydantic/pull/8932) * Fix bug with no_strict_optional=True caused by API deferral by @dmontagu in [#8826](https://github.com/pydantic/pydantic/pull/8826) ## v2.6.3 (2024-02-27) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.3) ### What's Changed #### Packaging * Update `pydantic-settings` version in the docs by @hramezani in [#8906](https://github.com/pydantic/pydantic/pull/8906) #### Fixes * Fix discriminated union schema gen bug by @sydney-runkle in [#8904](https://github.com/pydantic/pydantic/pull/8904) ## v2.6.2 (2024-02-23) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.2) ### What's Changed #### Packaging * Upgrade to `pydantic-core` 2.16.3 by @sydney-runkle in [#8879](https://github.com/pydantic/pydantic/pull/8879) #### Fixes * 'YYYY-MM-DD' date string coerced to datetime shouldn't infer timezone by @sydney-runkle in [pydantic/pydantic-core#1193](https://github.com/pydantic/pydantic-core/pull/1193) ## v2.6.1 (2024-02-05) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.1) ### What's Changed #### Packaging * Upgrade to `pydantic-core` 2.16.2 by @sydney-runkle in [#8717](https://github.com/pydantic/pydantic/pull/8717) #### Fixes * Fix bug with `mypy` plugin and `no_strict_optional = True` by @dmontagu in [#8666](https://github.com/pydantic/pydantic/pull/8666) * Fix `ByteSize` error `type` change by @sydney-runkle in [#8681](https://github.com/pydantic/pydantic/pull/8681) * Fix inheriting `Field` annotations in dataclasses by @sydney-runkle in [#8679](https://github.com/pydantic/pydantic/pull/8679) * Fix regression in core schema generation for indirect definition references by @dmontagu in [#8702](https://github.com/pydantic/pydantic/pull/8702) * Fix unsupported types bug with `PlainValidator` by @sydney-runkle in [#8710](https://github.com/pydantic/pydantic/pull/8710) * Reverting problematic fix from 2.6 release, fixing schema building bug by @sydney-runkle in [#8718](https://github.com/pydantic/pydantic/pull/8718) * Fix warning for tuple of wrong size in `Union` by @davidhewitt in [pydantic/pydantic-core#1174](https://github.com/pydantic/pydantic-core/pull/1174) * Fix `computed_field` JSON serializer `exclude_none` behavior by @sydney-runkle in [pydantic/pydantic-core#1187](https://github.com/pydantic/pydantic-core/pull/1187) ## v2.6.0 (2024-01-23) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.0) The code released in v2.6.0 is practically identical to that of v2.6.0b1. ### What's Changed #### Packaging * Check for `email-validator` version >= 2.0 by @commonism in [#6033](https://github.com/pydantic/pydantic/pull/6033) * Upgrade `ruff`` target version to Python 3.8 by @Elkiwa in [#8341](https://github.com/pydantic/pydantic/pull/8341) * Update to `pydantic-extra-types==2.4.1` by @yezz123 in [#8478](https://github.com/pydantic/pydantic/pull/8478) * Update to `pyright==1.1.345` by @Viicos in [#8453](https://github.com/pydantic/pydantic/pull/8453) * Update pydantic-core from 2.14.6 to 2.16.1, significant changes from these updates are described below, full changelog [here](https://github.com/pydantic/pydantic-core/compare/v2.14.6...v2.16.1) #### New Features * Add `NatsDsn` by @ekeew in [#6874](https://github.com/pydantic/pydantic/pull/6874) * Add `ConfigDict.ser_json_inf_nan` by @davidhewitt in [#8159](https://github.com/pydantic/pydantic/pull/8159) * Add `types.OnErrorOmit` by @adriangb in [#8222](https://github.com/pydantic/pydantic/pull/8222) * Support `AliasGenerator` usage by @sydney-runkle in [#8282](https://github.com/pydantic/pydantic/pull/8282) * Add Pydantic People Page to docs by @sydney-runkle in [#8345](https://github.com/pydantic/pydantic/pull/8345) * Support `yyyy-MM-DD` datetime parsing by @sydney-runkle in [#8404](https://github.com/pydantic/pydantic/pull/8404) * Added bits conversions to the `ByteSize` class #8415 by @luca-matei in [#8507](https://github.com/pydantic/pydantic/pull/8507) * Enable json schema creation with type `ByteSize` by @geospackle in [#8537](https://github.com/pydantic/pydantic/pull/8537) * Add `eval_type_backport` to handle union operator and builtin generic subscripting in older Pythons by @alexmojaki in [#8209](https://github.com/pydantic/pydantic/pull/8209) * Add support for `dataclass` fields `init` by @dmontagu in [#8552](https://github.com/pydantic/pydantic/pull/8552) * Implement pickling for `ValidationError` by @davidhewitt in [pydantic/pydantic-core#1119](https://github.com/pydantic/pydantic-core/pull/1119) * Add unified tuple validator that can handle "variadic" tuples via PEP-646 by @dmontagu in [pydantic/pydantic-core#865](https://github.com/pydantic/pydantic-core/pull/865) #### Changes * Drop Python3.7 support by @hramezani in [#7188](https://github.com/pydantic/pydantic/pull/7188) * Drop Python 3.7, and PyPy 3.7 and 3.8 by @davidhewitt in [pydantic/pydantic-core#1129](https://github.com/pydantic/pydantic-core/pull/1129) * Use positional-only `self` in `BaseModel` constructor, so no field name can ever conflict with it by @ariebovenberg in [#8072](https://github.com/pydantic/pydantic/pull/8072) * Make `@validate_call` return a function instead of a custom descriptor - fixes binding issue with inheritance and adds `self/cls` argument to validation errors by @alexmojaki in [#8268](https://github.com/pydantic/pydantic/pull/8268) * Exclude `BaseModel` docstring from JSON schema description by @sydney-runkle in [#8352](https://github.com/pydantic/pydantic/pull/8352) * Introducing `classproperty` decorator for `model_computed_fields` by @Jocelyn-Gas in [#8437](https://github.com/pydantic/pydantic/pull/8437) * Explicitly raise an error if field names clashes with types by @Viicos in [#8243](https://github.com/pydantic/pydantic/pull/8243) * Use stricter serializer for unions of simple types by @alexdrydew [pydantic/pydantic-core#1132](https://github.com/pydantic/pydantic-core/pull/1132) #### Performance * Add Codspeed profiling Actions workflow by @lambertsbennett in [#8054](https://github.com/pydantic/pydantic/pull/8054) * Improve `int` extraction by @samuelcolvin in [pydantic/pydantic-core#1155](https://github.com/pydantic/pydantic-core/pull/1155) * Improve performance of recursion guard by @samuelcolvin in [pydantic/pydantic-core#1156](https://github.com/pydantic/pydantic-core/pull/1156) * `dataclass` serialization speedups by @samuelcolvin in [pydantic/pydantic-core#1162](https://github.com/pydantic/pydantic-core/pull/1162) * Avoid `HashMap` creation when looking up small JSON objects in `LazyIndexMaps` by @samuelcolvin in [pydantic/jiter#55](https://github.com/pydantic/jiter/pull/55) * use hashbrown to speedup python string caching by @davidhewitt in [pydantic/jiter#51](https://github.com/pydantic/jiter/pull/51) * Replace `Peak` with more efficient `Peek` by @davidhewitt in [pydantic/jiter#48](https://github.com/pydantic/jiter/pull/48) #### Fixes * Move `getattr` warning in deprecated `BaseConfig` by @tlambert03 in [#7183](https://github.com/pydantic/pydantic/pull/7183) * Only hash `model_fields`, not whole `__dict__` by @alexmojaki in [#7786](https://github.com/pydantic/pydantic/pull/7786) * Fix mishandling of unions while freezing types in the `mypy` plugin by @dmontagu in [#7411](https://github.com/pydantic/pydantic/pull/7411) * Fix `mypy` error on untyped `ClassVar` by @vincent-hachin-wmx in [#8138](https://github.com/pydantic/pydantic/pull/8138) * Only compare pydantic fields in `BaseModel.__eq__` instead of whole `__dict__` by @QuentinSoubeyranAqemia in [#7825](https://github.com/pydantic/pydantic/pull/7825) * Update `strict` docstring in `model_validate` method. by @LukeTonin in [#8223](https://github.com/pydantic/pydantic/pull/8223) * Fix overload position of `computed_field` by @Viicos in [#8227](https://github.com/pydantic/pydantic/pull/8227) * Fix custom type type casting used in multiple attributes by @ianhfc in [#8066](https://github.com/pydantic/pydantic/pull/8066) * Fix issue not allowing `validate_call` decorator to be dynamically assigned to a class method by @jusexton in [#8249](https://github.com/pydantic/pydantic/pull/8249) * Fix issue `unittest.mock` deprecation warnings by @ibleedicare in [#8262](https://github.com/pydantic/pydantic/pull/8262) * Added tests for the case `JsonValue` contains subclassed primitive values by @jusexton in [#8286](https://github.com/pydantic/pydantic/pull/8286) * Fix `mypy` error on free before validator (classmethod) by @sydney-runkle in [#8285](https://github.com/pydantic/pydantic/pull/8285) * Fix `to_snake` conversion by @jevins09 in [#8316](https://github.com/pydantic/pydantic/pull/8316) * Fix type annotation of `ModelMetaclass.__prepare__` by @slanzmich in [#8305](https://github.com/pydantic/pydantic/pull/8305) * Disallow `config` specification when initializing a `TypeAdapter` when the annotated type has config already by @sydney-runkle in [#8365](https://github.com/pydantic/pydantic/pull/8365) * Fix a naming issue with JSON schema for generics parametrized by recursive type aliases by @dmontagu in [#8389](https://github.com/pydantic/pydantic/pull/8389) * Fix type annotation in pydantic people script by @shenxiangzhuang in [#8402](https://github.com/pydantic/pydantic/pull/8402) * Add support for field `alias` in `dataclass` signature by @NeevCohen in [#8387](https://github.com/pydantic/pydantic/pull/8387) * Fix bug with schema generation with `Field(...)` in a forward ref by @dmontagu in [#8494](https://github.com/pydantic/pydantic/pull/8494) * Fix ordering of keys in `__dict__` with `model_construct` call by @sydney-runkle in [#8500](https://github.com/pydantic/pydantic/pull/8500) * Fix module `path_type` creation when globals does not contain `__name__` by @hramezani in [#8470](https://github.com/pydantic/pydantic/pull/8470) * Fix for namespace issue with dataclasses with `from __future__ import annotations` by @sydney-runkle in [#8513](https://github.com/pydantic/pydantic/pull/8513) * Fix: make function validator types positional-only by @pmmmwh in [#8479](https://github.com/pydantic/pydantic/pull/8479) * Fix usage of `@deprecated` by @Viicos in [#8294](https://github.com/pydantic/pydantic/pull/8294) * Add more support for private attributes in `model_construct` call by @sydney-runkle in [#8525](https://github.com/pydantic/pydantic/pull/8525) * Use a stack for the types namespace by @dmontagu in [#8378](https://github.com/pydantic/pydantic/pull/8378) * Fix schema-building bug with `TypeAliasType` for types with refs by @dmontagu in [#8526](https://github.com/pydantic/pydantic/pull/8526) * Support `pydantic.Field(repr=False)` in dataclasses by @tigeryy2 in [#8511](https://github.com/pydantic/pydantic/pull/8511) * Override `dataclass_transform` behavior for `RootModel` by @Viicos in [#8163](https://github.com/pydantic/pydantic/pull/8163) * Refactor signature generation for simplicity by @sydney-runkle in [#8572](https://github.com/pydantic/pydantic/pull/8572) * Fix ordering bug of PlainValidator annotation by @Anvil in [#8567](https://github.com/pydantic/pydantic/pull/8567) * Fix `exclude_none` for json serialization of `computed_field`s by @sydney-runkle in [pydantic/pydantic-core#1098](https://github.com/pydantic/pydantic-core/pull/1098) * Support yyyy-MM-DD string for datetimes by @sydney-runkle in [pydantic/pydantic-core#1124](https://github.com/pydantic/pydantic-core/pull/1124) * Tweak ordering of definitions in generated schemas by @StrawHatDrag0n in [#8583](https://github.com/pydantic/pydantic/pull/8583) ### New Contributors #### `pydantic` * @ekeew made their first contribution in [#6874](https://github.com/pydantic/pydantic/pull/6874) * @lambertsbennett made their first contribution in [#8054](https://github.com/pydantic/pydantic/pull/8054) * @vincent-hachin-wmx made their first contribution in [#8138](https://github.com/pydantic/pydantic/pull/8138) * @QuentinSoubeyranAqemia made their first contribution in [#7825](https://github.com/pydantic/pydantic/pull/7825) * @ariebovenberg made their first contribution in [#8072](https://github.com/pydantic/pydantic/pull/8072) * @LukeTonin made their first contribution in [#8223](https://github.com/pydantic/pydantic/pull/8223) * @denisart made their first contribution in [#8231](https://github.com/pydantic/pydantic/pull/8231) * @ianhfc made their first contribution in [#8066](https://github.com/pydantic/pydantic/pull/8066) * @eonu made their first contribution in [#8255](https://github.com/pydantic/pydantic/pull/8255) * @amandahla made their first contribution in [#8263](https://github.com/pydantic/pydantic/pull/8263) * @ibleedicare made their first contribution in [#8262](https://github.com/pydantic/pydantic/pull/8262) * @jevins09 made their first contribution in [#8316](https://github.com/pydantic/pydantic/pull/8316) * @cuu508 made their first contribution in [#8322](https://github.com/pydantic/pydantic/pull/8322) * @slanzmich made their first contribution in [#8305](https://github.com/pydantic/pydantic/pull/8305) * @jensenbox made their first contribution in [#8331](https://github.com/pydantic/pydantic/pull/8331) * @szepeviktor made their first contribution in [#8356](https://github.com/pydantic/pydantic/pull/8356) * @Elkiwa made their first contribution in [#8341](https://github.com/pydantic/pydantic/pull/8341) * @parhamfh made their first contribution in [#8395](https://github.com/pydantic/pydantic/pull/8395) * @shenxiangzhuang made their first contribution in [#8402](https://github.com/pydantic/pydantic/pull/8402) * @NeevCohen made their first contribution in [#8387](https://github.com/pydantic/pydantic/pull/8387) * @zby made their first contribution in [#8497](https://github.com/pydantic/pydantic/pull/8497) * @patelnets made their first contribution in [#8491](https://github.com/pydantic/pydantic/pull/8491) * @edwardwli made their first contribution in [#8503](https://github.com/pydantic/pydantic/pull/8503) * @luca-matei made their first contribution in [#8507](https://github.com/pydantic/pydantic/pull/8507) * @Jocelyn-Gas made their first contribution in [#8437](https://github.com/pydantic/pydantic/pull/8437) * @bL34cHig0 made their first contribution in [#8501](https://github.com/pydantic/pydantic/pull/8501) * @tigeryy2 made their first contribution in [#8511](https://github.com/pydantic/pydantic/pull/8511) * @geospackle made their first contribution in [#8537](https://github.com/pydantic/pydantic/pull/8537) * @Anvil made their first contribution in [#8567](https://github.com/pydantic/pydantic/pull/8567) * @hungtsetse made their first contribution in [#8546](https://github.com/pydantic/pydantic/pull/8546) * @StrawHatDrag0n made their first contribution in [#8583](https://github.com/pydantic/pydantic/pull/8583) #### `pydantic-core` * @mariuswinger made their first contribution in [pydantic/pydantic-core#1087](https://github.com/pydantic/pydantic-core/pull/1087) * @adamchainz made their first contribution in [pydantic/pydantic-core#1090](https://github.com/pydantic/pydantic-core/pull/1090) * @akx made their first contribution in [pydantic/pydantic-core#1123](https://github.com/pydantic/pydantic-core/pull/1123) ## v2.6.0b1 (2024-01-19) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.6.0b1) for details. ## v2.5.3 (2023-12-22) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.5.3) ### What's Changed #### Packaging * uprev `pydantic-core` to 2.14.6 #### Fixes * Fix memory leak with recursive definitions creating reference cycles by @davidhewitt in [pydantic/pydantic-core#1125](https://github.com/pydantic/pydantic-core/pull/1125) ## v2.5.2 (2023-11-22) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.5.2) ### What's Changed #### Packaging * uprev `pydantic-core` to 2.14.5 #### New Features * Add `ConfigDict.ser_json_inf_nan` by @davidhewitt in [#8159](https://github.com/pydantic/pydantic/pull/8159) #### Fixes * Fix validation of `Literal` from JSON keys when used as `dict` key by @sydney-runkle in [pydantic/pydantic-core#1075](https://github.com/pydantic/pydantic-core/pull/1075) * Fix bug re `custom_init` on members of `Union` by @sydney-runkle in [pydantic/pydantic-core#1076](https://github.com/pydantic/pydantic-core/pull/1076) * Fix `JsonValue` `bool` serialization by @sydney-runkle in [#8190](https://github.com/pydantic/pydantic/pull/8159) * Fix handling of unhashable inputs with `Literal` in `Union`s by @sydney-runkle in [pydantic/pydantic-core#1089](https://github.com/pydantic/pydantic-core/pull/1089) ## v2.5.1 (2023-11-15) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.5.1) ### What's Changed #### Packaging * uprev pydantic-core to 2.14.3 by @samuelcolvin in [#8120](https://github.com/pydantic/pydantic/pull/8120) #### Fixes * Fix package description limit by @dmontagu in [#8097](https://github.com/pydantic/pydantic/pull/8097) * Fix `ValidateCallWrapper` error when creating a model which has a @validate_call wrapped field annotation by @sydney-runkle in [#8110](https://github.com/pydantic/pydantic/pull/8110) ## v2.5.0 (2023-11-13) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.5.0) The code released in v2.5.0 is functionally identical to that of v2.5.0b1. ### What's Changed #### Packaging * Update pydantic-core from 2.10.1 to 2.14.1, significant changes from these updates are described below, full changelog [here](https://github.com/pydantic/pydantic-core/compare/v2.10.1...v2.14.1) * Update to `pyright==1.1.335` by @Viicos in [#8075](https://github.com/pydantic/pydantic/pull/8075) #### New Features * Allow plugins to catch non `ValidationError` errors by @adriangb in [#7806](https://github.com/pydantic/pydantic/pull/7806) * Support `__doc__` argument in `create_model()` by @chris-spann in [#7863](https://github.com/pydantic/pydantic/pull/7863) * Expose `regex_engine` flag - meaning you can use with the Rust or Python regex libraries in constraints by @utkini in [#7768](https://github.com/pydantic/pydantic/pull/7768) * Save return type generated from type annotation in `ComputedFieldInfo` by @alexmojaki in [#7889](https://github.com/pydantic/pydantic/pull/7889) * Adopting `ruff` formatter by @Luca-Blight in [#7930](https://github.com/pydantic/pydantic/pull/7930) * Added `validation_error_cause` to config by @zakstucke in [#7626](https://github.com/pydantic/pydantic/pull/7626) * Make path of the item to validate available in plugin by @hramezani in [#7861](https://github.com/pydantic/pydantic/pull/7861) * Add `CallableDiscriminator` and `Tag` by @dmontagu in [#7983](https://github.com/pydantic/pydantic/pull/7983) * `CallableDiscriminator` renamed to `Discriminator` by @dmontagu in [#8047](https://github.com/pydantic/pydantic/pull/8047) * Make union case tags affect union error messages by @dmontagu in [#8001](https://github.com/pydantic/pydantic/pull/8001) * Add `examples` and `json_schema_extra` to `@computed_field` by @alexmojaki in [#8013](https://github.com/pydantic/pydantic/pull/8013) * Add `JsonValue` type by @dmontagu in [#7998](https://github.com/pydantic/pydantic/pull/7998) * Allow `str` as argument to `Discriminator` by @dmontagu in [#8047](https://github.com/pydantic/pydantic/pull/8047) * Add `SchemaSerializer.__reduce__` method to enable pickle serialization by @edoakes in [pydantic/pydantic-core#1006](https://github.com/pydantic/pydantic-core/pull/1006) #### Changes * **Significant Change:** replace `ultra_strict` with new smart union implementation, the way unions are validated has changed significantly to improve performance and correctness, we have worked hard to absolutely minimise the number of cases where behaviour has changed, see the PR for details - by @davidhewitt in [pydantic/pydantic-core#867](https://github.com/pydantic/pydantic-core/pull/867) * Add support for instance method reassignment when `extra='allow'` by @sydney-runkle in [#7683](https://github.com/pydantic/pydantic/pull/7683) * Support JSON schema generation for `Enum` types with no cases by @sydney-runkle in [#7927](https://github.com/pydantic/pydantic/pull/7927) * Warn if a class inherits from `Generic` before `BaseModel` by @alexmojaki in [#7891](https://github.com/pydantic/pydantic/pull/7891) #### Performance * New custom JSON parser, `jiter` by @samuelcolvin in [pydantic/pydantic-core#974](https://github.com/pydantic/pydantic-core/pull/974) * PGO build for MacOS M1 by @samuelcolvin in [pydantic/pydantic-core#1063](https://github.com/pydantic/pydantic-core/pull/1063) * Use `__getattr__` for all package imports, improve import time by @samuelcolvin in [#7947](https://github.com/pydantic/pydantic/pull/7947) #### Fixes * Fix `mypy` issue with subclasses of `RootModel` by @sydney-runkle in [#7677](https://github.com/pydantic/pydantic/pull/7677) * Properly rebuild the `FieldInfo` when a forward ref gets evaluated by @dmontagu in [#7698](https://github.com/pydantic/pydantic/pull/7698) * Fix failure to load `SecretStr` from JSON (regression in v2.4) by @sydney-runkle in [#7729](https://github.com/pydantic/pydantic/pull/7729) * Fix `defer_build` behavior with `TypeAdapter` by @sydney-runkle in [#7736](https://github.com/pydantic/pydantic/pull/7736) * Improve compatibility with legacy `mypy` versions by @dmontagu in [#7742](https://github.com/pydantic/pydantic/pull/7742) * Fix: update `TypeVar` handling when default is not set by @pmmmwh in [#7719](https://github.com/pydantic/pydantic/pull/7719) * Support specification of `strict` on `Enum` type fields by @sydney-runkle in [#7761](https://github.com/pydantic/pydantic/pull/7761) * Wrap `weakref.ref` instead of subclassing to fix `cloudpickle` serialization by @edoakes in [#7780](https://github.com/pydantic/pydantic/pull/7780) * Keep values of private attributes set within `model_post_init` in subclasses by @alexmojaki in [#7775](https://github.com/pydantic/pydantic/pull/7775) * Add more specific type for non-callable `json_schema_extra` by @alexmojaki in [#7803](https://github.com/pydantic/pydantic/pull/7803) * Raise an error when deleting frozen (model) fields by @alexmojaki in [#7800](https://github.com/pydantic/pydantic/pull/7800) * Fix schema sorting bug with default values by @sydney-runkle in [#7817](https://github.com/pydantic/pydantic/pull/7817) * Use generated alias for aliases that are not specified otherwise by @alexmojaki in [#7802](https://github.com/pydantic/pydantic/pull/7802) * Support `strict` specification for `UUID` types by @sydney-runkle in [#7865](https://github.com/pydantic/pydantic/pull/7865) * JSON schema: fix extra parameter handling by @me-and in [#7810](https://github.com/pydantic/pydantic/pull/7810) * Fix: support `pydantic.Field(kw_only=True)` with inherited dataclasses by @PrettyWood in [#7827](https://github.com/pydantic/pydantic/pull/7827) * Support `validate_call` decorator for methods in classes with `__slots__` by @sydney-runkle in [#7883](https://github.com/pydantic/pydantic/pull/7883) * Fix pydantic dataclass problem with `dataclasses.field` default by @hramezani in [#7898](https://github.com/pydantic/pydantic/pull/7898) * Fix schema generation for generics with union type bounds by @sydney-runkle in [#7899](https://github.com/pydantic/pydantic/pull/7899) * Fix version for `importlib_metadata` on python 3.7 by @sydney-runkle in [#7904](https://github.com/pydantic/pydantic/pull/7904) * Support `|` operator (Union) in PydanticRecursiveRef by @alexmojaki in [#7892](https://github.com/pydantic/pydantic/pull/7892) * Fix `display_as_type` for `TypeAliasType` in python 3.12 by @dmontagu in [#7929](https://github.com/pydantic/pydantic/pull/7929) * Add support for `NotRequired` generics in `TypedDict` by @sydney-runkle in [#7932](https://github.com/pydantic/pydantic/pull/7932) * Make generic `TypeAliasType` specifications produce different schema definitions by @alexdrydew in [#7893](https://github.com/pydantic/pydantic/pull/7893) * Added fix for signature of inherited dataclass by @howsunjow in [#7925](https://github.com/pydantic/pydantic/pull/7925) * Make the model name generation more robust in JSON schema by @joakimnordling in [#7881](https://github.com/pydantic/pydantic/pull/7881) * Fix plurals in validation error messages (in tests) by @Iipin in [#7972](https://github.com/pydantic/pydantic/pull/7972) * `PrivateAttr` is passed from `Annotated` default position by @tabassco in [#8004](https://github.com/pydantic/pydantic/pull/8004) * Don't decode bytes (which may not be UTF8) when displaying SecretBytes by @alexmojaki in [#8012](https://github.com/pydantic/pydantic/pull/8012) * Use `classmethod` instead of `classmethod[Any, Any, Any]` by @Mr-Pepe in [#7979](https://github.com/pydantic/pydantic/pull/7979) * Clearer error on invalid Plugin by @samuelcolvin in [#8023](https://github.com/pydantic/pydantic/pull/8023) * Correct pydantic dataclasses import by @samuelcolvin in [#8027](https://github.com/pydantic/pydantic/pull/8027) * Fix misbehavior for models referencing redefined type aliases by @dmontagu in [#8050](https://github.com/pydantic/pydantic/pull/8050) * Fix `Optional` field with `validate_default` only performing one field validation by @sydney-runkle in [pydantic/pydantic-core#1002](https://github.com/pydantic/pydantic-core/pull/1002) * Fix `definition-ref` bug with `Dict` keys by @sydney-runkle in [pydantic/pydantic-core#1014](https://github.com/pydantic/pydantic-core/pull/1014) * Fix bug allowing validation of `bool` types with `coerce_numbers_to_str=True` by @sydney-runkle in [pydantic/pydantic-core#1017](https://github.com/pydantic/pydantic-core/pull/1017) * Don't accept `NaN` in float and decimal constraints by @davidhewitt in [pydantic/pydantic-core#1037](https://github.com/pydantic/pydantic-core/pull/1037) * Add `lax_str` and `lax_int` support for enum values not inherited from str/int by @michaelhly in [pydantic/pydantic-core#1015](https://github.com/pydantic/pydantic-core/pull/1015) * Support subclasses in lists in `Union` of `List` types by @sydney-runkle in [pydantic/pydantic-core#1039](https://github.com/pydantic/pydantic-core/pull/1039) * Allow validation against `max_digits` and `decimals` to pass if normalized or non-normalized input is valid by @sydney-runkle in [pydantic/pydantic-core#1049](https://github.com/pydantic/pydantic-core/pull/1049) * Fix: proper pluralization in `ValidationError` messages by @Iipin in [pydantic/pydantic-core#1050](https://github.com/pydantic/pydantic-core/pull/1050) * Disallow the string `'-'` as `datetime` input by @davidhewitt in [pydantic/speedate#52](https://github.com/pydantic/speedate/pull/52) & [pydantic/pydantic-core#1060](https://github.com/pydantic/pydantic-core/pull/1060) * Fix: NaN and Inf float serialization by @davidhewitt in [pydantic/pydantic-core#1062](https://github.com/pydantic/pydantic-core/pull/1062) * Restore manylinux-compatible PGO builds by @davidhewitt in [pydantic/pydantic-core#1068](https://github.com/pydantic/pydantic-core/pull/1068) ### New Contributors #### `pydantic` * @schneebuzz made their first contribution in [#7699](https://github.com/pydantic/pydantic/pull/7699) * @edoakes made their first contribution in [#7780](https://github.com/pydantic/pydantic/pull/7780) * @alexmojaki made their first contribution in [#7775](https://github.com/pydantic/pydantic/pull/7775) * @NickG123 made their first contribution in [#7751](https://github.com/pydantic/pydantic/pull/7751) * @gowthamgts made their first contribution in [#7830](https://github.com/pydantic/pydantic/pull/7830) * @jamesbraza made their first contribution in [#7848](https://github.com/pydantic/pydantic/pull/7848) * @laundmo made their first contribution in [#7850](https://github.com/pydantic/pydantic/pull/7850) * @rahmatnazali made their first contribution in [#7870](https://github.com/pydantic/pydantic/pull/7870) * @waterfountain1996 made their first contribution in [#7878](https://github.com/pydantic/pydantic/pull/7878) * @chris-spann made their first contribution in [#7863](https://github.com/pydantic/pydantic/pull/7863) * @me-and made their first contribution in [#7810](https://github.com/pydantic/pydantic/pull/7810) * @utkini made their first contribution in [#7768](https://github.com/pydantic/pydantic/pull/7768) * @bn-l made their first contribution in [#7744](https://github.com/pydantic/pydantic/pull/7744) * @alexdrydew made their first contribution in [#7893](https://github.com/pydantic/pydantic/pull/7893) * @Luca-Blight made their first contribution in [#7930](https://github.com/pydantic/pydantic/pull/7930) * @howsunjow made their first contribution in [#7925](https://github.com/pydantic/pydantic/pull/7925) * @joakimnordling made their first contribution in [#7881](https://github.com/pydantic/pydantic/pull/7881) * @icfly2 made their first contribution in [#7976](https://github.com/pydantic/pydantic/pull/7976) * @Yummy-Yums made their first contribution in [#8003](https://github.com/pydantic/pydantic/pull/8003) * @Iipin made their first contribution in [#7972](https://github.com/pydantic/pydantic/pull/7972) * @tabassco made their first contribution in [#8004](https://github.com/pydantic/pydantic/pull/8004) * @Mr-Pepe made their first contribution in [#7979](https://github.com/pydantic/pydantic/pull/7979) * @0x00cl made their first contribution in [#8010](https://github.com/pydantic/pydantic/pull/8010) * @barraponto made their first contribution in [#8032](https://github.com/pydantic/pydantic/pull/8032) #### `pydantic-core` * @sisp made their first contribution in [pydantic/pydantic-core#995](https://github.com/pydantic/pydantic-core/pull/995) * @michaelhly made their first contribution in [pydantic/pydantic-core#1015](https://github.com/pydantic/pydantic-core/pull/1015) ## v2.5.0b1 (2023-11-09) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.5.0b1) for details. ## v2.4.2 (2023-09-27) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.4.2) ### What's Changed #### Fixes * Fix bug with JSON schema for sequence of discriminated union by @dmontagu in [#7647](https://github.com/pydantic/pydantic/pull/7647) * Fix schema references in discriminated unions by @adriangb in [#7646](https://github.com/pydantic/pydantic/pull/7646) * Fix json schema generation for recursive models by @adriangb in [#7653](https://github.com/pydantic/pydantic/pull/7653) * Fix `models_json_schema` for generic models by @adriangb in [#7654](https://github.com/pydantic/pydantic/pull/7654) * Fix xfailed test for generic model signatures by @adriangb in [#7658](https://github.com/pydantic/pydantic/pull/7658) ### New Contributors * @austinorr made their first contribution in [#7657](https://github.com/pydantic/pydantic/pull/7657) * @peterHoburg made their first contribution in [#7670](https://github.com/pydantic/pydantic/pull/7670) ## v2.4.1 (2023-09-26) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.4.1) ### What's Changed #### Packaging * Update pydantic-core to 2.10.1 by @davidhewitt in [#7633](https://github.com/pydantic/pydantic/pull/7633) #### Fixes * Serialize unsubstituted type vars as `Any` by @adriangb in [#7606](https://github.com/pydantic/pydantic/pull/7606) * Remove schema building caches by @adriangb in [#7624](https://github.com/pydantic/pydantic/pull/7624) * Fix an issue where JSON schema extras weren't JSON encoded by @dmontagu in [#7625](https://github.com/pydantic/pydantic/pull/7625) ## v2.4.0 (2023-09-22) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.4.0) ### What's Changed #### Packaging * Update pydantic-core to 2.10.0 by @samuelcolvin in [#7542](https://github.com/pydantic/pydantic/pull/7542) #### New Features * Add `Base64Url` types by @dmontagu in [#7286](https://github.com/pydantic/pydantic/pull/7286) * Implement optional `number` to `str` coercion by @lig in [#7508](https://github.com/pydantic/pydantic/pull/7508) * Allow access to `field_name` and `data` in all validators if there is data and a field name by @samuelcolvin in [#7542](https://github.com/pydantic/pydantic/pull/7542) * Add `BaseModel.model_validate_strings` and `TypeAdapter.validate_strings` by @hramezani in [#7552](https://github.com/pydantic/pydantic/pull/7552) * Add Pydantic `plugins` experimental implementation by @lig @samuelcolvin and @Kludex in [#6820](https://github.com/pydantic/pydantic/pull/6820) #### Changes * Do not override `model_post_init` in subclass with private attrs by @Viicos in [#7302](https://github.com/pydantic/pydantic/pull/7302) * Make fields with defaults not required in the serialization schema by default by @dmontagu in [#7275](https://github.com/pydantic/pydantic/pull/7275) * Mark `Extra` as deprecated by @disrupted in [#7299](https://github.com/pydantic/pydantic/pull/7299) * Make `EncodedStr` a dataclass by @Kludex in [#7396](https://github.com/pydantic/pydantic/pull/7396) * Move `annotated_handlers` to be public by @samuelcolvin in [#7569](https://github.com/pydantic/pydantic/pull/7569) #### Performance * Simplify flattening and inlining of `CoreSchema` by @adriangb in [#7523](https://github.com/pydantic/pydantic/pull/7523) * Remove unused copies in `CoreSchema` walking by @adriangb in [#7528](https://github.com/pydantic/pydantic/pull/7528) * Add caches for collecting definitions and invalid schemas from a CoreSchema by @adriangb in [#7527](https://github.com/pydantic/pydantic/pull/7527) * Eagerly resolve discriminated unions and cache cases where we can't by @adriangb in [#7529](https://github.com/pydantic/pydantic/pull/7529) * Replace `dict.get` and `dict.setdefault` with more verbose versions in `CoreSchema` building hot paths by @adriangb in [#7536](https://github.com/pydantic/pydantic/pull/7536) * Cache invalid `CoreSchema` discovery by @adriangb in [#7535](https://github.com/pydantic/pydantic/pull/7535) * Allow disabling `CoreSchema` validation for faster startup times by @adriangb in [#7565](https://github.com/pydantic/pydantic/pull/7565) #### Fixes * Fix config detection for `TypedDict` from grandparent classes by @dmontagu in [#7272](https://github.com/pydantic/pydantic/pull/7272) * Fix hash function generation for frozen models with unusual MRO by @dmontagu in [#7274](https://github.com/pydantic/pydantic/pull/7274) * Make `strict` config overridable in field for Path by @hramezani in [#7281](https://github.com/pydantic/pydantic/pull/7281) * Use `ser_json_` on default in `GenerateJsonSchema` by @Kludex in [#7269](https://github.com/pydantic/pydantic/pull/7269) * Adding a check that alias is validated as an identifier for Python by @andree0 in [#7319](https://github.com/pydantic/pydantic/pull/7319) * Raise an error when computed field overrides field by @sydney-runkle in [#7346](https://github.com/pydantic/pydantic/pull/7346) * Fix applying `SkipValidation` to referenced schemas by @adriangb in [#7381](https://github.com/pydantic/pydantic/pull/7381) * Enforce behavior of private attributes having double leading underscore by @lig in [#7265](https://github.com/pydantic/pydantic/pull/7265) * Standardize `__get_pydantic_core_schema__` signature by @hramezani in [#7415](https://github.com/pydantic/pydantic/pull/7415) * Fix generic dataclass fields mutation bug (when using `TypeAdapter`) by @sydney-runkle in [#7435](https://github.com/pydantic/pydantic/pull/7435) * Fix `TypeError` on `model_validator` in `wrap` mode by @pmmmwh in [#7496](https://github.com/pydantic/pydantic/pull/7496) * Improve enum error message by @hramezani in [#7506](https://github.com/pydantic/pydantic/pull/7506) * Make `repr` work for instances that failed initialization when handling `ValidationError`s by @dmontagu in [#7439](https://github.com/pydantic/pydantic/pull/7439) * Fixed a regular expression denial of service issue by limiting whitespaces by @prodigysml in [#7360](https://github.com/pydantic/pydantic/pull/7360) * Fix handling of `UUID` values having `UUID.version=None` by @lig in [#7566](https://github.com/pydantic/pydantic/pull/7566) * Fix `__iter__` returning private `cached_property` info by @sydney-runkle in [#7570](https://github.com/pydantic/pydantic/pull/7570) * Improvements to version info message by @samuelcolvin in [#7594](https://github.com/pydantic/pydantic/pull/7594) ### New Contributors * @15498th made their first contribution in [#7238](https://github.com/pydantic/pydantic/pull/7238) * @GabrielCappelli made their first contribution in [#7213](https://github.com/pydantic/pydantic/pull/7213) * @tobni made their first contribution in [#7184](https://github.com/pydantic/pydantic/pull/7184) * @redruin1 made their first contribution in [#7282](https://github.com/pydantic/pydantic/pull/7282) * @FacerAin made their first contribution in [#7288](https://github.com/pydantic/pydantic/pull/7288) * @acdha made their first contribution in [#7297](https://github.com/pydantic/pydantic/pull/7297) * @andree0 made their first contribution in [#7319](https://github.com/pydantic/pydantic/pull/7319) * @gordonhart made their first contribution in [#7375](https://github.com/pydantic/pydantic/pull/7375) * @pmmmwh made their first contribution in [#7496](https://github.com/pydantic/pydantic/pull/7496) * @disrupted made their first contribution in [#7299](https://github.com/pydantic/pydantic/pull/7299) * @prodigysml made their first contribution in [#7360](https://github.com/pydantic/pydantic/pull/7360) ## v2.3.0 (2023-08-23) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.3.0) * 🔥 Remove orphaned changes file from repo by @lig in [#7168](https://github.com/pydantic/pydantic/pull/7168) * Add copy button on documentation by @Kludex in [#7190](https://github.com/pydantic/pydantic/pull/7190) * Fix docs on JSON type by @Kludex in [#7189](https://github.com/pydantic/pydantic/pull/7189) * Update mypy 1.5.0 to 1.5.1 in CI by @hramezani in [#7191](https://github.com/pydantic/pydantic/pull/7191) * fix download links badge by @samuelcolvin in [#7200](https://github.com/pydantic/pydantic/pull/7200) * add 2.2.1 to changelog by @samuelcolvin in [#7212](https://github.com/pydantic/pydantic/pull/7212) * Make ModelWrapValidator protocols generic by @dmontagu in [#7154](https://github.com/pydantic/pydantic/pull/7154) * Correct `Field(..., exclude: bool)` docs by @samuelcolvin in [#7214](https://github.com/pydantic/pydantic/pull/7214) * Make shadowing attributes a warning instead of an error by @adriangb in [#7193](https://github.com/pydantic/pydantic/pull/7193) * Document `Base64Str` and `Base64Bytes` by @Kludex in [#7192](https://github.com/pydantic/pydantic/pull/7192) * Fix `config.defer_build` for serialization first cases by @samuelcolvin in [#7024](https://github.com/pydantic/pydantic/pull/7024) * clean Model docstrings in JSON Schema by @samuelcolvin in [#7210](https://github.com/pydantic/pydantic/pull/7210) * fix [#7228](https://github.com/pydantic/pydantic/pull/7228) (typo): docs in `validators.md` to correct `validate_default` kwarg by @lmmx in [#7229](https://github.com/pydantic/pydantic/pull/7229) * ✅ Implement `tzinfo.fromutc` method for `TzInfo` in `pydantic-core` by @lig in [#7019](https://github.com/pydantic/pydantic/pull/7019) * Support `__get_validators__` by @hramezani in [#7197](https://github.com/pydantic/pydantic/pull/7197) ## v2.2.1 (2023-08-18) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.2.1) * Make `xfail`ing test for root model extra stop `xfail`ing by @dmontagu in [#6937](https://github.com/pydantic/pydantic/pull/6937) * Optimize recursion detection by stopping on the second visit for the same object by @mciucu in [#7160](https://github.com/pydantic/pydantic/pull/7160) * fix link in docs by @tlambert03 in [#7166](https://github.com/pydantic/pydantic/pull/7166) * Replace MiMalloc w/ default allocator by @adriangb in [pydantic/pydantic-core#900](https://github.com/pydantic/pydantic-core/pull/900) * Bump pydantic-core to 2.6.1 and prepare 2.2.1 release by @adriangb in [#7176](https://github.com/pydantic/pydantic/pull/7176) ## v2.2.0 (2023-08-17) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.2.0) * Split "pipx install" setup command into two commands on the documentation site by @nomadmtb in [#6869](https://github.com/pydantic/pydantic/pull/6869) * Deprecate `Field.include` by @hramezani in [#6852](https://github.com/pydantic/pydantic/pull/6852) * Fix typo in default factory error msg by @hramezani in [#6880](https://github.com/pydantic/pydantic/pull/6880) * Simplify handling of typing.Annotated in GenerateSchema by @dmontagu in [#6887](https://github.com/pydantic/pydantic/pull/6887) * Re-enable fastapi tests in CI by @dmontagu in [#6883](https://github.com/pydantic/pydantic/pull/6883) * Make it harder to hit collisions with json schema defrefs by @dmontagu in [#6566](https://github.com/pydantic/pydantic/pull/6566) * Cleaner error for invalid input to `Path` fields by @samuelcolvin in [#6903](https://github.com/pydantic/pydantic/pull/6903) * :memo: support Coordinate Type by @yezz123 in [#6906](https://github.com/pydantic/pydantic/pull/6906) * Fix `ForwardRef` wrapper for py 3.10.0 (shim until bpo-45166) by @randomir in [#6919](https://github.com/pydantic/pydantic/pull/6919) * Fix misbehavior related to copying of RootModel by @dmontagu in [#6918](https://github.com/pydantic/pydantic/pull/6918) * Fix issue with recursion error caused by ParamSpec by @dmontagu in [#6923](https://github.com/pydantic/pydantic/pull/6923) * Add section about Constrained classes to the Migration Guide by @Kludex in [#6924](https://github.com/pydantic/pydantic/pull/6924) * Use `main` branch for badge links by @Viicos in [#6925](https://github.com/pydantic/pydantic/pull/6925) * Add test for v1/v2 Annotated discrepancy by @carlbordum in [#6926](https://github.com/pydantic/pydantic/pull/6926) * Make the v1 mypy plugin work with both v1 and v2 by @dmontagu in [#6921](https://github.com/pydantic/pydantic/pull/6921) * Fix issue where generic models couldn't be parametrized with BaseModel by @dmontagu in [#6933](https://github.com/pydantic/pydantic/pull/6933) * Remove xfail for discriminated union with alias by @dmontagu in [#6938](https://github.com/pydantic/pydantic/pull/6938) * add field_serializer to computed_field by @andresliszt in [#6965](https://github.com/pydantic/pydantic/pull/6965) * Use union_schema with Type[Union[...]] by @JeanArhancet in [#6952](https://github.com/pydantic/pydantic/pull/6952) * Fix inherited typeddict attributes / config by @adriangb in [#6981](https://github.com/pydantic/pydantic/pull/6981) * fix dataclass annotated before validator called twice by @davidhewitt in [#6998](https://github.com/pydantic/pydantic/pull/6998) * Update test-fastapi deselected tests by @hramezani in [#7014](https://github.com/pydantic/pydantic/pull/7014) * Fix validator doc format by @hramezani in [#7015](https://github.com/pydantic/pydantic/pull/7015) * Fix typo in docstring of model_json_schema by @AdamVinch-Federated in [#7032](https://github.com/pydantic/pydantic/pull/7032) * remove unused "type ignores" with pyright by @samuelcolvin in [#7026](https://github.com/pydantic/pydantic/pull/7026) * Add benchmark representing FastAPI startup time by @adriangb in [#7030](https://github.com/pydantic/pydantic/pull/7030) * Fix json_encoders for Enum subclasses by @adriangb in [#7029](https://github.com/pydantic/pydantic/pull/7029) * Update docstring of `ser_json_bytes` regarding base64 encoding by @Viicos in [#7052](https://github.com/pydantic/pydantic/pull/7052) * Allow `@validate_call` to work on async methods by @adriangb in [#7046](https://github.com/pydantic/pydantic/pull/7046) * Fix: mypy error with `Settings` and `SettingsConfigDict` by @JeanArhancet in [#7002](https://github.com/pydantic/pydantic/pull/7002) * Fix some typos (repeated words and it's/its) by @eumiro in [#7063](https://github.com/pydantic/pydantic/pull/7063) * Fix the typo in docstring by @harunyasar in [#7062](https://github.com/pydantic/pydantic/pull/7062) * Docs: Fix broken URL in the pydantic-settings package recommendation by @swetjen in [#6995](https://github.com/pydantic/pydantic/pull/6995) * Handle constraints being applied to schemas that don't accept it by @adriangb in [#6951](https://github.com/pydantic/pydantic/pull/6951) * Replace almost_equal_floats with math.isclose by @eumiro in [#7082](https://github.com/pydantic/pydantic/pull/7082) * bump pydantic-core to 2.5.0 by @davidhewitt in [#7077](https://github.com/pydantic/pydantic/pull/7077) * Add `short_version` and use it in links by @hramezani in [#7115](https://github.com/pydantic/pydantic/pull/7115) * 📝 Add usage link to `RootModel` by @Kludex in [#7113](https://github.com/pydantic/pydantic/pull/7113) * Revert "Fix default port for mongosrv DSNs (#6827)" by @Kludex in [#7116](https://github.com/pydantic/pydantic/pull/7116) * Clarify validate_default and _Unset handling in usage docs and migration guide by @benbenbang in [#6950](https://github.com/pydantic/pydantic/pull/6950) * Tweak documentation of `Field.exclude` by @Viicos in [#7086](https://github.com/pydantic/pydantic/pull/7086) * Do not require `validate_assignment` to use `Field.frozen` by @Viicos in [#7103](https://github.com/pydantic/pydantic/pull/7103) * tweaks to `_core_utils` by @samuelcolvin in [#7040](https://github.com/pydantic/pydantic/pull/7040) * Make DefaultDict working with set by @hramezani in [#7126](https://github.com/pydantic/pydantic/pull/7126) * Don't always require typing.Generic as a base for partially parametrized models by @dmontagu in [#7119](https://github.com/pydantic/pydantic/pull/7119) * Fix issue with JSON schema incorrectly using parent class core schema by @dmontagu in [#7020](https://github.com/pydantic/pydantic/pull/7020) * Fix xfailed test related to TypedDict and alias_generator by @dmontagu in [#6940](https://github.com/pydantic/pydantic/pull/6940) * Improve error message for NameEmail by @dmontagu in [#6939](https://github.com/pydantic/pydantic/pull/6939) * Fix generic computed fields by @dmontagu in [#6988](https://github.com/pydantic/pydantic/pull/6988) * Reflect namedtuple default values during validation by @dmontagu in [#7144](https://github.com/pydantic/pydantic/pull/7144) * Update dependencies, fix pydantic-core usage, fix CI issues by @dmontagu in [#7150](https://github.com/pydantic/pydantic/pull/7150) * Add mypy 1.5.0 by @hramezani in [#7118](https://github.com/pydantic/pydantic/pull/7118) * Handle non-json native enum values by @adriangb in [#7056](https://github.com/pydantic/pydantic/pull/7056) * document `round_trip` in Json type documentation by @jc-louis in [#7137](https://github.com/pydantic/pydantic/pull/7137) * Relax signature checks to better support builtins and C extension functions as validators by @adriangb in [#7101](https://github.com/pydantic/pydantic/pull/7101) * add union_mode='left_to_right' by @davidhewitt in [#7151](https://github.com/pydantic/pydantic/pull/7151) * Include an error message hint for inherited ordering by @yvalencia91 in [#7124](https://github.com/pydantic/pydantic/pull/7124) * Fix one docs link and resolve some warnings for two others by @dmontagu in [#7153](https://github.com/pydantic/pydantic/pull/7153) * Include Field extra keys name in warning by @hramezani in [#7136](https://github.com/pydantic/pydantic/pull/7136) ## v2.1.1 (2023-07-25) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.1.1) * Skip FieldInfo merging when unnecessary by @dmontagu in [#6862](https://github.com/pydantic/pydantic/pull/6862) ## v2.1.0 (2023-07-25) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.1.0) * Add `StringConstraints` for use as Annotated metadata by @adriangb in [#6605](https://github.com/pydantic/pydantic/pull/6605) * Try to fix intermittently failing CI by @adriangb in [#6683](https://github.com/pydantic/pydantic/pull/6683) * Remove redundant example of optional vs default. by @ehiggs-deliverect in [#6676](https://github.com/pydantic/pydantic/pull/6676) * Docs update by @samuelcolvin in [#6692](https://github.com/pydantic/pydantic/pull/6692) * Remove the Validate always section in validator docs by @adriangb in [#6679](https://github.com/pydantic/pydantic/pull/6679) * Fix recursion error in json schema generation by @adriangb in [#6720](https://github.com/pydantic/pydantic/pull/6720) * Fix incorrect subclass check for secretstr by @AlexVndnblcke in [#6730](https://github.com/pydantic/pydantic/pull/6730) * update pdm / pdm lockfile to 2.8.0 by @davidhewitt in [#6714](https://github.com/pydantic/pydantic/pull/6714) * unpin pdm on more CI jobs by @davidhewitt in [#6755](https://github.com/pydantic/pydantic/pull/6755) * improve source locations for auxiliary packages in docs by @davidhewitt in [#6749](https://github.com/pydantic/pydantic/pull/6749) * Assume builtins don't accept an info argument by @adriangb in [#6754](https://github.com/pydantic/pydantic/pull/6754) * Fix bug where calling `help(BaseModelSubclass)` raises errors by @hramezani in [#6758](https://github.com/pydantic/pydantic/pull/6758) * Fix mypy plugin handling of `@model_validator(mode="after")` by @ljodal in [#6753](https://github.com/pydantic/pydantic/pull/6753) * update pydantic-core to 2.3.1 by @davidhewitt in [#6756](https://github.com/pydantic/pydantic/pull/6756) * Mypy plugin for settings by @hramezani in [#6760](https://github.com/pydantic/pydantic/pull/6760) * Use `contentSchema` keyword for JSON schema by @dmontagu in [#6715](https://github.com/pydantic/pydantic/pull/6715) * fast-path checking finite decimals by @davidhewitt in [#6769](https://github.com/pydantic/pydantic/pull/6769) * Docs update by @samuelcolvin in [#6771](https://github.com/pydantic/pydantic/pull/6771) * Improve json schema doc by @hramezani in [#6772](https://github.com/pydantic/pydantic/pull/6772) * Update validator docs by @adriangb in [#6695](https://github.com/pydantic/pydantic/pull/6695) * Fix typehint for wrap validator by @dmontagu in [#6788](https://github.com/pydantic/pydantic/pull/6788) * 🐛 Fix validation warning for unions of Literal and other type by @lig in [#6628](https://github.com/pydantic/pydantic/pull/6628) * Update documentation for generics support in V2 by @tpdorsey in [#6685](https://github.com/pydantic/pydantic/pull/6685) * add pydantic-core build info to `version_info()` by @samuelcolvin in [#6785](https://github.com/pydantic/pydantic/pull/6785) * Fix pydantic dataclasses that use slots with default values by @dmontagu in [#6796](https://github.com/pydantic/pydantic/pull/6796) * Fix inheritance of hash function for frozen models by @dmontagu in [#6789](https://github.com/pydantic/pydantic/pull/6789) * ✨ Add `SkipJsonSchema` annotation by @Kludex in [#6653](https://github.com/pydantic/pydantic/pull/6653) * Error if an invalid field name is used with Field by @dmontagu in [#6797](https://github.com/pydantic/pydantic/pull/6797) * Add `GenericModel` to `MOVED_IN_V2` by @adriangb in [#6776](https://github.com/pydantic/pydantic/pull/6776) * Remove unused code from `docs/usage/types/custom.md` by @hramezani in [#6803](https://github.com/pydantic/pydantic/pull/6803) * Fix `float` -> `Decimal` coercion precision loss by @adriangb in [#6810](https://github.com/pydantic/pydantic/pull/6810) * remove email validation from the north star benchmark by @davidhewitt in [#6816](https://github.com/pydantic/pydantic/pull/6816) * Fix link to mypy by @progsmile in [#6824](https://github.com/pydantic/pydantic/pull/6824) * Improve initialization hooks example by @hramezani in [#6822](https://github.com/pydantic/pydantic/pull/6822) * Fix default port for mongosrv DSNs by @dmontagu in [#6827](https://github.com/pydantic/pydantic/pull/6827) * Improve API documentation, in particular more links between usage and API docs by @samuelcolvin in [#6780](https://github.com/pydantic/pydantic/pull/6780) * update pydantic-core to 2.4.0 by @davidhewitt in [#6831](https://github.com/pydantic/pydantic/pull/6831) * Fix `annotated_types.MaxLen` validator for custom sequence types by @ImogenBits in [#6809](https://github.com/pydantic/pydantic/pull/6809) * Update V1 by @hramezani in [#6833](https://github.com/pydantic/pydantic/pull/6833) * Make it so callable JSON schema extra works by @dmontagu in [#6798](https://github.com/pydantic/pydantic/pull/6798) * Fix serialization issue with `InstanceOf` by @dmontagu in [#6829](https://github.com/pydantic/pydantic/pull/6829) * Add back support for `json_encoders` by @adriangb in [#6811](https://github.com/pydantic/pydantic/pull/6811) * Update field annotations when building the schema by @dmontagu in [#6838](https://github.com/pydantic/pydantic/pull/6838) * Use `WeakValueDictionary` to fix generic memory leak by @dmontagu in [#6681](https://github.com/pydantic/pydantic/pull/6681) * Add `config.defer_build` to optionally make model building lazy by @samuelcolvin in [#6823](https://github.com/pydantic/pydantic/pull/6823) * delegate `UUID` serialization to pydantic-core by @davidhewitt in [#6850](https://github.com/pydantic/pydantic/pull/6850) * Update `json_encoders` docs by @adriangb in [#6848](https://github.com/pydantic/pydantic/pull/6848) * Fix error message for `staticmethod`/`classmethod` order with validate_call by @dmontagu in [#6686](https://github.com/pydantic/pydantic/pull/6686) * Improve documentation for `Config` by @samuelcolvin in [#6847](https://github.com/pydantic/pydantic/pull/6847) * Update serialization doc to mention `Field.exclude` takes priority over call-time `include/exclude` by @hramezani in [#6851](https://github.com/pydantic/pydantic/pull/6851) * Allow customizing core schema generation by making `GenerateSchema` public by @adriangb in [#6737](https://github.com/pydantic/pydantic/pull/6737) ## v2.0.3 (2023-07-05) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.0.3) * Mention PyObject (v1) moving to ImportString (v2) in migration doc by @slafs in [#6456](https://github.com/pydantic/pydantic/pull/6456) * Fix release-tweet CI by @Kludex in [#6461](https://github.com/pydantic/pydantic/pull/6461) * Revise the section on required / optional / nullable fields. by @ybressler in [#6468](https://github.com/pydantic/pydantic/pull/6468) * Warn if a type hint is not in fact a type by @adriangb in [#6479](https://github.com/pydantic/pydantic/pull/6479) * Replace TransformSchema with GetPydanticSchema by @dmontagu in [#6484](https://github.com/pydantic/pydantic/pull/6484) * Fix the un-hashability of various annotation types, for use in caching generic containers by @dmontagu in [#6480](https://github.com/pydantic/pydantic/pull/6480) * PYD-164: Rework custom types docs by @adriangb in [#6490](https://github.com/pydantic/pydantic/pull/6490) * Fix ci by @adriangb in [#6507](https://github.com/pydantic/pydantic/pull/6507) * Fix forward ref in generic by @adriangb in [#6511](https://github.com/pydantic/pydantic/pull/6511) * Fix generation of serialization JSON schemas for core_schema.ChainSchema by @dmontagu in [#6515](https://github.com/pydantic/pydantic/pull/6515) * Document the change in `Field.alias` behavior in Pydantic V2 by @hramezani in [#6508](https://github.com/pydantic/pydantic/pull/6508) * Give better error message attempting to compute the json schema of a model with undefined fields by @dmontagu in [#6519](https://github.com/pydantic/pydantic/pull/6519) * Document `alias_priority` by @tpdorsey in [#6520](https://github.com/pydantic/pydantic/pull/6520) * Add redirect for types documentation by @tpdorsey in [#6513](https://github.com/pydantic/pydantic/pull/6513) * Allow updating docs without release by @samuelcolvin in [#6551](https://github.com/pydantic/pydantic/pull/6551) * Ensure docs tests always run in the right folder by @dmontagu in [#6487](https://github.com/pydantic/pydantic/pull/6487) * Defer evaluation of return type hints for serializer functions by @dmontagu in [#6516](https://github.com/pydantic/pydantic/pull/6516) * Disable E501 from Ruff and rely on just Black by @adriangb in [#6552](https://github.com/pydantic/pydantic/pull/6552) * Update JSON Schema documentation for V2 by @tpdorsey in [#6492](https://github.com/pydantic/pydantic/pull/6492) * Add documentation of cyclic reference handling by @dmontagu in [#6493](https://github.com/pydantic/pydantic/pull/6493) * Remove the need for change files by @samuelcolvin in [#6556](https://github.com/pydantic/pydantic/pull/6556) * add "north star" benchmark by @davidhewitt in [#6547](https://github.com/pydantic/pydantic/pull/6547) * Update Dataclasses docs by @tpdorsey in [#6470](https://github.com/pydantic/pydantic/pull/6470) * ♻️ Use different error message on v1 redirects by @Kludex in [#6595](https://github.com/pydantic/pydantic/pull/6595) * ⬆ Upgrade `pydantic-core` to v2.2.0 by @lig in [#6589](https://github.com/pydantic/pydantic/pull/6589) * Fix serialization for IPvAny by @dmontagu in [#6572](https://github.com/pydantic/pydantic/pull/6572) * Improve CI by using PDM instead of pip to install typing-extensions by @adriangb in [#6602](https://github.com/pydantic/pydantic/pull/6602) * Add `enum` error type docs by @lig in [#6603](https://github.com/pydantic/pydantic/pull/6603) * 🐛 Fix `max_length` for unicode strings by @lig in [#6559](https://github.com/pydantic/pydantic/pull/6559) * Add documentation for accessing features via `pydantic.v1` by @tpdorsey in [#6604](https://github.com/pydantic/pydantic/pull/6604) * Include extra when iterating over a model by @adriangb in [#6562](https://github.com/pydantic/pydantic/pull/6562) * Fix typing of model_validator by @adriangb in [#6514](https://github.com/pydantic/pydantic/pull/6514) * Touch up Decimal validator by @adriangb in [#6327](https://github.com/pydantic/pydantic/pull/6327) * Fix various docstrings using fixed pytest-examples by @dmontagu in [#6607](https://github.com/pydantic/pydantic/pull/6607) * Handle function validators in a discriminated union by @dmontagu in [#6570](https://github.com/pydantic/pydantic/pull/6570) * Review json_schema.md by @tpdorsey in [#6608](https://github.com/pydantic/pydantic/pull/6608) * Make validate_call work on basemodel methods by @dmontagu in [#6569](https://github.com/pydantic/pydantic/pull/6569) * add test for big int json serde by @davidhewitt in [#6614](https://github.com/pydantic/pydantic/pull/6614) * Fix pydantic dataclass problem with dataclasses.field default_factory by @hramezani in [#6616](https://github.com/pydantic/pydantic/pull/6616) * Fixed mypy type inference for TypeAdapter by @zakstucke in [#6617](https://github.com/pydantic/pydantic/pull/6617) * Make it work to use None as a generic parameter by @dmontagu in [#6609](https://github.com/pydantic/pydantic/pull/6609) * Make it work to use `$ref` as an alias by @dmontagu in [#6568](https://github.com/pydantic/pydantic/pull/6568) * add note to migration guide about changes to `AnyUrl` etc by @davidhewitt in [#6618](https://github.com/pydantic/pydantic/pull/6618) * 🐛 Support defining `json_schema_extra` on `RootModel` using `Field` by @lig in [#6622](https://github.com/pydantic/pydantic/pull/6622) * Update pre-commit to prevent commits to main branch on accident by @dmontagu in [#6636](https://github.com/pydantic/pydantic/pull/6636) * Fix PDM CI for python 3.7 on MacOS/windows by @dmontagu in [#6627](https://github.com/pydantic/pydantic/pull/6627) * Produce more accurate signatures for pydantic dataclasses by @dmontagu in [#6633](https://github.com/pydantic/pydantic/pull/6633) * Updates to Url types for Pydantic V2 by @tpdorsey in [#6638](https://github.com/pydantic/pydantic/pull/6638) * Fix list markdown in `transform` docstring by @StefanBRas in [#6649](https://github.com/pydantic/pydantic/pull/6649) * simplify slots_dataclass construction to appease mypy by @davidhewitt in [#6639](https://github.com/pydantic/pydantic/pull/6639) * Update TypedDict schema generation docstring by @adriangb in [#6651](https://github.com/pydantic/pydantic/pull/6651) * Detect and lint-error for prints by @dmontagu in [#6655](https://github.com/pydantic/pydantic/pull/6655) * Add xfailing test for pydantic-core PR 766 by @dmontagu in [#6641](https://github.com/pydantic/pydantic/pull/6641) * Ignore unrecognized fields from dataclasses metadata by @dmontagu in [#6634](https://github.com/pydantic/pydantic/pull/6634) * Make non-existent class getattr a mypy error by @dmontagu in [#6658](https://github.com/pydantic/pydantic/pull/6658) * Update pydantic-core to 2.3.0 by @hramezani in [#6648](https://github.com/pydantic/pydantic/pull/6648) * Use OrderedDict from typing_extensions by @dmontagu in [#6664](https://github.com/pydantic/pydantic/pull/6664) * Fix typehint for JSON schema extra callable by @dmontagu in [#6659](https://github.com/pydantic/pydantic/pull/6659) ## v2.0.2 (2023-07-05) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.0.2) * Fix bug where round-trip pickling/unpickling a `RootModel` would change the value of `__dict__`, [#6457](https://github.com/pydantic/pydantic/pull/6457) by @dmontagu * Allow single-item discriminated unions, [#6405](https://github.com/pydantic/pydantic/pull/6405) by @dmontagu * Fix issue with union parsing of enums, [#6440](https://github.com/pydantic/pydantic/pull/6440) by @dmontagu * Docs: Fixed `constr` documentation, renamed old `regex` to new `pattern`, [#6452](https://github.com/pydantic/pydantic/pull/6452) by @miili * Change `GenerateJsonSchema.generate_definitions` signature, [#6436](https://github.com/pydantic/pydantic/pull/6436) by @dmontagu See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0.2) ## v2.0.1 (2023-07-04) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.0.1) First patch release of Pydantic V2 * Extra fields added via `setattr` (i.e. `m.some_extra_field = 'extra_value'`) are added to `.model_extra` if `model_config` `extra='allowed'`. Fixed [#6333](https://github.com/pydantic/pydantic/pull/6333), [#6365](https://github.com/pydantic/pydantic/pull/6365) by @aaraney * Automatically unpack JSON schema '$ref' for custom types, [#6343](https://github.com/pydantic/pydantic/pull/6343) by @adriangb * Fix tagged unions multiple processing in submodels, [#6340](https://github.com/pydantic/pydantic/pull/6340) by @suharnikov See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0.1) ## v2.0 (2023-06-30) [GitHub release](https://github.com/pydantic/pydantic/releases/tag/v2.0) Pydantic V2 is here! :tada: See [this post](https://docs.pydantic.dev/2.0/blog/pydantic-v2-final/) for more details. ## v2.0b3 (2023-06-16) Third beta pre-release of Pydantic V2 See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0b3) ## v2.0b2 (2023-06-03) Add `from_attributes` runtime flag to `TypeAdapter.validate_python` and `BaseModel.model_validate`. See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0b2) ## v2.0b1 (2023-06-01) First beta pre-release of Pydantic V2 See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0b1) ## v2.0a4 (2023-05-05) Fourth pre-release of Pydantic V2 See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0a4) ## v2.0a3 (2023-04-20) Third pre-release of Pydantic V2 See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0a3) ## v2.0a2 (2023-04-12) Second pre-release of Pydantic V2 See the full changelog [here](https://github.com/pydantic/pydantic/releases/tag/v2.0a2) ## v2.0a1 (2023-04-03) First pre-release of Pydantic V2! See [this post](https://docs.pydantic.dev/blog/pydantic-v2-alpha/) for more details. ## v1.10.19 (2024-11-06) * Add warning when v2 model is nested in v1 model by @sydney-runkle in https://github.com/pydantic/pydantic/pull/10432 * Fix deprecation warning in V1 `isinstance` check by @alicederyn in https://github.com/pydantic/pydantic/pull/10645 ## v1.10.18 (2024-08-22) * Eval type fix in V1 by @sydney-runkle in https://github.com/pydantic/pydantic/pull/9751 * Add `to_lower_camel` to `__all__` in `utils.py` by @sydney-runkle (direct commit) * Fix `mypy` v1 plugin for mypy 1.11 release by @flaeppe in https://github.com/pydantic/pydantic/pull/10139 * Fix discriminator key used when discriminator has alias and `.schema(by_alias=False)` by @exs-dwoodward in https://github.com/pydantic/pydantic/pull/10146 ## v1.10.17 (2024-06-20) * Advertise Python 3.12 for 1.10.x! Part Deux by @vfazio in https://github.com/pydantic/pydantic/pull/9644 * Mirrored modules in `v1` namespace to fix typing and object resolution in python>3.11 by @exs-dwoodward in https://github.com/pydantic/pydantic/pull/9660 * setup: remove upper bound from python_requires by @vfazio in https://github.com/pydantic/pydantic/pull/9685 ## v1.10.16 (2024-06-11) * Specify recursive_guard as kwarg in FutureRef._evaluate by @vfazio in https://github.com/pydantic/pydantic/pull/9612 * Fix mypy v1 plugin for upcoming mypy release by @ cdce8p in https://github.com/pydantic/pydantic/pull/9586 * Import modules/objects directly from v1 namespace by @exs-dwoodward in https://github.com/pydantic/pydantic/pull/9162 ## v1.10.15 (2024-04-03) * Add pydantic.v1 namespace to Pydantic v1 by @exs-dmiketa in https://github.com/pydantic/pydantic/pull/9042 * Relax version of typing-extensions for V1 by @SonOfLilit in https://github.com/pydantic/pydantic/pull/8819 * patch fix for mypy by @sydney-runkle in https://github.com/pydantic/pydantic/pull/8765 ## v1.10.14 (2024-01-19) * Update install.md by @dmontagu in #7690 * Fix ci to only deploy docs on release by @sydney-runkle in #7740 * Ubuntu fixes for V1 by @sydney-runkle in #8540 and #8587 * Fix cached_property handling in dataclasses when copied by @rdbisme in #8407 ## v1.10.13 (2023-09-27) * Fix: Add max length check to `pydantic.validate_email`, #7673 by @hramezani * Docs: Fix pip commands to install v1, #6930 by @chbndrhnns ## v1.10.12 (2023-07-24) * Fixes the `maxlen` property being dropped on `deque` validation. Happened only if the deque item has been typed. Changes the `_validate_sequence_like` func, [#6581](https://github.com/pydantic/pydantic/pull/6581) by @maciekglowka ## v1.10.11 (2023-07-04) * Importing create_model in tools.py through relative path instead of absolute path - so that it doesn't import V2 code when copied over to V2 branch, [#6361](https://github.com/pydantic/pydantic/pull/6361) by @SharathHuddar ## v1.10.10 (2023-06-30) * Add Pydantic `Json` field support to settings management, [#6250](https://github.com/pydantic/pydantic/pull/6250) by @hramezani * Fixed literal validator errors for unhashable values, [#6188](https://github.com/pydantic/pydantic/pull/6188) by @markus1978 * Fixed bug with generics receiving forward refs, [#6130](https://github.com/pydantic/pydantic/pull/6130) by @mark-todd * Update install method of FastAPI for internal tests in CI, [#6117](https://github.com/pydantic/pydantic/pull/6117) by @Kludex ## v1.10.9 (2023-06-07) * Fix trailing zeros not ignored in Decimal validation, [#5968](https://github.com/pydantic/pydantic/pull/5968) by @hramezani * Fix mypy plugin for v1.4.0, [#5928](https://github.com/pydantic/pydantic/pull/5928) by @cdce8p * Add future and past date hypothesis strategies, [#5850](https://github.com/pydantic/pydantic/pull/5850) by @bschoenmaeckers * Discourage usage of Cython 3 with Pydantic 1.x, [#5845](https://github.com/pydantic/pydantic/pull/5845) by @lig ## v1.10.8 (2023-05-23) * Fix a bug in `Literal` usage with `typing-extension==4.6.0`, [#5826](https://github.com/pydantic/pydantic/pull/5826) by @hramezani * This solves the (closed) issue [#3849](https://github.com/pydantic/pydantic/pull/3849) where aliased fields that use discriminated union fail to validate when the data contains the non-aliased field name, [#5736](https://github.com/pydantic/pydantic/pull/5736) by @benwah * Update email-validator dependency to >=2.0.0post2, [#5627](https://github.com/pydantic/pydantic/pull/5627) by @adriangb * update `AnyClassMethod` for changes in [python/typeshed#9771](https://github.com/python/typeshed/issues/9771), [#5505](https://github.com/pydantic/pydantic/pull/5505) by @ITProKyle ## v1.10.7 (2023-03-22) * Fix creating schema from model using `ConstrainedStr` with `regex` as dict key, [#5223](https://github.com/pydantic/pydantic/pull/5223) by @matejetz * Address bug in mypy plugin caused by explicit_package_bases=True, [#5191](https://github.com/pydantic/pydantic/pull/5191) by @dmontagu * Add implicit defaults in the mypy plugin for Field with no default argument, [#5190](https://github.com/pydantic/pydantic/pull/5190) by @dmontagu * Fix schema generated for Enum values used as Literals in discriminated unions, [#5188](https://github.com/pydantic/pydantic/pull/5188) by @javibookline * Fix mypy failures caused by the pydantic mypy plugin when users define `from_orm` in their own classes, [#5187](https://github.com/pydantic/pydantic/pull/5187) by @dmontagu * Fix `InitVar` usage with pydantic dataclasses, mypy version `1.1.1` and the custom mypy plugin, [#5162](https://github.com/pydantic/pydantic/pull/5162) by @cdce8p ## v1.10.6 (2023-03-08) * Implement logic to support creating validators from non standard callables by using defaults to identify them and unwrapping `functools.partial` and `functools.partialmethod` when checking the signature, [#5126](https://github.com/pydantic/pydantic/pull/5126) by @JensHeinrich * Fix mypy plugin for v1.1.1, and fix `dataclass_transform` decorator for pydantic dataclasses, [#5111](https://github.com/pydantic/pydantic/pull/5111) by @cdce8p * Raise `ValidationError`, not `ConfigError`, when a discriminator value is unhashable, [#4773](https://github.com/pydantic/pydantic/pull/4773) by @kurtmckee ## v1.10.5 (2023-02-15) * Fix broken parametrized bases handling with `GenericModel`s with complex sets of models, [#5052](https://github.com/pydantic/pydantic/pull/5052) by @MarkusSintonen * Invalidate mypy cache if plugin config changes, [#5007](https://github.com/pydantic/pydantic/pull/5007) by @cdce8p * Fix `RecursionError` when deep-copying dataclass types wrapped by pydantic, [#4949](https://github.com/pydantic/pydantic/pull/4949) by @mbillingr * Fix `X | Y` union syntax breaking `GenericModel`, [#4146](https://github.com/pydantic/pydantic/pull/4146) by @thenx * Switch coverage badge to show coverage for this branch/release, [#5060](https://github.com/pydantic/pydantic/pull/5060) by @samuelcolvin ## v1.10.4 (2022-12-30) * Change dependency to `typing-extensions>=4.2.0`, [#4885](https://github.com/pydantic/pydantic/pull/4885) by @samuelcolvin ## v1.10.3 (2022-12-29) **NOTE: v1.10.3 was ["yanked"](https://pypi.org/help/#yanked) from PyPI due to [#4885](https://github.com/pydantic/pydantic/pull/4885) which is fixed in v1.10.4** * fix parsing of custom root models, [#4883](https://github.com/pydantic/pydantic/pull/4883) by @gou177 * fix: use dataclass proxy for frozen or empty dataclasses, [#4878](https://github.com/pydantic/pydantic/pull/4878) by @PrettyWood * Fix `schema` and `schema_json` on models where a model instance is a one of default values, [#4781](https://github.com/pydantic/pydantic/pull/4781) by @Bobronium * Add Jina AI to sponsors on docs index page, [#4767](https://github.com/pydantic/pydantic/pull/4767) by @samuelcolvin * fix: support assignment on `DataclassProxy`, [#4695](https://github.com/pydantic/pydantic/pull/4695) by @PrettyWood * Add `postgresql+psycopg` as allowed scheme for `PostgreDsn` to make it usable with SQLAlchemy 2, [#4689](https://github.com/pydantic/pydantic/pull/4689) by @morian * Allow dict schemas to have both `patternProperties` and `additionalProperties`, [#4641](https://github.com/pydantic/pydantic/pull/4641) by @jparise * Fixes error passing None for optional lists with `unique_items`, [#4568](https://github.com/pydantic/pydantic/pull/4568) by @mfulgo * Fix `GenericModel` with `Callable` param raising a `TypeError`, [#4551](https://github.com/pydantic/pydantic/pull/4551) by @mfulgo * Fix field regex with `StrictStr` type annotation, [#4538](https://github.com/pydantic/pydantic/pull/4538) by @sisp * Correct `dataclass_transform` keyword argument name from `field_descriptors` to `field_specifiers`, [#4500](https://github.com/pydantic/pydantic/pull/4500) by @samuelcolvin * fix: avoid multiple calls of `__post_init__` when dataclasses are inherited, [#4487](https://github.com/pydantic/pydantic/pull/4487) by @PrettyWood * Reduce the size of binary wheels, [#2276](https://github.com/pydantic/pydantic/pull/2276) by @samuelcolvin ## v1.10.2 (2022-09-05) * **Revert Change:** Revert percent encoding of URL parts which was originally added in [#4224](https://github.com/pydantic/pydantic/pull/4224), [#4470](https://github.com/pydantic/pydantic/pull/4470) by @samuelcolvin * Prevent long (length > `4_300`) strings/bytes as input to int fields, see [python/cpython#95778](https://github.com/python/cpython/issues/95778) and [CVE-2020-10735](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10735), [#1477](https://github.com/pydantic/pydantic/pull/1477) by @samuelcolvin * fix: dataclass wrapper was not always called, [#4477](https://github.com/pydantic/pydantic/pull/4477) by @PrettyWood * Use `tomllib` on Python 3.11 when parsing `mypy` configuration, [#4476](https://github.com/pydantic/pydantic/pull/4476) by @hauntsaninja * Basic fix of `GenericModel` cache to detect order of arguments in `Union` models, [#4474](https://github.com/pydantic/pydantic/pull/4474) by @sveinugu * Fix mypy plugin when using bare types like `list` and `dict` as `default_factory`, [#4457](https://github.com/pydantic/pydantic/pull/4457) by @samuelcolvin ## v1.10.1 (2022-08-31) * Add `__hash__` method to `pydancic.color.Color` class, [#4454](https://github.com/pydantic/pydantic/pull/4454) by @czaki ## v1.10.0 (2022-08-30) * Refactor the whole _pydantic_ `dataclass` decorator to really act like its standard lib equivalent. It hence keeps `__eq__`, `__hash__`, ... and makes comparison with its non-validated version possible. It also fixes usage of `frozen` dataclasses in fields and usage of `default_factory` in nested dataclasses. The support of `Config.extra` has been added. Finally, config customization directly via a `dict` is now possible, [#2557](https://github.com/pydantic/pydantic/pull/2557) by @PrettyWood

**BREAKING CHANGES:** - The `compiled` boolean (whether _pydantic_ is compiled with cython) has been moved from `main.py` to `version.py` - Now that `Config.extra` is supported, `dataclass` ignores by default extra arguments (like `BaseModel`) * Fix PEP487 `__set_name__` protocol in `BaseModel` for PrivateAttrs, [#4407](https://github.com/pydantic/pydantic/pull/4407) by @tlambert03 * Allow for custom parsing of environment variables via `parse_env_var` in `Config`, [#4406](https://github.com/pydantic/pydantic/pull/4406) by @acmiyaguchi * Rename `master` to `main`, [#4405](https://github.com/pydantic/pydantic/pull/4405) by @hramezani * Fix `StrictStr` does not raise `ValidationError` when `max_length` is present in `Field`, [#4388](https://github.com/pydantic/pydantic/pull/4388) by @hramezani * Make `SecretStr` and `SecretBytes` hashable, [#4387](https://github.com/pydantic/pydantic/pull/4387) by @chbndrhnns * Fix `StrictBytes` does not raise `ValidationError` when `max_length` is present in `Field`, [#4380](https://github.com/pydantic/pydantic/pull/4380) by @JeanArhancet * Add support for bare `type`, [#4375](https://github.com/pydantic/pydantic/pull/4375) by @hramezani * Support Python 3.11, including binaries for 3.11 in PyPI, [#4374](https://github.com/pydantic/pydantic/pull/4374) by @samuelcolvin * Add support for `re.Pattern`, [#4366](https://github.com/pydantic/pydantic/pull/4366) by @hramezani * Fix `__post_init_post_parse__` is incorrectly passed keyword arguments when no `__post_init__` is defined, [#4361](https://github.com/pydantic/pydantic/pull/4361) by @hramezani * Fix implicitly importing `ForwardRef` and `Callable` from `pydantic.typing` instead of `typing` and also expose `MappingIntStrAny`, [#4358](https://github.com/pydantic/pydantic/pull/4358) by @aminalaee * remove `Any` types from the `dataclass` decorator so it can be used with the `disallow_any_expr` mypy option, [#4356](https://github.com/pydantic/pydantic/pull/4356) by @DetachHead * moved repo to `pydantic/pydantic`, [#4348](https://github.com/pydantic/pydantic/pull/4348) by @yezz123 * fix "extra fields not permitted" error when dataclass with `Extra.forbid` is validated multiple times, [#4343](https://github.com/pydantic/pydantic/pull/4343) by @detachhead * Add Python 3.9 and 3.10 examples to docs, [#4339](https://github.com/pydantic/pydantic/pull/4339) by @Bobronium * Discriminated union models now use `oneOf` instead of `anyOf` when generating OpenAPI schema definitions, [#4335](https://github.com/pydantic/pydantic/pull/4335) by @MaxwellPayne * Allow type checkers to infer inner type of `Json` type. `Json[list[str]]` will be now inferred as `list[str]`, `Json[Any]` should be used instead of plain `Json`. Runtime behaviour is not changed, [#4332](https://github.com/pydantic/pydantic/pull/4332) by @Bobronium * Allow empty string aliases by using a `alias is not None` check, rather than `bool(alias)`, [#4253](https://github.com/pydantic/pydantic/pull/4253) by @sergeytsaplin * Update `ForwardRef`s in `Field.outer_type_`, [#4249](https://github.com/pydantic/pydantic/pull/4249) by @JacobHayes * The use of `__dataclass_transform__` has been replaced by `typing_extensions.dataclass_transform`, which is the preferred way to mark pydantic models as a dataclass under [PEP 681](https://peps.python.org/pep-0681/), [#4241](https://github.com/pydantic/pydantic/pull/4241) by @multimeric * Use parent model's `Config` when validating nested `NamedTuple` fields, [#4219](https://github.com/pydantic/pydantic/pull/4219) by @synek * Update `BaseModel.construct` to work with aliased Fields, [#4192](https://github.com/pydantic/pydantic/pull/4192) by @kylebamos * Catch certain raised errors in `smart_deepcopy` and revert to `deepcopy` if so, [#4184](https://github.com/pydantic/pydantic/pull/4184) by @coneybeare * Add `Config.anystr_upper` and `to_upper` kwarg to constr and conbytes, [#4165](https://github.com/pydantic/pydantic/pull/4165) by @satheler * Fix JSON schema for `set` and `frozenset` when they include default values, [#4155](https://github.com/pydantic/pydantic/pull/4155) by @aminalaee * Teach the mypy plugin that methods decorated by `@validator` are classmethods, [#4102](https://github.com/pydantic/pydantic/pull/4102) by @DMRobertson * Improve mypy plugin's ability to detect required fields, [#4086](https://github.com/pydantic/pydantic/pull/4086) by @richardxia * Support fields of type `Type[]` in schema, [#4051](https://github.com/pydantic/pydantic/pull/4051) by @aminalaee * Add `default` value in JSON Schema when `const=True`, [#4031](https://github.com/pydantic/pydantic/pull/4031) by @aminalaee * Adds reserved word check to signature generation logic, [#4011](https://github.com/pydantic/pydantic/pull/4011) by @strue36 * Fix Json strategy failure for the complex nested field, [#4005](https://github.com/pydantic/pydantic/pull/4005) by @sergiosim * Add JSON-compatible float constraint `allow_inf_nan`, [#3994](https://github.com/pydantic/pydantic/pull/3994) by @tiangolo * Remove undefined behaviour when `env_prefix` had characters in common with `env_nested_delimiter`, [#3975](https://github.com/pydantic/pydantic/pull/3975) by @arsenron * Support generics model with `create_model`, [#3945](https://github.com/pydantic/pydantic/pull/3945) by @hot123s * allow submodels to overwrite extra field info, [#3934](https://github.com/pydantic/pydantic/pull/3934) by @PrettyWood * Document and test structural pattern matching ([PEP 636](https://peps.python.org/pep-0636/)) on `BaseModel`, [#3920](https://github.com/pydantic/pydantic/pull/3920) by @irgolic * Fix incorrect deserialization of python timedelta object to ISO 8601 for negative time deltas. Minus was serialized in incorrect place ("P-1DT23H59M59.888735S" instead of correct "-P1DT23H59M59.888735S"), [#3899](https://github.com/pydantic/pydantic/pull/3899) by @07pepa * Fix validation of discriminated union fields with an alias when passing a model instance, [#3846](https://github.com/pydantic/pydantic/pull/3846) by @chornsby * Add a CockroachDsn type to validate CockroachDB connection strings. The type supports the following schemes: `cockroachdb`, `cockroachdb+psycopg2` and `cockroachdb+asyncpg`, [#3839](https://github.com/pydantic/pydantic/pull/3839) by @blubber * Fix MyPy plugin to not override pre-existing `__init__` method in models, [#3824](https://github.com/pydantic/pydantic/pull/3824) by @patrick91 * Fix mypy version checking, [#3783](https://github.com/pydantic/pydantic/pull/3783) by @KotlinIsland * support overwriting dunder attributes of `BaseModel` instances, [#3777](https://github.com/pydantic/pydantic/pull/3777) by @PrettyWood * Added `ConstrainedDate` and `condate`, [#3740](https://github.com/pydantic/pydantic/pull/3740) by @hottwaj * Support `kw_only` in dataclasses, [#3670](https://github.com/pydantic/pydantic/pull/3670) by @detachhead * Add comparison method for `Color` class, [#3646](https://github.com/pydantic/pydantic/pull/3646) by @aminalaee * Drop support for python3.6, associated cleanup, [#3605](https://github.com/pydantic/pydantic/pull/3605) by @samuelcolvin * created new function `to_lower_camel()` for "non pascal case" camel case, [#3463](https://github.com/pydantic/pydantic/pull/3463) by @schlerp * Add checks to `default` and `default_factory` arguments in Mypy plugin, [#3430](https://github.com/pydantic/pydantic/pull/3430) by @klaa97 * fix mangling of `inspect.signature` for `BaseModel`, [#3413](https://github.com/pydantic/pydantic/pull/3413) by @fix-inspect-signature * Adds the `SecretField` abstract class so that all the current and future secret fields like `SecretStr` and `SecretBytes` will derive from it, [#3409](https://github.com/pydantic/pydantic/pull/3409) by @expobrain * Support multi hosts validation in `PostgresDsn`, [#3337](https://github.com/pydantic/pydantic/pull/3337) by @rglsk * Fix parsing of very small numeric timedelta values, [#3315](https://github.com/pydantic/pydantic/pull/3315) by @samuelcolvin * Update `SecretsSettingsSource` to respect `config.case_sensitive`, [#3273](https://github.com/pydantic/pydantic/pull/3273) by @JeanArhancet * Add MongoDB network data source name (DSN) schema, [#3229](https://github.com/pydantic/pydantic/pull/3229) by @snosratiershad * Add support for multiple dotenv files, [#3222](https://github.com/pydantic/pydantic/pull/3222) by @rekyungmin * Raise an explicit `ConfigError` when multiple fields are incorrectly set for a single validator, [#3215](https://github.com/pydantic/pydantic/pull/3215) by @SunsetOrange * Allow ellipsis on `Field`s inside `Annotated` for `TypedDicts` required, [#3133](https://github.com/pydantic/pydantic/pull/3133) by @ezegomez * Catch overflow errors in `int_validator`, [#3112](https://github.com/pydantic/pydantic/pull/3112) by @ojii * Adds a `__rich_repr__` method to `Representation` class which enables pretty printing with [Rich](https://github.com/willmcgugan/rich), [#3099](https://github.com/pydantic/pydantic/pull/3099) by @willmcgugan * Add percent encoding in `AnyUrl` and descendent types, [#3061](https://github.com/pydantic/pydantic/pull/3061) by @FaresAhmedb * `validate_arguments` decorator now supports `alias`, [#3019](https://github.com/pydantic/pydantic/pull/3019) by @MAD-py * Avoid `__dict__` and `__weakref__` attributes in `AnyUrl` and IP address fields, [#2890](https://github.com/pydantic/pydantic/pull/2890) by @nuno-andre * Add ability to use `Final` in a field type annotation, [#2766](https://github.com/pydantic/pydantic/pull/2766) by @uriyyo * Update requirement to `typing_extensions>=4.1.0` to guarantee `dataclass_transform` is available, [#4424](https://github.com/pydantic/pydantic/pull/4424) by @commonism * Add Explosion and AWS to main sponsors, [#4413](https://github.com/pydantic/pydantic/pull/4413) by @samuelcolvin * Update documentation for `copy_on_model_validation` to reflect recent changes, [#4369](https://github.com/pydantic/pydantic/pull/4369) by @samuelcolvin * Runtime warning if `__slots__` is passed to `create_model`, `__slots__` is then ignored, [#4432](https://github.com/pydantic/pydantic/pull/4432) by @samuelcolvin * Add type hints to `BaseSettings.Config` to avoid mypy errors, also correct mypy version compatibility notice in docs, [#4450](https://github.com/pydantic/pydantic/pull/4450) by @samuelcolvin ## v1.10.0b1 (2022-08-24) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v1.10.0b1) for details. ## v1.10.0a2 (2022-08-24) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v1.10.0a2) for details. ## v1.10.0a1 (2022-08-22) Pre-release, see [the GitHub release](https://github.com/pydantic/pydantic/releases/tag/v1.10.0a1) for details. ## v1.9.2 (2022-08-11) **Revert Breaking Change**: _v1.9.1_ introduced a breaking change where model fields were deep copied by default, this release reverts the default behaviour to match _v1.9.0_ and before, while also allow deep-copy behaviour via `copy_on_model_validation = 'deep'`. See [#4092](https://github.com/pydantic/pydantic/pull/4092) for more information. * Allow for shallow copies of model fields, `Config.copy_on_model_validation` is now a str which must be `'none'`, `'deep'`, or `'shallow'` corresponding to not copying, deep copy & shallow copy; default `'shallow'`, [#4093](https://github.com/pydantic/pydantic/pull/4093) by @timkpaine ## v1.9.1 (2022-05-19) Thank you to pydantic's sponsors: @tiangolo, @stellargraph, @JonasKs, @grillazz, @Mazyod, @kevinalh, @chdsbd, @povilasb, @povilasb, @jina-ai, @mainframeindustries, @robusta-dev, @SendCloud, @rszamszur, @jodal, @hardbyte, @corleyma, @daddycocoaman, @Rehket, @jokull, @reillysiemens, @westonsteimel, @primer-io, @koxudaxi, @browniebroke, @stradivari96, @adriangb, @kamalgill, @jqueguiner, @dev-zero, @datarootsio, @RedCarpetUp for their kind support. * Limit the size of `generics._generic_types_cache` and `generics._assigned_parameters` to avoid unlimited increase in memory usage, [#4083](https://github.com/pydantic/pydantic/pull/4083) by @samuelcolvin * Add Jupyverse and FPS as Jupyter projects using pydantic, [#4082](https://github.com/pydantic/pydantic/pull/4082) by @davidbrochart * Speedup `__isinstancecheck__` on pydantic models when the type is not a model, may also avoid memory "leaks", [#4081](https://github.com/pydantic/pydantic/pull/4081) by @samuelcolvin * Fix in-place modification of `FieldInfo` that caused problems with PEP 593 type aliases, [#4067](https://github.com/pydantic/pydantic/pull/4067) by @adriangb * Add support for autocomplete in VS Code via `__dataclass_transform__` when using `pydantic.dataclasses.dataclass`, [#4006](https://github.com/pydantic/pydantic/pull/4006) by @giuliano-oliveira * Remove benchmarks from codebase and docs, [#3973](https://github.com/pydantic/pydantic/pull/3973) by @samuelcolvin * Typing checking with pyright in CI, improve docs on vscode/pylance/pyright, [#3972](https://github.com/pydantic/pydantic/pull/3972) by @samuelcolvin * Fix nested Python dataclass schema regression, [#3819](https://github.com/pydantic/pydantic/pull/3819) by @himbeles * Update documentation about lazy evaluation of sources for Settings, [#3806](https://github.com/pydantic/pydantic/pull/3806) by @garyd203 * Prevent subclasses of bytes being converted to bytes, [#3706](https://github.com/pydantic/pydantic/pull/3706) by @samuelcolvin * Fixed "error checking inheritance of" when using PEP585 and PEP604 type hints, [#3681](https://github.com/pydantic/pydantic/pull/3681) by @aleksul * Allow self referencing `ClassVar`s in models, [#3679](https://github.com/pydantic/pydantic/pull/3679) by @samuelcolvin * **Breaking Change, see [#4106](https://github.com/pydantic/pydantic/pull/4106)**: Fix issue with self-referencing dataclass, [#3675](https://github.com/pydantic/pydantic/pull/3675) by @uriyyo * Include non-standard port numbers in rendered URLs, [#3652](https://github.com/pydantic/pydantic/pull/3652) by @dolfinus * `Config.copy_on_model_validation` does a deep copy and not a shallow one, [#3641](https://github.com/pydantic/pydantic/pull/3641) by @PrettyWood * fix: clarify that discriminated unions do not support singletons, [#3636](https://github.com/pydantic/pydantic/pull/3636) by @tommilligan * Add `read_text(encoding='utf-8')` for `setup.py`, [#3625](https://github.com/pydantic/pydantic/pull/3625) by @hswong3i * Fix JSON Schema generation for Discriminated Unions within lists, [#3608](https://github.com/pydantic/pydantic/pull/3608) by @samuelcolvin ## v1.9.0 (2021-12-31) Thank you to pydantic's sponsors: @sthagen, @timdrijvers, @toinbis, @koxudaxi, @ginomempin, @primer-io, @and-semakin, @westonsteimel, @reillysiemens, @es3n1n, @jokull, @JonasKs, @Rehket, @corleyma, @daddycocoaman, @hardbyte, @datarootsio, @jodal, @aminalaee, @rafsaf, @jqueguiner, @chdsbd, @kevinalh, @Mazyod, @grillazz, @JonasKs, @simw, @leynier, @xfenix for their kind support. ### Highlights * add Python 3.10 support, [#2885](https://github.com/pydantic/pydantic/pull/2885) by @PrettyWood * [Discriminated unions](https://docs.pydantic.dev/usage/types/#discriminated-unions-aka-tagged-unions), [#619](https://github.com/pydantic/pydantic/pull/619) by @PrettyWood * [`Config.smart_union` for better union logic](https://docs.pydantic.dev/usage/model_config/#smart-union), [#2092](https://github.com/pydantic/pydantic/pull/2092) by @PrettyWood * Binaries for Macos M1 CPUs, [#3498](https://github.com/pydantic/pydantic/pull/3498) by @samuelcolvin * Complex types can be set via [nested environment variables](https://docs.pydantic.dev/usage/settings/#parsing-environment-variable-values), e.g. `foo___bar`, [#3159](https://github.com/pydantic/pydantic/pull/3159) by @Air-Mark * add a dark mode to _pydantic_ documentation, [#2913](https://github.com/pydantic/pydantic/pull/2913) by @gbdlin * Add support for autocomplete in VS Code via `__dataclass_transform__`, [#2721](https://github.com/pydantic/pydantic/pull/2721) by @tiangolo * Add "exclude" as a field parameter so that it can be configured using model config, [#660](https://github.com/pydantic/pydantic/pull/660) by @daviskirk ### v1.9.0 (2021-12-31) Changes * Apply `update_forward_refs` to `Config.json_encodes` prevent name clashes in types defined via strings, [#3583](https://github.com/pydantic/pydantic/pull/3583) by @samuelcolvin * Extend pydantic's mypy plugin to support mypy versions `0.910`, `0.920`, `0.921` & `0.930`, [#3573](https://github.com/pydantic/pydantic/pull/3573) & [#3594](https://github.com/pydantic/pydantic/pull/3594) by @PrettyWood, @christianbundy, @samuelcolvin ### v1.9.0a2 (2021-12-24) Changes * support generic models with discriminated union, [#3551](https://github.com/pydantic/pydantic/pull/3551) by @PrettyWood * keep old behaviour of `json()` by default, [#3542](https://github.com/pydantic/pydantic/pull/3542) by @PrettyWood * Removed typing-only `__root__` attribute from `BaseModel`, [#3540](https://github.com/pydantic/pydantic/pull/3540) by @layday * Build Python 3.10 wheels, [#3539](https://github.com/pydantic/pydantic/pull/3539) by @mbachry * Fix display of `extra` fields with model `__repr__`, [#3234](https://github.com/pydantic/pydantic/pull/3234) by @cocolman * models copied via `Config.copy_on_model_validation` always have all fields, [#3201](https://github.com/pydantic/pydantic/pull/3201) by @PrettyWood * nested ORM from nested dictionaries, [#3182](https://github.com/pydantic/pydantic/pull/3182) by @PrettyWood * fix link to discriminated union section by @PrettyWood ### v1.9.0a1 (2021-12-18) Changes * Add support for `Decimal`-specific validation configurations in `Field()`, additionally to using `condecimal()`, to allow better support from editors and tooling, [#3507](https://github.com/pydantic/pydantic/pull/3507) by @tiangolo * Add `arm64` binaries suitable for MacOS with an M1 CPU to PyPI, [#3498](https://github.com/pydantic/pydantic/pull/3498) by @samuelcolvin * Fix issue where `None` was considered invalid when using a `Union` type containing `Any` or `object`, [#3444](https://github.com/pydantic/pydantic/pull/3444) by @tharradine * When generating field schema, pass optional `field` argument (of type `pydantic.fields.ModelField`) to `__modify_schema__()` if present, [#3434](https://github.com/pydantic/pydantic/pull/3434) by @jasujm * Fix issue when pydantic fail to parse `typing.ClassVar` string type annotation, [#3401](https://github.com/pydantic/pydantic/pull/3401) by @uriyyo * Mention Python >= 3.9.2 as an alternative to `typing_extensions.TypedDict`, [#3374](https://github.com/pydantic/pydantic/pull/3374) by @BvB93 * Changed the validator method name in the [Custom Errors example](https://docs.pydantic.dev/usage/models/#custom-errors) to more accurately describe what the validator is doing; changed from `name_must_contain_space` to ` value_must_equal_bar`, [#3327](https://github.com/pydantic/pydantic/pull/3327) by @michaelrios28 * Add `AmqpDsn` class, [#3254](https://github.com/pydantic/pydantic/pull/3254) by @kludex * Always use `Enum` value as default in generated JSON schema, [#3190](https://github.com/pydantic/pydantic/pull/3190) by @joaommartins * Add support for Mypy 0.920, [#3175](https://github.com/pydantic/pydantic/pull/3175) by @christianbundy * `validate_arguments` now supports `extra` customization (used to always be `Extra.forbid`), [#3161](https://github.com/pydantic/pydantic/pull/3161) by @PrettyWood * Complex types can be set by nested environment variables, [#3159](https://github.com/pydantic/pydantic/pull/3159) by @Air-Mark * Fix mypy plugin to collect fields based on `pydantic.utils.is_valid_field` so that it ignores untyped private variables, [#3146](https://github.com/pydantic/pydantic/pull/3146) by @hi-ogawa * fix `validate_arguments` issue with `Config.validate_all`, [#3135](https://github.com/pydantic/pydantic/pull/3135) by @PrettyWood * avoid dict coercion when using dict subclasses as field type, [#3122](https://github.com/pydantic/pydantic/pull/3122) by @PrettyWood * add support for `object` type, [#3062](https://github.com/pydantic/pydantic/pull/3062) by @PrettyWood * Updates pydantic dataclasses to keep `_special` properties on parent classes, [#3043](https://github.com/pydantic/pydantic/pull/3043) by @zulrang * Add a `TypedDict` class for error objects, [#3038](https://github.com/pydantic/pydantic/pull/3038) by @matthewhughes934 * Fix support for using a subclass of an annotation as a default, [#3018](https://github.com/pydantic/pydantic/pull/3018) by @JacobHayes * make `create_model_from_typeddict` mypy compliant, [#3008](https://github.com/pydantic/pydantic/pull/3008) by @PrettyWood * Make multiple inheritance work when using `PrivateAttr`, [#2989](https://github.com/pydantic/pydantic/pull/2989) by @hmvp * Parse environment variables as JSON, if they have a `Union` type with a complex subfield, [#2936](https://github.com/pydantic/pydantic/pull/2936) by @cbartz * Prevent `StrictStr` permitting `Enum` values where the enum inherits from `str`, [#2929](https://github.com/pydantic/pydantic/pull/2929) by @samuelcolvin * Make `SecretsSettingsSource` parse values being assigned to fields of complex types when sourced from a secrets file, just as when sourced from environment variables, [#2917](https://github.com/pydantic/pydantic/pull/2917) by @davidmreed * add a dark mode to _pydantic_ documentation, [#2913](https://github.com/pydantic/pydantic/pull/2913) by @gbdlin * Make `pydantic-mypy` plugin compatible with `pyproject.toml` configuration, consistent with `mypy` changes. See the [doc](https://docs.pydantic.dev/mypy_plugin/#configuring-the-plugin) for more information, [#2908](https://github.com/pydantic/pydantic/pull/2908) by @jrwalk * add Python 3.10 support, [#2885](https://github.com/pydantic/pydantic/pull/2885) by @PrettyWood * Correctly parse generic models with `Json[T]`, [#2860](https://github.com/pydantic/pydantic/pull/2860) by @geekingfrog * Update contrib docs re: Python version to use for building docs, [#2856](https://github.com/pydantic/pydantic/pull/2856) by @paxcodes * Clarify documentation about _pydantic_'s support for custom validation and strict type checking, despite _pydantic_ being primarily a parsing library, [#2855](https://github.com/pydantic/pydantic/pull/2855) by @paxcodes * Fix schema generation for `Deque` fields, [#2810](https://github.com/pydantic/pydantic/pull/2810) by @sergejkozin * fix an edge case when mixing constraints and `Literal`, [#2794](https://github.com/pydantic/pydantic/pull/2794) by @PrettyWood * Fix postponed annotation resolution for `NamedTuple` and `TypedDict` when they're used directly as the type of fields within Pydantic models, [#2760](https://github.com/pydantic/pydantic/pull/2760) by @jameysharp * Fix bug when `mypy` plugin fails on `construct` method call for `BaseSettings` derived classes, [#2753](https://github.com/pydantic/pydantic/pull/2753) by @uriyyo * Add function overloading for a `pydantic.create_model` function, [#2748](https://github.com/pydantic/pydantic/pull/2748) by @uriyyo * Fix mypy plugin issue with self field declaration, [#2743](https://github.com/pydantic/pydantic/pull/2743) by @uriyyo * The colon at the end of the line "The fields which were supplied when user was initialised:" suggests that the code following it is related. Changed it to a period, [#2733](https://github.com/pydantic/pydantic/pull/2733) by @krisaoe * Renamed variable `schema` to `schema_` to avoid shadowing of global variable name, [#2724](https://github.com/pydantic/pydantic/pull/2724) by @shahriyarr * Add support for autocomplete in VS Code via `__dataclass_transform__`, [#2721](https://github.com/pydantic/pydantic/pull/2721) by @tiangolo * add missing type annotations in `BaseConfig` and handle `max_length = 0`, [#2719](https://github.com/pydantic/pydantic/pull/2719) by @PrettyWood * Change `orm_mode` checking to allow recursive ORM mode parsing with dicts, [#2718](https://github.com/pydantic/pydantic/pull/2718) by @nuno-andre * Add episode 313 of the *Talk Python To Me* podcast, where Michael Kennedy and Samuel Colvin discuss Pydantic, to the docs, [#2712](https://github.com/pydantic/pydantic/pull/2712) by @RatulMaharaj * fix JSON schema generation when a field is of type `NamedTuple` and has a default value, [#2707](https://github.com/pydantic/pydantic/pull/2707) by @PrettyWood * `Enum` fields now properly support extra kwargs in schema generation, [#2697](https://github.com/pydantic/pydantic/pull/2697) by @sammchardy * **Breaking Change, see [#3780](https://github.com/pydantic/pydantic/pull/3780)**: Make serialization of referenced pydantic models possible, [#2650](https://github.com/pydantic/pydantic/pull/2650) by @PrettyWood * Add `uniqueItems` option to `ConstrainedList`, [#2618](https://github.com/pydantic/pydantic/pull/2618) by @nuno-andre * Try to evaluate forward refs automatically at model creation, [#2588](https://github.com/pydantic/pydantic/pull/2588) by @uriyyo * Switch docs preview and coverage display to use [smokeshow](https://smokeshow.helpmanual.io/), [#2580](https://github.com/pydantic/pydantic/pull/2580) by @samuelcolvin * Add `__version__` attribute to pydantic module, [#2572](https://github.com/pydantic/pydantic/pull/2572) by @paxcodes * Add `postgresql+asyncpg`, `postgresql+pg8000`, `postgresql+psycopg2`, `postgresql+psycopg2cffi`, `postgresql+py-postgresql` and `postgresql+pygresql` schemes for `PostgresDsn`, [#2567](https://github.com/pydantic/pydantic/pull/2567) by @postgres-asyncpg * Enable the Hypothesis plugin to generate a constrained decimal when the `decimal_places` argument is specified, [#2524](https://github.com/pydantic/pydantic/pull/2524) by @cwe5590 * Allow `collections.abc.Callable` to be used as type in Python 3.9, [#2519](https://github.com/pydantic/pydantic/pull/2519) by @daviskirk * Documentation update how to custom compile pydantic when using pip install, small change in `setup.py` to allow for custom CFLAGS when compiling, [#2517](https://github.com/pydantic/pydantic/pull/2517) by @peterroelants * remove side effect of `default_factory` to run it only once even if `Config.validate_all` is set, [#2515](https://github.com/pydantic/pydantic/pull/2515) by @PrettyWood * Add lookahead to ip regexes for `AnyUrl` hosts. This allows urls with DNS labels looking like IPs to validate as they are perfectly valid host names, [#2512](https://github.com/pydantic/pydantic/pull/2512) by @sbv-csis * Set `minItems` and `maxItems` in generated JSON schema for fixed-length tuples, [#2497](https://github.com/pydantic/pydantic/pull/2497) by @PrettyWood * Add `strict` argument to `conbytes`, [#2489](https://github.com/pydantic/pydantic/pull/2489) by @koxudaxi * Support user defined generic field types in generic models, [#2465](https://github.com/pydantic/pydantic/pull/2465) by @daviskirk * Add an example and a short explanation of subclassing `GetterDict` to docs, [#2463](https://github.com/pydantic/pydantic/pull/2463) by @nuno-andre * add `KafkaDsn` type, `HttpUrl` now has default port 80 for http and 443 for https, [#2447](https://github.com/pydantic/pydantic/pull/2447) by @MihanixA * Add `PastDate` and `FutureDate` types, [#2425](https://github.com/pydantic/pydantic/pull/2425) by @Kludex * Support generating schema for `Generic` fields with subtypes, [#2375](https://github.com/pydantic/pydantic/pull/2375) by @maximberg * fix(encoder): serialize `NameEmail` to str, [#2341](https://github.com/pydantic/pydantic/pull/2341) by @alecgerona * add `Config.smart_union` to prevent coercion in `Union` if possible, see [the doc](https://docs.pydantic.dev/usage/model_config/#smart-union) for more information, [#2092](https://github.com/pydantic/pydantic/pull/2092) by @PrettyWood * Add ability to use `typing.Counter` as a model field type, [#2060](https://github.com/pydantic/pydantic/pull/2060) by @uriyyo * Add parameterised subclasses to `__bases__` when constructing new parameterised classes, so that `A <: B => A[int] <: B[int]`, [#2007](https://github.com/pydantic/pydantic/pull/2007) by @diabolo-dan * Create `FileUrl` type that allows URLs that conform to [RFC 8089](https://tools.ietf.org/html/rfc8089#section-2). Add `host_required` parameter, which is `True` by default (`AnyUrl` and subclasses), `False` in `RedisDsn`, `FileUrl`, [#1983](https://github.com/pydantic/pydantic/pull/1983) by @vgerak * add `confrozenset()`, analogous to `conset()` and `conlist()`, [#1897](https://github.com/pydantic/pydantic/pull/1897) by @PrettyWood * stop calling parent class `root_validator` if overridden, [#1895](https://github.com/pydantic/pydantic/pull/1895) by @PrettyWood * Add `repr` (defaults to `True`) parameter to `Field`, to hide it from the default representation of the `BaseModel`, [#1831](https://github.com/pydantic/pydantic/pull/1831) by @fnep * Accept empty query/fragment URL parts, [#1807](https://github.com/pydantic/pydantic/pull/1807) by @xavier ## v1.8.2 (2021-05-11) !!! warning A security vulnerability, level "moderate" is fixed in v1.8.2. Please upgrade **ASAP**. See security advisory [CVE-2021-29510](https://github.com/pydantic/pydantic/security/advisories/GHSA-5jqp-qgf6-3pvh) * **Security fix:** Fix `date` and `datetime` parsing so passing either `'infinity'` or `float('inf')` (or their negative values) does not cause an infinite loop, see security advisory [CVE-2021-29510](https://github.com/pydantic/pydantic/security/advisories/GHSA-5jqp-qgf6-3pvh) * fix schema generation with Enum by generating a valid name, [#2575](https://github.com/pydantic/pydantic/pull/2575) by @PrettyWood * fix JSON schema generation with a `Literal` of an enum member, [#2536](https://github.com/pydantic/pydantic/pull/2536) by @PrettyWood * Fix bug with configurations declarations that are passed as keyword arguments during class creation, [#2532](https://github.com/pydantic/pydantic/pull/2532) by @uriyyo * Allow passing `json_encoders` in class kwargs, [#2521](https://github.com/pydantic/pydantic/pull/2521) by @layday * support arbitrary types with custom `__eq__`, [#2483](https://github.com/pydantic/pydantic/pull/2483) by @PrettyWood * support `Annotated` in `validate_arguments` and in generic models with Python 3.9, [#2483](https://github.com/pydantic/pydantic/pull/2483) by @PrettyWood ## v1.8.1 (2021-03-03) Bug fixes for regressions and new features from `v1.8` * allow elements of `Config.field` to update elements of a `Field`, [#2461](https://github.com/pydantic/pydantic/pull/2461) by @samuelcolvin * fix validation with a `BaseModel` field and a custom root type, [#2449](https://github.com/pydantic/pydantic/pull/2449) by @PrettyWood * expose `Pattern` encoder to `fastapi`, [#2444](https://github.com/pydantic/pydantic/pull/2444) by @PrettyWood * enable the Hypothesis plugin to generate a constrained float when the `multiple_of` argument is specified, [#2442](https://github.com/pydantic/pydantic/pull/2442) by @tobi-lipede-oodle * Avoid `RecursionError` when using some types like `Enum` or `Literal` with generic models, [#2436](https://github.com/pydantic/pydantic/pull/2436) by @PrettyWood * do not overwrite declared `__hash__` in subclasses of a model, [#2422](https://github.com/pydantic/pydantic/pull/2422) by @PrettyWood * fix `mypy` complaints on `Path` and `UUID` related custom types, [#2418](https://github.com/pydantic/pydantic/pull/2418) by @PrettyWood * Support properly variable length tuples of compound types, [#2416](https://github.com/pydantic/pydantic/pull/2416) by @PrettyWood ## v1.8 (2021-02-26) Thank you to pydantic's sponsors: @jorgecarleitao, @BCarley, @chdsbd, @tiangolo, @matin, @linusg, @kevinalh, @koxudaxi, @timdrijvers, @mkeen, @meadsteve, @ginomempin, @primer-io, @and-semakin, @tomthorogood, @AjitZK, @westonsteimel, @Mazyod, @christippett, @CarlosDomingues, @Kludex, @r-m-n for their kind support. ### Highlights * [Hypothesis plugin](https://docs.pydantic.dev/hypothesis_plugin/) for testing, [#2097](https://github.com/pydantic/pydantic/pull/2097) by @Zac-HD * support for [`NamedTuple` and `TypedDict`](https://docs.pydantic.dev/usage/types/#annotated-types), [#2216](https://github.com/pydantic/pydantic/pull/2216) by @PrettyWood * Support [`Annotated` hints on model fields](https://docs.pydantic.dev/usage/schema/#typingannotated-fields), [#2147](https://github.com/pydantic/pydantic/pull/2147) by @JacobHayes * [`frozen` parameter on `Config`](https://docs.pydantic.dev/usage/model_config/) to allow models to be hashed, [#1880](https://github.com/pydantic/pydantic/pull/1880) by @rhuille ### Changes * **Breaking Change**, remove old deprecation aliases from v1, [#2415](https://github.com/pydantic/pydantic/pull/2415) by @samuelcolvin: * remove notes on migrating to v1 in docs * remove `Schema` which was replaced by `Field` * remove `Config.case_insensitive` which was replaced by `Config.case_sensitive` (default `False`) * remove `Config.allow_population_by_alias` which was replaced by `Config.allow_population_by_field_name` * remove `model.fields` which was replaced by `model.__fields__` * remove `model.to_string()` which was replaced by `str(model)` * remove `model.__values__` which was replaced by `model.__dict__` * **Breaking Change:** always validate only first sublevel items with `each_item`. There were indeed some edge cases with some compound types where the validated items were the last sublevel ones, [#1933](https://github.com/pydantic/pydantic/pull/1933) by @PrettyWood * Update docs extensions to fix local syntax highlighting, [#2400](https://github.com/pydantic/pydantic/pull/2400) by @daviskirk * fix: allow `utils.lenient_issubclass` to handle `typing.GenericAlias` objects like `list[str]` in Python >= 3.9, [#2399](https://github.com/pydantic/pydantic/pull/2399) by @daviskirk * Improve field declaration for _pydantic_ `dataclass` by allowing the usage of _pydantic_ `Field` or `'metadata'` kwarg of `dataclasses.field`, [#2384](https://github.com/pydantic/pydantic/pull/2384) by @PrettyWood * Making `typing-extensions` a required dependency, [#2368](https://github.com/pydantic/pydantic/pull/2368) by @samuelcolvin * Make `resolve_annotations` more lenient, allowing for missing modules, [#2363](https://github.com/pydantic/pydantic/pull/2363) by @samuelcolvin * Allow configuring models through class kwargs, [#2356](https://github.com/pydantic/pydantic/pull/2356) by @Bobronium * Prevent `Mapping` subclasses from always being coerced to `dict`, [#2325](https://github.com/pydantic/pydantic/pull/2325) by @ofek * fix: allow `None` for type `Optional[conset / conlist]`, [#2320](https://github.com/pydantic/pydantic/pull/2320) by @PrettyWood * Support empty tuple type, [#2318](https://github.com/pydantic/pydantic/pull/2318) by @PrettyWood * fix: `python_requires` metadata to require >=3.6.1, [#2306](https://github.com/pydantic/pydantic/pull/2306) by @hukkinj1 * Properly encode `Decimal` with, or without any decimal places, [#2293](https://github.com/pydantic/pydantic/pull/2293) by @hultner * fix: update `__fields_set__` in `BaseModel.copy(update=…)`, [#2290](https://github.com/pydantic/pydantic/pull/2290) by @PrettyWood * fix: keep order of fields with `BaseModel.construct()`, [#2281](https://github.com/pydantic/pydantic/pull/2281) by @PrettyWood * Support generating schema for Generic fields, [#2262](https://github.com/pydantic/pydantic/pull/2262) by @maximberg * Fix `validate_decorator` so `**kwargs` doesn't exclude values when the keyword has the same name as the `*args` or `**kwargs` names, [#2251](https://github.com/pydantic/pydantic/pull/2251) by @cybojenix * Prevent overriding positional arguments with keyword arguments in `validate_arguments`, as per behaviour with native functions, [#2249](https://github.com/pydantic/pydantic/pull/2249) by @cybojenix * add documentation for `con*` type functions, [#2242](https://github.com/pydantic/pydantic/pull/2242) by @tayoogunbiyi * Support custom root type (aka `__root__`) when using `parse_obj()` with nested models, [#2238](https://github.com/pydantic/pydantic/pull/2238) by @PrettyWood * Support custom root type (aka `__root__`) with `from_orm()`, [#2237](https://github.com/pydantic/pydantic/pull/2237) by @PrettyWood * ensure cythonized functions are left untouched when creating models, based on [#1944](https://github.com/pydantic/pydantic/pull/1944) by @kollmats, [#2228](https://github.com/pydantic/pydantic/pull/2228) by @samuelcolvin * Resolve forward refs for stdlib dataclasses converted into _pydantic_ ones, [#2220](https://github.com/pydantic/pydantic/pull/2220) by @PrettyWood * Add support for `NamedTuple` and `TypedDict` types. Those two types are now handled and validated when used inside `BaseModel` or _pydantic_ `dataclass`. Two utils are also added `create_model_from_namedtuple` and `create_model_from_typeddict`, [#2216](https://github.com/pydantic/pydantic/pull/2216) by @PrettyWood * Do not ignore annotated fields when type is `Union[Type[...], ...]`, [#2213](https://github.com/pydantic/pydantic/pull/2213) by @PrettyWood * Raise a user-friendly `TypeError` when a `root_validator` does not return a `dict` (e.g. `None`), [#2209](https://github.com/pydantic/pydantic/pull/2209) by @masalim2 * Add a `FrozenSet[str]` type annotation to the `allowed_schemes` argument on the `strict_url` field type, [#2198](https://github.com/pydantic/pydantic/pull/2198) by @Midnighter * add `allow_mutation` constraint to `Field`, [#2195](https://github.com/pydantic/pydantic/pull/2195) by @sblack-usu * Allow `Field` with a `default_factory` to be used as an argument to a function decorated with `validate_arguments`, [#2176](https://github.com/pydantic/pydantic/pull/2176) by @thomascobb * Allow non-existent secrets directory by only issuing a warning, [#2175](https://github.com/pydantic/pydantic/pull/2175) by @davidolrik * fix URL regex to parse fragment without query string, [#2168](https://github.com/pydantic/pydantic/pull/2168) by @andrewmwhite * fix: ensure to always return one of the values in `Literal` field type, [#2166](https://github.com/pydantic/pydantic/pull/2166) by @PrettyWood * Support `typing.Annotated` hints on model fields. A `Field` may now be set in the type hint with `Annotated[..., Field(...)`; all other annotations are ignored but still visible with `get_type_hints(..., include_extras=True)`, [#2147](https://github.com/pydantic/pydantic/pull/2147) by @JacobHayes * Added `StrictBytes` type as well as `strict=False` option to `ConstrainedBytes`, [#2136](https://github.com/pydantic/pydantic/pull/2136) by @rlizzo * added `Config.anystr_lower` and `to_lower` kwarg to `constr` and `conbytes`, [#2134](https://github.com/pydantic/pydantic/pull/2134) by @tayoogunbiyi * Support plain `typing.Tuple` type, [#2132](https://github.com/pydantic/pydantic/pull/2132) by @PrettyWood * Add a bound method `validate` to functions decorated with `validate_arguments` to validate parameters without actually calling the function, [#2127](https://github.com/pydantic/pydantic/pull/2127) by @PrettyWood * Add the ability to customize settings sources (add / disable / change priority order), [#2107](https://github.com/pydantic/pydantic/pull/2107) by @kozlek * Fix mypy complaints about most custom _pydantic_ types, [#2098](https://github.com/pydantic/pydantic/pull/2098) by @PrettyWood * Add a [Hypothesis](https://hypothesis.readthedocs.io/) plugin for easier [property-based testing](https://increment.com/testing/in-praise-of-property-based-testing/) with Pydantic's custom types - [usage details here](https://docs.pydantic.dev/hypothesis_plugin/), [#2097](https://github.com/pydantic/pydantic/pull/2097) by @Zac-HD * add validator for `None`, `NoneType` or `Literal[None]`, [#2095](https://github.com/pydantic/pydantic/pull/2095) by @PrettyWood * Handle properly fields of type `Callable` with a default value, [#2094](https://github.com/pydantic/pydantic/pull/2094) by @PrettyWood * Updated `create_model` return type annotation to return type which inherits from `__base__` argument, [#2071](https://github.com/pydantic/pydantic/pull/2071) by @uriyyo * Add merged `json_encoders` inheritance, [#2064](https://github.com/pydantic/pydantic/pull/2064) by @art049 * allow overwriting `ClassVar`s in sub-models without having to re-annotate them, [#2061](https://github.com/pydantic/pydantic/pull/2061) by @layday * add default encoder for `Pattern` type, [#2045](https://github.com/pydantic/pydantic/pull/2045) by @PrettyWood * Add `NonNegativeInt`, `NonPositiveInt`, `NonNegativeFloat`, `NonPositiveFloat`, [#1975](https://github.com/pydantic/pydantic/pull/1975) by @mdavis-xyz * Use % for percentage in string format of colors, [#1960](https://github.com/pydantic/pydantic/pull/1960) by @EdwardBetts * Fixed issue causing `KeyError` to be raised when building schema from multiple `BaseModel` with the same names declared in separate classes, [#1912](https://github.com/pydantic/pydantic/pull/1912) by @JSextonn * Add `rediss` (Redis over SSL) protocol to `RedisDsn` Allow URLs without `user` part (e.g., `rediss://:pass@localhost`), [#1911](https://github.com/pydantic/pydantic/pull/1911) by @TrDex * Add a new `frozen` boolean parameter to `Config` (default: `False`). Setting `frozen=True` does everything that `allow_mutation=False` does, and also generates a `__hash__()` method for the model. This makes instances of the model potentially hashable if all the attributes are hashable, [#1880](https://github.com/pydantic/pydantic/pull/1880) by @rhuille * fix schema generation with multiple Enums having the same name, [#1857](https://github.com/pydantic/pydantic/pull/1857) by @PrettyWood * Added support for 13/19 digits VISA credit cards in `PaymentCardNumber` type, [#1416](https://github.com/pydantic/pydantic/pull/1416) by @AlexanderSov * fix: prevent `RecursionError` while using recursive `GenericModel`s, [#1370](https://github.com/pydantic/pydantic/pull/1370) by @xppt * use `enum` for `typing.Literal` in JSON schema, [#1350](https://github.com/pydantic/pydantic/pull/1350) by @PrettyWood * Fix: some recursive models did not require `update_forward_refs` and silently behaved incorrectly, [#1201](https://github.com/pydantic/pydantic/pull/1201) by @PrettyWood * Fix bug where generic models with fields where the typevar is nested in another type `a: List[T]` are considered to be concrete. This allows these models to be subclassed and composed as expected, [#947](https://github.com/pydantic/pydantic/pull/947) by @daviskirk * Add `Config.copy_on_model_validation` flag. When set to `False`, _pydantic_ will keep models used as fields untouched on validation instead of reconstructing (copying) them, [#265](https://github.com/pydantic/pydantic/pull/265) by @PrettyWood ## v1.7.4 (2021-05-11) * **Security fix:** Fix `date` and `datetime` parsing so passing either `'infinity'` or `float('inf')` (or their negative values) does not cause an infinite loop, See security advisory [CVE-2021-29510](https://github.com/pydantic/pydantic/security/advisories/GHSA-5jqp-qgf6-3pvh) ## v1.7.3 (2020-11-30) Thank you to pydantic's sponsors: @timdrijvers, @BCarley, @chdsbd, @tiangolo, @matin, @linusg, @kevinalh, @jorgecarleitao, @koxudaxi, @primer-api, @mkeen, @meadsteve for their kind support. * fix: set right default value for required (optional) fields, [#2142](https://github.com/pydantic/pydantic/pull/2142) by @PrettyWood * fix: support `underscore_attrs_are_private` with generic models, [#2138](https://github.com/pydantic/pydantic/pull/2138) by @PrettyWood * fix: update all modified field values in `root_validator` when `validate_assignment` is on, [#2116](https://github.com/pydantic/pydantic/pull/2116) by @PrettyWood * Allow pickling of `pydantic.dataclasses.dataclass` dynamically created from a built-in `dataclasses.dataclass`, [#2111](https://github.com/pydantic/pydantic/pull/2111) by @aimestereo * Fix a regression where Enum fields would not propagate keyword arguments to the schema, [#2109](https://github.com/pydantic/pydantic/pull/2109) by @bm424 * Ignore `__doc__` as private attribute when `Config.underscore_attrs_are_private` is set, [#2090](https://github.com/pydantic/pydantic/pull/2090) by @PrettyWood ## v1.7.2 (2020-11-01) * fix slow `GenericModel` concrete model creation, allow `GenericModel` concrete name reusing in module, [#2078](https://github.com/pydantic/pydantic/pull/2078) by @Bobronium * keep the order of the fields when `validate_assignment` is set, [#2073](https://github.com/pydantic/pydantic/pull/2073) by @PrettyWood * forward all the params of the stdlib `dataclass` when converted into _pydantic_ `dataclass`, [#2065](https://github.com/pydantic/pydantic/pull/2065) by @PrettyWood ## v1.7.1 (2020-10-28) Thank you to pydantic's sponsors: @timdrijvers, @BCarley, @chdsbd, @tiangolo, @matin, @linusg, @kevinalh, @jorgecarleitao, @koxudaxi, @primer-api, @mkeen for their kind support. * fix annotation of `validate_arguments` when passing configuration as argument, [#2055](https://github.com/pydantic/pydantic/pull/2055) by @layday * Fix mypy assignment error when using `PrivateAttr`, [#2048](https://github.com/pydantic/pydantic/pull/2048) by @aphedges * fix `underscore_attrs_are_private` causing `TypeError` when overriding `__init__`, [#2047](https://github.com/pydantic/pydantic/pull/2047) by @samuelcolvin * Fixed regression introduced in v1.7 involving exception handling in field validators when `validate_assignment=True`, [#2044](https://github.com/pydantic/pydantic/pull/2044) by @johnsabath * fix: _pydantic_ `dataclass` can inherit from stdlib `dataclass` and `Config.arbitrary_types_allowed` is supported, [#2042](https://github.com/pydantic/pydantic/pull/2042) by @PrettyWood ## v1.7 (2020-10-26) Thank you to pydantic's sponsors: @timdrijvers, @BCarley, @chdsbd, @tiangolo, @matin, @linusg, @kevinalh, @jorgecarleitao, @koxudaxi, @primer-api for their kind support. ### Highlights * Python 3.9 support, thanks @PrettyWood * [Private model attributes](https://docs.pydantic.dev/usage/models/#private-model-attributes), thanks @Bobronium * ["secrets files" support in `BaseSettings`](https://docs.pydantic.dev/usage/settings/#secret-support), thanks @mdgilene * [convert stdlib dataclasses to pydantic dataclasses and use stdlib dataclasses in models](https://docs.pydantic.dev/usage/dataclasses/#stdlib-dataclasses-and-pydantic-dataclasses), thanks @PrettyWood ### Changes * **Breaking Change:** remove `__field_defaults__`, add `default_factory` support with `BaseModel.construct`. Use `.get_default()` method on fields in `__fields__` attribute instead, [#1732](https://github.com/pydantic/pydantic/pull/1732) by @PrettyWood * Rearrange CI to run linting as a separate job, split install recipes for different tasks, [#2020](https://github.com/pydantic/pydantic/pull/2020) by @samuelcolvin * Allows subclasses of generic models to make some, or all, of the superclass's type parameters concrete, while also defining new type parameters in the subclass, [#2005](https://github.com/pydantic/pydantic/pull/2005) by @choogeboom * Call validator with the correct `values` parameter type in `BaseModel.__setattr__`, when `validate_assignment = True` in model config, [#1999](https://github.com/pydantic/pydantic/pull/1999) by @me-ransh * Force `fields.Undefined` to be a singleton object, fixing inherited generic model schemas, [#1981](https://github.com/pydantic/pydantic/pull/1981) by @daviskirk * Include tests in source distributions, [#1976](https://github.com/pydantic/pydantic/pull/1976) by @sbraz * Add ability to use `min_length/max_length` constraints with secret types, [#1974](https://github.com/pydantic/pydantic/pull/1974) by @uriyyo * Also check `root_validators` when `validate_assignment` is on, [#1971](https://github.com/pydantic/pydantic/pull/1971) by @PrettyWood * Fix const validators not running when custom validators are present, [#1957](https://github.com/pydantic/pydantic/pull/1957) by @hmvp * add `deque` to field types, [#1935](https://github.com/pydantic/pydantic/pull/1935) by @wozniakty * add basic support for Python 3.9, [#1832](https://github.com/pydantic/pydantic/pull/1832) by @PrettyWood * Fix typo in the anchor of exporting_models.md#modelcopy and incorrect description, [#1821](https://github.com/pydantic/pydantic/pull/1821) by @KimMachineGun * Added ability for `BaseSettings` to read "secret files", [#1820](https://github.com/pydantic/pydantic/pull/1820) by @mdgilene * add `parse_raw_as` utility function, [#1812](https://github.com/pydantic/pydantic/pull/1812) by @PrettyWood * Support home directory relative paths for `dotenv` files (e.g. `~/.env`), [#1803](https://github.com/pydantic/pydantic/pull/1803) by @PrettyWood * Clarify documentation for `parse_file` to show that the argument should be a file *path* not a file-like object, [#1794](https://github.com/pydantic/pydantic/pull/1794) by @mdavis-xyz * Fix false positive from mypy plugin when a class nested within a `BaseModel` is named `Model`, [#1770](https://github.com/pydantic/pydantic/pull/1770) by @selimb * add basic support of Pattern type in schema generation, [#1767](https://github.com/pydantic/pydantic/pull/1767) by @PrettyWood * Support custom title, description and default in schema of enums, [#1748](https://github.com/pydantic/pydantic/pull/1748) by @PrettyWood * Properly represent `Literal` Enums when `use_enum_values` is True, [#1747](https://github.com/pydantic/pydantic/pull/1747) by @noelevans * Allows timezone information to be added to strings to be formatted as time objects. Permitted formats are `Z` for UTC or an offset for absolute positive or negative time shifts. Or the timezone data can be omitted, [#1744](https://github.com/pydantic/pydantic/pull/1744) by @noelevans * Add stub `__init__` with Python 3.6 signature for `ForwardRef`, [#1738](https://github.com/pydantic/pydantic/pull/1738) by @sirtelemak * Fix behaviour with forward refs and optional fields in nested models, [#1736](https://github.com/pydantic/pydantic/pull/1736) by @PrettyWood * add `Enum` and `IntEnum` as valid types for fields, [#1735](https://github.com/pydantic/pydantic/pull/1735) by @PrettyWood * Change default value of `__module__` argument of `create_model` from `None` to `'pydantic.main'`. Set reference of created concrete model to it's module to allow pickling (not applied to models created in functions), [#1686](https://github.com/pydantic/pydantic/pull/1686) by @Bobronium * Add private attributes support, [#1679](https://github.com/pydantic/pydantic/pull/1679) by @Bobronium * add `config` to `@validate_arguments`, [#1663](https://github.com/pydantic/pydantic/pull/1663) by @samuelcolvin * Allow descendant Settings models to override env variable names for the fields defined in parent Settings models with `env` in their `Config`. Previously only `env_prefix` configuration option was applicable, [#1561](https://github.com/pydantic/pydantic/pull/1561) by @ojomio * Support `ref_template` when creating schema `$ref`s, [#1479](https://github.com/pydantic/pydantic/pull/1479) by @kilo59 * Add a `__call__` stub to `PyObject` so that mypy will know that it is callable, [#1352](https://github.com/pydantic/pydantic/pull/1352) by @brianmaissy * `pydantic.dataclasses.dataclass` decorator now supports built-in `dataclasses.dataclass`. It is hence possible to convert an existing `dataclass` easily to add Pydantic validation. Moreover nested dataclasses are also supported, [#744](https://github.com/pydantic/pydantic/pull/744) by @PrettyWood ## v1.6.2 (2021-05-11) * **Security fix:** Fix `date` and `datetime` parsing so passing either `'infinity'` or `float('inf')` (or their negative values) does not cause an infinite loop, See security advisory [CVE-2021-29510](https://github.com/pydantic/pydantic/security/advisories/GHSA-5jqp-qgf6-3pvh) ## v1.6.1 (2020-07-15) * fix validation and parsing of nested models with `default_factory`, [#1710](https://github.com/pydantic/pydantic/pull/1710) by @PrettyWood ## v1.6 (2020-07-11) Thank you to pydantic's sponsors: @matin, @tiangolo, @chdsbd, @jorgecarleitao, and 1 anonymous sponsor for their kind support. * Modify validators for `conlist` and `conset` to not have `always=True`, [#1682](https://github.com/pydantic/pydantic/pull/1682) by @samuelcolvin * add port check to `AnyUrl` (can't exceed 65536) ports are 16 insigned bits: `0 <= port <= 2**16-1` src: [rfc793 header format](https://tools.ietf.org/html/rfc793#section-3.1), [#1654](https://github.com/pydantic/pydantic/pull/1654) by @flapili * Document default `regex` anchoring semantics, [#1648](https://github.com/pydantic/pydantic/pull/1648) by @yurikhan * Use `chain.from_iterable` in class_validators.py. This is a faster and more idiomatic way of using `itertools.chain`. Instead of computing all the items in the iterable and storing them in memory, they are computed one-by-one and never stored as a huge list. This can save on both runtime and memory space, [#1642](https://github.com/pydantic/pydantic/pull/1642) by @cool-RR * Add `conset()`, analogous to `conlist()`, [#1623](https://github.com/pydantic/pydantic/pull/1623) by @patrickkwang * make Pydantic errors (un)pickable, [#1616](https://github.com/pydantic/pydantic/pull/1616) by @PrettyWood * Allow custom encoding for `dotenv` files, [#1615](https://github.com/pydantic/pydantic/pull/1615) by @PrettyWood * Ensure `SchemaExtraCallable` is always defined to get type hints on BaseConfig, [#1614](https://github.com/pydantic/pydantic/pull/1614) by @PrettyWood * Update datetime parser to support negative timestamps, [#1600](https://github.com/pydantic/pydantic/pull/1600) by @mlbiche * Update mypy, remove `AnyType` alias for `Type[Any]`, [#1598](https://github.com/pydantic/pydantic/pull/1598) by @samuelcolvin * Adjust handling of root validators so that errors are aggregated from _all_ failing root validators, instead of reporting on only the first root validator to fail, [#1586](https://github.com/pydantic/pydantic/pull/1586) by @beezee * Make `__modify_schema__` on Enums apply to the enum schema rather than fields that use the enum, [#1581](https://github.com/pydantic/pydantic/pull/1581) by @therefromhere * Fix behavior of `__all__` key when used in conjunction with index keys in advanced include/exclude of fields that are sequences, [#1579](https://github.com/pydantic/pydantic/pull/1579) by @xspirus * Subclass validators do not run when referencing a `List` field defined in a parent class when `each_item=True`. Added an example to the docs illustrating this, [#1566](https://github.com/pydantic/pydantic/pull/1566) by @samueldeklund * change `schema.field_class_to_schema` to support `frozenset` in schema, [#1557](https://github.com/pydantic/pydantic/pull/1557) by @wangpeibao * Call `__modify_schema__` only for the field schema, [#1552](https://github.com/pydantic/pydantic/pull/1552) by @PrettyWood * Move the assignment of `field.validate_always` in `fields.py` so the `always` parameter of validators work on inheritance, [#1545](https://github.com/pydantic/pydantic/pull/1545) by @dcHHH * Added support for UUID instantiation through 16 byte strings such as `b'\x12\x34\x56\x78' * 4`. This was done to support `BINARY(16)` columns in sqlalchemy, [#1541](https://github.com/pydantic/pydantic/pull/1541) by @shawnwall * Add a test assertion that `default_factory` can return a singleton, [#1523](https://github.com/pydantic/pydantic/pull/1523) by @therefromhere * Add `NameEmail.__eq__` so duplicate `NameEmail` instances are evaluated as equal, [#1514](https://github.com/pydantic/pydantic/pull/1514) by @stephen-bunn * Add datamodel-code-generator link in pydantic document site, [#1500](https://github.com/pydantic/pydantic/pull/1500) by @koxudaxi * Added a "Discussion of Pydantic" section to the documentation, with a link to "Pydantic Introduction" video by Alexander Hultnér, [#1499](https://github.com/pydantic/pydantic/pull/1499) by @hultner * Avoid some side effects of `default_factory` by calling it only once if possible and by not setting a default value in the schema, [#1491](https://github.com/pydantic/pydantic/pull/1491) by @PrettyWood * Added docs about dumping dataclasses to JSON, [#1487](https://github.com/pydantic/pydantic/pull/1487) by @mikegrima * Make `BaseModel.__signature__` class-only, so getting `__signature__` from model instance will raise `AttributeError`, [#1466](https://github.com/pydantic/pydantic/pull/1466) by @Bobronium * include `'format': 'password'` in the schema for secret types, [#1424](https://github.com/pydantic/pydantic/pull/1424) by @atheuz * Modify schema constraints on `ConstrainedFloat` so that `exclusiveMinimum` and minimum are not included in the schema if they are equal to `-math.inf` and `exclusiveMaximum` and `maximum` are not included if they are equal to `math.inf`, [#1417](https://github.com/pydantic/pydantic/pull/1417) by @vdwees * Squash internal `__root__` dicts in `.dict()` (and, by extension, in `.json()`), [#1414](https://github.com/pydantic/pydantic/pull/1414) by @patrickkwang * Move `const` validator to post-validators so it validates the parsed value, [#1410](https://github.com/pydantic/pydantic/pull/1410) by @selimb * Fix model validation to handle nested literals, e.g. `Literal['foo', Literal['bar']]`, [#1364](https://github.com/pydantic/pydantic/pull/1364) by @DBCerigo * Remove `user_required = True` from `RedisDsn`, neither user nor password are required, [#1275](https://github.com/pydantic/pydantic/pull/1275) by @samuelcolvin * Remove extra `allOf` from schema for fields with `Union` and custom `Field`, [#1209](https://github.com/pydantic/pydantic/pull/1209) by @mostaphaRoudsari * Updates OpenAPI schema generation to output all enums as separate models. Instead of inlining the enum values in the model schema, models now use a `$ref` property to point to the enum definition, [#1173](https://github.com/pydantic/pydantic/pull/1173) by @calvinwyoung ## v1.5.1 (2020-04-23) * Signature generation with `extra: allow` never uses a field name, [#1418](https://github.com/pydantic/pydantic/pull/1418) by @prettywood * Avoid mutating `Field` default value, [#1412](https://github.com/pydantic/pydantic/pull/1412) by @prettywood ## v1.5 (2020-04-18) * Make includes/excludes arguments for `.dict()`, `._iter()`, ..., immutable, [#1404](https://github.com/pydantic/pydantic/pull/1404) by @AlexECX * Always use a field's real name with includes/excludes in `model._iter()`, regardless of `by_alias`, [#1397](https://github.com/pydantic/pydantic/pull/1397) by @AlexECX * Update constr regex example to include start and end lines, [#1396](https://github.com/pydantic/pydantic/pull/1396) by @lmcnearney * Confirm that shallow `model.copy()` does make a shallow copy of attributes, [#1383](https://github.com/pydantic/pydantic/pull/1383) by @samuelcolvin * Renaming `model_name` argument of `main.create_model()` to `__model_name` to allow using `model_name` as a field name, [#1367](https://github.com/pydantic/pydantic/pull/1367) by @kittipatv * Replace raising of exception to silent passing for non-Var attributes in mypy plugin, [#1345](https://github.com/pydantic/pydantic/pull/1345) by @b0g3r * Remove `typing_extensions` dependency for Python 3.8, [#1342](https://github.com/pydantic/pydantic/pull/1342) by @prettywood * Make `SecretStr` and `SecretBytes` initialization idempotent, [#1330](https://github.com/pydantic/pydantic/pull/1330) by @atheuz * document making secret types dumpable using the json method, [#1328](https://github.com/pydantic/pydantic/pull/1328) by @atheuz * Move all testing and build to github actions, add windows and macos binaries, thank you @StephenBrown2 for much help, [#1326](https://github.com/pydantic/pydantic/pull/1326) by @samuelcolvin * fix card number length check in `PaymentCardNumber`, `PaymentCardBrand` now inherits from `str`, [#1317](https://github.com/pydantic/pydantic/pull/1317) by @samuelcolvin * Have `BaseModel` inherit from `Representation` to make mypy happy when overriding `__str__`, [#1310](https://github.com/pydantic/pydantic/pull/1310) by @FuegoFro * Allow `None` as input to all optional list fields, [#1307](https://github.com/pydantic/pydantic/pull/1307) by @prettywood * Add `datetime` field to `default_factory` example, [#1301](https://github.com/pydantic/pydantic/pull/1301) by @StephenBrown2 * Allow subclasses of known types to be encoded with superclass encoder, [#1291](https://github.com/pydantic/pydantic/pull/1291) by @StephenBrown2 * Exclude exported fields from all elements of a list/tuple of submodels/dicts with `'__all__'`, [#1286](https://github.com/pydantic/pydantic/pull/1286) by @masalim2 * Add pydantic.color.Color objects as available input for Color fields, [#1258](https://github.com/pydantic/pydantic/pull/1258) by @leosussan * In examples, type nullable fields as `Optional`, so that these are valid mypy annotations, [#1248](https://github.com/pydantic/pydantic/pull/1248) by @kokes * Make `pattern_validator()` accept pre-compiled `Pattern` objects. Fix `str_validator()` return type to `str`, [#1237](https://github.com/pydantic/pydantic/pull/1237) by @adamgreg * Document how to manage Generics and inheritance, [#1229](https://github.com/pydantic/pydantic/pull/1229) by @esadruhn * `update_forward_refs()` method of BaseModel now copies `__dict__` of class module instead of modifying it, [#1228](https://github.com/pydantic/pydantic/pull/1228) by @paul-ilyin * Support instance methods and class methods with `@validate_arguments`, [#1222](https://github.com/pydantic/pydantic/pull/1222) by @samuelcolvin * Add `default_factory` argument to `Field` to create a dynamic default value by passing a zero-argument callable, [#1210](https://github.com/pydantic/pydantic/pull/1210) by @prettywood * add support for `NewType` of `List`, `Optional`, etc, [#1207](https://github.com/pydantic/pydantic/pull/1207) by @Kazy * fix mypy signature for `root_validator`, [#1192](https://github.com/pydantic/pydantic/pull/1192) by @samuelcolvin * Fixed parsing of nested 'custom root type' models, [#1190](https://github.com/pydantic/pydantic/pull/1190) by @Shados * Add `validate_arguments` function decorator which checks the arguments to a function matches type annotations, [#1179](https://github.com/pydantic/pydantic/pull/1179) by @samuelcolvin * Add `__signature__` to models, [#1034](https://github.com/pydantic/pydantic/pull/1034) by @Bobronium * Refactor `._iter()` method, 10x speed boost for `dict(model)`, [#1017](https://github.com/pydantic/pydantic/pull/1017) by @Bobronium ## v1.4 (2020-01-24) * **Breaking Change:** alias precedence logic changed so aliases on a field always take priority over an alias from `alias_generator` to avoid buggy/unexpected behaviour, see [here](https://docs.pydantic.dev/usage/model_config/#alias-precedence) for details, [#1178](https://github.com/pydantic/pydantic/pull/1178) by @samuelcolvin * Add support for unicode and punycode in TLDs, [#1182](https://github.com/pydantic/pydantic/pull/1182) by @jamescurtin * Fix `cls` argument in validators during assignment, [#1172](https://github.com/pydantic/pydantic/pull/1172) by @samuelcolvin * completing Luhn algorithm for `PaymentCardNumber`, [#1166](https://github.com/pydantic/pydantic/pull/1166) by @cuencandres * add support for generics that implement `__get_validators__` like a custom data type, [#1159](https://github.com/pydantic/pydantic/pull/1159) by @tiangolo * add support for infinite generators with `Iterable`, [#1152](https://github.com/pydantic/pydantic/pull/1152) by @tiangolo * fix `url_regex` to accept schemas with `+`, `-` and `.` after the first character, [#1142](https://github.com/pydantic/pydantic/pull/1142) by @samuelcolvin * move `version_info()` to `version.py`, suggest its use in issues, [#1138](https://github.com/pydantic/pydantic/pull/1138) by @samuelcolvin * Improve pydantic import time by roughly 50% by deferring some module loading and regex compilation, [#1127](https://github.com/pydantic/pydantic/pull/1127) by @samuelcolvin * Fix `EmailStr` and `NameEmail` to accept instances of themselves in cython, [#1126](https://github.com/pydantic/pydantic/pull/1126) by @koxudaxi * Pass model class to the `Config.schema_extra` callable, [#1125](https://github.com/pydantic/pydantic/pull/1125) by @therefromhere * Fix regex for username and password in URLs, [#1115](https://github.com/pydantic/pydantic/pull/1115) by @samuelcolvin * Add support for nested generic models, [#1104](https://github.com/pydantic/pydantic/pull/1104) by @dmontagu * add `__all__` to `__init__.py` to prevent "implicit reexport" errors from mypy, [#1072](https://github.com/pydantic/pydantic/pull/1072) by @samuelcolvin * Add support for using "dotenv" files with `BaseSettings`, [#1011](https://github.com/pydantic/pydantic/pull/1011) by @acnebs ## v1.3 (2019-12-21) * Change `schema` and `schema_model` to handle dataclasses by using their `__pydantic_model__` feature, [#792](https://github.com/pydantic/pydantic/pull/792) by @aviramha * Added option for `root_validator` to be skipped if values validation fails using keyword `skip_on_failure=True`, [#1049](https://github.com/pydantic/pydantic/pull/1049) by @aviramha * Allow `Config.schema_extra` to be a callable so that the generated schema can be post-processed, [#1054](https://github.com/pydantic/pydantic/pull/1054) by @selimb * Update mypy to version 0.750, [#1057](https://github.com/pydantic/pydantic/pull/1057) by @dmontagu * Trick Cython into allowing str subclassing, [#1061](https://github.com/pydantic/pydantic/pull/1061) by @skewty * Prevent type attributes being added to schema unless the attribute `__schema_attributes__` is `True`, [#1064](https://github.com/pydantic/pydantic/pull/1064) by @samuelcolvin * Change `BaseModel.parse_file` to use `Config.json_loads`, [#1067](https://github.com/pydantic/pydantic/pull/1067) by @kierandarcy * Fix for optional `Json` fields, [#1073](https://github.com/pydantic/pydantic/pull/1073) by @volker48 * Change the default number of threads used when compiling with cython to one, allow override via the `CYTHON_NTHREADS` environment variable, [#1074](https://github.com/pydantic/pydantic/pull/1074) by @samuelcolvin * Run FastAPI tests during Pydantic's CI tests, [#1075](https://github.com/pydantic/pydantic/pull/1075) by @tiangolo * My mypy strictness constraints, and associated tweaks to type annotations, [#1077](https://github.com/pydantic/pydantic/pull/1077) by @samuelcolvin * Add `__eq__` to SecretStr and SecretBytes to allow "value equals", [#1079](https://github.com/pydantic/pydantic/pull/1079) by @sbv-trueenergy * Fix schema generation for nested None case, [#1088](https://github.com/pydantic/pydantic/pull/1088) by @lutostag * Consistent checks for sequence like objects, [#1090](https://github.com/pydantic/pydantic/pull/1090) by @samuelcolvin * Fix `Config` inheritance on `BaseSettings` when used with `env_prefix`, [#1091](https://github.com/pydantic/pydantic/pull/1091) by @samuelcolvin * Fix for `__modify_schema__` when it conflicted with `field_class_to_schema*`, [#1102](https://github.com/pydantic/pydantic/pull/1102) by @samuelcolvin * docs: Fix explanation of case sensitive environment variable names when populating `BaseSettings` subclass attributes, [#1105](https://github.com/pydantic/pydantic/pull/1105) by @tribals * Rename django-rest-framework benchmark in documentation, [#1119](https://github.com/pydantic/pydantic/pull/1119) by @frankie567 ## v1.2 (2019-11-28) * **Possible Breaking Change:** Add support for required `Optional` with `name: Optional[AnyType] = Field(...)` and refactor `ModelField` creation to preserve `required` parameter value, [#1031](https://github.com/pydantic/pydantic/pull/1031) by @tiangolo; see [here](https://docs.pydantic.dev/usage/models/#required-optional-fields) for details * Add benchmarks for `cattrs`, [#513](https://github.com/pydantic/pydantic/pull/513) by @sebastianmika * Add `exclude_none` option to `dict()` and friends, [#587](https://github.com/pydantic/pydantic/pull/587) by @niknetniko * Add benchmarks for `valideer`, [#670](https://github.com/pydantic/pydantic/pull/670) by @gsakkis * Add `parse_obj_as` and `parse_file_as` functions for ad-hoc parsing of data into arbitrary pydantic-compatible types, [#934](https://github.com/pydantic/pydantic/pull/934) by @dmontagu * Add `allow_reuse` argument to validators, thus allowing validator reuse, [#940](https://github.com/pydantic/pydantic/pull/940) by @dmontagu * Add support for mapping types for custom root models, [#958](https://github.com/pydantic/pydantic/pull/958) by @dmontagu * Mypy plugin support for dataclasses, [#966](https://github.com/pydantic/pydantic/pull/966) by @koxudaxi * Add support for dataclasses default factory, [#968](https://github.com/pydantic/pydantic/pull/968) by @ahirner * Add a `ByteSize` type for converting byte string (`1GB`) to plain bytes, [#977](https://github.com/pydantic/pydantic/pull/977) by @dgasmith * Fix mypy complaint about `@root_validator(pre=True)`, [#984](https://github.com/pydantic/pydantic/pull/984) by @samuelcolvin * Add manylinux binaries for Python 3.8 to pypi, also support manylinux2010, [#994](https://github.com/pydantic/pydantic/pull/994) by @samuelcolvin * Adds ByteSize conversion to another unit, [#995](https://github.com/pydantic/pydantic/pull/995) by @dgasmith * Fix `__str__` and `__repr__` inheritance for models, [#1022](https://github.com/pydantic/pydantic/pull/1022) by @samuelcolvin * add testimonials section to docs, [#1025](https://github.com/pydantic/pydantic/pull/1025) by @sullivancolin * Add support for `typing.Literal` for Python 3.8, [#1026](https://github.com/pydantic/pydantic/pull/1026) by @dmontagu ## v1.1.1 (2019-11-20) * Fix bug where use of complex fields on sub-models could cause fields to be incorrectly configured, [#1015](https://github.com/pydantic/pydantic/pull/1015) by @samuelcolvin ## v1.1 (2019-11-07) * Add a mypy plugin for type checking `BaseModel.__init__` and more, [#722](https://github.com/pydantic/pydantic/pull/722) by @dmontagu * Change return type typehint for `GenericModel.__class_getitem__` to prevent PyCharm warnings, [#936](https://github.com/pydantic/pydantic/pull/936) by @dmontagu * Fix usage of `Any` to allow `None`, also support `TypeVar` thus allowing use of un-parameterised collection types e.g. `Dict` and `List`, [#962](https://github.com/pydantic/pydantic/pull/962) by @samuelcolvin * Set `FieldInfo` on subfields to fix schema generation for complex nested types, [#965](https://github.com/pydantic/pydantic/pull/965) by @samuelcolvin ## v1.0 (2019-10-23) * **Breaking Change:** deprecate the `Model.fields` property, use `Model.__fields__` instead, [#883](https://github.com/pydantic/pydantic/pull/883) by @samuelcolvin * **Breaking Change:** Change the precedence of aliases so child model aliases override parent aliases, including using `alias_generator`, [#904](https://github.com/pydantic/pydantic/pull/904) by @samuelcolvin * **Breaking change:** Rename `skip_defaults` to `exclude_unset`, and add ability to exclude actual defaults, [#915](https://github.com/pydantic/pydantic/pull/915) by @dmontagu * Add `**kwargs` to `pydantic.main.ModelMetaclass.__new__` so `__init_subclass__` can take custom parameters on extended `BaseModel` classes, [#867](https://github.com/pydantic/pydantic/pull/867) by @retnikt * Fix field of a type that has a default value, [#880](https://github.com/pydantic/pydantic/pull/880) by @koxudaxi * Use `FutureWarning` instead of `DeprecationWarning` when `alias` instead of `env` is used for settings models, [#881](https://github.com/pydantic/pydantic/pull/881) by @samuelcolvin * Fix issue with `BaseSettings` inheritance and `alias` getting set to `None`, [#882](https://github.com/pydantic/pydantic/pull/882) by @samuelcolvin * Modify `__repr__` and `__str__` methods to be consistent across all public classes, add `__pretty__` to support python-devtools, [#884](https://github.com/pydantic/pydantic/pull/884) by @samuelcolvin * deprecation warning for `case_insensitive` on `BaseSettings` config, [#885](https://github.com/pydantic/pydantic/pull/885) by @samuelcolvin * For `BaseSettings` merge environment variables and in-code values recursively, as long as they create a valid object when merged together, to allow splitting init arguments, [#888](https://github.com/pydantic/pydantic/pull/888) by @idmitrievsky * change secret types example, [#890](https://github.com/pydantic/pydantic/pull/890) by @ashears * Change the signature of `Model.construct()` to be more user-friendly, document `construct()` usage, [#898](https://github.com/pydantic/pydantic/pull/898) by @samuelcolvin * Add example for the `construct()` method, [#907](https://github.com/pydantic/pydantic/pull/907) by @ashears * Improve use of `Field` constraints on complex types, raise an error if constraints are not enforceable, also support tuples with an ellipsis `Tuple[X, ...]`, `Sequence` and `FrozenSet` in schema, [#909](https://github.com/pydantic/pydantic/pull/909) by @samuelcolvin * update docs for bool missing valid value, [#911](https://github.com/pydantic/pydantic/pull/911) by @trim21 * Better `str`/`repr` logic for `ModelField`, [#912](https://github.com/pydantic/pydantic/pull/912) by @samuelcolvin * Fix `ConstrainedList`, update schema generation to reflect `min_items` and `max_items` `Field()` arguments, [#917](https://github.com/pydantic/pydantic/pull/917) by @samuelcolvin * Allow abstracts sets (eg. dict keys) in the `include` and `exclude` arguments of `dict()`, [#921](https://github.com/pydantic/pydantic/pull/921) by @samuelcolvin * Fix JSON serialization errors on `ValidationError.json()` by using `pydantic_encoder`, [#922](https://github.com/pydantic/pydantic/pull/922) by @samuelcolvin * Clarify usage of `remove_untouched`, improve error message for types with no validators, [#926](https://github.com/pydantic/pydantic/pull/926) by @retnikt ## v1.0b2 (2019-10-07) * Mark `StrictBool` typecheck as `bool` to allow for default values without mypy errors, [#690](https://github.com/pydantic/pydantic/pull/690) by @dmontagu * Transfer the documentation build from sphinx to mkdocs, re-write much of the documentation, [#856](https://github.com/pydantic/pydantic/pull/856) by @samuelcolvin * Add support for custom naming schemes for `GenericModel` subclasses, [#859](https://github.com/pydantic/pydantic/pull/859) by @dmontagu * Add `if TYPE_CHECKING:` to the excluded lines for test coverage, [#874](https://github.com/pydantic/pydantic/pull/874) by @dmontagu * Rename `allow_population_by_alias` to `allow_population_by_field_name`, remove unnecessary warning about it, [#875](https://github.com/pydantic/pydantic/pull/875) by @samuelcolvin ## v1.0b1 (2019-10-01) * **Breaking Change:** rename `Schema` to `Field`, make it a function to placate mypy, [#577](https://github.com/pydantic/pydantic/pull/577) by @samuelcolvin * **Breaking Change:** modify parsing behavior for `bool`, [#617](https://github.com/pydantic/pydantic/pull/617) by @dmontagu * **Breaking Change:** `get_validators` is no longer recognised, use `__get_validators__`. `Config.ignore_extra` and `Config.allow_extra` are no longer recognised, use `Config.extra`, [#720](https://github.com/pydantic/pydantic/pull/720) by @samuelcolvin * **Breaking Change:** modify default config settings for `BaseSettings`; `case_insensitive` renamed to `case_sensitive`, default changed to `case_sensitive = False`, `env_prefix` default changed to `''` - e.g. no prefix, [#721](https://github.com/pydantic/pydantic/pull/721) by @dmontagu * **Breaking change:** Implement `root_validator` and rename root errors from `__obj__` to `__root__`, [#729](https://github.com/pydantic/pydantic/pull/729) by @samuelcolvin * **Breaking Change:** alter the behaviour of `dict(model)` so that sub-models are nolonger converted to dictionaries, [#733](https://github.com/pydantic/pydantic/pull/733) by @samuelcolvin * **Breaking change:** Added `initvars` support to `post_init_post_parse`, [#748](https://github.com/pydantic/pydantic/pull/748) by @Raphael-C-Almeida * **Breaking Change:** Make `BaseModel.json()` only serialize the `__root__` key for models with custom root, [#752](https://github.com/pydantic/pydantic/pull/752) by @dmontagu * **Breaking Change:** complete rewrite of `URL` parsing logic, [#755](https://github.com/pydantic/pydantic/pull/755) by @samuelcolvin * **Breaking Change:** preserve superclass annotations for field-determination when not provided in subclass, [#757](https://github.com/pydantic/pydantic/pull/757) by @dmontagu * **Breaking Change:** `BaseSettings` now uses the special `env` settings to define which environment variables to read, not aliases, [#847](https://github.com/pydantic/pydantic/pull/847) by @samuelcolvin * add support for `assert` statements inside validators, [#653](https://github.com/pydantic/pydantic/pull/653) by @abdusco * Update documentation to specify the use of `pydantic.dataclasses.dataclass` and subclassing `pydantic.BaseModel`, [#710](https://github.com/pydantic/pydantic/pull/710) by @maddosaurus * Allow custom JSON decoding and encoding via `json_loads` and `json_dumps` `Config` properties, [#714](https://github.com/pydantic/pydantic/pull/714) by @samuelcolvin * make all annotated fields occur in the order declared, [#715](https://github.com/pydantic/pydantic/pull/715) by @dmontagu * use pytest to test `mypy` integration, [#735](https://github.com/pydantic/pydantic/pull/735) by @dmontagu * add `__repr__` method to `ErrorWrapper`, [#738](https://github.com/pydantic/pydantic/pull/738) by @samuelcolvin * Added support for `FrozenSet` members in dataclasses, and a better error when attempting to use types from the `typing` module that are not supported by Pydantic, [#745](https://github.com/pydantic/pydantic/pull/745) by @djpetti * add documentation for Pycharm Plugin, [#750](https://github.com/pydantic/pydantic/pull/750) by @koxudaxi * fix broken examples in the docs, [#753](https://github.com/pydantic/pydantic/pull/753) by @dmontagu * moving typing related objects into `pydantic.typing`, [#761](https://github.com/pydantic/pydantic/pull/761) by @samuelcolvin * Minor performance improvements to `ErrorWrapper`, `ValidationError` and datetime parsing, [#763](https://github.com/pydantic/pydantic/pull/763) by @samuelcolvin * Improvements to `datetime`/`date`/`time`/`timedelta` types: more descriptive errors, change errors to `value_error` not `type_error`, support bytes, [#766](https://github.com/pydantic/pydantic/pull/766) by @samuelcolvin * fix error messages for `Literal` types with multiple allowed values, [#770](https://github.com/pydantic/pydantic/pull/770) by @dmontagu * Improved auto-generated `title` field in JSON schema by converting underscore to space, [#772](https://github.com/pydantic/pydantic/pull/772) by @skewty * support `mypy --no-implicit-reexport` for dataclasses, also respect `--no-implicit-reexport` in pydantic itself, [#783](https://github.com/pydantic/pydantic/pull/783) by @samuelcolvin * add the `PaymentCardNumber` type, [#790](https://github.com/pydantic/pydantic/pull/790) by @matin * Fix const validations for lists, [#794](https://github.com/pydantic/pydantic/pull/794) by @hmvp * Set `additionalProperties` to false in schema for models with extra fields disallowed, [#796](https://github.com/pydantic/pydantic/pull/796) by @Code0x58 * `EmailStr` validation method now returns local part case-sensitive per RFC 5321, [#798](https://github.com/pydantic/pydantic/pull/798) by @henriklindgren * Added ability to validate strictness to `ConstrainedFloat`, `ConstrainedInt` and `ConstrainedStr` and added `StrictFloat` and `StrictInt` classes, [#799](https://github.com/pydantic/pydantic/pull/799) by @DerRidda * Improve handling of `None` and `Optional`, replace `whole` with `each_item` (inverse meaning, default `False`) on validators, [#803](https://github.com/pydantic/pydantic/pull/803) by @samuelcolvin * add support for `Type[T]` type hints, [#807](https://github.com/pydantic/pydantic/pull/807) by @timonbimon * Performance improvements from removing `change_exceptions`, change how pydantic error are constructed, [#819](https://github.com/pydantic/pydantic/pull/819) by @samuelcolvin * Fix the error message arising when a `BaseModel`-type model field causes a `ValidationError` during parsing, [#820](https://github.com/pydantic/pydantic/pull/820) by @dmontagu * allow `getter_dict` on `Config`, modify `GetterDict` to be more like a `Mapping` object and thus easier to work with, [#821](https://github.com/pydantic/pydantic/pull/821) by @samuelcolvin * Only check `TypeVar` param on base `GenericModel` class, [#842](https://github.com/pydantic/pydantic/pull/842) by @zpencerq * rename `Model._schema_cache` -> `Model.__schema_cache__`, `Model._json_encoder` -> `Model.__json_encoder__`, `Model._custom_root_type` -> `Model.__custom_root_type__`, [#851](https://github.com/pydantic/pydantic/pull/851) by @samuelcolvin ## v0.32.2 (2019-08-17) (Docs are available [here](https://5d584fcca7c9b70007d1c997--pydantic-docs.netlify.com)) * fix `__post_init__` usage with dataclass inheritance, fix [#739](https://github.com/pydantic/pydantic/pull/739) by @samuelcolvin * fix required fields validation on GenericModels classes, [#742](https://github.com/pydantic/pydantic/pull/742) by @amitbl * fix defining custom `Schema` on `GenericModel` fields, [#754](https://github.com/pydantic/pydantic/pull/754) by @amitbl ## v0.32.1 (2019-08-08) * do not validate extra fields when `validate_assignment` is on, [#724](https://github.com/pydantic/pydantic/pull/724) by @YaraslauZhylko ## v0.32 (2019-08-06) * add model name to `ValidationError` error message, [#676](https://github.com/pydantic/pydantic/pull/676) by @dmontagu * **breaking change**: remove `__getattr__` and rename `__values__` to `__dict__` on `BaseModel`, deprecation warning on use `__values__` attr, attributes access speed increased up to 14 times, [#712](https://github.com/pydantic/pydantic/pull/712) by @Bobronium * support `ForwardRef` (without self-referencing annotations) in Python 3.6, [#706](https://github.com/pydantic/pydantic/pull/706) by @koxudaxi * implement `schema_extra` in `Config` sub-class, [#663](https://github.com/pydantic/pydantic/pull/663) by @tiangolo ## v0.31.1 (2019-07-31) * fix json generation for `EnumError`, [#697](https://github.com/pydantic/pydantic/pull/697) by @dmontagu * update numerous dependencies ## v0.31 (2019-07-24) * better support for floating point `multiple_of` values, [#652](https://github.com/pydantic/pydantic/pull/652) by @justindujardin * fix schema generation for `NewType` and `Literal`, [#649](https://github.com/pydantic/pydantic/pull/649) by @dmontagu * fix `alias_generator` and field config conflict, [#645](https://github.com/pydantic/pydantic/pull/645) by @gmetzker and [#658](https://github.com/pydantic/pydantic/pull/658) by @Bobronium * more detailed message for `EnumError`, [#673](https://github.com/pydantic/pydantic/pull/673) by @dmontagu * add advanced exclude support for `dict`, `json` and `copy`, [#648](https://github.com/pydantic/pydantic/pull/648) by @Bobronium * fix bug in `GenericModel` for models with concrete parameterized fields, [#672](https://github.com/pydantic/pydantic/pull/672) by @dmontagu * add documentation for `Literal` type, [#651](https://github.com/pydantic/pydantic/pull/651) by @dmontagu * add `Config.keep_untouched` for custom descriptors support, [#679](https://github.com/pydantic/pydantic/pull/679) by @Bobronium * use `inspect.cleandoc` internally to get model description, [#657](https://github.com/pydantic/pydantic/pull/657) by @tiangolo * add `Color` to schema generation, by @euri10 * add documentation for Literal type, [#651](https://github.com/pydantic/pydantic/pull/651) by @dmontagu ## v0.30.1 (2019-07-15) * fix so nested classes which inherit and change `__init__` are correctly processed while still allowing `self` as a parameter, [#644](https://github.com/pydantic/pydantic/pull/644) by @lnaden and @dgasmith ## v0.30 (2019-07-07) * enforce single quotes in code, [#612](https://github.com/pydantic/pydantic/pull/612) by @samuelcolvin * fix infinite recursion with dataclass inheritance and `__post_init__`, [#606](https://github.com/pydantic/pydantic/pull/606) by @Hanaasagi * fix default values for `GenericModel`, [#610](https://github.com/pydantic/pydantic/pull/610) by @dmontagu * clarify that self-referencing models require Python 3.7+, [#616](https://github.com/pydantic/pydantic/pull/616) by @vlcinsky * fix truncate for types, [#611](https://github.com/pydantic/pydantic/pull/611) by @dmontagu * add `alias_generator` support, [#622](https://github.com/pydantic/pydantic/pull/622) by @Bobronium * fix unparameterized generic type schema generation, [#625](https://github.com/pydantic/pydantic/pull/625) by @dmontagu * fix schema generation with multiple/circular references to the same model, [#621](https://github.com/pydantic/pydantic/pull/621) by @tiangolo and @wongpat * support custom root types, [#628](https://github.com/pydantic/pydantic/pull/628) by @koxudaxi * support `self` as a field name in `parse_obj`, [#632](https://github.com/pydantic/pydantic/pull/632) by @samuelcolvin ## v0.29 (2019-06-19) * support dataclasses.InitVar, [#592](https://github.com/pydantic/pydantic/pull/592) by @pfrederiks * Updated documentation to elucidate the usage of `Union` when defining multiple types under an attribute's annotation and showcase how the type-order can affect marshalling of provided values, [#594](https://github.com/pydantic/pydantic/pull/594) by @somada141 * add `conlist` type, [#583](https://github.com/pydantic/pydantic/pull/583) by @hmvp * add support for generics, [#595](https://github.com/pydantic/pydantic/pull/595) by @dmontagu ## v0.28 (2019-06-06) * fix support for JSON Schema generation when using models with circular references in Python 3.7, [#572](https://github.com/pydantic/pydantic/pull/572) by @tiangolo * support `__post_init_post_parse__` on dataclasses, [#567](https://github.com/pydantic/pydantic/pull/567) by @sevaho * allow dumping dataclasses to JSON, [#575](https://github.com/pydantic/pydantic/pull/575) by @samuelcolvin and @DanielOberg * ORM mode, [#562](https://github.com/pydantic/pydantic/pull/562) by @samuelcolvin * fix `pydantic.compiled` on ipython, [#573](https://github.com/pydantic/pydantic/pull/573) by @dmontagu and @samuelcolvin * add `StrictBool` type, [#579](https://github.com/pydantic/pydantic/pull/579) by @cazgp ## v0.27 (2019-05-30) * **breaking change** `_pydantic_post_init` to execute dataclass' original `__post_init__` before validation, [#560](https://github.com/pydantic/pydantic/pull/560) by @HeavenVolkoff * fix handling of generic types without specified parameters, [#550](https://github.com/pydantic/pydantic/pull/550) by @dmontagu * **breaking change** (maybe): this is the first release compiled with **cython**, see the docs and please submit an issue if you run into problems ## v0.27.0a1 (2019-05-26) * fix JSON Schema for `list`, `tuple`, and `set`, [#540](https://github.com/pydantic/pydantic/pull/540) by @tiangolo * compiling with cython, `manylinux` binaries, some other performance improvements, [#548](https://github.com/pydantic/pydantic/pull/548) by @samuelcolvin ## v0.26 (2019-05-22) * fix to schema generation for `IPvAnyAddress`, `IPvAnyInterface`, `IPvAnyNetwork` [#498](https://github.com/pydantic/pydantic/pull/498) by @pilosus * fix variable length tuples support, [#495](https://github.com/pydantic/pydantic/pull/495) by @pilosus * fix return type hint for `create_model`, [#526](https://github.com/pydantic/pydantic/pull/526) by @dmontagu * **Breaking Change:** fix `.dict(skip_keys=True)` skipping values set via alias (this involves changing `validate_model()` to always returns `Tuple[Dict[str, Any], Set[str], Optional[ValidationError]]`), [#517](https://github.com/pydantic/pydantic/pull/517) by @sommd * fix to schema generation for `IPv4Address`, `IPv6Address`, `IPv4Interface`, `IPv6Interface`, `IPv4Network`, `IPv6Network` [#532](https://github.com/pydantic/pydantic/pull/532) by @euri10 * add `Color` type, [#504](https://github.com/pydantic/pydantic/pull/504) by @pilosus and @samuelcolvin ## v0.25 (2019-05-05) * Improve documentation on self-referencing models and annotations, [#487](https://github.com/pydantic/pydantic/pull/487) by @theenglishway * fix `.dict()` with extra keys, [#490](https://github.com/pydantic/pydantic/pull/490) by @JaewonKim * support `const` keyword in `Schema`, [#434](https://github.com/pydantic/pydantic/pull/434) by @Sean1708 ## v0.24 (2019-04-23) * fix handling `ForwardRef` in sub-types, like `Union`, [#464](https://github.com/pydantic/pydantic/pull/464) by @tiangolo * fix secret serialization, [#465](https://github.com/pydantic/pydantic/pull/465) by @atheuz * Support custom validators for dataclasses, [#454](https://github.com/pydantic/pydantic/pull/454) by @primal100 * fix `parse_obj` to cope with dict-like objects, [#472](https://github.com/pydantic/pydantic/pull/472) by @samuelcolvin * fix to schema generation in nested dataclass-based models, [#474](https://github.com/pydantic/pydantic/pull/474) by @NoAnyLove * fix `json` for `Path`, `FilePath`, and `DirectoryPath` objects, [#473](https://github.com/pydantic/pydantic/pull/473) by @mikegoodspeed ## v0.23 (2019-04-04) * improve documentation for contributing section, [#441](https://github.com/pydantic/pydantic/pull/441) by @pilosus * improve README.rst to include essential information about the package, [#446](https://github.com/pydantic/pydantic/pull/446) by @pilosus * `IntEnum` support, [#444](https://github.com/pydantic/pydantic/pull/444) by @potykion * fix PyObject callable value, [#409](https://github.com/pydantic/pydantic/pull/409) by @pilosus * fix `black` deprecation warnings after update, [#451](https://github.com/pydantic/pydantic/pull/451) by @pilosus * fix `ForwardRef` collection bug, [#450](https://github.com/pydantic/pydantic/pull/450) by @tigerwings * Support specialized `ClassVars`, [#455](https://github.com/pydantic/pydantic/pull/455) by @tyrylu * fix JSON serialization for `ipaddress` types, [#333](https://github.com/pydantic/pydantic/pull/333) by @pilosus * add `SecretStr` and `SecretBytes` types, [#452](https://github.com/pydantic/pydantic/pull/452) by @atheuz ## v0.22 (2019-03-29) * add `IPv{4,6,Any}Network` and `IPv{4,6,Any}Interface` types from `ipaddress` stdlib, [#333](https://github.com/pydantic/pydantic/pull/333) by @pilosus * add docs for `datetime` types, [#386](https://github.com/pydantic/pydantic/pull/386) by @pilosus * fix to schema generation in dataclass-based models, [#408](https://github.com/pydantic/pydantic/pull/408) by @pilosus * fix path in nested models, [#437](https://github.com/pydantic/pydantic/pull/437) by @kataev * add `Sequence` support, [#304](https://github.com/pydantic/pydantic/pull/304) by @pilosus ## v0.21.0 (2019-03-15) * fix typo in `NoneIsNotAllowedError` message, [#414](https://github.com/pydantic/pydantic/pull/414) by @YaraslauZhylko * add `IPvAnyAddress`, `IPv4Address` and `IPv6Address` types, [#333](https://github.com/pydantic/pydantic/pull/333) by @pilosus ## v0.20.1 (2019-02-26) * fix type hints of `parse_obj` and similar methods, [#405](https://github.com/pydantic/pydantic/pull/405) by @erosennin * fix submodel validation, [#403](https://github.com/pydantic/pydantic/pull/403) by @samuelcolvin * correct type hints for `ValidationError.json`, [#406](https://github.com/pydantic/pydantic/pull/406) by @layday ## v0.20.0 (2019-02-18) * fix tests for Python 3.8, [#396](https://github.com/pydantic/pydantic/pull/396) by @samuelcolvin * Adds fields to the `dir` method for autocompletion in interactive sessions, [#398](https://github.com/pydantic/pydantic/pull/398) by @dgasmith * support `ForwardRef` (and therefore `from __future__ import annotations`) with dataclasses, [#397](https://github.com/pydantic/pydantic/pull/397) by @samuelcolvin ## v0.20.0a1 (2019-02-13) * **breaking change** (maybe): more sophisticated argument parsing for validators, any subset of `values`, `config` and `field` is now permitted, eg. `(cls, value, field)`, however the variadic key word argument ("`**kwargs`") **must** be called `kwargs`, [#388](https://github.com/pydantic/pydantic/pull/388) by @samuelcolvin * **breaking change**: Adds `skip_defaults` argument to `BaseModel.dict()` to allow skipping of fields that were not explicitly set, signature of `Model.construct()` changed, [#389](https://github.com/pydantic/pydantic/pull/389) by @dgasmith * add `py.typed` marker file for PEP-561 support, [#391](https://github.com/pydantic/pydantic/pull/391) by @je-l * Fix `extra` behaviour for multiple inheritance/mix-ins, [#394](https://github.com/pydantic/pydantic/pull/394) by @YaraslauZhylko ## v0.19.0 (2019-02-04) * Support `Callable` type hint, fix [#279](https://github.com/pydantic/pydantic/pull/279) by @proofit404 * Fix schema for fields with `validator` decorator, fix [#375](https://github.com/pydantic/pydantic/pull/375) by @tiangolo * Add `multiple_of` constraint to `ConstrainedDecimal`, `ConstrainedFloat`, `ConstrainedInt` and their related types `condecimal`, `confloat`, and `conint` [#371](https://github.com/pydantic/pydantic/pull/371), thanks @StephenBrown2 * Deprecated `ignore_extra` and `allow_extra` Config fields in favor of `extra`, [#352](https://github.com/pydantic/pydantic/pull/352) by @liiight * Add type annotations to all functions, test fully with mypy, [#373](https://github.com/pydantic/pydantic/pull/373) by @samuelcolvin * fix for 'missing' error with `validate_all` or `validate_always`, [#381](https://github.com/pydantic/pydantic/pull/381) by @samuelcolvin * Change the second/millisecond watershed for date/datetime parsing to `2e10`, [#385](https://github.com/pydantic/pydantic/pull/385) by @samuelcolvin ## v0.18.2 (2019-01-22) * Fix to schema generation with `Optional` fields, fix [#361](https://github.com/pydantic/pydantic/pull/361) by @samuelcolvin ## v0.18.1 (2019-01-17) * add `ConstrainedBytes` and `conbytes` types, [#315](https://github.com/pydantic/pydantic/pull/315) @Gr1N * adding `MANIFEST.in` to include license in package `.tar.gz`, [#358](https://github.com/pydantic/pydantic/pull/358) by @samuelcolvin ## v0.18.0 (2019-01-13) * **breaking change**: don't call validators on keys of dictionaries, [#254](https://github.com/pydantic/pydantic/pull/254) by @samuelcolvin * Fix validators with `always=True` when the default is `None` or the type is optional, also prevent `whole` validators being called for sub-fields, fix [#132](https://github.com/pydantic/pydantic/pull/132) by @samuelcolvin * improve documentation for settings priority and allow it to be easily changed, [#343](https://github.com/pydantic/pydantic/pull/343) by @samuelcolvin * fix `ignore_extra=False` and `allow_population_by_alias=True`, fix [#257](https://github.com/pydantic/pydantic/pull/257) by @samuelcolvin * **breaking change**: Set `BaseConfig` attributes `min_anystr_length` and `max_anystr_length` to `None` by default, fix [#349](https://github.com/pydantic/pydantic/pull/349) in [#350](https://github.com/pydantic/pydantic/pull/350) by @tiangolo * add support for postponed annotations, [#348](https://github.com/pydantic/pydantic/pull/348) by @samuelcolvin ## v0.17.0 (2018-12-27) * fix schema for `timedelta` as number, [#325](https://github.com/pydantic/pydantic/pull/325) by @tiangolo * prevent validators being called repeatedly after inheritance, [#327](https://github.com/pydantic/pydantic/pull/327) by @samuelcolvin * prevent duplicate validator check in ipython, fix [#312](https://github.com/pydantic/pydantic/pull/312) by @samuelcolvin * add "Using Pydantic" section to docs, [#323](https://github.com/pydantic/pydantic/pull/323) by @tiangolo & [#326](https://github.com/pydantic/pydantic/pull/326) by @samuelcolvin * fix schema generation for fields annotated as `: dict`, `: list`, `: tuple` and `: set`, [#330](https://github.com/pydantic/pydantic/pull/330) & [#335](https://github.com/pydantic/pydantic/pull/335) by @nkonin * add support for constrained strings as dict keys in schema, [#332](https://github.com/pydantic/pydantic/pull/332) by @tiangolo * support for passing Config class in dataclasses decorator, [#276](https://github.com/pydantic/pydantic/pull/276) by @jarekkar (**breaking change**: this supersedes the `validate_assignment` argument with `config`) * support for nested dataclasses, [#334](https://github.com/pydantic/pydantic/pull/334) by @samuelcolvin * better errors when getting an `ImportError` with `PyObject`, [#309](https://github.com/pydantic/pydantic/pull/309) by @samuelcolvin * rename `get_validators` to `__get_validators__`, deprecation warning on use of old name, [#338](https://github.com/pydantic/pydantic/pull/338) by @samuelcolvin * support `ClassVar` by excluding such attributes from fields, [#184](https://github.com/pydantic/pydantic/pull/184) by @samuelcolvin ## v0.16.1 (2018-12-10) * fix `create_model` to correctly use the passed `__config__`, [#320](https://github.com/pydantic/pydantic/pull/320) by @hugoduncan ## v0.16.0 (2018-12-03) * **breaking change**: refactor schema generation to be compatible with JSON Schema and OpenAPI specs, [#308](https://github.com/pydantic/pydantic/pull/308) by @tiangolo * add `schema` to `schema` module to generate top-level schemas from base models, [#308](https://github.com/pydantic/pydantic/pull/308) by @tiangolo * add additional fields to `Schema` class to declare validation for `str` and numeric values, [#311](https://github.com/pydantic/pydantic/pull/311) by @tiangolo * rename `_schema` to `schema` on fields, [#318](https://github.com/pydantic/pydantic/pull/318) by @samuelcolvin * add `case_insensitive` option to `BaseSettings` `Config`, [#277](https://github.com/pydantic/pydantic/pull/277) by @jasonkuhrt ## v0.15.0 (2018-11-18) * move codebase to use black, [#287](https://github.com/pydantic/pydantic/pull/287) by @samuelcolvin * fix alias use in settings, [#286](https://github.com/pydantic/pydantic/pull/286) by @jasonkuhrt and @samuelcolvin * fix datetime parsing in `parse_date`, [#298](https://github.com/pydantic/pydantic/pull/298) by @samuelcolvin * allow dataclass inheritance, fix [#293](https://github.com/pydantic/pydantic/pull/293) by @samuelcolvin * fix `PyObject = None`, fix [#305](https://github.com/pydantic/pydantic/pull/305) by @samuelcolvin * allow `Pattern` type, fix [#303](https://github.com/pydantic/pydantic/pull/303) by @samuelcolvin ## v0.14.0 (2018-10-02) * dataclasses decorator, [#269](https://github.com/pydantic/pydantic/pull/269) by @Gaunt and @samuelcolvin ## v0.13.1 (2018-09-21) * fix issue where int_validator doesn't cast a `bool` to an `int` [#264](https://github.com/pydantic/pydantic/pull/264) by @nphyatt * add deep copy support for `BaseModel.copy()` [#249](https://github.com/pydantic/pydantic/pull/249), @gangefors ## v0.13.0 (2018-08-25) * raise an exception if a field's name shadows an existing `BaseModel` attribute [#242](https://github.com/pydantic/pydantic/pull/242) * add `UrlStr` and `urlstr` types [#236](https://github.com/pydantic/pydantic/pull/236) * timedelta json encoding ISO8601 and total seconds, custom json encoders [#247](https://github.com/pydantic/pydantic/pull/247), by @cfkanesan and @samuelcolvin * allow `timedelta` objects as values for properties of type `timedelta` (matches `datetime` etc. behavior) [#247](https://github.com/pydantic/pydantic/pull/247) ## v0.12.1 (2018-07-31) * fix schema generation for fields defined using `typing.Any` [#237](https://github.com/pydantic/pydantic/pull/237) ## v0.12.0 (2018-07-31) * add `by_alias` argument in `.dict()` and `.json()` model methods [#205](https://github.com/pydantic/pydantic/pull/205) * add Json type support [#214](https://github.com/pydantic/pydantic/pull/214) * support tuples [#227](https://github.com/pydantic/pydantic/pull/227) * major improvements and changes to schema [#213](https://github.com/pydantic/pydantic/pull/213) ## v0.11.2 (2018-07-05) * add `NewType` support [#115](https://github.com/pydantic/pydantic/pull/115) * fix `list`, `set` & `tuple` validation [#225](https://github.com/pydantic/pydantic/pull/225) * separate out `validate_model` method, allow errors to be returned along with valid values [#221](https://github.com/pydantic/pydantic/pull/221) ## v0.11.1 (2018-07-02) * support Python 3.7 [#216](https://github.com/pydantic/pydantic/pull/216), thanks @layday * Allow arbitrary types in model [#209](https://github.com/pydantic/pydantic/pull/209), thanks @oldPadavan ## v0.11.0 (2018-06-28) * make `list`, `tuple` and `set` types stricter [#86](https://github.com/pydantic/pydantic/pull/86) * **breaking change**: remove msgpack parsing [#201](https://github.com/pydantic/pydantic/pull/201) * add `FilePath` and `DirectoryPath` types [#10](https://github.com/pydantic/pydantic/pull/10) * model schema generation [#190](https://github.com/pydantic/pydantic/pull/190) * JSON serialization of models and schemas [#133](https://github.com/pydantic/pydantic/pull/133) ## v0.10.0 (2018-06-11) * add `Config.allow_population_by_alias` [#160](https://github.com/pydantic/pydantic/pull/160), thanks @bendemaree * **breaking change**: new errors format [#179](https://github.com/pydantic/pydantic/pull/179), thanks @Gr1N * **breaking change**: removed `Config.min_number_size` and `Config.max_number_size` [#183](https://github.com/pydantic/pydantic/pull/183), thanks @Gr1N * **breaking change**: correct behaviour of `lt` and `gt` arguments to `conint` etc. [#188](https://github.com/pydantic/pydantic/pull/188) for the old behaviour use `le` and `ge` [#194](https://github.com/pydantic/pydantic/pull/194), thanks @jaheba * added error context and ability to redefine error message templates using `Config.error_msg_templates` [#183](https://github.com/pydantic/pydantic/pull/183), thanks @Gr1N * fix typo in validator exception [#150](https://github.com/pydantic/pydantic/pull/150) * copy defaults to model values, so different models don't share objects [#154](https://github.com/pydantic/pydantic/pull/154) ## v0.9.1 (2018-05-10) * allow custom `get_field_config` on config classes [#159](https://github.com/pydantic/pydantic/pull/159) * add `UUID1`, `UUID3`, `UUID4` and `UUID5` types [#167](https://github.com/pydantic/pydantic/pull/167), thanks @Gr1N * modify some inconsistent docstrings and annotations [#173](https://github.com/pydantic/pydantic/pull/173), thanks @YannLuo * fix type annotations for exotic types [#171](https://github.com/pydantic/pydantic/pull/171), thanks @Gr1N * Reuse type validators in exotic types [#171](https://github.com/pydantic/pydantic/pull/171) * scheduled monthly requirements updates [#168](https://github.com/pydantic/pydantic/pull/168) * add `Decimal`, `ConstrainedDecimal` and `condecimal` types [#170](https://github.com/pydantic/pydantic/pull/170), thanks @Gr1N ## v0.9.0 (2018-04-28) * tweak email-validator import error message [#145](https://github.com/pydantic/pydantic/pull/145) * fix parse error of `parse_date()` and `parse_datetime()` when input is 0 [#144](https://github.com/pydantic/pydantic/pull/144), thanks @YannLuo * add `Config.anystr_strip_whitespace` and `strip_whitespace` kwarg to `constr`, by default values is `False` [#163](https://github.com/pydantic/pydantic/pull/163), thanks @Gr1N * add `ConstrainedFloat`, `confloat`, `PositiveFloat` and `NegativeFloat` types [#166](https://github.com/pydantic/pydantic/pull/166), thanks @Gr1N ## v0.8.0 (2018-03-25) * fix type annotation for `inherit_config` [#139](https://github.com/pydantic/pydantic/pull/139) * **breaking change**: check for invalid field names in validators [#140](https://github.com/pydantic/pydantic/pull/140) * validate attributes of parent models [#141](https://github.com/pydantic/pydantic/pull/141) * **breaking change**: email validation now uses [email-validator](https://github.com/JoshData/python-email-validator) [#142](https://github.com/pydantic/pydantic/pull/142) ## v0.7.1 (2018-02-07) * fix bug with `create_model` modifying the base class ## v0.7.0 (2018-02-06) * added compatibility with abstract base classes (ABCs) [#123](https://github.com/pydantic/pydantic/pull/123) * add `create_model` method [#113](https://github.com/pydantic/pydantic/pull/113) [#125](https://github.com/pydantic/pydantic/pull/125) * **breaking change**: rename `.config` to `.__config__` on a model * **breaking change**: remove deprecated `.values()` on a model, use `.dict()` instead * remove use of `OrderedDict` and use simple dict [#126](https://github.com/pydantic/pydantic/pull/126) * add `Config.use_enum_values` [#127](https://github.com/pydantic/pydantic/pull/127) * add wildcard validators of the form `@validate('*')` [#128](https://github.com/pydantic/pydantic/pull/128) ## v0.6.4 (2018-02-01) * allow Python date and times objects [#122](https://github.com/pydantic/pydantic/pull/122) ## v0.6.3 (2017-11-26) * fix direct install without `README.rst` present ## v0.6.2 (2017-11-13) * errors for invalid validator use * safer check for complex models in `Settings` ## v0.6.1 (2017-11-08) * prevent duplicate validators, [#101](https://github.com/pydantic/pydantic/pull/101) * add `always` kwarg to validators, [#102](https://github.com/pydantic/pydantic/pull/102) ## v0.6.0 (2017-11-07) * assignment validation [#94](https://github.com/pydantic/pydantic/pull/94), thanks petroswork! * JSON in environment variables for complex types, [#96](https://github.com/pydantic/pydantic/pull/96) * add `validator` decorators for complex validation, [#97](https://github.com/pydantic/pydantic/pull/97) * depreciate `values(...)` and replace with `.dict(...)`, [#99](https://github.com/pydantic/pydantic/pull/99) ## v0.5.0 (2017-10-23) * add `UUID` validation [#89](https://github.com/pydantic/pydantic/pull/89) * remove `index` and `track` from error object (json) if they're null [#90](https://github.com/pydantic/pydantic/pull/90) * improve the error text when a list is provided rather than a dict [#90](https://github.com/pydantic/pydantic/pull/90) * add benchmarks table to docs [#91](https://github.com/pydantic/pydantic/pull/91) ## v0.4.0 (2017-07-08) * show length in string validation error * fix aliases in config during inheritance [#55](https://github.com/pydantic/pydantic/pull/55) * simplify error display * use unicode ellipsis in `truncate` * add `parse_obj`, `parse_raw` and `parse_file` helper functions [#58](https://github.com/pydantic/pydantic/pull/58) * switch annotation only fields to come first in fields list not last ## v0.3.0 (2017-06-21) * immutable models via `config.allow_mutation = False`, associated cleanup and performance improvement [#44](https://github.com/pydantic/pydantic/pull/44) * immutable helper methods `construct()` and `copy()` [#53](https://github.com/pydantic/pydantic/pull/53) * allow pickling of models [#53](https://github.com/pydantic/pydantic/pull/53) * `setattr` is removed as `__setattr__` is now intelligent [#44](https://github.com/pydantic/pydantic/pull/44) * `raise_exception` removed, Models now always raise exceptions [#44](https://github.com/pydantic/pydantic/pull/44) * instance method validators removed * django-restful-framework benchmarks added [#47](https://github.com/pydantic/pydantic/pull/47) * fix inheritance bug [#49](https://github.com/pydantic/pydantic/pull/49) * make str type stricter so list, dict etc are not coerced to strings. [#52](https://github.com/pydantic/pydantic/pull/52) * add `StrictStr` which only always strings as input [#52](https://github.com/pydantic/pydantic/pull/52) ## v0.2.1 (2017-06-07) * pypi and travis together messed up the deploy of `v0.2` this should fix it ## v0.2.0 (2017-06-07) * **breaking change**: `values()` on a model is now a method not a property, takes `include` and `exclude` arguments * allow annotation only fields to support mypy * add pretty `to_string(pretty=True)` method for models ## v0.1.0 (2017-06-03) * add docs * add history pydantic-2.10.6/LICENSE000066400000000000000000000021511474456633400144370ustar00rootroot00000000000000The MIT License (MIT) Copyright (c) 2017 to present Pydantic Services Inc. and individual contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. pydantic-2.10.6/Makefile000066400000000000000000000112061474456633400150730ustar00rootroot00000000000000# .DEFAULT_GOAL := all sources = pydantic tests docs/plugins .PHONY: .uv ## Check that uv is installed .uv: @uv -V || echo 'Please install uv: https://docs.astral.sh/uv/getting-started/installation/' .PHONY: .pre-commit ## Check that pre-commit is installed .pre-commit: .uv @uv run pre-commit -V || uv pip install pre-commit .PHONY: install ## Install the package, dependencies, and pre-commit for local development install: .uv uv sync --frozen --group all --all-extras uv pip install pre-commit pre-commit install --install-hooks .PHONY: rebuild-lockfiles ## Rebuild lockfiles from scratch, updating all dependencies rebuild-lockfiles: .uv uv lock --upgrade .PHONY: format ## Auto-format python source files format: .uv uv run ruff check --fix $(sources) uv run ruff format $(sources) .PHONY: lint ## Lint python source files lint: .uv uv run ruff check $(sources) uv run ruff format --check $(sources) .PHONY: codespell ## Use Codespell to do spellchecking codespell: .pre-commit pre-commit run codespell --all-files .PHONY: typecheck ## Perform type-checking typecheck: .pre-commit pre-commit run typecheck --all-files .PHONY: test-mypy ## Run the mypy integration tests test-mypy: .uv uv run coverage run -m pytest tests/mypy --test-mypy .PHONY: test-mypy-update ## Update the mypy integration tests for the current mypy version test-mypy-update: .uv uv run coverage run -m pytest tests/mypy --test-mypy --update-mypy .PHONY: test-mypy-update-all ## Update the mypy integration tests for all mypy versions test-mypy-update-all: .uv rm -rf tests/mypy/outputs uv pip install mypy==1.10.1 && make test-mypy-update uv pip install mypy==1.11.2 && make test-mypy-update uv pip install mypy==1.12.0 && make test-mypy-update .PHONY: test-typechecking-pyright ## Typechecking integration tests (Pyright) test-typechecking-pyright: .uv uv run bash -c 'cd tests/typechecking && pyright --version && pyright -p pyproject.toml' .PHONY: test-typechecking-mypy ## Typechecking integration tests (Mypy). Not to be confused with `test-mypy`. test-typechecking-mypy: .uv uv run bash -c 'cd tests/typechecking && mypy --version && mypy --cache-dir=/dev/null --config-file pyproject.toml .' .PHONY: test ## Run all tests, skipping the type-checker integration tests test: .uv uv run coverage run -m pytest --durations=10 .PHONY: benchmark ## Run all benchmarks benchmark: .uv uv run coverage run -m pytest --durations=10 --benchmark-enable tests/benchmarks .PHONY: testcov ## Run tests and generate a coverage report, skipping the type-checker integration tests testcov: test @echo "building coverage html" @uv run coverage html @echo "building coverage lcov" @uv run coverage lcov .PHONY: test-examples ## Run only the tests from the documentation test-examples: .uv @echo "running examples" @find docs/examples -type f -name '*.py' | xargs -I'{}' sh -c 'uv run python {} >/dev/null 2>&1 || (echo "{} failed")' .PHONY: test-fastapi ## Run the FastAPI tests with this version of pydantic test-fastapi: git clone https://github.com/tiangolo/fastapi.git --single-branch ./tests/test_fastapi.sh .PHONY: test-pydantic-settings ## Run the pydantic-settings tests with this version of pydantic test-pydantic-settings: .uv git clone https://github.com/pydantic/pydantic-settings.git --single-branch bash ./tests/test_pydantic_settings.sh .PHONY: test-pydantic-extra-types ## Run the pydantic-extra-types tests with this version of pydantic test-pydantic-extra-types: .uv git clone https://github.com/pydantic/pydantic-extra-types.git --single-branch bash ./tests/test_pydantic_extra_types.sh .PHONY: test-no-docs # Run all tests except the docs tests test-no-docs: .uv uv run pytest tests --ignore=tests/test_docs.py .PHONY: all ## Run the standard set of checks performed in CI all: lint typecheck codespell testcov .PHONY: clean ## Clear local caches and build artifacts clean: rm -rf `find . -name __pycache__` rm -f `find . -type f -name '*.py[co]'` rm -f `find . -type f -name '*~'` rm -f `find . -type f -name '.*~'` rm -rf .cache rm -rf .pytest_cache rm -rf .ruff_cache rm -rf htmlcov rm -rf *.egg-info rm -f .coverage rm -f .coverage.* rm -rf build rm -rf dist rm -rf site rm -rf docs/_build rm -rf docs/.changelog.md docs/.version.md docs/.tmp_schema_mappings.html rm -rf fastapi/test.db rm -rf coverage.xml .PHONY: docs ## Generate the docs docs: uv run mkdocs build --strict .PHONY: help ## Display this message help: @grep -E \ '^.PHONY: .*?## .*$$' $(MAKEFILE_LIST) | \ sort | \ awk 'BEGIN {FS = ".PHONY: |## "}; {printf "\033[36m%-19s\033[0m %s\n", $$2, $$3}' .PHONY: update-v1 ## Update V1 namespace update-v1: uv run ./update_v1.sh pydantic-2.10.6/README.md000066400000000000000000000061631474456633400147200ustar00rootroot00000000000000# Pydantic [![CI](https://img.shields.io/github/actions/workflow/status/pydantic/pydantic/ci.yml?branch=main&logo=github&label=CI)](https://github.com/pydantic/pydantic/actions?query=event%3Apush+branch%3Amain+workflow%3ACI) [![Coverage](https://coverage-badge.samuelcolvin.workers.dev/pydantic/pydantic.svg)](https://coverage-badge.samuelcolvin.workers.dev/redirect/pydantic/pydantic) [![pypi](https://img.shields.io/pypi/v/pydantic.svg)](https://pypi.python.org/pypi/pydantic) [![CondaForge](https://img.shields.io/conda/v/conda-forge/pydantic.svg)](https://anaconda.org/conda-forge/pydantic) [![downloads](https://static.pepy.tech/badge/pydantic/month)](https://pepy.tech/project/pydantic) [![versions](https://img.shields.io/pypi/pyversions/pydantic.svg)](https://github.com/pydantic/pydantic) [![license](https://img.shields.io/github/license/pydantic/pydantic.svg)](https://github.com/pydantic/pydantic/blob/main/LICENSE) [![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://docs.pydantic.dev/latest/contributing/#badges) Data validation using Python type hints. Fast and extensible, Pydantic plays nicely with your linters/IDE/brain. Define how data should be in pure, canonical Python 3.8+; validate it with Pydantic. ## Pydantic Logfire :fire: We've recently launched Pydantic Logfire to help you monitor your applications. [Learn more](https://pydantic.dev/articles/logfire-announcement) ## Pydantic V1.10 vs. V2 Pydantic V2 is a ground-up rewrite that offers many new features, performance improvements, and some breaking changes compared to Pydantic V1. If you're using Pydantic V1 you may want to look at the [pydantic V1.10 Documentation](https://docs.pydantic.dev/) or, [`1.10.X-fixes` git branch](https://github.com/pydantic/pydantic/tree/1.10.X-fixes). Pydantic V2 also ships with the latest version of Pydantic V1 built in so that you can incrementally upgrade your code base and projects: `from pydantic import v1 as pydantic_v1`. ## Help See [documentation](https://docs.pydantic.dev/) for more details. ## Installation Install using `pip install -U pydantic` or `conda install pydantic -c conda-forge`. For more installation options to make Pydantic even faster, see the [Install](https://docs.pydantic.dev/install/) section in the documentation. ## A Simple Example ```python from datetime import datetime from typing import List, Optional from pydantic import BaseModel class User(BaseModel): id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None friends: List[int] = [] external_data = {'id': '123', 'signup_ts': '2017-06-01 12:22', 'friends': [1, '2', b'3']} user = User(**external_data) print(user) #> User id=123 name='John Doe' signup_ts=datetime.datetime(2017, 6, 1, 12, 22) friends=[1, 2, 3] print(user.id) #> 123 ``` ## Contributing For guidance on setting up a development environment and how to make a contribution to Pydantic, see [Contributing to Pydantic](https://docs.pydantic.dev/contributing/). ## Reporting a Security Vulnerability See our [security policy](https://github.com/pydantic/pydantic/security/policy). pydantic-2.10.6/build-docs.sh000077500000000000000000000015021474456633400160150ustar00rootroot00000000000000#!/usr/bin/env bash # This script is used to build the documentation on CloudFlare Pages, this is just used for build previews # A different script with the same name exists on the `docs-site` branch (where pre-built docs live). set -e set -x python3 -V python3 -m pip install --user uv python3 -m uv sync --python 3.12 --group docs --frozen python3 -m uv run python -c 'import docs.plugins.main' # Adding local symlinks gets nice source locations like # pydantic_core/core_schema.py # instead of # .venv/lib/python3.10/site-packages/pydantic_core/core_schema.py ln -s .venv/lib/python*/site-packages/pydantic_core pydantic_core ln -s .venv/lib/python*/site-packages/pydantic_settings pydantic_settings ln -s .venv/lib/python*/site-packages/pydantic_extra_types pydantic_extra_types python3 -m uv run --no-sync mkdocs build pydantic-2.10.6/docs/000077500000000000000000000000001474456633400143635ustar00rootroot00000000000000pydantic-2.10.6/docs/api/000077500000000000000000000000001474456633400151345ustar00rootroot00000000000000pydantic-2.10.6/docs/api/aliases.md000066400000000000000000000000251474456633400170740ustar00rootroot00000000000000::: pydantic.aliases pydantic-2.10.6/docs/api/annotated_handlers.md000066400000000000000000000000401474456633400213050ustar00rootroot00000000000000::: pydantic.annotated_handlers pydantic-2.10.6/docs/api/base_model.md000066400000000000000000000016631474456633400175560ustar00rootroot00000000000000Pydantic models are simply classes which inherit from `BaseModel` and define fields as annotated attributes. ::: pydantic.BaseModel options: show_root_heading: true merge_init_into_class: false group_by_category: false # explicit members list so we can set order and include `__init__` easily members: - __init__ - model_config - model_computed_fields - model_extra - model_fields - model_fields_set - __pydantic_core_schema__ - model_construct - model_copy - model_dump - model_dump_json - model_json_schema - model_parametrized_name - model_post_init - model_rebuild - model_validate - model_validate_json - model_validate_strings - copy ::: pydantic.create_model options: show_root_heading: true pydantic-2.10.6/docs/api/config.md000066400000000000000000000003571474456633400167300ustar00rootroot00000000000000::: pydantic.config options: group_by_category: false members: - ConfigDict - with_config - ExtraValues - BaseConfig ::: pydantic.alias_generators options: show_root_heading: true pydantic-2.10.6/docs/api/dataclasses.md000066400000000000000000000000311474456633400177370ustar00rootroot00000000000000::: pydantic.dataclasses pydantic-2.10.6/docs/api/errors.md000066400000000000000000000000241474456633400167660ustar00rootroot00000000000000::: pydantic.errors pydantic-2.10.6/docs/api/experimental.md000066400000000000000000000001311474456633400201460ustar00rootroot00000000000000::: pydantic.experimental.pipeline options: members: - _Pipeline pydantic-2.10.6/docs/api/fields.md000066400000000000000000000003311474456633400167210ustar00rootroot00000000000000::: pydantic.fields options: group_by_category: false members: - Field - FieldInfo - PrivateAttr - ModelPrivateAttr - computed_field - ComputedFieldInfo pydantic-2.10.6/docs/api/functional_serializers.md000066400000000000000000000000441474456633400222320ustar00rootroot00000000000000::: pydantic.functional_serializers pydantic-2.10.6/docs/api/functional_validators.md000066400000000000000000000000431474456633400220450ustar00rootroot00000000000000::: pydantic.functional_validators pydantic-2.10.6/docs/api/json_schema.md000066400000000000000000000000311474456633400177410ustar00rootroot00000000000000::: pydantic.json_schema pydantic-2.10.6/docs/api/networks.md000066400000000000000000000000261474456633400173300ustar00rootroot00000000000000::: pydantic.networks pydantic-2.10.6/docs/api/pydantic_core.md000066400000000000000000000013021474456633400202750ustar00rootroot00000000000000::: pydantic_core options: allow_inspection: false show_source: false members: - SchemaValidator - SchemaSerializer - ValidationError - ErrorDetails - InitErrorDetails - SchemaError - PydanticCustomError - PydanticKnownError - PydanticOmit - PydanticUseDefault - PydanticSerializationError - PydanticSerializationUnexpectedValue - Url - MultiHostUrl - MultiHostHost - ArgsKwargs - Some - TzInfo - to_json - from_json - to_jsonable_python - list_all_errors - ErrorTypeInfo - __version__ pydantic-2.10.6/docs/api/pydantic_core_schema.md000066400000000000000000000000361474456633400216200ustar00rootroot00000000000000::: pydantic_core.core_schema pydantic-2.10.6/docs/api/pydantic_extra_types_color.md000066400000000000000000000000371474456633400231160ustar00rootroot00000000000000::: pydantic_extra_types.color pydantic-2.10.6/docs/api/pydantic_extra_types_coordinate.md000066400000000000000000000000441474456633400241250ustar00rootroot00000000000000::: pydantic_extra_types.coordinate pydantic-2.10.6/docs/api/pydantic_extra_types_country.md000066400000000000000000000000411474456633400234760ustar00rootroot00000000000000::: pydantic_extra_types.country pydantic-2.10.6/docs/api/pydantic_extra_types_currency_code.md000066400000000000000000000000471474456633400246250ustar00rootroot00000000000000::: pydantic_extra_types.currency_code pydantic-2.10.6/docs/api/pydantic_extra_types_isbn.md000066400000000000000000000000361474456633400227320ustar00rootroot00000000000000::: pydantic_extra_types.isbn pydantic-2.10.6/docs/api/pydantic_extra_types_language_code.md000066400000000000000000000000471474456633400245560ustar00rootroot00000000000000::: pydantic_extra_types.language_code pydantic-2.10.6/docs/api/pydantic_extra_types_mac_address.md000066400000000000000000000000451474456633400242440ustar00rootroot00000000000000::: pydantic_extra_types.mac_address pydantic-2.10.6/docs/api/pydantic_extra_types_payment.md000066400000000000000000000000411474456633400234500ustar00rootroot00000000000000::: pydantic_extra_types.payment pydantic-2.10.6/docs/api/pydantic_extra_types_pendulum_dt.md000066400000000000000000000000451474456633400243170ustar00rootroot00000000000000::: pydantic_extra_types.pendulum_dt pydantic-2.10.6/docs/api/pydantic_extra_types_phone_numbers.md000066400000000000000000000000471474456633400246450ustar00rootroot00000000000000::: pydantic_extra_types.phone_numbers pydantic-2.10.6/docs/api/pydantic_extra_types_routing_numbers.md000066400000000000000000000000501474456633400252150ustar00rootroot00000000000000::: pydantic_extra_types.routing_number pydantic-2.10.6/docs/api/pydantic_extra_types_script_code.md000066400000000000000000000000451474456633400242750ustar00rootroot00000000000000::: pydantic_extra_types.script_code pydantic-2.10.6/docs/api/pydantic_extra_types_semantic_version.md000066400000000000000000000000521474456633400253450ustar00rootroot00000000000000::: pydantic_extra_types.semantic_version pydantic-2.10.6/docs/api/pydantic_extra_types_timezone_name.md000066400000000000000000000000471474456633400246330ustar00rootroot00000000000000::: pydantic_extra_types.timezone_name pydantic-2.10.6/docs/api/pydantic_extra_types_ulid.md000066400000000000000000000000361474456633400227340ustar00rootroot00000000000000::: pydantic_extra_types.ulid pydantic-2.10.6/docs/api/pydantic_settings.md000066400000000000000000000000261474456633400212070ustar00rootroot00000000000000::: pydantic_settings pydantic-2.10.6/docs/api/root_model.md000066400000000000000000000000301474456633400176120ustar00rootroot00000000000000::: pydantic.root_model pydantic-2.10.6/docs/api/standard_library_types.md000066400000000000000000000705601474456633400222360ustar00rootroot00000000000000--- description: Support for common types from the Python standard library. --- Pydantic supports many common types from the Python standard library. If you need stricter processing see [Strict Types](../concepts/types.md#strict-types), including if you need to constrain the values allowed (e.g. to require a positive `int`). ## Booleans A standard `bool` field will raise a `ValidationError` if the value is not one of the following: * A valid boolean (i.e. `True` or `False`), * The integers `0` or `1`, * a `str` which when converted to lower case is one of `'0', 'off', 'f', 'false', 'n', 'no', '1', 'on', 't', 'true', 'y', 'yes'` * a `bytes` which is valid per the previous rule when decoded to `str` !!! note If you want stricter boolean logic (e.g. a field which only permits `True` and `False`) you can use [`StrictBool`](../api/types.md#pydantic.types.StrictBool). Here is a script demonstrating some of these behaviors: ```python from pydantic import BaseModel, ValidationError class BooleanModel(BaseModel): bool_value: bool print(BooleanModel(bool_value=False)) #> bool_value=False print(BooleanModel(bool_value='False')) #> bool_value=False print(BooleanModel(bool_value=1)) #> bool_value=True try: BooleanModel(bool_value=[]) except ValidationError as e: print(str(e)) """ 1 validation error for BooleanModel bool_value Input should be a valid boolean [type=bool_type, input_value=[], input_type=list] """ ``` ## Datetime Types Pydantic supports the following [datetime](https://docs.python.org/library/datetime.html#available-types) types: ### [`datetime.datetime`][] * `datetime` fields will accept values of type: * `datetime`; an existing `datetime` object * `int` or `float`; assumed as Unix time, i.e. seconds (if >= `-2e10` and <= `2e10`) or milliseconds (if < `-2e10`or > `2e10`) since 1 January 1970 * `str`; the following formats are accepted: * `YYYY-MM-DD[T]HH:MM[:SS[.ffffff]][Z or [±]HH[:]MM]` * `YYYY-MM-DD` is accepted in lax mode, but not in strict mode * `int` or `float` as a string (assumed as Unix time) * [`datetime.date`][] instances are accepted in lax mode, but not in strict mode ```python from datetime import datetime from pydantic import BaseModel class Event(BaseModel): dt: datetime = None event = Event(dt='2032-04-23T10:20:30.400+02:30') print(event.model_dump()) """ {'dt': datetime.datetime(2032, 4, 23, 10, 20, 30, 400000, tzinfo=TzInfo(+02:30))} """ ``` ### [`datetime.date`][] * `date` fields will accept values of type: * `date`; an existing `date` object * `int` or `float`; handled the same as described for `datetime` above * `str`; the following formats are accepted: * `YYYY-MM-DD` * `int` or `float` as a string (assumed as Unix time) ```python from datetime import date from pydantic import BaseModel class Birthday(BaseModel): d: date = None my_birthday = Birthday(d=1679616000.0) print(my_birthday.model_dump()) #> {'d': datetime.date(2023, 3, 24)} ``` ### [`datetime.time`][] * `time` fields will accept values of type: * `time`; an existing `time` object * `str`; the following formats are accepted: * `HH:MM[:SS[.ffffff]][Z or [±]HH[:]MM]` ```python from datetime import time from pydantic import BaseModel class Meeting(BaseModel): t: time = None m = Meeting(t=time(4, 8, 16)) print(m.model_dump()) #> {'t': datetime.time(4, 8, 16)} ``` ### [`datetime.timedelta`][] * `timedelta` fields will accept values of type: * `timedelta`; an existing `timedelta` object * `int` or `float`; assumed to be seconds * `str`; the following formats are accepted: * `[-][[DD]D,]HH:MM:SS[.ffffff]` * Ex: `'1d,01:02:03.000004'` or `'1D01:02:03.000004'` or `'01:02:03'` * `[±]P[DD]DT[HH]H[MM]M[SS]S` ([ISO 8601](https://en.wikipedia.org/wiki/ISO_8601) format for timedelta) ```python from datetime import timedelta from pydantic import BaseModel class Model(BaseModel): td: timedelta = None m = Model(td='P3DT12H30M5S') print(m.model_dump()) #> {'td': datetime.timedelta(days=3, seconds=45005)} ``` ## Number Types Pydantic supports the following numeric types from the Python standard library: ### [`int`][] * Pydantic uses `int(v)` to coerce types to an `int`; see [Data conversion](../concepts/models.md#data-conversion) for details on loss of information during data conversion. ### [`float`][] * Pydantic uses `float(v)` to coerce values to floats. ### [`enum.IntEnum`][] * Validation: Pydantic checks that the value is a valid `IntEnum` instance. * Validation for subclass of `enum.IntEnum`: checks that the value is a valid member of the integer enum; see [Enums and Choices](#enum) for more details. ### [`decimal.Decimal`][] * Validation: Pydantic attempts to convert the value to a string, then passes the string to `Decimal(v)`. * Serialization: Pydantic serializes [`Decimal`][decimal.Decimal] types as strings. You can use a custom serializer to override this behavior if desired. For example: ```python from decimal import Decimal from typing_extensions import Annotated from pydantic import BaseModel, PlainSerializer class Model(BaseModel): x: Decimal y: Annotated[ Decimal, PlainSerializer( lambda x: float(x), return_type=float, when_used='json' ), ] my_model = Model(x=Decimal('1.1'), y=Decimal('2.1')) print(my_model.model_dump()) # (1)! #> {'x': Decimal('1.1'), 'y': Decimal('2.1')} print(my_model.model_dump(mode='json')) # (2)! #> {'x': '1.1', 'y': 2.1} print(my_model.model_dump_json()) # (3)! #> {"x":"1.1","y":2.1} ``` 1. Using [`model_dump`][pydantic.main.BaseModel.model_dump], both `x` and `y` remain instances of the `Decimal` type 2. Using [`model_dump`][pydantic.main.BaseModel.model_dump] with `mode='json'`, `x` is serialized as a `string`, and `y` is serialized as a `float` because of the custom serializer applied. 3. Using [`model_dump_json`][pydantic.main.BaseModel.model_dump_json], `x` is serialized as a `string`, and `y` is serialized as a `float` because of the custom serializer applied. ### [`complex`][] * Validation: Pydantic supports `complex` types or `str` values that can be converted to a `complex` type. * Serialization: Pydantic serializes [`complex`][] types as strings. ### [`fractions.Fraction`][fractions.Fraction] * Validation: Pydantic attempts to convert the value to a `Fraction` using `Fraction(v)`. * Serialization: Pydantic serializes [`Fraction`][fractions.Fraction] types as strings. ## [`Enum`][enum.Enum] Pydantic uses Python's standard [`enum`][] classes to define choices. `enum.Enum` checks that the value is a valid `Enum` instance. Subclass of `enum.Enum` checks that the value is a valid member of the enum. ```python from enum import Enum, IntEnum from pydantic import BaseModel, ValidationError class FruitEnum(str, Enum): pear = 'pear' banana = 'banana' class ToolEnum(IntEnum): spanner = 1 wrench = 2 class CookingModel(BaseModel): fruit: FruitEnum = FruitEnum.pear tool: ToolEnum = ToolEnum.spanner print(CookingModel()) #> fruit= tool= print(CookingModel(tool=2, fruit='banana')) #> fruit= tool= try: CookingModel(fruit='other') except ValidationError as e: print(e) """ 1 validation error for CookingModel fruit Input should be 'pear' or 'banana' [type=enum, input_value='other', input_type=str] """ ``` ## Lists and Tuples ### [`list`][] Allows [`list`][], [`tuple`][], [`set`][], [`frozenset`][], [`deque`][collections.deque], or generators and casts to a [`list`][]. When a generic parameter is provided, the appropriate validation is applied to all items of the list. ### [`typing.List`][] Handled the same as `list` above. ```python from typing import List, Optional from pydantic import BaseModel class Model(BaseModel): simple_list: Optional[list] = None list_of_ints: Optional[List[int]] = None print(Model(simple_list=['1', '2', '3']).simple_list) #> ['1', '2', '3'] print(Model(list_of_ints=['1', '2', '3']).list_of_ints) #> [1, 2, 3] ``` ### [`tuple`][] Allows [`list`][], [`tuple`][], [`set`][], [`frozenset`][], [`deque`][collections.deque], or generators and casts to a [`tuple`][]. When generic parameters are provided, the appropriate validation is applied to the respective items of the tuple ### [`typing.Tuple`][] Handled the same as `tuple` above. ```python from typing import Optional, Tuple from pydantic import BaseModel class Model(BaseModel): simple_tuple: Optional[tuple] = None tuple_of_different_types: Optional[Tuple[int, float, bool]] = None print(Model(simple_tuple=[1, 2, 3, 4]).simple_tuple) #> (1, 2, 3, 4) print(Model(tuple_of_different_types=[3, 2, 1]).tuple_of_different_types) #> (3, 2.0, True) ``` ### [`typing.NamedTuple`][] Subclasses of [`typing.NamedTuple`][] are similar to `tuple`, but create instances of the given `namedtuple` class. Subclasses of [`collections.namedtuple`][] are similar to subclass of [`typing.NamedTuple`][], but since field types are not specified, all fields are treated as having type [`Any`][typing.Any]. ```python from typing import NamedTuple from pydantic import BaseModel, ValidationError class Point(NamedTuple): x: int y: int class Model(BaseModel): p: Point try: Model(p=('1.3', '2')) except ValidationError as e: print(e) """ 1 validation error for Model p.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='1.3', input_type=str] """ ``` ## Deque ### [`deque`][collections.deque] Allows [`list`][], [`tuple`][], [`set`][], [`frozenset`][], [`deque`][collections.deque], or generators and casts to a [`deque`][collections.deque]. When generic parameters are provided, the appropriate validation is applied to the respective items of the `deque`. ### [`typing.Deque`][] Handled the same as `deque` above. ```python from typing import Deque, Optional from pydantic import BaseModel class Model(BaseModel): deque: Optional[Deque[int]] = None print(Model(deque=[1, 2, 3]).deque) #> deque([1, 2, 3]) ``` ## Sets ### [`set`][] Allows [`list`][], [`tuple`][], [`set`][], [`frozenset`][], [`deque`][collections.deque], or generators and casts to a [`set`][]. When a generic parameter is provided, the appropriate validation is applied to all items of the set. ### [`typing.Set`][] Handled the same as `set` above. ```python from typing import Optional, Set from pydantic import BaseModel class Model(BaseModel): simple_set: Optional[set] = None set_of_ints: Optional[Set[int]] = None print(Model(simple_set={'1', '2', '3'}).simple_set) #> {'1', '2', '3'} print(Model(simple_set=['1', '2', '3']).simple_set) #> {'1', '2', '3'} print(Model(set_of_ints=['1', '2', '3']).set_of_ints) #> {1, 2, 3} ``` ### [`frozenset`][] Allows [`list`][], [`tuple`][], [`set`][], [`frozenset`][], [`deque`][collections.deque], or generators and casts to a [`frozenset`][]. When a generic parameter is provided, the appropriate validation is applied to all items of the frozen set. ### [`typing.FrozenSet`][] Handled the same as `frozenset` above. ```python from typing import FrozenSet, Optional from pydantic import BaseModel class Model(BaseModel): simple_frozenset: Optional[frozenset] = None frozenset_of_ints: Optional[FrozenSet[int]] = None m1 = Model(simple_frozenset=['1', '2', '3']) print(type(m1.simple_frozenset)) #> print(sorted(m1.simple_frozenset)) #> ['1', '2', '3'] m2 = Model(frozenset_of_ints=['1', '2', '3']) print(type(m2.frozenset_of_ints)) #> print(sorted(m2.frozenset_of_ints)) #> [1, 2, 3] ``` ## Other Iterables ### [`typing.Sequence`][] This is intended for use when the provided value should meet the requirements of the `Sequence` ABC, and it is desirable to do eager validation of the values in the container. Note that when validation must be performed on the values of the container, the type of the container may not be preserved since validation may end up replacing values. We guarantee that the validated value will be a valid [`typing.Sequence`][], but it may have a different type than was provided (generally, it will become a `list`). ### [`typing.Iterable`][] This is intended for use when the provided value may be an iterable that shouldn't be consumed. See [Infinite Generators](#infinite-generators) below for more detail on parsing and validation. Similar to [`typing.Sequence`][], we guarantee that the validated result will be a valid [`typing.Iterable`][], but it may have a different type than was provided. In particular, even if a non-generator type such as a `list` is provided, the post-validation value of a field of type [`typing.Iterable`][] will be a generator. Here is a simple example using [`typing.Sequence`][]: ```python from typing import Sequence from pydantic import BaseModel class Model(BaseModel): sequence_of_ints: Sequence[int] = None print(Model(sequence_of_ints=[1, 2, 3, 4]).sequence_of_ints) #> [1, 2, 3, 4] print(Model(sequence_of_ints=(1, 2, 3, 4)).sequence_of_ints) #> (1, 2, 3, 4) ``` ### Infinite Generators If you have a generator you want to validate, you can still use `Sequence` as described above. In that case, the generator will be consumed and stored on the model as a list and its values will be validated against the type parameter of the `Sequence` (e.g. `int` in `Sequence[int]`). However, if you have a generator that you _don't_ want to be eagerly consumed (e.g. an infinite generator or a remote data loader), you can use a field of type [`Iterable`][typing.Iterable]: ```python from typing import Iterable from pydantic import BaseModel class Model(BaseModel): infinite: Iterable[int] def infinite_ints(): i = 0 while True: yield i i += 1 m = Model(infinite=infinite_ints()) print(m) """ infinite=ValidatorIterator(index=0, schema=Some(Int(IntValidator { strict: false }))) """ for i in m.infinite: print(i) #> 0 #> 1 #> 2 #> 3 #> 4 #> 5 #> 6 #> 7 #> 8 #> 9 #> 10 if i == 10: break ``` !!! warning During initial validation, `Iterable` fields only perform a simple check that the provided argument is iterable. To prevent it from being consumed, no validation of the yielded values is performed eagerly. Though the yielded values are not validated eagerly, they are still validated when yielded, and will raise a `ValidationError` at yield time when appropriate: ```python from typing import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): int_iterator: Iterable[int] def my_iterator(): yield 13 yield '27' yield 'a' m = Model(int_iterator=my_iterator()) print(next(m.int_iterator)) #> 13 print(next(m.int_iterator)) #> 27 try: next(m.int_iterator) except ValidationError as e: print(e) """ 1 validation error for ValidatorIterator 2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ## Mapping Types ### [`dict`][] `dict(v)` is used to attempt to convert a dictionary. see [`typing.Dict`][] below for sub-type constraints. ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: dict m = Model(x={'foo': 1}) print(m.model_dump()) #> {'x': {'foo': 1}} try: Model(x='test') except ValidationError as e: print(e) """ 1 validation error for Model x Input should be a valid dictionary [type=dict_type, input_value='test', input_type=str] """ ``` ### [`typing.Dict`][] ```python from typing import Dict from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Dict[str, int] m = Model(x={'foo': 1}) print(m.model_dump()) #> {'x': {'foo': 1}} try: Model(x={'foo': '1'}) except ValidationError as e: print(e) """ 1 validation error for Model x Input should be a valid dictionary [type=dict_type, input_value='test', input_type=str] """ ``` ### TypedDict !!! note This is a new feature of the Python standard library as of Python 3.8. Because of limitations in [typing.TypedDict][] before 3.12, the [typing-extensions](https://pypi.org/project/typing-extensions/) package is required for Python <3.12. You'll need to import `TypedDict` from `typing_extensions` instead of `typing` and will get a build time error if you don't. [`TypedDict`][typing.TypedDict] declares a dictionary type that expects all of its instances to have a certain set of keys, where each key is associated with a value of a consistent type. It is same as [`dict`][] but Pydantic will validate the dictionary since keys are annotated. ```python from typing_extensions import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int ta = TypeAdapter(User) print(ta.validate_python({'name': 'foo', 'id': 1})) #> {'name': 'foo', 'id': 1} try: ta.validate_python({'name': 'foo'}) except ValidationError as e: print(e) """ 1 validation error for typed-dict id Field required [type=missing, input_value={'name': 'foo'}, input_type=dict] """ ``` You can define `__pydantic_config__` to change the model inherited from [`TypedDict`][typing.TypedDict]. See the [`ConfigDict` API reference][pydantic.config.ConfigDict] for more details. ```python from typing import Optional from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError # `total=False` means keys are non-required class UserIdentity(TypedDict, total=False): name: Optional[str] surname: str class User(TypedDict): __pydantic_config__ = ConfigDict(extra='forbid') identity: UserIdentity age: int ta = TypeAdapter(User) print( ta.validate_python( {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} print( ta.validate_python( {'identity': {'name': None, 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': None, 'surname': 'John'}, 'age': 37} print(ta.validate_python({'identity': {}, 'age': 37})) #> {'identity': {}, 'age': 37} try: ta.validate_python( {'identity': {'name': ['Smith'], 'surname': 'John'}, 'age': 24} ) except ValidationError as e: print(e) """ 1 validation error for typed-dict identity.name Input should be a valid string [type=string_type, input_value=['Smith'], input_type=list] """ try: ta.validate_python( { 'identity': {'name': 'Smith', 'surname': 'John'}, 'age': '37', 'email': 'john.smith@me.com', } ) except ValidationError as e: print(e) """ 1 validation error for typed-dict email Extra inputs are not permitted [type=extra_forbidden, input_value='john.smith@me.com', input_type=str] """ ``` ## Callable See below for more detail on parsing and validation Fields can also be of type [`Callable`][typing.Callable]: ```python from typing import Callable from pydantic import BaseModel class Foo(BaseModel): callback: Callable[[int], int] m = Foo(callback=lambda x: x) print(m) #> callback= at 0x0123456789ab> ``` !!! warning Callable fields only perform a simple check that the argument is callable; no validation of arguments, their types, or the return type is performed. ## IP Address Types * [`ipaddress.IPv4Address`][]: Uses the type itself for validation by passing the value to `IPv4Address(v)`. * [`ipaddress.IPv4Interface`][]: Uses the type itself for validation by passing the value to `IPv4Address(v)`. * [`ipaddress.IPv4Network`][]: Uses the type itself for validation by passing the value to `IPv4Network(v)`. * [`ipaddress.IPv6Address`][]: Uses the type itself for validation by passing the value to `IPv6Address(v)`. * [`ipaddress.IPv6Interface`][]: Uses the type itself for validation by passing the value to `IPv6Interface(v)`. * [`ipaddress.IPv6Network`][]: Uses the type itself for validation by passing the value to `IPv6Network(v)`. See [Network Types](../api/networks.md) for other custom IP address types. ## UUID For UUID, Pydantic tries to use the type itself for validation by passing the value to `UUID(v)`. There's a fallback to `UUID(bytes=v)` for `bytes` and `bytearray`. In case you want to constrain the UUID version, you can check the following types: * [`UUID1`][pydantic.types.UUID1]: requires UUID version 1. * [`UUID3`][pydantic.types.UUID3]: requires UUID version 3. * [`UUID4`][pydantic.types.UUID4]: requires UUID version 4. * [`UUID5`][pydantic.types.UUID5]: requires UUID version 5. ## Union Pydantic has extensive support for union validation, both [`typing.Union`][] and Python 3.10's pipe syntax (`A | B`) are supported. Read more in the [`Unions`](../concepts/unions.md) section of the concepts docs. ## [`Type`][typing.Type] and [`TypeVar`][typing.TypeVar] ### [`type`][] Pydantic supports the use of `type[T]` to specify that a field may only accept classes (not instances) that are subclasses of `T`. ### [`typing.Type`][] Handled the same as `type` above. ```python from typing import Type from pydantic import BaseModel, ValidationError class Foo: pass class Bar(Foo): pass class Other: pass class SimpleModel(BaseModel): just_subclasses: Type[Foo] SimpleModel(just_subclasses=Foo) SimpleModel(just_subclasses=Bar) try: SimpleModel(just_subclasses=Other) except ValidationError as e: print(e) """ 1 validation error for SimpleModel just_subclasses Input should be a subclass of Foo [type=is_subclass_of, input_value=, input_type=type] """ ``` You may also use `Type` to specify that any class is allowed. ```python {upgrade="skip"} from typing import Type from pydantic import BaseModel, ValidationError class Foo: pass class LenientSimpleModel(BaseModel): any_class_goes: Type LenientSimpleModel(any_class_goes=int) LenientSimpleModel(any_class_goes=Foo) try: LenientSimpleModel(any_class_goes=Foo()) except ValidationError as e: print(e) """ 1 validation error for LenientSimpleModel any_class_goes Input should be a type [type=is_type, input_value=<__main__.Foo object at 0x0123456789ab>, input_type=Foo] """ ``` ### [`typing.TypeVar`][] [`TypeVar`][typing.TypeVar] is supported either unconstrained, constrained or with a bound. ```python from typing import TypeVar from pydantic import BaseModel Foobar = TypeVar('Foobar') BoundFloat = TypeVar('BoundFloat', bound=float) IntStr = TypeVar('IntStr', int, str) class Model(BaseModel): a: Foobar # equivalent of ": Any" b: BoundFloat # equivalent of ": float" c: IntStr # equivalent of ": Union[int, str]" print(Model(a=[1], b=4.2, c='x')) #> a=[1] b=4.2 c='x' # a may be None print(Model(a=None, b=1, c=1)) #> a=None b=1.0 c=1 ``` ## None Types [`None`][], `type(None)`, or `Literal[None]` are all equivalent according to [the typing specification](https://typing.readthedocs.io/en/latest/spec/special-types.html#none). Allows only `None` value. ## Strings - [`str`][]: Strings are accepted as-is. - [`bytes`][] and [`bytearray`][] are converted using the [`decode()`][bytes.decode] method. - Enums inheriting from [`str`][] are converted using the [`value`][enum.Enum.value] attribute. All other types cause an error. !!! warning "Strings aren't Sequences" While instances of `str` are technically valid instances of the `Sequence[str]` protocol from a type-checker's point of view, this is frequently not intended as is a common source of bugs. As a result, Pydantic raises a `ValidationError` if you attempt to pass a `str` or `bytes` instance into a field of type `Sequence[str]` or `Sequence[bytes]`: ```python from typing import Optional, Sequence from pydantic import BaseModel, ValidationError class Model(BaseModel): sequence_of_strs: Optional[Sequence[str]] = None sequence_of_bytes: Optional[Sequence[bytes]] = None print(Model(sequence_of_strs=['a', 'bc']).sequence_of_strs) #> ['a', 'bc'] print(Model(sequence_of_strs=('a', 'bc')).sequence_of_strs) #> ('a', 'bc') print(Model(sequence_of_bytes=[b'a', b'bc']).sequence_of_bytes) #> [b'a', b'bc'] print(Model(sequence_of_bytes=(b'a', b'bc')).sequence_of_bytes) #> (b'a', b'bc') try: Model(sequence_of_strs='abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_strs 'str' instances are not allowed as a Sequence value [type=sequence_str, input_value='abc', input_type=str] """ try: Model(sequence_of_bytes=b'abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_bytes 'bytes' instances are not allowed as a Sequence value [type=sequence_str, input_value=b'abc', input_type=bytes] """ ``` ## Bytes [`bytes`][] are accepted as-is. [`bytearray`][] is converted using `bytes(v)`. `str` are converted using `v.encode()`. `int`, `float`, and `Decimal` are coerced using `str(v).encode()`. See [ByteSize](types.md#pydantic.types.ByteSize) for more details. ## [`typing.Literal`][] Pydantic supports the use of [`typing.Literal`][] as a lightweight way to specify that a field may accept only specific literal values: ```python from typing import Literal from pydantic import BaseModel, ValidationError class Pie(BaseModel): flavor: Literal['apple', 'pumpkin'] Pie(flavor='apple') Pie(flavor='pumpkin') try: Pie(flavor='cherry') except ValidationError as e: print(str(e)) """ 1 validation error for Pie flavor Input should be 'apple' or 'pumpkin' [type=literal_error, input_value='cherry', input_type=str] """ ``` One benefit of this field type is that it can be used to check for equality with one or more specific values without needing to declare custom validators: ```python from typing import ClassVar, List, Literal, Union from pydantic import BaseModel, ValidationError class Cake(BaseModel): kind: Literal['cake'] required_utensils: ClassVar[List[str]] = ['fork', 'knife'] class IceCream(BaseModel): kind: Literal['icecream'] required_utensils: ClassVar[List[str]] = ['spoon'] class Meal(BaseModel): dessert: Union[Cake, IceCream] print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Cake print(type(Meal(dessert={'kind': 'icecream'}).dessert).__name__) #> IceCream try: Meal(dessert={'kind': 'pie'}) except ValidationError as e: print(str(e)) """ 2 validation errors for Meal dessert.Cake.kind Input should be 'cake' [type=literal_error, input_value='pie', input_type=str] dessert.IceCream.kind Input should be 'icecream' [type=literal_error, input_value='pie', input_type=str] """ ``` With proper ordering in an annotated `Union`, you can use this to parse types of decreasing specificity: ```python from typing import Literal, Optional, Union from pydantic import BaseModel class Dessert(BaseModel): kind: str class Pie(Dessert): kind: Literal['pie'] flavor: Optional[str] class ApplePie(Pie): flavor: Literal['apple'] class PumpkinPie(Pie): flavor: Literal['pumpkin'] class Meal(BaseModel): dessert: Union[ApplePie, PumpkinPie, Pie, Dessert] print(type(Meal(dessert={'kind': 'pie', 'flavor': 'apple'}).dessert).__name__) #> ApplePie print(type(Meal(dessert={'kind': 'pie', 'flavor': 'pumpkin'}).dessert).__name__) #> PumpkinPie print(type(Meal(dessert={'kind': 'pie'}).dessert).__name__) #> Dessert print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Dessert ``` ## [`typing.Any`][] Allows any value, including `None`. ## [`typing.Hashable`][] * From Python, supports any data that passes an `isinstance(v, Hashable)` check. * From JSON, first loads the data via an `Any` validator, then checks if the data is hashable with `isinstance(v, Hashable)`. ## [`typing.Annotated`][] Allows wrapping another type with arbitrary metadata, as per [PEP-593](https://www.python.org/dev/peps/pep-0593/). The `Annotated` hint may contain a single call to the [`Field` function](../concepts/types.md#composing-types-via-annotated), but otherwise the additional metadata is ignored and the root type is used. ## [`typing.Pattern`][] Will cause the input value to be passed to `re.compile(v)` to create a regular expression pattern. ## [`pathlib.Path`][] Simply uses the type itself for validation by passing the value to `Path(v)`. pydantic-2.10.6/docs/api/type_adapter.md000066400000000000000000000000461474456633400201370ustar00rootroot00000000000000::: pydantic.type_adapter.TypeAdapter pydantic-2.10.6/docs/api/types.md000066400000000000000000000001451474456633400166220ustar00rootroot00000000000000::: pydantic.types options: show_root_heading: true merge_init_into_class: false pydantic-2.10.6/docs/api/validate_call.md000066400000000000000000000000451474456633400202410ustar00rootroot00000000000000::: pydantic.validate_call_decorator pydantic-2.10.6/docs/api/version.md000066400000000000000000000002261474456633400171430ustar00rootroot00000000000000::: pydantic.__version__ options: show_root_heading: true ::: pydantic.version.version_info options: show_root_heading: true pydantic-2.10.6/docs/badge/000077500000000000000000000000001474456633400154255ustar00rootroot00000000000000pydantic-2.10.6/docs/badge/v1.json000066400000000000000000000017531474456633400166540ustar00rootroot00000000000000{ "label": "Pydantic", "message": "v1", "logoSvg": "", "logoWidth": 10, "labelColor": "#1e293b", "color": "#4CC61F" } pydantic-2.10.6/docs/badge/v2.json000066400000000000000000000017531474456633400166550ustar00rootroot00000000000000{ "label": "Pydantic", "message": "v2", "logoSvg": "", "logoWidth": 10, "labelColor": "#1e293b", "color": "#4CC61F" } pydantic-2.10.6/docs/concepts/000077500000000000000000000000001474456633400162015ustar00rootroot00000000000000pydantic-2.10.6/docs/concepts/alias.md000066400000000000000000000153561474456633400176260ustar00rootroot00000000000000An alias is an alternative name for a field, used when serializing and deserializing data. You can specify an alias in the following ways: * `alias` on the [`Field`][pydantic.fields.Field] * must be a `str` * `validation_alias` on the [`Field`][pydantic.fields.Field] * can be an instance of `str`, [`AliasPath`][pydantic.aliases.AliasPath], or [`AliasChoices`][pydantic.aliases.AliasChoices] * `serialization_alias` on the [`Field`][pydantic.fields.Field] * must be a `str` * `alias_generator` on the [`Config`][pydantic.config.ConfigDict.alias_generator] * can be a callable or an instance of [`AliasGenerator`][pydantic.aliases.AliasGenerator] For examples of how to use `alias`, `validation_alias`, and `serialization_alias`, see [Field aliases](../concepts/fields.md#field-aliases). ## `AliasPath` and `AliasChoices` ??? api "API Documentation" [`pydantic.aliases.AliasPath`][pydantic.aliases.AliasPath]
[`pydantic.aliases.AliasChoices`][pydantic.aliases.AliasChoices]
Pydantic provides two special types for convenience when using `validation_alias`: `AliasPath` and `AliasChoices`. The `AliasPath` is used to specify a path to a field using aliases. For example: ```python {lint="skip"} from pydantic import BaseModel, Field, AliasPath class User(BaseModel): first_name: str = Field(validation_alias=AliasPath('names', 0)) last_name: str = Field(validation_alias=AliasPath('names', 1)) user = User.model_validate({'names': ['John', 'Doe']}) # (1)! print(user) #> first_name='John' last_name='Doe' ``` 1. We are using `model_validate` to validate a dictionary using the field aliases. You can see more details about [`model_validate`][pydantic.main.BaseModel.model_validate] in the API reference. In the `'first_name'` field, we are using the alias `'names'` and the index `0` to specify the path to the first name. In the `'last_name'` field, we are using the alias `'names'` and the index `1` to specify the path to the last name. `AliasChoices` is used to specify a choice of aliases. For example: ```python {lint="skip"} from pydantic import BaseModel, Field, AliasChoices class User(BaseModel): first_name: str = Field(validation_alias=AliasChoices('first_name', 'fname')) last_name: str = Field(validation_alias=AliasChoices('last_name', 'lname')) user = User.model_validate({'fname': 'John', 'lname': 'Doe'}) # (1)! print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'first_name': 'John', 'lname': 'Doe'}) # (2)! print(user) #> first_name='John' last_name='Doe' ``` 1. We are using the second alias choice for both fields. 2. We are using the first alias choice for the field `'first_name'` and the second alias choice for the field `'last_name'`. You can also use `AliasChoices` with `AliasPath`: ```python {lint="skip"} from pydantic import BaseModel, Field, AliasPath, AliasChoices class User(BaseModel): first_name: str = Field(validation_alias=AliasChoices('first_name', AliasPath('names', 0))) last_name: str = Field(validation_alias=AliasChoices('last_name', AliasPath('names', 1))) user = User.model_validate({'first_name': 'John', 'last_name': 'Doe'}) print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'names': ['John', 'Doe']}) print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'names': ['John'], 'last_name': 'Doe'}) print(user) #> first_name='John' last_name='Doe' ``` ## Using alias generators You can use the `alias_generator` parameter of [`Config`][pydantic.config.ConfigDict.alias_generator] to specify a callable (or group of callables, via `AliasGenerator`) that will generate aliases for all fields in a model. This is useful if you want to use a consistent naming convention for all fields in a model, but do not want to specify the alias for each field individually. !!! note Pydantic offers three built-in alias generators that you can use out of the box: [`to_pascal`][pydantic.alias_generators.to_pascal]
[`to_camel`][pydantic.alias_generators.to_camel]
[`to_snake`][pydantic.alias_generators.to_snake]
### Using a callable Here's a basic example using a callable: ```python from pydantic import BaseModel, ConfigDict class Tree(BaseModel): model_config = ConfigDict( alias_generator=lambda field_name: field_name.upper() ) age: int height: float kind: str t = Tree.model_validate({'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'}) print(t.model_dump(by_alias=True)) #> {'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'} ``` ### Using an `AliasGenerator` ??? api "API Documentation" [`pydantic.aliases.AliasGenerator`][pydantic.aliases.AliasGenerator]
`AliasGenerator` is a class that allows you to specify multiple alias generators for a model. You can use an `AliasGenerator` to specify different alias generators for validation and serialization. This is particularly useful if you need to use different naming conventions for loading and saving data, but you don't want to specify the validation and serialization aliases for each field individually. For example: ```python from pydantic import AliasGenerator, BaseModel, ConfigDict class Tree(BaseModel): model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=lambda field_name: field_name.upper(), serialization_alias=lambda field_name: field_name.title(), ) ) age: int height: float kind: str t = Tree.model_validate({'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'}) print(t.model_dump(by_alias=True)) #> {'Age': 12, 'Height': 1.2, 'Kind': 'oak'} ``` ## Alias Precedence If you specify an `alias` on the [`Field`][pydantic.fields.Field], it will take precedence over the generated alias by default: ```python from pydantic import BaseModel, ConfigDict, Field def to_camel(string: str) -> str: return ''.join(word.capitalize() for word in string.split('_')) class Voice(BaseModel): model_config = ConfigDict(alias_generator=to_camel) name: str language_code: str = Field(alias='lang') voice = Voice(Name='Filiz', lang='tr-TR') print(voice.language_code) #> tr-TR print(voice.model_dump(by_alias=True)) #> {'Name': 'Filiz', 'lang': 'tr-TR'} ``` ### Alias Priority You may set `alias_priority` on a field to change this behavior: * `alias_priority=2` the alias will *not* be overridden by the alias generator. * `alias_priority=1` the alias *will* be overridden by the alias generator. * `alias_priority` not set: * alias is set: the alias will *not* be overridden by the alias generator. * alias is not set: the alias *will* be overridden by the alias generator. The same precedence applies to `validation_alias` and `serialization_alias`. See more about the different field aliases under [field aliases](../concepts/fields.md#field-aliases). pydantic-2.10.6/docs/concepts/config.md000066400000000000000000000067311474456633400177770ustar00rootroot00000000000000Behaviour of Pydantic can be controlled via the [`BaseModel.model_config`][pydantic.BaseModel.model_config], and as an argument to [`TypeAdapter`][pydantic.TypeAdapter]. !!! note Before **v2.0**, the `Config` class was used. This is still supported, but **deprecated**. ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): model_config = ConfigDict(str_max_length=10) v: str try: m = Model(v='x' * 20) except ValidationError as e: print(e) """ 1 validation error for Model v String should have at most 10 characters [type=string_too_long, input_value='xxxxxxxxxxxxxxxxxxxx', input_type=str] """ ``` Also, you can specify config options as model class kwargs: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel, extra='forbid'): # (1)! a: str try: Model(a='spam', b='oh no') except ValidationError as e: print(e) """ 1 validation error for Model b Extra inputs are not permitted [type=extra_forbidden, input_value='oh no', input_type=str] """ ``` 1. See the [Extra Attributes](models.md#extra-fields) section for more details. Similarly, if using the [`@dataclass`][pydantic.dataclasses] decorator from Pydantic: ```python from datetime import datetime from pydantic import ConfigDict, ValidationError from pydantic.dataclasses import dataclass config = ConfigDict(str_max_length=10, validate_assignment=True) @dataclass(config=config) class User: id: int name: str = 'John Doe' signup_ts: datetime = None user = User(id='42', signup_ts='2032-06-21T12:00') try: user.name = 'x' * 20 except ValidationError as e: print(e) """ 1 validation error for User name String should have at most 10 characters [type=string_too_long, input_value='xxxxxxxxxxxxxxxxxxxx', input_type=str] """ ``` ## Configuration with `dataclass` from the standard library or `TypedDict` If using the `dataclass` from the standard library or `TypedDict`, you should use `__pydantic_config__` instead. ```python from dataclasses import dataclass from datetime import datetime from pydantic import ConfigDict @dataclass class User: __pydantic_config__ = ConfigDict(strict=True) id: int name: str = 'John Doe' signup_ts: datetime = None ``` Alternatively, the [`with_config`][pydantic.config.with_config] decorator can be used to comply with type checkers. ```python from typing_extensions import TypedDict from pydantic import ConfigDict, with_config @with_config(ConfigDict(str_to_lower=True)) class Model(TypedDict): x: str ``` ## Change behaviour globally If you wish to change the behaviour of Pydantic globally, you can create your own custom `BaseModel` with custom `model_config` since the config is inherited: ```python from pydantic import BaseModel, ConfigDict class Parent(BaseModel): model_config = ConfigDict(extra='allow') class Model(Parent): x: str m = Model(x='foo', y='bar') print(m.model_dump()) #> {'x': 'foo', 'y': 'bar'} ``` If you add a `model_config` to the `Model` class, it will _merge_ with the `model_config` from `Parent`: ```python from pydantic import BaseModel, ConfigDict class Parent(BaseModel): model_config = ConfigDict(extra='allow') class Model(Parent): model_config = ConfigDict(str_to_lower=True) # (1)! x: str m = Model(x='FOO', y='bar') print(m.model_dump()) #> {'x': 'foo', 'y': 'bar'} print(m.model_config) #> {'extra': 'allow', 'str_to_lower': True} ``` pydantic-2.10.6/docs/concepts/conversion_table.md000066400000000000000000000007421474456633400220620ustar00rootroot00000000000000The following table provides details on how Pydantic converts data during validation in both strict and lax modes. The "Strict" column contains checkmarks for type conversions that are allowed when validating in [Strict Mode](strict_mode.md). === "All" {{ conversion_table_all }} === "JSON" {{ conversion_table_json }} === "JSON - Strict" {{ conversion_table_json_strict }} === "Python" {{ conversion_table_python }} === "Python - Strict" {{ conversion_table_python_strict }} pydantic-2.10.6/docs/concepts/dataclasses.md000066400000000000000000000264301474456633400210170ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.dataclasses.dataclass`][pydantic.dataclasses.dataclass]
If you don't want to use Pydantic's [`BaseModel`][pydantic.BaseModel] you can instead get the same data validation on standard [dataclasses][dataclasses]. ```python from datetime import datetime from typing import Optional from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None user = User(id='42', signup_ts='2032-06-21T12:00') print(user) """ User(id=42, name='John Doe', signup_ts=datetime.datetime(2032, 6, 21, 12, 0)) """ ``` !!! note Keep in mind that Pydantic dataclasses are **not** a replacement for [Pydantic models](../concepts/models.md). They provide a similar functionality to stdlib dataclasses with the addition of Pydantic validation. There are cases where subclassing using Pydantic models is the better choice. For more information and discussion see [pydantic/pydantic#710](https://github.com/pydantic/pydantic/issues/710). Similarities between Pydantic dataclasses and models include support for: * [Configuration](#dataclass-config) support * [Nested](./models.md#nested-models) classes * [Generics](./models.md#generic-models) Some differences between Pydantic dataclasses and models include: * [validators](#validators-and-initialization-hooks) * The behavior with the [`extra`][pydantic.ConfigDict.extra] configuration value Similarly to Pydantic models, arguments used to instantiate the dataclass are [copied](./models.md#attribute-copies). To make use of the [various methods](./models.md#model-methods-and-properties) to validate, dump and generate a JSON Schema, you can wrap the dataclass with a [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] and make use of its methods. You can use both the Pydantic's [`Field()`][pydantic.Field] and the stdlib's [`field()`][dataclasses.field] functions: ```python import dataclasses from typing import List, Optional from pydantic import Field, TypeAdapter from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' friends: List[int] = dataclasses.field(default_factory=lambda: [0]) age: Optional[int] = dataclasses.field( default=None, metadata={'title': 'The age of the user', 'description': 'do not lie!'}, ) height: Optional[int] = Field(None, title='The height in cm', ge=50, le=300) user = User(id='42') print(TypeAdapter(User).json_schema()) """ { 'properties': { 'id': {'title': 'Id', 'type': 'integer'}, 'name': {'default': 'John Doe', 'title': 'Name', 'type': 'string'}, 'friends': { 'items': {'type': 'integer'}, 'title': 'Friends', 'type': 'array', }, 'age': { 'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': None, 'description': 'do not lie!', 'title': 'The age of the user', }, 'height': { 'anyOf': [ {'maximum': 300, 'minimum': 50, 'type': 'integer'}, {'type': 'null'}, ], 'default': None, 'title': 'The height in cm', }, }, 'required': ['id'], 'title': 'User', 'type': 'object', } """ ``` The Pydantic `@dataclass` decorator accepts the same arguments as the standard decorator, with the addition of a `config` parameter. ## Dataclass config If you want to modify the configuration like you would with a [`BaseModel`][pydantic.BaseModel], you have two options: * Use the `config` argument of the decorator. * Define the configuration with the `__pydantic_config__` attribute. ```python from pydantic import ConfigDict from pydantic.dataclasses import dataclass # Option 1 -- using the decorator argument: @dataclass(config=ConfigDict(validate_assignment=True)) # (1)! class MyDataclass1: a: int # Option 2 -- using an attribute: @dataclass class MyDataclass2: a: int __pydantic_config__ = ConfigDict(validate_assignment=True) ``` 1. You can read more about `validate_assignment` in the [API reference][pydantic.config.ConfigDict.validate_assignment]. !!! note While Pydantic dataclasses support the [`extra`][pydantic.config.ConfigDict.extra] configuration value, some default behavior of stdlib dataclasses may prevail. For example, any extra fields present on a Pydantic dataclass with [`extra`][pydantic.config.ConfigDict.extra] set to `'allow'` are omitted in the dataclass' string representation. ## Rebuilding dataclass schema The [`rebuild_dataclass()`][pydantic.dataclasses.rebuild_dataclass] can be used to rebuild the core schema of the dataclass. See the [rebuilding model schema](./models.md#rebuilding-model-schema) section for more details. ## Stdlib dataclasses and Pydantic dataclasses ### Inherit from stdlib dataclasses Stdlib dataclasses (nested or not) can also be inherited and Pydantic will automatically validate all the inherited fields. ```python import dataclasses import pydantic @dataclasses.dataclass class Z: z: int @dataclasses.dataclass class Y(Z): y: int = 0 @pydantic.dataclasses.dataclass class X(Y): x: int = 0 foo = X(x=b'1', y='2', z='3') print(foo) #> X(z=3, y=2, x=1) try: X(z='pika') except pydantic.ValidationError as e: print(e) """ 1 validation error for X z Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='pika', input_type=str] """ ``` ### Usage of stdlib dataclasses with `BaseModel` When a standard library dataclass is used within a Pydantic model, a Pydantic dataclass or a [`TypeAdapter`][pydantic.TypeAdapter], validation will be applied (and the [configuration](#dataclass-config) stays the same). This means that using a stdlib or a Pydantic dataclass as a field annotation is functionally equivalent. ```python import dataclasses from typing import Optional from pydantic import BaseModel, ConfigDict, ValidationError @dataclasses.dataclass(frozen=True) class User: name: str class Foo(BaseModel): # Required so that pydantic revalidates the model attributes: model_config = ConfigDict(revalidate_instances='always') user: Optional[User] = None # nothing is validated as expected: user = User(name=['not', 'a', 'string']) print(user) #> User(name=['not', 'a', 'string']) try: Foo(user=user) except ValidationError as e: print(e) """ 1 validation error for Foo user.name Input should be a valid string [type=string_type, input_value=['not', 'a', 'string'], input_type=list] """ foo = Foo(user=User(name='pika')) try: foo.user.name = 'bulbi' except dataclasses.FrozenInstanceError as e: print(e) #> cannot assign to field 'name' ``` ### Using custom types As said above, validation is applied on standard library dataclasses. If you make use of custom types, you will get an error when trying to refer to the dataclass. To circumvent the issue, you can set the [`arbitrary_types_allowed`][pydantic.ConfigDict.arbitrary_types_allowed] configuration value on the dataclass: ```python import dataclasses from pydantic import BaseModel, ConfigDict from pydantic.errors import PydanticSchemaGenerationError class ArbitraryType: def __init__(self, value): self.value = value def __repr__(self): return f'ArbitraryType(value={self.value!r})' @dataclasses.dataclass class DC: a: ArbitraryType b: str # valid as it is a stdlib dataclass without validation: my_dc = DC(a=ArbitraryType(value=3), b='qwe') try: class Model(BaseModel): dc: DC other: str # invalid as dc is now validated with pydantic, and ArbitraryType is not a known type Model(dc=my_dc, other='other') except PydanticSchemaGenerationError as e: print(e.message) """ Unable to generate pydantic-core schema for . Set `arbitrary_types_allowed=True` in the model_config to ignore this error or implement `__get_pydantic_core_schema__` on your type to fully support it. If you got this error by calling handler() within `__get_pydantic_core_schema__` then you likely need to call `handler.generate_schema()` since we do not call `__get_pydantic_core_schema__` on `` otherwise to avoid infinite recursion. """ # valid as we set arbitrary_types_allowed=True, and that config pushes down to the nested vanilla dataclass class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) dc: DC other: str m = Model(dc=my_dc, other='other') print(repr(m)) #> Model(dc=DC(a=ArbitraryType(value=3), b='qwe'), other='other') ``` ### Checking if a dataclass is a Pydantic dataclass Pydantic dataclasses are still considered dataclasses, so using [`dataclasses.is_dataclass`][] will return `True`. To check if a type is specifically a pydantic dataclass you can use the [`is_pydantic_dataclass`][pydantic.dataclasses.is_pydantic_dataclass] function. ```python import dataclasses import pydantic @dataclasses.dataclass class StdLibDataclass: id: int PydanticDataclass = pydantic.dataclasses.dataclass(StdLibDataclass) print(dataclasses.is_dataclass(StdLibDataclass)) #> True print(pydantic.dataclasses.is_pydantic_dataclass(StdLibDataclass)) #> False print(dataclasses.is_dataclass(PydanticDataclass)) #> True print(pydantic.dataclasses.is_pydantic_dataclass(PydanticDataclass)) #> True ``` ## Validators and initialization hooks Validators also work with Pydantic dataclasses: ```python from pydantic import field_validator from pydantic.dataclasses import dataclass @dataclass class DemoDataclass: product_id: str # should be a five-digit string, may have leading zeros @field_validator('product_id', mode='before') @classmethod def convert_int_serial(cls, v): if isinstance(v, int): v = str(v).zfill(5) return v print(DemoDataclass(product_id='01234')) #> DemoDataclass(product_id='01234') print(DemoDataclass(product_id=2468)) #> DemoDataclass(product_id='02468') ``` The dataclass [`__post_init__()`][dataclasses.__post_init__] method is also supported, and will be called between the calls to *before* and *after* model validators. ??? example ```python from pydantic_core import ArgsKwargs from typing_extensions import Self from pydantic import model_validator from pydantic.dataclasses import dataclass @dataclass class Birth: year: int month: int day: int @dataclass class User: birth: Birth @model_validator(mode='before') @classmethod def before(cls, values: ArgsKwargs) -> ArgsKwargs: print(f'First: {values}') # (1)! """ First: ArgsKwargs((), {'birth': {'year': 1995, 'month': 3, 'day': 2}}) """ return values @model_validator(mode='after') def after(self) -> Self: print(f'Third: {self}') #> Third: User(birth=Birth(year=1995, month=3, day=2)) return self def __post_init__(self): print(f'Second: {self.birth}') #> Second: Birth(year=1995, month=3, day=2) user = User(**{'birth': {'year': 1995, 'month': 3, 'day': 2}}) ``` 1. Unlike Pydantic models, the `values` parameter is of type [`ArgsKwargs`][pydantic_core.ArgsKwargs] pydantic-2.10.6/docs/concepts/experimental.md000066400000000000000000000411141474456633400212210ustar00rootroot00000000000000# Experimental Features In this section you will find documentation for new, experimental features in Pydantic. These features are subject to change or removal, and we are looking for feedback and suggestions before making them a permanent part of Pydantic. See our [Version Policy](../version-policy.md#experimental-features) for more information on experimental features. ## Feedback We welcome feedback on experimental features! Please open an issue on the [Pydantic GitHub repository](https://github.com/pydantic/pydantic/issues/new/choose) to share your thoughts, requests, or suggestions. We also encourage you to read through existing feedback and add your thoughts to existing issues. ## Warnings on Import When you import an experimental feature from the `experimental` module, you'll see a warning message that the feature is experimental. You can disable this warning with the following: ```python import warnings from pydantic import PydanticExperimentalWarning warnings.filterwarnings('ignore', category=PydanticExperimentalWarning) ``` ## Pipeline API Pydantic v2.8.0 introduced an experimental "pipeline" API that allows composing of parsing (validation), constraints and transformations in a more type-safe manner than existing APIs. This API is subject to change or removal, we are looking for feedback and suggestions before making it a permanent part of Pydantic. ??? api "API Documentation" [`pydantic.experimental.pipeline`][pydantic.experimental.pipeline]
Generally, the pipeline API is used to define a sequence of steps to apply to incoming data during validation. The pipeline API is designed to be more type-safe and composable than the existing Pydantic API. Each step in the pipeline can be: * A validation step that runs pydantic validation on the provided type * A transformation step that modifies the data * A constraint step that checks the data against a condition * A predicate step that checks the data against a condition and raises an error if it returns `False` Note that the following example attempts to be exhaustive at the cost of complexity: if you find yourself writing this many transformations in type annotations you may want to consider having a `UserIn` and `UserOut` model (example below) or similar where you make the transformations via idomatic plain Python code. These APIs are meant for situations where the code savings are significant and the added complexity is relatively small. ```python from __future__ import annotations from datetime import datetime from typing_extensions import Annotated from pydantic import BaseModel from pydantic.experimental.pipeline import validate_as class User(BaseModel): name: Annotated[str, validate_as(str).str_lower()] # (1)! age: Annotated[int, validate_as(int).gt(0)] # (2)! username: Annotated[str, validate_as(str).str_pattern(r'[a-z]+')] # (3)! password: Annotated[ str, validate_as(str) .transform(str.lower) .predicate(lambda x: x != 'password'), # (4)! ] favorite_number: Annotated[ # (5)! int, (validate_as(int) | validate_as(str).str_strip().validate_as(int)).gt( 0 ), ] friends: Annotated[list[User], validate_as(...).len(0, 100)] # (6)! bio: Annotated[ datetime, validate_as(int) .transform(lambda x: x / 1_000_000) .validate_as(...), # (8)! ] ``` 1. Lowercase a string. 2. Constrain an integer to be greater than zero. 3. Constrain a string to match a regex pattern. 4. You can also use the lower level transform, constrain and predicate methods. 5. Use the `|` or `&` operators to combine steps (like a logical OR or AND). 6. Calling `validate_as(...)` with `Ellipsis`, `...` as the first positional argument implies `validate_as()`. Use `validate_as(Any)` to accept any type. 7. You can call `validate_as()` before or after other steps to do pre or post processing. ### Mapping from `BeforeValidator`, `AfterValidator` and `WrapValidator` The `validate_as` method is a more type-safe way to define `BeforeValidator`, `AfterValidator` and `WrapValidator`: ```python from typing_extensions import Annotated from pydantic.experimental.pipeline import transform, validate_as # BeforeValidator Annotated[int, validate_as(str).str_strip().validate_as(...)] # (1)! # AfterValidator Annotated[int, transform(lambda x: x * 2)] # (2)! # WrapValidator Annotated[ int, validate_as(str) .str_strip() .validate_as(...) .transform(lambda x: x * 2), # (3)! ] ``` 1. Strip whitespace from a string before parsing it as an integer. 2. Multiply an integer by 2 after parsing it. 3. Strip whitespace from a string, validate it as an integer, then multiply it by 2. ### Alternative patterns There are many alternative patterns to use depending on the scenario. Just as an example, consider the `UserIn` and `UserOut` pattern mentioned above: ```python from __future__ import annotations from pydantic import BaseModel class UserIn(BaseModel): favorite_number: int | str class UserOut(BaseModel): favorite_number: int def my_api(user: UserIn) -> UserOut: favorite_number = user.favorite_number if isinstance(favorite_number, str): favorite_number = int(user.favorite_number.strip()) return UserOut(favorite_number=favorite_number) assert my_api(UserIn(favorite_number=' 1 ')).favorite_number == 1 ``` This example uses plain idiomatic Python code that may be easier to understand, type-check, etc. than the examples above. The approach you choose should really depend on your use case. You will have to compare verbosity, performance, ease of returning meaningful errors to your users, etc. to choose the right pattern. Just be mindful of abusing advanced patterns like the pipeline API just because you can. ## Partial Validation Pydantic v2.10.0 introduces experimental support for "partial validation". This allows you to validate an incomplete JSON string, or a Python object representing incomplete input data. Partial validation is particularly helpful when processing the output of an LLM, where the model streams structured responses, and you may wish to begin validating the stream while you're still receiving data (e.g. to show partial data to users). !!! warning Partial validation is an experimental feature and may change in future versions of Pydantic. The current implementation should be considered a proof of concept at this time and has a number of [limitations](#limitations-of-partial-validation). Partial validation can be enabled when using the three validation methods on `TypeAdapter`: [`TypeAdapter.validate_json()`][pydantic.TypeAdapter.validate_json], [`TypeAdapter.validate_python()`][pydantic.TypeAdapter.validate_python], and [`TypeAdapter.validate_strings()`][pydantic.TypeAdapter.validate_strings]. This allows you to parse and validation incomplete JSON, but also to validate Python objects created by parsing incomplete data of any format. The `experimental_allow_partial` flag can be passed to these methods to enable partial validation. It can take the following values (and is `False`, by default): * `False` or `'off'` - disable partial validation * `True` or `'on'` - enable partial validation, but don't support trailing strings * `'trailing-strings'` - enable partial validation and support trailing strings !!! info "`'trailing-strings'` mode" `'trailing-strings'` mode allows for trailing incomplete strings at the end of partial JSON to be included in the output. For example, if you're validating against the following model: ```python from typing import TypedDict class Model(TypedDict): a: str b: str ``` Then the following JSON input would be considered valid, despite the incomplete string at the end: ```json '{"a": "hello", "b": "wor' ``` And would be validated as: ```python {test="skip" lint="skip"} {'a': 'hello', 'b': 'wor'} ``` `experiment_allow_partial` in action: ```python from typing import List from annotated_types import MinLen from typing_extensions import Annotated, NotRequired, TypedDict from pydantic import TypeAdapter class Foobar(TypedDict): # (1)! a: int b: NotRequired[float] c: NotRequired[Annotated[str, MinLen(5)]] ta = TypeAdapter(List[Foobar]) v = ta.validate_json('[{"a": 1, "b"', experimental_allow_partial=True) # (2)! print(v) #> [{'a': 1}] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcd', experimental_allow_partial=True # (3)! ) print(v) #> [{'a': 1, 'b': 1.0}] v = ta.validate_json( '[{"b": 1.0, "c": "abcde"', experimental_allow_partial=True # (4)! ) print(v) #> [] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcde"},{"a": ', experimental_allow_partial=True ) print(v) #> [{'a': 1, 'b': 1.0, 'c': 'abcde'}] v = ta.validate_python([{'a': 1}], experimental_allow_partial=True) # (5)! print(v) #> [{'a': 1}] v = ta.validate_python( [{'a': 1, 'b': 1.0, 'c': 'abcd'}], experimental_allow_partial=True # (6)! ) print(v) #> [{'a': 1, 'b': 1.0}] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcdefg', experimental_allow_partial='trailing-strings', # (7)! ) print(v) #> [{'a': 1, 'b': 1.0, 'c': 'abcdefg'}] ``` 1. The TypedDict `Foobar` has three field, but only `a` is required, that means that a valid instance of `Foobar` can be created even if the `b` and `c` fields are missing. 2. Parsing JSON, the input is valid JSON up to the point where the string is truncated. 3. In this case truncation of the input means the value of `c` (`abcd`) is invalid as input to `c` field, hence it's omitted. 4. The `a` field is required, so validation on the only item in the list fails and is dropped. 5. Partial validation also works with Python objects, it should have the same semantics as with JSON except of course you can't have a genuinely "incomplete" Python object. 6. The same as above but with a Python object, `c` is dropped as it's not required and failed validation. 7. The `trailing-strings` mode allows for incomplete strings at the end of partial JSON to be included in the output, in this case the input is valid JSON up to the point where the string is truncated, so the last string is included. ### How Partial Validation Works Partial validation follows the zen of Pydantic — it makes no guarantees about what the input data might have been, but it does guarantee to return a valid instance of the type you required, or raise a validation error. To do this, the `experimental_allow_partial` flag enables two pieces of behavior: #### 1. Partial JSON parsing The [jiter](https://github.com/pydantic/jiter) JSON parser used by Pydantic already supports parsing partial JSON, `experimental_allow_partial` is simply passed to jiter via the `allow_partial` argument. !!! note If you just want pure JSON parsing with support for partial JSON, you can use the [`jiter`](https://pypi.org/project/jiter/) Python library directly, or pass the `allow_partial` argument when calling [`pydantic_core.from_json`][pydantic_core.from_json]. #### 2. Ignore errors in the last element of the input {#2-ignore-errors-in-last} Only having access to part of the input data means errors can commonly occur in the last element of the input data. For example: * if a string has a constraint `MinLen(5)`, when you only see part of the input, validation might fail because part of the string is missing (e.g. `{"name": "Sam` instead of `{"name": "Samuel"}`) * if an `int` field has a constraint `Ge(10)`, when you only see part of the input, validation might fail because the number is too small (e.g. `1` instead of `10`) * if a `TypedDict` field has 3 required fields, but the partial input only has two of the fields, validation would fail because some field are missing * etc. etc. — there are lost more cases like this The point is that if you only see part of some valid input data, validation errors can often occur in the last element of a sequence or last value of mapping. To avoid these errors breaking partial validation, Pydantic will ignore ALL errors in the last element of the input data. ```python {title="Errors in last element ignored"} from typing import List from annotated_types import MinLen from typing_extensions import Annotated from pydantic import BaseModel, TypeAdapter class MyModel(BaseModel): a: int b: Annotated[str, MinLen(5)] ta = TypeAdapter(List[MyModel]) v = ta.validate_json( '[{"a": 1, "b": "12345"}, {"a": 1,', experimental_allow_partial=True, ) print(v) #> [MyModel(a=1, b='12345')] ``` ### Limitations of Partial Validation #### TypeAdapter only You can only pass `experiment_allow_partial` to [`TypeAdapter`][pydantic.TypeAdapter] methods, it's not yet supported via other Pydantic entry points like [`BaseModel`][pydantic.BaseModel]. #### Types supported Right now only a subset of collection validators know how to handle partial validation: - `list` - `set` - `frozenset` - `dict` (as in `dict[X, Y]`) - `TypedDict` — only non-required fields may be missing, e.g. via [`NotRequired`][typing.NotRequired] or [`total=False`][typing.TypedDict.__total__]) While you can use `experimental_allow_partial` while validating against types that include other collection validators, those types will be validated "all or nothing", and partial validation will not work on more nested types. E.g. in the [above](#2-ignore-errors-in-last) example partial validation works although the second item in the list is dropped completely since `BaseModel` doesn't (yet) support partial validation. But partial validation won't work at all in the follow example because `BaseModel` doesn't support partial validation so it doesn't forward the `allow_partial` instruction down to the list validator in `b`: ```python from typing import List from annotated_types import MinLen from typing_extensions import Annotated from pydantic import BaseModel, TypeAdapter, ValidationError class MyModel(BaseModel): a: int = 1 b: List[Annotated[str, MinLen(5)]] = [] # (1)! ta = TypeAdapter(MyModel) try: v = ta.validate_json( '{"a": 1, "b": ["12345", "12', experimental_allow_partial=True ) except ValidationError as e: print(e) """ 1 validation error for MyModel b.1 String should have at least 5 characters [type=string_too_short, input_value='12', input_type=str] """ ``` 1. The list validator for `b` doesn't get the `allow_partial` instruction passed down to it by the model validator so it doesn't know to ignore errors in the last element of the input. #### Some invalid but complete JSON will be accepted The way [jiter](https://github.com/pydantic/jiter) (the JSON parser used by Pydantic) works means it's currently not possible to differentiate between complete JSON like `{"a": 1, "b": "12"}` and incomplete JSON like `{"a": 1, "b": "12`. This means that some invalid JSON will be accepted by Pydantic when using `experimental_allow_partial`, e.g.: ```python from annotated_types import MinLen from typing_extensions import Annotated, TypedDict from pydantic import TypeAdapter class Foobar(TypedDict, total=False): a: int b: Annotated[str, MinLen(5)] ta = TypeAdapter(Foobar) v = ta.validate_json( '{"a": 1, "b": "12', experimental_allow_partial=True # (1)! ) print(v) #> {'a': 1} v = ta.validate_json( '{"a": 1, "b": "12"}', experimental_allow_partial=True # (2)! ) print(v) #> {'a': 1} ``` 1. This will pass validation as expected although the last field will be omitted as it failed validation. 2. This will also pass validation since the binary representation of the JSON data passed to pydantic-core is indistinguishable from the previous case. #### Any error in the last field of the input will be ignored As described [above](#2-ignore-errors-in-last), many errors can result from truncating the input. Rather than trying to specifically ignore errors that could result from truncation, Pydantic ignores all errors in the last element of the input in partial validation mode. This means clearly invalid data will pass validation if the error is in the last field of the input: ```python from typing import List from annotated_types import Ge from typing_extensions import Annotated from pydantic import TypeAdapter ta = TypeAdapter(List[Annotated[int, Ge(10)]]) v = ta.validate_python([20, 30, 4], experimental_allow_partial=True) # (1)! print(v) #> [20, 30] ta = TypeAdapter(List[int]) v = ta.validate_python([1, 2, 'wrong'], experimental_allow_partial=True) # (2)! print(v) #> [1, 2] ``` 1. As you would expect, this will pass validation since Pydantic correctly ignores the error in the (truncated) last item. 2. This will also pass validation since the error in the last item is ignored. pydantic-2.10.6/docs/concepts/fields.md000066400000000000000000000701021474456633400177710ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.fields.Field`][pydantic.fields.Field]
In this section, we will go through the available mechanisms to customize Pydantic model fields: default values, JSON Schema metadata, constraints, etc. To do so, the [`Field()`][pydantic.fields.Field] function is used a lot, and behaves the same way as the standard library [`field()`][dataclasses.field] function for dataclasses: ```python from pydantic import BaseModel, Field class Model(BaseModel): name: str = Field(frozen=True) ``` !!! note Even though `name` is assigned a value, it is still required and has no default value. If you want to emphasize on the fact that a value must be provided, you can use the [ellipsis][Ellipsis]: ```python {lint="skip" test="skip"} class Model(BaseModel): name: str = Field(..., frozen=True) ``` However, its usage is discouraged as it doesn't play well with static type checkers. ## The annotated pattern To apply constraints or attach [`Field()`][pydantic.fields.Field] functions to a model field, Pydantic supports the [`Annotated`][typing.Annotated] typing construct to attach metadata to an annotation: ```python from typing_extensions import Annotated from pydantic import BaseModel, Field, WithJsonSchema class Model(BaseModel): name: Annotated[str, Field(strict=True), WithJsonSchema({'extra': 'data'})] ``` As far as static type checkers are concerned, `name` is still typed as `str`, but Pydantic leverages the available metadata to add validation logic, type constraints, etc. Using this pattern has some advantages: - Using the `f: = Field(...)` form can be confusing and might trick users into thinking `f` has a default value, while in reality it is still required. - You can provide an arbitrary amount of metadata elements for a field. As shown in the example above, the [`Field()`][pydantic.fields.Field] function only supports a limited set of constraints/metadata, and you may have to use different Pydantic utilities such as [`WithJsonSchema`][pydantic.WithJsonSchema] in some cases. - Types can be made reusable (see the documentation on [custom types](./types.md#using-the-annotated-pattern) using this pattern). However, note that certain arguments to the [`Field()`][pydantic.fields.Field] function (namely, `default`, `default_factory`, and `alias`) are taken into account by static type checkers to synthesize a correct `__init__` method. The annotated pattern is *not* understood by them, so you should use the normal assignment form instead. !!! tip The annotated pattern can also be used to add metadata to specific parts of the type. For instance, [validation constraints](#field-constraints) can be added this way: ```python from typing import List from typing_extensions import Annotated from pydantic import BaseModel, Field class Model(BaseModel): int_list: List[Annotated[int, Field(gt=0)]] # Valid: [1, 3] # Invalid: [-1, 2] ``` ## Default values The `default` parameter is used to define a default value for a field. ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(default='John Doe') user = User() print(user) #> name='John Doe' ``` !!! note If you use the [`Optional`][typing.Optional] annotation, it doesn't mean that the field has a default value of `None`! You can also use `default_factory` (but not both at the same time) to define a callable that will be called to generate a default value. ```python from uuid import uuid4 from pydantic import BaseModel, Field class User(BaseModel): id: str = Field(default_factory=lambda: uuid4().hex) ``` The default factory can also take a single required argument, in which the case the already validated data will be passed as a dictionary. ```python from pydantic import BaseModel, EmailStr, Field class User(BaseModel): email: EmailStr username: str = Field(default_factory=lambda data: data['email']) user = User(email='user@example.com') print(user.username) #> user@example.com ``` The `data` argument will *only* contain the already validated data, based on the [order of model fields](./models.md#field-ordering) (the above example would fail if `username` were to be defined before `email`). ## Validate default values By default, Pydantic will *not* validate default values. The `validate_default` field parameter (or the [`validate_default`][pydantic.ConfigDict.validate_default] configuration value) can be used to enable this behavior: ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): age: int = Field(default='twelve', validate_default=True) try: user = User() except ValidationError as e: print(e) """ 1 validation error for User age Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='twelve', input_type=str] """ ``` ### Mutable default values A common source of bugs in Python is to use a mutable object as a default value for a function or method argument, as the same instance ends up being reused in each call. The [`dataclasses`][dataclasses] module actually raises an error in this case, indicating that you should use a [default factory](https://docs.python.org/3/library/dataclasses.html#default-factory-functions) instead. While the same thing can be done in Pydantic, it is not required. In the event that the default value is not hashable, Pydantic will create a deep copy of the default value when creating each instance of the model: ```python from typing import Dict, List from pydantic import BaseModel class Model(BaseModel): item_counts: List[Dict[str, int]] = [{}] m1 = Model() m1.item_counts[0]['a'] = 1 print(m1.item_counts) #> [{'a': 1}] m2 = Model() print(m2.item_counts) #> [{}] ``` ## Field aliases !!! tip Read more about aliases in the [dedicated section](./alias.md). For validation and serialization, you can define an alias for a field. There are three ways to define an alias: * `Field(alias='foo')` * `Field(validation_alias='foo')` * `Field(serialization_alias='foo')` The `alias` parameter is used for both validation _and_ serialization. If you want to use _different_ aliases for validation and serialization respectively, you can use the`validation_alias` and `serialization_alias` parameters, which will apply only in their respective use cases. Here is an example of using the `alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(alias='username') user = User(username='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'username': 'johndoe'} ``` 1. The alias `'username'` is used for instance creation and validation. 2. We are using `model_dump` to convert the model into a serializable format. You can see more details about [`model_dump`][pydantic.main.BaseModel.model_dump] in the API reference. Note that the `by_alias` keyword argument defaults to `False`, and must be specified explicitly to dump models using the field (serialization) aliases. When `by_alias=True`, the alias `'username'` is also used during serialization. If you want to use an alias _only_ for validation, you can use the `validation_alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(validation_alias='username') user = User(username='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'name': 'johndoe'} ``` 1. The validation alias `'username'` is used during validation. 2. The field name `'name'` is used during serialization. If you only want to define an alias for _serialization_, you can use the `serialization_alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(serialization_alias='username') user = User(name='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'username': 'johndoe'} ``` 1. The field name `'name'` is used for validation. 2. The serialization alias `'username'` is used for serialization. !!! note "Alias precedence and priority" In case you use `alias` together with `validation_alias` or `serialization_alias` at the same time, the `validation_alias` will have priority over `alias` for validation, and `serialization_alias` will have priority over `alias` for serialization. If you use an `alias_generator` in the [Model Config][pydantic.config.ConfigDict.alias_generator], you can control the order of precedence for specified field vs generated aliases via the `alias_priority` setting. You can read more about alias precedence [here](../concepts/alias.md#alias-precedence). ??? tip "VSCode and Pyright users" In VSCode, if you use the [Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance) extension, you won't see a warning when instantiating a model using a field's alias: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(alias='username') user = User(username='johndoe') # (1)! ``` 1. VSCode will NOT show a warning here. When the `'alias'` keyword argument is specified, even if you set `populate_by_name` to `True` in the [Model Config][pydantic.config.ConfigDict.populate_by_name], VSCode will show a warning when instantiating a model using the field name (though it will work at runtime) — in this case, `'name'`: ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(populate_by_name=True) name: str = Field(alias='username') user = User(name='johndoe') # (1)! ``` 1. VSCode will show a warning here. To "trick" VSCode into preferring the field name, you can use the `str` function to wrap the alias value. With this approach, though, a warning is shown when instantiating a model using the alias for the field: ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(populate_by_name=True) name: str = Field(alias=str('username')) # noqa: UP018 user = User(name='johndoe') # (1)! user = User(username='johndoe') # (2)! ``` 1. Now VSCode will NOT show a warning 2. VSCode will show a warning here, though This is discussed in more detail in [this issue](https://github.com/pydantic/pydantic/issues/5893). ### Validation Alias Even though Pydantic treats `alias` and `validation_alias` the same when creating model instances, VSCode will not use the `validation_alias` in the class initializer signature. If you want VSCode to use the `validation_alias` in the class initializer, you can instead specify both an `alias` and `serialization_alias`, as the `serialization_alias` will override the `alias` during serialization: ```python from pydantic import BaseModel, Field class MyModel(BaseModel): my_field: int = Field(validation_alias='myValidationAlias') ``` with: ```python from pydantic import BaseModel, Field class MyModel(BaseModel): my_field: int = Field( ..., alias='myValidationAlias', serialization_alias='my_serialization_alias', ) m = MyModel(myValidationAlias=1) print(m.model_dump(by_alias=True)) #> {'my_serialization_alias': 1} ``` [](){#field-constraints} ## Numeric Constraints There are some keyword arguments that can be used to constrain numeric values: * `gt` - greater than * `lt` - less than * `ge` - greater than or equal to * `le` - less than or equal to * `multiple_of` - a multiple of the given number * `allow_inf_nan` - allow `'inf'`, `'-inf'`, `'nan'` values Here's an example: ```python from pydantic import BaseModel, Field class Foo(BaseModel): positive: int = Field(gt=0) non_negative: int = Field(ge=0) negative: int = Field(lt=0) non_positive: int = Field(le=0) even: int = Field(multiple_of=2) love_for_pydantic: float = Field(allow_inf_nan=True) foo = Foo( positive=1, non_negative=0, negative=-1, non_positive=0, even=2, love_for_pydantic=float('inf'), ) print(foo) """ positive=1 non_negative=0 negative=-1 non_positive=0 even=2 love_for_pydantic=inf """ ``` ??? info "JSON Schema" In the generated JSON schema: - `gt` and `lt` constraints will be translated to `exclusiveMinimum` and `exclusiveMaximum`. - `ge` and `le` constraints will be translated to `minimum` and `maximum`. - `multiple_of` constraint will be translated to `multipleOf`. The above snippet will generate the following JSON Schema: ```json { "title": "Foo", "type": "object", "properties": { "positive": { "title": "Positive", "type": "integer", "exclusiveMinimum": 0 }, "non_negative": { "title": "Non Negative", "type": "integer", "minimum": 0 }, "negative": { "title": "Negative", "type": "integer", "exclusiveMaximum": 0 }, "non_positive": { "title": "Non Positive", "type": "integer", "maximum": 0 }, "even": { "title": "Even", "type": "integer", "multipleOf": 2 }, "love_for_pydantic": { "title": "Love For Pydantic", "type": "number" } }, "required": [ "positive", "non_negative", "negative", "non_positive", "even", "love_for_pydantic" ] } ``` See the [JSON Schema Draft 2020-12] for more details. !!! warning "Constraints on compound types" In case you use field constraints with compound types, an error can happen in some cases. To avoid potential issues, you can use `Annotated`: ```python from typing import Optional from typing_extensions import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): positive: Optional[Annotated[int, Field(gt=0)]] # Can error in some cases, not recommended: non_negative: Optional[int] = Field(ge=0) ``` ## String Constraints ??? api "API Documentation" [`pydantic.types.StringConstraints`][pydantic.types.StringConstraints]
There are fields that can be used to constrain strings: * `min_length`: Minimum length of the string. * `max_length`: Maximum length of the string. * `pattern`: A regular expression that the string must match. Here's an example: ```python from pydantic import BaseModel, Field class Foo(BaseModel): short: str = Field(min_length=3) long: str = Field(max_length=10) regex: str = Field(pattern=r'^\d*$') # (1)! foo = Foo(short='foo', long='foobarbaz', regex='123') print(foo) #> short='foo' long='foobarbaz' regex='123' ``` 1. Only digits are allowed. ??? info "JSON Schema" In the generated JSON schema: - `min_length` constraint will be translated to `minLength`. - `max_length` constraint will be translated to `maxLength`. - `pattern` constraint will be translated to `pattern`. The above snippet will generate the following JSON Schema: ```json { "title": "Foo", "type": "object", "properties": { "short": { "title": "Short", "type": "string", "minLength": 3 }, "long": { "title": "Long", "type": "string", "maxLength": 10 }, "regex": { "title": "Regex", "type": "string", "pattern": "^\\d*$" } }, "required": [ "short", "long", "regex" ] } ``` ## Decimal Constraints There are fields that can be used to constrain decimals: * `max_digits`: Maximum number of digits within the `Decimal`. It does not include a zero before the decimal point or trailing decimal zeroes. * `decimal_places`: Maximum number of decimal places allowed. It does not include trailing decimal zeroes. Here's an example: ```python from decimal import Decimal from pydantic import BaseModel, Field class Foo(BaseModel): precise: Decimal = Field(max_digits=5, decimal_places=2) foo = Foo(precise=Decimal('123.45')) print(foo) #> precise=Decimal('123.45') ``` ## Dataclass Constraints There are fields that can be used to constrain dataclasses: * `init`: Whether the field should be included in the `__init__` of the dataclass. * `init_var`: Whether the field should be seen as an [init-only field] in the dataclass. * `kw_only`: Whether the field should be a keyword-only argument in the constructor of the dataclass. Here's an example: ```python from pydantic import BaseModel, Field from pydantic.dataclasses import dataclass @dataclass class Foo: bar: str baz: str = Field(init_var=True) qux: str = Field(kw_only=True) class Model(BaseModel): foo: Foo model = Model(foo=Foo('bar', baz='baz', qux='qux')) print(model.model_dump()) # (1)! #> {'foo': {'bar': 'bar', 'qux': 'qux'}} ``` 1. The `baz` field is not included in the `model_dump()` output, since it is an init-only field. ## Field Representation The parameter `repr` can be used to control whether the field should be included in the string representation of the model. ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(repr=True) # (1)! age: int = Field(repr=False) user = User(name='John', age=42) print(user) #> name='John' ``` 1. This is the default value. ## Discriminator The parameter `discriminator` can be used to control the field that will be used to discriminate between different models in a union. It takes either the name of a field or a `Discriminator` instance. The `Discriminator` approach can be useful when the discriminator fields aren't the same for all the models in the `Union`. The following example shows how to use `discriminator` with a field name: ```python from typing import Literal, Union from pydantic import BaseModel, Field class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_type: Literal['dog'] age: int class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') print(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})) # (1)! #> pet=Cat(pet_type='cat', age=12) ``` 1. See more about [Validating data] in the [Models] page. The following example shows how to use the `discriminator` keyword argument with a `Discriminator` instance: ```python from typing import Literal, Union from typing_extensions import Annotated from pydantic import BaseModel, Discriminator, Field, Tag class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_kind: Literal['dog'] age: int def pet_discriminator(v): if isinstance(v, dict): return v.get('pet_type', v.get('pet_kind')) return getattr(v, 'pet_type', getattr(v, 'pet_kind', None)) class Model(BaseModel): pet: Union[Annotated[Cat, Tag('cat')], Annotated[Dog, Tag('dog')]] = Field( discriminator=Discriminator(pet_discriminator) ) print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}}))) #> Model(pet=Cat(pet_type='cat', age=12)) print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}}))) #> Model(pet=Dog(pet_kind='dog', age=12)) ``` You can also take advantage of `Annotated` to define your discriminated unions. See the [Discriminated Unions] docs for more details. ## Strict Mode The `strict` parameter on a [`Field`][pydantic.fields.Field] specifies whether the field should be validated in "strict mode". In strict mode, Pydantic throws an error during validation instead of coercing data on the field where `strict=True`. ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(strict=True) # (1)! age: int = Field(strict=False) user = User(name='John', age='42') # (2)! print(user) #> name='John' age=42 ``` 1. This is the default value. 2. The `age` field is not validated in the strict mode. Therefore, it can be assigned a string. See [Strict Mode](strict_mode.md) for more details. See [Conversion Table](conversion_table.md) for more details on how Pydantic converts data in both strict and lax modes. ## Immutability The parameter `frozen` is used to emulate the frozen dataclass behaviour. It is used to prevent the field from being assigned a new value after the model is created (immutability). See the [frozen dataclass documentation] for more details. ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): name: str = Field(frozen=True) age: int user = User(name='John', age=42) try: user.name = 'Jane' # (1)! except ValidationError as e: print(e) """ 1 validation error for User name Field is frozen [type=frozen_field, input_value='Jane', input_type=str] """ ``` 1. Since `name` field is frozen, the assignment is not allowed. ## Exclude The `exclude` parameter can be used to control which fields should be excluded from the model when exporting the model. See the following example: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str age: int = Field(exclude=True) user = User(name='John', age=42) print(user.model_dump()) # (1)! #> {'name': 'John'} ``` 1. The `age` field is not included in the `model_dump()` output, since it is excluded. See the [Serialization] section for more details. ## Deprecated fields The `deprecated` parameter can be used to mark a field as being deprecated. Doing so will result in: * a runtime deprecation warning emitted when accessing the field. * `"deprecated": true` being set in the generated JSON schema. You can set the `deprecated` parameter as one of: * A string, which will be used as the deprecation message. * An instance of the `warnings.deprecated` decorator (or the `typing_extensions` backport). * A boolean, which will be used to mark the field as deprecated with a default `'deprecated'` deprecation message. ### `deprecated` as a string ```python from typing_extensions import Annotated from pydantic import BaseModel, Field class Model(BaseModel): deprecated_field: Annotated[int, Field(deprecated='This is deprecated')] print(Model.model_json_schema()['properties']['deprecated_field']) #> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'} ``` ### `deprecated` via the `warnings.deprecated` decorator !!! note You can only use the `deprecated` decorator in this way if you have `typing_extensions` >= 4.9.0 installed. ```python {test="skip"} import importlib.metadata from packaging.version import Version from typing_extensions import Annotated, deprecated from pydantic import BaseModel, Field if Version(importlib.metadata.version('typing_extensions')) >= Version('4.9'): class Model(BaseModel): deprecated_field: Annotated[int, deprecated('This is deprecated')] # Or explicitly using `Field`: alt_form: Annotated[ int, Field(deprecated=deprecated('This is deprecated')) ] ``` ### `deprecated` as a boolean ```python from typing_extensions import Annotated from pydantic import BaseModel, Field class Model(BaseModel): deprecated_field: Annotated[int, Field(deprecated=True)] print(Model.model_json_schema()['properties']['deprecated_field']) #> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'} ``` !!! note "Support for `category` and `stacklevel`" The current implementation of this feature does not take into account the `category` and `stacklevel` arguments to the `deprecated` decorator. This might land in a future version of Pydantic. !!! warning "Accessing a deprecated field in validators" When accessing a deprecated field inside a validator, the deprecation warning will be emitted. You can use [`catch_warnings`][warnings.catch_warnings] to explicitly ignore it: ```python import warnings from typing_extensions import Self from pydantic import BaseModel, Field, model_validator class Model(BaseModel): deprecated_field: int = Field(deprecated='This is deprecated') @model_validator(mode='after') def validate_model(self) -> Self: with warnings.catch_warnings(): warnings.simplefilter('ignore', DeprecationWarning) self.deprecated_field = self.deprecated_field * 2 ``` ## Customizing JSON Schema Some field parameters are used exclusively to customize the generated JSON schema. The parameters in question are: * `title` * `description` * `examples` * `json_schema_extra` Read more about JSON schema customization / modification with fields in the [Customizing JSON Schema] section of the JSON schema docs. ## The `computed_field` decorator ??? api "API Documentation" [`computed_field`][pydantic.fields.computed_field]
The [`computed_field`][pydantic.fields.computed_field] decorator can be used to include [`property`][] or [`cached_property`][functools.cached_property] attributes when serializing a model or dataclass. The property will also be taken into account in the JSON Schema (in serialization mode). !!! note Properties can be useful for fields that are computed from other fields, or for fields that are expensive to be computed (and thus, are cached if using [`cached_property`][functools.cached_property]). However, note that Pydantic will *not* perform any additional logic on the wrapped property (validation, cache invalidation, etc.). Here's an example of the JSON schema (in serialization mode) generated for a model with a computed field: ```python from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property # (1)! def volume(self) -> float: return self.width * self.height * self.depth print(Box.model_json_schema(mode='serialization')) """ { 'properties': { 'width': {'title': 'Width', 'type': 'number'}, 'height': {'title': 'Height', 'type': 'number'}, 'depth': {'title': 'Depth', 'type': 'number'}, 'volume': {'readOnly': True, 'title': 'Volume', 'type': 'number'}, }, 'required': ['width', 'height', 'depth', 'volume'], 'title': 'Box', 'type': 'object', } """ ``` Here's an example using the `model_dump` method with a computed field: ```python from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property # (1)! def volume(self) -> float: return self.width * self.height * self.depth b = Box(width=1, height=2, depth=3) print(b.model_dump()) #> {'width': 1.0, 'height': 2.0, 'depth': 3.0, 'volume': 6.0} ``` 1. If not specified, [`computed_field`][pydantic.fields.computed_field] will implicitly convert the method to a [`property`][]. However, it is preferable to explicitly use the [`@property`][property] decorator for type checking purposes. As with regular fields, computed fields can be marked as being deprecated: ```python from typing_extensions import deprecated from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property @deprecated("'volume' is deprecated") def volume(self) -> float: return self.width * self.height * self.depth ``` [JSON Schema Draft 2020-12]: https://json-schema.org/understanding-json-schema/reference/numeric.html#numeric-types [Discriminated Unions]: ../concepts/unions.md#discriminated-unions [Validating data]: models.md#validating-data [Models]: models.md [init-only field]: https://docs.python.org/3/library/dataclasses.html#init-only-variables [frozen dataclass documentation]: https://docs.python.org/3/library/dataclasses.html#frozen-instances [Validate Assignment]: models.md#validate-assignment [Serialization]: serialization.md#model-and-field-level-include-and-exclude [Customizing JSON Schema]: json_schema.md#field-level-customization [annotated]: https://docs.python.org/3/library/typing.html#typing.Annotated [Alias]: ../concepts/alias.md pydantic-2.10.6/docs/concepts/forward_annotations.md000066400000000000000000000147361474456633400226170ustar00rootroot00000000000000Forward annotations (wrapped in quotes) or using the `from __future__ import annotations` [future statement] (as introduced in [PEP563](https://www.python.org/dev/peps/pep-0563/)) are supported: ```python from __future__ import annotations from pydantic import BaseModel MyInt = int class Model(BaseModel): a: MyInt # Without the future import, equivalent to: # a: 'MyInt' print(Model(a='1')) #> a=1 ``` As shown in the following sections, forward annotations are useful when you want to reference a type that is not yet defined in your code. The internal logic to resolve forward annotations is described in detail in [this section](../internals/resolving_annotations.md). ## Self-referencing (or "Recursive") Models Models with self-referencing fields are also supported. These annotations will be resolved during model creation. Within the model, you can either add the `from __future__ import annotations` import or wrap the annotation in a string: ```python from typing import Optional from pydantic import BaseModel class Foo(BaseModel): a: int = 123 sibling: 'Optional[Foo]' = None print(Foo()) #> a=123 sibling=None print(Foo(sibling={'a': '321'})) #> a=123 sibling=Foo(a=321, sibling=None) ``` ### Cyclic references When working with self-referencing recursive models, it is possible that you might encounter cyclic references in validation inputs. For example, this can happen when validating ORM instances with back-references from attributes. Rather than raising a [`RecursionError`][] while attempting to validate data with cyclic references, Pydantic is able to detect the cyclic reference and raise an appropriate [`ValidationError`][pydantic_core.ValidationError]: ```python from typing import Optional from pydantic import BaseModel, ValidationError class ModelA(BaseModel): b: 'Optional[ModelB]' = None class ModelB(BaseModel): a: Optional[ModelA] = None cyclic_data = {} cyclic_data['a'] = {'b': cyclic_data} print(cyclic_data) #> {'a': {'b': {...}}} try: ModelB.model_validate(cyclic_data) except ValidationError as exc: print(exc) """ 1 validation error for ModelB a.b Recursion error - cyclic reference detected [type=recursion_loop, input_value={'a': {'b': {...}}}, input_type=dict] """ ``` Because this error is raised without actually exceeding the maximum recursion depth, you can catch and handle the raised [`ValidationError`][pydantic_core.ValidationError] without needing to worry about the limited remaining recursion depth: ```python from contextlib import contextmanager from dataclasses import field from typing import Iterator, List from pydantic import BaseModel, ValidationError, field_validator def is_recursion_validation_error(exc: ValidationError) -> bool: errors = exc.errors() return len(errors) == 1 and errors[0]['type'] == 'recursion_loop' @contextmanager def suppress_recursion_validation_error() -> Iterator[None]: try: yield except ValidationError as exc: if not is_recursion_validation_error(exc): raise exc class Node(BaseModel): id: int children: List['Node'] = field(default_factory=list) @field_validator('children', mode='wrap') @classmethod def drop_cyclic_references(cls, children, h): try: return h(children) except ValidationError as exc: if not ( is_recursion_validation_error(exc) and isinstance(children, list) ): raise exc value_without_cyclic_refs = [] for child in children: with suppress_recursion_validation_error(): value_without_cyclic_refs.extend(h([child])) return h(value_without_cyclic_refs) # Create data with cyclic references representing the graph 1 -> 2 -> 3 -> 1 node_data = {'id': 1, 'children': [{'id': 2, 'children': [{'id': 3}]}]} node_data['children'][0]['children'][0]['children'] = [node_data] print(Node.model_validate(node_data)) #> id=1 children=[Node(id=2, children=[Node(id=3, children=[])])] ``` Similarly, if Pydantic encounters a recursive reference during _serialization_, rather than waiting for the maximum recursion depth to be exceeded, a [`ValueError`][] is raised immediately: ```python from pydantic import TypeAdapter # Create data with cyclic references representing the graph 1 -> 2 -> 3 -> 1 node_data = {'id': 1, 'children': [{'id': 2, 'children': [{'id': 3}]}]} node_data['children'][0]['children'][0]['children'] = [node_data] try: # Try serializing the circular reference as JSON TypeAdapter(dict).dump_json(node_data) except ValueError as exc: print(exc) """ Error serializing to JSON: ValueError: Circular reference detected (id repeated) """ ``` This can also be handled if desired: ```python from dataclasses import field from typing import Any, List from pydantic import ( SerializerFunctionWrapHandler, TypeAdapter, field_serializer, ) from pydantic.dataclasses import dataclass @dataclass class NodeReference: id: int @dataclass class Node(NodeReference): children: List['Node'] = field(default_factory=list) @field_serializer('children', mode='wrap') def serialize( self, children: List['Node'], handler: SerializerFunctionWrapHandler ) -> Any: """ Serialize a list of nodes, handling circular references by excluding the children. """ try: return handler(children) except ValueError as exc: if not str(exc).startswith('Circular reference'): raise exc result = [] for node in children: try: serialized = handler([node]) except ValueError as exc: if not str(exc).startswith('Circular reference'): raise exc result.append({'id': node.id}) else: result.append(serialized) return result # Create a cyclic graph: nodes = [Node(id=1), Node(id=2), Node(id=3)] nodes[0].children.append(nodes[1]) nodes[1].children.append(nodes[2]) nodes[2].children.append(nodes[0]) print(nodes[0]) #> Node(id=1, children=[Node(id=2, children=[Node(id=3, children=[...])])]) # Serialize the cyclic graph: print(TypeAdapter(Node).dump_python(nodes[0])) """ { 'id': 1, 'children': [{'id': 2, 'children': [{'id': 3, 'children': [{'id': 1}]}]}], } """ ``` [future statement]: https://docs.python.org/3/reference/simple_stmts.html#future pydantic-2.10.6/docs/concepts/json.md000066400000000000000000000211651474456633400175010ustar00rootroot00000000000000# JSON ## Json Parsing ??? api "API Documentation" [`pydantic.main.BaseModel.model_validate_json`][pydantic.main.BaseModel.model_validate_json] [`pydantic.type_adapter.TypeAdapter.validate_json`][pydantic.type_adapter.TypeAdapter.validate_json] [`pydantic_core.from_json`][pydantic_core.from_json] Pydantic provides builtin JSON parsing, which helps achieve: * Significant performance improvements without the cost of using a 3rd party library * Support for custom errors * Support for `strict` specifications Here's an example of Pydantic's builtin JSON parsing via the [`model_validate_json`][pydantic.main.BaseModel.model_validate_json] method, showcasing the support for `strict` specifications while parsing JSON data that doesn't match the model's type annotations: ```python from datetime import date from typing import Tuple from pydantic import BaseModel, ConfigDict, ValidationError class Event(BaseModel): model_config = ConfigDict(strict=True) when: date where: Tuple[int, int] json_data = '{"when": "1987-01-28", "where": [51, -1]}' print(Event.model_validate_json(json_data)) # (1)! #> when=datetime.date(1987, 1, 28) where=(51, -1) try: Event.model_validate({'when': '1987-01-28', 'where': [51, -1]}) # (2)! except ValidationError as e: print(e) """ 2 validation errors for Event when Input should be a valid date [type=date_type, input_value='1987-01-28', input_type=str] where Input should be a valid tuple [type=tuple_type, input_value=[51, -1], input_type=list] """ ``` 1. JSON has no `date` or tuple types, but Pydantic knows that so allows strings and arrays as inputs respectively when parsing JSON directly. 2. If you pass the same values to the [`model_validate`][pydantic.main.BaseModel.model_validate] method, Pydantic will raise a validation error because the `strict` configuration is enabled. In v2.5.0 and above, Pydantic uses [`jiter`](https://docs.rs/jiter/latest/jiter/), a fast and iterable JSON parser, to parse JSON data. Using `jiter` compared to `serde` results in modest performance improvements that will get even better in the future. The `jiter` JSON parser is almost entirely compatible with the `serde` JSON parser, with one noticeable enhancement being that `jiter` supports deserialization of `inf` and `NaN` values. In the future, `jiter` is intended to enable support validation errors to include the location in the original JSON input which contained the invalid value. ### Partial JSON Parsing **Starting in v2.7.0**, Pydantic's [JSON parser](https://docs.rs/jiter/latest/jiter/) offers support for partial JSON parsing, which is exposed via [`pydantic_core.from_json`][pydantic_core.from_json]. Here's an example of this feature in action: ```python from pydantic_core import from_json partial_json_data = '["aa", "bb", "c' # (1)! try: result = from_json(partial_json_data, allow_partial=False) except ValueError as e: print(e) # (2)! #> EOF while parsing a string at line 1 column 15 result = from_json(partial_json_data, allow_partial=True) print(result) # (3)! #> ['aa', 'bb'] ``` 1. The JSON list is incomplete - it's missing a closing `"]` 2. When `allow_partial` is set to `False` (the default), a parsing error occurs. 3. When `allow_partial` is set to `True`, part of the input is deserialized successfully. This also works for deserializing partial dictionaries. For example: ```python from pydantic_core import from_json partial_dog_json = '{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age' dog_dict = from_json(partial_dog_json, allow_partial=True) print(dog_dict) #> {'breed': 'lab', 'name': 'fluffy', 'friends': ['buddy', 'spot', 'rufus']} ``` !!! tip "Validating LLM Output" This feature is particularly beneficial for validating LLM outputs. We've written some blog posts about this topic, which you can find [here](https://pydantic.dev/articles). In future versions of Pydantic, we expect to expand support for this feature through either Pydantic's other JSON validation functions ([`pydantic.main.BaseModel.model_validate_json`][pydantic.main.BaseModel.model_validate_json] and [`pydantic.type_adapter.TypeAdapter.validate_json`][pydantic.type_adapter.TypeAdapter.validate_json]) or model configuration. Stay tuned 🚀! For now, you can use [`pydantic_core.from_json`][pydantic_core.from_json] in combination with [`pydantic.main.BaseModel.model_validate`][pydantic.main.BaseModel.model_validate] to achieve the same result. Here's an example: ```python from pydantic_core import from_json from pydantic import BaseModel class Dog(BaseModel): breed: str name: str friends: list partial_dog_json = '{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age' dog = Dog.model_validate(from_json(partial_dog_json, allow_partial=True)) print(repr(dog)) #> Dog(breed='lab', name='fluffy', friends=['buddy', 'spot', 'rufus']) ``` !!! tip For partial JSON parsing to work reliably, all fields on the model should have default values. Check out the following example for a more in-depth look at how to use default values with partial JSON parsing: !!! example "Using default values with partial JSON parsing" ```python from typing import Any, Optional, Tuple import pydantic_core from typing_extensions import Annotated from pydantic import BaseModel, ValidationError, WrapValidator def default_on_error(v, handler) -> Any: """ Raise a PydanticUseDefault exception if the value is missing. This is useful for avoiding errors from partial JSON preventing successful validation. """ try: return handler(v) except ValidationError as exc: # there might be other types of errors resulting from partial JSON parsing # that you allow here, feel free to customize as needed if all(e['type'] == 'missing' for e in exc.errors()): raise pydantic_core.PydanticUseDefault() else: raise class NestedModel(BaseModel): x: int y: str class MyModel(BaseModel): foo: Optional[str] = None bar: Annotated[ Optional[Tuple[str, int]], WrapValidator(default_on_error) ] = None nested: Annotated[ Optional[NestedModel], WrapValidator(default_on_error) ] = None m = MyModel.model_validate( pydantic_core.from_json('{"foo": "x", "bar": ["world",', allow_partial=True) ) print(repr(m)) #> MyModel(foo='x', bar=None, nested=None) m = MyModel.model_validate( pydantic_core.from_json( '{"foo": "x", "bar": ["world", 1], "nested": {"x":', allow_partial=True ) ) print(repr(m)) #> MyModel(foo='x', bar=('world', 1), nested=None) ``` ### Caching Strings **Starting in v2.7.0**, Pydantic's [JSON parser](https://docs.rs/jiter/latest/jiter/) offers support for configuring how Python strings are cached during JSON parsing and validation (when Python strings are constructed from Rust strings during Python validation, e.g. after `strip_whitespace=True`). The `cache_strings` setting is exposed via both [model config][pydantic.config.ConfigDict] and [`pydantic_core.from_json`][pydantic_core.from_json]. The `cache_strings` setting can take any of the following values: * `True` or `'all'` (the default): cache all strings * `'keys'`: cache only dictionary keys, this **only** applies when used with [`pydantic_core.from_json`][pydantic_core.from_json] or when parsing JSON using [`Json`][pydantic.types.Json] * `False` or `'none'`: no caching Using the string caching feature results in performance improvements, but increases memory usage slightly. !!! note "String Caching Details" 1. Strings are cached using a fully associative cache with a size of [16,384](https://github.com/pydantic/jiter/blob/5bbdcfd22882b7b286416b22f74abd549c7b2fd7/src/py_string_cache.rs#L113). 2. Only strings where `len(string) < 64` are cached. 3. There is some overhead to looking up the cache, which is normally worth it to avoid constructing strings. However, if you know there will be very few repeated strings in your data, you might get a performance boost by disabling this setting with `cache_strings=False`. ## JSON Serialization ??? api "API Documentation" [`pydantic.main.BaseModel.model_dump_json`][pydantic.main.BaseModel.model_dump_json]
[`pydantic.type_adapter.TypeAdapter.dump_json`][pydantic.type_adapter.TypeAdapter.dump_json]
[`pydantic_core.to_json`][pydantic_core.to_json]
For more information on JSON serialization, see the [Serialization Concepts](./serialization.md#modelmodel_dump_json) page. pydantic-2.10.6/docs/concepts/json_schema.md000066400000000000000000001153731474456633400210260ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.json_schema`][pydantic.json_schema]
Pydantic allows automatic creation and customization of JSON schemas from models. The generated JSON schemas are compliant with the following specifications: * [JSON Schema Draft 2020-12](https://json-schema.org/draft/2020-12/release-notes.html) * [OpenAPI Specification v3.1.0](https://github.com/OAI/OpenAPI-Specification). ## Generating JSON Schema Use the following functions to generate JSON schema: * [`BaseModel.model_json_schema`][pydantic.main.BaseModel.model_json_schema] returns a jsonable dict of a model's schema. * [`TypeAdapter.json_schema`][pydantic.type_adapter.TypeAdapter.json_schema] returns a jsonable dict of an adapted type's schema. !!! note These methods are not to be confused with [`BaseModel.model_dump_json`][pydantic.main.BaseModel.model_dump_json] and [`TypeAdapter.dump_json`][pydantic.type_adapter.TypeAdapter.dump_json], which serialize instances of the model or adapted type, respectively. These methods return JSON strings. In comparison, [`BaseModel.model_json_schema`][pydantic.main.BaseModel.model_json_schema] and [`TypeAdapter.json_schema`][pydantic.type_adapter.TypeAdapter.json_schema] return a jsonable dict representing the JSON schema of the model or adapted type, respectively. !!! note "on the "jsonable" nature of JSON schema" Regarding the "jsonable" nature of the [`model_json_schema`][pydantic.main.BaseModel.model_json_schema] results, calling `json.dumps(m.model_json_schema())`on some `BaseModel` `m` returns a valid JSON string. Similarly, for [`TypeAdapter.json_schema`][pydantic.type_adapter.TypeAdapter.json_schema], calling `json.dumps(TypeAdapter().json_schema())` returns a valid JSON string. !!! tip Pydantic offers support for both of: 1. [Customizing JSON Schema](#customizing-json-schema) 2. [Customizing the JSON Schema Generation Process](#customizing-the-json-schema-generation-process) The first approach generally has a more narrow scope, allowing for customization of the JSON schema for more specific cases and types. The second approach generally has a more broad scope, allowing for customization of the JSON schema generation process overall. The same effects can be achieved with either approach, but depending on your use case, one approach might offer a more simple solution than the other. Here's an example of generating JSON schema from a `BaseModel`: ```python {output="json"} import json from enum import Enum from typing import Union from typing_extensions import Annotated from pydantic import BaseModel, Field from pydantic.config import ConfigDict class FooBar(BaseModel): count: int size: Union[float, None] = None class Gender(str, Enum): male = 'male' female = 'female' other = 'other' not_given = 'not_given' class MainModel(BaseModel): """ This is the description of the main model """ model_config = ConfigDict(title='Main') foo_bar: FooBar gender: Annotated[Union[Gender, None], Field(alias='Gender')] = None snap: int = Field( default=42, title='The Snap', description='this is the value of snap', gt=30, lt=50, ) main_model_schema = MainModel.model_json_schema() # (1)! print(json.dumps(main_model_schema, indent=2)) # (2)! """ { "$defs": { "FooBar": { "properties": { "count": { "title": "Count", "type": "integer" }, "size": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Size" } }, "required": [ "count" ], "title": "FooBar", "type": "object" }, "Gender": { "enum": [ "male", "female", "other", "not_given" ], "title": "Gender", "type": "string" } }, "description": "This is the description of the main model", "properties": { "foo_bar": { "$ref": "#/$defs/FooBar" }, "Gender": { "anyOf": [ { "$ref": "#/$defs/Gender" }, { "type": "null" } ], "default": null }, "snap": { "default": 42, "description": "this is the value of snap", "exclusiveMaximum": 50, "exclusiveMinimum": 30, "title": "The Snap", "type": "integer" } }, "required": [ "foo_bar" ], "title": "Main", "type": "object" } """ ``` 1. This produces a "jsonable" dict of `MainModel`'s schema. 2. Calling `json.dumps` on the schema dict produces a JSON string. The [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] class lets you create an object with methods for validating, serializing, and producing JSON schemas for arbitrary types. This serves as a complete replacement for `schema_of` in Pydantic V1 (which is now deprecated). Here's an example of generating JSON schema from a [`TypeAdapter`][pydantic.type_adapter.TypeAdapter]: ```python from typing import List from pydantic import TypeAdapter adapter = TypeAdapter(List[int]) print(adapter.json_schema()) #> {'items': {'type': 'integer'}, 'type': 'array'} ``` You can also generate JSON schemas for combinations of [`BaseModel`s][pydantic.main.BaseModel] and [`TypeAdapter`s][pydantic.type_adapter.TypeAdapter], as shown in this example: ```python {output="json"} import json from typing import Union from pydantic import BaseModel, TypeAdapter class Cat(BaseModel): name: str color: str class Dog(BaseModel): name: str breed: str ta = TypeAdapter(Union[Cat, Dog]) ta_schema = ta.json_schema() print(json.dumps(ta_schema, indent=2)) """ { "$defs": { "Cat": { "properties": { "name": { "title": "Name", "type": "string" }, "color": { "title": "Color", "type": "string" } }, "required": [ "name", "color" ], "title": "Cat", "type": "object" }, "Dog": { "properties": { "name": { "title": "Name", "type": "string" }, "breed": { "title": "Breed", "type": "string" } }, "required": [ "name", "breed" ], "title": "Dog", "type": "object" } }, "anyOf": [ { "$ref": "#/$defs/Cat" }, { "$ref": "#/$defs/Dog" } ] } """ ``` ### Configuring the `JsonSchemaMode` Specify the mode of JSON schema generation via the `mode` parameter in the [`model_json_schema`][pydantic.main.BaseModel.model_json_schema] and [`TypeAdapter.json_schema`][pydantic.type_adapter.TypeAdapter.json_schema] methods. By default, the mode is set to `'validation'`, which produces a JSON schema corresponding to the model's validation schema. The [`JsonSchemaMode`][pydantic.json_schema.JsonSchemaMode] is a type alias that represents the available options for the `mode` parameter: * `'validation'` * `'serialization'` Here's an example of how to specify the `mode` parameter, and how it affects the generated JSON schema: ```python from decimal import Decimal from pydantic import BaseModel class Model(BaseModel): a: Decimal = Decimal('12.34') print(Model.model_json_schema(mode='validation')) """ { 'properties': { 'a': { 'anyOf': [{'type': 'number'}, {'type': 'string'}], 'default': '12.34', 'title': 'A', } }, 'title': 'Model', 'type': 'object', } """ print(Model.model_json_schema(mode='serialization')) """ { 'properties': {'a': {'default': '12.34', 'title': 'A', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } """ ``` ## Customizing JSON Schema The generated JSON schema can be customized at both the field level and model level via: 1. [Field-level customization](#field-level-customization) with the [`Field`][pydantic.fields.Field] constructor 2. [Model-level customization](#model-level-customization) with [`model_config`][pydantic.config.ConfigDict] At both the field and model levels, you can use the `json_schema_extra` option to add extra information to the JSON schema. The [Using `json_schema_extra`](#using-json_schema_extra) section below provides more details on this option. For custom types, Pydantic offers other tools for customizing JSON schema generation: 1. [`WithJsonSchema` annotation](#withjsonschema-annotation) 2. [`SkipJsonSchema` annotation](#skipjsonschema-annotation) 3. [Implementing `__get_pydantic_core_schema__`](#implementing_get_pydantic_core_schema) 4. [Implementing `__get_pydantic_json_schema__`](#implementing_get_pydantic_json_schema) ### Field-Level Customization Optionally, the [`Field`][pydantic.fields.Field] function can be used to provide extra information about the field and validations. Some field parameters are used exclusively to customize the generated JSON Schema: * `title`: The title of the field. * `description`: The description of the field. * `examples`: The examples of the field. * `json_schema_extra`: Extra JSON Schema properties to be added to the field. * `field_title_generator`: A function that programmatically sets the field's title, based on its name and info. Here's an example: ```python {output="json"} import json from pydantic import BaseModel, EmailStr, Field, SecretStr class User(BaseModel): age: int = Field(description='Age of the user') email: EmailStr = Field(examples=['marcelo@mail.com']) name: str = Field(title='Username') password: SecretStr = Field( json_schema_extra={ 'title': 'Password', 'description': 'Password of the user', 'examples': ['123456'], } ) print(json.dumps(User.model_json_schema(), indent=2)) """ { "properties": { "age": { "description": "Age of the user", "title": "Age", "type": "integer" }, "email": { "examples": [ "marcelo@mail.com" ], "format": "email", "title": "Email", "type": "string" }, "name": { "title": "Username", "type": "string" }, "password": { "description": "Password of the user", "examples": [ "123456" ], "format": "password", "title": "Password", "type": "string", "writeOnly": true } }, "required": [ "age", "email", "name", "password" ], "title": "User", "type": "object" } """ ``` #### Unenforced `Field` constraints If Pydantic finds constraints which are not being enforced, an error will be raised. If you want to force the constraint to appear in the schema, even though it's not being checked upon parsing, you can use variadic arguments to [`Field`][pydantic.fields.Field] with the raw schema attribute name: ```python from pydantic import BaseModel, Field, PositiveInt try: # this won't work since `PositiveInt` takes precedence over the # constraints defined in `Field`, meaning they're ignored class Model(BaseModel): foo: PositiveInt = Field(lt=10) except ValueError as e: print(e) # if you find yourself needing this, an alternative is to declare # the constraints in `Field` (or you could use `conint()`) # here both constraints will be enforced: class ModelB(BaseModel): # Here both constraints will be applied and the schema # will be generated correctly foo: int = Field(gt=0, lt=10) print(ModelB.model_json_schema()) """ { 'properties': { 'foo': { 'exclusiveMaximum': 10, 'exclusiveMinimum': 0, 'title': 'Foo', 'type': 'integer', } }, 'required': ['foo'], 'title': 'ModelB', 'type': 'object', } """ ``` You can specify JSON schema modifications via the [`Field`][pydantic.fields.Field] constructor via [`typing.Annotated`][] as well: ```python {output="json"} import json from uuid import uuid4 from typing_extensions import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): id: Annotated[str, Field(default_factory=lambda: uuid4().hex)] name: Annotated[str, Field(max_length=256)] = Field( 'Bar', title='CustomName' ) print(json.dumps(Foo.model_json_schema(), indent=2)) """ { "properties": { "id": { "title": "Id", "type": "string" }, "name": { "default": "Bar", "maxLength": 256, "title": "CustomName", "type": "string" } }, "title": "Foo", "type": "object" } """ ``` ### Programmatic field title generation The `field_title_generator` parameter can be used to programmatically generate the title for a field based on its name and info. See the following example: ```python import json from pydantic import BaseModel, Field from pydantic.fields import FieldInfo def make_title(field_name: str, field_info: FieldInfo) -> str: return field_name.upper() class Person(BaseModel): name: str = Field(field_title_generator=make_title) age: int = Field(field_title_generator=make_title) print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "NAME", "type": "string" }, "age": { "title": "AGE", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } """ ``` ### Model-Level Customization You can also use [model config][pydantic.config.ConfigDict] to customize JSON schema generation on a model. Specifically, the following config options are relevant: * [`title`][pydantic.config.ConfigDict.title] * [`json_schema_extra`][pydantic.config.ConfigDict.json_schema_extra] * [`json_schema_mode_override`][pydantic.config.ConfigDict.json_schema_mode_override] * [`field_title_generator`][pydantic.config.ConfigDict.field_title_generator] * [`model_title_generator`][pydantic.config.ConfigDict.model_title_generator] ### Using `json_schema_extra` The `json_schema_extra` option can be used to add extra information to the JSON schema, either at the [Field level](#field-level-customization) or at the [Model level](#model-level-customization). You can pass a `dict` or a `Callable` to `json_schema_extra`. #### Using `json_schema_extra` with a `dict` You can pass a `dict` to `json_schema_extra` to add extra information to the JSON schema: ```python {output="json"} import json from pydantic import BaseModel, ConfigDict class Model(BaseModel): a: str model_config = ConfigDict(json_schema_extra={'examples': [{'a': 'Foo'}]}) print(json.dumps(Model.model_json_schema(), indent=2)) """ { "examples": [ { "a": "Foo" } ], "properties": { "a": { "title": "A", "type": "string" } }, "required": [ "a" ], "title": "Model", "type": "object" } """ ``` #### Using `json_schema_extra` with a `Callable` You can pass a `Callable` to `json_schema_extra` to modify the JSON schema with a function: ```python {output="json"} import json from pydantic import BaseModel, Field def pop_default(s): s.pop('default') class Model(BaseModel): a: int = Field(default=1, json_schema_extra=pop_default) print(json.dumps(Model.model_json_schema(), indent=2)) """ { "properties": { "a": { "title": "A", "type": "integer" } }, "title": "Model", "type": "object" } """ ``` #### Merging `json_schema_extra` Starting in v2.9, Pydantic merges `json_schema_extra` dictionaries from annotated types. This pattern offers a more additive approach to merging rather than the previous override behavior. This can be quite helpful for cases of reusing json schema extra information across multiple types. We viewed this change largely as a bug fix, as it resolves unintentional differences in the `json_schema_extra` merging behavior between `BaseModel` and `TypeAdapter` instances - see [this issue](https://github.com/pydantic/pydantic/issues/9210) for more details. ```python import json from typing_extensions import Annotated, TypeAlias from pydantic import Field, TypeAdapter ExternalType: TypeAlias = Annotated[ int, Field(json_schema_extra={'key1': 'value1'}) ] ta = TypeAdapter( Annotated[ExternalType, Field(json_schema_extra={'key2': 'value2'})] ) print(json.dumps(ta.json_schema(), indent=2)) """ { "key1": "value1", "key2": "value2", "type": "integer" } """ ``` !!! note We no longer (and never fully did) support composing a mix of `dict` and `callable` type `json_schema_extra` specifications. If this is a requirement for your use case, please [open a pydantic issue](https://github.com/pydantic/pydantic/issues/new/choose) and explain your situation - we'd be happy to reconsider this decision when presented with a compelling case. ### `WithJsonSchema` annotation ??? api "API Documentation" [`pydantic.json_schema.WithJsonSchema`][pydantic.json_schema.WithJsonSchema]
!!! tip Using [`WithJsonSchema`][pydantic.json_schema.WithJsonSchema] is preferred over [implementing `__get_pydantic_json_schema__`](#implementing_get_pydantic_json_schema) for custom types, as it's more simple and less error-prone. The [`WithJsonSchema`][pydantic.json_schema.WithJsonSchema] annotation can be used to override the generated (base) JSON schema for a given type without the need to implement `__get_pydantic_core_schema__` or `__get_pydantic_json_schema__` on the type itself. Note that this overrides the whole JSON Schema generation process for the field (in the following example, the `'type'` also needs to be provided). ```python {output="json"} import json from typing_extensions import Annotated from pydantic import BaseModel, WithJsonSchema MyInt = Annotated[ int, WithJsonSchema({'type': 'integer', 'examples': [1, 0, -1]}), ] class Model(BaseModel): a: MyInt print(json.dumps(Model.model_json_schema(), indent=2)) """ { "properties": { "a": { "examples": [ 1, 0, -1 ], "title": "A", "type": "integer" } }, "required": [ "a" ], "title": "Model", "type": "object" } """ ``` !!! note You might be tempted to use the [`WithJsonSchema`][pydantic.json_schema.WithJsonSchema] annotation to fine-tune the JSON Schema of fields having [validators](./validators.md) attached. Instead, it is recommended to use [the `json_schema_input_type` argument](./validators.md#json-schema-and-field-validators). ### `SkipJsonSchema` annotation ??? api "API Documentation" [`pydantic.json_schema.SkipJsonSchema`][pydantic.json_schema.SkipJsonSchema]
The [`SkipJsonSchema`][pydantic.json_schema.SkipJsonSchema] annotation can be used to skip a including field (or part of a field's specifications) from the generated JSON schema. See the API docs for more details. ### Implementing `__get_pydantic_core_schema__` Custom types (used as `field_name: TheType` or `field_name: Annotated[TheType, ...]`) as well as `Annotated` metadata (used as `field_name: Annotated[int, SomeMetadata]`) can modify or override the generated schema by implementing `__get_pydantic_core_schema__`. This method receives two positional arguments: 1. The type annotation that corresponds to this type (so in the case of `TheType[T][int]` it would be `TheType[int]`). 2. A handler/callback to call the next implementer of `__get_pydantic_core_schema__`. The handler system works just like [*wrap* field validators](validators.md#field-wrap-validator). In this case the input is the type and the output is a `core_schema`. Here is an example of a custom type that *overrides* the generated `core_schema`: ```python from dataclasses import dataclass from typing import Any, Dict, List, Type from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler @dataclass class CompressedString: dictionary: Dict[int, str] text: List[int] def build(self) -> str: return ' '.join([self.dictionary[key] for key in self.text]) @classmethod def __get_pydantic_core_schema__( cls, source: Type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: assert source is CompressedString return core_schema.no_info_after_validator_function( cls._validate, core_schema.str_schema(), serialization=core_schema.plain_serializer_function_ser_schema( cls._serialize, info_arg=False, return_schema=core_schema.str_schema(), ), ) @staticmethod def _validate(value: str) -> 'CompressedString': inverse_dictionary: Dict[str, int] = {} text: List[int] = [] for word in value.split(' '): if word not in inverse_dictionary: inverse_dictionary[word] = len(inverse_dictionary) text.append(inverse_dictionary[word]) return CompressedString( {v: k for k, v in inverse_dictionary.items()}, text ) @staticmethod def _serialize(value: 'CompressedString') -> str: return value.build() class MyModel(BaseModel): value: CompressedString print(MyModel.model_json_schema()) """ { 'properties': {'value': {'title': 'Value', 'type': 'string'}}, 'required': ['value'], 'title': 'MyModel', 'type': 'object', } """ print(MyModel(value='fox fox fox dog fox')) """ value = CompressedString(dictionary={0: 'fox', 1: 'dog'}, text=[0, 0, 0, 1, 0]) """ print(MyModel(value='fox fox fox dog fox').model_dump(mode='json')) #> {'value': 'fox fox fox dog fox'} ``` Since Pydantic would not know how to generate a schema for `CompressedString`, if you call `handler(source)` in its `__get_pydantic_core_schema__` method you would get a `pydantic.errors.PydanticSchemaGenerationError` error. This will be the case for most custom types, so you almost never want to call into `handler` for custom types. The process for `Annotated` metadata is much the same except that you can generally call into `handler` to have Pydantic handle generating the schema. ```python from dataclasses import dataclass from typing import Any, Sequence, Type from pydantic_core import core_schema from typing_extensions import Annotated from pydantic import BaseModel, GetCoreSchemaHandler, ValidationError @dataclass class RestrictCharacters: alphabet: Sequence[str] def __get_pydantic_core_schema__( self, source: Type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if not self.alphabet: raise ValueError('Alphabet may not be empty') schema = handler( source ) # get the CoreSchema from the type / inner constraints if schema['type'] != 'str': raise TypeError('RestrictCharacters can only be applied to strings') return core_schema.no_info_after_validator_function( self.validate, schema, ) def validate(self, value: str) -> str: if any(c not in self.alphabet for c in value): raise ValueError( f'{value!r} is not restricted to {self.alphabet!r}' ) return value class MyModel(BaseModel): value: Annotated[str, RestrictCharacters('ABC')] print(MyModel.model_json_schema()) """ { 'properties': {'value': {'title': 'Value', 'type': 'string'}}, 'required': ['value'], 'title': 'MyModel', 'type': 'object', } """ print(MyModel(value='CBA')) #> value='CBA' try: MyModel(value='XYZ') except ValidationError as e: print(e) """ 1 validation error for MyModel value Value error, 'XYZ' is not restricted to 'ABC' [type=value_error, input_value='XYZ', input_type=str] """ ``` So far we have been wrapping the schema, but if you just want to *modify* it or *ignore* it you can as well. To modify the schema, first call the handler, then mutate the result: ```python from typing import Any, Type from pydantic_core import ValidationError, core_schema from typing_extensions import Annotated from pydantic import BaseModel, GetCoreSchemaHandler class SmallString: def __get_pydantic_core_schema__( self, source: Type[Any], handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: schema = handler(source) assert schema['type'] == 'str' schema['max_length'] = 10 # modify in place return schema class MyModel(BaseModel): value: Annotated[str, SmallString()] try: MyModel(value='too long!!!!!') except ValidationError as e: print(e) """ 1 validation error for MyModel value String should have at most 10 characters [type=string_too_long, input_value='too long!!!!!', input_type=str] """ ``` !!! tip Note that you *must* return a schema, even if you are just mutating it in place. To override the schema completely, do not call the handler and return your own `CoreSchema`: ```python from typing import Any, Type from pydantic_core import ValidationError, core_schema from typing_extensions import Annotated from pydantic import BaseModel, GetCoreSchemaHandler class AllowAnySubclass: def __get_pydantic_core_schema__( self, source: Type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: # we can't call handler since it will fail for arbitrary types def validate(value: Any) -> Any: if not isinstance(value, source): raise ValueError( f'Expected an instance of {source}, got an instance of {type(value)}' ) return core_schema.no_info_plain_validator_function(validate) class Foo: pass class Model(BaseModel): f: Annotated[Foo, AllowAnySubclass()] print(Model(f=Foo())) #> f=None class NotFoo: pass try: Model(f=NotFoo()) except ValidationError as e: print(e) """ 1 validation error for Model f Value error, Expected an instance of , got an instance of [type=value_error, input_value=<__main__.NotFoo object at 0x0123456789ab>, input_type=NotFoo] """ ``` As seen above, annotating a field with a `BaseModel` type can be used to modify or override the generated json schema. However, if you want to take advantage of storing metadata via `Annotated`, but you don't want to override the generated JSON schema, you can use the following approach with a no-op version of `__get_pydantic_core_schema__` implemented on the metadata class: ```python from typing import Type from pydantic_core import CoreSchema from typing_extensions import Annotated from pydantic import BaseModel, GetCoreSchemaHandler class Metadata(BaseModel): foo: str = 'metadata!' bar: int = 100 @classmethod def __get_pydantic_core_schema__( cls, source_type: Type[BaseModel], handler: GetCoreSchemaHandler ) -> CoreSchema: if cls is not source_type: return handler(source_type) return super().__get_pydantic_core_schema__(source_type, handler) class Model(BaseModel): state: Annotated[int, Metadata()] m = Model.model_validate({'state': 2}) print(repr(m)) #> Model(state=2) print(m.model_fields) """ { 'state': FieldInfo( annotation=int, required=True, metadata=[Metadata(foo='metadata!', bar=100)], ) } """ ``` ### Implementing `__get_pydantic_json_schema__` You can also implement `__get_pydantic_json_schema__` to modify or override the generated json schema. Modifying this method only affects the JSON schema - it doesn't affect the core schema, which is used for validation and serialization. Here's an example of modifying the generated JSON schema: ```python {output="json"} import json from typing import Any from pydantic_core import core_schema as cs from pydantic import GetCoreSchemaHandler, GetJsonSchemaHandler, TypeAdapter from pydantic.json_schema import JsonSchemaValue class Person: name: str age: int def __init__(self, name: str, age: int): self.name = name self.age = age @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> cs.CoreSchema: return cs.typed_dict_schema( { 'name': cs.typed_dict_field(cs.str_schema()), 'age': cs.typed_dict_field(cs.int_schema()), }, ) @classmethod def __get_pydantic_json_schema__( cls, core_schema: cs.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(core_schema) json_schema = handler.resolve_ref_schema(json_schema) json_schema['examples'] = [ { 'name': 'John Doe', 'age': 25, } ] json_schema['title'] = 'Person' return json_schema print(json.dumps(TypeAdapter(Person).json_schema(), indent=2)) """ { "examples": [ { "age": 25, "name": "John Doe" } ], "properties": { "name": { "title": "Name", "type": "string" }, "age": { "title": "Age", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } """ ``` ### Using `field_title_generator` The `field_title_generator` parameter can be used to programmatically generate the title for a field based on its name and info. This is similar to the field level `field_title_generator`, but the `ConfigDict` option will be applied to all fields of the class. See the following example: ```python import json from pydantic import BaseModel, ConfigDict class Person(BaseModel): model_config = ConfigDict( field_title_generator=lambda field_name, field_info: field_name.upper() ) name: str age: int print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "NAME", "type": "string" }, "age": { "title": "AGE", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } """ ``` ### Using `model_title_generator` The `model_title_generator` config option is similar to the `field_title_generator` option, but it applies to the title of the model itself, and accepts the model class as input. See the following example: ```python import json from typing import Type from pydantic import BaseModel, ConfigDict def make_title(model: Type) -> str: return f'Title-{model.__name__}' class Person(BaseModel): model_config = ConfigDict(model_title_generator=make_title) name: str age: int print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "Name", "type": "string" }, "age": { "title": "Age", "type": "integer" } }, "required": [ "name", "age" ], "title": "Title-Person", "type": "object" } """ ``` ## JSON schema types Types, custom field types, and constraints (like `max_length`) are mapped to the corresponding spec formats in the following priority order (when there is an equivalent available): 1. [JSON Schema Core](https://json-schema.org/draft/2020-12/json-schema-core) 2. [JSON Schema Validation](https://json-schema.org/draft/2020-12/json-schema-validation) 3. [OpenAPI Data Types](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.2.md#data-types) 4. The standard `format` JSON field is used to define Pydantic extensions for more complex `string` sub-types. The field schema mapping from Python or Pydantic to JSON schema is done as follows: {{ schema_mappings_table }} ## Top-level schema generation You can also generate a top-level JSON schema that only includes a list of models and related sub-models in its `$defs`: ```python {output="json"} import json from pydantic import BaseModel from pydantic.json_schema import models_json_schema class Foo(BaseModel): a: str = None class Model(BaseModel): b: Foo class Bar(BaseModel): c: int _, top_level_schema = models_json_schema( [(Model, 'validation'), (Bar, 'validation')], title='My Schema' ) print(json.dumps(top_level_schema, indent=2)) """ { "$defs": { "Bar": { "properties": { "c": { "title": "C", "type": "integer" } }, "required": [ "c" ], "title": "Bar", "type": "object" }, "Foo": { "properties": { "a": { "default": null, "title": "A", "type": "string" } }, "title": "Foo", "type": "object" }, "Model": { "properties": { "b": { "$ref": "#/$defs/Foo" } }, "required": [ "b" ], "title": "Model", "type": "object" } }, "title": "My Schema" } """ ``` ## Customizing the JSON Schema Generation Process ??? api "API Documentation" [`pydantic.json_schema`][pydantic.json_schema.GenerateJsonSchema]
If you need custom schema generation, you can use a `schema_generator`, modifying the [`GenerateJsonSchema`][pydantic.json_schema.GenerateJsonSchema] class as necessary for your application. The various methods that can be used to produce JSON schema accept a keyword argument `schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema`, and you can pass your custom subclass to these methods in order to use your own approach to generating JSON schema. `GenerateJsonSchema` implements the translation of a type's `pydantic-core` schema into a JSON schema. By design, this class breaks the JSON schema generation process into smaller methods that can be easily overridden in subclasses to modify the "global" approach to generating JSON schema. ```python from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema class MyGenerateJsonSchema(GenerateJsonSchema): def generate(self, schema, mode='validation'): json_schema = super().generate(schema, mode=mode) json_schema['title'] = 'Customize title' json_schema['$schema'] = self.schema_dialect return json_schema class MyModel(BaseModel): x: int print(MyModel.model_json_schema(schema_generator=MyGenerateJsonSchema)) """ { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Customize title', 'type': 'object', '$schema': 'https://json-schema.org/draft/2020-12/schema', } """ ``` Below is an approach you can use to exclude any fields from the schema that don't have valid json schemas: ```python from typing import Callable from pydantic_core import PydanticOmit, core_schema from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def handle_invalid_for_json_schema( self, schema: core_schema.CoreSchema, error_info: str ) -> JsonSchemaValue: raise PydanticOmit def example_callable(): return 1 class Example(BaseModel): name: str = 'example' function: Callable = example_callable instance_example = Example() validation_schema = instance_example.model_json_schema( schema_generator=MyGenerateJsonSchema, mode='validation' ) print(validation_schema) """ { 'properties': { 'name': {'default': 'example', 'title': 'Name', 'type': 'string'} }, 'title': 'Example', 'type': 'object', } """ ``` ### JSON schema sorting By default, Pydantic recursively sorts JSON schemas by alphabetically sorting keys. Notably, Pydantic skips sorting the values of the `properties` key, to preserve the order of the fields as they were defined in the model. If you would like to customize this behavior, you can override the `sort` method in your custom `GenerateJsonSchema` subclass. The below example uses a no-op `sort` method to disable sorting entirely, which is reflected in the preserved order of the model fields and `json_schema_extra` keys: ```python import json from typing import Optional from pydantic import BaseModel, Field from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def sort( self, value: JsonSchemaValue, parent_key: Optional[str] = None ) -> JsonSchemaValue: """No-op, we don't want to sort schema values at all.""" return value class Bar(BaseModel): c: str b: str a: str = Field(json_schema_extra={'c': 'hi', 'b': 'hello', 'a': 'world'}) json_schema = Bar.model_json_schema(schema_generator=MyGenerateJsonSchema) print(json.dumps(json_schema, indent=2)) """ { "type": "object", "properties": { "c": { "type": "string", "title": "C" }, "b": { "type": "string", "title": "B" }, "a": { "type": "string", "c": "hi", "b": "hello", "a": "world", "title": "A" } }, "required": [ "c", "b", "a" ], "title": "Bar" } """ ``` ## Customizing the `$ref`s in JSON Schema The format of `$ref`s can be altered by calling [`model_json_schema()`][pydantic.main.BaseModel.model_json_schema] or [`model_dump_json()`][pydantic.main.BaseModel.model_dump_json] with the `ref_template` keyword argument. The definitions are always stored under the key `$defs`, but a specified prefix can be used for the references. This is useful if you need to extend or modify the JSON schema default definitions location. For example, with OpenAPI: ```python {output="json"} import json from pydantic import BaseModel from pydantic.type_adapter import TypeAdapter class Foo(BaseModel): a: int class Model(BaseModel): a: Foo adapter = TypeAdapter(Model) print( json.dumps( adapter.json_schema(ref_template='#/components/schemas/{model}'), indent=2, ) ) """ { "$defs": { "Foo": { "properties": { "a": { "title": "A", "type": "integer" } }, "required": [ "a" ], "title": "Foo", "type": "object" } }, "properties": { "a": { "$ref": "#/components/schemas/Foo" } }, "required": [ "a" ], "title": "Model", "type": "object" } """ ``` ## Miscellaneous Notes on JSON Schema Generation * The JSON schema for `Optional` fields indicates that the value `null` is allowed. * The `Decimal` type is exposed in JSON schema (and serialized) as a string. * Since the `namedtuple` type doesn't exist in JSON, a model's JSON schema does not preserve `namedtuple`s as `namedtuple`s. * Sub-models used are added to the `$defs` JSON attribute and referenced, as per the spec. * Sub-models with modifications (via the `Field` class) like a custom title, description, or default value, are recursively included instead of referenced. * The `description` for models is taken from either the docstring of the class or the argument `description` to the `Field` class. * The schema is generated by default using aliases as keys, but it can be generated using model property names instead by calling [`model_json_schema()`][pydantic.main.BaseModel.model_json_schema] or [`model_dump_json()`][pydantic.main.BaseModel.model_dump_json] with the `by_alias=False` keyword argument. pydantic-2.10.6/docs/concepts/models.md000066400000000000000000001631021474456633400200110ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.main.BaseModel`][pydantic.main.BaseModel]
One of the primary ways of defining schema in Pydantic is via models. Models are simply classes which inherit from [`BaseModel`][pydantic.main.BaseModel] and define fields as annotated attributes. You can think of models as similar to structs in languages like C, or as the requirements of a single endpoint in an API. Models share many similarities with Python's [dataclasses][dataclasses], but have been designed with some subtle-yet-important differences that streamline certain workflows related to validation, serialization, and JSON schema generation. You can find more discussion of this in the [Dataclasses](dataclasses.md) section of the docs. Untrusted data can be passed to a model and, after parsing and validation, Pydantic guarantees that the fields of the resultant model instance will conform to the field types defined on the model. !!! note "Validation — a _deliberate_ misnomer" ### TL;DR We use the term "validation" to refer to the process of instantiating a model (or other type) that adheres to specified types and constraints. This task, which Pydantic is well known for, is most widely recognized as "validation" in colloquial terms, even though in other contexts the term "validation" may be more restrictive. --- ### The long version The potential confusion around the term "validation" arises from the fact that, strictly speaking, Pydantic's primary focus doesn't align precisely with the dictionary definition of "validation": > ### validation > _noun_ > the action of checking or proving the validity or accuracy of something. In Pydantic, the term "validation" refers to the process of instantiating a model (or other type) that adheres to specified types and constraints. Pydantic guarantees the types and constraints of the output, not the input data. This distinction becomes apparent when considering that Pydantic's `ValidationError` is raised when data cannot be successfully parsed into a model instance. While this distinction may initially seem subtle, it holds practical significance. In some cases, "validation" goes beyond just model creation, and can include the copying and coercion of data. This can involve copying arguments passed to the constructor in order to perform coercion to a new type without mutating the original input data. For a more in-depth understanding of the implications for your usage, refer to the [Data Conversion](#data-conversion) and [Attribute Copies](#attribute-copies) sections below. In essence, Pydantic's primary goal is to assure that the resulting structure post-processing (termed "validation") precisely conforms to the applied type hints. Given the widespread adoption of "validation" as the colloquial term for this process, we will consistently use it in our documentation. While the terms "parse" and "validation" were previously used interchangeably, moving forward, we aim to exclusively employ "validate", with "parse" reserved specifically for discussions related to [JSON parsing](../concepts/json.md). ## Basic model usage !!! note Pydantic relies heavily on the existing Python typing constructs to define models. If you are not familiar with those, the following resources can be useful: - The [Type System Guides](https://typing.readthedocs.io/en/latest/guides/index.html) - The [mypy documentation](https://mypy.readthedocs.io/en/latest/) ```python {group="basic-model"} from pydantic import BaseModel class User(BaseModel): id: int name: str = 'Jane Doe' ``` In this example, `User` is a model with two fields: * `id`, which is an integer and is required * `name`, which is a string and is not required (it has a default value). The model can then be instantiated: ```python {group="basic-model"} user = User(id='123') ``` `user` is an instance of `User`. Initialization of the object will perform all parsing and validation. If no [`ValidationError`][pydantic_core.ValidationError] exception is raised, you know the resulting model instance is valid. Fields of a model can be accessed as normal attributes of the `user` object: ```python {group="basic-model"} assert user.name == 'Jane Doe' # (1)! assert user.id == 123 # (2)! assert isinstance(user.id, int) ``` 1. `name` wasn't set when `user` was initialized, so the default value was used. The [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute can be inspected to check the field names explicitly set during instantiation. 2. Note that the string `'123'` was coerced to an integer and its value is `123`. More details on Pydantic's coercion logic can be found in the [Data Conversion](#data-conversion) section. The model instance can be serialized using the [`model_dump`][pydantic.BaseModel.model_dump] method: ```python {group="basic-model"} assert user.model_dump() == {'id': 123, 'name': 'Jane Doe'} ``` Calling [dict][] on the instance will also provide a dictionary, but nested fields will not be recursively converted into dictionaries. [`model_dump`][pydantic.BaseModel.model_dump] also provides numerous arguments to customize the serialization result. By default, models are mutable and field values can be changed through attribute assignment: ```python {group="basic-model"} user.id = 321 assert user.id == 321 ``` !!! warning When defining your models, watch out for naming collisions between your field name and its type annotation. For example, the following will not behave as expected and would yield a validation error: ```python {test="skip"} from typing import Optional from pydantic import BaseModel class Boo(BaseModel): int: Optional[int] = None m = Boo(int=123) # Will fail to validate. ``` Because of how Python evaluates [annotated assignment statements][annassign], the statement is equivalent to `int: None = None`, thus leading to a validation error. ### Model methods and properties The example above only shows the tip of the iceberg of what models can do. Models possess the following methods and attributes: * [`model_validate()`][pydantic.main.BaseModel.model_validate]: Validates the given object against the Pydantic model. See [Validating data](#validating-data). * [`model_validate_json()`][pydantic.main.BaseModel.model_validate_json]: Validates the given JSON data against the Pydantic model. See [Validating data](#validating-data). * [`model_construct()`][pydantic.main.BaseModel.model_construct]: Creates models without running validation. See [Creating models without validation](#creating-models-without-validation). * [`model_dump()`][pydantic.main.BaseModel.model_dump]: Returns a dictionary of the model's fields and values. See [Serialization](serialization.md#model_dump). * [`model_dump_json()`][pydantic.main.BaseModel.model_dump_json]: Returns a JSON string representation of [`model_dump()`][pydantic.main.BaseModel.model_dump]. See [Serialization](serialization.md#model_dump_json). * [`model_copy()`][pydantic.main.BaseModel.model_copy]: Returns a copy (by default, shallow copy) of the model. See [Serialization](serialization.md#model_copy). * [`model_json_schema()`][pydantic.main.BaseModel.model_json_schema]: Returns a jsonable dictionary representing the model's JSON Schema. See [JSON Schema](json_schema.md). * [`model_fields`][pydantic.main.BaseModel.model_fields]: A mapping between field names and their definitions ([`FieldInfo`][pydantic.fields.FieldInfo] instances). * [`model_computed_fields`][pydantic.main.BaseModel.model_computed_fields]: A mapping between computed field names and their definitions ([`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances). * [`model_extra`][pydantic.main.BaseModel.model_extra]: The extra fields set during validation. * [`model_fields_set`][pydantic.main.BaseModel.model_fields_set]: The set of fields which were explicitly provided when the model was initialized. * [`model_parametrized_name()`][pydantic.main.BaseModel.model_parametrized_name]: Computes the class name for parametrizations of generic classes. * [`model_post_init()`][pydantic.main.BaseModel.model_post_init]: Performs additional actions after the model is instantiated and all field validators are applied. * [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild]: Rebuilds the model schema, which also supports building recursive generic models. See [Rebuilding model schema](#rebuilding-model-schema). !!! note See the API documentation of [`BaseModel`][pydantic.main.BaseModel] for the class definition including a full list of methods and attributes. !!! tip See [Changes to `pydantic.BaseModel`](../migration.md#changes-to-pydanticbasemodel) in the [Migration Guide](../migration.md) for details on changes from Pydantic V1. ## Nested models More complex hierarchical data structures can be defined using models themselves as types in annotations. ```python from typing import List, Optional from pydantic import BaseModel class Foo(BaseModel): count: int size: Optional[float] = None class Bar(BaseModel): apple: str = 'x' banana: str = 'y' class Spam(BaseModel): foo: Foo bars: List[Bar] m = Spam(foo={'count': 4}, bars=[{'apple': 'x1'}, {'apple': 'x2'}]) print(m) """ foo=Foo(count=4, size=None) bars=[Bar(apple='x1', banana='y'), Bar(apple='x2', banana='y')] """ print(m.model_dump()) """ { 'foo': {'count': 4, 'size': None}, 'bars': [{'apple': 'x1', 'banana': 'y'}, {'apple': 'x2', 'banana': 'y'}], } """ ``` Self-referencing models are supported. For more details, see the documentation related to [forward annotations](forward_annotations.md#self-referencing-or-recursive-models). ## Rebuilding model schema When you define a model class in your code, Pydantic will analyze the body of the class to collect a variety of information required to perform validation and serialization, gathered in a core schema. Notably, the model's type annotations are evaluated to understand the valid types for each field (more information can be found in the [Architecture](../internals/architecture.md) documentation). However, it might be the case that annotations refer to symbols not defined when the model class is being created. To circumvent this issue, the [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild] method can be used: ```python from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): x: 'Bar' # (1)! try: Foo.model_json_schema() except PydanticUserError as e: print(e) """ `Foo` is not fully defined; you should define `Bar`, then call `Foo.model_rebuild()`. For further information visit https://errors.pydantic.dev/2/u/class-not-fully-defined """ class Bar(BaseModel): pass Foo.model_rebuild() print(Foo.model_json_schema()) """ { '$defs': {'Bar': {'properties': {}, 'title': 'Bar', 'type': 'object'}}, 'properties': {'x': {'$ref': '#/$defs/Bar'}}, 'required': ['x'], 'title': 'Foo', 'type': 'object', } """ ``` 1. `Bar` is not yet defined when the `Foo` class is being created. For this reason, a [forward annotation](forward_annotations.md) is being used. Pydantic tries to determine when this is necessary automatically and error if it wasn't done, but you may want to call [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild] proactively when dealing with recursive models or generics. In V2, [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild] replaced `update_forward_refs()` from V1. There are some slight differences with the new behavior. The biggest change is that when calling [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild] on the outermost model, it builds a core schema used for validation of the whole model (nested models and all), so all types at all levels need to be ready before [`model_rebuild()`][pydantic.main.BaseModel.model_rebuild] is called. ## Arbitrary class instances (Formerly known as "ORM Mode"/`from_orm`). Pydantic models can also be created from arbitrary class instances by reading the instance attributes corresponding to the model field names. One common application of this functionality is integration with object-relational mappings (ORMs). To do this, set the [`from_attributes`][pydantic.config.ConfigDict.from_attributes] config value to `True` (see the documentation on [Configuration](./config.md) for more details). The example here uses [SQLAlchemy](https://www.sqlalchemy.org/), but the same approach should work for any ORM. ```python from typing import List from sqlalchemy import ARRAY, String from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column from typing_extensions import Annotated from pydantic import BaseModel, ConfigDict, StringConstraints class Base(DeclarativeBase): pass class CompanyOrm(Base): __tablename__ = 'companies' id: Mapped[int] = mapped_column(primary_key=True, nullable=False) public_key: Mapped[str] = mapped_column( String(20), index=True, nullable=False, unique=True ) domains: Mapped[List[str]] = mapped_column(ARRAY(String(255))) class CompanyModel(BaseModel): model_config = ConfigDict(from_attributes=True) id: int public_key: Annotated[str, StringConstraints(max_length=20)] domains: List[Annotated[str, StringConstraints(max_length=255)]] co_orm = CompanyOrm( id=123, public_key='foobar', domains=['example.com', 'foobar.com'], ) print(co_orm) #> <__main__.CompanyOrm object at 0x0123456789ab> co_model = CompanyModel.model_validate(co_orm) print(co_model) #> id=123 public_key='foobar' domains=['example.com', 'foobar.com'] ``` ### Nested attributes When using attributes to parse models, model instances will be created from both top-level attributes and deeper-nested attributes as appropriate. Here is an example demonstrating the principle: ```python from typing import List from pydantic import BaseModel, ConfigDict class PetCls: def __init__(self, *, name: str, species: str): self.name = name self.species = species class PersonCls: def __init__(self, *, name: str, age: float = None, pets: List[PetCls]): self.name = name self.age = age self.pets = pets class Pet(BaseModel): model_config = ConfigDict(from_attributes=True) name: str species: str class Person(BaseModel): model_config = ConfigDict(from_attributes=True) name: str age: float = None pets: List[Pet] bones = PetCls(name='Bones', species='dog') orion = PetCls(name='Orion', species='cat') anna = PersonCls(name='Anna', age=20, pets=[bones, orion]) anna_model = Person.model_validate(anna) print(anna_model) """ name='Anna' age=20.0 pets=[Pet(name='Bones', species='dog'), Pet(name='Orion', species='cat')] """ ``` ## Error handling Pydantic will raise a [`ValidationError`][pydantic_core.ValidationError] exception whenever it finds an error in the data it's validating. A single exception will be raised regardless of the number of errors found, and that validation error will contain information about all of the errors and how they happened. See [Error Handling](../errors/errors.md) for details on standard and custom errors. As a demonstration: ```python from typing import List from pydantic import BaseModel, ValidationError class Model(BaseModel): list_of_ints: List[int] a_float: float data = dict( list_of_ints=['1', 2, 'bad'], a_float='not a float', ) try: Model(**data) except ValidationError as e: print(e) """ 2 validation errors for Model list_of_ints.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='bad', input_type=str] a_float Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='not a float', input_type=str] """ ``` ## Validating data Pydantic provides three methods on models classes for parsing data: * [`model_validate()`][pydantic.main.BaseModel.model_validate]: this is very similar to the `__init__` method of the model, except it takes a dictionary or an object rather than keyword arguments. If the object passed cannot be validated, or if it's not a dictionary or instance of the model in question, a [`ValidationError`][pydantic_core.ValidationError] will be raised. * [`model_validate_json()`][pydantic.main.BaseModel.model_validate_json]: this validates the provided data as a JSON string or `bytes` object. If your incoming data is a JSON payload, this is generally considered faster (instead of manually parsing the data as a dictionary). Learn more about JSON parsing in the [JSON](../concepts/json.md) section of the docs. * [`model_validate_strings()`][pydantic.main.BaseModel.model_validate_strings]: this takes a dictionary (can be nested) with string keys and values and validates the data in JSON mode so that said strings can be coerced into the correct types. ```python from datetime import datetime from typing import Optional from pydantic import BaseModel, ValidationError class User(BaseModel): id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None m = User.model_validate({'id': 123, 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None try: User.model_validate(['not', 'a', 'dict']) except ValidationError as e: print(e) """ 1 validation error for User Input should be a valid dictionary or instance of User [type=model_type, input_value=['not', 'a', 'dict'], input_type=list] """ m = User.model_validate_json('{"id": 123, "name": "James"}') print(m) #> id=123 name='James' signup_ts=None try: m = User.model_validate_json('{"id": 123, "name": 123}') except ValidationError as e: print(e) """ 1 validation error for User name Input should be a valid string [type=string_type, input_value=123, input_type=int] """ try: m = User.model_validate_json('invalid JSON') except ValidationError as e: print(e) """ 1 validation error for User Invalid JSON: expected value at line 1 column 1 [type=json_invalid, input_value='invalid JSON', input_type=str] """ m = User.model_validate_strings({'id': '123', 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01T12:00:00'} ) print(m) #> id=123 name='James' signup_ts=datetime.datetime(2024, 4, 1, 12, 0) try: m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01'}, strict=True ) except ValidationError as e: print(e) """ 1 validation error for User signup_ts Input should be a valid datetime, invalid datetime separator, expected `T`, `t`, `_` or space [type=datetime_parsing, input_value='2024-04-01', input_type=str] """ ``` If you want to validate serialized data in a format other than JSON, you should load the data into a dictionary yourself and then pass it to [`model_validate`][pydantic.main.BaseModel.model_validate]. !!! note Depending on the types and model configs involved, [`model_validate`][pydantic.main.BaseModel.model_validate] and [`model_validate_json`][pydantic.main.BaseModel.model_validate_json] may have different validation behavior. If you have data coming from a non-JSON source, but want the same validation behavior and errors you'd get from [`model_validate_json`][pydantic.main.BaseModel.model_validate_json], our recommendation for now is to use either use `model_validate_json(json.dumps(data))`, or use [`model_validate_strings`][pydantic.main.BaseModel.model_validate_strings] if the data takes the form of a (potentially nested) dictionary with string keys and values. !!! note If you're passing in an instance of a model to [`model_validate`][pydantic.main.BaseModel.model_validate], you will want to consider setting [`revalidate_instances`][pydantic.ConfigDict.revalidate_instances] in the model's config. If you don't set this value, then validation will be skipped on model instances. See the below example: === ":x: `revalidate_instances='never'`" ```python from pydantic import BaseModel class Model(BaseModel): a: int m = Model(a=0) # note: setting `validate_assignment` to `True` in the config can prevent this kind of misbehavior. m.a = 'not an int' # doesn't raise a validation error even though m is invalid m2 = Model.model_validate(m) ``` === ":white_check_mark: `revalidate_instances='always'`" ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): a: int model_config = ConfigDict(revalidate_instances='always') m = Model(a=0) # note: setting `validate_assignment` to `True` in the config can prevent this kind of misbehavior. m.a = 'not an int' try: m2 = Model.model_validate(m) except ValidationError as e: print(e) """ 1 validation error for Model a Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='not an int', input_type=str] """ ``` ### Creating models without validation Pydantic also provides the [`model_construct()`][pydantic.main.BaseModel.model_construct] method, which allows models to be created **without validation**. This can be useful in at least a few cases: * when working with complex data that is already known to be valid (for performance reasons) * when one or more of the validator functions are non-idempotent * when one or more of the validator functions have side effects that you don't want to be triggered. !!! warning [`model_construct()`][pydantic.main.BaseModel.model_construct] does not do any validation, meaning it can create models which are invalid. **You should only ever use the [`model_construct()`][pydantic.main.BaseModel.model_construct] method with data which has already been validated, or that you definitely trust.** !!! note In Pydantic V2, the performance gap between validation (either with direct instantiation or the `model_validate*` methods) and [`model_construct()`][pydantic.main.BaseModel.model_construct] has been narrowed considerably. For simple models, going with validation may even be faster. If you are using [`model_construct()`][pydantic.main.BaseModel.model_construct] for performance reasons, you may want to profile your use case before assuming it is actually faster. Note that for [root models](#rootmodel-and-custom-root-types), the root value can be passed to [`model_construct()`][pydantic.main.BaseModel.model_construct] positionally, instead of using a keyword argument. Here are some additional notes on the behavior of [`model_construct()`][pydantic.main.BaseModel.model_construct]: * When we say "no validation is performed" — this includes converting dictionaries to model instances. So if you have a field referring to a model type, you will need to convert the inner dictionary to a model yourself. * If you do not pass keyword arguments for fields with defaults, the default values will still be used. * For models with private attributes, the `__pydantic_private__` dictionary will be populated the same as it would be when creating the model with validation. * No `__init__` method from the model or any of its parent classes will be called, even when a custom `__init__` method is defined. !!! note "On [extra fields](#extra-fields) behavior with [`model_construct()`][pydantic.main.BaseModel.model_construct]" * For models with [`extra`][pydantic.ConfigDict.extra] set to `'allow'`, data not corresponding to fields will be correctly stored in the `__pydantic_extra__` dictionary and saved to the model's `__dict__` attribute. * For models with [`extra`][pydantic.ConfigDict.extra] set to `'ignore'`, data not corresponding to fields will be ignored — that is, not stored in `__pydantic_extra__` or `__dict__` on the instance. * Unlike when instiating the model with validation, a call to [`model_construct()`][pydantic.main.BaseModel.model_construct] with [`extra`][pydantic.ConfigDict.extra] set to `'forbid'` doesn't raise an error in the presence of data not corresponding to fields. Rather, said input data is simply ignored. ## Generic models Pydantic supports the creation of generic models to make it easier to reuse a common model structure. In order to declare a generic model, you should follow the following steps: 1. Declare one or more [type variables][typing.TypeVar] to use to parameterize your model. 2. Declare a pydantic model that inherits from [`BaseModel`][pydantic.BaseModel] and [`typing.Generic`][] (in this specific order), and add the list of type variables you declared previously as parameters to the [`Generic`][typing.Generic] parent. 3. Use the type variables as annotations where you will want to replace them with other types. !!! warning "PEP 695 support" Pydantic does not support the new syntax for generic classes (introduced by [PEP 695](https://peps.python.org/pep-0695/)), available since Python 3.12. Progress can be tracked in [this issue](https://github.com/pydantic/pydantic/issues/9782). Here is an example using a generic Pydantic model to create an easily-reused HTTP response payload wrapper: ```python from typing import Generic, List, Optional, TypeVar from pydantic import BaseModel, ValidationError DataT = TypeVar('DataT') # (1)! class DataModel(BaseModel): numbers: List[int] people: List[str] class Response(BaseModel, Generic[DataT]): # (2)! data: Optional[DataT] = None # (3)! print(Response[int](data=1)) #> data=1 print(Response[str](data='value')) #> data='value' print(Response[str](data='value').model_dump()) #> {'data': 'value'} data = DataModel(numbers=[1, 2, 3], people=[]) print(Response[DataModel](data=data).model_dump()) #> {'data': {'numbers': [1, 2, 3], 'people': []}} try: Response[int](data='value') except ValidationError as e: print(e) """ 1 validation error for Response[int] data Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='value', input_type=str] """ ``` 1. Refers to step 1 described above. 2. Refers to step 2 described above. 3. Refers to step 3 described above. Any [configuration](./config.md), [validation](./validators.md) or [serialization](./serialization.md) logic set on the generic model will also be applied to the parametrized classes, in the same way as when inheriting from a model class. Any custom methods or attributes will also be inherited. Generic models also integrate properly with type checkers, so you get all the type checking you would expect if you were to declare a distinct type for each parametrization. !!! note Internally, Pydantic creates subclasses of the generic model at runtime when the generic model class is parametrized. These classes are cached, so there should be minimal overhead introduced by the use of generics models. To inherit from a generic model and preserve the fact that it is generic, the subclass must also inherit from [`Generic`][typing.Generic]: ```python from typing import Generic, TypeVar from pydantic import BaseModel TypeX = TypeVar('TypeX') class BaseClass(BaseModel, Generic[TypeX]): X: TypeX class ChildClass(BaseClass[TypeX], Generic[TypeX]): pass # Parametrize `TypeX` with `int`: print(ChildClass[int](X=1)) #> X=1 ``` You can also create a generic subclass of a model that partially or fully replaces the type variables in the superclass: ```python from typing import Generic, TypeVar from pydantic import BaseModel TypeX = TypeVar('TypeX') TypeY = TypeVar('TypeY') TypeZ = TypeVar('TypeZ') class BaseClass(BaseModel, Generic[TypeX, TypeY]): x: TypeX y: TypeY class ChildClass(BaseClass[int, TypeY], Generic[TypeY, TypeZ]): z: TypeZ # Parametrize `TypeY` with `str`: print(ChildClass[str, int](x='1', y='y', z='3')) #> x=1 y='y' z=3 ``` If the name of the concrete subclasses is important, you can also override the default name generation by overriding the [`model_parametrized_name()`][pydantic.main.BaseModel.model_parametrized_name] method: ```python from typing import Any, Generic, Tuple, Type, TypeVar from pydantic import BaseModel DataT = TypeVar('DataT') class Response(BaseModel, Generic[DataT]): data: DataT @classmethod def model_parametrized_name(cls, params: Tuple[Type[Any], ...]) -> str: return f'{params[0].__name__.title()}Response' print(repr(Response[int](data=1))) #> IntResponse(data=1) print(repr(Response[str](data='a'))) #> StrResponse(data='a') ``` You can use parametrized generic models as types in other models: ```python from typing import Generic, TypeVar from pydantic import BaseModel T = TypeVar('T') class ResponseModel(BaseModel, Generic[T]): content: T class Product(BaseModel): name: str price: float class Order(BaseModel): id: int product: ResponseModel[Product] product = Product(name='Apple', price=0.5) response = ResponseModel[Product](content=product) order = Order(id=1, product=response) print(repr(order)) """ Order(id=1, product=ResponseModel[Product](content=Product(name='Apple', price=0.5))) """ ``` Using the same type variable in nested models allows you to enforce typing relationships at different points in your model: ```python from typing import Generic, TypeVar from pydantic import BaseModel, ValidationError T = TypeVar('T') class InnerT(BaseModel, Generic[T]): inner: T class OuterT(BaseModel, Generic[T]): outer: T nested: InnerT[T] nested = InnerT[int](inner=1) print(OuterT[int](outer=1, nested=nested)) #> outer=1 nested=InnerT[int](inner=1) try: print(OuterT[int](outer='a', nested=InnerT(inner='a'))) # (1)! except ValidationError as e: print(e) """ 2 validation errors for OuterT[int] outer Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] nested.inner Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` 1. The `OuterT` model is parametrized with `int`, but the data associated with the the `T` annotations during validation is of type `str`, leading to validation errors. !!! warning While it may not raise an error, we strongly advise against using parametrized generics in [`isinstance()`](https://docs.python.org/3/library/functions.html#isinstance) checks. For example, you should not do `isinstance(my_model, MyGenericModel[int])`. However, it is fine to do `isinstance(my_model, MyGenericModel)` (note that, for standard generics, it would raise an error to do a subclass check with a parameterized generic class). If you need to perform [`isinstance()`](https://docs.python.org/3/library/functions.html#isinstance) checks against parametrized generics, you can do this by subclassing the parametrized generic class: ```python {test="skip" lint="skip"} class MyIntModel(MyGenericModel[int]): ... isinstance(my_model, MyIntModel) ``` !!! note "Implementation Details" When using nested generic models, Pydantic sometimes performs revalidation in an attempt to produce the most intuitive validation result. Specifically, if you have a field of type `GenericModel[SomeType]` and you validate data like `GenericModel[SomeCompatibleType]` against this field, we will inspect the data, recognize that the input data is sort of a "loose" subclass of `GenericModel`, and revalidate the contained `SomeCompatibleType` data. This adds some validation overhead, but makes things more intuitive for cases like that shown below. ```python from typing import Any, Generic, TypeVar from pydantic import BaseModel T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): a: T class Model(BaseModel): inner: GenericModel[Any] print(repr(Model.model_validate(Model(inner=GenericModel[int](a=1))))) #> Model(inner=GenericModel[Any](a=1)) ``` Note, validation will still fail if you, for example are validating against `GenericModel[int]` and pass in an instance `GenericModel[str](a='not an int')`. It's also worth noting that this pattern will re-trigger any custom validation as well, like additional model validators and the like. Validators will be called once on the first pass, validating directly against `GenericModel[Any]`. That validation fails, as `GenericModel[int]` is not a subclass of `GenericModel[Any]`. This relates to the warning above about the complications of using parametrized generics in `isinstance()` and `issubclass()` checks. Then, the validators will be called again on the second pass, during more lax force-revalidation phase, which succeeds. To better understand this consequence, see below: ```python {test="skip"} from typing import Any, Generic, Self, TypeVar from pydantic import BaseModel, model_validator T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): a: T @model_validator(mode='after') def validate_after(self: Self) -> Self: print('after validator running custom validation...') return self class Model(BaseModel): inner: GenericModel[Any] m = Model.model_validate(Model(inner=GenericModel[int](a=1))) #> after validator running custom validation... #> after validator running custom validation... print(repr(m)) #> Model(inner=GenericModel[Any](a=1)) ``` ### Validation of unparametrized type variables When leaving type variables unparametrized, Pydantic treats generic models similarly to how it treats built-in generic types like [`list`][] and [`dict`][]: * If the type variable is [bound](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-upper-bounds) or [constrained](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints) to a specific type, it will be used. * If the type variable has a default type (as specified by [PEP 696](https://peps.python.org/pep-0696/)), it will be used. * For unbound or unconstrained type variables, Pydantic will fallback to [`Any`][typing.Any]. ```python from typing import Generic from typing_extensions import TypeVar from pydantic import BaseModel, ValidationError T = TypeVar('T') U = TypeVar('U', bound=int) V = TypeVar('V', default=str) class Model(BaseModel, Generic[T, U, V]): t: T u: U v: V print(Model(t='t', u=1, v='v')) #> t='t' u=1 v='v' try: Model(t='t', u='u', v=1) except ValidationError as exc: print(exc) """ 2 validation errors for Model u Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='u', input_type=str] v Input should be a valid string [type=string_type, input_value=1, input_type=int] """ ``` !!! warning In some cases, validation against an unparametrized generic model can lead to data loss. Specifically, if a subtype of the type variable upper bound, constraints, or default is being used and the model isn't explicitly parametrized, the resulting type **will not be** the one being provided: ```python from typing import Generic, TypeVar from pydantic import BaseModel ItemT = TypeVar('ItemT', bound='ItemBase') class ItemBase(BaseModel): ... class IntItem(ItemBase): value: int class ItemHolder(BaseModel, Generic[ItemT]): item: ItemT loaded_data = {'item': {'value': 1}} print(ItemHolder(**loaded_data)) # (1)! #> item=ItemBase() print(ItemHolder[IntItem](**loaded_data)) # (2)! #> item=IntItem(value=1) ``` 1. When the generic isn't parametrized, the input data is validated against the `ItemT` upper bound. Given that `ItemBase` has no fields, the `item` field information is lost. 2. In this case, the type variable is explicitly parametrized, so the input data is validated against the `IntItem` class. ### Serialization of unparametrized type variables The behavior of serialization differs when using type variables with [upper bounds](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-upper-bounds), [constraints](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints), or a default value: If a Pydantic model is used in a type variable upper bound and the type variable is never parametrized, then Pydantic will use the upper bound for validation but treat the value as [`Any`][typing.Any] in terms of serialization: ```python from typing import Generic, TypeVar from pydantic import BaseModel class ErrorDetails(BaseModel): foo: str ErrorDataT = TypeVar('ErrorDataT', bound=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str # serialized as Any error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='var2'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'var2', }, } # serialized using the concrete parametrization # note that `'bar': 'var2'` is missing error = Error[ErrorDetails]( message='We just had an error', details=ErrorDetails(foo='var'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } ``` Here's another example of the above behavior, enumerating all permutations regarding bound specification and generic type parametrization: ```python from typing import Generic, TypeVar from pydantic import BaseModel TBound = TypeVar('TBound', bound=BaseModel) TNoBound = TypeVar('TNoBound') class IntValue(BaseModel): value: int class ItemBound(BaseModel, Generic[TBound]): item: TBound class ItemNoBound(BaseModel, Generic[TNoBound]): item: TNoBound item_bound_inferred = ItemBound(item=IntValue(value=3)) item_bound_explicit = ItemBound[IntValue](item=IntValue(value=3)) item_no_bound_inferred = ItemNoBound(item=IntValue(value=3)) item_no_bound_explicit = ItemNoBound[IntValue](item=IntValue(value=3)) # calling `print(x.model_dump())` on any of the above instances results in the following: #> {'item': {'value': 3}} ``` However, if [constraints](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints) or a default value (as per [PEP 696](https://peps.python.org/pep-0696/)) is being used, then the default type or constraints will be used for both validation and serialization if the type variable is not parametrized. You can override this behavior using [`SerializeAsAny`](./serialization.md#serializeasany-annotation): ```python from typing import Generic from typing_extensions import TypeVar from pydantic import BaseModel, SerializeAsAny class ErrorDetails(BaseModel): foo: str ErrorDataT = TypeVar('ErrorDataT', default=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str # serialized using the default's serializer error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='var2'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } # If `ErrorDataT` was using an upper bound, `bar` would be present in `details`. class SerializeAsAnyError(BaseModel, Generic[ErrorDataT]): message: str details: SerializeAsAny[ErrorDataT] # serialized as Any error = SerializeAsAnyError( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } ``` ## Dynamic model creation ??? api "API Documentation" [`pydantic.main.create_model`][pydantic.main.create_model]
There are some occasions where it is desirable to create a model using runtime information to specify the fields. For this Pydantic provides the `create_model` function to allow models to be created on the fly: ```python from pydantic import BaseModel, create_model DynamicFoobarModel = create_model( 'DynamicFoobarModel', foo=(str, ...), bar=(int, 123) ) class StaticFoobarModel(BaseModel): foo: str bar: int = 123 ``` Here `StaticFoobarModel` and `DynamicFoobarModel` are identical. Fields are defined by one of the following tuple forms: * `(, )` * `(, Field(...))` * `typing.Annotated[, Field(...)]` Using a `Field(...)` call as the second argument in the tuple (the default value) allows for more advanced field configuration. Thus, the following are analogous: ```python from pydantic import BaseModel, Field, create_model DynamicModel = create_model( 'DynamicModel', foo=(str, Field(description='foo description', alias='FOO')), ) class StaticModel(BaseModel): foo: str = Field(description='foo description', alias='FOO') ``` The special keyword arguments `__config__` and `__base__` can be used to customize the new model. This includes extending a base model with extra fields. ```python from pydantic import BaseModel, create_model class FooModel(BaseModel): foo: str bar: int = 123 BarModel = create_model( 'BarModel', apple=(str, 'russet'), banana=(str, 'yellow'), __base__=FooModel, ) print(BarModel) #> print(BarModel.model_fields.keys()) #> dict_keys(['foo', 'bar', 'apple', 'banana']) ``` You can also add validators by passing a dictionary to the `__validators__` argument. ```python {rewrite_assert="false"} from pydantic import ValidationError, create_model, field_validator def alphanum(cls, v): assert v.isalnum(), 'must be alphanumeric' return v validators = { 'username_validator': field_validator('username')(alphanum) # (1)! } UserModel = create_model( 'UserModel', username=(str, ...), __validators__=validators ) user = UserModel(username='scolvin') print(user) #> username='scolvin' try: UserModel(username='scolvi%n') except ValidationError as e: print(e) """ 1 validation error for UserModel username Assertion failed, must be alphanumeric [type=assertion_error, input_value='scolvi%n', input_type=str] """ ``` 1. Make sure that the validators names do not clash with any of the field names as internally, Pydantic gathers all members into a namespace and mimics the normal creation of a class using the [`types` module utilities](https://docs.python.org/3/library/types.html#dynamic-type-creation). !!! note To pickle a dynamically created model: - the model must be defined globally - the `__module__` argument must be provided ## `RootModel` and custom root types ??? api "API Documentation" [`pydantic.root_model.RootModel`][pydantic.root_model.RootModel]
Pydantic models can be defined with a "custom root type" by subclassing [`pydantic.RootModel`][pydantic.RootModel]. The root type can be any type supported by Pydantic, and is specified by the generic parameter to `RootModel`. The root value can be passed to the model `__init__` or [`model_validate`][pydantic.main.BaseModel.model_validate] via the first and only argument. Here's an example of how this works: ```python from typing import Dict, List from pydantic import RootModel Pets = RootModel[List[str]] PetsByName = RootModel[Dict[str, str]] print(Pets(['dog', 'cat'])) #> root=['dog', 'cat'] print(Pets(['dog', 'cat']).model_dump_json()) #> ["dog","cat"] print(Pets.model_validate(['dog', 'cat'])) #> root=['dog', 'cat'] print(Pets.model_json_schema()) """ {'items': {'type': 'string'}, 'title': 'RootModel[List[str]]', 'type': 'array'} """ print(PetsByName({'Otis': 'dog', 'Milo': 'cat'})) #> root={'Otis': 'dog', 'Milo': 'cat'} print(PetsByName({'Otis': 'dog', 'Milo': 'cat'}).model_dump_json()) #> {"Otis":"dog","Milo":"cat"} print(PetsByName.model_validate({'Otis': 'dog', 'Milo': 'cat'})) #> root={'Otis': 'dog', 'Milo': 'cat'} ``` If you want to access items in the `root` field directly or to iterate over the items, you can implement custom `__iter__` and `__getitem__` functions, as shown in the following example. ```python from typing import List from pydantic import RootModel class Pets(RootModel): root: List[str] def __iter__(self): return iter(self.root) def __getitem__(self, item): return self.root[item] pets = Pets.model_validate(['dog', 'cat']) print(pets[0]) #> dog print([pet for pet in pets]) #> ['dog', 'cat'] ``` You can also create subclasses of the parametrized root model directly: ```python from typing import List from pydantic import RootModel class Pets(RootModel[List[str]]): def describe(self) -> str: return f'Pets: {", ".join(self.root)}' my_pets = Pets.model_validate(['dog', 'cat']) print(my_pets.describe()) #> Pets: dog, cat ``` ## Faux immutability Models can be configured to be immutable via `model_config['frozen'] = True`. When this is set, attempting to change the values of instance attributes will raise errors. See the [API reference][pydantic.config.ConfigDict.frozen] for more details. !!! note This behavior was achieved in Pydantic V1 via the config setting `allow_mutation = False`. This config flag is deprecated in Pydantic V2, and has been replaced with `frozen`. !!! warning In Python, immutability is not enforced. Developers have the ability to modify objects that are conventionally considered "immutable" if they choose to do so. ```python from pydantic import BaseModel, ConfigDict, ValidationError class FooBarModel(BaseModel): model_config = ConfigDict(frozen=True) a: str b: dict foobar = FooBarModel(a='hello', b={'apple': 'pear'}) try: foobar.a = 'different' except ValidationError as e: print(e) """ 1 validation error for FooBarModel a Instance is frozen [type=frozen_instance, input_value='different', input_type=str] """ print(foobar.a) #> hello print(foobar.b) #> {'apple': 'pear'} foobar.b['apple'] = 'grape' print(foobar.b) #> {'apple': 'grape'} ``` Trying to change `a` caused an error, and `a` remains unchanged. However, the dict `b` is mutable, and the immutability of `foobar` doesn't stop `b` from being changed. ## Abstract base classes Pydantic models can be used alongside Python's [Abstract Base Classes](https://docs.python.org/3/library/abc.html) (ABCs). ```python import abc from pydantic import BaseModel class FooBarModel(BaseModel, abc.ABC): a: str b: int @abc.abstractmethod def my_abstract_method(self): pass ``` ## Field ordering Field order affects models in the following ways: * field order is preserved in the model [schema](json_schema.md) * field order is preserved in [validation errors](#error-handling) * field order is preserved by [`.model_dump()` and `.model_dump_json()` etc.](serialization.md#model_dump) ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int = 2 c: int = 1 d: int = 0 e: float print(Model.model_fields.keys()) #> dict_keys(['a', 'b', 'c', 'd', 'e']) m = Model(e=2, a=1) print(m.model_dump()) #> {'a': 1, 'b': 2, 'c': 1, 'd': 0, 'e': 2.0} try: Model(a='x', b='x', c='x', d='x', e='x') except ValidationError as err: error_locations = [e['loc'] for e in err.errors()] print(error_locations) #> [('a',), ('b',), ('c',), ('d',), ('e',)] ``` ## Required fields To declare a field as required, you may declare it using an annotation, or an annotation in combination with a [`Field`][pydantic.Field] function (without specifying any `default` or `default_factory` argument). ```python from pydantic import BaseModel, Field class Model(BaseModel): a: int b: int = Field(alias='B') c: int = Field(..., alias='C') ``` Here `a`, `b` and `c` are all required. The field `c` uses the [ellipsis][Ellipsis] as a default argument, emphasizing on the fact that it is required. However, the usage of the [ellipsis][Ellipsis] is discouraged as it doesn't play well with type checkers. !!! note In Pydantic V1, fields annotated with `Optional` or `Any` would be given an implicit default of `None` even if no default was explicitly specified. This behavior has changed in Pydantic V2, and there are no longer any type annotations that will result in a field having an implicit default value. See [the migration guide](../migration.md#required-optional-and-nullable-fields) for more details on changes to required and nullable fields. ## Fields with non-hashable default values A common source of bugs in python is to use a mutable object as a default value for a function or method argument, as the same instance ends up being reused in each call. The `dataclasses` module actually raises an error in this case, indicating that you should use the `default_factory` argument to `dataclasses.field`. Pydantic also supports the use of a [`default_factory`](#fields-with-dynamic-default-values) for non-hashable default values, but it is not required. In the event that the default value is not hashable, Pydantic will deepcopy the default value when creating each instance of the model: ```python from typing import Dict, List from pydantic import BaseModel class Model(BaseModel): item_counts: List[Dict[str, int]] = [{}] m1 = Model() m1.item_counts[0]['a'] = 1 print(m1.item_counts) #> [{'a': 1}] m2 = Model() print(m2.item_counts) #> [{}] ``` ## Fields with dynamic default values When declaring a field with a default value, you may want it to be dynamic (i.e. different for each model). To do this, you may want to use a `default_factory`. Here is an example: ```python from datetime import datetime, timezone from uuid import UUID, uuid4 from pydantic import BaseModel, Field def datetime_now() -> datetime: return datetime.now(timezone.utc) class Model(BaseModel): uid: UUID = Field(default_factory=uuid4) updated: datetime = Field(default_factory=datetime_now) m1 = Model() m2 = Model() assert m1.uid != m2.uid ``` You can find more information in the documentation of the [`Field` function](fields.md). ## Automatically excluded attributes ### Class vars Attributes annotated with `typing.ClassVar` are properly treated by Pydantic as class variables, and will not become fields on model instances: ```python from typing import ClassVar from pydantic import BaseModel class Model(BaseModel): x: int = 2 y: ClassVar[int] = 1 m = Model() print(m) #> x=2 print(Model.y) #> 1 ``` ### Private model attributes ??? api "API Documentation" [`pydantic.fields.PrivateAttr`][pydantic.fields.PrivateAttr]
Attributes whose name has a leading underscore are not treated as fields by Pydantic, and are not included in the model schema. Instead, these are converted into a "private attribute" which is not validated or even set during calls to `__init__`, `model_validate`, etc. !!! note As of Pydantic v2.1.0, you will receive a NameError if trying to use the [`Field` function](fields.md) with a private attribute. Because private attributes are not treated as fields, the Field() function cannot be applied. Here is an example of usage: ```python from datetime import datetime from random import randint from pydantic import BaseModel, PrivateAttr class TimeAwareModel(BaseModel): _processed_at: datetime = PrivateAttr(default_factory=datetime.now) _secret_value: str def __init__(self, **data): super().__init__(**data) # this could also be done with default_factory self._secret_value = randint(1, 5) m = TimeAwareModel() print(m._processed_at) #> 2032-01-02 03:04:05.000006 print(m._secret_value) #> 3 ``` Private attribute names must start with underscore to prevent conflicts with model fields. However, dunder names (such as `__attr__`) are not supported. ## Data conversion Pydantic may cast input data to force it to conform to model field types, and in some cases this may result in a loss of information. For example: ```python from pydantic import BaseModel class Model(BaseModel): a: int b: float c: str print(Model(a=3.000, b='2.72', c=b'binary data').model_dump()) #> {'a': 3, 'b': 2.72, 'c': 'binary data'} ``` This is a deliberate decision of Pydantic, and is frequently the most useful approach. See [here](https://github.com/pydantic/pydantic/issues/578) for a longer discussion on the subject. Nevertheless, [strict type checking](strict_mode.md) is also supported. ## Model signature All Pydantic models will have their signature generated based on their fields: ```python import inspect from pydantic import BaseModel, Field class FooModel(BaseModel): id: int name: str = None description: str = 'Foo' apple: int = Field(alias='pear') print(inspect.signature(FooModel)) #> (*, id: int, name: str = None, description: str = 'Foo', pear: int) -> None ``` An accurate signature is useful for introspection purposes and libraries like `FastAPI` or `hypothesis`. The generated signature will also respect custom `__init__` functions: ```python import inspect from pydantic import BaseModel class MyModel(BaseModel): id: int info: str = 'Foo' def __init__(self, id: int = 1, *, bar: str, **data) -> None: """My custom init!""" super().__init__(id=id, bar=bar, **data) print(inspect.signature(MyModel)) #> (id: int = 1, *, bar: str, info: str = 'Foo') -> None ``` To be included in the signature, a field's alias or name must be a valid Python identifier. Pydantic will prioritize a field's alias over its name when generating the signature, but may use the field name if the alias is not a valid Python identifier. If a field's alias and name are _both_ not valid identifiers (which may be possible through exotic use of `create_model`), a `**data` argument will be added. In addition, the `**data` argument will always be present in the signature if `model_config['extra'] == 'allow'`. ## Structural pattern matching Pydantic supports structural pattern matching for models, as introduced by [PEP 636](https://peps.python.org/pep-0636/) in Python 3.10. ```python {requires="3.10" lint="skip"} from pydantic import BaseModel class Pet(BaseModel): name: str species: str a = Pet(name='Bones', species='dog') match a: # match `species` to 'dog', declare and initialize `dog_name` case Pet(species='dog', name=dog_name): print(f'{dog_name} is a dog') #> Bones is a dog # default case case _: print('No dog matched') ``` !!! note A match-case statement may seem as if it creates a new model, but don't be fooled; it is just syntactic sugar for getting an attribute and either comparing it or declaring and initializing it. ## Attribute copies In many cases, arguments passed to the constructor will be copied in order to perform validation and, where necessary, coercion. In this example, note that the ID of the list changes after the class is constructed because it has been copied during validation: ```python from typing import List from pydantic import BaseModel class C1: arr = [] def __init__(self, in_arr): self.arr = in_arr class C2(BaseModel): arr: List[int] arr_orig = [1, 9, 10, 3] c1 = C1(arr_orig) c2 = C2(arr=arr_orig) print('id(c1.arr) == id(c2.arr):', id(c1.arr) == id(c2.arr)) #> id(c1.arr) == id(c2.arr): False ``` !!! note There are some situations where Pydantic does not copy attributes, such as when passing models — we use the model as is. You can override this behaviour by setting [`model_config['revalidate_instances'] = 'always'`](../api/config.md#pydantic.config.ConfigDict). ## Extra fields By default, Pydantic models won't error when you provide data for unrecognized fields, they will just be ignored: ```python from pydantic import BaseModel class Model(BaseModel): x: int m = Model(x=1, y='a') assert m.model_dump() == {'x': 1} ``` If you want this to raise an error, you can set the [`extra`][pydantic.ConfigDict.extra] configuration value to `'forbid'`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(extra='forbid') try: Model(x=1, y='a') except ValidationError as exc: print(exc) """ 1 validation error for Model y Extra inputs are not permitted [type=extra_forbidden, input_value='a', input_type=str] """ ``` To instead preserve any extra data provided, you can set [`extra`][pydantic.ConfigDict.extra] to `'allow'`. The extra fields will then be stored in `BaseModel.__pydantic_extra__`: ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') m = Model(x=1, y='a') assert m.__pydantic_extra__ == {'y': 'a'} ``` By default, no validation will be applied to these extra items, but you can set a type for the values by overriding the type annotation for `__pydantic_extra__`: ```python from typing import Dict from pydantic import BaseModel, ConfigDict, Field, ValidationError class Model(BaseModel): __pydantic_extra__: Dict[str, int] = Field(init=False) # (1)! x: int model_config = ConfigDict(extra='allow') try: Model(x=1, y='a') except ValidationError as exc: print(exc) """ 1 validation error for Model y Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ m = Model(x=1, y='2') assert m.x == 1 assert m.y == 2 assert m.model_dump() == {'x': 1, 'y': 2} assert m.__pydantic_extra__ == {'y': 2} ``` 1. The `= Field(init=False)` does not have any effect at runtime, but prevents the `__pydantic_extra__` field from being included as a parameter to the model's `__init__` method by type checkers. The same configurations apply to `TypedDict` and `dataclass`' except the config is controlled by setting the `__pydantic_config__` attribute of the class to a valid `ConfigDict`. pydantic-2.10.6/docs/concepts/performance.md000066400000000000000000000140661474456633400210330ustar00rootroot00000000000000# Performance tips In most cases Pydantic won't be your bottle neck, only follow this if you're sure it's necessary. ## In general, use `model_validate_json()` not `model_validate(json.loads(...))` On `model_validate(json.loads(...))`, the JSON is parsed in Python, then converted to a dict, then it's validated internally. On the other hand, `model_validate_json()` already performs the validation internally. There are a few cases where `model_validate(json.loads(...))` may be faster. Specifically, when using a `'before'` or `'wrap'` validator on a model, validation may be faster with the two step method. You can read more about these special cases in [this discussion](https://github.com/pydantic/pydantic/discussions/6388#discussioncomment-8193105). Many performance improvements are currently in the works for `pydantic-core`, as discussed [here](https://github.com/pydantic/pydantic/discussions/6388#discussioncomment-8194048). Once these changes are merged, we should be at the point where `model_validate_json()` is always faster than `model_validate(json.loads(...))`. ## `TypeAdapter` instantiated once The idea here is to avoid constructing validators and serializers more than necessary. Each time a `TypeAdapter` is instantiated, it will construct a new validator and serializer. If you're using a `TypeAdapter` in a function, it will be instantiated each time the function is called. Instead, instantiate it once, and reuse it. === ":x: Bad" ```python {lint="skip"} from typing import List from pydantic import TypeAdapter def my_func(): adapter = TypeAdapter(List[int]) # do something with adapter ``` === ":white_check_mark: Good" ```python {lint="skip"} from typing import List from pydantic import TypeAdapter adapter = TypeAdapter(List[int]) def my_func(): ... # do something with adapter ``` ## `Sequence` vs `list` or `tuple` - `Mapping` vs `dict` When using `Sequence`, Pydantic calls `isinstance(value, Sequence)` to check if the value is a sequence. Also, Pydantic will try to validate against different types of sequences, like `list` and `tuple`. If you know the value is a `list` or `tuple`, use `list` or `tuple` instead of `Sequence`. The same applies to `Mapping` and `dict`. If you know the value is a `dict`, use `dict` instead of `Mapping`. ## Don't do validation when you don't have to - use `Any` to keep the value unchanged If you don't need to validate a value, use `Any` to keep the value unchanged. ```python from typing import Any from pydantic import BaseModel class Model(BaseModel): a: Any model = Model(a=1) ``` ## Avoid extra information via subclasses of primitives === "Don't do this" ```python class CompletedStr(str): def __init__(self, s: str): self.s = s self.done = False ``` === "Do this" ```python from pydantic import BaseModel class CompletedModel(BaseModel): s: str done: bool = False ``` ## Use tagged union, not union Tagged union (or discriminated union) is a union with a field that indicates which type it is. ```python {test="skip"} from typing import Any, Literal from pydantic import BaseModel, Field class DivModel(BaseModel): el_type: Literal['div'] = 'div' class_name: str | None = None children: list[Any] | None = None class SpanModel(BaseModel): el_type: Literal['span'] = 'span' class_name: str | None = None contents: str | None = None class ButtonModel(BaseModel): el_type: Literal['button'] = 'button' class_name: str | None = None contents: str | None = None class InputModel(BaseModel): el_type: Literal['input'] = 'input' class_name: str | None = None value: str | None = None class Html(BaseModel): contents: DivModel | SpanModel | ButtonModel | InputModel = Field( discriminator='el_type' ) ``` See [Discriminated Unions] for more details. ## Use `TypedDict` over nested models Instead of using nested models, use `TypedDict` to define the structure of the data. ??? info "Performance comparison" With a simple benchmark, `TypedDict` is about ~2.5x faster than nested models: ```python {test="skip"} from timeit import timeit from typing_extensions import TypedDict from pydantic import BaseModel, TypeAdapter class A(TypedDict): a: str b: int class TypedModel(TypedDict): a: A class B(BaseModel): a: str b: int class Model(BaseModel): b: B ta = TypeAdapter(TypedModel) result1 = timeit( lambda: ta.validate_python({'a': {'a': 'a', 'b': 2}}), number=10000 ) result2 = timeit( lambda: Model.model_validate({'b': {'a': 'a', 'b': 2}}), number=10000 ) print(result2 / result1) ``` ## Avoid wrap validators if you really care about performance Wrap validators are generally slower than other validators. This is because they require that data is materialized in Python during validation. Wrap validators can be incredibly useful for complex validation logic, but if you're looking for the best performance, you should avoid them. ## Failing early with `FailFast` Starting in v2.8+, you can apply the `FailFast` annotation to sequence types to fail early if any item in the sequence fails validation. If you use this annotation, you won't get validation errors for the rest of the items in the sequence if one fails, so you're effectively trading off visibility for performance. ```python from typing import List from typing_extensions import Annotated from pydantic import FailFast, TypeAdapter, ValidationError ta = TypeAdapter(Annotated[List[bool], FailFast()]) try: ta.validate_python([True, 'invalid', False, 'also invalid']) except ValidationError as exc: print(exc) """ 1 validation error for list[bool] 1 Input should be a valid boolean, unable to interpret input [type=bool_parsing, input_value='invalid', input_type=str] """ ``` Read more about `FailFast` [here][pydantic.types.FailFast]. [Discriminated Unions]: ../concepts/unions.md#discriminated-unions pydantic-2.10.6/docs/concepts/pydantic_settings.md000066400000000000000000000005351474456633400222610ustar00rootroot00000000000000--- description: Support for loading a settings or config class from environment variables or secrets files. --- # Settings Management [Pydantic Settings](https://github.com/pydantic/pydantic-settings) provides optional Pydantic features for loading a settings or config class from environment variables or secrets files. {{ pydantic_settings }} pydantic-2.10.6/docs/concepts/serialization.md000066400000000000000000000670121474456633400214060ustar00rootroot00000000000000Beyond accessing model attributes directly via their field names (e.g. `model.foobar`), models can be converted, dumped, serialized, and exported in a number of ways. !!! tip "Serialize versus dump" Pydantic uses the terms "serialize" and "dump" interchangeably. Both refer to the process of converting a model to a dictionary or JSON-encoded string. Outside of Pydantic, the word "serialize" usually refers to converting in-memory data into a string or bytes. However, in the context of Pydantic, there is a very close relationship between converting an object from a more structured form — such as a Pydantic model, a dataclass, etc. — into a less structured form comprised of Python built-ins such as dict. While we could (and on occasion, do) distinguish between these scenarios by using the word "dump" when converting to primitives and "serialize" when converting to string, for practical purposes, we frequently use the word "serialize" to refer to both of these situations, even though it does not always imply conversion to a string or bytes. ## `model.model_dump(...)` ??? api "API Documentation" [`pydantic.main.BaseModel.model_dump`][pydantic.main.BaseModel.model_dump]
This is the primary way of converting a model to a dictionary. Sub-models will be recursively converted to dictionaries. !!! note The one exception to sub-models being converted to dictionaries is that [`RootModel`](models.md#rootmodel-and-custom-root-types) and its subclasses will have the `root` field value dumped directly, without a wrapping dictionary. This is also done recursively. !!! note You can use [computed fields](../api/fields.md#pydantic.fields.computed_field) to include `property` and `cached_property` data in the `model.model_dump(...)` output. Example: ```python from typing import Any, List, Optional from pydantic import BaseModel, Field, Json class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: Optional[float] = 1.1 foo: str = Field(serialization_alias='foo_alias') bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) # returns a dictionary: print(m.model_dump()) #> {'banana': 3.14, 'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(include={'foo', 'bar'})) #> {'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(exclude={'foo', 'bar'})) #> {'banana': 3.14} print(m.model_dump(by_alias=True)) #> {'banana': 3.14, 'foo_alias': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_unset=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=1.1, foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=None, foo='hello', bar={'whatever': 123}).model_dump( exclude_none=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} class Model(BaseModel): x: List[Json[Any]] print(Model(x=['{"a": 1}', '[1, 2]']).model_dump()) #> {'x': [{'a': 1}, [1, 2]]} print(Model(x=['{"a": 1}', '[1, 2]']).model_dump(round_trip=True)) #> {'x': ['{"a":1}', '[1,2]']} ``` ## `model.model_dump_json(...)` ??? api "API Documentation" [`pydantic.main.BaseModel.model_dump_json`][pydantic.main.BaseModel.model_dump_json]
The `.model_dump_json()` method serializes a model directly to a JSON-encoded string that is equivalent to the result produced by [`.model_dump()`](#modelmodel_dump). See [arguments][pydantic.main.BaseModel.model_dump_json] for more information. !!! note Pydantic can serialize many commonly used types to JSON that would otherwise be incompatible with a simple `json.dumps(foobar)` (e.g. `datetime`, `date` or `UUID`) . ```python from datetime import datetime from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): foo: datetime bar: BarModel m = FooBarModel(foo=datetime(2032, 6, 1, 12, 13, 14), bar={'whatever': 123}) print(m.model_dump_json()) #> {"foo":"2032-06-01T12:13:14","bar":{"whatever":123}} print(m.model_dump_json(indent=2)) """ { "foo": "2032-06-01T12:13:14", "bar": { "whatever": 123 } } """ ``` ## `dict(model)` and iteration Pydantic models can also be converted to dictionaries using `dict(model)`, and you can also iterate over a model's fields using `for field_name, field_value in model:`. With this approach the raw field values are returned, so sub-models will not be converted to dictionaries. Example: ```python from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: float foo: str bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) print(dict(m)) #> {'banana': 3.14, 'foo': 'hello', 'bar': BarModel(whatever=123)} for name, value in m: print(f'{name}: {value}') #> banana: 3.14 #> foo: hello #> bar: whatever=123 ``` Note also that [`RootModel`](models.md#rootmodel-and-custom-root-types) _does_ get converted to a dictionary with the key `'root'`. ## Custom serializers Pydantic provides several [functional serializers][pydantic.functional_serializers] to customise how a model is serialized to a dictionary or JSON. - [`@field_serializer`][pydantic.functional_serializers.field_serializer] - [`@model_serializer`][pydantic.functional_serializers.model_serializer] - [`PlainSerializer`][pydantic.functional_serializers.PlainSerializer] - [`WrapSerializer`][pydantic.functional_serializers.WrapSerializer] Serialization can be customised on a field using the [`@field_serializer`][pydantic.functional_serializers.field_serializer] decorator, and on a model using the [`@model_serializer`][pydantic.functional_serializers.model_serializer] decorator. ```python from datetime import datetime, timedelta, timezone from typing import Any, Dict from pydantic import BaseModel, ConfigDict, field_serializer, model_serializer class WithCustomEncoders(BaseModel): model_config = ConfigDict(ser_json_timedelta='iso8601') dt: datetime diff: timedelta @field_serializer('dt') def serialize_dt(self, dt: datetime, _info): return dt.timestamp() m = WithCustomEncoders( dt=datetime(2032, 6, 1, tzinfo=timezone.utc), diff=timedelta(hours=100) ) print(m.model_dump_json()) #> {"dt":1969660800.0,"diff":"P4DT4H"} class Model(BaseModel): x: str @model_serializer def ser_model(self) -> Dict[str, Any]: return {'x': f'serialized {self.x}'} print(Model(x='test value').model_dump_json()) #> {"x":"serialized test value"} ``` !!! note A single serializer can also be called on all fields by passing the special value '*' to the [`@field_serializer`][pydantic.functional_serializers.field_serializer] decorator. In addition, [`PlainSerializer`][pydantic.functional_serializers.PlainSerializer] and [`WrapSerializer`][pydantic.functional_serializers.WrapSerializer] enable you to use a function to modify the output of serialization. Both serializers accept optional arguments including: - `return_type` specifies the return type for the function. If omitted it will be inferred from the type annotation. - `when_used` specifies when this serializer should be used. Accepts a string with values 'always', 'unless-none', 'json', and 'json-unless-none'. Defaults to 'always'. `PlainSerializer` uses a simple function to modify the output of serialization. ```python from typing_extensions import Annotated from pydantic import BaseModel from pydantic.functional_serializers import PlainSerializer FancyInt = Annotated[ int, PlainSerializer(lambda x: f'{x:,}', return_type=str, when_used='json') ] class MyModel(BaseModel): x: FancyInt print(MyModel(x=1234).model_dump()) #> {'x': 1234} print(MyModel(x=1234).model_dump(mode='json')) #> {'x': '1,234'} ``` `WrapSerializer` receives the raw inputs along with a handler function that applies the standard serialization logic, and can modify the resulting value before returning it as the final output of serialization. ```python from typing import Any from typing_extensions import Annotated from pydantic import BaseModel, SerializerFunctionWrapHandler from pydantic.functional_serializers import WrapSerializer def ser_wrap(v: Any, nxt: SerializerFunctionWrapHandler) -> str: return f'{nxt(v + 1):,}' FancyInt = Annotated[int, WrapSerializer(ser_wrap, when_used='json')] class MyModel(BaseModel): x: FancyInt print(MyModel(x=1234).model_dump()) #> {'x': 1234} print(MyModel(x=1234).model_dump(mode='json')) #> {'x': '1,235'} ``` ### Overriding the return type when dumping a model While the return value of `.model_dump()` can usually be described as `dict[str, Any]`, through the use of `@model_serializer` you can actually cause it to return a value that doesn't match this signature: ```python from pydantic import BaseModel, model_serializer class Model(BaseModel): x: str @model_serializer def ser_model(self) -> str: return self.x print(Model(x='not a dict').model_dump()) #> not a dict ``` If you want to do this and still get proper type-checking for this method, you can override `.model_dump()` in an `if TYPE_CHECKING:` block: ```python from typing import TYPE_CHECKING, Any, Literal from pydantic import BaseModel, model_serializer class Model(BaseModel): x: str @model_serializer def ser_model(self) -> str: return self.x if TYPE_CHECKING: # Ensure type checkers see the correct return type def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: Any = None, exclude: Any = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True, ) -> str: ... ``` This trick is actually used in [`RootModel`](models.md#rootmodel-and-custom-root-types) for precisely this purpose. ## Serializing subclasses ### Subclasses of standard types Subclasses of standard types are automatically dumped like their super-classes: ```python from datetime import date, timedelta from typing import Any, Type from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler class DayThisYear(date): """ Contrived example of a special type of date that takes an int and interprets it as a day in the current year """ @classmethod def __get_pydantic_core_schema__( cls, source: Type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return core_schema.no_info_after_validator_function( cls.validate, core_schema.int_schema(), serialization=core_schema.format_ser_schema('%Y-%m-%d'), ) @classmethod def validate(cls, v: int): return date(2023, 1, 1) + timedelta(days=v) class FooModel(BaseModel): date: DayThisYear m = FooModel(date=300) print(m.model_dump_json()) #> {"date":"2023-10-28"} ``` ### Subclass instances for fields of `BaseModel`, dataclasses, `TypedDict` When using fields whose annotations are themselves struct-like types (e.g., `BaseModel` subclasses, dataclasses, etc.), the default behavior is to serialize the attribute value as though it was an instance of the annotated type, even if it is a subclass. More specifically, only the fields from the _annotated_ type will be included in the dumped object: ```python from pydantic import BaseModel class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): user: User user = UserLogin(name='pydantic', password='hunter2') m = OuterModel(user=user) print(m) #> user=UserLogin(name='pydantic', password='hunter2') print(m.model_dump()) # note: the password field is not included #> {'user': {'name': 'pydantic'}} ``` !!! warning "Migration Warning" This behavior is different from how things worked in Pydantic V1, where we would always include all (subclass) fields when recursively dumping models to dicts. The motivation behind this change in behavior is that it helps ensure that you know precisely which fields could be included when serializing, even if subclasses get passed when instantiating the object. In particular, this can help prevent surprises when adding sensitive information like secrets as fields of subclasses. ### Serializing with duck-typing 🦆 !!! question "What is serialization with duck typing?" Duck-typing serialization is the behavior of serializing an object based on the fields present in the object itself, rather than the fields present in the schema of the object. This means that when an object is serialized, fields present in a subclass, but not in the original schema, will be included in the serialized output. This behavior was the default in Pydantic V1, but was changed in V2 to help ensure that you know precisely which fields would be included when serializing, even if subclasses get passed when instantiating the object. This helps prevent security risks when serializing subclasses with sensitive information, for example. If you want v1-style duck-typing serialization behavior, you can use a runtime setting, or annotate individual types. * Field / type level: use the `SerializeAsAny` annotation * Runtime level: use the `serialize_as_any` flag when calling `model_dump()` or `model_dump_json()` We discuss these options below in more detail: #### `SerializeAsAny` annotation: If you want duck-typing serialization behavior, this can be done using the `SerializeAsAny` annotation on a type: ```python from pydantic import BaseModel, SerializeAsAny class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): as_any: SerializeAsAny[User] as_user: User user = UserLogin(name='pydantic', password='password') print(OuterModel(as_any=user, as_user=user).model_dump()) """ { 'as_any': {'name': 'pydantic', 'password': 'password'}, 'as_user': {'name': 'pydantic'}, } """ ``` When a field is annotated as `SerializeAsAny[]`, the validation behavior will be the same as if it was annotated as ``, and type-checkers like mypy will treat the attribute as having the appropriate type as well. But when serializing, the field will be serialized as though the type hint for the field was `Any`, which is where the name comes from. #### `serialize_as_any` runtime setting The `serialize_as_any` runtime setting can be used to serialize model data with or without duck typed serialization behavior. `serialize_as_any` can be passed as a keyword argument to the `model_dump()` and `model_dump_json` methods of `BaseModel`s and `RootModel`s. It can also be passed as a keyword argument to the `dump_python()` and `dump_json()` methods of `TypeAdapter`s. If `serialize_as_any` is set to `True`, the model will be serialized using duck typed serialization behavior, which means that the model will ignore the schema and instead ask the object itself how it should be serialized. In particular, this means that when model subclasses are serialized, fields present in the subclass but not in the original schema will be included. If `serialize_as_any` is set to `False` (which is the default), the model will be serialized using the schema, which means that fields present in a subclass but not in the original schema will be ignored. !!! question "Why is this flag useful?" Sometimes, you want to make sure that no matter what fields might have been added in subclasses, the serialized object will only have the fields listed in the original type definition. This can be useful if you add something like a `password: str` field in a subclass that you don't want to accidentally include in the serialized output. For example: ```python from pydantic import BaseModel class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): user1: User user2: User user = UserLogin(name='pydantic', password='password') outer_model = OuterModel(user1=user, user2=user) print(outer_model.model_dump(serialize_as_any=True)) # (1)! """ { 'user1': {'name': 'pydantic', 'password': 'password'}, 'user2': {'name': 'pydantic', 'password': 'password'}, } """ print(outer_model.model_dump(serialize_as_any=False)) # (2)! #> {'user1': {'name': 'pydantic'}, 'user2': {'name': 'pydantic'}} ``` 1. With `serialize_as_any` set to `True`, the result matches that of V1. 2. With `serialize_as_any` set to `False` (the V2 default), fields present on the subclass, but not the base class, are not included in serialization. This setting even takes effect with nested and recursive patterns as well. For example: ```python from typing import List from pydantic import BaseModel class User(BaseModel): name: str friends: List['User'] class UserLogin(User): password: str class OuterModel(BaseModel): user: User user = UserLogin( name='samuel', password='pydantic-pw', friends=[UserLogin(name='sebastian', password='fastapi-pw', friends=[])], ) print(OuterModel(user=user).model_dump(serialize_as_any=True)) # (1)! """ { 'user': { 'name': 'samuel', 'friends': [ {'name': 'sebastian', 'friends': [], 'password': 'fastapi-pw'} ], 'password': 'pydantic-pw', } } """ print(OuterModel(user=user).model_dump(serialize_as_any=False)) # (2)! """ {'user': {'name': 'samuel', 'friends': [{'name': 'sebastian', 'friends': []}]}} """ ``` 1. Even nested `User` model instances are dumped with fields unique to `User` subclasses. 2. Even nested `User` model instances are dumped without fields unique to `User` subclasses. !!! note The behavior of the `serialize_as_any` runtime flag is almost the same as the behavior of the `SerializeAsAny` annotation. There are a few nuanced differences that we're working to resolve, but for the most part, you can expect the same behavior from both. See more about the differences in this [active issue](https://github.com/pydantic/pydantic/issues/9049) #### Overriding the `serialize_as_any` default (False) You can override the default setting for `serialize_as_any` by configuring a subclass of `BaseModel` that overrides the default for the `serialize_as_any` argument to `model_dump()` and `model_dump_json()`, and then use that as the base class (instead of `pydantic.BaseModel`) for any model you want to have this default behavior. For example, you could do the following if you want to use duck-typing serialization by default: ```python from typing import Any, Dict from pydantic import BaseModel, SecretStr class MyBaseModel(BaseModel): def model_dump(self, **kwargs) -> Dict[str, Any]: return super().model_dump(serialize_as_any=True, **kwargs) def model_dump_json(self, **kwargs) -> str: return super().model_dump_json(serialize_as_any=True, **kwargs) class User(MyBaseModel): name: str class UserInfo(User): password: SecretStr class OuterModel(MyBaseModel): user: User u = OuterModel(user=UserInfo(name='John', password='secret_pw')) print(u.model_dump_json()) # (1)! #> {"user":{"name":"John","password":"**********"}} ``` 1. By default, `model_dump_json` will use duck-typing serialization behavior, which means that the `password` field is included in the output. ## `pickle.dumps(model)` Pydantic models support efficient pickling and unpickling. ```python {test="skip"} import pickle from pydantic import BaseModel class FooBarModel(BaseModel): a: str b: int m = FooBarModel(a='hello', b=123) print(m) #> a='hello' b=123 data = pickle.dumps(m) print(data[:20]) #> b'\x80\x04\x95\x95\x00\x00\x00\x00\x00\x00\x00\x8c\x08__main_' m2 = pickle.loads(data) print(m2) #> a='hello' b=123 ``` ## Advanced include and exclude The `model_dump` and `model_dump_json` methods support `include` and `exclude` arguments which can either be sets or dictionaries. This allows nested selection of which fields to export: ```python from pydantic import BaseModel, SecretStr class User(BaseModel): id: int username: str password: SecretStr class Transaction(BaseModel): id: str user: User value: int t = Transaction( id='1234567890', user=User(id=42, username='JohnDoe', password='hashedpassword'), value=9876543210, ) # using a set: print(t.model_dump(exclude={'user', 'value'})) #> {'id': '1234567890'} # using a dict: print(t.model_dump(exclude={'user': {'username', 'password'}, 'value': True})) #> {'id': '1234567890', 'user': {'id': 42}} print(t.model_dump(include={'id': True, 'user': {'id'}})) #> {'id': '1234567890', 'user': {'id': 42}} ``` Using `True` indicates that we want to exclude or include an entire key, just as if we included it in a set (note that using `False` isn't supported). This can be done at any depth level. Special care must be taken when including or excluding fields from a list or tuple of submodels or dictionaries. In this scenario, `model_dump` and related methods expect integer keys for element-wise inclusion or exclusion. To exclude a field from **every** member of a list or tuple, the dictionary key `'__all__'` can be used, as shown here: ```python import datetime from typing import List from pydantic import BaseModel, SecretStr class Country(BaseModel): name: str phone_code: int class Address(BaseModel): post_code: int country: Country class CardDetails(BaseModel): number: SecretStr expires: datetime.date class Hobby(BaseModel): name: str info: str class User(BaseModel): first_name: str second_name: str address: Address card_details: CardDetails hobbies: List[Hobby] user = User( first_name='John', second_name='Doe', address=Address( post_code=123456, country=Country(name='USA', phone_code=1) ), card_details=CardDetails( number='4212934504460000', expires=datetime.date(2020, 5, 1) ), hobbies=[ Hobby(name='Programming', info='Writing code and stuff'), Hobby(name='Gaming', info='Hell Yeah!!!'), ], ) exclude_keys = { 'second_name': True, 'address': {'post_code': True, 'country': {'phone_code'}}, 'card_details': True, # You can exclude fields from specific members of a tuple/list by index: 'hobbies': {-1: {'info'}}, } include_keys = { 'first_name': True, 'address': {'country': {'name'}}, 'hobbies': {0: True, -1: {'name'}}, } # would be the same as user.model_dump(exclude=exclude_keys) in this case: print(user.model_dump(include=include_keys)) """ { 'first_name': 'John', 'address': {'country': {'name': 'USA'}}, 'hobbies': [ {'name': 'Programming', 'info': 'Writing code and stuff'}, {'name': 'Gaming'}, ], } """ # To exclude a field from all members of a nested list or tuple, use "__all__": print(user.model_dump(exclude={'hobbies': {'__all__': {'info'}}})) """ { 'first_name': 'John', 'second_name': 'Doe', 'address': { 'post_code': 123456, 'country': {'name': 'USA', 'phone_code': 1}, }, 'card_details': { 'number': SecretStr('**********'), 'expires': datetime.date(2020, 5, 1), }, 'hobbies': [{'name': 'Programming'}, {'name': 'Gaming'}], } """ ``` The same holds for the `model_dump_json` method. ### Model- and field-level include and exclude In addition to the explicit arguments `exclude` and `include` passed to `model_dump` and `model_dump_json` methods, we can also pass the `exclude: bool` arguments directly to the `Field` constructor: Setting `exclude` on the field constructor (`Field(exclude=True)`) takes priority over the `exclude`/`include` on `model_dump` and `model_dump_json`: ```python from pydantic import BaseModel, Field, SecretStr class User(BaseModel): id: int username: str password: SecretStr = Field(exclude=True) class Transaction(BaseModel): id: str value: int = Field(exclude=True) t = Transaction( id='1234567890', value=9876543210, ) print(t.model_dump()) #> {'id': '1234567890'} print(t.model_dump(include={'id': True, 'value': True})) # (1)! #> {'id': '1234567890'} ``` 1. `value` excluded from the output because it excluded in `Field`. That being said, setting `exclude` on the field constructor (`Field(exclude=True)`) does not take priority over the `exclude_unset`, `exclude_none`, and `exclude_default` parameters on `model_dump` and `model_dump_json`: ```python from typing import Optional from pydantic import BaseModel, Field class Person(BaseModel): name: str age: Optional[int] = Field(None, exclude=False) person = Person(name='Jeremy') print(person.model_dump()) #> {'name': 'Jeremy', 'age': None} print(person.model_dump(exclude_none=True)) # (1)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_unset=True)) # (2)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_defaults=True)) # (3)! #> {'name': 'Jeremy'} ``` 1. `age` excluded from the output because `exclude_none` was set to `True`, and `age` is `None`. 2. `age` excluded from the output because `exclude_unset` was set to `True`, and `age` was not set in the Person constructor. 3. `age` excluded from the output because `exclude_defaults` was set to `True`, and `age` takes the default value of `None`. ## Serialization Context You can pass a context object to the serialization methods which can be accessed from the `info` argument to decorated serializer functions. This is useful when you need to dynamically update the serialization behavior during runtime. For example, if you wanted a field to be dumped depending on a dynamically controllable set of allowed values, this could be done by passing the allowed values by context: ```python from pydantic import BaseModel, SerializationInfo, field_serializer class Model(BaseModel): text: str @field_serializer('text') def remove_stopwords(self, v: str, info: SerializationInfo): context = info.context if context: stopwords = context.get('stopwords', set()) v = ' '.join(w for w in v.split() if w.lower() not in stopwords) return v model = Model.model_construct(**{'text': 'This is an example document'}) print(model.model_dump()) # no context #> {'text': 'This is an example document'} print(model.model_dump(context={'stopwords': ['this', 'is', 'an']})) #> {'text': 'example document'} print(model.model_dump(context={'stopwords': ['document']})) #> {'text': 'This is an example'} ``` Similarly, you can [use a context for validation](../concepts/validators.md#validation-context). ## `model_copy(...)` ??? api "API Documentation" [`pydantic.main.BaseModel.model_copy`][pydantic.main.BaseModel.model_copy]
`model_copy()` allows models to be duplicated (with optional updates), which is particularly useful when working with frozen models. Example: ```python from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: float foo: str bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) print(m.model_copy(update={'banana': 0})) #> banana=0 foo='hello' bar=BarModel(whatever=123) print(id(m.bar) == id(m.model_copy().bar)) #> True # normal copy gives the same object reference for bar print(id(m.bar) == id(m.model_copy(deep=True).bar)) #> False # deep copy gives a new object reference for `bar` ``` pydantic-2.10.6/docs/concepts/strict_mode.md000066400000000000000000000340541474456633400210450ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.types.Strict`][pydantic.types.Strict]
By default, Pydantic will attempt to coerce values to the desired type when possible. For example, you can pass the string `"123"` as the input to an `int` field, and it will be converted to `123`. This coercion behavior is useful in many scenarios — think: UUIDs, URL parameters, HTTP headers, environment variables, user input, etc. However, there are also situations where this is not desirable, and you want Pydantic to error instead of coercing data. To better support this use case, Pydantic provides a "strict mode" that can be enabled on a per-model, per-field, or even per-validation-call basis. When strict mode is enabled, Pydantic will be much less lenient when coercing data, and will instead error if the data is not of the correct type. Here is a brief example showing the difference between validation behavior in strict and the default/"lax" mode: ```python from pydantic import BaseModel, ValidationError class MyModel(BaseModel): x: int print(MyModel.model_validate({'x': '123'})) # lax mode #> x=123 try: MyModel.model_validate({'x': '123'}, strict=True) # strict mode except ValidationError as exc: print(exc) """ 1 validation error for MyModel x Input should be a valid integer [type=int_type, input_value='123', input_type=str] """ ``` There are various ways to get strict-mode validation while using Pydantic, which will be discussed in more detail below: * [Passing `strict=True` to the validation methods](#strict-mode-in-method-calls), such as `BaseModel.model_validate`, `TypeAdapter.validate_python`, and similar for JSON * [Using `Field(strict=True)`](#strict-mode-with-field) with fields of a `BaseModel`, `dataclass`, or `TypedDict` * [Using `pydantic.types.Strict` as a type annotation](#strict-mode-with-annotated-strict) on a field * Pydantic provides some type aliases that are already annotated with `Strict`, such as `pydantic.types.StrictInt` * [Using `ConfigDict(strict=True)`](#strict-mode-with-configdict) ## Type coercions in strict mode For most types, when validating data from python in strict mode, only the instances of the exact types are accepted. For example, when validating an `int` field, only instances of `int` are accepted; passing instances of `float` or `str` will result in raising a `ValidationError`. Note that we are looser when validating data from JSON in strict mode. For example, when validating a `UUID` field, instances of `str` will be accepted when validating from JSON, but not from python: ```python import json from uuid import UUID from pydantic import BaseModel, ValidationError class MyModel(BaseModel): guid: UUID data = {'guid': '12345678-1234-1234-1234-123456789012'} print(MyModel.model_validate(data)) # OK: lax #> guid=UUID('12345678-1234-1234-1234-123456789012') print( MyModel.model_validate_json(json.dumps(data), strict=True) ) # OK: strict, but from json #> guid=UUID('12345678-1234-1234-1234-123456789012') try: MyModel.model_validate(data, strict=True) # Not OK: strict, from python except ValidationError as exc: print(exc.errors(include_url=False)) """ [ { 'type': 'is_instance_of', 'loc': ('guid',), 'msg': 'Input should be an instance of UUID', 'input': '12345678-1234-1234-1234-123456789012', 'ctx': {'class': 'UUID'}, } ] """ ``` For more details about what types are allowed as inputs in strict mode, you can review the [Conversion Table](conversion_table.md). ## Strict mode in method calls All the examples included so far get strict-mode validation through the use of `strict=True` as a keyword argument to the validation methods. While we have shown this for `BaseModel.model_validate`, this also works with arbitrary types through the use of `TypeAdapter`: ```python from pydantic import TypeAdapter, ValidationError print(TypeAdapter(bool).validate_python('yes')) # OK: lax #> True try: TypeAdapter(bool).validate_python('yes', strict=True) # Not OK: strict except ValidationError as exc: print(exc) """ 1 validation error for bool Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` Note this also works even when using more "complex" types in `TypeAdapter`: ```python from dataclasses import dataclass from pydantic import TypeAdapter, ValidationError @dataclass class MyDataclass: x: int try: TypeAdapter(MyDataclass).validate_python({'x': '123'}, strict=True) except ValidationError as exc: print(exc) """ 1 validation error for MyDataclass Input should be an instance of MyDataclass [type=dataclass_exact_type, input_value={'x': '123'}, input_type=dict] """ ``` This also works with the `TypeAdapter.validate_json` and `BaseModel.model_validate_json` methods: ```python import json from typing import List from uuid import UUID from pydantic import BaseModel, TypeAdapter, ValidationError try: TypeAdapter(List[int]).validate_json('["1", 2, "3"]', strict=True) except ValidationError as exc: print(exc) """ 2 validation errors for list[int] 0 Input should be a valid integer [type=int_type, input_value='1', input_type=str] 2 Input should be a valid integer [type=int_type, input_value='3', input_type=str] """ class Model(BaseModel): x: int y: UUID data = {'x': '1', 'y': '12345678-1234-1234-1234-123456789012'} try: Model.model_validate(data, strict=True) except ValidationError as exc: # Neither x nor y are valid in strict mode from python: print(exc) """ 2 validation errors for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] y Input should be an instance of UUID [type=is_instance_of, input_value='12345678-1234-1234-1234-123456789012', input_type=str] """ json_data = json.dumps(data) try: Model.model_validate_json(json_data, strict=True) except ValidationError as exc: # From JSON, x is still not valid in strict mode, but y is: print(exc) """ 1 validation error for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ## Strict mode with `Field` For individual fields on a model, you can [set `strict=True` on the field](../api/fields.md#pydantic.fields.Field). This will cause strict-mode validation to be used for that field, even when the validation methods are called without `strict=True`. Only the fields for which `strict=True` is set will be affected: ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): name: str age: int n_pets: int user = User(name='John', age='42', n_pets='1') print(user) #> name='John' age=42 n_pets=1 class AnotherUser(BaseModel): name: str age: int = Field(strict=True) n_pets: int try: anotheruser = AnotherUser(name='John', age='42', n_pets='1') except ValidationError as e: print(e) """ 1 validation error for AnotherUser age Input should be a valid integer [type=int_type, input_value='42', input_type=str] """ ``` Note that making fields strict will also affect the validation performed when instantiating the model class: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(strict=True) y: int = Field(strict=False) try: Model(x='1', y='2') except ValidationError as exc: print(exc) """ 1 validation error for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ### Using `Field` as an annotation Note that `Field(strict=True)` (or with any other keyword arguments) can be used as an annotation if necessary, e.g., when working with `TypedDict`: ```python from typing_extensions import Annotated, TypedDict from pydantic import Field, TypeAdapter, ValidationError class MyDict(TypedDict): x: Annotated[int, Field(strict=True)] try: TypeAdapter(MyDict).validate_python({'x': '1'}) except ValidationError as exc: print(exc) """ 1 validation error for typed-dict x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ## Strict mode with `Annotated[..., Strict()]` ??? api "API Documentation" [`pydantic.types.Strict`][pydantic.types.Strict]
Pydantic also provides the [`Strict`](../api/types.md#pydantic.types.Strict) class, which is intended for use as metadata with [`typing.Annotated`][] class; this annotation indicates that the annotated field should be validated in strict mode: ```python from typing_extensions import Annotated from pydantic import BaseModel, Strict, ValidationError class User(BaseModel): name: str age: int is_active: Annotated[bool, Strict()] User(name='David', age=33, is_active=True) try: User(name='David', age=33, is_active='True') except ValidationError as exc: print(exc) """ 1 validation error for User is_active Input should be a valid boolean [type=bool_type, input_value='True', input_type=str] """ ``` This is, in fact, the method used to implement some of the strict-out-of-the-box types provided by Pydantic, such as [`StrictInt`](../api/types.md#pydantic.types.StrictInt). ## Strict mode with `ConfigDict` ### `BaseModel` If you want to enable strict mode for all fields on a complex input type, you can use [`ConfigDict(strict=True)`](../api/config.md#pydantic.config.ConfigDict) in the `model_config`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class User(BaseModel): model_config = ConfigDict(strict=True) name: str age: int is_active: bool try: User(name='David', age='33', is_active='yes') except ValidationError as exc: print(exc) """ 2 validation errors for User age Input should be a valid integer [type=int_type, input_value='33', input_type=str] is_active Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` !!! note When using `strict=True` through a model's `model_config`, you can still override the strictness of individual fields by setting `strict=False` on individual fields: ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(strict=True) name: str age: int = Field(strict=False) ``` Note that strict mode is not recursively applied to nested model fields: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Inner(BaseModel): y: int class Outer(BaseModel): model_config = ConfigDict(strict=True) x: int inner: Inner print(Outer(x=1, inner=Inner(y='2'))) #> x=1 inner=Inner(y=2) try: Outer(x='1', inner=Inner(y='2')) except ValidationError as exc: print(exc) """ 1 validation error for Outer x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` (This is also the case for dataclasses and `TypedDict`.) If this is undesirable, you should make sure that strict mode is enabled for all the types involved. For example, this can be done for model classes by using a shared base class with `model_config = ConfigDict(strict=True)`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class MyBaseModel(BaseModel): model_config = ConfigDict(strict=True) class Inner(MyBaseModel): y: int class Outer(MyBaseModel): x: int inner: Inner try: Outer.model_validate({'x': 1, 'inner': {'y': '2'}}) except ValidationError as exc: print(exc) """ 1 validation error for Outer inner.y Input should be a valid integer [type=int_type, input_value='2', input_type=str] """ ``` ### Dataclasses and `TypedDict` Pydantic dataclasses behave similarly to the examples shown above with `BaseModel`, just that instead of `model_config` you should use the `config` keyword argument to the `@pydantic.dataclasses.dataclass` decorator. When possible, you can achieve nested strict mode for vanilla dataclasses or `TypedDict` subclasses by annotating fields with the [`pydantic.types.Strict` annotation](#strict-mode-with-annotated-strict). However, if this is _not_ possible (e.g., when working with third-party types), you can set the config that Pydantic should use for the type by setting the `__pydantic_config__` attribute on the type: ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError class Inner(TypedDict): y: int Inner.__pydantic_config__ = ConfigDict(strict=True) class Outer(TypedDict): x: int inner: Inner adapter = TypeAdapter(Outer) print(adapter.validate_python({'x': '1', 'inner': {'y': 2}})) #> {'x': 1, 'inner': {'y': 2}} try: adapter.validate_python({'x': '1', 'inner': {'y': '2'}}) except ValidationError as exc: print(exc) """ 1 validation error for typed-dict inner.y Input should be a valid integer [type=int_type, input_value='2', input_type=str] """ ``` ### `TypeAdapter` You can also get strict mode through the use of the config keyword argument to the [`TypeAdapter`](../api/type_adapter.md) class: ```python from pydantic import ConfigDict, TypeAdapter, ValidationError adapter = TypeAdapter(bool, config=ConfigDict(strict=True)) try: adapter.validate_python('yes') except ValidationError as exc: print(exc) """ 1 validation error for bool Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` ### `@validate_call` Strict mode is also usable with the [`@validate_call`](../api/validate_call.md#pydantic.validate_call_decorator.validate_call) decorator by passing the `config` keyword argument: ```python from pydantic import ConfigDict, ValidationError, validate_call @validate_call(config=ConfigDict(strict=True)) def foo(x: int) -> int: return x try: foo('1') except ValidationError as exc: print(exc) """ 1 validation error for foo 0 Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` pydantic-2.10.6/docs/concepts/type_adapter.md000066400000000000000000000120501474456633400212020ustar00rootroot00000000000000You may have types that are not `BaseModel`s that you want to validate data against. Or you may want to validate a `List[SomeModel]`, or dump it to JSON. ??? api "API Documentation" [`pydantic.type_adapter.TypeAdapter`][pydantic.type_adapter.TypeAdapter]
For use cases like this, Pydantic provides [`TypeAdapter`][pydantic.type_adapter.TypeAdapter], which can be used for type validation, serialization, and JSON schema generation without needing to create a [`BaseModel`][pydantic.main.BaseModel]. A [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] instance exposes some of the functionality from [`BaseModel`][pydantic.main.BaseModel] instance methods for types that do not have such methods (such as dataclasses, primitive types, and more): ```python from typing import List from typing_extensions import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int user_list_adapter = TypeAdapter(List[User]) user_list = user_list_adapter.validate_python([{'name': 'Fred', 'id': '3'}]) print(repr(user_list)) #> [{'name': 'Fred', 'id': 3}] try: user_list_adapter.validate_python( [{'name': 'Fred', 'id': 'wrong', 'other': 'no'}] ) except ValidationError as e: print(e) """ 1 validation error for list[typed-dict] 0.id Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str] """ print(repr(user_list_adapter.dump_json(user_list))) #> b'[{"name":"Fred","id":3}]' ``` !!! info "`dump_json` returns `bytes`" `TypeAdapter`'s `dump_json` methods returns a `bytes` object, unlike the corresponding method for `BaseModel`, `model_dump_json`, which returns a `str`. The reason for this discrepancy is that in V1, model dumping returned a str type, so this behavior is retained in V2 for backwards compatibility. For the `BaseModel` case, `bytes` are coerced to `str` types, but `bytes` are often the desired end type. Hence, for the new `TypeAdapter` class in V2, the return type is simply `bytes`, which can easily be coerced to a `str` type if desired. !!! note Despite some overlap in use cases with [`RootModel`][pydantic.root_model.RootModel], [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] should not be used as a type annotation for specifying fields of a `BaseModel`, etc. ## Parsing data into a specified type [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] can be used to apply the parsing logic to populate Pydantic models in a more ad-hoc way. This function behaves similarly to [`BaseModel.model_validate`][pydantic.main.BaseModel.model_validate], but works with arbitrary Pydantic-compatible types. This is especially useful when you want to parse results into a type that is not a direct subclass of [`BaseModel`][pydantic.main.BaseModel]. For example: ```python from typing import List from pydantic import BaseModel, TypeAdapter class Item(BaseModel): id: int name: str # `item_data` could come from an API call, eg., via something like: # item_data = requests.get('https://my-api.com/items').json() item_data = [{'id': 1, 'name': 'My Item'}] items = TypeAdapter(List[Item]).validate_python(item_data) print(items) #> [Item(id=1, name='My Item')] ``` [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] is capable of parsing data into any of the types Pydantic can handle as fields of a [`BaseModel`][pydantic.main.BaseModel]. !!! info "Performance considerations" When creating an instance of [`TypeAdapter`][pydantic.type_adapter.TypeAdapter], the provided type must be analyzed and converted into a pydantic-core schema. This comes with some non-trivial overhead, so it is recommended to create a `TypeAdapter` for a given type just once and reuse it in loops or other performance-critical code. ## Rebuilding a `TypeAdapter`'s schema In v2.10+, [`TypeAdapter`][pydantic.type_adapter.TypeAdapter]'s support deferred schema building and manual rebuilds. This is helpful for the case of: * Types with forward references * Types for which core schema builds are expensive When you initialize a [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] with a type, Pydantic analyzes the type and creates a core schema for it. This core schema contains the information needed to validate and serialize data for that type. See the [architecture documentation](../internals/architecture.md) for more information on core schemas. If you set [`defer_build`][pydantic.config.ConfigDict.defer_build] to `True` when initializing a `TypeAdapter`, Pydantic will defer building the core schema until the first time it is needed (for validation or serialization). In order to manually trigger the building of the core schema, you can call the [`rebuild`][pydantic.type_adapter.TypeAdapter.rebuild] method on the [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] instance: ```python from pydantic import ConfigDict, TypeAdapter ta = TypeAdapter('MyInt', config=ConfigDict(defer_build=True)) # some time later, the forward reference is defined MyInt = int ta.rebuild() assert ta.validate_python(1) == 1 ``` pydantic-2.10.6/docs/concepts/types.md000066400000000000000000000723631474456633400177020ustar00rootroot00000000000000Where possible Pydantic uses [standard library types](../api/standard_library_types.md) to define fields, thus smoothing the learning curve. For many useful applications, however, no standard library type exists, so Pydantic implements many commonly used types. There are also more complex types that can be found in the [Pydantic Extra Types](https://github.com/pydantic/pydantic-extra-types) package. If no existing type suits your purpose you can also implement your [own Pydantic-compatible types](#custom-types) with custom properties and validation. The following sections describe the types supported by Pydantic. * [Standard Library Types](../api/standard_library_types.md) — types from the Python standard library. * [Strict Types](#strict-types) — types that enable you to prevent coercion from compatible types. * [Custom Data Types](#custom-types) — create your own custom data types. * [Field Type Conversions](../concepts/conversion_table.md) — strict and lax conversion between different field types. ## Type conversion During validation, Pydantic can coerce data into expected types. There are two modes of coercion: strict and lax. See [Conversion Table](../concepts/conversion_table.md) for more details on how Pydantic converts data in both strict and lax modes. See [Strict mode](../concepts/strict_mode.md) and [Strict Types](#strict-types) for details on enabling strict coercion. ## Strict Types Pydantic provides the following strict types: - [`StrictBool`][pydantic.types.StrictBool] - [`StrictBytes`][pydantic.types.StrictBytes] - [`StrictFloat`][pydantic.types.StrictFloat] - [`StrictInt`][pydantic.types.StrictInt] - [`StrictStr`][pydantic.types.StrictStr] These types will only pass validation when the validated value is of the respective type or is a subtype of that type. ### Constrained types This behavior is also exposed via the `strict` field of the constrained types and can be combined with a multitude of complex validation rules. See the individual type signatures for supported arguments. - [`conbytes()`][pydantic.types.conbytes] - [`condate()`][pydantic.types.condate] - [`condecimal()`][pydantic.types.condecimal] - [`confloat()`][pydantic.types.confloat] - [`confrozenset()`][pydantic.types.confrozenset] - [`conint()`][pydantic.types.conint] - [`conlist()`][pydantic.types.conlist] - [`conset()`][pydantic.types.conset] - [`constr()`][pydantic.types.constr] The following caveats apply: - `StrictBytes` (and the `strict` option of `conbytes()`) will accept both `bytes`, and `bytearray` types. - `StrictInt` (and the `strict` option of `conint()`) will not accept `bool` types, even though `bool` is a subclass of `int` in Python. Other subclasses will work. - `StrictFloat` (and the `strict` option of `confloat()`) will not accept `int`. Besides the above, you can also have a [`FiniteFloat`][pydantic.types.FiniteFloat] type that will only accept finite values (i.e. not `inf`, `-inf` or `nan`). ## Custom Types You can also define your own custom data types. There are several ways to achieve it. ### Composing types via `Annotated` [PEP 593] introduced `Annotated` as a way to attach runtime metadata to types without changing how type checkers interpret them. Pydantic takes advantage of this to allow you to create types that are identical to the original type as far as type checkers are concerned, but add validation, serialize differently, etc. For example, to create a type representing a positive int: ```python # or `from typing import Annotated` for Python 3.9+ from typing_extensions import Annotated from pydantic import Field, TypeAdapter, ValidationError PositiveInt = Annotated[int, Field(gt=0)] ta = TypeAdapter(PositiveInt) print(ta.validate_python(1)) #> 1 try: ta.validate_python(-1) except ValidationError as exc: print(exc) """ 1 validation error for constrained-int Input should be greater than 0 [type=greater_than, input_value=-1, input_type=int] """ ``` Note that you can also use constraints from [annotated-types](https://github.com/annotated-types/annotated-types) to make this Pydantic-agnostic: ```python from annotated_types import Gt from typing_extensions import Annotated from pydantic import TypeAdapter, ValidationError PositiveInt = Annotated[int, Gt(0)] ta = TypeAdapter(PositiveInt) print(ta.validate_python(1)) #> 1 try: ta.validate_python(-1) except ValidationError as exc: print(exc) """ 1 validation error for constrained-int Input should be greater than 0 [type=greater_than, input_value=-1, input_type=int] """ ``` #### Adding validation and serialization You can add or override validation, serialization, and JSON schemas to an arbitrary type using the markers that Pydantic exports: ```python from typing_extensions import Annotated from pydantic import ( AfterValidator, PlainSerializer, TypeAdapter, WithJsonSchema, ) TruncatedFloat = Annotated[ float, AfterValidator(lambda x: round(x, 1)), PlainSerializer(lambda x: f'{x:.1e}', return_type=str), WithJsonSchema({'type': 'string'}, mode='serialization'), ] ta = TypeAdapter(TruncatedFloat) input = 1.02345 assert input != 1.0 assert ta.validate_python(input) == 1.0 assert ta.dump_json(input) == b'"1.0e+00"' assert ta.json_schema(mode='validation') == {'type': 'number'} assert ta.json_schema(mode='serialization') == {'type': 'string'} ``` #### Generics You can use type variables within `Annotated` to make reusable modifications to types: ```python from typing import Any, List, Sequence, TypeVar from annotated_types import Gt, Len from typing_extensions import Annotated from pydantic import ValidationError from pydantic.type_adapter import TypeAdapter SequenceType = TypeVar('SequenceType', bound=Sequence[Any]) ShortSequence = Annotated[SequenceType, Len(max_length=10)] ta = TypeAdapter(ShortSequence[List[int]]) v = ta.validate_python([1, 2, 3, 4, 5]) assert v == [1, 2, 3, 4, 5] try: ta.validate_python([1] * 100) except ValidationError as exc: print(exc) """ 1 validation error for list[int] List should have at most 10 items after validation, not 100 [type=too_long, input_value=[1, 1, 1, 1, 1, 1, 1, 1, ... 1, 1, 1, 1, 1, 1, 1, 1], input_type=list] """ T = TypeVar('T') # or a bound=SupportGt PositiveList = List[Annotated[T, Gt(0)]] ta = TypeAdapter(PositiveList[float]) v = ta.validate_python([1]) assert type(v[0]) is float try: ta.validate_python([-1]) except ValidationError as exc: print(exc) """ 1 validation error for list[constrained-float] 0 Input should be greater than 0 [type=greater_than, input_value=-1, input_type=int] """ ``` ### Named type aliases The above examples make use of implicit type aliases. This means that they will not be able to have a `title` in JSON schemas and their schema will be copied between fields. You can use [PEP 695]'s `TypeAliasType` via its [typing-extensions] backport to make named aliases, allowing you to define a new type without creating subclasses. This new type can be as simple as a name or have complex validation logic attached to it: ```python from typing import List from annotated_types import Gt from typing_extensions import Annotated, TypeAliasType from pydantic import BaseModel ImplicitAliasPositiveIntList = List[Annotated[int, Gt(0)]] class Model1(BaseModel): x: ImplicitAliasPositiveIntList y: ImplicitAliasPositiveIntList print(Model1.model_json_schema()) """ { 'properties': { 'x': { 'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'title': 'X', 'type': 'array', }, 'y': { 'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'title': 'Y', 'type': 'array', }, }, 'required': ['x', 'y'], 'title': 'Model1', 'type': 'object', } """ PositiveIntList = TypeAliasType('PositiveIntList', List[Annotated[int, Gt(0)]]) class Model2(BaseModel): x: PositiveIntList y: PositiveIntList print(Model2.model_json_schema()) """ { '$defs': { 'PositiveIntList': { 'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'type': 'array', } }, 'properties': { 'x': {'$ref': '#/$defs/PositiveIntList'}, 'y': {'$ref': '#/$defs/PositiveIntList'}, }, 'required': ['x', 'y'], 'title': 'Model2', 'type': 'object', } """ ``` These named type aliases can also be generic: ```python from typing import Generic, List, TypeVar from annotated_types import Gt from typing_extensions import Annotated, TypeAliasType from pydantic import BaseModel, ValidationError T = TypeVar('T') # or a `bound=SupportGt` PositiveList = TypeAliasType( 'PositiveList', List[Annotated[T, Gt(0)]], type_params=(T,) ) class Model(BaseModel, Generic[T]): x: PositiveList[T] assert Model[int].model_validate_json('{"x": ["1"]}').x == [1] try: Model[int](x=[-1]) except ValidationError as exc: print(exc) """ 1 validation error for Model[int] x.0 Input should be greater than 0 [type=greater_than, input_value=-1, input_type=int] """ ``` #### Named recursive types You can also use `TypeAliasType` to create recursive types: ```python from typing import Any, Dict, List, Union from pydantic_core import PydanticCustomError from typing_extensions import Annotated, TypeAliasType from pydantic import ( TypeAdapter, ValidationError, ValidationInfo, ValidatorFunctionWrapHandler, WrapValidator, ) def json_custom_error_validator( value: Any, handler: ValidatorFunctionWrapHandler, _info: ValidationInfo ) -> Any: """Simplify the error message to avoid a gross error stemming from exhaustive checking of all union options. """ try: return handler(value) except ValidationError: raise PydanticCustomError( 'invalid_json', 'Input is not valid json', ) Json = TypeAliasType( 'Json', Annotated[ Union[Dict[str, 'Json'], List['Json'], str, int, float, bool, None], WrapValidator(json_custom_error_validator), ], ) ta = TypeAdapter(Json) v = ta.validate_python({'x': [1], 'y': {'z': True}}) assert v == {'x': [1], 'y': {'z': True}} try: ta.validate_python({'x': object()}) except ValidationError as exc: print(exc) """ 1 validation error for function-wrap[json_custom_error_validator()] Input is not valid json [type=invalid_json, input_value={'x': }, input_type=dict] """ ``` ### Customizing validation with `__get_pydantic_core_schema__` To do more extensive customization of how Pydantic handles custom classes, and in particular when you have access to the class or can subclass it, you can implement a special `__get_pydantic_core_schema__` to tell Pydantic how to generate the `pydantic-core` schema. While `pydantic` uses `pydantic-core` internally to handle validation and serialization, it is a new API for Pydantic V2, thus it is one of the areas most likely to be tweaked in the future and you should try to stick to the built-in constructs like those provided by `annotated-types`, `pydantic.Field`, or `BeforeValidator` and so on. You can implement `__get_pydantic_core_schema__` both on a custom type and on metadata intended to be put in `Annotated`. In both cases the API is middleware-like and similar to that of "wrap" validators: you get a `source_type` (which isn't necessarily the same as the class, in particular for generics) and a `handler` that you can call with a type to either call the next metadata in `Annotated` or call into Pydantic's internal schema generation. The simplest no-op implementation calls the handler with the type you are given, then returns that as the result. You can also choose to modify the type before calling the handler, modify the core schema returned by the handler, or not call the handler at all. #### As a method on a custom type The following is an example of a type that uses `__get_pydantic_core_schema__` to customize how it gets validated. This is equivalent to implementing `__get_validators__` in Pydantic V1. ```python from typing import Any from pydantic_core import CoreSchema, core_schema from pydantic import GetCoreSchemaHandler, TypeAdapter class Username(str): @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(str)) ta = TypeAdapter(Username) res = ta.validate_python('abc') assert isinstance(res, Username) assert res == 'abc' ``` See [JSON Schema](../concepts/json_schema.md) for more details on how to customize JSON schemas for custom types. #### As an annotation Often you'll want to parametrize your custom type by more than just generic type parameters (which you can do via the type system and will be discussed later). Or you may not actually care (or want to) make an instance of your subclass; you actually want the original type, just with some extra validation done. For example, if you were to implement `pydantic.AfterValidator` (see [Adding validation and serialization](#adding-validation-and-serialization)) yourself, you'd do something similar to the following: ```python from dataclasses import dataclass from typing import Any, Callable from pydantic_core import CoreSchema, core_schema from typing_extensions import Annotated from pydantic import BaseModel, GetCoreSchemaHandler @dataclass(frozen=True) # (1)! class MyAfterValidator: func: Callable[[Any], Any] def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: return core_schema.no_info_after_validator_function( self.func, handler(source_type) ) Username = Annotated[str, MyAfterValidator(str.lower)] class Model(BaseModel): name: Username assert Model(name='ABC').name == 'abc' # (2)! ``` 1. The `frozen=True` specification makes `MyAfterValidator` hashable. Without this, a union such as `Username | None` will raise an error. 2. Notice that type checkers will not complain about assigning `'ABC'` to `Username` like they did in the previous example because they do not consider `Username` to be a distinct type from `str`. #### Handling third-party types Another use case for the pattern in the previous section is to handle third party types. ```python from typing import Any from pydantic_core import core_schema from typing_extensions import Annotated from pydantic import ( BaseModel, GetCoreSchemaHandler, GetJsonSchemaHandler, ValidationError, ) from pydantic.json_schema import JsonSchemaValue class ThirdPartyType: """ This is meant to represent a type from a third-party library that wasn't designed with Pydantic integration in mind, and so doesn't have a `pydantic_core.CoreSchema` or anything. """ x: int def __init__(self): self.x = 0 class _ThirdPartyTypePydanticAnnotation: @classmethod def __get_pydantic_core_schema__( cls, _source_type: Any, _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: """ We return a pydantic_core.CoreSchema that behaves in the following ways: * ints will be parsed as `ThirdPartyType` instances with the int as the x attribute * `ThirdPartyType` instances will be parsed as `ThirdPartyType` instances without any changes * Nothing else will pass validation * Serialization will always return just an int """ def validate_from_int(value: int) -> ThirdPartyType: result = ThirdPartyType() result.x = value return result from_int_schema = core_schema.chain_schema( [ core_schema.int_schema(), core_schema.no_info_plain_validator_function(validate_from_int), ] ) return core_schema.json_or_python_schema( json_schema=from_int_schema, python_schema=core_schema.union_schema( [ # check if it's an instance first before doing any further work core_schema.is_instance_schema(ThirdPartyType), from_int_schema, ] ), serialization=core_schema.plain_serializer_function_ser_schema( lambda instance: instance.x ), ) @classmethod def __get_pydantic_json_schema__( cls, _core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: # Use the same schema that would be used for `int` return handler(core_schema.int_schema()) # We now create an `Annotated` wrapper that we'll use as the annotation for fields on `BaseModel`s, etc. PydanticThirdPartyType = Annotated[ ThirdPartyType, _ThirdPartyTypePydanticAnnotation ] # Create a model class that uses this annotation as a field class Model(BaseModel): third_party_type: PydanticThirdPartyType # Demonstrate that this field is handled correctly, that ints are parsed into `ThirdPartyType`, and that # these instances are also "dumped" directly into ints as expected. m_int = Model(third_party_type=1) assert isinstance(m_int.third_party_type, ThirdPartyType) assert m_int.third_party_type.x == 1 assert m_int.model_dump() == {'third_party_type': 1} # Do the same thing where an instance of ThirdPartyType is passed in instance = ThirdPartyType() assert instance.x == 0 instance.x = 10 m_instance = Model(third_party_type=instance) assert isinstance(m_instance.third_party_type, ThirdPartyType) assert m_instance.third_party_type.x == 10 assert m_instance.model_dump() == {'third_party_type': 10} # Demonstrate that validation errors are raised as expected for invalid inputs try: Model(third_party_type='a') except ValidationError as e: print(e) """ 2 validation errors for Model third_party_type.is-instance[ThirdPartyType] Input should be an instance of ThirdPartyType [type=is_instance_of, input_value='a', input_type=str] third_party_type.chain[int,function-plain[validate_from_int()]] Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ assert Model.model_json_schema() == { 'properties': { 'third_party_type': {'title': 'Third Party Type', 'type': 'integer'} }, 'required': ['third_party_type'], 'title': 'Model', 'type': 'object', } ``` You can use this approach to e.g. define behavior for Pandas or Numpy types. #### Using `GetPydanticSchema` to reduce boilerplate ??? api "API Documentation" [`pydantic.types.GetPydanticSchema`][pydantic.types.GetPydanticSchema]
You may notice that the above examples where we create a marker class require a good amount of boilerplate. For many simple cases you can greatly minimize this by using `pydantic.GetPydanticSchema`: ```python from pydantic_core import core_schema from typing_extensions import Annotated from pydantic import BaseModel, GetPydanticSchema class Model(BaseModel): y: Annotated[ str, GetPydanticSchema( lambda tp, handler: core_schema.no_info_after_validator_function( lambda x: x * 2, handler(tp) ) ), ] assert Model(y='ab').y == 'abab' ``` #### Summary Let's recap: 1. Pydantic provides high level hooks to customize types via `Annotated` like `AfterValidator` and `Field`. Use these when possible. 2. Under the hood these use `pydantic-core` to customize validation, and you can hook into that directly using `GetPydanticSchema` or a marker class with `__get_pydantic_core_schema__`. 3. If you really want a custom type you can implement `__get_pydantic_core_schema__` on the type itself. ### Handling custom generic classes !!! warning This is an advanced technique that you might not need in the beginning. In most of the cases you will probably be fine with standard Pydantic models. You can use [Generic Classes](https://docs.python.org/3/library/typing.html#typing.Generic) as field types and perform custom validation based on the "type parameters" (or sub-types) with `__get_pydantic_core_schema__`. If the Generic class that you are using as a sub-type has a classmethod `__get_pydantic_core_schema__`, you don't need to use [`arbitrary_types_allowed`][pydantic.config.ConfigDict.arbitrary_types_allowed] for it to work. Because the `source_type` parameter is not the same as the `cls` parameter, you can use `typing.get_args` (or `typing_extensions.get_args`) to extract the generic parameters. Then you can use the `handler` to generate a schema for them by calling `handler.generate_schema`. Note that we do not do something like `handler(get_args(source_type)[0])` because we want to generate an unrelated schema for that generic parameter, not one that is influenced by the current context of `Annotated` metadata and such. This is less important for custom types, but crucial for annotated metadata that modifies schema building. ```python from dataclasses import dataclass from typing import Any, Generic, TypeVar from pydantic_core import CoreSchema, core_schema from typing_extensions import get_args, get_origin from pydantic import ( BaseModel, GetCoreSchemaHandler, ValidationError, ValidatorFunctionWrapHandler, ) ItemType = TypeVar('ItemType') # This is not a pydantic model, it's an arbitrary generic class @dataclass class Owner(Generic[ItemType]): name: str item: ItemType @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: origin = get_origin(source_type) if origin is None: # used as `x: Owner` without params origin = source_type item_tp = Any else: item_tp = get_args(source_type)[0] # both calling handler(...) and handler.generate_schema(...) # would work, but prefer the latter for conceptual and consistency reasons item_schema = handler.generate_schema(item_tp) def val_item( v: Owner[Any], handler: ValidatorFunctionWrapHandler ) -> Owner[Any]: v.item = handler(v.item) return v python_schema = core_schema.chain_schema( # `chain_schema` means do the following steps in order: [ # Ensure the value is an instance of Owner core_schema.is_instance_schema(cls), # Use the item_schema to validate `items` core_schema.no_info_wrap_validator_function( val_item, item_schema ), ] ) return core_schema.json_or_python_schema( # for JSON accept an object with name and item keys json_schema=core_schema.chain_schema( [ core_schema.typed_dict_schema( { 'name': core_schema.typed_dict_field( core_schema.str_schema() ), 'item': core_schema.typed_dict_field(item_schema), } ), # after validating the json data convert it to python core_schema.no_info_before_validator_function( lambda data: Owner( name=data['name'], item=data['item'] ), # note that we reuse the same schema here as below python_schema, ), ] ), python_schema=python_schema, ) class Car(BaseModel): color: str class House(BaseModel): rooms: int class Model(BaseModel): car_owner: Owner[Car] home_owner: Owner[House] model = Model( car_owner=Owner(name='John', item=Car(color='black')), home_owner=Owner(name='James', item=House(rooms=3)), ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: # If the values of the sub-types are invalid, we get an error Model( car_owner=Owner(name='John', item=House(rooms=3)), home_owner=Owner(name='James', item=Car(color='black')), ) except ValidationError as e: print(e) """ 2 validation errors for Model wine Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='Kinda good', input_type=str] cheese Input should be a valid boolean, unable to interpret input [type=bool_parsing, input_value='yeah', input_type=str] """ # Similarly with JSON model = Model.model_validate_json( '{"car_owner":{"name":"John","item":{"color":"black"}},"home_owner":{"name":"James","item":{"rooms":3}}}' ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: Model.model_validate_json( '{"car_owner":{"name":"John","item":{"rooms":3}},"home_owner":{"name":"James","item":{"color":"black"}}}' ) except ValidationError as e: print(e) """ 2 validation errors for Model car_owner.item.color Field required [type=missing, input_value={'rooms': 3}, input_type=dict] home_owner.item.rooms Field required [type=missing, input_value={'color': 'black'}, input_type=dict] """ ``` #### Generic containers The same idea can be applied to create generic container types, like a custom `Sequence` type: ```python from typing import Any, Sequence, TypeVar from pydantic_core import ValidationError, core_schema from typing_extensions import get_args from pydantic import BaseModel, GetCoreSchemaHandler T = TypeVar('T') class MySequence(Sequence[T]): def __init__(self, v: Sequence[T]): self.v = v def __getitem__(self, i): return self.v[i] def __len__(self): return len(self.v) @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: instance_schema = core_schema.is_instance_schema(cls) args = get_args(source) if args: # replace the type and rely on Pydantic to generate the right schema # for `Sequence` sequence_t_schema = handler.generate_schema(Sequence[args[0]]) else: sequence_t_schema = handler.generate_schema(Sequence) non_instance_schema = core_schema.no_info_after_validator_function( MySequence, sequence_t_schema ) return core_schema.union_schema([instance_schema, non_instance_schema]) class M(BaseModel): model_config = dict(validate_default=True) s1: MySequence = [3] m = M() print(m) #> s1=<__main__.MySequence object at 0x0123456789ab> print(m.s1.v) #> [3] class M(BaseModel): s1: MySequence[int] M(s1=[1]) try: M(s1=['a']) except ValidationError as exc: print(exc) """ 2 validation errors for M s1.is-instance[MySequence] Input should be an instance of MySequence [type=is_instance_of, input_value=['a'], input_type=list] s1.function-after[MySequence(), json-or-python[json=list[int],python=chain[is-instance[Sequence],function-wrap[sequence_validator()]]]].0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ### Access to field name !!!note This was not possible with Pydantic V2 to V2.3, it was [re-added](https://github.com/pydantic/pydantic/pull/7542) in Pydantic V2.4. As of Pydantic V2.4, you can access the field name via the `handler.field_name` within `__get_pydantic_core_schema__` and thereby set the field name which will be available from `info.field_name`. ```python from typing import Any from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler, ValidationInfo class CustomType: """Custom type that stores the field it was used in.""" def __init__(self, value: int, field_name: str): self.value = value self.field_name = field_name def __repr__(self): return f'CustomType<{self.value} {self.field_name!r}>' @classmethod def validate(cls, value: int, info: ValidationInfo): return cls(value, info.field_name) @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( cls.validate, handler(int), field_name=handler.field_name ) class MyModel(BaseModel): my_field: CustomType m = MyModel(my_field=1) print(m.my_field) #> CustomType<1 'my_field'> ``` You can also access `field_name` from the markers used with `Annotated`, like [`AfterValidator`][pydantic.functional_validators.AfterValidator]. ```python from typing_extensions import Annotated from pydantic import AfterValidator, BaseModel, ValidationInfo def my_validators(value: int, info: ValidationInfo): return f'<{value} {info.field_name!r}>' class MyModel(BaseModel): my_field: Annotated[int, AfterValidator(my_validators)] m = MyModel(my_field=1) print(m.my_field) #> <1 'my_field'> ``` [PEP 593]: https://peps.python.org/pep-0593/ [PEP 695]: https://peps.python.org/pep-0695/ [typing-extensions]: https://github.com/python/typing_extensions pydantic-2.10.6/docs/concepts/unions.md000066400000000000000000000542631474456633400200500ustar00rootroot00000000000000Unions are fundamentally different to all other types Pydantic validates - instead of requiring all fields/items/values to be valid, unions require only one member to be valid. This leads to some nuance around how to validate unions: * which member(s) of the union should you validate data against, and in which order? * which errors to raise when validation fails? Validating unions feels like adding another orthogonal dimension to the validation process. To solve these problems, Pydantic supports three fundamental approaches to validating unions: 1. [left to right mode](#left-to-right-mode) - the simplest approach, each member of the union is tried in order and the first match is returned 2. [smart mode](#smart-mode) - similar to "left to right mode" members are tried in order; however, validation will proceed past the first match to attempt to find a better match, this is the default mode for most union validation 3. [discriminated unions](#discriminated-unions) - only one member of the union is tried, based on a discriminator !!! tip In general, we recommend using [discriminated unions](#discriminated-unions). They are both more performant and more predictable than untagged unions, as they allow you to control which member of the union to validate against. For complex cases, if you're using untagged unions, it's recommended to use `union_mode='left_to_right'` if you need guarantees about the order of validation attempts against the union members. If you're looking for incredibly specialized behavior, you can use a [custom validator](../concepts/validators.md#field-validators). ## Union Modes ### Left to Right Mode !!! note Because this mode often leads to unexpected validation results, it is not the default in Pydantic >=2, instead `union_mode='smart'` is the default. With this approach, validation is attempted against each member of the union in their order they're defined, and the first successful validation is accepted as input. If validation fails on all members, the validation error includes the errors from all members of the union. `union_mode='left_to_right'` must be set as a [`Field`](../concepts/fields.md) parameter on union fields where you want to use it. ```python {title="Union with left to right mode"} from typing import Union from pydantic import BaseModel, Field, ValidationError class User(BaseModel): id: Union[str, int] = Field(union_mode='left_to_right') print(User(id=123)) #> id=123 print(User(id='hello')) #> id='hello' try: User(id=[]) except ValidationError as e: print(e) """ 2 validation errors for User id.str Input should be a valid string [type=string_type, input_value=[], input_type=list] id.int Input should be a valid integer [type=int_type, input_value=[], input_type=list] """ ``` The order of members is very important in this case, as demonstrated by tweak the above example: ```python {title="Union with left to right - unexpected results"} from typing import Union from pydantic import BaseModel, Field class User(BaseModel): id: Union[int, str] = Field(union_mode='left_to_right') print(User(id=123)) # (1) #> id=123 print(User(id='456')) # (2) #> id=456 ``` 1. As expected the input is validated against the `int` member and the result is as expected. 2. We're in lax mode and the numeric string `'123'` is valid as input to the first member of the union, `int`. Since that is tried first, we get the surprising result of `id` being an `int` instead of a `str`. ### Smart Mode Because of the potentially surprising results of `union_mode='left_to_right'`, in Pydantic >=2 the default mode for `Union` validation is `union_mode='smart'`. In this mode, pydantic attempts to select the best match for the input from the union members. The exact algorithm may change between Pydantic minor releases to allow for improvements in both performance and accuracy. !!! note We reserve the right to tweak the internal `smart` matching algorithm in future versions of Pydantic. If you rely on very specific matching behavior, it's recommended to use `union_mode='left_to_right'` or [discriminated unions](#discriminated-unions). ??? info "Smart Mode Algorithm" The smart mode algorithm uses two metrics to determine the best match for the input: 1. The number of valid fields set (relevant for models, dataclasses, and typed dicts) 2. The exactness of the match (relevant for all types) #### Number of valid fields set !!! note This metric was introduced in Pydantic v2.8.0. Prior to this version, only exactness was used to determine the best match. This metric is currently only relevant for models, dataclasses, and typed dicts. The greater the number of valid fields set, the better the match. The number of fields set on nested models is also taken into account. These counts bubble up to the top-level union, where the union member with the highest count is considered the best match. For data types where this metric is relevant, we prioritize this count over exactness. For all other types, we use solely exactness. #### Exactness For `exactness`, Pydantic scores a match of a union member into one of the following three groups (from highest score to lowest score): - An exact type match, for example an `int` input to a `float | int` union validation is an exact type match for the `int` member - Validation would have succeeded in [`strict` mode](../concepts/strict_mode.md) - Validation would have succeeded in lax mode The union match which produced the highest exactness score will be considered the best match. In smart mode, the following steps are taken to try to select the best match for the input: === "`BaseModel`, `dataclass`, and `TypedDict`" 1. Union members are attempted left to right, with any successful matches scored into one of the three exactness categories described above, with the valid fields set count also tallied. 2. After all members have been evaluated, the member with the highest "valid fields set" count is returned. 3. If there's a tie for the highest "valid fields set" count, the exactness score is used as a tiebreaker, and the member with the highest exactness score is returned. 4. If validation failed on all the members, return all the errors. === "All other data types" 1. Union members are attempted left to right, with any successful matches scored into one of the three exactness categories described above. - If validation succeeds with an exact type match, that member is returned immediately and following members will not be attempted. 2. If validation succeeded on at least one member as a "strict" match, the leftmost of those "strict" matches is returned. 3. If validation succeeded on at least one member in "lax" mode, the leftmost match is returned. 4. Validation failed on all the members, return all the errors. ```python from typing import Union from uuid import UUID from pydantic import BaseModel class User(BaseModel): id: Union[int, str, UUID] name: str user_01 = User(id=123, name='John Doe') print(user_01) #> id=123 name='John Doe' print(user_01.id) #> 123 user_02 = User(id='1234', name='John Doe') print(user_02) #> id='1234' name='John Doe' print(user_02.id) #> 1234 user_03_uuid = UUID('cf57432e-809e-4353-adbd-9d5c0d733868') user_03 = User(id=user_03_uuid, name='John Doe') print(user_03) #> id=UUID('cf57432e-809e-4353-adbd-9d5c0d733868') name='John Doe' print(user_03.id) #> cf57432e-809e-4353-adbd-9d5c0d733868 print(user_03_uuid.int) #> 275603287559914445491632874575877060712 ``` !!! tip The type `Optional[x]` is a shorthand for `Union[x, None]`. See more details in [Required fields](../concepts/models.md#required-fields). ## Discriminated Unions **Discriminated unions are sometimes referred to as "Tagged Unions".** We can use discriminated unions to more efficiently validate `Union` types, by choosing which member of the union to validate against. This makes validation more efficient and also avoids a proliferation of errors when validation fails. Adding discriminator to unions also means the generated JSON schema implements the [associated OpenAPI specification](https://github.com/OAI/OpenAPI-Specification/blob/main/versions/3.1.0.md#discriminator-object). ### Discriminated Unions with `str` discriminators Frequently, in the case of a `Union` with multiple models, there is a common field to all members of the union that can be used to distinguish which union case the data should be validated against; this is referred to as the "discriminator" in [OpenAPI](https://swagger.io/docs/specification/data-models/inheritance-and-polymorphism/). To validate models based on that information you can set the same field - let's call it `my_discriminator` - in each of the models with a discriminated value, which is one (or many) `Literal` value(s). For your `Union`, you can set the discriminator in its value: `Field(discriminator='my_discriminator')`. ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class Cat(BaseModel): pet_type: Literal['cat'] meows: int class Dog(BaseModel): pet_type: Literal['dog'] barks: float class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] scales: bool class Model(BaseModel): pet: Union[Cat, Dog, Lizard] = Field(discriminator='pet_type') n: int print(Model(pet={'pet_type': 'dog', 'barks': 3.14}, n=1)) #> pet=Dog(pet_type='dog', barks=3.14) n=1 try: Model(pet={'pet_type': 'dog'}, n=1) except ValidationError as e: print(e) """ 1 validation error for Model pet.dog.barks Field required [type=missing, input_value={'pet_type': 'dog'}, input_type=dict] """ ``` ### Discriminated Unions with callable `Discriminator` ??? api "API Documentation" [`pydantic.types.Discriminator`][pydantic.types.Discriminator]
In the case of a `Union` with multiple models, sometimes there isn't a single uniform field across all models that you can use as a discriminator. This is the perfect use case for a callable `Discriminator`. !!! tip When you're designing callable discriminators, remember that you might have to account for both `dict` and model type inputs. This pattern is similar to that of `mode='before'` validators, where you have to anticipate various forms of input. But wait! You ask, I only anticipate passing in `dict` types, why do I need to account for models? Pydantic uses callable discriminators for serialization as well, at which point the input to your callable is very likely to be a model instance. In the following examples, you'll see that the callable discriminators are designed to handle both `dict` and model inputs. If you don't follow this practice, it's likely that you'll, in the best case, get warnings during serialization, and in the worst case, get runtime errors during validation. ```python from typing import Any, Literal, Union from typing_extensions import Annotated from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) """ ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) """ pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) """ ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) """ ``` `Discriminator`s can also be used to validate `Union` types with combinations of models and primitive types. For example: ```python from typing import Any, Union from typing_extensions import Annotated from pydantic import BaseModel, Discriminator, Tag, ValidationError def model_x_discriminator(v: Any) -> str: if isinstance(v, int): return 'int' if isinstance(v, (dict, BaseModel)): return 'model' else: # return None if the discriminator value isn't found return None class SpecialValue(BaseModel): value: int class DiscriminatedModel(BaseModel): value: Annotated[ Union[ Annotated[int, Tag('int')], Annotated['SpecialValue', Tag('model')], ], Discriminator(model_x_discriminator), ] model_data = {'value': {'value': 1}} m = DiscriminatedModel.model_validate(model_data) print(m) #> value=SpecialValue(value=1) int_data = {'value': 123} m = DiscriminatedModel.model_validate(int_data) print(m) #> value=123 try: DiscriminatedModel.model_validate({'value': 'not an int or a model'}) except ValidationError as e: print(e) # (1)! """ 1 validation error for DiscriminatedModel value Unable to extract tag using discriminator model_x_discriminator() [type=union_tag_not_found, input_value='not an int or a model', input_type=str] """ ``` 1. Notice the callable discriminator function returns `None` if a discriminator value is not found. When `None` is returned, this `union_tag_not_found` error is raised. !!! note Using the [[`typing.Annotated`][] fields syntax](../concepts/types.md#composing-types-via-annotated) can be handy to regroup the `Union` and `discriminator` information. See the next example for more details. There are a few ways to set a discriminator for a field, all varying slightly in syntax. For `str` discriminators: ``` some_field: Union[...] = Field(discriminator='my_discriminator' some_field: Annotated[Union[...], Field(discriminator='my_discriminator')] ``` For callable `Discriminator`s: ``` some_field: Union[...] = Field(discriminator=Discriminator(...)) some_field: Annotated[Union[...], Discriminator(...)] some_field: Annotated[Union[...], Field(discriminator=Discriminator(...))] ``` !!! warning Discriminated unions cannot be used with only a single variant, such as `Union[Cat]`. Python changes `Union[T]` into `T` at interpretation time, so it is not possible for `pydantic` to distinguish fields of `Union[T]` from `T`. ### Nested Discriminated Unions Only one discriminator can be set for a field but sometimes you want to combine multiple discriminators. You can do it by creating nested `Annotated` types, e.g.: ```python from typing import Literal, Union from typing_extensions import Annotated from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['cat'] color: Literal['black'] black_name: str class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_name: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal['dog'] name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pet: Pet n: int m = Model(pet={'pet_type': 'cat', 'color': 'black', 'black_name': 'felix'}, n=1) print(m) #> pet=BlackCat(pet_type='cat', color='black', black_name='felix') n=1 try: Model(pet={'pet_type': 'cat', 'color': 'red'}, n='1') except ValidationError as e: print(e) """ 1 validation error for Model pet.cat Input tag 'red' found using 'color' does not match any of the expected tags: 'black', 'white' [type=union_tag_invalid, input_value={'pet_type': 'cat', 'color': 'red'}, input_type=dict] """ try: Model(pet={'pet_type': 'cat', 'color': 'black'}, n='1') except ValidationError as e: print(e) """ 1 validation error for Model pet.cat.black.black_name Field required [type=missing, input_value={'pet_type': 'cat', 'color': 'black'}, input_type=dict] """ ``` !!! tip If you want to validate data against a union, and solely a union, you can use pydantic's [`TypeAdapter`](../concepts/type_adapter.md) construct instead of inheriting from the standard `BaseModel`. In the context of the previous example, we have the following: ```python {lint="skip" test="skip"} type_adapter = TypeAdapter(Pet) pet = type_adapter.validate_python( {'pet_type': 'cat', 'color': 'black', 'black_name': 'felix'} ) print(repr(pet)) #> BlackCat(pet_type='cat', color='black', black_name='felix') ``` ## Union Validation Errors When `Union` validation fails, error messages can be quite verbose, as they will produce validation errors for each case in the union. This is especially noticeable when dealing with recursive models, where reasons may be generated at each level of recursion. Discriminated unions help to simplify error messages in this case, as validation errors are only produced for the case with a matching discriminator value. You can also customize the error type, message, and context for a `Discriminator` by passing these specifications as parameters to the `Discriminator` constructor, as seen in the example below. ```python from typing import Union from typing_extensions import Annotated from pydantic import BaseModel, Discriminator, Tag, ValidationError # Errors are quite verbose with a normal Union: class Model(BaseModel): x: Union[str, 'Model'] try: Model.model_validate({'x': {'x': {'x': 1}}}) except ValidationError as e: print(e) """ 4 validation errors for Model x.str Input should be a valid string [type=string_type, input_value={'x': {'x': 1}}, input_type=dict] x.Model.x.str Input should be a valid string [type=string_type, input_value={'x': 1}, input_type=dict] x.Model.x.Model.x.str Input should be a valid string [type=string_type, input_value=1, input_type=int] x.Model.x.Model.x.Model Input should be a valid dictionary or instance of Model [type=model_type, input_value=1, input_type=int] """ try: Model.model_validate({'x': {'x': {'x': {}}}}) except ValidationError as e: print(e) """ 4 validation errors for Model x.str Input should be a valid string [type=string_type, input_value={'x': {'x': {}}}, input_type=dict] x.Model.x.str Input should be a valid string [type=string_type, input_value={'x': {}}, input_type=dict] x.Model.x.Model.x.str Input should be a valid string [type=string_type, input_value={}, input_type=dict] x.Model.x.Model.x.Model.x Field required [type=missing, input_value={}, input_type=dict] """ # Errors are much simpler with a discriminated union: def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' class DiscriminatedModel(BaseModel): x: Annotated[ Union[ Annotated[str, Tag('str')], Annotated['DiscriminatedModel', Tag('model')], ], Discriminator( model_x_discriminator, custom_error_type='invalid_union_member', # (1)! custom_error_message='Invalid union member', # (2)! custom_error_context={'discriminator': 'str_or_model'}, # (3)! ), ] try: DiscriminatedModel.model_validate({'x': {'x': {'x': 1}}}) except ValidationError as e: print(e) """ 1 validation error for DiscriminatedModel x.model.x.model.x Invalid union member [type=invalid_union_member, input_value=1, input_type=int] """ try: DiscriminatedModel.model_validate({'x': {'x': {'x': {}}}}) except ValidationError as e: print(e) """ 1 validation error for DiscriminatedModel x.model.x.model.x.model.x Field required [type=missing, input_value={}, input_type=dict] """ # The data is still handled properly when valid: data = {'x': {'x': {'x': 'a'}}} m = DiscriminatedModel.model_validate(data) print(m.model_dump()) #> {'x': {'x': {'x': 'a'}}} ``` 1. `custom_error_type` is the `type` attribute of the `ValidationError` raised when validation fails. 2. `custom_error_message` is the `msg` attribute of the `ValidationError` raised when validation fails. 3. `custom_error_context` is the `ctx` attribute of the `ValidationError` raised when validation fails. You can also simplify error messages by labeling each case with a [`Tag`][pydantic.types.Tag]. This is especially useful when you have complex types like those in this example: ```python from typing import Dict, List, Union from typing_extensions import Annotated from pydantic import AfterValidator, Tag, TypeAdapter, ValidationError DoubledList = Annotated[List[int], AfterValidator(lambda x: x * 2)] StringsMap = Dict[str, str] # Not using any `Tag`s for each union case, the errors are not so nice to look at adapter = TypeAdapter(Union[DoubledList, StringsMap]) try: adapter.validate_python(['a']) except ValidationError as exc_info: print(exc_info) """ 2 validation errors for union[function-after[(), list[int]],dict[str,str]] function-after[(), list[int]].0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] dict[str,str] Input should be a valid dictionary [type=dict_type, input_value=['a'], input_type=list] """ tag_adapter = TypeAdapter( Union[ Annotated[DoubledList, Tag('DoubledList')], Annotated[StringsMap, Tag('StringsMap')], ] ) try: tag_adapter.validate_python(['a']) except ValidationError as exc_info: print(exc_info) """ 2 validation errors for union[DoubledList,StringsMap] DoubledList.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] StringsMap Input should be a valid dictionary [type=dict_type, input_value=['a'], input_type=list] """ ``` pydantic-2.10.6/docs/concepts/validation_decorator.md000066400000000000000000000256251474456633400227310ustar00rootroot00000000000000??? api "API Documentation" [`pydantic.validate_call_decorator.validate_call`][pydantic.validate_call_decorator.validate_call]
The [`validate_call()`][pydantic.validate_call] decorator allows the arguments passed to a function to be parsed and validated using the function's annotations before the function is called. While under the hood this uses the same approach of model creation and initialisation (see [Validators](validators.md) for more details), it provides an extremely easy way to apply validation to your code with minimal boilerplate. Example of usage: ```python from pydantic import ValidationError, validate_call @validate_call def repeat(s: str, count: int, *, separator: bytes = b'') -> bytes: b = s.encode() return separator.join(b for _ in range(count)) a = repeat('hello', 3) print(a) #> b'hellohellohello' b = repeat('x', '4', separator=b' ') print(b) #> b'x x x x' try: c = repeat('hello', 'wrong') except ValidationError as exc: print(exc) """ 1 validation error for repeat 1 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str] """ ``` ## Parameter types Parameter types are inferred from type annotations on the function, or as [`Any`][typing.Any] if not annotated. All types listed in [types](types.md) can be validated, including Pydantic models and [custom types](types.md#custom-types). As with the rest of Pydantic, types are by default coerced by the decorator before they're passed to the actual function: ```python from datetime import date from pydantic import validate_call @validate_call def greater_than(d1: date, d2: date, *, include_equal=False) -> date: # (1)! if include_equal: return d1 >= d2 else: return d1 > d2 d1 = '2000-01-01' # (2)! d2 = date(2001, 1, 1) greater_than(d1, d2, include_equal=True) ``` 1. Because `include_equal` has no type annotation, it will be inferred as [`Any`][typing.Any]. 2. Although `d1` is a string, it will be converted to a [`date`][datetime.date] object. Type coercion like this can be extremely helpful, but also confusing or not desired (see [model data conversion](models.md#data-conversion)). [Strict mode](strict_mode.md) can be enabled by using a [custom configuration](#custom-configuration). !!! note "Validating the return value" By default, the return value of the function is **not** validated. To do so, the `validate_return` argument of the decorator can be set to `True`. ## Function signatures The [`validate_call()`][pydantic.validate_call] decorator is designed to work with functions using all possible [parameter configurations][parameter] and all possible combinations of these: * Positional or keyword parameters with or without defaults. * Keyword-only parameters: parameters after `*,`. * Positional-only parameters: parameters before `, /`. * Variable positional parameters defined via `*` (often `*args`). * Variable keyword parameters defined via `**` (often `**kwargs`). ??? example ```python from pydantic import validate_call @validate_call def pos_or_kw(a: int, b: int = 2) -> str: return f'a={a} b={b}' print(pos_or_kw(1, b=3)) #> a=1 b=3 @validate_call def kw_only(*, a: int, b: int = 2) -> str: return f'a={a} b={b}' print(kw_only(a=1)) #> a=1 b=2 print(kw_only(a=1, b=3)) #> a=1 b=3 @validate_call def pos_only(a: int, b: int = 2, /) -> str: return f'a={a} b={b}' print(pos_only(1)) #> a=1 b=2 @validate_call def var_args(*args: int) -> str: return str(args) print(var_args(1)) #> (1,) print(var_args(1, 2, 3)) #> (1, 2, 3) @validate_call def var_kwargs(**kwargs: int) -> str: return str(kwargs) print(var_kwargs(a=1)) #> {'a': 1} print(var_kwargs(a=1, b=2)) #> {'a': 1, 'b': 2} @validate_call def armageddon( a: int, /, b: int, *c: int, d: int, e: int = None, **f: int, ) -> str: return f'a={a} b={b} c={c} d={d} e={e} f={f}' print(armageddon(1, 2, d=3)) #> a=1 b=2 c=() d=3 e=None f={} print(armageddon(1, 2, 3, 4, 5, 6, d=8, e=9, f=10, spam=11)) #> a=1 b=2 c=(3, 4, 5, 6) d=8 e=9 f={'f': 10, 'spam': 11} ``` !!! note "[`Unpack`][typing.Unpack] for keyword parameters" [`Unpack`][typing.Unpack] and typed dictionaries can be used to annotate the variable keyword parameters of a function: ```python from typing_extensions import TypedDict, Unpack from pydantic import validate_call class Point(TypedDict): x: int y: int @validate_call def add_coords(**kwargs: Unpack[Point]) -> int: return kwargs['x'] + kwargs['y'] add_coords(x=1, y=2) ``` For reference, see the [related specification section] and [PEP 692]. [related specification section]: https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments [PEP 692]: https://peps.python.org/pep-0692/ ## Using the [`Field()`][pydantic.Field] function to describe function parameters The [`Field()` function](fields.md) can also be used with the decorator to provide extra information about the field and validations. In general it should be used in a type hint with [Annotated](types.md#composing-types-via-annotated), unless `default_factory` is specified, in which case it should be used as the default value of the field: ```python from datetime import datetime from typing_extensions import Annotated from pydantic import Field, ValidationError, validate_call @validate_call def how_many(num: Annotated[int, Field(gt=10)]): return num try: how_many(1) except ValidationError as e: print(e) """ 1 validation error for how_many 0 Input should be greater than 10 [type=greater_than, input_value=1, input_type=int] """ @validate_call def when(dt: datetime = Field(default_factory=datetime.now)): return dt print(type(when())) #> ``` [Aliases](fields.md#field-aliases) can be used with the decorator as normal: ```python from typing_extensions import Annotated from pydantic import Field, validate_call @validate_call def how_many(num: Annotated[int, Field(gt=10, alias='number')]): return num how_many(number=42) ``` ## Accessing the original function The original function which was decorated can still be accessed by using the `raw_function` attribute. This is useful if in some scenarios you trust your input arguments and want to call the function in the most efficient way (see [notes on performance](#performance) below): ```python from pydantic import validate_call @validate_call def repeat(s: str, count: int, *, separator: bytes = b'') -> bytes: b = s.encode() return separator.join(b for _ in range(count)) a = repeat('hello', 3) print(a) #> b'hellohellohello' b = repeat.raw_function('good bye', 2, separator=b', ') print(b) #> b'good bye, good bye' ``` ## Async functions [`validate_call()`][pydantic.validate_call] can also be used on async functions: ```python class Connection: async def execute(self, sql, *args): return 'testing@example.com' conn = Connection() # ignore-above import asyncio from pydantic import PositiveInt, ValidationError, validate_call @validate_call async def get_user_email(user_id: PositiveInt): # `conn` is some fictional connection to a database email = await conn.execute('select email from users where id=$1', user_id) if email is None: raise RuntimeError('user not found') else: return email async def main(): email = await get_user_email(123) print(email) #> testing@example.com try: await get_user_email(-4) except ValidationError as exc: print(exc.errors()) """ [ { 'type': 'greater_than', 'loc': (0,), 'msg': 'Input should be greater than 0', 'input': -4, 'ctx': {'gt': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] """ asyncio.run(main()) # requires: `conn.execute()` that will return `'testing@example.com'` ``` ## Compatibility with type checkers As the [`validate_call()`][pydantic.validate_call] decorator preserves the decorated function's signature, it should be compatible with type checkers (such as mypy and pyright). However, due to current limitations in the Python type system, the [`raw_function`](#accessing-the-original-function) or other attributes won't be recognized and you will need to suppress the error using (usually with a `# type: ignore` comment). ## Custom configuration Similarly to Pydantic models, the `config` parameter of the decorator can be used to specify a custom configuration: ```python from pydantic import ConfigDict, ValidationError, validate_call class Foobar: def __init__(self, v: str): self.v = v def __add__(self, other: 'Foobar') -> str: return f'{self} + {other}' def __str__(self) -> str: return f'Foobar({self.v})' @validate_call(config=ConfigDict(arbitrary_types_allowed=True)) def add_foobars(a: Foobar, b: Foobar): return a + b c = add_foobars(Foobar('a'), Foobar('b')) print(c) #> Foobar(a) + Foobar(b) try: add_foobars(1, 2) except ValidationError as e: print(e) """ 2 validation errors for add_foobars 0 Input should be an instance of Foobar [type=is_instance_of, input_value=1, input_type=int] 1 Input should be an instance of Foobar [type=is_instance_of, input_value=2, input_type=int] """ ``` ## Extension — validating arguments before calling a function In some cases, it may be helpful to separate validation of a function's arguments from the function call itself. This might be useful when a particular function is costly/time consuming. Here's an example of a workaround you can use for that pattern: ```python from pydantic import validate_call @validate_call def validate_foo(a: int, b: int): def foo(): return a + b return foo foo = validate_foo(a=1, b=2) print(foo()) #> 3 ``` ## Limitations ### Validation exception Currently upon validation failure, a standard Pydantic [`ValidationError`][pydantic_core.ValidationError] is raised (see [model error handling](models.md#error-handling) for details). This is also true for missing required arguments, where Python normally raises a [`TypeError`][]. ### Performance We've made a big effort to make Pydantic as performant as possible. While the inspection of the decorated function is only performed once, there will still be a performance impact when making calls to the function compared to using the original function. In many situations, this will have little or no noticeable effect. However, be aware that [`validate_call()`][pydantic.validate_call] is not an equivalent or alternative to function definitions in strongly typed languages, and it never will be. pydantic-2.10.6/docs/concepts/validators.md000066400000000000000000000704271474456633400207050ustar00rootroot00000000000000In addition to Pydantic's [built-in validation capabilities](./fields.md#field-constraints), you can leverage custom validators at the field and model levels to enforce more complex constraints and ensure the integrity of your data. !!! tip Want to quickly jump to the relevant validator section?
- Field validators --- - [field *after* validators](#field-after-validator) - [field *before* validators](#field-before-validator) - [field *plain* validators](#field-plain-validator) - [field *wrap* validators](#field-wrap-validator) - Model validators --- - [model *before* validators](#model-before-validator) - [model *after* validators](#model-after-validator) - [model *wrap* validators](#model-wrap-validator)
## Field validators ??? api "API Documentation" [`pydantic.functional_validators.WrapValidator`][pydantic.functional_validators.WrapValidator]
[`pydantic.functional_validators.PlainValidator`][pydantic.functional_validators.PlainValidator]
[`pydantic.functional_validators.BeforeValidator`][pydantic.functional_validators.BeforeValidator]
[`pydantic.functional_validators.AfterValidator`][pydantic.functional_validators.AfterValidator]
[`pydantic.functional_validators.field_validator`][pydantic.functional_validators.field_validator]
In its simplest form, a field validator is a callable taking the value to be validated as an argument and **returning the validated value**. The callable can perform checks for specific conditions (see [raising validation errors](#raising-validation-errors)) and make changes to the validated value (coercion or mutation). **Four** different types of validators can be used. They can all be defined using the [annotated pattern](./fields.md#the-annotated-pattern) or using the [`field_validator()`][pydantic.field_validator] decorator, applied on a [class method][classmethod]: [](){#field-after-validator} - __*After* validators__: run after Pydantic's internal validation. They are generally more type safe and thus easier to implement. === "Annotated pattern" Here is an example of a validator performing a validation check, and returning the value unchanged. ```python from typing_extensions import Annotated from pydantic import AfterValidator, BaseModel, ValidationError def is_even(value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value # (1)! class Model(BaseModel): number: Annotated[int, AfterValidator(is_even)] try: Model(number=1) except ValidationError as err: print(err) """ 1 validation error for Model number Value error, 1 is not an even number [type=value_error, input_value=1, input_type=int] """ ``` 1. Note that it is important to return the validated value. === "Decorator" Here is an example of a validator performing a validation check, and returning the value unchanged, this time using the [`field_validator()`][pydantic.field_validator] decorator. ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): number: int @field_validator('number', mode='after') # (1)! @classmethod def is_even(cls, value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value # (2)! try: Model(number=1) except ValidationError as err: print(err) """ 1 validation error for Model number Value error, 1 is not an even number [type=value_error, input_value=1, input_type=int] """ ``` 1. `'after'` is the default mode for the decorator, and can be omitted. 2. Note that it is important to return the validated value. ??? example "Example mutating the value" Here is an example of a validator making changes to the validated value (no exception is raised). === "Annotated pattern" ```python from typing_extensions import Annotated from pydantic import AfterValidator, BaseModel def double_number(value: int) -> int: return value * 2 class Model(BaseModel): number: Annotated[int, AfterValidator(double_number)] print(Model(number=2)) #> number=4 ``` === "Decorator" ```python from pydantic import BaseModel, field_validator class Model(BaseModel): number: int @field_validator('number', mode='after') # (1)! @classmethod def double_number(cls, value: int) -> int: return value * 2 print(Model(number=2)) #> number=4 ``` 1. `'after'` is the default mode for the decorator, and can be omitted. [](){#field-before-validator} - __*Before* validators__: run before Pydantic's internal parsing and validation (e.g. coercion of a `str` to an `int`). These are more flexible than [*after* validators](#field-after-validator), but they also have to deal with the raw input, which in theory could be any arbitrary object. The value returned from this callable is then validated against the provided type annotation by Pydantic. === "Annotated pattern" ```python from typing import Any, List from typing_extensions import Annotated from pydantic import BaseModel, BeforeValidator, ValidationError def ensure_list(value: Any) -> Any: # (1)! if not isinstance(value, list): # (2)! return [value] else: return value class Model(BaseModel): numbers: Annotated[List[int], BeforeValidator(ensure_list)] print(Model(numbers=2)) #> numbers=[2] try: Model(numbers='str') except ValidationError as err: print(err) # (3)! """ 1 validation error for Model numbers.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='str', input_type=str] """ ``` 1. Notice the use of [`Any`][typing.Any] as a type hint for `value`. *Before* validators take the raw input, which can be anything. 2. Note that you might want to check for other sequence types (such as tuples) that would normally successfully validate against the `list` type. *Before* validators give you more flexibility, but you have to account for every possible case. 3. Pydantic still performs validation against the `int` type, no matter if our `ensure_list` validator did operations on the original input type. === "Decorator" ```python from typing import Any, List from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): numbers: List[int] @field_validator('numbers', mode='before') @classmethod def ensure_list(cls, value: Any) -> Any: # (1)! if not isinstance(value, list): # (2)! return [value] else: return value print(Model(numbers=2)) #> numbers=[2] try: Model(numbers='str') except ValidationError as err: print(err) # (3)! """ 1 validation error for Model numbers.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='str', input_type=str] """ ``` 1. Notice the use of [`Any`][typing.Any] as a type hint for `value`. *Before* validators take the raw input, which can be anything. 2. Note that you might want to check for other sequence types (such as tuples) that would normally successfully validate against the `list` type. *Before* validators give you more flexibility, but you have to account for every possible case. 3. Pydantic still performs validation against the `int` type, no matter if our `ensure_list` validator did operations on the original input type. [](){#field-plain-validator} - __*Plain* validators__: act similarly to *before* validators but they **terminate validation immediately** after returning, so no further validators are called and Pydantic does not do any of its internal validation against the field type. === "Annotated pattern" ```python from typing import Any from typing_extensions import Annotated from pydantic import BaseModel, PlainValidator def val_number(value: Any) -> Any: if isinstance(value, int): return value * 2 else: return value class Model(BaseModel): number: Annotated[int, PlainValidator(val_number)] print(Model(number=4)) #> number=8 print(Model(number='invalid')) # (1)! #> number='invalid' ``` 1. Although `'invalid'` shouldn't validate against the `int` type, Pydantic accepts the input. === "Decorator" ```python from typing import Any from pydantic import BaseModel, field_validator class Model(BaseModel): number: int @field_validator('number', mode='plain') @classmethod def val_number(cls, value: Any) -> Any: if isinstance(value, int): return value * 2 else: return value print(Model(number=4)) #> number=8 print(Model(number='invalid')) # (1)! #> number='invalid' ``` 1. Although `'invalid'` shouldn't validate against the `int` type, Pydantic accepts the input. [](){#field-wrap-validator} - __*Wrap* validators__: are the most flexible of all. You can run code before or after Pydantic and other validators process the input, or you can terminate validation immediately, either by returning the value early or by raising an error. Such validators must be defined with a **mandatory** extra `handler` parameter: a callable taking the value to be validated as an argument. Internally, this handler will delegate validation of the value to Pydantic. You are free to wrap the call to the handler in a [`try..except`][handling exceptions] block, or not call it at all. [handling exceptions]: https://docs.python.org/3/tutorial/errors.html#handling-exceptions === "Annotated pattern" ```python {lint="skip"} from typing import Any from typing_extensions import Annotated from pydantic import BaseModel, Field, ValidationError, ValidatorFunctionWrapHandler, WrapValidator def truncate(value: Any, handler: ValidatorFunctionWrapHandler) -> str: try: return handler(value) except ValidationError as err: if err.errors()[0]['type'] == 'string_too_long': return handler(value[:5]) else: raise class Model(BaseModel): my_string: Annotated[str, Field(max_length=5), WrapValidator(truncate)] print(Model(my_string='abcde')) #> my_string='abcde' print(Model(my_string='abcdef')) #> my_string='abcde' ``` === "Decorator" ```python {lint="skip"} from typing import Any from typing_extensions import Annotated from pydantic import BaseModel, Field, ValidationError, ValidatorFunctionWrapHandler, field_validator class Model(BaseModel): my_string: Annotated[str, Field(max_length=5)] @field_validator('my_string', mode='wrap') @classmethod def truncate(cls, value: Any, handler: ValidatorFunctionWrapHandler) -> str: try: return handler(value) except ValidationError as err: if err.errors()[0]['type'] == 'string_too_long': return handler(value[:5]) else: raise print(Model(my_string='abcde')) #> my_string='abcde' print(Model(my_string='abcdef')) #> my_string='abcde' ``` !!! note "Validation of default values" As mentioned in the [fields documentation](./fields.md#validate-default-values), default values of fields are *not* validated unless configured to do so, and thus custom validators will not be applied as well. ### Which validator pattern to use While both approaches can achieve the same thing, each pattern provides different benefits. #### Using the annotated pattern One of the key benefits of using the [annotated pattern](./fields.md#the-annotated-pattern) is to make validators reusable: ```python from typing import List from typing_extensions import Annotated from pydantic import AfterValidator, BaseModel def is_even(value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value EvenNumber = Annotated[str, AfterValidator(is_even)] class Model1(BaseModel): my_number: EvenNumber class Model2(BaseModel): other_number: Annotated[EvenNumber, AfterValidator(lambda v: v + 2)] class Model3(BaseModel): list_of_even_numbers: List[EvenNumber] # (1)! ``` 1. As mentioned in the [annotated pattern](./fields.md#the-annotated-pattern) documentation, we can also make use of validators for specific parts of the annotation (in this case, validation is applied for list items, but not the whole list). It is also easier to understand which validators are applied to a type, by just looking at the field annotation. #### Using the decorator pattern One of the key benefits of using the [`field_validator()`][pydantic.field_validator] decorator is to apply the function to multiple fields: ```python from pydantic import BaseModel, field_validator class Model(BaseModel): f1: str f2: str @field_validator('f1', 'f2', mode='before') @classmethod def capitalize(cls, value: str) -> str: return value.capitalize() ``` Here are a couple additional notes about the decorator usage: - If you want the validator to apply to all fields (including the ones defined in subclasses), you can pass `'*'` as the field name argument. - By default, the decorator will ensure the provided field name(s) are defined on the model. If you want to disable this check during class creation, you can do so by passing `False` to the `check_fields` argument. This is useful when the field validator is defined on a base class, and the field is expected to be set on subclasses. ## Model validators ??? api "API Documentation" [`pydantic.functional_validators.model_validator`][pydantic.functional_validators.model_validator]
Validation can also be performed on the entire model's data using the [`model_validator()`][pydantic.model_validator] decorator. **Three** different types of model validators can be used: [](){#model-after-validator} - __*After* validators__: run after the whole model has been validated. As such, they are defined as *instance* methods and can be seen as post-initialization hooks. Important note: the validated instance should be returned. ```python from typing_extensions import Self from pydantic import BaseModel, model_validator class UserModel(BaseModel): username: str password: str password_repeat: str @model_validator(mode='after') def check_passwords_match(self) -> Self: if self.password != self.password_repeat: raise ValueError('Passwords do not match') return self ``` [](){#model-before-validator} - __*Before* validators__: are run before the model is instantiated. These are more flexible than *after* validators, but they also have to deal with the raw input, which in theory could be any arbitrary object. ```python from typing import Any from pydantic import BaseModel, model_validator class UserModel(BaseModel): username: str @model_validator(mode='before') @classmethod def check_card_number_not_present(cls, data: Any) -> Any: # (1)! if isinstance(data, dict): # (2)! if 'card_number' in data: raise ValueError("'card_number' should not be included") return data ``` 1. Notice the use of [`Any`][typing.Any] as a type hint for `data`. *Before* validators take the raw input, which can be anything. 2. Most of the time, the input data will be a dictionary (e.g. when calling `UserModel(username='...')`). However, this is not always the case. For instance, if the [`from_attributes`][pydantic.ConfigDict.from_attributes] configuration value is set, you might receive an arbitrary class instance for the `data` argument. [](){#model-wrap-validator} - __*Wrap* validators__: are the most flexible of all. You can run code before or after Pydantic and other validators process the input data, or you can terminate validation immediately, either by returning the data early or by raising an error. ```python {lint="skip"} import logging from typing import Any from typing_extensions import Self from pydantic import BaseModel, ModelWrapValidatorHandler, ValidationError, model_validator class UserModel(BaseModel): username: str @model_validator(mode='wrap') @classmethod def log_failed_validation(cls, data: Any, handler: ModelWrapValidatorHandler[Self]) -> Self: try: return handler(data) except ValidationError: logging.error('Model %s failed to validate with data %s', cls, data) raise ``` !!! note "On inheritance" A model validator defined in a base class will be called during the validation of a subclass instance. Overriding a model validator in a subclass will override the base class' validator, and thus only the subclass' version of said validator will be called. ## Raising validation errors To raise a validation error, three types of exceptions can be used: - [`ValueError`][]: this is the most common exception raised inside validators. - [`AssertionError`][]: using the [assert][] statement also works, but be aware that these statements are skipped when Python is run with the [-O][] optimization flag. - [`PydanticCustomError`][pydantic_core.PydanticCustomError]: a bit more verbose, but provides extra flexibility: ```python from pydantic_core import PydanticCustomError from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: int @field_validator('x', mode='after') @classmethod def validate_x(cls, v: int) -> int: if v % 42 == 0: raise PydanticCustomError( 'the_answer_error', '{number} is the answer!', {'number': v}, ) return v try: Model(x=42 * 2) except ValidationError as e: print(e) """ 1 validation error for Model x 84 is the answer! [type=the_answer_error, input_value=84, input_type=int] """ ``` ## Validation info Both the field and model validators callables (in all modes) can optionally take an extra [`ValidationInfo`][pydantic.ValidationInfo] argument, providing useful extra information, such as: - [already validated data](#validation-data) - [user defined context](#validation-context) - the current validation mode: either `'python'` or `'json'` (see the [`mode`][pydantic.ValidationInfo.mode] property) - the current field name (see the [`field_name`][pydantic.ValidationInfo.field_name] property). ### Validation data For field validators, the already validated data can be accessed using the [`data`][pydantic.ValidationInfo.data] property. Here is an example than can be used as an alternative to the [*after* model validator](#model-after-validator) example: ```python from pydantic import BaseModel, ValidationInfo, field_validator class UserModel(BaseModel): password: str password_repeat: str username: str @field_validator('password_repeat', mode='after') @classmethod def check_passwords_match(cls, value: str, info: ValidationInfo) -> str: if value != info.data['password']: raise ValueError('Passwords do not match') return value ``` !!! warning As validation is performed in the [order fields are defined](./models.md#field-ordering), you have to make sure you are not accessing a field that hasn't been validated yet. In the code above, for example, the `username` validated value is not available yet, as it is defined *after* `password_repeat`. The [`data`][pydantic.ValidationInfo.data] property is `None` for [model validators](#model-validators). ### Validation context You can pass a context object to the [validation methods](./models.md#validating-data), which can be accessed inside the validator functions using the [`context`][pydantic.ValidationInfo.context] property: ```python from pydantic import BaseModel, ValidationInfo, field_validator class Model(BaseModel): text: str @field_validator('text', mode='after') @classmethod def remove_stopwords(cls, v: str, info: ValidationInfo) -> str: if isinstance(info.context, dict): stopwords = info.context.get('stopwords', set()) v = ' '.join(w for w in v.split() if w.lower() not in stopwords) return v data = {'text': 'This is an example document'} print(Model.model_validate(data)) # no context #> text='This is an example document' print(Model.model_validate(data, context={'stopwords': ['this', 'is', 'an']})) #> text='example document' ``` Similarly, you can [use a context for serialization](../concepts/serialization.md#serialization-context). ??? note "Providing context when directly instantiating a model" It is currently not possible to provide a context when directly instantiating a model (i.e. when calling `Model(...)`). You can work around this through the use of a [`ContextVar`][contextvars.ContextVar] and a custom `__init__` method: ```python from __future__ import annotations from contextlib import contextmanager from contextvars import ContextVar from typing import Any, Generator from pydantic import BaseModel, ValidationInfo, field_validator _init_context_var = ContextVar('_init_context_var', default=None) @contextmanager def init_context(value: dict[str, Any]) -> Generator[None]: token = _init_context_var.set(value) try: yield finally: _init_context_var.reset(token) class Model(BaseModel): my_number: int def __init__(self, /, **data: Any) -> None: self.__pydantic_validator__.validate_python( data, self_instance=self, context=_init_context_var.get(), ) @field_validator('my_number') @classmethod def multiply_with_context(cls, value: int, info: ValidationInfo) -> int: if isinstance(info.context, dict): multiplier = info.context.get('multiplier', 1) value = value * multiplier return value print(Model(my_number=2)) #> my_number=2 with init_context({'multiplier': 3}): print(Model(my_number=2)) #> my_number=6 print(Model(my_number=2)) #> my_number=2 ``` ## Ordering of validators When using the [annotated pattern](#using-the-annotated-pattern), the order in which validators are applied is defined as follows: [*before*](#field-before-validator) and [*wrap*](#field-wrap-validator) validators are run from right to left, and [*after*](#field-after-validator) validators are then run from left to right: ```python {lint="skip" test="skip"} from pydantic import AfterValidator, BaseModel, BeforeValidator, WrapValidator class Model(BaseModel): name: Annotated[ str, AfterValidator(runs_3rd), AfterValidator(runs_4th), BeforeValidator(runs_2nd), WrapValidator(runs_1st), ] ``` Internally, validators defined using [the decorator](#using-the-decorator-pattern) are converted to their annotated form counterpart and added last after the existing metadata for the field. This means that the same ordering logic applies. ## Special types Pydantic provides a few special utilities that can be used to customize validation. - [`InstanceOf`][pydantic.functional_validators.InstanceOf] can be used to validate that a value is an instance of a given class. ```python from typing import List from pydantic import BaseModel, InstanceOf, ValidationError class Fruit: def __repr__(self): return self.__class__.__name__ class Banana(Fruit): ... class Apple(Fruit): ... class Basket(BaseModel): fruits: List[InstanceOf[Fruit]] print(Basket(fruits=[Banana(), Apple()])) #> fruits=[Banana, Apple] try: Basket(fruits=[Banana(), 'Apple']) except ValidationError as e: print(e) """ 1 validation error for Basket fruits.1 Input should be an instance of Fruit [type=is_instance_of, input_value='Apple', input_type=str] """ ``` - [`SkipValidation`][pydantic.functional_validators.SkipValidation] can be used to skip validation on a field. ```python from typing import List from pydantic import BaseModel, SkipValidation class Model(BaseModel): names: List[SkipValidation[str]] m = Model(names=['foo', 'bar']) print(m) #> names=['foo', 'bar'] m = Model(names=['foo', 123]) # (1)! print(m) #> names=['foo', 123] ``` 1. Note that the validation of the second item is skipped. If it has the wrong type it will emit a warning during serialization. - [`PydanticUseDefault`][pydantic_core.PydanticUseDefault] can be used to notify Pydantic that the default value should be used. ```python from typing import Any from pydantic_core import PydanticUseDefault from typing_extensions import Annotated from pydantic import BaseModel, BeforeValidator def default_if_none(value: Any) -> Any: if value is None: raise PydanticUseDefault() return value class Model(BaseModel): name: Annotated[str, BeforeValidator(default_if_none)] = 'default_name' print(Model(name=None)) #> name='default_name' ``` ## JSON Schema and field validators When using [*before*](#field-before-validator), [*plain*](#field-plain-validator) or [*wrap*](#field-wrap-validator) field validators, the accepted input type may be different from the field annotation. Consider the following example: ```python from typing import Any from pydantic import BaseModel, field_validator class Model(BaseModel): value: str @field_validator('value', mode='before') @classmethod def cast_ints(cls, value: Any) -> Any: if isinstance(value, int): return str(value) else: return value print(Model(value='a')) #> value='a' print(Model(value=1)) #> value='1' ``` While the type hint for `value` is `str`, the `cast_ints` validator also allows integers. To specify the correct input type, the `json_schema_input_type` argument can be provided: ```python from typing import Any, Union from pydantic import BaseModel, field_validator class Model(BaseModel): value: str @field_validator( 'value', mode='before', json_schema_input_type=Union[int, str] ) @classmethod def cast_ints(cls, value: Any) -> Any: if isinstance(value, int): return str(value) else: return value print(Model.model_json_schema()['properties']['value']) #> {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'Value'} ``` As a convenience, Pydantic will use the field type if the argument is not provided (unless you are using a [*plain*](#field-plain-validator) validator, in which case `json_schema_input_type` defaults to [`Any`][typing.Any] as the field type is completely discarded). pydantic-2.10.6/docs/contributing.md000066400000000000000000000254321474456633400174220ustar00rootroot00000000000000We'd love you to contribute to Pydantic! ## Issues Questions, feature requests and bug reports are all welcome as [discussions or issues](https://github.com/pydantic/pydantic/issues/new/choose). **However, to report a security vulnerability, please see our [security policy](https://github.com/pydantic/pydantic/security/policy).** To make it as simple as possible for us to help you, please include the output of the following call in your issue: ```bash python -c "import pydantic.version; print(pydantic.version.version_info())" ``` If you're using Pydantic prior to **v2.0** please use: ```bash python -c "import pydantic.utils; print(pydantic.utils.version_info())" ``` Please try to always include the above unless you're unable to install Pydantic or **know** it's not relevant to your question or feature request. ## Pull Requests It should be extremely simple to get started and create a Pull Request. Pydantic is released regularly so you should see your improvements release in a matter of days or weeks 🚀. Unless your change is trivial (typo, docs tweak etc.), please create an issue to discuss the change before creating a pull request. !!! note "Pydantic V1 is in maintenance mode" Pydantic v1 is in maintenance mode, meaning that only bug fixes and security fixes will be accepted. New features should be targeted at Pydantic v2. To submit a fix to Pydantic v1, use the `1.10.X-fixes` as a target branch. If you're looking for something to get your teeth into, check out the ["help wanted"](https://github.com/pydantic/pydantic/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) label on github. To make contributing as easy and fast as possible, you'll want to run tests and linting locally. Luckily, Pydantic has few dependencies, doesn't require compiling and tests don't need access to databases, etc. Because of this, setting up and running the tests should be very simple. !!! tip **tl;dr**: use `make format` to fix formatting, `make` to run tests and linting and `make docs` to build the docs. ### Prerequisites You'll need the following prerequisites: - Any Python version between **Python 3.9 and 3.12** - [**uv**](https://docs.astral.sh/uv/getting-started/installation/) or other virtual environment tool - **git** - **make** ### Installation and setup Fork the repository on GitHub and clone your fork locally. ```bash # Clone your fork and cd into the repo directory git clone git@github.com:/pydantic.git cd pydantic # Install UV and pre-commit # We use pipx here, for other options see: # https://docs.astral.sh/uv/getting-started/installation/ # https://pre-commit.com/#install # To get pipx itself: # https://pypa.github.io/pipx/ pipx install uv pipx install pre-commit # Install pydantic, dependencies, test dependencies and doc dependencies make install ``` ### Check out a new branch and make your changes Create a new branch for your changes. ```bash # Checkout a new branch and make your changes git checkout -b my-new-feature-branch # Make your changes... ``` ### Run tests and linting Run tests and linting locally to make sure everything is working as expected. ```bash # Run automated code formatting and linting make format # Pydantic uses ruff, an awesome Python linter written in rust # https://github.com/astral-sh/ruff # Run tests and linting make # There are a few sub-commands in Makefile like `test`, `testcov` and `lint` # which you might want to use, but generally just `make` should be all you need. # You can run `make help` to see more options. ``` ### Build documentation If you've made any changes to the documentation (including changes to function signatures, class definitions, or docstrings that will appear in the API documentation), make sure it builds successfully. We use `mkdocs-material[imaging]` to support social previews. You can find directions on how to install the required dependencies [here](https://squidfunk.github.io/mkdocs-material/plugins/requirements/image-processing/). ```bash # Build documentation make docs # If you have changed the documentation, make sure it builds successfully. # You can also use `uv run mkdocs serve` to serve the documentation at localhost:8000 ``` If this isn't working due to issues with the imaging plugin, try commenting out the `social` plugin line in `mkdocs.yml` and running `make docs` again. #### Updating the documentation We push a new version of the documentation with each minor release, and we push to a `dev` path with each commit to `main`. If you're updating the documentation out of cycle with a minor release and want your changes to be reflected on `latest`, do the following: 1. Open a PR against `main` with your docs changes 2. Once the PR is merged, checkout the `docs-update` branch. This branch should be up to date with the latest patch release. For example, if the latest release is `v2.9.2`, you should make sure `docs-update` is up to date with the `v2.9.2` tag. 3. Checkout a new branch from `docs-update` and cherry-pick your changes onto this branch. 4. Push your changes and open a PR against `docs-update`. 5. Once the PR is merged, the new docs will be built and deployed. !!! note Maintainer shortcut - as a maintainer, you can skip the second PR and just cherry pick directly onto the `docs-update` branch. ### Commit and push your changes Commit your changes, push your branch to GitHub, and create a pull request. Please follow the pull request template and fill in as much information as possible. Link to any relevant issues and include a description of your changes. When your pull request is ready for review, add a comment with the message "please review" and we'll take a look as soon as we can. ## Documentation style Documentation is written in Markdown and built using [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/). API documentation is build from docstrings using [mkdocstrings](https://mkdocstrings.github.io/). ### Code documentation When contributing to Pydantic, please make sure that all code is well documented. The following should be documented using properly formatted docstrings: - Modules - Class definitions - Function definitions - Module-level variables Pydantic uses [Google-style docstrings](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings) formatted according to [PEP 257](https://www.python.org/dev/peps/pep-0257/) guidelines. (See [Example Google Style Python Docstrings](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html) for further examples.) [pydocstyle](https://www.pydocstyle.org/en/stable/index.html) is used for linting docstrings. You can run `make format` to check your docstrings. Where this is a conflict between Google-style docstrings and pydocstyle linting, follow the pydocstyle linting hints. Class attributes and function arguments should be documented in the format "name: description." When applicable, a return type should be documented with just a description. Types are inferred from the signature. ```python class Foo: """A class docstring. Attributes: bar: A description of bar. Defaults to "bar". """ bar: str = 'bar' ``` ```python def bar(self, baz: int) -> str: """A function docstring. Args: baz: A description of `baz`. Returns: A description of the return value. """ return 'bar' ``` You may include example code in docstrings. This code should be complete, self-contained, and runnable. Docstring examples are tested, so make sure they are correct and complete. See [`FieldInfo.from_annotated_attribute`][pydantic.fields.FieldInfo.from_annotated_attribute] for an example. !!! note "Class and instance attributes" Class attributes should be documented in the class docstring. Instance attributes should be documented as "Args" in the `__init__` docstring. ### Documentation Style In general, documentation should be written in a friendly, approachable style. It should be easy to read and understand, and should be as concise as possible while still being complete. Code examples are encouraged, but should be kept short and simple. However, every code example should be complete, self-contained, and runnable. (If you're not sure how to do this, ask for help!) We prefer print output to naked asserts, but if you're testing something that doesn't have a useful print output, asserts are fine. Pydantic's unit test will test all code examples in the documentation, so it's important that they are correct and complete. When adding a new code example, use the following to test examples and update their formatting and output: ```bash # Run tests and update code examples pytest tests/test_docs.py --update-examples ``` ## Debugging Python and Rust If you're working with `pydantic` and `pydantic-core`, you might find it helpful to debug Python and Rust code together. Here's a quick guide on how to do that. This tutorial is done in VSCode, but you can use similar steps in other IDEs.
## Badges [![Pydantic v1](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v1.json)](https://pydantic.dev) [![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://pydantic.dev) Pydantic has a badge that you can use to show that your project uses Pydantic. You can use this badge in your `README.md`: ### With Markdown ```md [![Pydantic v1](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v1.json)](https://pydantic.dev) [![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://pydantic.dev) ``` ### With reStructuredText ```rst .. image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v1.json :target: https://pydantic.dev :alt: Pydantic .. image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json :target: https://pydantic.dev :alt: Pydantic ``` ### With HTML ```html Pydantic Version 1 Pydantic Version 2 ``` pydantic-2.10.6/docs/errors/000077500000000000000000000000001474456633400156775ustar00rootroot00000000000000pydantic-2.10.6/docs/errors/errors.md000066400000000000000000000251271474456633400175440ustar00rootroot00000000000000 Pydantic will raise a [`ValidationError`][pydantic_core.ValidationError] whenever it finds an error in the data it's validating. !!! note Validation code should not raise `ValidationError` itself, but rather raise a `ValueError` or `AssertionError` (or subclass thereof) which will be caught and used to populate `ValidationError`. One exception will be raised regardless of the number of errors found, that `ValidationError` will contain information about all the errors and how they happened. You can access these errors in several ways: | Method | Description | |-------------------|--------------------------------------------------------| | `e.errors()` | Returns a list of errors found in the input data. | | `e.error_count()` | Returns the number of errors found in `errors`. | | `e.json()` | Returns a JSON representation of `errors`. | | `str(e)` | Returns a human-readable representation of the errors. | Each error object contains: | Property | Description | |----------|--------------------------------------------------------------------------------| | `ctx` | An optional object which contains values required to render the error message. | | `input` | The input provided for validation. | | `loc` | The error's location as a list. | | `msg` | A human-readable explanation of the error. | | `type` | A computer-readable identifier of the error type. | | `url` | The URL to further information about the error. | The first item in the `loc` list will be the field where the error occurred, and if the field is a [sub-model](../concepts/models.md#nested-models), subsequent items will be present to indicate the nested location of the error. As a demonstration: ```python from typing import List from pydantic import BaseModel, ValidationError, conint class Location(BaseModel): lat: float = 0.1 lng: float = 10.1 class Model(BaseModel): is_required: float gt_int: conint(gt=42) list_of_ints: List[int] = None a_float: float = None recursive_model: Location = None data = dict( list_of_ints=['1', 2, 'bad'], a_float='not a float', recursive_model={'lat': 4.2, 'lng': 'New York'}, gt_int=21, ) try: Model(**data) except ValidationError as e: print(e) """ 5 validation errors for Model is_required Field required [type=missing, input_value={'list_of_ints': ['1', 2,...ew York'}, 'gt_int': 21}, input_type=dict] gt_int Input should be greater than 42 [type=greater_than, input_value=21, input_type=int] list_of_ints.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='bad', input_type=str] a_float Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='not a float', input_type=str] recursive_model.lng Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='New York', input_type=str] """ try: Model(**data) except ValidationError as e: print(e.errors()) """ [ { 'type': 'missing', 'loc': ('is_required',), 'msg': 'Field required', 'input': { 'list_of_ints': ['1', 2, 'bad'], 'a_float': 'not a float', 'recursive_model': {'lat': 4.2, 'lng': 'New York'}, 'gt_int': 21, }, 'url': 'https://errors.pydantic.dev/2/v/missing', }, { 'type': 'greater_than', 'loc': ('gt_int',), 'msg': 'Input should be greater than 42', 'input': 21, 'ctx': {'gt': 42}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', }, { 'type': 'int_parsing', 'loc': ('list_of_ints', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'bad', 'url': 'https://errors.pydantic.dev/2/v/int_parsing', }, { 'type': 'float_parsing', 'loc': ('a_float',), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'not a float', 'url': 'https://errors.pydantic.dev/2/v/float_parsing', }, { 'type': 'float_parsing', 'loc': ('recursive_model', 'lng'), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'New York', 'url': 'https://errors.pydantic.dev/2/v/float_parsing', }, ] """ ``` ### Custom Errors In your custom data types or validators you should use `ValueError` or `AssertionError` to raise errors. See [validators](../concepts/validators.md) for more details on use of the `@validator` decorator. ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): foo: str @field_validator('foo') def value_must_equal_bar(cls, v): if v != 'bar': raise ValueError('value must be "bar"') return v try: Model(foo='ber') except ValidationError as e: print(e) """ 1 validation error for Model foo Value error, value must be "bar" [type=value_error, input_value='ber', input_type=str] """ print(e.errors()) """ [ { 'type': 'value_error', 'loc': ('foo',), 'msg': 'Value error, value must be "bar"', 'input': 'ber', 'ctx': {'error': ValueError('value must be "bar"')}, 'url': 'https://errors.pydantic.dev/2/v/value_error', } ] """ ``` You can also use [`PydanticCustomError`][pydantic_core.PydanticCustomError], to fully control the error structure: ```python from pydantic_core import PydanticCustomError from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): foo: str @field_validator('foo') def value_must_equal_bar(cls, v): if v != 'bar': raise PydanticCustomError( 'not_a_bar', 'value is not "bar", got "{wrong_value}"', dict(wrong_value=v), ) return v try: Model(foo='ber') except ValidationError as e: print(e) """ 1 validation error for Model foo value is not "bar", got "ber" [type=not_a_bar, input_value='ber', input_type=str] """ ``` ## Error messages Pydantic attempts to provide useful default error messages for validation and usage errors. We've provided documentation for default error codes in the following sections: - [Validation Errors](validation_errors.md) - [Usage Errors](usage_errors.md) ### Customize error messages You can customize error messages by creating a custom error handler. ```python from typing import Dict, List from pydantic_core import ErrorDetails from pydantic import BaseModel, HttpUrl, ValidationError CUSTOM_MESSAGES = { 'int_parsing': 'This is not an integer! 🤦', 'url_scheme': 'Hey, use the right URL scheme! I wanted {expected_schemes}.', } def convert_errors( e: ValidationError, custom_messages: Dict[str, str] ) -> List[ErrorDetails]: new_errors: List[ErrorDetails] = [] for error in e.errors(): custom_message = custom_messages.get(error['type']) if custom_message: ctx = error.get('ctx') error['msg'] = ( custom_message.format(**ctx) if ctx else custom_message ) new_errors.append(error) return new_errors class Model(BaseModel): a: int b: HttpUrl try: Model(a='wrong', b='ftp://example.com') except ValidationError as e: errors = convert_errors(e, CUSTOM_MESSAGES) print(errors) """ [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'This is not an integer! 🤦', 'input': 'wrong', 'url': 'https://errors.pydantic.dev/2/v/int_parsing', }, { 'type': 'url_scheme', 'loc': ('b',), 'msg': "Hey, use the right URL scheme! I wanted 'http' or 'https'.", 'input': 'ftp://example.com', 'ctx': {'expected_schemes': "'http' or 'https'"}, 'url': 'https://errors.pydantic.dev/2/v/url_scheme', }, ] """ ``` A common use case would be to translate error messages. For example, in the above example, we could translate the error messages replacing the `CUSTOM_MESSAGES` dictionary with a dictionary of translations. Another example is customizing the way that the `'loc'` value of an error is represented. ```python from typing import Any, Dict, List, Tuple, Union from pydantic import BaseModel, ValidationError def loc_to_dot_sep(loc: Tuple[Union[str, int], ...]) -> str: path = '' for i, x in enumerate(loc): if isinstance(x, str): if i > 0: path += '.' path += x elif isinstance(x, int): path += f'[{x}]' else: raise TypeError('Unexpected type') return path def convert_errors(e: ValidationError) -> List[Dict[str, Any]]: new_errors: List[Dict[str, Any]] = e.errors() for error in new_errors: error['loc'] = loc_to_dot_sep(error['loc']) return new_errors class TestNestedModel(BaseModel): key: str value: str class TestModel(BaseModel): items: List[TestNestedModel] data = {'items': [{'key': 'foo', 'value': 'bar'}, {'key': 'baz'}]} try: TestModel.model_validate(data) except ValidationError as e: print(e.errors()) # (1)! """ [ { 'type': 'missing', 'loc': ('items', 1, 'value'), 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ pretty_errors = convert_errors(e) print(pretty_errors) # (2)! """ [ { 'type': 'missing', 'loc': 'items[1].value', 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ ``` 1. By default, `e.errors()` produces a List of errors with `loc` values that take the form of tuples. 2. With our custom `loc_to_dot_sep` function, we've modified the form of the `loc` representation. pydantic-2.10.6/docs/errors/usage_errors.md000066400000000000000000001045031474456633400207240ustar00rootroot00000000000000Pydantic attempts to provide useful errors. The following sections provide details on common errors developers may encounter when working with Pydantic, along with suggestions for addressing the error condition. ## Class not fully defined {#class-not-fully-defined} This error is raised when a type referenced in an annotation of a pydantic-validated type (such as a subclass of `BaseModel`, or a pydantic `dataclass`) is not defined: ```python from typing import ForwardRef from pydantic import BaseModel, PydanticUserError UndefinedType = ForwardRef('UndefinedType') class Foobar(BaseModel): a: UndefinedType try: Foobar(a=1) except PydanticUserError as exc_info: assert exc_info.code == 'class-not-fully-defined' ``` Or when the type has been defined after usage: ```python from typing import Optional from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): a: Optional['Bar'] = None try: # this doesn't work, see raised error foo = Foo(a={'b': {'a': None}}) except PydanticUserError as exc_info: assert exc_info.code == 'class-not-fully-defined' class Bar(BaseModel): b: 'Foo' # this works, though foo = Foo(a={'b': {'a': None}}) ``` For BaseModel subclasses, it can be fixed by defining the type and then calling `.model_rebuild()`: ```python from typing import Optional from pydantic import BaseModel class Foo(BaseModel): a: Optional['Bar'] = None class Bar(BaseModel): b: 'Foo' Foo.model_rebuild() foo = Foo(a={'b': {'a': None}}) ``` In other cases, the error message should indicate how to rebuild the class with the appropriate type defined. ## Custom JSON Schema {#custom-json-schema} The `__modify_schema__` method is no longer supported in V2. You should use the `__get_pydantic_json_schema__` method instead. The `__modify_schema__` used to receive a single argument representing the JSON schema. See the example below: ```python {title="Old way"} from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): @classmethod def __modify_schema__(cls, field_schema): field_schema.update(examples=['example']) except PydanticUserError as exc_info: assert exc_info.code == 'custom-json-schema' ``` The new method `__get_pydantic_json_schema__` receives two arguments: the first is a dictionary denoted as `CoreSchema`, and the second a callable `handler` that receives a `CoreSchema` as parameter, and returns a JSON schema. See the example below: ```python {title="New way"} from typing import Any, Dict from pydantic_core import CoreSchema from pydantic import BaseModel, GetJsonSchemaHandler class Model(BaseModel): @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> Dict[str, Any]: json_schema = super().__get_pydantic_json_schema__(core_schema, handler) json_schema = handler.resolve_ref_schema(json_schema) json_schema.update(examples=['example']) return json_schema print(Model.model_json_schema()) """ {'examples': ['example'], 'properties': {}, 'title': 'Model', 'type': 'object'} """ ``` ## Decorator on missing field {#decorator-missing-field} This error is raised when you define a decorator with a field that is not valid. ```python from typing import Any from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator('b') def check_b(cls, v: Any): return v except PydanticUserError as exc_info: assert exc_info.code == 'decorator-missing-field' ``` You can use `check_fields=False` if you're inheriting from the model and intended this. ```python from typing import Any from pydantic import BaseModel, create_model, field_validator class Model(BaseModel): @field_validator('a', check_fields=False) def check_a(cls, v: Any): return v model = create_model('FooModel', a=(str, 'cake'), __base__=Model) ``` ## Discriminator no field {#discriminator-no-field} This error is raised when a model in discriminated unions doesn't define a discriminator field. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-no-field' ``` ## Discriminator alias type {#discriminator-alias-type} This error is raised when you define a non-string alias on a discriminator field. ```python from typing import Literal, Union from pydantic import AliasChoices, BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field( validation_alias=AliasChoices('Pet', 'PET') ) c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias-type' ``` ## Discriminator needs literal {#discriminator-needs-literal} This error is raised when you define a non-`Literal` type on a discriminator field. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: int c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-needs-literal' ``` ## Discriminator alias {#discriminator-alias} This error is raised when you define different aliases on discriminator fields. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field(validation_alias='PET') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(validation_alias='Pet') d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias' ``` ## Invalid discriminator validator {#discriminator-validator} This error is raised when you use a before, wrap, or plain validator on a discriminator field. This is disallowed because the discriminator field is used to determine the type of the model to use for validation, so you can't use a validator that might change its value. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-validator' ``` This can be worked around by using a standard `Union`, dropping the discriminator: ```python from typing import Literal, Union from pydantic import BaseModel, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] class Model(BaseModel): pet: Union[Cat, Dog] assert Model(pet={'pet_type': 'kitten'}).pet.pet_type == 'cat' ``` ## Callable discriminator case with no tag {#callable-discriminator-no-tag} This error is raised when a `Union` that uses a callable `Discriminator` doesn't have `Tag` annotations for all cases. ```python from typing import Union from typing_extensions import Annotated from pydantic import BaseModel, Discriminator, PydanticUserError, Tag def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' # tag missing for both union choices try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `'DiscriminatedModel'` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[Annotated[str, Tag('str')], 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `str` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, Annotated['DiscriminatedModel', Tag('model')]], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' ``` ## `TypedDict` version {#typed-dict-version} This error is raised when you use [typing.TypedDict][] instead of `typing_extensions.TypedDict` on Python < 3.12. ## Model parent field overridden {#model-field-overridden} This error is raised when a field defined on a base class was overridden by a non-annotated attribute. ```python from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): a: float try: class Bar(Foo): x: float = 12.3 a = 123.0 except PydanticUserError as exc_info: assert exc_info.code == 'model-field-overridden' ``` ## Model field missing annotation {#model-field-missing-annotation} This error is raised when a field doesn't have an annotation. ```python from pydantic import BaseModel, Field, PydanticUserError try: class Model(BaseModel): a = Field('foobar') b = None except PydanticUserError as exc_info: assert exc_info.code == 'model-field-missing-annotation' ``` If the field is not meant to be a field, you may be able to resolve the error by annotating it as a `ClassVar`: ```python from typing import ClassVar from pydantic import BaseModel class Model(BaseModel): a: ClassVar[str] ``` Or updating `model_config['ignored_types']`: ```python from pydantic import BaseModel, ConfigDict class IgnoredType: pass class MyModel(BaseModel): model_config = ConfigDict(ignored_types=(IgnoredType,)) _a = IgnoredType() _b: int = IgnoredType() _c: IgnoredType _d: IgnoredType = IgnoredType() ``` ## `Config` and `model_config` both defined {#config-both} This error is raised when `class Config` and `model_config` are used together. ```python from pydantic import BaseModel, ConfigDict, PydanticUserError try: class Model(BaseModel): model_config = ConfigDict(from_attributes=True) a: str class Config: from_attributes = True except PydanticUserError as exc_info: assert exc_info.code == 'config-both' ``` ## Keyword arguments removed {#removed-kwargs} This error is raised when the keyword arguments are not available in Pydantic V2. For example, `regex` is removed from Pydantic V2: ```python from pydantic import BaseModel, Field, PydanticUserError try: class Model(BaseModel): x: str = Field(regex='test') except PydanticUserError as exc_info: assert exc_info.code == 'removed-kwargs' ``` ## Circular reference schema {#circular-reference-schema} This error is raised when a circular reference is found that would otherwise result in an infinite recursion. For example, this is a valid type alias: ```python {test="skip" lint="skip" upgrade="skip"} type A = list[A] | None ``` while these are not: ```python {test="skip" lint="skip" upgrade="skip"} type A = A type B = C type C = B ``` ## JSON schema invalid type {#invalid-for-json-schema} This error is raised when Pydantic fails to generate a JSON schema for some `CoreSchema`. ```python from pydantic import BaseModel, ImportString, PydanticUserError class Model(BaseModel): a: ImportString try: Model.model_json_schema() except PydanticUserError as exc_info: assert exc_info.code == 'invalid-for-json-schema' ``` ## JSON schema already used {#json-schema-already-used} This error is raised when the JSON schema generator has already been used to generate a JSON schema. You must create a new instance to generate a new JSON schema. ## BaseModel instantiated {#base-model-instantiated} This error is raised when you instantiate `BaseModel` directly. Pydantic models should inherit from `BaseModel`. ```python from pydantic import BaseModel, PydanticUserError try: BaseModel() except PydanticUserError as exc_info: assert exc_info.code == 'base-model-instantiated' ``` ## Undefined annotation {#undefined-annotation} This error is raised when handling undefined annotations during `CoreSchema` generation. ```python from pydantic import BaseModel, PydanticUndefinedAnnotation class Model(BaseModel): a: 'B' # noqa F821 try: Model.model_rebuild() except PydanticUndefinedAnnotation as exc_info: assert exc_info.code == 'undefined-annotation' ``` ## Schema for unknown type {#schema-for-unknown-type} This error is raised when Pydantic fails to generate a `CoreSchema` for some type. ```python from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): x: 43 = 123 except PydanticUserError as exc_info: assert exc_info.code == 'schema-for-unknown-type' ``` ## Import error {#import-error} This error is raised when you try to import an object that was available in Pydantic V1, but has been removed in Pydantic V2. See the [Migration Guide](../migration.md) for more information. ## `create_model` field definitions {#create-model-field-definitions} This error is raised when you provide field definitions input in `create_model` that is not valid. ```python from pydantic import PydanticUserError, create_model try: create_model('FooModel', foo=(str, 'default value', 'more')) except PydanticUserError as exc_info: assert exc_info.code == 'create-model-field-definitions' ``` Or when you use [`typing.Annotated`][] with invalid input ```python from typing_extensions import Annotated from pydantic import PydanticUserError, create_model try: create_model('FooModel', foo=Annotated[str, 'NotFieldInfoValue']) except PydanticUserError as exc_info: assert exc_info.code == 'create-model-field-definitions' ``` ## `create_model` config base {#create-model-config-base} This error is raised when you use both `__config__` and `__base__` together in `create_model`. ```python from pydantic import BaseModel, ConfigDict, PydanticUserError, create_model try: config = ConfigDict(frozen=True) model = create_model( 'FooModel', foo=(int, ...), __config__=config, __base__=BaseModel ) except PydanticUserError as exc_info: assert exc_info.code == 'create-model-config-base' ``` ## Validator with no fields {#validator-no-fields} This error is raised when you use validator bare (with no fields). ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator def checker(cls, v): return v except PydanticUserError as exc_info: assert exc_info.code == 'validator-no-fields' ``` Validators should be used with fields and keyword arguments. ```python from pydantic import BaseModel, field_validator class Model(BaseModel): a: str @field_validator('a') def checker(cls, v): return v ``` ## Invalid validator fields {#validator-invalid-fields} This error is raised when you use a validator with non-string fields. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str b: str @field_validator(['a', 'b']) def check_fields(cls, v): return v except PydanticUserError as exc_info: assert exc_info.code == 'validator-invalid-fields' ``` Fields should be passed as separate string arguments: ```python from pydantic import BaseModel, field_validator class Model(BaseModel): a: str b: str @field_validator('a', 'b') def check_fields(cls, v): return v ``` ## Validator on instance method {#validator-instance-method} This error is raised when you apply a validator on an instance method. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: int = 1 @field_validator('a') def check_a(self, value): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-instance-method' ``` ## `json_schema_input_type` used with the wrong mode {#validator-input-type} This error is raised when you explicitly specify a value for the `json_schema_input_type` argument and `mode` isn't set to either `'before'`, `'plain'` or `'wrap'`. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: int = 1 @field_validator('a', mode='after', json_schema_input_type=int) @classmethod def check_a(self, value): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-input-type' ``` Documenting the JSON Schema input type is only possible for validators where the given value can be anything. That is why it isn't available for `after` validators, where the value is first validated against the type annotation. ## Root validator, `pre`, `skip_on_failure` {#root-validator-pre-skip} If you use `@root_validator` with `pre=False` (the default) you MUST specify `skip_on_failure=True`. The `skip_on_failure=False` option is no longer available. If you were not trying to set `skip_on_failure=False`, you can safely set `skip_on_failure=True`. If you do, this root validator will no longer be called if validation fails for any of the fields. Please see the [Migration Guide](../migration.md) for more details. ## `model_serializer` instance methods {#model-serializer-instance-method} `@model_serializer` must be applied to instance methods. This error is raised when you apply `model_serializer` on an instance method without `self`: ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer def _serialize(slf, x, y, z): return slf except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-instance-method' ``` Or on a class method: ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer @classmethod def _serialize(self, x, y, z): return self except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-instance-method' ``` ## `validator`, `field`, `config`, and `info` {#validator-field-config-info} The `field` and `config` parameters are not available in Pydantic V2. Please use the `info` parameter instead. You can access the configuration via `info.config`, but it is a dictionary instead of an object like it was in Pydantic V1. The `field` argument is no longer available. ## Pydantic V1 validator signature {#validator-v1-signature} This error is raised when you use an unsupported signature for Pydantic V1-style validator. ```python import warnings from pydantic import BaseModel, PydanticUserError, validator warnings.filterwarnings('ignore', category=DeprecationWarning) try: class Model(BaseModel): a: int @validator('a') def check_a(cls, value, foo): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-v1-signature' ``` ## Unrecognized `field_validator` signature {#validator-signature} This error is raised when a `field_validator` or `model_validator` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls): return 'a' except PydanticUserError as exc_info: assert exc_info.code == 'validator-signature' ``` ## Unrecognized `field_serializer` signature {#field-serializer-signature} This error is raised when the `field_serializer` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, field_serializer try: class Model(BaseModel): x: int @field_serializer('x') def no_args(): return 'x' except PydanticUserError as exc_info: assert exc_info.code == 'field-serializer-signature' ``` Valid field serializer signatures are: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic import FieldSerializationInfo, SerializerFunctionWrapHandler, field_serializer # an instance method with the default mode or `mode='plain'` @field_serializer('x') # or @field_serializer('x', mode='plain') def ser_x(self, value: Any, info: FieldSerializationInfo): ... # a static method or function with the default mode or `mode='plain'` @field_serializer('x') # or @field_serializer('x', mode='plain') @staticmethod def ser_x(value: Any, info: FieldSerializationInfo): ... # equivalent to def ser_x(value: Any, info: FieldSerializationInfo): ... serializer('x')(ser_x) # an instance method with `mode='wrap'` @field_serializer('x', mode='wrap') def ser_x(self, value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... # a static method or function with `mode='wrap'` @field_serializer('x', mode='wrap') @staticmethod def ser_x(value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... # equivalent to def ser_x(value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... serializer('x')(ser_x) # For all of these, you can also choose to omit the `info` argument, for example: @field_serializer('x') def ser_x(self, value: Any): ... @field_serializer('x', mode='wrap') def ser_x(self, value: Any, handler: SerializerFunctionWrapHandler): ... ``` ## Unrecognized `model_serializer` signature {#model-serializer-signature} This error is raised when the `model_serializer` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer def _serialize(self, x, y, z): return self except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-signature' ``` Valid model serializer signatures are: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic import SerializerFunctionWrapHandler, SerializationInfo, model_serializer # an instance method with the default mode or `mode='plain'` @model_serializer # or model_serializer(mode='plain') def mod_ser(self, info: SerializationInfo): ... # an instance method with `mode='wrap'` @model_serializer(mode='wrap') def mod_ser(self, handler: SerializerFunctionWrapHandler, info: SerializationInfo): # For all of these, you can also choose to omit the `info` argument, for example: @model_serializer(mode='plain') def mod_ser(self): ... @model_serializer(mode='wrap') def mod_ser(self, handler: SerializerFunctionWrapHandler): ... ``` ## Multiple field serializers {#multiple-field-serializers} This error is raised when multiple `model_serializer` functions are defined for a field. ```python from pydantic import BaseModel, PydanticUserError, field_serializer try: class MyModel(BaseModel): x: int y: int @field_serializer('x', 'y') def serializer1(v): return f'{v:,}' @field_serializer('x') def serializer2(v): return v except PydanticUserError as exc_info: assert exc_info.code == 'multiple-field-serializers' ``` ## Invalid annotated type {#invalid-annotated-type} This error is raised when an annotation cannot annotate a type. ```python from typing_extensions import Annotated from pydantic import BaseModel, FutureDate, PydanticUserError try: class Model(BaseModel): foo: Annotated[str, FutureDate()] except PydanticUserError as exc_info: assert exc_info.code == 'invalid-annotated-type' ``` ## `config` is unused with `TypeAdapter` {#type-adapter-config-unused} You will get this error if you try to pass `config` to `TypeAdapter` when the type is a type that has its own config that cannot be overridden (currently this is only `BaseModel`, `TypedDict` and `dataclass`): ```python from typing_extensions import TypedDict from pydantic import ConfigDict, PydanticUserError, TypeAdapter class MyTypedDict(TypedDict): x: int try: TypeAdapter(MyTypedDict, config=ConfigDict(strict=True)) except PydanticUserError as exc_info: assert exc_info.code == 'type-adapter-config-unused' ``` Instead you'll need to subclass the type and override or set the config on it: ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter class MyTypedDict(TypedDict): x: int # or `model_config = ...` for BaseModel __pydantic_config__ = ConfigDict(strict=True) TypeAdapter(MyTypedDict) # ok ``` ## Cannot specify `model_config['extra']` with `RootModel` {#root-model-extra} Because `RootModel` is not capable of storing or even accepting extra fields during initialization, we raise an error if you try to specify a value for the config setting `'extra'` when creating a subclass of `RootModel`: ```python from pydantic import PydanticUserError, RootModel try: class MyRootModel(RootModel): model_config = {'extra': 'allow'} root: int except PydanticUserError as exc_info: assert exc_info.code == 'root-model-extra' ``` ## Cannot evaluate type annotation {#unevaluable-type-annotation} Because type annotations are evaluated *after* assignments, you might get unexpected results when using a type annotation name that clashes with one of your fields. We raise an error in the following case: ```python {test="skip"} from datetime import date from pydantic import BaseModel, Field class Model(BaseModel): date: date = Field(description='A date') ``` As a workaround, you can either use an alias or change your import: ```python {lint="skip"} import datetime # Or `from datetime import date as _date` from pydantic import BaseModel, Field class Model(BaseModel): date: datetime.date = Field(description='A date') ``` ## Incompatible `dataclass` `init` and `extra` settings {#dataclass-init-false-extra-allow} Pydantic does not allow the specification of the `extra='allow'` setting on a dataclass while any of the fields have `init=False` set. Thus, you may not do something like the following: ```python {test="skip"} from pydantic import ConfigDict, Field from pydantic.dataclasses import dataclass @dataclass(config=ConfigDict(extra='allow')) class A: a: int = Field(init=False, default=1) ``` The above snippet results in the following error during schema building for the `A` dataclass: ``` pydantic.errors.PydanticUserError: Field a has `init=False` and dataclass has config setting `extra="allow"`. This combination is not allowed. ``` ## Incompatible `init` and `init_var` settings on `dataclass` field {#clashing-init-and-init-var} The `init=False` and `init_var=True` settings are mutually exclusive. Doing so results in the `PydanticUserError` shown in the example below. ```python {test="skip"} from pydantic import Field from pydantic.dataclasses import dataclass @dataclass class Foo: bar: str = Field(init=False, init_var=True) """ pydantic.errors.PydanticUserError: Dataclass field bar has init=False and init_var=True, but these are mutually exclusive. """ ``` ## `model_config` is used as a model field {#model-config-invalid-field-name} This error is raised when `model_config` is used as the name of a field. ```python from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): model_config: str except PydanticUserError as exc_info: assert exc_info.code == 'model-config-invalid-field-name' ``` ## [`with_config`][pydantic.config.with_config] is used on a `BaseModel` subclass {#with-config-on-model} This error is raised when the [`with_config`][pydantic.config.with_config] decorator is used on a class which is already a Pydantic model (use the `model_config` attribute instead). ```python from pydantic import BaseModel, PydanticUserError, with_config try: @with_config({'allow_inf_nan': True}) class Model(BaseModel): bar: str except PydanticUserError as exc_info: assert exc_info.code == 'with-config-on-model' ``` ## `dataclass` is used on a `BaseModel` subclass {#dataclass-on-model} This error is raised when the Pydantic `dataclass` decorator is used on a class which is already a Pydantic model. ```python from pydantic import BaseModel, PydanticUserError from pydantic.dataclasses import dataclass try: @dataclass class Model(BaseModel): bar: str except PydanticUserError as exc_info: assert exc_info.code == 'dataclass-on-model' ``` ## Unsupported type for `validate_call` {#validate-call-type} `validate_call` has some limitations on the callables it can validate. This error is raised when you try to use it with an unsupported callable. Currently the supported callables are functions (including lambdas, but not built-ins) and methods and instances of [`partial`][functools.partial]. In the case of [`partial`][functools.partial], the function being partially applied must be one of the supported callables. ### `@classmethod`, `@staticmethod`, and `@property` These decorators must be put before `validate_call`. ```python from pydantic import PydanticUserError, validate_call # error try: class A: @validate_call @classmethod def f1(cls): ... except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct @classmethod @validate_call def f2(cls): ... ``` ### Classes While classes are callables themselves, `validate_call` can't be applied on them, as it needs to know about which method to use (`__init__` or `__new__`) to fetch type annotations. If you want to validate the constructor of a class, you should put `validate_call` on top of the appropriate method instead. ```python from pydantic import PydanticUserError, validate_call # error try: @validate_call class A1: ... except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct class A2: @validate_call def __init__(self): ... @validate_call def __new__(cls): ... ``` ### Callable instances Although instances can be callable by implementing a `__call__` method, currently the instances of these types cannot be validated with `validate_call`. This may change in the future, but for now, you should use `validate_call` explicitly on `__call__` instead. ```python from pydantic import PydanticUserError, validate_call # error try: class A1: def __call__(self): ... validate_call(A1()) except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct class A2: @validate_call def __call__(self): ... ``` ### Invalid signature This is generally less common, but a possible reason is that you are trying to validate a method that doesn't have at least one argument (usually `self`). ```python from pydantic import PydanticUserError, validate_call try: class A: def f(): ... validate_call(A().f) except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' ``` ## [`Unpack`][typing.Unpack] used without a [`TypedDict`][typing.TypedDict] {#unpack-typed-dict} This error is raised when [`Unpack`][typing.Unpack] is used with something other than a [`TypedDict`][typing.TypedDict] class object to type hint variadic keyword parameters. For reference, see the [related specification section] and [PEP 692]. ```python from typing_extensions import Unpack from pydantic import PydanticUserError, validate_call try: @validate_call def func(**kwargs: Unpack[int]): pass except PydanticUserError as exc_info: assert exc_info.code == 'unpack-typed-dict' ``` ## Overlapping unpacked [`TypedDict`][typing.TypedDict] fields and arguments {#overlapping-unpack-typed-dict} This error is raised when the typed dictionary used to type hint variadic keywords parameters has field names overlapping with other parameters (unless [positional only][positional-only_parameter]). For reference, see the [related specification section] and [PEP 692]. ```python from typing_extensions import TypedDict, Unpack from pydantic import PydanticUserError, validate_call class TD(TypedDict): a: int try: @validate_call def func(a: int, **kwargs: Unpack[TD]): pass except PydanticUserError as exc_info: assert exc_info.code == 'overlapping-unpack-typed-dict' ``` [related specification section]: https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments [PEP 692]: https://peps.python.org/pep-0692/ ## Invalid `Self` type {#invalid-self-type} Currently, [`Self`][typing.Self] can only be used to annotate a field of a class (specifically, subclasses of [`BaseModel`][pydantic.BaseModel], [`NamedTuple`][typing.NamedTuple], [`TypedDict`][typing.TypedDict], or dataclasses). Attempting to use [`Self`][typing.Self] in any other ways will raise this error. ```python from typing_extensions import Self from pydantic import PydanticUserError, validate_call try: @validate_call def func(self: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` The following example of [`validate_call()`][pydantic.validate_call] will also raise this error, even though it is correct from a type-checking perspective. This may be supported in the future. ```python from typing_extensions import Self from pydantic import BaseModel, PydanticUserError, validate_call try: class A(BaseModel): @validate_call def func(self, arg: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` pydantic-2.10.6/docs/errors/validation_errors.md000066400000000000000000001250611474456633400217540ustar00rootroot00000000000000Pydantic attempts to provide useful validation errors. Below are details on common validation errors users may encounter when working with pydantic, together with some suggestions on how to fix them. ## `arguments_type` This error is raised when an object that would be passed as arguments to a function during validation is not a `tuple`, `list`, or `dict`. Because `NamedTuple` uses function calls in its implementation, that is one way to produce this error: ```python from typing import NamedTuple from pydantic import BaseModel, ValidationError class MyNamedTuple(NamedTuple): x: int class MyModel(BaseModel): field: MyNamedTuple try: MyModel.model_validate({'field': 'invalid'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'arguments_type' ``` ## `assertion_error` This error is raised when a failing `assert` statement is encountered during validation: ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: int @field_validator('x') @classmethod def force_x_positive(cls, v): assert v > 0 return v try: Model(x=-1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'assertion_error' ``` ## `bool_parsing` This error is raised when the input value is a string that is not valid for coercion to a boolean: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bool Model(x='true') # OK try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bool_parsing' ``` ## `bool_type` This error is raised when the input value's type is not valid for a `bool` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bool try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bool_type' ``` ## `bytes_invalid_encoding` This error is raised when a `bytes` value is invalid under the configured encoding. In the following example, `b'a'` is invalid hex (odd number of digits). ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bytes model_config = {'val_json_bytes': 'hex'} try: Model(x=b'a') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_invalid_encoding' ``` This error is also raised for strict fields when the input value is not an instance of `bool`. ## `bytes_too_long` This error is raised when the length of a `bytes` value is greater than the field's `max_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: bytes = Field(max_length=3) try: Model(x=b'test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_too_long' ``` ## `bytes_too_short` This error is raised when the length of a `bytes` value is less than the field's `min_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: bytes = Field(min_length=3) try: Model(x=b't') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_too_short' ``` ## `bytes_type` This error is raised when the input value's type is not valid for a `bytes` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bytes try: Model(x=123) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_type' ``` This error is also raised for strict fields when the input value is not an instance of `bytes`. ## `callable_type` This error is raised when the input value is not valid as a `Callable`: ```python from typing import Any, Callable from pydantic import BaseModel, ImportString, ValidationError class Model(BaseModel): x: ImportString[Callable[[Any], Any]] Model(x='math:cos') # OK try: Model(x='os.path') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'callable_type' ``` ## `complex_str_parsing` This error is raised when the input value is a string but cannot be parsed as a complex number because it does not follow the [rule](https://docs.python.org/3/library/functions.html#complex) in Python: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): num: complex try: # Complex numbers in json are expected to be valid complex strings. # This value `abc` is not a valid complex string. Model.model_validate_json('{"num": "abc"}') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'complex_str_parsing' ``` ## `complex_type` This error is raised when the input value cannot be interpreted as a complex number: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): num: complex try: Model(num=False) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'complex_type' ``` ## `dataclass_exact_type` This error is raised when validating a dataclass with `strict=True` and the input is not an instance of the dataclass: ```python import pydantic.dataclasses from pydantic import TypeAdapter, ValidationError @pydantic.dataclasses.dataclass class MyDataclass: x: str adapter = TypeAdapter(MyDataclass) print(adapter.validate_python(MyDataclass(x='test'), strict=True)) #> MyDataclass(x='test') print(adapter.validate_python({'x': 'test'})) #> MyDataclass(x='test') try: adapter.validate_python({'x': 'test'}, strict=True) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dataclass_exact_type' ``` ## `dataclass_type` This error is raised when the input value is not valid for a `dataclass` field: ```python from pydantic import ValidationError, dataclasses @dataclasses.dataclass class Inner: x: int @dataclasses.dataclass class Outer: y: Inner Outer(y=Inner(x=1)) # OK try: Outer(y=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dataclass_type' ``` ## `date_from_datetime_inexact` This error is raised when the input `datetime` value provided for a `date` field has a nonzero time component. For a timestamp to parse into a field of type `date`, the time components must all be zero: ```python from datetime import date, datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date Model(x='2023-01-01') # OK Model(x=datetime(2023, 1, 1)) # OK try: Model(x=datetime(2023, 1, 1, 12)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_from_datetime_inexact' ``` ## `date_from_datetime_parsing` This error is raised when the input value is a string that cannot be parsed for a `date` field: ```python from datetime import date from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date try: Model(x='XX1494012000') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_from_datetime_parsing' ``` ## `date_future` This error is raised when the input value provided for a `FutureDate` field is not in the future: ```python from datetime import date from pydantic import BaseModel, FutureDate, ValidationError class Model(BaseModel): x: FutureDate try: Model(x=date(2000, 1, 1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_future' ``` ## `date_parsing` This error is raised when validating JSON where the input value is string that cannot be parsed for a `date` field: ```python import json from datetime import date from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: date = Field(strict=True) try: Model.model_validate_json(json.dumps({'x': '1'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_parsing' ``` ## `date_past` This error is raised when the value provided for a `PastDate` field is not in the past: ```python from datetime import date, timedelta from pydantic import BaseModel, PastDate, ValidationError class Model(BaseModel): x: PastDate try: Model(x=date.today() + timedelta(1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_past' ``` ## `date_type` This error is raised when the input value's type is not valid for a `date` field: ```python from datetime import date from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_type' ``` This error is also raised for strict fields when the input value is not an instance of `date`. ## `datetime_from_date_parsing` !!! note Support for this error, along with support for parsing datetimes from `yyyy-MM-DD` dates will be added in `v2.6.0` This error is raised when the input value is a string that cannot be parsed for a `datetime` field: ```python from datetime import datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: datetime try: # there is no 13th month Model(x='2023-13-01') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_from_date_parsing' ``` ## `datetime_future` This error is raised when the value provided for a `FutureDatetime` field is not in the future: ```python from datetime import datetime from pydantic import BaseModel, FutureDatetime, ValidationError class Model(BaseModel): x: FutureDatetime try: Model(x=datetime(2000, 1, 1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_future' ``` ## `datetime_object_invalid` This error is raised when something about the `datetime` object is not valid: ```python from datetime import datetime, tzinfo from pydantic import AwareDatetime, BaseModel, ValidationError class CustomTz(tzinfo): # utcoffset is not implemented! def tzname(self, _dt): return 'CustomTZ' class Model(BaseModel): x: AwareDatetime try: Model(x=datetime(2023, 1, 1, tzinfo=CustomTz())) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_object_invalid' ``` ## `datetime_parsing` This error is raised when the value is a string that cannot be parsed for a `datetime` field: ```python import json from datetime import datetime from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: datetime = Field(strict=True) try: Model.model_validate_json(json.dumps({'x': 'not a datetime'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_parsing' ``` ## `datetime_past` This error is raised when the value provided for a `PastDatetime` field is not in the past: ```python from datetime import datetime, timedelta from pydantic import BaseModel, PastDatetime, ValidationError class Model(BaseModel): x: PastDatetime try: Model(x=datetime.now() + timedelta(100)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_past' ``` ## `datetime_type` This error is raised when the input value's type is not valid for a `datetime` field: ```python from datetime import datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: datetime try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_type' ``` This error is also raised for strict fields when the input value is not an instance of `datetime`. ## `decimal_max_digits` This error is raised when the value provided for a `Decimal` has too many digits: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(max_digits=3) try: Model(x='42.1234') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_max_digits' ``` ## `decimal_max_places` This error is raised when the value provided for a `Decimal` has too many digits after the decimal point: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x='42.1234') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_max_places' ``` ## `decimal_parsing` This error is raised when the value provided for a `Decimal` could not be parsed as a decimal number: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_parsing' ``` ## `decimal_type` This error is raised when the value provided for a `Decimal` is of the wrong type: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x=[1, 2, 3]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_type' ``` This error is also raised for strict fields when the input value is not an instance of `Decimal`. ## `decimal_whole_digits` This error is raised when the value provided for a `Decimal` has more digits before the decimal point than `max_digits` - `decimal_places` (as long as both are specified): ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(max_digits=6, decimal_places=3) try: Model(x='12345.6') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_whole_digits' ``` This error is also raised for strict fields when the input value is not an instance of `Decimal`. ## `dict_type` This error is raised when the input value's type is not `dict` for a `dict` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: dict try: Model(x=['1', '2']) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dict_type' ``` ## `enum` This error is raised when the input value does not exist in an `enum` field members: ```python from enum import Enum from pydantic import BaseModel, ValidationError class MyEnum(str, Enum): option = 'option' class Model(BaseModel): x: MyEnum try: Model(x='other_option') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'enum' ``` ## `extra_forbidden` This error is raised when the input value contains extra fields, but `model_config['extra'] == 'forbid'`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: str model_config = ConfigDict(extra='forbid') try: Model(x='test', y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'extra_forbidden' ``` You can read more about the `extra` configuration in the [Extra Attributes][pydantic.config.ConfigDict.extra] section. ## `finite_number` This error is raised when the value is infinite, or too large to be represented as a 64-bit floating point number during validation: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=2.2250738585072011e308) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'finite_number' ``` ## `float_parsing` This error is raised when the value is a string that can't be parsed as a `float`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: float try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'float_parsing' ``` ## `float_type` This error is raised when the input value's type is not valid for a `float` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: float try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'float_type' ``` ## `frozen_field` This error is raised when you attempt to assign a value to a field with `frozen=True`, or to delete such a field: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field('test', frozen=True) model = Model() try: model.x = 'test1' except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_field' try: del model.x except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_field' ``` ## `frozen_instance` This error is raised when `model_config['frozen] == True` and you attempt to delete or assign a new value to any of the fields: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(frozen=True) m = Model(x=1) try: m.x = 2 except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_instance' try: del m.x except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_instance' ``` ## `frozen_set_type` This error is raised when the input value's type is not valid for a `frozenset` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: frozenset try: model = Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_set_type' ``` ## `get_attribute_error` This error is raised when `model_config['from_attributes'] == True` and an error is raised while reading the attributes: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Foobar: def __init__(self): self.x = 1 @property def y(self): raise RuntimeError('intentional error') class Model(BaseModel): x: int y: str model_config = ConfigDict(from_attributes=True) try: Model.model_validate(Foobar()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'get_attribute_error' ``` ## `greater_than` This error is raised when the value is not greater than the field's `gt` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(gt=10) try: Model(x=10) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'greater_than' ``` ## `greater_than_equal` This error is raised when the value is not greater than or equal to the field's `ge` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(ge=10) try: Model(x=9) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'greater_than_equal' ``` ## `int_from_float` This error is raised when you provide a `float` value for an `int` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=0.5) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_from_float' ``` ## `int_parsing` This error is raised when the value can't be parsed as `int`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing' ``` ## `int_parsing_size` This error is raised when attempting to parse a python or JSON value from a string outside the maximum range that Python `str` to `int` parsing permits: ```python import json from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int # from Python assert Model(x='1' * 4_300).x == int('1' * 4_300) # OK too_long = '1' * 4_301 try: Model(x=too_long) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing_size' # from JSON try: Model.model_validate_json(json.dumps({'x': too_long})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing_size' ``` ## `int_type` This error is raised when the input value's type is not valid for an `int` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_type' ``` ## `invalid_key` This error is raised when attempting to validate a `dict` that has a key that is not an instance of `str`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') try: Model.model_validate({'x': 1, b'y': 2}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'invalid_key' ``` ## `is_instance_of` This error is raised when the input value is not an instance of the expected type: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Nested: x: str class Model(BaseModel): y: Nested model_config = ConfigDict(arbitrary_types_allowed=True) try: Model(y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'is_instance_of' ``` ## `is_subclass_of` This error is raised when the input value is not a subclass of the expected type: ```python from typing import Type from pydantic import BaseModel, ValidationError class Nested: x: str class Model(BaseModel): y: Type[Nested] try: Model(y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'is_subclass_of' ``` ## `iterable_type` This error is raised when the input value is not valid as an `Iterable`: ```python from typing import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): y: Iterable try: Model(y=123) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'iterable_type' ``` ## `iteration_error` This error is raised when an error occurs during iteration: ```python from typing import List from pydantic import BaseModel, ValidationError def gen(): yield 1 raise RuntimeError('error') class Model(BaseModel): x: List[int] try: Model(x=gen()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'iteration_error' ``` ## `json_invalid` This error is raised when the input value is not a valid JSON string: ```python from pydantic import BaseModel, Json, ValidationError class Model(BaseModel): x: Json try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'json_invalid' ``` ## `json_type` This error is raised when the input value is of a type that cannot be parsed as JSON: ```python from pydantic import BaseModel, Json, ValidationError class Model(BaseModel): x: Json try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'json_type' ``` ## `less_than` This error is raised when the input value is not less than the field's `lt` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(lt=10) try: Model(x=10) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'less_than' ``` ## `less_than_equal` This error is raised when the input value is not less than or equal to the field's `le` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(le=10) try: Model(x=11) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'less_than_equal' ``` ## `list_type` This error is raised when the input value's type is not valid for a `list` field: ```python from typing import List from pydantic import BaseModel, ValidationError class Model(BaseModel): x: List[int] try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'list_type' ``` ## `literal_error` This error is raised when the input value is not one of the expected literal values: ```python from typing import Literal from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Literal['a', 'b'] Model(x='a') # OK try: Model(x='c') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'literal_error' ``` ## `mapping_type` This error is raised when a problem occurs during validation due to a failure in a call to the methods from the `Mapping` protocol, such as `.items()`: ```python from collections.abc import Mapping from typing import Dict from pydantic import BaseModel, ValidationError class BadMapping(Mapping): def items(self): raise ValueError() def __iter__(self): raise ValueError() def __getitem__(self, key): raise ValueError() def __len__(self): return 1 class Model(BaseModel): x: Dict[str, str] try: Model(x=BadMapping()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'mapping_type' ``` ## `missing` This error is raised when there are required fields missing from the input value: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing' ``` ## `missing_argument` This error is raised when a required positional-or-keyword argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_argument' ``` ## `missing_keyword_only_argument` This error is raised when a required keyword-only argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(*, a: int): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_keyword_only_argument' ``` ## `missing_positional_only_argument` This error is raised when a required positional-only argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int, /): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_positional_only_argument' ``` ## `model_attributes_type` This error is raised when the input value is not a valid dictionary, model instance, or instance that fields can be extracted from: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int # simply validating a dict print(Model.model_validate({'a': 1, 'b': 2})) #> a=1 b=2 class CustomObj: def __init__(self, a, b): self.a = a self.b = b # using from attributes to extract fields from an objects print(Model.model_validate(CustomObj(3, 4), from_attributes=True)) #> a=3 b=4 try: Model.model_validate('not an object', from_attributes=True) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'model_attributes_type' ``` ## `model_type` This error is raised when the input to a model is not an instance of the model or dict: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int # simply validating a dict m = Model.model_validate({'a': 1, 'b': 2}) print(m) #> a=1 b=2 # validating an existing model instance print(Model.model_validate(m)) #> a=1 b=2 try: Model.model_validate('not an object') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'model_type' ``` ## `multiple_argument_values` This error is raised when you provide multiple values for a single argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int): return a try: foo(1, a=2) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'multiple_argument_values' ``` ## `multiple_of` This error is raised when the input is not a multiple of a field's `multiple_of` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(multiple_of=5) try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'multiple_of' ``` ## `needs_python_object` This type of error is raised when validation is attempted from a format that cannot be converted to a Python object. For example, we cannot check `isinstance` or `issubclass` from JSON: ```python import json from typing import Type from pydantic import BaseModel, ValidationError class Model(BaseModel): bm: Type[BaseModel] try: Model.model_validate_json(json.dumps({'bm': 'not a basemodel class'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'needs_python_object' ``` ## `no_such_attribute` This error is raised when `validate_assignment=True` in the config, and you attempt to assign a value to an attribute that is not an existing field: ```python from pydantic import ConfigDict, ValidationError, dataclasses @dataclasses.dataclass(config=ConfigDict(validate_assignment=True)) class MyDataclass: x: int m = MyDataclass(x=1) try: m.y = 10 except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'no_such_attribute' ``` ## `none_required` This error is raised when the input value is not `None` for a field that requires `None`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: None try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'none_required' ``` !!! note You may encounter this error when there is a naming collision in your model between a field name and its type. More specifically, this error is likely to be thrown when the default value of that field is `None`. For example, the following would yield the `none_required` validation error since the field `int` is set to a default value of `None` and has the exact same name as its type, which causes problems with validation. ```python {test="skip"} from typing import Optional from pydantic import BaseModel class M1(BaseModel): int: Optional[int] = None m = M1(int=123) # errors ``` ## `recursion_loop` This error is raised when a cyclic reference is detected: ```python from typing import List from pydantic import BaseModel, ValidationError class Model(BaseModel): x: List['Model'] d = {'x': []} d['x'].append(d) try: Model(**d) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'recursion_loop' ``` ## `set_type` This error is raised when the value type is not valid for a `set` field: ```python from typing import Set from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Set[int] try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'set_type' ``` ## `string_pattern_mismatch` This error is raised when the input value doesn't match the field's `pattern` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(pattern='test') try: Model(x='1') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_pattern_mismatch' ``` ## `string_sub_type` This error is raised when the value is an instance of a strict subtype of `str` when the field is strict: ```python from enum import Enum from pydantic import BaseModel, Field, ValidationError class MyEnum(str, Enum): foo = 'foo' class Model(BaseModel): x: str = Field(strict=True) try: Model(x=MyEnum.foo) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_sub_type' ``` ## `string_too_long` This error is raised when the input value is a string whose length is greater than the field's `max_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(max_length=3) try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_too_long' ``` ## `string_too_short` This error is raised when the input value is a string whose length is less than the field's `min_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(min_length=3) try: Model(x='t') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_too_short' ``` ## `string_type` This error is raised when the input value's type is not valid for a `str` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_type' ``` This error is also raised for strict fields when the input value is not an instance of `str`. ## `string_unicode` This error is raised when the value cannot be parsed as a Unicode string: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model(x=b'\x81') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_unicode' ``` ## `time_delta_parsing` This error is raised when the input value is a string that cannot be parsed for a `timedelta` field: ```python from datetime import timedelta from pydantic import BaseModel, ValidationError class Model(BaseModel): x: timedelta try: Model(x='t') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_delta_parsing' ``` ## `time_delta_type` This error is raised when the input value's type is not valid for a `timedelta` field: ```python from datetime import timedelta from pydantic import BaseModel, ValidationError class Model(BaseModel): x: timedelta try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_delta_type' ``` This error is also raised for strict fields when the input value is not an instance of `timedelta`. ## `time_parsing` This error is raised when the input value is a string that cannot be parsed for a `time` field: ```python from datetime import time from pydantic import BaseModel, ValidationError class Model(BaseModel): x: time try: Model(x='25:20:30.400') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_parsing' ``` ## `time_type` This error is raised when the value type is not valid for a `time` field: ```python from datetime import time from pydantic import BaseModel, ValidationError class Model(BaseModel): x: time try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_type' ``` This error is also raised for strict fields when the input value is not an instance of `time`. ## `timezone_aware` This error is raised when the `datetime` value provided for a timezone-aware `datetime` field doesn't have timezone information: ```python from datetime import datetime from pydantic import AwareDatetime, BaseModel, ValidationError class Model(BaseModel): x: AwareDatetime try: Model(x=datetime.now()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'timezone_aware' ``` ## `timezone_naive` This error is raised when the `datetime` value provided for a timezone-naive `datetime` field has timezone info: ```python from datetime import datetime, timezone from pydantic import BaseModel, NaiveDatetime, ValidationError class Model(BaseModel): x: NaiveDatetime try: Model(x=datetime.now(tz=timezone.utc)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'timezone_naive' ``` ## `too_long` This error is raised when the input value's length is greater than the field's `max_length` constraint: ```python from typing import List from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: List[int] = Field(max_length=3) try: Model(x=[1, 2, 3, 4]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'too_long' ``` ## `too_short` This error is raised when the value length is less than the field's `min_length` constraint: ```python from typing import List from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: List[int] = Field(min_length=3) try: Model(x=[1, 2]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'too_short' ``` ## `tuple_type` This error is raised when the input value's type is not valid for a `tuple` field: ```python from typing import Tuple from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Tuple[int] try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'tuple_type' ``` This error is also raised for strict fields when the input value is not an instance of `tuple`. ## `unexpected_keyword_argument` This error is raised when you provide a value by keyword for a positional-only argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int, /): return a try: foo(a=2) except ValidationError as exc: print(repr(exc.errors()[1]['type'])) #> 'unexpected_keyword_argument' ``` It is also raised when using pydantic.dataclasses and `extra=forbid`: ```python from pydantic import TypeAdapter, ValidationError from pydantic.dataclasses import dataclass @dataclass(config={'extra': 'forbid'}) class Foo: bar: int try: TypeAdapter(Foo).validate_python({'bar': 1, 'foobar': 2}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'unexpected_keyword_argument' ``` ## `unexpected_positional_argument` This error is raised when you provide a positional value for a keyword-only argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(*, a: int): return a try: foo(2) except ValidationError as exc: print(repr(exc.errors()[1]['type'])) #> 'unexpected_positional_argument' ``` ## `union_tag_invalid` This error is raised when the input's discriminator is not one of the expected values: ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: Union[BlackCat, WhiteCat] = Field(discriminator='pet_type') try: Model(cat={'pet_type': 'dog'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_invalid' ``` ## `union_tag_not_found` This error is raised when it is not possible to extract a discriminator value from the input: ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: Union[BlackCat, WhiteCat] = Field(discriminator='pet_type') try: Model(cat={'name': 'blackcat'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_not_found' ``` ## `url_parsing` This error is raised when the input value cannot be parsed as a URL: ```python from pydantic import AnyUrl, BaseModel, ValidationError class Model(BaseModel): x: AnyUrl try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_parsing' ``` ## `url_scheme` This error is raised when the URL scheme is not valid for the URL type of the field: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x='ftp://example.com') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_scheme' ``` ## `url_syntax_violation` This error is raised when the URL syntax is not valid: ```python from pydantic import BaseModel, Field, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl = Field(strict=True) try: Model(x='http:////example.com') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_syntax_violation' ``` ## `url_too_long` This error is raised when the URL length is greater than 2083: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x='x' * 2084) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_too_long' ``` ## `url_type` This error is raised when the input value's type is not valid for a URL field: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_type' ``` ## `uuid_parsing` This error is raised when the input value's type is not valid for a UUID field: ```python from uuid import UUID from pydantic import BaseModel, ValidationError class Model(BaseModel): u: UUID try: Model(u='12345678-124-1234-1234-567812345678') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_parsing' ``` ## `uuid_type` This error is raised when the input value's type is not valid instance for a UUID field (str, bytes or UUID): ```python from uuid import UUID from pydantic import BaseModel, ValidationError class Model(BaseModel): u: UUID try: Model(u=1234567812412341234567812345678) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_type' ``` ## `uuid_version` This error is raised when the input value's type is not match UUID version: ```python from pydantic import UUID5, BaseModel, ValidationError class Model(BaseModel): u: UUID5 try: Model(u='a6cc5730-2261-11ee-9c43-2eb5a363657c') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_version' ``` ## `value_error` This error is raised when a `ValueError` is raised during validation: ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: str @field_validator('x') @classmethod def repeat_b(cls, v): raise ValueError() try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'value_error' ``` pydantic-2.10.6/docs/examples/000077500000000000000000000000001474456633400162015ustar00rootroot00000000000000pydantic-2.10.6/docs/examples/custom_validators.md000066400000000000000000000233461474456633400222750ustar00rootroot00000000000000This page provides example snippets for creating more complex, custom validators in Pydantic. Many of these examples are adapted from Pydantic issues and discussions, and are intended to showcase the flexibility and power of Pydantic's validation system. ## Custom `datetime` Validator via [`Annotated`][typing.Annotated] Metadata In this example, we'll construct a custom validator, attached to an [`Annotated`][typing.Annotated] type, that ensures a [`datetime`][datetime.datetime] object adheres to a given timezone constraint. The custom validator supports string specification of the timezone, and will raise an error if the [`datetime`][datetime.datetime] object does not have the correct timezone. We use `__get_pydantic_core_schema__` in the validator to customize the schema of the annotated type (in this case, [`datetime`][datetime.datetime]), which allows us to add custom validation logic. Notably, we use a `wrap` validator function so that we can perform operations both before and after the default `pydantic` validation of a [`datetime`][datetime.datetime]. ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Any, Callable, Optional import pytz from pydantic_core import CoreSchema, core_schema from typing_extensions import Annotated from pydantic import ( GetCoreSchemaHandler, PydanticUserError, TypeAdapter, ValidationError, ) @dataclass(frozen=True) class MyDatetimeValidator: tz_constraint: Optional[str] = None def tz_constraint_validator( self, value: dt.datetime, handler: Callable, # (1)! ): """Validate tz_constraint and tz_info.""" # handle naive datetimes if self.tz_constraint is None: assert ( value.tzinfo is None ), 'tz_constraint is None, but provided value is tz-aware.' return handler(value) # validate tz_constraint and tz-aware tzinfo if self.tz_constraint not in pytz.all_timezones: raise PydanticUserError( f'Invalid tz_constraint: {self.tz_constraint}', code='unevaluable-type-annotation', ) result = handler(value) # (2)! assert self.tz_constraint == str( result.tzinfo ), f'Invalid tzinfo: {str(result.tzinfo)}, expected: {self.tz_constraint}' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.tz_constraint_validator, handler(source_type), ) LA = 'America/Los_Angeles' ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(LA)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) except ValidationError as ve: pprint(ve.errors(), width=100) """ [{'ctx': {'error': AssertionError('Invalid tzinfo: Europe/London, expected: America/Los_Angeles')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Invalid tzinfo: Europe/London, expected: America/Los_Angeles', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` 1. The `handler` function is what we call to validate the input with standard `pydantic` validation 2. We call the `handler` function to validate the input with standard `pydantic` validation in this wrap validator We can also enforce UTC offset constraints in a similar way. Assuming we have a `lower_bound` and an `upper_bound`, we can create a custom validator to ensure our `datetime` has a UTC offset that is inclusive within the boundary we define: ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Any, Callable import pytz from pydantic_core import CoreSchema, core_schema from typing_extensions import Annotated from pydantic import GetCoreSchemaHandler, TypeAdapter, ValidationError @dataclass(frozen=True) class MyDatetimeValidator: lower_bound: int upper_bound: int def validate_tz_bounds(self, value: dt.datetime, handler: Callable): """Validate and test bounds""" assert value.utcoffset() is not None, 'UTC offset must exist' assert self.lower_bound <= self.upper_bound, 'Invalid bounds' result = handler(value) hours_offset = value.utcoffset().total_seconds() / 3600 assert ( self.lower_bound <= hours_offset <= self.upper_bound ), 'Value out of bounds' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.validate_tz_bounds, handler(source_type), ) LA = 'America/Los_Angeles' # UTC-7 or UTC-8 ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(-10, -5)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: print( ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) ) except ValidationError as e: pprint(e.errors(), width=100) """ [{'ctx': {'error': AssertionError('Value out of bounds')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Value out of bounds', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` ## Validating Nested Model Fields Here, we demonstrate two ways to validate a field of a nested model, where the validator utilizes data from the parent model. In this example, we construct a validator that checks that each user's password is not in a list of forbidden passwords specified by the parent model. One way to do this is to place a custom validator on the outer model: ```python from typing import List from typing_extensions import Self from pydantic import BaseModel, ValidationError, model_validator class User(BaseModel): username: str password: str class Organization(BaseModel): forbidden_passwords: List[str] users: List[User] @model_validator(mode='after') def validate_user_passwords(self) -> Self: """Check that user password is not in forbidden list. Raise a validation error if a forbidden password is encountered.""" for user in self.users: current_pw = user.password if current_pw in self.forbidden_passwords: raise ValueError( f'Password {current_pw} is forbidden. Please choose another password for user {user.username}.' ) return self data = { 'forbidden_passwords': ['123'], 'users': [ {'username': 'Spartacat', 'password': '123'}, {'username': 'Iceburgh', 'password': '87'}, ], } try: org = Organization(**data) except ValidationError as e: print(e) """ 1 validation error for Organization Value error, Password 123 is forbidden. Please choose another password for user Spartacat. [type=value_error, input_value={'forbidden_passwords': [...gh', 'password': '87'}]}, input_type=dict] """ ``` Alternatively, a custom validator can be used in the nested model class (`User`), with the forbidden passwords data from the parent model being passed in via validation context. !!! warning The ability to mutate the context within a validator adds a lot of power to nested validation, but can also lead to confusing or hard-to-debug code. Use this approach at your own risk! ```python from typing import List from pydantic import BaseModel, ValidationError, ValidationInfo, field_validator class User(BaseModel): username: str password: str @field_validator('password', mode='after') @classmethod def validate_user_passwords( cls, password: str, info: ValidationInfo ) -> str: """Check that user password is not in forbidden list.""" forbidden_passwords = ( info.context.get('forbidden_passwords', []) if info.context else [] ) if password in forbidden_passwords: raise ValueError(f'Password {password} is forbidden.') return password class Organization(BaseModel): forbidden_passwords: List[str] users: List[User] @field_validator('forbidden_passwords', mode='after') @classmethod def add_context(cls, v: List[str], info: ValidationInfo) -> List[str]: if info.context is not None: info.context.update({'forbidden_passwords': v}) return v data = { 'forbidden_passwords': ['123'], 'users': [ {'username': 'Spartacat', 'password': '123'}, {'username': 'Iceburgh', 'password': '87'}, ], } try: org = Organization.model_validate(data, context={}) except ValidationError as e: print(e) """ 1 validation error for Organization users.0.password Value error, Password 123 is forbidden. [type=value_error, input_value='123', input_type=str] """ ``` Note that if the context property is not included in `model_validate`, then `info.context` will be `None` and the forbidden passwords list will not get added to the context in the above implementation. As such, `validate_user_passwords` would not carry out the desired password validation. More details about validation context can be found [here](../concepts/validators.md#validation-context). pydantic-2.10.6/docs/examples/files.md000066400000000000000000000137711474456633400176360ustar00rootroot00000000000000`pydantic` is a great tool for validating data coming from various sources. In this section, we will look at how to validate data from different types of files. !!! Note: If you're using any of the below file formats to parse configuration / settings, you might want to consider using the [`pydantic-settings`][pydantic_settings] library, which offers builtin support for parsing this type of data. ## JSON data `.json` files are a common way to store key / value data in a human-readable format. Here is an example of a `.json` file: ```json { "name": "John Doe", "age": 30, "email": "john@example.com" } ``` To validate this data, we can use a `pydantic` model: ```python {test="skip"} import pathlib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_string = pathlib.Path('person.json').read_text() person = Person.model_validate_json(json_string) print(repr(person)) #> Person(name='John Doe', age=30, email='john@example.com') ``` If the data in the file is not valid, `pydantic` will raise a [`ValidationError`][pydantic_core.ValidationError]. Let's say we have the following `.json` file: ```json { "age": -30, "email": "not-an-email-address" } ``` This data is flawed for three reasons: 1. It's missing the `name` field. 2. The `age` field is negative. 3. The `email` field is not a valid email address. When we try to validate this data, `pydantic` raises a [`ValidationError`][pydantic_core.ValidationError] with all of the above issues: ```python {test="skip"} import pathlib from pydantic import BaseModel, EmailStr, PositiveInt, ValidationError class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_string = pathlib.Path('person.json').read_text() try: person = Person.model_validate_json(json_string) except ValidationError as err: print(err) """ 3 validation errors for Person name Field required [type=missing, input_value={'age': -30, 'email': 'not-an-email-address'}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing age Input should be greater than 0 [type=greater_than, input_value=-30, input_type=int] For further information visit https://errors.pydantic.dev/2.10/v/greater_than email value is not a valid email address: An email address must have an @-sign. [type=value_error, input_value='not-an-email-address', input_type=str] """ ``` Often, it's the case that you have an abundance of a certain type of data within a `.json` file. For example, you might have a list of people: ```json [ { "name": "John Doe", "age": 30, "email": "john@example.com" }, { "name": "Jane Doe", "age": 25, "email": "jane@example.com" } ] ``` In this case, you can validate the data against a `List[Person]` model: ```python {test="skip"} import pathlib from typing import List from pydantic import BaseModel, EmailStr, PositiveInt, TypeAdapter class Person(BaseModel): name: str age: PositiveInt email: EmailStr person_list_adapter = TypeAdapter(List[Person]) # (1)! json_string = pathlib.Path('people.json').read_text() people = person_list_adapter.validate_json(json_string) print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` 1. We use [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] to validate a list of `Person` objects. [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] is a Pydantic construct used to validate data against a single type. ## JSON lines files Similar to validating a list of objects from a `.json` file, you can validate a list of objects from a `.jsonl` file. `.jsonl` files are a sequence of JSON objects separated by newlines. Consider the following `.jsonl` file: ```json {"name": "John Doe", "age": 30, "email": "john@example.com"} {"name": "Jane Doe", "age": 25, "email": "jane@example.com"} ``` We can validate this data with a similar approach to the one we used for `.json` files: ```python {test="skip"} import pathlib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_lines = pathlib.Path('people.jsonl').read_text().splitlines() people = [Person.model_validate_json(line) for line in json_lines] print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` ## CSV files CSV is one of the most common file formats for storing tabular data. To validate data from a CSV file, you can use the `csv` module from the Python standard library to load the data and validate it against a Pydantic model. Consider the following CSV file: ```csv name,age,email John Doe,30,john@example.com Jane Doe,25,jane@example.com ``` Here's how we validate that data: ```python {test="skip"} import csv from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr with open('people.csv') as f: reader = csv.DictReader(f) people = [Person.model_validate(row) for row in reader] print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` ## TOML files TOML files are often used for configuration due to their simplicity and readability. Consider the following TOML file: ```toml name = "John Doe" age = 30 email = "john@example.com" ``` Here's how we validate that data: ```python {test="skip"} import tomllib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr with open('person.toml', 'rb') as f: data = tomllib.load(f) person = Person.model_validate(data) print(repr(person)) #> Person(name='John Doe', age=30, email='john@example.com') ``` pydantic-2.10.6/docs/examples/orms.md000066400000000000000000000036671474456633400175170ustar00rootroot00000000000000Pydantic serves as a great tool for defining models for ORM (object relational mapping) libraries. ORMs are used to map objects to database tables, and vice versa. ## SQLAlchemy Pydantic can pair with SQLAlchemy, as it can be used to define the schema of the database models. !!! warning "Code Duplication" If you use Pydantic with SQLAlchemy, you might experience some frustration with code duplication. If you find yourself experiencing this difficulty, you might also consider [`SQLModel`](https://sqlmodel.tiangolo.com/) which integrates Pydantic with SQLAlchemy such that much of the code duplication is eliminated. If you'd prefer to use pure Pydantic with SQLAlchemy, we recommend using Pydantic models alongside of SQLAlchemy models as shown in the example below. In this case, we take advantage of Pydantic's aliases feature to name a `Column` after a reserved SQLAlchemy field, thus avoiding conflicts. ```python import typing import sqlalchemy as sa from sqlalchemy.orm import declarative_base from pydantic import BaseModel, ConfigDict, Field class MyModel(BaseModel): model_config = ConfigDict(from_attributes=True) metadata: typing.Dict[str, str] = Field(alias='metadata_') Base = declarative_base() class MyTableModel(Base): __tablename__ = 'my_table' id = sa.Column('id', sa.Integer, primary_key=True) # 'metadata' is reserved by SQLAlchemy, hence the '_' metadata_ = sa.Column('metadata', sa.JSON) sql_model = MyTableModel(metadata_={'key': 'val'}, id=1) pydantic_model = MyModel.model_validate(sql_model) print(pydantic_model.model_dump()) #> {'metadata': {'key': 'val'}} print(pydantic_model.model_dump(by_alias=True)) #> {'metadata_': {'key': 'val'}} ``` !!! note The example above works because aliases have priority over field names for field population. Accessing `SQLModel`'s `metadata` attribute would lead to a `ValidationError`. pydantic-2.10.6/docs/examples/queues.md000066400000000000000000000034771474456633400200450ustar00rootroot00000000000000Pydantic is quite helpful for validating data that goes into and comes out of queues. Below, we'll explore how to validate / serialize data with various queue systems. ## Redis queue Redis is a popular in-memory data structure store. In order to run this example locally, you'll first need to [install Redis](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/) and start your server up locally. Here's a simple example of how you can use Pydantic to: 1. Serialize data to push to the queue 2. Deserialize and validate data when it's popped from the queue ```python {test="skip"} import redis from pydantic import BaseModel, EmailStr class User(BaseModel): id: int name: str email: EmailStr r = redis.Redis(host='localhost', port=6379, db=0) QUEUE_NAME = 'user_queue' def push_to_queue(user_data: User) -> None: serialized_data = user_data.model_dump_json() r.rpush(QUEUE_NAME, user_data.model_dump_json()) print(f'Added to queue: {serialized_data}') user1 = User(id=1, name='John Doe', email='john@example.com') user2 = User(id=2, name='Jane Doe', email='jane@example.com') push_to_queue(user1) #> Added to queue: {"id":1,"name":"John Doe","email":"john@example.com"} push_to_queue(user2) #> Added to queue: {"id":2,"name":"Jane Doe","email":"jane@example.com"} def pop_from_queue() -> None: data = r.lpop(QUEUE_NAME) if data: user = User.model_validate_json(data) print(f'Validated user: {repr(user)}') else: print('Queue is empty') pop_from_queue() #> Validated user: User(id=1, name='John Doe', email='john@example.com') pop_from_queue() #> Validated user: User(id=2, name='Jane Doe', email='jane@example.com') pop_from_queue() #> Queue is empty ``` pydantic-2.10.6/docs/examples/requests.md000066400000000000000000000036001474456633400203750ustar00rootroot00000000000000Pydantic models are a great way to validating and serializing data for requests and responses. Pydantic is instrumental in many web frameworks and libraries, such as FastAPI, Django, Flask, and HTTPX. ## `httpx` requests [`httpx`](https://www.python-httpx.org/) is a HTTP client for Python 3 with synchronous and asynchronous APIs. In the below example, we query the [JSONPlaceholder API](https://jsonplaceholder.typicode.com/) to get a user's data and validate it with a Pydantic model. ```python {test="skip"} import httpx from pydantic import BaseModel, EmailStr class User(BaseModel): id: int name: str email: EmailStr url = 'https://jsonplaceholder.typicode.com/users/1' response = httpx.get(url) response.raise_for_status() user = User.model_validate(response.json()) print(repr(user)) #> User(id=1, name='Leanne Graham', email='Sincere@april.biz') ``` The [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] tool from Pydantic often comes in quite handy when working with HTTP requests. Consider a similar example where we are validating a list of users: ```python {test="skip"} from pprint import pprint from typing import List import httpx from pydantic import BaseModel, EmailStr, TypeAdapter class User(BaseModel): id: int name: str email: EmailStr url = 'https://jsonplaceholder.typicode.com/users/' # (1)! response = httpx.get(url) response.raise_for_status() users_list_adapter = TypeAdapter(List[User]) users = users_list_adapter.validate_python(response.json()) pprint([u.name for u in users]) """ ['Leanne Graham', 'Ervin Howell', 'Clementine Bauch', 'Patricia Lebsack', 'Chelsey Dietrich', 'Mrs. Dennis Schulist', 'Kurtis Weissnat', 'Nicholas Runolfsdottir V', 'Glenna Reichert', 'Clementina DuBuque'] """ ``` 1. Note, we're querying the `/users/` endpoint here to get a list of users. pydantic-2.10.6/docs/extra/000077500000000000000000000000001474456633400155065ustar00rootroot00000000000000pydantic-2.10.6/docs/extra/feedback.js000066400000000000000000000007731474456633400175770ustar00rootroot00000000000000var feedback = document.forms.feedback feedback.hidden = false feedback.addEventListener("submit", function(ev) { ev.preventDefault() var data = ev.submitter.getAttribute("data-md-value") feedback.firstElementChild.disabled = true var note = feedback.querySelector( `.md-feedback__note [data-md-value='${data}']` ) if (note) note.hidden = false if (data == 1) { window.flarelytics_event('thumbsUp'); } else if (data == 0) { window.flarelytics_event('thumbsDown'); } }) pydantic-2.10.6/docs/extra/fluff.js000066400000000000000000000042671474456633400171570ustar00rootroot00000000000000// set the download count in the "why pydantic" page (async function() { const downloadCount = document.getElementById('download-count'); if (downloadCount) { const r = await fetch('https://errors.pydantic.dev/download-count.txt'); if (r.status === 200) { downloadCount.innerText = await r.text(); } } })(); // update the announcement banner to change the app type (function() { const el = document.getElementById('logfire-app-type'); const appTypes = [ ['/integrations/pydantic/', 'Pydantic validations.'], ['/integrations/fastapi/', 'FastAPI app.'], ['/integrations/openai/', 'OpenAI integration.'], ['/integrations/asyncpg/', 'Postgres queries.'], ['/integrations/redis/', 'task queue.'], ['/integrations/system-metrics/', 'system metrics.'], ['/integrations/httpx/', 'API calls.'], ['/integrations/logging/', 'std lib logging.'], ['/integrations/django/', 'Django app.'], ['/integrations/anthropic/', 'Anthropic API calls.'], ['/integrations/fastapi/', 'Flask app.'], ['/integrations/mysql/', 'MySQL queries.'], ['/integrations/sqlalchemy/', 'SQLAlchemy queries.'], ['/integrations/structlog/', 'Structlog logs.'], ['/integrations/stripe/', 'Stripe API calls.'], ]; const docsUrl = 'https://logfire.pydantic.dev/docs'; let counter = 0; const sleep = (ms) => new Promise((resolve) => setTimeout(resolve, ms)); // avoid multiple replaceText running at the same time (e.g. when the user has left the page) let running = false; const replaceText = async () => { if (running) { return; } running = true; try { const text = el.textContent; for (let i = text.length; i >= 0; i--) { el.textContent = text.slice(0, i); await sleep(30); } await sleep(30); counter++; // change the link halfway through the animation const [link, newText] = appTypes[counter % appTypes.length]; el.href = docsUrl + link; await sleep(30); for (let i = 0; i <= newText.length; i++) { el.textContent = newText.slice(0, i); await sleep(30); } } finally { running = false; } }; setInterval(replaceText, 4000); })(); pydantic-2.10.6/docs/extra/terminal.css000066400000000000000000000010231474456633400200270ustar00rootroot00000000000000.terminal { background: #300a24; border-radius: 4px; padding: 5px 10px; } pre.terminal-content { display: inline-block; line-height: 1.3 !important; white-space: pre-wrap; word-wrap: break-word; background: #300a24 !important; color: #d0d0d0 !important; } .ansi2 { font-weight: lighter; } .ansi3 { font-style: italic; } .ansi32 { color: #00aa00; } .ansi34 { color: #5656fe; } .ansi35 { color: #E850A8; } .ansi38-1 { color: #cf0000; } .ansi38-5 { color: #E850A8; } .ansi38-68 { color: #2a54a8; } pydantic-2.10.6/docs/extra/tweaks.css000066400000000000000000000077271474456633400175330ustar00rootroot00000000000000.sponsors { display: flex; justify-content: center; flex-wrap: wrap; align-items: center; margin: 1rem 0; } .sponsors > div { text-align: center; width: 33%; padding-bottom: 20px; } .sponsors span { display: block; } @media screen and (max-width: 599px) { .sponsors span { display: none; } } .sponsors img { width: 65%; border-radius: 5px; } /*blog post*/ aside.blog { display: flex; align-items: center; } aside.blog img { width: 50px; height: 50px; border-radius: 25px; margin-right: 20px; } /* Define the company grid layout */ #grid-container { width: 100%; text-align: center; } #company-grid { display: inline-block; margin: 0 auto; gap: 10px; align-content: center; justify-content: center; grid-auto-flow: column; } [data-md-color-scheme="slate"] #company-grid { background-color: #ffffff; border-radius: .5rem; color: black; } .tile { display: flex; text-align: center; width: 120px; height: 120px; display: inline-block; margin: 10px; padding: 5px; border-radius: .5rem; } .tile img { width: 100px; } .md-typeset__table > table { max-height: 60vh; } .md-typeset__table > table thead { position: sticky; top: 0; background-color: var(--md-default-bg-color); } .md-typeset__table > table th { border-bottom: .05rem solid var(--md-typeset-table-color); } .md-typeset__table > table tr:first-child td { border-top: none; } /* API documentation link admonition */ :root { --md-admonition-icon--api: url('data:image/svg+xml;charset=utf-8,') } .md-typeset .admonition.api, .md-typeset details.api { border-color: #448aff; } .md-typeset .api > .admonition-title, .md-typeset .api > summary { background-color: #448aff1a; } .md-typeset .api > .admonition-title::before, .md-typeset .api > summary::before { background-color: #448aff; -webkit-mask-image: var(--md-admonition-icon--api); mask-image: var(--md-admonition-icon--api); } /* Logfire link admonition */ :root { --md-admonition-icon--logfire: url('data:image/svg+xml;charset=utf-8, ') } .md-typeset .admonition.logfire, .md-typeset details.logfire { border-color: #e620e9; } .md-typeset .logfire > .admonition-title, .md-typeset .logfire > summary { background-color: #e620e91a; } .md-typeset .logfire > .admonition-title::before, .md-typeset .logfire > summary::before { background-color: #e620e9; -webkit-mask-image: var(--md-admonition-icon--logfire); mask-image: var(--md-admonition-icon--logfire); } /* Hide the run button in logfire admonitions */ .admonition.logfire .run-code-btn { display: none; } /* add border to screenshots in the logfire admonitions `img[src*="logfire"]` to differentiate from emojis */ .admonition.logfire img[src*="logfire"] { border: 1px solid #448aff; border-radius: 0.2rem; padding: 0.2rem; } /* banner slightly larger */ .md-banner__inner { font-size: 0.8rem; margin: 0.3rem auto; } /* Revert hue value to that of pre mkdocs-material v9.4.0 */ [data-md-color-scheme="slate"] { --md-hue: 230; --md-default-bg-color: hsla(230, 15%, 21%, 1); } /* Add customization for pydantic people page */ .user-list { display: flex; flex-wrap: wrap; margin-bottom: 2rem; } .user-list-center { justify-content: space-evenly; } .user { margin: 1em; min-width: 7em; } .user .avatar-wrapper { width: 80px; height: 80px; margin: 10px auto; overflow: hidden; border-radius: 50%; position: relative; } .user .avatar-wrapper img { position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); } .user .title { text-align: center; } .user .count { font-size: 80%; text-align: center; } pydantic-2.10.6/docs/favicon.png000066400000000000000000000011221474456633400165120ustar00rootroot00000000000000PNG  IHDR DPLTE%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%d%dhN+tRNSE4 /to<xǩb(׎lOѣ}XKAJMRIDAT8˥钂090 ZSGfSa-E1|>Qruဍz')*%ZZY-@>^MwXu}IπjfW=^#T&m[fʱ_J@J4o/[mZo^zs޻g䰆H#b/,7JA4@MKE0FW%!q=gE3)%E{ua2Gεv!`3oȤ(1-&rT,Amh!}sߗ1М%LsiIA)Rge{!^IENDB`pydantic-2.10.6/docs/help_with_pydantic.md000066400000000000000000000021451474456633400205650ustar00rootroot00000000000000# Getting help with Pydantic If you need help getting started with Pydantic or with advanced usage, the following sources may be useful. ## :material-help: Usage Documentation The [usage documentation](concepts/models.md) is the most complete guide on how to use Pydantic. ## :material-api: API Documentation The [API documentation](api/base_model.md) give reference docs for all public Pydantic APIs. ## :simple-github: GitHub Discussions [GitHub discussions](https://github.com/pydantic/pydantic/discussions) are useful for asking questions, your question and the answer will help everyone. ## :simple-stackoverflow: Stack Overflow Use the [`pydantic`](https://stackoverflow.com/questions/tagged/pydantic) tag on Stack Overflow to ask questions, note this is not always monitored by the core Pydantic team. ## :simple-youtube: YouTube Youtube as lots of useful [videos on Pydantic](https://www.youtube.com/results?search_query=pydantic). In particular Marcelo Trylesinski's video ["Pydantic V1 to V2 - The Migration"](https://youtu.be/sD_xpYl4fPU) has helped people a lot when migrating from Pydantic V1 to V2. pydantic-2.10.6/docs/img/000077500000000000000000000000001474456633400151375ustar00rootroot00000000000000pydantic-2.10.6/docs/img/basic_logfire.png000066400000000000000000002062211474456633400204400ustar00rootroot00000000000000PNG  IHDR d 9PLTE""":::HHH***7BV3AU8AR111Tesss6^w@ABqqq===$$$&&&,+3czUUU???kkkΑxEEEv}(((222444III888ddd-..RSSPPPLLL///ggg]]]XXXCCCuuu[[[aaa___fff666iiiZZZŪ|||OOOZcq,,,8DW~~~WWW;FZĦccc޽nuKKKooo>I\mmmαzzz"(*duISennnfmzBBBKVfks~AAA晝{NXhppp^guM\"AKwwwւ"5;hp|GGGAL^U_nNNNCN`ryR\kx~FQb,18閚4BVԋt򠠠?NWpwz{|˛ḃmqckx1+)06=񣣣bKBȎtcTZF>eV7ATB72(+-;HPyeDU^K<6hnsa~i%@Fv`iwmRGuXK91-Xu~]beW[^zM[i9CJ~^P39E4=A\ziYTm{Pgse`Htn\>hS|SA:7>MLal5?Fp4;@4Rd\凤5\s8`y<| LIDATxٱ 0 AC 3riEn_f Ў2@;BhG!#dv Ў2@;OYB&02wĬUjՌB&cԺjDC"ʍSQېɘ~ p5#!31 0H#D]I3,V 0{ C:L`\BHu PGu PGu PgUۆ ^RO"0c驔NJhJC(%]LХKDW^A/ґ]&v넾Ϣs·@!`Aj>\|E@땾!(ˆhJaJ&oʰetd'.|uݕripyo]8/:TTUՑ7dH{`y_vɖ?2Ou`_&!s1f y{jbucT(=@%h8JQuRM s2-GQ݊$6z9f F2a>!L#v +%LW2bUO$4"Nk Fr !.쳌0{2̆jR$VC&/BDf3%9݇酒cHWՖmnzNcG5שwrʰ+bUU+ӄ3c̥jӤZ៵\$4R2ڄ)'3Fƪ{Պ"W6 q~j?ɺtθv[E*HA^D]AdO=o>0]RhWZyF[7"""UQlm2WA>@lUc%n v$kVpq P JR25$mZ>rNVuU,rڋhI`{yWdԊ9BbmNVdYGfU%a+6f]Wd^(vݳdL8Q]݇]!""""Z ㊤誐_ JTC@[gI/| Ǖ @hY2Qvrw=2CYC,{dv?Q]`3""""UrQJOҼ2"d+h!Sm~2#6()veEsB&sTh[BDDDDR*d@}r>xh!ߠ 4'!_fJ[O&C7O$ օh<[ӴTl塌Y Lߝ0B 5d61/[]v;2B`)BDDDDZ-2i*d%{OMo #7uG2H، v>3WTۂ˟V, )K;FDa17(e#r]['g qP?{21WȔ!!CDDDDa\TM#).Мo>26HYsL ɱ Q2DDDDCIՈ$!Ӆg1sƐ)Z2K9Of2fpLrT  ,IBF!'Zn|6:1ؓBF:p&ѷ\ȴf2ŝ$X$޴""""Zj]U9ol[fVȴBf AC3Sȴ< ^WE~H$d0J,X?2 53dlIh(JB""""Zj*GI [H(˱ݬCI胷l&"""%WXl]ӂCFJmK]D~!5ǃlN. z+B )V+*2vf}Gd#m be˅ZP໒`(RJ!?2.晾P/ϙ7͛bNfQkL12""""Q%>dyBhђY6Z0'DDDDD ʋLؗ۫2Y"""""ή4 a!Kޮfqw "Ypr%2xoErbP |nyB>ʣUȤm 6y=%3bin2t5ߟy"dNS|8oh,Zteι2@$])dhw-fI !#dp 2@8GCfH3 uI)Cfih(C.V&em$ YCÅocDL/[g*PL>S!Idt'2M{j<8o 0L20!3_&۫c,F2r !,G2r sYL̑\`ەo!S`272`:[ҏ!S>;w0ax &Ju/nB_.&}ѶiEZ/1q=2 AYLᵌiyneL뤸Oy7EcHzKtc IXy'2 =󉗫 h !CnPV0d!Cn*G[tLitDdAY;BdB2_-jnJv4v2Y`I B5SmjymOMS(4 FFL)UۦaքE-ښ8 ZꙂZ_2H^fӇ]N~B&0z2bG!st e)$90cuIZάmbUR|G!cS"#  ed5 L|bk˸C7(0DFY )B.H)Ò{ @z@$!}0\#%ľ+|^ɚ?B *)Rrv߃S*$qOCm,`=u4ށ&;v\֠]BS7`H0+|(%1oV{hs뢷hI-;M FVOk! /a2y_A#h44Wr+!BI2) dd2*9;KtTڔ#!#5JjTLw UZ= f#njx8'R.SH-*PTI*շ Ш$=d`>CQGfzCg=4U!8@VXORZ]4AJ 8~(ςÊ5 =!ӯm3zE7)̠MZy?Du,R&e`!r2Mayr-'CJ$e\ y LxȔR< QLt;fip&d2gtQN&aPѕXWݬеոI2 Λ.BJ0[˴#l(-铆$XQҴz}weBܚ`Oiͮ!YMbE! ZY'G$_R u OLӲ-d&:y_f!Sk0H2G&IUš{dc0)$ </C!ǣ4@N*iV~R=`bj.- xO4uoܝ1.F-|(w=֟,׳l#/WD`BY3TML/(͂Icl֥ʪ=!8~(xlx:Go9OL/kZol({EðUلڒƫ#;o?.]Z LxKQ\2w cz~Jn oDQȀC4(^C4d .)]/p(dBPuuʣ=SV`uՋWn9 mS5_5| rJ5XY}Sh cL 8v( ImʚV~l3 t+>^E9O gC&Wek(_'M{؏8C&fx9e& \o&%!ڋ<l0 Fk4r[]!\)YB&Lo 0C&5<<̐yD?!3AZҰ+1N &ͺ`fG0x` :=;ƭ$̐9p:ƽ+fC&3 hѫ rڋHFX.;yjELz'.M>pp % *ʉFfF_e;;zhWdz@w/mq?dB521W/"QDrN] JֱB/\ҫ]؋j7 ^zb ;IW8/((>}wsR"31c*K,Uv-4]mMxr6A!Zna2լ>{2SlY5 2؀ 9ub翢]2wf>4 mjr\=e yU [?lЧapC ]({d:g50wh16h?<7%;.|xVtGU=F|D_znâ`߶F& Yt8 @O[L=_Rn2 jB&DKaa۳D.\P-`Le=e64V>orerVὐ|!#q)BGIfyf3~"%\ K'Cf0>Ub1̎ *L .hhpcZ~{O]c.ONPX% h -Vt𬧖uC%fȈT}eڄA: j^X:@Eg|pch.(.D|NyP7AӺ+dLt0d2CJ2C$6SfM SHs ({.񷖍lkPuZ=7E+dLt0d,2s\rC` T Dt*M xkԲ6F!d*)o=r`8t'6@d;qB ƪ}/{؆܈G.tr=k!Ó͉ om[/d}| zhze$uRW#A]<</&E-򙭶V^X:EEĎYBHBT`/RkgpDx #?0=t` ; yJpf"cJIĎؙ·92Y/m K0޷gYûa( 9\DZﱖ_1JAGFI`SI +* kbӈG, "^Ļc\,M2oiRV&_:iˉVI~ E2dhX>2S?) {)Fm-$3dC2CfeҬal)"ޒ :dE4d~c` @2sHf!j*H^&2o2#HC$!9$3d)2"領%ɬL"/v , $3d Q[2Lu&eȀdQ 9$3dL]C]O=dz.!EdC2Քk  w2 YHf!9$SMѲg8YƐ"A2sHf!j^6d<5`x Ð`q0ۥ\PEACqHHq4cWRL◅A,ґ,B ֲL⁌!dk!Y:#Y:!S^6dl5$KGt$KG63dnKƎ1dN2$KGt$KGb_-` $[ ґ,ґ,o-{hLxsq&J9L>|/O>=6=p,٥JdJdJ솆L[;;C&w3Yҝ,Yҝ,YbҐ)7c g%KwVtg%KwVtg%}v##a8濇 ̺^u ɐB!nvȌu_qv\G2@AF[:ڃk~@s\~d 2;9c8,SS<Zmv!PÂ80Q!l (Uʺv^v!Xw Ԕ(z܊"2w>g3",9QjκI$܂f+{Ǵ;hkFf{iR6yG\Ac>=ӃBMxz?VɝʂlX{ݳwt6&[r_xپ`AyDy?\fetj{øާ Ögi>-2dvڂtq=~Pt\IsPv|Ic~w<|P %eSõJ>Zmv)HwުTVq2(r6z`:Uc,ue#}z!{%B+ 'd DpץC: 3b~^cѽ 4E aw( cz4(銐϶|D*{gǨaQ3L$_}|%Bnd4 3924qUK92P2~Hqe 'iD1[K3e.iv"2P`7vĿKF&"'H kizat \F0;Ykb[4tP % q)ȴ0G\CGLs WOdEQ*"n 2n$t=+s҄41Կst =dQ`*>(4޿~ V!AmvIeq.5h(ҧǬV]:ָi9H}*(Kvـ4qUKpX],1u jտ-pQiqXF֙ 餃oth{d`pR _ 2 q 2(Y/]1rw6mQ1u/ r#Yeg#c6XI؉s0-C0``BN&@C}k驗@r걇 %-jgG]]iUBfgfgf!hQ(3-޸1e6ڴ@;dAMS_8ג/Zd[WOzui4njdE"2dWGFP(,Cv"~n>f^kK "cWx"_1N^"$rSNd!i9gzy(@!?zM]*NvȢBdzEd1f!2pP"CcycG4Ǯ "FM1!208TGdp$jmzgS+2[SFf'8܃JlƔh8_JҟN`2I6蹑jDe oA,["Nւ!zf HޑM{L(U3;) A瑿j {+<7@#En;f D‘v 1x7C5EjBP(>aN2|2RdJLC-v7 Eѝɲ3v"xL%i/1 䏙ɬRO0-}^d%{ ķqewPf}U?\L腉>)2*O:n-*2=2?a'_~q/2Ii> "G,yrk `8vYq o_X"Cĥc=Xqvm>ݒ"M@(/fq}I"H/4%2Ep6٨{ e|:"c j^dyׂ_yHdN1-]@kIqML؋L\4A = X`bM[s{dF-Sb`Ȑl"p : TdbEmjVD9xpkjLSP(8#@ݖ"[,@ 4w1Td{d>cU""4!2$C }hyׄ?4%2eO<\>d4; {9#1g8.S#&q ak;2 ͗ #ZEfmE&E},24t"l&!/S.DFvf2@ۥ@v]_bـZB~qnLa3f0#eDEV{k#FR"P( ;0ٲԋkG99/2C% WCEfalsix$2it7sM , g@j aۡ-24Va^~S/ՙ?7#2pZAD "DcUd|`ar ,"8OLHRɒr& #䗖B@ [KܖjJHuWx[8ȽT${?k"CB;rK|] LwRLn"JYu$%͑"7/ϙ\6<܅L4C"C&Q) B6V>l"SaJaGvLp?Ю&MLx gBkluyb]!`&{e0h>ȼiW E&+@/"{=pBiL$q6u;Ɇ}ULᲞ~E: 8쏅+5EAڰDυuslyb])"ӇOmEf*E1PA`?HdH`{zLn"2!?Fpv{17F%=B݈VImE&I.,Ev"cD)phu5|{dtS?OAፑm7R*Be(Km0IS@ *4px"zЃ^hb?a$خ}ݶT sb2;=Y!pTk)No5r:PUG0cMOåҐl)l9u#yec,),Ω!O (_!c)7XYo4d脈w@x|ZnGaV5rRC&(D:YN0WwdȨ8X[!Y!I!]!Ӕ 3WX`͸f]x~MY7r;&d$6B^ I:'8Vuw9-N.9 #z,C4 -w-AT~ (9kC 6RҮٟLm#.4Ki4@RrVè,]n2C(k>(f"td ZDc{DXBbl`5@oB>?뇵H@LcxVC |}jY1q\ ,Yi(_}>?$evC1ر^'d}Q$!G q\yOC&g%Ҍ"JPBjGO[!&o:ԛN\l!C& 09 bT?dd)emwD p;QޡKAv@H[PPS˻wdDաo#xJHKJ2uЉ簇"祿~c(wK.}Ajެ*1ӥC1 4d(c c1'М|:?.A4{Rkr~^̖+xK"gg DD7~DLÁ!ਘ&q<jXC&$DMR2Ac$6D CipR~{!>1jU4;8dJx5Qqw/`xZ CN;ېq. ʎQJߑa1J#&x,d>rZϨ]3Ґ0JZ )%:yTޒjƣ0<8b.,3P k|L9ccL [!#KExyk{{K~9P5cTB_?>#AMͼ G U'x4@<sds0Og1Vǥ6Bɳ2d*xP9;*^`E† iBg(y& c10 ݲn\&C1vMş<#;C @уwavaXa ww_!ïL! dv"Rc$ y1BVY2쿐 3de!#E~-k d!sGBH2#d< #GAnԃB2@#dC&&2 ػ(8?LGX;; *,媍mb-74HK Bc'!z%(z>DV3Na~TۗyrŨ5R2A*dDD!#"""F}{w`/\.;po)dDD!#"""FT33S QȈNB+"} G~;{g-zi$l_*ҨďoLoIJI(dDDD!w`޼ʷ3-!BiDq{ZYzj=+ualzi1ӊ#W'iϵ2QӖ/cX+)/^DBFDDDj2W>E=1C&1h+3+bꟇ-sN7O腧>ԼN*~Zs0\m~\#tm6"#µN$RG2"""RÐ1e75dhlEY&2f| /m +ķ#8 ʞ|W9F;f7&d7$R?2"""Rː1%sy/1j2k4xNuN\+7vtt9j\g Ke Q;Kty'Z`a~>:֤p 2<%)x<]{ 5n.2CQdHPȈHC\$ D\6Cr6TȓT2/4"Bۣ+8R,.p]~/aXqX8U: Łíixk o'Pkh{~y0`o0b1FK4*ik>qeY/:dzy=t AȜmnS2%\!Õ/d[pu9tC ѵdja"dEWDD23n B -~TaBkM2ȫ4xIr't Aȼh#/p\Ne= i<#)=<,^ hd"d6y 1"Cxb,d3XjPNV (|ݐEy\$R72"""R٦?pU.sxBaw `Dfb>2dZ9OV9dDBF~w?Ma?@gH6u?l9A*nI(("Lˎ^Fɋݼ m0$%ϓfkKDDI ǫgg%V2ۨG>ȹT E0{ |٫Ƈmkzg,5?Fw3'>d,8C2ؓt*-xo5r!#~P[hM*ԟƇ7q&-ixrC!{o)|d W^N C yÕg|BfEND!""D|c yEK\[]ٲ'ђ3G_6>p9%{-k{n%"drݕJ=p" %"d |KcR-Q([;+M6b<{>Ui0G[RogؐANz4ղweGrjI?l~ZPP#;#J9(*~r{ٕ35I2M9zt|{Ql0d(!]WDo`2/dQK;"ji#8"1Pi 'E\]ޏeCxfMEҰW;++:3.<2cdHD F <`薬E̾"C.oD """JHŹ )elSZ,Ae4WDYD*,#{Q{н"R)>P>`s%p"Ҋt'`QbB?rJ*"f3Q|0d!_'zc nmD)fQ0d!ck .7@Dw6'aC&Ɗːڪ"B7(nD' """bQ0d!CDÐ!""" %C2D8 ]E8v iߓ6.XӫnWk+{n2D\ )Po`"Em<ư5ZZgV31 X"~Wj'1d(2DSk ,j|Ԭ ¨W2jͤ!SPa efÐ!Chƕbjv%d4mIi!CtC CF[d0d[ &a "=Y2l!ADAAWN}񘘵^`&7Vj]Ɛ!z3,.BFoId{wo:9ķ 8h]}:]yRzFpfeɽ{;XHg~<,{'b~;RNয,JA*w*{ NhX!%סHtZGըx2+bU,iU{ކ޶ /TBc,>v[| LLѭiX\a0$bc Jpt₾Ϙotx9|ϐ8nU[DyM$Bɰ,=g~2E*Olgn`|,L) 3+^RI3<ɇJmǑhGa$"]{C9[3-9H\Wiusʬ 0pO:^6MAD "zX L-d=1?*- 5{}{dB D&3vR;fLeCfAQ c3-޼9"*F~Lbش32 @]E<slk g+uH@Һ+->#S p(qWcbQ>o:w}ͤ a "bb d۾ +J{X+s[Lʁ>"3$/fھ{211b *ʿRAr C&"1bJ)UJZCm I W̆Y S&,][U)k2D2D? .CfbiUyhཟ3S O_L{ȸXHߦQQct% k @rK8"}H>B['@ߘ!sHe˹|2;[f+ISyx_صdSDĐ!Ő!Qȸ[tmȜ0X)|),/ |ې9&m 惞2/C 9nrӟ_"!sHmOI~'djWfEp_Ux5mSCCG!#8T_*V%_x@Alm rBFt~ !Sͧ# ]_ӷ'2" =6 CFRULo(8"-dz>xDf_2LED{XLKX |6d 6ko}bg fks .dB *djWVcL1;7f~^X︄MS\X!CD!CӐtI8 k-`{xr *+Lj|HvACfK}Kl0C2n>`z@_3$_A⭀=V6uO>$E{Ԯ^/0o9w}1|N= "bCb8d$Ar:^]rvg\TBF`6~WK{${ퟑl5[!s2қ$ZR' 'Zh)U(c-73 +\)LCve7GL%w_m箯]+)w!" =$ jJ]W_|bndm' 4!7 ۿז_}M=VZ!CD!CDDD "0d!CD 1dÐ!""" q2DD7u@ vB`GdvD`GdvD`GdvD`GdvD`Gdطa "C299 %%(ҌZq*$.˞6x#+a}&Asq!ҍa:[21?|2vĬ@u[2qRY*0EB]DƐ.70kMdU-M),qe\ "]00?ЉjEV2˻`HQ̈́ /iN;Z2YԊ9y2'6 `k̷,28uZ2Vs8HDc^"␙]͗wi2'St"d]D߅M_!A[5\odA~7}2@Ln c]4\ĆfBK`ax!wpkܺ1Qa*xHZ<WfQVvPG/E>_`Id5&Bm݌:>5"ic="+ 2B!* s=d$R !+$ B!2)d.[ߓA!C.O7 !#0y_II>:o\Kd_ށߓA!C.O @!C7Ƅߡr(d/P2 b"Ncl2v'x)&7!~3txf@oX_RG$EHs )UR*?ϳveVc+8V!x YƥHpP%IzkWLt~\zQ.U2p!/(ubO uYj^ 0X8~SV Vf'qtκ_-S5lsCr"Ue+ն$oUμh(@ڇPoxԘker\ղSR7ԧPX E}%w'Sl !侰pVɤ.dތKw5拏>:*dtflP`KHj!sQysmQLj %z0uWɤ㶐)!;EKGo3d bTM^WPMP2 2/7X[i~}?!3Le(ҲR/VUdŮ s͐,,VU螇ӝ,vhakƲ`\nqhLsJ6[| KeB22+ f>64wXb1Ll1'UK1/Us :Z_!g.dr8f]%l; }3dʨ\LuL+zt52M|4 E2;5(/Z }i<f٘H +MCZ2!#$dЀMOb‣=e/'[mM2hWW{՝>yZ_d#°u$/s*;Y=sǤZ p1R-W<*Ϻ85HcșpJ?NVfe=OI[˴I\l洺6Ѕ~j!rrhB22,gzR\Z뱯U]pޚvL343ؗ߈w*\Geį:9vεenԷ,gEἶl 1 (B"L}#XHL:7hj>B:MˈלLCߡ6ѦPL3\ l-Cg\)On怌Z ߢ1 IVV&bD5 2F=|j\m(]/_.,= n'4~;ǐ7v>u-_l6OȼE{/g|й|usYyL>=yȨ/?v$~t/_dT?v PDCn(԰w@ a(628-RC02*X\SFP٭Si&y_A c%n it 钲mB@CX7=P؀dSXB!4Xg,J1yYDL2yHC,3`$_|껮 6qvHP.J&d/^d40$Id*p6 1*]( XA´&P^ɬѳCpJ` s8fnlz&Lp0͟fȅ}`:xzǎݕx1)Xd nV2Ruhg˵PZ˗oZ`I|A>+7z;,?2pL:~#7[ ў`2/.f>^|{|x\?>x!cmNwo8Ȝ†ܫ?Ɨ/_L_2&$RxVWoan:uP i0R$;S'^AW5sP17* \B[cEXPjhnL/%Z93Uf)cIZўACt`^>VkF !HT̼dTCXQ(V>k2'8+iճiS.[kiA5h:j .9r-#-$ 6%HExJVzx+WrLrYd,tua2}h:: 3cb~I7^Dhp´(2 CF1 yE9+ [\ &? _^YKD=v_UĘZo-k*V)3JdzX1XDY;[b\6sh6RT~n:}e2yEyGrX>ak{؃ښw}ӱ&>:g2gJo#_?fyЇKc=~Og >'x?]9RTO^,8c-5gbءx h/4TC; k@6Y욃•!sEX`Ѝl* c}c2<೨:i/#0' ^i;b#'pcO aL'/՛yGu,XO pEP7U 7L͎S\Y@\ܤݘc-O Jм JqyP(Y#"E2|p;OQ1Eze֠̎82YgP!_rl k.=vDL~iJ]ZLY&%c34f]M6 Ud|E R❿{sFw? ['4z%=hD*2/qd|z* \/J+[Nq~Λ=ܛVv[7@ V/&a#p,/gtpUh/T̔""$5&H+In XجdH4c{˧JGt<=r苃t ԚN=dL:;BG ;fBF$ d*bi| 2i4搬K14Wvq-d(sC1MRr>.L3͔h!0EfiC1{]k|; =]u;6 Af~Nw1Лjp#xS纑@ 23gTUL@,c!ߨ*?˗/Z&*|27)P/N oN 2S&ȼp-ֽ52T.?eB/AJ࠾ sWW #dkڣ ˫7C?!a2"!10-yz%]f3D2EfWM:qж UTT4T`l $ >꺳㪀8ИFS9 #:Ad1\:Ѯ ׮qJԖZ&vTw@ 2ٚ5aIDȤSO7T^^^Fy QGjЅXV",Dž2'|DoU,PfHUZn*̕GQS`~V C#zV8<&}8PPPTR PBuGPAƗ/_[Y= @:_ٔQs d z*[AuήE|ħ? ʨ&ϔQb|=p}%P}KYRmRwˠw`<; WWԒݳPk{ΒlDC2L&Ů6 KR "xT@l7pU=4$#rA͍Yk2u:;.Amn58_&ghXj0jXЗ CrvU.ȸ2-D%n̏7UvFF xx|_`u$T!$iԦۇ _~GH`Vt mLW5ߨI%vT# ˗/O Kɟqadq=boɫAH?^9d:;w}z*!Y$=xbkSiN!!J:4DNө"C# \LÂbګ|툒`|ז|sT:^.=_5wI2{p8OtzAFN瘎edP_!p$̩=? >! Ҏ2Ǖ|=pQoftTͲ;fWYÎ̹~kPQT=# FKmT)rzBIcqcñ$lS޴,JiTB#5!~s qgX_4oSy:QjrMlƂe-+{Q֝CuaKjQGI6Tcwi+2DќKa &,["gqG lJm꼯(y #ٹS(x'cGΚF< Yw[NcǠ%<䃌/_lbK@&9専 qB{H <cp,1ˇ #ZG" LdT6T~,i?_qZ~Ɨ/_LdI߫=l@`ɬiӔ6 ~^k`D}F`vR{= F3 b 3aJH̔Uy츀BĎA'J_kAE|\P%xP$F==dYpB/<"9T!iN!aM ir K8;wAdۙyo:e?̼v[xpGcިȰgL@f f%q"Rt)@ƧZ dtELs %d$fG4_0O2qg SGo t<m2T|aDD'^ǨIX9ee6tb0PWBŐf*1"8JU d6mmԈ9Zwn4?rtFD#P;Q#:+ŏ n>DmBZn&}tY[Kc7dka"Hew3=S2d;%{8%cŝq"jϒqW{\"c)??ZVFM92э m"5G؞8a58a^&إ(, /J j'nĚU< nhqr3U#xogЖAf)ކ@˖&FA\rY9VzEΓϟscD$I梔#=~,~2 ;@:݌ϲ^%[b7,ZA֥ j>}2M~%9G7/=k6R|˻œ{h%cTIEGw[.4ϝ"r1dd #N`2<  }gÚxX~}g%H.\yЮar-Q9;SЛN=Iͦe?c$w?72^EL9.bjt0IeʝP&ZjpL Mˀ6|lC;|dQA5isu4H/NWXGY. 3Ϡ|s= ed5 }P ŬNИ0_ W(3an9Åk*ru9帻󷞒rkr)_Oa0&mT_E~[ 2B;=@dmdȒutJ?Y&3{yD l:]d >c&s01c moVO-ucDρ !v9hQ[&9y[9@/fk:tif2"E:ћsg/bBS!d WMP ..WSGT%ΦФ]_I}ܛ3"E􇧖URlUΟNΖ8*z'vN7[a+ٖ SoߗS`Vlogt~:^)u^X*j;t.x2GJUE7jzտ}nVlŏ]tfJ^GZ_]5H"-uܕąEDN"LQrH_ػc(Ò[X[eVHebg,XB9HN(! GLUf<2=b^pl;BD`~dlSZNgSvs!D7djBmiBѺY!њˉd6uNLč7d*Bj߶1h,hEfEro.oZ~IPŃ`ԂlRܕ  ׫# U2'Ru ݽm0@aဴ,]j2Re H, }#<~ph~q:U2Liii%bljlm9v.g1@ W2?: Z2 Ъ7|)sT2Pf08Nc px)v+s Q$2^)emޙU5ۇqbE$dtMD0\:YDn-dJY[IޕUU8"2>A cfƐ)E Kb:Bq sW&*9dJ)0r_O%dv _`nO0B|ZhL ZJ Q >sVx'1 d@2#d!LGB 0! B| e|M2! Fpy)#dF!""o#9"Ry!rYBh2t){wo&cLX%\2Bh2)b #"g'j,\2Bh2{Tvwߌ3RǘRĪEWp'9q_ rMBh2 %b =( y+q#]ꁌ kUߒ2@9V &qz cg\҃y( M\}N$!42x6+2&|f% @ȬHȼ !#d>90B&␃ $@J SYAr 4bCKO`[lK/[a/e_7|< &u׺$3y<<H$[7n޼u&IKy$b&EFD"Iun$kݚUyh1G`0AnL;״s,ᆲ 1l)0P7m C ;{4նq] Df^ Bdj[CoUZ1R*BƊtTo0e*{0$2k)XyiU.^ZZln+al]1wU]yWmMH"#H$ϏtG3Df!n4|obp$zW^`-^9G- NTQGה@C!7nu D#2fT*Rw.W+Un 6бLF;^tNd /jV-ۥ._g/'E?-+3.Zvtk߀ծ*[71TnS]e1-B`M|C1G1l/Kq6Ⱥ-CsNJ-eSl+ RGYgtD|\djj1FAy䭃mlCb$AJ IdBb"SM(%bcV,?@tz"EF?|L"oއ"W?dyq"#y!mʢ Nf2k7py¯NU~!N % KDX#PHl^,QsV']&`EN 2ȗ)0R&{QEFP^7 D%xC4یwZf&n _Tt y=%~'g1#?!ia<(e5C"ÊK&#5ctTdEn=\# 7>I"E=v4䫟ΎʣO|>÷Ui-'W?7$~gz88.mDtF M+?eHIl2r!PiP(p3.q 7!*Ӗ"T*}LoٳÖkH6Wxvggه33u$cMLRi1Ƞ6n8zZ Oa))IO! ef sORM#;) 2eJF<ߵl7.W Vvt>fpy A&-`9qwMa; >aLQpw d~OߚXddB^#ݑ6&UCQ6ïؕc)$ #r lrHo e$yP)yDz4}'v![V _oL#7~=s +$FZ.H\5iZ;ևSr,?9sxij vdd:Ǟgc&Sd=¿}|FA GlV-)_j7Yb՘a^\#jQY7ahYhgd`] Q4L.n0Wb!-zcL?Ceе軫o}vo>#VW`HߌNbV)DQ*^EeJ~v 2ίb:-OF5mױ8]Hy0%$Pժ;_phB {= ~{0W"(Y`DƾlddgVtڟθi㥻.@Ŭ̳.I c!;$\7@Wz' 9CH{ 2yd%n M4@HΔ8Gx`L~y*C]9}H?5r{@f|Ȍz/D`&p{LS=j7N#Wh1̂H.p lA@Ղd&@UXphBAa"أg a ;[<]Iyc1488\th|p-='r=R2sYisl1ky c`}w 4  Ai]&CSWAςL1O76xg&IE -}=^)|d(⯤ʎ7+{:Qr<ɼII+w2sWIIJ dNbj LwC7Fق2Y;كQp i;2V+ 2۬ dj.CA=oCUjL*"GA 5@K@B2+[Uj' SASnd60ʾn$ U+DԦ^uZ5m7Hh"lK91A0RcmfoA@Fl9;+$841Ƞ^}scz=}V@YoX3=hudN2TgAƐcc3TtK K|2* 1oy sYl+) Y> 2U1ȬkX2|72οz|ȜKN>{LxjMy\͝ӑ8뗂8:08LEEtCBu Un3Du:4: Z%Wgt<5='Y.WJ %Cz`$fbr$ā5 .)r2Rҫ?9=q>̧Zʉ9qc*^>Yn\'B <}x cu=FU|}Tђ*'n摂 1+=f Wn~98>c4t4ޞnPO]2Qb<4҉TJ(7If]T+AyqE:&|Cܙ 2\7JI[SSes8Ș {2mMcaLmb:UYvЭߴ|uxfOsAP 1\P7ryT]e̫Z2Ѓ슊%V6/hAf<6$۔Hv36G_2{l?-+~le-S%ݡ$("e~A&.sEf\fI(F.ߤncBwV o^34^1]/ya>h.K42B\CedV5E0|,JiX4G(_ .$x&jƎ3esƛli\'>!;ͬt{AK륱323$K)Of}DR)CC>3ΤpVo!Fe9~ck!^/VЏGT2y=(z/0lThg|.jr&̨58れ / }odf 2S(=ReEѪ|6H) 2I CCىQFfKjڤh6W0}Yfb&va\i\#Wlq챘<#.Ztu~JL~B5Z%)@eҲmit W1*?ǒCϨ>ZvON}XԎܵH > %YT91'`&+fsHC3Z>qH A&ᠸt ymW:lA1pWW_+']_>{^ثh3 2I < Hao6zdP@ǔMzOh./g>ӑydC+<4YWׂաm7rAE6d$=Oj!L>xZvem1m#̋rkmra.PzLjP)yv5ւgn{1zdV!̷+CfT( e֕V^2L\\B LU+au1`܅x,Sn,<ԅc[[ 0rd oKph"cܓ}igAF<[5H)$s s^usL23ɟ92er]X8 :M~Q￰aċA,@3|&.HU7ڤ4ًBC#w 2wg ,l>{ASZ֎ ._sy: 2I s: Tz7g ̪>ǩFs`fiE CZ#J9&i=*,h1hrE2[pϲR01Z7_hfDIqײ0|Vk0-[?( ƎWc bLe~=6,p?QQ2Ly#G[a8/83sѿ*wg Ӭ|\WNHw_ze&:SVIӎez̦`Kv'\ 3AN+2 Se{22d].q M 2I C!oTͷ!ҼRފ`|^۶tfG!; b'qB~A*`Mt -uÖ4RuB.^K c=c cIOssdiS>d9~,=;wzG "x#{D}~t/2Go({h |ɏk VB0bD$X"s "vGom83\aWĀGmJ?\dV , U;Ef( !% ;Ud5IEcf釶ܵ9ń@4P^JdVCB[TU'~NKHK%} ^f:?%}:"JEU G'jK@%̼_|;L(nrײYHZPzסY oDdȌC$>w-xˈ n'EmA AW2jԆ_Q#2}̽,22 Ef~ӭ\BÊWUlY"cz ?qւ\dNOIR V -y5"CŹr&(ىF"î\WhD&&sl4ŞVFY2k0nLg+j5P/pTzY{6Q@ɨAfkK5i"3A_ EEF?9^J6>I_BIO84ׯ6U% arơćȣZqcKK@iFd[D̝]{ƹh;~Aaqu,.\29}"qy(2>ѭo\zDD:w"GĪ[ܿTrN_l_Z)qj}):UK QE&So.ƍ8n/_9Bd8Q/xWP6&>Ddb$v %mx"㢮kFdLn>JdzXFӑM.T]r7`̈2>Z"*ne0~} +2iuKd,NHd{#cU-=:9o)Eχ0"bũwWЈtId=o)a&s[5$Q(¥G3$wV3"=zr`P{OVRkwiu InaS%2ngC0dRz劌r[ /3!mZ&"CK@" qRCx]yA3\({\qkƨ[\j/AڡE7)d%f+sS"L(6>Z|`=ΌC?T4㉌jr4'2t|~QsB%|)Fdqޣ&\-& ~ɮrr>ħ) ]_4(2SFEKf1ԒȨsb|eAn,s/-~|u^< nm^[tr^#Z4SIk"5L&L5f@=Geatʝv\K:d7XWۘ"35g F\EFr[aĈ'+ jBP{!-+qH-NBDd"I)% !+~](.Bg"ZqȈ ӯkκa"2tXf$%a6%i _d!"s:Y iAjڸ6][u~Μv`o@&COԕyw f 2\Q_ȬQ"B?f}Jsa[)N|tb(yLDLRk!AvP}J~l7G)IFPAx1B-Eah"CV}oxBwwOjuiI wV|B<={mr^Z4I"644#F]hzcQg&EW|YĜbg2ȡ|%-IV P ,-9v RBx<-@YmSp37>JMп!_#ΌoaQ]9䦸X.}RWI6AJ$T:eb1=_&I}+ }L(6>W22@~9{ uA}8TCǬ Fd*qՏ޺~ײp/# @!Ǐ'BTWj $&S;Z*_!0H]>aĥ + Z3^Zk X\{ɜh*uȘMd9d+ dҤXo͞}]nje kjc#*&WkpBm\l;05CuqnjM ӯc<% bˤdą$$cmJ/s2#7r1ÿ0!bL@ t{_3dQ2fu~n9| ]J!3NYqæWgH\1Z.JnG]{;bgW$u& @hT56o!"dP@Ȍn]pnNiRGYq6dC|8a[xy7OJi>xc}_}5{ݖdnoO&k[J5Ԇ3(aLaU2YNBqZ Ejy2RJ[dAr 6Vr,v6d2j`PXdj~LQV9c5;Y=6 ̴(j痪6ɐ:qcbd@p!W5iO.J 2qBpOĚGlB@p!ױ10Be-DQ,lJ2xN.$ d2&i!9!   OAB'd !2@  H $d|B2@>!   OAB'd !2@  H $d|0+/t_Ú }igs܎~r d2wy7_4}ھ˝Eq}2$ {w8qq+Fhd !a@F.@H8J.RPN}cH.Kegi|-c^l~TaNtî|5d]ˈJ! c?CGB3d~W|kO!XC1!C!rp0!{ c6bK2Drl9tЉIS`]ӂWg>)zhl[7$jE#-8Q.?.W*mϮ!sZP۱%ںu{aT2ym__cb_GExիv6 IrHC6nS&Ku ca쑐qRCf L=b0z#K')l"9tL~^&-4ϜCѿ7`&@,>Z[ e`1êYҙ\c̠$*dw"Яz%^AY ~x wz#c2_CGBVBsZrPQbi^}1zOc:ZF0-#gυX8-d3DqȘݧ.ǵd|WD~w#dD&xU c=׹ rcAaN.1􎷹V3}dKZ;y(R/ vDio mg=i:1!8d{(dgY:CC&_L1?H [0"u B")D:YV}#l)IO=+2K<,>__ȕ6Mv@ 'ok3k[|%dHJ"j!=P鈅Püio66B6aF2=2rq2;>Ŝ(i/7Lh8\aAZ)-:tOH$ifȄt5̲O\'W*ۤ1*d3rfs7duHktDGN{M{wѧ2~ cT،1-2)|P$08 [_OO*ѦOdc"D)iGV}/wByv0-# dS 4̘D8t ٨aܩskn< vtkR3|bڸqg4= c+8d^2=lþ %dt1rIf? s &0C&2PSIV}ލ9c}+dj$)QS>0n۸s욑ycC12~*d32 U,impҥ0YF*fdCnȘfh|CuWc U;#d3cI+dj(MqoYka} c_-v],9dHZcqt7dY3dJ!C)L>ê40#mя4GHa{wW+ԏ2c2a+C8T`E!Cg  }]R` !R_K!s23H>&%EPa ԑ0,#2!S{ EY;6#!%|W;8dc_!W yB7$rVU"d#Laͧ)1bʛ!9keٗBcJK}WS5k İ=gت5L! \BnCe#yoN;QMi:t1! c%dM¢[!C.0tؒ.̊f2s;2A 2-<!C ĹE12Kd}p>">.!Sz;dꏬ25me^D61bq0a D`?vP?Mh2fXt;dhs  kRPg Kt$]8= +2bX枞+Q^cD KjծwB*[{wn =:G !S1g8dSȡ{H'˳ᚾA:(Ύ;{0]cx-ή?2;׷oaC19c 8dc!c?_kqX!|񠁠Ls9{sZNͽuE-D(j1A.1<uڥCo7f|./|}="⣐BFD|2"""/PȈBFDDDW2Oo2:N!#"""׮^TL9Lb `sw3܊بZ:Ddz2"""2 iBI?0$~teBD#L>%1圉!LKPȈȄf2d@==a *l{Ycb9֜&?OPȈdf5dL?d\0!7d\SȈ\BFDDD&3!pEWV9n}dòv`~buce, B&zMa~5K׬G+Kz֮{`Yqv'8 񳛷!"#)dDDDM3d*liZ==c8p&x9b@P D1Y4CĔE\L!#"""Wo! D|$^J4L &Ubsɲ5v*ck!c%w7ƌo2(2DDDrӮfkcR2<{2;QO(2DDDrӞ8ML2.2z=TBqDŐ!""pCfmOhl2d*؃͇̠LΑOJ -dF1 bQ!ө51|z4pj ŋ)2Ed b}Kcꕯw)ukzAxzeӀyc &zzX1ƽm]sȅjGɇДJTi2֪{1TU,W*fCZda`= W;GK)xʸ(2DDDn\Z9um瀬*XІo#GPgK2" eXz{@q&WEXm`⪶uewz:Ii``$bV9WטM*q(#agJ5mL꽲؊I*ndw֕>< nECB 6^G֐^ %gf|C!sW{eү)tLg d2R?L,r~ul2P VjZ>d0- iۤ_̲Le` ^(2DDDjȔq Y C&V@?[ Th̻ip琙F[b0vx"\1%3C悬JA٣V!r 2E!HʔXʢ%en2(EӶ~ÕʰJr߆/v88cą9[R{ݐ,) & 2q 2h 崙 k4c'f72f_][2ԚLe%[7Ⱥo㷖o>֬QIiM~&\5C2|84&ۆLUvBTl2u]A;x"mS7C" \C!EC ]2u!3+FVIFBfvșh4l A@d%jR~LBJ?]ZA#U_"5f% O.dvVeiƜL Be|&d&!2{2CyVV4" lql,B1\%-zY(s'DZ>fK.?da=y!8t!Vq%z! %!s#=:?_zeȣMAVZC$'zȁ(2ٻ&8OH_ڄPE" +bjBXԈ_(xPxDQo. M7y>vfa7QCfR,{eXg8n$y!QnȜZ(̛窵ꐑԡA4rE<uC)e&2^3K(B !Ϧe/t! S.۟!CDDDM 3ڸ)l6놌U:!siM)crx7dlTEcpYQȤ_BH """j^duPM k ɶ +7dq ^~CFtx~ԑQ"yVkeWf~<īFNr 7:$LD>C2Z\p"Q2(lvC&ᩏ>7dB*Ư(1YKBźbz]OxCfA,F=ӯPiW DĐ!""ml$%K!M`2x;dN8釵n Z<029matvUN/nU%7dƌ$@XB2LlW[Rd<i2q%B $ 5-dN:dy&d.Þ8 e_quA&V7d^jq&cT7c0fnjHlmli2dpҮ4$s2v7A*ŜJהtC2m쩚OՄ tݹr|VQ !3g?bN[ś-_NmʨP\/*P+ėeEqjmp"NFB/\yeN \SkQ%. 4 {YDĐ!""fL^ Iq6dtA;6CvM?2R28G臱il3,2I!r2#:F!Sp' & 5+d6>-ڐdֹ!TBtI87dR%rgbB&SLhTW\W^66/F2=UŽ~ YWx.j2*ԇF!;' & 5)drژFE\ڐ{7T ~y3}Zl^ Bw9oBKoh0չbkSg_,/F"2pU!r)eR{a 5!CDDDMu_lCc+Sci501AȞ8%r}J5Bc#JԬonS!CDDD2g:[8ʽ$ɨ2!XZAP """j9BWwrl\ GJSDAŐ!""V m-<,gBYy?gHzS՞t/!CDDD2xY e$v dF)h,C2h{Krhysrw\vJOVlG2DDD*d*{# E`'X9 Y%1t 9[nke]~'2d&d! d!! ! ! ! ! ! ! ! ! ! !,Bh‡Q2}]]2@=d*B2z(B2c=!aQ"'P@Q*\7+h712M'ڰvx/+*!de2@2LǨ;吮fʄL^gG dF uU'dƆicnsxf Lt_}LVOBhU "$ (21> Ш2bB؃EB-- 0S=d1 Ш2g9d"VW7 d ZBX!oQDQJ6(C-i0iYXhc TڀXZBpsaeB$%ly_q d2ZGBh!u d2ZG rlBX,B2@@#d!sR5Bh\!qAn7.ۻ@`8.[>fTU>U5ovq$dn4Uң۴@'dα;P'd sLO8A!KQ1zxo؟s4ߜ5d>~ћWV#7knEW!mBc(&F{??vbzr)/OvJic)^l3/W^dwVErS J' `n2ٻ&0AA "'шZEѡ(:81dPŹ KSIf!K!CvL{}9Ӥ5g3x佻\+dΘVsREV}7j`*7o5~+bUs 2q}9qz  !|ĐF8,z̴hqL7mI!SҾDL^ h_⒎ 2 ]gDǾCZ2e{l7R DB涞_RjOBWX b؂>25Gn*U5SFG9w;b#}eC?\t#d@'dl؎'d164*f&FJ dҪ|uMx!Zqޅ+Ԋ➶*eB؅uȄ$gyռ (dtp< %8ލZh̴uu'ː9+Sjq f+]{]U T6!d@b{2v%{kZ<\n52d*m.Y /fc]^[yy6?-6>ZF`eu+mrg yeeQ0%ң@gJ2 }PcKͯ:ѱ9v)u|m.ܻ`.fLEF.5Wi +2 /!=oI_m(uZs KCTpT*UugL&dRX"ḏ}Ḩhj΋ϴV;BLnHAJ|=d-}WhG5dj(C%dSzȤCkẊt]ckȤ*ˇ[sI`=d4#yGcRrnĬ%d)ɻ9:0t6Lzdڥ Z(֐I5\]bZ)2v HpB~ !eh^o]Ntk(/:אָuGY]b HlCf-lN3VI|2( HBrev.1Bu# [Iw:@uH0g P p'2@%2w"T"܉ P p'2@%2w"T"܉ P p'2@%2w"T"܉ P p'2@%2w"T"܉ P p'2@%2cVQ~W>/3|>BA +ŶS*ksNu94$g_Y2"d#dE[Gȋ5ϓ}OJ@B*'өu_ŗ//?oF n;d>b5Lc:kB d!:өXWMsШ965s@ d搩z:?h.;2oqAf{8l7Xl  o) ^+tt:dZE)g5Pj5k!GeM[94vHgG8D_3l>o*FQ{U櫐)e!sh@ yo @] à= ]"]Kl ~y>ND${'Ւh9w"35^qЍ"2PG{YɺWJw_W3 ORw d2{h~sDO[B&.%R϶#Sha${9jkt\tHrA<^{&Wm 958x-)b20@ ʥX[v4 4.2#,'z9:JhBE@O^^2f:惈웎 dNLɔT};͐$'[Cf&@n<^'ِii4ri]2RϾ#sT2`BڔLKCqblyw!ӖKem<{5+!Ш7dRMz!v)d %L4kΐ PvY^Xr󵧲aޮL}d;2$T-ƻ"% 9C~O\ڐd}iNfNeb:"{b!v)dzڕKaaŠom m (.,?e6dJT'&|6Iei-Ƌi龏_[<ܭjƹ1w @ y2XJz/+OUƎRcy"4JQ~:뚕gY[ݛ1'B\@ȈAp-kYG=?[tmp+U;!."dY3I|9u@l c /rRTG!-ƾ8tÉ̖!-K2p@8; dBu@ vBvD`GdvD`GdvD`GdvD`GdvDbi3-J]n4cd# F12r `d# F12r `d#223ٝvFq[݈]wcEA&ő1ܙ$RtH" Ŭ{ʔ9 POJ """Bfd{QV5tFMeYդtMY]@:I@D """2Bf#$)A0Ԥ驾Y@8U!CĐ!"""!s0dd,N"UQN\_M-zʜ }peE2D """92e^4"Uv}2Dj5詒Et>hl9f1dc y~wϛy}~C5L; FPQS }`1dc1>dF]CwKTr JX5)z}c1d1̹9^ql|k2q|=_C;۠e,I)n&GH=)z}:dz " slƘ\17g&}j|ld_l5z lOCFlB2DDDv!sc^=B\]<{1[sYD*bXV;d2D ?`o-p\AW[%B#qPq(nptga ' Ϊp, -#QqG<$e4顰Xj홗̣Tf42Ȇ"=2d2DDD LKN(k5kxt3)-T~ 4TfT 3:2B-#b2s!N(Md|r;=8P7@Mu_Q!c2d2DDDt^uCf%/a-0iZAY]c8yTm7 ]40#ÌΆLq!C!CDDDT9dą |/Bf'謃B.{O2:NoW6#ÎC,201db![<̫tk+ϧYցm;'mw }baF!c^ 1db!]?t!3e:ȅLw<ࢎjvb$@ǛXpi aFہIp<| 2D """*rxnJi}޻*dRS}p27f44!CDDDU < 7jޏ;4/i/<32̨]xYWDĐ!""~@ `GZ|ltD${ 2$9 2D "@d8" Gd 2دc( *DPX@V (ƶƿr:" PGd 2@D#2# 2@d 2@"'"TE&| 2@Ud U{*2@d,s U 2@SdrWD: طc SI+aRX(h6^A,+!̡r Š66>O9'b(2B(2yB(2ϹB(2< !T YR*z7"2@!sѻ|` xlMfy` xK2B z_D$- v< !2CG !T r' P)d d dr dr ֓޺|lGB_| B>ػ{0 +ڤ(G2)d+/"   CZ56Kb6{ 1+B2@'$@5!1kƆ\!)d}{i 0  DHHT@рz6d|{M(! 2‹f(exk<N2122 1xƆx yB62@ 1dC!  o۲g!L2@L~'TDz3i%d ؽ p      OaPp5-'2Id'z"D p'2Id'z"D p'2Id'z"D p'2Id'z"D p'2Id'z"D~0 fD8 $2@Od=N"D8 $2@Od=N"D8 $2@Od=N"D8 $2@Od=N"D8 $2@Od`kQvsnPWlBZR-D\ b6P\wM wn .0d&&!R33:w8=B%BD`#dX"d@=B%BD`#dX"d@.M-C"dri㊄L#3 =  9d3Byqw'jȟ 65fHB2㥪6؆L/kJF. ; 3dNrB7QYK]jab!dkhLoh-%]|VcEx]BB2n;TRCF -C!! z3b8ټ2IlN 2̰ŷZ 9<#M2m!3wc췝ddM=󵫮Ɛ⻹]8tƓ]c-dn>d  ai(Q!TQ_9d̨z gBfY_BF{e|~Ƽlwg՘i ucm4#zFkqS5zu ]ԉrBf2A:_2j-_:W @~wMq?TAkjkZMDɉCQ1jhbI Pp(:ti:8BG(].&3 }&z dFa߇* UG0[jtA<"CA!*/)2b۵(,Đ!""z d/NTjbn(7dSswsB%QcR>7wh>wn!dJ$6[ """ Ch7,d]TaWCfFT55eH!2Z*OS-!3/js*^,"CB끐9)j3ij lAc0f –52?E|ZC?z !""=dފǢPnrۢuCO|=(d"cAȔD "S1Q8 """ B抨<,Q'42}f 1&EBfKTw?dsg6I@TV{rr6" !CDDD@ȜcmRȉs~k~.7dxh:vQq tT)gfC+O-\{( +!ܘ̿ܛE.]#C!`,1CCt:CB`ѡ2 AD2DٯCA[? 'z"D p'2Id'z"D p'2Id'z"D p'2Id'z"D p'2Id'z"D i" 8w A)Gr51,ZVVB j0i@Q 088  be^Ahk3~ͽI2"""R BFDDDO!#"URȈH)dDJ ?TI!#"""*)dDDDN2alME"8H "SȈHmC=>:M2:ы5D?< D2"""RӐ ~aYB!309\9_1K3dѸ)s?aȼk3 `c2d!|1ts"zL< kcNB&2"C!#"""5 q2eW2 2#̤g(X lc5v2#d%F#CP 3B'J: K42xے'вMLC`a|](dL]l({r{=pJ: јþz^d{!OQ!"L[ZA[-d|֗K󴦯3*K|C}vE4`ck|=0,挵_vVyj!(2"""RyEEn$ͯ1.Nz!ÝCбC}ɋYN5{!3IRc}msgN Sa23hΆNuہ4nRʅߘWy˽a̪Lb[& 1=( lki ].*4%8ӁxtiC&aEW2tȟ` ~By&ݙs rtEw8yU~̈oGm䅹݉͞=21 ;!9cbo"ni ]Ȝ%PaAX t€u S(E9{(d2ʐHp XHeȜ.]T lVXfvsEluzBh5+e 4 .d^80D_V23cP{X2=dֲ= [!bR2`M0U2׍ +k20gN萉sż7[Yڅu.*<)[$1Σ,cOP/ d2b!6dZ!Q2<>[ j,W}L3?j=6fH}4M_!c{k6M4r 4q;Qn1?6mRυ`h6LC3P )<ɛE(}?_7[ rF{?.vړW5/d"%Ա4W@e7Q1=md2CC/HEI!s ƪu˲ /ϡTАQJ)TB&DN0|^^.WlFc!LToZ$񫱱@Cȥq;0" 2W`$DDw$ߞLC֡!RJ 3 *qƆ#C'dr@U!3FqDVPf̙] !ǝLp,x\5d2fǴl̚E YG^6T ѐQJ)TC&g29`ǒ9Te 2CA 9 CŢ9˨&~7d"4!M2~g",BŽU 8,D21}ӄؓU(B4dRJ)Đ#/G@D`5dSn{$,>RҜZQ*OZl4dBOṙ{8!3avqyDo{&dҐmׇ}aR˓9 `Cd$ vfMv_xls*@̄z#=YÑPh(R!$ErY/\cdrV]2!9OZh4d#aSQ27\e YA>GYv9 4*HY<_k" |p_}HY EL\HeWDT+ѐQJ)TSC]#3,NewIZ\$C8 Ih0dl_ „̘g{w2; л92qLtg~AH.݋TO˗ݴ|+5!&꓈X8tYvPh(Ry!c|8*ێnf2^[q|8==du`v^l!~ġTKѐQJ)_ ;2 'd$D"P'ul oo,,v9d&` va(ZXx 2D?o@-"L*<֦w@-"̥>q^[@1"̡rD DHGd@dtD ]w:8&>!L: -?PO QPB2"Lܚ \nũѡ,J0>x'2@v d@2#d!dGBȎ ! ;B2@v d@2#dRZ\ҍG Щ{Bofq34XL{8r5Z|{_5@%dJ'eBm:) F^FW?f3GKBRJ3wL̿ MEV*Eq2ߋ;52S,+2. Xm)^6d2SX-BfQTTiX @M]2S2u-!s83@B8%!SMkv^WD𯐹Tc륈[{ۋڮlߏSH<>{u0-s'/+j5~܋/Գh,Vz~sǮMqjĂKҚ&1iXk* % A* dq:覃t Q&%Ui)g}eW2"""Es?2x MtRx!sB&Nrꌳ\4YQHD2tk5^d}(dDDD勐yFcToO qt@rU y?ITK{˩+'d/ά5W̶\AϘG˺];wv "/BgG$Y )i 54!3\ 3Ap-O ͣ)Rd  hM>֔,_\$gC&zE"!r[4qQ B1L?k h B&DKEN8ɚ20ƙs#H:@tL4泛Ct!dCb;)Ri` 9!SS8n#B.ոoSӂBuo/-M"/B9-~X \nneɣrm'/Ø!Yr6$ &d&CT/CfxkG8YSlsv>Nd(iGx!3zp(iY~ $r6ɩ0$!xFdqi7J%H\e/CigIV2? l2 y^Ax!>˲>h^ 28ȦPW@=@z^Z;̂] mKWbi _ek1#iJ'd0YzqDD!#""" 0|H4nO"tGdeA_*f1%0DD!#"""~ BFD/ QȈ?G!#"": u;}!" 2" 2" 2" 2" 2" 2" 2" 2" 2" ~vD`GdvD`GdvD`GdvD`GdvD`GdvD`GdvD`GdvD`GdvD`GdدA_##2;" #2;" #2;" #2;" #2;" #2;" #2;" #2;" #2;" #2;dp$X\k3i2T|ؤ J, _ ԃNCf5:G>{bgx~=|yol dB2Z!- CB@!d@;LRi3/! |vThLǍ &PȨT2ޔfzcM2 u-jq=j2@!d@R[45Xoq7Tu{-dbh^ H(dۭd&lu_oq^~0$n!;@"d@"!^2)L!B$2%3$d2 q)]/i(?YFȃOLod;gN:72b+ +3i9g4}r:+3(2a2%h>ު1 ֤V,Vm"vx(/Ը$F2MOڰˬ쒮:fJѾ\"+-;X8 d@Ú5dLF/9dZ;d>5rN{fֶV:B4YC^jʨ-3אY?2gCWˉ-U8ВpA5k)GqLxKv9`j!)xy!;1˃n^pqPvu "d@Ú8dK-sj̛^ײ(d CfgTdhE!S}b_xD`GoTt;-!a]? >b뢶jm-Q?QQ(*F+EdQ CpN!fs|޻Ki= drVܐJ"k@M2!CfR`YMc+dY3ֵ BVQ顺,fDʹ_|!S?)UufB橚%\Ԭ>"r d@n>̄Yj6!3'2I'#\S\|%|2 †Lj^0.ZW&dJ쑚}^!HӦDF]FrcD܊2:7Mِu5Bi,tSk\x!d@n ~#5Bު 5D>3Qd]wicY"TIu^ ȭh!1rCMl9*F$5+GS/Qc :}kYzn56UG!#d@n ]j%qBM]]$&ϐtxI*$dRo%S3>>`chX_eڮɟƻ&&&$i| LMMJJJ788vvvlll剉BBBYZ["""oopGUiiii444WWXQQQzzzEEEddesss:::GHH2ubbbZ/ -***i\\],,,{|~噙SSTŹ蔕n}___.//O};<Er("8W{4&Ukzn(2>cAq~ / @nE )[ @nXP֜uɷǧtXj\G2RDr/#E6&e%yms%%*zѤ,_@ `U7!HFz/Few[+)Uuna;.GZpW#,W'E `%dZ0gO7zcm:tHA,ԙĵ3%'t UR!pܭCPsTb#6ɳ\ 3Α!1>82I PIelf)X -ZJoQN`,JFCN3Z FG6Kт Pq9NjqZ c@F: r)=H SU,nF)Y oA.!؃0Fkɴ6J#z'5=}>\Wt0i,#)i  S 7NO#`8^s1wT>G ` E f 6~%YJ*L0\ 繮U >g<"&)a Fg-Fz􊸲H\蹗] )?i0 x0Z^KdmN[5tv[ZXufsS`U,x;l!z/x^ G/ o6]vˇ%%˓tk槂pP C)ɷ=4tY a,=dY jV,`OPbS)hVpx?db>D,xl?,aogKw -A%(H(f/lFJAbXڅvRv >o" R]띂%%H}7gi TUUlI 80X,ŲL;6G!A8!bz_&3| 4BS_< Am [D}8{Tę,[%i T%<(,IIx{ LLwi!E-w WG lL nQAO-?4.'lւiཀྵY] NJ8%-qT72Ņ%!U0d_%F:hb,UKA7"HEĕ>+3fqA^D&(R0XFJCz5}s_62ߛUݠ)OP gm P07\u^"u2JmuPNQ< S`isU;҈0p w@}z&'O@rRsOކ2&_I&'77%#vȱG,7XᇅԆ}5=“'33O"{%_!̕`-X f hjcHAؓMI]?RB:^݄9*傂 IBk &r쑆I UI(k!( >DJG^ <!nF fxy  "׆M3Y%sZ<Q}^UM_C5c}n*RpEDŽՐ )%MG#Sd%!_h{զvD`4Jnor ڊCcs!:-caX4HWcP7),)A[B j-xC'4_<^GR T)A-(T\PW>2[OAJA ;>hҚ#9h 1ߟ!N/LOh)xgh ʥYj7 r&JL ndzxf`m[!$ O'u`g(8+^⇜w|a$`v|!c)`o(KRPwɭ1-;p[3CB c߭q^&t]/gN`ʹIC tj/O".ݡn3BHr`06d ڡRTfqUOAˊQm* rQYUT ) qG+`I ,OM]%ȈAdSBr)-80+g- *ગ܏eCCiM5J.!D'*dRP8LЕE2ꖒrPFo~#vԂvs4>~?"f79}ێtSQ/!gCČ2D`HA] ]5.BH!Xt)h50)s;e+!`2 A)(T{XZTt/U,.THA\s>$&w❐zc m˶Y$EǓBi J)%)g|ܽa0“퓂 HYKl,8LV-H+g۔)}ү_PWM`?ʃ'{)rȹ-eUB#)WeSЁ{{046Gˉ=g}]goxp]1i^Qq(I=P<> v\۔L]!*uo<>d f[Z D?0mt?bzt‹ޫ%'Lqۨ *2 8@G+]}k|6+_& 1AD4ͥEGOO+RǕJF?wPA6[`?b}XTnmnb!7#vQA7bR&t\GѺB1i{b. G-HQvSY)\]ꪠT0r<@J_,ETE1".tyڴ7D B(X01 ?rmɋ8ģ DByeJl;ωzY;9 R@385 6Bl&&hhNṓ ƨ '酤?_ggrK*88믧ڈ6E6Af^q0b*`+UC3sj!v*``TH5nZ=POu*@K/w "L^ACCq , "$Bk;ëA›Ͽbdžγ $HfP܉DZMWnUzNP߈Q5ƙ&C%1<4#ʪ S~ZA]i".*JV$m(K>?mZPONWA0(kKj6*;$TYT{< EmYzFTP"f *x[LPA柋Ӣ?\^W^.T rW_-STpESUqDdULm`|q]2"z^؍,b6`d)庘wmZkT7ʷ] Ǭs)?/,&kfiA`)*R.زH2t]6l؍*3~qLrWl0kL{x`fG\RL<^TpKNNϷn)Wf7`*x(X&8`PG:x\F(F"DK }~sr)q|y~CRzȧ_]U]0A.xadeVd^K@YUA8HSAY b]_aMJ^Z?6-5*x(+Mp* NQFT!`0+7Qr@.UӬ>PRQ_co$X z"3X" **&lNU3C' UCUUp Z&EQqRX?^>r Xo*2ۊ#,TP7{TpJ[$ Dpd2D{e]Y0;Q0\ ܤ7ApV=\ۨtA'(k϶ש 0MY@bR}Wݞ}":`JI}a!7Q)<[54 *#WyVOT9ktl'4d?vX"zф1*(Td;}( =v.{s hVCrGi*_%dO>N8X"2x&n[XP *XQQl No'THAhFDarT# +Ҏғ8>*)*xϱ lO\Q9*GӬ][i])|tAռlFtq{*x{dH M; +iݨۋ[kUpqHI]ԍ, ڒq3t*SXr)9Û<;f`3PHuARiPA|c9F W3FH߂H&A;ld^gt!Ys)x~)>cЛf"mi,Y`\{e[11UP ҈v[~1(D/URXƲ*X]ڱZ~ lp+6A8hF2*R*(je>As֩0sd {M;>3͕PAhb:q >BMtOm;T?I 1'HآDCp_1yo" 1U)5J[5;A%õb Fi7IUù/ͨ Ni FU@%R>C?tVܺ r;m a+fxDQ5GdF9xVJߙNЮ CЦA[,U9#[A+Ҳٟ:딼Tĝ;q6dבt,'XEr >’~%Za:yc٨  5*8d zdsBx矽dߕJgvӋ3 iiyAw}?ov,I'II *("ͬe+1F|z =]5U7lk*h j|Цո^;K,n8, SCCC)"oa5 K.;D*i@(&:Qjkf)K.󄽶F : Dٻ8>- 3y9i9"̡r(n]Sr ̝HB((7SV)wn\>` {Rsi7*(EStfb(9ѳur(L ƒgSSM#,p,&_MAG :L" <ԭw_S;0\RTJbq\4Kc3)ZQG>q G6\[ԉ3V;7(r&T_%0*@mF \L?c#|IbP&40GTQW"u&i#Y l)T]yGIA6N1IEs[l\߉DOv:}" gV͚L8T&r!5)ZCx)d#㩺o|d3IA6f5}!n\b*vZÂi)A?}zi}Eo!Ԑ$.(Xg_֣b^!cZ ;z_8 M1L~ r`mlޖK *d̼riL4$w^vf(t HgDv*Cd`{#&7mJ퍩d]~yǑr& kL kn,[9 qN#HAO MܮDj8AӒ6g噤 qX)Sr R_ ,Y;,EH?HO{+z bNJ4P)%*`ߡnQ?Lo{oPMlvLmۖIjiAs5 { Ɠ+e/,X)rev+ʨE!NASMK |u@;TK'c Yc'ϛ!'u3Q]l%ک<4s)JShM |~Jah!"@Qi~[*D…YXBd$ZhѕJLS)_ra$Ps⩎\'j߾㚂YH4J>~~/1&Tك1MƘ쪮`gagi(ME(wS;/bV ږHJE? ]Hz [T\m[Ɉ1ѫf}NAKA,4.2)X}PJu^򦥳^SFkv/a^v0  !5&<ɤrl"F0Dbx 8 ]pa$fB^+a.Kk)"([ CDDDS5(]rgOSЗ؎RâNޞEqDvPi e SvapgS,UYh?K )X)8f ѯԳ!X ݇NLڰP[jR>\>0bʺP>y ER4* Fբ˙1:!}3ddLV`jrif%rW'BDDDjL^ȚJB h؁ Z=͐T}Yw0%sLQH NHBnARk2RPGR#m1Bs(#!"""ZQוue"]1>: L]Y<z-rU?cbe]Ơ% 8ޠqUA\K 0W`7 Wʻ`5mT]pJZ"""U58 (O QZ\K+؀eo2U‰lnWj9f` {7N{'i>k}LA^-R`j >.xg$[Ã_3єsK%TR0s6`先(R%i`\F5LdY(u˃bL)8*g7 3GmWuDDDupH*C)?i9NK"Fą,5O1/<m٠WJ=+[`8cn)9j+DDDDKJk܏𾄜}+9H8 *S6T s8;S1,$2wl!ynKS0X wm„|Ršjʢ(.V )Lkp]%_VpַJ:RP Oh$?40ꖔNJ)(? "Q0,(A JA)ydB |1)Xnc<1~ҴROvm'( a i1$fPhǦ"BFDքؤm/J x'u6{Zb]lG'q?*0cXЅvN-7Q0Hql8kc16SwXjhtEzT}\MNqd1^3W˘*\(xИ*qd1؟~Kah`p ]{_[(c?-$Y 6e`,,\CP׾wQ1cz$ GqF3, fpf c16뇭Qr#d%HVإ:Q[/%+T4K+qHz)p}ک2yH+%Y)qm{qmi3k *d;ള 2cn@2cK̄d8;#}jpĈޝ}nh^E_h4:5M8%[ d N4GߴgPQ#5[čHjnEKkWx LE _ĆU#NH-,wbSg7X?j^qǠQkOݫTeWmCwhbz$MT|Cxѿ䩱ǴcDc[qPgl vD{>WlSVYpwGM*f*jES-7dA]5dcOm<)D=WߠAU~TS՞ZȻ+R&V Uz~QЊW`]O(펂&Q:# CJN(@ lM~O˂* >Ha0N 36+.ț`Qbbl44^hU?C?m38 %n;Œ?8@ &j~-JmK@s @jƭahSCh|EB:smw6e/D4!>|ܞ(ׯ Uv'=ౌ.vd!)1G ˹heі(4Qn847a,Z+K3i`"2hʮEC.Ws HjlP]鳜[J(1UP%gY&*Xg{7YC"^ K`#/gkX{MN)dl[c/`y 9ܗ&y2hN}<~¤8~_Ew8+{scQm cAM{܊ѐ&Ob}-P<ڶ+ְуf}KäLpT)Br~ _;DGFUUc T^w$砆vx7MP}/f }YRcЂ0ɰ ׬ ]|m[%huPA0bwZ8' 6_֕oCY+P Sx- Yrv2/s/o4hK8(;{\|h >} 5l&y֊äLpT)Brʪ :tC`o^491?[aq',9A|o홱 1Uk߳# n*Y`[I-<Rr<5lBUTr ՑJH3ծV}jIO^CR0%$^?*x?kbϲ-Q{G?gagOceLP߳[4k}ܟeE{8|UCֳ=<<@ d3%vNRݬMR=]@y8jd[O8ѼM8J<@SgC/GJ- ȋ$ YN˞Jl3rN}wF(,ACGjrw1vYƢ ^)\*.ְуP5.A%IQd!9S>hb«C1It̫^%\4̻ ( 7ݫ*5c0)<_,a*1֮X\30fc񽪠l ^a:;L)(OW)~ğxt,i +@iJ)oopR*9JUL .le7Ed!͋y:a[~AF[BPA6Nvk*H~Eӑvf~5 Wxx*DK9Wac jحZ<0)?#32,$g]YY {«t(&eH( .GdUp8b!R1m‹R -wж8o[Zvlp mr鉑 ~I+8̬h):z [b cPw!*9LLL$ ə*+ۧ͹/$#vG8?NCn13 ?)>J=bI* 3z|`f^x]|@cٞÔAtk~+htT9L_Oױ`$ J] z<*8v|}.|nʔ)SL&DFÚhjCأ iڴ Up/)SL2e:S,VQ_1@DQ4`b.Pƾmem]ڔf?‹ JYLoMA<p%S.[yiLc 2ps 8T=[yd >s%pSwZ^}[zlAc :n*\7Up~ҧ5R^7p.gZZSpԆ)ps 8T])؊ĸ37p.o5 ^#-LAMKn)8GZ܇ 8M5&~S [La7p.jLI SlڸՂ귱mO@q~㱙M(TLKkڮyTJ TK TK TK TK TK TK T_L1N0aVBjc3p,zA ߊ( Cs觿bd#<<І=)࿈wW)8m\S~HTb裻FSa %P!"_I.bw sqR}"SЄ1+J(P!"Ј爩$#JXL&-/K@USp9 c YNQ2F 4d9pv^ 4c C" @C %),)HAfIA@ 4K yeRM |8q: n9+sւRE ]7-P/ǫ^/ Na   ͯ`·*_)xYZVa V1 ~%~f  ٻ8]PX]ju*R`sź2aACCai {dRk\ZƤi;?C_ J-~A&l z7(RN O ǘ )xn(yC4GS۸)E T-zQ=Sn%idM )p&Һpb).I $ R'ULKIF{ )`OdY#J)p/`q)`'e,vaAd%r:;|䄁`]SP‮ %@], ]eO KGDھmm} 1d޹q@Z"!{90mƀ0$`ɂ!nsaTR)U@ ;f[ mC"agfwOsJ&Mʀ`p? 6{)8 0O";ABTp\=VYTV<4=x`F Y? A8`\[!*PӎFrqvm࿥:U &;OI 6i90K ߤSS? Lc*ܠǫo4_mSĿݧ_!QI!"f[N@b3Y4n; 2T b%j_W Kqma-͵Fw|TJ./yӇOU}1+tL׏٭`Dt}?g\K>;+x*g>ΔB?+?FT۠Fఔr WVxoT&Mءp|T4cͶO܀TA_!*D6)V=w@㹜 \'p2Sw)PKŸVAH ^[hsI=Mޜ,SȵJ΍A.BSAX5a b/冔;BmphcUP-$?ߚ"ku:թX[4ё*(Y¿1Ѧ ~gU0TPwA_+q$~ 1~/n*XǮJES<}Ly $;m.oJ &ǎar*_>p\-~;xgZu0,n#~^6k 7ޣ1Q?W^j*cXc}g8dIk"qty kr%/i&gRG׬zΟ)wd *n.romR{ vyN~{ΆM{rjbqruY@xQz*(O hk]P^|;`z+ vD %/ço>4Vk>xÕ(%:;d„'/Ft7  jMdT^dcD]jhؤilBU al`&$tPnUDjLٷT47Mڨ/iES'!*KdXo0ܦ K4cF鬱 L?%Lsզfr'1:!d3;\`86*o7ߣ1!z rsHbac 9Y^7=*؂Ë 8n"( u`u_Qo202Do6]%-ʰ4pR= Øt5n/iX pJ 008VTqN ]`y쩡3D qo4+͍c='kfBMٲ'!쟙@8O =#βǻq<^ѣ.+=$8Zh}n稂ST' vpAgq;xw'~յW`pC#SAiz\4PAW"Ax!\e bKR8`}*+%`h4{䳩 ,T[x< Fۗ5!#2TZi 9pg01>a̛U07pAhwqL~&[1 r^  f;E<~gE=FD0NF FdU*όF;@Pvݪ`٧GBS4qCRAWA_$UpA4u=LCDmQZbXmmLQC:Çj0x/T|q2D~?|f9+>P9~\42住{T׸{;\pM!r-hMTc-Uh4RY 8u vr ',װ`m|is ܻp|+U5j * q+;dpv4 U0 ZrĴME=8 M^ɠ"/jmj*E}Myc^-ZVTP,-OTM9Kӵ!~b}&- 5X/3^·Ǩ0Qw&:Wrey֣)2̱Ayhwv70N*ؤ0`ѐp!GtoTIUG4ɬlc(tG[%(+q50TРRA}m Q=< Cn;Lm}6L3lp;A lp8|MNN2J r-0~.? T0foM.ܧ>^uɧx)?[a7?n׼.dǥ@+;X+Er7*h,엒 ZYcd=<TpZƇM3V?)+*}j MmQi-ZkVT Hh#g\|-TG3jRW01*'>H0*f ,dY-:!@iSdSf0&MP XgTiQaFVSW'.NUm Y,**8SER܀bťa*q> @׿Ny)5YXL"QeY'mF=5#I G (o%gݤ#=R@ฃXf0nGpԮ~^so@ {䭪(@+Z]IcNUnLd[PWpM-+a{KϊvTtޣzOx-Ҷh}[3SA}=O?Z*(c0P%עOǨ`*M:`/Qa̱] Tp_&M.UBe}V꿯s*C0Y!NV7|M2MH[ͳTM4"[eNaa*áAO:A>ActFȉ˯p$'] **PAS/{ʥb$Ѻ >@B p^*<|))]` Dx|a kn hh*Xnl%'~13MU۷ c VAOt!TҲ= z,Wl6l86jZe-Ҷh}[RA}N8 MYMrVP>9x*5Vh<Y EGd}Iyj,RTpa41T&MyAVE%"*k瀱HApVPzA?@ Ue#=iޱA (tkҠa*áAOmDWA5j lp?8bx([JgLX:@D&V .T\;? `k B}*6(D+@S2YĻ*GcP^cTBXTU0 aPSAaZZSA-Ri[[>ٜ͢+6ڀ5QGidjj?zNF>Y3WG|J3'A'Tp(;9_pLb oT+P%cI˗KQA}q\ X&MQ5  ZLj*؇?1*nb 18 *ji ~>߇">w>A^F*h F`Յ*ye\WAȂ;MHڢo-JSt/S{guNGZ<,lȩkRF&N `B(ARE@U Ћ>KD "zٝݱxk`nw> :b[_#fs{כ?[O>sZWk.|<|*+2]ԿMz Y;W y *N \O2_$:j3A][0'=TPpgmh凉 }"DAD, t.i pN%SgnYT*wT\ bqh" ]}NCAOH)8+$<(F `}O n {\&?TAL>, 3{k* p~IJ rcˡ9^EьHco1JI#F8iֺ-pcxz䨂\+%/J~ @|g{@ |^ X' ! ;?Ms  ~*"TPpge^7MGldԪ:!WY'̫5> *l+EalLaAle+ 2`<-g B'D?B:+sw p3o<'s0"e+ b۫`zMtD륶W_IQx3)l+~U+xN 2o66o2Ul(ZF&BvdGs4êX5 *Hoo U3N!8 /,ϫZY< m< B`leDEi9yh7H0Uw=Y I&LABĕ  CI HjmU`' crƣichiAag!*[@㱳 >ePlTV0DyJ|Ux&[~S[C9`,*iTf2:sT=).PLP뒎*ỳЯ &A&0-c!+ &ұHKco!Je# &(6@Ms$c9fgG9Ё8 3bKxX+B$) ZpA]ne(3Z}%Bsd|>ޑeU%%:W7. w0(>7(J-,TPBP|esČ'^~]w4zpY @F]X<.Vﻒc Bqn"1\q ./sНطca ,C0p:0FB)ԆXI;]8C&)\ NW!x:S344I lI9^YכxeN\,8e)-x+eSmPKnZ))!J\g]pꌂ,x(p M` )`}e%L Q *k 6yU /%T 8oͣwCS0%&%)ҢFJJ@ TJ#`ߤ A 0gb @ dIA@ dIA@ dIA@ dIA@ dIA@ dIA@ dIA@ dIA@ dIA@ dIA/#)% J R K R K R K R K R K R K R K R K R K R K R;F(&RC-ےm@*VKH'C 0}L:_bߖ͍E?연^..]̽{ξr , ð 2 , ð 2 , ð 2 , ð 2}P_lJYF|fDcQ*ԅ?X/J񈮅yNadUyN֮ݶJ`^l.J]t]L4+/~.0F>N: dkq'Ǐ0A 2'E /\5'dgŶ+ܤ D% i z*):3 2/OG!Ue U75{*BC>*8 ^Sb `7b1VA 2TPۥ@X>SìK*hŃL4NԴ*f)'N)7/(Ձz)*-lMoYY )#f4ٺY M;Q'QLeL5 `QP:^We+XRZe#+e(]ZO&q䏪&stxIynIJ22w9#SƽГ j9 2 2A 2} *h@2GOV c*Vi'-Υͫ $7vshڃ !llZs6X$ͅRVR稌&WlNUpZx,uR.Z8Tpq'NrD*$*cα>GVf?ZG3zVA&SA U#ZU,U0Zu&S|ڶ &X%).G"1+j%]5bĚ9SH@bqQ Zfϊ0M;kAM']d E><ZӰ&Y /<*̹XKVJhe/i4ݝ`ɥTL$RhE.'\UĊYoΔQz%(30n >:="*j!}V1*$N#UeRy%'V :*#h?i뷃cGeϺ}sP&R0C*hB7=۩ U#ZinDvI۩TH]NW`ʴiꠂEc^%l9b>,UpA)9J.$b &%Z;8h,6֩~}.\x sP%& Hf{%NMe ;N u{笨uP fpUܨM[C U+ZbLg6%!S,Ri`+T&2Ei>T!P@&DHK'Mh̥Lh>'sږNb >KdrZKw)&S:ZE|xBe;(LD``n*xχo ~'ČVAhܲ=SA mx$tj\6i9!$'6PqT*y nU0t !os2z"{S$αVS%נUϺsfP"mЫ ߨۨ߆a.*4 gE`g6VS H [P+]T=7bUR5l 5/"*-Cy8c${SJz'ɮƯ=ىpim!6eoo<7~i~rT ?OZizO@⍇'S@XKkv-n^TgkۗI^Yܾ6*.EqI344ۿO]wv޿:/sEih]T%(-߽{-=,v۵$ VAi"88-@gL ZQvQ&ZmiZc\h)࢘v1!#} st@mU7糰s,@R`ۜql rwxwM[n."0f 6UsNYqzevgv!܎AQw[ r<{7DtR`w)~=C4shoT Dϐ)QI e ƿNW`XN()>X[ek 4RJ f촺%G(kI$:cɟ{ uÂBn+춻Aڇuj|9M#YALDs`Ѫ9r@aB9jՇ&O4*D)2 L)(%8I=C'J(~`A*FJ#eܤEci:<&G##iLJ z~k!L$˄H^ ^p!|'9 JIZTh *96]% 6BFb]%\\KJ&d#*`.´<_'"pER>;=^ qDHETSP)Rj)(rA휛`N˰d7kDQ DY%ºwe+W>rR jLȻLj#I2'#hX<6S%d$z"*]!zg䤉Spg,QuR-)@p`K/X&qu*BΟ=)TJ)5]Ng*6&Ǖ&(ʖn]<;YhyߩIfƑ1(a,xUq*-/MIb#(H ~ C<&U!<# PJ+)X C}#=:)N<"6_4y){րk2 5U?J)4bn}ӱӲGj|[ :\vaQHMʺa)\ %34S#)Hcٗ楠Ha(q?esGSP)ԠRPIG&֌ֱ!e|Aönll4䴷Zvw4;1דRxk$blk=bS3pт=B ׭s)8F}oڄGSP)R$jWZ'=QAn7"=-S@'F?;,#kzk .gO m3U T{&]22WD3H\ t ;v1VTJ)5U(Yn,9u&C8MN,[+e/zQea+ؖ}Ф;k%1D_Sp)ؙ g^\c/iT `أgRPn5DR!k75U?J)58e.^5]1cđ4 {- WwB75  ףQyan5a7<v/)"~)HC M8=˿,_@z6i!YTh *c#8+ #36;?¶9d:6M Xob%Eg 8Y 4UThΜAp E;?^5aѤ[{Dt6)2 g<,T}i ?1j@aR-Y2lri {TYLH'!wIP+6qf$<$T3ouY=n'6 o|iC]rוdȻ#ȗTW>C ]|ՅlJNA[~O& bD #?ZrE)"'۶[9F[ ("SOT'5;m9 GRiRLkRSpM Yk @ GƤ`&2;*N|@ Jb2LAb0v@.l N1(0%9 Ԃ27E ʾ5)֖ZPfSN b)Y%8yA} Z9/(Ҵ !,EB & X} jke>UJ{@#'])Ρc}HW' 87?`aјBOlдZ5 czhL@f L_};}/ehUL&`*rؔm45r  Yk/C Lw3R38/S)ir}6T !#a3)xkX "aT HAn1ƻ+ŋW)ݝ$wRptѴ4hkl +A ^-A t~t1j B! e#̧)V=w 6̼-uSn9XHA@nHAqMCdC?鯲9`QS00RnB 7 8AS<@HAR)0[X4!& hO1Ũ{-C)~!/ELz+a?E\4vfe}᯳,lHCĢ{K`)8R`,i2)F»<,d<8vO8n{t:8'O ǗfqIdaq兮*k{J?j_7<`A`JiK{(xWAw_ƃ`Nzrh+Uޚ(RKQ/Q8Mi~,ߚٙ\~S0 [V va<aRN0+>nMz1 dL 1~ [MOS_73UpZ5+ƌ?ѼE\E41, ="lA0*c1f&p. ͙F&bB‰4bʀ%nJ"VP/$`^gnqCX療Jrs:$l(܍YM\$K X4(߂`@VnWaX j4/'WG5AC#cq%Ã޿ρdzGw# GGP.[Dt bb,¥D. 9$("11[.&p?I-{q[&a.3vN6>/!p hC7ǽZ TR\ĸ?.ؾ 1HL{6%TPr Z5 Rd4+W"G!m_.RAE18t/]#w>FОXӑw3oYRq¾.馕F('T0' NG8cD䚵P2L*'lR@^FzP`Edy>%Lpr4dH;ccO1TݶD^ߠUPJ^LЧs+;yt~7'o59uooǧPx <]P2A~Q ; 7*ZմGkys{.F4La_Zdtd;f1+2ilwn$0qEӠ+b|}d9.nI_W8 j4WVVZ˝rkDK,y :Hy~*h"@6q.m#LӃ"xBCowN`t0͂_6ik&{t P[՚' 0M,굂UP"u߹2]Ú+ u*Svt~XwQ7-bCvI0Iq xwn &>K8Z{I`޵Z؆sa=>Uʏ\u?I& F*~Q!c~X]2Rv-I"Y/u(G` c|^YM "!74_X'TyTBtqYUhfRmo\cM3R'{x j8-VK*\>d[i4o'WU_Qu޳[AJ=]ۆZ!#I5R(n[cEqUTxe┼4y vƉ;˽ &Hd΀ܦC 3 ,`HJ{ʭo^ѶT#UД,ۊn{kv-vJJV0F]šu'cF u@T#>~+g&YW+:@#@ S$\XܢCd2x_QAIqa2  ؠBs)Lwaf,M6Cl26Q "2ݔPbo(lU`VM gȧ2G&fSL!y.P}W5ޔ]EYm6$ *c }Tl'YrZlsNFyc*hWQm:oD.]Oqkh0%mr 9GE1PPC*È[B HK̔1O)n"3MU w(v<: C(T2#N1UPv,mnGVNg341ӊLsVjd9LH N`~Kf ci%+N@CٽX. ?ƮVK*h47ҏi:XwZXKl 6?Nē9!1?<(&Ya#a@ff8|ĭgqmc l -0b nhq8%lҿo/G?K'm-*8qpHj HE+&.:aӖ5a+V- ^VAFѼ!~owgx-n89~Y|o7 @"BDgHBa$o"2.p4OݛBbI't2ۆ|5#;hLsgL͝ v)!E ~ TA/N`< g 2N&zѼoHuhV&QVq= Xk'&>ew`* Ք,[P8IM4Vh(§ сk0oK+3HiFy#*\d0 SB@(*¿"[toq~Cki`ԕ-)q\DJ30uN=5`D DR$ C"@!wvS& .{ * ?xn:_9)OA.Hatم0gpƣ'py-ǺsIre2SwH=lg޽>y4RnR0~x()vh d &`9[ρ|SJ֠;uzc.za6%V Y+s0\6cxUUĠٕ[pyV 3u ߟbcq^'~2hOZ)}$0"Ipd t`4OJ9HAfTk}^ PPRQ%vC\WgNPө,Jv 虂l&-E[U7H I1 A:eN8q0s˺6.i% EMDLjzBt% R2P9}<8m7%w i#}vnR+1Ig*uK tS0W:?h?a[ RZ% bv+q(~S WY68=wuOA%L7NND0!9Nv0PI ϸg 2',tv󩱃e|~$Pe\DL 8@k6X*m`5nx7ꖂR0Xe=!5Ǻ, :Rо!3m]N`-mM1PfK3ԊA6vOJc(q !Jg}vn>cpx ƾ žĒP#|vK{W)Ō/y)xh0n NS`ٝL@3X;?yo)xvo~:sH|m{ 2TH9TR!D> IY$+HtM+b=4BTdNVU gz±.u0S kDjm]n)E%Xma>3,`q>]u".b#Kjx/u?]R!>5t9 ޯ]L}[rK;.)Wȱ?CS0*[Bp뷒7ͽ-{/Η$OcݐR-<8N#ԛM;LX (B90"HA UyP9BJ擣 LJ NbK Jjr,A-⎏ aVf6`|ܵ1&fl[>'_c LA&a $@o!NhĀJ_d" Dc!n vU>nҔ|uu,Cڸf*xvQg=i >Ct)-G@9.8M||;} aOS ߬<&<2t=`0*pw$Lpq+#:zc'UAA 6E 2%ILȔQTAȓI]a4LTONw$&%"6SV#eM 4.=|91k).=!GI=x#*gi {er=$)U)K]&[wA]28l^G/WA6|}@=S,Nӂ03#WJºh` ﴴhԇ@i-L}A(W8[tu@2ܝ>Q zEGLbۮƥv~-~]'im^*Ag܏ ^U$LzGdG/>ڥ/F]$LSvk‹إs*,hH392KU6L\DH̏LD;[5(@N(' >Y[tOSM j }v_[c@[A8FF]+8tׯ\\h" (>&!OW"/RA 5U1KMI[|Laxrح3LWA]*8K&In&*X&7Z L|&l&0[]!5i! L9_-LXF4SH Swg$/GXG"C1MUBmdgLCԩ .=|9B+IO˝Tиxeox` &c\&O`$ݽCvy 5Oc~wZ1F& &͗|ɮB`C,d>nY+Dzo1a '&MJjv3G[)͋j[xq.K\RXWeg_rZ1C[`~W;bekm3E;G,^FF<3RVWZ7tаnLXa USA>*y |5 :Vnzg@jTA (%M뷒 5[ ByC@H9c/REV0N۔; gaIeVy59a1)A;>l6W̯HI?R{C7UI(:-Y.`^*+Ĩq4vwk3KM. mxTV˜8mvI+.[hyȯ߭TA# >C;C~w-8ߋGNA#0?By0E.4 ]D"9%:Ʈ: T@4zE@ⓧ:T N$z}($WA]zrDb 6).N3);K(h5VA4-2% 6QA13zFi=_)W-dqb.Ei1u+}MUpX/ QkCLyx壟Gpk؃*Ak hjkLԤs*+BOݾ"x6UЀ]޾}xb+k(K`:_@xB]pz= ['4' 6WAy5ܚVٿ%{?2k$T#tNnSLѣ Հ;fHH!%D=6ERq"EiSoq-rb%qȺº4U3UorX"S&WÕGtyv*PVJ(Y9v2FLsT0]R2zU_g{;g\'rxHb"F^plk^4 . φ LPlj*s]b{۳X1 g_'#듿Nm}fVN&.?^L&kQQ\W] WA3,8\WUp#XG@rHpv맵qwoPIdD{{8R]XQ\O 2$ Q!mBbkJJ*>{jSAX/f*nVPaif^S$(6cqh`l I 㒒1ڧF4rvp!慹}ڰaRCe?kVЙS&E&{Yt]9 |2qgeqHp)H "sA˶MSV.`sUЈiFA2f3SAs*x!Ñ>-IEP~BQXtaZwҏI*yH OnX :āRQ~cj^~%V!Lj-'X ^‚ł),eEQۓU2lA^c2*^RPV"Z2,RnzŃN \Ss2PDev.xch3QxĥWsMf=bi6AHh'M4{WAu>.1ҤN*O`O Eu jqoلDl E"q6kWsH;kڅF|!|9lpǏ[.` t rS녶۳W "a#~t(<\qʑы O8 u?\}H~ 4HE o8y kKĀN d){ryT>#PTd-*h\YQ0Lf %!b$mBQNxJg4WAqV5VM"[SR2I@ Õ֨TuH>a\]]fݫ+:K{.xH=']5;;gI2Iǰ-I,Ꝫ͵F<:_X f]ZIۅ*7 .u2dpʂj݊?X+ 5_ nm.W76&F*Q{[E{4D @ҵ0>U4ToX*Id^xr|oJSB vJ@ƅΆ-wl&&!*, %6oxDԬQo&[X/_+hLLTT݂ vDV@EҦ!tqK*w3^͜ Ux]2idr *Le2wE$j]Jy;kQنմBz5Qg@ARDf L^Ӎ!d4+vxEv3ڮ 򻴑c: VP9V7\Xc VgfA YnD ʸN"rbaXsƘ*hb8pDnfPA-'l?ˍTspv{-0~Ŵ>H ms0EP'XeMYvjrXBRd_h+}?MJԔ<0"a=P* c[]k܊Nn _HNsVWƘ*hbhp<3&h@v vu U.q7DkDBK+x a INM+"qBr׵I0 Էdկg14C˯Ʃav_hSxu U55zO%ƶKKAEtYlF/^t#2\oWY}*RcyaBsI|c!By+ eԋÿQSP RBjS?&&Ƒ!mMȃT ȿZph.Lk _3~B @h{G7~"YLH07:-)0e*Fe%n`7vXeQĠ"ax-mj*fbG+@+MUX`_YS`uPVsNJ> UQ eY],Y! !\+he K|q/Y8a"WCۃ3WU[ 7AnfZjS?&&&&& ^?;HH_Q! @ +`z!Gқ$ƧWg'5?Bmr"hy%/K/RRTbûAlwSQ'׳ IԘ;hpeh[UzhEB$9b>{FJJ< 3rjE.kaH ZU`[]A8{#4E^;1u0(U /:[}:R9X AbTYt?_-Aj]TB"}¡M4TA^7qP=@S¼˸s8 yNCNS%A~'nڗ@˖,EkpMt ?vcږƯX2,(FO63-yIgU<ݽ,unɵޛUИ+}X-9_W34ަğ;SAW O.[%{SS`?Ґ"|D={e' =!-5};m0_(ʈgKp^ {VnBR+R*Zl'Ege)hIo zrIx(YwrrR h)&*(2ղ~G;/')L^>KDHNp=)HA UqU_ 5/IY7#;~xTjDNBA .ˋ#:)$6qU= n  RpQ^ݏ^x)8F )Дj|D ))ΊHMYJ0Wm5D4ND]hKޭrrRJ0Tk[Ρ ⧠)`M|<CHA A6 ' ` w6Fa JwvZ`w6RC x\ MGw/!iwb]߹휀]YN epd"Y-EF%[+MCdz3$ t8U4R!1X@/> OJqTB k#u v)] Tɕ\G| v,{*ArT"@4$ ' "k /l-k:!{HL(CIbNh&*n&ACN-rU,=hfya2 О=T[Ul!KOu*<kSѰwIBo .T+ Tã[TPIg*h bͲg9ZhDW2o ԉ8@`]Lq⢂T&#XYN<8ONK]ܭ3t -5#kCw<(7&&,ʺ*-*3T` \$%AdH9Eo|PYT.:5u⦂2WTArLc즩5T,m zpYPeߒ1\Q bQ IA 5h%Z~TI%SY窨f%!aV*3~]x~8Wl,*7^/(\Jr0uj ӸoSjE8E.<0.*{dqi"V 6 YfRu4aż8.qwVC y]4>=\%Dk^"x/ ͎;+ãBM86Ƙm8"4Ђ&/쬼 yU ZM/|1?^C UP7`Б@+=!^5R-_XьXeU.$'ٖ\b$9501tȺ q;1-᪣'RKN"xU!Y*-*QCv*XB0L~#h-6 Cļ8mvb-V$uL)#В_WLa"$ĩɺoA64k )+eRE~ ?wW[q$GAI&_W{.Zn0ۨBmh婥v(c"gkxP#=nV{?ݵ*HiRAŌ/ 8DNw}Sυ%g10h_Ya$Eh`~# .F(qʙZT(UjHjCT Qh=jQAaP棃;;!xHUۍwB^Mqfq*6ֿiaU:O~c\W\^ MrФw0^bNF*pQFD Yc8bƫ9kH^@<9#kXJR+ƒG"NN\T05 aUřW @ +3Ǚ?+xS`ִn| 793(4a~?y5!lqيļ:>8%у4GE+r$k+$4^xV06&qNqQA__) oPx*IzM$Ak_F n@ȲBPy-h8\A+P SlBa~<>}C~)u:*8F4-^6)J4#NiESUQB$*@!KH?n]PRRq6oUO9I_30ؼwupWXⲢ,_'< Z,6Di3͋jq-Z y5#N*>>µ%$0ŧTp"M2a &h`i`@ h ώ|S[a0TYЧ1PبtTRTK<#=< J&@@2S̘'NU0e*P%A B1!b@ s< *h%̧gՋOȎCYAù*flUj=C W\VYȭKA>줂#(W 8ʞk҈"q:ٍM UPrꨂpPe08Of=b(kUaVS<*8$ ?h{N{'WUA<>&0w`$qp#=1}D4鬠A#L"s;+谋*Ne5Yuz*RZTp K`C}xJ "FhI\mqK'kH頂# .6r"lWԙ0Dx,4^ tVt3ĽQGmI={פv*x40<*⏍<V^[Tǿů L_'#Z2J Dj\dhE@) QDѥQ 6D<5&\B[ƤK}hҴ}/_ФٝˍTf\nv38PBDO_}|hUP `(A1ec<nwK b+he*}i6fdvJXuU,TBBJٸ԰tfTP}XdT+" ]R nsKЀ &1e/N!Ҵ ZŔx(9izH U̡1%a9RA6zr(@UGۇQA&.Fߗnd@}\..S/jN+'QaeJ~M@xt*H +b}Sh`A}>0_x~45V zxɅB\iՒ6vHG>@.X:J_|fmLx7~TP/ NP P|o$7I*x Hlc(YtF& ^9ۀ Σ *imTT kM1q1*rϼpӭCep*|#Vͩ8Z U_#ݢ, PG9vgI_ c{a<\7Al 1 !@3Q,oǡ2Bx36T<6}SJ3mB(zȗBl~?/}f sbYϝ/񅚠DVP[ Yd{$ c9鶂g0^nYvJPu]%A(va9Rcxi_Wg4TPAC |}T~4 :6  }=w~8oi+ D}nj{RAt]{5`WguWoWq v #N?߃ 8 l:UQAVq+5fnRA/NXn!4'*>0Vê?bۑo(:]),2AW9N0YA+ж-l"OʍuZׇ6:(y){0G*뻽꾉aoa؛bnM>Ux.&V > \,7L8qUX>윒k`-r f)Eďad7:Ѽ]vr`J%Nwipj,P}8٥bV@(Hqt䦺˜&kV}N: 63Q!/pS. 65A}* ~3ϫ3cP'-/C4NCB}EDm[EHM Aw&BXv&'̹!y{3Mg! 3,>=o9o.!V_s 5A3)3y˽qÈ]s#[1zKK<T=⚳G}R bWG[#aʹztRM☦>>ɗMYpT?J@#RAWv?agk,WA9ylxG- VjXFn4*l+%5kۚ'rI6nJ}BD.M=Z~[./ >Lp_m<}TPB+2ϨUnd*L\2, XR)yEm;<EWD룂&TVXxom[wyI>QQyd^'r} Vn1/ԅ Cϱ#`7\_(S7)U`tSm6A@iEXyX]^N{01&aDkYJ_xuuWsTyt7ݩ-j+覞@qϔb|B+jL##s: VRB*vk)nfgSa_hSl0łMd]̽Vxlٟ݁oY}]Ա o%[x>{l_[b`wAG )̔|*B[$itDpzuS&4RsԪ@lFnL+F.(rݣu8C4 })8gdJ * ^TЅ&NA"N,3,ӫY5NG/]]Q$Yu0bWmSS ǩ@±'|2 ?0|*4ʼn\yD-mY2 O/6E s錑z(P0f{_OOX_xK^>(EIL})LPǚG`A9k[iCaZmYa6.VZLpcA_ZA*1K MRA Bl>TopưElO5[]qcA d#y\_BA S>Ú ?!& {Gf9x&Pϣ l i}4՛'{8姝Xc'/X{ w(@O, V3zSAz;n\җĠ ZLP!bSA*p5걑~S*P(R+D{Ao(v,!(\mMDXp͋L7vهvO R0FURFt,oe IJ4s7jw4.ۯB` [Os,W}AO3(gW'urO"$.ۥѧt*XB(٤AFD'yV>f5SH O,K;@4I?_)T*P({R2<ڣ %,(Nd *(L.%}݌4HǨ=NO3XM]Is"]5;R}3zZoAf#]f D{+.{/`褀A*-QF>GTpTPBP(E(ׇNm@3z^?lVtV[pC(%ҖN^S‘2NP#zBlX0J PZHW]RxYO={ CVE@>mfOX%ҽG)uۃؼ#n#x RA{7BSGͶP4wR%k7ؠCX9-#_HS*HxtG *T( }W<:Yo5Т6)?)K6ްKP *8`LHS}ar\ &Ra=/$f"vvŲW"5W]iԃCUXn/zܡ,*:xuo3e`^ARQ&!{AATPHf72T]ޟUAbbJ * T} .sݭvKzniTKv٥e%5;:|s%Yؕ&y|y[4l8ύb$P*k:ICPfW˘3m褄v|zuxiGy*R7YBeW^$:d K Sw>. R6"]INI ; j"u8 eizV'|t_ *T( MAO۴Y]$u5 Jz: Q:Y.{/gUp`JW}^̲go+ˇ"V LV邹y5y3:}(iZ蔎iM\K d8~e>#{|7~a'?fg=`W.3PGpDӚ<4e_+C_ D$ێs4?;Ǝ *T(w7ia ? 1=m*1h]Q[/D4VI C0-$_!P"-[CW~|/_X,=<Ϭ3CQP>7b77|98qw]YCН͛"whpK8A|Nd"mPa#|.bEL}5ɍ@HQ~{%h6QL6|;Ry\tS ]acB"^:lDT-@#bkz4v*ݗD( #d:M.VBITdcfF,C}*P8(A-.q\ѠA!"i|"hߏ.֥W?[6&٘rFLxC_ro,MtFdc@Dݯ OqM $ِgKANLFzWٸ7J_ }/KwQ {S٨Oz(؅u~|ثz앂.7CY~yy+A7t\.nh1EAB!vFAnfLUmWM>и峰i?ZnGN( By(xŏSs Jj}Y8(Hˮ0A $耘Znoۓi/p2tQ)xQ);8HA+tv ^C'/>8HA+s"1y}'ä ]Z*ξ ) `SH+3y}C$ap(L*1ȾcR eM FNT&ہb5\ c?sNq I}?86 kSpj^:y'(XSuA5#'5t !")K R  @,)HAXR bIA@ Ē[(RA@TPA-T`KRA@TPA-T`KRA@TPA-T`KRA@TPA-T`KRA@TPA-T`KRA@TPA-T`K:o=E * l [* * l [* * l [*;FGq10Ӵۉ$آ6!J ]I"tEɪh7.t)d,\!g]$!" D$"2ֿIAgP$"2ʟ`;uYX (G)ug,]OSKMe^,sǏJPQcu6ؿHDLA""%Eu{>2-XBC"=2TkI:W^v|TZ)HDdRPylWRPkw0DDx JE7-K1A9{umrUQA n0̹RPbn#ڗ%J=ͧ/CBc"')HDd$SPAӵi[hWOB]َЩ@?!+ Q؍HA`OBc')HDdDS0Xvҵn18HYx}_Rp;,O1K>JO|\bSD@F>Rp^64&z"DDFI4pr\="^qm$s\X,b rvwգFS5\kl-rgR 99Cb %,"3m` ,j 3ɽ"UoG]"@Rp c')HDiI% 8~/\ 2rAURE =^V^LPiB }>y1YsޭZ(3M3\ mcgkzMk.#X*^ ?21Z) X1L$"2LS;|*}+hRPRR"8Oх`-c^JnHPLgdߘLA2S(3M&2"讌$tVuac3;ZyM }6*sU/sؒFϑ$ݦc&2`LA""2S@Nd v;8' HA%'N=5 }F:΋c c&2`LA""0.H}(Ur)]})Q JAF;0dLA""*=X^Rg9ur۷*i8`]uR9(Qo)>_BvelLD )HDdY`X,By!s  D΀w=iRM$p$}vG)hy}: })誉5` ߘLA2S(HAiud)Ԁx_8Tw{p<%яLrYsf5~y\x^pOA͵ Q|c&2`LA""LӻOpT4+`V̑Y2=D"12^'2S(LA"b )HDLA"n WoFH),T KRA@TPA,T KRA@TPA,T k. ph0 ؚ î #+HA Ƥ                      uYEaxgbht NS 8!88/{eP\ kHK R -)HA ҒHK R -)HA ҒHK R -)HA ҒHK R -)HA ҒHK R -)HAê:=lǨz})@ u =9yZcnr7Û^z~>XiN8)ntͫ)j:FpYy8[.{ 65O1fRe|5O 7`< 43M5Ú-S5xTߕۥt͛/^ػF8 _I[$j1D&;Da\X\(̅ ^ hN'ٞ=dۮK*]} )xzzڟ!k[)~bol%)(H˟LbS0 mk)C BH !b (6,xyg5 :Q*Šm &)Mm5XCy OTry̙9J+Cp, ޟWvF{fws[gchPtj~ O/?]ɽS-6'p}٬yOftF˕xR%:J;\QbB,$Bt3oRWb}p)v tx (4օbcE5e5AY Y7$#,O*> `Fy~VjlOP}[rR0L'$Bt-%8,ٳAhR0a)u>)8 4mM7gF3m$[=+-xڥ`xX& >IlM H=!8o1]O s5OlG!B`衱`^ a)mV&@ F`UlZ>beXsvb7gaL*ך6AߠI&I~AmbƢa_O&љkҖxJ,&x&6zG?^쐂ybBRP!D7R0=m)yПk`YE.VAmS0SZLw~K#io ddy6w4]) ;de:q/L*i0i4<Ӆ}k"uSp!$Bt!$w)5ao?lefA)ԥk3?%l0^G!"Dg˵?pHlBH !O`i^*6D2aU҆IBRP!~gY C!!+MmAokj . -`WQAqI_tpSiI n8ϳ3}ǡ)xGnZQezLQ+wNk]-w V n>_8p`DӳQՌY/ǿHA[x%*nV1ډY)j)Ы"l)xrd)@ Ptr 9;(5cjo^ٝa R ):M $E R YRɒH @ %),)HAdIA@ $K R YRɒHSpHʴSjpuRʤy1J7Hy Io1ƣb`5F[Pou- IA?g1jif;:^A nzd sNԎ ͽ-/a/]\X^HA~s,jq]DB "4&Sl.npA|t(7w*5tTzt4L+b-~@9ͣ)^R1U%`Jv/#UF Sj6{vC'dSjyb**sIaYr'i2H潓xĻ@HA)SoV"cıP34إ/m uUpIJ`9&<$/>w! ,RS]-3,{6en:-]I'[i摶גʼq2)8|~-?^Eze\^ )8Y]S t_OAif!)R:a t)B  HA t)"Y R:@ @gHA,Rn0Q^%\.溵;0${0XAXgWp x+ V  @,+XAXV `bYA IJe+ V  @[VlsϪK&q#Ɨ9ADDOw*QyݗY)DbgS*m߽_ ` So`_"HrǗSPe4DD/JnU:پSS\PT+Sb ` W| :nY &VN3ӵ;nd^4kL;Nn/z -ƽ.P;k$)h:΃k j?+K,MKTZۙV1QS>^HooK3i6 G .T?-WV[)u'R6iX~tOCL@kr""^#hI$s`hSm@[ZQ~qJ ֕0*TVi $;X3 $"9_|Tqn܁gkD0Nn>PZҗ}ն)(]q%bQ`CxK2}E 7` PۧX`\U&WUaXA6 NՕꧠΛ> LA""|˅=qq.*ehЕ!(I%\MSR0 |"cӖiBѕ%Euk`N֗U +U@@UL O6fc \ -i^LAƢ]\˗C xy-Ol-NZ ` aT&nj5=yk 3jyTSpM]ܟR )HDD#)<J+g2wl _')x([z"yo4LAB .F#OF)1)x϶|^E\RB1So[fb ѼW^5( S/)lHW`"JP2 a q )d1)A3F a4&ϯ08sD,<qwv s eSRp"W륕SL#Ŋ1WsSa <,/v/)HDDۓڰ 5$ s aNA=4VHCS'G`S@K%B)80nāj)c J- CQDD])2`_8)X]$`O"R0)hqͿ . y*S0e'RVSl_0*nD> mK $%_R-Z_OAeov=gx5*E$H1x);rny L i[e) ΟY ^gaRJJDBSl*5S"0hKRpz*b ^ûػvq'6R!"nmHAIJnÏOƕ4|  ߞJBS7A3eDS$i;V>b^v{<Y1NA,NA cy cS1蜂y cy_)Ș7q 2ƘS?s 2I1)ce3>;a7,7iiNNcl6Ia)!4 ?r`S1)W$HtGm'ءG?a\a!0҆6sg&+m5X"nl1)})Z|):L˴cL.)N,t )iU꺆)ݿ4OVC#?(stETX& HHԵ`I0MwHLЙsjLc €1E0 _Db县h`<*LuĘbBZn^+Kk9{銐Ra<:)88cSRoe~]y?~3Xu`n8Q~ڙ ogDS_dVpqػU;oU%'L/ݽRv7)N`)9kLAZSnP2JmV%]V!j@zd4s9~ ŮzqݠGug kkݽpSTJv(EY`03 )Yyf%b-abn`\\%kV0.n4q !){~81)WgKA'{uK>܃e}q|Gbw+wּD,i6ϽiC;)|W46Vh$K1B#G[5E@Dș{qq @?.@9w+fvI+@Jl۳]A{rv,b"d1)*mk) h=zh .QbRQh~x &)T ZEv T!dh3$?LwrCom2eaa1Ʉ0cmYq;1`eE|0H49SpRP[)jLXo!:mqd1)/l)C{0T:Baև``j١z IMyiHDs;4)j(y8SP}=}FjHläw۩ԍZT!eF.SLAي))4k&"oDљלp<_=>& cyJ_ VV˦j@.j(wQߴz M% OeF'Qb撂^ fT<""4@;)BS[)Xh/SHf8S3$]5 usc2y}ӆu|M2NAe#/3 ],H=JhXdU%A'`54ktF$F1"B #ML]O)'DRJ)x ))c(/Դ<})pImtߡ FтOS:]F~NS՗mDSD`+ <CR0k,a V)7NNADz˦`2e%t^m긤%HiZ+OBKЬ[ ~Jy%j_]Sna@X Gᦵp7k'`[mun)eapXɭEX } =у^5hLR)6 8cSL'TGiԁe;S=złJA{z5[^f)CJQ|{+ўqroONA\ q}KFpr"~{VЯ"z0AR]A"qKAԽ?12kDI){ӢDpWE"&>\oe;}VxOSbq{ɔQHA QZWRՃ=5.W*\Cy[1޹`x{<=JEi2O J8T{ݮ敺KAnWp{“W^%[:ɣ e;)+LnNMAi[^~B{W^OJeCOR2 (G)}F:}pIfrvD\Wwo_D݈-t{n.\p<ؑ$W] 4 YwLt睰52 ( S@ϔ,R2嵑OJaHTMp,C-,Ҧ )F12b_Rs,oS{KHA0Ӂ++qHA0sr ` Rp` )~n )8ϋS `"R2s9AR0)F9ùt΅OHA0ʯþG "( RE  X RE  X RE  X\L|!*!(2gAp-(3 }F?X% hˢisOo JHA0)x|)螵dy? 15L3tgbs1\X$\b3qqIB+Q|y VĻoxP-R pR%PSn} gHA(IA9R @Rpw7{2~ݘڱ?'_J^ݜL~KQGoԟcg+5rSbtD `HA(IHX'Z_1vMH3Lʶ t;g+5rS}MXR e?)bN6jM aG5)ԕ?礠?54*-h说iSfslGr|x= gHA(O\&GԦ.Ud+o ޷)x$krF)ջ: gHA(m͹K .Ũi+Iio )d&}9[ g_BO %S 6nzoMI°!{R&+/QJ2ZxW )_얂t v%eH|V Ν(ۅw)@ސP({JO` kC2L/R)8ӓg"2qRe 3 RPQ jE̹b~J--'QFoW )tdLdzjЎ]qaS̷ܑ|4 BRW7I};Q+س)LujO[#R #kacIA5@bTVȌ6;eIh6`NAh|ґΝ({ ^z_R 0R?ԜlǁeY;ƣnS*ʡvMFsgeJ39C @F MA:"NrŸ,g+윯FP(E 7o "gHA(`'HYT^< OHA( e)HA@Rq+!MAC HA @ [D0 . MߨA`N Yq~  `8 cmMb1]!+B?҃tHHO8O҉^{)%1|,亜Ir@(`v~ELyu#LH$lRJ$\yg)8 `L-p -NA(+~e<%yurg U3 bqEagj; `^9T{*^߉$_>|Aۏ_˧`71ϧrZyAHF`@!L)h>`1UlྺgCKA}a@A< ?Yl 3k`S16ʥY Qm\D~^p#Җ]+cq!?=N&ȒxM3KdHhHFd =OiWq{me 0ښ $ (-[S1v9qEOJB|SZp2; K4QY rF!}y@\1}rbS0N4.-,ӄ GPBkVƘ c &j9~RRewITɌQd ։.>ŊlIOw)bɈhD]K)B D<|<8cllv VS\d^cفߥF+S ~.X5 L쇄TapUJu 6ũ&|'\() O:jsM)c;eSp*r/SqCUי B\+SPB<:)hPs6IhuȎQt ¡CJZS~tJf @@k@Ie21?ZԂ6)>S=!>ئd)x&D`z y.x9_uěJ./g4#҅ɁhL|CAzk`d2P rlM=ZɤIԀVpk&}֘,<$'CRYT&VD j M9cDəd)JFhS. jP9I(1k ×MoPp2yLPJC)Gh*8дDmcQ ,GRR~#fCOAdM"鸈HAmHGໝ@YSM)c;eSb.~r=:?aZhVC!B ja݅nXx_6 c$ץReע lq 2NG q8wv``L$)'87WHA)x~R E nsށ葂ȒȒȒȒȒȒȒȒȒ\?kqtR t*gpwr5p7 3HBAMߝ%%wg@c/ ?EDrK)("JAR RPD$"-(EDrK)E;HJ V*%TVh^;HH V__8<{ $<\w~?_E""SuS"'@)("+Hg "JA@ !JAH` VuX"Tfb\.X3TF*plF|MjSN_%SdYVM)(ɕKKqsr"ŀ݄'M<ֈ^;)X%P J(EDrRp8DN=v\kK{FG 3*im&q(|o;_xl?ZN:+D捆wͦKU*r}U)(ɕKA}M+3qJ+-ϟQǏ^赝B7.E*Q ѥ`Ԁu>? {Ls iEQwqwĢWu{PB]j=8GLK)EDrR8M(7 S wSHYHMw? ٭c"ab,DEX X-=:HQA@TPA,T KRA@TPA,T KRA@TPAU+         Y* @ *}9/<ٱjSq3С3'8 2t QC1ա !S񍼵A 38ӧdhXzqm^ k na<"A P@ c yXr.A gn)8}lR O:FS ̥)@S'ǹ=T )8'Rpgzxrk)HegURR"1NcYRRi @ PT am1HAKAV|@ PT n2qD.)@Q)Ŀ| @ )椻YJ^  קl'If7rrqeHA?Kɬ/UERp}J]7U9)P>}޽ @ |^U+2?}|,)P"SHA@ M R~[Ǩ QEg ZV6.Ҹ1)jBrRۊiN.%) @ZRiIA@ %) @ZRiIA@ %) @ZRiIA@ %) @ZRiIA@ %) @Z>j-ԦR5&>ʰt.+`)q(6f)R)x>OCݹ̮mi)Sp{]+n;.wY `)kqަq ٷ0:Z^\̢K0rv@•XABtGyh< ?8t9zc)RlޞۈΡC\ mJ747g)HAH1G͎!n7d @ PB i㡾oR0)@ )8a?)P @YReIA@ /O_'J闂Kf.M /?[IAR 04)PJ-IAR ʒ(s  M 5y{ `dR~ٻc ;& CS rC쀀.(v`$KvZFZ@)]S}M2=<9MiIAR'=. @d$4tѺ Iզ'u[;h|>%5yú(R[V^a~^[jT5LB"ZTdR W2_DoNº ]v "$|ďDX^YSRaɛ)b]JWA)\}9vіSpy<>WSصE=JE Irsݵ8ڑ^\TLK *J}(9bzcfDcvXlC ʋ`CگJ B3P)(=%nԢ| ؤ7ק/򔣦Y& ÐT oW`.D9gQuSRPYe0;,h"-O{*y&dR W6U !膞}3'֧ aYa-O9*TtZ٠Y@!rRP:,:t_\K{'+Ճ)XOMl:%RS+R`[s )+/_`q .HQ}8 i*~Q3Jڢ'~y۴o)H[>5K> B)9RF|%iX&%~Ʒgo J|'Mŧ3-;T f٠dR W2Cѹ쟷$u:ǭVNG nn:緫o8$ )fǿNbl`ٷ0<4uI!]N-M-_Ĉu2 )+LAZ!ͦ$wtTHC4YMgdq=jKT'ls9/)&->FYHA\H Sڻl'[-v.&mXy:*>㔦nseHA\f _k@#B  RpôGR W0O)+H 3\@ + r )٭c"B]D܃ @TPA,T KRA@TPA j#SAT KRA@TPA,T K:&[&FvHT dYA dYA dYA dYA dYٳ䭰)aZ?<ʲG9o'a{^x)АS Bkō m!ԥTXbT x({lfRN ~+1/) 0: Xt$G- ςB݁rNvdN t °*9(T !l} ֝NォNrB s~l$Fqq:RtS}䲐ԃ&.٘Ko҄6!pM,%VAE)XuUcC}xCS0kxҁC?YH%bOAk-qi@!KzKSYW$akፙxǮlMצ`A]k#, ІB;hvquEFPhg]] A;nm>nQ V !lJ4NN)ՉVùj:$wRPsM"S]Zv+doDKZR M{B~J2bͤa^)bsqJқm~mB:6:Dݺ*nİ6a"E6M)ܫC3yJljL gy |!?YOfR0 c-Ǣz<(Px8ی"S+|G ;|B-_j, KF SB#k߼K _ȟt1X0)hW▂E՟SR?{ /#{k}xٌN}c^<SAH U] ݈#} @H.t:LAղ6+&|d'0-UϪ;5|!;_Oq*!X0)hW▂8^{W.b>)(.ɼ-KL$/ ۵KY͛jT L #*\f}݈ lAJASs눳\/ޯksʼVVm )BQx0)hwv@`gP9@€`% ~VQu_ cPA(*HsGa*E:T DҶ @ * *K RA@b TPAX* TDHAi==ކeB UN3 2~LHAi޼ݥ|I^u|-pTboް,YV{*at5r{.;FV3wBg.S=N(Rr))҈#FC{[<[8:j!FJtn$ζj4r~THAJRדX+jp4R9U -Hb9M%oMwN%i{X]Wwe)xuwu+Sa'WJ.khLD^FtB t-au7i-b L;LKv~( )V)\ Kd|("}B"` m$:c/= $Y1˝Z-2+ٙT? P* Xx)W71C)i"Ss-M`tG'v%j=\JmNkmM6~THAJSPjɎ]1>J$W:f*W]_=O@bt|l* b)V)b &w3m1<H柨3V^ u&)8m#(RRZOѶ>c`X{)uznM6rdP6 X)DZR ?e16NRv>i )V9xWKߵ^ԍ^J^ u߆?ϝ:GeB UN!} LRi4M A8ntuz?i7pG1hgQJa՟ov70bŊ  8g vܯ2׹f HAHuU=`7{w25q[xƻx]x+li2"2qd! Ԑ>M9l3)x(FĚ_ԑ]fwӿ uslJіr) ",6!7RjR~\gC.s @ PA v/%R #SJ 8} G nt)S21{L_wRSuK8'\)4͇ L1جTR 5Ӳ̜}k2_8|jHAjIi@ Pa iHAjL?_Ԙ@ R)HA _a0pf!f V&^RЌyT0,)HAaIA@  K R`XRÒ֯ P%@ R)HA2-)@% )@#HA1HA`΂P#藂\(K9)@\r a a  a nv#P"O zCa @a wǦ Q_: `xd Z WB"&'aaa GS___[[[JKK~~~333ۃhhhjjjWWW9::ձIII#?FCDEغƪ999򾾾gggΙlllekqҴmu{X]a_ekOTW)EPhpv$+/MRU)7@pJNQR7\bf4[sSX\[=/fAsH3I4+0N`F3*P=dIDATxAm1t?q 叠2wR#YA1 "A4XD3&hp`M , DE48h|/3pm|;ھ^{͑iΌ]4x&I7<zel9W#N~PNŪ64Ѯfv*P] AyLNĦhʙЬbQ|gqnd^>2Fs<Ԅ75N}< xlīCs*.F՛~;VtD1nb ͱvZ Mi1zkeī;ӃC3}TBls$Zy\M'Ҍ -Ho5iWvV ͼjNZ84G%4&{8%UHIfYÒLKf$3C}=v{t&Z44G%4f6U+i>SRRbN"~*@ x5< Kمf].EUJ[I&X3{_@ۓꏳt]Wg + tTh^=נ59=l֠}2&C gLd[x,j_*jjP2x*)}yf)H,'ivU^o6-a z@_'UUԺDշxFoef6^ڨGNUz& MUݓmH`+=$ه Z$RN%4VT'@,?_}>z+WzE톤kk–EMywv+:P6jCc2kPNaَ+0> 0SFw@AhP&{C3_pA Mʳ9&dIhJ GfF`YVIfi1&2G44;`2GIh,F M`B"4؃M 4v3{w6`tn&H $Yt,d'*`,FMK,sS|ܙ;_P>弌:cph.U$KqC`y 4y`e^ 潁&D͜0J`h&EhfӶ1m=zYo}lЬۘP*Фp/ M &&f)4).4x`Dh3 5w&M :GuI6ݫ Ff=Ї&Gf3iBbm:@M $YIh2W lc&xGIB3f:IyfKyU/f&f䖦ФXBV{n&6f\hRM'4NZhEhR, p.߸64cV.i M34? j@9) +C nArh⚖XA`(rAq`c& ^6o.- Jiy#4XUy\nʽw29Xi̗.I5:KAOmͣ 6x}/,Q.":̦ _qUO!hjiIpftLJiLol!(Nƃu8.LsۡFgſJpLyM3Sji~~l勦{&ъ;DFVh`Dsf"^͎* _^B4kg;7M|NW V[s dfOX_/6f-_mb|<̭}ʶZ^.RSKL4I8T,yYf\UZhhi/1{Y%ίWV>$FY?XyghdH*v%sXVf<85LK[t SIsO-*Eg~֛L#pY;c?unFnYb1gQMLK*.Vќ NIEh.9[B446nDs(é^}0mla/f/̸9p~>7irFƹ\̏DkFahpTǸ`D*2/@s+4r{Ĝaf 8,f7ڪ:Cl ,+@H9pWq{%LK[t Sې:FL\sWʎl1{M4#Df&jD43$|dXESnD!I,oW89q;NAJfhf+fT>/C( <ɠ'SM±s_)uea>c=(Rygw9@^4E`X.,1g(Xc;SAQj*agh; b?jh{ODM{iinaʑaO=r*Gz4?v?w6[~B]D Yh^hC.%JCpX#<9M Єhќsf3M4𻾸0 1ʞ(}0O@TUɆK_%uyv]ZSpӔh#/ I\noa~:q n$Y3IkK1=´E0 bQb܇ϛ{&6*z4?v?w6[~߰Bzn9i/ Q- AVI48r~>+NќTfzj O4J rS۩cEinfW9| T gIjF5Jw.2[1xٹ'y9EsY`X.a9yx ͗h34FlͯjU4^'›´E0ۥJV|]|O,ǣ,p,ݛܢ ]<' J_,s;:ڛD*-:sf0΂ \) {;f AJťr~w]TUwɵ0Clz'Ve`td_hFP fFO4#/h& {a-L{U_2 qAjl9BG=XXwL*GֶfJfRe>3“ f7Kcsg7⟍oE3xC( kydv2`nLN&"RŔO|u&_zHS$EkdٮCp沒)Edi6"jYm%>֣czyYg0v^X.3$)Cٍ`+piVg]´1^ C q݇{&WV z4?v?wi߽Em0m%L1"@ᰨx[2gF+WKcsfE\w%q h)]P՘n{m΢*DžA ev4LL(hMȜ}{p B4VTI4@@pЭ ''\ siB;#Amu+<,>&~tO!ʾHŒ|Fиvqz0?;ޤ8b Kt֎n YK:(n N{-@}B˖^nwO]2,ry' )O7?/ ^Xyp8osM j=dF}d:GYzMI[Q{ΙӹvqFuhw= sC^͏PW7qvǛJB>p?:zE`t;rv6M{uvsRE+U*4!mʇ'雁zt?UYUO+ fg`W/R&ҽ=%̳5U6+?vZ=tR#ɵ"|Gd_mnS,C+I-`XҨJӴkmT='AVibZ9( wӸsp3߯b{Dȍ\z.^U{Ni2܃Crz<+% LH)h06\*@~˜'k.S:,:?-;cGc|>yvv@ , LLm;nlrk>PZпLMJӢ)/:<,)9eBfoȖyim(+ӢL`;j߂McDps&{ӡf9.Bn4s*f=6if2dhDbB~FAzDA )gx) f tӤ &ni좞ڜDRe!6;pť| 'KG8>ӢL2cѹ̔C3\V+U_ }yL!Дd i;BsԾ{_H)n|4"uX$0CAٞ\C!cø,(`Ig*[pQ!6p6"4U@V|y# MAmZDii¦JLqm MH~>R`& 29a MC3LRiTeU@f,QeNm (e2 yy;wr.4o2 ;]_1'4B3(d6k _ MAxIfPجMf^h!4kL2 ;Y3dzQP$3(l&334ԙ?͚!ShL2fm2ͦOmq|ah^aƹ9~4&.fžnCvGnͪ' ͠ܞBAhNǙͪ'݅t3\ž\h>͉DfiC(8|8 `yv>ٻca(CЁRwv\`l7+33`8BM'e智 5$4[7IhxIhCiieJrV'WfieLCe3l Mx _BgMAh"4x ;9g`cW'<4[zBs$:j2&7aߎM O e Z;|lBs48sڄGޙmh^pDhBBt &&!4 4 @ @M:MBhAhBB?{wTqZcz5@:XXLP"AJbu<ɤǵ]>_}xBSDDDDBSDDDD M1 M?(4EDDD"䴇 ޛDOoJG_ʛUnmr KgƗ4ns{uG_^yY`n/F&mv8~ecߦJOc%Bsmf+nv1ּʊq?OVֈeD  <=4 RhDy 9B3'L/"q_f""""!Ef<0N馍Ѹ ;0]f6';`ǻ9n:>`I 3%qR[0 Mo ƷNE֤w84W7d$0.H [h LPאT^1\%u~ex|HY Sw}%)B->CCdϞv4ztɽ4sơy4-h'6K~ nƅ RhDIWf wy3wW$L\h:NYI6ic;f4yn {<-u1_n̝gLJfɲ8@h&fȭ f\ FSDDD$&ʫ7!svs0;LOɻ@v{{擃z`hNzfhJB4Q̆Xh6#~Az׹H2NBLЬP- ft=8;1oq#-_>c܌DDDDb]c Hu yR:7dfbh>40gtnw4o]f!K޾|h's 3͈_>GSDDD$kwF&6IJEh:>14fY`|H7 4x/4?]iB3TBs0 wbhF'[o fDSm#|,//40k!64yZ̵":4k \>{r3'i'rvC {^ nFXQu."""2\p '>J Xlp'>4l&ajLZ)˝xyhI:-4QT,Cs\ ft2r]Dnc܌$OAxATd6]SY'LYFWvn} -rqBsehVF3ѥdf`3""""IuYB 4V"U Nƚi[, EQ/tA!\BH Z!8<^#.΅3I9>"/e4~͆1!b:kf6jf{Yo24`6x,?No w7v95EN-e'^le5 Ё=߄&@>#oBףE9 @Ah&u$ @&IhP @Ah&u$ @&IhP @ڱ 0" )$&#4hAh0B MΡ &OhP 4 @`& B4MFh 4 @>ΝGm # Aq yX)((Hi Gr}e8^XDӏVȣ(ibr`\g) @-wV3q;X0XhG BhB~hpFqjd^jYuf6l}dcGԌ>s2D6#מw4rH۩3MB.>Jr)-&$)rk?5͛&ɶus d Hdw%;wD#qhA!MB[ )X*"(g (F#4ȬhEU:EUUr/*h6Fk)ΓkFFWRDbnr8Gyu[K\cPPL&1>{V59s7uZBEhBKER)Lpjd,"x'l_tp@4;>`Fhtgݹ3+9kE}ĒΨIJq|/=MbU1(ҭg^xeuث"43c%H:ZBE&h5/ 'FP"/fwV77x+D^ݲ @,mqg2ßcTbWf95*}SAD~)%n BhBM@ue[;E5(LYhYW4fTѯ eMpt[˽G;cN3&rg:E~& &!mTxhZL>.V°՜[shYW4E3@mCZ`t[(挹?L~R4OcX,qͻ;!P4 !cs5)2WIHkd֗s8~$q][JP{R4W4MX*ePw}XƩl|<1OG:hh:pq+y~Ʋ[4Wh!n BhBMahj,|c\_kG{ q5MSJͺ9@9 Ʈhq _##'ʐI)QkkYim]m-`H47[4TOi"P4 !cS'HFU(o)NQL3e!{쯢npY;YW4%Yf"%=!^4%A-QY׋ >/ Ԃ+Jx{d!><Mn |h3@ޜt=n}K!MB㊦AXEjMj "PkJNM@x`Mm슦B`2AV>0N@^, 5>mZHߍ.h` ݢ1*1-BEwAm$W|H?yil?ܷ{wcLkd 3QBhB!I!|bߎM(Q4K~/=s#:%4NJ2ӏQn@&QXtmH@&AXeֈ<tHht.)MOBؙ9]*+MKB`/μXz$4vn[{tGhX?X㏽;fQ0oDP4[HvlpseT\<Wlٱ Dԗ(' c[wҗ8mq!4`+wCJǽ"Chw^|!Bhɹ]3K{CS1!4`Sm M _5 eflw.|YMfܭn4DAOBsefY`Fy}f+Ue\@LMNCsc>9V*shm굝Z#ifKj20pfo7V\ٟ!4W~th|/*XhN&Rft[v'& 6 +^C\fKI4[bFhc/lَƺEjآx@MxxNxgf擲!4aVCx#1#4L9KCv\RǦvq#4+(Y~q>4Fh,6rD:Fhki*Q EB`4紣Ygmk9L:hx"9bZ%4VK=Fn(>ї<7qwhxf}saB-~ }ɷ aBJhp BJhp BJhp BJhp BJhp BJhp Bصߤ0ojjms"So3xYNmd^uQ%d |o~o3:Be[y~ &䐞4BFh`>;4c"40<:@g40Mh&8jѴhU 4XKBeۉktKuOBDF`$4iQU&Mh h1 "v`L@`7 4M5&6&¶eίK+?u6Pf/49vڒ]ٗ̆ xyrK"RW_EiY\H14+M9VԙZFFUer)d|fn]YG^Zx%wݷdiӼ/}󯈖?ʴ,zhvoN"\F}%]繹*苹&qb͠jZF|l/aN%釦@YjEBJ2BhUKY #PV_ s6\$!yl[,$ ;b^@uŰNn>"4A[ܹkFpNG}Ohg3r􁚻~<s~jË,/@CRE݅|~]zih{hJlCj&'IaIGY$%*cinJ%?יc@X,*9Wtq'VhfM8%ܕ~>$;@-ŧ]]C >w9)S 'ptaV|Kx]@{X=V2څ派r34pU4~b.ᙼy뱇 *@BXgGƜaiv0XWfhƎW79jnN_E: APpW2ǵ:Y>KpM*GCwG }ihNkOzR9d2Y'}>D?Z]y&4~j"ՙxamQUҧ8(\!4^/Ƕ`qNO`dSL7i[ ?czx)zi-FChsx1 Ppi{,.fg23y U6VLc%/.)5l:Kah' תBˍ> )eA#v^zl4|hU'DZ>4gk\Ƹy|дy|[:[QKqkR.IU ;[#BS44onuCk~@_]f\:tcPAf߂E >5@6PBs1qml<9<>4d6on X ~Ƙg6Х3G[hSr9u׌s3c&'kjcV[hXĵC/C5Mɬ&Z'V9DZhȂqMUvh1M R :!pisEukT>Ff0hٜMA֓LPq8"FL42"4c7w GG E*Pw/[Fy7T6rm;NeaѐvYѝKm/|ӻ#t2BsC *a5BڧadFz"954ӻkLph_s=h#kў3+BÇm2Aju0wPqs3w;i>sAܓЌ]ȼn?;-B_70\h7| ;ah^$td|hve,в8t4?v,@،RkhH64qx "%*";65B쨫JyZEE]мF?R%94͙F Tu< qW-Ui>|4HyߴÑw|Dh mh>$ q3P,sIfc 5^9BБ1ЄQ@S1/$4; Da$9+i\h:1Qpy<_Bl5i Q*ekhªҴb{Cj/V<{hy.4[5]C㍥&7.4msBSV/V`M~gB }IhFo̷]ho7c_E$ -4Ϳ^eБ\khγ<183 \y,R/+RWx C30@NUf[BJ 8kM~=C`e4A )N!.0Mۜ4Gg% h9)>_/&c).ɠ@׷Me aIK 96!4d$c5Q)ȉ FpĀ~uhJblrx $+FiH HՈ9B3AqZ,X6B*C@c+=`8̇k3XԆKhwewR@w~ ۦ&!z H@0lMAV34)$d\&PƬ}8&\_ /Sɏ,\%Cs=dCor,40c8ٔD;YmhF=1&8wMHY]MÔ". C,9t:_Chnt MXۦZY:IU$M5c^ *Ż&w`TL-FHy5;LhD6L<4lt MXylh3G\ls2Eq1EP3 ةϬ,p)Bj ̓{zC<^svEI<sqI/s|"sOB""NCs1 mXH`,xc76:M;\ \lT: c10yCؚgT4qcnuz M?bͪBxK۬{BSﮑ47=@F0ՙ#NdpZU:&0~W4pemmtuh7J$X}>+yMw%ktY.^_i#!C C(c/FfJ bw[hG﯌bBz64+;2srDo|˹Ӫʺ]S tKyuo=wҠ{~}(A|Dh з_wі&/O#KC3ƚv CL7 e0X~zNv(A-҄K_hv$I14a2"^[woe݁@&Y*ܯ4K|Gij7FLyᐎ6d~n6@S`aEMf 4&`a;s"'( #a9Z &ps%KQi"48ٻ&(' Yש^7YZ4 @EAHS//~sw.{e3?&b)JMXMQ}BB! O|+MBI!R4 MB!:V|c BB!E ESߚU4d0lWքA<}A͛nA XL¸g'hL ByyGK!ɋrMhD f6Nmhm+Cs\~cʛ i/K7 ća>O.X\y`a++^ h !5Mr孩34L*ͫMQd{ġs-Ź^2o\1p4ۂ`,sL?0\ -Y\cOWl>: u.FI! ͇ceCj4 KrI9^{r+lɦ@747`* yl-4Ay&s[g M|ޛrh 84 W44aDI!#MMi>˗sšiKdyϞ {M7х?~,a/m,#aϻvt MabGCS5 BB.6Zh^ MU,XC9Yh DhZ`'n<Om?>ㅶ0EK)g dmkm^>JNLe "ӆ>Ê s]҅Хd? MnN+[+ a@VetXkj(.9nh[NA}ToOZRzĶRn+sA[V()B%u lRpu. ]?LO3:BBCӧJ^1Ρ M֜8߁EgͺXd0evHBy8f߁ySVQ@Bs#;%mrLS<2ޮ$ vngyJ Oc JYF,%><ƗUL[1l؁%_Ա B  dnBjߡ]ǮTaQlVrLj[g Ja:;8k;+#m.{hz=BBBXilX99JD5UstOC98Fs"_OH'!YK#ƈV(*+xdxfs~iAjaw8h(3wegr t1@;mE6U*=4) $ǠBr`eR+:} v3jģ)ըqfhRwl(N5$yj14?c-dqMn{hUDC 7zf׺9u">^Dzj$.yUm^CLK2C@Eؾ=<,YF5]W[ #4WS'W );FHes]5l,9֦AZ]8fs}<~픤;]T*]=]^p?vL.; ͣe] Zwܠe[;fzy1 3$i MW|SC)BHn 奏d1|oCBsz b=!4q@]5b3h@ A`8/j.l OYSGAJXVfA~dAwk K 2?.ђl>{`gܾs3JocFh..L] !78ݝ[qHϚ.#.=zaP mӊ2 uH˻%*Nh2@ؼ&<#4CM $78N<ԹRYa9*H54%8)bvQF5MbG^UZSFshb?5L{hZ3<4s{Pf,0 懾TCӵB14-MB3p%&C]k+@m .%4L(=+4$C F M" 838& #.XjSb94S #LL2uVFyX\4CVe TZQYfVy[h3<4͚܄Y<|w(7I9(_-v@Pxhr ̈́G^}:I`"S&uh. s%2á)R짟!0K3GC3O'a:%2 |"].|4)46i`pBt,NATЄh!`q"QM?J])/GgZ)+yЄ3whmt͊l&fP0HXۼ,Qp00+Xtm MQ{}[hΡ榁Z!4u-M8Kh<Kȶ;el׹;:,iOޭ7Et~[hB!j~h.F" cDЋ[.yɔ~;L 7sd*LgO ffZ}|j$69]սA)kTΛ 0#)mi\ik~֒L֌xZY[CuMWA~+ vcu>; mt5]ߦDa&w␔뼬Sԛ"ʲ[h_VW&(8憾U,pIx/#z#z^5v x&ZqfhZU$7|H/H?ihxg%#h:G(EzvhF.w )NHL}.0;4AMBKI@)  /g14`rɈe|XФOP2'l*"} (qgsՀAD4DT6S ̃NŮUݼȵ jtuۼ,㱇&\h-?tPCtҭBs{@1 h h h h h hcY8?HO1 1BPAc!!Y:.3,c~p?EB &` 4:~YӫԴUY{?6iNhhW#o<꽽K_āzyWąWR(vk 40O2[[?'CK!1's#4#Mb6̼VG<~h}:&| _ B"4M6j>>4s2^YUZKu>QYkZ#2.zM_/YM[X|=!9ϵgaaMDު,W9YUznQZOVtQ )lti}WtOӬYb_ezt/SgۅwCZǜj'LeŲʶߩЎ+2vZtM`BFB3ir׀i߿5e܅fVb>]^fTepem3G 1<`M{;v|BS'].4O׾ &{l9N(/Lf-Ln17)7}oo mh$Ch qmׅ"v\4/̦qHbA<· UEF”+ឍ Y&{U lwU܈O5 ۟-nF)rIu(xބBj.=^Yۯ/x5[f}x6g{箣8 #Ah@"PQP:hAce&cg؋mDV}J ðh2 XhfXS+YhNRiaMxJWBy"Je -OMO 7XU&hzcf[bo4+]Gm}B̰!ruIjR*i ކZ-fG4cmq;&Gf0Vw:9sm\>&}#0 &0M#jE~uq3-1.JcnJP Q}Þ4 % f?x͕ilIqG͝C]w8bHd0F4? l/|w7K\uH1Ų?N'[k]}[9AdVDaMa^-THF4CtD(W4l5 _<c-=KxVuxP4?L5F?0 &0-Tqj2$8-G[o6h2|h9)~ьqn7T4M`[4C)/D4ayK.ւ(Xaߊuh&/jQGO4cK'ͩT򆑻o+]9ή]?0 &0L"? /4$rI9ic3rB D.ܢЬk$ MyZi3 xKyU"A8F4C@3_kί} fd"uLݷSQKb,aEaWb&fĨH %-%o-K$Ƌ2@4,m~,h?dӯ l&-Y%܅9'dD:$j{~MQ'IX'@:,g%vo;l [ ðh2 ? [+=Av,7· zFޓ2}_|yHxV:qe̎;ac5`,IENDB`pydantic-2.10.6/docs/img/logfire_span.png000066400000000000000000001143441474456633400203240ustar00rootroot00000000000000PNG  IHDR$:4X|PLTE"""***HHHΑx,+3ɜmD##"('&]]]&''/*(%%%۹5/,<2/ɏuȂ[F?k%$#/69R@:jYM>78ARuuukkjDDDnsWKYYY@@@duaR///fNCLak666D83:::fVNeplRRR̒xLLLbK@ċrdTWD=q_^}kzf[y_IAc4MU}ƩG[dm|gTTTx).0RkwpooobS-36}icLB)+,u׎hWUo}FWan`WqngyyyO>8ڣtaBQ[ˆpkZqrrWXV18Fr&'ɲ#^:L *s͐lHVӚE2x[C$o" " 'DH;vl@P%Ch\#3%$W"NepG$kS.kP|D O$MvI8"Y'%d-SΒsDrȆ7* 8G$羑mF3@HnmaI$(DR,kg3U" Ů&qIF P̔!B"AJ889t?IrK$ڇXk䵨<7њb5EDJX~{¿LnHrlz).^ّ|#|#岡N/lj=~r$[Ɨ85HNzeY t=`*kU:sDT7o]yGNTx=/"ySiۀ1Nφ4}u/t ʘi3TQTٟ4B^A&krjRJgTv/Iv{y$sh.e7-!H]ƒ}C'Iq-} <'fBo:`$ex ۠G2|DRɋ /G y6na$Tmd>v$ <'fd(Q\z:JF2o~]X1u] #\?DW$gwJG2wH;_JVx%kEL%"n4JEF y$I͢|E֘EfB |:\*Y1cI"9`8eM^kLiXi-r I$da#p$ҞzۼCFw'f` H蜈Ad`D G= 1yCF'Ec;ZjRtbݬ1kIjtXOHa&] mv ΀G1*|C$j Erbcv*g&aL[GѺ8v!]ǖ?otB`B/~>G2eah,i(d!8IJF^/BD [. ܾ2c=XoՅlH>KQ Iq>mm@'y_ Za~$QH*y3vQOpd& $E-EBFq"X \-yS3|oϼ9" AزmɣhblFx8sh&@I'(;xPP`شtv'} 8m$%Y´PT\ I^C+%ّg,Ɏ|CI Xg.<)zmh[)!mƽ\%~ wn 'I& D1HNb.}ZKbAĿF*;RGd> 3< w@W*}FJu ɰT f(<۷!:G,aF%Ȓ#rSI Gy_JbII~kRIgO"X`lmYi0;ގnen5koM Xc$u [" rIg0Q*?CCz~T'b&(D$Y ->pfc$ /K ; =)ҽG5'%))Z$HR/$dZznA UI` Cna;`I[︱EI >|KR|IZK%Y2$407kX1ABgg]$73&HK%Y*"ɽbs{u5 pܙ1rQ'$%u$@3<;UO&9|aIz\5lY#:C$dì]S@_ T:q:TJmtOJrOI~S"̕T}<~6vӸ$s!3\t BTu,cyIr$%i"I*䒜d$LXrG?m3WrT_NKYHK H"ܹw;s("$^`>$"C2\)d{,\K%r[D VZNT|iKQH6Rd Mv3dIJ B7@zǘȪȖNj\%\.{$hcGO5'֟щt$).}y.|#$A=DkI yMnI=c$$3L`u)BؑRn@$%YcݹJ^|_(i9<pP/YxhG$OųäQ"f8<l ($k|9-M{ pSS/}<ϒ)+&r 3Μ`(bz) IR ݜHR%/ _)m?=j7'm<=l 1I+k;W{IBU1G%YꟓftgFuU$Ũ%yg#C `cBt4IО+; T&A1IjMܓ 0} &SUCT4J.{R$9`Tt @VIRpF.I:k=M.{TڇpEp)-VyX$FҢ͝gW6ɯn KkgUq,d~\ n7Ij{ \_hΧ@G=4‘8IG[9>pfmo4S2IZEs X fvC$%Yvɫ}0RT$%'%d1_^}$SUu9*4YIvgYz?šS|V7]`6%M}ai,7Cɓdzk԰Pes N*I\tj>pf1F ӬĎm&EImUK]5d< K2^1j֕V\$ɌY?t$0‹H2$f`+630V}T]dzr\-Ý/IM$SOS`̂(BwQ&O^h{%NqdB>pfbK$$u.`Rf.p+$u~tHF$-zƨ).ݖ݋.$ǒ,Έ9IJ*WJ"t E:UbNHuHIV+ea!AZ١ch\$,m/i%$.:ߖzy兹-gIx$! $F+ɯtF?;ϹCLOTHYC!BOH$I!DR!>F6ʆB3j;Fw{"P\8^g= 7Fmy>.6$Sl6ZQ ׾zB"i,Fa$LLÑWfB!j…z"CܩH !N$$[8q[2Sht55 xlVFЂ]}9ɳ03H e悉iS!c98Z#=ʧ5#_Z2xJ_2]7AKY$1e̍*S!C;&=͡8}ۋ>rj$o:cp[ɘF~|QlvdmGHΩP]&Cs߮LHf<08V(IڐnP[hNHy4u ؖB% rZ]<'YNBB<#c`!N2c%kJ 02\k|4QDVl],SD:B<#d<<1yD%~BtOSe¡Bk6ϺLCiw(T"!NHxZ)Y$"E*pbTwOrN2xf9?olB%!UP.sr*Q-S!w>HfܼfeK]FRs‘G8'Pjm120e|r 0g߶(EलLɳDRUs$g _)<";rpSIxZrX2Ye2gSH.-Bdk[?,88 e$_,>H9+ch+8?sHNeЦH:må7ַT@o/7,|e1ėaeQ>Jku+D}d -",I0R|T$"O*.^ɺ^;,ay+vKUa;~$OHԲ|U(2On"b72 `R(Ьvva9~$B=RZ]H^҂l oBp'zr @YvoEhݴj<.[$xFhK将?tIeܐv߀VƦH^[JwAfI=*Bn(, I]ח" HqH |:H/Lh3w%[0q5d9Ze#H(@me{ d9`ɾ# Oo y,<#'mso(_'"'ݿr0R?ܿ+Hv@F/$m΀Dd(+ODGRww9d1Ia"7b\XIp$;Y˂ݒTw$%GGb'ǚ^/HqHvyO̯n #Dr) h?^ ](_*`=-rx\gѻlHU ' ]+ =Di$ S2!W*H7/IݦHS(kAJ!R3hM !MzlAĞzP EЃĻRŃ=xKo7&jަwDO3QphI>ofm*(?SdӔEEIZ [ZQv38$ RGRq[})ŧ2-96;qלᲂqަ3:Q!~uGbCɌIvN6W\8JP3̱IZC?PN$}@wM |K c $I_n[@f=m݇m\4^?_T*خѪ"+|l4O&2K2`Ha:WYR:l$YEYQps&5ۑ@A[#,H0q6%Ys^YK{?ƖHR$rCY|)7IIA2h$4$|x36,& i)&a$Iw"S5jy1Iҥnt'nĖ%y%BWaV"ؕ!7dfM7&02$’.AnOڪ)$9q@X^FIVkY%ɚ3 ఒil3ݑ$I[AgE}+CeM +|׭dyV쩾a"Io!$5=RfTI?Nc,,4IBF>*+A&$ ?\ \pVPhkmF! Vw 5\b2(1 ࠒtE&xqGe v' egˬ ?"Ig~N O:W'%$H2l$ *I6%TVHr*< (8$hw2ޕco/bI2%I=G7Ekk%Tk.0m̭?a Ӭ{H%;]Hf 2H6-K#9ap60 *:Zq˙$͌Aw/fh̒& Pc\$)9^nq;rSwR<$} ?Im0 5EI3)Ps%УU,M$=Jc$%6lDueR_bIvi UY4htIK~Icpx3Rz*!2\1^.>>^oluA#0EMn]\SIo0 )ɡ1ylɑUG`cI:ܗ3KCWH@ ]f ˾.t`uCt7.SC6IVtAi*U.cJCxaP$1H i $0$ OK#]*ItHXREɅwyi.1낆nlԫIA$~I)&8${<0 / ݘ0Q栍n[zU6.#Ln#f8zYlM-^_/EO>X n%4dsւ̒w^i6 ޟ;6$F!$y(G Ƽ8 +۵]蘒ù_f;XNmG| eSPuL/ r䒜XS݀#S3ע=[&J >`|CWbStބϋ>0 $\%g0hQyz_% ;&#j}JCI+yw4I% Z!i/F7\f]j-5Jl p (a8~I2 7${$p!D= J@Bҁ#iB\nҽف{pC߁/L@lsKS:~\H" D$I @$ H@uF2ʫ?gD$y#%X`6vH>J"ktTOi][ | <yg"DHJɂVH{-FʸFnAq^;}>Fd5ruLS`|?V:OY':Uuӵò4 逭9?ɪ\x^NY5"tw"v2ȜKx h)i9 p%/)LJ#?15j#i:fʺͤ+~ Ѻ!~@$Õy-m9xZ_?_/-DL/|j+Rv#-^dTk{ߥ癲=\*vl[:m%y/͔Q?(z6ᒹj 8R΢^@zI.&>C4s0OeIyIHYݸR#'u{NzkDǵ^})`+$,Z]$_'w+rz<<0vzjy>﹪|f>ݷ9#]l kXHęINu`HrDZ#F}! Vz#HY5E)8 d}1y#HҢHY @$H@" D$IXC$]OI @$@$|`~_8 _H1eD'+0PA5aQTDٲ Fna{j5ZujV>^U\ $F#IDd$"HY`$,%ÉzÍsI]|. 5*">"t!,b9 h}@DH~_$W+HIFIFHN *2|is$}-m9 aW&{G`{"_{P?DYo[b\s%>P(e6 3I͛P$= {W9?wqޛذj1{v1D6"9X$L-EO#"CNHɝH -ȏvgUI\mˉ-JH97UɃ-:#\I^iqsMd$lQDPHw" BKHL 3=7Rr%)c}iJO'%>"cXRU["nHW4RFt·$ylv6,5"0#鎉C%t$ UHbV0_匱zYE%+@Dv" ϕ@V8PYqH)7%1pF".+lm\ csADvHVeDD R<ܨW}')+7A{dg'72DILKJBڑcHy^^$uHjy|Q,Dq ڭE|et;v,8E?b2ő6H"-(o#"az,߮N#D̀f3[N6p-zilׇnh7o֍dX$"GSB;'oTE2xL' 'HHfr D"h0{_IwDPS}MoۗwG_ftlv$"HY`$,0DDI"" $F#IDd$"HY`$,0DDI"" $F#IDd$"HY`${qC$OՂ2 jC Ԍ vsBd\.@LC8΍ Y3#=mC{Χ=MB8H !B8H !IoFM+n8zFy 83VНF\.#< l;#` lW$ sVBX5Z9j{̣pϐzC 6u*WuS6'teA=Frlr]$q[ɡP:spXm?"ZɺkӉI]cI!T^="b3,cAFbf&zt{"ӭ,"iFYa䆕7UYy_<+k掽ݲP?6ƺ柌.LFhYaXoǔT$ۦ{ !i1T֗aIȑ#Io0`0 \{eIO <1)3VEot4d7/zxE 8 SEP$#Yw<4& 6&I$޵푬V/0{1Ha%(3(6F8u5v'ɟB/1L6z03?3\Z]MrJF"qV@sGXD_Լm$?2H7:lKHy&I9O#a5N).cgI_)FY<' (%7OH퐏uXwH")㖖d]hSܧ(|mq[# ]s}|sbPau5 pɅr_Y^x30ĝ7M <۠xH~A(qn=7(}޸zPZz*-:vDًRi:6JY%Wrgj{4nX) Ğ&Y(UY Sw{LGa]IڭB\`' I43,dR_$H [_|`|Ň`ӣБLfi@ɣFv:(cu+k?ҝ7]Hq<9ơx|>5IA*\ɿ{Grl`mV>Vqs$.ZVL-wI7F=}~ )6l $E?"I7(Z¢r5Gr9 !v펤UU_x?܃jʌž̌E) nYڛdxȣfAsSo[:kN b'n]#})f 4\78^eqC]oDqK%kod>]5:Un>+ !BBH$ŽDR!H$DR!H$DR!H$DR!H$DR!H$DM?bH3^^^@_p$X5VUx5Gھobx|qѥt#i]m=g !>F#9«:t#i]m=d !>&#H~țdNX 9/9TcnhUCcPFtbs< V"MMeUm$#G|$w/Mq?_k.Vt2_` ֢Ĭ 6k&IIu(R+t"EsiΞZ[a-s7Gw؈ BK_ a P:&zh>.|RX*WbkcUȌV۽ \q?9ȠEڢ)HSWmѺڈ<ёJ/H܈dW((BkPui,#yb~Muqgg*}Յ{A+ƌ)HJ+iבԧӀHfb7u$za+ҁ̑u$w@D2Xuf n)OEJBJsdCiPL/gX62y*++O'&xR Z$udɓP\ӑ=٘C$2X*^hcU(G_~90O,I]IȬ"Y+#نT[r6\G;(R&IcU(WYg@DN2x4>e]lU$[*'#(7S "HbQ~+rSC(=I @7"irt]"rA!HvAy$-}\M R#Y--Hrઁ#֗/_D|UHlg6{u7#Y(Ґ*fr*sd>HQDS"KRmV*RuUIǪH6Z3RVh/ڑDTur)FI"ȇHvw"-ZW_'PJ5#m]C$:&űH7RJ#ԩ-"JN?p뚝uخ ׯ ֒o_{-O M~~V=ZXۑl&nHH&3gm0)k,Qִn yV]ADC]E2(XHRt&Vi00rnNԫiFrΜv#eeΉ%0>,Ylp>keݙ59d#ZXItUuWgH:zeyMH^Y2;\dŚ;#٧)Q+?U_ :0fϫ]@eͪ9Qհ;+ Hΰ#9iQ`$I""F$F#ID䁑$"Hy`$<0DDI""$F#ID䁑$"Hy`$<0DDI""$F#ID#?F%>_S57 6D2Z 5NTЭڀZ#1Ğ;[X,W9y0]Dd*d4#O+$F=7Xq$NDXIY7/m3FZD2sDdy$ #"V/#tDr#{EVHlVF6EJlhF]s=[: ̡ٳgX|nspMgҘ𺡱7pOH[r"yM(*Fed$DI"[9umzFoo,낪pҎӢGp2̷h7\W5峂- fTUOD'uުfmR$ɊHKVJ$ |Z z(Sgx'H zz^A ܼKb$dF<0f[|M^O]Oj ]/ #T痁~͇aL9gd Xp8G|,`C$Khow|1AUdka3PLҧL{49\Q e3.I|r&Sk?D҈ `2'EjAD+Q@QJO5W! p<ֱh|spuZ I͙gb=xopqԸ|ֶ^7"yx9r{-6vE20)pui<(ƊLhH?]޳Zt(GhF4ի1 K0⨝|&y,+H@Dֱ+W>&uj?\WTHn5ME$}(Au=+֜;13r6$%DdM$XJ>vEK(ա?e|wNcf=_;'<4k+0Gz^ck+*>#6ukKzJ17Wi$\xfֈM.H0{Tq#| meDnhtS"+C*R3.$K2Jnv=B7\n,8`g;_~>z8HD#KHcIkuhj۪f?o' _uSoq] FnL7Yo߮FAvʥH>x$qPbe@<7u_y.= \:#٧_|"iSk~ֵ.[m`WݘNӧ]JV$kHQrjzG28/kN=REG8|=-Erj{/JIn\g#&oMn)Ez$qT: ]_/2ofV$%U MHu]yF7U}&ǫĭ_K֦ Xk֩M}SDGS$*DEhf򞡣Y dUuP==אv٩R8\B %:~I`?X졚Q"Kn"X_q]GADE`$JU|3,DTFM4y1#IDHe0DD.I""$ F#ID䂑$"rH`$\0DD.I""$ F#ID䂑$"rH`$\0DD.I""1ߤ3AL ]r1.aQ~`y!5V1ٰq]\b‚21,Ke ,yCr imh12$<~E+kIIcjI"y̪H75I_ofoH]dsYk4j܂a+Z+aACw1_$F+•U sK;G#L$V= 2DHnW3Hwr: 3ɭbT*BǤ9$<<(HtiN$sF;.ޟd}TD$y!.:I<;3)E:t}ԙEwe]Up$+V!~l"sbݑe! "tcW0" r#X_ߝdB+ECّ7HBB3m$C=*'lG# "B$'sJ"M?(56)tl$ vh"p1<"0^dpD$d޻ۏdBɎd1yc,"C4y"IoHHѮRun_kZo߅ߊMLTzG%1 uR$sG!Rq@@D^H265G^cޅb#_0K@9g?P{ F̆#pIj 8`̖ "/T$U0:l$: ) *}'#Ysd pBaWuy63MsX1iu"9G,y"yRfF򞞃W[l$SɼZ$w~Ւ "q^ʐg*1X!W$"2OEZF2amOV$mHEFd Y #,ّ3 Y$g{"2oE.FCE7HvN۷I%GcL$"y"Y- aD&E2w4"a,yeI5r |D ފ([l$1_vk)F[*#a8'fCH".Uy"Y}W?NdA~T$?w2亩 "OV$KUknKEk{آ~nd߀Ɣ"H6FE.2tbCulE("{Oh1* rdkUy,'F!!dl}j눌ݱkA%ObH[jR# (*nN"EcP J#H@:T:] (trem"ʥ{wCҊ^w#яGcB8X"y=;vg4zetTMDR-KKͯg܍c#SYѱ,[/Cx+IH,ҧ u&5Pyѩhl}ԮO G% U%[" m+i yBG@$ӳ{_A$HIA$ DrIA$ " I @$@"yD]XHz5_Q#y=mWn^[#yZbGr՚$k$nŎdæbGqbGR[G. \g[ʪ$bGrؑ`KƺU(k+cdӌ_8\wH+v$k%Y#Q#٘mջv$p$ ݇X1 _FܳOyfsE n" "Hc~^8MԡKa$Ivj֥upCHĠDc4ͬX(mHq*x|D|IA$Dod ɞO?0,!})NB~'vv D+$@$$D*W[{/a1JkysYsu)(ErbfffHH~]P"m)m~RmUϫ`{Z[V۷AIii8fH&zm]ZtH>__릞rR/*eI#ءC~~H1P)awt^Ϛ55u!,FH$SG8HU+b;|z؉duE\}7&& " `X#g#%7jމނ&ŵ#Z5DM'O `D"iWҳrGW/VHdC\NW5ߝ\zQHHz_k4!R:ɏjZg*/&UD@(ERbeK)eUڝlkHpT$}|դIWR}k^sR6Q5c$gaİLFykl"9ԋԦ#Yi^h7mi&HpM$|[MoFQF|$ H" >F,h#@$HHZǦo;@ 1dz0"1\ H^ ݠH ^LL` #8#"G"VO0$Y0}N H Z!tј꾿kHج)\Z ;Y?طCާ8Oij%\E& &5   mRm|vYZͯٯ"r+|=20Jγ- *D)]<6t40ң\[$~!{etL q ]rǻ6 *B17s^̑O~}), 'yk ]=,1FnPpPZ:L5F2V2.8{cMX/9zv< 펴fYBy?{H.VbuL#nW Z+$ fﱦ%^PL;k4#Q2Eog귛%CyWT<'")"@$w9lh[u{7u5IH~c]jg1&.~n131~n+i`lրGJ!gYDf"XFqaIYi@z5|I.H#%Rhj HDn(ToI%#"Itf!"Hy@D ERDd")"BYHP$EDV(":$CU"u"ِ$I 79W I$[I $.I$[I $ن$r%I$@HI@%&qEoH%iKaBB_Tገ"CZC1C" %WEEџs1n$=~}>pOr:kގL$}Vy0u֡_6bP<Z'}Jr$mĀ^HЯ?S7Zm@))+Q%Z\fIɃNK$N4n'>vE2H~75ґ/H6)/Ep}j5 +MٍkdǑlPg("9HFWVJvujmːuhn,3\^Pt: 'rI ȃu]#Eg03슻r#;r%#]I˙ǨT pٱP&ĠUgC?.ju#f#ɛRS_?z@b_)׮]HA"7cE2S'nE$Nhə7jKl(FD&qa@D4%LmK!FڊK258uM(ļ̮AL pebjƷ%$aQ,I]_",@x8J=cT*#霫7I% Z t]ɉH./G3J2R==pj*EGh@#kr Q')8I>&GR tj+X ~L߄uMaQ~Oҳ@-#XArJR.z,gÕ7Fyh65׭H37ƣ_Utom\QHN> wr"%BڠRC gD8rrd 4*FBYkb֪їa 6z9;5%x?(z>mp/sxnHfT7,%ꆛYmClb:( Lv#Ą5W$i4\s Sr: [?bF_ ,g$2"9g)%'S)_o"9}3/9oU/*$vd?E2aj?<@il:p x"W@D e$/X/"ҲU_随H@D g$/X)U*5W ">o _@  ܡ0Q,4Uzֶ]I" D $@I A$H")@IH$DR$ " I HA$E")@IH$DvkGksێqwoǮiqǟ7(ba"H j! [B:tP2d+$PpQJ -]@3*oaJ#2}>o./_ S""")"")"")"Eit[d HjAM#Y+lHa&#_"9μXEEӜ.gwJDG^KW<@΢$3u82dmKɀy01-0Oyߎ "b){"qZW\z|3]ƜAK# dc>I/ek3\b"b)" yǚaώDRVEIDVc8^BDlH>ݗcIERDD1JkQyMy *hXha)).AD [LrN.f?L3H$PI ")@A$E(HI " D ($@A$/ bHFl?: Ʌ 1"p= "4χ l&y=ls/z")z"ټ]DdfǺ >O'4@O$|2{\9̋X;$+9I`Hvf@$#rk'_y/"k晏7@$7`E$ " PIHD ($@A$ " PIH?k3Eܒ,$k! hYl.5DM" FUclЖ $ STUwZ/ Hkͽ ݬU|HN윙'`%+V`%+V`%+V.ɵPkE*-K6H6RWT E"by`O5) A *Fr8Ip#t)6lb88,݌R߰8C&* 蚥[mu+):O굥&R$7IrZϔd^^4#dHIx( kCg7G$MiI{m%$kA `ЦR IO36%|,^49PD,O7<:#BKk+I/$ir>$ߌ$$&IzXI O}m]um/7zz <${$ZDS>BrL&D4[L^ٺ^ilFh)$Ӟ4m)VSmC6!xzM44:OU۩L~s&]mKR. 9xQ"NMc|̉٩#?9 ZԓN#t,XΤV.5C}S&&$, \͜rz4$XZ"(NjC]ODͤ3Sb^#[},:=n>J5@q)X-_xqufMdSK;LX92,:<OH]UX A;n1qDhY4whl X$>G}[HB󞏤ωeŏ憻>WJ?=O IZ/d~*(9< O֑$EtjE1a?yT,h gU$S0Wgs}+?:Oup!NsK``E9m0V|JeJr [E`(^IfCOg0gif%h9ZW}T>IlAI ^cy(WTvFcȰZ ]C>}m=ʊ# ٔKdRBśrDMb^me=E%|=k^@)4r5P2G1qD>+GYbIgIZIF.=JDK_<!Z{OH$$R/IDt&= k(g2^ kX6Ă%Ǿ[ިazCEi:׈s7<bqU v j$]+#-;mJ!;cǚ6V ƁչXÎ+ɩ,e.}w;;mÈr:MCLhœ7Qm'P[XIćШ,(/}DJ}%*E.wKGCL'G5gdYr{v{%1UzJ2g12hąJ=ZR9Ą4g0 N(IyHrÉQszMPǀ)%" F֡1D`Iu7Uʬ2{Z$EI\s=IVt>p\$I>Q8vN ,Jj@uc])%]DO~HDI_;dd_\$XZ%'{#ɉ3ΖJR= b. &Sd]hDSKW(GEAHp,SZ3<Ѱ-fLv0pq.ٜ#߸ŖrbKEn%:?c./I-{٫ƆQUY$/x,!$`Est%PhKOuI`!Œ#a/_$镯ZI";DIr~x%/!h '&b,jJrbHIrG8눨tR@FWGHlm' %&&yvF94-n&.I_Ȼ%9 :ʪsVk I4';V ,  NyX@4X|xYMlDPs'ɁXm s8.7oufr5M2Q}$ĜLW=&.~ȝ宅$CZRV(I 1aG)?Bi$qIzf^ j$(EFHm`%ɿLg92_o}%m {juftŇw W)fcr.i H_, gV ?IJ^>Sdrn߈-f0dpS!EGɟ&ى$>+ YiiI0Z]$< NyJ_n$=ES\" AGϖ:7(A\fg/!ǿۗ)$$ d$,!'zۀd:Fu8nȦ]Fq{Iy{$Pڟ}w%\$^i1&b9߹ə1еb(f0č8څ$؏Hۧ5Wm ZL |m)2V<&LcIzku%~i(e)t0G<<.IOQPR6-Ca%) ᓆ$j֏HR($}%}]KϒŻҷga%9M&'ɉ$AƝAk>*)x$I#oiUHPvd QAIJ2F%ف zQ#bl}_I&zcmid7uT-}Wj`Dڪ4Cr*SDK7{ijSLs4T!.xѹyF( .I^w $Ei)NCk(9rF,dvPNrI w#|`chX$@U%aeZK$ZII~rdאwJwtgI*K';Cu&;O "q&#J*yU1_NtH ɤ#_I `K6](׃s$gȯ]=͈C଴3c D`!tn٧.sH,;e S.I^k> a]㕤 UJ*,X$ɓ'$LzMQkyBq^\{ޡPY3IN}w($pt|Yz*F':EDZ*.(kWUmͥQߣnӃ4#t8-@>iK<ETnA']8c-%l20iNO5%Njdm  FwAN2|.IQ0+I^ۤ)В\hŲ$O!1*=V;Fi (SHYM X %X1`milRBۀ=B"9g-WmIh-'NewYyjɖuy}_jHr@cC㴚DyZfW{'c2nPGF@$~I@$D I@$D I@$D I@$D I@$fN[0 ?$Cl@cVTM]Z0 ZغkTDź*u׺/ .(vp&uW$O HњaI" Kƶ5n;dxE2TUnvsdo^nDb0#IFVLήIw@W=gdstoxEry 0"H.m7,bh .[B- 1uzDrha қdW$=Fo:DD2׻#IUԑ|vpmml$?LZ|;tIku¼"Y d;u\2صhPMlӞH=l)xIh`D+$2ЊddO]lJrQGeY6mu+j 5jC+gnHۖih1%ОZLEa$uH~}_* IhA0=D"!3 AD`$s".55_d.B$5yn`TiErnJרF3]0dH0RE$eBSҝWrnݣI1KK=Hɓ0WQ;t7~kt.^*EI($b@3f({{"1D%:F'/ͽ^(<8%ZyYrYjoOݑtd )R 3-uٳg "F#) ` #]*]QJRZ{"#o<"Y]߹";j0ʎ0P:smP {mVwz)'Mrcf9(,oI^F29a;~^E|Q'S!}p$jnt-jxpG= uȅz0H&6T-n0DF2{HYgc`i?Բ#v>Na-ˍ$VZ!"l$P&zHew$sF9R): "*H$J?љوfA8T?*;dIHRRAD$#-q#Hz<DHz84I""F#ID䃑$"H`$|0DD>I""$F#ID䃑$"H`$+ O"YZR+) #o_DT\I$赣J"_}#'JH.~D@`@)H^UkH$Ƃ#I #3uM{||G$L4$"d=h:7wVmaW5HN fF(HoG:wtIke5 =ζyG``=HN˸a4:g7c\WMo$偡 a&Ob'*ѭJj]MT\mkYwٸنu'8VAG\!̡u}73$%*?-#}:^pfnG>WOXch>HNT:[$:z4.uWKEI灈PQ}"*?/f% 7‘ڇrQ鄑֮#ٷ}38"*?11R*jV֫ýG/&ar"튪7+,; ؖMa5#@$uH:ڮ;aYT́j>k]s?zLDEkV).^ jcٷHi&tHݶT.?bOH`<~}Qa$I""F$F#ID䁑$"Hy`$<0 *y`$I"H2D䁑d$#HF$"$#IDIF<0$y`$I"H2D䁑,^ݺUhį0| ~ e0FJE~ 2_kQ}{83ZzHsџQ~dgdlFib#)U$5~#a$+`$N$ID;ELDr3:D-F:'FDRHj*.$uRe)럭c!뉗+Ƨ}lY?@s'0v"#Yњ`IJcAh+C>XRd#_$}m1PP$+ׇK$g|$@@2a|Ulm =ڐKE&H -d2hh>XzR2\5XBMk /6=#*lM1TFHLzMqͅT?x5>Ni; {Td$父qoho $cGi4Nj<\/Zl6ڭMcGrhs`/3AXdO8*"Q$.ETZ4ǗESE h X.m rx @!b*7g͟]HR1b}jY؞Dr[.jpn󪉣]R .T=rNXt$$Fj|/TKS6sڭHL6  F"Y'3{d&:TuDCn G_N>jҒ3zU"GUb{_Lk+[IY7LC ܪl$.åQڃۯiiIDq/^^$GN$k‰&@䁑D)f{x{Iռ iOzmIo_#):<ۥ #+);kIhXɘLkxpQ@5M9rFd{g䢽hw]pi9( q]]$"zaGɒ#yC͂iw"9YKUs7ؖ V'Q +VHVYŒvu VHw$,7AHvȣvT{v#ƍ; 2jdoEosENk%uVCӑL!1B=%M7ɽb‘H>WҺs}Ha9Hhv$eD2;\E2kY%%֧O֓ȊL%G~;ɀ,[[Dpoh,C$K1ƒ͑RZ:+mS#yH5X L "nGq\K8yqqTS}ś"iVEWU2⭱i0DtHN4z0kC((\˶_|Mh@)iH>qP=<("SErUDfNeթ΋Oǟ߈k|e.б.[FZ:2#·̠ -{X\2 !9eǯYRWB"=uEo1dڟ IR\kTah+WD5d9hL,ђ>(JɻDAx'/_ 䬪eBVcٙvV[􄈢k~wEW 7>T?n>MH:(xZh4=s, Y#=\3[qY6>B@}U#I=g$_gg="yoVnɔvLD2Z$nAg E9"tX+x\52(+b&Ƀ(oHURC~5!:W$)hcOfm\u0Pޓ%QWnQm"6anzHI$SܲdH:՞C]JICN%!IȐ8`M"J'ȻKZyjrgh;:}yId`Z!e9pV`XZkX<; Wy|(#aׂv'H$޹w쐄#Bf8? Rjx4\r%d Vl;j ,7Hv\H%y$ \k[kz^sf{X Gr@~&\M1OHkq܂է#[,>5n;FRaS5r7%tJ3$!"!1HzÙN)8cmT8Sk_/qߖ/.Us_V?maGn@DBBqsU  El E Hh(""")"ERD$")"@ HP$ED(""IHERD$")"@ HP$ED(""IHERD$")"eEOZQ\S$d pi܆Q"=jb$|>99Q"iK$R +EnH_d:xTKx!\1;\T "0j4F@ /5oR*HD2{?׊HHHFt pYD +L'9xVh 0F6] [\#Jq^$yf'oܸ?@ YE 7=f"* |؝0%ipF|a{7B Ɩjbmv1Qu}"/H&yYDYSI ѻg0dӻކeD)i2ޓ44:I#O H&qnY ,zKm 7-318Y$o>yN"r;db$?`*\U!"eF0tÈ?$E"E.L-N#i]d Sv᪳h ->nI$E"!k&.F2pm0k UH$)"z$eRlqwp$Y+_$)"M$YT$EcdZ s^1}#1 ߾}{OcHH>\#&?H0F.;Vy$hGvHBDBb G27#0ˮ9259.WRM`nsC yl_9_OOOONI DOG\ $6IRsGHk:|Nʕ#d6\Cr"(q [ػc('0F$Ia2BHQH +E,Q_Fa_aB~_1_{>{v[q]tNIظb"嘲]w@ۦo1 7I I I I IgIy!>IENDB`pydantic-2.10.6/docs/img/rich_pydantic.png000066400000000000000000000311621474456633400204700ustar00rootroot00000000000000PNG  IHDR@-!+PLTE)))ƘrгDhK***//0XYX--,LMMCDCSSS667.-.UVU545PPPijj,+,8EJ++*vxwZ[[p|}|GHG2/2lnm`JJJ?ǬC888z^sts112qrq~o6ZR1̰Cqe5VN0opobpqpmGL2cOfB9CMNNEFEfcK?M07:{9@AAghg64+GblG=I?7@C>-uh5E^gW:;:NH/dD343:hYI\xzy21*abbpWtJE.=<-87,_`_=NTAf[3GB.BCBla4^OBR3=@???>YQRR===\]]UFX;;jLkvdeeSDU;5=[c8Pr}PV5hpz?u[z=ai://*kTo`McAD0HfK<233bX2@?RYJhrfQjBY`Rw[J^W^7RK/cdcF[J]T1TzEDE:u86AE^LaI}B=@U\js=_D;FxAs~@>>>en;; k0IDATxMOQ o Hш$)!j$!*n]nܸi'eC;E[oӜܞ{n''S˲,˲,˲,˲CCV%+e-D,)e-[^imivoŲ%W*_yOn{,s>Dr<%/T$!$+8KeY> FBY^Ns/W+ˑ0yBG,ɇrV᝼r} /,kNN>"H~p]\hAW,k@iZkktQtN<7:…nZ/t ۫qಭދnz~B:uߘ}ŸM~FcvlʓF7kQ0U>OPu_S@z]5b3HzWVgeh<7+á~*,~[D$W+Qp^y'vdbT"~u;)em C0x"X#q':FgkS32uC䫄~=p".ddD"9cD6R3j_GZv˲$fw:ZpTeu`K*3m2 x5?.u07`j_  fG5l5ponB9RA8T:ZB_,2:<˰-dŶ $dR&# $c_%2|=U/oՁ)&K(SXfN %xZL^AM@Ų,ϧ}`; {#&p'fp"RfG<1}B r^ ?^e85؉#39\oկϣק:˲:<;gƣ61+8?k01彾[&Rp] [IaD`x*e2Ѕ[ueYa PTb<0g%j &;]]/Ϩ`$uΟ^o, @< @կΣ}U +v>@ce: $ Qx('Z9%0: C1;PWQSIJ19C e|LQ@&~|/O%K|UC!Q;PkOWө"<4*nCͩr:s\ <͂2IO5h,pLZ^!{,g%Jr3 O(W,W($-L`޲>`9iyss+Uvy?ylP*!>0pJPBBXaP]B|\[1 K"|>A#L3B!B!B!B!X\&;(Ӌ寑uGaA^y{w_߭[?BcAKaz1{tX~op VT"Z~r.x?b%ґqЋQ"FK/F8Z~x}!*88NGƯk'B|F[H$P5(n,*c luN[ ܽCgR4oM/'ܳjeOjM_CuYsc5Sz?Vp}ygy1w8#тK_2i!lhcD HgtEz~HNK20Zql v^<@iOp glI[Pg^.g@5W;!P~G;?H>>5,#z>6^|0΁Y W;!P~z/e98!@S|!Smtcr:B<% Diigp7.-ci1j" v<-&kߦ+Ei,r-cSK  'K@0ίM wjm6@}W:ԗQSb }+ֵ~^O0r9|ʹ ">P5؏#`$VfoޏPv-ɔV}ǿ {u`V!Qc Em:p~$DD B^ LY8nj{+hIbNF7Lӏ !DD/KE!CT i<;zkIDˮ-!"zf 'qdعIV'IHZr+Slx uv$po<:ΝF9C9%1.[@nLBq+=  /5X;d0T@tY|kMޠhZ1׏ռ&DDϭXńi `d#= I7j>F=w =Ip7 `)Ȃ(B_/'"z!QLvLs|2n P.JW/X-^[ PN)-`iBJqa1Yߕ_糳BDICLsLEK`@PH|x}k=&H y ,RBD4eg=P9`WRcl_`ZMu-ZCsBD4N: \l DwY䬗ᅮ'ȂrZ,DtE~sabgS$/1ρ &-Q0b^~!>5,#*t SjRe~Ѕ~1o :@ҞFŸF>`,&΁{!"zTR qud\0`NX3 h`;cXYsXJ'G?n" nJ^{vZDOKXM`"$\ YLIDʀu-M wQ<P_@sxVa͑InO~$ #DDϭ"e'0r ,2C1YIF."E - XC[es"ll`pSݑ@%b(+JRBKM(\8kM"ICxs8X4L ^Үg ~2M~$rm)ΖHHf$Xp~ȨZ錓Z8v("Rf@cPd@[#P,!"R>Bes)ӲhCl^ O쵉( 9}ƈDQn(ꃸ+ b-b1׺"uKR1"**?L+nz\q4;TT;U.З)6?V\叼Vc@83.֠yr ,ܪ3 p4UZFIk9^Qգ&Ts,ol@|?7bmJc@D-8!/%U/b<;bT{ `,w|^&ˍvuf3E@,E,HҬfu¦x/;^OqqYsfԷm)Wi7b'];f[ ^H^υ'K GhKeܣ:K vXՉlvxU:9ة5ȚA [ &U.;-'xk}ܦ탡YW̬@ܤ t`V`Y2p`Nw~޳ d_iHr,jxޘMuWUuoJ n'4b&V g30 M vϼಃ`t|Җ/t.G`FBUu\ɖGtW\w#juW&XU??S$@_ʮtR@ ]L0DL!b",N$$Rn-CЖʖV7[*+O O\~XysܗpT!3_s^o21\SUd`>E Li/c%K!*)A7 d+uc%5m6/\y] Z s9Ksk^q_c^!L-vKz cKf|Xʿ5P~$?W> 1FF:*ޖ7FKQ0| c.;dɏ FhȬfd2نJvKA.iKoz Qc.sK˺ @lʛӂ;m^%郌SBzceY@Y}vHo;xL>!c<_5~r !Nc=bc810zYׁ1Bc1c1c1v)sI xAO VQƜ42dHtÞzw{~a0Ec-txFdbQ0pCxzUx8AiB|g-όuLeΜh~5gxU ۼQ3Mnz3%qkC 8F+=RoϓB6sl;vu/F.r=H7 ˈ3x$f뀢1`,6' adأ_4bFo%xc +)H޸}:JLzᚶ +2^tzBw"W·;4|j4@VDxpv9i\46Fcc'HÍ6غ2E57iZG$v_Ṉ;\';3.ݞ/f(A&SZLx6`vpj3)pWn={SO, +VB3ԅW@5[4䐦5gz¶F#f{XhӲtb#hNk'ވ O<z.zŞP5{ܞg0+3Ǫⷦah=2^Yt]0VhIS@,)ȔLLV4/TJi fUPce1%ON+ E-z ZAq<- kHoli JJY_cy`wxZdr8Ci_ mɤQ=<4hettJ.2. p3ղDbˍc[VPd=u4Zη_k9Aq8c|D"$tDH&NK=sWxBsƹ̓PҾZVZg M5EJ®.%/ʬ cfUu{Q93k3a3f3Glp/?:A`p.4RSHsjdAިcBHCUuag>Ʃ ̽x@*fm{sjGƉ E`"3j' Bxfpp ;Y?~ѫ*EhȀ ߎ8VC[a{>Oχ S 0UBo:9EiO!ZO?!8lҤIGyw JS*7rSsMeN BHC`,{S:Vu[=y=i=m#ɼnzs: 0d3GV6ot ;00ێaJuq $ǁCős~8X%x\R~tV!㫃o?}[dyr1k=y=ݟi=m+ !>Е_z3hWi'WV`rapV5Ul'|xp@y3="2u Em%B_2=᠞縞쏃WOg1꛺P,zplȏ ML[, q}@uOauϙ7WʏtIh C̵lWbB-q>Z"˃禧#$pnl{֧xQ!&携%L&d6, %/o몥x)ݟ)!=_~6S4%\] uhJ_lPLǧ*zVɩCwl_)-C,. lϋ,' {AdW!딐;j}Cxe)ݟԓe!˘ّ̄:C5 hE>UML&kaDqEݺk yu_LeG _v̇E= Tdy̩ }<{ֽ>@CW@@ֺr|BT9'WOtROڨqr21<8:flƜhҲ@͊p}[+/ _Jhp5؍Ǐz'ݴٻ%0Oͥ앢=荽htMˈp$-2.*ڊ6,z3ѱL\$Ι:oDxcEɗDv"ذ4[uGNck S̶zV9Kձs\+P-)DQ'zV߭_!pw_qs@auE;{p y.]M?K;{yPx1Il P4^?M,#s>'ETS޻_Qgm͞ /ȑOܝq@xȪTr^f{Fܻ ๤H7rh4@\Zk#_&;dϫQ,;@TƩG@ h `RtҶ?;!iYg@$yٽ(yNh v#_}*_k$bze -s;Jqw|yrhܲ{'|H g= =QW ^!t7u%)X͛ H g#gBRi)лAW3ٸY[ 0jQSFx(@`;)"-[oU O'&&RxKect.rN>rM)ia@NKiXV(^S'Cq` `L=0 aZ˝ʤv)K;vl#{=n4i;1_Nk Z0bQGX?gVK#]K5own`6 oSueS' qfFK?lx_6;1yfGY@Y0>wǮT^ ,w+W$y~I$0ZF^j0[oh"a3ZUkɨ%57C}Ncw}z9;GG9zHYO7ݟ7A@{zT3‘`n7y/S8)705x5w|oT} яGOaǧX$mBPQRovh[F7oZhq7)*d9I u}^lcSĨt%/yoI72rN!ID<ѰJ7i˜y,3|7I:ȍ{4ü2 ݡ #D ͂w_|?@ X-L|d~} `cQl!fGݮ w;A &Y٩\W8:@YI()QJ)U'-ql}ht1REpI4 îRN/=D)nu`g= ` kϓζ@)j]"_{5ʰ %`d@)o͓(TTP#i[&F-4gЙc{^[J9"6s)wY Zp&\?(2J[_ݡvP)U/]d^9G/&9MWɪ DFD, \pZѦұ-ff邊RF9ڈ@v[1,*dFRu;Xw8g'l"d8dKiohBbr ) M8TmQ BպtXnOư+'z^C==86 v-)9D]%-\ɹ@C[+~7tICA6I\ A  Y $pmk; ;ISN3s}Z@OpWnPy4ND׳.}{ {p_S:vh.dZ7/<[C` jj;@p@fL@<ZN{Ynv_K;wDax5$Dk'FXVPR G@T!Ԕ \+P3<ݷx ;@-@ZTKj P-%s6m u@D&(Clm tz߈6m0M "Un Pu56,r[w>(}m _^۷(ur"".or[tWs7o(r8w PG&۫o'ARa#B{R imƣ+ӪP#*ZiV+Qߚ /ٟ`Y9C+Y!е"v=l5BBmK.~9 @syJ`|1k#JbW̩`s\79_>3M~XN #d&F0lwߪ uhdʦH4weuW59VH3t {;Z[ #|-8r,TZ1j2/o(߁GR.khTn<Pu_eCm*ϸeэklG1¡+IR5\}E Ml ڄu6AWve0K.qRw篖quf۷T_iY]IyOdƢrnw o#6jYpKD.L*?eKz1:3hߘSLRHe_}?4AFًJ,wZ&+>y0T^fǿcӖˊ;rDEe5<6Fq,H,7gI| !K-ԥ^2Ofy$(ol솹yl+`>`vGK#zN۷zjs]u vM9]ݒrlo"f1sP):sNxr=XuybIeT[QAҲ53}6~" 4iVJ*Y>&y*֤+,,[U{/1.X]EyQd*npckAfu ׯ P][1uΓuO&O^'?S@, ~XR3JљO߳}ε-_ڹO3R]PICkDPo&ղZ ;_詣",*Ψh}Gve_S&ј8 GH~=0fVe2J]lv5 4 nt׮ ="gh嫤:Rث]dO[5]W6"k2\1=w*xLó!v/y,ޜ(MW&y]Clk-T.yaڙSWKf槖hR%&L%siڌ$ ̗MQ)jZEjm0%ʌi HOeo>W U&\*LPz4, 9ESȅ3eYGPP+F{3Hse5iAެ^:&)SN%<{Y|; OF%%V%$컪 uo!m-ѪO,3++mqnߚR#O{3&( Z4(LC@i iq\{X'ֱ 3~Q~[UU#V"4Vdj7=\n]UJ"4 Z~2fP-DdK+*ĭr.P: i(L#=MhUuICQQneK΂=^Soag*Š2h Dkp?F YƸY!.OzPˢ^=%5e!QGphXϯff,`fH(WU6L8Xi4 %Ic ([QTTG^jҵQJcpId7̔6)f__M%>(5@? 4xacG[eYPq'P_%z" " lѵ ׵ a[0hnz h:>ұѻ$Q_{=Tv$a@CZ^F!ӧiJ+l]Kw+mYaU+ /`s>S9%/+o-9wn:y1E {xHѣ:4HGϊ0lh SԈ >Y\+G3 `9QbJK*)y8`jmhi:D`rM͕0*/ܼ-q{C fϽ*>$}ZUrQ6 8FVJRjdcC0oN.Pf~M'c|"BS Us8B.R<0XnnCBKm=5G?y=ҋ:O'(z@Ba ۣBoS>`2ƓmPd*6kk'滳6RqkD[[fJU}͜tײK&ڛLa]v')Հm(Rߍ"#І#r kǨL6Y–};Xgۮ:\j d&"+xzstH*:e,B 1_VC"\ͰQԷ)7S naN$ߌ!+!"FpzNjsk8PQ˗*A$Vi:*bmu AnaApTNY^ >(=@~^S̸Y/N< gI#(Kd-@ys5s 5̜ݷvF}7ºxOo}^Sф<'V&*_@Q맫N0kuw*?ņ<Fa>ÿpi؝]?(L&;/@A%S2maU8)eͱں'IX_9zHp_PJIfL#iX:M&5\md5PkF,tSI* T?j"*qRqV/fyN̕ ;B^!+,;~Qs +'KȤ_A՛\}+B!lky ֓8Co[!BoUj1.ZYjSB7!BEPBajA!ԂB!L-!BZB!0 BS B!B!L-!ԂB!!BS B!0 BajA!ԂB!L-!BZB!!BS B!BajA!ԂB!!BZB!0 BajA!B!L-!BZB!!BS B!0 BajA!ԂB!L-!BZB!0 BS B!B!L-!ԂB!!B2eoYr+zk6odكfG qTX6\Mѿ-EC[Mkoegp)bVnTbt._BW~v.k5^-&撿gqlysR ]&N'NUM|2ł:VVæE#@,S$Ȋ:W[LآAe#`~AQM ֖ނhȾ3}ng=tɨLU=N!^eBi.6YZz`$^j Ѷ U'_Cs(on_r MX78ϛm w-k7pkkni|]'Nu)`z1&;Oi+*H^SE9{Gf<Tⲿ9|&Id*.E7CD7H|w+_ZպʛJ'K( W8lcEh#\#-@4 ɌrJ-^Mw/ަ!ݼ#o;W}-?l 21)b_6J8~g{+tHXsW!DfwƓDd;>Y?~r-i!Z?X~;\1iA! 8~apH|PA)S?lSVmb@]cȊr.O*y꣚kDuԅe޶ſ!O.(>!qתUR|i,S2ʚ[5%[J=\*uUwX ڮks:CGVl>QWTKV9\ =]a|%Ԟݪva:Jy(„HP*tR>\.f602Ĝ[R%̋ͧrl]7U>lխoe8Ĕs€I1r`1kBB98jNV^rNTP%io1ZEEr[Uʺ*ܱeҴfG=^,и3B )Ȉlj>L-IVG,4S (ӵZ0@Y x_HP4nD ˢ)(;n*&IJ-QN[ ˨ c ([QWXSa[κ:> MyCtY\!\aag%u[S6;w=ܳZ4f==:ti<B2;OՕ57u`e(fGzQXu%[Ews<[{u?c2^Zr3}2/x a)rկO5r1J)sMZ,SQ4$M)]wkѵ"1 _*9jks7(>2&㠰W$֔.[!x-rO01'BJ6*=(q)~,w2h h<BJR`d_ 4,v,w:k[s/^Zsg۶Zc?KGv:e,ßfŗy XW=F|)d6"V6X'/(3)5&"9Q@bט;_hJK`EǮ+D§^+N;xCԲ efgbQnCI, SQ004Jr6Z֯7ߖͭ_|zsmA2Y6&t>!Ƃē !ЫI-P|[[6զTqjmmX\ &@Z8˪J]}̬lY\$ȕ%j Fk1kL(X{jҢPO)m2upm\g[[Y*m-dy[w^faBVtS^p"(.6̼5dnJL`1"ʅP*Rn6 {Tj^#;t2#+hA6T-/O=| 4%M|*x0ψQ/A;.m^t'b{%߽߲ie]hgC_M}ԝUgǣ+'wTUfW# z]5,A6s{s):zhErU^DD܏4\lb5;-ޛKdgf[}8 8oN;_'_-2ɱ|. yq#=kώ yn_8l҆n<Pu_e5\u6Ӛr6͋c!zkR܄/΁ lG|z9Oinu3yPǭc+o8[ʎiXiVLXXqi+ w|~fg1Ͽx[a{s]P&o$X8g) %C6!zRNǷ/ |TWsOQ_~fR-M9_:Z<B!L-`[;rο4aҸ(N/yйUO6LE,!N_dȋh:CB!{ZA]1IVV(9qt:%!jgF M: 4Z .*0rv~ޝuX ky/&YyNaxBZă !w"͓Iң d?Zj{Npqr40 \9rBwww N>?mݵ{q!a`of, 㩥2W_[>IFxH!BkeU{.oK|fsfPDj_5?ٍ7W1ĝ4WKȀGJ(Q~9si&PUY׶^[C+#5 IDATn3LlzMUy©ulpѝ!j]s?gB!ԂB!!BZB!0 BajA!B!L-70#B!@ɳ c5o7429trBFf<B!L-/FϘ=Z ?Z>j;e}fP~or$*iĂ{\E|LZP.+ XNN0;+5XT%'] b’uj>޺)1FfciT-e{]ҴMY+&,,ܸ!78(gE˵.^p(z!nAsOfLF|6.&Y ]&N'NUmpB!Zk o^R]uꑻ&Hm:uFYu?y ޟUr|9va_yJw8[LTxx BobAmY:MvDߩuʲKjƷvw8iB,5%@Y6N/y5L70$"!7p?t/ϻ\T[uRfNX=:c!k!zXjn_sB+G& ))-}KZYY$9w,66a$|!(Y޾usbxdy[w^f (Lϲ &H##B5)7},JsͨՐ[tiyaI;lVo"g4*RRV[]k}\ >3]@AӓWϥ+BOW~B!Лu`[[uk7m&~B5$BḌF:qXEK#B!⑏/^A!гN^BajA!ԂB!L-!Bo"b-G*:eܧ75y=YO||*5bfEaa!z{m-yE/?ѾAn~l_ 0-ڬ31ӀqW>zgB!L-!BmeAxĻ.6 $?.o ^>.֎][x7j]k`䊰^,(ɼpaYX1WnQ|I)kBܐ^&{HT5' p 0~xʊ&'(8j,'l5MlZ?|;uaO{KVux7oT6"#A9}tcT l<1Q#2٥{0B!O-A|t@ቕ;Vܭe#2 f.i^8x5E/(L2܊ѩEf÷U>Q>&6}\P[VCM^㠜/o9b/a%#5 " h~>zMұg) %C6~a/͍"V\i&":J^YzR'/R9_^n[ΞN5CwGҠ+fx.BZBu o>Ⱥy*ҝK:ݐ)^<$'F'9c'˜%bK/}>!NX@iR@#,Rs$?tM\KK?9FO fZ+Ფ{sn=lk ݐgfU3m:$+|4JDz{ŹX!L-v6vDgҌqeJG|HhxQe8iRkyEjI)ZSYIKG6t 'Oyu"CR'/2Ղ֨3$-*.mbCLm+$S r.x`U !ciT|ѻb<}?Qrtb) '=DVyɝR5-!sEEyt0`:^R^D;5;veVNqxu* b䷱g, x5.hy_9< n!YEAQ`P LKKNY)$SRZ`|.U7R!US B5LCɽ揝5ҽ{Y3H*sDA#{ZO*гٛ V$,,\6 >RÐE],w7Ӯ2ւ=o݋, mF|3cQO-ioNZA5Hfg=%;[llNJ,m,l_=g Q{sǘW*jn+g~tE'(Q[;9qpЉlR >1B(Lr`ۡ;E`7zwɻ --ڗ؂ ={O4E++ʼnǷnTFMS~r} CDiV_eD̻OYeL=<{ӉM K_urq7?\LMs' )3dRa!Aثe/ZB CG_ѕ)KPJںÒ˯}?C欟BцZtԭcmym#+Nt&GKX&Yn5ǽfV}ӕi=Eg M.?ŻKd+M< 4do䃭4oܯXz"iԀ{Xs%k[T`9֑skISf/T؅F~sK6T@_ֵu[/gQz^hwMdf'K/l=t TY7KZL*ZXIᅙ~|4(FыGn&멇k,۹0Ekrm=ҁt.Azt4qO/ [4;L-#@ķ?%JQ_^h.B6mT+9]ixnpYWJMJvv!NkT1 q# -8&pwº5U)40wqfqfrBFcOB)Ӛ (~b"4Uf>Le#w[|joO/^IOTq3ʼn|322Sn\]aPkZ2hLeI5 Y]WBI d O5s5cjA=S*U#SWWVʺqTj** Q%xVyXu՟$|-9nO[Jf 8wKU4/hϳ'\3x+&PpJ>ϋ~rC*'AIGr8ZjS Bٯ]inlϹ1ƥNYBe`AS$B){~rfT^׆Z)CuIVŬgN!ҿG)r`qxPT4q񱡒WJ`a͂*0ON{u0yv|Vͭ^ ͭx,MEI+ M_8aK/ϕVs;>-2bLE @:z -NB/_\5߿vDؒ =9q;_"hc9./yТWx, x,°% VNBBv;;B7/Qam_:NdßFU+;rg_@.3]: ͞t5O!ǫ{Sʂ.= gkxhXN0V|?˞dzq +U^Q\AXn @n ǯN=n _wnmwbmQg=7sZ~A&͟҉ _gƍ?N 2{uGQ4m s3^x-^AF:L9P4P:5{H4=I^>>Eq%7zuԔBOh u•; k\[ u9ځ$H!! kkKPrGւz7};[:_b`9a,5xܿ\3ǝ wOiѣk}+NW'vUffOrmm 0aGݮⱊJJUKx"'Ga͵=jQd;..5%rހ:`'kF):%A4O8cʔ҇o<.TImtѓ^| N{U?[5*zxdˎF7s=2uPѰ=}y^ŻkI'b/ߘ|z&ߞh}AK%(vx0_S k.&-S*67 63\b\п tjW|{`BjbnR1 o-z Πz[ƅ~t;ݚXA%^KQN֑1ØǓ]հ<ւWm-vҒ)\l9,MeVa1L-ajA$c BS B!BajA!ԂB!!BZB!0 FuK>ڰo,0 z~Ya 5L׀n'z/gDzEaҕݩ`mnY`|ʷ_[!_/x`7yL6OLB-MEzJsY.'~^0Z7 {_0p!`{Ʈ]8mSv53IV}qW?YnjYxBZBGumFda|d,<Ɓ߮yvwӬ 'J4U+fx;`eh(ycߖJJJWX"h&ޒ> mDnE/3BIB@BQZ7F!HKe&d[urke@c!#0 Z&ҵ>zɛjn6.=QjCzJ_HvX;u]BHPP-m|ۘci"0 ^g^ނo k;rfOw=hW7S{* ݅v݆L5 !bCz>=v$dSݿڊO)zea-[O4==m`/X0)\wBj1K&ooo8!"BajA!ԂB!L-!BZB!0 BS B!B!޼y3BMx_u)kեu ;tú|8P~l;ΠdvS+AK4b:qz߂4p|ǧ36?g6p;$l9çoAL8`Cެ6N?!F=mbzMzPZ:VF}_Z˅Ygᾢ22O-ģx=d| UTyV!!.s|2C&YoQG˩Ca;r%g'Qׯ=Ub4xLoGL6 mĶ-7xϜ 4'mJk+gTkiYҶLYo/q&3lH^qfݾ@uIX;̜doD ॰D!_UNJ5?zW`d#qv#0Y@]{YU0!z% *E-/jޭKSyKi_O7r,;c;ΨLl:4WӦ}pQ WQX0$ˑ {-*|Ϡ*I<)IrRw}d[~xq@ =FheThV˥~n0*]JZsRЫau]#m[i?7Ы]~-RsIJ2hs?:Pr"3IDATe^ﳉu@Nco.-ٙV_h@L7 JsnA;gW;Rb,N_3)U73r»] ne_+`)jQȆL&­r HJ \Ϋx4n~fFZeNփMg2 > WP9B XzȠԵ Y/<*meOՎlvb9~-"BLMN#-Z^xԺn~ 8TNhמwuYyV#E.%_KL}Ƭ÷&}OcY |.;~ z_Y`bl^kSMM7b1`ɡOBLZ|~u1ʹT'~LĶ6l9}IF$K2t)禀O69Oi+*H^SE9{Gf<,{kA~s%+䗮d ?o IShkFPAvn@#r>gFKinu3yf[1n%;ǐf#N떲~ڽ.ViڦǬn\zv> |u'@5F{RjTFNeS_v?-Z6ւiۮs[5le,HsB\n·#CQ6﨎rkc#&iܱC=Jů8.du Q}C]ya1剥M,ݧ{G~'\ :v1*D}#OagI~kJWV& 3X5zSN{ꋏ*o߉YZ ﯨI]#mڴMDP(@qve4X}i~|Ջnu#% ]lwD_πi'/;4fB>֊O5Sʴ:e-S2^dmۉ[2c"}^+=|,,)=jV}!3o]_4@=vm9O]ةQ%PSY+we5T+O{GѠAQJ irK6IJUYےtoO՗n6LZvL@$+=E\i. |q5ﳅu[qQP?k?GIJ_phYz[_8(.瓂uir?iZk2$% Bo}[ 8YwHy4 l6@S]&«mDP`c\tcfy0}dJKc@Wt@-bt"2#d9U2Pq[;N8f<0;<0V \]..[іZzY[%\3c6W|;3w~>[Zxެ4^}&s75CJ.ʺcPUPkZ̆@hUK\l%B4 -JTZtu_;H;ޕ~d6QF _~za*-(ɚ0j b.6֦ U%&"LF !$1UztAY㯉cҵM/Kꛥt7jm*m5jrRzg%*fr<Ш^"X.ϥЄ^) 7C1g|oֺF]V~rYUZzַFwX|d՟yܨg}(U.WSIhLTZ*Q`0 NQK**tnGG7fV\<_зm]fM*.fnV^֠qe}Iz_k+J Sޞk{4AHFq;?ﲇ6#m 93.T_[hw6S~{Ƕ&Øe.W;9 BAb&Bq{^~٦C|}iP[^Q%d{t,{>yWRWwInӫCTd{;h5*>lR;fї_۴sڸ"s7-o(Jj QvO+Qp9#Rog|2!_V m?i|)Kpa-?}Kv};OW<)y888¿/wuP[=e3أ+^jd't-ٹirp1VT?.Eg.u㜕 c\ͪ-'OB>줶N>f`!̗SlJ-ݘm@ݺ,p?kMGN_~+_NYvB1E.b"p[Մ:8k Wkk)ܰ;mK#喒ԗ>=-,ҦۘPKr>InS?6]ITH^M^k{M==cӝ_wO+ʪ\4M1s҄;ȴxG_Y???>\b3ܰhFCµVikoK p,=1 H}ʪ_{Vrn|<iZnᤢ]tonD"Z2P5Cs;<4ؔ'3הHQtfW}ɦ^rH q[G s&~c8Rj8QwH]bK{65ϡٹf_b::ݤ}]G{xM1mvnqTGCiC!jÂ5Sۅ?ǤB~\nQ;ngq<_/y:R *n{ fٹmRf{d6}gvNޠuիaX}xy;2!ĒϾw}/fit &RͪyE٭ڄwl:m"tMxfXd6NDaR !|[_ε\Nk'["l[>B<\/]wfN_~VW9>wӲ 741ˢLN~cN-D"=IR7F^!@jp|"aҗ]ܽz {N_sРPCBm"dz<#_2/]Q)!m}q h#4p]_^/De0sGW^~q­7jJ;BNRTґg?u܈~\},92kLT[ok4LWnm-8eNqtv?uxMƦg&z#jgu WY~8rsګZzONyڒ? F*@$2y«+]ZR^R٢)<]Ip⧴ PH->FS-61c_y?7pw! =V}Y-=hk"=D@QA%xPH-=|||>h矟ZPP@4ȑO'PH-w"B1B 7`슟mYnrUVkZ!DHHɓڴiSYYyזзo߸bÉ'_;}]}r+@jв8z+,O0kOn|9V !i4R\`ANN z}.1jZժT9s^KIIYly``iӴڢ76MKN>ݳg^=}+Wr&4hݻM&VL0 Z)l{>gLS}`n+N$lyVOR-^^BX˻v}XR}?jjj4Mtj4hPjj;6m[nM6[/**5XB_:Th^R<߄&RK@ŋR_Vjjj33zn*++avvW ?4.]q9&U Z:_: *'״_|ޓŢ$Ig,UUCZh_1x1iXw#gj*>;R׮]Je於\///Jemni&0//ٚj!B?Knn^``@À3gN7҅-#4ˡos Pػtk pD߿`0̞bPPÂǎ'O}ٳnҲ:::Z 1tPooѣ|K̛]9;;S @^ZZf?r ?BS~}vǶ|^~Zpܔa ?h؉c2RRNFEE7`LvѪ*tBZ>}ƌy}wNLL#Ǝ}cIIIB"V;j(ww5k֤M?`@+Wrϝ;ǺZ&?Lu>NGw_Z*lcȍkR {Zy-@jR ڸ@VBpc3x0CH-@j Z@jR ZH-@j R H-R ZH-@j R ZH-@j Z@jR ZH-@j R H-R ZH-,(ڤIENDB`pydantic-2.10.6/docs/img/vs_code_02.png000066400000000000000000000606251474456633400176010ustar00rootroot00000000000000PNG  IHDRWisBITOtEXtSoftwareShutterc IDATxg@SWBa@ 8Pj{㪣Ϊn닻Qq*Z(*ٛ@H(uZߧs=''pvv&\-0 05qh7ΫBI+;| gc{|ʼn0,]k;tδn׭J>v{bT5id yz4bse^zr }ǶמZC*Nz9k؍oR\}qۗl}jĮKaCnuO4KW>K>=aĪSQطW[5{ Q07z^uzHt:1˷Wc'O+dfR773|D.^|`wkmYgfʼT~'x(|->qoszNv'w:x6V%?iv'ԮٻO~Tʴn}N+RN/S/}!vpХCg MjkA:ist7ڧYʣm=g5b ;UcZZ&T&w6DdesQ q7lo {ȬҤ/ K׃2?x.*&>===>Ҏ;X=~̚ ^wЃsL&sO'o''<8[؟'OhjweZ~Ѯ3'm 2]ϟxP&"-u׭:~Ϥ3FtyӴzew8Êo[Xt S&ޜQ6D^9l* @$=2"}EY'^mA.86fQs.r'N*lh"3Z,f Gf*߮ ջK?-041xkzuFZ}'Sϯ^U:|Fӷ|6чPu0pS3wF\3'Ȫ;컗U)MZ|>dRQEEiEEEE RJ 3Rn Q!Ug(++=PB͵rafƾY~~'WBNUI|r ݇,W HeS۶D Ӿ>km+Ѕ&niAFȶ6[z=1===-&<m-s\9Tf=z.[0}eDFDq?(|ӂ×G61q;mgj+}ttt8ظK%ʈ+PUeD\g3%zP6ḰC1Ϯ/+;6rq]ݑ!To-K?0SoVc?}wlcPU48sdnnm6.ˊ*?zgґ-4'r25wkT+Wnٻq=Gj>j?eNz^ڸ:P9n-})dy> 1?VO}2`ssaȼF>3xwwNu8][6\ c˲FDׇqpٰ/iޱs ou ] C߿^'WrYGzj[3+Q0bzDUGc*H]64tًKc~?.,=* " h?L`3Gz*66C6>9yocdRsjZS(:RPuaڹYz`*~6zFQ1_1ϿNk5Ñ5zN60)svjRRd/.-HI3:]'3>,Q%YUί_5qh7륯e_تTx./9'*|ϽϬQ:*&DUDU@T@TDUDUDU@TxXhBFx UDUDU@T@TDUDUDU@T@TDUDUDUK)hJW>R wYp8<s;pVT*U*'Tg̩Dddd>:#gp쟊+++Xvۚ 2`0,,,R)|r{sE'DAջLi/޽T՘zcVzYzG*$"~;uԷ{ƍU;655544_<|}};wtR|>(#ͨ(xXOՊR=8|\?m˯]#Gl~3nx@KDlK9բw;f㡤kAևkTe[4Aim9#|w>HTeX?U>SΝ;oMzdՔ}9g_a>dߤ9őbN%"3k6OrR>镬((x*?:ԃ/DUb}C "##qf$ھGm\T9*iBSIjj!~1~&|"/kÉHdݣ`y]}~V75lz _/ n1z/DzyoA'Fçn_N~cO?_LY3r5͌m]R~wD;`ٜNf|Gk+Z# 3G7sI$Zj1@g?f͚ED oҥ?`XhѢÇD"6s̙D"bٝ;wnذ!2>}Q}uttT({쩨 9sXYY>|899A@@@͚5rٳgdvرYf'!!aǎ iӦ}gϞ[n.]lGoSYTz:wlnnNDqqq755۷oR$"gg={ۋ3g<|p={lذFH$UQVVVy,)w0^\~ȉvsyɐkq "D=вٜ= #:~U5HN;gl w|\ƦȬ9Gr;~hw6*Z~=t݄!N-t0}wWFD:eSvڎہ*k Z4gbY^+m`0pwq+l~cUTMsQn釠/!>Vj\._~999666 իW_|V;vUrss;vqF_Wibb7\~=88喔D""ڽ{\.oٲ-Zhb+\\\QPPW_EEE|>X>?v𰰰 J͛\n߾}{mH#Fjk֬QTUR*6m2113fLqqmۈh̘1-[ww#F[.++[n[nU(ڵsss#"X\\bŊիW |8f԰enmxdSԲxs'"9%7-K'R܊.ā8&~^ځYu]K-rzzr1o YSItQݭ-ukio*RRƆYPPF[s>`Th4j}}}srr<<>>lԬ׷̙3+;;BCC۵kgmmfW} //͛9rZyq=3f̣G^,juII ]|yРADdggh"Cg>$3aTehAtUyGúhv6FA@J sR.f]g+^]:A@kVթx:V/\3/lW&j{ IQ 8z*nMEgd?S?TTj:2ѽ{vo߾7 DP(ݻw=3WTL&~ͮ I$V;o}8qD(ܹsN"""k׮vZ"qFΝe2YnnErrr*p,~rA7.S-.lyv=u wtٕV>aY"C kf9GDc'd~RvE\kS+s؝#q'&L0<* _7~M<ƍĽ  ߳⼶vBBBzݢEM6QDDD=nݺUu-Z8;;K$~':v8h ss󒒒QdӦMݻwoѢF yiK^.[PPvZV{IF3p@ss󼼼UVUn9xue߹sg߾}̙SRRrnL&۲eK=ڴiSTT믿_~ m߾W^&L0˺u4 |X6֫ J޽L$:nZOV.[lzj8wԿں\ٍ1b999|͍>cjUˍ]vmܸqFFƮ]7a֭[͚5LX,>vr*'DTx0jOҗW>7Wv*  *qB*  **   **   **  **{bffvOJfvf?j 80 Q8_;%U.Y"r߾}gϞWǍ ͛\-6.33ё`:tʔ):99edd$Ū$ed[GrUN.&OzvfJW.V5ؙ1$'o .kkkgddrd2dz6DU;::O޽E"شiSes=u˗7oޭ[7KKKDd\>eggg>_TTtҥ/;wCEEEtt}Z-K(/]TPhѢSN2 (oРA</??DԿMrۉ($$ӕ?ʕ+O>}U"jժUϟ߳gϦMiڈj"rrr2dHr/ޯ_?{{ܽ{P( YT*t6jhȐ!ϯ9Mhbb"O>]f4iҥK@QF1 oܸwf͚O8Z5lnr혙UhiGOx)ͻO͸bS3UɩDS^^⒗-e2իWz}zzϟ?cǎZjM0a:n,ٙ|ԩS._r m۶Xkkk\ND틏QưaôZ+W(88X*L4)%%%,,kĈwNLLT(>>> ؼysNNȑ#c|>~III˖-STݻw5jӴiR°ZYY̙3ieQQQu1DUOOOC'ݻwϜ9cmm=nܸsΙΜ9ҥK ˕H$SL9qĭ[:t0nܸsI&,ZxРA]؊1f̘䌌 ooaÆڵ+--/6mwnãZj%ƾ**4j.֬5wιg\tiüHOj8;879];zQ5aÆ5"5jj???Tz񢢢{lҰRJEEEJ\)SBBB 'Jrppj{QQQgΜi۶aaffL&KIIupp ֭[߸q#<<\" PZZz9+++W^^p EnnnjZVt%/YvmGD^^^JDr$999""ݝJJJBBB Efbsd2ى'ٓjg%ԟYRx8R4"""##ð#Cn,,,|/ O28yڪE̴GgY!$V(2ɺeRlvÆ /^P( diiGD"rGGǗeccr _qHѬ[. }/^6;wD-o߾}رc5bNNQrrsV^' $lbx͍***qRentbX$EEEHxL㓝]VVV㍪z͗k׬A$&""[-;8.O':֡ zaŠaWdċÆ+]ТEC$m֬Y͚5Ϟ=KDbN:UhQQK 7n\yy;w VPPɓbkתoee%H[n`޼y bԨQWKJJ AX,~Ç IDAT3f˖-666˖-3, k֬ѣ :^ܚgϞ/ë qj!⌌KV]T*{YɎb1ɴ2۷o߿_&}WQQQm?*sNռv_3 Lҳ6`jMOcƌ;y5znog[B0((߿@R5СCT":~aÌ.\p1NNN/j^痖HիW۷m۶ZnZzu>}:vo߾{n"ڴiS߾}Njll,Ν;GDVVV<0k<|088G֬YcK"zѮCx? iӦ 6$$$)|33'=z4::sn e şTB*|h@T@TDUDU@T@T@TO[Uxx{}{ xBp*Ha  **   **   G&'->>Q>U999h*{{RG QUUQp _zÔ&|[Gϱ8K*ۺ]>^xsI1=pŷ-8EowEl]N/qwoEa {=2/Ez=w2nN]x%,c—mk&LlD!zz/WIG\/6<ѫlS+ 9A f_:;k[ <@ 0yDRs/0em^Ҭ*/U(URxe&kqFm 1S­IyT}Ό>sM1M_WdbdvA#F9S+ל [O:#A#{7q%*L ;롨"uӯ'u9U$ڡ_[DDm cgPJ,zJl ꊔ.U%ʁ[=5WS0~Zq?O{z2JZa0mk߬=ey}޷m;+آ"TIGW2hĄi;lI33rҀCl +͇~3dh5{UX2V!ښ)&=i,3id}əD,;txE#%ok8xd[cٍhw\JNeP5{FuSgD ͳǕUoHpg壸^GN>fm]m+d؈~&1Oao᾽WK*Ukm&f),+MgDOMcC"TJ¤K0]݉W4$`ɦgz4y}OT(KT+E% 4KYTۑWķ'uYvԘU_>r%'+|n>˪ iu_OfHn[G&éO!!""\x@kҶM㒥ڗsd6[{B K"R<<РgwSTze S bT)Gg16k R~{k"*@T(YC7y}N%"RRii% }5 ut.Û'5'1TZRe~]e MGU4{U~;{qVZN.>DD66zy WTx~._44 tJ#c_mr&ԩ.ݚ3SmbMDNۙDu(x6m} Ll\(.*aF|>E~LAuoK6 J<]Sx%Zjmd֖ѣGiDݿYk2>Oo ۵[dtĮ0oM̧eV]no)tw7-~L;0w7xg֫|H+ٿ&fPVYAG1?ONDD77;c)}ڶKc3W^ݲ -f܈)~9u^Dϩ?mP TՉR>RgFDPfVmZe٣YD)/-ItC+7 ۫ A MYyn9xF5g "79 [3ћ kn8~ۏ5L\p]>ulwzYrHVybjB T͔cdkFQASvl⬉;Nuٲ*zqYo8lP`|QDIz`o'B*ۺUZW|ħMIĶ b{(h(2䨐}jV}kJ1̘E2[nŸǽ4B]]S"";v3}6JV'Iiƣ 6l8 qF|\+ːZt4H_&+0y:xң?LHte$u @HDZz7>. : hkClQ&YGf쿸0ԾBG\{[8ǂw|װf>"k>G֡YgQ{)xkL+",Fţ=,^&ڴ4W IĮ_l_یt6kn 8CG էo K(~ϩ`!47336Ҕeݿx˙ʳD-/¼ l93 Z=LGD.ainl"EAO>(""4s4f%Y?|J̚Ҭʪdu+e;SV,M4ʎMrՖEZcdW8 һ-駉A÷쮼snPXiQ֝iňhcȕ4^ X*yz܍[^H)%"۵˔~޶ꢸkoRE#&ZHt-7D A)"{"b ƙRQWWSㇶ4)-'S:"W_hY,9SO̸ÌV~Vp#hPn9s׳45e<7ΦӸ ٙpT#W1M--MD6s4ZYLl\Tu@cj܊ {&bO< G#U׵ &#,}O\G[oٷ͌o{ X3߆}\dj1CZk$d% zx9E'B %:::~Re۩d9)bЭŠ ̘D܆& la)HV7?qT]3&XPV`̊O+:vѽ~Euw/yZELr"˵c,L_,fڸޯ$p̪jqbG}#&˭;GɼGkXB}?Xw^1xf6߃4jZ[Ԧ܊.6r2 g(nw-,f.Eكqbzf Y5۪+s-P ZtuQf { Spfw?>_M?ȥS;j݋,?,6f:w6-\-r[ Q Og]l7wŴv^VqW.1N^^n&U?a5>./%[Ůݥ{MIm %CIdb)4"".OCH}z\26kjW_*, J aWoխyM$Ac>X3~ʻF$FoOZtbw/9o1ȮIf~X}QQ\Gėǜ]%F%?&=U9][7mc<xF1taEF-zu[s"8 Aܲ)',wmmB-Irzsfo }|_ w&zvf6/yƳ^JS^Z q"y\\E7R5zX,cShib rrdzbcjf+OVʈyi|t#:u:y22⁒7CN֩f3H ;؃*[jQBSлrѫb/eըr^ K[k4MLY[V"2:ذٯN/)ӓ1aXZ Mkض߆=ޝm#2w.KK"mqAVVaԪWlekBn=ar.dz6z*LI)%L 5Zز--Lz*|l%@TKNx(i3y误qgmOZC-*LL;u1YW)ЂMg5odvѱEEW’HQ)ëHW,iVBf<˹d"*;}ɰBmstŷ?P|S_,"2suzвN'Sb_Z[ߒٶ޵3K6[lI܌/'Kܰi׫ߗ5[N8_5R.)ҿ)Zz豈0K"/^<dEDb D߃?PH=],vRٗ-;зͶ5-,~vObDR,'e\X㪡 Y,3^LZ:ܣy9͘75KgKr"jDvcP\o""!WM{f΅A^ߘZi`ˊ'ÃgΒFrW= *lDz1rwsSFN)pLLNM4I8իZu5?Xo&4/0,Uۅ"5Ɗ{զoFb$ޑm VExu?s}?G,^“NJ,1=꘾ Uz EZ#RhҮ,VHKDL^ytJ&1Șb}|}4wO~.jX}Ui~XdgI{Ei/zLaty8& BW~J}tvrc,d Sr|ge'"4y9jQ¡ _}TrWτ,;Ḽn 4A7Dd{ӗҷn.})g|uý^Sj9m[TrIv?[JX:=rj7o͂[H!"qԕ&վw #6q|Fѥ[sMk~oȸ)IS{Y!|\kӊő<^AO7okoB GnT%"RK BuT}.Mї~-8_҂U6i2b _sd5P˲nn!d\;w oȐDS+7+&#wu:3]7U.aY{\K00 IDATLcdfwbQ`*{J4\0|cUݚ74)/(E'yԭCL"xxIS Om^5^ejͧrS)TZ767P'OeRiy[A[Cpcj wu`3HeXz8l>X"At)&_` {D-Of66Jꂼo#z_i[T]D3WYY?[JQ)zI t i)SHKT﹫|ieyA_P1V"mfK2wHRrqVmsTt-JSю̫>w.Ѧk:W3HOD agyjQ\TtEC[{#h0-ϷtpULE֯1eYen,MnbG n|+;kն.K -/cv\"R&<=ؾBjsߝW=z*6\i#ػz}ˆx4Lv1;foXu*1ϳA &=.-wK,Z5:Axy&YQUOɻfOyIal(\}rV˂(˻]6ifO~PG1?Od9w=KV)3"xp+MۚO!?*4/|>ѱ5ޑr,6hmۻ(T`KHM <+3v<Ͼ1I=\7__;O%mM,{K-MDo3`弄ʰM!69:pͽ|hn#p9q`v|ugoZ?kf/J?o5ka/"&å89&+`N~7žUMASuc{ ֻ59ꈈv/O=8TwI:y|AD兇~Iޜm r >n`L2^GCix}^ID=^ٻ"aކV1Sc"Z;qL2Uټnod""r%˕#bHΑukgF w K%T؁^2Ϥ*ζ|ߝ̞"0g͜D5_6$}I ;q ;i־ڻف~hlъ;V~sӪ2ձ/ڛ~(p&ջ%DtAKwGV My_z6MʧG=?Tk7_LĹ >|nR`}zpPM*XWo}Qq濿oȔ8I9il4 bUKX}Ft 7v>iYt+7_Hf۾Y\t;axj9%CZF}OQ_YM+J2mk6r@yƚߴn&=ʃDd};M|o\鏪XĸPaNJ9Fq/II=""7󶕑AlA@6`1}}-uSOƾ2Ѻ(!_r#uV.9kY/oRsk&5L|Yctcwe5XZ;([w@':նa^=D9*}lDtVw&I*}լw$> b ^}׭9ٚSXGM[5Bݿ۔gI􇤋uPF@_U~9nlbyPS#=: CjCvjU^,?;9Yѵ+s?/ +*bNNW,odOhBQrRwkGBK{뤾J iW·GDlDUY(% ^ ߐމcD}:<:KX79FQ$w\ڈ!mMsL_9)xS6I2r˳lsP֕ Dε%Ji5٪쪧)"G 7@D6a3)]꺝yuUh(Q!wx'hXFg8gW"57WvPkC *UxmQ=7S˝LfыʈU7r6B5Ƿ톍7&|gDdael33,7+JRa ǚ^'I;\fZ]1UHGy$) }A"2]gxU"XFF$'w&:kjHWA^T9JD,&Rr 6"rmbxK+>7G1'^Ӓ :9^WaA=haRo^4)IBM1%,N-6 "qrk/]l,qoOmK߼Woc>QqևTxpMԝћΩ%^StV}I*o`\t܍x_Q]Q^yʹ*Wl־Cr7_Iy?gg#Ztz82؃}gDSUē+<9oyiBM,*Px_[7SVi׈9Y/#"^ay_J\\tK5_벬gOuNr#dC\&YaF1UmH(w=F;;r_Kճ|ݜyK_,ݞ2~rJ\#8vq : /pp3-b̹//U&}7bę,k\VWsjb#5V$'Sh(n塟" ɐxd.޸2iDIT?v=]Ʋ$VsD5 `IRp aOfq&?KY(D":ܘݾKD2Fo Kw68Zaj~)GYeUؗhq#r筽NjnCb_9H!/ut롇c^BКf.$t) Ivq&DT^xhݪ+ UGUMt|va'3/i'o7rF7WJM+wV œ7T?rOMvnd:Tڴ$)D8?KRj]|Rkk>#fXX:gjiz:dA0'S*YE㻙$B@ .))36-Ee6bYC\]ߩ =./֬k+b%;Ϙ{Z*e8EDUx6E\"_QQUUQQvcYGdn''D[o  u}ئjkM]{3 &cj_}/*=_eYiQtjS*my㍸J7IZdKԇRw^[-zwj /FP{gW/^]ra>=خLy^s~""Nn*Qն^Y~Ԉ \~5Mwߠ3z`5eVUw5&˜8ŝevds&3f̟$a *EңQSߙVj[G<""Qr"I И,}I'0/RN9aR~ r*p*ܯldui%9_XXڐ.%$I aRS-*;S\;][$-O/-.ͻf6Rp9Zc'D7Ω{ZP@Tp𿟯>ϧ=tC{wwTZv5SiuȐ 㔪dT@T1+e c>L*4 g$n8DU?g ?X[wk2!*@dmBÃ|]qXBinJ6ͮjqBǸ#^^myJ:Rx>ⅿ fٌ )ZTCo(ܧOriQk,;wզ5%igؤ&%Z\Sq-'M֢).(_v%o_7/2Tc wXpppIIADiiR]~wO.l:c꿾| (DU@T@TDUDUp3> E@TKZQe' pWp}B@&N}A1( MwrUDU{ CM׽[Gˆ 3w\uzxK{m䰌apCmUwWPػ"}]x-՟OX{'4N|{h/\~K E]Oz`{\mZ/JI D$Ϟ鑂)kJʖ%z=O XtLȾmVqjaJDU ""[sY+QݟJ$'\+'zE4e?۠49t/j.џjlioO-ȩ83`v8;:H#'\d>U'lp&V퟽Q#ڈ!Y&L5>xli,-HdW݃ǂ{l3f>1Կ^ިۡgԜmDQSt%ՍF}f/E4ܶ:z{&kN"eU 1ķ_Hd Ga.6tG])yW6'm!Þ]\ԂU64wtPW_G"5%z+CQ=?V;:F٪TN>dBYឞU3泬bNmDU_mLm1cUl8]fl*r^pֵ-R64Y&*zğuy:{~rn@6/Or Urr29TRWr)CnR\¾x%S0aK^2={ |UY \>/F [^Qn!Hv=_aP_+yqf]3]m \Άr>:~>6Jbh3s !QF{4*L>cZ(ݑ͗llE?S+FuM-:{#_WKNdbӒ|FVuMxux:P;9r񐲡ɥ]D>Kg9ʝ/Xui!}.`L:a\-@*x5ns_ݿ wﶞg<-r*miUwMU.]AvmQ}:<~ۥ9=UQwBBOhSy%[l#]yŅ<'ɗ*vdҋ6KD|DA-eەk8zkGaXSz rl"ğw?bQ~ ڈGT'ґw3z'MEorw 7j݂[;v*-n6®.ow<3[m$vSj~6M ]'-8ېMrKG}]=,EO 9βB@آm|Pp M5luc>[$9F3' h+;l9lstP8qL=U{/< yWW/ Vr3&[,HE>7i4/] "uy?͠6VKOQRޭ Y->orl!YDF$){N?P'[w~2+JW>:vs=<-6|egL9ߎYKOv ײwrw+N|3g j=L;Kz0$TىUX U^P>0'u´*@T@TDUDUp¨*  **   **   **  *#\hIENDB`pydantic-2.10.6/docs/img/vs_code_03.png000066400000000000000000000655251474456633400176060ustar00rootroot00000000000000PNG  IHDRgq]sBITOtEXtSoftwareShutterc IDATxw@OH.@I@+=DpժZC:_[kkjkmvإjp C%G*Pyw=wB B VB!0 BaRA!t{_MҪ29ToU Z8;(#ĕ i֊}'if4[JRq.r,Qu]#&,_`W| 7* \P)Z+( v- A1rI`=Fi3?9f#U%4ϴ+or. }/utˋ|io~e5R+374|oX4q3t{ʽŏNzpaӊAOM1#w_3Pt^p6,~m|ٶ5yf;okN~zJ6aY݊mʎ:o?}Q =0޻~_2S\jC掚A-[J-Զ;5iqBOۮ7 W.V@+Y*rLxo/kyg=aS/ Y.m5+r\[)}&gE7WnL#zMZt\lǏ=B?4j5f;U\RYN@ߖ>RHy\vҟTu5RZZMtW⠼όnV|2BA:' B1vmWV4++-AdYThC#6cݚ: MtQ0v-bofw\Gp %:UV{E&heֆ*d_=bxǵa廉}cA'c:/U}dB4n'<;OG:i^odk؍eʴVZW/ͮq>0y)wGվ"[jsc'6EN7c_VyN]GJ+7+ [~QK"i-|Ŭ˶y"*h<_dž\K9DȘJwlZu=1ek(-*-M iw$H($IJ:{YV5k Gߵ%b X*ߗf{[rվ$`">Lgi*"\Y 2y;'F|ZWβNP6 * Aj*F:Q򥀋G폫j41<\in>:ezᄏTCf5W[nUvu^ni냴bMѕr > )=;DK9ъ[m m&K MqxAjuw1DYE`g!$i;IkK At*l脡bGpjG{p͜5ݫz'F;  D翈31҉{R܎TLb)3vdiY-jg]&x%' y#RS џ[c+Y]gWcgaqu?!^Ujs]<ülwpw,8ϕ!T1LM$v|}"h8)om+P=L`7Q;{'= `(TATR6 1;^`6bsPjah*?pc[DAq ucé^>?OY*}OkF>ϰʛ`3IU4qm d{ywև!t$R{my^$65~Rq>>)57ª)nlh) ɢ7ڿ) C  IfaKqӧD(ޢuݹ*h8T5'ٗ䦍fݑ5X?v`z$8Y _Mx$.֏-HN3[}ۨk4?$)N'=J,4Ї! ?/Υe {4*gAvbr:w_4Г.<+lrh1k\fH7߬BjP:I4.xt4C >2# mBP@˦2(x,@6M_0etu{;7PPoퟺj_w<|.8/^pxdW6eڍuAUw>"VXq1_%[vOi<*e*#ws^@|)_Y"L'|o <۪@7_r!Ūɓy7Ǫ/;pf;h]}b>VuԓO5i?@*e^@Zrw]v#\d\qok["OJ(ߥzƂEgO(aK뿔[7v+A &Um-LŴiTzqQZtTE,|)~s9Bt7^ Bdfu֛+~k'ֽv]^Yǯ`X܉+pB]Z`8{$Eq^qoB]=vZmZp!BuwB!?!BTB!I!B B!&BaRA!¤B!L*!BTB!0 B B!&BaRA!¤B!I!BTB!0 B B!&B!L*!¤B!I!BTB!0 BaRA!&B!L*!¤B!I!B B!0 BaRA!&B!L*!BTB!I!B B!0 BaRA!&BaRA!¤B*KoLYСGg B![ų?9 ׯALl|흭MߖS{+{7 "DzfKfZ ݪmHrl}쏤1CBui^#GP4ʽkt]|Gg[,Jfr@hRVK'{S¤>S87$SZPro$mOb}GM;W^R߱ ! _#rI-|%Z5WFD}qOH[P;EFE,'^Spl1ɹ(p֬QT+r.tN09^ G*63q]h_ťoIt&\C!Po%*spW\/渹?ΕE Yklr_0U9)jb{.ȕ'~ً}`B{D i3c6R#39aՆ ~G<~*9Ac(Yxq+RODEe=JW^n?r_=1NK+9t1HJd9mOPa^^-U9ʋ)<;3.e` ʺQ˙U%vMYQ͕>ӈi?F.]M΅46וT؞Tx/=%.^^z(~{6n̈ >kbzG4e+B&`kܫS29Q vY*m1O|!i$WWj.\L@UM}숢Qٷ3gFf pV &j9T/AMb][I뵤Bj5ֶf&у p]O3 $ E^P<RZ҆\mW8b"'_+6ce][N(}^{qKcU++7.Q6m `kصl/_:;y( 7,ʍs-*{~yHӯ[RVJ53N&29o$Vg7>g \X7t"wGly\U:ը&,NZ :;ct˅pg/e{e6X,pݡDn|OVl|]m<)W3MቼiƙN7krVB:L9/戼܄Aޙ !Z) RgЃ_gA1Uy&ٍc*Ĉ%ݼ4GBꖝx9zV.LeEpXOEhϿ~~_A|ka6SL-Ƴ3ۻ+3rjEtw0+~[٤sbpY6tk=9p;9b^M<p}@¼ͤnW㖿0J.2dq㯇-pAqB*])RpX%Wlr|~,vCFۭZysfQjv#x FtGqN c|j2}~hӦQE5r;z(U_E-`*5W7ϫZ)`5Kstڢ?F* ЁN$-5a-~( ?鸹-1nB^D ZK./'tA4}tde񯹽`TG!ԋx~,æ7[@؂ c3_h+Z<.2m aOIL!z==1+jC(OXQd_lU!PoN:)I!&`ymc> p oiޔ\Xe ?Ě^{qKK w;|!7^2A#ܨESq* Io-L^`Rl k^3i|ỏR]|=6>BN*Cs^;̯w|7N7b'm;Θ%a9nsﺦtlߗ'?1Cch2s73.Z1畆u:~VWO֬j‚=^*Xʌ ›KYQ֜UEn> lXR$] <BnO*?&=h}4镓Ø[EV,V9 YF.ڂ!¤axʘ, Mӥ>eB.8o9&C[B!IfWn6(+9+9%ʝnwTn7o"B?rV,XmV;?eQVӂ҂nv=^MQ۔}xժCZotX U!B(5ңEP WjzyC%\O*Ս-x|*Ræx%,,/f̋ڛK6|yi&'a?4+&ÏXdr۵-Z;.ޝKP-B;TSwG~3z7:RvM|/)h)79/ؑ\w&mNϢ_wg?'7n \uyج'g_ƒ?sZ:?t>V^` IDATSlN:--;؋' 4*eB6JPPПRF!!BTB!0 B B!&BaRA!¤B!I/b=<'bn̏{qKK w;|!7^3y\&h`$4 j-0ocsb&0=`ZJ7_xd.LJp X-ivB w+HzLshXVL\Z gʑnE2?f՛YCJúeYWO֬jMroc=yc|ܭ<{䗦 rMGsh㔋 5x!B MX5rB!t[`y@ݪ<76k!&v䷧A!qx-B!0 BaRA!&BWq婧ٸY/6^ d A:iτ,=U$VBҿlLvǃ [nM @E}n7j^z_ʖ|g|p&~BݻI!B|' qJmjvķ4^cW\DHJ֊)tPV>f*):cƣ;4YILBlU=g<.4rMOf@ܨESq* Io-L^`Rl k^3i|ỏR]_p ye`L' ޼~ĀOhVl:fsDBJܔ! 'oaBm?.C⧦ԮYUC:V#[Ieeҳ뗝8'qK-+qĸ=Zvٚn攢 הݣc*3s"`察j:0(ذc&o#GJrҒKwO/S~վ><B!IDu= C=_d AЗq؄ü}cx7KLicR8<W+!hW<g,yR`8u᜺ F,楙<BnwϚo~vygMi5Hm >oO,܇O+4tTqO/X0䫧|׉,7^u^B*WeEEE?!)"Q{222}1%ߕXnK(\ƶ;kKf )}CDќ@I|* {?NH̸Wg\9kl(5a?tjنJ~*opq2_tJ;/JiyLu穕wN) ^>YaO{\8+!Ƽat뻃txkoTϔpcj+O1e@9?!?9>(Sޱ}YTQ/igٱ-o欱)y=@ǟ^o:L3wnWU[IZs$qRJdXL*Z ScC=,۹FOo.f\1r`:Օŏ8`Xc |b}U%w}R&DQe'5]^|OB|wNMw1 FT;Yn̆`0VܡbнF 9zLK w˗XB]rVI6/)?׵i >(ܗJ*OmXypЂgfÕII ?A㖽5gVev$ U& diOKVK}Cyچ[v5xo'\V6.5CQ[jAxw@@ߎp@ٗOs#"]J@wŝgO&ןn/9 Gxc 6^^`*_i)ِ֣4*Ǎl8_np>'L}dPpU Ϭ4nهs5:f:vھppRu)3FFӭ^}\ysg%!j;U@!U[0fP]ZBѶǩ%*lmt>rz:/JX&ĴT<YZxdQeVZa$VJ]?x`Ћ^-9w4)}C}YТ5]=]2=!ѐC?(QIA<6 ,fe}cRoKXuV7&*GrjpS:iҽߴ_La0!h@'zY8'nA|3/OreFN1L_^ϯ }E>8" =ow^ؤ`{'+XT{y澿:sɳ+fvvf׿Hpϖ6G$7pcs| ?(&֭j9X|r.aƲ#^pZO}^^* &d8qeD'/yo7ɰe+dAL"쪿+);O^օ{ghx\,ij!&.xC'8*)isu-l I}zcgoh`'/>,Ѩ,fBg4v;PTIR*.0]ஹT쇻bT'ACG o@:={זFQ㫥voQܣ9 4T&;oa܃U* +a#lw_j>>Vf'>ѯ wgO.ѸpB҇49P쀩>~{ yAGs$Fn!CF+:]EUzSmv߾z_J&EboQ]Qy٧f"0is`wRR˹;llΰ.Йhm7^nRQ%v0ϥ.W BNՁ~hSM1аmovpNڗ%ۓ]B jQ.:==]:+/bwߪoUTN3Gs$Ow@RY}|X.{E>NJZPj*-$ ]ʼ;_g"Iio*ST߲'wYEII`TɮB>{C?|RRY}y5Czsk$v۾-ho>a](ڢ{49tUeaiN)xxw^c`̕ Nn"ÄT aZeÅm?G.<0z-g%SVHڟC⶯s%4]>Sf]읗WȈ|h%n Je{ ;Uvxö *ը:s^ Ce_+ W8`v?ηtg/).:$6Gq׊8;÷{Nʔr Ccl}wL&+վ0x/7*)+5ч)u[lӝn=uG;oӦIUV^OEP1Ɗn-gO_nP zuݥ2_ȣRڢEzR `V6ݼ@%O ,le *Lh?vQ"tҢR5[ձ&M57ߢ O\m2ljaɑ.gi J~'2jMU0V^,7s<dLgIUXa1>8j:sZ&u]aJ7JFIs{ly#蔮Eit܋i"^{.JmۭZiMOVJ:A%RXoPo8W]@ pۥ 9,"gWi&,wwKATUTT6 aHZAV^(mn5qufVQWz 0ZɢϽo"!Z/ᰚ$ίg n*J*dDk!$I$ $ @ZI A_;iC;}3PxsbpXuʶ]$I`Nڻ|Gsc,ZJIQSG;Ks?_qEEe'u' R}_'m@# '(rH8Q =bI[`TB! 'ֺVtdC! [eve%bXȽ@s~{'8-=%ܛp*a8[fm1ٝMnNEp㌒ ˓p XdһS JoHBi ;`䣑"zmjLivi&}{J5uz#UFO_rQiVŢ3fy8,))&)i/{ol+VײSSG?'>/#m~J{OO:ǟ}r݊ $]:q.+!2,\Պ%$^6V.[6L_0 /298}Le*XO*v`0w45KcYaRXG9R`**'~İh;<|ư?:lZo,:3XzH~&gش}_nPԞnvt0X @6޼Yjn9Vwٓ#bV屽{5V;fLu3Α "[$]K^ЮϬ'vȮ E {vi]]g 4m\t]{=jE[KE1EGvf;;|xqwPp߯x^x_7ɫe m;:+ @sK!8\kwj'6/w@?zdQV;gV'%t0]>]%#2%q{ѡAt:@}p"S6ඓfVLfQ_ߦ5(6Ryuf<Br̺s%Bnp7Mdhc`E29'ܛ^Z'܏hK~VpE`ehjZg)k"#*nҮ^~l:gu;j7|z\};ii,:[(pU==`mFrfMz}K TumorX.uo1T,>EQhvpriu7-m0A\)9\swyrIx[vŤ+|Z׀`NQ8ġ=b~Nq؁ֶTNQ [5ʫg}ZN T:f)uwUʘTz+)%?W.5d RFHOK}2-N/Z,8Qc<2=N~?A},HZ)|0,yȃO?9Z9*hD&%lJuEΞjMT/Qi{Cq4l` B&8+%^ҰAB%I/KTɇZUPnaqACOs"RL'5 IDATa~tUe`+% 9,NpjzYRЃ1oZﯯ8_) Ҝk7g6EqQ5nX//8jt=S)_w /s@sa_5||JMc7%#ƛD$u. L3=|/ P\osB&e$,wwpU>=Ǐ/:rgBNku@Mph5LĦFz9\0GةnVgz30MqZ2%K %Թ}L[t?[V~fkO/?].Lji~2H*?qI] %߽ɣZnE[sHnRB҂կl;&F.v,l!&}q6Ej wL$;v8LKNnX-'/ n]~pX V=ѥ[`넋T4OD'=9/_Jd۪զ|VV)\]һdW ] FEUѸOlLĂo'TWBug׭9s>4W-pZ 5U4OVmE_[woxfcd>{i`^=~akԔQW);iSç]&N]_`6j+lcdq1 ݬ)j P. ȦY' $=t*#=e'vYa~efڰchW);ʳ% Fjm}$/Z:ECYYyaN[M2IsuC ֊g7=^RS.V$SJ5mb{) {sţ oY`f=|btǝvDwe%9'+vi%|RG `Y-9jg*pdN^]WonoUM0"& -ZE剜NPa\q]wMKg_\CxLy{͠w'Xg*Ae, t>e&<rX̷ΜSn;Vm2=[x'NƱ2bԄx݁ݜwp$H>(߃AxoOSn7dx|0A0D=+_Dž2"5̛Igx̘N,*ݝ%K eiJKwa_{wE. 7 fxk^eXiiQj_ʾKG^yVb(r rò.{PQ||vw3yg>3c)pTN5Ʌ-7V߻6= gA,|И8Y+TW876$&FaK섟ߜfD<$]FqƒbW_6=~t+~F;$$oq*G&TQh'gIITl^B#:OD{ft{5$!!JŒR{ NS QG>#~;/ICLouEd_ 쁮{* o3ễoϾFT_5x=Qᶵ < ~8ڟGR߬/[5j4W{bbnB"2T,H<Wn?X1|\rf|ݧXv1ί[9b#69gIʩNG@Ѿ9沽#mA1d{jOv[—-nݏ摓_8rT]~\²qr# E="#|xITTI˨r:fhǺ A;Xϭ$ QD~C)Owm 64Td5{"sXZjC M5N Ւ{o5 =/,96^U)o)2pJmV[/"Fg[>&\ŗ5RڒԾJͻp @R/۾Ed(0#b=c5z-m1#bbѬ@Q!3Q>(8o8Al(ܶ#;רz7q+-1c݃i^.}>󄚂#*Rl>ɷ1VvZDDMBHW/cގ1k"Y#Z& Z\Z#|xmS[Ob+s+cOQށ^"";cL*YcիnvAeJ,JV,iҔ A͠KMV59K^?cS~֟7q5„o1=[3b7%fy'+u|&b}٦ls仧/J%ƎDl*>jFS$^=?%mko5c|bYC{TCG&ϊ6w}΢#N-ߞWtm H7]ғ8bĘȦN>9KZ3'kϘ Xryo{Bg3}FO2̙UӪRX/Ҿ/isڬ!b*zdU'5DD+/.^Zz<Sy4̕r]m@>'f@٥(mܜu,eaCL%yFF!"bm܃Xp$T9̈́9E*[bT1Q\DVE >o|+ OY7*:]W+s"{]ն_m=s~RpáDA|.Pb.+SvmE孪bJcHh߹ oKq5A%,]}#d6ŞQlcɌy[4!1^؉kߜ2558PFiK?y?}3d/LU}iBDęVO;j蹛tI]jsE覩6&>QhOE"a]EI|9eܔo%b/ϐE˔})MoCDW7e6Y2_MM:}'}7kʄ$T-I>pʾ=fzbH*TڔD|" ӧR OhYW52v$8kg Z!P(1v8Q)P+]<wL9`ڤ"Hk\qF6^.Ys:sGDf}!u*[}}|Ee\HqRh֮Yܽ"XOXʪVqx GŻV {NsgQQ+)e gyFg\铪zF`#Iwa$7Z[}/u ‚w_qi`RypUkO@Ҽm-UExi\c|N}nA FϞTsY5ZtxvY^E!I'K!ЕTZGE:mх 7)B5?5iϦ_j8]IJ]?lPx8?\)+ձ0B(얝Ў!›8CٟGFCɬBx[+m%RM+Q&@J41ASf[݇ 73hT(o+3`c#C~M%glmT/IbVUEE#GD|LJ%*>L&rUQ0$ ""mєiY$K3GCGؽxZ\0|Լ/[Cr\?; NWsq'ٟGHxL=q']{T m6J@*%"s,Lݐh,8 bܳ ;\XrF9R&tlm(ìለ/ JơShC=q@#aO#Y-YVAJdgbyNlκQ7]7^%nȢW->0s1k mZ=ym^)j$"MY=Yԧ~񈈣SG<ū}C gݼNآ;6ݮӰJ{H_6FkT2q'$v9Ayjy .^y~t<6}%&0{U?oKWn% }drwO{"﯐0lfCDB s@j(<,zDDc 3E&4}O͕N| ݘWȉ aLSj+GUQ2NdGcz%~G=/޵ho?)ߥk/.ܟp=xD^<#Yg4Ҩgv8y0#{aƶ+Qa w _q|=]ʺ}VѮs|p ᝎ잭[^sSK*V:1YF&W>Z ٭wTC̄u}LՒ1NB٪kİdsib:웷Y>ɽy 9Pzsofڇ'XMyWgVUlI;gV:vtNTV +6le\k[=7}&jYt*,ҳnISə3gƤ\.cU}|뭴"Ū!AJ=d, G~7=C<gv%lMZsM8PmoF43crb6<ٔwd׼/WtF0mfN,OcnwL߳rGqx{\,%n_eڄ UϷ'.Ld =_jyƺ5ڳ2q[ҚԪ͹GLpܳ[oVb,ߕ᪕GLqQI[n/t?HrsT}sQ^.m|MG*۴GU?GЬ8ʹ[ hOA¬;rEc7#"͋8F-zqTraqtU)عcSp߇/:r4&,\O}r{K8[@ \crώjt_ܮ}^JdY$_XT܋rqŅۏ\_l""eX/[yY.ooLk23+c*ǴUIBs瑸}fÃ޶[N0\9;pXui6M/vceRq Ԟe<(Ҭ듫׆n_Vx9ӻHϮ7L#sm+krKBcNu4tS ; k__v9Č064x5{vu.> ӔǕ_Rku ?i6HzOv2=7QE/RH$GpWoL1}Li85?}*$,x|G+) oםrCnݮkDS.˚H[ ~xr ܚQʪѰ&_J\5FߏTlӱlI{v&C#V F}DgYz6Ә',]ӗV͖'O]]UIH[YsضQo m"NyڰR8>6z{\\H[}A~Njm  _HpD$[P4wH%kD$U͎(t勸20%(e+4|aLg'ݫC(uqA{_~ u4+\w!h~ qfޙ LFcH̙ru"$'{ bGT,}wsǾtIj(9)rQ!.dT#}pJY^m0e#?o!S- :v94wk=xG&rB=9Pr n:|z,Sg_zFN3K͵_22svXt.mqChϥ`߅)j?C#8xݐ#$ȣEɭ܈\"_[ ϊ=?.ZHѵ[힫Lɲ3ՠ+VRmsk]ٿ%:~KWsΧ_?x]C{$wbH_~.KNߪ?tZ7Ef"⌥;D/sI?^l"HYrDDkkSʓ4eղ/Ϳpzu }8/N<=mr*E$m:LPNf_ IDAT"Ȩ-+2D&)iuy)؈!S(p--y犫 M t{hMAAE?Kԭ#V᧴cS-Y 3󀞌 O;oᩈPj?ƕ{r\z񶆾]뛝[HUS 9,fQmᪿ IUHx7.ܑI%g55:8+q1U3^=QcjyM*Q_t|ޛN^2DS>6QՐw2MO'p\P4չNO셠Hg>ը"ۯfm3&ȣ?wakOjOf(CƲhy1I<{O%Y>!YonyĒqljm`J‘l$ [JoIZ~%#r/⦆IRi ɬ=|DQQ_-&A(y4Pf'ߒWvsDg$lc|?rsN-ft둑A-E&"aw#y35ȉTujk2Xna1Q,1 $aE5W ``'9vmOCO"=||NBD;jCke\GOYv$ѥqm)VlE?yg6m}V'sFΙ_}j>cw zW)G1c:_pnok1Xd5{ol~3f@茉.7wej|)h谎L${t~E;P{_a/݇W)>~>`_g_ZWT?lWou2Gc3OW{j,۷7_ j &Zժ)wj>LʲfŸL9kQhW-t`DVUʻd>ɕV#nwUВ M,0/'^qit4yqց6!CefV沪,JgiqDh1:sU8Kن9tmm(é3glL&.fû$JWy)Կ=ȫEq\*nZ םUOMsH]Ӕ%}ZLu:|OM]G֦?m0rw}On=/*t:T?Lcz:*yVrbHާ?}vslKeT5n0h9F Y] h=xv'es;zMmdNmT5Ds]b''nSUi+Q;OUc"/<{:V}NNSZMܟhZP^2wnŠ .9ؾޡݝ!` PC<%|[UsZntȚwӐͼ{ijq~%r[]ADD^FISU%W:xueg -D)I@D?UkԔo,8@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R@R$$@R@R$$$@R@R$$$@R@R$$ﻫ?IENDB`pydantic-2.10.6/docs/img/vs_code_04.png000066400000000000000000001015021474456633400175710ustar00rootroot00000000000000PNG  IHDRKvsBITOtEXtSoftwareShutterc IDATxw@o Fa/ApQGu⦮ZYVu׽[uTEQP@ # 2~$"Z"Vm\޽w;# B} B! B!.!ϣ2&a?izOߺ4&IB~qwT8Yml~JazE7ߺc{S5oLܻ AOt;>%DņB}p(}/sR! }ޔzKR EH&~Dyyt >54Yϝʏ.jwzz]+=bIN7<؝ފɔ]~3B&L$3JJbi#;'iM)b`2^q߸e'na,U&AVԘR度mRT`wBf.DA9i< WؙWD%.?؃ Y 09ΜŀҒY LxNԎ(-=;⩶0k;1YqRVg-S"h1=Ê('.0 5Eޔ@j>t3BQ;OiB؅qy 289;w1 cQ~j)`3Ƨz|{.FWnZs$vQ BcV;;op W:y-Z>pָ/zϋM~ٽ䒵'Euhm> H㩵4ve׹gR1 h@+Β4^¬]Q3m*+܌T]ȷOhvm煸 d\+5UچdeI>p. Ѿ<GakqKlۭ?L)SҚ{v}O{j_aA6d[(qrBX.IN_~l?#ٝ39RG̓W,VAեuΦ{yjtޥaiQo|Km+E{qTqZ\;z` ZaiD\ `|#Bk] ^hi&HoMyN JJ DiRm5AIB[pgڹz]MXw@e[~vZ~)x%cc{0UJh4t!ْFi*wTTHF$P2m&5P\\q!)-~yY^t)<| IDy Gƻ{MRT&oUqw\(V>.T@ͬÌ>kuU6+d-bĐM+uyʄ2c+k*d,XjB mghFw9y* !_]$U-[J*݀R;iʮs"X  < ;4}ccm XE̕@]K %EFt@QD.2.n??7',/ _a>ЍU"+BܬHS;>Ō>/G#zZtJ}k'BY^Tf` ИagJ +cnHò,M GT41dl3gXAbmҼ(Bofl5 >1,M ),+@d٘j.fCIP}Qj #jS%UTeޗ?ÇNh@XטxSzUb(\ZN_G6MW@)[Y Y |ibcx|gDnBJėYg 6"(ꢸ{.z4WY]s#=g%љi%Ϯ3R^FPIeӘEg0hVUrħ4} jnM򷦱Y}9oY!(hdmL5˲~fQR,+FP̍8Vtg'm}*SylѺJ,4|6$1{sFLpxT|푁O6l+ c-\ĥѹP}G$p^](Zd5"9Ðmneflʲx`j@Lt  A5=&t&:H .ly Z~JKq@:TXU2!NN$}>6EP?lR`<™]Pv-B&ozFu8QhnV3&u=0l4#> >)@y)*27iEKiklFSK]|10vƦ&i(R났U]9Sq-O0oc !j[dIs$XpʯoJ4oYIA^<Qx Tc\C,@VMMtVpmcgY)g+ov B!3B!;B!]B!0vA!!B BaB! B!0vA!!BcBaB! B!.!!BcB!]B! B!.!B BcB!]B!0vA!.!B BaB!]B!0vA!!B BaB! B!0vA!!BcBaB! B!.!!BcB!]B! B!.!BT&wmB^E&f>f>lvZTz)Uق2}3ܬMQ(ECNW+ϕ7US/{nW!S૨owǎ8y謶4 nSܸV%+Q|uEy0H5BP3VV؉%b;7:M/6z^R#֐({w&C)K֩N_(|6}Ǚs aD3c{״Q6wY7gkoUh^}ӄr6V}.3O*7a؉jRCϰᚬnJWf, (9&O\iPE3$P}!,Єe`Q@,v1k ъ} 3v`bݵKsjjFvI}V#?CฆL'ozj:Qvco}>y@޶ˮU(C!bstJl*U&حSRʽӈ Z ol_*~\+ՀTV<00?,+%yiwd ~2$nWm#<;3J~]VP{οC0rV4RnÈyL*ojT.§߂-πڿeyOikgm "ڦ@Sg,0`QDOe*A~;f "/JKYt&Q ?;iRg=(,Hߞ*ʲ~k1)+\̰FW;pj*=_ PKJft +};#yA!O.b5{2=gɴ4*==*TZuq{8܌ 'eRf%ی_W'9eׅ9MGQGтǷ sETJq6TH mʶ&*W{yB+ֺV)=ьoE4'I}qiEEɫIsgs}UTji̯Y\0:Q̕mghFw9%B=.#05oq]hv=EE]Wi BlfȢĥrn ܺ]i 4#vQ(4G]Ƴ !wy]4QޣUl{_68]V=y|`SW 69w뚼;\oXlkpKn_(N*$] }Z%ښ{Y49hOWgЙd젋c\H#X5Fv."F4$V4_ar5dռWkcnoP@%,M=ԗ9&io5{#bVδe^%,!I 7FVD3D ^Uvԧ54ˆ& Ԁ>Beݖ I9s^JTplʊ"ygP Dzb_z$C43cgݡ0tvSTK* TȄ#nY ] &ArY۶Fӿ0<BS~jl ڀbiks}\i 3*@).P께bsM'XXvJEr k7OCG Qa󧢔-= 7jQ&3J#F[zX:wpIafkhI!ŤI{@㗚v#g+#wא9J--'@ ub@]'efsokeb̈́q˴;\pjDs!ػP;2O 2!#͜-c}jR+L}94C"Zn!2xhy==[j¼[O)2z1!{/}=rtE>r!PcmJ80o=JEumRmidv[a; '}t)423NYi$g ^fTy4#eX"CVYjeEȴ+Ϳߠ!/@|:/ \Z"~JRYp -lϪo5R+pz8m19UIzWCDKOGWFQUxJ #}M{P)ĥ%j嫆nvQ_C3hl#~RƸPVUqSJUpwz~]*TU$_{r@7RX͙PT=Vĩɫ~n2*jҫH;Mwd73Ɖk%x^!z{(X q+,ݜ 5p}ąְKA;J!ޞ߄pn+`h:Z|w #qT'RB[K@]BooXoc6S*oe*aӋBaB!B!ip}ɋ*<!j>wA!.!B؅F7{&O>F9>CBa򶰃:U/Qck^ؠWq+L8Y>1>SվPGXB}}xB!^.VN,^{֕lP9ꏎ,{% :ٗo@_nj\RnM^~! 9czOOvm 9V'Ban8Pލ? As@7pc.U ل\Q-(fF; @}RUCBc=ݱ7&} 1G]n֪͜ec8_ L,n 4 Yu+B!K"58*Gha{i/ٝO"M.RI-!Лz/Hr^RU!B}ZErQNc`kVVfDSW K]O۴)߽PV,B4OT€ڬv_^QPHwt/.';9PpezZCg_~4JD{]l֜n§ց,Cl\P|z\1ig"(>V/Bս,?: X|ʺ95~ [Sj7vrλk?=?i 3MѴ IDATGFNP~]lq؜fAUeqzZ:ZBc6bUɗ2?w+qdѰS͑F!Pk8::rGyzz<#zB-9=Kdd>]~dǬ6|`mS |㬮KGFDFF_Yl.T3tL{X[pa{xی]8؎)3WZ{k~g] fN4l5,1Z4s- )JNJʖZ҄Laj׻,.AF~-zD/mok8\xĿ}|:0'E'o{v$;!+}C6PV(Bj2 { %};ÉgM8-|C?+{nM%ELq̨>6W~L8 _=p͂#yW; ?_=\3]*<8~O256W[;3(5&bN֯}|2mPR~isnD#4ow1^Qϝ*R{>CA?Lin nɉԆ\Xӓ0~]xP혿3[f.-^{ 8c֬lꐕ? "|9 ]?*nЈer.vF Mx\Y 'SYsFwV]w|7P$Yt,KC\,2?e'RZw >kKT 2.r:cyAډGNQ&mwQ+ިW0 `0aƨΎ)ͫnў%1dqXjᓈyڞ!£olB{;HD| lǼڝXzgP//&t¡ל8ks{@~#a' aܛG$7#aGͨv <Θ:VrSػF1hEaNyS]ʂKOhC)G1pޑ!YtEt A i_}P0vA5^JRNUXCeFG|)_+iIl g9]YQih60О }~xkW;qO&@e7\/%K6 ~>]Z:>3#s[zv }؅ч|*(ʉ3Ww֓"(э'nW0FlB*H}En7wKjAޓd (nR _6b₤G|(Qo3F8|Ed\tpi/yME0_.'> (ݥ>4Us'8.¿_v~QGn ɻW O ү?a5 GAPNA wA8)`aJ_9u>ݘeyLV~u*"T;uhOg g_ (~ O|Vycgbھ0ϯ%ʓYQA*o^.Ru3[\+mP'4}ռ_޻t2p|ݜ55qxyص kHѽlV`= Ү߫|Vь9&bg+b2'mUa aall8,ߊ݋%%$\p`˪Qv~MW m2=0ҀDwe^L2!&kh'"FʹAKǞC֭8!\H%;:7c?3EO gHk`=(eZTCza-*a X7i`@[&ʩcW/TPTtq50s 4}'NhdYuu6ɞ} T9&coVˊJd=A-OLJ! Xv7syRLU(,w 77Σuqb;x 2̳<>:T' }>g;GCCohM^x >.ɴ5M֢Wtފ1M±m?lR sgP%Ay;gDKqkz҆'㇍F]趁;(cFZہlS*}eBcPQZќ.D+jEz,/RO$8AC.v*aC%b;7U z J~uJvq2U-wnĖ&/n;p<=t[VMv~gͤQόJ #A3x5P|fNl7IAʟ[]|rѫ&MLi'&h@Qx,`)r ;iV%s-L2j֚IO3j27bgjf)fF -=89@˜_L^o6%&D4oƗ{)bQLQ3y ZLj&]81.C (͙#Qߎ}l67),#j{􇨱kd9@Ts-&yг$dEB uMl5Xv c?eAʥ;ݚj,:9lxU 7~P^$b/q HWm[W 8í+w_ԭ~P>xu [5osڝu1hM\vb1x4N9Fjj-?K3d4,ʃGq|̿[}׸er,h²#sPZOѐ?},l2.V%BU1l;x[fc-<gLG!w|H{3 ,; y6aץ+w.y3 G~ځG1Z?#4#T @|O*66vfb5Go&Ow/;S:a^Vy0p LlWd*9~xK[ \}ݹN.|m s2w(`giC)T7<\f`fOwlōuPCØ 6r˭gjiR׭R ʣ7a:nX--ʌٻ2k>4w,s/8~})wAػ`Sc02Œ{zxDZ 1Ŀ/߶D*ڵ ܤɵl + ]߾_T~p\rqmE, Z_eo?<{F w/sER+,쩺 XcV7\;WS.)w tlx~2͌meev;IHB!s,?m ש%]a{w46n:\}<=jnnnf~zsS9"-v)R5<ВBJB!A4w݆X겳>^|#? 9S 9~]lq؜fAUeqzZhvvp'&pz8m1H#BwUGZ~sQPkF,Z!~?;ߐ:ӆE<.G)x7Sap8gRog6X-.JЉ{q̨>6W~Lmiz3SRc"71̚2M>s`Y6lǼڿ)?~ڥ}?fǂ)hӤ.,Z !bnL[k#lA?V<ݢ=K cNɼ';wEj(:gǚl="4@OlHōkʐ,}|F6G[gy՛3F 4e}vzPG@C?v~<6Hw0eܐ6Y!ȹyG=9pd/'2m?c>E?>Zxa@*I$r DX]n l^Ö{+- dfh(`Q/ꔟj 4ok$XX5gɈgAH_5Pv6|9{ߗ O5_o _zn6D@ÔWZ~>*[~ƛ7"$Iej)9g?o^Ҽy [pv jme;9;/}r@2noܼZ`p|B-C\_nyֻ&D"( )W@k|"$EM{QD=D.SѦW's]Kqxt$_zj>;;{3G ̉Vgo(1%Ԑ`jaj#*af{CljM.r+ﶦz)WǶ}/XC_I׳ gc A7deE7qVAb4JOPpmdq澿kB ^zO_&:q-MM[O-0擀5q">[<m}yüZ0)6GaLyo7Y+;X dObR]cepakܘW6w :76OٺylaߕI{!'Jp7K}%*`ۻXZڶ 2!R:t|"Zp*hOg;Q`:t 6)|I1lmp4eGEm #.M;hF}F^qKcPQZќ~Q*s( d5B^%ir 8܊0qD [ɉ*= C-=,5dyޮ l]ޜV]=zn|ecپ?NpreXht+7ߎS:s1 IDATY^zVg% {{7JmgnĖ&/뇯J?ΚI 9 -K])s3s֔OeFⴚ%]81.C (9025*I;+^ !ILR)ˆiYwn>W #3`%=nuuS|SX&6G;x9 Uf_8l{wIYf⭇|2no f8~nia̯Q~s&7#M17tjkg}z-BսlEEnTHyyeZvv5XY~ Xv?S2Q`^OߔZt`wq Nnf=Phe76\9[]`h&䂌Jn@!474B|tAb1Oc̃ νԳK%&SQW;KS+73zr_}M d4ȜJBYlѣj<"[qI-11L\ybyG@CM.noj(Nյ⧢2=<]4Z>rswZ̩]6f (dY%A斵f1 PMU^.Ta8 qj*t6`:w`0RTvF,B8.$rs3^+|xgs+3q=-1σ4^Dι)nH+9`bM;Ub涳2\9"B~Hؚʁ!_RW d(O6b˚!!Ȋ~{3;oe/=|oiPe6x婡q+i' 2ȊVTU$_^LQˋ8chX)Hk}Ӫy5u @{bgqֶVXj[zԧOk_Zkڪ8 &2( n۞?{=q^rT⯾ sS?C7WHiWjwD j .ͤ鏉bFg" YEx<=Ҩ't4@ +_~+CYPVkޫȎw:SWăy'-9jHttt_S\:>-9Z)iEĻGGGG;*huZѠGFA#@o|;ᓷ{Řuh'GZ.h{e[^Ҧ?q’J7j>}r[-%͎umj+s3鴉?*πqsGtSUCm%EYGg !@MYl){з<@3fm<T{A𰘡X/m{˜J4))^^y\!|-QiԪ+.oab blsb 0j5B+4aDoͅ[ \d(PE KrZP wǕ9%NG+߬;8`BM~mז֚n} =-/Ƶyt| 4M 5l8)+{2raOj N\Xr[GS0#bli$2rj ؀IYK3T̥3U׍CB=4Nx-\Q3@a8 r0B;)p9Lua̙^̐%g4{p G6is_D1M%Fh Ο/n%?υZU}AzzyOJޚC q1B_qpN2}I8@_(7QFpԋ z"NaIM5 g{EQ~*]My1 >"G&䵅&=~XAàH>]9qr !kUrf JߪINK m<<脩ʅ E͖&جo5Ma.~Akxbl fNU{&S]saBN|-+ZT8Ϛ&(hnE($>.ˉKIQ|Zi{(9-J=4RP A@AјLrF{%gӼ;~7{r` |5NxoOny–/losFŨ,ҢMV 0֖Č fTj\,6x\\c|Fey3S:6$i|4)A`.C&*3Μii<`. @*35]n{F&&LBWC&)+zQT^`#{ '71F{+/m'Lxww*?礎ᖘ4~92@܋Б9-GS0!tbG-Q+2s*ha"]&^#/obx^?ΉL^kΦ)9΀fV Ve+ʆ‹F /M5ZHS.ڒ^\x.qüQhAUۊQ%c6bR$-yfٲWIsSCeN(`S@7f^-˭zx1+5V\+j0U)w֨n$HscQ{׊hށXKXoPMzMHڹda^Q6T61u SY<M\ߤ1riUIUNms(|x~!ר;Ft XǍ 縧iMYqP @4Ԙ'JpMQ";qCBg2Ю0+G 0 rFYO@>%}}az_`m36 B1.8vfꕚJ1u6Ǡ(R&DuscNJ/X!#&82̣o3$Gk.wtYryO?*Ni^[u&}?k)I 3- FQ)O Z YUiplܦje@d?YssKSUEo ]i`e9XV#aMzFOmn*Os\s}Ld9&T;Zߊ?{s]J7p+dn` ђJtAٷXa4Bo4GשT\#XeXn.AVm1@ݰO}$,VxC1*"&U= ~C'uŢlBv^R4=ukyM (<]I6ZyE\RPN7<@8&xOP_>up\ @Bvvw\㣝zqv]rVqQ^"@1wgoF]Q#\O8xԈM+m*u;.*\CR fr" E|N{( H_WG#ve4Q2lXt}[uRif\Y(d oS4qC\zoy8S&NwAsF#&ra60 ZӊM|~G[3E%|N]ESO繺4xs⁐ٝVeV^<:nւ87Tj#}/<=6sp-[L1SЦeee] sSiC'1¤ȫӾUJ!I< fn?0]:I)\s@n+3B ⸆ 㴖*ff,Ŋ}& S垻!BW}1=K+Zj 5 \׌5$4g@]̙6ĩťmy iކ@ K'ov-X|W h!@[k_WW)SP9 /r@@@ 7@ku@ @  @ #@ @ vA 2!@@ i@ ] #ym~ox&W %$D %o+ׁk'.\ZH^*{\={v?>piĨ[5"qax- 13W&nqԅs;~4;gOwG k?ha P羺xdP%eDwt),#!hqvK:wueaܲQ,Y_sB\S0gn7. ?xנ?MU|]AYOY$zjHӎq˧9Eu_UJL;k)t'IqVq=:X;|}wS!_''wƭ[c#$(gpwx,A|dx(胗܌tK3J#vA 3V'h!$eY4,jE}oXl?ksp75>,O5:,'I tywmDqqA'RzoİqUϛj[Sv Fdy!zӎrL;UT;%+D!$IY.~ a@G`>74ca45L0S6Yh3~ZN -?˄&*mNa[H*cf[7(4]2TnUm&L H$ilIUЪûyDǪÙS\! :}I*5l#jYoʻ/0aT[psh=0hq.@I2t.Ug3ntC Qc)' 0:x)M)ߍt3K5,79npcRȧk+߮f E^N^hbbPHܼy駫;Qh#_[Oؚ/rTveDYw(W_JBYEUZҷ7W"{_ϮKa{qxҙ N+N۱u5E~i|9{d\TPҺf:Э-V<7ڜmQ+>]¦eyZ⇳uYwW==|3W@v+v|TߔNaֻ wE h7&ɮ t!r b8FP1BMhE`2ݘV zӜqّ/GQ,m5:?~^2]/!?.\>6zvݮ?{$%s^º,˥TGUl(+Oj̜7fn:]\t#`@!ʓ6Bp Jh~cCܧ {NXpnڧ Y|4Y\ZӋwO^ n=?}mNB \r7] iqͲ%K޸d,޳9Q 'SSlLX x^d.9{;UW`7PӇk^yi*I^kFp0½b;XLT1vTL_.JJBk; _صH˪] IDATP.ɫ8]7Q3@_ ;'_CMGI!nhK2IQߤ1WJ- ܣ+42AӜVjz{_ lUˤJ\nP!DWd} ЄZ罽u3B SǸ SYv!>&tX+kkB͕trRp#ZqZdntTmal߸-3~O~u] .lBZ$ev/!2BTr@=ٱB{fuG41~҉t7-ű*{I'H /V9ev]f{6Τ?˻]2Fx=7fxPf}ը9;K f¡#h6BpL H;t6y^2(2 ]*p0&c"]{:3B j&5[ZO);QGpO7 rvTsk L[N↲s,b8bHWLwOL1ˏ& c`,F'6Ʀc\'xVYkI%=)-iLr="$>v7XZ+ӛL"-6 \sDh.v 镞IsFb%dtR&Nb[Ƞ[+ SBgƂRQ^BGgoM9?l=lv},hf, ;sg 悜Fߤ!n<Bw: :#Q(9b^i ~NyJZ,'t[!xӣ+M_k:wA kݴ6ӑ7Dj1hk(ͪT^#PVn?T7mD6.q3T_8˜KRwVc F&{LYuz$IA.oD#7ёP&UFu 7116Vv6_R(aRR_XUPu}yG4j <ũ60,p"KC@zrOgtUE[vѝf腔5&5M::Lj_h"b(ʳm֝پEl@[t:g*}&E)_U8rz~rS98}ǾYtwf^viۦż+/Y7^w`Z՛)QSnX% O+uŔ_0,cDF/=xaakg>E4ʋ FL1 OWI2}'F =ôo>3sϙɏIt6YU31M&|½]=}R="qԷGg˻o:2~θמ^-ll㢢F= ?x{BeۣHucpgy2xP,Nӣ;R:EFb 퇺\m˻i1rPrѨKBsǔ~=.f<&9oդvt|nΔ]5V JZKz 6s[gswvz?%[^-{ DԳyO;MAh- QG&lFIS#sG{~}n|a`3bTJE"T7@aFEt[%L~wZ,mG. DeztCƎm%DߗwKۂ!ީ(aIO{?)U uS~oKYbX'{:(k1P8B/0D}N:;>\٦^?r,r?ٚ48N'=kX<~Ejv#G}{pT#߁PUU|F6v-[,QsD^*B`/g.¶72/6]OH 2pe^_,O8n?n`9~/WWYă(8[lԛm?m+l@Ī$]E[\ ,OtGOw @Oan,qr% \g{c,nx3U{LN,@ɒ?  Uꎾ)h$3qz~zܟ;Q xO/yW@\vS/^{Ja{+sXGޙ r|g}5]jCjF,Q!#\Ggq\Nt战ZWDoG7WrPaټ!,P..!lI1׊]د/1VroQevfݼnb>r#՚{.4ItzxGEl ⟣](/榺IolZcwi ӫ~Uyxgg#@_:iM#6wUйfZniN3gm|yo7e=Od?bjټyf㢥.+{uCc_WƲz{R?x>z1!Q$ ~)jWetfe?5+d׌ @_+%[4gw/O15C,__Y퇄2'`E~&}cЯ&ֽ,##r*Frrẓ }k窻z1OC̹ zd?Sm`wgN7-\+۲ˤ{y5@IXZKm[ Ɖ/U0$!ݴg* e@6|hb:F:lZfؘgc.\%lt6}۳4L/jLP^P-E8oRQK&ŭ2xEyCGF=v2|DY.\%lY>f/W|lfV̕BEh<'dItXM E\kFqvfd,$H?<=pgw_YK51j[ֆ>nVSPcͅPV01Tn vfb@sA+]~ok{PUph Ϛ3奖ߌ-@HR% Ceycn7t:mZ?H ?^HJږ, *ִٕ+$_6(*."-O}څұ!QHsˡ##7̟[_|@:'ؕ]<ڒ| ^Q)Evjxjw  }v@>8h2_-z4|woѭɻ<<3Po J 0;w0;˫t۔c􉨉Y__޾aAzwˢjUTgH ݋o ])!3(UW5mzkin=,Ύp A%35UA.j-PCYM &xLMIfT0jd}~7ݹ?p^%X_Op 혺GڣShg;zG{ܷO:Vdpқ) &`n@  ɻ˓v.L'tY?0%lNQYxBR}WY_ϥ(+Qo6ڑ fVa":]g jZ, b[ׯDqdQ!dPH߃ڙ7e j[\(_H)ӧL}`nQ㲾2x|P#}p&evp J=  p- ן6F i<ŭ,Aձ46~éH~X⫯ q:q˝nnjٛJaL(gpkfEYUի BO<*b_:T2r|UQMӅzC{308OOq1;2)l%%-)18o jB3[ܨJTJG'>z}*gU4oX$7JVĢWYC;pۼ "Qmdz6-<{laFJx)0xX u۲Ԇ&ǰU+F{`w"BoaIKl=֞} ܨ-R<o~*dWUߢ '}[?XqێTϻ:!ΞXoD@ﯾV& $i>OS7MR!6sFeɵ3p&TKpecm'?u~r,PQ:,6jnle)OD]-yz,k톯gE0R<,vGjPPEV{a!w]8Bwrv!ft.Dsp p^}UqFq*Co7XLv=*cc:-L?tRj8PIcg]utٖ}o%e4EfWb87"qˆgTjrr{d`ܑP%pm_M¬ C65gԷꌀ=TޜGE TFjU!/O.Qb|=»0l捬fmpPVmם8ݑ8흜mru_*yO|],'rxu N+V-h sSҼ}^.nkp7i/k/$ĭ ZU`h(l fƥu)2 )F%bI}tX~^<a絺JTak Ilwޕm*JSo+$E6Jф/ /RT1zIUzWm[X}Kw\̉_a.^kL o>%'~*sK1`;М_.p{sdGzl:q|U-~PIxBXV92c0P`3wFpmn`:4Jw됛xPrzɶ#88!3g̭zz@BsbIZiЉkR. 9'azkɤp/J y""f pPOLdJt p`9;9O:\#j9>zʒZ3kXctΕ 9 OUv'W(B!NvtO;%ݮA&@P=|~܎/IxuGܛ  鹇+|m~b({dR+ߏi3ݪwzf!,@'` bD7;ڠ g7P5, ٭$u O{iʽIPM_/7ި#N<Si\ooO;ߵ;ʍa5=۲~k=UA+&uV]0_:ԍ\f /[bX wHCm>R&ٕtF=ҺKzCCg U;O4?_#1|ve@[}MÉ-C7=-+mtMǞWVFEֻ}/Z i ?J f_l"]i7s-m8ngkUCc ML88`X۪kjȔY2dž񆚆 NP--7uo1 l6X[m'$„`+lo*2\/1t\0@ۉC?tv(wԹp>A%fd]XyKpٯ5{I}O𞿪 @W^^g:K_j3.= i`̉zN?Tߜk_%=#uv _]Qp8#s3\tIoWSs h9EB߻pq_9ށ?7ag})=Рgm FPhP"tHO`Mƾ7]:^:6 _wy0Dk-8_I9K{0zIDAT#>S$Vwʏo>!]=Ǒ/Y`?/,ř 1_80X*_ 蒤a FUn0dd{[Un'8!}@<ߵཿЦ{vA06z߾Ƭ+>T,Ryśö́oVrG=i2<{{֟x]Ng=w؈k6z:k4xjjj]yD 7h8;ᮮ<gsťݻej;'v~ˏDݻuvLo.u)7szVqKWT붳MA`!mec%r"b;\y~c gܳg]l\_QHe:B{ =˹W{P מqqä%ڜh4} ElUVR;"ɍ!!RUVj7|iDFϾ\lh &ݚqЮӢտ\3z•d b0;eI`3vdR_76x>y6_pI{oE`7W9ʌL:0e ;s4"q]QWNkԲΓ$TVV,E%/*%+KH ;lߵ=Dm]1m׵Gy*Tr$3&nJ=IZ.9y<*|JW(pr Cn/SYZaPSI96,og?yGHbNӣwW47m>P9/$>555%7-Ȁg\.̭ ^:iQԽGRу-%'[K"iE6jaK5+7?WV#ۖqu\.m?ٔ_PB|yɕSl׸ߢ_v}G~`v_ͯLM5\}SQ#xyב92wӎ 2kCW{{V=76,͈( ;|5}Լ],,72#+4޲N#/[,3)ڪkCO 7uP3 ڎ¥ɷASOs/+Fz# q^ oͦס>=c!UcW/^<ݭHUTafૡ?YGHrrŊ0[.[5']19wvA-޸:^}!OiiJR]lttK/t=@SWmȤKVT=ݟxv_^^sxjS_X}h5b5}2թuORQ>nq[{4EL9""BubB>p/1{&t?[saʹi}ۗ'a *$> *EJ|Ō*JQ9i.`n$T%Dep%$KspL!wNrO-JD ++ח`rL]j}4;6zeeWrqIW):kWxjՂQ_Mw9B*?.TT@H@HTT@H@Hx ]UVV FR!!RR!!RRR!!R>2G ] >dug!Rބ. x_ێ+ LE#u:*ihhx/e5k?d2$ ͋R[VKKK"rp-auqm o!V.̰qĕ+WM%=zػw/_C~>q,?ր%?`lRqF, 1t޼jү0]nL'J-"ClCα$~]1eUԆ)>SԎ;~7 . {|̚5撝;w޺u>9_NL7S;3LX^NDmX(iUB['ˊ#1ӤG~}m>UHo޼ŋK 7n\QQѷlٲBߟ~ORQqµ]o !RJEDq}Y҄ӛQuAm_^[1ρ o}sM8eYl cacvR/s4U-4nx[eY6^LẏX:h$9:&l$#ŁW2Fޣ obk,*RhǾ+t<~/tNBBB֭CBB3m611ɉd6hРcǎٳ $lvVV֩S}||^.yݚ*--}:x{{kiiT۷o9rD.DQF蔔\x1$$D]hԩ"(;;{/KÆ ۴i~zzz}+))Q(=WK#Msn뇎1qǎ_Y3 5|z^u J$%WDLvY$~GzHu8U>v-[*|1[* {56c;uR>}3_?7FFup73.T#|>ܹs=zvZ˖- @ ̌rttLNN./2НKDϟkܸÇ_.氵էLRd…O~ID"ܹsʕ+r<;;:vhjj011?vΝi޼yXXX||<]xqDdnnnffvڲ2"JOO'OReEzZM;:]HdSN/W]bm:,6ɩ!{Tl՞U2/F:ݯ=tetC.T--Ν;7jԨQF+W$rMMM"ҥ˩S<<޸qcqq1)̭ H,--¢[n X/R,,,ӫP(:99ǃdr8]]]LNötwgfvz~уOaٽI*2 .> ~={&&&G1e2:FEEuv׮]/W_TTTkɊol\\܆ J̙3k  ?"bcc?B._ c} 9 UY^Ge%Y|K>s_mƭYq~ޔNq7GnՕ'[vM;5*6#u ^ۼn܏K We^߼TfU]޵se҈;D/?D%&F>Oe8]MDD쥟dK:Kݾ}…'d2>ODJ… %%}W^ZZڵk_S@ ~ mUUU/GguU(8P(p8ڵU`KK˴0TT*󳲲=zԯ_#GjiiT/R~}KTwaݴ /'zq)C-8h=MDtvDD(;79""v+&y Ox!9sTn=r;(ѳ%#F* [>T{1{g~)_y_׽ v,<8z6yԗ^ 6T%ɪ޵kכ7o\YfVVVEEEﯾe֒&Lx6:e˖Z:t׷E T}>//͘1C.[NwϞ=ۡC޻w}"֭[v% 㕖^|_rx#m!5*y^ WrER~kI I ++]+r*y^^^qM++'?%ޛW{dzB!;EԩSÆ Ҷm%FEE sB.PwFRGRA B*B* |(n FR!!RR!!RRR!!RR!!RRR!?J[[_: mmm&Mf۷WR58An+/±_n@@״iENJ':t ~9`qUkii,:_Kf oϕ=4~ԖGi1vO{jWD\4&e:ۘj3~+iii\.8''GCCۛjkkK$ghh666{wC$15kTש7s'N\|iӦ;w+**;wnYYٸq㬬|~~~K.^^k̙敕vR(Dݽ{w}}yIf͚uAGG'...((H"y{{xǏ޽;>>zݸqc.b "7o&'OVBEk׮:th^lق=/!!a2K.Æ :u{ƍOX.VZZ:e"RW\֭[>{S rrrΝ;'LrҥM6qܢ"q;v͛ڵ 9sJ3fLnnٳ555՛.((/AAA>ttt1bDbbbZZmۖwM0a֬Yqqq]tQhbbR]CLLvv_YYY1q1d$$dɉ\?² .@CT$df'}C=#o&"""""ADdgg*˽bѣGCCC7o^B,WTT7n\BBBppd "==zn:uTV %IRRRLL9laaaEEE="6mڄĈӧO׫Wy/:0;w㕗s8CCCT]]R.rRYnnn<\\\!ʊۻ8888???33S*6iҤܹsرc;vINNVQrrƍe2˻#R疈}Xu޿___А㙘TTTl|`# Idq#G;5M+V苴6=xZT2tll7o̙#JbccH$U͵*###.YCDUUU˗/m۶ŋ;V#,###"wuueXDt "244Tgjzzz-Zxp8/7'[,+))i׮]~~~{>rHxx}dJJJAAOVVFbb X[[DnnnͷBٺu/U*FeeeiiWII@ PܢT* D"ѭ[RRRbq\\D"qww,--9~9Te?TU#Sܷۄ+^)H/9} ·޽{͚5Rј&MԩSDTPPQ3ZUzzٳgܹ>/^lkk;v؂k׮,o``PTTDD;w T*6lbuVPP1bĺu"##ϟ^ڤIÇߪ^BǏ>}ヒ ym@u5c ͛7BKKK' be*((044Td2 y^^^744uև=ݯŞ9l}M긵=#0tR Ԕ2pǯf[2GI,++7448h߾= ®]DMzzzV,**jΝÆ 366prrxEEEDKKK]]$9::vϒl6Ū]֤IƍYq-Z|7zzz/DXL0 L}蚾9MOOWT|>?77֯__$U_YkI"y&oL{adddllnffֱcGH$,--(===11qذaBP=8JDG~[\ҤI D޽{+hu===޽;""; IDAT\;}F,f(H uX֬Yrʇ+|mmѣG>|822χR3gڄJI]  /8O ww%#--CVೃ    ] RuA]+ikU'*N6[%MxGpxB \fq>Z;{>}D!m^V~k q>pBR1Lpkpj7?WךRJRi-$/ʓnAn,LN|c's>K3M\(`ij`r:X΋1FIo˷LդȒVӊM+j|.EmJ\JzD$}g>n-sԚhRXy -.1rkԣg*>׽ܕ}BwRU!)@Hsή^JD$/+Kj.aaS;j2DfPqLuꎻ5WԌ98tݶM=Gܓςl|IFF|K 8Eul9f܌S TWhh0qS)O9wKO1[K , nm۾ֽ4j窌9l̵)Vd78|=s3֮zlfyV"Jظ6u L[w|~5mfHMe*-ֳ@L*#k93/;8/;|ÇW uKoh \ |1ܧiړ^7-u'۫W:{7:kdqQ:lrސ~=Gr8eWOպMK DL&~X|%TˬKj 1f/Q^R莵^P aC8qY2$x*aVu Υ"㲄N6qgi0ܽk*g'+!~)W{s4a@KoG`vӿveKr]^Փ:]nlV l~-&o}_ӶNm";׼p9&n@H(qQ^Z~o쭧_]=kQTm/?r>"etݠ_=W-"v?߀toIDDtcڗ[ulz* }yuKm@v^3N3tg;_R&;UZ& \%ON?O?oNemYDTU^R~Wqٽ^UeGG8___9O8"~<}ccM."j}3pXkΨH\il1E6h;gύ5 :t^Xdc摞I'@O՜Bsa?KL321.nj@D.e {SV%^ Ժ3_LeI=3> 7pOXkwyUU|Xyj Z8H(xK8j ,M&*'ft߸@lonJi?͹9I_Uidfn`HEDi.I1L(d/N~=5H_ψMGUDZ9jer*#W[煟_ug?puݣEDM'p7%gч/(^'rgb3Y3МE cw+ ߶)Ȩɏ˚0|\ϖ>pd1X{c30,sQ_G[[S4#+97o륺tJNk /ݿl`7;[mMMg>{;O30~ 0`$rۜ{-d47gm͜-4I^.)ȼoc{߽zwdao=+m""8:(RӺo+`@_jYނ~,t(gfi&WY8!Ʈ[ aHۂ?yk\_yyR/*ǭ`?ȳ6l\k}m,;+?Ycؚ#YO ?M1C|sKrOoYz&9Ǟ8( è_<&3Pk+%mdֲO'p{Xe[ٲtYɰб#0AU'A.B"9Qild:r"###N&JJ,`7$~?oGAUnRThݠauLM}skcfEF\bN9ҫK5+hq@'ֻu68BȺAEO;DU\e mU F$ e)L O(f ,VF,=ss 2X~NEZR.CHNhT l 7u *qO)0rvr QQ2{~wAi؁d{sdDτ% F|kLDzvV̘bE>SG41 "e񣻡 5<;:\ˊcc9LMv6kwu8yQQJfNlj>yFy>TuNyg~ 8ig,\Gֽי~4fދŮZ;p`ciac&Lez[aԯ55]۽iͲqU*IHUFt<^R/W5G)k{%sm-kl:ZcU--~߫{vp]*&C #Άڴ^.>k9~XU%dE"^HZZ2I~vъSxhh!"*8;02_/WlVUѽU>ZƬ'twlҦ.!}Þ;_SnNq"C4w2~"aTSF$k3` DzfomWƍ>0'Wx ^۸Y)%c6>5vҴQΕUHNe?&Lܸ*9zW'X%4Ur"j;^NAD"I@Z#aYYȠF+^^_ӺU?QUTECo0.;FM/X<ٹ0O`Lr(_ӎF٭Ld2bpP;g߁26e8%'yɶc2ԙ(ߑ4( / )|ϱFCup˗-7hf˗~X{: sK*yQqn]},Ve%IEeDKif WNEY׃7Ev}E "]7!t]u̐VG.*$~e^Lh3EUzfVefIףO#k]QQy={[:=LlR53zQG*^,?oΈǷdr~AEyrUǂ~m]uʫcw~jk̺b񉔧e,ee*I3Q^NdP&S͠BJ4߱ 0pAm]"-JLE^шCI_RL,~:ZtnŢ1z Xef/<ؽt\--H.T4yz_&[V%#">GATUt/2S|>ƹr#0boY)A"؛  91'߯u*ǸWZCR~scaZut#YeĤD\flJv *$ێE($9h߆v.udeI)),1bF.N 䕥ScpVVVlN3&w 6G~Z_SDZbOV߷)sVVֳg|plvh;M#{E%-%y>2ȼ5WdL}mu]LS}-)Dz皭?X340D\.S~la+8>HwbeV%cKֶ0i9: 'sLd!^HSA,fs_јOolY}ikе{f7.ߓ>Աtݤ5@BO #p|XexU;z[_4ZQž؇GRqM=½_*JKTT/ݯiߴVynnAldc]bQ[+V1wWJ}yÂili>IDCt,:t挹A hKҲ;yҴ?v(\P8C=3"y/ޑ475/][ٛPeYV{T(ء իljpp3ğ&zQEҔ<))>ۦi7 փdf^ڱHycjo ]'"I˨K-vq`E'W?(|ü@@MT\.e2JTVVVVVVR?73kBrK*w3RsV'Y9q|&/!â'*3ZiF `D USS- 3L.KDȩ_`HDDtlH[o[5߿/#"ʛ)T)_} 8HQk*IS[v5BOsbu5uslKS1qt>B'!_dg#:_򠣂s GM[S&:no[2I̚Wxoژ8.IQp=T(cdNCvuN?'o.{}I7}n- <[Xۻuu;\-s=ݟם {שܹ&.ךduҏLD$KQ>]07@tm曟⣴[QJ\ny'<9Džgw#BGmՒl۬q;i䞫bi'<Ԟc{NzZ,?dmUXR{jU 4Mcni\'OHqWaUY n6[qzyGl+^jev`#9>SrH2s{-ʗgKT#&Ann.yߒ*lSc~Unv1/^l>QA|'ʉvzkg=b%1#xQΥU'Э?HfsB,9ljԹd}>~lEUqNқ IDAT}hwwyduV\fɥYqlAk\cXyGVV"V?uYo۴!,%tdIOusȉ3k>[*θqd"uߩ!QArG"KX"ÿ+d26h(UUfw;4i߳&3O^Hķ߯C݋GW>-lva׆ߌgmuԭ[.OٲdEQ$}E#d׳S}6[&-μ꣮UGUxx2gvof9p׆A&*\7~~L%W)du?OwnKާUƺK5I1X O*{ұ_?k%5zU#]Ft9ylkLXK߰/6֮[+EUpvbܔު2fLMLb W.HI$;߂Ʈs5p=JNSsa3a LMOSUY~F8#|dHӣKyqeŏ^w}zӺ2D.TiǴ8ɨGao6;+Ki!ѥB+KK5o…K,;Ó;dI:"iiRT~gdiZZTZp,2w27}Q87F['Nc] ҕcN߈w/Œ8eƭF L>}]Uvnzi[ΥA̔pam5=ɌMVϿ r22Xjz~.dJPFL}yjJGD+8>AxS+\ K{s$LǔvO"غ!E"b_}o9Z'/B>'sW !Dwՙge Er5ʉݎ3WYOݯȲɒd1 !TO´DEKDm(gÙJƆEE~ ڻ,Iڳ=o<]涬z^fڨߵcI s_^|%f [ss s닲rf˲o BH 6klhMWUAsVKeY{3&)mt iDM`VnjL279~"7mYNC?(%>&ƒ2zSF95wsr+ 05G_W_|X*^/JԜs]_t>p]W*h^ˣ]N$:G+7nY!GRۇ4y-z@Iۆ}/)A噍붯4<_+~kUSKm|xwa9rϮs-y?,K.ؙ{xn9MPd[khnw:wVV ٛn$v7N.\JLI_5֔W _-}#eS[w<->Mwt !OnHOk:WEo\Oѡu^-Rj765Z?]=uo5([l6I3LI,[`ACCÍ_+B_ydLL7aoj;u!\gg'!IF)nY or,۫Q%|>F%5wswGi/-ߺgiK5u;{`%Sɕ4gUg݁&mؾuWFte,qF[GsdWĕJ 8|bEqϻ쌵%R#ߘҟaBmIمLW댨Aߟh*95Y5c9ŭ{LкmP '}Z֥uZrZ.FP5zFMpxM?E!8us6kC5FOseɿBYNLRxC\CMYSW;L'E pq ƺɎ6]>Tyq֖wBzMWw6#n-0lZr^6ϭwlӅ+5`:lW,U^v1廓CMxLbA#J:_tߐSauAG.fK%&f@GĖԷJj*ǒ9Cg{3+9S]C:0@UUUy. EHtbjU' *SYp[ޛJ0hRc[=ƟaGSWJ% oXڛ΢HsTS16VaLh5k5t|[kj3'OsvCG3qKW;~'ԣ˂^.,3auWdQ7$|<.:+jFYP%8GJ.>Wi=tJ&7nLEo\e([Z{ ):Tރ컪 W*2o,WeRP[TI+}-P +t_ eyYAyؤ戻4)?7'"M<_`ݭЃzY׌=rzu]oVjX`ʶH$j>ɯ-J(Jt 7lhF&t:wf6PU%e=V; ,(ȳ Cdğ F ~Syz-( !̵P "B .Z~>}gdq&ۡ\bME_ON,vPUG>TPP'RUU{;ZPA@j!IR @`RR& F!YjTި C;sȻl{V7LX7Vd 6;dVhIp(А$# ҈}_ݑ<g8Ͻl@MkV_;Mr9}%)8pѨ# `"QoДhz?! ӹs*xnMn=k2ryO]W]kl21kfi)`M膴dI ņ~Vua=? +I”Mk.ʸO` RZ([(v&TVRNtz=alP-HauE$dhjl=-\0UzXZ۬2M=z ?V&2KO6 ]MȦJ-hjc;;u+SVCBjEBZe!ZZR[Úy )Vi:>6lMS;<Y+z3t} ǚ3sG;rŅlaFt6e9NhIR\ N ,޴deۍmʶ6iNSŔ>̅`YP< KM& r(.Fd89?-޾Nפ~ڳtPq87&DJ_wy 2J%?N7p$ O/KбFlf7mfA; }Qӂie̻jw 3vl֩*҅v-4NB=lwN Pgeq߸|4’C[k[17kP=9?]wzZ\Z>ccoWf\FҎ̒Hq"`OfKoZi!@-z keT|R5RXX+c3:9F ,]aKO1lㆲ\fF|5~גfoB!:kt_r,;C( !Bu4zVo|;B!B!E!D33ZKkNjL3126.gQQgh\P+ scF*mb3K+\F`jT8ەFj7[2ڊaVPTc*L$٫Șjd?X`F3622)տxz"} >Bdj*᠔4ZyD&[Xk T5LPuR!\,Sk:}L @h:R:܌&Se ;;*FcjrR%FzjUwtE!zY.>h!B B!&|!Be6B!R0 +Dܮa B!+[B!R'VB7v} [B!^q‡BV8S‡B!Ä!B>B! B!„!Ba‡B!0C!B!B!LB!0C!B!B!LB!&|!B>B! B!„!BeUz6|||B骫R!^ \.+!=B! B!„!Ba‡B!0C!B!B!LBDKH6+YF*=d3|Mb+'$'$}LmcM r PFYuFvsq_@k[>F&&|W5q@&/ 3i',vTˌ/l鹝7׈/ll [KO232 G^ݿf-'؛hzuٕ3}x _7Lvx@Ym8}g~.v 0gp6 |$Ś}8v|(½|Iַ_yS';:xdݟ]p5^Pߔ6eU˨#RJ%9cВ\J褂' %Z5t{WmޛOq~ ~QB/~l^4zmjkCz{SIո2$f&37 Fo^7׮~%ˬ/׻]Y}/qiP{Ru%i{%c' d\ߵ&nS6o~x@&&/1vۙbRO؎;p7gzmgvmNP[:p.ϬOA9-7xD61a2Sw.MSo9e1k3a1&RGf8G֝à~߷dv8}Gds+?4J]udPmXz8OĀE ͦH[2WyS9hjaGϨh5$T氭GS ?E+{]in9`shqn6SȧF{x3ۃ' lCbAџ {OfWS7# sgu^@2nggJ_=s_Z9-pY Sr@ΦdŝvP@\+,K/y2Oj=gӅMjY?^=M|jܕ cˆ!]-6?)M3p\nZaN &0)yyƄ!Vu"ɸO6{׬پ(„ӽǔfu5:E17/!/os4mgz.Z1*gkc\8(+53/=2cNo ͭwϢ }Dl )J\19јαş8Rjspl׷k -_%nǟ<w~[-|z(޳xZJڳ%Mo$eWpA@0j! @&~gR~x^yפZ*5YYOȺ/~3K' %2RI3y!Qk2 ى c7_hѫ'5nm9׮9#[ %Szl^p1;$ PhqtH"WM!~f sZChߜXH 7eմVb4}5CfFcFxA$yY|nοCtMzpF$ż7 .@X:=KQ rnrx"|ޖ0 ~'H;R8ŝnȮD7NYAқi*'/qV_>'_x,c߁}ԧ[y)B.`+WJ5NgHn/IAlpv7Yڷqu"!8 '+G/Bř{ODt1n7'2/؁@mу DU?5d#!HB<^6 ?!Pܖ c琺iѵgڹ[&^oSs%"Q+],y'!"Qq_bXQxlyJZ%ɡBacX.9wHPwй<{u( [.- rnQiNگ66s¬RfԊ6.,{d:|@p7q_bI?ZaЁiKEiǿO1O3U@8GNgKjBA[pئ*myPHz&H$I IRMRc 8zpLlجEJh @ K!Is X9)9d>2&: ]zЉP%&쐈w%+Ngc'!T?$HLttttt+wgʝ{6E@d8xn7vIL& R&eVx v~][$P wW) ݼ"'$GtRt=4dLhI.hjqtfk '4t^P IDAT\'ӓ͓oeGnvdSz=!4 G&|犠 M2' ]LyύRxnj2 f{k+.tC9:EedeTn\uuj[?xGQr讍k;~( Le::ڈc,(@ 81DT/<vqS}^.ok^9-}\()jc3wA4?~ESizs573Pzy3RrϻQJKIۅz 8Z3"+{˦䰝=gh6W[Gf[m}jAJZ)}^K- zVJ~=<_{rsDzT!wغFL+B<~:qa0'`ظy5qsdc B[^ԑw'wpvh7FE,F!(!\,{=9 T7LYGׄ#C1]9 { ~^]EDwn O59MNGNuOhΤN2"ZEZC=9lЙ#}UVt( [*#lLf`gG&έW?_LPnhq56kRX760T ZoRrq'ᮊ#I^+/Orm6nҪ!VTR"8+lGwlµb,zi[,c6iY/_MςQ/uCWgm W [jcc`zs L xoyҦ ؾrCár"ov,yǷnO8~N"F;8;bֽ .\tʒ]^̎7ic9{0L``dLcOMgI: feU|vlϕF-tKG~Z8k[Q/n1 [lʐ]y)YqԞ}idKe$G8mndf_ }@夅ȥCf2P/JeI|wC27ۄM U\μ ;HuVk)Ղ]Ehj7(Bquu}Tƀs?2<Bɲ6\Yo8vzm_p>|,Π)OVTM;yXf:f=w䘚߷VwbT_ qfyJ ˸*^kw]]lp|;n.PՔzܵpk\Yf㔥jn9X?-X]?IWXNK t6#LuIO,>>\.?qI/ڟjۡ붍-|ܦ5߼%q;Ԭ9tη8~.+`?h:/*=ҝe׿4q锔Njkm΍.\jZZŽ{LOQ `lll\%)AV%|4nbNSVܬ횹`Nbqѫ݂AD7; ;s[2"|0*T3ʦuLIw)r2f; S'0k;;R YOͅӍX]򦥻Q}Bؘ(̻ra@]i]Gk@MȧadN0} S&snliY􄘫.[~ qhgyfBijd _6D~4%@Mѱ鷤h㗩)vN4l5г8Kv/We@71 LqrG-rW}|>>;? ewp2iQ^d֎ik)#,5">=CCT@nDV_o𝷇ý'ݮsQ.3ƬpG|Ea6GQ&wcI!I=څʶ':9J">R"V78J鎠7sJ:ZO,w,-\L}fv0nRG%(#_t3&K2LQG=qТiz(ꈖlHhN#)Ury:CKLY4y:PBJ%kXosw{S'ϐo4_Hj=ukǝC14e_uub;cA4;>ԫ]4 g\x.ý=gmr{:#ܞNSpov$j}jh 5UZ0Jiu;N3wgˡ~> t߶ߩ8\\}&em9 ٞȕӳEcRC[%'{2NMlS*'qX*Tz{G5~^Sź]`an7 L]k,qJ6%TOfp}ݵ(,̍2]cгRrX n͛8PG;.;ҮKC:}c& PiͿLsN*J `fkzEޥҴZY~UXjψy{\vcVݝpYnGH/N[tٶjIL*ٔU_qLH͝ҕܻ3VͭVW]m[yiEbY#ɯѬ6^&A,8 ,{F빕LJATRGϻt1W~.͊mggE!z>:) ]lVV̻w1TLch6o|pwz+d=dc BH mcʰ2Zt9+OċKzy~x[oS󋤰JGso֓FACf&vf;K3g$+r:Y )3gPSZ> 'XA!,[mX?t6e[u4'©Gk' cmA*rj%z|>94S1afMr,t1چ iz!KepVH,J m#/;xhu>!z>Z<ke)G)(>8+ <;N$㓌LeUB~׻{(rn ,Aft3x S_TJ Q8{\Og4b݄;PO w'm=3|0Yѧ!W~,9ԔGuݧzx>=Uq/sv2#Ӳe^/9 V vߴڂOv7g^0ZRUy=\ [MqUs&F'i,(F\])B zۺg8P'μpgoÐIrB!gߦ Vޕ/j%9Ы1v PgM97|Dww+'3Nz98܁z; #;N|.N6o{01hd`',MfѬT]rX@Vn}r93y05eYwhBmO%7,wDke73 ra H)[2^9<ЅIKvl#$u~F]udP[6tO4ȒmWj/zG;o:kmKW%q~8qڸ]8VTR9.R%tܹ!m0ycdRf̆P5 ʤlCUgM):VFLreK$x!@WLN4f}slg+E~}l6h|]6P쮎_Kj{2%Szl^p1o[NKۣc?*P9I=,Ȝ{N f#S&'Lhθc13GG;y:zk"Nm9,eu+_l~KvDP Ȧ|Qo}sϟ@~E~S0pBV]+jbd c-I]9S.4 =w9 Z_E~S+Nw%jԼ 5νx)MNNHz(7B{7iKI}7y|˫dEsOϮ_~tL̊4H۵^ 2xOB᭧*ZY9Z^9_ D! +RHP[-㿭VAtԟ7rU3(i=5:ٝ_NU(S ܬ2~ٱߧOHSy%랓=wI{feĄ!+vlfiC Y::V`TQyBl sɱLF,@HTѳ& }5D@ II Wfn3._$ mv-e) o41 Xvd`EhJMCRK ˼[0iIӥ4 ȱU HeH,libj# 7&-,ml 뇲,jjKqNR|eP_݂L2?ؐ=2r|$Ix'gtq VAەc?74@'E†}DZ^!%8n`Hk chI߳ۈ?!? |5_E;F1~7Ѐ4Έސ[sZg.z##I;Ν?6%6f}VۣZLBy1\Yv\G׫)Yݹ?}}HeL[gbRnV㋬>RpoIB}ɜCY|wyqWea[wvo=UKϿzæI婫h -\=:A3nW+WA(ˮc1pBU];[',ۄH7,9Lt7_;2[r=yٶmw?7J3$X:pr'4t+z}Xb_V-ىY˅{"nn6 |d@dv1mC\3<>P-Gˆ;\"=IGo z(:|@?eVficC) pH$ M("SMI ,]+/q<ٍ!$%x|iֶ.yV^ :RT-7ݒx 3R o\gB\ms#y|POoy2d Ay et7* xB#Ow&p7uWۙ˿A@D;lgЉ #94')s@{2*RBlOɫ{s}u+/[zv#'7MƽcMޗc?zZ&"ֵkDwAf&e~gαu=5sDBֳǂxt6"@q9l+$j;q!&|PZ,w cCPnCԟ->fu:}t!=H+z{LZCTiٜ.##yɳ.e㑳-]aG 6=BPzx?tЃ[&}V/}ER 2s;|ْ~.4]=W{d4*q{Z,] WyUQǮ(SW;L'E {6VO֔VI"τلk'G};ĤiZj6w?'k;%VwdŭS2wx^PȄE.gowʒkmq2*GۺR^no r3 UT//4Q[z IDATwxSbڸIXQIPPtmž?3D#rΧ5~I])JB64˝߶Q>mOg2^X^祎NuQo۾3/N2>O$%雿Gk8멅ؙWmS,#&l:֝j0#|ӎl4Ty b1kO7~SW^KƸ [9/)d& o_/Qqogw.ykΙʇ֊gdD;tL)"|zrﮈh6!1xYCìړsZ9My>݄eaޓݕTi,N%[tf"(BKŇ/sز!ke1_wܲ~%c ywoCӲAt}gΟ}8O)֒9y3ێPPhl:u83wݩUX:fӲ B=_ZkM˂BFXW&WB>j?͕L1fxxΓv8ѓ̌Bɿp'?=?ط;ߜ/ι[!lЫIG;{ Ba‡Bb?H۰p^DnKD^>!֍ J)Z 1`Q`ðkU^StHP}g \̇g =;jط"8BA׈RIo\M!#'BAg}Ba"Erd{zh9&6$>'! KLMߝĊ@@(RzXiBe<mPw҆7e6R*2/2n;|J;'yy @Ha/e(*UhV[gVJVmU-.T ATe ,`rk{?޻{s3FyL']Ǒ}GlQw,-̠+ߘ<0ݜ[]%a 7ӊbd ̇Я?'Vĥ|tT2?{ 3@3G?ʯ?os(- x%y{e,g./#C>Pe~]޲:!{mrwcl>by ^RuMǮ=(ǷWhYw?2ZzW <[NTiɽ>QUdE^c}VȻPnGIQB K@ ^/2k[~z=JUY [>q }Fէr\Z–߫Nq0Qy?K Y1;3nP*$Ti*A}@ +ʋǰ ,l({'k:KPgKnpBatK njEUN;&Ӱ̠Rx@  Ghʚ V4NZx ӳik7,GC ߜhV=LTa$R9x[nxVQq챮m,'\[;$T@ {ÛK=0r0BA sݬ&bֻstK⢍DŽ s@oG #j "uPIf0uU7^hOGP"Sҙz2,7MAQWurгKF7'>nTyǧit @ ϋL"ui&N4adh j0eA K: gcFp'MZ @<go޽z-Q)|¦N[Dņկn Jvlw&|fj >xLnkz\ ym( 0m]4%aT=MA^Bu/%;lߘɂ"<]rqwF-X^߻H> vG^%g>VDIyA| 9|̉t8:NLhsui6ј^~_Ɓ"4M Tvo>K`g͜בO׉ D%>ӿ_ULwps$&dI캭6xL|D]b Qqb3 fm}(}ힱF1+rVMO @Qdte^{R*;YHe:nXNԔ:}OB _굋jk"}Oum0>í j iEϹk'8[hEwrD,H!3HeZ A H銦i͋-g5'ճ釿}Kt5ݥl<ęN>]EZF [n;TPYk*?cOrJXj7iխ=G Ȋ-kF2]R@?`~%w1Ruwv͝8Ӈo7wo7klZCYه5[; TF3 ߿bY}W.?X?z5[G\4,W`KɜLǔV/%|[jm{KO243䋟'~m@ H!@ RA>]<#P%}|D{]973>QM6ِ{|*3 nScx'F4}GXд+ӕMCOrYǷpU)IҶO^@ c.-A.¦" C!Y[V :V5"<ۙ:=W=:6W޻UE/Q J@ ս5SD㩗ݾUIR Wzoq0:]q=ƣ&Hrﴽ̪ z;^c9jڕ=ܜg `ᕃRs'wwh $Uwr{mBjZ0)Mӊ fg{SF{3qSgT}'|4çr昦$~4sքoݕ*~UͿRΆeg%]V;urqmvA (`$Vɮkº))Gmbh@ 82jebacjDrPʆn|<{[`E/ DAN #ȇo*ܗK8ڻV#,6|֨u7)VU"qzQul[ !z .]8~ĆlUo'a TYD;3,m7hElC?λ52 }fo^kƮX4mreK+Ye 8?_/ؼ(|#/|_={ 1뵱~Z9E?VskQ@mv*?7h쌺<.rk g0HlTzUܷn|_yQ`^?.3ʳlsk~ӻ}\a~TWHly(-Q^Ǚ2 A")Ol-m슥Mߟx1 }sѼQ,Q8nXVc^TZq̙? _lݝ#9`lǡLosS^`{{Sx99[ )|s'Z4}w9}/N .>fL}w4Jf罈m;oЦ^M':k_!M=>෦t ޜ1آ8%>wIWb!C=9)?w&1j{xNf.ݏPqz *S+s2`ʼTR9GPeX,mxg'k,eoƭ}-b-*ugv"]q׀Cp οpa˚;̩Y*ܲ6C?GwN%H50ښڮ0}m1PkD%^c_Vp)on6[(H sR!x<#!-eSs&aiwO.]),{zR]G|s{ϝ pKXnx ;#A iͥ770a;fc&ܽQM=@~vGTjE䯶OƛҲ䭑0}Fk'Z&*{CTv᷿=iZ|)z {`ų?926g2a?l`հQM}>1ckw(ٛ|Oh+ֺw§~A%.N?0ήM̜t4iQYNov*MۇMJN?m,jgؘ%WR]fK޾5e("|CArf:r;9 "a^fܥБBnSy70`LQWj 1*- U} ~{ZnuNު(}an^q@/+ƭ0&a؋MODl53c#WgWq{/8枙Tޑ5vEE?2@1v~Q1<1uq㧆;V o4LVY41nsymɥ~7.YW:QWTsU⓴dsA驘2ڨcL ?ި'"oׁ;>p.cP4iwi &h/ݎ }6\xQsj}'eŪR"? 19<|^x6 ?ؘ=p@{c=_̺k%5̣ m^V-k\F3$6&mA:LL2XbЉ*zt\pFw;7g]k{}Njy )xxͽER8w`ktX>ٗF@[g&:P#lѧ/ x폞y`JM^j l(Gn\[F(Uil]`Ue6 $YXBtw&gUeeu*qfMf?|Y^4-Et+~FZWѵ"IaiV\ԨoIT!肏jf!0c8~(AE),!)o J[y蓙P H=gOFi'{NVJqyfJ]qQ_!c^VGxI|ˤL݆9.Eŕ¹AN|6:$-)@=[ﺅqޘoXxShIyOQc#qi][1w֜ IDATyߚFR|n%ɵq@<G8}uJYuw0,wo} ?c6&,Lx@N2vnVVwH .)3jueOI r_;{t/vWfN >xP?A>x-7rH3 % ym( 0m]4}HpF.F4:.-O A>v<3&ݨ RUL0Ay,ܼX psh*K"&$x̻z"D@9{s&Wr_QR4~0,#Ӯ^ͬ%#h@7FI 6q2f\s_o ^Qsn 0]W0'0|]bRQe~85h` ߂QIFB]y>;8)Ͽ~)P) ʿV`߅ˤ(bn9},DS k;PF]T@J"5V2:tJ;] KTU'͌ЈUj@w&*Db}4dɦV#.O]UL}{ZN_UUazԞ:(0yUwdV}lX Wȫ&ޫjLDfU9V@dyVn+++Ki KY\ x3e4kC4; [`v.֗b\0{ϣl`s^,эK'Kzsl zb" ?h ? 5R:_`e䵕[p=`y#az9fh23\Dr3hpt ko{y]0(lڣB.pD6\%=xؠ1 Bק\:r^g3H2쮼6B91J + qp[KtO/54=#lԅzA`}JKHcǧ.-hd: ěR q3(tHwlK`AP&&;rP?g 40{P%u^Op k{v/QnF͸t,Wndp.ZG'.PTT倈zy+6\wi-p"E5UT@ϣ],3zxy4˔n ݝ-x37WȕJIEFzJ`ݜ\^ɺ km.#]Y!Q*y)I҆|ŭKEU2\piۜh 'HR--+LüFܭR^q/1[vj:CS]GQS%6sf@Ow*$Jyuƕ5t'/z%KS j*;GLF 69sf79WOjR&*RDup7,!ʽY ǡBg@uջ^W',PkY"R)Ⱦ]4c?^K¬ )ɾSNZ{fC)ǔ.k2QY&5bcZYeTQJU;t&iԖTȔJ"fv**Ք{:qiZGq3SD9x„Kuڱ<,H|GN&N=C-H!hH1GH0Fȴ`hkZ`beŢL}w@%Ww (V~L^'Geq9Lis?V6*QR"oIRa9t8tLԘը5T . 6VyՅz☝tkeA _&tJfzu.Fi àSoPDgQ<A><%tV(oS^+ SF//xM wVWTt}qVoNHy%<먺DYUH枃bNHp0eGWm]M*c)\HRwcP([ۇjLTF+ @Jj[4݆01:&!ؑhv6 s0UM|R5 A&=AWK``P.܆,N<ed`؆زK<;#}>Z+f:Xaw{{1Nv|A!]030SJR@t@f?,VZ vVQnjI N_am}pozM2Âc҅{3(uu{qLʪǂVuFN^-SԠ* VZVeBÒCpl!hR5 @9Ԥ %0 nCsP0sKA)?VVMrDR +Ne,~$,1)iҍ:aa捫O%SNFD)C #ȫF^n:OBn=IhPW-[{pV.QP86­/s{,!;xűv l8~qs!.-䚒J=֖޹0YCz 8,߀}5%+maӯO^+08@f`o`+.oaC0P)r ӆƇ0^! srZؠ-)׍t0-. ՉJ@St)MӐg2v{;]v|հ#I8PbiZ & :ślcu RYt8)WTe\1th0ZSeЉ͉:D7;ɀz(]@H4Akq Edcf$eYFYSW&awV]=]$>bz8 WE οi/]*{ޢ$Tth-Ƽ֭u Y>xA K¦(.ZE7z&$|dEU@ 0޷8eo6;z%@]iB̌7*{EA!]xp۠hn;PK.t1wP0U(@ .@Z @k֭[h T'!555--|xIE@ Т @ $@ H!@ ^Z>p8 >om‰@ /E @@ @  >xSL`N4+ru oZAI羊@axyٙB\wwls&偃,;WGH!*,mZBSfl&ljP}vXia'/FMJb" \^y@_am:~޽{"x ]eD[`@1tn[fP&bZx8wS{G|[0dXw1I{5KCAJ3ˮ%- [24_e{3IN~jz6+mXgM|kpȘR!F'KU&jL޾|J]GqJOׁ񅴩4%!#H!D[P U`_L"dZBdjc_mbVT[6>^[sP4JV7T|22hvn;W[v_l)ĞklI0l$ohi #" ϠJw?2m95IqYb9 Ppp kza?1($ >huYqkF &HM]{vPUMF:woh݂]/,>5 _d5{sIm7.:,Q'4%u-.O5RH%Ӗ i_Z@)j. M5թqH tCDPJ.|WZ=z_:A,9O`:v7ŒT9-4@@ղ߇ty}ك^.:-yN*T] wv3H/IPg;QHAhGd>ժs=ޛmɚV߰"MJ&-4ؙ3FJ\veߦCֳX 5_[Udu/ϟ_zp&E)= |;Ea՟ ^&P(p?y QVC /}U}_1Ӕo°P1kˢfC i*EtSA7*t]Yn[ C/bTgNx2d,uQ wwh8yn˩ܑr-.TA-CuC1] Ҩ£{ue˒+,Ba-gPݩRL ?k,ۭ' )ؠNy0_:JXLD]޶ky-;;?q|FVx͘{/MY4֮ X ޑ$BAs gݱ#9ۤ(3B+̚C G ̃QnSFOnLew6|Йox1q)]ctUI߮hъ9wՂ@w i"?YdQ)m݉GmbɂU.3FLjwBQ㚏J v|ۉ$Ŀ#9#6.hrZueF"Sܾ[vqk xbVJ֓fYFFQm%g'M|23$%MQG7>X#ܩ.WܪMbkreQ ۳iGZRQL*JT9)F\ ҠnpY!NO5Gk{~sxyVKuγl&/ox;.s2;q^M1=YMsѸM&ߩ# I׾Lw9fT߁e1_-Ք&EdYviZ1{FH"1{E'EerPp峬o[\Y[J[z[r p{t.?4TV}JO`$`90uP(oWJq #rӳ `N/H#FL(WɫSⲵ<n+E2aDξRj³kToB@/Zӹmډ "MItf>Nl%ڨLi˜QC&SsM8IB.: :]&VaH|ե1ZEc<dQM[-m8H;UFchPI:`tmӸ8%I֛ \uy7f.pXT#H)kH.hRuFf*@,7YaM#OL[n~7aeFB!N uBqiN7_yq㛊SzdPʊ{Eu_})|RoU7yoaRgnrAWߏs`;3{9N@}hz^"3nNd@ @ U+ڲGMŻX`P+4S}x}iy3wr،4(^ t@V,`JF-JS -/l IVk %>TD0"$ #(XK51YNd U!gDuqTLRy>\s -5!hV̶K ص> ɴ9;T).qw"\IDƾUj[o5w*Wl&'40Mh"\Pf$|'tF<' Vl]~7+rfb;tzSϫgYWk-om; "N"y.(d}PecPvQTS;틱l`h3t5IT[+t+v3=hː^a* cAZIRFR1]0SztYlPp1 6սO]٦YgcpGؔHJxؾ.WLL)#QineET?a;ԿU90F|c؛|kia5IMQHuڭ֠)#^=O+yk8&\\m6hfx/')qUBLav2lQ >Z`79"Ӓ+]"{ 8+9qw1q}mP8G{Й8pH#8Hq9O 0ފ҃ZC|w6D Ç@ ^EIńzɇdlM2^<^"NY̸kRUNY_Zk h~|eIX[,;%6Ř1)bAP OF4u_qBGKΝ|؞.g2kԌ. DcHMG]ml.\]j8^8e2#K;M%H ڛ $`X0Ɠ꒞B漊?x,]\DZKm#VbW)pJ%f~3?Z?F(.ʟ3s&PwϿ5b~w 2M 8Pw>z蓣b8ɛ=eW2A)L+c7=9t*_IF±8)7MdRtjyyЧ%w+.< ӟ#-8ײ)/'ϘoŮ\SP%7#˄a_4׊ l 1&9Cv?$8"'''T 9н{w 0X &ug1n{ogx]BS;iWmGJB*ʜف!ϭD e(Q#I@<'6%ɜ%,@/E5@7As`yDjOd*@ @@ @ eC ;;[T ג_<TC``L&k0kvHmIDAT寿TR;wJd$ѥdL +D1h7xΜ:lhVW~]*eIW+~?-~hC'N7gUtT0 L`[vY~ګz\ƀo҉#^_ɝ{8SWC%xm!Yuy޽7Vk6߭5E]JZ,Fyf3:dw|Y)y61pSވӵ܁r?:JJfР}:|Nl*ao(+?k6:~$FqĿQ*;?xA|t\۹2FU|t$Vgu$`wyXܳ:0^8rg VK>> OogFCV-ST!V{ {oW,uQƻQ .hQURqZh;)㍢F%n۔lkjK?! Lox0܍m0[-uCQ,]ٸw@WOIs`ϥ7>O4|}i Ԝeㅎ+qإK| .wd˶㸚Lz2HRP,Ih.*lݕ1çPz>уǧ/+ .=X3-PSSqBE#6!ɦ 3 Ew+7es 7_.Wdr6qes Ok&wo5\ÍC#|mV/joy H͋-{ʒqy(Ҵ;֝нS__ۃȿx#vPҝ+`Եc&B}Uz/kb/ ?gd;n+ P&59bk:k5\ :ebeKްNaD#3׮uydG10ݍsm*a8rӛ{^}X:y؆O^_@סk K=vAiU7mUp ]'q/+,祝feDrk?bX=2ƧR Dkj6mQB>p:vݻ[: {}MޯwZIuwSo?akrm2+Hee[GiNϳq)\[M8u_~?pIٍ,GYG\ҲUKS.5d -ƶWo;KnvΜ-.MY'e]!ݎ󵷶al޻%Q r­kڌf.e1`$ QFݢ NMcr\n0e2\׻ǯ5+5 ̰{7^i= v4ܮg/DNWF}!}#xM/&v>\R*q֘%ڋk%=h|_㩘NviH9<%.ՊZɞ?ϾbѮ>UW5b Ƭ]-cC7>KS6ty'OVmy^\0)NH.ps뒾&u:hwQޑ7D%)lHXYn*兄mUZ4 8,Hŋ/6飭o5 >(=3Zwo,5+2~қY*[&[;sͽjnl}ӞXe7n3`% PŞ#=A>"aNJܿW7ڊN.}nV_V<2BSzP%dA,IwvgV?L)@u^Z1i^={C<;mW.CM<>4aPsТϹmQ#:i6S[({M%eŵꮙaR!#ŏy/enk=ܫG/+ڝ5o֨(E*O;3! ({t?~KqMQݲƅb)~w;*I\dVԎ(^JP|0N;%$T:z]kmj9P[QЮ%6܃芿|g*ZFHe+ۼ( H!$S?9!gkpr})O^B420ƣG熷8'|궤S`cpx豲7;!&$TJ}YGKe1 +ō!>Mq / ^P,L)iȠS=݁c٥3P]Q*[)\.qM,IX)xBÒvl}SSkjI#ԓ`Flg.v ۝ຆFo:@/ano6/ ͜z> ~V"R3`9ʢ*\ e5}zϧ/R[ $mz21vf<>ܾr;'֧`فd wdԁ : G=A0jYܤWSj _VyգϞ hgLUeỤЛ_]?=$I+ B2me1;vmZrOr/ԡϦFp H;W:iq G|wγYj=ռ'hOo_GG?zg鲜2x8d51*.<4 tW{_'_qk4%[ޖ\u% TT_Fϰw'3w\IɼћqAzkiv>]Ot'{ xŝ9?=Զ,DDdwH<&;6+tS_d =0r}/i(]f#kɈ7^wkʼLꃥѯ-]`0}ӧdu^B9۲3G)D zݟ<6o}ڲ;*s ^5VVcҜ:uMӒ yvs]vOnIIDʔuL4"g]鋇*:fQ[y,:wkZ|q);mޱ.EWQ qtu[l˒lQ%XsbtKiYr|Nnnnk.> _ˁuiމg\\+wrfz;4D"玗<𑼋efC+ ʦ]YP/X<@>@>@@>@>@>@@>@>@>@>>@>@'?{0IENDB`pydantic-2.10.6/docs/img/vs_code_07.png000066400000000000000000001230461474456633400176030ustar00rootroot00000000000000PNG  IHDR?s2 *sBITOtEXtSoftwareShutterc IDATxy@u333 (xfbՖvСJS6ZMVKq.;CKMRIc`yfQ"0=q~&!|HopYs6xrN&/~1aGrG,v6ZA0*{PtH6g.[Lp׀j*(`eQ5{i\ͭ&WfEC5;ZZri qq\s+ ^0|Xw;M ,56gn%fr)1pڜ|e6] =HD~)wO|t3RDžmO}yVr?r3IS|st}ȹc `qȦ.f 8Kc52]iz܍n;םbWU_;ÐEޜ兛~>T9%}=ƨ88\՝SV=aw7V_zU+i!"umơŗJq3\m Oxt6_w[r/DHCozح_/\"xC>@%f`H[鍽5nȜ<Ȗ7mD$w|JvF#4,_e_cfH-5;߽$=~uvuxb[n9k,xuǵ6VyC φ)w$l~7V.YZWEe&تsN8ciaI*ۢ7m-[M&^=ז$FeD o*9r};kiDV<] ̿ԋ셟XsfrΜ14eyJ=CA[mTzo $+a 4<> M{i응Իꃗv`&WL}ҤJ^sO=xϽC%|kP^ ˘$]1CD|Msm~fѰFD܁m^vi5;Dr=RzTsfGe46ltkuIt[ 9m8ޙcu!`Č~).":_ _M؉2*o7jǧX)mDd,/kDgmUIUcls|-tR}}`:v;|6gco~AL!'Cao͢k7ԋ⧍o'x׭čS{y)#c ++a&}}oN?iȒ=Wt ۏD-N TdQpP2i,Ppoi;YkEo}ś!qPZ%/Xf})C5̩~9ȈsJxuɕ+ZZ(z蘁{x),d~y[0--tJ8[5gNyTCs[is=7#fdrSE{OA >x~$">WKv˗6&Gh͜#Sf XGq3d=6\Waoj- Af%L訥&#+q,69qCF\n7Xscv<"q&6ah)}wX!krH^ojo{ޛXb.Z"r4&᫏4ىL ^>a=;]DTVSY:/״ 1]^屝X%9K IUE?kFjڬg(D@ncùm<# s6TU7.M~=P>ǨCt]i{uq D"B=,-k-OLmz?kok,:VwQ-S8cv}Xa;W*;19? 9!uW7D6Gdm)ʚy|o_|z. 7543Vik98fdC~۩њSM=iK 1y6]P2Db利xir?%\4/9Kw5dݫI ic{]_0exDCDa1ôӈ'BMrr{gNKX%rF}(,cKB:9oQ,9mY*^ZKYk~%w[ZupvmyvR p5Y%%D$ Ć7PۛY_&xs=yq6Dn{9ޔms[mF8Nփ3pvƺ9ݻ`0(\Gf)\%)\%LJ4ڄhYH/OT5F 7dydXkyum&*a<9"DǞطli~3?uHx8juo= ykz3qX&LzqHo0?Ejn6ջ6D'xFj16|WMq)\%yW}\_V2,>js:^6>.CD#?8y7nwJAxxE"{<8 " +ns+qAp0ߧ{lW݄ͤ@v5gNK֦h&c '`_֚7coЃ9_kM      @O" ł,9g~YQ{L{U^U\p[۸}tv; ޞ5 ^v¹\r%INV59YŵI5-D|q=!Ryi @=pMA~9kV˳\hQRp)muP\МxgST)7;bm@Ԭ mfk۬%vҰOD>lrj\`ο͗ *ΝP쐵~wM ޣg;J^]C?s8sz׮ZՍFDmu[/o:NUʝ{=bj'&/!%;_HiC$\wqxKL.!Q gh|Kn_ٖՖ.rd }AXʾ,>?4p~|z\)Y-e=]hČXQnFYqՕy?k03Vcߋ/'Qæe7va, O lUa3O>T>t$ duv|tb1i6FD.΁@"J-MrGKgtN_?ψ8rDig^("MWXqcMQU1^[Kxʏ[;F|53c'&T,g\AHD^T _[TFW%R&ϛ&VKΒUU)g͊(VS3<8wbp|Ckw;lҝ)%DG\hq\&t"wé TPWM"VXkw.o641ޜaH4%q 󶗎0(bZ)s*'D"w%NL9DLN շ~k93'S/s^Ͻ)f[ŤJ^99T[[|tb֟ӍQR'l=kYRFK=-9ٻr8s* *G_>u(R';~h1RҔdp{mPTbB'{X2%XH "MFVDGY-:. mr₇˛Ȓs0*+_)U7TgN7U)CGĬ?_:RH~W$nJNy_J>,#KNkKIYւ<ޣߑsSvuVJoAfȘ=olD u%ZnꎏF]2i{ X)eǜ6lF{%UHC-M ;f X cRx$G鞺+54BӦ;oaC>Td&j؜28+tʊ{M]T<6J'&Hj:,n&nx+#yǦSklِ6rB֧ðR%Xpnj%jmpDu݆[ȹKn!"Xҵ~BiWCllDM5<@g{K|XW#UfЬپHW$"2p$upps75GH|r NoD۰+.{L|(CD,ǽGĐGpwZK1v[j e<5<츥d1w$w%dtpwgs[[Ú`ٮ#Rl]ܑtF;x+N|l7:@J|gg>{mUkgM뻠`IO$Ѧ&fKgl y֘9K[t:SNUg#":W%f:fgVxw7IDg錰N]WG.XUJ=_Bu[ q̙Fz8H5ͳӿpۯ& 4;"!3D퓼DzV7$l<^"*!Qf󉣹e]z[⤸Ug䌳sݥ>mrå3yH_b Ҩ.<;}t_ fEò'Yzi־A)4ic#'" Ēҳڨ$u;mL|^t ?޿$"VW6o+ 8.Kr*ߪ 'K\$l}z!/$TtU\)ౝ4hcf!nTKvoyF+{L \#nƒ: ;o5r|M<_ַ=ӦZQ8.gz#Om° h{Hzzܫ+D"""-QE*HG) 7~R/\486z[y㵙y"0-!EN %*F+v]S{kk22 ER9Ftx.<2sJ,O#QcU jk9zSP74#Tv4|礤GO;22鹠7J#C<{ݹco؅o~qJ෸OǶ軔0~֠a,mN*;}0?Z?=Ó߅ݛ;' K? '#Gɚϖ54 D$G;9Ru[,'>,cϺ |[^̈,8};iw}bhGy"t*yƒk[xwuCU͌N.ڹO-dybHO>lR_4wP{qS] *)^;i7pVcÖmv9$}Uc> IDATu93l>3fYrUWXi 2,_8oGRxv=֜g%uT*c- HӀ1 uZ ~(YnG7|=HWt]ՃM{@zk6Idw> SD7ܳ]/*Hl]@z\|K/'~_mvty_W xh{3<}}ǚTCRXP[p9Nz_؋MpE7>9meDUZ3+E ±ێt⼩cDDs%"X]EMuIQj+0N?W"W)d1ZnYLL7]&"TU"CQ) 0DD&"^<diSٟ'ćrտ[ @Ko};e=*3}n/Yݛ޸iDv/C+ysn>M S~mqA#Wʺ@Ko)G}M$Ȳ86_uՕ"T k&@Dq'>@z;iH\,K%rCoMwVhМ-+LUmzsֽU+sI^`y;0:8%>8eH7~|<}~+!^E_%=I~)$qnĒ3'kiѿDvZQ.mnرhր>g.(pX[p57Ao$&/bn%sCbFJNojg{zGE*Tjl[SEc/]ocӞx0B؅o~q;R1kZhVV'ӥr?%˰bapi])^/_g:"8ۻnnAH:nQT}iKe8KrCjܿȧy{}3 ,޲[|ת9~b(.j7`$̇4r/j)Z'DU>_0exD'\Lttt]ՃMxK=  [$O{UZHoWT^[}b7@/jt ]C!|~-CB>\/&O Ɠ_~r UKD~QSo]8@Rk7fJ)'w;.!"r3׶~ 8ߤV}>>"L;>:( }Njր1a^#"~8Mp-7"oPwjxIKy~œrD_:3să-9ȷҲZW}G[zϢןOEI\9|3#yݩ Lǭ7w_ʌǧK=-9ٻrH1!sCsnVM ՋPST% `"&\ߡY Q5;"5TјKũش' 8Jv7^8]uGy+°b UgNm6n3ߤpkftԪnV޲p}ѧ_ss."-[?^(%wG-8/|UeHi^#Mˎoz1{xܧ2xbnMKItVk^cS7nofk}ᖈ:ɡjV;`\т6l3]oLx춬(3Ѹvw~Nl$Nx @O"_x7f=VO+ -#+'ldo>DZ<: #ު?7ŽAV+9#􏪀Hw[1/>_/s^w\)gNy[h9 c'??qVzhi8r(ov@ecd-MiH˾nųG˭$;|ae 9ܦrRN;gEya";LUkj(r1 CD4{ql"u+.sH=՘,}rH|+Ww|뗴W̢IucKW/(kV۾ivQꄌ|v'"b IwHdx ˛~TT=-_n!$!,c慠f=:dգ5DN~~ʢ՜֝Ŷ."J߇ޘ|}ҙ' A-ZJ}2C#,1Tҙ`ر=Zr;,<+μO<\x&%P \Y^ 9jF6u7Rst9#܏\M-$'G}eVq7fe {pE]/'-oѶV myGqnuNKsơpզ7B5YTlv"}GDD}o-v=5 YV%远_HD{FߏnhPƎ$Ҧ]=ND-;df]u%-&3u֚YM<]\k楓,{ַ&$ߠo8m_E CUR<dWMĔ'G%Eam{0w؅o~q;qV1kZhVV'iuY~Ja/:M%稨0.ĚAwi4*͝cM5OԪfͽe2ZwKu×׏KyLdo7F wsE9woh M_s ?6Uf1vCeΥk|tOJ+}G*^oKӟ8YX~̓7^:F|O#r%kU>_0exD'Ѽ5cƮVm>r p /6&x vT>6n!.@g =|œv7. %M_Ԅ;L~?Lttt]ՃMx @O] p             @O"ł,2omh 5 s*7/9r0C 8?8"?9&<0B=#=!Ryi LG3^r6o8 0r7gy_Xus -fK#91%R{;@?}K=l ik;_aH_sZp)S.9J؈͹J6M P|c>+q=GV~&V87/m[oR= !I^?4p~|z\)Y-e}H$0eF%flƊm/UvQ#ʣw7[*X2 bʭ8?r+X9*m%K"bSN c՜tnb9GH2|XUi"ًf[tyީ,715۶s*,vMd w|$ac%EdnޘlSPzG 穰VѬT{S4nf|@F?:>AMMw*FfYM/וz6:{Xӗڛ3֎%M`჏K"RczdƧs wNz󍝾wv,]@z 4C)xcG j- uVw|l6r^1XN[ĊHh-; d3O!T!q45D Y,yl 8$9COoo_\vW춖[ g27w(QDTpv1SFoBDD|SeDK*VwtӴ WRfs@tWm 3Gh7$)6np{֓n"ORK+-/~qwk6ldʷ DNӮ꜡1;EOc;%JCJoD۰+m L|(CD,ǽGĐGpwZK1v[j e<5<츥dW^KuᰡІFč8%r9/7fURXCIwwo4`\ܹ5MgִYƢ :"EX5G{!C!:_竰ͺ;SGt""}g1n+4"*lFP߾Z!Q ~1g$NRXBt)Rg{M|~5I7~\CD*:k{[Ϊ]YMvNN|Z|Ij>Y+RGȔ&D+=<T ZQ On啭^sdmo|Lvz6#F|[ܱQ]ˑ&EJ<<)̅JDr*;0i|#"ϐZqS Ayǜ~JԦeqaTnBRQ& hr -?$ȧ6aX?TJc4==PUMU"ZǨ"D$ w:EgyDHhLF1 иD_/mXWs^scV>3"Mjڞ[Kk}S ê"2vswNJ@moxtHnleAFK%Бq/* ,Þ5N)X$5m)vD%jdUI;/tMlxB#jOEmqT31W51-ș}EDlxjxZHݼO|[4yx,OxƬYTQ_$;+G iD_֯_67er3۞RYׇ++;}*i컸)Ѳd韶8.ϴތi=a_ '#Gɚϖ54 D$G;9Ru[,'>,c׽yG_˧3Dָe1)s]')ɧ),X~BĔ'0h(􎙱>do/<]j ϞȜqƴ9J3Bꨴ壈m3[ MԴcC]B栕/{9t!_. µw@,gҭZ/rW-Y7g8ZT+kEtϜ6.v3.βM9HȲCOM?x7n$y Xu([{Y;yצX6O+xuu{t G҉hvmK[:-Y"P/D>CCG7NQ`/ޒVCqWk.#Qf>집{VK֒8!)ó'2g<1`7-%%N]Xln[ ~?ʝ_C1y(;3+b{-+s3 &w\ .[ZWN_n.Ӥz׆י{/ɼ>;RXPW"N[pN{_S3y/87jC!?yT |=ܽ7 PpD ?{'$70 a)d(*8ps/hVmj|UlVm+jڊMUeaI \B~!y#sss>sLƅE< † !I/_trRY&~.fn.<\e0u-~z dն,at.AR+B`Lli i֦CB-'ޑW!s;NϺ @7ko"@"?ڥ=C&r̟G.Oз_ K0Z= Fs.@|l~૾uǺ('E~Y&;x]k{l_;m휆6g+)s۩7_ݘԿد7:YpYT֤eCU׮v^󓉡w,"4CQi |X=Fr @ y_bov[~XւRw6BpwX0Cdߎ_g[gߜSP6ё嶤\@ q,--M)o(oRUKEuvM!*|IX_F~@ }}yT)?W\Ӗ!jA x՛sw~sE9rn˿6V:nجqHF700b"@ zk[q!lٓ55ťNjg:[O5ۋR] d{DL=o#؛|@ Roϣ_6MY=o ba5­^Ŕo1Z;%H߾\١7!@ >|CWqݾGS44e;@XZZC .^ث ޺_vx{>k7 O|qWȉSƒYwPW.ێ^jjߟNElSF;s>_筸5P@C<6"V gNQb4cۑSڪF tʭ딙^AAK; s-ydK@wRv#[oٶhB+.NޱK.2daa#+OC wc2\\&N6P_s) &SCZc-c3w1MOM<|F s}`/QYf'D :$t6NjcHC(A+gԔ|2 wN;jtDcW><[dG-2܇[vԒ1ɪã+ANwloM'?ok-7kE__mˏJlvnN@q X6ɈNVTje˵X^ >=ČFʹ ՙ; w=*DzutyXT̮'6!U 6.,>X#Y"SH5ed]zOV8[૕JN.|goWl1-fkA>uPB#qg_#цgJ֯OȻ3%x*ۣJo} F1n󻊽hߪy,pfذȧ{eEJyfܺ%h+&?% @wsl!_~2^Ȏ{b% ?~(Z \:1O47BaVsBl$cĭēfm^ sU?wxoٕ,n @ݫ[\17:7ד|Gߴ*jb36hp,qqymEuٛIco^  oV梵bY!A3Iû+DW-L=pFz05ų.;[ȭ.\\Yq;j7.]h=?6, ZuxbmZPe|5zIKʼn=V14\-7O[UgFI]c.EWr[RsC+w+g t嗊Ѝ885ࡒsMM)=: 2A/a@QNzE47k:ΎL$u GN0X7+ JQO?J ¤Ԋ%OgқEe8#yo3!c!b03kV. 6c7v#X =RhGQ2^IX:`]eJQ|iB./'~(=jAC&X4 *"*ؔ~xl{еv0{ϦsG9L+eSNL.&3P =@DZt@8[0ɣ^dqNSas^h,nGùl>nu*qpNH0b3V}K%ABJ!Զ8h5X =Ee 5"ԐX,{sŦVjY·FL(kJ! ZΫ,YFBs Lj.RJ@ʩk*vY)FR2}˶9lNVz\|j9ޭڂšN0v߂=}r 8ɷcꭥ3p5r7 {Zr9/9ZX]F;Ki7:5(Unf zԂ:*c>,g/}ǠvNbL\Ц9Pk9])Uԧ=KEb5͖X:b> bwZee\GORoo]/P)-]vO( ,><ВeʒX6 \ioB\=l3N]#r5 Z˂oT NW*^+ ^y$aݛIm FOP-l.uA\F}>X ~gN{z/NbiؔVCoU0}F[X.mU#-޿}MG՘ue!`T 4K+o6Mw o:õDBNvߩe2tX硫F.N33 C%(u^Ǝ}XVwoo<ΦڹR;J*+iF.nvqW@`37z=!;B+OMry\.Z vDN5ĥT _<ɁŴ>iWHh^:$'.{gYXqìYLK? pzB)EZa(q꾁s[0Mm.֔*WUNhgC8+N=->A\ 61Wr-(dɲ3cE-I*gGMFMsPX#= aǷV,q6f0L=g 鑅 ?lּv4#˫\AX ?fܹ3ht}>O^M3tQcu`yLc\_eB#`&^sm4zD獮.A 4ښtY8Q(恬G-2vc`ь|L@XlC,-غ74]qOW73O[:oƴU#V x#fKGRЮωǜr 2Z|wȰ(AxӒi]M4 2r׎YWo"~^$(ow_Kk S&2{D]"-U:m5JY ꨌJk;Mo!M+ vD̶U?1ɢ<e- 2>X4FJikۙZ"A'㲠k3? PPS anAsO7"@g"^q֎׆^A7ROdeƽ{ߚhxнB~ R|OV† _fqo?ޯ%15ៈ9$dG%pLׇv_xϚd͜{D ӤoO-Kڤ9G^iWEW>/^&9~$ 0+Q M2U}D<=.ĜfN=˔Yҵx1^lBq\;t$yn[s |? oϑS=g}C0Z]hH%+jJu" J͜*otOa5ڶ@3 {C  NN@ ^@ Ro@ @ H!5+hƶ#atsvxrb}hDĩ?MU[QW.9Nذ珈S"6xcj﭅Ro)o2 #[Q7^6HNS7LeTIRb[3זs>'(+ƞQ &4c-n1g-o?>@Yr~: W/m8*44& l_]BPˌӃlWtdϦРU Fώ~]'ٮ2<: BO"`- R9V>Ef6*u*UJMmva58̕Vq!\}~zI6fW-oؗZK} X]O 4`֌ZD"\SXVƒvkgb[֐|ꠐ-> nAWIFI]c.EWOoRG^=ĠO=JEVf;7[dž\;~){w[U;y;]K(ioWu7\D, IDATWs_8bx'# w=(#@Ͻ>Mٷ.f2qۑ̖?b@PqJɦ;ܨAaRj78}D e޾ײq2hq#7!''#pr3* @%w=T (ʸz&w^{ϦsG9LvY2)r>dwnf:ۧm*?sen''bZO̩(g?3ZHfs2.n6ѨN3Y??,;\飱\Co?qLwύ,hj3Wֿseb9>nq6s C7Qk[KÛNn\w8l0!3%ǥ*ZS⇉,3ױ)<.7ڡk;y1gΦ旳.GY؛gQr^Pҝ{@G;Rs Nqݳto>T-*W)RY*t,bja"xmAsljTAGah[P[c\8'`$[nO| 5FtM{#?=(ӰR+8 v#7\ :QkW(ٙ 1:WhBk.ĥR){ٽ o$ټN _2iPכYQunޖ:{?:"\2~df zqJQv f-0/U)ĂږqMR26gy-*E j"Z>ˤ ibUoE6|Sw`.ejK8m@ a6`-&6tTF\x2vlB7=w!AXϩm~p$ѶDF#.UYTҬÒ|B:+s7T …$9я=D>x6@zmy{jF?&J$I?,Z+)ikfN{Xf-1cSf[ Nvߩe2tX4YYhFؘՠ+ :ZJyM*\Ƒ,Iy;v>/h#gi,sz˿:,}r&T8F^* X-azj5_PAYljE_4:ȘluLY49L:Gi`ѫ5&K8|x,CsJ*iFn\F08 km4:]q&V_{K5D HG!Z-wa1}fBAlޅFLht/ * C7#}5d cqǙf1Ml\,,i-1 kףV,qf1-='SzT˜R{{JaL~rp3)C((*w Xbײ&5&u bPc &Oh!f;;F7g ^:ֵac5AۺҴ?[cУ =!M]e`XoڽT  YȯS[o:GBrl&9uN8JLv`2s{#x;@ >i/0%^x{{eρʢf]ҚԪ]d\5{I <%:Ģs.7uַ~D\ޏ?-_ϑW,(hHLzcnͼ"θi45V ȋ{7ƾr"&hݯCش .R}eԕKi%:p;$ #+%%Lظw1F8.ٽ R2,pfO=9hOzXݻ ^/Hg r]"//fN^v4:~´?6i3=QvYfh&q9R񚠙S'= @\~3D}ŞUnxe҉=wMxi־sRɊgwz3{" 4st:s@ 4s@3@ H!@ zC @ @ !Ah oC6Zo.PS7pwww=qvQ5 瀩M& X#Ln c O ی?_ڂ .@ӅB!`cQw~5k*\(5Z|S<×3.{aܷT~]m0˨s\"(ϾqaK< n؁'Dj 3K?S9SE=(~a;VK*9u.3}'LmD[Ћmf l]Ủ6s\H)f MU \PԷ9pHմM #yRMn/M~N¿b>F~VTwZ@w5ԠbQg٣u;kM2.ȉJ.mPwK&]d|N{mdQBnzT~jJ@+j$@ یi;#`Lpl,`JYEz\\ViZJP~0y K rY} ~tBYkUvȐVZX/zx+\W$tL ouԎtfMLP{'ݚҍ%Bwgj4@ z{_hhrY-_wJ<9l\"uH边K ho~N^y!ۣdDžDž̩ 2WbHY&E V{HldcJM+| UMyKI>޾eW3Z? S[jdyL=d@)NR@43ڙȾ}WG؛jcPM]E 2mqU+[*d7 ^&$hvs9SDe7”˜ZI+ sf=5lQyJc?o+yf?a՟{*dKIΔבh,I)y|gN `hfL /YgeĴML.L<'r.H+ d3 C4ތ/N^C|G<_j{B%iR^]f =t0Kr1#2/@Z Wn7u M@lVjї*uR˖ۡ4a軜:13B9@KM&OD@:x|7"K;nyc FEUxeBBZ Uwd8סwot =wG.+U*77@d9* ;TJnJl:Z,+2,T4( z .5c bDXn 12F'W$b~ds[#INu)D5,ΉM-mP(A1G@gVUV++2Sv暭?<ʭSJMhZ֚^B(We<&;> $bCaR"VTJkJJ$@ ZNXxɒStUF"# Ud }*:EUb_~9_U>55)xE ]Uʫ9U.sZ2UI5#v+/L"**k{yEUj9"D,`?ȭƷc QhR,eݗ$F"@(ĢB `&F}|@2[s^tfS"giIS[qzNÑJhXMq^P6HP2!6ry4H! iwC}r'C$ WMUgb9: Io]kiI"V\RL֦ 鐛E|IkUb]O ZgUM !\S)F ,cQT J%D>L7ı5dq |,JyE:s \&2KW~ C|2Վ_-Դ_S5q&Z(:-F'nⳋ6SZ񪪪+ ϾR*[B H! ެlJU6)dX%+X[쌔Nӹ:|ݵw^)U(l=S-/hP4/[e/25z؈ mJQ* :t FoУSچe%WҚ#M-SQ+Ny^܈L\P#H B입1"Ɇ t X F!\i~Ew|#TE kYaA~u4;S%'YeVԘvLm B& @[ *%/;Vm1MKng@?;G;D_KK۲RnȈY(|47؆AӢ3Y6C}],iFtGOn4ș! T ((8zеhC nG"Q`&'HY!bϜ44l첌/"*Ȭ!hLS}t:R畢*q!V=t }#F3rԗ)(Svau-]=&SJ6jrLjEbXQm0hŰdEFB9ʈ@d:[m؉IFNvzZ ֊}|45 *h^C_aCwX_$cu#]e9w R_߿} %QMVQؘ]2m5q9 ԛ%z`[']WI`TPpk@v_2D6Q* =p\OZSmusSos-h:V`zt@SîK^cDv`mIl?POyrҌGޓNakf& 7F&.)(C J (Go:4ػl)^I#DO!wzNUy-lkrax[3뼧ȃqkg|t2\E5 YvhFooWYYD{"ј}Fu|]=z}ssQCE⎅a1L-R9xQ7'.+ڞM¢^w2!_2rӃj=n@ 9TY_t;=]0# rFY 7xsT V<݊Z.~io@>N6:W{4s$ tIm S@!S4p/4m鴽RȕNj}BY=Ak 8eOKKt*p} >VԬ-٣S[ 4 "fgSdR(qPZS&H䖬%*]'⥪So@mK&NxccěWo]6}A[ %#B{Gb)8ߟ$& go K6ihtt;I14/}YceEyPۢx\=9'$f,ԧlr@u_ rh'mRцI %7 Uz(Tx9C#; =H:o& S}=b#:QXP5 mY;v>Fyrk#Rr|Tz38,7gn i34O33xYťT[$χe\+r+k􇍶1hY}ܓ얉IY!OB!V⻱5&~NmQVSǙ q S$F! ?N oW y25pߌ.ati =8@ Cw(+2Z @ RoKl[@ zSHO6246Mn@Ԑ @ > :g]RRs )Q5 @t{C @ A&P;D -z;zzMއX 10gm:V[kZ}hKA YM/0tYmх&?EвhwbkM G IDAT=7j F83'D$~zDhid}T-W e VgvcǎM獓 N?ͷ ^ BgL^f$P׃9Цȃ/1xy~|Lis/a@ '|7rA %:ŸQĶ!ml<}w{f8&Zwpy]CoePղTeX3j xj Әa IiEPv?<@;ήvjY꓿Kj1g=ߓd4K=xb0wc5BmkA凜ֽ書G_p]]z?>TNJߋϨzG#;|);[ߓ̨@W<);>Z&,\|;qkK j U'65Tƈې}д$>׷Zz275+(/q>hv[7Z>7+Dhؚ`YIWވ9$ `JSX>8XM,\WЪJNy% 1srxէ7(=M5iʒ;T Yd.A@Pw?v/l/\ (?~} =;x r0l~5-~:K@ dCk7 nn&a+W\T~J$c*2)QJV 5R^yD2N67@;!OiNsERsgR@Ι@VVFThTPg~1PAsSM͙jf߬,&~ېNSęmtbi,{>D&y[,IʻyЛY]lTM2SԲSK%*%HK2u, Jn0lfgDʥb nޗl8WsNzX]AçpU;AXwK ou6 hM7˔an+~YgKl٬ud6`ɂ.&03x륨c eWa2p@M:"]ӷ|]ڼ'tC9R9ASB}c>#0YS{R( W|Gk΍9,x,~+Iڬ}spφ 'W2$.lqYƆ|7AW {~=fɷ{kWX{![osרOݐ7]7":kхWw渢;夎s~Nhr,l7ЋX|e TJ_'vO556UظȫSȚ6.PDu5wꎘZJPJ²#ԈRڑq!3gԥxD~$x=\ 5z#<w)^? "#x&Y=9:ɏ 2DlƒJO QFniJV5IԺ?s*h>oj|XsV4%T[T* ]3^@Ȇv =Row}4"j6f]rq,zYvۗÖg<>bĐ5f.$x5e/ՠ!k5p8ѥ /9hCgN./S},^o#Z[0lj|W}H$!F Y>oBKR/gkJRfJnr,jo؍炯MI~Œ.)bf>nEȜ)'r/ztOxTВsK7nwaB>ʕ@ >v0Xf&?(up؊?ּoEs/)m4~X*I2KH%},BgtIͤ$IfBb&{[ ,\Q_BےARëϨ6Y=5zsP\[GR,wPZjN̜D> M|ycUq$G$&]W-S֗(9 y7u/"jV@4 [V_FoJ7v"OK--ҍzVFGM y*%Wv5TT_g0BhJx o e}=IWΠu<?MxY-lQDj?O*á˴~.Y<@:pjBG~S2Y/ݴhg˛ E-"b GranJeml HrAy%@Lg #"n &\kʤ 'wS= ԼD+BnұTg$;9hPRfcOe@ z{%֦Ts5jCfvvf>ٝ~>3!"2a;mcGg֜8߭󸝪:U?z)yA>ʭ5.n1+F̭ڴFd>Ru淛TRtZN,ZJU/I DđJ%TTbXiT*Z~vFڥW݈zg%bM޺*E>S&.xlپ _mʬ#F>dB1<"(H/qL51Nd&!0|a=hD'gL~X*ߙo@zwW3߭zN[b>hR%" E\%g3߿tz}aqNvSy2[$+z&Y :1N9ʉYj2CD7۞aU3V1GȐ;/zdH 8B(,.g]VI]7J+dYVlrʟ> V6ӬIscG\ʪ:glđPMDjjN|`$gww4>f&] ȯ^ĦJ]MR-1rD'?021&tm`^5M_m /̙LSV"?n+R`DDCg"={v-CbvZRȋ4=OX^~u!^Rx;lVm% !f'’HtIUP?Q٭zNo#%N[lL;}mdPpސMНfxD%VRyA[]m 0Sjk/b$ V \bM[tv+qxn\ᆺ$.ne@Cf-\2 il>57rq"Qɱ:/۩ֲXt<>~Svs%"M_ZtUΪ=ָ%ɈNǏ4D+rLFpZ(c7' 6"~Ë|cfF'5wD* #e\k!,w<v+bC ӊL[j0m9Z$Vt/z梕HkM_ږ/~fݼ ᗵuw甈/b=^o!-U;C^qnUGO۩lJd ;ﲶ&p%^zF;PĵOLYtzDF/εH=٩^vbK>{wXhCFYUC=>#vZut>, _›Zmqx"2֜PYiuf*ۿ1mDDHỶ >ί>͚M{u)'{J\|][3DDGyfV~UK^\ u3[?Y{O܎E-cqTs1lZdlcwNtǿX#1yޤP5T|Dǖٺ?dₕcD kЖՙ,'gFg6lp>I͆]M+>@jԤ?4g|sV}LkAT'Mg!#Ԁĩ3nn;OޓYPsS⣤#{1Wm^}֊|p;i7 v߸o.rO'dXo3 q$]mEnokvrm):D$vV<}9®-d]ҳD~Xm{ 2x?9￶WIv[[/moVgv:h uo߯lL&L> #_nmgf+O8mBա4YQb]lNWԝCDuAo.7?ysaI=Ҟ4̕z Bo6-@i{&|; sg:Ot)wS^i zFUfZEDC֪eeZm\&<} ww LvN^Զ2NS[g6@wgmR}%5D$*ASwp}V?ќ 'ɠ?>h q<68V:\.YiYk.7^a#<$ۿTZ+3ziD=fTp˲hgͽ/Gmn c;s>&yheo :;17t'\]~;H^xHm/f{>b8qAGƎzkۘW)79u/s) v^:;9|rZTzC =Fe̵y"neܤh):sj\rj/m``k7`ٟMַPYT5X,\+>6F(tLv#<.Z[k=w^gܧk,2j㥧dPdk}N~^nHd~B((+OmԖu8H(۲\Ʉ ,XK/6NnȦ |AaDl݉b/h&UO=ڽߛ:8/x'!ސ>KjƊϵ~/ Qؚ~f 8K$rW +эcD57GC<)B;+O~ D9{7>I_r&WGTp\k Ϯ>]e&"}xԗLKϼKYXpՔӟҜR-KWױjJ/حi?-E \ٵ )TQ`':WP*djl ^ck/ xGH36H{nW!Mju oRρ^MW _ds,eުJtgRXsV{!=+T."U-OOfڝR0ڂME^N#FԤo66D^^5թ|iGIqb~т,魭9?.ZȞN%^C& K"&o"U7t k&UU47k7NfPLIEt7Ivs(E2_o4 ^FՒG Sps)wr4^J!Addn>"+y"&#<G,syUkX. +z!JH},^sqڶuGĄ t6~".$N"JDv)";Yv- = .g4`ݒ"Iæ'l>ғs3`nBA+ >m}.޵M{N3#?9Mkᑣm5ꃧGG/;;oEq>SZʉw+E_rog1q;Δm݈k.y{ZB'j7!灳[!e .dRTUieL1*+݂&g/= \Ͽ#:y; :ӳʭ#"IHggA&-{Nuc5OϒhIF\؊}jvܦ 4~2i'^!4Z]iB1<>!>WQCi!Lx#oV~**0Qdo0RY,6>QL:¥J ɽrlņ; %(/JUu4q q358r7SH(u ,-k! xy+0ܬ&K|tD7xS/~{w*ok9ΆY,'4O3<-x\S1\jeuP._755 W]Am33V_e\]H|nKn~CůX(ud9Uf~%Z"vqun5@rvc}Аx_S}<يTL&ёHg\e=9 qIDATlIђNXCz'c{2bH~=24 'bto< >@DG3e4ȕ}% v62iijtȢ";aoNSU]76_*ⓩ?iZ[cF$\ursu>}R4֢Q)?|7:{4C>+xftv^C:p2kt˪.n\Gg7EL;pSq]g("T+rh~T|ӂevV}QcO3]M `Y,U%8Kqi|آ ƫC-{t:>jٔgrcF $VM"m-W-w6ܼ^Y_s.rsb·- ij?+.x;c_ǁo(hު]s>-+[ϫޞA~:/yҳSxPO/ 'uqD$%X(ݣu*~_Z~5#P]$ v:g]t\|\ㅂsSlR e!w}(z{u6,tכ$+{(Ƈ~&,;wSz` sy{aلOF:ֵix隢ee}5&T*mnF[P_7UGoM'K>P=hX?vpʅʄ@wy~gfθ(+`Jǚ2'{oG˔Ցbћiq/VVK@<AkmQlW罰+3_S0Sm'O @7k (cSXZmފ0e9q@Hodo:<9o 0t{!)# 72sTz&pu !қىw¢zc\,bAa!@GHo=z\# 7#\4Gi4%mݏ9T*+|ItZR(8(7@z@zC  PHo"@z77  d?ؾ[)[3ئvyS SPjm+Jg^NF-YS]+)(8677~DD ږ`@!4]  QA @G+ѫOIENDB`pydantic-2.10.6/docs/img/vs_code_08.png000066400000000000000000001011611474456633400175760ustar00rootroot00000000000000PNG  IHDR LsBITOtEXtSoftwareShutterc IDATxw@g_@#a po뮣-n[Zu׉kjݢ*8@3I $>Ar㹻'|y B p BB!0 Ba@!B@5777]mEObch^_Ty/FY{wĻAa$n0yvXN}iALaYzО,D 6'|pMґj;(ݲ*rC٢[T,.z+H{e$>6:zg6fbZPϞȋ,|6l;~Ό,v-F:Knz3tG1֦:YzZ=T,-JSr+'RQh֡}¦h+w RRaڜ]lZB!S=7L:0u"y B=Tx66sd>C;hCYNn^LWWP7HdsKΥ1Hu]Rvaٵ$ rو:0YjǺ`YϜt >g Aր N-~vF /y"|"<ͻ=ޜn9TS?s?rH ẙ7IcEo{ ]" lIkjoo> (`{ +;;v.}ۥ`ԭ>p/ݜŤ*{RPF8Fdv/ *#N>-EbQI߹W tajA~r(5B-{*=MVU7K ZNVDs II/ %ZRQ}?_Ƶ{V:7Mp C6b[iiP-._v2t$,yR~82զg*Qd~mBs1B,*omCA)҄CO+"A)N؂ *"?yVv#!QF'x}ZYirU+JJ ݛRA@ju!IR @r}T&mB (Ξ1pomGʶ%h2|Y]]ݙ6L}L" t:Ш )4#*AMn80y2;'H- @Ukt@Z U:ZZ^H:7+[ Y2<- 4Rx7R6rtoCAe{X Pmp'iO(2]C&4;uDU2#dfR^5S#62FR4BJ#~PWg,ܰܰĦQGO9IjERFB,)zߵwkԮRqu_]MtБuecbe ^%aG.P5Dnn$ET@F4GEn2C3W'75ν,/+.#ڜyT})zߕ8tHR5Ptϼ^!BzS:nV7?,6oLfX?.0֟Y|~ZqIOaP* N%Ei\tj0پC´7߯qhL+Sj`JKtPR\F[ؙ+ fm7Iq@aY4:*Ә=SJ$11;$7ra0(::Sԗ/<.83Fba6lZT_bhcCTnaĪ &M|θ{!iR}Ȩ}_*|a@suܚɱ:]-LEwbBK/\1 ~ۣO2Ɲި*טEP^=X7]FؖΜ&8Yu»cH`?ȸ]@>M~j{wS;QK&6beB]\i4ڨbȲ40ζ71(K4-p`NmRJ,8{ӷ &ǒޔ[׆OK/#MC\l(ֱ8A,0 nbP[% u7b[xL`_ʌ\p u$XTO!A:ۤ 60@V7"Tխ7 RTxb~\y[e1{54{}UHJS\~[T+qbkv֟qtN(5R6``艁?OHB!{pC4-+!Bk!B|!B?B!!BB!0 Ba@!B!!B!!B?B!!BB!0 Ba@!Ba@!B!!B?B!!BB!0 BB!Ba@!B!!B?B!!BB!0 BB!Ba@!B!!B?B!!B?B!0 BB!?,6tYww5)m_|G84fR^jF:T:f?Ԓv \{WWMmOlkxYKx)k͍i][=!0ltXir'DZsΨ"t$[y1WJJP2o_q)&Yo.yo:)01(JIw¯ȥwOޭ6t׉0 |'0'm,}w+Aɳ)wZy̎Qs;͓jeg*xC^~ ؼHʭoT3BMgFY !cd2ws DyE2&b3-w^G4gI_̑?\yk^ކfPM]_vlB\҄-rMvwS:5)-+#㷊pp,w BAX6šh@y2/8!2lp1-A+d˸pROOXdѪRI汔sW-4ңd!E,νZۨsyȨan91b}XcAFIǎ:[ת(:>1v!tXN˿Xh7)bAhVsDjhT1с!1Xz.x RW!uhF@4/Lͩ@yIYKxKbxżJQ&%m9^^ݓadk/@ZRj^cӆ+ϰh/t>ǎ i;n*EpFpIrqosr/+w; ]{BBN7@1vF\z)'!mԹyԵ^#Bm„GFaZ7`!1k @ka{ JaB쎥W3;ukbI/?r|ɺBۨSEZDtyQU]-fsʌ8x+ V3[@MGa Br{'Ve.G*%hU$IB*UTf; $UQɲI7@Cfh"-A&,|,.fϳ_ B^VXތ߻R`:6₏Aýšښ)߫p;MFqܷ$N?D"OY<',|/~vPiT[nAJ3k:ncX~l fx7.:- ί.[zaɩ,7+oJm DMOj&H=;*'up J )Z‚@6<{Ud*͂xΙH. 7ÂLYt pEMn"xd5գVp6[D<1f4 lýiULؑa ﴶi9:io=tn7+Mܛ*2Bv#Jgy;žSCq-߈26lʛ'e8mf]Kn h<HnD:cYUa攝k&uMO4F.,;e4fJVi~ɏvijUp ^]6ns,Bo0r'2H%H7vV)|ѿ%8O+:PEunB>*+)d9)(%IϽUqG/G.CrKM6YIqr)@QYaVuQrޑ+u 5MQi],T_g`Vʜ[b/W,$?"ҮMw3Nrxٿݝ~c$ˋ ;f<7rVjṌ$(KnE5X[ϟb[6HbJ0rk:ڷ7Z}*Juir)84ISQuBXiÇvhm()*I.*x(4aZpN_NIа{ l[(Wdم'"uJ-nF O ׫Q9iH8)_IиcYS TRI C3xA liJZ. P i/hviyֶT?>^^Guv]U}UE 9@|i @ͻu!ِ%D79QkE$iʨX1I ^}) 3'5P/itFF `Ē睍R@r[{%/UK-<eecArVWf>eb~s /55R%0: VXXeKgV^+r$b/"-=Dsjƀ;J eY95?7cN0D5RRyA$k?#݄,Q3[zY=| --70,mL ?Ey=-M8еj-拉lۺi*YU&>4 k ],Lf՝ʭ.mڦ-io߁XthiĴ0Қk.. ]|k9_ѳUdZQ¶`kw+#nKhM,)!<9`d\lLsl̼[@%i<ȑ`prX .Zz,S6,W/pBcFlKo_IxRWt;ktrStLEwlդkn\+#uhfu9@,-۴7,=LEŇ/ }?OB{PGk[/ ? ^T"ǒ*T´f/*\Zۺn7@hICUBN{^NA)F%FMK!&5Ht(2h ˭0n_Zrk)-uCOdzB;OG~GP${uWcZjGrI_29#W/uEyrBR}V7WB,]TyʜC:hk'I[k̀i$~ش(JaJi7@hldˏyV1.c6^h- 2VgJ)^\3"IuYAeF,>F9q IDAT#cPI $4KW ݊(Js~caLzp{kNvs9[}y]畻f7Fu8ϰBdA}>ߟ#!\:C[)9ZĶ'V|+>>BϱީyXGKoT7^#B?B!wB!c<*R!aB!0 B6~& mG\s#k͍o:D!04oٹp% ңn>boSk?utH;yZR6[~o !j$ÏLZ2(<6!wƦѹBO 꼣[.+ Lo|9A1:`7Ҕ5"c=nHç7)gƴ`1%TAl;*(+Ĺ{%([K v#m1+S>:4gƼrs!+}atMΩ̮NghaNϭ7*vdφy%Ն^_o^Jths¤Y+F-,ܾG 2rU s;zxOt!JfSeaU䆤EF#ΐ +B!;o jx1JQTŒ؄JV+łFFSqB+!BnpwcTKU4BA4vi«~ƴZ)NJㄍZ)c%C!> _vweȃINVV :ٞ{uVlB==[VV 5j iRg:9^=Ƃh5⼢ד7lȼ"5oZF( +B!J!_AlN%F*I~Ѵ ??yG9;:XyO.tbcWo.`V7yZ4&Y؏hĥL->*HSrTgk"(!V)B>X{>|fI/ I?wsB(@e5/ob`gA+~\2hxrbx\<>#lLr~ϛZA]N1aV%>7RqtN(E!Pj߿E!!B?B!!BB!0 Ba@!B!̠o:>_ЎB!)*NV}8q7BJԼ; BGװ{X۱ [/PoNS(N?7cp〞sg7vAq~ӛw=ibD$!Q!T*uñ{j@z]Ώ#8!S%GꡧZIa"3v5VBSt#2IWn^tD vǛ'D'Q^_o^Jths¤Y+F-,ܾ6e䢫+W#r̠o0/z{7.:-  &!&SM:(o B!^ ݼ%.ɺp#c`{B{_Vxnijfq7Qg\~ߍ O*NdkR9VB0MB~= @6T㒥%Nn]D1M( ,ϻ(YBV`$ B0c~ғo{SPY옵s㗯wXG !}fS/K;/*rBW3 ^+͂mcc~;+H~_B!Ie^m L1{@MH|(8W V5_ޓ Xfہ &-_י6m0GM-ͼϒ%e_!J~j OOO<!^N@RoxxB=rq'W@!Ba@!z>?9m@ m!C1!';=Nn7=C!0 B5B kl_g'5??6ʣ}jkG&< _W(% lEhKg^y,-cظs 77>n@Fqݳ.:zZNåK^5+|NOY|Iy6<{`i[ xgK600{xBMsbDЊn8eGG6^Fe_ypLIjnKM 0-~nkNtNi#t9+?HY>k0&@;rBd&S//B߳yҡM&u6wy9AG ):5oA-ً9]}:1bV eҔV06-ƌ,?5cm_se6g Aܞ^o4'=}|%T8Jz*lihgJ 0pxC$ pFl:qp^J?)} }W++F/B7l̃υ*}Nt۬$)_}S&jSĘ 4`*.9KB}AP7yUu9Y}N,.JIؿqn}[7t NM%߉ZȤuKn"˥YRRl?HX^CAԾ>ȹXͿҭYȔD!|hfD):W.3*KIտP`Bx/SfjaJv4f7k.(˄)Y@㠶Yu?=cuѮpdKW3i[rh90lXf6TRmRUXoqlczШ^4_k|o^4'!;LӁ̈́[ÏD^,Ҝ:ҧ"{/۷7uXGcf?g_Yd@jV(GޒJus>q64(3h$}'k̚ rNqWQs4ۯ<8&AM<쨤ֶEb%}dAQ,}#_.;_H8f>83D7VMP#[ ӧE Zyu# *>C-;waΛ{:a0SW#$7Č %ŷbG䔽;8d$È Y}0[Gj^k7xVRgUG?^ΗuNAq^vqT_3"*fZs.ʼn5Ls^qIVt@sV+3*H<0wϮɤ\ ?ƓOZ4h (4ж_.`N?~{6SVL~ǪʙM{`V"^n9ݝM崟懈.[1po36A)$uf-'~?5(}ƭO1[htEvʿ6p]dniKgfyf'rIF5u֊ oycd }n2il&wK l^շs٭O@p< LoJI5֒ JI phe~\ ^ζr.\BfȨ_N&uR}It}ץINŹJ[wiCHcl5Ӛe%f!񟫂Ta/|VK:3fLSCE'&r?`5RK j0l DL#Zz?2h$;n 恽gE??5lL߫Gw敒2qL0JKq9b1?E5o*q 'KxTu^DCΩG?E$E1ǎ3l>Q7OD26&ؖ2"^_ۏ>!hgv~o׶{:?}[Ggm?l5nM|\%IW~Ѣ_&IG7M_ͱ `_Y%^#S[SY{\lakj"H僎BgDNia<9"@w7>#ADQa܅;(KNQZ~=,VX.V)TZK?ߢBKSQcKߦxUdGUcGݠxi9%ҿev^`*9??6<@郟"6-=vǼ0Rny\~j4]y5Uʾi+BcBf>dQ⭮ڐɠ<2:KFmKfR--ߵ?>(VXw7znFvS uR po-6DwK/{€bR!U5Z1YT,p#Ļ-hǦ= x57^hW^;,|Gϲ5ե*Y}(f^/{_qL2ޖ ?)|ڋI0MqK|&ǖ|zbR4xÁ~ 4ibwm^)M 110A]"դeXiS2Wd›|qcۛC<~U+oyg^Ej  l9bYAGٴJA˜npX'jL8cY@mhӈxBWV $ۗUԼ_SKGhe*zS;j)^N՘^?v˵, s3/fA((|m/I6n߁7KSk?wsq$ac[r/VKG?3`pԶ-V6go 87ȫ`~Zڭmwo{ oZoMS ػqܙ=@ G)dcdOG)|GƆk#I}ɥuJ-B;Y,sty;*cN6~x|k]W|FOţârqE)ADArR:gۻ|5e& ,UUS=ueXoCTͼ[751Bt!Aj4ыĔlߤGhuۮ(0-yXY3fC聾M0hBUB:צ@ˎ"ЩK$/g˲J8JJxB%_wo՚}3φ7+ۅ/_2Į6SoJ(ҍn}_p*g{VW,hŕ{\ppJ̿vkݥynd1tV'fEFaVR@IOzhr/jۚIuV8peuiYɄ_? %9P^-Fuu/-I);hʈ>w3RqꞿMm:þlcR%ğ5tW+D?fݮɠ8yc? ٹ*GzmZAWuB)(*<_gB[y(5S-IR̛Rrz֜_l!t~{ 7a_n%e+w#q"5>kTa^XZ'899*Oj*~%}C_hzhќ޺a4; *?{3qiii+` ^"7 Ӗ΂{dq pn#bӸ&f2`0Lڷt />zԊ:5>(zsи>j-4*|Bw`jy wB IدOp? >]8B!!B!!B?B!^9xBJ&G‚; BB!0 Ba@!B!!>TFEoŁ_9ٵDE8v,býM;ۚAXDkҘ#6^T!oع%v=pBӳ r`y?:8LkBm}< ͻ:y}DŽUb5S7>P_Z=uKϕa~+,gN~9EVهjNpl_qe~ƒ/C$ S`>՟mj8-hyBlĜ|@vf{քWT)%dJ n  1cHewy B?wrn[K7Fd:/Q%}7]lI0)Gsv7uB͗ê'Fl[N=SW]52ھ-GsF7GvKSVQuWm3m&pANGg[=Wɣ{5ZT)GǶx,zvJ8m'Sd@%Zpd7Pd9]1턙ÃL9ѵL~~kfm|!r+r*c22Q5'S`i5jVMA:QSh\ȹ|$_k]K=VVMXھ)LRɗWĂ Lwvm;pZ{xwjJ_^+i3dv+=Iמos̿o'%5Um>Ɯs 7Jػz}I^V)`Tk?rjb@9Uu|A.׾_rQ xZ<۠4gIݢm[]p7A\U=[zdefA&u"K 1=mjh9Pu­2m1F \wz Ӷ=&Y`0|. ksE"(~IL<; 'ݔ8_f];[AUR+*ɐupv"]kws⃧ejV7-x5GֱuΆIjmfN%$fKbAj5PIiq^ҫVa)^tuD 3 3=_ 5')"s]M]*J>'-jӒ%$=M!>KY J@9jR CBGt5]=N~n?kL;/fyM)GU@{dQ_W[|{vq=|9OW}|Q%^P`u%ߢڿ_)YxϾc7`T&"&!uJt4yzȕ"of]/UĞBTJz7Guf¤6 !V6d;YIyReuF>RT Oah*X&q9޲k?y ;W_A69k8e@HFQDAu:HmZ;öVھ-v诶VkEAl !$d\.(XyrZBxŞ/!G}w§F\@ թ߼ܒQCloŏ2:@iѪTt!]! oؤOQx`^q"U:ftQҿZ[2Acmy˸THsOO/O\0`1^ο􄺜qøuI 1= v."=G-fغl?(,&-$0w橳/!;2;H6U^:oV|r+f,/pk>-"E Ե-.Ns"[,x-nxY)vb - - ױKg 9tzo:b1wͻ[gBML^Yuʷe g"72Su`vVNiwr ,;-5uomMacFXpTsd $?:}Fܴ){_ؼ7xѳ|{Cn{gmVd-q9סkƬƨ>Ot6 pFEuS$?.op6b W݇xQ ,nktD|9$ya^_BWz|;ZiPyp si1 "4OOVTTb߳_t t%O*⭟l.vW_'^M5:m[ErmEE!D vc3񃝮g\ma2c,X}%Q1<&8vJ?Z< >{%>i9fW=AwN0g͊LYk;X1LN9߁9^3/!$Aw-@!!!BB']ìR`鳌Qﮟ W'"Y,:ƪ?ڟI\ݙկ˴`бSc\.:31%Fb& gHfYHyg`?`_/|^T}{_P^c1Xho^]͍ѼUkNLT|pNS2_ؙCPZRzVcڨ#[>o%3ۧue);Sd ==έczʎH4< .\9OQJ?{ǿ!z"Afh'qZmƖ̑<$u G nZ2zE8qWJSsp;!Ld$i'2n'cF^Rܾ}`g{EgfKnBypr+obs,=[f'‹Ϧi9G]n`Ez YU;'7ZcReuL7eeT䧝TzLB4``nRʲ/7Wv}'M=W  8Xpj_aг4OT٩)OT9k|w,:͎MԲg[E~&dɄa^64m]ny[D.`-`{1A/ٞnR\8tXC]ܦ,U =l~BݮHu+ } Jjj~:V!2(R.S oh6~c]m(N^kiϔ؁\:7 o׼?h?=e6(&\^p*]y[%0_/N5N@ꔲ&=IZHL7cW}yC{8 Y7*~<{Fl'jƬjS-ʫq=\4<ʼn6+Q#wo^L0^^{$Ϯv]/&kޙuGzU4b^bxֵeY =CzY`糏#uskNצNۺ՜:(ܛB K8aŗ5_xcO4ny#q{n񯽝(Z 4p ,WDLxi޻%?"| ߺלKQjPEA)-EuSj%9WmF5I8 1e]\7wu cGĄd(`ŎId>Ug œm Z3ea  nJɊ=p!G B˨uU70xa\{b ݹFcXKҪ LaHL'JZ]rA1CU)[XnFƉ2N'fn@+^@_ς;kLLԝ?vB]TUIu5:!?w0̴e4:ӣ?4LJZ.@ ""Z.sLe⺸(KrW)tTM/U֖Ψgx{N"x"4Pה^:?֦1_dKt$TyRi S4(xfJYuiڦ9Ѽk$I$I7#l뺿*I`_2 jee)?)`lۺYw`w@yl#;Bzۚ5 Xp4 [ffK*Z!WKj^'5zkcOZee,4j$75[JYϟ&ihkruNgs }m*Լzdlhz婩*RvV~̶Q҂+5Zj5 <ik*2;D vx 5Jk J5KzyMϤQx]}VʥFUs-Pwl/ڊUeRɑ~Uu\BGQFezLK }ŕf ,ڜo$˶C0(kkjF!--nk+ ]S_]k2Iղ{ߺ;;uu\רˮ]h:8<|#lx1ns!s}iۮ7FEШ3 [ n W ;-!-` ͿGU`J^_/))-8s\u_S5%u 3H$C\+F`3lr6~r-Lm}qoxVF#m}LM֏d)U H>dڞF xlbd2֜( Lhh̞CUZ#XNOj]7-&&tޯ(Mq>8 6O)ը)׶5r2hJ@[ǁŤ;yӪL.703"ѩ,]P)D P{wyyDŽy{e,MS'rww Vxp:ciݷN/m9-#1eaˆ<{wSY  }2oq[ )}o]VzKpiK=Oo)eD[jL/q؟o֚Z15;5WVjl}}rʠ+ Uu{mA74ASQ!n :t1]TMerz-R,"W[7Uvצ)-(EsuÄ=,4΁c(UZĶ絟Veh&\@iX%n0<<9~)E3eʽ8w1S("wܾHym_AϞS{Hp{c07[G4/_3ΞNLPuۅl6EN?Td; lx&lTc݉p 4aG){ľkZzohׄ ﴇEV+#}"E\0V  y<1nXU0֞(a N-=?9T)~jQOю-d4F0|/-xF Y uEpgeEi-SCȱaJptZ/@p'qWٕ2KTB򝃣{4cٕbʥh?c/fª+A&VցGO>rc +ǐ" Jʵ@BF 8ωRI޷i>9<: WwMr1OoѼK7cs׼;. Xiǿ['Z8密$9+E<1 /3}-m/7}czEGrCƾ>3ڤ94t? jǰO9w8%:߸p64TS~cWdsn~}bl~ 4oڂCJSQ5rOͭ:;0rE~hVݝ4K/d;nV$}g9ЯЧ@,7Z> ctTI\5I= IDATEQ)hz(WwK,L:i "aQc'm]3KϠg :.cI0?E79)箷bEÂYТ.T[Cy.EP.{]e? iѩϖtl`T}WA 4OO[{v: Æٷ>%w7yG!A6k?moU|{4/jݱű3{HV jxMsJjVci;`:LJ)+k- gޙڴeƦm59>qos@rMZu\ɚ%ۮa}|Cwb"G;=ZJ>vzj)tc/<㯐6n`a@2!h8 \!Bڿoy<fݼyӪ~^ K^>h~z?"1g8dy/m;otq6}G=t_Wo)Ϧ2_4l_rL0L{nt"Iu^ Xoz^w8#U5gv sҢWA K=zIDNm}u0ƨT$Sz`c\xLjc1,+LjM1+0GhxO[T!r3j"R\XUȮErk>PS<33x|kE}Ίx s b3d}˯-f_j2sgK__tC9/ṱdU^NkL01T\" U']+-+u25ޮ놽ۿ൅˒OfN jvR]v+Z:{:!TSk= 3E4}QFJu~^NKԗ0cD3N^4'54?ݑXXEdS1fhJ]iWZLz鏺^$7T:8$;>jd3&ťm{?1ZE8&-l{x<.(R.r Ry!&4^@(O;#ye#{W5ёmM? zn>]sj;/߈-}vd3F S!2fNfD i/Nq |mi  wc$XmhQb{8u'B㏇D1H^edZ*62ѬS5tCl}-ڍ [Vuʿ SnLtjk[@` Ƭ7Bv:pKm54cŢb>f9 e4`ZXZl1J)`)fC-_kHxlT[@)q L͡Q-w X9kШ<-pkB>ìN !'ZwqY~v>O92VlSSΌQGnYcK;ȗ-xvΪ(`z513y2} 4<)L)sb-9VNOZ5kqI=Sv|gptu:B[hWӹ7*}~ޭaύieJpC .L--0n"pəUJ茍o0pAvݑ2fd(=X(#"l] Os8͒dI3#\ڍQzeFpC,&u%&CL#\mCF[ZLGG>ReɣYic ωRI޷iZ<: WwMr.ۘfy3A!/:{tJ4x\:j@gg>o_70lyYχY׾ģ f8s3)3ЀҼ>OsQ^ qbVwUz"kaNm˿}je3_?_wzO`8\ůB?ha:Er]U㙱&Okؼ ⡟owhrSkUʪZ{?"wMX/V jlTBWA|zt_3yV+ݴ#Gߐ<4e8yEo9YpG7fy29){{4z/y~%?ڕ6q66:U]`.4 >)=ډqз~b0(cI( Eʚ7`xfvU)Tk&%iY{$ʜ{(:Vlמ/OYp\RvEE񇕝9)5.!0e:y?'S.;1XR(O 4 X8AIQ}4iq@ ʣ&:'()2)z}wzHԷ;s wڹ_䳓F<3jG➮m4+H @b֖1:Ŷ>n?߅d5;Y'#Pq} F$m\馲dPw^a BC=ӵ&F\uQ7Hh.:Ce/o~bcD nӉ2@Tn`Bܮu 1}ZXFQVcj 4& l)*/r(1yG|k,_!Uw ++#Ƹ92h kM0]T_4Ѐwb5uYJ`~]],Fu]Q靥l,/o|`jλ*QOh[|EǼrRח{o4+y?ϱk*X)Ul] Z9EP^뵮| !]OIgz!@"^Kg5hNmuUH]M`azd~_"-?0>]݋}[rg*?}u `j-Yh<]`Į.k$ۆBF̎+9—p\\GOgOK Z8 >~띇eK˵]m`4йJSRuǔ4`љ;qR>}ƺJCdK/2u4:U?g]>_nCu"Oix";2e /GEy5Q`2k0AtsBC 4)e+ ,h*P؄mUu_}O*p)B/]uU- !`PU=c,ѐCmz>. d! \mm9.wl,//EyL{MD*߉ޖ WyB޽x?;Hoq,kh: ryDHK] #CClx(IQZZFq||^ϫ T 2T6($5u^kB&mXҨ/G]4Ԣ7&?+\-wrfuST2zMc sM^~egZMϜj:{}vHsi1-` yv]/%ψ9.߱Ъ]ܨ,Z;bҫOڐكkiO=nU~5m˖2g51oaBkITRwF&z96_i̺\P+ ˆiW:.#(ՙ]e 7^5X ZؿkΛ^SFvKJ%yʶ!$wZtX,[qD {R?Il\m k4/j[¨i !ky:-3 m>IDAT :!jGw\\xG/ϏSv:,!(tlYv=9r!4׹A|\f 5!Qny4H#^?e 'T&,!4]_Ǯ׏r^/#bٖGquZ]Y :3qBO]|壗bBuf Ba[BGWj@!C!!B BaB!?B!0@!!B BBaB!B!0@!!B BuMl6 B!l@!У֩CA!Æ!!B BaB!?B!0@!!B BBaB!B!0@!!B Bu]l_IENDB`pydantic-2.10.6/docs/index.md000066400000000000000000000211321474456633400160130ustar00rootroot00000000000000# Pydantic [![CI](https://img.shields.io/github/actions/workflow/status/pydantic/pydantic/ci.yml?branch=main&logo=github&label=CI)](https://github.com/pydantic/pydantic/actions?query=event%3Apush+branch%3Amain+workflow%3ACI) [![Coverage](https://coverage-badge.samuelcolvin.workers.dev/pydantic/pydantic.svg)](https://github.com/pydantic/pydantic/actions?query=event%3Apush+branch%3Amain+workflow%3ACI)
[![pypi](https://img.shields.io/pypi/v/pydantic.svg)](https://pypi.python.org/pypi/pydantic) [![CondaForge](https://img.shields.io/conda/v/conda-forge/pydantic.svg)](https://anaconda.org/conda-forge/pydantic) [![downloads](https://static.pepy.tech/badge/pydantic/month)](https://pepy.tech/project/pydantic)
[![license](https://img.shields.io/github/license/pydantic/pydantic.svg)](https://github.com/pydantic/pydantic/blob/main/LICENSE) {{ version }}. Pydantic is the most widely used data validation library for Python. Fast and extensible, Pydantic plays nicely with your linters/IDE/brain. Define how data should be in pure, canonical Python 3.8+; validate it with Pydantic. !!! logfire "Monitor Pydantic with Logfire :fire:" Built by the same team as Pydantic, **[Logfire](https://pydantic.dev/logfire)** is an application monitoring tool that is as simple to use and powerful as Pydantic itself. Logfire integrates with many popular Python libraries including FastAPI, OpenAI and Pydantic itself, so you can use Logfire to monitor Pydantic validations and understand why some inputs fail validation: ```python {title="Monitoring Pydantic with Logfire" test="skip"} from datetime import datetime import logfire from pydantic import BaseModel logfire.configure() logfire.instrument_pydantic() # (1)! class Delivery(BaseModel): timestamp: datetime dimensions: tuple[int, int] # this will record details of a successful validation to logfire m = Delivery(timestamp='2020-01-02T03:04:05Z', dimensions=['10', '20']) print(repr(m.timestamp)) #> datetime.datetime(2020, 1, 2, 3, 4, 5, tzinfo=TzInfo(UTC)) print(m.dimensions) #> (10, 20) Delivery(timestamp='2020-01-02T03:04:05Z', dimensions=['10']) # (2)! ``` 1. Set logfire record all both successful and failed validations, use `record='failure'` to only record failed validations, [learn more](https://logfire.pydantic.dev/docs/integrations/pydantic/). 2. This will raise a `ValidationError` since there are too few `dimensions`, details of the input data and validation errors will be recorded in Logfire. Would give you a view like this in the Logfire platform: [![Logfire Pydantic Integration](img/logfire-pydantic-integration.png)](https://logfire.pydantic.dev/docs/guides/web-ui/live/) This is just a toy example, but hopefully makes clear the potential value of instrumenting a more complex application. **[Learn more about Pydantic Logfire](https://logfire.pydantic.dev/docs/)** ## Why use Pydantic? - **Powered by type hints** — with Pydantic, schema validation and serialization are controlled by type annotations; less to learn, less code to write, and integration with your IDE and static analysis tools. [Learn more…](why.md#type-hints) - **Speed** — Pydantic's core validation logic is written in Rust. As a result, Pydantic is among the fastest data validation libraries for Python. [Learn more…](why.md#performance) - **JSON Schema** — Pydantic models can emit JSON Schema, allowing for easy integration with other tools. [Learn more…](why.md#json-schema) - **Strict** and **Lax** mode — Pydantic can run in either strict mode (where data is not converted) or lax mode where Pydantic tries to coerce data to the correct type where appropriate. [Learn more…](why.md#strict-lax) - **Dataclasses**, **TypedDicts** and more — Pydantic supports validation of many standard library types including `dataclass` and `TypedDict`. [Learn more…](why.md#dataclasses-typeddict-more) - **Customisation** — Pydantic allows custom validators and serializers to alter how data is processed in many powerful ways. [Learn more…](why.md#customisation) - **Ecosystem** — around 8,000 packages on PyPI use Pydantic, including massively popular libraries like _FastAPI_, _huggingface_, _Django Ninja_, _SQLModel_, & _LangChain_. [Learn more…](why.md#ecosystem) - **Battle tested** — Pydantic is downloaded over 70M times/month and is used by all FAANG companies and 20 of the 25 largest companies on NASDAQ. If you're trying to do something with Pydantic, someone else has probably already done it. [Learn more…](why.md#using-pydantic) [Installing Pydantic](install.md) is as simple as: `pip install pydantic` ## Pydantic examples To see Pydantic at work, let's start with a simple example, creating a custom class that inherits from `BaseModel`: ```python {upgrade="skip" title="Validation Successful" requires="3.10"} from datetime import datetime from pydantic import BaseModel, PositiveInt class User(BaseModel): id: int # (1)! name: str = 'John Doe' # (2)! signup_ts: datetime | None # (3)! tastes: dict[str, PositiveInt] # (4)! external_data = { 'id': 123, 'signup_ts': '2019-06-01 12:22', # (5)! 'tastes': { 'wine': 9, b'cheese': 7, # (6)! 'cabbage': '1', # (7)! }, } user = User(**external_data) # (8)! print(user.id) # (9)! #> 123 print(user.model_dump()) # (10)! """ { 'id': 123, 'name': 'John Doe', 'signup_ts': datetime.datetime(2019, 6, 1, 12, 22), 'tastes': {'wine': 9, 'cheese': 7, 'cabbage': 1}, } """ ``` 1. `id` is of type `int`; the annotation-only declaration tells Pydantic that this field is required. Strings, bytes, or floats will be coerced to integers if possible; otherwise an exception will be raised. 2. `name` is a string; because it has a default, it is not required. 3. `signup_ts` is a [`datetime`][datetime.datetime] field that is required, but the value `None` may be provided; Pydantic will process either a [Unix timestamp](https://en.wikipedia.org/wiki/Unix_time) integer (e.g. `1496498400`) or a string representing the date and time. 4. `tastes` is a dictionary with string keys and positive integer values. The `PositiveInt` type is shorthand for `Annotated[int, annotated_types.Gt(0)]`. 5. The input here is an [ISO 8601](https://en.wikipedia.org/wiki/ISO_8601) formatted datetime, but Pydantic will convert it to a [`datetime`][datetime.datetime] object. 6. The key here is `bytes`, but Pydantic will take care of coercing it to a string. 7. Similarly, Pydantic will coerce the string `'1'` to the integer `1`. 8. We create instance of `User` by passing our external data to `User` as keyword arguments. 9. We can access fields as attributes of the model. 10. We can convert the model to a dictionary with [`model_dump()`][pydantic.BaseModel.model_dump]. If validation fails, Pydantic will raise an error with a breakdown of what was wrong: ```python {upgrade="skip" title="Validation Error" test="skip" lint="skip"} # continuing the above example... from datetime import datetime from pydantic import BaseModel, PositiveInt, ValidationError class User(BaseModel): id: int name: str = 'John Doe' signup_ts: datetime | None tastes: dict[str, PositiveInt] external_data = {'id': 'not an int', 'tastes': {}} # (1)! try: User(**external_data) # (2)! except ValidationError as e: print(e.errors()) """ [ { 'type': 'int_parsing', 'loc': ('id',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'not an int', 'url': 'https://errors.pydantic.dev/2/v/int_parsing', }, { 'type': 'missing', 'loc': ('signup_ts',), 'msg': 'Field required', 'input': {'id': 'not an int', 'tastes': {}}, 'url': 'https://errors.pydantic.dev/2/v/missing', }, ] """ ``` 1. The input data is wrong here — `id` is not a valid integer, and `signup_ts` is missing. 2. Trying to instantiate `User` will raise a [`ValidationError`][pydantic_core.ValidationError] with a list of errors. ## Who is using Pydantic? Hundreds of organisations and packages are using Pydantic. Some of the prominent companies and organizations around the world who are using Pydantic include: {{ organisations }} For a more comprehensive list of open-source projects using Pydantic see the [list of dependents on github](https://github.com/pydantic/pydantic/network/dependents), or you can find some awesome projects using Pydantic in [awesome-pydantic](https://github.com/Kludex/awesome-pydantic). pydantic-2.10.6/docs/install.md000066400000000000000000000041471474456633400163610ustar00rootroot00000000000000Installation is as simple as: === "pip" ```bash pip install pydantic ``` === "uv" ```bash uv add pydantic ``` Pydantic has a few dependencies: * [`pydantic-core`](https://pypi.org/project/pydantic-core/): Core validation logic for Pydantic written in Rust. * [`typing-extensions`](https://pypi.org/project/typing-extensions/): Backport of the standard library [typing][] module. * [`annotated-types`](https://pypi.org/project/annotated-types/): Reusable constraint types to use with [`typing.Annotated`][]. If you've got Python 3.8+ and `pip` installed, you're good to go. Pydantic is also available on [conda](https://www.anaconda.com) under the [conda-forge](https://conda-forge.org) channel: ```bash conda install pydantic -c conda-forge ``` ## Optional dependencies Pydantic has the following optional dependencies: * `email`: Email validation provided by the [email-validator](https://pypi.org/project/email-validator/) package. * `timezone`: Fallback IANA time zone database provided by the [tzdata](https://pypi.org/project/tzdata/) package. To install optional dependencies along with Pydantic: === "pip" ```bash # with the `email` extra: pip install 'pydantic[email]' # or with `email` and `timezone` extras: pip install 'pydantic[email,timezone]' ``` === "uv" ```bash # with the `email` extra: uv add 'pydantic[email]' # or with `email` and `timezone` extras: uv add 'pydantic[email,timezone]' ``` Of course, you can also install requirements manually with `pip install email-validator tzdata`. ## Install from repository And if you prefer to install Pydantic directly from the repository: === "pip" ```bash pip install 'git+https://github.com/pydantic/pydantic@main' # or with `email` and `timezone` extras: pip install 'git+https://github.com/pydantic/pydantic@main#egg=pydantic[email,timezone]' ``` === "uv" ```bash uv add 'git+https://github.com/pydantic/pydantic@main' # or with `email` and `timezone` extras: uv add 'git+https://github.com/pydantic/pydantic@main#egg=pydantic[email,timezone]' ``` pydantic-2.10.6/docs/integrations/000077500000000000000000000000001474456633400170715ustar00rootroot00000000000000pydantic-2.10.6/docs/integrations/aws_lambda.md000066400000000000000000000156341474456633400215160ustar00rootroot00000000000000`pydantic` integrates well with AWS Lambda functions. In this guide, we'll discuss how to setup `pydantic` for an AWS Lambda function. ## Installing Python libraries for AWS Lambda functions There are many ways to utilize Python libraries in AWS Lambda functions. As outlined in the [AWS Lambda documentation](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html), the most common approaches include: * Using a [`.zip` file archive](https://docs.aws.amazon.com/lambda/latest/dg/python-package.html) to package your code and dependencies * Using [AWS Lambda Layers](https://docs.aws.amazon.com/lambda/latest/dg/python-layers.html) to share libraries across multiple functions * Using a [container image](https://docs.aws.amazon.com/lambda/latest/dg/python-image.html) to package your code and dependencies All of these approaches can be used with `pydantic`. The best approach for you will depend on your specific requirements and constraints. We'll cover the first two cases more in-depth here, as dependency management with a container image is more straightforward. If you're using a container image, you might find [this comment](https://github.com/pydantic/pydantic/issues/6557#issuecomment-1699456562) helpful for installing `pydantic`. !!! tip If you use `pydantic` across multiple functions, you may want to consider AWS Lambda Layers, which support seamless sharing of libraries across multiple functions. Regardless of the dependencies management approach you choose, it's beneficial to adhere to these guidelines to ensure a smooth dependency management process. ## Installing `pydantic` for AWS Lambda functions When you're building your `.zip` file archive with your code and dependencies or organizing your `.zip` file for a Lambda Layer, you'll likely use a local virtual environment to install and manage your dependencies. This can be a bit tricky if you're using `pip` because `pip` installs wheels compiled for your local platform, which may not be compatible with the Lambda environment. Thus, we suggest you use a command similar to the following: ```bash pip install \ --platform manylinux2014_x86_64 \ # (1)! --target= \ # (2)! --implementation cp \ # (3)! --python-version 3.10 \ # (4)! --only-binary=:all: \ # (5)! --upgrade pydantic # (6)! ``` 1. Use the platform corresponding to your Lambda runtime. 2. Specify the directory where you want to install the package (often `python` for Lambda Layers). 3. Use the CPython implementation. 4. The Python version must be compatible with the Lambda runtime. 5. This flag ensures that the package is installed pre-built binary wheels. 6. The latest version of `pydantic` will be installed. ## Troubleshooting ### `no module named 'pydantic_core._pydantic_core'` The ``` no module named `pydantic_core._pydantic_core` ``` error is a common issue that indicates you have installed `pydantic` incorrectly. To debug this issue, you can try the following steps (before the failing import): 1. Check the contents of the installed `pydantic-core` package. Are the compiled library and its type stubs both present? ```python {test="skip" lint="skip"} from importlib.metadata import files print([file for file in files('pydantic-core') if file.name.startswith('_pydantic_core')]) """ [PackagePath('pydantic_core/_pydantic_core.pyi'), PackagePath('pydantic_core/_pydantic_core.cpython-312-x86_64-linux-gnu.so')] """ ``` You should expect to see two files like those printed above. The compile library file will be a .so or .pyd with a name that varies according to the OS and Python version. 2. Check that your lambda's Python version is compatible with the compiled library version found above. ```python {test="skip" lint="skip"} import sysconfig print(sysconfig.get_config_var("EXT_SUFFIX")) #> '.cpython-312-x86_64-linux-gnu.so' ``` You should expect to see the same suffix here as the compiled library, for example here we see this suffix `.cpython-312-x86_64-linux-gnu.so` indeed matches `_pydantic_core.cpython-312-x86_64-linux-gnu.so`. If these two checks do not match, your build steps have not installed the correct native code for your lambda's target platform. You should adjust your build steps to change the version of the installed library which gets installed. Most likely errors: * Your OS or CPU architecture is mismatched (e.g. darwin vs x86_64-linux-gnu). Try passing correct `--platform` argument to `pip install` when installing your lambda dependencies, or build inside a linux docker container for the correct platform. Possible platforms at the moment include `--platform manylinux2014_x86_64` or `--platform manylinux2014_aarch64`, but these may change with a future Pydantic major release. * Your Python version is mismatched (e.g. `cpython-310` vs `cpython-312`). Try passing correct `--python-version` argument to `pip install`, or otherwise change the Python version used on your build. ### No package metadata was found for `email-validator` Pydantic uses `version` from `importlib.metadata` to [check what version](https://github.com/pydantic/pydantic/pull/6033) of `email-validator` is installed. This package versioning mechanism is somewhat incompatible with AWS Lambda, even though it's the industry standard for versioning packages in Python. There are a few ways to fix this issue: If you're deploying your lambda with the serverless framework, it's likely that the appropriate metadata for the `email-validator` package is not being included in your deployment package. Tools like [`serverless-python-requirements`](https://github.com/serverless/serverless-python-requirements/tree/master) remove metadata to reduce package size. You can fix this issue by setting the `slim` setting to false in your `serverless.yml` file: ``` pythonRequirements: dockerizePip: non-linux slim: false fileName: requirements.txt ``` You can read more about this fix, and other `slim` settings that might be relevant [here](https://biercoff.com/how-to-fix-package-not-found-error-importlib-metadata/). If you're using a `.zip` archive for your code and/or dependencies, make sure that your package contains the required version metadata. To do this, make sure you include the `dist-info` directory in your `.zip` archive for the `email-validator` package. This issue has been reported for other popular python libraries like [`jsonschema`](https://github.com/python-jsonschema/jsonschema/issues/584), so you can read more about the issue and potential fixes there as well. ## Extra Resources ### More Debugging Tips If you're still struggling with installing `pydantic` for your AWS Lambda, you might consult with [this issue](https://github.com/pydantic/pydantic/issues/6557), which covers a variety of problems and solutions encountered by other developers. ### Validating `event` and `context` data Check out our [blog post](https://pydantic.dev/articles/lambda-intro) to learn more about how to use `pydantic` to validate `event` and `context` data in AWS Lambda functions. pydantic-2.10.6/docs/integrations/datamodel_code_generator.md000066400000000000000000000047111474456633400244100ustar00rootroot00000000000000# Code Generation with datamodel-code-generator The [datamodel-code-generator](https://github.com/koxudaxi/datamodel-code-generator/) project is a library and command-line utility to generate pydantic models from just about any data source, including: * OpenAPI 3 (YAML/JSON) * JSON Schema * JSON/YAML/CSV Data (which will be converted to JSON Schema) * Python dictionary (which will be converted to JSON Schema) * GraphQL schema Whenever you find yourself with any data convertible JSON but without pydantic models, this tool will allow you to generate type-safe model hierarchies on demand. ## Installation ```bash pip install datamodel-code-generator ``` ## Example In this case, datamodel-code-generator creates pydantic models from a JSON Schema file. ```bash datamodel-codegen --input person.json --input-file-type jsonschema --output model.py ``` person.json: ```json { "$id": "person.json", "$schema": "http://json-schema.org/draft-07/schema#", "title": "Person", "type": "object", "properties": { "first_name": { "type": "string", "description": "The person's first name." }, "last_name": { "type": "string", "description": "The person's last name." }, "age": { "description": "Age in years.", "type": "integer", "minimum": 0 }, "pets": { "type": "array", "items": [ { "$ref": "#/definitions/Pet" } ] }, "comment": { "type": "null" } }, "required": [ "first_name", "last_name" ], "definitions": { "Pet": { "properties": { "name": { "type": "string" }, "age": { "type": "integer" } } } } } ``` model.py: ```python {upgrade="skip" requires="3.10"} # generated by datamodel-codegen: # filename: person.json # timestamp: 2020-05-19T15:07:31+00:00 from __future__ import annotations from typing import Any from pydantic import BaseModel, Field, conint class Pet(BaseModel): name: str | None = None age: int | None = None class Person(BaseModel): first_name: str = Field(description="The person's first name.") last_name: str = Field(description="The person's last name.") age: conint(ge=0) | None = Field(None, description='Age in years.') pets: list[Pet] | None = None comment: Any | None = None ``` More information can be found on the [official documentation](https://koxudaxi.github.io/datamodel-code-generator/) pydantic-2.10.6/docs/integrations/devtools.md000066400000000000000000000025571474456633400212630ustar00rootroot00000000000000!!! note **Admission:** I (the primary developer of Pydantic) also develop python-devtools. [python-devtools](https://python-devtools.helpmanual.io/) (`pip install devtools`) provides a number of tools which are useful during Python development, including `debug()` an alternative to `print()` which formats output in a way which should be easier to read than `print` as well as giving information about which file/line the print statement is on and what value was printed. Pydantic integrates with *devtools* by implementing the `__pretty__` method on most public classes. In particular `debug()` is useful when inspecting models: ```python {test="no-print-intercept"} from datetime import datetime from typing import List from devtools import debug from pydantic import BaseModel class Address(BaseModel): street: str country: str lat: float lng: float class User(BaseModel): id: int name: str signup_ts: datetime friends: List[int] address: Address user = User( id='123', name='John Doe', signup_ts='2019-06-01 12:22', friends=[1234, 4567, 7890], address=dict(street='Testing', country='uk', lat=51.5, lng=0), ) debug(user) print('\nshould be much easier read than:\n') print('user:', user) ``` Will output in your terminal: {{ devtools_example }} !!! note `python-devtools` doesn't yet support Python 3.13. pydantic-2.10.6/docs/integrations/hypothesis.md000066400000000000000000000017261474456633400216200ustar00rootroot00000000000000[Hypothesis](https://hypothesis.readthedocs.io/) is the Python library for [property-based testing](https://increment.com/testing/in-praise-of-property-based-testing/). Hypothesis can infer how to construct type-annotated classes, and supports builtin types, many standard library types, and generic types from the [`typing`](https://docs.python.org/3/library/typing.html) and [`typing_extensions`](https://pypi.org/project/typing-extensions/) modules by default. Pydantic v2.0 drops built-in support for Hypothesis and no more ships with the integrated Hypothesis plugin. !!! warning We are temporarily removing the Hypothesis plugin in favor of studying a different mechanism. For more information, see the issue [annotated-types/annotated-types#37](https://github.com/annotated-types/annotated-types/issues/37). The Hypothesis plugin may be back in a future release. Subscribe to [pydantic/pydantic#4682](https://github.com/pydantic/pydantic/issues/4682) for updates. pydantic-2.10.6/docs/integrations/linting.md000066400000000000000000000006251474456633400210620ustar00rootroot00000000000000## Flake8 plugin If using Flake8 in your project, a [plugin](https://pypi.org/project/flake8-pydantic/) is available and can be installed using the following: ```bash pip install flake8-pydantic ``` The lint errors provided by this plugin are namespaced under the `PYDXXX` code. To ignore some unwanted rules, the Flake8 configuration can be adapted: ```ini [flake8] extend-ignore = PYD001,PYD002 ``` pydantic-2.10.6/docs/integrations/logfire.md000066400000000000000000000046251474456633400210510ustar00rootroot00000000000000Pydantic integrates seamlessly with **Pydantic Logfire**, an observability platform built by us on the same belief as our open source library — that the most powerful tools can be easy to use. ## Getting Started Logfire has an out-of-the-box Pydantic integration that lets you understand the data passing through your Pydantic models and get analytics on validations. For existing Pydantic users, it delivers unparalleled insights into your usage of Pydantic models. [Getting started](https://logfire.pydantic.dev/docs/) with Logfire can be done in three simple steps: 1. Set up your Logfire account. 2. Install the Logfire SDK. 3. Instrument your project. ### Basic Usage Once you've got Logfire set up, you can start using it to monitor your Pydantic models and get insights into your data validation: ```python {test="skip"} from datetime import date import logfire from pydantic import BaseModel logfire.configure() # (1)! class User(BaseModel): name: str country_code: str dob: date user = User(name='Anne', country_code='USA', dob='2000-01-01') logfire.info('user processed: {user!r}', user=user) # (2)! ``` 1. The `logfire.configure()` call is all you need to instrument your project with Logfire. 2. The `logfire.info()` call logs the `user` object to Logfire, with builtin support for Pydantic models. ![basic pydantic logfire usage](../img/basic_logfire.png) ### Pydantic Instrumentation You can even record information about the validation process automatically by using the builtin [Pydantic integration](https://logfire.pydantic.dev/docs/why-logfire/pydantic/): ```python {test="skip"} from datetime import date import logfire from pydantic import BaseModel logfire.configure() logfire.instrument_pydantic() # (1)! class User(BaseModel): name: str country_code: str dob: date User(name='Anne', country_code='USA', dob='2000-01-01') User(name='David', country_code='GBR', dob='invalid-dob') ``` 1. The `logfire.instrument_pydantic()` call automatically logs validation information for all Pydantic models in your project. You'll see each successful and failed validation logged in Logfire: ![logfire instrumentation](../img/logfire_instrument.png) And you can investigate each of the corresponding spans to get validation details: ![logfire span details](../img/logfire_span.png) pydantic-2.10.6/docs/integrations/mypy.md000066400000000000000000000167101474456633400204160ustar00rootroot00000000000000Pydantic works well with [mypy](http://mypy-lang.org) right out of the box. However, Pydantic also ships with a mypy plugin that adds a number of important Pydantic-specific features that improve its ability to type-check your code. For example, consider the following script: ```python {test="skip" linenums="1"} from datetime import datetime from typing import List, Optional from pydantic import BaseModel class Model(BaseModel): age: int first_name = 'John' last_name: Optional[str] = None signup_ts: Optional[datetime] = None list_of_ints: List[int] m = Model(age=42, list_of_ints=[1, '2', b'3']) print(m.middle_name) # not a model field! Model() # will raise a validation error for age and list_of_ints ``` Without any special configuration, mypy does not catch the [missing model field annotation](../errors/usage_errors.md#model-field-missing-annotation) and errors about the `list_of_ints` argument which Pydantic parses correctly: ``` 15: error: List item 1 has incompatible type "str"; expected "int" [list-item] 15: error: List item 2 has incompatible type "bytes"; expected "int" [list-item] 16: error: "Model" has no attribute "middle_name" [attr-defined] 17: error: Missing named argument "age" for "Model" [call-arg] 17: error: Missing named argument "list_of_ints" for "Model" [call-arg] ``` But [with the plugin enabled](#enabling-the-plugin), it gives the correct errors: ``` 9: error: Untyped fields disallowed [pydantic-field] 16: error: "Model" has no attribute "middle_name" [attr-defined] 17: error: Missing named argument "age" for "Model" [call-arg] 17: error: Missing named argument "list_of_ints" for "Model" [call-arg] ``` With the pydantic mypy plugin, you can fearlessly refactor your models knowing mypy will catch any mistakes if your field names or types change. Note that mypy already supports some features without using the Pydantic plugin, such as synthesizing a `__init__` method for Pydantic models and dataclasses. See the [mypy plugin capabilities](#mypy-plugin-capabilities) for a list of additional features. ## Enabling the Plugin To enable the plugin, just add `pydantic.mypy` to the list of plugins in your [mypy config file](https://mypy.readthedocs.io/en/latest/config_file.html): === "`mypy.ini`" ```ini [mypy] plugins = pydantic.mypy ``` === "`pyproject.toml`" ```toml [tool.mypy] plugins = ['pydantic.mypy'] ``` !!! note If you're using `pydantic.v1` models, you'll need to add `pydantic.v1.mypy` to your list of plugins. See the [plugin configuration](#configuring-the-plugin) for more details. ## Supported mypy versions Pydantic supports the mypy versions released less than 6 months ago. Older versions may still work with the plugin but won't be tested. The list of released mypy versions can be found [here](https://mypy-lang.org/news.html). Note that the version support policy is subject to change at discretion of contributors. ## Mypy plugin capabilities ### Generate a `__init__` signature for Pydantic models * Any required fields that don't have dynamically-determined aliases will be included as required keyword arguments. * If the [`populate_by_name`][pydantic.ConfigDict.populate_by_name] model configuration value is set to `True`, the generated signature will use the field names rather than aliases. * The [`init_forbid_extra`](#init_forbid_extra) and [`init_typed`](#init_typed) plugin configuration values can further fine-tune the synthesized `__init__` method. ### Generate a typed signature for `model_construct` * The [`model_construct`][pydantic.BaseModel.model_construct] method is an alternative to model validation when input data is known to be valid and should not be parsed (see the [documentation](../concepts/models.md#creating-models-without-validation)). Because this method performs no runtime validation, static checking is important to detect errors. ### Support for frozen models * If the [`frozen`][pydantic.ConfigDict.frozen] configuration is set to `True`, you will get an error if you try mutating a model field (see [faux immutability](../concepts/models.md#faux-immutability)) ### Respect the type of the `Field`'s `default` and `default_factory` * Field with both a `default` and a `default_factory` will result in an error during static checking. * The type of the `default` and `default_factory` value must be compatible with the one of the field. ### Warn about the use of untyped fields * While defining a field without an annotation will result in a [runtime error](../errors/usage_errors.md#model-field-missing-annotation), the plugin will also emit a type checking error. ### Prevent the use of required dynamic aliases See the documentation of the [`warn_required_dynamic_aliases`](#warn_required_dynamic_aliases) plugin configuration value. ## Configuring the Plugin To change the values of the plugin settings, create a section in your mypy config file called `[pydantic-mypy]`, and add any key-value pairs for settings you want to override. A configuration file with all plugin strictness flags enabled (and some other mypy strictness flags, too) might look like: === "`mypy.ini`" ```ini [mypy] plugins = pydantic.mypy follow_imports = silent warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True no_implicit_reexport = True disallow_untyped_defs = True [pydantic-mypy] init_forbid_extra = True init_typed = True warn_required_dynamic_aliases = True ``` === "`pyproject.toml`" ```toml [tool.mypy] plugins = ["pydantic.mypy"] follow_imports = "silent" warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true no_implicit_reexport = true disallow_untyped_defs = true [tool.pydantic-mypy] init_forbid_extra = true init_typed = true warn_required_dynamic_aliases = true ``` ### `init_typed` Because Pydantic performs [data conversion](../concepts/models.md#data-conversion) by default, the following is still valid at runtime: ```python {test="skip" lint="skip"} class Model(BaseModel): a: int Model(a='1') ``` For this reason, the plugin will use [`Any`][typing.Any] for field annotations when synthesizing the `__init__` method, unless `init_typed` is set or [strict mode](../concepts/strict_mode.md) is enabled on the model. ### `init_forbid_extra` By default, Pydantic allows (and ignores) any extra provided argument: ```python {test="skip" lint="skip"} class Model(BaseModel): a: int = 1 Model(unrelated=2) ``` For this reason, the plugin will add an extra `**kwargs: Any` parameter when synthesizing the `__init__` method, unless `init_forbid_extra` is set or the [`extra`][pydantic.ConfigDict.extra] is set to `'forbid'`. ### `warn_required_dynamic_aliases` Whether to error when using a dynamically-determined alias or alias generator on a model with [`populate_by_name`][pydantic.ConfigDict.populate_by_name] set to `False`. If such aliases are present, mypy cannot properly type check calls to `__init__`. In this case, it will default to treating all arguments as not required. !!! note "Compatibility with `Any` being disallowed" Some mypy configuration options (such as [`disallow_any_explicit`](https://mypy.readthedocs.io/en/stable/config_file.html#confval-disallow_any_explicit)) will error because the synthesized `__init__` method contains [`Any`][typing.Any] annotations. To circumvent the issue, you will have to enable both `init_forbid_extra` and `init_typed`. pydantic-2.10.6/docs/integrations/pycharm.md000066400000000000000000000016541474456633400210640ustar00rootroot00000000000000While pydantic will work well with any IDE out of the box, a [PyCharm plugin](https://plugins.jetbrains.com/plugin/12861-pydantic) offering improved pydantic integration is available on the JetBrains Plugins Repository for PyCharm. You can install the plugin for free from the plugin marketplace (PyCharm's Preferences -> Plugin -> Marketplace -> search "pydantic"). The plugin currently supports the following features: * For `pydantic.BaseModel.__init__`: * Inspection * Autocompletion * Type-checking * For fields of `pydantic.BaseModel`: * Refactor-renaming fields updates `__init__` calls, and affects sub- and super-classes * Refactor-renaming `__init__` keyword arguments updates field names, and affects sub- and super-classes More information can be found on the [official plugin page](https://plugins.jetbrains.com/plugin/12861-pydantic) and [Github repository](https://github.com/koxudaxi/pydantic-pycharm-plugin). pydantic-2.10.6/docs/integrations/rich.md000066400000000000000000000005501474456633400203400ustar00rootroot00000000000000Pydantic models may be printed with the [Rich](https://github.com/willmcgugan/rich) library which will add additional formatting and color to the output. Here's an example: ![Printing Pydantic models with Rich](../img/rich_pydantic.png) See the Rich documentation on [pretty printing](https://rich.readthedocs.io/en/latest/pretty.html) for more information. pydantic-2.10.6/docs/integrations/visual_studio_code.md000066400000000000000000000265471474456633400233150ustar00rootroot00000000000000Pydantic works well with any editor or IDE out of the box because it's made on top of standard Python type annotations. When using [Visual Studio Code (VS Code)](https://code.visualstudio.com/), there are some **additional editor features** supported, comparable to the ones provided by the [PyCharm plugin](../integrations/pycharm.md). This means that you will have **autocompletion** (or "IntelliSense") and **error checks** for types and required arguments even while creating new Pydantic model instances. ![pydantic autocompletion in VS Code](../img/vs_code_01.png) ## Configure VS Code To take advantage of these features, you need to make sure you configure VS Code correctly, using the recommended settings. In case you have a different configuration, here's a short overview of the steps. ### Install Pylance You should use the [Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance) extension for VS Code. It is the recommended, next-generation, official VS Code plug-in for Python. Pylance is installed as part of the [Python Extension for VS Code](https://marketplace.visualstudio.com/items?itemName=ms-python.python) by default, so it should probably just work. Otherwise, you can double check it's installed and enabled in your editor. ### Configure your environment Then you need to make sure your editor knows the [Python environment](https://code.visualstudio.com/docs/python/python-tutorial#_install-and-use-packages) (probably a virtual environment) for your Python project. This would be the environment in where you installed Pydantic. ### Configure Pylance With the default configurations, you will get support for autocompletion, but Pylance might not check for type errors. You can enable type error checks from Pylance with these steps: * Open the "User Settings" * Search for `Type Checking Mode` * You will find an option under `Python › Analysis: Type Checking Mode` * Set it to `basic` or `strict` (by default it's `off`) ![Type Checking Mode set to strict in VS Code](../img/vs_code_02.png) Now you will not only get autocompletion when creating new Pydantic model instances but also error checks for **required arguments**. ![Required arguments error checks in VS Code](../img/vs_code_03.png) And you will also get error checks for **invalid data types**. ![Invalid data types error checks in VS Code](../img/vs_code_04.png) !!! note "Technical Details" Pylance is the VS Code extension, it's closed source, but free to use. Underneath, Pylance uses an open source tool (also from Microsoft) called [Pyright](https://github.com/microsoft/pyright) that does all the heavy lifting. You can read more about it in the [Pylance Frequently Asked Questions](https://github.com/microsoft/pylance-release/blob/main/FAQ.md#what-is-the-relationship-between-pylance-pyright-and-the-python-extension). ### Configure mypy You might also want to configure mypy in VS Code to get mypy error checks inline in your editor (alternatively/additionally to Pylance). This would include the errors detected by the [Pydantic mypy plugin](../integrations/mypy.md), if you configured it. To enable mypy in VS Code, do the following: * Open the "User Settings" * Search for `Mypy Enabled` * You will find an option under `Python › Linting: Mypy Enabled` * Check the box (by default it's unchecked) ![mypy enabled in VS Code](../img/vs_code_05.png) ## Tips and tricks Here are some additional tips and tricks to improve your developer experience when using VS Code with Pydantic. ### Strict errors The way this additional editor support works is that Pylance will treat your Pydantic models as if they were Python's pure `dataclasses`. And it will show **strict type error checks** about the data types passed in arguments when creating a new Pydantic model instance. In this example you can see that it shows that a `str` of `'23'` is not a valid `int` for the argument `age`. ![VS Code strict type errors](../img/vs_code_06.png) It would expect `age=23` instead of `age='23'`. Nevertheless, the design, and one of the main features of Pydantic, is that it is very **lenient with data types**. It will actually accept the `str` with value `'23'` and will convert it to an `int` with value `23`. These strict error checks are **very useful** most of the time and can help you **detect many bugs early**. But there are cases, like with `age='23'`, where they could be inconvenient by reporting a "false positive" error. --- This example above with `age='23'` is intentionally simple, to show the error and the differences in types. But more common cases where these strict errors would be inconvenient would be when using more sophisticated data types, like `int` values for `datetime` fields, or `dict` values for Pydantic sub-models. For example, this is valid for Pydantic: ```python {hl_lines="12 17"} from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' class Quest(BaseModel): title: str knight: Knight quest = Quest( title='To seek the Holy Grail', knight={'title': 'Sir Lancelot', 'age': 23} ) ``` The type of the field `knight` is declared with the class `Knight` (a Pydantic model) and the code is passing a literal `dict` instead. This is still valid for Pydantic, and the `dict` would be automatically converted to a `Knight` instance. Nevertheless, it would be detected as a type error: ![VS Code strict type errors with model](../img/vs_code_07.png) In those cases, there are several ways to disable or ignore strict errors in very specific places, while still preserving them in the rest of the code. Below are several techniques to achieve it. #### Disable type checks in a line You can disable the errors for a specific line using a comment of: ```python # type: ignore ``` or (to be specific to pylance/pyright): ```python # pyright: ignore ``` ([pyright](https://github.com/microsoft/pyright) is the language server used by Pylance.). coming back to the example with `age='23'`, it would be: ```python {hl_lines="10"} from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' lancelot = Knight(title='Sir Lancelot', age='23') # pyright: ignore ``` that way Pylance and mypy will ignore errors in that line. **Pros**: it's a simple change in that line to remove errors there. **Cons**: any other error in that line will also be omitted, including type checks, misspelled arguments, required arguments not provided, etc. #### Override the type of a variable You can also create a variable with the value you want to use and declare its type explicitly with `Any`. ```python {hl_lines="1 11-12"} from typing import Any from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' age_str: Any = '23' lancelot = Knight(title='Sir Lancelot', age=age_str) ``` that way Pylance and mypy will interpret the variable `age_str` as if they didn't know its type, instead of knowing it has a type of `str` when an `int` was expected (and then showing the corresponding error). **Pros**: errors will be ignored only for a specific value, and you will still see any additional errors for the other arguments. **Cons**: it requires importing `Any` and a new variable in a new line for each argument that needs ignoring errors. #### Override the type of a value with `cast` The same idea from the previous example can be put on the same line with the help of `cast()`. This way, the type declaration of the value is overridden inline, without requiring another variable. ```python {hl_lines="1 11"} from typing import Any, cast from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' lancelot = Knight(title='Sir Lancelot', age=cast(Any, '23')) ``` `cast(Any, '23')` doesn't affect the value, it's still just `'23'`, but now Pylance and mypy will assume it is of type `Any`, which means, they will act as if they didn't know the type of the value. So, this is the equivalent of the previous example, without the additional variable. **Pros**: errors will be ignored only for a specific value, and you will still see any additional errors for the other arguments. There's no need for additional variables. **Cons**: it requires importing `Any` and `cast`, and if you are not used to using `cast()`, it could seem strange at first. ### Config in class arguments Pydantic has a rich set of [Model Configurations][pydantic.config.ConfigDict] available. These configurations can be set in an internal `class Config` on each model: ```python {hl_lines="9-10"} from pydantic import BaseModel class Knight(BaseModel): model_config = dict(frozen=True) title: str age: int color: str = 'blue' ``` or passed as keyword arguments when defining the model class: ```python {hl_lines="4"} from pydantic import BaseModel class Knight(BaseModel, frozen=True): title: str age: int color: str = 'blue' ``` The specific configuration **`frozen`** (in beta) has a special meaning. It prevents other code from changing a model instance once it's created, keeping it **"frozen"**. When using the second version to declare `frozen=True` (with **keyword arguments** in the class definition), Pylance can use it to help you check in your code and **detect errors** when something is trying to set values in a model that is "frozen". ![VS Code strict type errors with model](../img/vs_code_08.png) ## Adding a default with `Field` Pylance/pyright requires `default` to be a keyword argument to `Field` in order to infer that the field is optional. ```python from pydantic import BaseModel, Field class Knight(BaseModel): title: str = Field(default='Sir Lancelot') # this is okay age: int = Field( 23 ) # this works fine at runtime but will case an error for pyright lance = Knight() # error: Argument missing for parameter "age" ``` This is a limitation of dataclass transforms and cannot be fixed in pydantic. ## Technical Details !!! warning As a Pydantic user, you don't need the details below. Feel free to skip the rest of this section. These details are only useful for other library authors, etc. This additional editor support works by implementing the proposed draft standard for [Dataclass Transform (PEP 681)](https://peps.python.org/pep-0681/). The proposed draft standard is written by Eric Traut, from the Microsoft team, the same author of the open source package Pyright (used by Pylance to provide Python support in VS Code). The intention of the standard is to provide a way for libraries like Pydantic and others to tell editors and tools that they (the editors) should treat these libraries (e.g. Pydantic) as if they were `dataclasses`, providing autocompletion, type checks, etc. The draft standard also includes an [Alternate Form](https://github.com/microsoft/pyright/blob/master/specs/dataclass_transforms.md#alternate-form) for early adopters, like Pydantic, to add support for it right away, even before the new draft standard is finished and approved. This new draft standard, with the Alternate Form, is already supported by Pyright, so it can be used via Pylance in VS Code. As it is being proposed as an official standard for Python, other editors can also easily add support for it. And authors of other libraries similar to Pydantic can also easily adopt the standard right away (using the "Alternate Form") and get the benefits of these additional editor features. pydantic-2.10.6/docs/internals/000077500000000000000000000000001474456633400163625ustar00rootroot00000000000000pydantic-2.10.6/docs/internals/architecture.md000066400000000000000000000222441474456633400213720ustar00rootroot00000000000000!!! note This section is part of the *internals* documentation, and is partly targeted to contributors. Starting with Pydantic V2, part of the codebase is written in Rust in a separate package called `pydantic-core`. This was done partly in order to improve validation and serialization performance (with the cost of limited customization and extendibility of the internal logic). This architecture documentation will first cover how the two `pydantic` and `pydantic-core` packages interact together, then will go through the architecture specifics for various patterns (model definition, validation, serialization, JSON Schema). Usage of the Pydantic library can be divided into two parts: - Model definition, done in the `pydantic` package. - Model validation and serialization, done in the `pydantic-core` package. ## Model definition Whenever a Pydantic [`BaseModel`][pydantic.main.BaseModel] is defined, the metaclass will analyze the body of the model to collect a number of elements: - Defined annotations to build model fields (collected in the [`model_fields`][pydantic.main.BaseModel.model_fields] attribute). - Model configuration, set with [`model_config`][pydantic.main.BaseModel.model_config]. - Additional validators/serializers. - Private attributes, class variables, identification of generic parametrization, etc. ### Communicating between `pydantic` and `pydantic-core`: the core schema We then need a way to communicate the collected information from the model definition to `pydantic-core`, so that validation and serialization is performed accordingly. To do so, Pydantic uses the concept of a core schema: a structured (and serializable) Python dictionary (represented using [`TypedDict`][typing.TypedDict] definitions) describing a specific validation and serialization logic. It is the core data structure used to communicate between the `pydantic` and `pydantic-core` packages. Every core schema has a required `type` key, and extra properties depending on this `type`. The generation of a core schema is handled in a single place, by the `GenerateSchema` class (no matter if it is for a Pydantic model or anything else). !!! note It is not possible to define a custom core schema. A core schema needs to be understood by the `pydantic-core` package, and as such we only support a fixed number of core schema types. This is also part of the reason why the `GenerateSchema` isn't truly exposed and properly documented. The core schema definitions can be found in the [`pydantic_core.core_schema`][] module. In the case of a Pydantic model, a core schema will be constructed and set as the [`__pydantic_core_schema__`][pydantic.main.BaseModel.__pydantic_core_schema__] attribute. To illustrate what a core schema looks like, we will take the example of the [`bool`][pydantic_core.core_schema.bool_schema] core schema: ```python {lint="skip" test="skip"} class BoolSchema(TypedDict, total=False): type: Required[Literal['bool']] strict: bool ref: str metadata: Any serialization: SerSchema ``` When defining a Pydantic model with a boolean field: ```python from pydantic import BaseModel, Field class Model(BaseModel): foo: bool = Field(strict=True) ``` The core schema for the `foo` field will look like: ```python { 'type': 'bool', 'strict': True, } ``` As seen in the [`BoolSchema`][pydantic_core.core_schema.bool_schema] definition, the serialization logic is also defined in the core schema. If we were to define a custom serialization function for `foo` (1), the `serialization` key would look like: { .annotate } 1. For example using the [`field_serializer`][pydantic.functional_serializers.field_serializer] decorator: ```python {test="skip" lint="skip"} class Model(BaseModel): foo: bool = Field(strict=True) @field_serializer('foo', mode='plain') def serialize_foo(self, value: bool) -> Any: ... ``` ```python {lint="skip" test="skip"} { 'type': 'function-plain', 'function': , 'is_field_serializer': True, 'info_arg': False, 'return_schema': {'type': 'int'}, } ``` Note that this is also a core schema definition, just that it is only relevant for `pydantic-core` during serialization. Core schemas cover a broad scope, and are used whenever we want to communicate between the Python and Rust side. While the previous examples were related to validation and serialization, it could in theory be used for anything: error management, extra metadata, etc. ### JSON Schema generation You may have noticed that the previous serialization core schema has a `return_schema` key. This is because the core schema is also used to generate the corresponding JSON Schema. Similar to how the core schema is generated, the JSON Schema generation is handled by the [`GenerateJsonSchema`][pydantic.json_schema.GenerateJsonSchema] class. The [`generate`][pydantic.json_schema.GenerateJsonSchema.generate] method is the main entry point and is given the core schema of that model. Coming back to our `bool` field example, the [`bool_schema`][pydantic.json_schema.GenerateJsonSchema.bool_schema] method will be given the previously generated [boolean core schema][pydantic_core.core_schema.bool_schema] and will return the following JSON Schema: ```json { {"type": "boolean"} } ``` ### Customizing the core schema and JSON schema !!! abstract "Usage Documentation" [Custom types](../concepts/types.md#custom-types) [Implementing `__get_pydantic_core_schema__`](../concepts/json_schema.md#implementing-__get_pydantic_core_schema__) [Implementing `__get_pydantic_json_schema__`](../concepts/json_schema.md#implementing-__get_pydantic_json_schema__) While the `GenerateSchema` and [`GenerateJsonSchema`][pydantic.json_schema.GenerateJsonSchema] classes handle the creation of the corresponding schemas, Pydantic offers a way to customize them in some cases, following a wrapper pattern. This customization is done through the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` methods. To understand this wrapper pattern, we will take the example of metadata classes used with [`Annotated`][typing.Annotated], where the `__get_pydantic_core_schema__` method can be used: ```python from typing import Any from pydantic_core import CoreSchema from typing_extensions import Annotated from pydantic import GetCoreSchemaHandler, TypeAdapter class MyStrict: @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: schema = handler(source) # (1)! schema['strict'] = True return schema class MyGt: @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: schema = handler(source) # (2)! schema['gt'] = 1 return schema ta = TypeAdapter(Annotated[int, MyStrict(), MyGt()]) ``` 1. `MyStrict` is the first annotation to be applied. At this point, `schema = {'type': 'int'}`. 2. `MyGt` is the last annotation to be applied. At this point, `schema = {'type': 'int', 'strict': True}`. When the `GenerateSchema` class builds the core schema for `Annotated[int, MyStrict(), MyGt()]`, it will create an instance of a `GetCoreSchemaHandler` to be passed to the `MyGt.__get_pydantic_core_schema__` method. (1) { .annotate } 1. In the case of our [`Annotated`][typing.Annotated] pattern, the `GetCoreSchemaHandler` is defined in a nested way. Calling it will recursively call the other `__get_pydantic_core_schema__` methods until it reaches the `int` annotation, where a simple `{'type': 'int'}` schema is returned. The `source` argument depends on the core schema generation pattern. In the case of [`Annotated`][typing.Annotated], the `source` will be the type being annotated. When [defining a custom type](../concepts/types.md#as-a-method-on-a-custom-type), the `source` will be the actual class where `__get_pydantic_core_schema__` is defined. ## Model validation and serialization While model definition was scoped to the _class_ level (i.e. when defining your model), model validation and serialization happens at the _instance_ level. Both these concepts are handled in `pydantic-core` (providing a 5 to 20 performance increase compared to Pydantic V1), by using the previously built core schema. `pydantic-core` exposes a [`SchemaValidator`][pydantic_core.SchemaValidator] and [`SchemaSerializer`][pydantic_core.SchemaSerializer] class to perform these tasks: ```python from pydantic import BaseModel class Model(BaseModel): foo: int model = Model.model_validate({'foo': 1}) # (1)! dumped = model.model_dump() # (2)! ``` 1. The provided data is sent to `pydantic-core` by using the [`SchemaValidator.validate_python`][pydantic_core.SchemaValidator.validate_python] method. `pydantic-core` will validate (following the core schema of the model) the data and populate the model's `__dict__` attribute. 2. The `model` instance is sent to `pydantic-core` by using the [`SchemaSerializer.to_python`][pydantic_core.SchemaSerializer.to_python] method. `pydantic-core` will read the instance's `__dict__` attribute and built the appropriate result (again, following the core schema of the model). pydantic-2.10.6/docs/internals/resolving_annotations.md000066400000000000000000000251151474456633400233350ustar00rootroot00000000000000!!! note This section is part of the *internals* documentation, and is partly targeted to contributors. Pydantic heavily relies on [type hints][type hint] at runtime to build schemas for validation, serialization, etc. While type hints were primarily introduced for static type checkers (such as [Mypy] or [Pyright]), they are accessible (and sometimes evaluated) at runtime. This means that the following would fail at runtime, because `Node` has yet to be defined in the current module: ```python {test="skip" lint="skip"} class Node: """Binary tree node.""" # NameError: name 'Node' is not defined: def __init__(self, l: Node, r: Node) -> None: self.left = l self.right = r ``` To circumvent this issue, forward references can be used (by wrapping the annotation in quotes). In Python 3.7, [PEP 563] introduced the concept of _postponed evaluation of annotations_, meaning with the `from __future__ import annotations` [future statement], type hints are stringified by default: ```python {requires="3.12" lint="skip"} from __future__ import annotations from pydantic import BaseModel class Foo(BaseModel): f: MyType # Given the future import above, this is equivalent to: # f: 'MyType' type MyType = int print(Foo.__annotations__) #> {'f': 'MyType'} ``` ## The challenges of runtime evaluation Static type checkers make use of the AST to analyze the defined annotations. Regarding the previous example, this has the benefit of being able to understand what `MyType` refers to when analyzing the class definition of `Foo`, even if `MyType` isn't yet defined at runtime. However, for runtime tools such as Pydantic, it is more challenging to correctly resolve these forward annotations. The Python standard library provides some tools to do so ([`typing.get_type_hints()`][typing.get_type_hints], [`inspect.get_annotations()`][inspect.get_annotations]), but they come with some limitations. Thus, they are being re-implemented in Pydantic with improved support for edge cases. As Pydantic as grown, it's adapted to support many edge cases requiring irregular patterns for annotation evaluation. Some of these use cases aren't necessarily sound from a static type checking perspective. In v2.10, the internal logic was refactored in an attempt to simplify and standardize annotation evaluation. Admittedly, backwards compatibility posed some challenges, and there is still some noticeable scar tissue in the codebase because of this.There's a hope that [PEP 649] (introduced in Python 3.14) will greatly simplify the process, especially when it comes to dealing with locals of a function. To evaluate forward references, Pydantic roughly follows the same logic as described in the documentation of the [`typing.get_type_hints()`][typing.get_type_hints] function. That is, the built-in [`eval()`][eval] function is used by passing the forward reference, a global, and a local namespace. The namespace fetching logic is defined in the sections below. ## Resolving annotations at class definition The following example will be used as a reference throughout this section: ```python {test="skip" lint="skip"} # module1.py: type MyType = int class Base: f1: 'MyType' # module2.py: from pydantic import BaseModel from module1 import Base type MyType = str def inner() -> None: type InnerType = bool class Model(BaseModel, Base): type LocalType = bytes f2: 'MyType' f3: 'InnerType' f4: 'LocalType' f5: 'UnknownType' type InnerType2 = complex ``` When the `Model` class is being built, different [namespaces][namespace] are at play. For each base class of the `Model`'s [MRO][method resolution order] (in reverse order — that is, starting with `Base`), the following logic is applied: 1. Fetch the `__annotations__` key from the current base class' `__dict__`, if present. For `Base`, this will be `{'f1': 'MyType'}`. 2. Iterate over the `__annotations__` items and try to evaluate the annotation [^1] using a custom wrapper around the built-in [`eval()`][eval] function. This function takes two `globals` and `locals` arguments: - The current module's `__dict__` is naturally used as `globals`. For `Base`, this will be `sys.modules['module1'].__dict__`. - For the `locals` argument, Pydantic will try to resolve symbols in the following namespaces, sorted by highest priority: - A namespace created on the fly, containing the current class name (`{cls.__name__: cls}`). This is done in order to support recursive references. - The locals of the current class (i.e. `cls.__dict__`). For `Model`, this will include `LocalType`. - The parent namespace of the class, if different from the globals described above. This is the [locals][frame.f_locals] of the frame where the class is being defined. For `Base`, because the class is being defined in the module directly, this namespace won't be used as it will result in the globals being used again. For `Model`, the parent namespace is the locals of the frame of `inner()`. 3. If the annotation failed to evaluate, it is kept as is, so that the model can be rebuilt at a later stage. This will be the case for `f5`. The following table lists the resolved type annotations for every field, once the `Model` class has been created: | Field name | Resolved annotation | |------------|---------------------| | `f1` | [`int`][] | | `f2` | [`str`][] | | `f3` | [`bool`][] | | `f4` | [`bytes`][] | | `f5` | `'UnkownType'` | ### Limitations and backwards compatibility concerns While the namespace fetching logic is trying to be as accurate as possible, we still face some limitations:
- The locals of the current class (`cls.__dict__`) may include irrelevant entries, most of them being dunder attributes. This means that the following annotation: `f: '__doc__'` will successfully (and unexpectedly) be resolved. - When the `Model` class is being created inside a function, we keep a copy of the [locals][frame.f_locals] of the frame. This copy only includes the symbols defined in the locals when `Model` is being defined, meaning `InnerType2` won't be included (and will **not be** if doing a model rebuild at a later point!). - To avoid memory leaks, we use [weak references][weakref] to the locals of the function, meaning some forward references might not resolve outside the function (1). - Locals of the function are only taken into account for Pydantic models, but this pattern does not apply to dataclasses, typed dictionaries or named tuples.
1. Here is an example: ```python {test="skip" lint="skip"} def func(): A = int class Model(BaseModel): f: 'A | Forward' return Model Model = func() Model.model_rebuild(_types_namespace={'Forward': str}) # pydantic.errors.PydanticUndefinedAnnotation: name 'A' is not defined ``` [](){#backwards-compatibility-logic} For backwards compatibility reasons, and to be able to support valid use cases without having to rebuild models, the namespace logic described above is a bit different when it comes to core schema generation. Taking the following example: ```python from dataclasses import dataclass from pydantic import BaseModel @dataclass class Foo: a: 'Bar | None' = None class Bar(BaseModel): b: Foo ``` Once the fields for `Bar` have been collected (meaning annotations resolved), the `GenerateSchema` class converts every field into a core schema. When it encounters another class-like field type (such as a dataclass), it will try to evaluate annotations, following roughly the same logic as [described above](#resolving-annotations-at-class-definition). However, to evaluate the `'Bar | None'` annotation, `Bar` needs to be present in the globals or locals, which is normally *not* the case: `Bar` is being created, so it is not "assigned" to the current module's `__dict__` at that point. To avoid having to call [`model_rebuild()`][pydantic.BaseModel.model_rebuild] on `Bar`, both the parent namespace (if `Bar` was to be defined inside a function, and [the namespace provided during a model rebuild](#model-rebuild-semantics)) and the `{Bar.__name__: Bar}` namespace are included in the locals during annotations evaluation of `Foo` (with the lowest priority) (1). { .annotate } 1. This backwards compatibility logic can introduce some inconsistencies, such as the following: ```python {lint="skip"} from dataclasses import dataclass from pydantic import BaseModel @dataclass class Foo: # `a` and `b` shouldn't resolve: a: 'Model' b: 'Inner' def func(): Inner = int class Model(BaseModel): foo: Foo Model.__pydantic_complete__ #> True, should be False. ``` ## Resolving annotations when rebuilding a model When a forward reference fails to evaluate, Pydantic will silently fail and stop the core schema generation process. This can be seen by inspecting the `__pydantic_core_schema__` of a model class: ```python {lint="skip"} from pydantic import BaseModel class Foo(BaseModel): f: 'MyType' Foo.__pydantic_core_schema__ #> ``` If you then properly define `MyType`, you can rebuild the model: ```python {test="skip" lint="skip"} type MyType = int Foo.model_rebuild() Foo.__pydantic_core_schema__ #> {'type': 'model', 'schema': {...}, ...} ``` [](){#model-rebuild-semantics} The [`model_rebuild()`][pydantic.BaseModel.model_rebuild] method uses a *rebuild namespace*, with the following semantics: - If an explicit `_types_namespace` argument is provided, it is used as the rebuild namespace. - If no namespace is provided, the namespace where the method is called will be used as the rebuild namespace. This *rebuild namespace* will be merged with the model's parent namespace (if it was defined in a function) and used as is (see the [backwards compatibility logic](#backwards-compatibility-logic) described above). [Mypy]: https://www.mypy-lang.org/ [Pyright]: https://github.com/microsoft/pyright/ [PEP 563]: https://peps.python.org/pep-0563/ [PEP 649]: https://peps.python.org/pep-0649/ [future statement]: https://docs.python.org/3/reference/simple_stmts.html#future [^1]: This is done unconditionally, as forward annotations can be only present _as part_ of a type hint (e.g. `Optional['int']`), as dictated by the [typing specification](https://typing.readthedocs.io/en/latest/spec/annotations.html#string-annotations). pydantic-2.10.6/docs/logo-white.svg000066400000000000000000000015551474456633400171700ustar00rootroot00000000000000 pydantic-2.10.6/docs/logos/000077500000000000000000000000001474456633400155065ustar00rootroot00000000000000pydantic-2.10.6/docs/logos/adobe_logo.png000066400000000000000000000272111474456633400203110ustar00rootroot00000000000000PNG  IHDRxsBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org< IDATxdeyQ!(H$@Iԥ&z.8Q8 B NmG "" 2I:0 l0OL IY`uIbz:a~?`(+s _L!0 Q0 ~$zM!HGp`\g^`BNBL/J::C8~b:0 Q0lnsQMIY[GK:Dqp`(v^gLc+UY,tuUY:9W!<lGJ0L)i?._2 1dn`X!)b?P1xCs`%}:| 1=$9 WIZen-/ b wN9C'usCxD? s? --`~^{Cް `01PU*j'Q0KY.W$en/iuUYd0 8 `kg*9M{Iubd ָa*iO À3PtsW!-UY|:Y-9^P?/%}:=%ĴF،{/AWlfb?@N1qN["H7ێbaf 1=, P~?v_!ÀlrCnp|^fZ!Qz@UY|TXp$[`?@w$pT'80[P!C,vtuLwӋc߻^6JZln)m!0Ue>p,u\b {l/߭Cmna? TUYjtH` S.k=$}:9Oҳ!,t, 1CIY@-ZAp;vZ|*X;Z 1=bTesV\ni]-<[@Tž!NX0Go;q$[FoC4`IiuS>`N3 i~N=k`e!B]'=3ut\bz:bIZ@NbubzIеCQtuәa.$]mn1M BLK:GT:l[U.9` UYgtb+`!,H::9O3!֕pQi*]$}:L}*wX!ԁ]) a%NQS;:Z](I:b/$bӣ.Q.'<[%}:\_CVQN \c&b]Vj{8?uurHUXmpuu[+ +%U!pWFp'a:1,Luhs:\ cc%\Nʢ1~jcM~n#,%k9ھwZ%? mIX{ lHj#Xn1bvudUYh~q?@ Ws= w0)j~ aGTe(lbzLҥ9mk!OU$韭s +J::i0` 1̱v{U[!k%ccTn 1n/΁,#!"Ц?Y@lR^tuœ0!G?9-`Uu*}%b.f~?@[ 0qj9|*w[!_] IDATIuh/CLwY?UY$6a:vOֲ ІI{Z@kPGI11 η4-+!Gi%9&{XbZcTeAI΁usu!&s(iusQOTe#2s.tuS$iő@̖~ v:*X!K:Ǹr-OK:?괓ֲ;kXޟ|hQ3lrX6(cWfh^n~^Qfr,WM&0 ٲ̱ sY@g|*OX!_K:Ǩr+K:?]̖] z*OKutWxur'C"^RpXQ҉!GNr*WS,i㪲:\b?@N? szg,)N`auG$]eTesls)KBLCC$kwPU?BLKuPJZfn`.fs pI)*Iuﰪ,WNuar( am/bӫVY{_ҵ!OU;H::0ª,vW~ ,1mYYfC!G.Icc`!OW b6G= CLY?UY/s9!3!E1puaIZJ:Df':!Ĵ*Œuy]~I#Iх׏88bA92Cӓ}zOZ1UYp0:u[8KC]ґ!G$ec?ӛ[Te:\qs[#tu-:DftuJNx+ Ntu ` lbzMN<$a:\qS89 S!J:GfH::1F0䁻;*wX+À^ !&*<#yCK$=nKDi888MBLds|,:Dfype <U!:хtu }:1=%ã g<1 88:\1.74`v:Gf8-~"A7飯,vWL dY^tՒ p eYNo[bg'Uea}NZ~KhS$e"3J::\icp}^0ELsd-,t7V?]g8:\iHExVE &!K:G .XSofQ wwDU{X!jp>Q,|IOX̮Z+K~ 5!~O 1=8*CӒ.h⽚.TIkCdfI_Wl<)@BLψޓ`K 6 1(WuO`ii]ÀTUWj T iiCP6I?Α!b3%\4U 1={]}*=CS4UB5@Yq!;>_& l}8џYd#C'Teuߙsm]DXbe$V~goI_WjXwX'N 1=',b3]*wupӓ5:@>QGC:]:,tsda%Dkj|}yH^^!C Ou,CL_; $xpeÀu$-dfm-ݪ,vBL?t4_pQٚ^@~Hz:Dfupe*Ym0 ΔҴ^po^@Gubz]i^bSu92ă0"Mi? y{q`|_bo!t8ךv8Y rCdfI'X+Sv`P`Sd㪲:ܸ\Sx4 !zi׭CdfOIߴ`?bzQ*b3-<LT߇up|^cZ+Sz--b41<3kfZ?.\>!Cg~9Obu [Pom8FCmMp`l#m#C :8KoxU| ݵ\k!2#C~|| !&~xL$ӭsd#i-0_)-7ubCLw@!{4//p[{p>?‘Zž! &-/JZ=l?[NI!CTXCLoLFq$pTeCe֌'- -vtubzVcb{BLJZi#C'We:y?$?uHtubYmq s.;92Ė@4]q 0UYoni9S6JZ ;IDATxyUyo>>>0 *(E "b5јh^5Ѩq[%EEQ\PA@AwagzzNuU~aԙ=yιOT*!I&Fw@$@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%I@[;cR<  6ޝҔ ,"^{s? 4uï݅$ :Y<8u(]@ǰ=&`-GF: qc-,f?6A"hi/q>:} [Î:!vQaB"#m;f"̯.>kw']*5|ۋ}^"w5p>D}2btm&t^!뀻?Wǰ8pT:'^?yP(JC5?8x1qd\Oo_$>`o$ Rn$Dx}Wr<8>|8sos9u o/`u~~<8+W3w7=T|8wx9zx8r-Xwe}c/Iۛ}xkj;^ 7>[xWj9D%GL^5^FRYzYID|uheDPQ|0~~.|>{kh.#9V.>g swԩݝwCb8%. sLh%~_1aq%uztf&n`Wb=b.u|j| sK0%Io*/H,E:kn6=a=󈳂:(⵱Nzӈ'uqC֗&j})0p$j|KN_ש09$F g.gg7ܳ!V0\Cu}%b"&h)@fbRY3{:?4Jn/!fW2Q^I\j6be#ORJ:;@:qv:?[Cލv<]x6Qz~a6}h<Ƴ7<%*6cZt&Q[#7&mK!6ZGy鯑AolvT@,0e"[}W\ 5x&Q8=YWd[~3U.XVѝ{}fJH: ߦq#`fKqkp?9_빿: |ўj籉QmI*)|{n%6\yT8e c7yxLIVۈK]_-D>DbqYKeC@ozG&h{TW&וđt ݮd!XGLN5N?&vz bRlŁvk[no 1Z8i-#}Z y5q8yQF7#Ϭ+NZV mD$j'gz.'F#."#͇h3uc=bB6/&=OȇI?%>k欰x7q0sIsVw7i>ɂwi!6Ϻc"8D"4f|]4/1^?Ze.CeZZCFbu!"`|RqX$3ywFiw;K q>ĪZ,&Vc4s>#p=- 9zO9?A?n9}5ՍX|T{1Otl6Wg'JQDd͵YoHn&&Uk_KqnbV/OF-L'A뀿M3I;P9Fj.vs~t3kUI ocޙ&Tۤ!ܻۤf(UĄ?/g*Y6JYGT-Z #Oz8A? |ܗG'YhtXӅۣ!~arbH1e!s j!ij_M-O'O\<4cq=q>磥Z"!Ĥ[.ey3IW ڸ8(ٿvVF1fCg6B΁>-[=*16ٝZM9w.Zc< q:Vvi-UT'+qOkxzmWR7D)6.uYP#)h"!z]-MЏ"/\vFI3O)38^o3}ė0qU# vK)'R+1D}0j7Ѥ-V(tnHΉY[M5iOQ {qbfXjPiOЇv`PLЇfڗTd`obP߻Z}u:ivcaFx:~ʂ7@ls 4g%ry8n>Q;ߧ0]{uv΢3f,q]|%a{c)Q;HԬVI&XnJN 0Ч3HS'oklٶOIxi*4n|&i>,I+@?V&QTm4Z@ &~Yvg"eT&겅 @/̷FR V #A~UqUv&jD:")Џ{45SHx#W&h1کurr):Rv)w ۫ 9$Q;Ki2`H3ܾ4A0GSq{jD'"/ w÷DTqOnMΉWb4p;F9,m/ / ^ص:"ADht j% ځx1~88Q?Xe; p1q%QwwqY1vCRTۻ8 T}kz9 .m@QFo}݁a&5c18xP:W1Y[:8X ReZT9B4j_y鶂<4K51RR̤qlOѷSoN֑5< e:R!KU'ՙ㩭(GIF)M3_:@Vfj!v_JІ,@tD}X uhW=Nv2*&*EpMEr 0}q\-El2YB9姤g>A}iv*I༄hcJm->_H}'Q~j,4[ 8/|9{v`9 ۛWnsٽސGϤ[ 1{ io R7Up\Z'Yzp3Q+*Q<;FNeoKA &6B88J?$ΐ|6z޿H櫈~b}3ߥ9TͤHo!r+ˉ(bTv۬3T[3ILyq;}qG\i_%FIRT^-&2K5!ZS mfLQ| jtdZM-LKj#7#iF6Mn്DVEZZw34m@| zk;(|mu%%fet%a)LISFnЇѝ(4zbGjl4_Phn#1s)c[έb"^IƲ xbibpDչĆ4{5‹hbP;)Ckp?>̮Y%}qIuJ؇ٽqu^{bvD_˟wZȎ)=x4l"u3uz.pA'BD8w] |&XWiJ6#3915 pc6dg0FY꿠#YЇ{+9 ęw[nv-QZ;ؚs>нJO Vh~;ǩ K[Wj1\`1BF|> M\G4}@ mH|mқ5>=A5Aj1*/9^}Ķ_~O!~!._U~tx庋d*gtʎq^TAZNEJc"ZX"]Efv 0sˆQm_gFYUGb$DmVDuzHJQ?%0ećɲ:GT/]D5Y]KĶ6l4X`ɜ`ۓ**Uζ}}Hqv}~@ <8S7=Đ3`7 (a[_v30B{{p^XAlw2X*xzE(Akw2X* ۹yLVtrߦiܹq:}}P(1ho)R,==cD|5-C|KSO?-?O50ժHe6#ՅR)BAˏ}J%wk%>L4~ Le{ ;~ {tˋ]͡Krw,`qC?ҴgۨwP[mF=/ z} 8?ڣϣ>[?z޲ݝYw6qnd+ gQ ~m(>xyk;7=suLn|p&P@>>rGK꼏fq0 kkqx]~EmÝB|>8%4WzB'1ÿRb睧žwtcy,[壼' W H&΋Ō/P z\RcYWv)1V'n'.OݐGh-9-(aqzN?h's 1]]D{m@/h-#ew,fpML?kK9_I׵tSy Qq1~@|&bfM@n1fuq>Ej)ckK}d2$cRX8}͎dj Rbӎvm}Owyw\MfcobC)X>~C]u44laG;?^VaK'fǬc 7x6LDeR(JsMw-bۙгu5dr kwiK<&c"1.akL-Pzf-(ƒۺ8f6|`+AX7Wp![~4g+ _gѼ䰁8]_=Dg-}-z]?;^z Ec(ʘdr kwug_8&^⣪?p@V^c{Ǽfw 0Yi)laY{nTpm~Km-%[l|bbiĤc!ZOL1Q3t>~qB^#w \MSdrl뢻=.3> CCm?;}k4X,R(QsoJh)1g'v;ޑ&V1<8^f Q%j {3{NU0P-M,L@oo-rp]"1w.nl6n"L`nW?ګ:;[Io_+:,sV/E=VkDw_6ϸvƳlbwR Fv:n_|r)[ژ>J  (=[^52ЛdnƬA.qݰG Gk軉uǫwY`nZ13b NP[K;Yy8V{bjDմjO R$&́-b4[lw4>{)q:MfzKlku5dwg]N_#U2++3gkVo4ypׯ[qڙܵa:=mP(1crU: ;'.ıKqx-r?={dEty'hnϿW'/=wz˽:humG?͌“\kK=[/PTgm#uh_Qn9D1ڴ9W瀹=N &3Y}}O.󮎰]OԜޝ`nW;TQ܇A,|Z{{ޟ?bYq.δJ.fu ׬^ (Xey +-ѫ/J.zO ,[=aKN.>U6giZ}-˱|QPQ'XUԁjQ~Vv\lOvx k9MUKik/rJmo9ӔɎvkNZsY<GE3Sj['376i#rkv2k*ZX4:fkT7AT曈qMܐm;332sDUP?+Vg;zWC R(7*&\#|F Wա,YЦa~𗇮_.>?l-2ciEZ[JZ3< %N^ӗЃӷ^]?o~ jum㙇ďo^L{ KF^4}wlu:^=?G׌>֖xkG r%6w9ئ>bZ|P5Զ#ޱĒ_zƾKǮ;3]roݝf.9]gy. #nhB`|e 6lgUw'woC:[[H{K"?lΞvi猃#Wbd\;f,?E=q%R(1X*b4Z8qMxo^@{9EJ<` O;x=9 ]nؗ_޵_3bz9p)B=l4'\ǂ} Ǔr{NoqD<=fzge@obHB` U⬾*6oi-]M]ܴv&[zX ;ٲmwrơkyᑫY_Ol!Ί2r%<脋_{.Yw-d^K-Vl8“mǯǭ~,K/W;A{E~~k;9͜wJ^1GbqU{6~z"}\bwB nq#h~at2;L){W~7^p oI@o2S4!8peT&|Z!~`JOvwo\ gxI ̎69|d`|}pY̝YVV]KD` mN{KmuǯǮMJx-Ǻ]!#C:״,7ɖmoR7\,B|Qau (13^ϛp!mbV -{bSZl m-!B8,?mELg-_ la];N6mXPdWw ;Qf4?.Ͼ^|8hVMn'o'MDgc :[_ f}Ϲ?n4b `]Omܴn&m% OYRW6ƻeq:40B߱ӎ[Ϛ;qL~j.ɶV}mbX`NW/;(0}! sná{nK%"7Xɟ_9>].\ٽRѤO)=/~GY\cG\_1TRvi!̙1{WP {`(4~Xw 5$镴g؇8?!}{6^`}< :mOGdc5ETpQZYktA\X8c| yMyӖ5a a'64ZDlz~5-#.DQeOWU+CrmlY{vubЏDb?ˉs{'j[\bҟ>&[3v็aQISj,\RnPk?K\ gఄkrؗ88 LxxoI(h1>T\)zE7D0Ч;akJk@wiG8!Y[%kG^@ {pg\y9h66: `OmgKs 19> |Alۛbpz]U<1?v9ullw՝.i)Čۉ JwR]CoO\#|{+q=fgD}ˈZhx" O"&=xqZjoeٽ 0 R t=m$3ag`#_A]XtL5K刧G}'_F:b' {,}>{gúɅۛ O{NaWwWo#K3p1p |RC}ovpmK(I&;V '^G(K˷a~`eSthv$μ#Mk` 6y7qPV׏򽛈kf,hr@W%v?xccKʷ+  pDv ㉍L?'g_sNk /ν1Z{L5^m 1ilO{+ʷʷ;ٵj#;5ܳvT%'IWpIڑ>C\ihVb^nZ7@W;`$q:1v^bۉ_ͮ N{˷WwO/6".b]^D*&~XG>8lE`ݽѽQ tnb#_E 栓qی~D_~yc>#o:>oJ@<~-P2Ja5qmb-q%:>!2z9 Gic0ЕN࿈ןI,{Sn&.ifWwwa"7`F :Zq]Voz.gGTb$`OK`tՕzcľgr + xQjURmwu 4 t5{"69n \K\'oV}D_Ij&7o$֟< x:Q!Ml~ETq&1]j ֪\t4kNb_o^ʌɠIQ8tbYܡXV'ү#JJʔh5pQQXccBT1rZ5RA]9\U YH=Zu)Q ]ZwC Ixr|갿?l'K ?;5N"7!} Pp_&j.vضt1C=:ص-j<\@lOQsQ ehK֭""{o.?FR(Y\XI$Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] ῾eIENDB`pydantic-2.10.6/docs/logos/anthropic_logo.png000066400000000000000000001057521474456633400212350ustar00rootroot00000000000000PNG  IHDRݡ7zTXtRaw profile type exifx[r$߹Yoe9 FURKwn_aUK$sX7?fk:̋_?b}~\J߮72±uLHzwx~Ou\ϯ~^'muho|zۅN𠒳TfPű7>y{0my[?__b1o$sFWi9{ϸ^[F1 7*!/ίpޟ5⊛iLoH5MwLf˝c;(=ϼ+, rZKKz;4xIܚ%>W{o+ܻ=D)X1b9]$$w ?@ W*ʓ}UB)la2بR9 AR$̵C#LOϽYr~n"R5"Y u%E42LYڤ֛.G}2Fc̱ffcιV-ZܿYVmuԵϮ[v}=:Mvge)LaĚu6m]v˭Wn;;? k靵dYjct"32k"3,Tkyt$sNj)M*oA?23?Һ;䝎 ]D˲N ;GH;xW ߤ;IR섖ָKhQ(]m#ha,Y,r'y]S검oaf}CK$11SnV֫KXV4^N3tpzU=#;?ڡ/u Z`p5?8gl 5EzꏇB (4DŽ$s4H \X~m/C N@WlqET+Ais "Y0 sCjy(A7<"bF+E+ $/YJwQIֵ9+MyGekKL=_} GQ9$]^^K/z )-;5$4! 1Ə<`$Z UEk@.GI1 ߈:x@z*Ю]SNqؕ@Rq%ąp|& C2N ]JPMo`X'%V7zrqTj|Vcm-mpZ/ ㊺))Ri٠]ہ:c{nEO~9Q`m!xZ{(Pݗq:a&}ޘƩ;H @ÅFlpt$@y"g3'wMe?&@EHa{I!ڣ-# !jm.HA}`b[B>/-B :Vd4&/K00.;BI]\'F%`qh8AL!ӺE=-pͧogmt0SRxun 1!"W:6{cv 9{W Q{<4oQćg΂.oLU `&e㏜^fN5#ƌXDԛwMIԲ!InY9N҂ }3@3rt}lI 5Gfcd)恤;cq2_] 2g4/+l)R-<}($ae%CEQ2[8Iy_fŸ-W2o*c 2[-X&h: _tޫ@x)5*R*8 $D}ki9:JQ)5ox=Dm){ p'p Mo#}>KA#/9A%fT2K_u=YnpOP8qͽ<`}Tq {U{7߁#Jә)1jiCCPICC profilex}=H@ߦJU* ␡:Y+U(BP+`r41$).kŪ "%~Zxpw}wШ2463˯WA/ YƬ$;]g9ՂŀH< &^'ڴ QVUs1.Hu7%59(X`YԈ'cSXYXႾuHa BA Ta#NN '}C_"B 9 ߽ /)_cͺ|;N>Wzۿ?I.ۚ\Olʮ)}SVqdW-Qϻ{:oM?OrH. pHYs  tIME h/ IDATxIײ9Yb"&x{-'SL&3m{)3-2i֯o1OU<՘縻'2٠?K ~|_p#@~ qrqrqqq,q,q,q\q\q\qrqrqrqqq,q,q,q\q\q\qrqrqrqq,q,q,q\q\q\qrqrqrqqq,q,q,q\q\q\qrqrqrqqq,q,q,q\q\qrqrqrqqq,q,q,q\q\q\qrqrqrqqq,q,q,q\q\q\qrqrqrqq,q,q,q\q\q\qrqrqrqqq,q,q,q\q\q\qrqrqrqqq?&/VBի㶮{)x؋4 $CHĈfh#t|A0 @EFSUP@EUcI}3 DAAؔTȐq\mFAm^P`9*cDF q B  f" P!*! ($%9b%h@fPA$ ̰f=6%3EFh&DZ M㠰Xd,,#o &8'lLl M -*̸ d@+f8l8wS0 5b* hUU l14@h֙qVUqJj03[dg1 0 " @F`f9^3 A@rD@msm[1g{҉5nsCoYm[&X `fdF%eኘ,E2b#v"'Qj(`@hJVT fsyH;Z, fi(} (3 LU!`d:[cuo sَJy@$ (0_=B,c+Aoq=JWלּNAdAbm P4 []p?p#XSlS@!_Z~%p=JI0{2ۦڎa6Ɛ3`D\ߌ1ڷÿ,=~rX*yVɂA kDȡ3!"JD z+#+.b#@{++ϿΆ_7-fdH9IةOC ۑZ<8Q' CU#3EihfMȠ0.?}c("gԶc8Ƿ(~+iq /}+/k$ "wǮ4(F9>ӛ+O?}y{ 1oCZb[[[ "s!m[U@9j|:[ l LQ r1͐͸LPTAȴ@*«wVM;r ƥB#CC={9{a0ЌS C,$`/_\yܩSܖ=BUUUf!YJ!tr~c&2]<BPUD4nnnXEMKAډ$6/( (@&+&A6AO>@!Kf~BԩS޸4V7'5T6@SfҔR+z@Dfrrf̌ U@w6PP%WnW"zP%1|jʵ۟>H[l;ŢC g>=ѡ[_(Xm?Aw~!T, ֚ףG++KB+ 3UE6mfB(R;['.&`@ܶ_ͿPh.CĠ+fmshg,=~"G00hiؘ d@#p}歿ruHD /~~O)/cs&凫RL&lFfN ,ihFV6PDI @M#7i+T@,"kҫKUd(D!HDdw\.*+,g6bSED̬i|}HF%ZLUQY `CVM Ř& Xf",U}:IL=zQAI*M3FApZ BX,6rQ[C(%G]+w "蕈ZYY { j Ef { ę AɴKaB `jH=|7玧f} w'bfjj.?xv zV* dܕLߚX!-Ɍo SuιYq E=f."jfii2Qn3`dABƘ1fNˊ="%RmoܺS֜1i  KH& cƨP)!S X 2U{[+O֚K/[칅$ +D*5^νmHpXxEH@خ0QpY%Y"OR/][Jfh3ཕORC5K*0BPBސck/Tъf'ͭv=)MiuQ+C%Jmf#VY8#.XKإ&a5 X PB7om GYw`mg޳WvÈpR6DD3zJ?ɶWDлIzgm'=hHNztGI'O?nL-jNF;=CU2+DR;H+/7oUԉĎzpGz2JU*^$ vlY/7n--m{ {!U(88]6Afccxң/[U%`"A ѳWoL!(1u-nv?]"J)^K8* ߗBz{i髵>aKUEUT;_/.X$TcT@ջF${ƻV2""5lz 7n [`DČ6m^ 'd$t]S1"2QB gN||^@bT~YW={(e`5 ,9 WT$fhfdJɀG^_{?oҤݤ,^BC<BM!9ܦ*N:v~S\ q`ȁMs/`/A^rj,0mF_"fɪB5P!)NU!Y4  8꬝_[!(5 {Z^nͪD ZMKߒYV"gD$3vopn53jj{uw.q^rQkrpDxמΝr\>\Xi  z3~aP/7L!?X< S,5DlղYjC_\<&4u[B#D2\ePUQ;ր׋ .XMJ(PQ%˗\)bϧ -N.B,д|nR/>}uwW='L*qV4~˄ MF "}ť.I!- 3E.=~K붛%et2<ӍBFV^u5*(*HA¸Q Ib++}K;oY5J 9ƈSq$3YNA#v\$ PI4Ip5b[XXqtZQh={uf{4p[vXpB !t⠽P+ŝTр(4mUf&"4G:AM@sb4{g) 57[ޮ޴Ƞ)`}7`w(mZT߸>*q+c$DF]hj/q˖ +dPե OȟN _wP IKBRU)OO}r RjMp;Z$_ZY}eN&]w7À!lnnߺ}7$ch7w 4M?s M\51??r`ەZʥzÇ G6瓜$b h=)2A Eir/E ^ZO㶮mqAkN)joD8Dd@c>U8[OÁQUГтle3Z#޳ML|̂G66[P OjWOפ DaNĐ2̻cl e_aç{#(G{fkil -K es̪`4]vm8vCFضv9bx|2XF~>,,m{'f`25m_@J*XJ("7oݺB%mJ^OU Cqpa2GPb]~=D9F$y֋ƪ#{ޣǏ؈`%L3aLmwߞ?}%h@.!AWHCs<φXf|f+ޠluop U5jt歵P̜ C= :T˫M+ $C< IM-_~~}lY 4oT(HoeD 8Ç @V(yw%4a!r}֝{NsVg<#{^.i .aԫI>|`|%I`34>_"ҢOB+׮7YC[5"|`}ؘu] ʊ>{n@I,Bs CۜL g/V?1d2ES<xw|_eKӾI!\uKc4CU0QUp/b@ @]\>tjVU_oߺ3nZASUef33v1&&jh °Mwl94oWt!0M'tL%Mp!MJgS ȱ *c (Hַߺ "T!B .X`&jF ZZ}S %IUCa%0s+l<|zM Ai!.bZꚾtU,Α'U6J#2)"7Vl[,FN'&뜳al<~b~V0DSijJ0evv-s\EiR[XlU׷߸ר&UC.]Aόr@ ǎ>u(@!b+HB)Ij۱!eO>\63&\>l$zF{V wyn<p;'B5D*"pʚ^on^y:W} Qt'Of0@15˩f8gpEtપ RJ4THh)o 5F)"a8a F@Ճo[Q,Nԥ3cʯU Zlz}ٝ{+nE YnCܝ}OI $g`Jބ ]$*q{a29gf'냦)$S ~mq6 uTfRpzWDMLD8T?Ȁ]2E )l$I'ǎ| (E{=3Ɉ4B NY\>@H@qo!u]+I%Z zUWs͒JZQS{ˏP"*vcgv "d=wHbfQ@hsdOfa_LvXtڍqVU`}Ī$Q+xcz#U՞sN9FH׷̐|IdJCt}{j3a$*ZN +C~EjQuwy,:̸`}0sڵnYA T5 6GV)"*%%D.e,R e)޻wچ%#db&hZWmۏtb]GRKXJF;d U9 =aTT/_*H(5Z`;»:D1@۶N;emF>yp|I4rc;ȟ_؅+ Z\آES2-E8*"2`!uپ"\H΅fT nꃇ/^ #eqs)FU1,pɓǎ]x!5c4 12#bJw*mBmf!Tu] `3u"!*91Cd6&ICP4i#b{w?赥i, *" A$ywW&B+wkܷJ L1^CtiO&ԪvD!RNk22ЈÇw 71,=#p%X>u޽ f vz"(Yoݻz*" u&ZJw/+?E0*#°{+ַF QVs9fO9rGv$cKD BeSEkڜifXfM|h߾sOCZ2f.͙^7@P6|q}VY, 2ߙ+vaw/&Ф LW=}e#`!dE5 Dd.YPZ&@.}޽bM|~ڦM `TZC>7 "5 !,.=x`"zv ,43Ex@9NJHRc;MPe2- IDAT?#-%Qid҂)圀j_Xl "2PxWL Ç;ڊssxtȊRĿ)&"2py҃J1]&I%\3Z3U7l2: bfhbmol&P "9Li:}`aiBŝV$I ٌ%+ Q0!bV)x7!DxBePlp{Fs q%ʻBefU7wX 5U$ \ 58'C!'S" j`Ͱ5@D44`,߿bG4) -,]HUT79rL>Oe~k HÚf*MPH9*nmoXn-XMs)邠'hhlLǣË \X1vkeb)ȐS+U0{5w~lJu<뽷׶E(۶͒B0]~9Έ 7 d(_=>;wRkGMT.SG4[}25 C˵όc:,̈! n YjG 'G4f*ΘI  v"G/VHF ]~ y/ ~AwbMo'rX`j׿yK7`1H\:[IF3 Fե'>ڷBbm1~z:#_o\s0mmJW4n 6v.:ya65֠%&V뷗V%AbD !iG9[X=%T1<~~wY4h2I48=M6ܽE=oQӶP1Ԟ:|⧟f$=Wp&V~#vg֊KX K vu!2 #!@D44>?s`/;!p@41P `+[`0f3 w,at\~_z3idQWoy1l0t]MǝycΝ= O(e%9Dy}泋5I2Hdu!*#HLC(/31T!NfzMSfgO{'1dg,um"B6@_^1Ʃ?X*(KScM!AhRhh@ c>yA=ݹˊE3WwVmUBe$mw#^`vv&MBVw׍q,& k$lyJjCLl0,Hd5r?櫯{ @ T--FEx;wm3 McI%sF40!ՅAsWof~%ՙ&ްV7S+Bt`  }sMh4{hԤ<,NIe4Gvϟ" HN9g%U@m@h/0G8b@XۣѨlA31b[" D8:uj|`gaZ1O;{⮿l@j0]usc!$*1D'}'|ED'Np,j@Izo;_9mf$%uYnU`Q@ZKȜ AZF`i/9Vĥz]b(U!DBΝ>oO"f3EΜԠ$ 0*Լ͝aѳgaU7bbFF JlƒiH?0_#JH/.߻@ ŋe2!]]g#TUU[Y}+Ц;hx {WӮdDCæqЊ͑ ZvF"3V*|e\Z}6[@ lJ>ptJ1T!axLWXXѕ77ILQ[aV 4OX%KU˔ iǏ5 K;|"2¬6_}7)ũyUz (zU&GT$6/7,Cr_1 JԐ9svGd]I Dɖ>>w`M ~ Qc?4)e{++^(2W!(,f̏G)HƸ}̩OQSK""Qh(ڭ룆AQS-.X6##wC T$ m/矝*zTLbj&"W_T̑qTjvӯ{#`oHJ!puyH?K*c;Ų/9uh@VI]jG_98ɋZ18hMGW+?J Xd amkxaFTd9J0b;]8wZa?c;S&ybݩ84MS*""m'[((*d=I%^Uڷ \1؛#mķ(٪B +Om`_l:.X9+ͱW ;9zMeX2ͣof'jHm>9vGz,m08 +nǣ=Uux5)JUUeV0LhWlHƼe0924LBg1TV@KUhG#.Oe"o_:]p6P?cMqSq<uRVS:.X/bsKD 7.m bHDDYMMIJ h22lZۙTvK* j; `9窪ôDͷESd9N/mZ.~ҭS 3`nO||}`+s?ɜ3a{wy5mVPŜs{@ %Ȍ +ӧWn-ت"~IrQ YEߺz}mcڥY@ TUՠ]pm۟h:`MʀJIeS˪wS3ks !Ȼ,at:;f]8;tP9YϺaU`-&c u϶O>v !&Y7v X9ŠZ+ P@yɓϞ&TQ PPBTsEk*+ސUA5,{{g.\lQ8hgf5ν~v/أ77^ kbh3D&:*B+&@fB`pCGO1{yG[}Q>_Q.mTƍ^o43-E4 O`ƜsE)lҥϪ: zf L Xrh4ݽ{wssc4d#f朳fwLs^sn}NBj_"rx71r?.c5H{ =w CU?x;1]?bŒM28ٳg#۪ n6=c)Lwt'|F~:fQRJK+// XqJ'B3j r g¥bD@]H9?D[[K+7I w ]wڬU(Fp V P$U1\tiE"5P#0TcrfTodT&ӮXvɳI*ht+)pM[# `jN8qaL`U`01Cf< ̆wG x} F~ō۷(Lh  3@ԌvgQ@A=?&#dN 9z,5͑؉?ĦMa4n/o7I1$"\kFѳO_!avtⴼrUbhV+ c͚L,XH&L&$Y]TXK2ӘEoӳ^L3MO2+. @ "s{\",e&-cp?~s91վzlۚjJ NLEcvOl?j+*d="ff@-@1YрًfTf;@o\myR{F" J@sBvÅ'T&4?O'3y/Zf&ʿnt1s1叽ʈ/,Ӂo~ !Αdι5a5KNT5քaBvƹe9#&PU05$ZtV_u !#Df7T3MS>vũ;;ع8~nc3ݏǯ?GUĒC}APO'gXQJu]X~yv/(@B 9Ep|sT2ٞDR;]O?9u|Ț;$c r/ ֓ﷻI}'*GS#ȳsw<IbTckbD>+=_^9_{ ~XZPJ=z2^Yş~]s"Z'-BO!v*9Q "FX=]900EjRfHu ЪZnqxѳEa p^<Qbf Oqx;Sx|-dcos(<SW$Ʒ|iey5@rD-: wn+W;& CұD+HTU 4j 3N6wrhX+:gC4)ҠA,'ܸtT;W.Vb @Fl\B°/ ;'5%y1WVFUr-ER$gBusΕwc3sW/{4}]qjX>TD!H݇_t')cD T$@lvBҫKK8gEw1M=TTgfʑՐ3 =^` Mvt b--8#DjC,e!Vu'e޳fT2&rfEsUc7o\18= w|\\7^,-3s|-2)ٙ77f<ӐoGIjʛX5dē8C6V~}ـ2@_/<{#}͠ 7ogfVO\tCB3 UR!KTP3ifgO\~9Ad )Z?|@dd/7?ɝs,Ny ! [=f4aj`fOH2O7l#,R_vrSY6YQh lȐR6r WV^IFCL9Ō\|֚9W0'u "(*+@jl㳗/}IST8X6iEQJd5%Lل HJ䄼+z%6r-o|S\X%}U:WcR%gwG;ݞd2S`rLhO 4܃V "Y1x/|7 B60<.-eCڀ^P ؔ1sUA>Eg԰{Ke]; ^,^Mf3kEg5CFHb BeDPn'u˃̡@Xa/]l%S0Y+`i-/_=|%" Fq}qp a ].Lcn]| @>fs;u~|N7y"gVJJ+)d;N밪E sZoDxƆ_šӓ3/%ȥ=-{n&|1R B94X $T5ݩf )߮0Iy$ \S ϶z(C49I kLJ{O"HHڥ-OMƯwoPY_}Й5t\ &rLxX0y0T`Ukuam"pxd8r)l{3V]ǔUՊhɸI( fQVJQqZ! F-πb-$% n=K<kMSz5BPS$R1$WLOMOXwI&l `= !A- ujI2?`REZ'N,xl}Ve ᕎ"On 8`LSp1CU B pvR4ʇ#V?N+xMh$F8(g%Pbk%f[J݇=ћLCO{ޣKΝ8$'"0,$0detom_9ً4vq+kׁyT a4bLٻp~q{oZ؉)GVτd)7"`;b7-Xw ۍ9B U3$ {n `BK_\M&J! d~h6'S",,8:(4amcgƬ1+֩`N4$DS(%ddL2!L@іT f$ qxW]1RОR_imr@OjX)Z'('2Lrt[ L^]] r$߻'H;V. ?M$: 2Ͱ ťwY WArYY8Ɏ[@J\G;E@#)""g;>Rb~ÓO| w=~g-0sL+GD LL0wʃG;3nT ̌j̱"|zj)j2w4v+$šEmϦ-XIraH6C^rIHZSD$k;OIʁ̈́--P+g#Lou)yW&TeVS(ҚDY cP3* `3뜓 N2E@ΪRIIvbD}X 3!bV%cDwvAU&q!uNrnɝ@(5Ɵvf5@h`{ uni83g >߼aQ1v)%&W:UF҃^B!90Rr;szEt@PC P(kJh_uDGCAg6<Ց#,@`WD³ϗ[)bEEpOE(dYчnN}Z Nz+~\ 1 x^OjT fR$Ƌ]x%z2LHWH lH?l~}J>ƈpXAe3#tFN|CJIU ]:C?0|S Z\yj}el72xV Z#GXDfb_xn9̩PtLoZCFDȹ'3p`3mX: bUUfnG^ҺfȼLf(!;n-gˠbU3@b1S~%e %뵄ܭňzX)%^ O{YvUhC<Д8WgyBTDNTmt(1!fgvbs̍3A}ǣ67#nf"p) ĵ' `t)YjT s`1k29%gQ(Y)T^`"*!rL'r3J`,"9g@51+)5WE9=zq ʀ&u u.T_{썽:oq`3y8&01I }0֝|ِq m$[MUscsWLDfY1!7XC.E(, l*@4N`#~^[KFZu2]W8ġ)>LIϞB 2z/audFVuNT!yGArk)Y9 j* ?}jI kX0 HWKY-G= TL`aȩa@)vbU>^68%F`Zfxs"G?#7`9D{J Ȏ{wbtˤSbX޿yr$}OgnXhDc,łٿxbFΙ1"" h&&1;S7ot:E3KU~V("Obr>Z%t $J| djϸ4V?31Z&ι;XM$!:fFؤDm!D~R3ٶQU\{)Uқ2*[[^J0Ml`@:L?_=~ڬK\oz՝ڟgk9.iotscoC05SkcHϿzhAhBʝjƽ33΢Pwx  d@xԸG]oxrΚbvB*boV߮+J)I5J-: IѢ+}HF(`,&P=jv`eHV5JRJsϟ?ugj6fLd `Q+60陛ׯ-pHf Qǹ%"`"NN/u#3$Cd #+6n=3rffFfJK}I (|3  LTUq T0-<_y"cJ|X%[U35E* 셽rLbXzC0DJh# ȔsYn̪/j{FfsLgOp I"{'*h՘ºBvнpI&b TL pc Քv㣒8~B,9yw_ec=B-JѝJo   b1 M?VL:}RSAD c~YE+^\^~r#@$TB\6i9ً^7;.Elͣ""+#}tkXV|33:D>_$+Y{-t0j*VvQlW3CRBH,sh@H=flQ^7Xl8,Vg7] ٷF&R~ 6L ; sΟz!ABkۅދHΦ N ^[Z}{34}Pk{&9FE[%*WUfu>ŵ+k).#h{.h` @yNB`F,F`f%- 밓ƺtP(ǔh1)bpg^/Z9;D߯v^ FXd0NgO\x@Q 5Fy7mL"s.$"+V42SƂY[/]= tOwnIrI~3,hk;bfGTm+d@nٻ8<]z~˰ӟˆa9JZukg'[(Ls]-cS&yon&Vp)߸|~~N1M *Ro f (j $拗+sׯv`V=SxbJ", C+uK-po} "RbmO`I>֑a|i~tU)V ؐHО5F-2}zrW=ArN~E =OMt?i֍>96EsxAYe~~n,\D)%DM ဌ.3cNkݹqƛufd7!/vW>ݥX=;B:)B$Jf$Dd(!*%t#Z]znҦQREmNݿa+ou|'3-=L;Yb+t,P@clƐѬOS?df^9$ŏMC-g}(f@Qήo,;e咃5-Д!@&V8ZP рĸyff&žI"'ff6?1~S=0swBcxB-l`#:`NsMtaݹza'=^w"}Gmqai޻=T sΧ?3;'{ TA\F$bUF@r.SUOs͙п=wg}%糒S$!8Kp7ޘփRZj|LˠFl=SWׯ9a&2CVkdQmEa&o+ܽ{Ç"&"Y|l}U94bP Av{75+Ur>Tl91a+ǐV1v!)o>KAޭt$a(TB84GAW+ &Ikuַ7vWDEA{$n')DZV?~ťL}sV6+[lV<4jSûR}9ū/܇APSBlN}iNb#|xlqi-ڀd,6kI%%,I roX֍v M Cq`tX11NS\ufju[JEzb,V Bl^.or1}q4.Zb䮄+Zn]?g{tvNii׮LԽ@h5'5!`K;ݯ=EA{P ul˳2;n4#(ix~F!PAַZ:fz]EtT` 3D3$꜌ooukAf};` gܾvSFS:ܨ 1.;Jb$lqܓCvW0{ b2J<W.WLh*19oy~cU0@[_{ղB΍LxebJ>!Goޮ):p"f튻g2ͿV;CM|B4bJuC CUqӭHS$p)c1NjK U#-Z.|x(_Lu*ǘR*t-ý;ʿ&B533Tb׭ŗϗBR$L)%C~都c4 fN\wb*AF!ܚf9ukrWI/e^ (*1Obf̒x{'YUR!I) X̦}{O6= A4(̬# +/^|ռIcQ16\DN ("UO뗙1"ʛW+ EAL<#^VQ;Q˿cM/m hRixN|#@IhQٞhX$_xUUa-vdI&r9kՒlzZ `Z-5"ɓrCdMG'Ծ䒈>;}Vhj*?5C\O.>HAM$]<cp25vwfQfV:'5{p~kk;vAopt\BD14GUͲJɳ) c !d#&-S"#6N@gsZǻT/>rS٨V\d~VGd׍MC{H!Oa:PvpC aYM C8|䙛g?yjeZuv4= 2p.h2Պ9 aj?MzͽkMDֆ@KG;M'Nݍܹ֓t́$#3lbݫӵр A62ov}6ժwS,MO]ĊmlFY~}F`ꛠCP2=yĩSBp9.vFNOOc/UC9"%HEA46ڽDl z=[NTm:ODE qU,k`wlvsT%0;@G}h4226ĝ͵/>?vr@G~vZNw[W>Hwc5xd̈ T zL#0ŜU x7W.]>wg+LD"( b5/="8<_sٜk`@&Y8Mڱs7-< jPBPb%Ւ[!"X 9†IExۓ^_gt-v[؊4hgX߈1k-B1Ô]C[_}GnS΁!@# @#l+EHM1և]tCJƸ(fgf_Qd!*\i@WO'IʫO.$Gui zIjolo)X҆C_Ja=" 戩'ǯ^}Z3X` zDNAEr!y/:6M -q4 δ}?ASoRnYPw/Kr9u @x@ILIT(fWa8z.T8Hp@-$h0]xQp碴!ﮪ||+sK~wnpՐ tuќVӐh)isySo*bU)YcRS'_znTf+. _dEf ߾gl[\GBk=:]<$}gz11hˉQoj!OHp7qnyn:Q]{OWIū0m=%lT"T= &Hn,%lZ/qD?i.̚˲v,Kt֊4 녾$1GݷhQZougY^;b!U֮-ԲĒֹ:o0l,]|v8}VMcGF再hh ~|sgRrJ'.%)LZZ'ŢZYݣ[.SDLbI-y.0$NnXGlG3ST=.bUs$v`5IL!բljx&$k JjDk>oƾﺈpţ.;U+{q 1iϦ؛j}OtR6CD$uSiw}ݤpHrU|Ak)CZTl8%d2ϵeJ[璹jsIzۏkiW+vj"a!z勧RC,PqIn2, , , , , , , , ,                    ````````uy?B2kIENDB`pydantic-2.10.6/docs/logos/apple_logo.png000066400000000000000000000746371474456633400203560ustar00rootroot00000000000000PNG  IHDRa> pHYs  9iTXtXML:com.adobe.xmp Adobe Photoshop CC 2017 (Macintosh) 2017-06-09T16:59:21-07:00 2017-06-09T17:32:22-07:00 2017-06-09T17:32:22-07:00 xmp.iid:4a8656bd-5a3c-45e3-892e-8c89668318e4 xmp.did:768ACD9FC45B11E3A602C026D63DBC66 xmp.iid:768ACD9CC45B11E3A602C026D63DBC66 xmp.did:768ACD9DC45B11E3A602C026D63DBC66 xmp.did:768ACD9FC45B11E3A602C026D63DBC66 saved xmp.iid:4a8656bd-5a3c-45e3-892e-8c89668318e4 2017-06-09T17:32:22-07:00 Adobe Photoshop CC 2017 (Macintosh) / image/png 3 1 720000/10000 720000/10000 2 65535 669 669 KC cHRMz%u0`:o_F?uIDATxwg]df+QBRQPl]u]wuWuuuYް#(JBBzɴ d9g}3 0}aaa!XI)B'B' t t B'B'B' t t t B'B' t t t B'B'B' t0;6)۰a"-[ՑiձՉ۫-g}n6\s9mٲjcuS|h)lP7c@]]]EAnٲGL򁍮)ɍN B's27T7_=z8 ]@`ƃ)h>zb)po ˵8F:Ysճ6-c\} :Xa>/ot4_ݷՙ>:XA/zjc{ǣZ* tpƮ/zvyYS B',S|2zr:5Nf4l8?ף# 56<9k, Y B'38V_S=1} tpaUV}m*Y : t)lƚY U;DA` 77۫dA`WR~sUn?N@?:XyxE7OUvrn>PP :wi0=9s)nk`>f<w Kl"CF`6fi֘R?ZejeCC`6٥CQ}$@s-:?5E:NUȵN>pnm9`\Z}DB'{ s;OWzeΪVP=TU꟪m tp'A2 ]8J,>R] t?w9j4tvQUm-JB'9aշT7O5ջs p_\Du,WW:l"ΫQ t̯+bGJ`,r>᪲">]ک t̫#=T)V̛u:ӆ9Jb^^mU :搜RQImRa9>cWm՟Tc@?yJQZ8AGjܱur^0;\ pMoM+˴:#ol ʸA)@8*Rt9A#+#՟VN:sX[?' t̥SoT;u9A+֟;WҥTW+ꛪbE^Xl`~=z2wT\mW :4 YWWAYukTdEloNyu[߫8AgGU_U7N D_"n[2B'_>OTw t̻gW+Ų:5JB'ufteXVWV\L 0|fu,;߭^%p pG77ʲX^R 8A}^heX6o~NzReXT?W}\)@6W_\mTvmNode8hW^m8A`hAqW;N:\O lgWկV7 t𹎫 lzMU9s羍u?V?Z}ZN'>,ZyVus +mD,۪>W==N9s^@!ޝV maQ!l{G*>mA'܃i!qn^PFurB'9:E5կU_ݪ e~R8"tƑH?[Z8\ UQ]r$ t,Ӫ UoWW]'lB'٘MD_^Y)pB'tΫK!}(7`>#O?7N:~[75ov >sg].N ߪMw:V:>ڸrKu ٙ̉0tZ+& t\nWY꺲Q:X۪?^V]V6尽qV/>%lB'tg{?޸++7 6`~ kBuc?4w .3]/Y][6B',YZscckqg6`htXFg=5NYq)5ޘ>3]Nubgc3j34ϛGM NCu+߽1e~]NA35oNG:ŽꪃxmOM2`8j6TM!q5XyKcU33/nZ0)x_(t88~JwLsGhWpg/‚ sa<VaJ F"UaE8[l <Ʈ#6Vٓ8taW76{m9cA? ʦ :::_ubu)SsEsFm ۫8jGx;ukm} ;©Љia= Nƹ$fQ= ӠyG=/rg95opyFuVu*O^'NrP kwMmS0uTN^%t"t Av&qӀ{Fu4<_Bgo~ѵr4/[]LV!`T>z\! >=tWO+@0tENN`S|4>` gLaFUlw6:8/~qǦ@uzYƁ<`^=zd}vjuGg9xum B'B'<@~^cjaգ4otq4 И.xiNw`egL]y[]3W>1nª*t {4>ѹyRfޯ1Mq]nՅ{w7:WV S<OoTy_moL}@?X# B'B.TOhtrjt1owqvL!N{][F˪'O_Nr^>U;}ѻ)ANscpꙍiƔ[w6:6 p.Q}Ic)9y9 ;K_Ns 5v>z4>T=Shur '6_R}颠kzO}lzO1˰{J::gp}Dƚ4llxc?4֯Z֫{j|rc:{о.zEcfj)t"tzD|gO]Ng;7N! g6ϮsG㘳VB't kt=ѽFg`yIi|W#j/]5Ε}pfܕ{/y9BrLmS;=U~RqX=q'5{)|q܆O9G:Y.Geօ퍳?zqϛKsCuRLo?j~sCr)trۂYEu}7߯ǁsScsOk/C{ESTstӑ#t `z\s2>O>ݻ_P)p>4Xr<SdiwS|̵kasz>|S'y8;٘=m=O9G:Yʠ{rcwMkcWl74?P o>Z?zzX^7W_vu^k=O9G:ٟꋪ2ӸRԸu?8pͯ][?ztL6p?ޏ~\TNSd_oëuu?a vU6|lӍ)53.zo>{jlFE r)trOq7V?X=188؀љ{VOnl zJcf?2˫ml6ZSd4n2r۳&w70 N5u~gTl ֚3rMzr)t}gW?Ҹ#]Xi78mtAo1 !tQ8; 6h-9k:UUϯdzodS:NFwэB礻ɡ\i\f󫫫6$M9_64_q~F)M.06&bsNS*7W5n1ꖵ<S\sSe/6`}ꗫkZsNs}Σw~o;M`$B 5l@̍]ՋKv9G:g|@Sw4̗ݍ{ZrZ6+p>? s+q5Nဃ/U:\9چY xRWhz}Tg7ZX}WV b]Y;t:9yXN/^4~0t:W9}`l' x_3Wusvtr +o'I/T\4 t==W*,W4԰SW3:U)GipJߪ+k圵C U_p o~qB'ܥ3T}{(c:8iկ5-epT}6 B'< J2;|vV3hћ߬OTX$:ܝӫoViyR̉y|S9FU`|yمN#p8;;ͪ*X?UO8u G g/:Yg?z8t88"*ӫS ubyzST5`cc3Uu;NG<~]Zp|CSYh8٩3rh଱f'U5̦Cu;g&%Kj|R\zA:1{e*p.vNV'p4ekTWNƽ`-/M5u9O~zF]NSd!ߠ"7WV=J:YT5կW]*l 'U?8d[#B'orVkUCl[cwU6N֗'VWB77nnurSVP˪SSd}S_عmqTaBK:A)8DVDjй,<ȫr&Bz U_PjvV/E&qwUQ At/_]}2p Q]S:ץij;7jqU ;pV}C**z8o8k:ׇ66o s`rnzJvT/Wur?LϾGU^=w:,N猚s:8{u9?ΪU:X$X ae-[5ՏWWpr t:g3pV=F"XIo~P):ƴ.'+#):Y7fy+ꧫ sdZίNusK+NtF$rRU]iKDO_8V5T]D3Ǩ+iks>g3+V+8:эs9OT V-կUo8:/VVꏫv shZ\=zT6`8\|6<wqKt8YI:kآ DϪ",oU*B''V1I,՟U;t9:ytI`}z@c+ƴH"K7aUؽ-@/Ǩ귫u8YM:k፩eJX ՋrB'֟^=LEX&WT[]qL̜ ]m\v#tY S,TR!p"tӪ(`[JI9:VEX&:SepseJIu]돪",7Vt9:YlSt}uJNε,5: t25~pjt9o,r"trW'WU ?To8:;4<\)8WعB'kmˉ{|J8ގuNjDi n!`|:t9:{A^R):'Ug*sB'{[t>SUzm)@d_?U)wr"trONQIwVTN偹w[@s,ZyFup>Yɾ ;X:ٗ#ugt=O`e]PL#to ۪hW`RPڡˉɾll\}i='(t"tMD,ݶn:'T*KtCu2 t#tp.mr\pxXku|u2 v(B'j2w6օNNJl>x~SMJؽ:йQ)X%N;XkΣ:췣G&RLB'keB'ƍDvU)B'Kqtt4۪8:VX]ʀRl,͝^GdNP`gU:QoV:t:. !%9ڠ , tߎSs@dr`@dU>4|p:Ad,{:Y2gtT B'B'p6:QoVډQʀRX=NdB'+͑;:5e@VҦpe@Vzެ ,݇,ՑթʀRܡ,)x~] X#ӔXIcX @Vر-[B'bN,J8:A:_V%ou2 t+{u"t?pTc;[Qc:O`RmV Nǂ 6)B'F8e@dȑI7uɽ]8 ۫;ptc]::W[m\=b tO7W)`Cl&Bd?lkt;@ܯq;ӎLpNRNeP=l&Bdv5uۏW Negu+V+B'N VY:gN‰Sܛ@lW˭ve BuEuRp4Ůۉݹڪ gVs 2ʀ?;s!pѸ;B'ƺN8X'TϭU NvKu2LNNv8XgT_Zm4ŎIuuTTepx5]јfYeCB'wuiu2L:I):YjeE9> lml&rzJwTTeUKφOV;etFJ9tE]EXFoqU6V]a˪˕߿zLYۉI]W] UZY];X1'n'BQ}ڦ3Wm])Xqkgn'B\Ztl5* 9/թJ9߮.RVг DR2͎9v6t;+IT uAu6WV=t;:ʢDhLJ:":fNsSL;esX׹S)XgV:M):ĢuU9+2M#tΙOVW({s\U] V?^=D):C.`=N:V۫sOulfKUUv'gS!꧲s.\8: ~:As}QWmS +2:\D\>u:T?T}sQD\gm&RE8P\G\׮lë_'x"tO[V;C/Ug Ȣ);P{n Jй|":X6UgW?]ۉй\WZP ր(dhG\Wn׸ւ#~DD\뼠Ǫ<:ׇv֞c~TD\nnt;@DdY])XL[9`Fx|u9>Y}PXyʖ-[\9KnM EU5껪U)Wf"t΢wNֲëo_=ND=Pfkߩ%x"tΖ+w)3-ꛪO5nZ׹z+̊ߪ:^D kt<`ܿꗪl:Y.D̦iL?:L:YnTP fЦ?$x 1Ӻ΅C9: ٵzBۍ[NUWNVի7RK3UPճzYuR0'WoU Uh:ՇU9qyc.:Y]U`Nh9Sd,zֳ][BHn(`x:Yuӛ*:zCuRZVR֩kwr ZW76'e:9}{M.ZXONc J!t6\PVXg>ָrԺ!>V׫Z΋Bdmy_q\YکB'kzI%uo[4!tc~e`r:Zdܡ̰n:Yc*^B)Nֶ ;`VDZTt9NֶZfs {}|NN֪Eϯ^"̘uw*l_0f[Y7.t2;Ϋޠ ̈n8Nf˭n3ޢ B'3d7ķ6nt9W+l k?o5.t2cizFYMuRv𼾱s]r.r ̾WUQ֘Ջs}ɺqe6` qn:q*^QWEX#vW/>xɌC)X>]u]:Y'_Y_E8vW/>B'ӕs;dPSSdCiz}B'ӢoW4vvp(\T <_n'o={KSe';Om:M\`^ZB'OەUE9Sd.Ge`9J!t2G9=7TPݨ* rN{[c7;]TB'rHq\B'hѷT/QI5 t26\,wW/8NX!p^pnowR/+-+JslRճ0oSW]Wg},N'WoR kr oTVkN'ܼ;`v56G)NW#XGyH}Z!QP`lk4,>B',%xhLSEojܱ)tR]\Q)؇Z:aI}K}UR` AB'opw>^vUD쵩菫"wVS}X):Y๫q/[TEέڭˉr*qsmdB'o`~-Tt.'B'mk^R v~~m'B'jчoU7 \11@dq&ۂR̝WT/)s9g͍nT`\XzuR tZ>4ϛ`.h|r"t ݍ)v՟V 'B'`]tu'B'jч;wXUzR tr(.ƿ~.]NNgO)֕K_U NEz?TzUXhZ]<WW/:YsnifR̴OgZhѷOM*UIwTϫޱ;-2#t2UXE)fʅM7i>.~̄[]G):VUX^8d.'B']}GpGW*mv@d][};oTUϪq"t2/sw/>PzuR t2oƵkoTumnBd]TrqX#u퍎-^_CDdރ/f}'r_g;߬")!\Y B'TV4}3i!*H=(@mEod >}(;d&,}KGUX5f>y{{5㴎ӕ/VU o'B'܋E~TUizu;V<_WfUG~L):c?1u]]8t^_A%=^јB`z^jAD脃wy۔;k803AgC|8LW}!?1`jick)NfޢԷT?_]*{[cJ +loQ`}{})9xnof^us)p+p"tϭoV/v 0~z * 6֫_gFNf߃2ɴx|2bwc=/ 8Tm6N?<=sG Sؔ?NйW귪GIϵ)PѸRpW5nu߻q#̾UwzN^UT'W'M\Vg XڮF[k8&t. Mc&`\]9OUN^:gL۫+5OuB\U=lS =o.3-p B']\4o\=zѝ@u~u, FAՙS}r) ^S:a.C^_@>v6[27U@>='UOZ=zRcz^'tXn yÐ!t\ν3_jl22ώN۪wL}Mk=ދ'Tl\)>| ̎˫_m\{zߡ.0סs@~rTG{֬;k0O'mx̢qWUOk9^qTh ko:x ֌ @o4zaZp~]/zfupњzK;k~?NйƂGyQիy< @Z=sN衴:O۳, t~}Xc/Vմq[66]+h3}jc7ր8sI㘶?/O BUgT?UF ojtv Loj\U7TilJblLBy~}W\?Tl@-lsyY77vc5{C 6ve6emc^]wyHXyޮ_m| B'KO;Kok3W9>_K6cՋ3.t t|xc*T}3zw4v3qs'9xƥ '4;S k*/UwЉЉй Sϭ^~nO5:n\Qym:<=S)Zƞݍ.++ml3/t"t"t@I GO 2ojlX&wViuZc19ӣ/gsgu|dzPpw{ fؘnX@ࣧvL.Υ7q*#Xe癍XrJuf:Ea{dѭcwc j{Bk@yeu0{-~m8\=zP:‡6:C'Nqƻ ׎Fi@F3{8M9!t:v 'O1S@=fz?l[¶N_ѽyz?\8::: #PјrF 5 /,[gf^q1x40}v ~âk.ϹЉ :w t t B'B' t t t B'B'B' t t B'B'B'0*0, IENDB`pydantic-2.10.6/docs/logos/asml_logo.png000066400000000000000000000761221474456633400202000ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp bwIDATxy=YU޻p=4- $`"Ήh0>ѨcqJLiDQ3cT?A!̈@3ΩG{έ^׫sTSUk-fs90p9y@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Z@އML{oV9v*=ʣWոRԇH%)tOD9CԂ((7R@w7js!*[&L,(sP) b?FF4P j~&¡eLH'G+#r.!t4g\I,V'rR/fFmLU X*r-{%'9 H{!SH]TtH6o`Xudd9/#cB:Gas#,\kX7 9CƳ!s'45i;܎ sQS@ox,u@qflMhWX N ~|8sMڋZW/Ux< j@ AAMϷ|"nwt&ûM' b] ~#XιD0؉ f$0F@>߰;Kbn|ľI:A>ILqˆM|Ba\݅>n,@tlsn'پٛiu@+6X'C |a B^^z'8uucg<,~=&% ?!a&w Pdl]e#@an΅FI,N2=qZټO|r<˹,3٭d6қLf/\-LԦ:*F@ۇH)7ǮAxaLf<]wtg?Xcc#ƚL㓎!qmq'B&扝|=K?#gvX/Dퟀ2CÖBM{?v+U+3zG@E}|$XlSJ.J2o5YfY[ksz䋤/R$Fy.ݝhTf*vO0x^mb`ye@ GE$_2]HkrxJtF?"5n`̖㐊i:IK,^%i'C !il)X, b/?ȗM`= Rv)nFL,,Mc釞g!9Avu2 Ꭵgfn-B`LٽYK8/o[R<7}XRb#iosZ_ B<)1|M)44EA卻/QnC; Uoi ug1IgifS`q牣x& "v}*W;F&ۂ^vJ,4"ZHW[S #j|:i1|RTX9 lC2ΙIO9_֖ SD釭WG^^>tKqD&lKP~EDgvj-]û=Dٔp)w%! 1 ;roƔmFnaբc-Ms$m i]lfPL76a14skBQSG.{eW o_ϳ<m,~`G 3y8i$#Go\ܵ*`ȱH,AwC9a(fx@k]i\Sإ8$DnԈʋr `ϻ 3K{eW]>mkKMn.N.3;j`E];݊05I3C4:엂*=нs|j<%Yj1~[(V6°"{p"blv(¯ɸE<rz2,R); E~U;{\ Q & l\KMaDmb4Ɗc U㦲1#j[~a'BTVwiZ~ 񫃔i{7&fGvY\he7*O>Zް7K]=6}MӃf:<7n6{3{Ҭ%sX}Igwv`R7MS'#M!] mICT^$Y8&B;lcT%Պv! r,Gd c]cz(76[,T M');z6ʽmu#,gfQff g,b5_;+}(τCVjW$ :cCgSК3(l4q "2$s.&XTʣH;jyq鮿=B2Q6KDV JSV5"&vN̰@|$~D2\&P$ f-#f߁wiV3#ꌦ Ef]U:G{{o.Ƨ "%ZѪP\D`P퀱kKWL{?0򉉾:Êb}A w4w6cvBhѹSȑ}[D:G))imtMטg}%bE^#/_m>#X>kҎ%\6Xs0eL[&J׊H#`yФ b(Qt 'lhIZ:\\]Hmmb\|uJgnj.$9;\cF ӄT[2K-Z&st ZP7P:"I(o Un,%I]'eo{"]_vsX&mL[ t[(N C!r!dWl'Bʟuje@^V~ew Z|j]V [~x4Leʮw>@=j ru]}B ۚ2itvQi{~ZhH1SUW <(w)&#mtrZZ `JدLdZ@(^Ήlhg,ϱX]{ìQQ7]67:$Vbxz*p3wAZ8\OZՖ"FBH&^.@>tΓ;1c,b{6r7aalMf`Mus/[yP n'+ }Ad ;haװA,g!.I|imw4};ƽj%QJ*]̚-vhhFG]Ӄ ׄPj&/x >SPh :ap)X[boC=sv |EĀ*4>E\yfʻ` eG[`^Uk$¸7 8zgڎu `:7b^45I/ODQpvq0O[ iۭ*Exz_GE%_ |J2h) Xx=0P;4Ȼ>vdYHeZ)fv p8Wh8C[tgSj[-vl]>b2 nf&f]B$zHBih B:KLHn֜Q1M=o7{R5`/nʋDULkX+>񪉣WQwc i6˝l5AWdo~+r;kaf@].ZYHF~B=!Y`nq߁ HB%/6A,,F_$OY&T6Z_K2UD(MFS(M}!I0 1gTÄeXty0G@Rw^_ .W{2:ްvh_Fs$aZ?rPteoBTv~ c]m'H.풦)v ]ҔwxUƾwa" qgccY߰"HS[/?ȫ VփIRm(S 4Pw !L30(zD)m*5GElpR ^;g̰ebpk1Ⱥ+?G?I`H B X Z v!Zd7 Akp/HĹƐ(inMˇEޜg"ϯ@󔳔Wv'RUArl7YMX},+"}j8Lw36 Hd/&ڊ&@mg;FDS+>HuoYMEJ|/('w0xiE"㙏M~w&b['@ZГZIU+/okH8t`#˘)MeUP  L>*^P#6U_9,z%qpݚ͔%췍FЂp6e۬.~Ѭt_niB /<Ǐ#߁@3$B>kRZ y4%pVI 釛`wFeR,$!oF Z;`f%B*;Qw5m9@zt"yXӌ& A@E?FAW VM=(huWXaߘFEKP-ؓ.#uk&7wFʔfd2}VGg,͐Q;ڂϢeRO0H/`qO ˆ }dMd&=&qOGuGKn|TI,{4 `zaity_h)UӸ4՞xV:Aɟn=鵄_M+c2,J:݃_Cz0^@8vyH YW289W}2$tJ)  }{CceI쒱 nMn@-qc&h~8T%; ]4粏-- CzxmR6,S !QБϗͪxl,,d J:r"r*ܟmδA6cT,*iP>OvڴJCGI>cA_ir8JG?= HG?HubM:^]#,>Tg雥}x/V]'$k]46 u͚MhkO(}E|Ґk7/Գfmzr7#*<"´iQ~ys/m\025hŵ^015 On&!M{Bu|Sb~9-t #_GdX g^lTyb FzFYz 5sϚMˤ( e3SӪCr0Y,~k 069 \3ü@?Z-jlw{InRgvLjXa6g@4oU 5ݳ=bD ?k[gA}zC%h'EBh]:d!2Wo|fΝO#>/%(u //"1"rSinW +;GC;(\K?PR v*4G0Lk5`o>X~ O;[5 KTd)tbPC*a[tZզ@M0d:ȪTY#9ћqJa8G%.3IPW➅n~rjh@kstfCRZ!"EM&/%V 9y`M!9PS;RVZ uf܈\k̰`H[z11;k2N@ّ$\A}I'3x3\fTA1JR+J|4Z+$wQ x[my( ˆn%IIJ \5P?$}\R3 ;D%D a7DtLU.fvu_`bPI0*gCM_w3`FIsL;HnHUg/ΛSӪ0i"Ïvup>õ,:$D Q}ë}W4n3*z" 1;"F-B)[{0o7 I;˚ރs{YSAh͔j(&1*&Ԓz\;zVvlPDH4e YȾ%9ܪU7c ݭ)V µ% O =jb*"jL ٻUZݾ{-@P>,Je-{li/.=cӕ6~HxV۪ѩ"$ׇ $A](j4{@LJAu~i]f? G ݝSi죠]_:uW7SԦc@ jM[w ]uċQQ;&SPeqԌ1ug㨎v!3r{nE2eΈTe]H^"V9Rt˫Tu]-7^N.T@_jA3d'Vꕳ 8Y$*Nj]|\i4Z6Ve@R!sssD8/Rn® b|&ŶÙ4M$lc%4jb.m Ur I`7A .s,vNa߶gF~ Zs5SW4ȗ5(8uKjǚGr1 ٞ;0KX<2YBT+bM/Pˀ_͕"+rUSthCfߙh@W*b#<8?gX{ VmaO@@9uutl'[\(|aG-lojz5:-cY*7pufkl<Ib5Ai~Zgrt^;@P>a IP7+e4@'B$4P"V[8>fכ?zǿ_HCҩ H`zT@f Icrō6yGRc½ & Fnl7`uEo"ԲF(m} O۬9[SuĶ!1狶NؗRtyaS| FLj }CM6E@1%2Ku :CQ-"m54hx R!a+7/L?&RX=C$Hf`E̯ĉmLm _܏sRo#*O OG쓗|ьD*_l"G'u3ggƬObMfܵkI52B\YӦ "tʔ 4CyqE"uU~g _ +1[]k4K"}Uc?WLՉJX:LVht<B=Z&ы!_x${w95NBI=h4ЕΡ &lHRO}vHje>q! GY=7ꆛʤyno4>93%0c1 ۯH'BfJqcu< AB7 f<DIM~Ӷjz\HՕ[@תM0MSջexdm6$:,%5]qIq#{AyqU{|{v]r7pE25 ]&:a?hX `T@~ǂØ\.D zy6=" ,St W|&:TrTM(rT)^D0_}/vٽ2xh+7Y v<{d8srT{E;]gi"#&I7L(a 11X5}EaX;fmһ!6nJ",5@{)vmn92>I]t}H5q&X:/=Ayn-i/,Ƿ:q'$ÕGtb>G(]D"L~Ӗv&c'|£Kv-0<'Aڸ>&t<3y!;S Tǵ0, uVnٌa)tx ms5$&.ءFqW^O`ldԂ' cWi כJ>/ 1nvD Igs&^jMi/2 2jgcR^2#!^tqcbbwwuce]RwL&!a7nQ  閖Ҥb1 %]YUazukwN萊/(eiETW E庍^ -X9@dW3~3Pq=T!;TaVKPCs0B.YdW {1N{\ g3sļMϡi]if(zvU rn8)v#i"5Oj[+Le-~9;  %ﶭٙ%'zD"&|48y2 & ܖfxr1,Qͩ~_H2'Iu#Midb=w$3#;BiVTG^K;6]:<܈8c!ʠ_7ۆUFkz*j\+Vwc1&m"6Sx$$՗YcSTWkݍVv֥? ZwR NΜ X%E@P)?1ٷh!d,睙=LNǨށ;GFxxSH}d]ۏrl )1/dJV IE.f{A y%bq%՘>ŏKCyR+a"-7՝b+ H7QwO}{卻=jWWsA}{Be@' z6j|y=C~MQ(X XUY~Ȭ@H˅=ByĦu,?C%ngQ?I,m&f^tB6dwǹ * {0[>v'>0jSEda tҰVI2% ~JsG]3ABu8ŕ_^l 4Ho_^CdNwJ4ݰf=""R]fH7)Evq7DSa'psGy'`o4lH亄H(̉ d=~Ӄ"JxD'4AQ6Q/@{6V]YwcuDD|4c:S ֙Bd^>6"|ȐNvo7fRaɿoqbRr}gREtᙫ4S_{?8cfi:^G=DvʥY9¦h!->)tjn!h^JU3]uY*}äz#g{E)9Jlx e0dô9~zt-ߵJ.gCP#b_8&쭤:u^g+`E/ְxݓN_C!Acs.BHJoLֺ@'wCU<;>ݭ%VNޑ}9w߭QdyKl83,2u[G}-!/$wEw ⫤Lj t C4cgS$CRu%*zvߐ>̱tISvD`/fuǂI۴D}sGmB&¨txwC>=7JP(Jm0V R-u2?@%"H~gOuzM&#iy?$S=c "ЗuN[Јq3Y瑸B+ A`rv@7k4F Nʣ:A{s5Js2@`]"5ów>dyC(: >*OTwO6}sV/ ޖ?h9G.; ||F}alΜuRtW^g"ΙwC럘M@lIr!"@'|t5509&ZB-Bm]ϣiM,˴ݤ/_-:MbEϊ%h'}T=Z ZP=-2~C-cW{,"}. (r)Y ,jmؿ9MHQwB_)c*S1y:AED[#n'3`k&>36È$B&b tOtUY:ԃ`e/zO_#}$ +_h;r8$VMJ91TQ<9pnR,b 1=2gB]=RDfz/U\gQÊ_fȏr@ !0ՈȇugG!xHuSί u~Dbݾ$]k|GFofAkBd܋K"<^HLߨύnXAjo~-B+#E?JFꅏUxQ8 M!Z?p7-}ܣaM?Tu{ -΀g~iI~Uut ؔ?{ Ws6o{J|EBB8ixfX |&$ l\O950eUFX^0A J . dYAi3+M{g_X,I˨vI3%@ s,.Y90jB]YXh7ߔ&f{Lo-MBM?;z 8;N!E5QAn Fʋ{'R [NTE~i*VL&dD7~Zz 0ztsDAguLj+)RU ] س>stXiK1tbίU=|Tڭ&i3i@l) ۴Ez9Sյl(SPwt 7Q}cz 5YQ`63#C.2³[=B:tdkO"H6r“EhESYx_)NFiy-փp`d'6+"e[iV+^PHB`W2 9#3+Rc anyHWPw&OtyJ^S1Ͻj 'Y8G@ 6Vonw%Ց"T%îTrgo0CCx08-M37}Ɩ*8%9,VgtҪc'nXjF|` 墿tSvWY߮WB#EYsXmw(-rS_ b ]݇xٛH0Wά<7`ܱu|PbYtfψ2 ]⫪]4~F!C&<%CV)HW/M{zi(s=mf \o{Cxt_NW1`6YI>.Ql~ QL;`s wCkCMo#(S_[PK]yNohtǸ!UXXZ(kzN:ij}yp|w9,Qܣ웺ߒyhgetjEASMWw<]Yj5偸5|]bC'| M탦YkG#p[Tv2.^)6)jHτ(۽K͌öM{ªō]kZN5]MD&q:Ǐam)b܄٧}ۼK0=x!f,vt0>Uvr&Y}/cqK{O []|e.;> )[_`g^u&Z& \Py,嶏u: ECw1ƕw}B^h0XU@_NԹz.3ұbfQ 1.L3x*!3jkXඩ'2N_DaƤb5 jfQra"o5Y`u\t,IPtؽ==<$?,\Ҵ>[./mECN6!R/'g(jf=(+Mq s̓OBH[&* 'c} 0禺~S$aب%ֹfdl]128B_o@ދPf@ͅV$/ͫ,B(|Dl_ߴh$W4T`1@>…V^1Q鬔UB N:.ct)*Y" 17;'**uDJ>@ys0 3 WATGܥ`u.Zr/Z+"ZrSktU_~Q3 3 Xw.&X'u U ) 7Q݀h;G֨ΗdH!.(hLLg &3T` ,對i.b|.ݰwfP&goŦAr@BBF-a"l$Nd~j$өhu]f)z|AD?̅Z;mȻ^{s!Wd9@\H"],g&3ն\t̚$J,w1+gecp;'L?*4A5{3ې?8gұ83_,5ӿ>jJ :FUɡYh= +._M0vE/Sr+~sOlx|yUfH+^g\=bJznm`b #MBU_XкEH4XU[oM\~3pل!a^:|Ħ>/{#G1ͰsI+'[f9z2Ʋ TplT E={?J >TN(P8=&eZKŨ/=ݻ4wڹ9"֑\uD[>nFn5&<6)wOtnj!5Ag%?`rFEFTܾ'CSowyŽ$$ׇ|;t7֩wI[6JJLJ[a,V$w|S!"En60~CO\UqÉVŃ8O/`$UO @k$NJVFD+m!AjV^!:\xp_1Y2H#O_5*g E+d: ],((+ҏj3sZR (kuu F8tbu,z ˷2$bx, ;P:ߔ an#~ۄwu jޖ5=y[ Dk2LڨX[}/lTڡO?9 ~>&ʗDNۢ羣 k9Gr(FÓqj3Ŀ¹N&Qj-5FIeu27g^N; R14oP 7N6uM 1 lFD%'s5`Vr.ZqVfnaBA"Uu@m_z )6^L74~ΰ {kcwd4a- R>I-=Oxx@$p(G@7"afNwAlp~dr_$X}{R_eF.} Z! J=ˑ,sˬ`eoIP9XHg6u yb6CBxH:SBn%ôe.ZJXk\ (]4řfe*av4M-}?+u'ѤhqVQ\X׫WZQԵ=d:-G(5:j x0$6`x6/"Vͨ-I (S靲LM~6]{V.9P߭VPB086^#cX?Q>uYY0]\WHэ 2Nx*&zQ"EF}~xb;M`ٚu.TZc _6+E.0DZu"{>@TeH`Lz ffKpU]^<ǿ۷4AbEܳ~^aܐ̮Zseiά+ dgگ~-i!2jyk^/CLa ۵<#w&/}Y%7򾟿z] li.c`T%tן; b ""ϚʤI:_UgLK `# -toEH%r\u_bl]ZzlR, >;SzLJb]f!˅$eF:%0WުY?aqnQ\7a)SlSڟcvуeC„SI $ PpM5K T]f8M'=OQTH[5wY"$ n'BԄr-eeh%'eëv\0Grhf`y@,Ǩ¬Q/72DfKj:`F: asVƛV7tF ]@Fb3G RDiKH_;"RaYa#CO9;%~_6;/G~jkKǏEXatΡSeA]AB8:=Cfl)K% yKY bg>ci*UoYaZKAb&* H׮;["MLS_a9}Q3e.}?XMeHZ1V`>uVV Y乬b5SǼ&j:oH0i. ͒ܬs~glJ[i$EcKBћ o{?pv'Ā\Eo" !47=ƭI$TprΏa2vD)hl7 s?Ejь% cP M6 cfMWL!݋tl%Iך-oŚA iݕ.]Xn-LᑋK!ƷZ0?OhlUBliϵZwbZmo `[H!\GEؚk3Po]}`A r$E|D_=;tRbfy!i v'IS n'ct~,wG?ke#Z - (\Zg~`4b vRI<@o y#jg̜#T2*=K^`!y<}䡾kE0#ؖZޛf% Bvd.W-b f{I\;Af^.-"d`H!QaFkmwr>*a4:EAS.UZ{Qg!2zܬ`Z2B$72u;l+'V- E$yј Fq:>MJ3^ JSk#!gʓBeς(tqg$KOhm%'Rd4f+]K( n'5P'+g U0wٛOf&}bQ徭03%Dgiq)d'ѹ8ъzUh7"v,yqe sրFŜ6~gSwG_˦9M/Sy̴,G pp>DtM@M6ĤOYMnM Dr4Em"2ch+(vY(!TOX3Uym/:3cS͇`d&HެQ iH:Ȇvgg|?8ͧX1 [i=\))`YX,ozAHiխ?z$WD/ ԹM5}pA:1E)7 pwf]'Ʋz705`5Iug V4KjzgLX^y!qaX ;c/+ˢkxS;MP&sN%ny>#aPruA!pE6, L|'[xpݾ10w{ %$潚"Bᖗ* y$`Wʷ(XQIoOGV[dJSgjhoiΏݟ$"F"+3E#p/2sf CQ.(CZ&R[EyN?TTˀ#{tS%q+WתKB\K4 !;fQ"@\:"LC4aE1V3Nr'uʦ%vE9M~m9?lV13M] X1e?Y ޖH"/Rbt--:p  $,Deӯ9)PHPӨN'1i$jmh7..#F(sցw0#駣UэkC8ДwX""륿0-B@<U:630L\(b4ub@eoqG jV>"Gʫ?qFȂe=ct@>0BfATagn wOerh:M&z:7O&+r[)-L68Y@e7"D aeKOQ :!כA[&ٸA o"0r%g&i=&n6E0L##Mkt|70ѵ[g\ \\\5w+# $a4) uI3E+(6lkG7YuBN4|p\gD4降̙iq)OfkR4]$תS`6omȣ֡JY;a!r<[s[.w{f' rw A6p9>0499\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9Zs9Нsιsεts<;s-9k9\ x@w9ZhlIENDB`pydantic-2.10.6/docs/logos/astrazeneca_logo.png000066400000000000000000001745671474456633400215600ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp (IDATxuE?OUgN$ ri "FC2#|vЈ˷_8W-NAyEҵgcWtm{_ C7azpR!һ!#!2;N!]54\ eqH5Cͫ&z:譄Bȇ`JG yEfωMj̻d n[Z2%Հ~  !)H71}[]Ai@.Xsav,ʷ^H*(B)"`ܨ8S۽K)S8 ?*a#sth™*(B)`:fmww'ș%CH ^"t{7_Z2p)B)`ZgZo+HGk 2?VobZ.2-\$r-v@Zx(Bc YݝiR\ &72JX݋Nߢ~Y]7P@'0288d^y,|9-`v,jZ|]N!Cks-Xso< Z7n[6/c-:UOI{WoG? !|(1f@:=Gi88ʳ|`7{Nm,{?:!2N2*|Jqv;tB!:L µNJn#"Pr~+;=yO%ˏ rO;Q@'C+63S +onb˧ɴ>Gf7+|\K<_xқ7c\O!3'=4˃NBA-`PHq7gL RpR nH:7gZb* 9yC乀/?7 RH.~% rRUI46 p$ څnԮ=i}83m$>% FO,CtB9evsL8>[dZr%eŎ€]L(l~=.p E)z;tB!0 L)K-wW0Nuhim,1 yG+G'Rsѣcnv BQ`J4i.sa-SiOdOr+ȋ[{|~7tzMvMk !b$% Htsٵ<{/<ۥu@F~Vfvl߇M﷌BØlZ`Ǘ>n|&t6h s+wKvb&V!j7c !Ƙġ^0$ !n;CLWz5禮 !.zU2ﹲXXz0< SuTܥ‰7v:!\WKB̍zH_IWV nsZ7!R*,vnu?u, yeϟ?skg:[ j)B!IBJu0VĭtiyZt6wF|opS{UVP (O^0. =QVv/cx fƀ Hv.B4B}>ė?=(Bȇ2K.ZL-Q}7ou-}@܀]0pֵdpuhѹ9uP[~)(C   vpZ:KN!rL#~Kl+}@V '!]@ ŊVtj9(^c Y˵P^/ 3W̯Vs7 !䐗[:0+ӸX} cў?VazŜPpz!P#Sk5SB{Ĩ˼˒Ͼ]yo퍃fT= ! !lO˜a6AJL }?~B\eTskWc_a$E\DaJRS'rIudviu?tB7JBy;7Lpb`j Mkj,y|soIKˎHzF$0mw2衖&6`.z^xa#lM)U(B!ܙ Wts <=[|sZ@Bp0f삓X)2`ތ_` ; Rxfnwo0k#~R}>z(B!N[0SK7<`va:+^Pu`9Wx\:V:=؅RJ0 &@ͥ`J L 8eN|iyscPc& "\tϸ)Ґ:AJBQIaP4,48ɕ0ґLLl^c/AfWccF4B8`<@)(z=(/6MdK_(  @:-NR$½Y V" 52vb5  R0 Bǘ @+%ijyyA:P:/^>Ȏ@'c<pҁipj=pu Hl=[N8͂;ݐ"~ @Z~2_jt:p ry r(U8qE+' ` "LM߁_u@;Ng^ĵE2IڊOq5oAq!2-̑ \s7e˿}!EFxp ^ 0nz;TLz RT>_`ye]rf%҉_"e&ABw 2ؐvPb4Mz=#dot 6H7]<+ޥtb\õ&禀N!րxi8\e^_~[? ҕPCCwϙk*3pᦷA`,`T l12ĖD[Qsq.rkHS%0jh\ V {ABq^ ҆)0ffwfi*<.MґJGZ߱k*/Yj5: 0/C5ɵȴ$&WqJL\U^n vjc $9z'B=? !ƙ`j)4 @!)pSkfĖ*zY^p[u+aG|Zى߱zB4C>tc* M)%PG1Rfp0pNKE.j؎\-lEf`Zע>(`*FK`z5VD BkR&/M:ixˡ7PwS~|eܨb(W2h~{@x pZd:z5J=deް5?O)#jى7Kmg[L C:pMPCdv@o<tB!Y0V ҉MLnKoHۯ 0%k++:6y1 \ ou.^0M^q<4&s(䁁l`瀰[!2;  m0'Ir]N! vxI^F#}|[WԼ^E]g7k1S˹k ev>,&EoIrppYfO C k\ @B#C]„6 nfaEBq^ܓ|;cJ@$Îycfs[6-W.*! (N{&0-LPZFBff+V& L `< pV^Ap{x %榮ɼ$lH & %85NZ?8 !oko^e~0tezODcy ;1`V۪V|5jIZ %2mp_e<>jhR֞co;'o p"+ʸW܆>=X !opGeZnMwxˡb@$gey77 2-Ɖo V VWKf5))c$ 2s;!3+1zr„b1!$ )Bȇۅ ɝ[b+5 2в][mصeXoZ[3to+0f@=UӸ+ (| o /#+0Xx-< rc mE&W $̎ޜ%1/x-mE+)WVRk)٤\x8P8  T3߈d{ zϧ^K\zwV; 3W>e!?Bx-M4 2؏ M޸9Fd[(h{aVB@rc6\2k^@ \I>7[v;mÀ1`^ Qa~0N!^b->r{&vniiۄp`ə ]Z2P$8uvzFsVqڕNbVc"7g ϾnX|~Yxdj3؉Ǩs}Rd ?b:*!O:~\pKnv3Au ij1[UHTCa]B'O+7nj{PX- iJ~3:9Hhի@C~K9B/ܑ^K=B]~+=`K3Q-W?jH/8AU N-tBP`Qb8) X/|?i`9l–0 \iQ1vkף\gT 554ZW=NW^q=hG-9aw|l'Z[O&e {H6AdFڽ_˵k7øC^}_uS@'C%vԞ}5{:@LϛfǤnUHʡEg)m~VP)p{?N,?M+9I2n/ _ ]^vx3/Z}/oRroYpXmonN(8枅<0r 1%`>3  p0%CTH-7Y3 X0~'t7ӪN7Ð8FJ!8`v?y{m>Sh&7aw1kGFߝz=LGB?:7ÿ5htӰ֔dorͯۿI}rUfJ;/N! lQ 7NH$ԧ~%OvkgǮ   bf"4ꗗ2Qtx҉A ϸ;PUz2qץ*C/Ћnb!.4E\/t!6? n8'3) i]}7]xٝ]?|͍/jG1n)ah˝B>$$ 2UB⃯Tfs6Vfy%SAf/? GVpc`J4d"U-z{G ;L_V+kVi W_}.ZͭZQ\^u2۲d:[`_%?[>{1s0BNnɴyI߆K˴^a=LûC-tB`!P Mj^Rp{'AR0+eg H)GB/ ѥ\x\](\~T9$A_X-zVTp{,j.g?jr0* *! I}A{\&E ob)my u˛3'N↋`õiU/Q B>4H աV nt+kòe@7 2;N4;]5{ߚ_C\8jd74sqp#]H71' awC ssӿWRoN"qw!'S!ݔtwp&+Hne^3/՞[=wnCmycp$vlAbfӯTqBB' k$E غʕ՘s72߯.W!ϯt{a#25lv-:* ^ՄgZUn3'kk~,s]}8hsP H M `J)Wqқ {E0*gn}ƽ-|Fig*~]Cp :!Cg1b3n@)fvfu?y:0]0fe%\}B؝xB=H0 ,k@Ox&/t[)mU+SVK7 F vHa/Ӫ!Da`\p)ll`Ox5sٳ9s˻Zy1֥kl{0'A#_`j(BH4DFmu8:s{\C:/ARaG1 iLJLc )O@ė+~ĿnjOȭ A"p ZH%0 0eܐ`3{)#wxhx\ ŕLu\mA}`|y4 ' F% /;poB>.7_߀JsP [}5jN+s3Q%#4k?Q\s/LAB! pc(ثUx\m9gmM̶RSFt H;׫}O\-$M XJPvDLטL 7> ޝ{l4aau>6U-Ez}ӏ:!!x^TN-UnA"~%LMmt\2\yT)@Ǘ5X L p0„4 ()~n`[9525/Z7L? &.AfnFpXi#%#x6C`U (!V)ư7sWs̎f/+^{yAKcz]b` t!Fy᭪Q;!!KY_M(+ƫ鏴`ȑpanNKp!H?1͛族 ?郛 nT?^Y$u W8'N3;%3عVWA2Vt ]OC{_TF-?s]{ؖONj-ۛPÓo eST#[•^OEޘ{. >@ߪ[Yf]%8Jp?sjBH!JpSWOb@徣:תS{nU<6t%'?9 f1Umj)Z'gB5? 2mOca?J^z2rW޳C6z[&BQ8ɵ4p+(`Ln gj۴#to)Z&m 1Lks?QϸRYaMGlK@:FޘR$ZʳSB,gC:?dV5ؒOhR@'@0qhz2;Wb>((X8=SB ήo. ]7vK|*S+0HiA0/MAXV\U}7PJ na=?_a5NmSBJLb4OôTJj-NnXNl9f{ʆQ)v`U'EKs깕ܲ#_0֎ށsH56r}!3 8.cP%1Ơ+GJ 9AxL,,Z! @@;9dc?:m!eH$j@J Cع3c+q򱵨R .\$#>Kڏeޯ[/WI!9  pvķ}bFKIv7ڥB F.[-p*cT.y;*ưwUMn@z_ny*?e[2ә (FEJ'i^-z4OCwl7pbFϱ^{2$x8 t-wͣ0 ^xz` q#RJT;H;t]c*F"(¡u Q*ho;-H~8ӝi: nk-訩)Eo_.cp\Zq? 8 mx{am8W eu߻{TהBӔqKc C0-ڂZ̮(EA`-HjEfǛz_z\.WH\TW܉X`T_ %01WS 1rCɽI}pH5NDb0Lc+: Z!Ea,ʾ%Vc MiC W)p[VJ00=?h{l\t$@ V:\@tX܃i*߻]#03Vn뤐HL,[>b-]1%tP[[h4=^ ueINھ[} MYi'G(h[yqpG[ *\a18;5s Ƙ3!;/gX̋, ރSd2w#IR96$%\q++?v"J_e9 y4 B6oꕛqGS/?ҹvoҭ,p;Kzcϱ^y4XԒ#ÍH}-S(\s7 \pWT.M:gw<v|;Kirgsa< au^͹jEs _5I^)xnۤJ` kI|*L Mo0G7.zYay'WwoEtS[gz߻a۷ƢC!` RKqltD[k Od2λ|5`PG0Co`02LTPUEa S׬uގu 5 )  Zxxgs*J{_I, ZѣחB0őHWjaRBdae]s.K7͹uz,S&oDMM2 ƹD$b.L_ghՀz\S"gӭ\y&PgO5[o|7KpkY=_5_nT.b 0[ox`oyT1 ^4 3J>}ڴЎo5L=G*H7 T() 1 F/n[7aQ E c {a)K{3yk@^0ʗ";傾`ТG[A(+ޑISU4w,|~)`/\/s;&;I^W$L =9 D,MBT>Xbj7T|Rƈ5xXfJK@$)0BAiO74MA$+զ[ EQpIS D{ Vs/0T׍ 'mUSl#yٝ#6m;,g->锉p$ʪ(+lgqΐؘ1s$X j#%# R1)%+lÆך`@Ͻp5#\yi3rl'0u{*xlը*m;zCVu\lҌH$8h(Rc0Mmnjc;HglCzQi 55\W"6c 'wi- c]0.vrcмĞ$z$cR]ՕG*+>;VN>elہv!c )k60-X\[WQ{̶z_离&T%w{ZdɁ!ޠ'?xrϩ"%p^(ayV+y߅bIlI^HN߫ xN+;d̘i 7<;.Ff>sHC]n  Rxnj瓻~M0k P,?X)!ΐ@VÓVȷY 7KxBavՓgO"MɼSm6-_w v2SGuuGOr]U}rX3gN{$tz LD8D_7˛ uL@m&uc %}zW[O}~rkg8`zZwt/=S+T` "^~Lށ^[33[lyup^:wgv=Qės5o,K(F9w!Z-3<ןoNdMٿr\?N H 8U쭧z{HБb]4t-d5Z.\{YϤ{AtjPVOw#-KHȼqlV6q\?pL2 Í앰D[[/?~tM7? !>y1&΅gOæM'}=H7R/!X.]]WK&Ir܃U ca:3Ɣ΁uר~WHҊ3Oݮ}člCY=HL'Q5\EF\-[(ah4H}{ɽ>Z -Gju!$)q?Kq- X}.%!$:0w\si0My6onFII™x?55P4^v_ȃo%PUKLs䎓NzMS*4](5TjSo߿lv}p6p{> 1Xyw>VY ;*%8-S KT2}G #idž {C9TW>F ,F<:BH躊H$]W%0!xs"RQL0H`Y7U pEQtTY\|/3 /::5~Pu' Ms8C uo}*mw9GIIF@~^H8 ^|`؀qoz1חGsN?cef,` !( ) Ph4@@@Ȋ1IW9oBepw1(%pcXiʎPȀxͻi  @ĀNJp1/1^uuW8@*ەһF4 ;3 q>жާ"Cc+2<6 \@-evX ;H p8pp$6AeISp\NjxmT& luQu5Jd·.- 1)ǽ7xRڰcKPB@ZP~ Zrث\zzGt=|'[ Nnv=?~َm {kwgOklVMoNk춄FP}\-hyw= HXl`ޑ!@N[() ˏy3-}HՔ p?醆X_ >xz2 gjjJ0w 1TW 2z6'\>XUɤɶoo _!\!ؕ! ;G{m]/]_yvQ+d&}d*m+ {WD\W|oㆽݎٿTAM2~qk ?/32-SU9s*sEi5vt :4MiSQm}zGJ`|"i+5/D,BwWc"PVB2~v{[WOlm؎7v\H \R&vꀐd!U%oqۨ++*"{Si{Z3tw%pOCJ0fm^ضcqէWMAdl~얛j{[*to|eEWԯN1|q__@v b$bFs!QUUXٞ=ok}޽U+ m_$f"c'>Hd-}{/ @?P*c+Á07{Rd!cO2iڎ枣\I$2wG~!ZU:eA pFpκKZw' >P;iמ&40:EO>ݻۇ5y2AooꪪhwMm銽{6)MS߯Dl-CWglTwW]u_aRH) *qrΰq>H!/y F=ghkm12.١W}r0ea$KX[k=J8 4mke˶cFu54V>gOǿ+*"/) $y^42їб~߉P50Nږ )2᫸^y'tˍg6@݇O{_8ͨ+a 8|_fZo ׫xqmj\ ?NLBþqOdϾ15g|yZ]=[fw4uu[)SF$Əqh%j˟_o׍PȀ+mikwyGO={EoU4E̞pDp8H$0a붖/]3{vw )J~S^O0M1lٴoEF>+?bcs=~Ѱnnf4doڴ'4ϟk="28.c۶:?ZZzQV]e+麆|8r 歐e Zs)hi۶\|߽&$`.4!a^Q&`Ș6 $3rϏYyLȣ~3 r._+Kc]224Mc}'vl;LͶ0]dZ(GrђP7emyumP8\[BU=IAQxEKxg:q?s[i=kUQW,oz W g26Dž~Z܄q\m}ww+ øqC~S+ʟtC[N#b$@g{e7V9-Fd ^rWW\۳N׵+xl6#{}uMɃ:J0$)o8$F7AS]* KO8?zgͮG +%Ʒ2'URNVr72٠BXmp3;8)D-w* We<0RCpSnfcXW|Il~cu%%(8#ܳrcQRJ+prWeٯv~ov oBoE`_"a6¼S7Xw5wr8A_`ܸ!8팙x%(?ظ`Rz-lw<7Qk|,Ӌ.9#w;/0 PT?IX}Pr{c}>5H%3,p1|U;GF"pCK1SN򈷿r[}]ׅK~^Eꐍ~t^T.#^{`@P?poiz~޺6xކis>g&N`WWrTV`5);uk@ה\/8g>hO<3?/,ҽ ( K%c^븹L|_S]lOW { Hd롐7[戢xm]w<|O=&}!;] Ўߞqk'$D"MJWzo 3hy|BHYXbNy)NA(`ICfʻaH=~>MM݈DSI)s\wTהk$&ǽʆ>c9}ƨs]v ;P 6 E&脪Zp܃ G1@V+xha~Gf/Χf5*ϜZѷiip[t;!E˘qWʻষC ;uδ'X0 >*ϝ",׋zQՒ?_ T~v o_w㸨+Õ=[wחI)g^q c \`~swʴ[kJӥeaWD ~*+K1w}֛[C%`nlt\/a- O#xEj'3oaxqS8^yetqg=w2˲ˣ7،P(secraٖz(+~/+ =~HLpΠjJ(O_ݕ4Ap8 |ճuKW/޹Es1@P7/uBhaㆦ_gO+/ {Fv)H3Qo#2Lfl'8@,NdLdfG0 voo9dG{^Ӕx]]Wc B"zs{{(+ {C{1,$`fb)pƠ7΀XV ޶ G0`+V.}q02i0OU9I Tpp:8Z0I`9w)Nqu~ަ. =L]œ& ={:J'2=lxvZƚbm[[tN{"3b/Al!$QXcO<}(/@Q@\PtCEyYKKC )oثc@"4-!3DGYzz =xŕ'̮.I:bJ]{AAvo:?])RL[Ad!zr̳fqڛn|[o]8\SqH#s)=#FԀ+ Z[zo|+w׶\؎@@ìYbӝՖڸao42k^x2OsϮo.Ew" @7ܷ^Hd܄:eJJے)syI4XcglҍDz\Ҵܥ[3G"iMۼň c$D:c៷xmw~ .mcs~p^}e'nڳE\.t<@PǴi#Q]]F]}aê147wgZZ{҇_'Hd hPT|k t-8~G8BM]iNꛑ;楝;N t0ߒNkΜ={xzc*mb:h@ NWnknaea XEU0cȝ'8_#G<ݝ>c*֭-:{8q˷mmfxj0dl}׫UVzιG^v19mk B!Ȟp @Oߒ={pƠp_Alşnzz.8ⅅk<9 Vs]t?u%,Q$N=m:TM]Ƨ֋Ҳp&A'͛ē3ؾu]*5غue=~Yuϯ9"  {]eK< }uOObu#ओ'㦛-) &zq'NElt0Gf;qOs1Zg̟yܹzU}:6²gW쨍Fr+וsg*umo,ZWSCWq ,]^\we2W[$Ė--լckm3BU;SiK7̷{`νvܰUyTڂtTMwfE(d/Y x#>sfض%a;bU幮mӴ88gؼi'oɢѐ] p9g|'S/z Zi=g8gXtb4S=]ҋ릧E Xin'LNXH$Oձz<꫅\,$JQ[_/C8[Z֧R&Kkw=tbSS{} ]P1afl/5ztg>Tc@G,<"Z2.N0/RV{=z81Ma _;#z㌚KnO^0NSZvŸdn?SخW?;MɑAf3zBÿ-NVmC9ޝ2Uh77~ݞzCWMaC!I%L$S&!B]RII4qro 񸉖yӦcEc,5Dקͭ, z^"㏗^~q:lRt]}xAt'u#j~Yu?M^L{]h=wziccQ~{ιGұ ћmcˎMz-4Ȳ\ Z֬Cqs\C;~VMSǓO.ߠwَ@ccebko\0o1l+ߴl8 MSHQYEYiR`eeVkGY|n̀LBeU n7sp0nͮTʄe9M¶ug]7th='͛zЕ-=/ W_5~G"9x)D„\|17o:U<Ys;?1 tw'\ׁ[Ӣ6h\y .pgս{;a «jANUlފ?,Y61+6i*O¤oWX]]q8~  >c²}x,O^4]?g/6t5֗Ăgcǎ\c}.ֻ, [zwll0a N.%''N'Xufoo{Į]36mlS-34`Ӧ&<Ԫ~WpΛM}wD"Ҳp"THd23F>wƌ'ySSxm O2|-wsgtJLTd2ՋC:m?c`*Kl_?mZYi?Si~퓟vVWCyE$W(*4UC+W]i3Fz- /hm/4هYlއ*+U=[$XDKC>?1>sɒ-H&MXW2twQ1eWWcn/J&^);/ƸCnM+طg Aq h'>eu?]Ynջ|\}ˍyՌ2bJFktĎ2pi<;ݘ O 8fW`* i+ wֈ>o҄ CQ9+Wvu`)i ǝ09]W]`~1cƨGF {KUUAv XnoUwO㎛ˁK2ƐJcj %K^";~vsfi_M= 5Ƴ8HLYQǵ;1u_=2pذ->cҡU@@ǔ)ñcG+2i)1Exs->3OLc~wj(}BR^zA˘G0 uD0p]e^}y{tNfƆF1aH˰aG߱tֵ%!V]]7anغוObk{{0r'_ /{Nt}p{.8I#aCkՃ_MM;^&u$ϭr)~5i3-HpoXYkKtCOd0zt-?a3gȐ'6llB:mBQ8K_NL|c9vOڳ;星Og x#|}u&( &ahPTD,=m-$#R?/]MڊmI&D]}_ƹw>ƢMђ Z3^\$q\ضMve -˒[J#6/ێ*W9uꈋmM$2{Ì8$T]wQ2?]%&7NXtuTM)lW}Ę 4T+Q\{ߞ LW6}Ѩw!]5G[Mgby?'?W+8jt /Zaw;pr}- Ar*/hXyS_2u̮MP7N2|5O5`4tDZ5< lR5y@8ȍK)O8s .<7۶ؐ7ζAO%#m{OwsK0jT]gQQQ=zݵ !\{:I!E.a˖mȑdlH)Džp,]+/_}I*2uks ;uOHdĩ:meKPU Kc;ش霫?sly;ÆW#4gOGѠ-qo}+DAzDv\A?$Ŀxҭ[E8#6}uf k$Ba3gǖ`TUݹ_u'R"e 'PUH ̟?Em\rŎJUSs9hhXn|/wWTDpe7 X y}nO|qh՟I`B,B:myySn.N?}F,_k۷6klLu1qx]]W؉PHxU#/ێx,tK0ܱGbFL>C+U'N#,l_SWuȮwhqcӥ%&S)A Kl=֎0 öhSO~אwA!ry+^eI7sƶZяLYyd_k[_,ח5BZ]qTVF3%]W y:)%!L>kpL>v!Wx|B(^ VVFg폎~%;ZÇC㝴T mHa eL-awA: (z0CLl偱_T‡m\fyaȳy ȝdGA:wzXzUkW0vd`VNvӻ2w\D~^-e[6l鶱),Gl[*3Ϛ*eرm-U4C421Xc (ZpNVQs4T9mX'v Mbi+<=[iޞM]qpƢ͐R7tpcp Xd;Bajv!,ˁiN$ٽ==IFa[vpΡi k삔Æՠ$[cG+‘+gZ+e ɔjկH4_؎+!PU7؂. --ӧb!x%Xz7N7^v,4][lݲ/⍣{HmZ5V̌ />y5i*AV'W _]x㴓 [Km8bLQlڊ$2{0}]e%pc@*ibԨZ| ֗ QWW͛?nR5:RzEic3K۶_G74Z{o>m xCd坱1\c~2jdX,dBWW<"xOLB  /Օ < o{,_`2m{h.2QYoBHA)20(`j9dr=V 8`jV|9$p}H}0;=C:)0rA+϶t1L\/þq:'j /{wޭܙ*qRQeMMz0`ޮ87p (˖mz 7HЎF7G$Á˶J)::dk8rճa0/8tCw+Hxurߪ9HcӦ}3"׾}[ =k/#0vlB!]]V FPOx'~]Px̜5[4'h=T^1yH)qEs?ƏG"㈷< FQ8R){Md() aǎ6C:6mlB2쌣#`țkgyM\@d#z{ e  \nc9x.9ν"Bg2pc̛m4n < 9C_o %!\3G.`̫2- ԡqh~=4^7^\ޞ+ȞAB`-5%TcjǮDa&Ձŋa)0kH*VM<]]q NYZB"UkW/ێO@PS {"x<'RYnK>rls'4يO0|)Pc'n?9K\11c둈o 1H&MFعB0M X )ض t+wo',يx<olR7?{MSNع5ǀ,$utZ"L p pAS€z^ (^q@:=pX @ ?vTH ȫavusZo{Uj.^j@rxs?x^W[w tuū_{uR\+_!1gθ2 :;CGi[_yiC*EXtME[[/K8bKx:'5b{jy-tCGKK2US䲎P;q;/سB-j604ttĐHdLdP[_}PR”#1-(VD/poO; , Wʎ3ưuk R) }(/TW$ &M3:zrkWBԯ Rn )h[.KcܸR7bkl6\vq d+ax hnhH$ )VQ[v^"4 )G #.,sWUegVϾ߯'R/fM3jt _y-R;vtuߕ\MMj/0:FgnMSq^zbx3Gaqϖ'<22 0oNn,ˆm8p yq% 0-ZV&+䭉vYW k鮩-Y`t~CG_{qqsE$@U.*,w%q8qK;qI؉lȒlzgQHr|̽ ,R|;Hn;wf9gΜW`6z{Gc vVUGLuA cYz`y3e?^銻*ϱD"~Gl>'`vC1CS}3B{ Dیz̝׌ڄĢŭ5z7Oੂm;HrD1'IP']_U5q07C=ǦbY RzȾV@1ՃE/.r͋3$\.g o:ч?f|xy=OB)^S98z;5S; R- =xrTTpHMG~Fq&k`ڴj_8͕ #pA>y R,*'^}moٔ[8xOz;.pĈX=J⥚bu8HP߾~oo`p؟SDC伖r!|p03g}Ǧ|I`Z';pܭ [o?رޞZI1AQd>`ۖΙ/u]v٢cԆNdRVqeO\fv ] #cz[pO /Xf*gaK?P[͛E\vbt@6gs }8r$i3xtٍDL̃trRQOEҖPoFzoĭwxp7UN禌|bW*N ",qhZKs9o2M _tO"O6}되}r*2F(/ݜs؎h4xCs=vێZSQ]YD.7jYbӰm EdrI"5LgY<dB͇7kWA7;(j8Ӱa wtJ6D7H&?iRlO-tK^" JȁQO?V}NR"CqLLmnrrT:7T09 QMpnǙ \Sb:Ȋ Yb6o>w"cp+?eHٜl@qM7̂H?sjiJ,`u_?[aD$T 728r +ek2nv:x+P%@<֭nRی/Ϭɏ~-774V vrΏ:w<7*{ݶ‹ᕗy)=!5Ow~-E8$IrCeU]˖̀$Id㈠0/0nI*>W]f.|*MS~4~w,==eC4:~l_ڝLlm!~ w/a]W{am)ͳP@s_l߼ȇTMaQT݈`8;ٮRAƋ*$hwWU V"icFil<.SM4TsG+,xUUaD"ryqD s[觕UP 9* %ܳk} OmEr7 M3,jjxBչɟ3ә}LI%IbNyyq [s>ȹI "GQUZ 3L@ςa_Zn%[VQi @?s \m'1y˝M?~5kJwUUö÷hr  ,$p֮[xőrErU{an ahVoϞS\^皚0::Nшax8\8$H$_rж"_# gh88 f2X.g"ΉS)wtnY6tŋ[͞ 'dъEۛ Iw\k]'F$č7H^Y_?z2O"򻓅H؇{h:N9'&֕n,a~aճ gOw8Me_9SBa[G1'"yay;SUwDy}ݲ5 qV|AT 6vFF~͋O&ȤEi._:f]Db|fS0Rіw2_{c )B=B].qrňloL+twn3~A ꐤ .0JEbd0q+z&sVAol<}9  ,**[8:f̬K&wmtDg(%-S:纖phh40C1Zp(߯~ŠY::oGaMH0cC%}|okR_[=_-[$g̨q+N7TYuJGLhjk6z W*d+p__+Ed:3]n2Lut>!I c"&WiXLdW_hX0L8\ ߶S ;zG(Dfs[[gy ӧW L&c׮{~¯;:N*lVKrx]kqşx;ĩE"#aHnm[d!(70.-{Oq wNy,au_8Gyv|Cr&$~V8uc&C"efkah0?xex8]?MP(w;㳥 `8{UdEjw ~Sy[xH$G^y}1˗@u+{4DsKx;|hh'Z4;yx˒٦XgZ$!% \H{PG$`zbl +wm*8ZI[#]b KC%rzX$P* l0jԁ[誵OP V9۩))U'CA_|a/{l~O Lbt⾹?ﻼPK؞^+Mv"wϞ;܁pK|l,l6=lY\Š?ztPVdxGix{~?w{'KtV椒ZMTp8/2wl'w^X[޺E:66V|];{nMؖ8ɠrA0e:9(P5M&\Ǚ0Lv9**Au%gP[ ¾>`EEt̶r;$EˤܚS9&ƽQ]Yi"=d1ZIIީ&Im1WbqL{LQd8> &%'|Yg;sLy(9ǎ G}~MEn&^?x(\gks]11vW [Ǚrlxl,M!)E8$IF"`UWGgm aax$Xmm'/8b4b3w<+O٦ZqijJftK937[<`s+eSA(@9s)Qؙλr?1tI>i曒Vj Ⱦzھ_cOߘY}"dίU [gmf/ 'wq(!kᔌ%^qP,ZS exJ~={V=:$жeY+K&\΀$!qUKޣ߹臶mرP*+Td1PeϺ2sL"9ܼaH㷿Ї,m{k|cڴo꺚D3hoo%xvÞS3KT s8MUʐbOe3f_##m./С(Rzh;<@Xު"CVB7ۖp++aP5QTEdd=qi]-8F% 8O=+յǚt&>?xI랤te[9dcHg!.;Ki8$ q躆BDڂgOǒJX*"ĐJ$Du3Q+?8{_|T85%^N,GZFkȤEPae._uXᚽyhH8/~,ViGIVg&r8 q>2\WᎿ-hA0EW -5|= 7u/د2nSOln``BSV8w390-=!K!xJ;s!ɓc]g^tѼ72f8V7\VRka%w32%T0aʋS7& "6$WwQrgժdAjky>7t'p̴䇞NHLҷ%^t)_Mڵ"ܝc`4AYP&+x7e]at֭-cc $KuDr&p  vd5噧w|5{{Z{@F[ *ƩeKvPWniUp(X,uȡMxV\mlL`LxO˶ǶhVcFNg<ig|sˁ? &⼸H|ބ*Kb1L^ ôgק*>rblֳ޾iq=$K8;w{j8V&I o=/~:̢2x Cΰ IWݻ|y-.'OVU:5|:?W腻B2cFpX>qJ,"ҩrgpOh&򨏌$qK(2lw{3" BaH&NʹȲؗg?I"Kb<uQk#7"(% Ef32p8?1{v㑮ᯏ'$O˯XjP*Zٶ#jCԼ3 ~mMW˒3š猼΋USYht$ā V/[$tm3lLM 4?3ʐɛ?my7:;ák A(͜d8C߉\l<}㮓A7hMUx}LSx!)eJ\U{t4<G0+xWț"ŧ`FjyKNch̛X={li_kW70$%-D%R|sΗM Jn5s$ Fw> +2/Yi( 8[9U]p?_q9v΀އNNYֿV,KsG;vV,.8`o [mv}9J§7[;f:$1 K}cB^pKLXQٜ ђ%o,'4/,|ϧAQ'&ص{n:_p Hi8Y(Yw"7 ݛJlyw}(*+76V.ٶ}'F1> TUi`?[t]sk38o>m_>?Uym c1`3x R?ϹpqESQ_ۺå&1]YĒZS-lCCS;[VUX]RMI)T׈@od˝1V魵nd2TM~{`4/g:ȣx~#dΰP()rg3yэӺ%heFt$3 C|f{=f)yx 8wܳg 4Mɓ:: 39xBEu>/6C!?/|lЇM3)HƘӴ3_2?>&,΄;b1kg7F1o~ @ &a AmOqo[c=ç >5k|w|gEA ?8nWnb^1 i:gmm{[J%{;~஻{nvEЯ3m9\rux7"9ș&\ ;5"ƯO}¾}LJ>A˹HzM}R~X4OU<'PwL1 oտ &cƞ1XJ >Ќ֢>ǔsO)O4'N^uKg\aCdl(lG+H ގSĤ3 ܶgϱl&K\m۶VU6#o X`pΑHJf1>n\|Ɓ؏kz͝K\}[עa_r+Ȫ 1ֿ=n3tiMU[MB./z2Df>a[N>ٷ`H$HɨX,߽GxlA& ꑨ?5ӻq3 5E/B+L4ţq'"k^$ɾQBܹM^܇p$XXI2C&CwנiZ,&fYa f P]ƌuҶmG߉I: e-_Ѯfyt.ʤ#hi_Qd$Ӌ*3 Bbc?3_A$+rͤ|[.lя%V:08h٬1a:ccL>g X1 :/Ƙ n3)_Z>7 \/ z+Zp¸鯾V\&A n$+6czܟ[H`*`ǑfE/[ 7CAO~AB>Vpq|_ۢ-&O od1&qEO7F ix,}ׯ;y0Ρti 2*+طt$D@}W%K#Sg"[VqҮ;wW^dT]A10ch ?a{gjXRUqMj̄ZYa1Vg8&08v5HfX@"ntUS*LCTz{/Ꮼ{GFOq4mHK`FB>lxjP5^t DL3<[qۊq ~.%| f([o?M&$6Yޘ;ɛ K!NSminU D](h9__JW++5x=x S懟g!ٶ  ,*+Cuض#H#'wb?ALE?×n*<. ;W9o~ O=K-޻M1yZ=x,/YI/R,R m&pF qJ%`B>3Fm9fD5~κ`=e{O=Yά{nDBXFxz˖JRe&7OV //_Mô8,R@46TUU={z.?9*iy0-䳧+8KapOL셤4١TZŇ&9#8$-{֫v1+q)#@gF F~>۾; vRh $z#(_Ȝ\,Npǁ6R۾hj5o@ڛ p{'wze իgt':;]LכtUqXUߘ"W;|zz_}Ef7>HH H$OK棿 ?+Īճ1cFRl!}뙘7m m4Oy9|{׮[m'w=|>5 t$IJqJy(fjw9yH *@ɱ/E!Icn<7k\+CykMg҆=6 k7&e^~i?ΣJU-Jɂ"":a2{}qQδ=Α6vEfI^my*OicWہ/ıa ĠiQWaXC Sh琚s^^s3%t]EmmDT62ƐpB>Z5 3E<-:QQ$0 ӦՊK'bBHj=:\x^'e8C IF"!+xǃ_z!!+2㭷^,^x~/룧\w7K#P 1~^$|J\y&9GeEp$K5-{v۹ݓoҹ*}ED 'FEFmsDԓ;;MfTSmqma0 U!뾙46B1ז$Q: t'OW=趻7o:x, /}W4ѝoYu's^i!ZD$\Ȱ,;VWݿX+:` ,qlxdESSy"1_|*Y<>t.*|,Qs\UWO= ` [{zΰh46a}pHlj#.&V8 ˂ԡK^L.gKaxm=Ud,u_82 iLUkf6ё$^ *BPUQ 4-ضX*̙ۄ~44VaSqUK 3p勱`a OHo~ȷ).dEF8::.|{/(xZ"*,un5@U"[ L~ʼn $j7 pÍW}6o:#S~$1Sy˯Xefbh(eKg"!KxbTE.,Yfjqqd jxqfp i 2G17 ; c;`^} tr7plVbl789Ra=`w)_}7/WBs'c8w`Yyc8wrgl7JAK]Òkl۶k"Z|+V̚}pQDPS{b68o~3;vtf;˩{Çf2ElOE{SFuu1ض]RQ__+Xx"T2|^.Vd D>gbeQ[~gS;7,Bֳp$Z\7u)`y. v lĒ%m4t'3O*|wA8;DkCnB<*̪TP)nBN\k/];w! @VD@@qxع(V-N$$ iH΁ZE.]Ƽ(w .d,&T, <>]-,мLfRڞd2Wp5ڶ i{8w0H2 j $Ȳ@P_d' 8*#3s{.k㟼 Mݻz E΁z݅wsx%mD"|u8G]}3fԡd݃,W.q&spu{"0bzNijhjVSA_4U)*Gۺ\gA\+12@"@8?AWކik $9  t>Lo()IFF⏷4WP̯A)\ lGN^cWRLpLԽ߱U.e~sd:GMmv$Y^;;~a7'ujhjᾼ8A³  I"Ҽ-HX`beI!X0T7τXdy z8kRۑ~Xc㰜3o9%ʊo}[L6ǹ)* ~س`7ȏ>n'x3S90)T \BRk\!iuVnJgә,ˁ% Wİb,$9X]Wk]3Omp(ppa >]š'm[~B쯟PbM]ecl,Y0<pQ^;wW6<;}V#OQ͑[V.'WO>a!&g3~5:,lX-%v2g ld d㠢"5u ע+O*(`oL~:<Yq͵a;o| x^-l,7s_"A0ôi5صkRyB ?P drmRDWVv_y,b#` *1t9Q[UUa|A~>loD˴*,]:ފ*B `Y"Z.g8r$"~h EΟ3kVX_?葇\\nȲL}p7zN$DHԏ69A c'ƐɈv}}jk"ؿobx(|o2-(18{ xm7L ~Es7 \<|!WIUրZ3ڧ.&Yx38Тۿp=m(K 3xBmɮY'o±HL|>`ϳ;{c۩ڸ0SNA d8v%oB?/[>ቇ&rY\tU!ϸ+iLfaSL1X$46 AU 5- NO:/IʐJgyt`EF"tMI0v>(C6k>D"~%p0hțf>j=kv\kkP - q ep+*qwu "%Âpx tu# MX8K|_|gry5I/6%^5e#țtd3y隷pqHH؇8mąG2)t{t iRN˝ILd8kE2EgՕ'?~_FF&/ s89w޵Ku%Νs7xVfmM>]ѮAw?m M,#}tKfT㠱]݃H ̚Հt:Y`Y6qxU᳎/ROcÈn;?8t^v{Xv,MS:y ֱmQLXp,2#bt4sE"3 r[1-KeWSe cή?câl,ceB RS*L2əLQCS ' lۮQd/I.9Lٶ|@UuG;1XJMC}Puy?,˾k֎5P^4 s)BY=7e8u{N/SϽ~,Ɓ]O80dL͙7ˀM#1a樓`%8S97.##7p+fǽVhMx[)5KNno9U^5"b?A߱㨸AU套߶&f9fl3k]\r&jQU9D $at4g6("a_Gޤ86lxƦcig555ξl*Q(#Ppɤ4 Ar'`wks,.I9?&kB˴dYar?g܀DEP kE :b\y+eYBkk-ʋgx6s 8E(AOcΜfTTly//hDgg?E/W<ܞ o1N֚7DZQSn C,Kap0MW }.wΐp&K*^}\4Em[TQU$YF$GMM/۷u*~SU3T*%絡2=Ϳ=UyXp:ɾ1(;D1+mہ" /k,G4(lK>g!;}>v yDn(Sz:Y\n"u-([#\ob)L &|G!`gw*|FꟄ$W1[ nܼHϯ?\P/}t9ɱdm a֔iG=CL^#"KS5]/$I,*9kSCdAc@&>[魵=/a^f酕a&Ǽyػ;G ;[MdI+v9SV~ئ-ۮ4um]p[? L]DPȇHį=/_{_'^(*44x?6pK/㦍Gfh&8Hr ;-;}Ǒ˙8aqD>`T:x߶( Yt]'ǎ Wqr(b<ضnvt1D]eYӑ*tw„yo9*Yro _#+2ShV淬y"yx*P?- M…e# r=3s>b֮wwcMMd*cYcwa9#ESE>eooMӁL`P@ /p#sgoSMUHdiaX>'퟿<8V-ixoضJU79hl]o\4mxr' MSɞ~s ~@`t"ULB>oLLKFE8'3If&_xq+g/ӄ9yRI=h+Eþ`(BYɮpuyp8=W]Ksh;QSO;}h˰>Kx>p7üx< 4=Lˆ*]rv D+;,wrP i2,x|%ߟHdlIoݵ?^x~_[W[}ppXt Gϵq4MA,Ɓǰر]= _ ȋݰ_<M!ah(~¾("MGp^hem3}O aL9xmSϙ&H=+V Lqk@5qݽ}?[qwDϞ͑x.kq rfO̿srB!za **sqs{ uKKJ(#\\iXhn_͛r0}z-'M 8&kS=c\6 OxbϞؖ,$(_>:9>ksUUPUؽ> iӑ>.\_JsS,\kM};@ql Yμ->isW`"֯"@ ?3ԧoGk q'_/aǎ|ހ76V 7=6V zWR{ Z4T2m|*[o۷v#o VV {vV9 ?mZ蔶q Qd?9jtá/lnmF}oY'Ǿ9gܦK/> ز?mwIq&!J {׾ԓ./LXH$H|Ӕ!#|fv<L1o:/ &$ . SH]jOl' (@?^^e+JRܔX)ȲLhݸ97¦O)^i)Khh()MEa ~]_\=3i]gp㪵Mb|> sN:/# ! bNupͫSՌ$yAnƄhձÇ>?8! $ÑJqv-_1!ӲkdΎ~t2MFRHGYfKJ/U[5B>U+ TZ~ƌ,O!ϢDd՞X^bO43/>WU$<̮+f~fGFBMn:ܳ?̳{j!alD_L`7eo~!gޅ~hniY-ˁ?FwFH&s~sϋ'[V٘(Ggӽ{;\}Pm|_yū/hD[Pb~ky_o>Lf Pȇ{_߯~Et ٭X:փъ 9W}]Ӄp8j|4̛׌ /Of^~7XōBgL:'<}_UyJǨ "W_9CkBd ݧMZ=qUS;˙ 9C?H~mOAY'fGQ| A2.<h.cn'Ra}Q"-4[s)^} $%%wr`lL] n6a ۷Ё8{E:38PXŭEAi,pWgy5eiڨ՟k/]ܹM_G0+MboP _t sĪRa6o9=>}{7=~nC>h 3 ,{a K=w+r&{*Ρ}VJ ' lݨ( |pҢQ6Ǎ;K**C[ M93֬qY,fEQ 6“sL`),K/[qzm]ra*B?=iƖ-X}۴Ü8sٲQ;w5ltdGpE@gr;),CQ$ExIsp0sƐKI麆ޞ!|]O`}{,U SJ%K۲bيe0\0_mϟ>tH TW.O$2ZlƆ+g¶g=[%ՠiʭ^bôI_ Đg0ԧCLS 5Qμ(XH%폂WxOECCE~/{lO뙫yK,*]'K(CW ~sK_;1vW[[;E>[%i a /|E8臬H8PUՏ53"RIӈ@U\ӟ;?|6H#~,GؼFjeu }OGd >uMiǷG]tA]r!_>t*dY•W-ma^!W7,ϯݱ3g6< /rùe9A82Xs i,]چf WtM``p_wu X##3KڦW;hϳyBoo]K,v s#&§ +,L{`U Pw,nfjhSZ[cN0\A~ jj, Cw6 ӽߟ7,,[8#tg># 2ҩ m9zZKģ6q{ kIf,#OޚgeZvi>pK?4>OFGSG p4ё$?ZTW3Ɇ )$Jp?naXU0 dȐHt߸O%9U-T@0IvlgQw4<ЇΎF"[#A8hm;6eYSu<ȼwʽ6m:YOr^{݊u9~Ty! q<֕5ᕒet+$29:Ap HYw߮lѳGmGF_:`1}/ZpIQ = %E1/K `1x(J޶D"~'L06ğ܄} 8>o?ٳ"~$ٍ#O핃#@׬l&cf]'\=:l~ѡx` Lซsc̛߂[nMS anMm*n-":;es.t!FPTy|9:t/J0s~dyOolsH}]iӫSLd}"x{?4o,"#k3LXUg=|e"~}Ba+Zޡ=|ᯓ8^" ۗ64V.}}]W_q|zk 3,o N4,IÅ0'p\~ëVͺ#̚t̞\ƃ_e]O/b*a{c\_UZZUdDz؎"IƓp,-%RSc%wS8o^"P|!΢byKTer9u`J5@:ajD VB^&e~ח-~7od2 Րs;Jz˯9I^Lϵ5OSCO OI8D!~asԯg1qfϞ^XFtj3 hcŏ|o>I8|dWܳ~&ұrp8,"jgQU~'ҹa&)~MU06~q[s9 Q ½.dy+v/k WDP<\1r]w]L(j ;wva]"al4Ʀ_WXzpӯ8$/Ѩqܿ*Ww?@e1y͚hjȿ}W|zμ9*+CxI2T*DXDĐ-.p䒉{afB>dr9#q%n I{&3a8dLbsiGʮl@F뾰pѴonxj 5۶ݯdڴ@ېaY8@.pU8bU4CV?ncS}TRl798˖=jx.Ik(qgHnp6(̒>&=趷;6"; f-F0:; pLu'@qcè2w^sb(W *a`0C_-~Lf˧hkKe~ۗC5t~U$Y!yk Jw# 7 Ugˏ^ .4 :+W΂+_=("8DBT)D7pl1yGJ޵zbEjճ'GQ$$,[6;?c|H v7 K_ɿ8n 8ُ_;:HHd0~˽S >U짃A Q~Jr.y,c勏xʏG $IBmmzG;[E)"Ms>37BPXM|_(Cgg?v`d LHݗ]f͜ o>:ݧA Iʄ9{ `Z귳g7 ƹ6 dwy_4Wϟ;,|>*J]ܙ̶n:F|WRۏZ^RR,fΪ¬ٍ3졃's˾H~65p7ALD0e-wK>S <1kkQQ<|u +Ugvv4UElA l~+sHzZC'tOz!u٬WyC^yZwqٴ_Qss\8,,K=V]~[pS)+s#LE?TCݳXwN`8R:dץ[QڂuJ$e TU-k^D8CNTztw/TXӱ^ ˂(X-uÍXWz(;ωp  u YTR9-@S,>]e3gιaXj<1j YCe t]Ë/ǜ9Mⵌ Yvymoxr/oޱapCU($d0spQs8s͵_tѼoLu.k0Y3ij^э}cD\uE1zimA:ȘZuYjw*~fmٌw{oܳ{$ȒOeViL>K0/[5B߉X< O"Kyk07˜Ȼ1:l,R!c d$Q5J8uFf1ᇎۘLK yB6+21׽ ˋ6fάy]4M?1~ۆ *`j)w`(d"kTV(R:gF91-jD+͵k3p8`0{NEXs$X X}l,3~rU}'F^ߒEނtNl%YF0NݬX9+|Eƥf~V̆Qa_wyqʶK6n:v fJ iil}/ڵ[diP A̝ی˯Xs0 -6\O&3o=1 rw8t 3דɜ~0z1cF˖x"\m-k|;tY +\[:^}^wh0㨫ߧh <Je$eYo5d3&1ĺ$1x*dw[迁ù]W=Zjs1<G}C 4m .\4momMD7eܹM= 2Yq}{{* y%vGMՑvΎ~ %ܜk2 \z eq,I>5M3>~L=&̢3꿺xy{L$u>U-raLl퍍PP1ss&bt¶_|lv̋3+}f\uu(蛞Jf37O8to{zkk푎=r3g]+ 3!,^QkggE?- lJee<̊3]|uuhGgAgOaap0~r?6W_bi10.)_{8cG֬}r^h Ea6*+C>,eg5޹bE{k$,m[mjVD?ÖgsƍqoaCUEl مx,X,=TZ,{Tnض LZ"CK!cEEpB-E%dy H`ׅߵeK'yKgmrP!O~9`#I)-R܄XYx6?g&BxֽeMA"! au&NS;s^K'@̑s$&Ϟt<7ٶ(mfUe7:rx{\p:eKFFnɾ?Xc#^`&ᇶ_[ZC>j;G(i5|l<=p6rl$d*QxIێŋ[wta^ 񮡹 _<}}ص ĪAl匾t:ߧ(c8,.lTSÅ[t6dOFk5Dc'2& sȚ&G$ )URC87_5~Jm& LnET:ΊOrnyIbJ٬ӴLfst٬Ib .Qr&2 U#IDATxwdGu￧6'e!AE0<`llc8 #B l*lުݻ0=t{oկNSaqaacnac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1aH翾 |DTqQa~ ~iOpʁ.p^C a"udyh ϡé  --H$ݹ :2^> 6,F|:n1?_tEqkɿDn"2Q?6N #9]&!":EbZ$&Z N $s1u}llEEs*u)Ԃ,.N$ BP"9UI~+ЂdZ 8|Oo}Lg@IT 43qH4(!\l.K:* k'90рsDfq1%&jh! L h::[/\kvo @׫322dxPu%DE Sw⚋0|ÞC1xyznpI=Lx*Y$0QGB WJEЧ𸳀}~PqȁkWq $Ÿ~u08 \&_v /f9or@^Ɛzm a@Ъ {sKv)~$"_2Q^4aUBW׋/;6fOn~֒BV]-]B]3(_ kN'J-;ml' ;΂83E4,VF#K@hBd?iX= eT_'m#t}W+ze# oޕ a1,)s7btg)e!΃+'#WĹyQq(Qs.8 Sχ%cTIv՟PΤc3"ɔKW +mF vOV5&轒DQUW[N|WĦU!DTԻbau4xjq M|Ģ"8?\|H[QEC4E}WqMW-V rp 4ZPIP@:7NG{/opȭR(GP!+jfHV+&n :rY0[A*#$?XΊҒһ|8:V QycdnjWT2iB>:Tq>q.%#3#1I-;X>P<ڶGP3T틈!~տӧU zDjQFtdVр#"D25_Q #7V!:P+d@a6q[&e>bt>/.F`8bG_ƪ#J/)|S#kif-U(PDQ<Xf08,@3N`nZTX=ɑ$&B3[ *$A:7JҕĬ~\: 1A7V'"J51 tҊLжα1A7V#1M45(EqD Nay?0A7V%fMlPP0 LЍUQbF_(bgYFf}Xuunct<΃&ƪb`}s4vG}:hW}Ơf}uGLЍUU3YaU *p0q%`U 1XU8c8Xp}*1D,".`U 1Xf1\V3++f[|P#Yưq+S tcdph`YUa`n4fһ@*00A7FK_6Vnh6F g(V=nXUanVMJ?-Vn#U3Fw D W-ć 1RXU8cqI*\ê tcp*2O}XU!an #1I,qzܽ9f[|h#EعJqKpؾ [n cڡڪeUlA<]\#D,pRk| c"@G~Tx6Q9(>rb3w і]8AR\u, E\4! w?E~mOkW^5V]vw%,8- ~Qѧ! '_2Fy*\P>NXMQTIiQy ^'ՃT~ EMXDn!ЇU}w{-~U@a :vt[A/MuQk)eRHHWo:K kHk.R=j-'M „ bx~Ve'r 8^}5#k0UBWNDϝ$ԨvAA(<d}B'/%Uٰa5m+2k'4-+ s.W}E巗"*:dx"״CmEf,o FЧXL ٲUAM|,Fm2?˱ -Qo!wC q Kp5pЎGYeH++PKQ/G"^I]}Ţm4W@.7A{`6d0_(@W[ɞNqk!Jk  :AiA |PH[Po%V,?huېJ0܅H;z ^4߯׫e.moekZhSqf!褯;Xyb`?G}JQAߙN.D*{oK@PRFȨ#@5Kz'!=\BW,1h̿]BV致VE#80$ZʻpT!2m(EUǪzDE2ElCh}ޭ85K$__$J g U~ZKK~$jBO#DHƶ8Eu$pktPKyH闇KehFw9 |Cøq\lӯ`c'J[eܼ6҃^+6#ڴF c(>X^h" 9z桜<7vt k"8)E9Q-Y8Ж97Э5r8ϼ؄,n'"t[v*rxŦ! ? ,nG:@3.1>,tsi #^bq;?257cMol uTm,bfF:~_ ?0FvAl_pC-0:բZd۸5<=$>{/ e_a#~_HK׌E c,psHK_a†q<:UVqp3UH~&H?/@|PP9.#;?ڲXY̋p0#"xg5˨V[.V=XWn4t4zqn)VYDzVYnU12^fҍŨW[.V=XS,w>p'ct$6ҍB;kwg1:eݓᗺƻY!xhg[8cMбίnU1Zk;WzVX Ыn^ 8TAU3}lף xqƸs*Vlr?v*FgTDv*E ;1K7;9Wv89l( P}$SP=' MJUߣ㴘YbyMHc1`4gHt\BԳn(~ɚ8^_?ۨB/B'iu_Ry]ĔݍN$K- X W»@ HsUFw,+<~wK qI!`A2c,T!FdbD EB%]j\\{ z|?qZ8ϋ {W|\*^cK7ƇTx8:Sׁ.;6Jh#ע 8H/=fK7ƌNUR3WTw<(4 mx .;g_ܝ鳅C|L7|Zġl@Z3Ɗ"v^K]ѭ|drmJ !{DB[vzAqfも4;,:O=Iͻ)}. ~53:ۑT%ď+M$Wɷ3ouz1S/߄~t$ Fo(',D8)*b7WA<&@ (VXÁsoD]c)~VrG;|ӄ Ik2_Z'ιrxẏN0a1.8JkK_׎n~P{v\L7' y9ﴪ\ERL M`.{kTݫ[\oUz|Amu䆙?,0bqD,Pe{іJjmfNf{㪊Kj5)X ?XC9H.UZKD\e 1M.; 82iPQbDA׈ |h9Ĺ.6߁1&$n:4XbQb!簔 䗣蚨$"xyj)b~eG117ke6L-ÍBxM=ӭ~G9 1&⛈o%*{J"N..ӝ-hpּ S\b* ZZ[^ Hy% ** 0$ ,ĢH>XGHB4\l*E@Uѧ%}i'D,*ZMURz cD𹧬OPٔPvׇƪǕW5UrT[*B hqADGWe-uencLXڏjVa]+hac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1a`nac aa&a1$kZJlKKl0Av 3 }*"b&*tz%7󱜦 (V+GvzfƮΘ+Z[X-%@VfjB _^IR@Q2r:Vi-9AIp/.!䱢1#̭/gR[hhIWoũrU$oskΕ,1oȒ{X;XQok6Ifiϣ@p#u[ .&$$tqAz~>" Hn>NG&AwN`ޡ h3BZbj&K"[q'n@_i-HA=wVu!Ի=(Ԅd">ލrkCG꫽;Bɏ_qW%*I8_{k/&l&~wC0al:noz_vF~y%sEɅXN_3Nv\\s%9uՕ7"T{) 'EZ5$;|︳y>" al`E j8$_Drҟ"O{lJQ,sq*z"\sjc= n}iP%ELw2~V<oZۈ%e;pw߉k.eH\LznzbӼƘPBJ~xE-FEϝ|E)&$zy't/xUBxЗ*GZl}ԶX;pNXWBbJI*,Tz 7{)v̗1/hRלJZ3W㒋" %7T^:Vx1֯ZI1*kA2bLݲ%en9H% _.7uR,\t8|:#P<&`wOxp3`aĹ~PMঃUԹ/<VMK`rь`K -!_K3m*(Rx[sd卤=lnXV-f RV|) ,mSgk=4aɉWw+'{^o]sP[[yMq7$y[8cꙒʮeM.%A!*W"4KVۏT9XA] =s?Z~mGst U/Rr WDhr%ށ"^uIBAΝ_Y14f?ȑ9^ʨ%Ŝ }<^<+(T)F!esIl}#|sW2T~mlx8 7GE'\ݚ?J[CZObTJz%臎^oo%qWyV҇>B#:?Pd!Kx{;&_ihυOFb>J_lJj$ރxr~LH6bDTTE:A%Gr65DNGOI2-Drr>RVЁbP{jzq>pnnUp&ո[jV#Ɇ9p喃p`6Ď{D 7U'C$%hPF{4rtHeCL@ehQ.a%& .;r/I1w} \R?Zyu^"o0XAC`Bq-f|Z|չU_Xwj~N/Z;w  $oO]4L|J~z;)l B(tG~E}Ej'{ϨpUvW ĹEJM?*VHp:v3G )bօ DRӟ _^x5s7śKG P'3xާ*s(*JHHZG8w{pv'0ȍ7 ŠXhJ/+~ʋ\Nԭԓb/:ƃ8Bt &JD:QX++n}B1Fs7K7}܎Vk̶\4q5dj5RX}8l1C--沛'kaXѮDj*}1FڷB}0J}X|ZD4qud^X6giYT:?zRif aDb'Co.b3F *8s7&qZLЇ`š \:t7=y3 c->%v 3ލ&fW[.Mۮ>fW[.g3{ lS~%yEЄ>n(UxcWB_e_>*PmAf=3w^ONîKPN$ J| $׮/$M6ELc. c(V~'É@YI9Syx@c "{Zv;/3s{*E\@qa p`?[!^H%OLUpUU;-&aCObFA|0anf}ppqn2@$TP(F 紿-!A&fbr8/O?߼k}@Ȟ#kn V= 1*xqeTXD$1T6ʳz͌"8a,AV2a\%*\ׇ,;+ͨ3?^xJilf2V sUt%!XPKG]'[: **#[HY}8@0n4?%e%͉s>$ /\9&!W969^CUV!o#`b{ZKn@Sr_塐o&ٽ+ՅӞ#h?PY 8>LAPطZ 1I/}q/XȉUT7sO$P¾'}44O*P,4!Aggގ?k'nz/21( $yQh/Fb2GOy  _BWDA(GP+TJt豕ҏp~{o}1vu]ž'*gށN؞H`QB0Lv{N%<:OEhsY_=HlNtT9f/(|A4&YHQuDGcUC+? CĺGMCe>BɁp>n9a}-D'd.TCSd~1jc!GG}؂n!3 (P $woCM˄\:b -lpDq,f| Z\6Z]> G%8r)yMUXTFn8Hxf3a^&1xG{{WE8ZQe$慑`9Ϩ7q7R zpg݅LU3G(έd+G yZq|X K)n-oBC642hSB5ިTػ;#>L8׶v(O M\q$_n8O.Rmw> *(:tsjQ[P͡.guT`k M!wH}?9sf:*{#)D]"SR VT +eIt+,aAPWEkB#&=F^$8)~ֵss %>x3^혠8Dv> 7 N ,kw6uNrBP"mU\t( {kC:G:~@ Etj?} F=ۋ{8Jt[Os,[?,,JTp"h+ȦIVGϞJ^KƁP\@[ 38MUmqig(j߸GzЎ  )WeM|o=LFr;s"LWNȍGp_ڏJnCȏBim; ˉј>qBq"Bs@kakhи/CoiWlj/QPAP4:䡈RQ4I(uyGtb^¬;\ ~&¥.YNy1KnhsBL 8-t8V;A:w#Z?PڇQAP1RVrn:jDe" ȭ^RSm,A?GQUqjEp%/g]RAX$ k@ U NQzCXjS. JgnA- 賨ogq tSmHV)]aZpeCC9&;{ձq sZ MsQn֣FpIB;՗' *-B~E<~{S|Z5E;TħDڶR)O0uɟwsq;Oo7tbfGS"mkK;mAEJ^ A|D| ͓WZ?L,@<&+MSK:p>{TQg$Gw3 jh``vfoRL\ASBqpJgx*"-sŨ&-.h:/[ $q8TWz6=:ƥ^D* ?;@F_Чތu3up$p+0GhK8uEQL$ZD\5BNo7B*q6+ۖz'Qh/*+' L&"?Hxw嘘H{,r*j/u7]"$h>9vU -'?w aq$#Ȏ4Ɵ{7:W#zťxɲf ϻ;|t<ҿeE:xga1-Nu1"I 1̣zK8Y[}UY?> -^2 I Vxe}\lN~7d\##I! ['.Ye%dN=ąeNS)"\\ۻ|@\ 3xr8EbpGv{i?Y) $-ĵNBu| El)\K h12P Voqm3Bj }^7vbAz7o4"ZL>ϵ虎U.cgYWh%su}4| J$ ODsx7Gekc}#8/͒~Cv<<j; 栾i׬ }h[QzH/YF.D}E]bp.:ݩ3s守ja7}WJ M[ܨ mEˈiIEJO&1󌠥nOk.@s4(Q?!F^j[/Qw7l ) 9Dݲ a_8gHٕHlKZ]]Qs/,RPxPp@PMcFL퉭f:!tzM(,rYGTMvKON{(JHX_'ַͅK4o'uhޥO8FȽ c`\+U:Θ.ˆݫȎq_t,1Nm$Ϲ NP$#v¹2h Hlbu3qu O@8>Q8s8Ip(CfUqGR7A'Ze:nG ЪSO.N DBu@Efk'/5g'EEk rֺb=Ćruq娥N[vv%1Uz#J)\:TeͲT?8K}Q)$iPT}7'&"$jcyP 9Q}½G$MڧCļ]Fa r('"l3Tl(CL'&pB phT,NNҫa E6Ł1x(gVF@>&* HX(0 c$QnYI*} !Q}-!͎q1 (Y j I=#ĉr}vL&Ua.,uXr|eivDȤJy>;-g~ñ+=}0cIOѣޟrs@Tٳ/ .Swմ؆a% ńđKbQ^_5uOoULeZ\wfax:-h0!_,_PKJq&N`.OgXl0 z:>] b-╵ ^6|mۓaaut|6tz-ٕ B@3£,Ba|W#s uL`"hIit GrT%^^k =Z~0P X@=* U3ٗ'A D['m3ǹ)T'IBРpt}+>b/8?ؚlx,-"Maah!r+)`c^bL6'ԓ.<7PI{\my^",qmq* Hvt]ٲYWGc&N.b^16 >QJW5*,)r1@8CUBuT+$}QN6b 8 !".;pu+ȍ^\ˡMב%B֥D0IR,lm.b.Tdgd>'5G9 R0y[` wr+n;`Skt!A"* lu*q9 ÅUBt  r+p7){/B qbAA ;=V ϗ  3~!%.Rw#sWAM"_֦+9tua4l5&6|[C`>UC6'l\{D k pc$p%ȥ"M'Q5M8s P!!!S>^ cI^ |7=pGx3,D}< KZH">* '(%GU⸝,ײo$4-$EJBuoF^Jxxw\$Zj+xkŏ>(zg.#PrGZqT|^YNB^oZO`|J-/!i ׵GZ3pdu|1Ehj-dUbIbf's:{@38+OuMt r9{=i|$|\b;vT)EL">|Ji<,"R,F_ym*.䥪Sֈ&j Aѽw|ԪPOazJ̚=@ ^ގ*OQ-_2 h?@.^.I*|`EmŶpea>uz^|cQ(\W V"Sgrb`VcGTbiȉ<}-zKqMz WϣK Zx!`?.y㹈Y틻(?}\W!8g鐝cX-Xhpǰ'}gvGcUmg | E&JTt1>/9yz?b.Hl+[Ѥ?R*RssRbƆ:nՒ_IJ(yGKY/u&:Hx_tr7N&$ & ;Cf5 y\\7 "5W;_$9}Hx 4IykL9-k[Ћ/᳧2PKK-1^4E*N2Q>C񬽭aGPIIm~go0ZQVԁ."?q/&1AwD-5g (m!]ݷAu[RTi-ZB͈Fԧ PTx*(rWJλx$ITb I`ϔ_Jp4 bs"vW rHsB7?F"kښ$;L15_1Nܡ|v*"$ua:Ǡɚ Ӭ/{kֻa5T\%'*5+G"VAɱsQpGrs,v[(4=lW]syG c-tz}Gn=  7CzXzݠ$!Tw0Hvq422sR}R:7d;vF@v~HcwoFj#c]k4uoڱVi XRA~x[Qu v,/{lQ_vA7  R@/{om {64]Y PgٽN;rvU2e6:RLm%tk#Zd{o݇i[gmA{5O|Q/Oؾϻ/VOkN\K΋WܹqD!?Z;diAĞjT'n6d0=5q0f9A*&=ȷgaj$u(Ud^A!(fvw}ڈ/8~Gom HEc(D\pI>ypF؀-rH=nC.35 *lG$p&p? -+G]ܔ|rיN7Y& Yw]yR6z.8jϡ{h2V{aw$H_ӿ1YB#@X!WyǠ_V>9g#Zu7pÑL&}!baB/刜}Ct 8I~po^ CwQc4mLT!>hؽ!";_]v!KP c#SNQA]Ⱥu"Vxi)گ3͋coS7'I]X{W"h]h875yyWsO@S \sh7˺qZOFVh=GN,PD-krݎ%TNzZWڶC LmW+1SL#$4`[@! Aq1,sS|I< ou}4F-^Bk4Y7h^CG%$~v\]RVads;݇@NjG |*-  , Wkd<1!QQHHD%^e 0Z)۸2)BQE\؟| A\J"I9y=pHc;Uވ_[.‡RL NDHFn˃@W"< ݉JnznLy!h#Zt[߲qzVƉ\x;Q}zMg>f9ՉJPq6vzDa]5̼^[4?)ݟ&I@%O³"qN8Hv9OJRin|E9+OUsbW O5+*FO|1>`ϡA*\U(@xbq8pSBܵ KՎo>2Y5$ !y.Ԯ{qIv@+/w9~Ty[bW OZB#k_Ɍ_Oe@Y[o$V6TېڿYh$n ϗ0%Qm=1&+[2 Wm@cᇔe UPCWP(*NDz- E,߀fjDXaK%+K/%3⻹D̈=xawW_C  #륷U_UFNo h.E.O(auaaahċVksMYud,y)+ /vL{:8~<*&(+ qyg}?Z͖@bDĝ-BUпF%D{tǟCD/?I/Pfu9T^țD.;y;hDd. (LT6IRtD ǘzӇe_W#d6/"c>o@3,>M٪_GkQԿJj^&L2V, + BJG"dK$Em+ q/?;>q]O@zd"<A+hnLP"z"5/cDRu+gX -Fy:G@yR= Y#f1.4=?q 텂q$Nr.5z/ىh8$0piT}+*H #N?D-ܻ|"BG nbmA$²ڊ.F\ZHQ,DQEGĩ9dBL&3ؿ"=UE؜0D!g$-u" 8'+# D)t_W9spUoWpT}PЈz|~G?k*:<+[ܼI nu&%qtDNfxʦU\*ED*'"BQ/9NrAgݡf_v rW51%R-D bnF_؏lB2.qB$~Ya_yG5$D Dj5̨]C dʹ_}MZ QS9cu&tE"Zu܁\Jx3W#9Pm8Y g\VXݰq޵@Љ?{:B+_){/&CRɡ!F4Oxb~l\!M4..3)IOI_"_D1` Ey 겖jP y®-όws #.$FThV@2^H[W3$(;dDEB$rm>Fz 3†*ְh[n8\.Kg ;zqzֹ /TU@'p RqeXԊt bc72pA5m E2GWT,>/~&~ 9z7mEM,)+Dm:T&;O "}%հ *Ov=Ѽ\F5|E- ZNtm=Ehnz84޳ mֻ`D6_קC>&;aoy.GsGMCePdշp%eJ{`L|?Ǝp,NR;dn$ n/hrp~@|_ @Auqdg,<1AEvĵs @K\ү0\$8>4ׂH·]ckuuSX8#X;//1J*]ZQAU}R@YYr;~3{8GQ4DBm}{ݨ>Wpyxo4~p;9Ϋ Y$JI+#w/\2iUWO2H{Oa!MEnz!E ->}@awp_s\qȉ8)O.)^sdԒ]Q;QP—#4hIr >[gLo)Kh| ~Do7EUppӖlL~'+dI|2O@Z=Lu{: "C󴜣?N\Ab E=8Jߎ: *U,k]H]lGq>վч/+[GDPp{{C*񸤂ܣ.A/MriR w[tZ'7]Z$)<&g}b)#~"UpM{KۘoyS:"E{e?w+5sHSmVw;I%rPT5R;ŧ ŻɁC I!izr]QN#g \Xr]Ank<}?Qo"SE ( k6' Ax_Idfð5/ҼzDuN B$E\&. @%mfqsH rX]BT1&E%y}җ'qK>R\ӿ&N@#heImf)ʈA%tɻ[Ɉt< Sԭ^7߯_@E  ]y3$B:a->_Aʅ ?үgZ1בr.ğs}rzj-}ޭTתpbC=SWY}IIAl/9rߧ?,ʯ+n/I YgJ]i~B~ׅAr|\"vKY=( -,;2^s)D"I+3EtQwQ\vnt{M{KE"a-ADZ+:tuXR|nHHgR*ݏF!N_#ٕRLvK$_Oq+̯MQ\v UMq}yכ +xI%qyeKC0 nFϋΔؤ oZإ6"Ps4"^I_Vxp #;d5vލCtK1 wLU{`C7^9 ūTjR҉z](Dw߆Zލ S}J';}+$1ȺiliLtwυFa-9!*XYqGE "0Jq*ӛtCZ#ɪ ~7FZm=ڱq wJt&l+k"$ur둋 ߯Qd—"EU Sb F]׏ϵP;$5wC"*(^*hEɫy9%$IЧ})*m`|KXvJnjr4F:G0s;)@VXs@d>ki3Gcƶ*)yx֞go/Nz`$Nj\j1p@ P܁.b[( 2'F>E9"ޘ\3OW}; (m7x۫'ZM_=;%3RZ0Z'J|dEvP̺&}sL,e~4#򠍬9}! jkm8$m=p4ѬqL?le?PAUQ]Orp`܇V{v;ҮWqc9EڅO8Ϛ8H} XDsYD&+HT\QϢmj'"M+ce'EqN2\pmy,;.1yH>9Q˾~!:U x¸nd8༠Pu%L.BXn i ` ݇lU`?*{9N% dUVU9( ~foĹ׮۩-r;3:ާyJQDD|*EϬ{4A hDI!4fMd8ԅ-~RvENv 6*h_ke=Z/9j" GL>VD:')EwB{'h?kU"u1w$*g[A\S#3윇eOy4 =Rސך*mC!h|_|] KMp}.Q @ĕvS"MbI)%T3QR&c+yDd:4žk:=#'M(#e4V~e©/q`#D.* O|^2.t9ܞZ, n7iXSҊ0nO*o%A9{rp8.E}ŹtAQYD1*rQqחwvl%m~[`f5 (EJtwtN;)JJHyz)߈r:%N`1{flA2&?uZTbAu~D=í2Y,M8K4潫F"e'u-r>f(>cɅ" .t\ìBk(Y?CDCliM+Vl핈JFS^< I{Ɠ& aq>38bHH][+0"[zHq=`U ƀHMk%N2/~hV6!wFCᆏȒs(>Kb:VDӯST .>J!K+Y@4ЯJ/68VbsԽ+_> <y$YGU8DCD([OO'OS]\>|`x+zwӧ/DUŌ|"Hn`* T?Шߣ ~M7& .cIŹ]hALLB3\m'HK-Ot\#w k4_Ӌ7r?#ZDkXڝ;+XQ`bG'xor*1vcpz/_ADY떮JJDךDb[V8CHL.(\)*-!;YZuAwyQA%}Ҋe:*OѬbcvúx\.5xශ7UyLi{8ܺ\}À/="tyS3>˩vF zqq&L/ [q+ bmW}'WTbhVO$3nLbe۰=CV/ۼ91/AHX܆QzX,w6$G\BrLgg/_5*P'3)bEA$ _u×|9CWN>ӾygO"Ho[I@8rF=L_+"N*9pߤ]/YAHآkh~rb֩R}:$|}#E+\̩/^"+}Kl=|q27F49aqXzB~ '!d 1 )FYN:T߈ & /U oDe%S*}{DU?{<*-pY@'F”p$0ƕ(S$^W TֹD'W"J8wqc Yvwgr.x÷Vc]\bſ!U&3͇m2B< 'Bi $^]LUPA'n)*+M)ad aq+%m=ogʡiׇOpCoxDpqh@$9q[Tõ"%5u]@UH{[ƢI m*R+RYs^Ehu|S"1'ęs,$75>t;pKCoU#*My+pH,M+*EDՉrcGj[Un l<&WlN'>cdiHvH䑸FTD :_s?̇2q`q9ODlnwx6I}Ŝ-A7N ]݇!nwt}LVp?>NURO^'<! Q:}H7ɬ8l?r24OSG1uO_M-m^acC9ǩSk )(x{D'-POϥ*?HeVS6we& ߗޏh&%{;HqЯA㿈cy5 ϸ}s h6)/X+Vu~_,uk.n )T8n,5>+nW-{H~e[dv{ef'u(BWMniWP\٨Pԥ2*'[;W:tk?udpɨJc3r<[ѼہZ$Fjg;M3OloAheGkzҎh*Zh́+b7ۻnmw[O9 4ClycE!F:A'\a_WVC*PՈ&qy|:wk}ZQ'U\.ɛlȳ%%EA?aa &WC'VR67I_@23/'v7)QZJ}Cbi^$E 9Nڙl˯`!.^~oCE֯9ʟ] 3*\Aԗ2X[?B7_bv^ Ur+r ^Do׊:qf_xyK}_J^گ"l{}?!D KnZ]&5pv}@RxE1O9xnu+{#탐LJ}ÉIZcxb?&Nq7㛇A0ܭDP% Qu0t@}L(Q /&wɴ1{UW@S%P~?-mE25 硕-E;=~ߪܫjG} m NzQ^@u2Ϗ:flH{رw\{U>n \PG| BIиJr%hЧ|cWƤށ! ]nhh_۵(ݘ8Vo8'>BHFujǒ'Oc:{kx W(jĿvwn)/Hr4^)yP*K(fEF>ۨ 6g sTaN\V!6l[ݹb"h<H9#0Ey7Cr Muc qjq&Z's0nqߟ#"Bx1 |A+ n q A,:f,rGHm+}9v#-e]_3ẃ_`Sj/TΒ 6+fkwpO ճHe29q)32;Z79("W꿜I$ܹk}(?laן8T+DxڣSKDOy{K2? :%uR)Y5 ,Yrk^şD# "05ݶ¦sϕ۫.(AVF/1Gx.פ'P92Kz89@aUQw1,^meh/,4 b*%@ ʷQp3p8iT l)NkG?o>\jOu)ht[|Fk1 TLWpw&ukj(@薶ncCBI[hy8aǶaDY~O9'o]~%Z崗U%ޜ$;%M D$6]Po[!F 0IDw.s׃~HطA%@[.-4y-~zUf> _8e ȏ&Px B5!HoC {ftфRɟ D3J|ð E!q!݇r!L?H }f#,ߤ˯~ Re\(j3ᦎXT1y2xN?"&kKX+ٿe,zp~t+ 9.169h̩O-{_a#"jq^V4{Upv.%nq?Bx&xGZa*Bu}X)UHD{#yqܼMƽNB6Tm)JTDm*$TB% !.Bm,T($QBԄ$qUBɭ81o3gyqvsfgfgi$99sxЛ~5%'壥O*LXHJ=I?Dcz 6LY.ifC],w#XKt#뿔e8LZR?qLVTP|_Q*Ή&VhF[5ްګ|z7JR|XuGjqhqؖ6Tq(;mRϥ۟;89{ct2f+wmPUmWsž?$kJɮ6Fl폩v&y\QgI ummI1K$m%%ysUu/ڿZ֚p) ,Diq揳ci=*L?/iW t5cPh7͡> cd.~%u|MX0~t5gWOiN0}慑 m~pBvVŮ1FͭH ?P>w'o^_zxetiq:nc끜Y]]*1 B9Tg[GuHwfg~te΋B~jQ_EޛJ˘y1yg2uG7T1 )jDݡϫ.csIB+rnRJ,jfa;0'{zn 7Y<|POE-%*L;=6X45K9K/ [M&}J+l_HqCf[sܛaP,ov촮TDZzBt5' {W \GVACQvT&]vIƺ5DJoxM=ZJ7HeCHCNȼ[ӷwd "?=r'L$]cV|H;jݏHگaIVtlG_3eI=-U#y,W5c&"LRA6Hâ]Y[:Yܑ6yYrQUwꍪH,R0!- ./QљPwSr.N7{,\hzdFO:é\`K?oGM ?ԑbq&A=L"bT'ؕ{6)SZzbHz\akjv{Ҽ.;Jt7g%@ij^G~?NW iJlQ şLnR*.riSQחI!"Iby_ K|=/wtECaU>ԞAVx$ِ}_ҧW\u {/mN~MRRR{fOբlwPS0ٿK[=ɆE/ŋIafUzv ˫Rݒһ<ݽGxx/Ë/~mew|^A2ݫ{C#KҠIx\dSS.BU5jهH*%x^f'$X>u}ئg0{abux۹cR|$;>ʥMIg +ܖժ iaSbMj^떚c'-IBZwž/O,X'CV'J=_Is޹[lPt♹ݱ+vMUѠg2EUf?nJϹғ 1 Sv/f'e8 Nh|akJJbdEJoW[Ym%mՋj瓲8.ٌvqdOE(c0hlHpN@2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2@A  t2?mIENDB`pydantic-2.10.6/docs/logos/comcast_logo.png000066400000000000000000001117121474456633400206700ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp 8{IDATxw|;Փ;fLeN^4% PBJ b{ջJ =]|ߏހ>3I<#((ߴlw@QEQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(EQ((@%tEQE)*+(JP ]QEQ J芢(RTBWEQ(g;(Q@P u!ݿ_@46` hb@T#1@(삘sKtEX l6(JQ ])``O`403y%Wb&RvJ`/`̄@[}2Ж+!vŪ#Al3>R?ɾ1/}Lo37(%*+pl$`le>^r_rTTUa~~g/TBWd,)|2xxXݮdlwĂO0?C54Е\v ppFag`n+K02W0Е\t&y{|sD^d"n(9C%t%L~m x 3r?~ V|{Kۀe#*+!,?>oLlݑ,BHEЕlm91iUu%ppQ;#~lvGI%t%[|YG>vGv~v)fg# `Th m81W+_5@G; *+{݉s?^HX:,C TBWsUW ;;Iu"cqxFx8E~fwbNo~٨(۶S_AkkΎq ]7H N ہөv; \).)dkո2p􇱘e4'WF4F粵DWn z#z$Ic82D&ͲDh^׍#UQ A'7vq<87KE^ uu&YƦM-45FimJ M)%ٽLh\B!/b/ՃB br /aPz |?TЕL;x ˌh9~Dt.^ElVRmHÜּn4v"4 p&`>)L! 3Kp:pVMp ڃ!{02 4 ř;{>ʚՍlF /a g&Wi"y|dJJ&9M1]41ljFΒa甹$zGTKpWW:쌣d_{;o ½2bZ}{M-B^|~M;DB#N4w3bd ;ʸjmeeR ]ɔ2= 3(Axl`tI7'cCtBɖܔ0]]oX;oemtv&()>BKⴷ( yG9#Ԩ%ǩdµ+mwtSK^ R⮮@ x0I?@IS}͹r~.;/,e5\8r$ ij"2? 'Mt.ewPe`S ]܈'ij'>*DB0 bxCs ieaVjc\wZ"ބéQQh|w)JGG+8I{m_=(P ]Ev+l!:/*UY݄a$oEstFʋn6zڦX45v_֛uh3#]3/tu&o!|}4f~#CsϏ);H¯,'W'furiy;*2͖ͭ^8vX`W0eQ ]Ilt$jXz7i=QhnWޏʿp:oEbM_gϮ+}#՘g`%9M]ttĹࢽDX/>(^tt.Y͢sJcǚ AӐ8AM^!j eOb])sǃ"RV/dfY߅Օdy+V >ķԦ/_1=ooy7XͨQhwMUK]}wrvQЕ8!]+ֱ訫wXMȔk6kH62vWV2bdINW|ORچ(gr:@ORZ%']O)>lt> K>B%Ii~:'NjN{ vcĵl"as8̷'S:qhn||Vv,tz~pq84w@JjP|ys6a; *+@)I%tegz #څ3\ޛCJ>ͺ/MDyETFj C2hp<뚭 UE!Еo9>?#!u5՞.xW:{>Or|No i$XQb6zoOUE.ZJaS ]2!8kïGsz\ݫ}Nx" k3UTVvY Z s6H*+_Ba†ffiW>Rmf^2xanCzYX4 <)K%t\m-w=7Vlp ;3PQ 2AJI$dͬYh5̕vI)*+;2oߘM۳ '%1ًl?vlNxj}6vI)*+;bLGeS}kep9Г1j?|o=E!gP(S[a5ivI) ꮫ숥Elc ͯ}gg!D*hz=R-i9?l뤱Xm5Jl/f†^'Ѱ -൹K ]k›Ŝ3kD M}8)%>f}Z}1w(ʧTBW7 tGoz="5:'wme蒥K {U!~ {X%k\⃷RTBWw}bevG)qU6{1m i[ =6tN&H$ |j쏒TBWwߘMQ#%gkZ]IuK/RBQK-?JS?Jo!,nWk}w>ӧj+_v5kq{jn6oneK'j)Е&`ji㕸J~%%pE+!|]]I6p47GYk"̤(JMrQ7N dK-obӆ;3č[(yN%tV.X= kn:h4áe:0ξ(M݁Y(zPlRҵrlnEYv9hﴺdQJJo\]OMgt]e=ub2Yv;hor+R ]ϛNh>*[ ܖNUltii+TBWz^'Q߂U =[4dS+>_JpTBM$:]] +)TBWz}(Y߄щp-Do$Yk;;tv&pF NK  3kJaR?JKCTS2X]p9̄LP`to]t+WP(d!vL!JY90b ξSb:B"ꣶÐ$)Ԕ|m"Lugu!@&>erjbIv- i&S(wj 0t[6vGc*+=,ХHH+/a|~H@өH8A"9J<V7Mtv$=k[6x()փ@$%t1*+Oh9S3>Bܫ=7pY7g 6P `*&D:; _Eqg,|phHuO. •[N8yi;ݥ$,~+G%tu##d8r **$O+tCz ×%#QbAEinm(h-@;aN;niPeq( d+_Žw[ r;ϙ}({@$^_=潩 hꁭ3  B!K :By8<3X_5*`0X| ԦeeGTBu'0z*ck`6}[ׄ)Y\LN 2qUd+x9JE /Fץ#BdB',7ۀ>^Sy:T bvW7 |y |l@888esm60xs~W#Ұ+R,LlC _շHVϥm75lƠ!\. ]lߩDRγzvoݍyo88dlZ]_0]*[s$lr1G^5}~Jv|8ˊ1VnGzGY]Pɣ = YK|7$:UUjz(p p)0j# 3e[G%7 )w)\.Z <<~V yƬtH$*tu ~dTW[^K~}pp8_h _TPvm/&/oop3;;+G#;e@9Hw$6MH5*ma2j?0~J۫9x\v S!* |>OEJ!3b2$ }Ⴍw_̐a5*"OQVgb+K-bg']N9 טܕE0ײݑ~a&:+{]] $0a7sy졹8F*Q+ّ`ʞ|}*dw[;iWv'}XI +gRN$bo]K{H@`OS,b8˻+v>k0)c"& p‡ò3G(U~ VV+dń lw%_ ܗpd 0=gӦ)Kg(]~nͯf0ՌY) $ՃByn(d@vB\9߃-S'QdjޏxGfv?[of63܀ ?G-eEݑlpo;!L4d,_^Xˆ%jWݐO} [Ag#0 Jˉ(F6^Ѳ Tqd  ./"T` ZpjީZ NdmoYR3L&S8B<r_͠A!4mT>ўYgEsUqַ'htL HCJ\K85AAeEu谛e^Qbwnz|1};" ;{՘lނi 06/زހhB;??w){ok @ ^D QT(z<ڨq 6drvJy4w8jn6Ì p~?IhnbcsuQt1k[c|#:g%0>%n<qӇq N_^]yj YyO f}o]37mA&Ӌpt$:d|H$!rq?i8&peo/x{ gXcl_3z3e'lng:CyN0wK'[ڙ:CC .a&t b)H ~XcGq+ƑފL7meL7֒/h/AT#~ۏZșKj;-q}2ΩsP>sWQq lǎ1;xVgd+/u"Zn8#F[ej:ci3.oЙN(zNh lv~#pR _ٳ,H3;S䂁0BpR&+'1_ƨAx݈pn шlki!_׮t-_GS/U^p[Bt&JFM[]ȏ?En63lXŗO˦[ u]DGC )֥{F$tÅJa :1) RMyjLMjKX QA<;;{+QTE 'UR _!q~)G?rQ|8Ͻo0.=C #+?^PC{{, u̘~ioNn/m6g *_ =]fO}XS 8qb1?;| }4}B~~8΀FJ^BcIz\>h4i3˺Z(ʑ{t߁$g>/]\PS" fPpԪv !1xpf R.mr_zpiP3gnS[N^^ȡ,ֲ1fwk`oBM0Mt@3DW Ww"Q}цز=WWev֩䘃(vuLxWyqFV75uVNlBN>ubڐgne=KǬZ8Dr/ogA^|Uˤ{7 3goEbbB9p]5!shCG[@A<Ԓy:?GUg:2 3 ܑR=z5DAճiUcA/)]6l,UtP-l 8S۴Mma'e攵HUNV0KFEhhd=~?kXiN=/0/nWlY,7%gDo=10yoz mXtzF:[V_s U!8K3S$  zH{HEp[ݽ?ʬt dCJs}l70;eC>'+C `lYNMd^>\%z j,[Fե)*4c#3"A$*[%3m8QOZWrS%}=/oo# CMt~2ֆP ZXut~-?WMJI2c~J Z{¸y_iN1T2yk_}>|˴:k/0j7۶T21=6x ҴU{ES&oR4!2r(Ko#a=.ӵ=/EjKI}!_j^؞n_a omAU21d' %..?DkJɔ#H " rN8r@zj1rwNJ!8lH.;՟1ߓũ]hC2`E5+=#eog< %'Y>" ,I$RB;a|Zq{c 7d{ '?&h;I'- cbٸ.R& >\jY"r U+d<S u+/za9a  MIiJ*|,ZߩI7-vtt. d , pv3t![=tMZ>g UfJ^. $UŸ>9"SoUHsKǡ&@߿ i~0ږ>|JS1G$^hգ 6{0RhxYR7~gE|VLetZ2%%>Uӽ46tLb}=36ӾJ{s$殀]nߡOwӟ[Pl%*J9JĞ.2f9̠.@hNz xGgHX,EX,ʼnXQm'ߩ 8lkCK{tIQ\u3,?c:x/J?fUvwI@xxg>uMO)eHZd.{y#9ծO{}3I \b_2'_uV/%1q!`C[J3^v5b!k7D Wio?LTUJ$51Ǎm,Yej6h/;zzQпjĿo$䱱;o%1~{Uw(œ*qE*\J.gbc}ԯgׁC~l%[go`O:Ч6fZd:n(+!ߑQ{ʫ.?dG:s7]1ϻPI)Sp2/nmSyιy ɥ 됽r'ՍYEVpAJ|^!LqR>0q<"*!5R tf! #tB2ͮS/86:SL9)d}lDQXMaiWFZa9GzG~[$RD(VoO]H? +xC,o=)= y5 |a'lk'9/C>{B3; ̞#E!ee~q*B2G3r_Ϯ3G|.DjK3!,m·hT2 DYDvXSuEwsFJۑJu7ddǘªLK-V#lŸ\N{YH߰\i,b$yrKN%0j*4Ӌu4oF`.%>;aq:X5nU(ڶ TcbZ%MRf88~*/9x*4È'q[ UVCZ !ࠃSYeDڮOܨʼf[#ޚ`aNֆ ;׋uȦ-O ]RvM"BՋH(UWƾ/O,32+G*|.t]tuSē0:"5k4bV#;vՄ>(EF:dG38Rl x$gX/4RIGQf\Zpfg]HP3෮ !hhd`LB7_ZV/U <:0 I7kD+Lj O޳sМZe$YYޣOqCJI"4F,i}*$`uKY -o D7.qfCo6෰ ~ޛ " S BKs#tB2[*$ 4AmCvB&+OQ$ A$ f0vROGTD\^u{0 uϗogVUH&W8-gJ?jBވPgE"r'A萩$QL, =B @ggCuX1n] ,%Nڐ"c\M}˒բ iȶza:Tݓ. \%Yۃ#RU˥a M;j48,mVgMЕ46rnV&$ROq9DȺ:3#r ƍ#2 ;]1Vw>K:詁/'IDc Rr+S>0 )t;뤼<@|tuKSW 1jFğZNՄ́RrxHq gp%^S3f`4=*"A≁Ѕ0ߝ;u&XIKBy*$8-R|Kdre#S)EAò?PUȖRO8$뇰<UH&7I;5NK =m&ncP%48 COI

t*QhF32g.) yHq+f1c,͜:3]v%~ۜQ#d"J= *i "$g?lj9Lgر%Sz .RV'(.iG[ۙ1T!&aH-66зYH 3K+ȈF<-B66xQ1*?[ ʭC*"Abќ*OQ\Yq:UH&Ŗz]K~m8|85HQ"DRⓉ7vih^Y#R(/!uutp*iBUH&I![fݶ ׋h2HY>V0QT~%f9FWSQM=֤Fvpϵ^UB!-nt:*$yP*Ct ˼V#l;vՄhcÐLGD=$|rwuK|[1 ܕexVe+ Эio'D,H}:%a7'vv.|7\+NdT^QRbf  Ќ3 kyJ0#"]SH&CX[DNUH&uF|rB_h"Ǩ ) wN8+4<ӧ߸ ϗ8a5:(+ P\#Y+ݛLTAYHƭ<'~P#>E!>UShf+>f /H;2ޟ8HhSHcZvVnniqpPGj b,Gw6t29]Rl@"ErQz(>@[ kFFJÑ3l"@{ uh1~3L8cTPb)|e> Z*zxWiN@u@ok{@+$99&;@9=UP'YΒ**[vBy4̑Gﷶ/8o/Udr:S;"i9?|`cl]+=|eִ`M$zr򳏥h$ &xwEiʲHeNV0b%NUmNt$T!\uM]f"#qVgw;,4cqp:4m\o:UY(%Ukz r"kSM14@Xy1GXٚ#[zB_rk1ڢj=lj`ƦuUo0خhtFs T{ b뚞qxjWOihOrb˟ \:RH%2fw[ID[`ux, I! Ќ2g|~y]ACCSa8kGԦ T!`Hp oZ˦dD>i"Q\s1ASO9KJDq`9L5g**FZKxG AK)I$tKǖ4ѺCuhM0ft,n_%z+-U 6 4 omC2sv{ 2w̄I:8B2sU!0Jv{ڮdJ>$t\Uu5jˤD+Lj+<<~k$9y(*<~pd9nd˧R$_cV/CD(6cts@Č,7%L|kmޮxr܃Id\2HBO޵8Csc,Ǹyv*$$չ",_f,.#qmں= HaY?QuȤwtVpjeedn])$3.K1>i3S}B@s!c\WY:~gW2)g/}q lQrǏl 9ѪgM>{}F27e^q2^.̉ 2:L H'lG8'MZ: [1pCx~{l)D`qx8~﨡YYlt~tP^G]m,ljo1tx1ӎ^H׳ C%tiRge{e$,NeJнНb_0z֒ym~= 9?@vql" !n0Wa] %½" &hlbQ lhKҢUHn=c`H~wh޼l|%] s} Uct )N~/K1K F&0%WnW*`l\ zB  ͔t8%QB3B*"A0)$sBnS Mס&#s$wEb6'fy 6YV.jIsc <WanFlS JѴr"fcvLpxLoUEB2b!CCHvEb}P^?Y_/bM s5|peAxx~wb4v$ߴ=rI&'үoDxWyIޞʪ ֯91x+$ $kY& n|:*I!9h`x X=n|'~ovc?AyغVFO@RHaF,wuyIEngHLGv!WD̅+a.V(X 7{C*F+G~RI_=O`R6l猎.<ê_&HpKޣK]H$PuK2`[#zD8v@X{*pHd?a.DqyY:vSѴѺ%-'CW"Tw0<fBXe`ʆ~RhD_\  2f@߇H$ / :c{8>ɢ-B.R_1NsW0Vx= }wcD߸޷! C'i7o(]:Ǎ7ko/RYmm1F*eQc2YU}4!U,q g/vr :nπEB\Y"1׫b4KJCa=\2UV\0+{TU3PMQ5+mI^\ŪnWۺ(9yܟxM]F_{H aUk#"UA~WFR5 9xϭָ*$zJE9qr~} Ykr+/eceW_lD#@q0b"* ‰.pw0B wsnK"ڃQ:x?V^Mj g-bm:3i͝<+ghϛv[!-ax)OZUHf<֙8w9(oJ;RQ-l5ґKDͅPvp 8v,K0+fLOd;y}9tD6Bt@o=Fܜ:DtW6N?J] HN(Dh?p^VU3G٪=<~dm 7sBw<!WEqFdh`*VZꣴ-x^z ɜpB2-l - U>5ݾ3M1PCg`_ХXܺUY޾[YYCMdsSMN! pS1Wp%#L.c_4(e۾yIFd2s+ZG6#[AZm!p#n aNJ'[#6mF6-[ОPzhd:\$ķH((]ֿ0e$?~BsT6ܑRd m;#H = KU!]Eb4>ڹ,l]ˢl V>^pxÃCCJ$CAG*J{ kx_ux}!eS^/Mĝ-w,LTBT{/Fּ{ ZC;x':\@ShZRT4~ Da9Q}83w8(wpu<~h0F#%א| R+Ng]Oj@ j NYۑH0#Хa9!B2Ϭhv];T֫c02gm重KxaۼZ;E-@p;}xY"n_Yc̪Y[/UEC9b*g>3OKϴ#Sli !;hA'DSAм#wrT+ @Tt.֍A dk-ZX7ϳp;?v4ظH&1:7ٶ╗>wGu}B@ccJZ+Y/4AJ?;O;m<5y&/x Cx<ŸҠ3fМL.ۃ eÎg/cga>+ Iv4#<!| h4?7Py Am9ꟈ?3dG;Zu} e-=s-EZ 0mI gqWYmqsł[oPRCi? xǤ~CY !WU  q p)9ml_Uu5<<%jVË[%06Bt8׎9}fɏCLN+X ۡa͕Ks]ihc 0,#7H͟(/GJh@ 8'M,;4<*ޱl'{aN.ۺ%6m  yӽgX^lwxmQ> hCJCIC 93%Mc+x(wrЖh:rS}|Zi듳?׿ȹA2(*_9gn&s0!X0Ծqȏ.膌4 'oVHұloxrAלcjl|'ltF**H&y=(4ASciG7&xyq*$\=fn-xKY/&z1Xt]FP|Vo3Dnx(cMOcAtli y>!>,4B3*r;ŒċG9.B2Wm `;+Kff}mg 'ml%T<(S}aHK8 c9ȏ n7# i{C`::LKnϹDZn0j6![kY@+ ȶzK!IelmL#YV\SEf4~{ Pƌu7@ڻ`R1xXFܻҒny)\skW4Cں:'PT29sKx)XJBbW=&bpl!:*W\f맃1|| `l^`1FBג|1*/?aZӛz8r܃"iu T9dd-kۻ-O=[k8p\634*OI93Acp8‘WJ"AJ¡t&;8k am3OʝY d޼̜bHϤвCnXs.!p|\e4PH(|qdZ WY'^ 4DzgϐJG deJ?`~<*FϨ>I:YWMdrYf׸^<!R'DY?Klێj_/Z6m 0=j(2pOw"t!??iM 8\hh]"/'5t5 1 ;DO?(*P^ ZAc6@[zFۢ3:M{˥sWi)E%9<~nH"wnv <ޗ r7 X_Yԣ}1{AŸbB-BngyŽ_E8=c:dWys|ӳNN|sJcB ߨ¬᾽HxZBh1tX1G5r]k& 9hA'k&r k;veiz{:~5wWpp v; அwER}N}Nt-{GBd)lEλcYߖִ(?|Dt`deZ0לn0\=e{" 4MȣF[+[ӑE=d m1S$& itϒ]%Rg@y v)u\5t$9歅Mhhb)Yh4},4c}(H!-9`ƔYns s[tsGS̟?50ՋǪg %+W>2wMHPI%; J?ǜr]SN_r1=f9  HrÖCT]qIU_ý7QQ$ݽRdNx9kX hRP1{)\O@v tn%\<w$CqϽ-p`'y{gwb7[pVr4p4xg?;3֜ f.FS_8u@|t8KBD.:x}-bw hޑyHe` MM]LR~i3P?{FuQsFLֹ͟krq 1^PF ?<~7٘֫r/;>kZr6u# @.6Ɯ!n#}Y7<Nwb4`N06&,Ǩ\R.Lp PUUr7e 9z!Q>X\8grb|m2;2cqWaE%;j&piNJM,75%tpdëO@0TLI.mӮ_I98=,Fm+b4xޑ)?c:-5bƖE\]Goo1|D ӦSH/$SHN&w$3)unw 7HL&Ҡ[F"ZN*/ο s[IslX iZF싑 .TKF*F+x/mǨ]e4ѺjOZ0ןta$w~jUQj| R&h.$Z{P֙Ee^8if=Fu=eꅼ фEjTARںߚ~7"׀LmQ=!ڿjܧ1r? kwMp 3 'B| lY+;S .L~yF9qzsDBhA,wOʢ&q̔no~u35P[s3?S:ϞS%ݿO~>7v&ipC:7a|pr7BDǘ4?@cԯ3z֥D+BW}d9Lg!S u}?zN=l֒WҐQ=i!ҭ:=8WOfn73b+š ,3s! *6-iE;J[.W ]SRImrş0ޞ -2Ӝ\x;6hY&˿Ѻ 7O?IЖ度eHKR:^($ؒ&7wB0unmMr,d^bFo8e&%W=F{N̒쟫󻣄~ЉzGQS$ p̳֛b?rխk9h7s1"6WXi +I~ټZ M0W9)3@johS=d?iw-tka7O_>N)~aFGh)BFp2zZ1y@># ok'nb4=ZR-HeIQ3 ͓r[2@0ٮb#1Y_پ>K$ h:SP唩,q2-)~9/dA_M MJp sƧzbhv2Ę;c͹MsqN=,F՜hD+ ֧#|rsޫL$~$⟭too1bd)Ӧ,$v!^[w?~e ykw4gwufyHr1y9k=X-g |;ci%WF*s$"A_Լ,G);u%Gٻt!0j&;̴X,$Sۙ\/$ 6aY)|Hƚ{zLyB^jwDG^|B?J$Y"L%mwb3Zg9I!pL<}d#7 n7/w]f1 4s@'t03=dOT;כEXrLj9S3GRHsQ#ŕnシEspx :|y(ش Y-F#5[HHZq"%{-_!ƻK_HWS8wbqƚ{lKEoã yvB\ob ,*76nJzz=3z% wun306Gv6ehEfBO%,\"jcINdz uЕ=gTsFSs-OМh#CmG+Xpd]W#{zb!cq%Hmm+1?kNW x THD}RY a_aGdаkkRcaC h:QW~Bܮs;LyB|EըIrYAPTX @S _m!ކa; }7}0D۳`"VV+yB2=ϑ]=h,ϟ?! }f8uf5hj (–5j]20HAo۴{^Kd v h8Zte3ї?m*_%ͯ"W{Jn=XrܻGdի=NܨRz޶S[Xsi6޳lèb4@+Lr޿1,l[>Qjt0O\[8ż%\4 TKht1~Pꙧj*H9-D#rEׂ*|W<_ ^EcM3?ZwzQ3l; kZ!d4aM59ZкՋyw{r@'4F"v,SQEȕ`Xs)ta isfIO#[2׎Nҭ[Yk&ۣdD+9WCbFphy+ ]`ְ+zFEP{{ֺ i?Xs{k}0f5 .0ұžb4R B_eoO١!XHFЙ([p+ݺmϸ܉&Z윑my G}SN?HK땁Y& _ZKNq-2i7TLfZ7wBІyQ۩qW dt+=g.eֶYPg+MR)zqKϗz dj@{{۪G0 _Xsڐ)6?C7`ԯO4?[UցK#רXzdc+vצh=ng$C#3˕ G^M}? Փno\-̃2sMf1qWC*h芑xC[;dUkz>=P%)yq)ɐ{z]?2֢s 4ױW`nB}{a J${f̭s3 >>`i*y0ӖdnN|klLxnU$FIW)E;VKJ"x5%} =Xsw#$[y`ч"XqX?h.PHFhCg&sq){v)bA:=^1VCZ{^˱R?@RtWl{{ކs%Ƽ ѐ&]^湸:cK_Hߣ ȆRu1NjWNྒྷh̸k=sAN< cZd}ۄ$KU\D˦($#*e!7l<6@THޯ]}fXuf.HSl5>JQ~* bsKR_zZXsȮVfRѺ4ʆZ? 3֯B9uBeM@Gbu`7NI+jҶx3gCng$*d8EXO֊S;CP~ Siw%#zi]ޑO~&g1l/s{[r'h%U flhg |Z$ 獦җ#r= ~{(^def%eIy.Oؼ=Z*4KWQ'ҭyIjڠ̶v9sk,KPYd'o9j8B_QBRP`x{ohx3c͉j|ޫ#eDI5ʹ =7iʛb X>$s[7E9ٿTє%TV.y*C|V< |nr`ESCχN5JW綷}e?hIxY& n'73P/-k[;Uͫ'rqCf4xtKżeᑄ?dUI!Hul.fQz0JKrK~ks~h $t4SCL1{mQ_'KYx.8u1/icm_[c[J5Wd&4Zxtҳ} |ဋ%K',5e0fhYS)$hh 좫أ".+@nE4H ȟ?7a^WNǠe ONm5f`:/pWd"Ztd@=A MD+*G:% Fd@ U|uoXm0NWdZVq3yߤJ~>`yP.A(.S\ :HUw(۽RlNjHST4L`Wr&vR-aIRⲼfd 0: `֍RSJ1l)BT^)iX־f;ؕ "U˫?q/mk74` 3mL <,6Ĕ;3ؕr»Iƛ ! v%7I"ں_soEvxX֞ݧCfBh{=%dWnkyWASFJt$͟pٔxlj51iЫ0QUN0IxURWr0K&t6b-?vxfL[)45*Wr -8a䩼| "ܕN"?<<]I%u%$!ǣuFfWJ/zo/?8|jT:ZWqxkڟ lOs,v )TRWrɎ&!&1lJ^6>)Qєb&sIGjz4oOK˱~0 ri5"gީ+9H8 ^gg4 =OfVܷ;?B#j;c U=d^;v$w0^,w}A0)C~WrHBG U Q}N{5mkY8C 8}jTMhR18ks޸XINrØ}@Q)Y1[1:A1OYW<ɏNU$FYpohFB /XM.`0şŘwl 8{ztCx=UAZY5ncƆ (I B޾ ~5#m+Pu>)V/ޞ\#'ݧq%t7qOlZS=%tu"1Jn҄FW*F}#~ɾc .p Ig#,{ cנeѺ{6à#y|#ݫ֡ǹ~oyr?SL[FJtTޱsoci+fKT t rxt-r'bT^zv.--Vg+9GVcl~9P;HVؑ\nf݆70V>4O-SrPmby=WY{7w1rUuǿ̶ťv1Zm}ЈODSLAF#F@b4FBjBC!AI 4DDDmUHJ?s}8;uQIvfO/p6osm .7u<99__W6oE5k٬]^V>`g.Ts߅_ H:@jY9qm{aݵe=yͷ} _kdYΩS0Ձup㒁v:qTyS+p?V^8}wi*]cm<|;`]hugi^57\Í~-Cu{[qV4So*<@af&{:lYi`v+Z|[y#?hәٓ0w *}\6k/]D[w*-\֎|>?3URUU%e~7o=R}=LzfJ7טg> 5Bu-#=coCkwtb7w :7/Qſ~'8fgP6#<}勿:25Y5ξրwo=$_xzV7JᬼUn-*(hP- iK}CuQoѭ{͛;7 I+_^Π|7YzA8sJ1}f\ܢBGֽoQ].5]e({Dm5X~g-8}& De@ӓ)J'˩d=TJ#0?D&48Ζ w`7:z Og| ;+~^!$ܒIe=0;IQ?MP){Dm5S#So̳-EQwjEAT լB5!(W+fUF VVSͻbq33P}bD!^}$I]n1mө/_$I* kx;5C_坾$IS&Nݰv7%_v9F t0P }-eBej69^zn @I'wD $Ip] zӛH}pvIR$<֠7f;+K$i%ӟv|: |rp[c$ŷxxxܡ \Ezn]Ñ$px4]A_j56 ~}$i9?HvEs!F`\fa\l'IQ-8G<4*k5m҂a`[$I˔󤈟Nb4/BCVJ%I %IRtI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $ItI0$`%I KA$).IR] $Iw? IENDB`pydantic-2.10.6/docs/logos/datadog_logo.png000066400000000000000000002032551474456633400206460ustar00rootroot00000000000000PNG  IHDR@@~sRGB, pHYs  PLTEc,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,c,,ltRNS?]y8Sl Bcަ$A^zWv9Z:Vs(Gf6Ro43OK>|bDe![x}yxOq+CDDVܼÆ'z]"""T,o'=Х"" _t%#G5iiZG ={RKOkGDFc]_xz󵮉7 ŒgV]x""C=|ы;%m 2z2ӣjpCADd]#<o0džȗmּ˛H.cDD'UάZ}9`MhK~݅) z؈pn^T!zq7HfK>wjqC햢Hr;E e$"RZ1S)1zD,{횕kjݛ|i5z̉$+> b>0=DDn_k!_pC7;oP_l(j? ;"LDwQbpbM)5Ggۃ(< 7n(>~pCA7 G?~a݊/!3ЍCDuyNo2W"iմ-=5l= ys (,>Pps b#(.b]% "ͷ+k8%o9Rf>Lj(X:/x-'lDwl*~HmtѶ5!ݷM>"2y~W4 +Qǜ_\; ID7e9gXh(?;|kycK^\M,Aw(w+^bvK'SэJD~qsU,ɞs-KD`gW9ѡ#o"gA.ԭWhjN; f{?O}ɻf&"-y"ewAҡj"R_S_D jno"ȼщHί{<[o 3 OD-4_(.;ܺp|#6pJ%@DNxjcΟAa})MKF R}MQr7kCŒߜ8?6]ڠ п$9}Q I q諅x\}ܢ]%B_>Dl5+:" ) 2ỏW>QܼtC&.$]Q5Gw(nA_`DAtN=w8J/4ͷ+G(٫&ҡ#2A/RB_zDа˘?$2y~|?}q*:MkwԪh|c$ {(w˓ȟ2^p\zLÝ]ї(!-]8kJ˔Wŋ\[5Ǭܾt֌|ܕJu5KB_uUtl>3UJ#>4NZC򍧚I1'Y"4u}qqHeg8k7Q+O쉾c yoE*V"p2#*>\F>Uu84SEW,{cV\J{k-|4=EKV޽j1T [>v݆d\"mkּ w.|}Q1nQCmyĮ_"]WnĶK3)7>*r iyGՌ[.E_Bd=7z±o{ t[ɷwT7[G+ADD5wq~;%CqGg| b")qB֧yqu=met60H$eࠋ{/8qh+U:yy/<ׯnn3NJ(ݹ3)]H"qNoń.a[_N a㫟ȝ_/Wy /k:N>'K.qkș\٥ЭNaKV a=v:,M}ϼD/WT7v %է5Wvtks2 f՜GGn7$?;y;=}OWBNCiEwIO [duۭ2OjQNC9  Ld߇κ|MCrzm|WFeVlY^yޮHtURStTpZ >߼#nW":?<˿%Sqq(tGw)w9W?T5r^ sMJ;u#:dl\62uMJYE? ;jh<2&%*W[z.Z.)+KKtyU9\rD'.|%9vDAfG0Q ݥ(wx"K݀PBLLjۣp!G )7P}]QAuև+Z#,N)D6q .IԃnM19ńAI}2CYY0!4bEC^x?B,qOaQݦNzȨdʿiBE1pUY:Kژ;Mp+mcȖG2rY[[;8VՈKfc$nH9(6T?+^褼j1H&^N˗{*<uyVhTJoSR?щyeXsѻf@d2}}"?`$P9(Bmޥ[4^磏MK6N([TMϦ.R Rݥ:lm42bsrqOK(T+&8/Q8tǜZ. ƽAVݥ-=MZC'c֊||?w*dԞ:6-#fjWc&1VEAR93Io|T ws7Z%~ɕ?z)tŎAMU:ӱ 3ϧɉӇ}@V֑{b5'fV~}caXcVY=Ft*{*ϸG}9ߞn^7)LFJLŞWY-ZVjMx'e:1/~?[?o#v(oajĞ \>Z UitJk\|C7d N7W"{jţnKѪ+K2 U^OY}4*X E6P͘UճeySF?J[4}\O3NG(%AAnSI[рUfdx* D~MC'(eZ]f+vPU#MDAU=u#RES]Tgw*ћ^ȅq$(ƙ/K_ŝ~񿫤JHGAt#s]wPifk;;q-niSȊ˔L"ҥKϾ^JGE&nS @dzܒ +bWx!q=>ю]PB' +bP Dw'Y7#3B$QqgOȳ\<z ߇1[DԱNW0=}d]H=Xq-I8g=bKOt:Ow8+OT;U]\n޿ ˛l5ݻv.QM@F Skeʑtc}kUEK\|(`t7[FI MO_MCW b2%!7=§ӃR/GnSӽVNSc5dk-E< rx%(x̘2 Gʧ~rrZM=2/>mng^ČhC%%K_ƫ`Y8Zn<2/@t|\*,6-27ɧ3K͖?љG_FIBP܁Sjj]S>h'q_que5g}>3e}."={Qqcu ȟi{U|(xVԽ)W_y۾\I;Y ` ֛#lBKsg@;u~3R{ VHA(h>C+=<vvM+c=9=VWd)A5Suw= GuُnSĭɜ6ؓU&`G'(;y310M!e1mN@𬰣Vۚ}"C'( 5} @c,#/<Pz]JTWU 8+}@K?3GͮT6VϊUU*)'t*U]L2'i`V]T%;VNzS @`:lr=3HvM3T\TJT N֧hwPA(pNAʃS*ÇFm.=vX1MA (pԆŷ kW?2PAN!9a*8[" P\nT:j_*GU`cӊsAۢg)t?{~*͔}ҿ~́G6C^K-:x6o NwB/)p㰬'wRbW⒔֢s;\l(HN~E)v1DM\Q*Χ{fov8JOVq7VYvHrSc\ŝf}msi [}j42H~)pև"';*Na?4R8yv_QOV'{;/lHxLET| C @Sݧ܍.ayDuJxcNY')s!hP[at8Iޏt㚕[G',y=()x}jWצJe#1͙oDOI:xeN%S܀nT{fE*GBDoƾ2[Sby? C DϑDw^r髓r} CSBgf|rj*ǻN0^td}qz9OGQqRGz;i:M_,b+Hǀe.Ructw3́^Lx_ɜ%JG嗋dOCNF1r;IX+\@;IP} { v>Mut<Z't 5`S$33y)K#q)x4^ d|^WHe5#h?O[rdAI t&5oo_*ޟHB<<0E4ڎ SB7jb_DڲGؓKdrtZʏ7Fz3'Tƿ} i+ƛ)^>R_袤reNC b+"lǣ#6N^GC6 ݨ1ޛ.Ij97] Ⱦ/:7k!ei lF}͉K={pqwʝgyRs50=7)֠5zXNoNh|4ˉ'U\:nΚ7tF2}׋rCV79aMo5u[-o?FIS0@7j_Cö`fE%×Fg nSkG^mٜ)ƣ5l3޾:lFnA; Zp6P?uy}]n\fl ݨ֡+/q2K#,synl'Z?@4X#.2 >=bٺ:T8t*vsh/%|4O} ȍ݀+pSȎ:~GմӡEc-p7fLA$Q3 ډ ^¸QMr75yPz'0Nto]wt0e;l{hIJSo @X8U0_ эnD; yvtgUgakiӅv=+yq`\u<lȠ :"zst;J 9YQxt)^Ƕkp~D5Pck}߷իl%v)s\`n%*ߥ#r}>jQU׭˅0d).JJ w"WrՏ[R?){zXe. JAأC N#r %.'m{0Tm4TlM+!uIJ֡ClfK/Gfr:pgNR<؇X-_%zy iЙ{1"sPr0HH\D ))y4}k\&(&[H|IpP\D*9ZBmHwVW]"qx`Ii4+tvH-Sw#'R?F≘=ZRYt4blϤ/zmxLm_Ejɓy@ov|߳i⚶7Df Xiw$ᶻWf^D$&7[:9gᥭ'أE|fg_YtU ?Kog}85Z|Y!%@yOE?27[&\s:u\K-cOS챵IRLK:dkA5!Y7gg:i I)4/3R@/" [2͢?!w&p{pxt6{em nzO@2J̴]yCIzZ=a79R;[:Rٔu~G7CMQI?ߪ 6CWzN=y'}.Eպ̗QAWwUΈ[ȶ9>frkSi\TcԨ;6GCVv1w㼙yN@ݽ&H𨾖ktt\Gt8dV]wרE7X#)aD Ss^3e%nA ~$U95[z6} 2Y5Ҫ~}H-NZ5FGDx*9N€mKU#@5YHN#^WKYb΂(- ~bST'nqbIPpXџSfln#)kv;rb13MCGD6Rٺ SA=t‚ew:${R;H}8PuܘU KttDdg6oRQg3S@˅|E^utDdǡOTxqs/Vyj1u No_=jEsY4(=A&k+Cg+l}lKC"[^;ku3՞Vv#Bfj+?Dg+lM\C"{;O>g{Qh鼧?:Yi#RlܢPVE <}2J M:YiĥXi:&\sqU2!3[4ޙdȮ4qnA'}=9m#.W!]S:}㼂2wz:Wy{_}us즒=zў;f$5:&m@}o5!C-tgU iP& Zr=:S>Jm7pȐ!Ւeљ0!>|ȁuix?:٢mqLU3>1g<2Ϻ:Tt*ԏjUȯ̒x=Huu)LU(`(2׺F a 9S%`Q!3,Ց'3UD{c"g_0g&{*kkщ*Eנ"gt {=5utJetPPG4]x=&d1:rtJȴ!5iBS!ԑ$sEASi 1 ="d:2Kaeej{tPԅ\W1Zm@Jܫ#oB'ƥBEXG Q)V:0eKFKSCNTX9 ϣǃ TOCnF穈՛Ϣ"s!9z8_iF<gkrl1HAROCY%{#:*rnz.#| 2ݚ:/:Q5J[:1Rӥ`03OSGvG'~\4CEnt-|X!nԑUdUw"7t1_KX! jȠ.pUG"Wt1_oP!jЙU"wu5^P!tud5tj|lȕn z(5ud@JtXrMWt:7TSGޏT-w# ĕݳ2^]-y :S5ʵ:*r0]׃C:]-{:CE.}OkǠǁ OWKvNU ˭a[o |k0!iЩ*qe[ak;]>v2zudtJlBtXkڮ2}6EX\O@E5vEWO )JikwЩ*Q2׀s c[j$ =5Zt*oj tX. j2D}My:WO*(A{2j.3t &2DވCu7O8Asgm.ȉ eɛwEJ,.y~?.Nt t8Tdy:i17_-;ǃ&4YXA謥$a:sҏUsqcEJR[F-dSˡc br&9`#4.ؑ^6Xr5:p$i5۩%& kO bXMlȦ{j5' :ܑ17bu|@ƌ p[%O :zkG^s{@/@ ݫ/agꏎ/364%!tY"N& ky~/|V؜^qgt.Cv}n%pj#gǵ/8Y"y :JX} CL`Y!܎SK3mro TJzCaoԺ)R;Oُam:Z3Ӡ#CjM?0݉NϫĮɺDnɋ.#ߤΨ":Xy&swgȑv5Kхp"BXG[uoG/sыcF bM'mz]R3c;P4lV jg7A#Gyݬپ8.m6j"OOskF.:93yEJ~teL `qGzR'z}"3J#ȑyZ5Cga'gH%tYʝ'483qKv;XČY#BÞ+eOVXu޶Щ8%HmX) ̸{5GjrT@GDm y| "" &BX^6ʤ2:̴<"6.+8%H)W^c7 %^B7k[Hn[Dy:F\}.SC\62&նsMM6$z9kdjha#A7k ?{^ N>)K ;ha#nX~}2; Icd6nHg*ø?A| b[7`y w7D(oH\*f@ޗg.{Q0&;p%Hff@j(tQi7۩xH?)!٘A)zaƌLnV+~]<z3*#<&y#WEC!$,Dw[/DJ3fzUC (#7 ` a؄ݬJ@ł*`"\u%$NĤ&י<<'XDCA@7ЅA@"[Axd {?GyyBfh12_fM*dqZ_U5Qkܾy9EF12ol\]h_: UCqDRLugŢE4¥#nք]H>M!ik$wfӲE4txD7k%3`E/ذs\rp 3^d"t&s~]tux> A剻;b@..bLݭ䃐}~i&˘'O1E&J ǡzQߤ=D۫x2B{^7v1Ze3} xt]; *o{;fBz̦~dH5ݖpu4ngs}-^E++2fM>8Z n Ȍ=Ugh'kRf_kNMg'gTόێp@w-'ofսp!\95ΞAb@(* $9iwĭO]ƪyU"қ5y UQ"#9S 2Fd$v\ljrst6V,?[t: ^>QSGصz"Ίy2Bdfud54Cb.NorfƩ˗~M笚{V;5ό 9zQ^f.V/p*]MO~+ cXDڢաWwgw5^mx)}}g@B)z)ꓑթƛTQ]uC6pcvgj.$S9/)o&BK7:i.y1ɯ QYBˣ~P]t2cnu)kdrǕTo1;Os~p4Gq/[])KAtH"h'hNOw}StխƢYvdL6,oyd蚓9սy]oFXO&nuBt$*_F%q;Ci^]t2ndyJS`8qWhO<%z袓9Gw7x^r[.8%r#1]b]t2 xg}V0bSm?Gt@1ݭ*%JQ[^NL+P!%袓9nGw + iU.xT褰I2fֆ2*Nл:KvJJ$RHk袓9ʡUN6>1?,N#,jt@ל VQq ){veD%;k 5'F}pNtϘ&ι"3{X>c҄z@s]s2PtP¥ ĥ$=JNUI: a\'-/f;rm/[sޑպ\sWO?RNl(\XH!$|}ȵjj+5/q 3f5'sAw5މZS})NZd?߶/ZSt=.j]Yktעh}?@t6} /AS"B{t n5Yv1\aƉo'nWsGkv:Y4Vf NnWc5տ_Wf?؍7]M=rKw:WzM)nW3X=]|gt.D1ݭFj/zܲhљbuGW wcjYUѣm@%=/<)0l#d0QrAUSOey}FMAwiZFXtx͵T kMnW4hhZtҾ0]j2H-t4[xኵQCI²4KM.bkx%ȇ.\< 7;՗+MnWS|꣇ʒ<Sk M Ƈ?gdʵ'BnObwEgz7`銮3D}M8Nݴ/]^>td-3 ToR}w^ظWR* [nu"},Z?"GmQ)u/1>Z$:2}j㋇йi""A:UBج46pt9/Lѹ/22P7of +2ސ9< r7W4wd3dv0+Va]sϕsc7Mk6|t UB m7Ò`ATry{(ZsEn\AGW 2ݮI Oå昧JTOj_ǩ.r8;.1}:OIGKt *It8(.\OY'dv*t }`ZZ2:Tm$ L)nWvkdP]4A  ]_2vSt @fm{ lc%nWCvLV %LEwϢk&\5.^tz7]CĪ:N$%nWG7΋ͻ1SܽH tq }CƔ 3kHTenWod=~)t }dSXR% vZХ%I\/%G6C9&t$Aթ,d_=]B-$/ruUmG,i3%{_?_vr&nWϸAGw"Ѕ%FgKHᴽ+ո]X2vlR7[_u%Cg@GϡJ&AgEjn~6J&ٍWʢ+H-Fr Q'+Huy;J&yݯ^BWkJ-t˫.*Dtz?AהLbZniK@_ZtM 0t)ԮG_KJnWBWB@O EtI$vIt)iƒx ݯBB-}.(dtzrB_EL2ݯ^]A 9dd~t)<9$]Ut)&nt9$M՝ RuG_x[ 9`DnIpƯIi,d)=܆@"t5$գF_~DL2ݯG(mJyu d~~tҖ9kI&yݯ. Q AL t)$sћexx!J]J2ɱ~Ut2 F_ RDWL ݯBNKA=BIn6|"V7 /@ݰ޴E׏(ˉkAdU~Btَ]H2I tzt-DYG_ 2>CבLRݯDӗu d~hDG@_ "H& ݯ. QW[U$tz1DQʠ/ o*GFݰmD(f]?2&tz]@hoy!~d0tz~d~ ]@ ODݰ. o฻~7򘻋|=[{=9RVs_3u9<WQa ,ӗ. 黻NYy}EN_T=~'G@G򹪂S ݊nXC mӈ~kJ'gNMQCU)f֣15uK~<ʶ^{tz4]@2@G;fBcOVjn/ htzuk^|}vPʫ G Utɷ벹X9xog{bgnڎnXNCW|1޳l3knH(L—1s;mFȟ]_Ս PtztXtOt3XOMZ|nl(G7Wy$X*v8Sb( t;Im-K']Q}nXGW|`ÕwLW2 )ţ|fGWԍQt hhݦhM1{GXQC7gKH0ưd5Q7Z@S 5)t cv8]72A ntzt !)7`ćcTݐQ=X>D Ѯ_g@xB5 A5!) ʆDtzV]B$P] A1zY.F> j =l(A7wkHl{f띄YgUtz7]CҦpV@X^MW5tLwF6;QB Q4tPvz5$= PF '#eo ٧˧!C⠢ڋXz."2˜4V:*< +btIw=k}d;  +"Zh9|jɋՌ`2dhv~h^8X3r5EWC,khA#膕p.!hupJQ1x\+ItIˠ6i z->ɑ 莕dusVpyrc%"_C{*2\/:); 莕08pX`(;V2b'5v&acGf\UvWC{Y: E莕+rki~ 篶(nX0.V;AwQ-Dw^2b-auhX@[XgHj=U_v+ntIѨ*ip&>Uz Ź薕B7QNk{H枞>{*-+&G;/SDw:NsO=hIT£.ce\ s6ձ&>,~PP~ cxLvĴ¹$ByM %1w(u$o=ߏ;H $sPx\nYu$)?BgqLC:"g[VFQt 3WP{蛿dO*#B)EnY!o䦔 WPn|-8\S"4B# AdtBW%_)7{[f"_EU5&Aʺu\ρJa{H6W1o:Sө7!`-5&yݑRNܬ/A{hQ]d9#3 }CוB52 Pz!#f_ql܉Gv.L ݑbfgd4)^AHwݯ*Ϡ;R΁y "÷i0 2무9_\a RXl}}~C|%|&9 fIX` KqPtgF[RܴJ9_ܚ_-0L\k35-)∥!$0Z S(AVNtG|<;3YYc 1=]3t_nBwG|}wG gPrw#:/';RPɈos$2G^^EG@IUG4C݃V)58Gߠc}ct p- p^`A;?Gd[ԇ"f/IgޚRL݇Б|-}Чj?c}r3n1Jב 9u9>ͼ'XՑfy nQ)R;HEuҫ9pPib_Z%~In^e^v=K}*ۑCN݈ /f(N%Zm_d9Yޛt!pg*o\_j_> O0n,2"-OQ]ao۹n'?nYfPʻן^;)Y_8C귄ɵ0twCՊRZUE7u>%@'+ҕ$)obHo: Q> 'ҧhUQW fWke6L} U.WL9H jq9)H1˞{eγ:ơwJLWVy?ӑK+'Ip#:8 1Kzv!NBo;'1m\%ZH ʻڧ6]a[^$BGQ#Ox/ SK;{*V;*!Fׄ;v?Pn_ZO3K/G=Zȼ{SsN #or YӪ/Hk.b(T^O9kCa %L$6KYޕXzlੑN tn^,{N2^ky+oFI|)hvec>l5˥ʰ(81MuXhWg6a|',|.LpL(6B#L GWz49)E/BGwWYgRNynvVT`S_+%˭.cPg:D})hT+'GeU o~R~wxszy2aY_g 從DI~8ףģonSiX]5 kϪiG|Gdq1rҏ80:XʭQ8ޭ-o]%ɮ:J8q`{{dl'GQXj=(sOޅ׶vFk:[p^s7pAGI~Yug$Z&>q?95epl#=U|0;h|gDУ"7l:E;GئIC)GV wS} )0ۇv͛ÎZd[^,]#[)AZ tFE7]OxSMWn ߜk[/FGI~H"gϟ;0[|  nS{Qؚ>"ܕ =Y ;7?#;e%/M3C+!z1M6[Vէ>֤5קG֡֍@1Mr<-,z̍|[?<;AEg.!|m~F_cl$iB5JZ/Mo+\!Kr5oS(wTM6=.~}K9&ϣ-Re>mpa; EGIst"o4]GkTcֻҶ٪#o6\AWXTxBEԱGSgCicǿK (Ьת4>W{@Δܲv3ЖOڮGt7/ Y%Z; $>7ʭv;)$ವht7]̷{*P=^A6bIm88-My YfЅ8X J2iغmpVW Oٚ%NO7!ag9:^սHK;|}T-_qJ@GI3LELrU?nXvt,Nh|,:JpEB+].hm5sӅK^+Z (lF)0t;.= OZ.!L[B}SslH 9Cv%'$zd*lVPwyOZڿwEXQ'KȖ& QԍPe ! ŗzGF)/Eس"\_d5KFٮ7W_W).G'+h]*ZCd;rf͍|{)fu3BunѠg7ivj~`0_Mi]&Rqt匱}}wR-f+T_GFguƍCv߿Az?e&%üR7zdqqj#|ٻ󀛊7BIY.k"[ֲ Q%P%~WHvI5S$?kޙ9g̹;g{sP'WL<ڣ@I eZOCЊf?}T9US|Z$|UY.Xfޗžiۇ.RR1xܧ* e.-6zh\j`UNxp͔7_( D vRZVqߟ9mBV:6~k݋?, {=b;##&ӑs:LvsԹF&!_J~Z!]k|碯IXh$)v] rWμUe6~f .4!'x-unLyzvފr2P,HrSM,ɚSgZ|c`w>h(x@`^J=4Cj}4 m6M-zhbGbPm[ S߲3X&ߖBE=4D- 6]qo˄vN$ħξuۑÅF2aIJؖjBg? &tM34`(`[OBXFz(W\NjFPM,UwpOǧ=ޣ.W3e[9/UFYBlߖ|=ꡉ%e/TRڦOa *T[=)-Oyq8-OXN7/>6Jޙ#} /yA/RM,$NI-q?_AYf|,ci89]')^0>إ7.V\8G٫&`E5 ےB_ )!('+YE1WJ}f⸉e-0?إ7E=4w+UFǶiB -:BC乗Лm?~^i3ᲡH%3?/J4X$(c`j >q叧&g]` B5@2- ϿC {w)BK%q*.P`9`Xo7AxASv5-78hsxyNwWy A=uNnͱvΠ>G1K`CKufOSt) mwx`РggſA, G^m7n7$7m:F c^&[^Xzhb |SJb|lzvt+zJ>ps玄*g^S t,jqXѾC8?-zI1 ?WFꑉ+8' YMC%@/2_9{[ ^ێ7>1(2,h`UH%ꑉ 3\ykY~zIt0 PGa~gr??",ᦴ԰2zdbK wux@c7db:v{jXޏ pwPGX?gOb{ trB-5xzGVӨJG2j3&r X攃70I=.qf;::TUU%@O27` {-%x?`n/E{j܉谟 \6P[E񰁮e^}K /%/Ȍ|r\E1@_2ER Pr2.BySL6m/`af0UCy9|B/: fl^16HUހ]NN)z\xq/J}^5?\h@o2ϵʃ"""^*-p8,Xg;`pzTŠ?p?::0EE\ XJq ut`7NKns¡¾ɳ9#;&ok`NJVN >dux`Gb a8ıQ-ԃ6{Zyu7n`o`jז' NkRG0Ռk  S8k@\o 杋с!Qvdojl{`s| ut`~ǥ}${]OzW)f3:.wyMnl(]F:G_ۅz D:#C(feݕD.uXR᡿nmtXCX=CP1pEqG tl"~Qq-]74b&2A}ig^QՅ+좺{0++݁R%7A., 榺OkOB\Ԥx M̍/lvm@6SԤNcǒ=iHbi+Ux3GCy𶗥Vλs YN+/E7'\uZ4u]5|P~/!)j -2YLYw#GSn)Y܍6 6Do2It 3+œ ;j.7]HDo2oJsHY:H0WL|l8@o6֧upOH^եGOn|X0e띿s l]>:n=Ue"_MC'WXGq!%wg=VZ|VQG QFEwr1^ i/]VIlᄉ_Ջ::߈[>]c νH=l :~?{?/i:-Dtl%j/R[B)Vϰٶ"9 hUzi/*6VogJ5^ζVrS8*'j fLˤ`Cm` 0K- sLԘAv'2 6dwJHW\K:|U5F K=$lUm{1X~ԩջ謹Wʧ'yz,ln[,Gfdfu1 7hhb+Ȼο -K\ȝa<+_|`D ;h,vML?r$+g@q>R^<`V|@bj鎍tM WzBcz2utcsH͂{r^*`K,歨kEQ.4fr՘{)>*T'maus$u_`;½fXEvB~yȎy1,7|u, ‚eD!l qUWW4N2}^au'ut+2*kJ¹Z̃Yo'~$N{e>`s**՛SWQ87]LtwJ>j7-:0ok nVO m8`Ct |gZ9ӫл֣j?+SC7ˆ_a8tB[GgTǬ#X]Lf(-I%Qz[DoQtbb}]K͘ی_*JcȻ\іJJ I<tіxSk̗JC7lo'P!j[SګkZ" nXugox#Y눞,Kqt^Pw=փ~珈X::[e,\-:*y+l蒛߉l Ӫs)9}Uxٿ= NNjFgX<?6uNJ*XbGPT^NjArɻ#-=)h"Kԫ-~_FLTdi_TmEd/#Tyh6f\L0 ѳ9xwc-Y#ӬuޢNNY=V q(ݵiBep鍼>'R鄃PT{~󦩲nm~Ȼj%C褣jID|X::Xf0o(\H*:.ܙ<- *ܐ:`}TlIu[~m0XuGbi&38F Qg8wXoLRUOD#=Q'RIQ2+ͻ^>cMm˄! Tf]@G2@eL_#G%KRq@S ܣN mgw* MP Tuœ\$mA?hp x:Fd utV~ToX^u}\lmI^ƺ*ט}I І::l6l׷^2m Q J׌4FdA_Уw2s-}<2\[Q4ua?]To:Cȳ?VD$^6~mi%W -2Q:8{(aɩg\~V:JSGgV SC""J-zҼ$M F^}ΛyXy?B.OAK^=&Z !lMYK6QPGgߌUϦ{0m>Y,{:k$Xg  gp2Y$&Fl# 8I[l /7D[gb6^-=O3";ICŝj-e11F`!jWD#üЊ::oD /+ψݗvի0t0+1l].3/-]knb:V.~K.Bϲ252 W3LGKzS@m@-%61b`3:׭i\%u fZd[bҪ#yp$m 2h4eL'RgXzH/̅w nb:o,_wʸ:gUZFKj NܽL 7eb:u5V+{ƕ7wL9N<ɰ}7`ɮ\-FO'̟A 'K[®f8dU7xHN̚>7]/cзܗy5G6k>q]U-FcG<}ݛfIԻA ; nŔŧ?Jv32n`ױ Xڂ:k]cEX:8~q%F lTg:n?NBwᠹBxaE־ ehK܇!,fzk B#oeg׏64y > $=YN JKx߰j[(b#G)󜿤/R%E٠|O5yE0#"&3O[[ȁuz,_qwOpցڰA o1/rhKMXK1}O|B;`"zFFL8R /ZZcanJ'F^CBD=}OL'M62z`<WPwт!D+xs-qWV_ :l)6o!'GQ*}k-22|`W5V K_5XF48B6<o?з[zmh^S' ۄbٰ~aP<눚 U[Jbd6h^݃#wB4pzMF$ Ļ)KXKpK8 zxč>/ҁqu t/#N𰓊 7ehKMXgCnã((A/eE[ab:=t/[cԝs@/Jۄ$# )uC/Ux)ގ;Y,?~7w@G^1 +=2}d4/gYab:Ω 'O1_lXȈ1-pK &vtk/qmx,tpzpL VDVhPn`\:+ jNOfk}S */sUE[1bϪR=ʘ_QO̸:/,~["^&˄:~vh,8NY~*Z UŊwZ+^x+!<-X ҿUcx}Y3.EUpCIJ ,api:ঞ |1ɗz%]Y1\L[^L /y[n9!Col8x lڸ4!}W"6NV\\8l,-(|zNc4'ig)=f=rQ%s%6&9,Y/5J[{4ot״1^WKh:5T\RXqH|J\ppKYM *hXN!9.35T;nn`Zwl6 c=]PN~}̣e!{긠*J/HM;d䯩pK]Y6PwS vб vp=qɖEwta<Ζ?DQtԽT 5lk:'ˉ. cf/"T^؏$xS j|c]g\gҡ&tVO(dPgū" Lg c[Q}WۦbSX+w٢TK5Uz%qUJh wD*Wo9*zl@chvz-)׌+5Tz%aW))]KYCǜ{Z%"jj,@7NdAh&ƅҩJ/.]X63~ 4Fo!k,_Sx9:֩1.UueǏ>t۱@C~V?ȟK~b\fˈ{4*Wʧm|M0RR`|7O;y;ߣZW>22OpSKnojx <_{PV댟\N)UT hP3{śl:CKP?k̸gNdOyM1RS`}?GyȶFѥb\䜢k[R@AL &j KWVB53K!w)R5+o䌢Fqۧv)BH%3t2H᭝:.R#e֙tGf+r >B~0fndx9ոLU+Sqq T3tDY}Vb2) 7(8O&Π@os )=Mb{=7eU\qF R@ɜD)s!pdp#9u̚U?tbߊl#FFJ S\CV0vOr|'h|򊟄 Y-@ uەS-椓EXOMo^T|uzM 3M=f>ԗ;y;e\ܬh<"g,_o-#yA}ٖ$Ɵ//1-U9c]X ܼ)Fƶ`} 3&y;BR:a.HЁ_$/zd>=<=p.ni-9Pل[m>\=[(y6ȤM/7X3''|jpڒ>CTJWbՒB1FJX]'9`5WX0?tGFJ]N0SV&`_}EMGzF[nvi$/Uu*3^h:(0J+6=uw(U2FU'B:E]m<N~VZRw?,fQ։ /ԧ lʂ //fW~!%SW&`HmΘoe`V* {DHys_[!5ԕ ҁ3ś:R fN ]Xp;'tCp+0yoԣ`uu_ȽOpwc4{U]' ,+% XUu庂/u1ZA5 DDuRP@go*_诠PeNjy^qPF{GwBF@w|}R^_*N+uhm第VY['WT݅?كx[9X]^u}Oo憾'mF2Z{Sq+ 53ȏ7%_R{3ˌT|YAn#Twr)VkQ[_=|Y9έ:/X z}9tc ,Xg4G6 2EZs?7jƚ++!!{:h(]Rw6*}7q^)ߗ!sCpRlMJrrT5,ױZJGNzuQ[e_@P:fXDg1]ia"(ej]dŵuтG9=]&𵳘қRtNV2ry'[a_R ^ C̏=?e,By[֙ɧ|5{ .z=:p-ImO6 9bI6YLtwȽzmߓP^ + u|@@nҩuՇv/Z,xWWUe2izdx?J^_K 'UWWA>x[u (&T0t{[:֑?Uk*Nx[ms6Te҈W-G M)ڄtx3 ku'M.!_`f WVNRK#>ʁ0qZEy}O Oۭ"c!kd>:;&їw;\AyM5*Rq^0CP^C_l5gGBbԽ4g Xe,u:X- VHpl_ISbBDe5Voݼ^?/E׎hQQ%6ڵLPbQ"}ZV[@3UT5bpyL"UWBv=Ϛ o':,+u**K0L:^9COjLkʯư7T8›LQYu?yKDZBN!=MrQ ]x:Yꗝ޼i_ً9LvH)`VW*Ο$<hԝS^Ozfq(ev/rU*q:Ug4۩JԽE?Rԫ?m@ z.vMms~(xUr[$GEF t?Wg6XU:U驸ۣ =I9^F41&ʝxml->=vCu5=vgxla׎z2鴒:U骹QwCR"X;Ɨem| NOg,mEgu׼ST~>]gGJ .?M.&@V{:|GlqڪvceSۇy{ԟn#fϤ]Ns~}i_u i:ԝoB359~JifE.+*\vD3:YKtXyDe_-]sk42_܏ʟG=~:}TUVk=M=xod,C:+SMԔǚ%׋׷żN?o`\GFpۘje}Hrn"RC>zw8Ko u=}tQ6p\騊غG`̈́MRzYޤck?cHl &ABFL?P xשuefPϱ&JǔT ڸxuk?s~˄Po5uɭc&HK&}yY혆8ml|33UH<gL`'X#M̫lkyv ͼc6G| ^_ `=<ЩXYm ư=' 'B! +)amCFgFFu/KgJ3GuΤ vF&uV/ݥq $5ޑR{OO]q V=tT}b@JQKk.|sf+o`>qAV#=Cc/\(3~!p.ŧztahq%OC啦^߈HX;{snc)5"ШYB /OQ4MmH`-+r= KYXBo7+7PgB^n܃2h;f/['{#hAEwzf1'FZI6Pkj Wo2,W1{Qڬˬ\!LܥwxB Kk<2Yj?QXʌV^nïP\agu `ғ&bebp3F' vHY[ '{d6f}H~u8>oYfXR 8TBqUk78lӛC2EQ$]4/T=;nk#@fXjv]8s͊A1kbr2IKCtEѺ 9HuJv9`C_Eˎk2t^~a>iIKi >6 mY3X'(j{]"/R\|V:i)kv" ?-mJ& B:ࣟm2IKo1?Q;7B#esMI+)O/~GR0Gf4tow6]T։Osf_u+tW;w;u_yd'-Mg4VOk8nY1/5? ɼ.|=غ'4y5RMqGpʧL=w={TXl¶? MXϏ;݆Rz!wiZʫݕ4;ō*`:|AUQgOPm ]B~φ9,*QgO\ ڳ3['OMmrY[CG2F2 '>]Ws!#oղ# '03Y[nIHdŏ_'>hUղZ;Ș|zYcCM=GPKLڻfdz\+|~9g}.u3O|'Rrk;UD{YU(TCu=,_18ke1pp+Iw/WEɛ^vXSȭyoh\ ac#mLtF+Q{!SGOiۜ6DGJ;fBa9uy0q~iq ~UVPtQaT`ZR@g4M= ]kX1ߘnmzɋ f'mBD?gNhG):ZNnh0pR W9oAySz|q%3^r,Olˮ]I}+ّ&wnt"RGps~d.#s%' \pn<sǗ=Y _8?qe]z(~opZ-70GUb[u4M=.~\CNK=5TPm 7~gTzZ2/vl"mU7zTe>@Pt m3o4EO $ܫ2 PAy6Z9nҏpG qu53G-Ξ3wWͧ7i KPży㡺0oxX.7w0qtu>s&xc*'"3GUڊ/,.u}vu_u/* RrI [48a'S 0oE2Wǡr=sEU@4&PwL#E Զ7?]C;? }K&xL5c 8[O@ZR s@hN'x&Տ< &u)Q™#O 6R _pp:ѐ܂GyHK^&u! ƉUsI+xG)<`3R?Ϙ%MS"M#P:Z@h։.SCNnxW C.GGqxY@1SGGGu@ߡihUgǏ2ߊϢ""ⲳWR!Q7ލ^%pG qa/|D)HN n#WEON9? wջ=$8# Iÿ-'~ŷt1}8aoF ,Yv77"|' n[*}:Nƙt†zsiƫɩ80t]\{W 3[KT+qߪO<㹥@"A߯Pwxg@e"S =~j  (za1ώ'rYBz ayx =SEqݶ5 *佃/ߒzwMK͡܋/F>VK]{n+<9ض &8[z^*WW:slzOzJmԙ3Epx2|c8)+O=aIJXCYtg|֓:mN8RAըNuOA={ lT`5m ;INcf+8 h䝿jv $Gq7GW<$Խ7;}TiCcP"r|s3u@ MKUPk )4_nz 9{<155u@|*h=|ټ3ŖWw t5WYr u_L䐙;.{;u'ƚs~TGSG?z-uU+š8*4oYvCϧOzB6c2&~ށ$mzV Mbw_BV{g}[Ggxim)O{|`3CiNН:J*5ig\ 0Ut_:`ZNv7M^-޾;w/>q99>%ȐWueΜF{k!AmPY%!uHU~,]a١ ,v>_%zc9H3A3gD.e~^֝P}Z8jQmvz^J/RTګI:Jzӏ?,狢[7s*e s F5t+UqWg+owz=^ _Mu"$e٨Z:RIJ3V־4 תW5} +AxC>q˝JaSz +_.g.܅:m_3NՓ Uj!߸0u<8f(@f~%''"{?"v&{D!}S.3f41/0<;vـt},VJsm $GuK{}3_>S6uUV`'aJ&pef(`Dnymaؤ5uJ6\UAHo 7(߳uݟBWaQ7<6vԷF 䦗<9++u& ?졛z/5Gtm@87sŠ0(ڨ% &@r,Exa3SH,ԩŒ $3uL  $˯AʾsCsU1dtccK/*SjyI3qo5H j hz(KI&Aoe|A<&K7h'ӈKSD|~W #2VWYjMRV WzGhvP}/!}^|y~jgϞo EO!Iu5APWT}%J^k\]٢'w\$u[#naݴv:ƲQ3܍ d܆z+ym@/JO~4zKH>K:6)<璽ح$J2j S(%&g#X B%$S>԰M*Iv"0$45x]da%ad3AvEu`&nputJrN(ﴊtca$ @}nɟS۫l@]Wa\ISr2w&4OtOZ7>9;3(OQ[X7bNjV)HyJna헏C^N8>5qdx>}aEr $NQa=  0O7q&g $X7y;)d%qY $Zo8S1|"G{9eȡ'7͙Jy^Y;"&(@.@) 4 ,@kJT"RQH ~#@%R@LSȿSIQH\u xNH/Խso}b7f.kxo\:j1L75䐏%k'@;m3- 9?6S /E$H̒@ޒL:,L1Ja9Lމ Д}<-HߍPKⒽ@ZfSR˹v3~O B,,俁W J^''dKԽrc% D7m镼b>Qz R?; UQwK\:'@:Fa࿔ Fa,HzK],L=RH]hA[o ԏ c]Za,~i5Fa0IHtR4TaZuk|ւ+ڞeM ܲj-[4 u3Y D[ ||h{K헻zKvS.5T,_IG[.7zXɽ&ޘ@xlv(:f+6M >c#@?rUݥnH.ov\R X?V3Mu,<3G:ڏy9l@ Gu!n#ޘ@""6+y6_T%??G2kGlABHDup^89;]b,@TAM+HDļMd*@ēXBGPQi%-l"r  qY}_%RA]˽w`qX4m9I)56},w&J͐$6u%1Ȟ$\Ԯ KP3~-7gQRtqx:I3X4sxcA4\+pM2b{*|l r<_ﮬ#M //3Bg`,ד^QgoO9;1oz1@K+i;ޝ@"JXc5mr3w8B_a7< 돇'{$[x3g?@Z놗'w󎳪8X0:JQbA1 *6PDA,IDTI`,]Qc%Qc P{g4*FbA#FH4D ޻sgڙμs G^z.~eD! NJX9#}5yC…`r5Ƿy|LEWGΦR|Uy[8s6A Ha[SA^ޠ Ϋ&o"9&j- vt%0$Mn"l)]4y4[ HOn1ȩ)6ME- nӫaۭ5s>kQ#g\@NO>j.:'nbyBbYp_$o'@A]Z~ V3jA #4.tOJN7#WE] @@ggq[MoIopgNxcņ!l"㰞ےٝDƩl49O. Kn%V],Y>ζrޒZ:(m`wqhפ-%û~F`wBR8LiUb;?iaGԝ:/ 8r~u.̽%'X.roTB KY66NYhXUNw1c|}G(t5 Bk{(p_4X ׋I _6(bY"4p7`~;s]@7K2Nb&fgH i抰2\Nt2z:o_瑩d 8"߀{*m̨ұ"81rZD! {wD|P= )nj. 7t#*KoGzu '-f7M- $";5廵+,M]L|ri->Z{E! {of)-@*w  ^.lj~ys)UAr;-Tn ;hh;SjiE4N(Vْ}|RXM;)E6kYA4Cqk7uC@`$T_J@|n?Nñ{?$_#J]I KTfM$>ܢ2|R -'ؼD!hX>[R m+)\NVMs a]$I5I|y+c ojZ@z66|ेhH1=,&EI"{uP CW_5!,x2LaHeIВ>*N aB5̩\Hh%2^s77d,ڼ Sn\; G0{9 . ˒eW`2*+| "6K h@ ˳|w yU-6= \J>{Nn]E@fRj@=X~6+| ?[l-ti 2DoRϯ`ǽy%tᇺ-A7G8ޱ\HW $"RSEW>`Sɫ5$&ӟf:{@|^g Gۏh3aV@W $3 pRhd&hʇyof y\dij8 D# hߟypXllyGVWA>y^iu=mU@@)̙YfD# 赙yp C)3.anކ #sje6}8)RM}@hhR7t.CcM-<3[|C<шa&s|NRIZ*@@^e DÏq]4Ym0X7^*EC>\!J[T2E.QP P[yPKQtⲵW<KdY7Ъϲʒ7j>\!Auުx؇Z# KR4"ҁFa !EikxE~n * aeE.`F#THmNO5`}3"Nd2bW7"ӏR`VVA^\!bV( ݺE0abmQ$sX)lT$Y-ck v /M"ʇ1A Ⱥd>΍sv G;1+澋尶1uuH@ PV >cfUjg <r[o*Nq$fU4b+JXEXoFc]13?.j8QÐɋ]}!@O34}0p][xL5#uҏrXS 1SqQef4%^gR7*EE P@zُC۳7u^!*!7?9(9KLA"}UkLk♶dWv,'uV>#I/U_p'MvR/OǛ,d3۾lO&*AMI ( 64U$oDBN3Li Գqz˲+GᓉJ@PH>*C#8J⃗ >`Fm;_1Ծ{E"`WS k韜n 9]Y BTҎxǓ B>{GDxr|BhGͬ{W""!YCN~:xd 2fh+ u'vt}O2?l{WMDBb27:ȏrGD'2]OPYT) + Vϑ$ G|>5>K,{WMDBf׏rPJLV"rɆh4=B[]'Q@ H8lHޣ-WMDB27*]u@H$`Q&>yBT zڒf"yCہV(w]H@HCƹ? O@H5׬r"[YIhTTGO@#= Ƙ&oh h4|kjMс%ZC"$)_ӃAJeZ@W/u4kUo"&0LGg oӽ". ǑU/-Ǜd;nN'*q5 [m?HNd o"bOXG\VxE/lQ,Z@@En61I@PBuli2O@iD$ $= kА/o-^34e,9+L#|xAwpƠl{7qDG)Sa~ U='X,TW# iFlSªãȖŶLq?zЕ=XOAH|* nƃay M}+dck>LQ ㎄5j+/W3N$$ }@%K7[IH@H.|,7]@@y J@~@<=bF$ _L7z$oD:jFOtVL1DBr;e,wG4n&Q )QL 뚥~3w|&_F4pH@ȹM~ 2a' [:ʲ.64ϖoe vSϯpdK0áѝǫKDBޚEcYG)tA_-Ffw`|gȺ_OdὀwWxuQzWM?C@Ho+$ du&h3-86pF]ea$&`wThL<,RFͿ;ƚ#[,DBNcσ~ي7pPD0Bmac]Klx@PmNl9Ⴡ xGη D!A!m;wW YPZHOV[{ųifNKu"ԏ(+gdL )$&|0U8%^ cCف(YM<B ~nkT <+=2>;C،†LfrmY11ihkL$gD<BbN&>k *3ibfă4`4b#zP_ϯ@dDAlZI@H? 12Ya<vX4ƲHO[sU38ڰ D@ȁچ{Q/6jLBg`A9h3QڞE#b$ )6 # $,cc>C3Eiae$ $p΃&! eYIVūJX A+ %s)iEh-&&A0+켽He J$yL{IpI) D# tGw"b#>v Ya<7"ķb^F# ˈ_ D+^ 'DBëN:qF@PYF>܆fC:JQCbʈH@PhH.z*r';+AtE1?p4.ʲ2 pP:9eD$ (%T2yMNLOJe A@H-lM-tAjIz!7 wqaR"?9k]\|5ir%t,kmb)Zj F?tg aB3%I6H-HBbxќ_BU%d 9;`Tf1gMiܙA Iqh@:û[mty26"!1'9'g6y@P֌6PO(0.<ԘX<(M˗-Ve9B6$ifmt5؎˱$9@q5E_ {҃ռf;trI5@?*^ ]@H%Ku} bjN)X}u?m/ 0+{1m{𱚦) #:z{h󺍕l@B[àz $+>G" ǡ߂zMнri 'UwyW;֑]$m>sMғ.&v"\<>|AAX١3{y&aW (^R,<1y H[pҋKjH0|bPHB}ip(3zpہkxYR0+,Z,%oYh*2nNtaOI1 rY]fH3kF,TNIa li`/V~&gHޜ8R/|D3Ӆԧ؁tԿvwEbU Utd5$hŕHr%䌘be5뒳gӅF$o|id@ , T${oh}R?FH6)V9 \tu;[$EE *W83>wd5M %4K)}gVHAtcRtL|%0zfhBr(85uf]ZJs{;YMS܁ps`3eW#49SW@'?+"-% -KU]PSp.Ah?qQ/lk'kp,ifK`sI-LcgIW^F?Y+1̀gsv(k|[싦- ·jW^Vn%vlEOPxVM45r\\0xpPU5;Kt"Lev뢘;д0Ì;YCaJLv6NPm =gK[£SV֤  S=t,:-+2Ï6=?eAQ"D3^Ћcdwrd+")ABbݎUUB5:)ژw՝-pObx':YN=ZK(Z@F GXВT,eW8Q <)yUKYE1 ) Ҏ:sQ@v"^h\X5C#؁tb+j.}BO f;ܭ4#,%d T$:%rM)Z3lC#XR\t(dwKvYDA۹pR=D+P# r*h&ZAPy6N &,D-u褔p1jZW~V@W%:TOJ~{>O@Hpfqp灌 ֕PMs磔OhQR$oS;¢dMd35l5O@#wO7 }%3A| w^ uĉhx TĎ^zh`t89KX]"7!GOI6E4=xaPr<{—X{*\ H ZY&;&{m~'g VL7l;0prjO<.Xh+0޶bwBwkhȫз尔:RhĐuՎ3%G_؆Cl-o~;S$E!jK Ȗkh<犊K?.q5$פI pVWJQ3)RE< pr % gqSBIW)c. !nP?D`\n`j8<&u(*,J  HVf"5>#OZȃx Z_ב"Bcjڧ ObO &bok$ w75pl};7# kJLXQ"H`)ДpJy{>v# a+r]w?h#] tϰnЖ%A_a| a}NVv^'PXdi=Z31hַx%hP_Q5zё54#a$vQn{r+ &Z䥔EG@HqOͬe Nes jS8T]_? W݁hTّwZ} :HrAl| _8歈&rq")%?^bz~! bKfkԭ#+  tq.£89.Dzb+ek9TWD$JS>Xts]kZ6yinA!KF* #u¢euo3b$WmKD']/y|J2E&rgަjc+}Ukh? ܻeo ƕ|iKۍ9D+Xbx}hRn1jr+Z8EzyJL4:vY[bSKE=)\ED{k?b[G\L|ʐR2 %H|6H/z^|,;J_UDpjZ̴s5M@@QB *zK_z'Fj/ dUWfkd\X7YlZJ.6]L7♀>_y  EDK=y d>PNRdıZÐ"Oy }ޢWx v/y68ToKSv&K6uA:Fi!uczwF|fA.я\=V)ң]CS?*ui qu@I\$^\ڐsϛ[@P%){ Oi溒~_b6ij>DS?]SN6&;%,rkcЌo-qOT*n5uйQ+yG:}ER&j~v z{m2z=xHBަbqa h]% l }^PXa/K%ZyHMTX#,}[46 I71}|&jz _:g ]U|_2tbqޖOLG, ;Id4͆hPЧu1+/2\TD}GP]}WaKȬf UXA[+ |b/Щֶm5ѫs}:y&:&h19(O?:%Cg;t6 H3Wi><*Fgy=l߀?)7WNb꒻#,wvcDL &sz{up@Pj{|hFEa|F:jU^N*r |p /VV!j^ 6z3`qmx 0Rkci!B؁ِ)ZE.15 יF`!V5?2F[r؊l-U31-|ƧYԞk<|2!6?6ԐY{d42%Xte@3dȯoiR/ϐN@vQłUv֔b-?zá8 ȌMuCwLwi ˉЋL<4lBE`所 {k[$Z}yM>$#iAޛD+{ڜXZwaS[%-I*dOA(t+@%.x^Jyc[[^zN/\AߍYZv3F|x n̶HH)(۫}b.`X!x+;۫p \_*Y4fjd6R1rXpU."CA}P-7AWB/AuxԛC1$cuv~yɽk_ މ[@~7BH^#fڬp):s% wd?cQF.c҅0YB~D}G2o ˬՙ 㝬Je`q @oq\Vx25y#͘ӵM[B;gJ!.- E@~H5K <,0Ӷߙ?oTbtE XT5ηWa\Xϯ/ZՁVoEAGUZ{y5 Kli^9c`QCJ#ycYkLr @k+jל3oiQ]B5D* HȖȃ-tpsSX+KK]c-E g1z\u~ޮ'uꀢ*x\~]g( #H]]o2Qa) s<&7qnw2bd 4#eW4D@ޜkcaamBLҏE@l ~maQJq- !8\cƞ@}eu;@?30! 9 36 j2pϻa9D_~{jwbݾ1=uHsАd3 ۫p\@@ Pjdt*c^ HkԸùaB58Tlo 2yV'&e ]@e3|%bN(bZdw!7 \@L0ZlTE|'l[Y>l 83b~ev?:J3!G~je} ! ulu|-B d}MU ׀顽0,ovL.qۍtAչ#ũ+ G>2~y;9Ϡ0&#y^f5RgeZ U@*/; KׄjuሼNb]y_:t<2Ly]yՒ C*P;fdx@ rJ"yXjjs¿}Nq9ӟz;XQZu) Cgiy|Ԓ,2Y€z%1g{Y'6Ġ>7v 筥Qϩ,/- ~<º ;ǪYl͞@fQv 3SX1vSr85?$͋,&vuS!.w i 2$ Ne -p~^ \ٽ;w*7x GX=^)yc>С|f}3Wx'\ GX̹K!63Gdz^ %crOhsMohf S-Nv'`¸5M&iG9r%8.wOҹ}w|?7䄃7>i/k Rm-~ suZMo`Jzf/@B-or^8[gvFzȥ{寮k5˲ߵWuʮ1`WiN7 ]FK1rf[I,꣑Qz$3̚)-/]r-4!q8tU/;#/r>aI]6{/it[լf5MG3-;%qg|ۻY.G_s^f]v:?U8Øլf5qEM?ԇ:6ݺw>An_ 1 ^@IDATx G}ꞙ/-Kl/|ll|S!6lXe28 8 <}ݫ_{Z륜ب?vpH{HQIb  p\%b@X͌NfdtJzAE[բUV*ki$_VA#Nu4V,L<nR?C\C_<Ё'%òN==֪nYg|) ?d;`*S?RVi@@*á, Ze^Tzptюy )ǭ(_ܝ#[, :i7H8hHCA<uwcGH p,A@a@WA$蓎ʲWQ=^m^M߈v鞞sXOғ !r' Z+g~moHF\7EJJwE6z=Wla/+ Đ-yc(خ!#vhkwH/Qo{_} C6@HN]SR@ Arm>ړI Ұ7mU] q8_}|6si/iwcΔ P0$U鵱M:ll=]vٽZoϾ-C`zT4 @F~{îk=3)WJbi-B)\,`@@ }C=]F9g*kϖ{/ɇOǸ xQk}_ߓر_x sLǻ @TD7 0@ֺr[|r뤱:?]6픥j]X@0n2}w8zA@ Q,07^`\;_/|9s#P@0\_mY;Fr5 ? CTd/꟣Z%Xe/s@ OlL(a🌟QSM:9@ ""RQd-A꙯l7Ȍr>B:Y?٧GA#@RIoi_Or{+X@ B\9d -tC5'8*_>[D˒I G I^v|K&Vը><@8@ ,^i(mVKe"KV* Q=R ~#k5 \?d^*0+@ (;9D& XpSOӨ眦#d٢E>'S$01O%ucoMNA @qYtIcsޙo.f~p$YYy qϺo˭YYqfʄa " 5#2sﯶVQ~_*ţ`߲:okv/kof @ c?@r۬dd9Q @2 |Omn $)%_@9"Zzw(uQJ26bS8@`:`afDƆn%@C(s^`VX| C8\!U;WV}Ur6&<W t=@@=FgNuVfRƨBd7|~vS*DH@*"@֝7ҷrkӹ`2?>;\FO> s|U䅧4 X~&)8KVϨz}d[X۳@ |;%`mk_yޭ5;$f+#x 8@&34pQ2 Jӥ 8 hnᶧP$KB@" ("&I!@ߪ=z +{|(vI)lZU_@{ ?kI-UCeu[ 8~&)0ly`Urf9M&@-0*ݟʀ?ʦt 0]X1Xy -2Fİ @{eU/q+` $ 0>ܳP[2?,Sm:@ K_+G]{S=@@ 1Sd teΐAZY&K"P@h*szg/! " 0y{xTtl,Y'<@f$# _gDD(N =JUkZfRix @2ty_Y-:wܽWYCVemzpϩ7k z6EO_J0ݞ/ F!@ @.@  {&uoUtYl P>Tc ~0G(sV_S4.@/1|eGD { @Yahok}!ϒW5  @䖁9x'.e @8^(io[Mr65*AI @F0+?bYBr @m8Y;mj`eȪ C_;_s3>r7 @8Qz $^ Hr+jraQ@.W5ζP C@ Q ,{fzVkO%G0d@]f9o{;z  PgG_gRhߩhaK@X ʀoZ_}S#H VuKaTE  =V٫T-(@@`F}G3b#("L-@j"$V@@b#P9$BP_߰zs" N!@9A@I k@F_C04=ٶi-@ S>jk;G>\َ+ @8Nuov +@n p4ۛzRZ#,ڎ؂  0s]SclL-@`j"LXm)Zٻd]vl o,[]gq`@j8]VXJMR,<  R13۫jSY+8%vD HlSp |x"rnV\X&)YR8'{" 0sk%dL$|gD =gG }snQʿWɮN) G@:|ѹ?پ$K* wqaʇ@Z7wL2~e$L% @'C~yFٿK@ ^ÔY,{fk{.< ]/jg" @iF$y'_Ð*D]@k#P u;{NWy1IRÐ& U`@RwZ/w (-! F @nԳD ܣuIu,|Nt@(J/9˃Q?PK2t=;VKx4OC2C  @idzI_)珷W9 "@D+%VZ=$_$K!I@LI|wDzd"@Z1{ABC%1@@,CroXm>k_?)o@YA@ TBUdtmܷ;w@B _Ύu?~<@ !R@%jCz8ub? ؿ{>^;yd HfS gSXcWIF@y@@(` gov +@ VbU#n[߳hvF@ _|fGk~ğX@+%[ L98*ѠO J ~! ~1,xT@4D5ZkurϟL S aOUWr+V"@$D4S {ʹW+}l5VE@+#w S[;[DB@$L"0@Wv{7\vly,tw@@-SLH^Tswa 0yC2#&ټ  @0 WRwԗ4Q Z#M#V]?(/O@(zfGj^)HJ @$@IXsq֨r+dIx$ &Y+w Z]:4DD@D*l"]8޻NT@@JZ/z6P9@MUY<Β  9id\|:2s1`G) H[j32c#J*@$=h3}WXGeղpll 7@D*l&B@"BU}gl k>  0K!%ر^*!@lcN׏ɗE @He-U RV'0 SmD=n|sQh  &0o؏h/^[3(R >Yk.Kd3@@ H'Hor]=S\*"@"4io8ȗZ" ' g<|ju @HI2L-`uƁS MfX $V '%UvGm?<% PVE@ljYzc(bjkӱ TVA@?Y +?T7T^xZ;#4Nva-y nS2wXΝy'  KcsL8Nqy{WMͲ8/@@xҡعs} (T@r@ nP5ZI@@$Z=Y]Uŗz$G Q @ LK#xnZ9BkiB  eGj^A { :7$  A@(/Go㻟.CbAb$@ FIQ p˿Q22C  -5qXK!pSw<@#`x TF!q #ܻ.F9*]Gϑ@@ƇXcdې@@Z3 4kӱMJ+eWO%o! TN[u _zi=Rlpd-@ C*(07k `V84  zqYk'w>Y :3|x"8fͫf  T^y-uU>;p W} Xݾ<EȄ<@@"&%]/E}Y{;EdJFKQX}kֿ%y__@@vkkՌ|~OMp o2k͠  A@Zk*ClQf#@`6z z--R nZ  !vӎ6={h-OHV:EksW[RL/  @,~"<GUߨŲ 8oS w􇤄g5S*@@Hz?Gk.@ 5LhɾڐJUxX [`HaθDx!p=xs·/bMt̋K@@VkytזބO@Ck}jkec8@H w4 'ܔ34[)y֚Q[''@@J9U|Vw{6@ "\ydjxn~ٺ{  @z'GG?/;7  Nb17kLkg!)] (3  _t]>RʌbC"$@ BEVg.еqB_-nfNǖ  @|eQ~x?N2exkKվT' x  "MZ>UQXk| `uK?.3 H]@@xjyWU;&V( rmE+\VHLR@,5ܻ'%WB?H@@ SֿZ*(@#5}'Kwe͛e|g  VkJhcCϹJ%F@@i^ʾgǖƯMo"Pb%&7}#W]@@"- m5u:[.@W벓s WhO@@b,jC_.Ƹ\4N0NֵM5#E\bR4@@!wWß/k7y@)R?% +@@#G);}^=ǒSlJZi$?kM`MJۭR%I*:eE@@`U[xlgw,R (.i& `~IYC  %+ |hWu _pĺzK[ Wm~WG"u@@Vs0@!j7gP|@@@<[C(e[X&VYtl9WiYY]#r@@$߱k 8.LwȺ&9  UEZ!fksJiy9  G , Cs|KdX@c |_c/= ^   @e:Eww3425\?ítSrt  }|r7p"L+0]Oyn9#M@@* `ַOw>Q!#$@ BUeos@@Uu>ޝ.l@u y:[m3La@@@[ƠMPWOe2*kӹL굒LerQ@@@ O k$\3@@B.p'\| (!nVT9'Ty"o  C@+\zZiDO_l9[xwZRO@@@ Z&=ҏ#_XdEBrCXہD!@@&]ȥޭ`R%HD5/.>!)/URB@@ W}\pfʞ`o{4?#m  %/{M_/I$:zJJ:/0"  5Pku3s["W}r?d"R'   PR[ %=B@(tXͤ^p{B   UPx/2 S4Z_,F@@ <1eK|, }W+mԕX  HY3P7wt=Ny0-OtZ߽7I Nn)9  Y`2ޝO6~pe@}`k9.rs<@@-Х|Q JVx'9@znvw%G,d@@!3tƣ8)=TYk:{qOHu @n=d=F~1ʪM'us*ʗu2{{zLz*}ZƟ|=ik ,||~~ O$CDn2v ^i 4~2P~@ jK˓yР_'8R4i9;Fk(]/Wۆ*=r/U~uZӎv2Jf^knz=Iix<]~p˜Q;&{߰cLϰx9&x@ޟ&hY,J=8D ahlZx>{^h%938"uݵw?#e @&_hxӎ)G)WsjunNmѦ>2-Z鱹uzؚTt:#^d?+ lUJ%JC3cxU!%V[C @&h+ k\ek; kޤ5M LmFjd꾖A"{XJ?o$' ~00\A%dHUH/GW°>Y3dSݲ~4'#=y=e 8?K -osʰyֺ)YM=@0 H#@CuЗ+Aw|3I{[&YЬZ:]U}_KWA CF/%8@g[-=dhw: H :hH`|#!L.^I  ?WoO $2B8:gC}Ȭ?kG@i&GUJ٦jIYfFոюNMFk!I+M~$cA FH 8x}̨ 59 굮 ]`~Y7F# HpCq|?af&_ f~$aOݙVc|ysCIy $P hT'e Tc]fU?aIpFe]N&cǏ?^қ@iU0*C$ @Jz#ۻԎ^^i *#_n0)A&LI(x81F&3|:@VQ ;zlfoCHU_z#Hpʹ Ix|t_v}c ^%~i:<8dq! P^5ܒ.Z-c4Nc.\掝aemet*=~J&. FwPn1He׀xm&l;l@Vk8 qXּS&X @1vL-vZV T@@ Fi{wNk{.!7ůz) @%?>}sם 3%*c" r20nSoy@N9 KS.J][,SA@@]]-,p*}Elj TT B{䊴gG@`j_Ozȧtl}Y!ySh@A^9 l{6M"Q7[/z%@ S>xIݒI_j! L8=jMl>M˙Yy #,fuʿ촦_0!;" PZoZszi֑6S!88?Хi-#  =cSD^N䏔`܇\[NvHtJFJz"c7D~FT@(ohOPPZc}r*(D Ip%|i 7I妬 DXUn xu:e(( 8Mn @Ie'ޝ\I6:Y0@t \lnQ.,X@/q})N;@e͝*tm=Ѩqʀ PW_pH@𦑁s˜|o 5cyg{@&r_ng?!_@8(kuZ7"t=6Z{{+#TN(ǿW## c]ÃF9ZWGLg׬vW82 [/#zU&\ 2W o^je@@ 7ܻ7""՗KW"MV@B&=ŵ 3n akd”|@Kǯv V#^7yWrZ*H'%@, Z)^Ux Py_p ? r @)]NHVemڷ&qB@ TgcEr LK'^o4~e7 fd"+ax$U?Z@" Hu&*Dm@n5'!R@"J+V#dw@8$p;Ы?D*1!s$R@z:w|"  @Ze JzEJ8\YT,EtA( wQ $@R humpRi>0RгQX@HuڿdӀR# PJe^c:p)5RY@+JcWv@@@dM5EH8恕7$ `#vp @i9Uy]i1Cz I_ OG 'eG@ 756 6H_͍RG@U/o @@J' WZַtG]ʡ1d\z)+{#$] U@@J*z6}( ڴ&B@F.r#Qid@h hng2gh$_ًE@$t_fڱxQ$  0#Fd _%4͈@F%PEix @+Po!.бo'5@ }뫴%" PApyגe”]yr?&X 0S`&m4gv  P,ž֗+b@ֺʨ(ekU@AH@0BkKfuPj@&`oh޸1LՏy#}K2ea"/ ȸʞ12@@ggt2vÅG+$K1o" ps?Φ D@.n7)k)I&@KV<hn /o3^Wv+ a!Mcj$uܰ@ 7;s_t된# ^na)Kh]3 ]ϒ   TL #?F[T,7MO@(H RC @)`~}_bYhZPnwr`?@&'.OB*J˧"_Ϧ Wǹ) @y:&C.O9 8gUn4gը y Z73}:ssfpsZhkj޸Tm],C4/$F@UEΘ*>$G@vx̎Nrzvdgid}Ws_q HBQݿ .CzZh8[~DNl\t%>&N|ÞՠZN_iO~S$@M ~=Q:?(<edkm}2L@#_|+1wDj\-oVxdqe1U]_.$"?a7y$$Ř~`؍UL@Io~U[832;l[_h<_RkFc)FNȦ!bH"7 A @@ u7]&K/V͹ۿKP @`x3y5ƟK?X^4m#^onCk @`I^u<:6߾oNsWo˭/z@,~摛o~s!=GS;F&[J]M)b?a5r7R+ZIDXX}qv%@ CPJeϲԑh)Ps-Z(:<L7Εå2/-7IC `h ~|_ @遣zLF3>vRtFk޲ݪNG?ݛ?GCH(X90cOBZPߵoCUS;- @@M^?c0ޯFY|RL@ug +yZ T_7_SVY??f=a/ ϥGk'Lx/:l%PWTWJJj?23y  л@ҏ?p|CI,3  0ۃGwY OWWdYb#@]EME%\+6n؃@5h=8:utycE:ot}fP`_ >^<&U @`Rq'rNJQ&WS_mO]OnWk*{须Kސc=t Ip`9z [ }+?:?ޝ%: \ 7 '!dn}n  @`:^u,MVJSi|debr?(d @^C'CLoMR*n  @`{E4z-˕tnluCq @`|1й0< G @x> ҿJeTOq)d*b hT&@.9fR1Myҿ_m[mM@cx0 ϴZ&"jczaS5o'jV3o#@[ P?z~X_y^2 @CNJ;=q8}듫IƭoCqhl~dP1 ,T%@y@> -7>*rh /6t'y cQ59c  pw<0Qk9bN/֪e/eGg@VHًGRL?Ub @@-=c^p j "|3x @`\R ?u,>8GSskD$@flQ ї[>) V3n ˏ?I+y:ױ8y,狭"` @`T zV!jAL .`u &HD+}RY5m @Gգ=txM =Wz @^s}]׸O[·* OY @y~tu9wsGSQ/C{;5{ 0}<7a 3NSwoV גUH { }O~"{߁* TwU7ZLnu|Xv>6.0k~/_N|:ydUG g4c#q>AtJN?%ݭ@s^ƛ&Z^P?TSPxOsWh{|D$@ 񵱝 k*l%[c |;+f]%po5RDk @P$@|X/Ο}[G 0)ƟK}ѷ}ѼT&C wLdw  Po{|`?0Mz0_`T~N'`#@O>ayVL|}| Ћ@X`^wSzb9c*nxu獷='^zޫv؝@uX E,/ |l3 }" @`=?:S^(X^|6g| [~tXib FH@oA #y6}$@S`~%.tzF/jϽz/q)hx]H {  @: \Oub&@@Mb@Q7K%@Qkꍕ|+Tu>1 @P VS UPN @= |\i5Y`ON&@R0=sN<22-e OóVV<6V6 @w[ sKT @Pז]*X\M+F[@=v5f h.Ww @`13 ^O`_ E_"jvL@hJ|Ni^@:}OLosKibHCMg@  0"F,<vH1ùNv 4ؽ@u\Й{87WV._B{T(wxnV|n*t苀 0*dw/\KVɃ(sE 7LW]%1>+{n*t @|J횷B POOBjB]%:E{YX=DMؓ@ym1-y eON&@ /G{.aW Sks @2+ y7 HϦ^sOsͽVx @$g+ Ih6 @@=oz྅{{@yh龐{ @IhCVV{|=5@lUZL<(&C58 W9R  @{h忸 @ Tsh~ X*d#cQ1 @yV'uX @@1}d6Ğ^<ǻ@/8 0I&;ȏiaaD/ )ux. TX @$ T0I-@Gh4?_O V\+G#b @@S) {e~ {J"@v @ @}#=%b Uv}7 @ @5siguxh}@HF*%@ @ ^4;N.[OpJG @ N ?ȃ.۩k'P @ @c*6wvL4cة @ @H!vb@h*lG @F/pHq 'F @ @;KeVvq@L:yg{ @_BlNJl} @ @ >lhn ꤘrF @@N\>]&zL9N @ze/Kq >,K`,6gRc,#ij.ﰙo;C{lZڮ8Qz}qu~%v\Ӎ?A;xaܷҖ3C$ lSNd-6}m&B"9^q8LDވ*s_Z9`ju0~Nvg_'/Ho#0 2ظwUFa k+ 0\ [m @S$ɚJ 1--o @R`rڤ%K pj@N @5tl\MymF/ @&TԕJA`B{XK)z.Z @c-^/r+x`IpG`#Ppuz @.PKKmC6yjNq @wwg/[nN'@` Nʋ#707`ٕN 0M˭TV45^[ '+U@~ S @`b-չ6t@~ F Ek}W @&rrhX`o&@ <J.v#3п*D @`Υ,bc&Pr 2^  @^^#Վ7 @ Л囩sc9E#zss4DqcZy\m9K @`ҙr lOrcp(oKX\oJ*k @ ē˃J( Z V771Fl @@yj̑ T?Zm ՛[5 @w_I+7S[9E 'fE>`$F @@/Srͱ N }FfY-p3E @&Gr^b*09}%j.yiF3nOty _ @Nڶ@wZ"@`,m j-R`C @xbg_]z9 Leg@  @&L =w550a9j,P]7}l\@{X @#XnօH*W)(:'^<']\ @FBٸW06@v 0|n=pU[Dqi}v @ T\OQ` @tٸzkŷMF*.tA&@ @`{+7j#nm"M7'P(#dQ @g.ǿDH`ʐ r-Kо@ @ w/Knc/#k7ozS K @[ t>w#zvk{ @c#XTUF @]Rpwa6X.x*7XHsw{ @6 \]LͳsMXo> KFvs6ڷo._G ۘ|C @`kkiRC~^@`h6ߔ8^_6UI @v]-SLP0\)!h #@ 'g.ZZW v!^8ir-PTE N!@ 0msik 0\x,q@YLw{ @^XXM\G 0>yN|M7]oi-)[0>  @ \+p=7B,ʭ20& @о> @xfxL4j>V'G @gʕv}&#A,IlΖ r Z p+1 @$J;x'Hz٨7o[&I-O\= @R p-SM`Lbi[^oI:= @ ιw]ځ֡-Goe^1@vD @_N< " @`,~Oխb:)  @ ZI0*meZ[d @it3-/B9O 緋h@hK$Kn'g? @T <{YZM%cMc:]P&R;]'mw @YҲ4 [n&ڇ;I[.eIv @B\I@mtahM*ܻD  @Lj1Z9cih7qEvmxr6g;~ @*ki n*?t6P-B:݉ @Vir0&0O.!vz @LsWb~ SM`lR( qIwLtb:[f%^ @B '-Q諓mśۼooN~j;= @S&PKKKLYk.+NQh]'Wl @dtJ4 0nq%WMV @ZxJy vپ{kwL<9WXZ p'A @LJZ`t%P ݷS;&^<1ک @&3SXX͋fOSs<8)]&B Hz|:P;}'t-q @&A sW$E|Cݶ@Nx> np @qV,fvhbFhw%_t @ 7RZ&)):ҹC|zb4 @@n]h9! {i)pu7R,%u  @<`iLXЕ@`rr>}Ңkk+׋c  @j-岰@P&N +1ůL8q "@JS%/[p+@VH y5S @(Nh*v+-h]Ε\= @&O׷ZDy{N\}fH᛽Vx @u8#\^H3u=1PNX:txv9 @@ՔX/ ɓ+pW 'c7{ @j&N])NF1XeIJ@+̜!= C @F`WQB%@`1>&*pС]U C @`Nhn(I9fQݝ*fc;k;] @<@p)z%3y`w |bj_/=O:PUj#@ Pi&%FS!PEIk]mNLܮju @\]Hm LBqTʷm Nx:OxrW: @@ ^V*DEZv]'. y2]gv @ KZcaqI=rضo:PXٶto @@J|Z6@L+O@,_m#@ 0+}i>7ĉ @@`n)5\/Ago>pt,cYگC @`y ˡ4GQϟwKb,_.6 @+)Qw  x }MsGe_'g#E @[gW,zGRxS'o۷o&)|~19 @:yՎ%F^bxC׿k_U@X`) @0S0Eed]{ј(,n  @('k<<ezs8};񙃗/#@ 0pze!Nvxj({ٸZJpA @`tor+X`t]f*p7FU1_/} W @;{<(S Q`!I[^}6XpU @Y ^%,F'p1 HԻ﹞j~a+ @ e(\O+L!>q qI@l~]% @J*ZU%@@>Wͭ7%4d V& @~ ,\Kod @kˍg<]9 @@n+7WSh%@ڜz}.Oh $r @Wŕ0`QE@o1<>փh1X/ D$@`2W(o˕p`fگ/ ymgCψ_pxNcS-im#@HnWy Fȩ־'Q A L@\r mS*)Cxݽ}FjVrug"i n|lU'orTs lh+O2u`ݨ`%P0i;]~GkDP;T|;,gL3" K]= ql ϧ  @ @ T|:7 ׁ'q[|C @ JeK u Szk @ @ pA7` ŧ| @ P3N tCIrm&ç @ @z 3yVٿFCI\xы1FA @## #ޡ$^lHd~]FA @j ?2Xh6gWxzR @zXq-p-5 ?; @o+ơ% zW= @ 0kȧv<@'. @fj uP:1!~TE @*aLeXwhCƵE @S+RHJz \a'4~5 @B8׈LJ 1#@ @CO ;'^~ .#@ @#~ ל˷փJ @DLO#I\|!O$@ @#E#ISQ 1F @@Cĕ=UY'__+ @pe8!~1O%@ @CxǞrՍ,PEPre#_ @ @XN!哳quTm |=|iTW/ @w!յe5#M\~E*2GVn @ @`"'(2@26>B87Ju @ @ ܌p#=)=>@hE @ @ [R#O!xF6¨;A @ 0)cG‘'*.?_5  @ @@.FE/4G! @ _oݕ6 j"2u<7a~wp @;v~/ x$ľVR +" @ S>ՇrR$NՔ&̫ @ @@'.=sqi$*fY|$q @ @] ,"Y@6V ˇSzT  @@~j!Vyת*p=q%Os+w @ @`L/|ɱ @6PY4 @ @ P?>'m3ǾBx҉ @ >8uR'dq @)d4c2~:|F @O q@|BΤib#@ @%SU>b|~X4 @ @ 1QqEµO2|l1G @%B땃/ԡյHb+2\PH @S!WܨCkȒ:gEV/F @؛@ fh~to $2*ExY @ @1y[;~{key1'ǏQD @ 0eFb]mZ%J %k( @&Rc'fZV@[6ŸZN1 @ @`bHr6%.;Te  @㭴ga6cpe^  @ @V?T_ߪ_ 7="|o "@ @ |h|CZ&>S|_N,h @ @`/ȅ: hb&@ @Z l経<]G]!Q! @ @`b |mO^@EJȏ|mp' ' @ @bHkI<9WS*~/wI @ @ ku|b1'&P}$fZ?gg>*&@ @H)m,>2yfeLws' @*zd晛@q_lȳ42 @$LRL\1όm1K}ഇ @c.Љ)w9?q%*'(O @r;DL0Um"@ @:1|}~5LTFӤJ @Z O?uv~bkRO- @XU '6P5VPA @ @`G'N\~Q;7  @ @`:E 3wD'F4 S J @nU{'>`k_ @ @Sqj'FP) @ @`@jlg*k}7&  @v)Њ!o9.ϯiSXr!תwK @!mQ*&pᝯ\Hp @ @`e^5[\~գqjC?B0 `45 @ @`_5O(2U SqQU&3O  @ P?{ )?ey @ P;ybxvEއ3>z)_v @s| W:1s`Mm N}+U0 @@J/Eؓqu<~S/\73{kj$@ @! {T@AGej 6 @ @`j|p buO}/;baԝ~ @@u[g=6O}hu.}8|F @%Dj.#.>p3G @ @`BBHv-LH{ .\?ټ0I @ 0>1|b%t>4>6 ubi뻼 @ @@mƐ~/;W9p [@ϼWcͻ[v @j&R >Um;y  @H >Ho|O\QUo9BHv @ @`LCJ6{LeX_~yo @ 0f1, G`Tj~;b? @2!M8ȇw8[ @ 0b.:1Kܥ{N|gr  @ @`tߊEa6yCHSuG/#ߝˡ&@ @ ~Ǟ~Q:xh$$r @+?,[g=kN"6ޓ}B @c32]@wmb|gJ4s @+>9:&t ,jB @ @`U-l\LW@}zG. ߕ]4 @ @@VBZNvi=}~4=p @e=zJ6(}>=9 @}b|o;qeMUfW˻8) @ [~#i>O`Z!|t; @zx"v:fz @N;gwY @ @{)_>K<)U@V>{CHO+{< @ Л5,|S}~=fs|ZN @ @`y)|vA?-eJ챧OЙ;RX  @ @`JJW.⑯o~˞^$zHwyn @إ@{xǷV\zs)BLV@ @ @7N>ZMn뇀@?_*#\`>( @L@ SQVɚC@/6#eߙ}* @&p!z´5|G~3\rV @ȀWG>; f%J)V$ @&]Fk3 v$ ~)=g@$@ @ \[]?#_ɓUR.~i@U( @$P~cFS[$/= B @毆؞[;$#  @$K1G lJ 7x`'@ @kC87dB.^CJ @@ N+$o.>p3sUBu @ @@].T;KuS`Hwѣ̙GsuCR5 @gVJߝ?t3$&0l㼦*U @Q6<&1& !wťIU>1jUE @qx6=+$#0޽o^rժ#@ @oNDs:ꮪBB Q@ay9Wd@@o-'ҾϽSkӤ i4@@%k/&&93-l$c핅6YP~|.D@@huk1Zpy*` Pn2?j`ԬWj5  8.`yɼ@O6}sBC   L9`6VMWtykVKh  kvZ EBz6km좀qBڇ4@@ e5*B,,ԍ^&^.3D@@ oQEŢ鲷l) $ 2Xߥŷմ@@'<9U3xmE 0[o t@H@@@ zlczCIf,@`dC97Ҷූ"  @z1䛒M4D$Jt4ܫ@@@V2` x_ܫU07@@@?02r܌|O4DA@zq1/Xb"y*)d   pg&nYEtHDowE֓Y3oʝ;#*!  Z-kyShHf:Y PMw"  @u HT+IPF5FB@@ l7JoJfFš ڮys-;gnށ  CZDD$o%@ୄ"|6yQ;r:PB#@@@AQl4crd ^3^Y-qyI~"  \9_֪ 0'H8Qc& ^w@@pR@; mm$Ap; &v  kO:#ۼ`kn[s.IDAT\;@@ h6+@`4nO@@nVmlTD@EdM\c9E87@@pCOf.A@zÕ\5F%d;q"  ^Ki> ѻ^6D:~,A  Jo^c)@ H2hIݤDɑ @@=vX'JO8܎憻,s@@)R?cST$[nk)y, xT$4@@6y_\Ɇ@uHTןڧ8yS&  axI-JN} c9z$ק}eϨ/)H dICݛ4@pU@?e͉ 2$Bu4+x 9l9UKDN  Z]8(0ԀwSPlZG1mh4D@@WuNOf!3U@:3S;ZWb%@@@ de:g3KHڅ$~56{^&р7! DOecԗs-+UFso#"np+O';%JP  exBP[R"|-;7Z@E_r JULE   2Kʱ ]K :}d$u( {RD h@@r x싹rVľ( r lmK] n/{T  }/>jN 0#,@ Rhmzخjdh_s)vbE@(v큦ֆN S]d|S{lgXF@@SC%:HNĪM%y?wKȬD$  P*h}\9y-+) (^ǎt @@ ҿԿRa(v113YP˨)"hm "(! 6_z@w *@orYL9$;e'  ;?>} EK`j=*WqٺhEG4  GieNM6d2ct֐1,$S2hn+{Ȳw9^l   =,;%@&gb*EA@@9|ZRJr#n${g#ook#2,#  !֞>1ۘ\ ?ntejS*$ ntS@@gՌ5` G].Rm>'I! DM@<9.H^֪ 0]ӕu55<1OF6PC@(ʁ}KGSÝ(@sT ۼ`k11r !v@@( wB9ɥԳ2kQXU˺G<  ;zVPɼ;q)o.@}x5Y)uc>/&" !k:=*+;[G"y՚;Cܙ4$Jo# о2L$D  @վ>ג7Q=NtLԹƨS%-NM  @8;7TΕ^ g _5 ^-="W|_Z.|H\I @@ VnUkx@D33-4*Xd{/ʘKmjA@ML)KN~z$f 42kR+A  @+pn=-S6 TD`fS+bM% % L+7R8%Ԁ ȄK\ rDZqe. )G1?$>'V@@L#꘧9ߒ)eRfN p mԹZ 嬋}# QQ9ʱ PImꊼhn/z`2 @(=~-Kk$ PUFUዲ t v DMOZ0+.)92j P 6/ؙ{5u1R @@0y y-E   \y]vЋZd/;@@!`SzE)|obC2 0I@`j(uRxvq~"  lFg'OhI#P1fT\شZ;䜾gɀ+׵s݅@@9q]=rIN$@2u 0E^K $ M8<|b@pL '_Vqvb'\B! PttQ`lDR 1vс@@ EJshp =2@=uY]'֨$~&" mƨI~UӍ or[˵s׼uOJ)@%0jCκԣȟؗK6-c_TJfl  @R{iⱝGWΞ8Pf Gmڮp풳{S~dտ* C@dFW^kyIp ½\!@@i܎9?k r@(Lk D;1ߜ\࿌2 0 =KF.Xvx*@@r5^$wל@3"ڱ@1ՕKȯ2" @Fd)_6uA;# S+wەzfi&e8uxT>)W@@  Zר[Wh$v@6tHnyC zrĆ %.Ag[@@ 29%be PV^v@XjRTƜd铕Q7%ډ zHis 6+cQRK S6Յi&/! oE)meU`\5`D'n" PAn%ںu_uST@H;f#PyY(p( ~cu_+v,|[@gѷpss@Jx RO'cN\1KFU@r ˼LQ]/$Xu u( P)=&u=tX<1:aWdcJz@p@`T㵅z2 @L$npE`KFHw-tpt@Xx9rkӼ2Ů@t4a"PnwQ?.k'k~PKn @Zեoη:# @HT L$UJ&D-VA@YSyO+ PZdo 0!0XzF=&u,^t1F_c)GI@ ZVUWY?LTB@(F"]DAgu3s?"Wei^t&2@2W㱻'Ns lBD r6nY/|WAUb@ 2~5y@ *$ғā@D J(1 Wf#c""a DSK_U}+x aY@{#a-="=2ÃǴiwJ":!GHSە1ZG]"_ቀ"S$nE ]ՒG>Ufmx]¾xO#Kph#tC]|΋xc4J-S1,?Ǎ// ))/& @%2п%׿> 3 0s3ށQBV ΆNbd95e*Gp;XTB%@19_eֿytG6@U[^ {uт@ 1o@6֣&R;\_|\1y̞"U ($\7iӺxd]Яu+@ Nt3A"@9&.'"1}Jy0];\Jm9e m2!iuG,{$r"db'D@l$Fˎ@9E {*w.}g!KI|\ >,e9x( 0;9_Yr ;}<6Ugɻ@} ؗ !Su}:rтCcF}P`,ۥxl}yDM.旓n]jv@J&@dK`/M ҅2* h9]ؙ^. u{?e+҅Zp'>@$ԏnH2@fl`7ȕHK~3#RƧ(pQ`ϑ~.v?1#@P ĕ^_? }> y. Ț|" dvYIW=6X#oCA'D2KFv,S&_?@F--riXcK'u !`QF*Ip4@` =@$ lS2uvwki;ר?Ҟ6b)*n)8" S۵ߢ%A;w)a" n @[kC-K3;LѼגra\J Z@Hm2R`QG =g`C tMD d3K+ "tA1߼G2O$ H^쀩pFأR6H1H&t]_gk@` V@`fr|C7I1-tdžj/y,FE>oD]6gjbF( b8*MWfb\`,$x͒сbhOyi3yn{-0,ge$ o뻻Y#@tHDo \}U.awɘ^l?.イv_ѾCK !@$Qe'U6б%G:!X|@Gg߶{_Q-x_OR Z)l n1!W%r=<'*x_#3 0s3ށD@@Mըb.3t\$>OA¡! 69̓R2/ꆷue 2p4@B$*M5 @&fL& okjr$KZJ,1>", {5(I$5\iE6Mj{~? o  L L$^ܫ>eb\oW噿PF[AR% lXƥlLߠg_'/|  @HD@`3 ^Miv; )vfM صk&6`U1mۣR{O;zcuv -v(! 0!>  R@vovv=08&S~N& {l`j޶[fLm{'ovnӳx{ޟvP#60Ɣ}     EuWw=UXUQ  EA$3LtNI-k&v&~^||?mhhv."]"5t ̅b.y+ =|m$~AR``(v)dۦmئm6Amo[~psQ'D4{b(̖e$ 'T'fY ܧ2%R䵡1l(X5VdirU"$( +@@u@c{ȩh^=jҖTXW#jY,h"QITQM;`P 2:]/@NkrzX.sY#jv,ihhm+k!u̵ܫQ]kP{GRm :H|``5gBH\GQ௢;ݎRinpX"JuOz<;WDp̚3P[D֡Zo6cwPTp\"G&:ܨUJePDVXPb|U\INJK?MT*EQˋ%9*Op.9֙(ިiydN@DZN*[g P~9?ټy>,]3UDpInnT0~ XPZ.קB@ N#.‹~TZ_D4,yڥPkly~#FwF@DS7O t:ݵ*^*Ba(X=q1D/pu"k^u<Ҙ|dQ72@{lUѦYǬPc@ <rxQTxXջR: .Dc%ygZ!j^[De4؁nXPB+ @qW%za@֛.{\`uYDս*J[Psc@34YV;xOYߘmC͉M۲lvq7@m ,C^KEҦ i)[.vU \/}wlM,ZPU*[D^y+3.́#M`Y*hjZg!":.߉/.9(t|DezY,DD'$+zr`@@%D͈6W{-' K8}@: ,y-eޭ`OD!m_* t8 T|@u":xžX/v͛kցUsk<: Q:  aX,>OD `u""BAŒ !r .,DD>]]=wl:LXq@*"Y,)䛘}YRm%lXl@y:Ir1  <!u")>>>2On;"YDKda$,8 Dri(]_ݻubӴ:%T | m!" 'E zz{'@*hT*1Wxu"f0%\!, [Hjpu"榿tFF6Y'i5,,dBD :wQet,hy4߱'" Fo%3s$BY2ADDdݽOlun@DDԢ+y2X2*:;o:l5x<w"QѲn'[0<,GEB")ЄBDsѬ.@APREAD6{N'^U93>(p83}D#cS7֍Xg&s Հ?=s"zp~tJ:yA9Eq&35&&NQ BM,xӽP_2K=a P`aկx#mODcɱ1h*o[ǘ* $2;15w Ad QD]U|RjPT*Q:G(]ǬLUSO:hEN~gDVT#E\ȿ0h?Ygh=RCLG}DdDE~hX †hm/3(/˻w$*wS=`Z{SH@,r:K\.䗣y6pDi"~[!j֊b7[}5P<:GRk7rli}G#4O^\ȟ*:QȝPR?\:ODE1S/~uG fHz )7 :.*r(֢?Af7xY*[ցhժsTBoLiE[sA!DD~m[sSK l6;/4Ic^:v8SET t 0~-]9!6 R!_h{'ԗ|XhB{'jóu#Uv3%?XT*uäFh3)Zb6{&lȖ*&7FZNbsߣmCHE)A2Io>tLC /Db8nƄ'Bux8pJ`rYhj+:X:Yg9PE1RܳgNa;`ds^B=M@T_+#`IM!orb)OЈѬRa9x<މHZ Xgif8/XKE^Qpzjۺr2V,+Vӕ]GJ(,WFeחLdu]N *\;liMx|5W9w]O5ۿT*i`-R𧃿߿*VڐsR'=Rs-֗NS r7gZiJ{|: I繿 H4r!D!'@]rSwN*׏},]N Ά٨ϙ!r1wmrjMl"( @sRvu NT5gſWY#n~!0;xp`ڹ:6jJ 񁁳s!KGK+s],zh=C$ncc*~YjՂx"Od5tk˅a)qDBI) I#>vuQ ͠2:zTkP+8yOB,rD'\NpT*m5 $FS4='Џxeӎ7L?G!ߵZ3=tkxxx!ЅH$=]?CTcR#n*W7l6/jKa %3N4+rO?jѺ6;^οvwu"j}"Ęd@`ǢrKX" U~ب/3)᣺dA~SOqyEþTnm (zwo6F}/@:U@?nŜ4c_ T*LoͰH"SGbl`R;  |X20^x::C+79 [?7h:wlV{_Rh鿁D|h7/fq R;<@Ojog~dRGC /xkt;a'$"_GJ73^hdO '(T0P`anXl.75 :Huu" 'P`F*U|v|Tyi#hd >ʁ8|+DudRF](O}suΗe _&:Bh6l$!@,9g5tt,@D!!7yl<1pC q- !( kHZ N'xvK'{CQ(,& cI/zI=/X[ ߻CL"jZD8hE֭K8,UQȠu"jm8:C=6̞VխP>AKO'<Q=:@ pYU"Nw ͆KyT/fJu)܋a:u_N2~p>סr*@Ds3W:.3zSxG".<5VRnDzxu: "+f{YDZ {i/.ն*b\dZL)"1%O5f7x٬O ЧGQyp`4*/g\f xl`DtD2@dF 0cDT?}u"j DKs::3} hϥD~X2]xgA$UD !&rNy4wu "j^CCCmlXd:}##T`|ϳ@DmE^dF3=Yju"jn,7XPoF}o+4Rs7!DQ=,Zj[QEQ yG\9`d5y;"-M8/rcY\UOI} 3Dh93BG#ϝ3Xl"im0QraxV;CO<~BaPJe>aZ#{krDJoqt(ګϘUxӾ@E9RCsDpB 7Y9iS;52g NEɳ*c묃Qx,dD'joީ D^Ku_ tS> vJ@,=@|ƉD[^om͚5Wọ)"xP^}:0 IDATFFAtihTdh4 2,RT7 ?MvIDD3QsEV-8ַYLz<)OovX "1q1B*ڹv̍Y(N5Nr"jj{9S=X;f 4&M/:Ʊ ~$j'P.~79BL?`~#qW*/[ "~:CHu%GG-0t,?%3DԊj1F gGx//AD2D~a#|t0Ju Q/c'cBau ""+ u0ȃ}CsTXM'pS~o!L<őCQ(ܢ!xGu""Kb<O~"@(Mh9ou""c@`_`Y6\э!)BdA"nyBHxnDDA zrHo*#SYo!T [+Xg " ={OTiGŕ3kwf#4@14IVDDXtr}& B:Q`(Bd)P0H(6c !P^0"zu" qq ! 6mX@ӄNDtܒI΋x;j(P8B DDA"PJ믨G'88nȞ['plR]!)VYGN!R@Dd)wB8)o݁#v PӶuQN!Dn ()[VT'pZp;u+"|Q}ЀۜiMD! ;#&5h DDF !(aRX 3Ok[g ",}9hF50@Dd!=:CHU쀛 lN^BI?Pl9b'V-Zju"" G@mVw_v\IN$zu0҃}`5@6XQ$G2:G92 qMEζ@D䧪βVz?0qu""3 ,,-R<J9*1c5LDg9*R=d@-%8 |<"Z 8Xl`mS\ 0nP o,I$'[ "j4K~`&QA5ѧXg "jx<} uңPU@6DL DD9<:Ce@EYSŹKZ "jCF<i!!_4r yVDԲ\9j'[GL(tjAD-G9Rip&Guu|u"zJ$=ηj*-?RZ' "^ :G?^(k:*.Z.ADT7*/v ?VM2ݯ/ADT}Jz@cVDx8\5a@ex`DtT V[ "%KCt Q̈O:G)?P9AdE^x|DD3%ν:p#tăT8=ms:LP<:s("'GUCDl| sSNL/~u "H$= c }@T#$KVDDQEe8)3p:46>Mmh*俀pqhD<~DDSmמWs_pxGӏZ8oKgβADtC!AGctttﱾy=[띆Ż2p-Ys5'??_5 Ց̓Iu " x"wa$ \K9!Re*7tT&̻x8nP,!Kx:ϤF> $tTmt׮=l?[j^!(\b<:wڤ^l E}%9(}u:6qmruID#%܁M4RpFF',r>Uw"6} @:x*} s/0 S[yOKfoZS_2Uu:gOq"`xO,yu"j-dr9oZgs57>[x=Kf<hЦº D۷+g[gʕ SyTnTE WO.^<8:5m{;f>p{73BFNk#'W@Z45:z@GB3DFO?oS,TEoRT< SOg(}98__Okq~ϴaEu(H}˭SQs<c\7OhGO+@DA(b<t[giكdZvjz( "x" DLU_K?T*ڶvTĀ'O_lsS~<@(0" x2" D DvUMI.\ }JT!WS[!"[}95_ : ͐/6r-P7f< Q2W|B)<ЛdfCg'k÷L,xs_2T*aOdޤwtZgY;M}Cͨ88[> KZ'!j2WS_\?<<}&O?@9Rqm/D-jY6xm?śP} ɽ`LODr DT_3"cf|nt[ϸش] ֶ dKgfM̛<'HX:\鳺#xE)ю]w%Xg!Y,Kp~u/'nV+fU o6נ@K g!i2OV'<: 5bll_zsUkfw S_2sS6c7]Kf;07+z f{ j 9ʾppCDKdѹ_"eYT<^Dꑤ/ . /zN(.;Peu}AbY]p^D7 O+[sahpp}ľnC=;k׮_kֽ'3U}]#~:Te}+Vx=BV8 +pWA*Z6W1<H+^2~"Ѩc\hhhrֳE"K딱)_ Km9ъIx2dL3d$QRϋ6R</lĵ}LjN>X,7ڑx*$^s*_Wo(X!jx<QbU[&Bm/ې`y"3hĵMĝ]ngY*rMht{MdTE22: Q=L^"x,8N= {J^DW"r~ Z.2D $]}Kl('ݚF~Ј?Q#-HFjb<QUW?*OoąW7Rk%s~Oe/P觠8ﶛz|SR@:H,= ] ȩց%<\n0Kd`Eۨ?וG?oP-D-24 E^TVǬQ8ReU/zd\(2LJOizCCŒ73lJ:gURm ZUkVZeh2/<[ڴBau jM˲I=OU· S\}Qo w6ׁ/G󯱌hժsG|΀^EEoy=&ޮ6J=B4rW2a@F:/&aL&Oj4 :yȀB\wOyM|җ|{jp<l, WhD~)52:A¦/y_R w-*j'pfu϶7S* +5<?qڛʋ(_&t~֯|+{WKf~ߟX(9@%bA¬T_Z'6o^K[tDlV򮞓z޾m۶Y<;vWzzRpDX͛!`bֵ]]!xuY&/ڳuos|8&& \?m( ~3|\fb {vC'B&B3}Ʈs bv{h&8Ewou;,U<H'!>ݽ68QW6=XWF?+Ml[sҹ7Kb @-h}E!_Ӿ90ӏY'S[g zlj)u=*_*֝"B6oBtT⽷RY:u~: rm/`8@g@T<7%lT@t`Rp`LvN-9MgŒWdMjx)ZK.:O_p_F06߳hx<¯J{6kvr\(7Wp9'YRjP*BT^;6bc:R1w5Z6"jRYR?U72ʥRiuUMujXǘ+ o!j}NCT 8=YFB:5?i7[瘉,8{%GYyg&3I\Uz&3p \t!Eux .o좻"8JH2 dn h2Lwuy~d:+.lL,O>tgevM63 {t j>LY2AI^6-*Lh ,|t `"D߇lۼ7,B}#c$#cDjXm:|?2Z}@, jŇGRR+b"h59rxfQ0oZߚA4:zLz'4K*w1bH[lq@{#T "rw?nrJ-,xtJQ sW& F\9dahj.V/|ᑋZڟ^jhYzȢ: @P hewߢDށA(Un }!!3F/X8Kry۟7b>z{G#`VE?hC. `[Dk),OM|<h &qo36Jf :Pg:Kɣon:Q ;LTxr솲{A%ճ/* 4;0rDY"yfB+t2V 5P|tt,ѝfLh.:t ,<,N,mO͠ldox\n:QV( ٖ* o #E,_WRKf:K] .(Q} X1o+(-ru֪t,P TrEwNH.utDigel*4Չ2/U ~9ɲ/AD1t%C4C. mS:S8ٹn.GM J3|\i"+n9۶)A!7'=$Mfb ]cLg9tIgw-wNB&Ku5rMo }'ˬWc_Q7"Wo:Q /Bv T IDT,DskwbP> @֙R'\(i+>cӦM1)ѧh9] 00t DD IjzFAL}vndT㦳IA!G1nY\ o&yC9#j YLD}Ct$Rs ot`xdHoB8sd/Yf: Q(Gr"²=Ay߽*MgC }9]h7XO!xwkLI",۷ۑ Uށ_/]:2a 2MH"݈MH2Ylߺ 妳4":ΞxA[qlǎMӦC5˒ w2"X`tv,p, ;;_߶mfӡ#>,,sm|]= N%Yճ|Iwt/4HDLz/eobGEdM4 .|'nb:Q ߫LzkB߽ >( _!M0A߫.AXh:,o*I G(dG2CQζдWzp4o·LH('9+Q ,bDWiށ);A*uQAOV{Mg٩Zr(J5%kxkreau*4f_yfq(~&ic4mSa9GQM27 6&x7I9?ȣsqX'ȓ"rj7,MR<Mg,/|K x&3Q& `s;ȧLg!| ,߄??"'nb: $=WC l6&/8 )zU;or]ۦ'/~ i!~)_Βkx 0', nUEYN-h(jrT;Bm{*wpL%o= tX.ܯ׬H\!8> 'Mǡcȑr%U y9MI.w(әÈr.+?vwQe& ]ۇw--4o (~/ =gET*3#'?0666[wHf! 7[?+^hMә @&Iڎ@o*hз/T}(98 mlllR9CD/jñD ^ !f#XUwp,P=Qn; 7TMڣ|Y*lf(ޙEsV͛?M֦P X֕z)ny& ` dF_{wm(Uuay׀{])X)zFFt= 9t}t2oxD гa: e 578|jD{(*8'o  F(ъ[nTEsPΰ1/օ*r ԰k9\=.>"0NT 'P!,}d %AG+*w;kEqiT "7 n }d %ҥcz zXČrVTM)5::ڶa.6-*_*% %ցG[f:|&\ES,JkrV>N5RgPPC' <s/* Xh4 )\,O8`?XSښ 7Q}P -y*XݗIq*M IW,Z-`,ÿXe9 RG[[jl-‚bHr1|Ju0::ڶ}h,t{-ƦǞ\nDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDdVҵIENDB`pydantic-2.10.6/docs/logos/google_logo.png000066400000000000000000001545371474456633400205270ustar00rootroot00000000000000PNG  IHDR/I pHYs  iTXtXML:com.adobe.xmp UCIDATxw|ysfvtKT*$8NHI֖fI6unvyinqkd˲eKeI$b$Q/p̜1$eQHX&\99#gV3뉀uQ(\Ž30W`c$'ܶ{o&_t``KD_p1b ^N3_Ma-"ڹpnRW^Vvo!aK%L!t&'p"d x}8zxu(<dpaKc ^R W-!ݽ` }+V",vrjuQ$E@a5ҘL Hxl呎NjoB]c8p"~g#}\WEb0W0!l/9p6d #F|Սkq*?;Q job{_FuUxf5b h<#̢Ec'RrUظI]Mc+]oy ޠDo8"ܲ)<\!ڹaܑA˯!uӫvlGXގf\ 72[OJ*傫ղk4Ү^)gc_kZ#xcb<^L*SL*LE2Ig53wPixΆIWwtuVy J;h<|?H9s .}pV .!Xyؘ}!* _ -g |n-%Տ͛Gc͓[,FQ_OsxNkot!4Ho/KH63WA#(BY0xs`A9\@ry\=r8yx}}4?GA6\?Ö'0]!@ ^UH:בLIe#C\)sH*qb{1m[n WMGщÖK!ۋr:v0 ,1H*hkqaBin>DpJ$E>`- l` Hۍd cL2LmY.V@OI.)$z qrݒvExX\ P)<9?\fj%OsQ&c?pMn:k|rې;iO9>4-Ϥ3URPi$J6W|Lq Zz誕11hBgErI5t: Z~NJӢ]3<|$G2Y$!g%tڡŮRȎ,ݕ W^lGmi(? l@'cp:@|h%S$Y9+"p4H7?Ѹ>hn;zdMn,XE9Bπi;ڵ_- ̃R\c$VSmF~|w$kCo9|*drV1^R I%S] <΋o{OeImlNլ%AA7xߞh_EyW^iS(|lJd: ꌒLϻ(ڲ#]?ھ*H.JPtt`h<7[6_~/¥BAW+zjЀ^x6n=_=K)[L&}(͗rl{p?,M+?/\7d2JC񪶥]fPtŽ [a$լf< JG_S `kh_ 57dK-jGթs.izaGG~; 7oՎARdY*NQ<<>}gp_/oiW[=Z&4S25{񟋟]h!Hg0]='J90I&{`_oGOk?ƿ2=k%ћg]T<`'ܸіͯv㣐$tRg밳+ _pS.u~:4A2L;t]g~=ܸ79 ^D:{A/Jտ1s( :_/ 9dr7>h[( ]LVjvS{_>)]YD:dFGmՏfc?eGt l\13hyޝ%ӯ{_+.knЀj7ߩCKΗk\6\R39uӲh綿/CokNl61_Awko־/ߛ#Bz4Gt<&]hyx綯]mooqqգTgYLyHyD;JGv}2 \9LoVy\s<מ 3ڤ[@}[?~*d4+5S8nd}"J6g 賅s>\vanr2Sݣ\ʁ:s۶-v+.g~Z=Buh@έvt"~=i.Ԭ0quϮ%>k?ߓ|a+ ?i@$t忇럾GLJ>A¤رjF{v=~߿bWz4h@ 8Ϭ?lۑ޺k Wjs.D[7/*_ŗYkc 3Ms'ڹ?m+A7\)uTŹn\N{-|n3폼G\]Lΐ KJx^SDr9 JsI.Cʗ>GzWng3M }ar;L\ry+Nh4W7([;)8[{tN\x[_#;:6ݽJi WJ4]Lk_ڃ{WciUр>]9dlqHc%|̢z$GRi=tK_?HuoH:TԠv4O7\G<|ʧ?P˟!LW7ДRgFs.]ёm4KD}:q0[~/x{7Τl+K)uf9R( ,'b]4x&W@M ӈN_Q~M}7p\RJ-A*Y>r쟫?P(|{]9 Wm::@RRz#ݸJǯK]~GtN 9c??ݦGRuK:h굤yPJu[ڽw(N -KZ$Er9<}GXrj#/ :ջ婫mvم}]X?qj;A* ~x(w-]roG$tA_}xR(ztJ)ur΁'Sg3O?gLgw^W?tj%= D{vZ_s;<%vҼ\I*|X{a?sӳWgg[Z$G:#TnbۧR.nwUSkl,ҀBՅ+]Vy?I# !޽cnzo(:K4k΁ĻJɇj~zO*\)5HW7vl'y參ۗ\Nwgsɹio/[OkR3KjlV;>:s[-]϶zh3琎nlq|Iy~w?"tH@-?t[NJ:Z=Aщ+X?y}Bh\)5M9[O[e㯚~J2Ux`.R׾E{`J)%]ؑaʟekyE^O 0]4? {Ph0WJ) Y0A2,w4iS9B'ީtxJ)VlaX`}ۉI>SLr2oN#A ML\\ʿ|+޳S:[=:ncؒf?f禸fG4W-cULG'kOF[ ޾VP)\ъrE4M~aa1 ѾBfoRjz:.gM;>& υ9Cd[*tJG^f RJR̃n|𑏾#ev\|[&GrϮl,\"ܽ+{W_3̕RTM-&}Mwu+"lq\s-K*1p XKc?S7R`Ry96msh Tsg=|ب6[QJqn> r&5q3z ؂%ڻk/|%>x0c5+ԩ:0o~po8[Ϸ,*e[ۿ~3B;)353?ru7n4g>|Vcd E|C7^0s:M;>v+"nBƴ _RV5Jazo>/J>`R!Rݸ!c$V>K:?1">{<{Yo߸1ocmqL[9!E{`D/*8Ayx⻉֍$;)!չ1dQ1/_pш7oz'pvlaǰc Hs.$;uq3-Ptƚ2+>c+eCQ.ܰ[=1TQ_hј׮ښY4fb\y  UkX=ZX:Q$es}7v7SKLOלѸz- Fb{񖟷˿͜-z[d0:rFEW#F{H5\ v3e+6KO'd719ԊӺT3ͅ;c-?s.ɑ7~V7qt]~3>6:c=3d O!C?H+%K1HW.AtEFdMN@dqZDpah>aĿ{LW=EqxyIj;Muzxi"`]ʪ ;z "]i"9uΩnqn|>U x^_/>vdC1]="VgƶJW!^[v޽E~t/==ls!Lgs$vlIf~w6ھmھǐLfuy"Ƿ@ъ1Imv=g~8Ir.x/b [w'-Fbtua|Y٭/ZuL-a2;1u,{]%D$ή,Fp_e_SCsầmᆵ 7>wGV:z|ປ6؛MN%Yxqa5u*9nUջi7:"O:}#Tvr[ppյ_.sy<ؒT I1DpV/nzϽ?mx>R(̎kkVmOfEq~Һ6 X?^T9r*G~f+~ogxsxy"J l ᯼`.4AWĎ&']3b?Y+ ]|ǃk0u-vGG{v]*f]q͙5cuc<]cw@ OQtW:%ǟO .rcꆛƛ;ޢeDCH*fv72Wܙ~ӛ-k_9Cjv=g:ϓV^4}{_o㡡Ѕ!&>B:!uMw^֍Scq93?BpݪM;MHp͍wu7x4g^)>ώs"G~2~|v;4ѯkZ::ZO;>#}1u?/]O: 6DQ'!Wv/D~+޵Wry$mk9#׬Rbd("KZ٪6#~V +kݎ bh{fE_#G۽TMʋ8 ?~s? HғvV Z#?<14%-LO7oo<;͒JN:6QDyٷwHg6!{RHR+> ,+v_[tufT0?|U s5@li1g_-ܵ9V[=&2T߾\J6h0WS9NW\~sٟyoK&wk4pɣQj>]M˿U<\s>[?~77ly׭?i'N'4MSY=>g~xq\Chsk0WGc.?KpJL@4_2c˘9ׄ]?jwG^^Zf&uM_hμ].Ӄw4GyMSor2RDw]T~;iMwrW zW^=JT+ OMc"8Z)g3d*vm_f9H4l2L7cIuR7MqQ#_]cwsjrAa1R(Pgw<3s5vSˊc|;@hGD+iE̜y*|?>R޷>Uw~z3~9$gcМu\~,)ZHMntpuwƚ'j ]n3W&'/`m'Ɠ^+W'ɌMN xo9oyQƿiw<>2<4Wcp2v|-ԕm)}/?ߋ{yAkVm0sN>ϱ)I~>MS8UIiTTc9aq w֍cndI5\yWǩ_}wA)J*X̙߰=EcfR␠ w\֤L)ssqk-^):ᆇH]y>Y'Ag$GI 'hOKe{tzǰmswq /QJ=}:74[ؑa9sm~][q=X=aK9[.%+V8y"!EϺh9Oe=]!__0~I6^bz;{ώ_8}ߞG}{ߔW!U8ɑWf]=@12S˴ S=?W죱"I[3s8EϾ]oLv=O_JV9Y/x9{l/#5 ீ³QC?U?_}=<W|A}&q;6Jpsۈ&;6rf1Iex.^ƚh Kj$zx%|Yn|!x~ڍ[Q]*;t`W 6{sfY-?ٍf_γR߻Ss = sa -'щtHKT0&}HI JR~3GH6^h9'8L 7~HÅ7!M+ܽړs lZW̽(I J6G bYs\^<:K_Aj7_dp !WrBD8h+b_v ]%+5$c|x9챽[ٸ^&Q  B?lGҖmU(c`. ۚvloйUJHWifc9 IANxI12um<@›Wrl1hhYsEX$}?_lq/_cltfG 0i@lD!٨{֤bt<gXӍ(K:I cU (Ac ~T4 .j.ZbgmC_J8R?iJRFsw g& ё~7r/9fCHj-7ğWoˈf1H -lqG/vq:>f`>yH6?Y!di0ǎ܅O/d,bz+i7uXMr^f;<gb-DXJ[Yk lP7ib"QjEp |:O~#~H%3K0 })y{~Ҧdv~LmѶo.IYv6FI]v9I`=3:a7G.{B2ghmѶo.Ku!+¯_vH +af<]~ŒG*&)yX/_F%#Bw;+#d+MV"∔"vd ~6&)a8/P 񆸂#T$C/c(x]@0kt,qb}r#:9w+e9 p~zل,jms!}f^7nDrvm{Ewnz(g^O/vx[9U-Sb\Lfzտ=HQ q9!{+Ppv.SKx6Qs/`\Bv3!Kj`Tml0m|ɮc9rm2w/#P9MGog`B;K?> Ӯm&ɼ{˖'h\g0s"ݝ R:9 Β KH#x4Ll)44k8~J,BVBͧCH}ls ˰ћK*apOGN. ȸHyhR&MS\ 2VGKحMSf$3Ua߼w*bG~v'*{'rp_p#%3g>vd( gi' с8 /!K`:{1ۺ.gПٱDpxwɒx) Ph<.XsKG+3>ll&7Jb6UYf;l/Zf%_B  2Mm˾ 2_[A?tm{yDp2I? d䈛k[W/+8# ݻ;1d}Ar֝0E4a !O@uqE B)WpԦ݅ciJ'%ȻuیypB/.{w-HW&\F}m/-%$aUȽ%u5#Ξi"{Lv&QBBqyfkؒ[C&>{63 cEQ:Q)XoaGY8qw+y:A IAJre,㥬Wys|g~'7H6õfv~l<$Gۊ@_چ7o!$3vm{QacnyRnC<gPΔ}o"8N7uA[%odgϥ adRjK)a#pa~> VDHE$g,GL y5gMg4DEěk9Οmw[PoˈGSI+V=|ͼhd!ӃTBg^O/ >l+ν䨷#%*븐]L,H7 ΎT識-im#2dV ϭRrMH!rSDVER)r?y Fdhi[pѤl!ؘBTp'^B(̒ orFJ\a\tS NjeE0Gd"r,iJ4%Zvig7T9vm<_\[({ GAdu :lB-?yoy_ nBǰʱݭə̉!FپW9"!jKu CQ]8Zdo؋񥄵5~*2iGiz>EY)Τ㏾ژOE m޼ [JɇkRF:;ɼ]qM0BtGvgm37ۮz92e lf)]GyY|9 XDdN(Lv/梉 ΋R,fm+s#xRúlȧtJ >pcw'hK5 .RIJW#΢mE^|ߙ)z69Ti-`GIE<#}y\2VیL7Gʅl-y\K_sC,N/cK\D"xR[G5G^-8 PFW~nQmly: oN&Lal:2;<%C_vd8Yڞӝ!Cb; %mI !j\e?žb²G7%UbtR'z'd뫃\d6R XÑILZצ)|NS}Q@0M ߱o~=K#iL.Lf-VErtx! ^Z8$m Jst ۺy:u!-inVv<ؕA h|q9rm^>_R NS xN7$oO㋐yxh[KhlArGۜJnZpݪo^o_Gu gcL* h4Z=JXҶ(Yvs b'-M5ld,N/fcf1O!ARGW;44z|RsN(=w+m.$@TV*^BZ$͜.(b-̼}s㣸Zt,]{V$mW۴C ,} R%Egc@>8IJP|N(Wf*UDpWRgJ-H>"s0Tdfum"\ ʋH*`z47 `-}v;ŤkG8y%Ow_Pf.@wԜze່>B~KtzE|hwշ[3B]RT(߽hO!,7`F~]WkW4,v#i7 L?L߾o~חg0&+M͛gݿN6tvNϴI dR^TGAa/峭p箢h}r__WJ330^7y3-u]\h\;_e46u#9B2oz_xI`܈Ќ",vd,ԓYy6e"ӱ7Թ IJ?~+UGeVLu|h+{S{l.Xdgw`'W?E\eLg2-\L߽8 ٸ^ً4].+Z7ݯ9$۰7;۟m{v/Pgv2tm3m6ccҷx ÃIsoF9cj+bQƲsټ \RJoQWʶ3>]^w^WW@^mqU!]ה`>~^ By_*p&޾Ky5Ld2R|WqGy|2Odh۸\0I)}s5h[Ԁ8"wY0GMgX\|)a>O7Β5~5=:bru\)>| z~1y^׶=Y'%c[66(\~Fm߂טk/Y$Cz{qXH#“+j>]LVJK%L<u?ږ)k {a'RI͖ܵW\=7|Ζ5%Jx -\DkW~"懙[Jw?nh3Դ໡oG-/SЬ.'%Cx*mL>uZtvwLp֫ƇH7!k 1p̕RӆOzމ?pc@uߌz &ԅim{rH mklϻ`0Çq3gD?_DB uMC)5N^yyJwMt6[]gZͯIAf1SQbg6zPJ)u|_x⏘zƕ7^)+&${~y*ZJt(_6lhq%tcHD!EdX\)5vĤoq#u puW(Rz\^6W.z;d.c3OLx]ɘЦ*Ji͗'HЋJ3dkDBQb͇$N#J![?izs* 9Tc4+?$(Bekv͟1SG&$Sxwr|X 8m"rU{[xx+,6SD粦:N4+fs*b"`RW&Egxya9J__N8mqd2S˸>K$FSr9͕R3ٷ@4yk .X~ 0?I燶RbO @l_V6[./[q$ds\NϞ#OSJ pC߸~_Yv *T(yɎE!kW}\!OW 3&ﵰRJ->qvs7jq\pf׶_CY3D.OWZhwB4F,&mbDJƷ= pV%< Lk +z딿Ɔ^D671.tV0-~փg':QSJ@ ~]U$xt|p/ɿk7ނ o/&Kc:#\Y-^h^\1jp^/s4Z=O)5s\iy5u&RͮmBGV)}un\^uzx! [`)^- |;qo x_eځkܨ௘-6#Y/&Vş9Y c,iy?!>򕋓 ^ۂNSx V.{ЅH*_x%FuKȒj-!J̷Cw{K1ˉƴ!Kr11w\t?s6/wS떔鈟|HBRӆYz-/-oU4)Y0ۿTZgs *š6u0mPkzPe6ZbAQnxR/O=rycdoqK?n=qk@Ӳ) +.8WK}gHkwduv~"GfQ77 yɅP/mś]u.&7ߡNDR5(k|&tB36\mϒMVUp@f~-~^BfwRJx˜%Vk'ҥ[yE[<6PMbX)5y+W]_6.ߎ=D un\Rt8ܡ6coD%=<} J70^߈ç[ӫ`q]G$0z$RĤt/RjV c{w`:i+q@vA>9FslYcw% =.ۉ%dqHs/sqV5ۉkAGv&Ɉr}+fcdWqr]/Ádz4 AnRJ FZCӧAyUnb5zGLPf<}h.]MQ(f#BhwDwnm xI>LVTүJ)58FlO\h$2&EeN&[# m`U@-ϓ z!l'" ;q[{IT#Z=:(V,/D6ǜ K09Ʀ^5 T)5@CjѶ^ ۆ$=A6d+GѕR*uexTtIRUldRyDϬ)f!)]R|l=h tY$M{EΣR1P .n4x8jOU\mȂVE)9hDd}Gh(o3 vv*qz)&̫]PJ.̀!Z=B")?ЌIxݺ)N)5+ Z2>jv$!>ꑴ?ivQJYDKk@oGF<1PJNzh4oCtRejEiC:TJ)r4EOcz;`I1X)5;%:)z㣔W6\#z(OF*RFP` 3JqzU.RjrG)N!ct[eLGҡ@)f#Hqa_PB`1?+#X}NۃQp.ԀīBrS)ff@V7um$ʎ.IEgT)5+9ی+O7ŝ@${RԬeJɛz5l'UVvaR`uKVqEVJ>1{j1 ;."LUS8&qtScی yy5DV9 W,PΌn"TJ"nj=dro3`=H7!d ;I#B)f T1Z `Wٺ !.zDmOK1~GRRT};#КmEW<GZ=!^Y6W)f Q%;tɽqd?& s2PJh.W}.v&ۊ6&$-=l>RQN=g2 F|l%'WG)5+9 O'0~ՃR1@4t!~WZ=$l%IOK͇wVz[1@cث'aXRu xLU)5 8yÆѐo_+ګZ; ._#=ھVeb*4ց'RR'2$qIv]HrZ}6@m7z:T{RJ3Ͱݘ+e_o̦\e{h'Tq R3uO֙Lo9_V[P/*k ]]PJ ւgvfeҷ>ƨ^ۉ D&.7zDmK:D/ppWJ]NM}4`cBeӍdߋ2/^'K?bψO.UR3_l## MS ߍq}ݺN hnĜj\ĺЕRoسv_|p1AՃSK|7Pơq2y3?RJu8ydS g[zh PqJI/؀zD+qNW1RCCwes WQ\si3yܐ,Je9X9])fAGVF>=oTY7߁XM 1Y3#dFSfQG\b+џRjMq)|I} hh3\nbz ,; d9)f.Knj.7n>0) R=6#\u5RٳԀ"z-ClARjb Ŋ/ARFX6֍%T\%R D՞|TC_ەR3^Co4+nxeru{14BEz_\%' oXSfÑ.RD)5!"c5CFG ;j`B ˴!0unyi@MKxb_)QJ\L@)r%\ @6ii3Px4:1qV=Ej`4l =y).5x%!Hzn߮vn`4\}g+={ ٥MzD)(Aԫ;R3 @v)wO' @8|?HwU{[o[梁tthqԌf R+;u^@$mgaW[oڏXn {)J')*qpnae$r`kϸY)6-?{)=wn"*zD'ttbY;&-J:H0 n;G1Q`aKGN38noҀ~"AiqԌƐO=VBOuWQ yXkk߁M4ydX!2J*O3O^g˒O{Djݾ6iiSr$O\nڊx1VG)θyxgVx|wsegt'Z^My!ϔ6~R7$%RJqf׼5qpj,~D<ۀ^|']Z~G> WXk0NJÑYɖ/}ɝQƸx8.hH 9gb` n"a,['_aCj+5\pXT/90"dRF*m%پא/'U6{]?#cO^p,Xv]UWoY5ώ'tjCp {f4-q<xj#DZG$-q=tX2.~pGKT?91?tkikwR{6ݷ̽b#;eJղd~2)O>?!]{qDA.-q>ͼD;2٠#SA+5"f.ɧH@\NڣM'/ʗRu LR:dzÛ4^\cJTmpۊHL.b_xtǫ J5G.s/p@e|bGaA:ߌ7ۙ l,u .^=R:ޥsc"U|D:zYvW`yH9!j/d(6M"]~d:ƹ`g =#_WqGC&F/gRB-ʡWC=tq+Pg;!ı1ߟ>Lأ7ɞ}۲/jBr16#V۪*"Kn^qbJ#Ų/v#Л3Vo~uÏ\I/uN&sՌ4 20u@WƑm7rҎ m|%9F.8A>16Ǻ$gO%ooGKzOWjۆs;My?rھ~?Sip g<D\N^ŧ+3U'>.2#"\\.VtOr -9)WSxd(56llCM/MzaiA# opXtI3$t\,5WڙAs'gN^ :p'Ёԑu{o9މJT;G>r:n!n^ZP']B:H5SL .6BF~R?\c7T;Kj ?C< ϺԜ=>4@5 1U|I9@ϱg`~UsXmBg' ^>Gil)t5{Yc^9OȧK6ZMƱdӭz0v tZ6%;v =~sLgG?o!8U2B'7S䉡7x;޸O92 8#LSHRP aa|c|1PHCյБ]R}%~wd_k<&f4 qrҵ|imB)Vfw8:R9t.;ѱ691Xd[w-&ړxt>^J>xt5+@=L-=܉3tq쮟l+n6<:kj 'H9@i߲?̙o*_4Ly$핽h Ayk^rNo[9hŐN:S*%ңljfjcTu s:isrLva,16dߑxs)\c,K0h؀?ϔW61sLOm|oD6ֽq/9̛s}q 1 t7iQEЈxɧ'M ?A6uΆN¼nWݺ{ׅ/H&ًvjWy_1NH`bv4x1S%2 #汃k>oفrTE4H@/hZ53@4=.[lQHJ.%L]CFz+{;>՝/}HzEvG?5qG;X^摴Spty=:#mVt~"IS`!m:rR$#^QWәuP;.ZqBC6-1qS˅Xk[S<}xžOT|br.ѵ9t L-||V!f)%q8zҝٺO|+j1i*gޕ*4=XYQأ7`ArUԇcOU{I* r:j!dN¶COk>I#9m`vOT} ^8#/zNOgB& t%R=3v$txۻW-Ƭ GG`g^7''**>R؀Ÿz8b4vP: Y}v#‘I;aV%~=ޜ儗!ꞗ*..B{h5;O^#y 9ogѓ#p+Wհl DGKQ 딢 GwQe'wOp6E% ˺cZ-&gU Y7U-LX>' DJi>-alO6v\R3ӇR/_Irͯ&E̢Ujrؑ2y{&藺S"~o#3ڕas} iyB_#v42z} txM˅xm@00 £Ϟf%M9N;pó뱵o'cu`t8  p/Csπz q(U(*[7&DxU) xBЀLQ1Y%tQ\_n-==F [z5 c5܊oa_E)i` ">ՔAFqP:K=p@J" s],>?*f8yǦFEÚ)Ũ o\HG&y,o opf`3COm7]XvV e3UEx#u)d:ϞԴjfGeע(ԃH5(!GoB"$Q]F[z^-kU0< v,MMu;x:>6qc n&U7bx_ I|~{qOJҨ_),: u/AU~?n7aҌ00Q69DY`l,B3-Po )t=ñk9RDuĞOeW J$}He>< &&Aba{(IEq6Sg֘^~Jjq}g.yOˎ`@UE+Aj  ObL$IY6&9i KP u:\A f ޳7 9;DT̚@6}&%>v/SPG1ڞQMQSv4EB(~oͯ:nZ͙ =~fHN9 P8LdAc 7-O@!u (᲻>GR ) )T$FI7 (`r8ZEl dl}H̞`N=4%a2[e#@s5]xbC[c(n9Z 3~vFSowV'DФGuIڑ!5=uEK4̽c H}! 'LtPcVC!Qx9&\߃if>i1<47V61Tg*7Q>=HyzG 4L%(r]3RI>rV] W+n~~R ZjD@`:͛xF3Azx{E@);ؒ/RuD54(0^tbj }9 U@~4=s:S4hOQ{#T,7b0ޯ1~h-F3PEX/JNS!G$TQQ⍻@~0!Z[2kn}dUWHw3TC5 56ߘEUTŞtV F J<_6A0!E8p0R$@z:| ;v+;Zj7w0#{FH;z|z^YWf52l hT%WG@P;-54@RB9bj!A|xVNî`PL96wA#3(6QLo$"GTn򾰯Ǫr20>1_<>Cd?bhwo!bJ0P&_fῊ P*hi K@Y[{ӓ-Gkz܇BȈs(%0,UB P`/ F S)iq4vp ΁꘏f(hLQMH!lG 2rG;k}JʡeWA%SV~B}(.m4G-|IڑB Femy늄QJ>;S@)@! Fx< i2Nlmh({hdWV@h7Z`f ={5pvs_ h R&izH,P~(Z, ^ys`*xnKMn5_'zT+tMmfIs} %,@ s GMu֎I%x$ W8sy [pp=`R5}4na2uG9 $W.{/  'h+텯5b Q=39Cv':ك+1E;Av!h`@p8''J!8W@ E8|jvC$g)[ixv 9k㋁SzS#xd]|r""y\*L!/qqxɰL8G? ._DvsP!Qy1|fĉ( !(u?|7 Uc.r9Fɳs*T^ O~d Q Ty Z/FA4B]`TE*JK9 Lxv88=hy#|BG7 %|Xc2X.0ZG/1C0:HٿKl  GA``?%Y{^~uB N Aw`.';ôT LvG {D0?Xvq/v=LgA G!A"0KUR=}V 5=ؕD.t(DAEzWͩ3/V(Pp>@FU@wpyvxD`%j`J>v9B<Y. 8fQA4A4ȓgw:1 *./D 8PEYLfD$⃋Kn:u V*?ʎ@cf6ҿwq_B'ѱD(f@(>k"G,IRՀ}bdƆiUidjJPB^FKlz k{`b|Z 0 HcTW;Kg#73? 18gB!\J}L/"Ye7@* k[(:E'y癠LlH/W( G K\E], QBw3!YIړ)= 9C4$ uD ,GN5TXx6 rB ǐs ٦;̒.l^FxQ`04DTX)%  ٌ?|= E@Gp AK4 8it!!@A4T[ zx"$\+!T PՂB%D4qC-reLh X4AUp{")n=I.i(xUP_~L !b(&~w/s|RJa0L (!0}`V.8CID}^;9 PC,?,1^+ JSNd*l915SDE;0H1A9<% .³?3>â3<5 y]-,=SJ{do&C;2I?G/ U ((#5kܿߩR)=?\J8v|^8W5 :t{sxR1}1 :Fd8DE lCr"E![rNV?fb0">:PUsP> ZOņ-,|j"Uغ^zD̹j/T=DԂ{Iض ߉'* \pN@W2psr W-*k(zK!HFh@DMcnd5>/=iM :jA| CZO}O!Zg^*lAɳ۰} (H1 @&P0V<DQ@p܇[ЩeM}VXU@`&bnx;A 5p yjPlk(F`S4H~@ h8-HS@ۀ#7 ;9י0r|Tq<)utu{=Gј=dn ;r5~  P?)P -OIaB@pvM{3~vQ %m U0a< P¬qAtS@i |"Bqj1*WCh #w][#(7?X78!fhc{S 8 z@hթm8Փg8%-zq~k$>j#K*@{E|6-]ps&uHCIULY#K`a9[S:_gT &%iF@ ggpS߻e,&_"I0 L`;XL0 viǎ >m*OgJ׷0$3${th{ך?E&hYU0.84z tάU?Q,I0ӁBƒT.aہ6qzshgW78_)H8 ɡ/[;,R{T=@̡P7,I+ͥhyՄTɸ֓:9oe~SB?eyHMF FΡ@4F2 :$ t [wOTUr4*9U$? =+>C{54 Z[-9}^_/}nąI6q-}v%>7̧@uI: RH(ya(Z@(;P`A:*Aݲp;=;斌ӳZ{7x߽=l52'T(b؞ۛݶ{z˔3Ό0g$ Pâ 샅!*ܼf?7a11ϢG6ToCթ_FCӘ'8]24Ұ/( ~O;xܖSΊ21Yw.IGG0w_eFhEHܾp"}nZ@`??iC?yY>wXa cFH`OCgϣW6. '1Ih%3|Qh U0pR~@) ɸ? ˢAhl-kg]]s+S64mLJhQ+vy5o~Y&Ie0'L?yۓ0.`>z]IմsDCYý:"bjZovOYh&gcX\F%^5#>;'_sI:at~MDtG;OR~c(AX?;e^I@ /T__s9ӟ|ó;s{,WUMK|%K"nSܿMScWe{@U)*bbd!|ALWܸoN@WyniDp!Pcs |.T;;4 f'#}oa !@8~za̡r%7p|Om:Wq /ܺ/7>ݱG)=C3$a0n9߂7~Xzx4}n8c/rw~ .TaE/& m_0>\mO/=}7@\s|;إ2BS=ڴD@侺4*A$xj[G D4g>V<ݫߊ'GScCC@&, .sQyATPƚ K=ONLxtr!HR"l nyz{p3+½ dLДcSFfLmظr²mՄWμFTui&)y¹m͎ศApc 8YD=qPc)mxg MNqM+:I-.gLRY޴+fW 95Ɓt`'ݼ%%P cM\K*7[ˇ|?FY)N?|"/\$8ʾ{=Ͻrry3\IcA,51]{~ƿ+ǑM;~MUnNmerdS޷=9zB LhDZ5k\)zm.+!`͇7]t$!j*\zڿ=eFrt #PU{ͭ&Rm7`{p:$ !`(! ٲ@*B@eFNpgSnKYK`<]l_?(!ל>_a}WLel㘀Jj*tȏ7[gLע !BwTRy}UcA>AI3P9{烘3RIqB ༖S>g[ۣi# t\Iq8>2N\|秦&Tv-3£WIqYG^]ܠr}g-V~UۓHlJyx ԚW8sx'y=LB@*4a_ӓ^vgRzb ^Iza$B-wxbg#ZǠyTqʎ@s)2SJ1|:y0LO2N6':疦F0aOᤆ F.E۳O;k_\V7ۺɜ itGng/RSj A  .dPOU;yi֞ú*C3*.C15w^thHjqO|-~O5R&4l\z 'iۿQR:$5A@S͝8]f,@abaqMOUFU'ϻh˪ BRznᶧOْ-'}gqFOo*ғa ?, `fAzf!G ) L}+l Ҳh>!P3'}Z SYQA l1em=ל޼k)-7^:z<'cӝN-ye$L naj;C+,>@mȠ>܅k4_PF /t v]8~m/m}ȫʾ`M\w$AB!># tP~t-?rgAdPlaV{U,cT80\:M]׳usկ&KdvR#vr>BkՌƼSJlU#H /}z}q5z m+%ҶR;a^=PV:=R{2tRl{&@GbE/d<;1O;&"thj 3/-C W5֩I?1SO;y:j03 L$zyOwmxϋ[/i/uA I-c R-î5f燡8u+ %@ LSg+7 `#*^ p!pY׳{9>RyvR;ص(8:=J{-Wg~̟EU)-ʛƱPt$ ^{ysN Uғ/@xDhp#P;jacȇ i8kI5;S0r0!((u5{ͤ3) #F}Nw~ܧ֝_g[H-15:J:Йo^亮o{%Dk DUŭPvKC( Pl;0U3˖B,7  2V4.ٖ쮋7k*p`6 ;ٚavz]ғln2(H {`T!)=Fb2{.ܖsB{ \0ʐ+ Q@ۏHGS !KFAXv"v&ضN>Cz6+=epxSWz˵WlQ_+{KCd]f8G ؁[o6֬ܪi%mRU;`!@@@ ET5a*:,Ϯ{O[ysܙ;9 0;Vv0$@#PYD5{:Q0#ȗ&גԏi *H5FjGz G$.ۋ!㐆vSe+{RԜ[M\* [c`p%N*zNU[c-rפSGHh=A~Ai7;`^ Y6\$!7m3;-g[;gg!@AC@pDbkav rZ C/l (+7Ͳnqp"RzRw5?XmiI-.;a1e 1=V:=6oQI|z%>aML>_mAMD(ШqB0AƒgO_۳i̮N~Bnj~; t̀OK0XFj^Fϻa]v4v| tSI Wb8YW7 ZW]{~!K٤&S4'񹏭] =^66F7R`F} - pPBAHe^<`JT5芆@V ڼ>ܷꟚss;˽yPc8n%jG׃z Y62E%tKWKJAypT\[XްEtm[r_%i8Ȗ ;v`GnoBqB`0= - /FccZ[Bٷ'XJYh=&3WHh10@xBJ•Cٳa(aШN2z[vO'mnǜOʞ $3gTAO $vLp ԏPvc{)qxٲZ~:W#  Xް;{Gڅ5Fz&I@ FTh4B@ jʨr&őcM*emgf۷w(BZ}~܊٢#*LUn" (%,_۾%7oS.QUuiݾ!] π 8pu|H#7P^u]MP‹?>~h _e$s'Ngfˣ'v۷1 )(nۺo9;'$IHXgY$0`^r@2u2&*tuUK 爌lļ;pa.>+Yx̪$I0 # h!0ԋ2lW{C^]0`v0dwuZP Yh4kM;m3;̺Z%$H$}@'uO]#`ԶUd_IH-!rӋ13B$82Nm$'IҸDppūs !s޵RQsMo1a@3&> u'B$I:$G X^J <$x-m#),n[ 뎐>8a-LSE%Iw^>VD:? ng@ -Uͩm30quuEG@SFO?#{@R3Kkm UFj%I4 Jo(m Y_d .p&/n1s\'?1J\ϦG{caG,Iq@"q΂>/ ni!@_A`e5+mYxAxThN% z~@n,oǼWFRʞ$cJ̞cPK AVw9J@*B܏-Oͽ:~DgHaP(phGn>!QՔA]q,,m$u=9c! j/ًK;rL c7_?kkb"J139 ni7iJ( E$SJV÷=0z X@xhDPr޶}mdzg[_x?F5Fz n0A lm+FA.$I  0>ŞM o^.+ FJxf$}= '6.yw)?Զܤ5$k&h‹=@ۋHOڨ.- mwve_A x]b7pU޼bME.Og$i`~ ցRW`"2:pZ:B /S/QrV8a||eP@! f܂[zεk̴ $sa#Yu7k`t_KWxi!@&Z clJK8cQCr+g&?zd}Kk y$IRąPP90>Vx@F8>`{:S day0wU}@' <^@ "ؓo~K]5fzG(IT)Omˀ)BϞ@)m#h }ʓk*2;!2ҡ5 T`{_œm*=1y;'ItD@F Rm@ NpV:czI]&I\V? <@-B*Vܴw)Gzb'ITAHEN)WnWFԶɳ'?vvW(`hcZ6R A- e$znj ϟk,fHJ$aUcO8sp:`g_Q`$u7Wň_Fgdɀni#3ο9P:1"Itd (_ SV\8ҩ4!@94$S ̝)߁_6HJ@W(h43;j*PyJ$I+ #г'@(Nm3s΁w}܅;f P}'rds-sy=@D@Pl<\6U7/S+m%I ׂPjNjPqL}p<wT?je!}B.2vt[:M7;[!W,IJai Cj19+:[b.fxC)׽ۿƨ{$IJ éJoo#P-sg=m^i + C |"Ο|?075'Nv xIzkg}U6z)`R|y MX1pWK4. A ]8rrsd@f=Vufs7,Ӛ nQK4f ^j7Mx\%I!V絜8C,ʆ[CHF@ @AG^λE|KK4Bwps'Ͽ⬘iy8d@ Y;>3uE %I]|! cPmG\ɕ]s Vw@ICNF+%eBwiM9\:)r_.KI4*Bq>'+t֤;dtcA Cc)V秖\bUN(%Ih22NO=rC:4+X+Գ>$8}B&HTYKr3vex ooÀFͻE\8o7>|KsLf B0%I~Eɖιs|[[p\pXh7=Wta7~'1=$I#WD-MN3/\99m HɷP~c V8[DS.~e9.I0 [:E,W:yŗ ^IN0F K:.q^ZMN.0ɠ.I10gea*:.j=\vrH>YK#LQ[rG^u] 29%I<"k03ݺW^I']weղ"Ȁ>BtaηL9/>XC&IQ:0+Uw# RȀ>uV?&'㋮YpobTE%I:loĞwwفUJȀ>F ~z }N?}+x%I$\>; wBs9Ntzjoѹ=o۲ Qc2K(DHXO^pS-S&5JYꃦY'tD-P¢_ի~;/nz航сS% {UTM!tW B @>Ȁ..q^i9-|-Sv>':ba*bjTuIPe߆Xwk3?51i]CEHE~ХW'f[Oy?Zӵc?{uG@]*m}ufNk>[V~Pېu/0ХW5 gz eջOn\۟cϾԍ3UM.I#`0| eB+X\;fD9G& 2KpQ^ѰOw_ةLY4L;2,FYӛ߹awՏ[9ZR2{]d@!H5ovΕ'7.?:׾VJR15 *KҐ  'p0!Z-xAs7pa\Zd@!%߂$NY',]L: "P&Ovt-G ~qM5fWF>ɀ.=oӢ/W73sg׾BkֿocYC\@W4@'Ia3p% B(ZW4,?Ho#cp4>ɀ.ˇ;89BzEwGvOužNwrЩj}vIzrZO/ήes]cTz~P2G.o%Х!D@Y7SѳKjxQorަ̎z~G#E!I0)suji֛}[wrOƥ!4^tjݓo;oޙw]BMQepx>[=)Z%usWO5"Y?`I'޸4d@*V`¤ĄZwYո%{՞B;rN_l3`DAD5Qu/aagj*&vL>Ve4#˽X,B2?I:Vd@*^ǚQ:f:59U+jwNm/uRfImYOx0:ET:j\pX '+zR̯nZjңZ1pO܂]PyH4ld@F. ~ wa2gBusSݡ۹w=wi) |&%ry^g6w!ac )ZVGkKjg̚`O P,p`*C ҨNʠYN\>8-9[ Ǫr]W(BW40 23da> HbAcFC3S n vyJ UХ1 ;ps h7#[P3{98_yVRoE0 P}\p8'pq.RZ#+m$__=ɸy)ް1RcX}_B7(T9d@$[sݕPwJO[\v|WwigwnM2\CSTFGd,Dwas15j#Tͻ"5sgVzMIN K<,[!U4yƼWBCW4L|>TE8.R-&Dw #i[J)Vm$#q8"@ I Ҹ6=abT+CWԞiI*ₖӐu z7GU؂ ۦfŠrĂ[U<{ p(BUTʠcYt> BB@JO"'zF-ŶVuKkjlRs1=5 Pm(D&_i,] !w(y^ ;$ jfPt˘jBhٮ -i-9qr3l䕧b.(QF ;]q3{`&,_x`<P2T)DY=F:=15|cvh !TObK8ql߅}NhDEɳ}4ɀ.Io@ n!QpK}&3JnoS5u 'J| Ĭ~;;SSn,^Z^xBo J(/B+JB^/u^Ϊl``B "y+d0ROhm*ez1ZCA6L Ek,u#,߆]dJ t^~'C1+j!Œg3b"R7˽ɆH݌WjUhKkS~+k8M<+u q;p6.{ӁPd@!2>;(8X(x%xg#xN>2[ -<}g*"Y'9\4w@/|fV' ,_s!s!  ~89FAYP2Q~jeRG!fڦjx{GTc'%t{RZ,ғ Z]%jX3x=n }hM4+ߎWb)`s Q` PW#$ Ykٷ@ Pʰ|YF:\ͭ;Fw,܃[yjEW佒kG{vN ZrN9en1/TOJ 8SB%J+PBט.j-gsW8!Z$2N􋞅GڞAL7QkV!PkVc{v/ nE CёqL/zeljJ墅IENDB`pydantic-2.10.6/docs/logos/hsbc_logo.png000066400000000000000000000142601474456633400201560ustar00rootroot00000000000000PNG  IHDRz] K[Y(r|˂9;9g \$d8}v:_~ƗS$~XF\8X$_?9w{kwT>`ip3"&%q1-"9Yr*GDom)ng "r}Dm;ϭ'b*|0S} :/#%~1F=D9* KU&uy_lYttvZz87X҈Ҫmf[m&Ra1YIJ*)!t#jLImחo‘u|PUulL3 ־sz woȒo)7oɌH-rSӼN|2B[|_r[IS촲|e@6E+;}2:]Ez Ig֢.h cςp@1[o!sސ$]@liθ)I\bJt<-kF_,qx_5DAOb P޴U)SZAIC$ BVw k-1A{3I8cYY_6QQDV;F8=Dk#\P\Q6@#cֱFRkz/Ѐ(CBTfF!"=IQ |{I$#?||u1JBʪo>ř\4½7ȇMVFؚZ>Pw1~@RRyٹEFb:.RWH{k\rd śtJOJPQĭ;CӉiӹ\9^y{3^&W|hN:N0f?9 $8#Ձ_2-TB~7s(3ϕNpP1,R0M;=̡J;+FPE;h`.M 4 wPx^gAzx:"-@Saw`׆KPcUAۛh_i4{'h ZlM<7U )ĞrҖ6SO/H&d 4ښa ~w)DjMH=te7g^rb؀fPj&qWI @|8~Nx_y QjhK3^WfaN SUeoH"i:_7s:8. Ljث[Uq!ԓ>oTP0`h=`0D?N#=2^O 0]&w gzr7Hh@.?#i޵LӬGP"&Ҳ 4lkLI˩B^mCEBMT8L;0{aȉJ ".L ZTjf6|N\tyG\@}G5!C:h$p\WDsMuqj!GSۊ4/YK5@rT] s?* X`Eܖ wهGF}wQ1MnOLAb`x@e qkCr5n`3QjIۥsHI>)ΆJtuc0%KM-K2Y QM%p‰ : _rd+(h^?Š0:ZAB C[kt $,JYp!9}u¬µO+]~yuVn{LYݧM_p DE ƥ_S<^~>>gG&ar] m. xr6 _usd"=~Y'jxDO?7]r|K28!o18mC^ >N7&<`vMiCCPICC profilex}=H@ߦJEvuP8jP! :\#4iHR\ׂ?Ug]\AIEJ.)ޗVb6hmq1]CFefGw1_Rs" 3Lxxj68GXQVωGL #8\xfL#bfES#$NBcgTa{si ",Bl1uR,.P:O3p70IzEm⺩){dȦJAZB>7e[s[@zeo4jrh PLTEGpLBtRNS@f pHYs  tIME#5\,IDATxͭ8 e3&m ܄[S)Z "%DϙpVۼ>O,CW0W8R3l2  2 F  `| }7i i ҏiq ֳL=U@rSi-!9b`?,1A$? 0uLv܋⥕S8H碛tS@rnt nNpf.3MP^`RMKL R) {YZ9L I5Ltf|, ]d' :MrïH3A}"XI1A$ 0)&Hcr5Qm)&hG 0&ȭӲgfL R+ Wǿn-+ӬY]5^eyx\>#P.}l`4 MP n2A!͵DUDUgqׇpUoYhL}.g0]d,PvBeI("! eh7ٚWK?kR",>h6nݽ غ~s{tGZ@TLkOŶWn:^KWvfM@W@MKvSyO9wl&T9VIkDGחC KmOU CSG@s% B%7LIPS;Ԙ@|z8(\mlo9Zo_SK0 m)WZJ Іrv,8\pĀ8txD0TD\!V[Si)G݇nCܷ7R_@P9ǧތ@7LK;/x` O2$ 9x?GB)`TZ.Xi:5K6iK<00XyAPf-QL)'SzITfdf>Xr~ ssʒh-I95[X5xshj'bѬIU֪Yjhe^BnXn-\ko~{8_bdA;(C}x[kQm Cfu. Y4fjC~`lUMAhMNllvWn "`v'e-&-P88(YbD'Ueȵf/tB':`UZm^^Wb&`oZ8XdK2kNoOo~]K8;+ɬ=?@0Zɷ-@5ܡ}y(IENDB`pydantic-2.10.6/docs/logos/ibm_logo.png000066400000000000000000000462561474456633400200200ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp 7 G_IDATx{l]y֪۾[s $*WAQbH8/ed^d 2kxH0h0M0c8C.ˈh$Nrsr.쳯]Uk=oX{w]]յzW]jպ}bla+ """G@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@.""Ŭ⬗+;dOp`_ne~PMg=OyLV% !d!:C.1QS}Ml"EL\ 6*VW.s|7+8?{tuwf;ہ?*#Fpm`zzs';? cEgܙ:K\٩?G5 E >w &ǰ9^^șH2=g#̡ L`=2>|%>z _4@)Ӕ9s?t %X)μr%fתr~p#ҫY8XC~ UefwgcBX)^05@t {YePC_3eqZAO<ƪs+u,iMh ǐ9Ӌ+r tV=51dۡ:r'à<ǵC`VDُ;!-YWj 86Ўc^ P,{S\EG(Y<FOp\ N\g6?+գC6jw3#q}w;h][y"qR4npVYC//\Fybx,8n0z|$]->4'p>zc1ڌ{|1"'i{ϧT]KK/fz#Ϳ4wJAC7>(qޛx^$/8ھ ն/]$3:|܂}cz9b~TO1 k̻J42sL}i湅 @ORTx5>+9:nG.$K_sxve\2%Js㣂_ޓ3֚&jx,fK ]CG.!4|A߯^:13 {gY> Y!C/QfwO"Ay'}3޷s:Z=Ef FyP~5tl^)K3O|󻕏]x#нDE3]`rI%;w|.IXU<)+$gHl1 e{⸶#>~rPQX8q >JJs4Ë7Wi 39biP#2LS74efā¡nZF~B97V1777La;NlSǹ0x"mwOzm۝-uNU\9R̬y_fFt vPCߝM:D4H~RUJK*TLgf:WY6L QA(*AӫJ-fjϼX+n|`K'lh&pW 'om9/q{qNlD?a3jbL,`i /*]Kg -ශY.xn|+Ŀewɭ}nIM*aɾ0w->YSahw٣f )BErxjV VYwFTq|t>>eNL\6.P"5{Wŕ7i7=[UZLz-دbYi3X\|䟧34gJOÜ* ~[Cqq@q:>| {$ΨrQ>(i >˟;G)e)Ed3{Qc'q%uG]P5E(y f.X`!0o%,=`o8 o0X3nhUyݸ- L;ɤ Kk"[H]:{KKWl${ew*=E~S0չB4άÓ<;ϣ󝏴̰0jn{w-d;ލ)xk]RSw3< g_ON[{83񈽙vC}m@ %u8~JGlρs<aaQBS2ϩp d!Nh۶,&+"Tz4wneFw62OicqkF)wmDٚ;Ia6syd~Zʛ턶܀U`4GTk;m?=~ Wg;&{%l^Տ5U"[U>d~TYЅi]X;:!~B| A vl @M{-m~nIW,1\ ax q+}إIk㝓' з6ؠ9mAƿom|/ۃ=r4LoW,1\B,M L]I4:@ `a`yn=2:u3ȷ+׸c:ômUo/+5&֛v{ٮ,si+}dt[6& [iA[lzk{QĘxڶn12gU7Mήq K)t5Q9n5ޟ[5go ; 9W' lj_ncf[vt(ЧV&osYibx00!ўbpO"Q(= gM'l <˶TPc_xN|(4I)gwkڲڌiTSUɖCYl3fyĬkR=zB7Ȅ6C 0>s |k?ޒ@?3f.36͋٪q@xOzs\I&)WYb;' f6Me>o;6uWV.LV3#ՉJhq#p:`yؚ~uEY|hqd+ܬM8T,\=Wq)<<-C7x9luYt!q\Xq3xŵ6cP?8ێuJX7]ۯ9‹P~SsjT}pTRy"SWv@![ă}EKkӿ@?gmu MIVVAp0 ũ[skrNvy<#̳,w4d"lӼkzySOzm@Wgn!|z/{%>ܗ-9Ed. zXsû]CT ó PbP+;ےt tA3m5 !~e6Ώnq@WOXB(OO-ü;!ۡ"/9 F  F?}B=3F:4nsgk_GqYLs+̧5P7`zf>mFo\:Ҍ[9Q.-Bd0!/3gj3B6.]ځPGl~,c+kn<'65 P[z&̍;w:v0^ y,6;!Ts Dx*wGO{^cشލo<D_“5sp{:9vc}.{3g {Л>pBm{p>-Лzsi,T<ƇnNXXT27xibf,sÙz札P'8wX_hDFGbwuT`@([e!]poZr,յm}/oT{cBxpuAmn՜Osk<}kxjok\6/a kl>T!gxRv & vjz9c#f3s:;RS(7_=Țƴ]'m_ /ik;|.Z-TNrxΜ1䷖TYL-d@d8ɌiqN>MA} eߟR@UouM:.ݠtP5 n.2kT۶~L>-"KD; wB9Ds6eqn{{lvw#x*9>¸-ƶa n$q<40@?6\89YhGs3o=ީmbXj:s!' ,> ;"ͭs魮+53"lskxv<9Vn! Fl-1Ps6Y/oXDkDhFhEvMna<.pĥMʢƏ#7X~ky2TU'՗4w}Y)cM#DڸC!`\͸3iɚnbNaB9 tٓcb[ 45@ȱQwa2YdEDq?͔_: i)l :Ɂ'WE6D7=OkTw(blڵ_dfVg)/g;:dfTu=вQ֜nP E5@[W0|% aSqO,"Э&>9~E1x< RȬw./ѰH8-`iFǢ+̀0_b̰4c ;ĒUqW?@?fwmZKG09\ۓ rOox~/ĎĝXB)-;el1+2GD=oݳAK}|+pXګ2k?12B~3Na} -Ca_}{du {)U_G=;)EN!3'n/f:jEH}N*r͍Q^g_BjкK.r8XtaTtԵ1dWq?v?.KΦzmmy-- tSe}qWl*({iP@SJ_[zjOt_֖07xt|ŮI(EN 37eř!}*pv{*P·e" B_u7|j ?L)EN FG_tչ|  IsZ~[_XOs[ló, km N*$e ^c|‹Cx/ef=UzNnCxg/-rȯhCszn >]@0 N=,n`%ȿ>φȓW)ub=N{UUUj=д"[tI>\ Xvu2+P&lP]+_I<;*!<`[[{'{qOy~|#!,0>"܊j,]*ce+pX'9ׁL#hQ+h#^_=O˴ #m%D6/dĹf9fxf_(1_qeryr[1..ҕ.ٟL7sp}+qX3cqc8- t2[|ye!"coc3*?Sо@g$odj6tGZ`iQg/}2~J8 3|ѫ:x2 Ԑ̢WkNr>5kt^Jf<{s2aE *nͶ).sV"""@,oMiLͱ9g+ПӞ1Ug#"^!gWIqg6}sb0p# ,}3XStZ$^۾S~o9G`N}m{gtxStZ(f2a@wWE,8i 1+M5/s@Ou> 2{ Z2:GOyݝ@_Ю!F: Jic@ԗ<9gi:ۆ1/x0wZ,ӌF#ucs<Џ7|;zq8Oo{ jڵ_N ]q>\a{ϜK9} ˀ{%m1k}0pq<]2oЮ@//zE$/iSp_ +m:CO'#xziyU"܂8/D\V̌[{ޗN0,1Bl]kMuh~Rk?t*^7GJ (:G2']D-XH_<υ 9Ԝkjn'TX1zvɚaq[E.r*τO{^3X+/wXiՍ3yyg߮l(EN`c&n–a~Z$YUJnFφ˻vco~ޓ`e.bSRQRPOO]lJtO Jp w-&9]w5!}9GKEKl#+7̶3|jsE].c&(ًqn 7 Fϫϔ?qY]p #R_5yf]桷? @k;;8Wn޼?ʶ1%J3C!Ae\ .o۠-޿wJBY]^Pq!ɽ-6هy,F>Cf\laV3.̡yۦ+gs&șqG7swMn_e;!ηLզ}'uϸ׾g˻֎;nCzBp0KTvږn;%ُfKqv͜\fx`®U94ׄ6ƂM6n\%n5 AnWﹾsLw{S徐@kJRr"ȡGۏ9f6Vvkk9MI}{gK̺ ;᎑]D58@:i@~+XS1̄m ~> Xs:DM)@vC2Vd_w#x{6Mk}_Nr  KXLG.59pw;JBpڹi.KY"璛; yސM \fnr7fm5\\[qlChX3u\3sB" oa5~DZ/.6ۛO wmw*qh3vrtρ? C_[ ˬ^3(-NAp>ANfR@դ~KV7%!fN@s03't@7V<^ke 7]fex@FMR'Λ-x(:`ߪk ۂܷ^͝K37YB :8[֎]d؅o[>D.9fFݿ*GPx&2es[*؍v_Z|kGЌ~lus:9fD:VY,Uv27 w4+8f> .zE3 s%V亇|fk ?a6q[z|:Dx8~lbs{z!EafRkG3Jά?,=_n.w'5ڜN?1 }N@ǵji{z̙w?My~<ڳc̾f,6}2c^Ø[QakevG|sejr#نzvX+ZwpB#3N2ȬDZ%_pl4xT/kMp׳O{jIۻ~̰:3Fɖ~ۼmo _"!dl=綎G5w2*vx,<zGnPzs6nr{0oM `de =^ѹř -07#n/lM8bLsq 'cz&G5ZpWw~Gw[bpg`׏ҽX]!\${M} *ֹG"b͉nI9} UioclP? ] s c 'V5519ŽCm6rMsߜ핝<({?>0#Uf3֣F7nTdWw6ŗy8B'#/s3'Fy6oe\i^btp ۧY*ϑFfFs> pr&B}BaR,]x>ҡcr5gӇj;<߯)ɟ׺znxY~h_}&1F}3(́~tˡMr&ՆB 5tT$VNcqslڼ F韕)om; H_}CP&?c@is?|U*kBv_r8f!C$& Zٱq iK!ԡhUlW7sxs$4GyLs-fPg ]+Qu>QܙxjqRnzƒFIznu4r8sjO 7F4Oj~*3}q]y0*rPt:  v~Svyʿ*/n~NjQc~rO5s5b,xnA Wg";̛piXMcވcgM4$ړP|8V] /)dA)罫dy+[D\]YR<~xS)La*&ensP{U }ԥKu`899)nN[ԋMK]HOm%tw$-o/-E yDz{/TFn~ @C&<0R[zS)< ֪^Y~cx0,8,+<ѵo5C;}Ǘ`.`)`GK6yEoۢlDcoS3]^JSiO1rpER( ^"xEsݺV58̘-0G~[CKV>f+Bm< /^QUO=^0 yңZ]jp\،q&vD:AKZ)$/^}\buV0Xj 4)<}19|0AWӍInw&A~LNbfJR(#ApZkklgǬ>^s,k`i7B3q6ӎm6uf~QΕ'|DZAg x/>}xmiJ'q i¹g5[4qb,!olf*rQ4ZHkKzLݱTGcl~O>D '1X~ !ǿN섲bs7ay< ߄-'c?r'`)g#PNa~; n+=1=/2Ũyil=npRnFct`p5"rgɇmEbE|ȢF by, izęU Țڶ"rK+?,%{#e;$á0z5?/<Ӵ_g)9]6aƍdz tߴgy.^3y w>$ҵ%<9ħi߼V!N]u^X=dU<0 YL[ W͈w~0g 9-Wn{p߿>!^G+s]0~:lV7 4p?;lJ7 f`D8}G-q7zi{4͒=|Ms8@w5/fĝa=`rC5쩙`x{]ߥ˒Żzxk`)d_i[0Oj$dqY49} 4fPqO|/MO؎D}4VAr\ZB;-;pLc'C]u-#oS3]u KFW"mpPBPw^mDDT1ֈOXc;2 p3! ضNmm= fSx?:d[ dgZC4'owwL.Xok0^V؞^1(/peS838e¡}}"i<8 c3!6syX7 -ǀihvff.C4Y} }+0Js3zqwg X=E7r\VOqq}! d+> 9W29g )vPgq`*9~מ`:N*8:d/1Ƣ M>ktcs{ۿpAs!JfI7FAFOqaY w)}ȹb3U] Xʷkq@z^Ŝqo?Oܤc(DEN VX/Lq>RQ[ 'YS>D~ Aon_dQ2K@7\=ȹc frknG2#~~>ZfD1Mۮ@WS^;~xZks0#|_25p=6[%Üp@u}7#> saa|ξc%dF`C~y ƙ+mlf;p)1.c0읃`HL/r9Y >,qJke8TBTu>˶rpkar1C`9#& ;\Ž/[6~vpF!!|)ҞZ3d#`߃#ixX}o~5]8Q\0 Dp5"d_]z+svRJ/7F1Q(5_xw'gW+=d4vp0gO[2<*!o*EXټ3~i)/ֆDgڒCE-9|Koj} iƋ stY/PE:Op%[qG]4qq7Cz5Fe>J;ٕxZS.WosFkFF5Ç̉]#Bÿs"aQ=# .:z*0?^ذ$~OcF <#tfC).s@d]W0dO*`OS!yet^&Ʒ:pƕ/YQG )EN1QOnп &e/RnՑ}"H^]?{ͽ̈u d"'m9ҏbߏK)۩Y ܳJ S>rMvpTZՑf !\Y "Ot}`e˞ʺi'@ikgﯮ,4x{8csX ķk'ggP9,w#vw,~z8yϬ:DDD:@BDDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD"""@H(EDD:@."" tPt]DD:L:~ƝIENDB`pydantic-2.10.6/docs/logos/intel_logo.png000066400000000000000000000150441474456633400203530ustar00rootroot00000000000000PNG  IHDR7TgAMA a KiCCPsRGB IEC61966-2.1HSgTS=BKKoR RBTi@숨"q"Ay((6T}7o9g}>F`DdJ<6.'w T @- m@n8P $ B2r22 t%[j;eOv$(S*@@&X`(ʑs`̔`)d` SGE3(xW\!Sd咔Tn!\]x87CP؄ ee3FvD9;:;8:|?E񋖴e /B_TBfgk+ m_ _׃  2r<[&q?.wL'bPGKĹi ˒$ IHk`~B[P. %w߂1w0hْ 4P6h>؀#;x̆P8XBHLC.,UP%BZF8-p<^0 o`A2DX6b"ֈ#Ef!~H0!H "ERd5R#U^9E.!==F~C>@٨jڡ\ Bh G h%ZBѳڋ>G03l0.Bx,c˱b6b#{";!0 $,",'̈́ Ba$nD>1B%+uc[!\H8Ri D:C!d6ٚA% ry;4:yBP)xR@\ RƩjTS5*.Qkmԫ8MfNEhhFyC+:nDw%JaEz=Ca1J~=+&ib3 z9c; _EBZY U| գWUGԨjfj<5rjjwY/i544D4i01VjYYlۜgK߱٣34545Ojr0qpns>Lћ=Ejkh4km8ndn4ר1͘klŸx$dI}S)4ti[3sf-fCZ||L OE57-I\t˝׬P+'Tj֨zu44ii50lmrlll9-/L6u}wϰ0ۡ7G+GcWLor ]3:B:;}rvq;7:$pesø܋DW'\߻9)܎n~}hLڙFY4xx>2yy z[zy~c#9[;vi{o?$L 10(pS_ȯvlvG#(2*IU<- 999-(yr`GryPGTԊ OR%y;mzh􉌘LJfbq4]ڑ#z-ںhT$Fg* Ki\˙S.7:hz4k]BX"\Ҿp骥}˼],OZ޾xEኁ+J_S}Ay1 W XPR$/}uuu맯߾sr}IERaofbC2]Ioot\<s--.zbFmmmMo*VOuw)y}׮zKv#swo}}9Fv~N~:],k@ Ç]FƽMpXy>t(h?8:V܌4/nmImmk9>x{{۱mDI͓eh OM?=vFvflŞ}> uzwq%K/s/\qu'u;w7_uzZ[̞S={M+=; wz˸~+?R{TXqϖ?7:zA?q)iŠ`Љak=x.{>>R/;^XW_FcG^_NVJ3^=~fm;wsw~08姶ANdNLu cHRMz&u0`:pQ<PLTEGpLr]UtRNS@f pHYs  iTXtXML:com.adobe.xmp e IDATxz8a 8\ I{l\_u8* HI$Igȿ+)t:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F (OQ9@g2]NL2:сdt^I@F2:сdt @F2:сdt @F2:сnX߯@F2:сdt @F2:сdt4t{6gӐmH)ݾz&׿bVi`t5ܹS? =uo5&o}6Ң7:?qC:, |?cMj:[vw|:rL_w;},4M燏[za.;-ts٣D8DgNj{lq"}Z:v>X)!f9m@z;Gr1[FKw)Z wgzgͮ%^ѫx}p{WȱYw8[N4}W:Z]`;G1ݲT>ӏζP\xivEZ}C"я9kw }G# ũ~~9dvݏ~---wWҟS.-2mo|ψ'GGmGn+^w=Xe~7ea8˅9{HR[MkH)E92gki| ?21D|j5.b<8N''9w)y{[0N|Z vg mmh zc`5lZ0lɭgOx3=h/:&<-T <?N|=vl}nJgGq\4oA{z'nY9= )[Jc 6܆P#78W?澗Mڼ3yYuyOcSݧI:G s,/ԁAϙAAw ȳz?L2:g @FC)iu㭡w0=reVucFm7u cvZOK]4N;ar\fS_T7g7Ap0~զWꞽ+7FMf>/~ye~g4={5[bno2׏&|F9%~Vu]<\5a??:-<5Wwiw=9'3Oyzg05 zg~;.Kun?!1NZ!{} yX3.cF%.g5ꠛ)s]3}A{žN +OX>K)RJ)It\F__;ܓK<{ϕ3H{sP: 9}o{\9z5Rmvqσd)}|{ɖO&Je)UlӗճY?Wfv(,usε]-ԠzjkT.3WٵfN)W\@ ǚK?TzY|+Էӯ?[ g^}pQqD`='t-3Ksw-z^ 2<MW/μDoy,ik17ɖ7K09UR{*O1)7??`sܫ{%[tpndtߪdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2:сdt @F2$I$?;2@qyJIENDB`pydantic-2.10.6/docs/logos/intuit_logo.png000066400000000000000000000305631474456633400205570ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp k,$IDATxy$AoVu;]Jk]E,  @`lc18|pl` 0v`i!ZVZrՊ=fwwtwU7uDΛUU )%$IVH3%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e <ˆ߄30 JHCv :8` v S@YAV\BaNj9;PZ%xA/4p^50'ǁRcܷCa% e 0BwyEظ R.PDԫPpN>5xwiYD(JP'Ie{I-! ݧ;],<,mm#$ES֟q|+O4[Q>a$>K%a?UKͺ S"o"!nHхr ~s1( Rkͦ.\ Ϊ w;ZP6ÝiސzMa}{ۄK ܰEj^ﱆX#6\ %[X: |\.wtg>'|n _tJzu'wqtw-[g Yt֬ 'S?n%rLJ "?s}sb|;y.Ȝ9t)! vor9| exSߝe~?X&A{>Mn5'}Os~&(S=L@/z͌hsw w8 ,/:N1ξ?ޗ֒&iDw6s nL! ~kW$ wog_r^]fZmā&lXLN-K蜅:sw{q<\gUE~ԅpfFnOУ@W+~钚9(4?K܆Ӫ X݁y~0o,0_u }͜OjhHU;<n=)vp|Z3m>>6LkL;Qc+ebX{iS\~o=':/?[\@(FsgxeJ|1p!}B:-N v.tVxJLE [p"%3)VL)s)m T5w=Kw6<.lTs_ |l*"=rX |dikkP౩kTCxK, x*K]I8{-f۞b4d]и fNʐtNu}h.ҟjY@O5x yU_)@w~^[׹(P@ {E۠AԦXP4,a}'GE2:C}9=A4= :SԬ.jn|;GϿx]ty'2jqwi8|K>9 +C_kB=M~<)NIm8k4RzуY='2lz/Aljp|lVë`, -.?o޻O}7E"lKj-uBi|ZOe)a~(r/e6wbA R"/s""=>&ЁGkj fT`V w~9NCu(.w.ϋw*` _:$raX"OoB\˵|iC}9(:M z(M;^WV;+!@Y_,4~|>t6[hv~6[uF#}nqُr9=c 4qC~n\KoR6_OK^VM*"LX]V=6xӃ,@@wckj'_-K.zǗ[)su|l>ζ GpaG*Z t1[I NwtqY6pO}5%waև2YG<{ʵM%_=΋WIF; []xAs"_hhP ;*ş6\XI9yK{|?s ks')-"4j$A4d%vVu FR=)mm^ؔ+pQ>~*BӒѫ0;УKhAUSqYGt/ 'w,H̯5P_@OMpSiھfAu&W_=X{.SEMENqg4t<1>H V Ulqa-5p|#+uT%܆sUYa>6P_@lSoM9@?O~L>5p!ptB!c@@æ_3X 3@Ӥ,ϙohz3:Hg84ZkvoNrm' ziLjMա~Џv:PG@J-}PVK2KάO^VjVNLSڒS]Y6!xfQ\'xOuLq͔|\|@>zd2ltB~BUa5: O!0;@/]WO:G˒,`|2\Si}f=9}5;iwEPo0?3KUQtIJwxg'o01ԗ.c2g/0q[_L*Uu.yÁn&p91h0\~2#m0˹l7ղIB|Tq/o0!ԗ.a)nud %.1#i.j8XE;0j.hpϠ7, si1FtRwjP)|lP_@_JM3bu =cU1yq>{ |iTw8RWj:R?~S2 j!)Fwpn9puM{7ǏrvԹU& u}:š p"pUt:7Sp{F7k0 B@}kꔣ8u-QO>eYY[fTnt,4|Ҩ_pP7562sMdK]:jff,P{'E{YH-[[v]ө@d ]^o05;k0=o'YF:FnKKn'ɉfn2͹!錤VЌQaK,AQ@1ifK?+[ffy X±"j}JMfeqà5p40ΜMb3k5`e@XP`{yR吴51@9L;<hF v:GҞVk tʂifUwx,FR}UWA}g;cֱ󽳯 t*8y@[.H>f5Bպi6Rj9@ǻ^ڞ $0: 蚕5EB7Z[[2sk%ML!͕ z3{L'wiq \=d'%(J(:ڗ.-0fv$z_a 銽3- M<}t>/Hq7N)v9Wa⽝*ͧ]Zfa4{j{~\Hp]t'b4 PnBQ3R^ a+ C]@Ei6۝@kf&Uh BIBjh9Q-v{}eT[=@+^]gj(Au}~=fV+.< VeX)mq~ t]t[~qNӽ 9~cVJX=ΨI} "ħN4Y. %&Sܤ]%=QL~ %0Vn@ظy: V[W;JNB3vp{cS~l5|)Ak˚w| ;|d X-,kfe}a¨v Eo{SK| -v#@倗u |u~bYZ )W?л#Z n‰Mth=z㡩{ף3N0ХeV4-]x입ri.{PtIZ_ Ѻb;=ء bCf}n f]&چoU>j9>Omb&V]Zru8v5=(|NfhfnFM?t:Щ/|eYz=u'ñ0Ѭ/oݳM蚕E"$X :3Ē9;oSCy"$t'[l1P,`NOᙦ Ғ+ 91j햼ziXvvyqQl򙱐΅ {?# eH@/TVL^ێJ]ް 15 byZWT!_&FoK"NÂ&:Sc׵ٻiiHiaKK(¹+Ü[L|Kf+JuxKu]wXq{}&`kj #zkh^BjXmsRԉW(j~H|k[Օ=~-)BO{>i͐;4GڇuzпDx4v)|>z{:Q"UhfRnǵ{N?"5|s?|s^{Mڶ=~!f'ݭ!?3:)^-*+GSIR!<7v?*0 oxnܟ"?A' {`4XV|NF RhЅW6֔IjNggM@Ԫx3@g8KJL$pWkGkMޓ$x'\8u_|\'i W0t;胼L; P=۞ n a]Sw, (CSwkft6[-0 yP/Z/ht@D `HEs(-j 'X}E @oyM5ru1IItZ(#M'.QgBz‟:Ui)hTGHApiàSWh[AÚI-Ⳁmà蚙5Lj0ǚ! 9Q #W|aAk5O!o/FH[;u :ڹ~}>X.Mbww剷xhm 嶱 P=b+͑t{3Cd+:_AJxs*xuQrw]_N%F 9}:鱖Rv+!p(X˝ę}mUeOu6/[-,;+)ro nVZ7"%^l0CXO̳$29F,ǀ7y>4XUH?Vl2Y)pW%|b\^ݓu0:yѫ5U sB$y('IbK] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%Iʀ.IR tI2`K] $e@$)$I0%I*H^IENDB`pydantic-2.10.6/docs/logos/ipcc_logo.png000066400000000000000000002055721474456633400201650ustar00rootroot00000000000000PNG  IHDR|E pHYs+ IDATx\GyVa^S;́@ Xأպv (VД Y3orz3w9'?Bk~eW, ވ84VUEYg0 ZrE#OUw:_0Iraѯ?'z̙;vdmM3uA&L.sCHQ%$Be[ݕ; @ O9$2,iXp(*ox0K<2@ X5z]e’AΠe0z>n{?{=|9tMc@  #VH{ARgX$d$tKqj=<;<@`Yyf(] \P\qЬe@B" [ot0&WC۾؅xh{4%nP)~v{pyW4؍gW 4؅c. uҔ:fFN .o훹@ ?B'z=h+D-qZ5Gx*_~@ ]ҡӰY̎qcΟ@n`k+0aъ1A<槤;.>5-je.bf}U@ af m؆p#3&TO^~}ў5E'?Cذi2p6$L8ğ`['n 3fNз:~5m8MZ6ѹpw r3BaCf.7 G)0 }kREBE TڅiϧDLc- 3 Q꭮`}kҫ@ͼw"@!Y٢Y<>gs]`sҢ@ &7IЬh6 Z2&2;ׯփڴy 0`sz8 7c[3o:|@`}!NhVϬN>/%6b[:8@`]!|p5|a۴ė:`/=S'G@ +r]Uū4_ǚf{o}hqiqšd2&WCXw 0!x3:ttg&1i`Kf^&@ _CxMWx3}x@Y< F0IM鮉A.< ozg-4N,&${ Ҋ ˋdK7 _v4u@ ~{˾J9'3{ knBDd4"X;ٗܧn~k@  r7wo߂5[-&L1$;Vēn /n1>/|ms3p;]bLڋ!E/=@ߚ<Vc0 Vgg_x橙w 2@ P8Cp>M ރ+ s7$\DAƒ<Gyln @)D=KJ!x#>|`Kl/󸉿Q"\ Gp9_:>ҴmlE']$d: ɞ6t{>h V0a'ƶR9kJ)6np`٢`^pQϞ0@ @ @ @ @ @ @ g83a;{03|kOnHAq:{Um߾,ZO%PEefQo"oÿgl(! ď|;S-5T.ys)#9'7K}r7ڃ}Sx n]4TЋRR.y(c朹x%z=cޕuV_9jf,?[VcbT\ɕey'D+`hdal#6vm4\@VzaK1="KN:2_g3ER'~q"7}G沙9|_"F`3ޞԁ! )2ri#f+>1?z-r=qU'=p샟K%ƬU $aVѮIy"Ix̞Cy;s'77eI.=D @[ElX6$SEE_&BioOkb}u!#L!ƭ4m5Qv֌WEsBW6)Ec}t˥/j :DVOn0ƗW?nx^/> fa; 03$9Rk, RIG2N<&>Ƿ;1(#+%~; /ixը'ɛKZ2ih2dHB1'7w%ؗ| %noi1ve:#ZO,w#`&I+EZZ/[glI dFy";Q7X8?ȱMGf*Z: eo@&Nzj@orⲋ2e4 2k3Y_Ǣ(`gMZ3`>}s]+(_^` KmOIMV7oVo)udxu { 0:<{.Z8I747yh8ϛwMbJ|gQe,bhC/Hz3Wqߨ6''m.qgX)u+4k8ix.>ʲ|zɨIOHZ1S6PG!_jtdq`/5|*q5pfMZQTLMbWz+cCX?5w`aDLTݗ*!5JElIw]>T*U_8Wn 1Tp$Hj6,DnV[k^,ոQEӢ;ҶRiOן_Zn< h z^w#S=o3ᢙ]Q.5Fgi=ї.nNM0U}~d+~x]vgWaz+Q!كg#u Sgfş*?q#OlXu-H W䡉Q7)qUi!>fkTƘ-)ϤhVE\$4-snF{?tnn%hֹә.Z7}ibIGx l[`js4D6buEtvUʕt~Qܧoh Wm~A\SWi5v {Σ(ĕU`6a/?q`qck|\aVE Rr$ޔtka,# #])UƾYDb¸ؘ)=G8*5+{uUG>8JUi֒9miH\֡&sSK^6#]ny/pǡwrq˪ZO/M&Pdt I݇i#98kx?i__sV/U1ar-u19Ӭ|5J}{nlsS|Pkm@Q*E2kR-A4מm^* Eq)!=iIIKw 6#( Jvk-9O8Ia*(iy7G?װhw20)#ZF`I\%.S - \MUCZqX(4T$AE:6$\[_qno/m<"b#anGv睾pR7vWn9hR6`Ɩ0Eƍqnuz㝇kKNm亳ٲqD~o;bxH٭Ö ⦇=3tgMLoK`7ז*xZ( gz _.%h |,ǁ͜}7^G ǵwFRw;ڮM^6`ү[4.X˲7 pᆘ{0kzIڿSo6W~׷YSv'`eTm ٲMO2$9`JgܸGwpy]+*/]Ղ͊K._-ۅa4-Ƨh7Io~b q +pr8ĺ%wC!y3 ɾf?eomq)8ϋ+ih]ٌav#n^[Fx¡p/D\%JE+Qo1,q]fz;~/ ]y^6@]vHMeźִMoE\+gWTʶXgZۑ s;ƃIf:gxY4]ʢM1jxk}+}xM Epb-tdYYO,1hwUtѵ[*/`&]lDڙ4$/%"Z*eqp]W"wCu%К,%]coX%vH&֋(_I%-w4ew]熏=>v =tM^Yod 2.O?q876C/J. ֺ+SIMxxBC7 w{ ^(֍Mj}Tgt"7[L]mi&Lm\c$H_磧su,OMwU|t;-=۾LvCFո+O̢ͺeLGvRnafɮ[jW8ܵ&lƫq9S熤1ak{XN3vh/Hym lK :ܔSj"@I3\+bsJf/F{R5I+#MKg-U& Htj`g}iR A۝f"ԍ fVPlI}2r-4Ě6XP910L6ee+7s*A2?f-z}[{0Gtc͉3nϿÅ0/Cdo6WQh3dQ9F;񾎷: i|TޖPAd +-[o}z[*z護}jlo,MvaXYG2~c~,1yd\2XfýaӦқƾ.P4|2hvls&2?xh=AN  T1c& J]mL{Kodx|"I2j?NNgc&t-NU޺Pңo:0k;`QK*Fg %6eO;E&oj,iǵyO19ǘ=v퇞~_n{]]6aɳўI'_M}:}ԆA^0C`7G[HvU d=g$fߔx xcwxz\ٍJ}Y=~BXlN>*϶ )_Q(Uҋfw8fT!̐x$6g[jk/?Ыƒ cc3X:v_b|j,pd|{fxIJ󛽹caNP8%'vW&#L ׈7r?h`.Gl"=.JlfFka{3ԯ'ccQ تd t:ܥz`+KykfO AwQ"x=m~&!γQp/578?躝Gx'^K'ԵѣTn(e h}wTiyְD6#9u𕺳SG{Аܻ­ NiXzr7X{ư?>}ͯ܇z8b s&Zi5^FwoJn>Tj& .A<WfMƖ\}iibJWҿL[{_M5=3FiVb7֤.{Iѯ>7ۅīE.I9}iZD/5J:gs֘gk>zy9 ڋT԰%Ktn3}D2X͙Jj*qJ#tn4J^ѣh6M~Ge`x'J/jf hr_R~ɚ-4BnK -8LxoJ|9KKu5ՓvYyWGQܚ_qW+A5|ִ@>'ן}#}(&|W.]W@epNsqNMt 621| F/4ЫAJJi@ͬ?'J?-0ǘrXV$vnJRfArڎx=X5ZnIH43EM}{H݆޲{˾[1> ֱ]W5䶬EPv2U5j3kA՞~4wޑusꍃK ldyHͽ zf Fff#jq-\n dijbJp-/|'j]w1?zlThҽ x!#2n[}4|aN#yvnhӳozŚK穻[͸8JŮ#򒔁Kh ?J%w.>ׅy|[v͎W^aC6x ځ]&o{kW!\.Zk;ޯ&ÓcV%Ճ_ةivWҭ}&uxɸE4_=h\ "7&G݋И {%n=ћBG2sܤz4V*WO5q)imt 8̸q8ƹ{'v9dggMOܟ/..4/:wjWwMBU{-*oc!ؚ̐Ҋ<<=kRq.kP~ 4)zWVS qTz޺Tzq?Q|ǫGj|i2cGee*HVs xFSk[S!?0󵌦-noF7;3Ϫ'eDvzs.4"^eGG^|ba!-yȩ /m##*ll!Zfd]iGi 3w\z Oͼ#=*ϊVѺӤB[27iOzw°?7㟺չ$#Fs=#&^V. ?iݕ91v;IZ鄤]!Y ~qj,+=Ի^G/5Uxdd1%JSvU) /_,7˯ͭ7pͯ<$ ?gYmal }%vd4 A:ql _%μ{cO*Z*:^Zr*Ci3bCygJxJ%1y#V~bk%0N/44Ii-_f doO-?E{0%z nq{~rS-~c{!4'ZqLGܟM|=JMlJ޻h*@mJ+q$ޢp7')ڟF-e@ 哠N44Pz(iZgfӮaMJMl1lyYE'3ߴ/ ?jYc Sy$7IdYCw|NŌ-Dnl}4̃7yE_O1 3Vb;hPeřyKz~ja<&˶u1&%Ć qj4 6YjI/#'V(V({[ s}oa*dZ>'ˤ#hpLN8T h=0I[G@ i4V8Бg^^Aff^sE+}ZLαP/-Y 79|}Z󤤹Ͻg__sBCHDMscK67IJ)1QTzV/f=6=jۺMʊf>f WwKG?q>CyʨDY)'@Y33o?R7Ϭ i2Ե"M\2!sC`lٖ7!2&?؆QC:OUV,al C_jTAJ6{u<スzupУE^,"_efTgQHФL?2dw`-+>ṵ#8e%xff8A3VgSr*tP}0C׀OTUIz8_Dj!^A(0V7&7LO'ҞB5j%MEK6e⥾ *`,Y8}}5I] uL ^6l4ԌP&# o0 G2PUb;)ިٜp|DcM:ʫKyiMV6fLZۆëK:bQOn6:.<-ăcP5z]u\*eZD 1^o6y- $:@uz:@\lt   /E dMA YqB38$z(U3qM( O H2O*7 M,f\8=m!- 7<5{iii}tpL,|yyo״o>PTr76D@*Ӵfڇ&kEg33Xļa#C=:WTaK۪q50{OO: uoɟt!S:>ڑ4ڀzϦ 0'rP+r:Ihw@qCo+!?[Ůl(yj5=F|,-YErҲ|o{8u~4QC5Os[)[z`^jpѮ-J]U=/&| w塺Fΰ*[uZlQ̌nDԷrk-2:] _r7 gaB嚨-I yF=J$Άhs!\6ͻ˜VK e06Jb*-Z-é91䞸 PJLiGM2P:J%w8 )SYp˴5+ =~vD50aĽe/I(PɸԒn93Rzu`R|e/,TYįmُ&ҹwN(V{@}rx2NY]D!IlEu:N>4ݼ%sB$eg;qbi>:JyRuGӿLZʭׯ&8_'_,Y3@,lִ{6T@=DMa&O49r!1i`tVNO ivkjỵ.CΌnuKҽӛ&UE.KW=^4b `&u~5ߴ#w: 0^5iBMFtF0Q|l#2%f'f%McoԨ ;"n4e,UwJФ24= [bwr~M|.;L[+0uo- ]ɜf-*fD_&UA3X:K;ʫKѬkYm)ΈG(&D1Vt<.>ܨa${lFaĤ {FSdIQ+֑}han Q3Lr7j"#Dlׇ_Fw3]Q֭`ӬR'Gcvt{d*5xVfe V-'|qhBm/&aR!q*>5A-"μOc6B@bw!l{TD-])P}ԧB#;@D(jnDa# oQ(BpGDnW<@/GSjwFƻis}L M0<nkj5u4f7ͨ'ut~4- h; W~n3l z.J] >&zf !ͱ&N@Ϥ]/@P?f1YFBj$yϗ t` hNJ;Yr7 4vVܡ*lVC}YäfW(}L]w/i,:ݕ91µIJ:mؐv_tItGyuW?('*_~Q!P ۡ>Y%m0v_WtOV>43U5^\'uX3ޚ~ڡ>9h(B0C@S D}hB@} a_Pߣ(=F 2n@ ȓ ag`B@}E6oeC@'wC!) &ԟVZXc.PF! j!A_kB@} ad͸PEs2G4F1)WIo;%_h֢@;Ls]%Pk}h2f*z fhc!-@B! {G5pT"܈@ a y!  ASPLQS۠ݹ~n35ujS+kLhr7!~P: ԟV#kM!МlW!~{(}B@}vWr8&* m_Zhsi]@;P;~ML@@; !~ vL1ԟV!֢Hvr7!~yB@}Η [PȐ! @ Or7!!h =j Z~>ۼ; u떞8ҿ&~qQb:l?࠿=5DVehnMå5IAo̙\7n[sD&\fBh'=Wop&8ú,HjLu2֏A0{dg6o\,^4s-MK:nѬNǗ ڲ^uO<.8VoM놂=pي,O2՛FFM0l%ھ5E ː dJe=f/0E;{qFB+8~5.e M*-'~299ܭKߔ˺<ܲiIS7+J/@i90l$X9&C(uCPAUsD5kssPh,3m]'ISЦrOm{WAp#w]z֤߄B]/nC@5!>ѐ Wv.g@ $wC.PߣAi6_ۻsP搻!.B@}`B@4aR|lr7!~"#wCY,s$ԯY4!~DtyC@}B4ي3/)eXPߗIQ!@;`Xk`ifQ|`Rit7'Z[2/K͠~4뤩9V:gJ@=RU1OlV4^\2FA&>#:ث&C=EʏQUq5^рTw\柬4C wCS@}x`r3l@4i j)~|8*jMF: 18AlxV{)8nz[Jh oGʥm/1loΌuWm82 aQ[kOM$jW" 5c?i5e|)W (_>V$ki5.08s;I-m=t¬OM>t5.*7#b{ힰn>190`33¹s7@`}`19<3[à! K7X4I\QT\^Ùgumcj_|;rt+A(" kڏ&-a,͊Xq3xCGJTptn&aҝV |;gwV}i,UE]yLNR2a2;qr=TZqDD0s 1Xܚ 9kMF2g/pX̨dF&NuF '7MflеzQY!u4q9W>70!~C"j{pK6/ÐީQT0wCY2s+h!~0ͺ _(jnJ]]P߅!䝨9`\ if!in5ܼ'dؕŧֹB@}tD!]##wCvz|9Δd4"fyo6o`_ IDAT*;/C{:{&Rt#0aA5i z-W~4E TS"J/~o/RjYc:qIYOi2b r7;D)W J a ~4E6pT"Di0ڮ1$ѐJtViAG49:e {f}A(! @ Or7Cۄ~5t8@O.!hFλh~;܈ޏrG6[EY25 uvWSPZԯ)ke^H?|$'į]&eFTE9zC{WM aU@8gx)4;n!![@^t +hkP?ʋcgXuOY] {j#Ϋ1pG oն Bމ' AV!~MS["LxTt.Ԁg+w8f)71K&ɰ %F\# /z۵\.+[`(?x31DrEC@}7 @{.KTPǛЎ.w64YY&$̏{ Y]ng͔yY$1~ j@Q@}tK2E3铫"CX01pxsEm{:p_46̛Pi :_r9wqި:-Vwk} ߠ2JsAxB'LO_荌m O^w&+#M :Lr7!~yN3,gc; a/[c^Ō 2ot![ͥsTZ*V_2~±vTHU#Pq̸2ȔjޡqW}c a?]7 \ svΨ|+yk֞!-#Pr: sl]ʼ/11ڨ6Re` IVAYcg@4׆Z#rgYlr.۩7,lJbQZ6+2;"䳿o|nPuTuhrz~2-PD<07ď sN0«nzټMA#4Fc1`a~lդyyM 61yVN̕42ՔiY]gO÷{5)'f8mRb9s΢zc.BZ4iw !~M4bjH*2Ud;%&1ږAKJGwAt|y膰5"nn&\R8KRiT*)/yLct 5'nH[C4;Uġ:iE[{m8 D&7\,! SM]qUiIͨZY[kv['}qݗVi"C6^P9Ӡ:4)c2g[<` ^Xuౣu٠lzW2i P/t׮>iA3LBJA26u?TYtRpt(w&C#:`˯2Vv;;}ҢPsJm_+c[7&q̲c W0qW ̓]t/NN޳!;>`Aq '4 84_i 4*) mޚ]21N ${߼rJGiB}5ѐ9Kpsr+'_A!Gh@XQ\Z0~Jq{WRy Ir>pJ-SyF*`AaU{A-Z ^}b8%>,QTW.?lT!>L^U7PJZ'nIiBє! ,p糼tNy}C ЕH -:Z+W;{&08l3c-(5A+oZ}>as" !%aupavٳuo&:$Vj4IPtwu}Lƿ=q;A]sϋ;h TP>zUP\:U+?!IiBM@C_r@z3s%W5n͛ ǏБ9 g?5N <)MpQ kHФdx8B'4Ph1#sZ}SbfQm=}=MOr˹˾{_}ߖE^>Q:a΄3x̞nhO/pN?_ Mk\MЯf.]I,5s$ d>xNO(q=tA&3~4&aaq]ūooRs/54ǽeEߺYqm!Nm%&z2 ,̼}^w]Œݍi u@xիbx1ۻ%w Ks{Xb0I,Ok_5sܿkv7m^,O5,*xUI¤1~_Mx1I⍊7bv44nj!lD4AHr<@}k]τq_/|}Yt" Wl]Ds޴[V0b8fe",ԽpРtlFBGz[m=-s>e k-~$YJ^4y`ykOzi Ba20%iP|`,DyKg  7E]?<E`_}/Kv#V*, . *3Y7pȰ- S@\T*'*(GESPbRQn&1L30}}/0.qm 3D\W}=thf6~uWA欰2<&:p}/"a "gB"enh6n 40@k@yƣ^Erqdrj?n2~RbUc)* y\ŠP- p -ЂLhpxK}6cӁ bU4T/X){)!QJ!+엑Ҕ/&2 1)#&(%`/ҁ #B'2߁$z_W0UT? sM t񞟴Inf%"/"em݋_V#IU]Hv!XH6g6: ,X<S3ԒN30f %(l?PAlX7hyQ7'&]ZAЪ-Y+9PͯQtbQXEAI+0{hdeY J ,/h!E$pl?ügל=iUқ{kvctIZGqkHSXH  @ $}sGakaqيEwD |J?l>gu&o[ qlSVK_Ϗ*sBnv8w=qUSbʍxs=P]Z68%/RJ$GeidVq* BUDyNRL Lb xM>񙕇)![t&aß00y*3OyEd[ (}`})Ԍ.Y4iB}JSdw}&_$ހ~h[=Cm]LG 2sICӿRr(?+hr .ʛ.M )/zMupDž>7m$/ΏlM8^R5|fjZ~ g3{08d)#Bzċܾx)5 I6;%}ϯ:.x "s l IdĢB Bdl4n]'MIƹ%w&} xbN\EYfIht]ffx~e'KIU|PT`4Xv=s`r'M2SK-mP>= ^]l=/z?%cukd)0im{_e?ZLAX= =o?&MqeMl͆{﷬y'ErI0إĒEmJ~‘ES~TS.̾of߶v6=漶!o67m}B&!ʅEU=| m8Uh}1)1ڑ}94'iY6%Ij4Kf&Ǔ>hʓ ՓsH l|xͲDc-4BSJփGc"oYzR raW/ayĀ6hCHFS.|~m',k:gdKܢJ>(LʔAcV¼a`}%2)2N /h0邰j Aه<"it4 ᘊ8LE36$Y瓟M3 w_6JE]o(o$[d _S )Yfoܓxɍ>wVKBTn^XXG~LSr22l/$ &M)dx(fyh~Kb `<#b XB>4ܾX $)"ɉ CESZ\c*;o/M.Ȝ#^t%96S* $ Ƒ_0GdZtʧ//&",D'MR IDAT,@d]W;wbvp=Op2%wZWRxɿv!",%2at'ݏ e<]?y35cwsR LACEάMF3SM؍t+0јc|g[ivp|\@#XIPFh{feX)Q a4z o-!~z>oG:7dհ Ms+?cuo'%](3҆pr+ ) ds1g^z(zv/o_r2]: œ BT,$—UH@$_28<#^_{psov3z| Pf}w)"Si&6nOSw,0ƵkAz@.ahғOtO߶`U_E7ه41KF܃b#UU%qdu-,ܣ޲\Α˽e'Hp/eN E]\HDgJd2f*rB(yO*b56YR ק"^.a8n\sNqW-kMyxa 4 - NB祒;bD5X@@?2 ޳O;)7:oL | o4!1shs&jk灭pp!Rl$Q8aY D?5ݞSAXYFjbfbs/xT5wm͸ɹ`bd#3 C*֋^+C+ZE>nto[vvyګw܀)Fիp [,8֐NIP\SLzI>5+l$3*{ql; kyK M11:#׾*t)4ɳ˿c+iapEibIJY-BO {dv7a'޼'yQ\0\+udſn4-n/,fIPMb&$ê؝$Ji]m.:{ Ԓ?>_YNrrw֜ROjA*c:Bp26-3EgY€`]f;:;.Mq"g,B&q r1hZKrn}Ӻݙ$k^M46%IpF!ΰ+CoJ~Øo]FʻU.KsDnt9޼4:^=+P #2$O#épf^0r[6D5gUnsWmm- , cp.z/TF,w>ad?m`wMA̒2$iP!(Drx91$%B*i˛ŽI( ‰FbME)Np&ԧH"ITN ҄)R8I ‰AP"E' RA"E)HD!MO"E  )RHq 4>E)Np"&ԧH"ITN҄)R8i ‰BP"E'. \"Ĭd,"EcE*'^ń!'Zi3)[)R(C*2i4*#I_)RRA8Px~̕K*q1 u#+:H"[ gZ=<<#4`19Ja])RH"E)RH"E)RH"E)RH"E)RH"E)RH"[ 'MJ~7 :gn}.w4;]A7JrX X9`Kh8f\MG?b?f9r1TX-sp-p69<ɜ9rky0e.y ~c(nupmp٣yƉ@_ 9c)OOIsq>g;f:g6]}λ^/?7>cb*bu6ϖXL@O-{zW{.`u>umKirs 9Ow꯼#K0PʛI-y/mcwU6rƟvdwmy1n\ξgɳPhɼ A,jl_o}byXm: =M,0F<]> .tzwܥCftX +|bms׵ϝ5WR>h2kb_>ooZczl~0sH(I1N}ius/w9RFC,z5̧s=Y9Ȥ>tg项 @CT_ܝȩd6j ~-١,6W2ԇ]9VgQ.8.m Q302C;7<Wq}ˁu>mK_vکs-DAtvdG"z][3amLp=t빻"[g/qYqC=5r]߲s-Ws/&-l>X:-Ra%fZg7|ryK׬{ə_}'Z'T'&=>?ӾHE!ʴ]5ŚÈ3r֖+X`>;t-+LrA~[ xK/>1Љl˥ӟn Iz#}Q/pú=|!?]hڑ $ȬA, bUo7N T7޳s)2o B&W߳}ۖ-<54眛S;q}|K񻞞LmY DVI!n3l=P!Wv@\)hЮYOVN"}V*qn:[AhrY`_]W5TGtvD=qpij#AqV_Sw SM  9DI};31gX `6n#O <(is 6aogh0do2 |b4PTA<%bpŪz3uW A[φ=j4fcG.0&{P9TI!Ey"BgԕB:9Cy١>N:9H`π= lBlBUc@r3HB'&6tj +׽RvD(^9us4Z\`&T%Й\FY{8 6 6؄#_1G1Ό:&ΔgN/Ёk^ȗ?y7tIXhnmYCA^{8,8-Ea29f6"2Dg !,FlǸ;nzݎ}.$ 0cvr & W{f%s``5V36TOң$|q^W-82< zfWFV ]e.3gۜ1Gkp^eIx,P]6<&"wC"D"LnY0_qE;ytL?k X3+Xs Z^"K̐d=ff/Q~櫪hLJ>86^3KZKo^Zޕ69oEc,63v.3͑ws*y 07Gv/.~Nm:%5ˉxt7{r^ =ֈذ]&ZE&\JoFo{9[O{em@萌Ox[G(!$=Lsv@ogCqf+)ka:.k Iud!:&=c0bC L-vaz:Tql|τu$}ڜ&p]/|Aߝ6Qp" 22.^{y[5Wq~:qզ1$!3L˰gwfKyN O2`N-nuDv)YW6-k]mOkJh{ z*N8A3vMVŌ]x_CWWzl;m 1Vc̐Xէ ^ xWQ)g(sq`܁K;Î܍gnɽm_n\=+\XdaOt[wT< yZSsٿ|ϱ^@j١%{B)Ws>fWrok22j7s -Q[SF=792 }Ȥ9-=7?zEs萟޻q )B&1xA"8ԙ(0ko5웁rwDZ;vx#=Lp@YnL~vk#1 Ѣxf;v`c"1OƼ. nsl^2 *yͫ\t~Xh:x0uL ?V%@vb|y';+ Z!B bcbDld"tEe +. ,38EΊ0D"\VsV/v-oHl͚dJQ3hf^yHTvNM ߀MLjVf`ɿcYYTӊ%0*CB iҫi{;x2 c +x5X}osd'u!aN{5y *(a14b.L0#2䄒X!"Vq ` 8lP’2TU)=4HȝzJ }ELjf,;ƺ8+i6b1z0e724Ea`lǖ̇:-(EhȹPyq"v/[K a{+ҋ@NnA%E X's >Ai 1q !6;n-Ƒ-KP}X(%ƙ~ʿ{\?=n{_oV*KN- .x#lky/+mn ttCejgi;,-ʹ8r_KI ;N Y%Vոӳ={/&c;4Y6,#x;'\wΌ1"Bc}5,dϾ$5>~ ,1sn;Z~|Yg@yJiFPu)nE_W1 "r! Do[VQ, ~r:Oƽغ0 df9`M_Q_8*u3j|!yzB\=x>),8Ŷ9"VyBˮ\c7/Aa33}aGU("C_a}o_ 2םIX66}\Iq8*7_] ̹殽n`tu7"^A \Eܵw#%v[O&",d< ߏ=.G҅jg]Jg~c&O o oa;ߙ93|pD(Om_WY4?j1Xx~^KN~s[?0f9to=`gmp=\83XjR%zc!̂nGn[V!0C{t;1xkÈ^z,i}6إdq4mv5DM45V zQϖ Û7BCLgZ$Wgp5#fy<3> }ʡ]f>3 IDAT5YGѩ`r.:lfw:{EԘ[f?;EH^zO<_ap'fH>+",8w"_/D4ltO۴v^HPd2{08uGϘ%gW8cQ{elBwyH=]osLr-msC* uղk$) ,Fy_9ܸL9FO2ĺx f8SiBgU[=B3;ײ H*b#&;1)D`jXkh]}&8E˼qZ vVp檮S.ƣ(_%B9~X>/>P8A>IM^laZ..58Up0u$8Xc1+K<*vZL{~ڎf?:܋f~QbvU#tP+-.`|_ҥHAQX\Zf2AOi?nĻofѣ'TXʒKr* Oy<+}=1?Pi˘./UK(q#P暻f8W6aq/>܎$-A@KmT衋.dX[`m 6CB]tLNǰvD8.uJǙCFqЧ_-sBY'\Froar^{Z,d6mfD)*Oˆ{wZ [ ,䶚,2JKzn}ykĕDIteb/hp j3?}Ce>d_9g`~Dh63&C6ȚgAC Q&b,M.9՛\iu_zw j WxQ#e>ma$+l3,PaxE \'+b:@s Z[>tH-0EG~=N`?)9/G|gCiAe o/JvPHzA`>Y5=IYG?9{I][%U7~x/?n~z8헀'okY3Gq߹@ -SIa.9"sXu뮞-W-9sd/F(EC?Bo.3IA3p q>-*RKYTt91+bՀ}asSvoLhSJP!+Ewt5]w/G?_`@nh9s{kzi+Eu?>vއ=fzf$Wt s1%6"՜xC'dԻqE͝dƮ+]W.=$Nƶ %MM3ʿj9 .Ǿ v3daD'I ߻!@dtƎp`K'1 qE ^/lfXaJq7ɾdb3<)ELH煮s 2>f{M8m/4n, "<'π4'F6!υ7>.F[1 mVVx)E=1Q2SW뺟@nഒY^6V3ۊYfA;z:_[{:#(?:Sh^a2ԖMW fY#,A7N4%T$wMOgfፃcV??];s%HJaac-3>}ίy-wbna 暍ߞ, 5\br>:S`v`5P4 87܊Uw7UrA` $ ISؖ9-k6<ZE2l^|==?2Kdv0<4YXeTPt oȄ'qizek|{Rg/t E2טsQRP Mt f ct<NJn#oz_ ȍHz+8Fe_}|+8?ܢb%sţkՕeMٸsIƋS +?G 5W;)nyIњmT[o\}Vݽ^ᇟiq!f fܤTG-.wK=>I`n\ FC"=O#~h`}oϳ܉*Cz7 ' 5w(?_*=_ndϤJS] \}Og#eElͲq:`kgyNd Bu-͊}TY&pdp6-3Μ25Z'V| efX4dȌϻX4dpurNt$Gt,8.W[>w>\ ;78QjZ Mk686\a8Q{<;P)ٴ5w96_{意{ٽ7|2I/z[ϰhа?l Wf}PJamrl}/~O[kO1;IjǼ;2+ :H"=lNn6%C,3? L^w`(0x .!{>}׌n`&rsy҈{ofs,Aa~=pzs&y|,zL7{'7 *܊6GM.}.w>a=sr'"jRI12BAKs;dC}"I}W6m+D3pguO; b҈ˌ$DRҴrWϿ2ctcz _|)IuGqY:;;Dͩ i\66*ަ j\I z νA5{'it)zmf FY8Ɋt s_&b]`W%&3^K''ޖ?ϗC0wy璳 #Wy!wH=bGfLwqBwWVx`8\"l Ts ؅7Q%tR_} dN;c1Z?K"PfQ8ai5)ie>ޯ~KdvfNNj@<:R @ɾaABddxts=z!;_,KTco>/D+|އZGAt =5mO5 EBn;FsE)EXW?N|/o.RY [ U.G4,2w4Y;%Lz. Roȍt82G?p/8pƶy~Ƨ ;pc#?~IO C0PkMMxMLc ^+Z "~ P)J"*C>BK|C.kZmGzFB.4u{۾yLG̝ȷO>CYR9vjr^_d?>y?@E#(=qzuw\'1[#ⴽ~9]uۦ ?%}xS;Od璘zb>on [n~Ջxo̍axc˄18 cwSxX93b(=?/GkqlHI z S&Lx ][=jQ~ELϦW,@D <}Ξ |vY]Y6O^|4/]-o;+S?s0n_NZ{/ l_ ݷކ^CM5>Ϗ%ci~kn졘S5.?789`c_uɯӷs$`AAisC`,f3ǖH2MyE'2iRLeĦ*z񽛮Z:ޚw7Rd*U[& \Ev1u7349(h;gE\17Ɉ2fA@ol7p '4e}3775G }T|7(kY ԈY(\)g9+I罩`ۺu蓏gB(h 9 "^'z/=E(\v\\`ٛ ⢠Krڗol[\kh5XtzN\#jD >{ÁwˇΚp޸;ถl)dǾ/gˇM630,x.z oK9Y2 "+[x7>.z&']bØtl9qv, 3@XY>Dj fadAHuCO*7 fdnY3#z3B7i4pݓ,uwt&2xl 3/晸e'0+0ٕJN1 A(s«bs8M뺶L뺶kE$ZT);fM2 pէvZN% B %s1wĵk6>(ήF.'$5Yhb^}&rFCޠ_NF&z'qX IDAT#!'2? `.XYQF/ ro}8;uL %b3ge"D|DˏXY|<Z`NscNL{7-yj^W>&+Y$lI@dɺ}Y kqBɓ4f"ZzwVeȸd2{ZJ>%'R<;W۹tgSWmؙ]m96!h "PtnrSO-ޤAM@ZL&GV{rފue7 o{faIl| ;1pLyus!A،{I!oh2&RO\o{fXqY$†&{e7v%r3JV{ohF,#Y|FV %]4V!jxۿm@_zR3Y +&-8H>iI +o6zaLEojmuݳM^Gl1؎7M*:+9`\A!p$ 7+Hoo tVuuH rgTfY pϞ2wxY'ւh҇0mؓ2!D3m~l 88^MRArL3ꞥl) 3X*>3;(i@+o|Mjh M$7 0,&O=Hb^J9i+)6߀Ų `9#`c՞+84Tc^addY1Q8RyDOb^tnhuwO:#/(("1#sf|s0 {Cd"wPX}|''$_&bP g,`q}&Le$|%;*H8 em@^}$$nD[zDםYUE# M qd< >kYnC'\e^ oXbFW5,c%&>F3=X !J})bNy, 1-SR,9hx%Ⴁ\ϘgZP&s'O:v JUI82-ČN4gOUN=7_{^z񍏅쪍Z;,(* ,z1&W %GYnwλIxU&Nt3c%CKzn]{FV2˭yv^ʼ2ȸs=Vjw<XE3 UT`4\ܿN hN8f(}S~ɚU -<,hZG3y݁+kza_z؆0 "5z9ar!+ii9 ”Wتx/U8{STVîXXJ׺`2A!bW'r8n.L #ü:cAByφ js2%rNRd,V D&G^%gKOF>A*w]Mx1 R*JGT ={,y-go ==g-)ca ״b:[o]PtgdE*yt/[UO*|}9lesn7,Y_\|Qkx5ơ<(ɑaKCL9]ۇ24?ee4Ll eF^㮇&~z$@V՝^Ӽ^۸+%`FA8=A_Wm3jLlRo^`f?-A͘6yJ?xOk 2hDʕYv[/׆=eMjM9EݒsP.q{шɄ -1N ߰85F1bF)Y?ɉEgݗhW8#gmvQ͑YAnr~}z^'Kv͍Ih4ь_Fc"3qA%:pM9#\.:i<(bS8R8@2tC^$5 uqjbB{޲BH< HJe}WQ*TwD,P(N{`lsi7X\D.r_|[[Zh'y1fsVmtHܔIsAHgIz~W}9tJ QP0"4#XnA{cD18Hgoi{rCLUb-:_uL T#L9rSl`$GGHP<9s&S.Mnɛ $NYPy\V\rRI`Ʋ觻!.hˆ'TA1Huu=ăt+r >< YL O7I嵋Woq)BAyn"Yз{K>廙d⪛v`<ZV,gQJ}@[HsҨ0ÚXߤX:X̕܇&A#" ޻t$_Z^$K#M]vǃwHڽZ[֞5<7w.h62I=Ft2%G:dMj)7-I:$Jqa"CIʟ @ʼn\yuJSMs~Q% TO[zJdWi@kq7#4f6;-$96ECc^X6:K ɼkfYæm%/<⠙¯;)0eGoILjV![>&|۷޶l>2EQ#c(cG$Q$LNK;}S()q1>zوLr2&&e e9]*6BQt `F|ѠԞFE IaE$G=>CF {[o~ꋑ:% 8h)z@GcoɈ% eK&Nj+_C;v9~bhldumȸP ؏M,y {_qG}9ǟ]>ʧ?Sli$#PM2[]\Xm܅xL*}1YvN̹sX Uu;P7-ӌG{7.q*ڰ?,fc<H>֭;=lЧa+_՜~3Cs3;@H8%?aփt;6]godNvHX0rݽQ_~iq 1cׁc8X԰JA!0B[ug?x#0IatM70vX1=q ”W {F;nzA>nretvW}ݣMŠg"-EtѰ^~Dib 'L]ّ&,s T׵7`=$ 0諾X,aӠЙskmvI70MMWP I&&3ŷ\t[G4S;&J=.VݲGzdߋEE=^e{3Wߵ#tXqcI#Y{$G{nXR-'5v'rW1k>ésc5Ow٩>?%4w+qM?ICzAL4fiKJJJif[1 EB*p&!$KA0u*Fٕ=􇕼D1I2h!l!&-M@'JS!ҏJ{aKce<ŧRRRRRr#,*zk "-hvnHb,f3;$ws,))))na{9Ų ~2`Dh5ǰ$$:QPU(II+%vչn{ռ߷s~?nc6$͛93oOkK-Ϲѳnڑor8i ϦƷk a)" (Hꀣ͈DP:}yYO-KgCG͎Ŝ4 h7o}Op _Ql>g Ʀ?QxXjJz'M>ozNgۡxGIII1sUoML~v>pa MvZyYF4f28IB۱LidtkIHgZgj! M& (O[>@Di͕r;V8=վ^u4x\3}x;J,Z7/Wꮼ eExQaL4W" CR/Ƚ3k@h8<:oR%^@=wY Gy?(9e_v݃#bh`f0HWgO޺y^+oxL$aN63@'0, _б5}&^1xWDSIlNwN97{wsX/'>z$#Dv޼1lİwGJx0`f97bA}CێpJӢ́\|C:ɣ_k5{^&EβytLxk;vMYs^q:_~3Jzͺc6Ѯ^CB01 oFC^FXO1&obtd$# c߻ 53l>:9udxx(Gsv `ktSJGI@cqB2vLک@TVR?uO1;Zlyp;IXCSv|KJJJN% e~I +%oDd]'FaӇw 4_"T@z[]7e%bDwP+٫k {#qS o;jpyr^7%BoY5 '?U%IHፎ>PO0.QBpS۟'2& / &Dt'(`B}\ok'l;=5֡xIM5*A pFt#6HT>:,JU^_/a[c6Hazc5/tQ$>O]} z?Mp龜546~𡪻hPy$I.-+=b}9LLLr̝s6KAAK!eA~͕.ܼ=?~<1퉎% Ax ?89g-)))5}UŘy/4d+gEa۝}pSy|lQNluxWxuT^u:}dǂ€L mo zj)YrEo䶼7G:DaM6|bs˻g@5rK22fM) kㅛ{0U2W7D@ ZRmhgL<î'7m=3.Xg_CEsT\/k}L,E~5v,L@mqmZWV)))9](<B&-@21>@s֑NsZop;//0`iu v! 2 1wFqK/檇$5~Dqc|ZӼo,JHW<ci ^ 0mPqՕh-8*ʿp!YhoX0aO ؋ǗXNqGwTZxem8lZ}en^&萠n<{2q<µ!0>ݐ\L<X+94"Uzc:`8uCWϿ*~ۨW(&-'a_Ng (mUhǰ'H0h9x4? Ų`vNg:t1,D*3ZR_p(EV +* ĸ#43y/_3Ϡk;{ؽoq?B_JύLےdFᅵ^%&Crc6~b&S`w>G #@^&+&vo׍S+4c&C7y+-nH8*鯌~JùOzdUvH3Q :Eo Z$J"JxBϊ"3 i??Ń?wڶB, bH𴍊yJr4<˻,f(}6-)))yP H[.mj|ϭ@\f6.wÑW9oRҵ7xM;󏺯Ĉ% !qd7%&8g0->=R~AU_Cm.uա\-))9%QOb@E! Bq#$P dTCd؍5+B 쾜ni%SdttN[ӴңE"_"0e%k8XK++X;/|0Dpo˧7^=@/ T_uӎqko; KJJN*JbUOд|]o0AuGgۗpJRd| AZ~KX[R0Z=6dct_FaR {I&+3Øߏ9,iP8S T~sܛ,X7@&4zx333.߸`A9:XJ:޼IT) ـ-c ;ƶ\oFeYqwm%ioLvnf%%%%'SWYƚ-t*aGiIBMI4Kk aVL!Ő~xC}pUN L:1>6K"[[1S@b]ޟrVJJJJ~jN!\sp`) XT-!sz{mF~Cz˵gm}/eϫWg[,G>y ]~A2&” c&2E(O4 v cAD+&~a`tֻupNL`w7ͦw?xJ%ƺSz]I[)RF c&G!?^-GE1T+!|P]x S՞j@܏g# \ՎJJJpo:CݴZI@UA l!H;3ʎɉVźJ[ !GT&M$R$Y"u|N~8Aռ AZ`$U2'tO#RYL4,)zE8_4 @jJፖNG8'j haIWyoy9!_s0Ԓ49W]#E^LLiP/}Uui=CՍ}C$hvX)g+# BN;[u(8O]neݬ҆$DӖ\fK:v{vJ Tm{vӖ,FҿHHQY̞ftpڸ ; ?BAd%kϹ#;n4BM jB8^! 3ωB%%%t87^ Gh"2Q SIe^ !8\tzZ_9#ڐΒUrPT JpDh @/u0fg%;Lr_zײ@t6`I/ݴ3ǘ;,"MLl-KE2ItQ;\୆=SG>tc/{lDw{l=-X q}E7c** ׄA!Ie_9!!ƅO9ׯT -'4p/o&Yw3 (x{Mk'~W0EOkDfsg e6`p;QeZ6]`mc%$4 Txc(+].k}Eʭ bhPmY`km֒ӴBE ;W~tN7hh_*G:SIWShjkƍ?t8HWaCqԏf3"g*dZ`Ø8G!j֍Ld<|S[ҼaPJs-1_iG0ϴ7Dx#B`ذ/Jj%m8جC'1 f6aȊߞ:2j'"` 0|'|Ϩ' \gup"F(xW)0G[3sj̸("`h=$єi )JCF#! 5ōِǁ{d{0'EY<0"f2@JDl:鼠6lίن >&!An,@%>SPhāY VA>`/uOI]B „n)E[5ibX$"ɮZ5lszI.tB^TCН{cV"SѮ3fEJJJJ6N!igzMߔA 2'}H!#pz-B_{TUkأ+}ԍV$왚Z?0>aώcQ"y,q8dO_s~ڍ'zrk:s4 Իȫ }ϦZiMT=j'  lg8(LЊСU[/lcCЬi[b5 NzCsi$ZZq?!"xEXZr"FN!ڸ n#; =<8 oc؄Āp_3LĜfˀ\h?aԔ8$g2-ܲ\a*!+ 1CP r5ʵ^ef.X;sh"2#)ȢفT4DK+2ƓR}D&=AUy-oMmyk<ܤ|Hbҏ4DA*J楳̭'c'=e 1# ;6ooag3_%%%%N!|WcA>`X^ V'bij&%YUv B>(8lX {p܈rM2" 3 )bG̅v.J!zKw#V m XTQsgh($Kc.&"x=KO;` ʐ'Ǒ+ѭ]nܱw=꺿YعPb / 1*߰p?PUat?z wyvs P$ItgI]_qE(@pKaIIIjo~)jXcf$9wcn "Y5["brÈ.º'⥘>KWл3 @0hUr([N0v4;վTv풞  ["SzA,v&.x]Y ]*W.?*tz C** q EF`>-dr oVGC"ǑSLJ?EBT3]u$-RN4;\7E@Ep FDo(,Fd$0]IՇ};dy $ BSN;b[9H!ßJlɳ]r&n}\z1/ ;B뱛ZֆZVÜ焴Z.yVHӡ9~ye6[ Jq)ƞzҒvAU9ܕ j`>3/1#ث1c/J4/:p(6Mdk5>v-WNj=4+03X`IIIgN YUtb4 B:! r `- \C([84IDT01}~?~@!I{RI/B OJL/q&>Zߟ$gw\z, HBH+W9:`<>*X IDATwBnm #*='V)i0sO"o~pL/  57]k26R (H '$+0%!<}% #CG&Kq ᑩh`}ςC!zm`d:oHgt"#\Oh?xg,qWg TYw;P, ފcF|Wfc=UxF/jIHM:iyDeorA:KkgZu|fOJ̛vĴ`+h@To?Aw|W?i/Y Ӡ֬|ڥYYsl3 ;h{1voqbgɪSʽm1!/xDg ʞ%%%'n6. SHˌM06RKCԿOtCƏl,VG|`CEPڦfL8RcRh4bHWG:~PsfZʜ4p4O6C"m[DRlr߳H҉lS !ۇtˉ ES3>AJJBywǖ⣙RžPS6㖾]$xx)@{ƽeђ)dy|hT =*X I|U"IHX"idLHB XQ d#ȫ-APzWJc>SC{ЄNr p'SMHKi-L$3,U1Qņ7P$}yeH0PT0^7}jBr͓5ߊ<璤NO6Oُ40)'e^QݸusT],/vQΙ;oAيqJn8n<4' A#h,8|ŸhkO ؋նx׍Rx7ErӻGGCWL=IZ Z_t1 gQWq[}1}]hN--1%VmPȫ1z^X Hk%6M3y42A]ׁ̟Wv8.FBktfj۳!=)n ?S(0>8Qx"ߑ1~O&ΚQ1v՝xSURRRx_z6bb7BįzABL*+D:m fXhSq}kϊˁճں~ CR IaM֊yt݃}q!0I r5:HTO$-Dz%b5qg1gAimT>]v>`bt;ڭLbo{{n>Ѹ'f1㋳xdžF}1>%K6,,%W̷9 / 3PK( q072kiQH(xbدÂ|:grShWބǖ#qOuff rmu«2`;"2j:ŴzGHQq@v=D;7i1`k::pM<)3a:IƐB!"RN @t xv퇷Ot/m:[H dNw7y)(& 4xښ,'run}eE#DXClqrnϣJl3!D]>I E0!~˚w]3{8sWP].<pAsXiiJ73DCxO)me(QBVeKJJJN&~wֶy oA8_ȧ#::U7M'xi4h݂5UW]Gq˻i2vd1Y۟C{4"J4%'%#u{i*9 31C`BڣIQX!LZgm7D}A^(ETZRIrmLS9+:Xt+>f"Ж}l:Ɨt maJ1Ѧc-og^ sզ É{ꂚ𘤨$CЭXЭXSV+))9Rr=?ѬCà_PRMUIlp܍fg84«>Ȋٲ`MQH p4&ٖE0p(!PF^Pj"F3 Ƀ1pWF("_dxj;g߰\Cѩs&d+>cӠ6w@m'y-< /uEbMb"XSZ8bO[gz~JȗJV2ӊaxВ'O?^dqj љc;WLS d2uÁ|usNdnwzw=DĜ_\wۖn.az[V| t Qz |Du=vmkfJehs0k/\3Eg"q4n!9F=9WEP=;<⣭{⦏}+MtI~(-DHn>Q?^dqJ N`fgp f/q*]{^횳yr0s0TI(ɴR[38wb,c}͟ʺhWn?fg7paiཱི73%F@,A;ZsK㶿 ?I SZ<yuuƋ嚳-ל5gD!s{_ֶs) !-v}QѶ) ʺYQ-}֟"=VhaII)ᔗ&AoDST1a:Ή_Z?g<_S$y-/5[NP\>B wSZwn^Ƈe?ܫLC W 7>4z˻|ه^wIu?Q̋){묧t2IQ<-xX7g]~u]&'3dsN#h۞ xJ~^%%%%ϖSnxךl㍏fBIa)ʭ '6w0HQ9m9IL ֫06P| _4 ?nqgo$FצN úOQh!;o%zk2=Z"}} j^&6~}@j7ʝBk(z=L$Qx~ǹ}9+>=$وi"z{vqCĸOvk=v f 2^+gm`tK(]g?JJJJN?IG~8* W!~X*4ໜ{ !Ob[1Qh8&b8Nt;˾F>Myg4|^ RܭaJYT#t*|Ć뿱( ?V#WV$tW̶'f9h },A! =K7U̡3%۪ կ=TR-IfBV}vv^{HWHɯw׳糘],C≈PC s:-P+pjzp1CNT\ f>O h V⽠GpeU]؃Cnni+3pX7OC@ Sdqs}hxで`1.!_rruw^alF@?8lӰh$' k7t,C-$?e%*#]82ƃ!Vwu1(R^b[LVV[D@/?z}-k<@Nr eysKt$Д .J[1o7ݯE}Mϐ;!½!+K1DY4_wSc}#0)cvfkIҐ$IB:[)4W"30)+Lt p4DAe}}Ri݋lSp .v͜r=0>h8`{Pش'\WefϋYB~%9Asѕ#|}[a #\Qe0gV703f a?_Ư'ƃS&G)*w\svwٍyOj4y~j2Ï]u(EdZvlyCZ>zYϳS_$IYGxgS WUJapG>u+ouc9G\n㙶0,f3\p`c^zzY,8v4sɭ[J`M,Qc{:q=]0#︵oWa(Kݳ#k7hh!M9 >V"' k[nq~vO**Fuom|wzM} :GF! LTB m5~\ۚټGbXe!Yݪ$}N8 G S'IWJ5BUE=rRZ>"b\g>1ǾR$>,q%KVܲbM_-|o[NH./˚Z{ "}٨qS7s9ph)7ѠpL?c}P 3Pˣ;@KUg.7GPU5LivǸ}g5Ԕ8h tDPOhrh'?BRDX iv%I$ *(iصwRv^Go5 4C;+NY _ d(^ t^)O?ןݞ2FZ (0WKg1 U΀Lb-EV*7'@g.'F#r%-;țYV(^ zg@03ǎW'G@ j=vt;b {^q{'k%kMTEt3:,bf$e*UX7a' py>#=OBkHk $yLp\l4`r@FR8s yn pX$T6=Pَ?υbQ<J:|ߵ噲OXyzK>/оwZ0\(:ML..U n h8CAMj~_zme\hc;Tb0tϪs$IrL]3` lZwa㻌%f 8|ѡ?Zıy6_veAt$YC@LGYWޑ7Tjga(Nd@\бFeĤs/܏ F9X%g<Ģ&`LfƏ !bG`0{Uv߹ja I4.N5jղeP2j8D:vsy4OC#t\plFsBw]pD2rH@1'I&\ \wVa6oA9KG=p;zlykN9T M9TkpdpX🈡\'w0}wb #]E}; Dg6`F\f o3էv~bzᶡu* K0 {!w!y62Q@"Yk!I5^A xK_n`Ц?v H#F0# % gR=ґU %_hSS9o([a቟_t KJ1IDATxg$E $PP$ EAE1T P0aBAE̊ $J\ⲻv鮮S\W_{f&]]uXܦPLallvt IKK.if۲?f}3/L4&i$+~IwK=>WN aHz^HL&龜&I˂r{edAzzv}4eY0EM2IkIZ\1烴ht6B{]*-&zq[^ ܶ/h EoI_t5Q?!_ۮִ{cmB= w'Czy5˾S yHz[se%IHZbݯ'Ib{$,isI[HzgPҙm>ٙY~6UKivϙ/5Y֣2U33&[ Z7$|g.J:MϚ>!Yt5]Җ~T kU)tMI$֫q[VXҎCm 7MwJOvP7!~w('ΖN+ eg;QD͏wG;g % zf;L>K툟p[^َ:]uV͞=YRv{rm -h0%o"m[Y[ Ps=Iv ?P;BXVHmv & snl q/1ɦ5L[ʼvTɛKGпFԖ?xn{Ax=IJG;kuM f|~Rat\c $7&v .!юGnvA(W%~ IZ@x=&<H"*Yz%i#KlK&KGmdoy!AR_zC֙xsNlF&,}>Հs4M<rWIs6ewoٿގ5Fgԋ O4 sz=l'JyztJwz_YҾп~wfZɲ&zͼHZx4 N޵X.s̐%%ZYҊO'zz2}/}fE] >=]]g;EvK 'A<r8x,i_d)']$)[acIO*y|I?&%I:B׽޳d@&F{^^,U&yLrsY:t5 g/yqK7plyg@&e!;@zB3XIvL41ߗMs'i!7O:YdY/Ī1dfަl(rYO[(kd3EWyS9:_&\YO][t2SeQeø[O(2uzWCS6|ڣ-!I){dh| ߟ$۹sy#Cd;]<;ʲn&igI}\9 ޲E\ qѦUvXy, &9zryS^!A_sՅzlu-eDo.l)=e۲nWȾGJz<2Zrtɲ]9;O@~R6s![WT?'V}S.s{DN eMtJA9^V—v6}OҦ5=Nn첎jko L-ۮn z\=E2 ->6udG!E~u?]ȩN;ri0/ 93"o)eiCXU9%~amC*0 jA7ņrj}ɣΞ5 .kS@Uzȹ̀0ۿ5a[Tx+>?(uaNݷ˺]C;7UGҫ\h;Im6*%l:/_m_9 36YuPG=Ud~OHZ%L{ٺZmSoO 1sr_U徝Sϴ)_HՕ̀>C?ZlWZ2c};fI7X%dzh_xvw}zާe:`}وrY2w '9 |p7`YŒ>|5wmY۫jMSAm8QK5?];o۟ϑx Im-7en[A@7M7$eGeF{du}z4`l'[.-KWe"ViG+)^@+gIo}FPGo-ʠǦտڿA>ƲMHr (z!U bO6 >?~ֻQ33`[4뿉yk;lj=%]?uCMGtM;F#W{Tyw`M<ɖ-ɋ0B}Xaܠ򳼩jvі1|hz"*lVq>~u5uV8/R~imP&`6~P7|j6z+$}醔81)|&āCfckG$}mmIsMR.p?*6jo/H֓e"jy;Y_nK'~]=g[el]N~:[z\-$u,t'=GSlLg}Ox=KO a!tɡ$1._.[~j'yK6189ԘxVjH򲳘2M:{T뤾Ri9Ӓ@jry\>[᳾]nXZֳճ Y/94v}\u1IQՒ$g_VL+\OLoUS iY ٲ`>Kֳ>hQvpK$]-Yekw!a(tݥ6wT.G j7sW1RИ*_d~,KmlTQ&8](L~oI{l:rO*r&phArO{)rw(?Gu/߼Ϣ S4{~-j#vܿsI,?v٬d>)[/@|/I꣬we鑴]ɠaDų'[@iأj輮p|4p.ˮR\=";Zko^p,ٚmrlklsV* p޾g*grI~'U! rx'T}1&;0.Xx9.=k35Xw2)~e#|u)>p,h^r7ˎ|$SNrv,MUTe]GHѵ7.߼c@[˂dF|YqI^3VYU)'CdDܿ2$л}л)3X7e*s+KS Xb;{^9.1?3}ic&ily }]a(̆Hؼ^7n/iнSm)ZWyTzPT%4u3dg36=fIQvHβD+E(gTBEe,T3gk;&畴]/weK)`=Y?pܥ߷Q8ewWG]BWR:SeU% o& U>No+ys͑ڰH<yxlu"4Pg9r0yfɂ8qq=w5I r/FR%On;ST>ce%=:<*AJ>u-2jaU͎;qٵ}F\ćYӘR ϓI#Wɖ:+>X:\g^_vV[:LDSu>&",eKZ;dOdz dfQٹ4Gku YW˨Gi,,e4+*|h69S T~v=5˯˕4Z|=*~W2]W/Q%-ZEMɹ"M*l؉4l@B6pˀmIikM~l*V ϴA_>&fAI3U>`)?}wYvSTg%S˵d] آ,~=LVs:.A +5CSϔ]68}; ,d+GT> 'ozV笺ߩ=L7wS1r5u:Q[,2{ dG]o \W}*^swTek1$G6τ]!! ) %)P=~ZҞ,!d1^Udg|!h?ӣy*BCAT~U7' /_4XR7{ .p=E)?T@?_y*?=Eٲ.[YI=>IfFܦ&;e_\فѣΞ7[+ B;;{)aZ/Ty}95X6lhGȎRz{ٓ:uSر .ʩ?dɋ;Ჟd 9G='PG!Re8/ܟYIip'e2&3cܱfl;E۰l̲3{#5 +kujf:oU33R\BqYQO_2u ge"c~]'9+M3tv S55[+*5ʮç= jˡ,wsUQAIv-R3fK(_IzjຶP`*ܠrKձ4e޲q$y陁>mع䔵m&{Kǐg)@U4x?٦ٲuE;dul&[7 $ ;J 3d ! p!L-`N=>N [(s]7,O=}k6P l͎\_ſ5Y O+} 4u(@ftLV{Q\՝ӆIHm֑O,*3tᴥdq%6y,EUlwɂjw$q˨ *JS^dU.GS37DSeGټU|w_Α,ohVޫrK6Ц}_N*\ 4Eո,ݭ芩f5*yz(q߻[^YĊ[ކ"zV}O/ikz_ކ09&^̐oR,K%V6%l9]d2_ٌeTi߁Tmd| j>Zu7Vjk0Zme *Õ6r -ĥ eu_/[޲ɺ *cs);0Daͩ+!ڲԳ[Iz/ɟ&ޯ]e!$;^~ lU:kU?5P!3Z U%;K6HseŖ ^E6l:Β9saN%X+o%{g&swlŭ7ϾOty;v Y%?D5e}nlYź-9^_{˂eeI.*nj s5H*@6#I>[3e%C""MLԝ9Β:H!A/Ty+$Z/ec(BSPҗ$=&\GO$u~'K=O,vOͦ;N{A3e{S%Q 5=+N{,kfˎ"n|Qttـudݗ+Ȃtَ1Y7Uog֕(-ޝ%;Xy\t ^*GbB/u/m+;X9wɺ/Rh"-';Z]M6dcl0ƽ`~8`hMt0p~0"t"@@ t"@@ t"@@ t"@@ UW[ eYٲs+VIV/[nYX(lKzla"$K=rKYGo~lߜb  vBC֖&[ Y^5l[IJ_*1;#I7o-$#iVZm$)tپU[gu_/vxXL %Se*b̓,)U*c@^( "}Í嗿,[>1=Ii%}XHk_IWD=e-+,FwCII@"[yF-i I[H:@ҩ~)rIiByz/@ ު_Izq+邜![zI=s~Eҁ3鎲֒\'闙r~,[VGrPj/R}sM`ۣN}ɺВ2oRcҐuhϛ5qrsynOջx?xcecd=I˖.Z걇ը{T@]7gKzrK6͒nMyK/<82E6 *ܾYtN?_' Kْ/ iOu&^__#ubY>a&Sp[ eh<_^zX[RLgU] %Z!K>FsFl[Rީ6m0TMܯ|zȲ)'9ϒktOK{7̴'*zя#@[GҟSv15fƽd-VvR݆vֲE{d׻GiTȂ{0!~rG5H3j7񵜶mRT&~Ȯ+{ߓtafv&K̳^j͚8픖5 eѣk2E5 @+dae>̟kfg,/w7{O6KMvZ횴d `\_4~}[,@g(3޾Sq״q}FMV]er+z {*%莢18[(s͜u%ql˫F{}sge{*ڰS>F+g6dy7/=UsZk]Oib?j&agȒ007|RY5x'r8#@B;]/WD/MIT1&{2%q[˂T,%;UalTsxul*#ޒ^-i/EMySdwϕU,>#PSi6:(솂ϒ}vUߚCs6l%iے\"K%~[6vr,!ٿt{sM }n9]- ʏ|||źC#UF\w51{We;Cd D';[H]v)[іg\eWW,c9eX>m5 4QehyKۥŏHPX}<~_S.7 k|[5dGɎ(ﻳ싳l #@O\,^.yg%jxYϮPw(.tm$l99|ɨaF^o(ȴ •3-OufJJNLkߕ@AyݣNۜ-X[ߗΙ ׵6@Tғ% s5MQ7~EىS63rUT{.~A[oXjz,V۾rKm9ӎahkʖz K ֭'y}ljz{<M6v|.ZlA[/nPKPr^ϤJ@K:َ 捘M[Qd+G=k=k%姟ft,/taNfkbʐ}L!ݩCQy2hLފkclp_; PyN[ w)=(Nok2߀0Bօ]w{fҫe* FuߛkGܖRu<*~BճS&a%dgdwf.fiZ$}VAY{oP*Qqi},}`A^| ne-_@՛fiVQ JoJeGT<|Г9MtIQ`u镺ʶeB~]bǵҟmE'Z dGobmtReВ)D2rV6g;UŲ,uQC$$R.FW55Jr-f.O]r/ZCIIf.NW>Kn;u'UvlUޓ'k *N=lsrlk]rR^wjiS>Cl>]W4 نM -=wrrO7jrllh4ʄ 3izYʐv%w,$K~tdfyu#r駻"o]dK-uO6ch>]{uF |V<"KP`!'~leS3LჁkQ~#/Jsֱv@}Xd`QPV7b"_cbvy6uK4 Sd<:k:OJz^vvJRo~ޫAđϠU>I{.IZ?dߧ,|ܒ˦9Vq,|܆˟';k[rZ:q_ܷ݃aUո3jʮMg-P|!EN&g]W=V$sZXy% UQ^@) 3%sIc4 o߫]:YkES#%dtE-=*)>"iML#I{lϗ["%L2*lҥYXe_/*_kFeK:nBӃmϔteђ?kֹ3w_Bm+YL3m- 2_]A {E z3_"Qs;sT| lL6KF7 mj*>U%7ԎPzʬ-mZ]g; 3uiHg=wL䟆rà 'sM;ٲuz7^Nv>ܪ -*nk/U%,mn_&wDFӀn,[eXI&(̇ ^'vc0v߆>Hv{H͒et+@L;/KT~Xݚ& tl;Gq QM\C_^ -YVN92彴ֹm:-~M_w-۟ Ӊ%$6һV:5q3sJR+jл[ #8m\@[(_ TGW@.p]M2U^.SNA~8Wjb]9;dkjИ4Ȉ|_pGBA6p|my\7WOW=#dۧ^Sk9M=I煨`.َK䲒U\f"/-鞂ErLܦsGf7 Yf:E MdgH=5?w mM/\[Օr_-n߾R2Dy^S\x@6K:m *n[Aq;9h:Lٵ켺o\mr/{`MSKŖI;435T:U^4uMTi4Լd=j٭jnchTJk3mUd=Q5֫ٱbYzPf}.YJIJ/RYw^^Agw)5me2=f [:ƙLOz|{9_Ëk]俩:FܪϼCZCgsy[cl:[y]K[Eud)sh6Gîrl伤]#/EVP֣ @uY[%T 75@I%f9C M쭉mrLjÙ:Fİ>pIoo X6LgɾsEݢԯ]d˱nS2=-0^p;t/eLIe%{pVA,ûAv&Kndk:XZvߝiSkMu(%Hs-w2mUsg~eL#*s߃cuN \GT~-= P_arb6 {`^,d^ORmdǓzH= sn 1T[ n2fv%%\V6RA{-SM/S+gR2/UMHRY5G^R攛l!ϞWd;:`e\Kۏ5\u5Ǽɳ Te9W-`ՓuZM-{=g4܎7g/sѩPE׻P]Ϫ\PNvzȀ.J=$!UϥOcBuOONJkAQQnW(kN9ev#l9mۻf`BuHv[t@uڳE7dcXn'=Ju]vxNh'[g'濩.5ХM?r\A]trs7 GU9v2}m*14ܨYUx,sJNyyMľvm7[ g-YOȎ&eV";N6ki<_f®Sn@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@D@DZoIENDB`pydantic-2.10.6/docs/logos/jupyter_logo.png000066400000000000000000001725221474456633400207470ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp *IDATxw$Gy'lA'H$ E6`lmpgml`cc&GB( t:qoÄziuݝݝݫϠۙt׷ꭷWT@ @ =A@ XA@` =%@@ Bp4^v8zg*;wb޽z)YDP{@DOD(yO"Q"V^}GtHI{1yk,JȾ}X[Y'5BI +|Fns[» S7P<CO\/E 5p+U$:*d!k}&~"['Ӹq*8Fio <-ɐ^^[F?Lv`EEi>M~C1H^W,ulm^׏3 Zكɏ>Kdsu\?Rh"V)WA|ȭ޺`m&,`]-@  y)Gp*Y#:09Dt.` R o8A@` #"iJ200xbP|aV}pȴF灟vK`Vg͚5e˖XUMRig k1.WiLN %Hg,[+V|nʕ4MSeYP> x林 aK^-Ƙh|||D>x?b[ˎy9q @` rMѤ yAU ɏqA?2'3@DEИA@`I T_SޅߨG %A@`G"|l85nY@ @I%FMV!U~GUlUedK/Iz{;wd:A@J?BYRT,^n#r@'][8~Xk7#*XdS+6a}|d;%‚YhfabbwJ,iw @ m}Dž-:,^ݻ mtQUD/eY6-cWw_y}]k"ªU_[L $R$>uZ0s1$E DU]DپD=&a(yEt%->lY1BeYߵ;Dfv,V$(}yn_* 3JwT:w>D=^<1%Wb˔px=K5tQu˪H@ 8RJ~)p@䆻}ԧ @ pX^^/!+YͿ]Dc, ]hC#ݕ)(U{"|C }->'^ @ pdKo8EW|$W`A`Pv%_׳ ";kcڈJ@ T"3˜Ss -0)a|Ƅ@ AcXv >==| n}uk A@-Ol䗜왏`?rOЈB! _gF@ hOAQ)!1>U٢IUED[ku櫟k@(z}\~ф^lm"<~N%W}@2q~/+18%rtF"sWY.ikV3#z LjUuggx/e1saL DKFqWq9Du"NKUEd@ϐ @ 0 qv >l1$Us}gfll"1 m$z ?+=Z`Lt|>}``j5=hМ?LV b> ]kD#w;ީB'>q4J}g4qe@:E垽$W|eTY <#y*x{L}Zr,K/Kc ;hиu{1%k3'z ȤhX@d YRX ʰz+hIUJev"Ȑ+U U+E'wۭ#.<;}n'6#x6ґAJM?z^}gPr6Z-Q1MSSen$z 0 à{/LkEd Țspk=tm6vT:x>qU<1R;5ɳNہf]z#-:_#F/3e/~g'A5^D{y޿ttPC=2A:O8$N/gl=MA ZC0(Qz՞$|S;\zC@(NʈW{#~Sf`3p _IdX2*u3BdqgY < *Յc)==_3@yiQ+=)n}P3 @Q9ylr WR߈XK%'bBx\Eqm38ePq# F`}'l^Xt1ޅb=_CT(/OiSca` (nU˝k4FAU.uM|~)85#73F4ZHtr,b9bCr0TOi{ F;_4 , T{M\Pg.q~jz,+HcnN'e4>Ehx}u@/+ܲ? ̅ piX^f!=pL ,@t4m!'`~+fx ox o[ m07d= z8~5? AҽB (0Q\IWFYk W%C}1@@| "rcį{tAT*v͚typ5m\D䷢(𯝶70(>N8'8~ͲNt:f=׭)rf?|s BHs{x%+dxw0'q@D&I4EQEU'80I mΒts&IƞtUD Tow@m52գc"@?q)z}[2ݎJ<~|xi|ct#1 jLRnLFl d:1'&\V3ȲZRzKQ7V8=1"fvt/9(T VeB/( [{ȵ+hτ ^?gvω zmqr޽#Xc@~І,U+9|WJSU}{_S^mzo~ p"4:b)yO}(frk(ٵ X^-6uE"#14_CCCWW]q7߿md$IBE{T|.]h2]9_k3f]n=/jE 2gd$" ]1T+ķr?VE_)ڕpk+(hof@cD')ɿ_uI}5Tkc^u-VuveEV%iUB@٫UT֚Gyѓ$a9risnAYtGSՖ>]Ts?󟌏?tNPR[t']酹*p,XLރ"ym7zHA>rł&J\G ŧWj'+Ӯ}#"vE<2M 5'b}7p38xjeA9VjA񾪊1B |muc عY>w!}{^ɽ[:nsE Dyn H}F3x0Y*ƭJPRCQ/E$*dpiZfM?x-orW :0XJ8U}pm SU}k- t'I2},c|bX{I8I^cD>R(̯Vɽ*Aٹ᷍@ֶ<~@hT<ҫO[53G4"cWm~fo?-޲>h~آy~[oXl۰gȅsga@ʗ5[Hg1OW K^M!YUzN\~EOs9 78iPx$ Mtឪqpy'AWfѹFgD{z8w&޹EŵY`5FQ?W ޻T9I<t0 Ȁ 0zU%+63-tGTln79sc ֮rZ9u;}/X^gw'gy2#yOV3ݶ$EIk_wD/ 3YZ|o?o,~-T3F:hVpYmHl5IHUsc$ 9]n8u%}׍EkөU/U4ܹze*!9:MlORt7\~a]:_syVu^/JyyavvVjj0sl4F\(9z^APTq=Wt")}vz;sE'8'ݙNcUDJ;Yf4ʳloa@~:ge4-/)oJVO`MTG'>(t̢Ӈ:lJlj}"s&蠤Dwaת.È{*NjE#-ث;}A򄗭i ^;a),NZefrڪ+J`f|`؂"Y7^+i9,; ,08?I;]\ntAI}ἕ;]>-V`˃tf0:.sR:}c JP19^t}f-id v!4(w&ֺ͌v|D"xDEWF]js%@yS~roļCRdR<^c|p5lf.ZgHq h$:.0)bW<*H\iY$ 7bdGK/Zx_:իKsEJa#,.>CJ/z:Q\P4oRHFAGٻ׫/<{7\x.3?J %Y+щLT-V5koKدQĆbEKQ,m~SܲRƹgx;C>}c}Sc}ZH `=ܝ+X(?q4?Q2F?}qK.'|5(2瞻.8J慶}*a/a+e..$ØOѹVG6V1f~[(2xR*GyW#' ڨypfctڇ֕睸#?::=o^SЯFvmn? a;LB<'kS|{}Z@`Q!XA~GT $3pu7/^¹|@~:4\_t3ݠ8aP}Qf7ҝs5k\լ>XQϫ3Fjci?0C"0W/z+gKgxLzb89m'9rOz}qՂY\rϾ,NWb8wz5)u>D,~x{wC@`AQ#Sʏ 㿋+U/w#98 Q >̰/2Zk3w!{)/-v4<<1&*{9K6zg^eB#v%s9Kw&"8c[~B5݋crh|^yC#idqQPENx}%n04IO!0t jz{ejemNjMm Zy3t#0-;"oXk\D5kD0#Pu+^ּN48EF_[x奶GgE^Tf,] $~?jW懮r l@ fKy#$Z#k9]NUg}ȼ)ϺY@2Eotˢw^@ExM:MJ!d\҄_04KunL.YK[psŅo@_ 尔li},o1K7 9 |@pu< \{$!$p}sBWM/錕@`^{0Y(>ϴ|k_Ձ9CO3ć^w 8<),>Ru mOAG*qٳ3P%FRxIÜy 8 _mƅ6)9F8fgS;}\0{ bX{8`i}h>t4rs:qFWHP0#p/TYJږ hTNy 8 t59P*@-PPoȴ8zcԣ>:sn booA*@#1gaҫ 8L;!C`H ?ܺfN9dr:׿7yVgļ"݉ozF L G} Qq(-b}=\M %CPiUʼnbIB R!&":^i,gBUhʴ (B#\DWOYzA1+Eu+P!`XI}QE$9N>tBAJXdWo/o7y 7(D\  }U`+K;W?Ȟ0JbNB["0 &?5x$ &AgIKbw^޺Y]X,wlX.cSb~8F nTnM.B5fP=AsQ#В撽,if-$M F"']p#z Z(!SE 0L`ҥAa=e*=3M(*ySUv! WUR 2ǍLr _⨂1 āAENYF¹ mzW#+ﴹ#qJQc]~ h 3#E*,Wqf:p=3;`s DMg5d4yѢ=_bwWʵl\j 4i.j7&g3i~>7FahȚ6,Úޒ{/BcUc>MZ1n[Ybg@fvPi> ZrX%VP+HKT?PϏh u,^L։dl3׳2=RsØ; B[8ɁG PWaK0'+S }gUI`~ pwFqS-BtՂ慶3e|͟?qM᱐H_#JJ :, w̙b*BF ?Y9n id\ּ̰݂n@4QRgՑ*&> U41V2|2In__mL8܉}?(z_icqzՂN]hfG%GD׺yU}7>ZFlȣt7Jk]"@^"Tzo]ݕ mұBW Ğх6aVhŔ 7.hAjn/Ḋ6*9@`ѠQ4vy?]d}T|X※Z>eM0]ZGS{>PXd]]6W*DZ);X02`BjQGG\Ǽo1\IW bƭ/%}J犹MSl8A{]r|!x h؁ H6,UK'ci^[/Zhf:zk })sEh;+nT\MsAKQj\ XҨPﭚŸk 6 ]-7'fO < hbǻw;WW7~yNA@ 0R!}"!a9U@ٙ+Z+w]h}Θsi'UZuMr>QG{l#.tc {V?QLII7_CYNB&]-i4&L ClGDDAi&Un{ԀF sM4-*5 Mf5P)<ܔKt$ mBuwz9smu$_ͨ*}@p)P#7M}LZHgx;ngʻڀjA9/9DqgrsR T{ LU_Q̗7 ,&ƞ]rs/JE͢I{},xYDN9*`@n7q%}n`}䋧z F5goƞiݭ(l\O#^BՂB05Njˎ̊U>4&Elт n@#hH~ωƿN:=]On +jA׽_F#A|FݜܔȎH>,F< ~GE{T>Y1h8bCetM, GGOJj o:HP=1* uOy ۹x8Y7Ayh:@W 3V- GDE$#Zntn{]oS@+.ΉGqu'xukw1]-sGڄ#܊7_+|]jϾr!Ek $[b_t_br꫎_ T;{] ,mՂ.ymM8,j(ˆ͋]v+7MٙaT@c~9=稜#6B2tuHO=;Zi mrl%"5ɉ"hiV&>.f@`PA DCV. gO}ڝӊŅ6MZKv, FA||FKukUMTwkѢb:D$:ШԊm@f("s'_r jw m½Q@?EfP$y}1b⢚ﳦBko@Dx_Kvx-GW zzҩ m}PL3c1+A՛@ XD bd?ڏ dd4B_%>'AkDjAWcڄ{P}1pb'eBH  $I*j/? \sKw,g<3H^#Pw%"A,MrxEصX0^ƿKǨ{OuK}iN$@S]-辋<.mERM s@GQbphy*&9+H.@frH0=cA9bAh/_<9T-5J@H~*{ _Rt;Tݔś:5@FOW z=8/ /0N.uVrQd=$iˣ4rKkࠞ'`M(@9f9W=ryB_8ю7di,ηZg]vM| gSn53Hh`JXw}&%Q׹}Zл8yW;ۉ3+,=6:GCS3P"$ IjC8pXJE8:oSBJ ܓ)xQ7XQ~63zjc f'Sq+ci"JRH{~y`brN(TlGPr,?Ħ \\>gS}Iq ̃1%vߨYɻ36o.rE3Ms6n?#KC}qP69p9C1Q7 xcDƳ0[NBX#u u_fv#K:t2>} g16m Ja/0P 3 wSWG5|nܵoU(F_/PC5Ǘj x8}6G[.ǥ2q\~;u)ųf㥋9'? +XwokV}:gQH}Q$w.wonD/u^,a]-y(bPxBc@*XGbcC>̾jӃV&xPr/Zz*Lx)/XUF&|yJ~v`KTaM!b൛ka?JFp=ȱcLV1*%prgbMPamTחYjAO Յ9hfgvmˌ\I0{"QX8r"Cl\,bs&deeV c2rxmz(x-"Ϫ"+7s|9T͈*OE-ٕFD()\q}l(gVܙH28wC*5S\c]K:Aq D+ce4uDb;:4:#CqDgzG%w2Xq 1'J|iZo v c XR/YGV{`9?y?'8muJH md~py#UI;& -q5;쎄/[Özqʉ*n#sBB[>g=YɅye"Ke"'uRWr0[XnYlF/z>i+qut:9s)qQ #R4D;wq!{>+cgpmrIW zYRcg"mΌ݈Hoң5(֡Zɑbzf~Kc8 pgĿl }}9CIw̻ G9e 6_ݱJ n;-aX-g-YwXg9Ɠ$Uo-_۷_U`u\cY?Ǣ [1.%>]|7.;g.õm:W41(ޟyM=؞G)^ҏLW zTZ:b?*C~ zA%9h2rOsX =B ;}-Yn2a~ FS0 F9Z#Gb=;BZ]>eDrV2JvԵ>]Z~bs=J8vr|88w{pUeD<í?=ZD_ o-?r9W^h' Cɛ}`t.yXZ+{F6/4¡" :QȌGLj?Ur2Yn؝'ʒ𡌨R+RuG*DS,wn[R) :KsxR~'ge2xψ!g.I؞Ģ$Y&ATYR)iT.n0I<, 1$Uu-*eWsLZ>{=YԂOE(5JԸZ5,pU7rDq/jr0wV }YD()Yϲ8ï-}efOqT,[%2Q\g^}wZAxkb]fܺ|AȧGz]E|Vœ8>+n2 yWiA fg '+U_3ݖkIX31sNSx%O!<4s!U< k(F=3xn,v, Ic>}9s'j/lUGۋrX4ĵ7o,T׈ZiJjHZ8(ϚBqq8R.X9n`&Y4`vձEA6#"CbNr|-O]Oƾsߦ9Niuc㾗F񊱖F-~X%P"(c_PqPrlMN.7jj0'|G|p_~Wݻ,밶lIġȚX6$9(' G59y˕c KFo3N(8 VnP2Uo7W Sʶ{ "&#WZxktM=tO]7'coS$<U*4#)@r)Xn|pf4hgƿr XC,4,su?>:P :P%܌}ע8D2r6׊\_Y=A#{xnNJoѪHF7͇]-j;{2'/-[^8?UdOީ1] 6 +'r߾y ˢ:}Q߁@YW1jF$y*e(Yn<5wMnk9c<{x?CP +~ t'$uW{5 h()@ m4WiOb<6ux3e}V!>y @X4ZJQX~LU*1qcP,|a{k go3QX_T )Dkyn0CMy<LV%QFR+ցyy ե8uJd.Ʌm! Xr1zƵ#|wrn/Zb# cHi2l-EE)%vE%n F9k/p#b3h{6Uv)ﮟYaǩ. _Ok$6~$[h&@A$NΎ5|u7Uz؝,s)5J/9j*Z)qX^"$)l%~# H΀ .]ߒ8T%K_QdY< ĂBO!w֏gVuN.WFwcD9d, @ 1ʉ*ӈ&vpN]yDNqqzub8Ŏ4Jè =Ey v` $5, BmUx renqQ[ԭa4 }ʎLu=DŽ=d,#l.՜o˷lN' "xCʪB<X,s=LD΋ι(زe+45j<;k81$YrS{CNk7atY'_xF"_NFkMvpnSs yƽS#oFM/_ '2",sAz.H9ι^9(J$(]UR*%9֒ ^zJw܏(52A)11)-žkӃ۷C ]A7,D'oSZC\ߞZ4LK[22K e)=!,s1-@{i N[ xDM;";@`tw@ pLAs$ M@ еUyۚ @QՃn n"z jZtkBB?8 @`QЊA @`-.Йt#aZ I@7F@ AN@ A@` <G!y`-ԁ#bCJt[Vя&iHU5wa>IRرNB=a|p0 eTUG1`އcr.TU}@Eg U]G-Ȁ}kcA_ < Ȧ(Vڸm۔ , Ny#oI?.gZ |jlZ9@mཿ" v^fG`^g!mc̵3ޓ)OQ5,mcyoچv< X;+ȝ)+и9}31Mgɻ꽿l|o 賧C.jwcU}SzhJ7Ūi7fy>?|$f1؊-!-a0JUg=&|5ιq|T{$I~||tCIӔ+W{n* Ba}s^;o DKfz?c,fI4o> W:ܛȱczFZ1樯fVmiё^#͝fSK|x1m~qhh `4uΡylU{#ݎ1ics<)r)dYvL^[6GBj?ߡux]ۛv/M;,#24% AqChyNOOIL4011|``-R0w뙘{OP cqv--|1iDY~8KUHyN;[n=b67糿^vUuh[3AD^Rwu}>۸q#֭ZQhWMijͳq8|V%"O艚LV /뮻b8sV|Cɧ"VU$ᦛnX,q'|ڵZVt 99wV1֦A7Z]`F)=== _bxNOB$Ilm98:訣sh\+V|XD^iG51s9 xR~z;!ui#6X,w^vqm۶mmZr0.㶶U)[ȱX(R5=""0kc;҈6T*wqݻ눣U%˲GQg펚K^t 7ƐeٌGI<:ڙƘ{{{1ܒa9H޶4Tpasvv Ids<jyN72_U/3ƜQ*veY6mRN8z:'-A8>cǎsYz{۶lˤ OW K~;wykqYZ?g˭_.?r{{ƿ}3v NE]WD.AZk?sq q#s8^ET5Y=Չ8oh}B@e m&"_Rء!rWuxn#zi˷q]ƫ8~4ˁoEFjZyN#9E"LwquRܫJP`bbkӛ>f&l_ "M6nC)]“֒ٺus jEj;w6BTbhh'QM$IJKM[AxlO=z}{$Iph3Ϲ|,˾<*4GGI^(eن4MjӅ1}(zQ;\W]ua8S(JGtNz^O>p nUwy;J㰬'IVaڊ(J<^$ j%7|qS9^V(>qgyHy3O*0sm>%Pi,3}ZK֬Y8cttP[#fUsU}taySh{-;k={v^ނ6)3پ) _xO;|wΙzET*mv0H $:P4MܴF"SeY|XkT*j5]wu0s]vv__ߏKc bOvÆ LISik#Mӽιꖞ%+=Н̸D3xꫯfǎdyNoo/˖-;Buνg޽yitVgYvq<9Z|3:~Rik4if#q3::ʭTJ?Z>-oj?y}yrjVիټy3崙^qooi޿Vr .wʕO~;ܬ u+K[錒g2k ֧1_شiƴ#y2::zuy6K26צsTAoqͯd95lJsӭv7"J+?f*Z˗zvNg#,{ƍຖi֭[ v˽=~m&"GJa:>/QgϞwdY#}:|_Ѿ4u&ebO:kZSGz+YAf<4ģW;;m) xt409B9M{5͆tzb,x1f=#msj[et]wSWS zkd6ƯGU,"l7(\mvE;Vphާk+:?@)uڹfvq|s}]wzv+AȴfC,߭Nf&ֈp# c۝sa9c`y~\9Uy)oMO){o~m`͚5tI Vu%z[baÆFE޽9cccGfX$ܿ?p{OR9ŘnGJƘ]WNqྥ&]OL+߫sܚҘI1MqFQ5-ts@Ik9nc(wsߍhoƘgEQskn; ݜs[ntk- vo1yafpM$,2|'fN۷o?_0[jMLfO%)"=oպnm_DdŔܷStnZ^N~K(eYz}筀ݵF;y-""o׶tj%U=쐳hGz?'5xES zˆ6F}+T3NB,~oyZ>Sݓ_(CCCh ae}dρ4ճ쬞kѶkEOq?i#nމNP=ndsafF 4M&7J_[kns^y\z$[[ Αš{ S'kDy[ASSQןL(MDLJ-2վ 7>9]1"R8ҊZw{EuX|y+kNGkޜzN9vm5oBZA惸X,hcVU pK|Z=Nxl=}+-&&&dKUw DQv4MY|oYG ؊㘉 ֲܶlٲjg>4cFFFVM{>k׮nw(M/Dzç74;N9{-;Dg;L>ۉn;v{&&&:ef˾[m!j85JTnZVrkcZiVU1cN;4hʏ+ƩGs {yP^p۴?Q:?MgOLLvvbb 6\q)=ּq7ɲl0h9jb3222>::j=lv.y>eO7[o]WV(IpNisvH^)-"B$x㍷O^;cp SVg~h9FeѶk}߶mAlsB޽{_y :4իW :z\m%4Y<9yy $Sm䜣\.?NUXHE1eGu ,*Nlivnj{#A֒}]w{(R1n8oZzmX)">fҐ}s٩ZЋ;?* ^M^{8Ye}TS)S=LLLLtqj8x}OigD=im-eY8Og..IIZפs_UtFFF^qV8>=vӼiǫumM[:]ED́9]L997ީi$=8o{6soEqsεRUMDln . ;\U>FU99Zc\bm;k)Tsnڧ31s[ n.ppwqǬG "BmƍmwU k-io߾i׼v 0A8;2bYoJm۶T%AU%\_ڨZ6::rkQ8<OF3GO:iwWКgo?v1G(:4nJYAN95":NC)='q6h/4s~m&}yQe̤GOD^t6qUQrgݿD"n8= | V G3ANt5H;6MG8Zom{>9Y8)zn;, f9ӽ W0-Y[LA5ݡI]NezNc"2Ά,eٔzSND^ə4H)מ;vlkk-CCC@+wUnu8[S5z5G! m?{b)If66yb[CPյkz9i Qkz잞t>&FA?sf}/eYbVr`igڼOkb% 賠ݛQUN\͢""j?g4N?T*u@| "v\Ї!IWZ5+1Ɯ)"nsN?JүgO_^礓N-k[uzKHۦ@{ܧ,{Ir離'l۽|:N9J$M0ɀ#`G 68M rIHBYw ]ta6ny-z>UO=S._lښLvs񝩄}L'XGD*hJ7nغumjql1f`1T$I6DQ4άաL."lڴпЦt:팧vӰU}7/;ZKRyte(V`>x彧 Ikȣ066V@70e'i,ݭ/ g'Irhtb}Lm{'M7%HZZ9mwι&&&Vf5R(ӕu[<ϙh.{<fz?* my18ػw/y6^cc]ϔs&'' P.[-1z?OsPsDN8:ҽuϒ$a:L.Q]E^eJx> %03io͞Z/swj5s%Mөc$93Y{$͎aig̽t1 Q/ȓ鬚~۶mu+:w< Ƙ Wsqsz6pYg100SHs[Uoũfui5ʥn-cڞ5.DQċk?z.:sx0;_πVg?,Ɓ63EcdebbZƪUn$aΝ{m@'|6n*u4MDd8=hNf(pmLcC FR9ta!si}'zbbm5w`{c5juu]M7ĹΖ4)r7Ǣ=y>Yg;!#fL v٦[w`3 񢾾ӁێeSˎgk xMUZw0;π9 7ส3^G֠Z2>>Κ5k":J 6zjZq 8T*=xWPU2B!ZV`Yx[GPvc+VCFAdYv޽{200pm<кu(z\;c̿M_3ad63浴Uogv?\ZuT)"_^?޿h%yS,w5kޓ$ԴrLV;#w1B,fC^ m|Av; ӽG b205Xkp-|X$"JC: =ZEiv3ߏCGZ)EEL=SjbWDk5kQ$~OO?Zk?rh3zCgqflMA3ʜJ+k9s1xk)`6_kq/;.57TMp-m- gSVqA&''ٺud㑞x`fG2uzݬOQ{7Yf͚㞧e[X38PSRZ7SGm<Ϝ-"O/N휠1r(Ouk-BA.թ] Pt]{8梋.vT5@^xRy1?͎ i&;vH^h=y<\ "n:ϵsnn7}ݔJC?}߫ۜs[>,'|ٺuZxپ};{{sG9ΔJv'i2886lx\Tbǎx4]m1j'yAnsSfkFBy^onn\RiY8 qSch:Zkg)"TiG7It6g{ƘyלuYO9f,z?5D-!(˳~nj^[4=5TdU8>uywel'9Vmr ֭N;6[ςx/vpEjٳS\ O;Hj댌Do499488yHd:K### Q(HӴ9iι3-`3oU'cql,P(< /~qĜ,{֭)p ׌ ߿cǎffHm](JU۪1###-\wM7}whhPG$"=0A!===?fΝv;GX,>`f5 qSTPU6{8r$zkF$ݭ9+vUZ=7Ƽs1㭔L-ќy6vru3p^|1 h x5oUebb{{9w;֪/ZB[b~tk ъ̞o,4MGT櫍Vji~-uKd Y]":&SՎs$I~{Xk=[nޣ=4зw2i2(VtDrw1C}/nܸ###WdY&_A!yS(jx^l||-9UZȬBnJs'~ZDNuB;v۷5YjJe"e܋1Yz3#"霻6[3uK>TEI8fDGCsrs3>"S>UmWQ/wc흵rGI}́_sy+W:M[k{V΀0ϐTV~[kyx;BR!B2|d}1וּl:Jڵkf͐i0c]h5k|XU?!?( {љ| #kU1L(@ϖf7<QLUOq)Ym}W===%w}VU?_ןn c>7c ziuY>/V099V)_c>?Ycs}Ƙ/EQU9h "W߱>};wHl>M{ۺn4M1+3Yc(cjnPsιwfY6{֊jWU_7As}[*sA3nωkU]ܣӊCطo2^T, 3K a`>N<{1E'汽qX>wSØnqcmztkv¯.J* T*_^NB/F!AbU\UfԲXco8qwY¶Կ> TeA8փ鯪wfEc311񜾾>ijEdX:]4M׵u;k{?rhOk-6Mӧj4|脋p2pzψu?\.azejG9?fyU\.*6V*Y[X,-phynhmz} 0*"vz駟]wllӱslQ??~xw4<(9zKҐ1f(sDD|@8/ۺuqw8f||}uEDPiM:Qv',sll%3)rOyJz´Ns~:6iS{of/j7%I{NJ؟R7`U} ܪ1 ShlUdc Tնw:L"28km]wΙV;Qer˜dtLX,~+-ӧShfvZ+H4gucL8G{zzz8s꫏.eWtkg %s7: U\󜞞CN#B}IUzy < {8D }vvvMӥ˧zcJ )1G/wEm~/ff͚5k~޽O8gMu_DZjb͒q]5g'B4QգnY~6l@\>ne7U( o̩ 8 o.LkV}۷oPDwc[.z6g9tQkDv?tswN׮|˖-9Nt<QՑ6?[ήeNLLl9_>22(n0ƌUկu[kjT9 @kv+Jtl~o;Y{ ;ϋR׿!"+"O7{o9JPR |mBJ%Ƙc\/\g[XU)Nq L16oI'==@DRGak#"ιΔ$H$;X3;T*Gu¶,rKWLd:fLbqGU\Z,>6_%Fׯ_<ձ9IPEcxh 駟γ700mv%"iBVvK6:q(ljA/gqƘ4;i5*"j5N?i z_E w6ذaGFFFFGGۙ{oMI:qW,{=Ƙp.vsoU7WB6[9ιmN޶֭'&&ܪJecQ]Q*yibxW/pg+XkwܹRmg\yƍ{8<3$ wfs|jA W=V^-[m۶~yGQy>%ƘlUɪYm1fUx策{ӖC{dY8ay؋`,g笁)XkC>vWUgy>1TUq$Iٿ roKY~=֭cmþ}Y(UY; Z^6kjZKG1bYtQ;pksy3tT`E8::R"rEQecTEfxQ)wyT{~*]s9jLNݻ +WBysT*}9/ccc(2h1]_Puqv/w{X QtbellXmz˲EmO[e}ٳv[σbc{_E/ҁ\UWկeƎ\bLӬ9 p]LtìE=|gb[ryhQFGG?} '|Zprr%'z8\Ѐ8oEzz^^#eml XoU}W=CSSb1iߕeً_canR94M?jժmy3<<OD.j8K",з+qM/1O25wU}Ǻ =/+]Nk7ٝȲSZ7x`LBaV~,*rMccc !"Ppu$"y1f=Ҍ%S{V QjDZSKi"r\}49J#M(*4 pܥXkཿ9w/ԙȡ%^c̡}gzo"㗪ꌂEC+}!ևiTwo 4v; 8Ƴ1Ԝs#qEѮjc|؈rX`voTogcscm^4N(GDS1cƵYp>[G.N z\4GY;STD̴j 2<<S:"1ǚ{RUҠV#r֫[9 !v!#rZ`J"r|02bZ'I;]ԀG;|kj.j*k=~ɞu8̀FDdLẼ=wK\f||vcIe \U>&YCc2h\1&UuXUcSo~-;x0]-cccQ(b;4gg--|wwyfN룧|;3>G dR;qWn{.,d\ Miiμq211q(h:ޯ6?s1a^ǵCjd5tMLky'jAoE.{)L jinn7y̝ZMg#Z 6P*VZ,jЩ|م5̄[M~cboaJZ^f,vZၑpq|(xKs05MvDQt\pgӟtVYLu-,fjRy-Դy,^cX;5omv͒VTs^\&XtC=IN'hX,^]*f$ v7pCWVsZ(Zb)䔼.C+0IZ +izsxgDnum¢tx w:{y6\zj3s-FZi=d)k׮X,vj|:{$IBTBDT*DQ]wvr7SglLhx+?z plh'M=IUOo|P(쟫NT*q'.[A jAZ[UR[o%2N:$hE's4M M9v/K ,:Aqf1Sscƍc,7:v~8M2ZBq{Z&x4dyUW5v OjBp\E^gժU?Sn3\(سg۶mQv XtZ|Zkm͙دN\ژelذ۷ 9fkpǣu"YG=?Iϲ%qځ?RdGR[f)MVPƘsqϫgc088\jw:mjmV|166Ɖ'駟zvC\~={رc.VZ+XӃ@YԂޢ*@$]34QUT?}mh"rhkϷİq>ۭ5oȡ< Y?{ii6l`ժU<"dN'ajByq ===Jdzk{o|r 'teIzk<P״ɷfCB[UʺEjSwwZU0j5VZ{~<22'R 3KBЍ1? Y}gbb[0)9I۹ֲGΧЩPQ199伋y3vc$vLcccoÆ [+X,)y/w)plPbM3|9 bc1LMo$_5ߩLVZuh;ȥm=|cZ2111bL*wt?/"$UeT`"P`277qBVqrb/{azA? IW 5K`^0]l{;KZ Zkjl߾A 3wiz= $ j]B8& y_l}1 v>s1t16&A"q9 GxxP]-~pjι572Sä(`BG =,>;?6!I/;^97߉׎7%GXi-E7sFBJ{>[EVcӦMlڴi!RZhQP>Q(ADn hƪӖ{y;@ "6oՂ,'Y೭#3hyȩ #Lyz?;>ο'lbtxi#C˃_U  5(>wvnݺ@`.QyĽA^Q SѪGq)Ղ~x92T}S;\xks7x븣Kbˉ#y"{ZU$"N0+( +sܶQ&ep3,kmgåW0쨦мY R[zf˒tٕ]E{pV/Yt4ޕRqŋ(uopuKuW*OZi kOYi)\qc,3K_ &ya3+& M [E5M!dґo*p^ $`"xƚ~1ժ<8Z}УD?G䉧P)T!YM-xp\IR 2-@T" ~}R ?g"' {\1ac鑄&=D0$)q]U i,B̛7ge\k[τmi=aeS43[/nȣ0kBz6\?e5N+V8BFe$TH?[1HUYE 3R3YQ踹Qh~`AZ=KNse+Z~/MrBa3߻qk$DXQ޲Vbn݆'$ģ8C[iB.^uٴxHH)E+G{/H=ܝ*M'=A kIlȽgk9$^k\1I$ `J0EhJJs4}c_ *_/тńYRn'ӈNTrk1zhr1~,Zuo2 Z8oP8($Bb=\M$2%+=k:z;Iq(i%{yA6ܰG|n_[cVĞI>V8go%n WRfJ0!Jc8) Z]`^ C7y .ވR^auk=?cIa,9=6k+ =ja^6 W_^)qI_qh^Eް2vGnyS( 8^a'a'KdbYQ:EZW ͥniĐeJj9*Cn<.bwI%ZxcӜ%*dRY?c!F$y="7W l\/MDq ~up;W z+1[b\'KƸv${{ށ2wOD$VY8LgFH YNK!Vl~)PsX FbRgQ beLƖӎzOR/3 aV&tmv'KF3oX9N{abzO$"`<.+]^WNJj"ZսkZ|nOӶ^V' '0ZU3UH=d9zD9r\2T55v C.l!&+δ1+Q**ZcP~7 Ė2 %!`g I|=0+[sGܿFD0^]}r%JZu1HVwsjvU#6s LVƎH8;_ƞQ0BVƙcyՈkW4+LZo˻-U XQrbְvYv)M[ᒁPM# 6#Xo>ռ@(oլC;Z X4|TQi8;iJkTx 91|cxƆZmzIf끶h !8W:'=F!F30sF8:ko#ڽt}p ڢ) J] JMpOʌ,H*UA Y 9Dx;þ| -'+Xgֲ. *klRL Q(@*1i7 @[zJ'빅-9Z$K(T~)]WŚ 1qc.ǢŇWo|,WR`ZPWOe*f`kqK(S!UC\%<5i݇"(OYW=IPOK7iU Z- 9^q0jl?Xi2"R4J`er[$"(bMzD%y}#w*f:cbaQ z=uP%QA_rjN9_i=\Ɵ[d(%'n?MhbYoDg@LDPp9C˚-Y2J{o0G\/_s)QPWHȽR h;|cgF&kЪEnFEm|U4fb#z-U=aCoЋ5Йf͢tTx$i@5kEn._[% 9+٦4!{҄x.1ջBy5-bR&h`qCVl2~O[yݵOjuI#) eE6%5~mv^n}Zz 2 tj~M:X$1RuݴޣȰ eQ 'Sy4Vx؇)F|W_rN\12@m/ uCEy,fMK& & 4Tc/MnVd*^v?ۻ/ZuIJ":G'-:ki82^mG?&x翘}q=Աg;Dx^bˣRy;ET5%"hz޼Ǽ> !I[uT`U8K㜾rP׊$Msf8E5~{630{vcBJ0(ϊͯ*8xLE%oHdx<ޭ4wŢ=Yya0sEE6F +$w.gߣTzszGY='a8\\8  5X5ΓUgQ2+OcY y o?.~0YwFW2 y"'3W=7vB\/#tXn~/E)S!j-Vmޘ\&̯-*)4 ZHAk<o#׎R~6慃YLsqr)$^f҇11s$nk2q٪a.[1գxu08g(jwSֲ}OϳwȾaiz9]Иc߁> _q: QƖ mXsUH.Sotiъwywm޵MZCfZ?%TaearGR*!KE/κ~x;6*ԪKr^;qK@1?cKz'x{w/ȼ%|I""qLrbhR#ZS{&þ ~qh?7Wr KPZ* E؝%=fp0U䩃(0^iF/)GT(™%ޥ/B̍b3{S+FԅǮwnʥ itb^G3OJ"PQ_/w GW .|\يw52#kU9GqW?jy9@iH_oQ׈o6E(F0kZП209RdƅV8qLE ҤȕtoEi7,r| vuI^4b:m@tCYnyU-_ {-mߝWߥF= Ņ6+ j I6XOT-r49gt4jA10/٠4 **x^-*CU@ ("94J(xGj8^ifɪnZt.~(ݟ5)գbÈB:MSz_fCy18Ұ~-1Rq+7A窼Fbvނb0iy/β?] ҙ|F'}xymD_oRxwvڨ@ 0`,K%e5*_\ \S*ok_Z6avǨ{efO1;mNmMJ%g^@`1!ؕ PNVc Xo{BYRt D@7~ٕ1e Q A? 34} <"{=/Vu\F$H{˕7%8cLL-+8joNt``&g~*~@+hGx=<0S@V&NJRW72]-ifǐXyYW%S2pyf5od!%%0V8Wzy ZhlB֏d/v/,Kic{J fi1k fQnӋ~`t| XXW/yaQn 4pS}ZtbΆt/5 ?8xP ѳ+&i9:5tDT{>}Ę趠:5nK$_AhJ/sRM>^!{a\UN[jAg `~hzKi|cf/Twbڮ>o-'<"_!w@`˕D&2OXwHXZ1ɷ/iڣ+qf<2my`DE(PnG_ 7u@`N8,Zd>IEjDծmd/~> eN`W;mJIw z흶`~H,Mַ+\)DGG;Ѿ k5_TkSÎLLfS@WhcŸ.~ŏMv9"xNh݂¨ҟܓ[K݊$fHͿʯN^)Tx]gRo|`ahXœ <x c"9uFBrm#;髺N0*6AO%~zNW:w:\KOcT~|X"':ZqQX|,(Ãؖs=kjN[8vڀeך%Bc2.UdjMM ֮vU:g3Tz>Ջ.%1"~u{;y Nw vڀ&Mi'bvR;H9 N˽\,z^"?nkԾ6,8pVELlt+.P>\S>\QuX[83UŞpzJ)BoT:6ZkͿN1;xYΧQkMaТ)WލL'8=2(^. GAeVp"ƁGk[VSRG`g{Yu;$FF82 V~g;E.s[pW+ 9_|*` `vpi On鷈|zTnџ?@@ARρN"LS= 8KE9z7|Q[E"q9H uڴ1nAN!|JqlGCD &j+~h}0I+:/ Te#׃ܭV~V&!ɲI\`r~C s~33sʠi 璇20 v?rst\vX4_݂>i:HkNtxP$}`VQEc!ڃT~c^=Rul+r6f{2~F5/':붢VTt- F5C[r+$q9@eZBtNA3G)ru1baQ,fS46KlmkóCT(FE٣Dw$Lٵ8]QcW* ^ zot[g֟v󁠶g0^DGeα#/>[B\kAN=˂Vq7eҏr\m]xƓуgXIixa/ƀ `k@c.=*4 Ek@/*‰@W;J5_p NI΍?ܽŃeÄWrW&'AyT.ͺ9g8 8UhO4m.4DjQ 5 "Tkx [k䷔Dy$JA-"h زjoQr4!G);K! Fã[:mH`A_ +r^D}q$);4jW@W'H/tַ+D/("!YWXL^oL%Έ/~iC'"FD)\@ 09 ¥M%07A_(rN ޲;XTo }Ә8ܦUmeq^i{sK%`"?{& _k x(Z?_;FFjj >=@T+F+D/ * 7^!)ɠa?%Lz$*RcM^!f}蔟FQ+hd2B16߳yc&t*止VQ EV>+wv _w @}?c6]tuNp=+VӃ3 }ƚ3Yz+7ÁFIM1\2tؖ-IQ/%}>;dvڜՂ~#uڄCXeg(\_|<ťg_Ο_lj}15OBN]im(/5j ^*>DK I&r= M-7ZW2ip}'yCL hPEkV@@(s1n_qrN;ՈD_aAm?7.I2,֮D onwv5f]fN!1'` xR ADTZQhKI @$Z7A`{υU88o9iG&yHļڣZ|īB˄'U,H-x ]@%1"=m<ϔw楑@ plAnyKpG.F1+e{6=|_g>σ+`acџ>%wp&Բsm-UOxJ O=nX,KYQ=.xL3]/.Am$nyyT @[A_D"ԖI3N _{7$<6|X3UdpK!^,7}=9_L܉ͽ[A&7KA4oy29Wߠ"W#;kQ`)z%" BzFe)O.ORJGG ?L_[Qi yf##< fG ,)/!a`nFW}Dq~0m4W41h$?$Ӭq*  me iK K/r>N~ /GSEU)o]źoiJ @AIO3QB#ܩa/QUc;9~_x›gmFAu/苀;楡@ilpP}W_R!~40oA_j }?U"v٫k(W#ՄWQT37T@YFʻlT*= Di Je"ĮFo:>/옷Zp# bO|ǠO8_ysA_:Lî̹ξ3)Uo,*H*|,m݉> *_ѧc/=Be0Pfbך(Olè_ha/Q>JD VP$Ud/uڤ@pbXY "\uoH*{mz$im`" \`!c@D$zDA *n%zhcsJlv$XPQ@!GĿNo$V( &Mڋ." \@=p/B:F>~|٫IC<|b}da"Шd'~_gP#7U9Cɲ^k;ԪoRcnm@A<Cpˏx+YGfU ikJQt̘AU2cI$OLWX@`A<!Rz*{oE|71һ/xCV)-@{b/S=b!< aZtYG"*ƷqD>wc+۱y}b _>^^`=P v}H+ @^.@`1uڀN)3TuXzp@ ]>N`*~*[Ok#"X„P䧑?M%C54cHDUm9 +wuO'V JE"9G~^U9*b/>w9"שK2MΈfA`m d}C^S1y** _ڈ +")|z\R%!"W3O,{$sИMU?i {TDi^nT*+w1sV:UMN" } =]gs`v4vYXmHXA_J+Wo TC=r+v{$I@_=.gtfzNeVfk]TU3_֩Ot9O `}g Lf-&oXޙ%5C>a:Ǩ*1Mz]}k-gTտvy ;Wo ݌}'S>p7 >Ft]ᥪ$}60]iĐ|z} NE˗0CDU%s{ND*QFDx~NbĚu|sRy3FUm{NC묜kH&wXqVUkunxkM_vܖI~n̤<܈yRә%eHr1r p՚UES7G# ip^ruw#"1"UxQOou{Nk#!6&]wC}+OEcqE88 XQۻX@!ժ \܎p5U72 oUт+A6^rs/GQtᑢ׻)r/NQ_fL|}LV ߱ v|rdۃ ڼJ*ߨc"+,z6{&V)4]YgR}e64ӽ|e'wkZqA$1448oRϨ)%J xc<@`^jA߲eKM{sOjS-KsDQM6}_1s%1(Iüxˊ cJ b$zF6nxAT4M;fB""ȿ_߿=qUbzǯ~7X<f@ 0 Z'&&:~oo/ιYʳ#ދkD̪!~"=xa%9P.fCW 9_W266RޗG6CdS?)wU-,@ Mt޽#YOO(W4/TwOoy-Lw1m @ &]-g;Ү`8^%]720V^N}yϻX5,N橽@ Xtv~1$E8'QZslllr>1 kyF4B@-Z,xn?ٳgbq6jZ7?wA.<  =OW ,z_2Z=Qv__߼ :@V>G}. y@ p,Z7lذmn۶qwJ%6_uU5D$;c3Tժ[׻bs iw[&5@ x(]-]w݂x$ɒ> 44-XUwrCߙI;O.aՂwm/%IG@U) W-pM6<>.]EnCb LP(,dssQ󣣪t]Sz wp0ֳ@ t/0gA%<#t`@n<+[FĄ݀@ \r/a9c!"9kV]hyͬP)&@ w9#QUj\p֡: s>@@Vuʜ@ ZR`Ed58f׮];g*ͬz&W}'@`yՂjgvYDDQĞ={:y뿐ha]M Ղ:`-RT$I:m4\U_[~N]˜,,_BPJEBWU |&|w QDIE "E~0>;oxQv0eKW(A{U)@Om?^$N9pn`wc }!E`-bDd6N^sKoz{ XVtG™އY1f6< {_S{"q;mX ,]-_vsJ%!%[,ʁO_z7/p遏N FW z03,U%m _cR,mTی.(VGFr {XүwүǦɸӦA)ȏqݟjz Xt/,YF ?\|N8p7'6)杮Wu"R7ji#ьb>u`mV B*\َ7,Htc0Cˊ So>N8i@`^jAj ^ ~"/>[=U5yi@`jA+~Q-ZXD^@ htkA'˲br-7qkWXiSfDyǿI! =6) !"?nYƻgՎFO6ήS~ݓNsNW }<+Ww{cဨ>ξd- q@`ՂӳQUZ<e~i;fKeG4vbSfњNsJW u:v^ZicLl4a Ϩ$elՂik-5?#"]@Eo3sŊ(b ~J"tڤ@ 3ZЇ;n,Ƙc~V(ݦ-Z$It_RFNbglI@ 0gtvڄTcl޿d||\u+c=S$wژ@ SZ_u>:'p^[TSV4^.Pկwڦ@ ̞ 3|ssX{6P?;_Uo=@ _ֺyV~ڵjaNsQ7K4-H #$2VX&''^iDeޟrI@ Cc yezT^j*TN!15KU[u_W``y$ QIDATxyde}/={Ͼ00: ( FE/ƀkxo\rcL&770Wq QA@ea}鮮yZN63g癇,yw1ƀlPM(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1@'"":Q 0ЉbNDD t""`(DDD1 WWOjH৯؋.Ĝq'4^ 9ԇ7iӺ ([ʌLjƘI=IK &u,մH$.8{|߇<u8Τ_/c JR(J<Zkc*WJs82 *t '`ttƘ){_S5$JR،1At: lyj_ yr\e_v{Zk(U.&տ=OJ)(*-p]DD?tR?FGG^׾ػw︿)P,1:::昂 |\E.15|_y;RA2DXDu? Uα B###H<X,yn'Juq=7{۷oGf2|GGG/^l6[.E(8DJ) #L"HT[ghՊ `. fQG)*ceZk\ Q(X,qIo}icztTu@?(@&Z>{&z TTЍ15m| aێ0"bXq>z]zlp/:::F&zLV.CPssA7R Cww7 ԶDBƘ2Ƽwjk&NGST0% C!~1GU=@' ?nC)yw$e/n'͢OD٠8AGyD-X===΂ @ww7ϟbqr]RA.;3zZRkD"CN3D"1uݽ-ۻ~ }RjTV{z;J)hGP.;cƜ9s&ÚR \}R By}\z帣h@#"~[Xlj1㠧gRbA>TޅJ%̞=c{]f#U W\mbGFFZpb?BaRQaw(sMd2M=څr.QJWJj{=Ұl 4c1}d2 юر?O'u,˖-;:-Lj;i&lݺu̗uƍs"GVP1LG!v;q o'J.yhaq>SOdEpA`˖-[bqJ)}AP'"ȥ.`4cc/u羉<:::pꩧNX (ls@RNhg[>~Ɖ訠8*v=KDGK?7\Ш/ ܲj*R))J(jw[J7ƬvBc. :ͨZ}y~XD"u&UJ!/TI's=wvFV{c~0Tueӡ)"8}^(&sqkZoo/q7\rɫ=Gߟ}1u] c_ rI={6:::P(t( ё 8"2~1f<ŋL&}>"c|LC2iƅ}}!aU{,"/R p78Oȋ|Tz폎UJ}GGGnN`ҥKN9朎ZT*ǁ8̊?x 'Tm6E<:;;aK DG+5\8NU1&q7 W_C ˜5kH:8j?s]E29溥l6k\h}a+X[?۞ntl dڵ={Ex m1e+"QKYy^m۞Ic!-UDRD"{o{˗/#1٤ `ݺu{}q8%d e˖aΜ9,A3_;w~BTiNSmA'Ex<LD~uT [lSO= ZhѢtwwםҖ]T(f¨TNٳgLYk]^RXDWW\ׅ ,hue7IyYX*ޒ_.4\`=J笪D<ƘLw˔RoZ_b[k{q%`}fl眪:}DƘ|?CDl{ttt ec!a_q֔RopL`?oC6govIaӔ351ok\.YfZ|"_-"(Jjf.p^d2\nB%Rq׾vԯ022###*ߟIAʊN.':NWXnЄc.\x>j :~+k"_u3s˭W'AQ t"JҸ+RJD?Ӄ 8Eic̿hQJ] mmf %_."[YJʤ-f"lN;$" ۸ʅ ῷ8ENj:::fQm9"rl'E"Έ]@ z ![cZze/ "r1wXR.CTr'Zlixx>vFr۰/0uښF\)7bZLF/k1׈ȷ|U7Mͧ]6ԷajpNw&DfR)uG=.tww#Jy Y 'y!`$Ƙ8@."&i\ ? ct: bqZr}- RZ}DYytv K=lK.}mGGרw7xmE!՟f7Ƙw%OH=9Zˍ1pS(H$4m»/Ԕ뺩|>Y 9J޽{[ɒ%7nzǪiϓd݋dP@R7aMYioUJM8m+kW) /fzy~ttG?88! }x Կ+^=͇ݐ}= kURv'Ck==;Nk7FMXR3(bm(PqBgg'fjBXVJ)x5G2եzDww{{{Ϝh?CI)T>?'>ضs;?lm};c{,Rgu֫Swޅ C^=<-JF)WD~C11<A+"nGa$Ir;=4m :M뾽ٗ=$S:;;OBWk]KPh8JH&*J];<<0 v]lmtsl9ٞJ*aeXdC؎pJU ӰUTv~.VyIF  zTDbf>޷ni1eتϴ;}Bz8~vy|߿X,YXk|F۵%fs$cP(`A` 'qaddp gܨ:Tc>roInAJkY/aYn9X楺VI0of*dQ۶êt?nsnš~h?veyaӴ !kRJmhyOuݶb^tuu+n/lJRӋ0|ڹu>yd;J=r/mƘ)<*e'&V:P5P8_:f͖̙t:{"H;c+QpƘE9cL>t5c;MFu/3d( ڬqXD~T*ǙTeJ54;VrcWMFAyHkz8Q}- "rDjn/C/pfӴET*}ٗ30NӯJ$kwT n[v8;mNQCTZOpOc̏y^5YLSJ:c<յu7V5z[:ݣ>:ӣΝuƘL&`xx'|yMiXg\׽EkzYD5Ƽ{1mk_yޕTf1c֭cνvўC:dB|>_[Κ T*}0L^ǹ$h7uѤA)JNF:,hvϛØ6Gy:;;m'f2AW*K5Ei8Y1V%c 4ӔZkR)$scsZ|n5lM$) 'Ap]2|W2vt'"'"wvvvk׮Ųe0<ەRȕA| -:Za@i۷/+1 .8c…-o=S8餓ad2"rRbsov㠫y0`ddzww?9qCRȃԤ444k׾%/yɧ xxϞ=}d{MۥD$j?g$h1(Zd  `9PDyz#U &lč~Q 'mlx뻝C ;w/ڱcڿcpO{ 3-f KJRْL,1###Jf͚ue:^e>{R1u"DV[>&"\1fQ>hLe?gFGG"HT&B2VJmM|@6n܈+V'Hi:AE]*"Pq x xaPI[\===+iY*q]]]7,\Tj\OfZi\CM/뇟Z׷>cp ڜ!!˽;]׽(<s]wF)իW㬳BCmVxZ1>+a:D0F{(e1P007SSM/A;@Ep]Dgn6.A\oyGz'1\nZaPjJʋ!nAu1[(>ݬs`@&90㪪L&ϯWJ_reMT*5ƾ!"t\Jѡy \GkD"t:οG[Lsay yJxD[7BU7s1-c嶬^zG( cnJ5T Uw@<EA1\,"(Vz|P@OOϫ/ƘDLf\T9 &A/ʾpۇ%1@Cà*g*&UHzL7Zk̝;2\c̋}0GFF~?===;}oܰ5c .P(ԟ=}nxxL\[ Pkrˢ^LR{1JT iK$x\lldu_pE6i;^zW(Szw]oo/r˾dmAu1֟tof~w ^[˽@7¹|oڻ( MJ*i-KVbNN3s#7vjRʇ0J}qppڮ]E&Yd߿/ܹs=ODNLR7k3O<={TqŊXdIvCT?.qeq+]!".Bnղegr]RQ(&< vd2Boo}<WE1k,Wzo~°  /l^u1+RZKk=;J5iX|M7WJVBpL2q[bR asdegD"1"<+bsc}VjO*SƘeApiNaW$1SNu\c&|l_־f ! Vv>u꣙Z td<#sѓl> TJ8p~T.xUg%c̻RC{TPu5kq rԱsLU%A\8 ¿ٳmi׮D5k֬!ZbKaAOOύ"rGGRT%Ы՞-yIU900 6|JBOOwb%]}pO>с9sTᅦU<38cyW?pr;iVMDy?ʅ^\.\K/z]l~Ռ =6\ 9>Z7s<uZmX,~wL]yw[1{f/-ƘKކ59ަf۠O$شi9x}0l{ZFo*&=Ӭ p߿. ۂ ƘΚm}NGT} OA/w_gS<~Ģ~0@ojL]i3ve4cSٽ{/{YoŰcnpݕ Bgcr狶ӆBP8,KzqǍs~S17>nݺ^mL;DsJ{-cߧ38}}}{6<<|j3"@T;zTj8}h'wm?`+$ `mkU"r71l,Xp OrZ]bԶkNu''6ڿojQ^6kc6j+nJKudrSCl rw.Ά5Mw4.Cӣ ݕ/vGG#~ki:3X؁YfUv ЬV|oWZÉD꒽ {ov^f͘ Ȇbu%KR&m)s ^x_/$lقgyZks=uqSΝ;nDڑk<cʓ8α6";, 3gι- yE}l8Dž׆!" xUvm!fw'R+LM0f72*} `Tm jvUy/@o^a(cժUc֤}_iՑDk!nZ?d4*_|O- #TK~Okn[<ooqQB%<3Q.;YAwSfcFFF*Z^y<>׮G} %6ܪFDAcyMC羪T* hpH$1fm!x8^(=gkxvR |Qn'gjJܫUjvrDRr&{ na3coruͪl'H|ιpS]h])qs.R'ue'ζe;,Mݍ|;%|a^lfUɃ4qMTx[-%7"]}<^=m; `z6{v oѨtcA5̈́ߋQ^s=7˅ex [kq*7JH1["͞v T 7o劥y]͂( 1U]Uu*@̄m\TËb8Uo31@jw޼yZ#Ͷu^M@ @07w7y-P",jklzgohy"Z'Q\.@iQgǍ1c^Გl6O<_|12 FGGa\߫.qD {8Fl}ޗl_b8fmvHq0::^&ƯS=ٳgz^ُF[]Ӄٳgc<D###w˖-JX7K^_*.6Ƽ@#X;+>a-:Dca, t:rٻxXbPxiP$Z P(`dd|>Կ?Z]%30 RR-"H&xdp'U"m]?P gUaMLzJTqߏAqfq^B99; akW\~g_=Wbd"=ۼV>zLV-EMW{:M;ܠZf .Ba/_~ۓۇ͛7}@T"3hULsD"fn*GU;{Ap]w\Ro!?pXXysRǴֿ#t^d۸H$fϱfɕN7U/"7;K"2+jsƘV ?]׭Np!mUB}Sn2ؗDDk}QOOOzavPCfv8]0OR=P( TF>rgH+~ ڝ@yZjW@XD"QB| 4\Q'z1(oڝQ{}f7SDk+MǺv[mmꎌ tiYJZ7 zt Yv bttVudۣFI$Xb֭[<Z\Dd"zJۚ9}{rQpdqQJcJګpT, @WWRMjQ~'nZ;kʯ>ncG xKu{~x\Gmn`]ƵyWw}lZ3gy`8:*FwA\8N9."+}{zzSOO1)l؛ƘgMjP*0{sw`I':KO:VJRy7pJ)~={verVYYX<G qE/zQ-iZwc c+8ȫD @D%Fy1&!" 59KI#lV[N!ͅQ*M鴯cFLՆ_z0oN7(];}&J}%Zm8&-AP}N)bŊ7wq͛hѢIV؋tvJ 1%DWWטz5P,_̙hҥZ0q'l/X~;|I@XDOOz{{+Cwqr9??FmncZ럅~[ =ok&Ƽd2 ++Q%LiUBolBMfuuu5}أ~GFF|ɿ~ ߏ(SlIbxǶ(sV,J:c[i4GGG+Np'"B׽ur[ @:F:F&d2زe \EO +fY+W\8d2 c qtKVmTN3ޑRqL&ؗR+vz1f~nCkRcum6i,^\5ӽY8;^+jݻ?ϷT q |>Nɟ FGGOV.-¬Yۿ%cm 7x#rQm鵺vA4JٚcwL֭üypm~7?NsywxWY 9v&_g)/4@8r~( @__]"RQ{kcޑ}&u u?L&O$?‹ro2}T^v#"]T@ZP(8x~ʕvH3::>G6EWWBc8J)dP24wqPJa֬Y'Ů kڪ2 ;ڡ6s@ĦM]]]7]r%essܴx}WlNs9ũ;v]vx{}q:Ͱ c$v f4cJ4VX1Ƙ}1fD!J]_ԖpqT KML$W GEXt:}y_ SkmKn{c`;ŵ:_6OLb"O=K/|ήF'2o? ٪J7̞=/?}JsUW]۞Nؽ{MDc"UݻqǮhv- Dj F[T1ٳdNǯOڎmviXj2L;Ӗ8d)h9Sa8vpOO\m:fW[gC7DJ{ƘYiȲ0CBأ=ИADe˖۪U^uQ,]kN5DZ g[u6NgR0j'cG[=jYe0\q;H$n*Jt{zziӦ[y'|߇K,̑wJyǵo3jV<(6o. aKǣ18ΧN wW|ߏou}Rv]w\d;ݶaW"8 "- ?Z*>v馟-:A*#0/_~իo2эRj1~o)Iӛ=|SƘזJ/!l!iUf+-ZYf7cFGG袋/|s0222-JU l"g1 [mOk u;,XpW mN3< a9ڵk:ݪa7}鏔Ry 0::׿U=88x1QoPk`H$>z[y@k SGxAFjeƘ V:Vo1-{)cr]qƘ+E~ccdx"2 jcWT?Sz꩕}Z=u1Dd1w jr[;;Jss5aŢ|CCC͝;7^AGh &){%> 8CDpgNezOoO93ቔF5~uό: X,.SԯQnjqRƘƘ4<8+? }X,NIZD3Z}Y9B:ϣKl:oyCN3ƜV} ًPoo;*w}{oRoZkR) tTJ)𘖈Ȓ aQk ;3۴i.inMoN5r:pFmG4έ:R{oR߆9K"b\.ʅ=4>ZPw=6:;;+Hxa*͞$"' ;4=V(^y^exDG4zN; C6{sgmo{ۘoKR) d2mWa;ɼG/;QXlN8D0YMY?3ƴu}@tH4j#Qc֎XQJP( LWhK!ߎnV 8cVP{?3L`ȏk׶ճ~i w뭷~1ʕ+ kwN˩Tu/g|Wkm9\ԊƘ15$>9ml3mDXV7c+"cNESBxUr]w݅{-Zkr¥HL=t'ػwMR=kYN #";bx1/D of9?"%gq{1zN/RͩbXSro}Q+X`8:@Fd* Bkmki& u, wfYqNp6tH|ϋȈRjz[Oȓ.PNP&UOcPJs7ba+Ddyb3,)DdVc0b?Db' _߇VkTk}y'jc:cIyc̨ vcfӉD~z",XT í}7Dn~8ZK yZ?`IDATx}dWU؆13m 6&CHTK $`1B.b *U#7a&Ş_sa㼟Y|z>^kmpJ-8Nrglp8 q8c+,qWX \a936rglp8 q8c+,qWX \a936rglp8 q8c+,qWX \a936rglp8 q8c+,qWX \a936rglp8 q8c+,qWX \a936rglp8 q8c+,qWX l?\R\3=Iv& KCd=ԙa~>)neTqFw HLwcv_3GCBIUn%\4-wyp 2]d9QE_$q'kAG<q^nVNs<0bp(8[XCtJA5ZArOכͮC 1>bpBefttxgl觅519K:#3U |ܱP^;nS Ke6M<mepw 3$$mn1ej\0DWi9vgԸ&j=H0ɷc| f[ϿJ8k WXCB( (cV|&,\98xd̢[:Vma6Ve@0Uݾm1+!0nD*iaqكG1,r iB~p5+!n )Y%TF6M&dZho?eep;1^׎&U\FC&rh=؋rfLzpHrVzuX J's -+6V{= C6I#Qp׸rU\a Fu& D,Pݿʴ@S7XhY _X ܗIB&.oZ˚L8Oo4cEzG+=-x/]}Ԃ ?&svsL0g)CzΘFUI>a7EnH#6ē{۵4X2m+_;++xCg {nV)B`J9h5.}nnG?kmdULF&a NkV8j*"5g'˧F&F jؾ};#$6(j4|〷BLaOS]; Si$h4*pcfKYAq=|G  u 5V!8  O8l*qa8/zΠQ3 },j2)bY-~Vor:8 kpI"īze$4fvx8w @- $i8q?0T^ok[3 ׃N#1X !q/8 8Bj堂 `" Qd$j[\Hqt jA$"8SL@=i7طL4o/ Ld=yECgw TNDHT(Z* ᇁN-%>F9'%f<)X/tz58qq%+'T[4+|ܖG% fcy6ڭZ!9eJd\SҩC-2x@4q㸷aUC^RѳM^WV3YU+[:[B%J+!?o&t?Pd/6&I3{^eaqGIU ė '5&hԏJq8_xeZun[󃨓8Ǡª)c+нJMv3ne?(Z532Fj3F89re=0"W|Xl_g3X.V%T* ޴)[ͰDzh X'Ivٲ'wBq%, v:Xgn蔬y]C_|1/{AsKb~Ϸ A/>~qD Zv/ܵnZ/ZP9Sàj"L6ھЩ,q AZrr* 40UVV>C?g,ݜ)/`ڌTM9Q;ƳPz#ti?hZIr0aDŽ.Y(Ќ?SwY㢅uv]43.^h[2^WfI#+StY!c^`X6a\uoZ{Uj: 3:o%V}u㙱WXBFE c[όIc | O[ŋMdxF-\;j3QRYUdFYSXo5 ٹNG<xcu:L$6x`vebkeY)gK9^k4$??%Va{ `[%/’eESCÝc#j&Ba8c ̹0( KݱVư򺩗[~tr8αθ%ϋ1֣C5]ᓰ4?sl3 KB~)CnwSw"q88c%ny|t-f?\)tN^^!oX28X+,=MHkff(2 ˝QI2iPp㈱VXn_7]tLI\FqhtwxY9㯰粎ęsxW56ȰMeDܠuWX%kJAf^h \}k%` 72 89cfGf+GIo*u'W;gJٲd#1ՠ,xJlsvkYj۫pȕ >̺G3 qx|BYjZ)EmWuw{ѹH<3!5ǮWo^9qqʄcV]$} /݆pؕ1\ <ŖӅdFu`UqJaԂ 2cc5) T$-$M2Bt ;I'(xS>9\K2{^ռuPuscb,VkK%>O{vdIEޛ0lM͋x)0-gS n}7Nw bpn^:6wypccNaؑu0l>5/VGg'3'oM}f[foq㘲a+} شԱ=LRKؾU5Z 8gtE-&爧wYYl#n|t]Hs +V$.`&Y_'^):Ԛ)Ռ9ہ?n8N7 Ԃ` , ]UX>>h|{qB\a-ff<̞xƳmntPρv4Z-7;΀qreB>j4lxVTEfvK|KhⱍvP㬄h1H; !E19Š})G,}`F29>觅5˶S="-;y#6L-i<@ 7:܁$%{&t̩+6MK7ݽĵRB8 VɂìdAY}Vs,eDD3 6#ta~Dx~/1;W[tjI}zX=OhaGdYazD6tC^zn.T|ٔ*G+瞥9So5 o=,zWQ̄j9(a˦0 nZ'= $ύu,R&S.>5XR^1ؿezkφYĂ魧5_H$4A6qY%<>Z.U,+l(f߬%\p_ZP9W [ I> uCE~oםZҏ]i0,v,Z[5lxRNڱ@]ӟ4+k!X~#~0.Pܷ{GK}ѣmaW]jVW1;o09.R?}\Y|m RpR/Ug0.g&Uޢ UoG|j), k*̈Yܯ V\Q^Ax}ToƔdmnAesȶ.d2}{U@W n1d]b3߲m`o @g-dٴaRqPNCeuEIBO` >{j_dp8lGJD=zssRtVl}2d=0fyPuY/*t\8^ 3w%SBy-)Th9hGgG{̝M r'lc̜zۉjXNEt1|Lٌ|!{Z 9A^'BM[˔G$!"~g6ȷ̞)t^hTxB ZOF[d$Uf09D^v|oe,p&0!ܖmYSҗ1S^(>X!7ZlԈJ=Ҹ|s}+^ьk0!Ih>? o$儏V@aak H#tNBv YO5xSK[.lTտa"ߔaVٵBnf#g_<:쟞)!ǐ1e>4[{j$S2"lftó#[t&ωHigݿe&?5{H*$tv;2y^b!~] =C`W/2mdv%pbtn}a5MB/-#T-&B3ȇd$+tSܿ]#Cab\>|`O}S0?ؽpÍ4m=¬o7`/9}^vr6qϥaۮ?kk5(J!#Kgy5~#.yPo5g5GBn]YEd3s[fjт5:Jdtq&1Ƴ0v!xg뮣2O[0{*Q1'JvEݚ;2èX՜ 7r詑z-]knɹBө{//+҈5 kLfi}gs^yvVCQt5WK[i6nQ'k S#tn[I)UvJ4 ~(l+&G$i&,wlCص&ˍ)Sn5.Hܿ{?;fUF8?I}2Dz@Z~n<K%xdr`qTl~tbBe1)n4jlc%./+G5Z_rݕkh? fI,5˻~ m`~$x^&zl ޸ؘi;Y`lL0eUٿefLoTR%c|x潻yvMS{w߻yvڻN8bd/eyl1\l٠~brÉ /MaߗaWsՠ"+ 'KhLG6HRõ1?=-4EdmRSkL-11dQ'4#CWêeZZEA1{znRvAG=kj{={ؼw.J\ t]]n8 "dQkzCokE`68_|`ÒY-W~VΩ!\92UqX1;tOgTS Uʦ/x_6Qw7gtM=k)&r[G^]曰녮(ɘ^gbT̳p|O0<cJ._4f%~obn潻;s:mݑػ\Wmß{?=!Ō[~OY3vpZf]fl2iY1{Q}rm^z`~vϞi;h֊NE]{/-=W0i+l~jAo[x,0܅vǭa?4­ n9*fv21:O6lvS,A9g0%uСtYZ,q/If1l<\ *_3[;NYkO1sFv2p2fO5ѩ]B þ,t:icß>!Qd7V0 \-`&aW}.Y-]PzRKgU2[_8$!n6cgρ-l<ٿet~1Nm S׻.}h m_Ĩq%lVyvwOCz^r%Fe+#mW]}ʌ +!ۉ#n_NUkAeJeUn. cWH#ZǤLIK Z 8iAMy.֫JY#xb{tȮmYwb6:1ܖ `G!f09 {.3%th-*ό"וdN(X}Ē*ذ& nTsON`J/OD<Labג+$BVv)ba;Io0f 2p%QfFqLO0 I1Bs32>\-GⷖL2>}ˋUAQD Go&37>nJ; oԓĤP'o|K2\ }dU;[wQo5?xeUP1_2ͅr<~xxc0ʩ1V~SźYŕWZ P ?r_jXAߴwV36Bwlc2}n)BnVFyT>tp0+wxod_g!-"0Z9z+OC-U>ٌ)6YjQc~SҷI9<5;[fl-KB#bЌAIC 7=߫λx|@H1mg"+jA9ڥՠVs䎦0zL̯'tT:iưms0+uubz|v.<hE mY**^]mׁkK|b)s#Q&Þ""c)RXQTcdDY/%ARE=w`;b>4Zv㠸ۭÄ/ #%ʈ.eo(,Ӂ]C"}ײjAp+ BօǐV+J ;FFtݓn|Ǣ{M5);v99ʢ\/۲%ZmSK9pTz(փ&߸iR_\M7XVTբjk{Izzrzt-4bjɮ[o2BQp//(O= cK[ٺzayg3þ(Ɍ}(|٤c!dZ50.3.Ei▥Z0l;VRJ9ÿ\,f;rd4/;j&#}OsVLiw=4{~h{ pY|F20_Oq/+\Y-^ԕ%HTf_3۩-#>O 6|'؟-KK)J?UnMEeb#{'f,a/8|yMƳDdLݿe۫?jy-[f.1  ݣ3/KVo> y7iMөeByWGH8nzICE9j[?Us e[hL|+<\RQpao<ۻuQlq>8dp^QCDC ϵmٶܖml{/_<.o(lqқLZ N jG^#!"gɻC-c59[W̼ӗP[Xi[%zL_|jP8c%ZasݝSW}-:c2y}'p+.9WCJMo_h#F-3x!JT1{sCS>,%#9ݫB-JȒErG$htoD ( FPH;?] j7O\iXrNdQV=v=K>Yof$îĸҠ#B)3; rd#-u\E$hcؔ(N7C.e?lߠ۱Nsw$ *w7<60\184u71uKv, 7jdf6rYБ|jﮣZ|XԜjLYVZRn).B0tO(:zO:<YoߎJ^AI+KF]Z2%#&z;]Otg!z=^a ׏~<{gm1aqΒfgf]QK1ۄ^eLZ3cM(# =<ᗁp,_11M?vTԢf ' vE-W'"< oJz3>ذ] aG*f7 ٍvϔL~omk*Qy@Q~s&_w2]6Um^ƶ|%G6.~jw({xqC)Ś˜A|ߓDu36~gsd'Q\x8jʹfw uJLJ<7MI{ < 젅/I)30=Dn m.]ڸ>yrr-Ѥe{kwl1!&d[ =8xD|ovS&yʑ lOz⑪Ɉ5{YHGj⶚z9E5yz <,3?Ds ,6!2uCIH` K%WۭWRWb܈Hֹ^%%->8񣛄Nލೃۂ(~6y!@x}MpĪ(r&4SIo5d?fGpO8_I+_N6 8$Y'lVVӖV#SD[70RB~(y,wDW(mDn_W>akuݱVJx2g]nPNB! G2K˻o2PeFYy܁|૖9Fv;n|8㑏'2]udL/e*TTC1 {=zyHfK| eˢ{(Sdze籩k>eקHݺłzo#"uuHb9vAiIz](in2l~2V!ӧjtVxeF}d?»|^x{4ⱂǂ=yjNkSoݵb[0i}t WL؅j\hLhХ"8hƝ>'ӧlUmnz[?!R8<7m, NjTN6k rò:ۭ6LyV[!؀q 7;Ďal$ՙ1*x& F tq.I`|{|ڄL4Od$DC2ՂqZU}zO\rհyW*XG45rθ$45t88OpYp.e~s5U 8+`{yX)?&-L\nGTpWX ]S=h:ak;ֶ-q%2VBoX?mLx +Lvlx(dr'[X2m.eUj6ڭ59qe\aEHz;bCf|4I,are²EY HVav jL L .ςrU»K1fs)2F{1cU+W8Йi&&z5_ $JKUŀh!To5W)غ8p ذAѰ!^(tmZ'}m& {钅Hk 88)ذVa̬{to Dwqiݼi8e8ǨѽT6oOI\h^)8NߌªWґcHR5/ n ՍVs:st(a-`ݑ@_E:=\Ar+ X6@RJ~`J$Fyhk8ByU l}?,KOZnF᷐-{wC8}3'OLgrkq.l(W"I[jﯖ+,̚%+/>iDLHìXKǿ-'ܘC`9kjP9n`c_$}fÒ6.s,_ڈ'7 qغ5A 6~.sF5W *qc-iP~eUq&[Xr"[-`ƧV_K]P.ӱRЍ.^Fix\s4urgqƲKX *?g&&e^yUq\vY +6hNٟuJ\l.,qG \3-…J+mJ#xu:*ePo:)sq{ٺN99"֔ª )AxT~d]iw0g4mt1+-qcM),)O;=B= qpݠfv0cx) Pq͚RXf<hb jdv~dOIApN5:Ȳ5${2,?/>$nIcC*qEXS \Bv ^F{8-74: IOF,ΚQXՠȅT'ٶpɾ:L9m|?}28ӛ50JHS8ktqߐ8CcܵqV535:L,[I.cunƱҭfB5Z9rp*s VfִHQuV}8 cd3&k*_^avavip;P%гrg@\y`#D}H Nir1$jzWqȘ.Ro74(Ŕt+=oiese8k518RJxmJJzv8Cb+,yě ?M&3i4۠{ {rߨ8aM۰Ԃ ItG2{+kQ VsM36"VsS7@sg&DEf3:vjC<RWV6VZlu+ N+zP IR!ӌO[qwkjL"^< n9àq =`w9.Vj8Ӱi @d1lW 3IDATxwUǿwnzO@BwD:HbW}UĂ(R)JU@DBm{m<)IޜΝ{ޙӕ`X,e`zbX6+bV-b)[,RXAX,nX,K`bX," bX,EtbX+bV-b)[,RXAX,nX,K`bX," bX,EtbX+bV-b)[,RXAX,nX,K`bX," bX,EtbX+bV-b)[,RXAX,nX,K`bX," bX,EtbX+bV-b)[,RXAX,nX,K`bX," bX,EtbX:\\^N'MY$vM6/m4F⑀L$7k0w'x]"mcQhCl(,D 1L.1)mI܊HAE|1j N@:~x \bGc*@D UJ@/ Ġb1FoW QŤ'(ѨFWR*fD ( Ai  diGIBgAePA)mѥ1T A% x,9- b(G#Q0߭<&h#(* h hP@'TDB T}HP1"eSPIiJ*ԏDx-hK|'e J*<4%l1Hԏs Zo z"6L sJa\TIiExACg (`u"S5'D\$An$A(d"M22q6)sPYU>n q)#юv h/ QhX<~2b18 Hp%Jh`](MŴUCy1mPCƐ\ IH`3h r5bZ!y*LE %@.O"-ЍHЎTQiCjd11TR2w]Ȥ}#5@B" U.H. Ȳgd鿬: |lP+U4M@=P GEjO/ы)Q&e cb)`3``*ǃS4( RL9/Ga˶ ([^pգkQ?)J>Yvaݲ!Qp`8P-@eM8I—C3_WUYUcLRs\wJPoV8ZCb:V- KI"(=Tҝ+"+jʕ CBDTqh ѲwY!-KObR(R* CN003* +ܗa(WQgw%.#y f 5+O HࡌﳑOn)dT=]}6DXAJ42gc|=P壀Qb8V]'?K'A:":1Mtby{GѷDZ,+A`iT +Q*D.:J+IU`rwQB^Y,}tKϱRU:GRg#tǶwP L /ieGܖR F1/(~~FrB݂2q@i[޲׽Ų9[*Skh_̅w{}~S_/cX6ו;/Zφe $7forW1)7-5y@Nc/% NU^l_qLtˆ)b?XK@pYޱv#]ݦD7^ĕMT1|0m+9{Q1i)^[bݲnLQ'䫂 V X 4**C s.<]&H@8ETU Yy% 9W[nY\NQ3ޖA>e܂{MD"k͏zy^FR(2B#/FU=->|oBYATTE `rK"JJA]/ǀ\)J=`!@},ܧc*XoɁ?ne-tˊHP TEd4L]T`xŎ:V99z5FVЋ-l-=J) о(KL`LjP ~>mԵAΓ/d7,H״uFL2?aDxzˊجy!q8i=V Go8 z@MA$`k_;fԐ}s.gՓ^-YG<2:FEIT"B"s pĮcrtmQum5etNYVܣ6D@E3{,=l^PA @(D[S`>QzB|J5!J)f7pyN+KZxkGDT#7CJr bIc+x hhi:wMܔJ#?qK8zRǔ|D(@$$ 1*-A. VܚO+QGq_^![9 ?Ҟ-@:pmp7sBJc4[(r [jR^>5l#,Ե3s~30fXN&[fCKaA7ap D]Tx]6S{]QC47,݈r=A=Cìx$H* ߻&17O* O5q18<)}mkDAu:mz>Ѵ5e0>`s29W^I"u1oъdC^?H>}ш.3H:&&?_bc7j }&K㉒Zb^̰7گ#+ %fDr[aܥ/,%si]ׯ}v۹Ϣ+$sGpѓ-/?ԤpuzlVc*h~i&w*Kb|Gkw @4^@Ex ̤Ѩi fo^415:(8|oĈry>HEifֱ`V=Fl4rE/s37=>ϨŭHl쟰FL!!4䈸FU,]l`W+%~jpT`W^)\nP AF̙ 6@hG/t*JjPF9$jUӽ7ⰰ|d-OI8r{&s_Q[hYW]BGyZ{+6' ,5Y^ 3qRGA>@5lk+'4'(N#Muu8de ቾ^e# x"RLF俧6DToVJq 1A !]z[ 86fLk.E0y_,;"NF~ʔ<{38h{LUVn9KR|@E]Ql[wԠw$,miGU*:D [}$z"PnωUpߡ}? zBAFE8Dm9_.A+=JX֍1".)P5¬ $"8ԻFxWj-&ȎxuQ-}x3xMz͙6,jQ?k8_L5ƏO@=(H dY R[Ko$<ܪ6ayƔoR6@\aC$Ow^Eudfk ֤'<;R оr>T&Iz-ŀu#lþ^B@ ˅/pdK=@AǝQmA ]PyaD4f{dnx|^Ing~ L̛s4dz)80pS D\&k$ e4aBuPyV{ 1QHCrcc}M Clۼ-w_L9rOӶ(e-v^tR&I }`}ۑ0#];s,(ETOET+ۻF/`*:G0^dDi%9+s3gQ =<8` S/}ΌO+#=^o|T97”4K8UYz9E0N7ܢ0}Z ZAaDyUk,G+HդyяxCvQ} o=l>㇗LTN[F=W#ueg .n%Vqsơc4Z1q忚Ѳ$JPAbEg Tt2b4:].Ьm`P@e_ŲvtG7z#w[ަ |v&L̴i7RQ֜!u#f}˦ѲM)) Ʌ}r0 !ʁlC(%av}XA@WP tX:J}C/ҊMqKb\^,8;/r^Z7ng\3Mi(tz 8JG ?muۘCG~a? z7"F Q l 4YtNHN9x"ȍ }km㥖\]&[ū%jWܳN4 ALLJAs"AwoEgrN-jq7sUw#i#Pn1B,B$4ty7_EcKt5{qPc&k˳6YX΢9 $ F^c-jT_)Jm9@_)1tqc6>}c,,¦"l|/dJԴQ cL&Yh H(+S;0B6ܚ5G%C- !dXq6~2@2[q1ao!^bm9N~p5^" ˸)k܍nQ@@؏{+EsS_wtS K`z~ENp]h;A˰!ڊRT,ԶI -eΣa[!RBeiTSWD[46XTM^̴ |0h\(OKx88z0Qeq~pT9I` Gels]=DG]m}u{|sQw,]LjIU|^AoWmonߠ@%XaG3? 2%V ˦ZZsd|3cی=ȡ%]:RD%K1*Ŗc+ه?X[{ x4k ʼnqR 9ukc᧦Cy)K ":7-E 7,flP.'Bi7S}>j,#t )܎睊/=!nA Kl/ ``"Sw'Ct犝<7t5+ 3jqaG6gPGYw3|P1vtV^`oO[̴ <<漿0WHQ+P\GJr鯞⹷1gQ Ͼ4İnOl黙"qmC˸ Rp'P u`VWk̳͢O+݅q#+T/Kar5 sSҽ8Iqd'LG<AE. +IGimq__KrhiK߱XzA  MLLW UL D/!\rď& n/&@(=qq Db%k{f*մ4gaf=2_7GY[טKxZ>[V4Ӗ)tt5s]M*Q2"$c2~xFS]Xg$99Mntye6m )%qߐy0"<s˦?WFc%yA[8+";MSŎR"܉IHϹ1@EJ{+Y+V ESg5dn$ zE}Z6A{%٥-SG>wf>YG&hͅ!℮=#|K[:Jbפ0!U)qnF+J\z\z"^zg<=1 +#iKQ }n,h͸ &6XhI*0H;@ -{$6ݥ/(}A/f"Sl6f_H< )VĺQebb5)sӉ݈5;oI+-{{o~(։\ͫLդ _hhhչ  s )a U8u40|ENԉmyYx'|p]-s-}Ͻ,39uChluI3Br A܏jy9%z];&US .|h)q*oV}x9g7@Mв椴.zQρttٺ@{/'sUy-ă'rnc4hdS|ߝFNxfW2?us+ eE)*kbKY$ؠ$Q BRB3u{D;ު:`{t"#$6+!qӼ!W0c&gl`hMیZm,$*S;^_dz(4 gKq⥠B,䊗gqEuCK9pQtDy*bL%a4xϸxYsI /C }eň"ͽLD⬿Hqz۷Y{ƠQ<1(͛vR}JE=ֲ k9 B Sbpc? 9L_Bbp /%"kǸOBuĘ RHB(BDapl.`Fn~{79sImE4T(MG9鐉tDx~rK|gVg]M@mO$PAa0jb*$y3c^aG!÷÷^Ǧ 3ɉYn]k!U_(Fqܝ-oko,[-)(* OԙdzvuA7 L9Ghb>#i7[\um`3WfrpGkÁGsPc* />)9G2 Nv/=e{=p nmn|=>WNnuMhe&؋#)୦2pQ}M_PO08@D7}&(-4_:8\3m? ):l*OlU^S~\HV==xI2ee6L&ys=oswC&vXu^;S|[,i tmLQјȎK9jj t}0ed7zS̕ yF)_`$=䤇r)z5pAgӵ*VidΣq۬ӗpOMG.q5137%LPi87P9n*g_0⤆1E󽬓w-jhgތZ-\^iݺB]fp6a t)dP{DU?QhCר^b"~ɬ\S( &?q \ [au }n֚)pwRĊ4IDPe|S}̔r#9#;Ӕ9xqUM VʃPαmƜE-|O֖'Uňh|f|.(*KI 5-}:K+Trv2J{~.K.HDtv Q61.ΰo?<P (un#;3Wo9i/ ~"G9ֶ|^q.9c'TV?|۫`p&{v4LAT™!=(,Qɽh*уe|=L.8 .'u9ti  P|~EwPD NgӔtU,% qħ%}^%6&q;(2Te7(-9tܖc+á}@C2' %>]|osU}5pUp5O%\57 9'{-b;<n-7 C,RMA"+Zժm\,=H^B|'crE|a]uzCE(bK+d='DŽ^7$ T?4JZӺ2&Npo랦uA3UaM')@3muxj>;LiMF $z|$#U]`IGN[?٧YVt8htTGҟ|35ul(iyWy?dV}X{kҊok/TY^χit<"G4RV\6jFLNX/ɢL2r=F&R*,mN֭u}9"Ep=APgsci<Q¢u4s;;;"QEFAhU$TW=9tm&ki2ʹNͬaZ~ӈb`vpi7ͥ843˝ÖK~h}{5:7c$C_T]zBx8j "PjB^O@g 8Z1?0DmNAjTEt+enMАaC&rbU[۔3#+HmR(Ё J6|=_~܋qC{꜆k;c>sŒpO((8d̺ l>@SbmH]=~1>ߜ3_9<7 ERB0(3Oc3͕s)wڷ4>߽IZ5Pcc떁'^zw[I[Sഭ@*A.@iR :2m} "tW!M9Ho.""ȵՆw-*B U6Xe)HZҙ6հDP#h E -KZw橗>粳wfmw;^;>KO~YN$fuˀ8惢s_͛#PߔB߷&@p"`Z3\xϹLJ2sHʋvgUhp!UD<_X֍B&|zIyk .Ǩ;t>Sv۞nF(*Z HaW rZL]??V*oݖ9bч M 'ee4@9,<Ʒx $Ƞuψ%^"9#^zy_ǐEoir}_폴ͨޟyNۏC7IPm †y"UI1oe§kQ˅Wyn%#hvGz43~؞䑻B$Zߠ*< Y*X**gnz5ݎ+u_5aEJU1ucĪq6A떏XZg\B.ॣhGuItQLQ%l1A(G*s?`nKsyfճuZKGڔ?ƬMc:%"q!\hIFH(m>Eu,[)RLfPaH7J>&]b# -RbsHy;K쇈CJ e2/iXD#Rȡu/_Fs[ v>9׀(Hm9ߙG/>O߱~g9Cvq_/<>5u+^@!qSMf?.@%iEށ%PԱWW5t/B+#X1_ "dҴ,ҖM,hϡe~dteQW.jy[x^u(4חQ\hH/CY^V乘'9ٙ@0J?fH4@ J~k⧋Xu>XV7"?" nHRo,WI&|ɽh1\t6m5#ϽuFU &BhPMͳ{ 1`|8"ltuCJ8x{%3K>hК쥉\p@07)oH)5Y\s>9l1<}m~ZKj|;uSbyw L ㈎ZraEb(CG)`""h.:oWvn`=֨OGd}RaXCsd.ދd bC4N67?^n>>l9#.^Ira\KQ|4َnG:W0pw6-M8<͊%{|^/l-7= }R@!9i{I2ojhz(O<;)~̠4oWxC GEHv{޼x#xKJ( !U2r)w~@EŜx, bmF.u7USzͅKP kGDWAb'L=fTԤъo;#νȘJ"Q֪ !ӑ!q|R|yw"4\-e ,RWS6xgj9IN9w 7kS1Z3->E(R0L>OأbYv9n"ry8nfBa]z+сU qPb1hEJIqUeqRZhQA|CzB5-7Q7;Ofio3D"qJP‡\{TnEBܗUw)kP>$]Yn=@+[pd42OЫZ讣Zbδ@|+oH.eN#|vvtLJ2"mU)eq+1*y/J(.E#Z9Fߋ9`v,G[iQ!ug h\ǹ.Rѕ1(+"N6)BoH,eQ 'rs~y3 .VHDm~L;nED^<|HX~VA@m(YvW* Z(8Neq~2^lnXZϢܘ@;#e-"Ⲩ~#pB@:E[[_GO,qkf-ng-IW&m~Bh $xczyP"͠vCki }6)Cpg΀(,KvP,'1A׉5[1m=O*<65qt%8*L*n/WaC$J*.__o}~znl1ӿ-KZIW 'rR,c)!,?Y!F^RsUYhX_G,YajRmU#7Xn>sD$^)ȲQKVfofe /D1D[60(hQ5d'uö"rL@&e-^F!(2WGwq^zT1 um*&BMcX.|q|H>C:{e2338 \ţr`&eZngoN$@\g6LjؿhSr"Jp<qp/,P e5Q>\iӸeLWɴwB$Szum<)9IN8/zTԎ^] .RvS΋@^Il1b%t/Mt|sC 벘;9 M-IsD$+&x#E\-7ϳUd{U=hfl;[_-kjF$^Ƨ ?@cS{ut}~=Yo?DQDMnR!oHQ|tfNh p> 5ȫP,8gHDVrh.p+Ϻ/(C[nGA$VdYOk f/l}mG@ul> Ye@#?J/WNږ ?"Ilu(s.ߜT20-M@-䪣j W8{9|kA,0 olAbIt{G}VAJni~VlQzHU1N*@|+=R@!`ٖ,g„jmCS_^G];w/WEݢRʀFvf2Kx(2 C5A%@ٍS8ه$79 l~M)ut]*J{7 *qD3nZV7#%vZouy9|;m>`L3TM7yDaB5?̫kO=>H\kFo^ST匝㥔/8&SE/!`8~|hF.|C-/yPMż hq(%"dMٓ sRv*(/QGr#hȧG o>-+o{! {[c/LƟ>NUE\[}~`fOFz C R"K|@PȗHisa/h>n:@'P9, wӪ%h)EMFhߘө}*L-$"6Qb15'5y4mdHnV0=G lN.*X$ձ4.\e`'9ׁaVG I.KDGoZ-6@ebsJ'{>g쳃%h>~~F'A3A,V7 -<0e(>|;+<Ԅ*dt []aF+g>b$#x{(8$Fuw{l~*`GBUPLjR772rWSNJy!"Pwwvb;UJ6UL`H'h1%ءӨeG[Nf_:ت~;K]!d:QR,͙fCuQ yh= ,b$Wp7'P V<˝N89m]Xmϲ-qI`w+9I.jAW]XW.Eu 8Gmq*|ޝ)m?j<|[=W21!]%S deWx/qw2~%5ʁmd.Ø $j"A\phb077WMT㱈o/$' Ѹ4áYșǜo0E3uBuq6B`K%bģ.rtߜKVi͓Ly7G2mS,v8V>-gS6f{[=J-g?mvd:A,/\ˡ=Y0he5 fgEuIS8'x(erٱRĉ21haF+O4W\^ 1({u?^!4axPc+tVe1 Z>coe޳_$b-q//aE=Lr6Q cc p˧ITقor(/5D9λ<HYP\^c1b62mS( Dit<; 9 Jp ^%˖{po3(e*Z?^ľ?z[oWܘJ"oa΋<'FKk]A=ohK|W焢]V/>_a49c7hHԠs7p,uŮqi2\#iX|uq*w/,3Ƣ9;Q(7z@5h,lP 4`8ВWg ^Q]ކdnlA|CbP^|iT9N/GxfXnxp!s6oR"&05+$ivUO*4IR@¶t=^ {pg w9Pr`P$Fڊ aU>'Sl(qX )O: N3b!d1TB-B"uT^pcr#^]㙿v@V):Z*K؍iλ|X*jAEnChMPÿً_%LL#NE~^'L/o4X-%u.RY/DEt\K""P|5_# 漣&o ISeLВ-pw1~H׷P_x|;Dh[m{&Jư ӸW9OEiibq-P9;I2oE1d3=yG8 ]Ǿ$mK>L&ۧ.1˃P9ZFu^cT_;u,K%p!yssܲ<)2q}԰2Z?]ž'Χ_H[4uALJrͷ™2 Ո !O a2[#'"!a_v|e eO~~|H¨bƠJ(վ+tu<)QFy8Hs @p;@{T!AG57(s2UTPӊ1я: f<{1RX"86^xr1EpQ "ͫJ(8ىLwfϱ1C@!Ɓ͖<@K͝- "F;Ϝ@)>pcƯ8P'{36H횭p݃2- QQJI?p+Abl"e ^p*]|$F ! |Ct_:^Ü(DqN}vA!q:UG8=Buwoم9Dt-3f{>nc b׀bdNY]+#e* pTk w?`?f2h鈑y[ybO|g Ñ.LI[_l^G3žLQ B(6 ]U\)XF瀢u `%u[/8:i]Tiܺ-@#u\g<:|ayĶlϋ~EtE:"oژ(-xq@nd7oB%4sn^*7CPD.'s {!WGFb.qV@S }vZ%*lv{CD`PC~սG'zD-|FY !C(˰[e\t"6luؗ+n&vەb%f,d4Ws40Vj6b,])٭1< `Hǔf'G9{s/ 4u}mzR.***h-]~nq։r5L[ONC&.~sԶ*ODZMX_r Cys٥Sh“0Hb;zi,[oAM(Udqb6øhbPOˬ5Yǡ= 9׹SA= b LtN`CH0 SXdOSh;"BK5ON.8?# .V'AE.9n*?$ lKZ|G+JE^X>bgf @PF0/|d/o#„AQQvȶ)A?ix ]pAp%tmNtiw^ѺV`"&f7*f ^}\vr2l1xt+] ,N=M*Xh ,TW*t-_)'c@D`8AztYеqGL$^I+ F0s;>a@u˗b_\%Ni&u.kH3 r纹0gJ} n)x]c6ϸj*-;yߋ ),Xġ1ON$CS|IkqU69PĀ2 fc ;@3PP\a06LLrQϲ+=["Mr(phxI\N<"BҳB{ (X^V&IahW\@3r-gAGS"s@AEh]`f_'1 ڵzϯhM(Ё ?b#F<^՗Nؖ~o8,sN ,H^4PfZB8c,7!㟴/Nv-w.8w8ƍxdh~ĎgG#QN`}ԥUn6) w5 ,ngW KQ̆(YO7.3v`Du[)!E\zPLx5)}3x1nQlv )М%Z/kXYI"t؂˛`nǞ%]ܖ^]rV9Kœ ď=^V{ f]6lM+@1q5}-3r47vtD@uyؾ5 ,`.ٗs0 j7&_^Jvv)tB|_.JK&*Qϡ);^"7i"a\h[3؁gDŽVhZ#Ǒ`:-:[Yu+H̜3WlL yyQkH1o;{},.('oF0Q tvda|$sP0Hɠ[+ >\^'1:t=׽?YqM07tcE"zu@tSm5g>G1D7lR*׵B+{&W~STmA<'.,^U_X7/nge gG10˼uƥt*ehWՀn$ cѕ7%EHrgvjIܥrwvF9yLMqv}E$aǐˤa7֜g>v5 *:F|lf(o+HZB+Z[|LX|[z/ ˾KKI)%g]Lcq^z*}d yeHk0OSq#;vjcX֬ks.l σA,]`.si#vgg >.uT%Icw?%ďC_aX4vt\wV2J;=whf-hMW:7]}+,ң*0 |~'j >L]ImCVGNs?!RG`p 7>>?:Wb+tq?%۞'цvN5ɒ A?rk ~ |7OSh?m:Zw4&VCc2=INe /@tl55b,jhs;v  H[+d3\茏1O'R=x}/p^'iFl|U>49 9Iwnp#6yHX(M4Ù| Q a .5y vBm$e.]_gd:?:@;?$x g# d!K֮˯_sӾ}\'tf8gGne[3h"eeg:qߜN8p/y 05XEkt.̘Oˮ#WH95k~:xE%V> E?RijsJ6/C=Bݴ@1$y9TLp'CLP)g̝{V R)^ya&>9pP =(uF&}&/(FEb.?9gO'pt۱7H6u{8az lNN}PJ'9.47 kQ+P_8uĉlsp7 ̙.& w:Pg^VlxI^fsVdZIhB,=?&C'x ͆萧P;|OSqw;~cYGN7D6 Vnɥub .T=[kZQ k{jZ@m\xL߉¼ŭ 8qx:-֬r53jIU&y3:zT~vңʗRr91^+U|yf7IuM{S`N#W\}Ws{y_SMvB J)8i4veCYB8gCC&BBu΂{jijDy.Lz%󕓐݅3b$a־0x|`D#)l%&4-N94 8O|6 3r\x~Qi3iz*|ouΆfK㩇? 9s(7ɔzr-R@;vn C6@ҿrH۳P&9~T?ĽAq4qO&JvCytDY@΢A/^( 7v4-Pum7boCg֓Y- rF)2-⧿>}_`!;tRQwFD-"jʗhm/PɴdIU$RՎef#&f_zLgaqq+ɲq.4Cr 39ٓiq4oD’E҂:"/ʩ_sLjx46bʘ,]lGY#,M|9l$Ftߝn܂7(M`htiK1H'POQ?\v/SdyFzT7:e;+PvVQ 8Ǘ׊'f499tPӵzytw>ڦ'l(ׯ%Jh/cxMn֖+=w&׾C{?sޑ[?!PtCxb<­G%ؽBXZI㯹U}]擗[ _?a[Ə(u\qNr;3qmRwJ /܋\܋[B[hRo{nf1$$i~y |μ;;=={6Pյ}w=&I|'5r, &l91[[ICiR4kOvfPiKUˣKgxwƁ%<iJ TMAJԪj]vʫ -t+wtD}19Dug]D6nzz.ѫ/8o$hZh 0g&ɫ;w@X06/jR)( ".>qMvp6rr_h3+ f"wvbb 7 R<}7bOT.{[r)!.cn: AqKFI0Jˀ8Sky$ Tv=JĶgsm]@0D.,Mră{nG='ŋRNv"){qa;TvH溿c|쟓֪=u Hqe/W3\s:KCV73 IûDeŐMs^cdHM4ٽIV2l"udʨ;j}ΊzjJ*O6K^(~mȅ=zTJ 첧h@?lXGVN[Mk2MXV5b;JvJ#k6IR贚i= #'ұ;o /4Sbn?oUnOZ  p䄃ѧ7c-O_!Wp:Y01P^6;zQ-ePcg_ %aN$x]޿jc9k +7S\NX^ Hc(yg5J]#Б9i\hMM/Db^UMZdB&UԞf)"y#Ro%}5nCw䲛C? MX\S#DƗ_,feӧKR}N:/NZơ:'xP _Ta&uGV]!tD{&`o7u{C(:L3 `7P(@Τg{20xn\?=nur ߌc n?}be8:=5c^Y=-誩i˛o4Peic^y|R6o*Ǜk~P.O|v{s #8U1l)2="ǝ TOQ m(, #{Ww5aBޥ?_уt a']4pdn[.;~ is3{tc1H T L\~`8PY0oNoYF^uC]J%q '|lDC(0|Z8\7`yVT=R+RK6 R K4%ekivi5 qH)[ИYX`u;K0KStu.;f|.=8v)tw>oTE9jtjβ/s Rv[Gn~nFr.thȆ*B ,1[ gYM2a28j4L3\? w7ė~q̻͟꿏y<';'eQ̳I-ϥmY )x)+Vpt!Z 'MFZճ&ibl]uDZ!MCrWG].fC-Hk m+ßާ Vabʬ%oGXqruafxՉ--X>(ǟCڳ8&;xk׹'J acx?%=wF* a:-=&E(tSbui EdҌ=BXf/ + Yׅݦ<^tF<Ԩ9^&O6#e7bDCaA,E, >=N[y$B3^LCgRJ\;l\vyͦ*|ĉ uK$'%fƝ[{Mķ lħI)sb\Z2R}1U>s)49 2[)1C#fPAj>G)hM7E60BJ*k.߹Hoq[/Ҕ:Kpe.%{}7Iq NbfˇܣQHICf4)…yKЄL$4L=2X0eW>6uى#OJtEz8DcJ-*Af 0Hg^yA!1^L@PNc;B%A G:+6%3 [34 Ēha &9JTJsӑ5̹!5%Ƚ&CUkw_YTTXB6DQztq Ҳ߯< ғ!ގa Mz꺃TZOAw*&h2x)`מi{l )شݥ2zK=F&1%< V#Tfc.4+T<ьJ֒x`೘0M\v-EF8WdvLk&myC3k]RJsn6X7g 'gtWQZZ6L-o?îGB0JGY#-SfFvl==M+vj9c f_TH$f$z98)ߩWA>q"F߿]t7xy ,:j^(2~Zndň:ڳC1gB)%=`4}qHO k"1L낶^kDhm l @"[URΤ&9y 3e \ɻvqx"8;ŕe gbmJGj62F,%Y}̜.o XQDUDd0 Ol䮗)CdmnI7,!a:VSu @%q4p 1[+e̥AF ͧKnR\׸~;)ǵSq)C*aց9bҖ7TU>U6((fRq߉V3cs9#*%ihS0g#snOz=(@iQro_ O,1D|&PR[S1O6?GVEZULpmPLԐGnɧx]./=vphT#!#^%D1ZesQPdp8wN(TE镟M}=wK~OSp!=&Tjq)@1=J[pĠ߰f7= $2:ՁCo&޺r w.SS W>2__:}.ȯ>o*ǟ|fI@aC33cFuf"~~\\|l^<ɩ.I/.tX+ل~p+2@zNUTJǿG9i8)DC+߁%2ާsڞwfBx k@vڸp|d "ja(&+s]Wa`Sʭ&Ez{Rʔf=a"ZI:aJ|9Z676Ep؞s1˯>7!^F(`pRWw+ LZ)\{W!Fވ#~yXFټltKڣ:yW^g?R,ܹ\A0\IHt/+Bt>%rXQ2<:~E ~:`N2*LYٺz'$`/i4g} Hmu?}Ki+9tM?H:9ꄗ%0M;?Mqӓ?RVL^ Fq9 (r&,8:nK!F q/&Y DF:e]NHE¤*A-A?B~+0;s|]'/o~arP[H^.p-3sv̀t_vs85%M1MM8*d\|,3c)^B`&c8@yɐd֫U L&ȀSqap*TP S) h.a(p YM&al(VBJKk֤ ]WunT I"MK!6uxНv5ڬ&S]c`%*lt s*Л1 x#(x؂<E٩Tذ,XI3=1+1$4 1 2Dl02rx}S_4% *16$.\Jq;fח1wF \2%JT>r1[J50O6C^PJk%ن7V=΋XΤ'bѥЭe%s׎Ip@nw{j!™D}mUsn eVuʿ|+ARr-.[NҔ4䂕o=r?yBv1);ml 4+L#xNAkS@3rX -5& /$;j~IORe9T[xk2ւ |?>=,<4*0r(bc+zd׎(2vLOlCKW%WB|}]?j(76scց!ۦS"Q4f= EJAl@JaZis֒=vE9xLuc:,lu rP$ YIx6yx~CdbHW _ !A~%g3!#4n6"88P+֢RTFb3h.,r)ps<_2JxV/ m kF Wϝk/AG`D\bcu@{Bsl|lDbJŏc$I a1e-$9w0 ).ka".VڰE4 hU3m%0ZA^aBL|ޝ g kՁm͇'p9opl  0ƱlVN.NHMKj' 1~z\)!3-D~6bY ?l="Sa:+T< a 3,m?Z8멱OY`]vbןE?py[WN/Bt^4(8(~g8COS5RZŜ[!uǤeB\[J.׳'C]x֩n%۩b,cS ;-WZ'VȽw" 3`8s^y0"kvl.$F%H2Aqap4sx7Ywf Cfa8]yUѱxg9VS g-륯9fFxTEYZ?66":w23mN@"+og ϼD￉mNt29h.,䢠0yYtI՝ס?o?F ×FM Bicײ}xD"Khʚe!![0Lɻ/4O+ExEGZOғ]䦊']FFTjlJk4zZ)`3o2ê1{~ ȶq 02<7|pVceck'^ 8zl\?âQN̉/&` g˜촤AiHlLY]_b@16Ȟ=-Ě,((E0T[xv/{r(DPO?8N +RB'ߟق W<VWBKo6[1FJYdmg&~>^qŋۡԝ4h o4 .C8gя @29$c$=K D7nyW)!;-@MBԁt!Vo^CbdohfL T\BqsC{b$ #V>q}3*3B;Cq.D74QB8ZuC8I׍B"P.'SWBU{ WE6iўwn":f+ YR՚bmߧ4iƭΞ_"%R]ph+֏<W~ UEzZuH݅W榢z ;cSRJPN=LHΧyC3og-#ọ52<#H@WL&;?Ǫf;PCꓸ:btQwx]QQL1o9'IM3}:f1֕!(P}-:}&:~nHHu-SS?L]qh-hMewfK&ή,Oy͞D%9x঱BVx%!^D82,*P HӉ89>\7 `=)c/˙u21#y I2ѣmh捄 \q|9o=6߇>]Ә|G;A*l|kP#Ûg 1RJ)Rf!CM?k8ZMMzm.u΢9O#Ȱ/, VjbJ˷֘7Qs5#goҢyҐ5GL/=;_M1]qz4!=0L:4 _\QDg9x\ϻɐ^;u\r;9>~Ba HgDј7@A߷æWua#Whd.̛g^r km3QPj.dYTecٕ$ϩﯜry37A1ͺVZk@A1G )+`z^$M1?[ĽG^Gi<לY7CS Uq8޼ψfoZaQ4ZQTQeZNI&Lb?] k59+^eғCw&Ӆ_gY% 룎58'|0qws!Z ¼;\o@ I>&7:Zj# )&4kye".n)es+H(DtS CݘKpp?'pn PS*`xsг|5q#yͶ8ۥ|]r@3ו8Ϸq<g܍8WXfI%0i.?5xLm$D҈2$JT@c8r 9MbDT #zבg\:Q٥0$۽m2#1 QkJ}LB(q&O]ɝk:zZ+nNg&HAlj+רBLd43YDP0_Ҕh3$-̻KV`W$'{"4UAaz,o dLi[bkړpOoì0֙q QzDDUl 3徶]$V!:cYą,X 6͛1~ww ?ńQhpUGTS_Yo2}*rga6hM%f`JnU1=P(O渃{,nEǘa"ukNfJ<%' ODUsIK@aSZMc&A%)NTA`+zI=W re61YUA>;B;Y]abˏ!.!W"Qt +-п_qeyyBv  JJpi-"`˜ү(}-EZ+6U+Iߛѿ{: :u^czu/v~@P\l6$W^p84bMZlZX-VM >ygv9'BA.Yd=ȍs[?&V}ߦ̖:'Hit!4En%S[z hun8I Z{W>RҀ c9gʶR1He7^RBLuq%}[EXОv|x5cx-ۇӥ;'γ,h1pYy=מ܂.=q0̍|a??Y j#'\YIWng1 P +_qK  q+nPH)g(g},Ǯ='-&txfJg3˷B<Jٕz,2IK4-*cKCU@y)0a R2Q}`|ͥA- Ma܍Dcu(51C{gѥO6zY7Bbbi*RTꔵ"@Ujњvx s;Cj z _Nz92NҊ`֑MfqvƼ̙OLg'5G7$=/9fW$#Dլd8#Ntd^XkPj:r/x>6b+FJzdpݙxqq?"p]2LSb﷝7O>O8.>c#w``& W- blG?/kPAٯ_6kY5i]JPޏbBY҂6Uֈ+n=-WctbU_#od5|5]i-E߇.q!]X@pc9ɘ=3w@.ؽ7ΦӡP_^r֡`WYea.0۟AEyqB`O[&,,_a7p!Q}cRa\TOCF׿y]%&G2絳'C/5 p"ʋ?t(M]foS"1&/kQ~y%%1Ȉ֤oh*!5iu4lP)PE69 ~[ ܟc]E@ h2oVЄ@9VqS02]6g>S47N~e>'s|q94tMgI L"hZ>v_w?w7 6c.@UPKBE%<]9<~W,[XߵthYJcGҷcJ/^9>l^0MS*sXΏs6p0kxN|, L UQ: ČU)%q~M~?#;gSKK*J^C&ngR#+SeP|]s!W^E!1]}F+C>"rcGGs_ --H7f#:d{9aL}:i&Ωo?NmSu:'wZ:ΨNV(b jNۓ_.nPmaN 1cwk6" D۪}X6 m#qSCSTUX>2ab  >}1c njɹ6Ӧ cx=% ܣq9JP/SLLL,^y Fu}֫WESY&Y4q.OU'=_{0 *Yd66`qآSp~!XmʅA:B)/Y3; 6暆ZJBIKgZ94o7SX"GuՊ+*S#9Eưޙ;Tg7&l;r c))|RqۍDa [JTTbDu!.e#3EVۺZ.ȺјoÕ$>-ʓ?âBιϿq5wo `]S8޼䏈TwS݋6Ł[ BXvO#<7.X4\ 99NOfˮN.x.mXxSį䲀L~3J N̮r ?A#[MFHƃ)ה貵qkXE/}Ӕ+Ì=Mft!y5dOzTƝ*~|r-e"'Nᐦ3G;3ʎhdS}|%0HBX|ja\wI:8Vb?']s[wq*RBN>kN˩)'Q:ƳǰaKpe#kX`L.a4p?X@'%xv\Ä.i<7} `GJ`C9 )C9Du>}<Xҽ:ҽK8`e3p /jJ< 3C`(SE $rj"R %J&](V&rV fū'N@!4dOܺy3 n*N/זta!k#:Ƿ@e1ᅷf8̨)mŞb0j;s(l sn!|ľ]jq!fÑۑUe!a' kjsn?/eմ8rPL? R8|[l _TT5\c1yU7ث *8I<4:-FD%Eєtߨ@(`ME($9?_:W eZ}2cS9\#7i[0#nCvbtVrN? yP6(JJ'gv,W!AUYc{f]#b>rq;lrfOfb VS$ *4 ֬TYV<ĝ|!eq]H 铯A-P' 3&umBt) //<>t4j|z3V )וl~-5#t,[m EDoz7=M`k=+L오Wu+ j:BU1B]'pQ˧ߝ3ha%^Y4Q eBDc&e[6#>yCWB fU_/ቧBD":x ~ =~cbt4[:&y jgnLEJ^5n~x};;gb D(J΍(s2 T1 eJ:HB" -Za- Z$(L&Ѐum I[;ϑƆ $]ܚkW݁TDkHkwB~|<(4wBf6J4e[洦}Ѳ|6 O]![qXof|9 вfWwsx)q //b}Xꅝvvw@7$<DlonXӡqQCoUt:ooq kM MPV#ׅ3r{1_|+=:i-z4⧨h *w0y)qKf!]5޷  ň-E]'MeȠoD#:47@u&S* nRGjЍ  QCC08*8FXWO71zAx*] ҏFzG%:aO[7Oqz+u;fx'$Mda|]x=KP[(oU {^9 gZbLSZY̭lv):~(BZU5o&Bř$n,ƨx?ɾIl0 <,>87+ꥡ `S߽GqC2m<,nK]P."OwW`x.9gNlj3rTc`! @ y WtCs 6BR"Qд]8\l9bb{GL5 SubҺ6E"Gh!=g|n+=N2#Y|L?+q`h0qŨLک,1{|>}5֗aK@ə"QE0sV??.UsmBzvJO,i%Zpe47s>#HjcSfL[u`GQtỌ {2x] )e,$6ڼ9D.?3&wɻ; * Mg~>&\Bfs$?2oV/|< cIu:笣s~]5Tw_l\xkb*#Sq9vϨ M[ӔDc4NSjqXE7T~cD TwŠd0MlpحZ)(e!HuWTFeTȠG~*^wĐQ+|*.!XPRvH!Q4YحlM \~Se G,dxvESB _ҷo$BX T,ݽUJr*J*iqwVfEeż6.hTR aEB 6EfɧL6ޫsdV"+ūq(zb0F:i,~Haݔںj##IU}P !aJ&N{( 8} 5ے7)&lrV~PE oyϰ |F T^ɓʍM 0:z(ƌ7fAytk萂7UM@4f $'MnNs!/Ӌm'-ɉeæ)8*0 i$Q cC16n|]sme2( ).\.[ "lOF$fuTnig`JNWT+Q$Rw tՠ'@Qb "d A(Kvcn "Zo0C8 @E!J6(rKXӻ*`[' ) f&y VnW-f$NӠT-$qndtEHg>}7}>W'a4,Xjon Y|,+wUו<+e, 1Ś HPOģw~=^F>8hLUE-pUsUR]"RZn>ׁ#յ嶤1pD6U0}FMă{s3iH)m=.DF]`=l$@2RIF*_BQ|a`O;PۨZ. T@@p|,< \jן@SHs 0cpQ];ݘ*2]cNp#S1}l_Sl삔i jtm ɫQScڏP6P6L&-CiNJ `p7sv\bb,c ɢyɣ'XLh)q<u۬z]v/ݺB;t5^y6=Y{Pʋ("K9O(Ke{_JIZN?~SqmM]OQJI)!72%cgDlc9Im&BivH@5%T ^M +f"sBWdQeB#߶Fן@ʁ(̻{=$i3!uedu LT{SC25Ō>e~B&ߏ/^g՟[LQ&~2|-R$yT^Rh<Rݼq=%Ijc?dK;ALT1yl]a@ \Xѯ5\)%U#%h9I,E=ιI33V!kf5P8onJ0et5Jl4f=`āI&Y|,|d֙fy:PCK7)cew-8`U7q{i)0Mcǟ99^L3?{ $8,܉I latiԙn6@ k+o7-F՛ʩ p;mꗋ3?H0(vvE +˜CirJXt4NkXfWtؾ6^-((;x %A,Jl.i`ěLŪb" N5o_sGvGo95 |##6 G#4Ս7Y519X a.# &(ޑyf\Tb)X4/2fNTg0L>z N ok7`RlRʋI%]A튇ʣF*bCF"Qڍy+ALɗ?ᜣpľ߶:X#hTy Qt/أB6geu]QzcrԹ&ֹ(!MچE{[;--yعVrS?` Ir'؝ǟ?CАrۂEL?|C@ĒjCE 0XMWbh2euTuNFwEF(bIBaEGdk37}%rֶPu=0^0.eulpkIv?u~|]LTi>(l)4Q"M-W`SfT$S\zԡd3o=:uxG$TC[y8X ]v&mlN,cˆi=8].W䱽ab蔚sJ\6L$5zakD!L=܄r@JfH D-6 jJ*aё2~r1H6 * aCqPv mC*P? B ޑ9OoE>ކU+5DS6TK4n{^t ]+6b{{RJH̵*WI"q'10 GLs%O9W%;QďmJB4e[@8\[mHMcԳ~]mIoT0/ S;  H*v?xYN`[ڙ |_"iI1 Ot69|m 8L G"1!oʥz:!؊f3wkh).6.Eǰ>tΆ8#]Q&m7WT&nsY--Q c(OLݠ^^X2sg.R R(v\B!WRq~AniBB7^1=2!𝤿 d1l@uX^y1p'p/l WD>bC$cm*$99/ VŸ8rĦEC:=zeQfꂸ' `UJJNBDSRbЅ7^~f]CCㄣϚ5u* yqlQ:fγLSH&J%viB]%e; S&=pKs+XrPN3j7{3R"XS'=ݲZw?đ_|mo.y:RXϥ[t3칭܅&v|fw.`hR\R};۲܅|Eq@=\6[n8D9;Z0HvZ4qd%Y*|7wᴁٯ;us&rfE yNo^y%?8oB UN,Be0JA G8NP;զ^6).* #r2 jů/f/Q1i=҈Z5]fH'J5Ilv&9Ea*tj@kR+1}TRok$ hs? Crj3BnZnSx8 n(8ي1^;kJ;;S7$[G InS#l% E2;QW`qY1(`בR7 JJςHڳ$H!:WEi$!lyR ̺E'5` 0 ;go9 MLTsA?5X]B4f`3 ܄cN[~6IOv1-Glg9T)%)J2G];$ӳc S݌?n_{P$;Y*+24%&5F$=o,R2P44@Č U B3}i!X%o*@$O (B~`H6'W.عy0錕R1Wm-*½8ugO:rm|T\BSfKZ'm7n+`6|:D@O"^&p'[-ze]!qkh ~zg9{:ݨrLSPQZ=~  VHqxV@`딺7:4 T#֯Fwz.6lkbjsPnGxU$FE`D K[MhyH\c6Ilr1bﵼPF!\q|(_uCY-* g&|{8Yqdo㝇 ̘@zHiy(Ae0BF ˎݩm]x2jHÍtn]QݤdoABv_htz-@78&|#p*him|#PБnY>e<<4ʀOpr!8؊V)%,/%s,zixx<gsZҨa;Pt*Q@l-5zrش5Pi)n M6 @:z,!dMF68ѠDؔR _fkI8 &~|@2G0M2@vJfR.xٌM<]Y;s=yR=~nz|ƜnD흍e {;{GtNz=u(a"uMS_IPPAm3Z `$Lm [ OXTm+gmY ǰfj$^yn*os5?v :ceeUTʁٜVrqr?卣q9I,r.Lp_^}/A:%!2pDJX)-f ̘a`--?Ut k$]#zI ) ˊ4َt~4jo;]@ O yl2q L8Il++pdλx"9\#:Z<6w=uk?횎l7EX"]H MUpصF1E":E !*`S%ph W; Qz^!\(fB,Ԡ[L9k}(v#1m E{PՓTlD; cW <{\ŷ7Pz3YI7T qӔS\L8@zuԫ>dӂ}ҶD` A$V&fSex(ݤ̿5XVeSQZ[8^,[WV]CT||#1MxTpͿgr썕iU :1TPN3iS7IJ]-4oJӑ&2*2?;bG# !K\W#jd Ck$ڌ-le%9s` >w`o(R7b/k8tdIϝC;r^4UUQd'9~]ә{`n P1TJ`4%(ʝӓ)H)Ȇ)(_{.&3~&쏠yc`(lӍ6(z Ɛ t,pbKBX)lۛ҅AUxڱVN_V"? =KA"㸜1pa %a|4qgqU^ O4Ǻ5{(Sj{zT.,`X\{.B`})d:uMc٪aA‘ߋmM"Ch gǛ"C2oV闳[BD+( oF74BJCA4)%+ 50@^IWF1".y=3*u[d̤*#O΄^y&`pr7*r4A2ׂё[8TckHLTE`3LF/K͜zXo?Sux;'5C#bW9/9Te8;R@pSՏo>S @@$Aaх,Kssc݈''i~EI㑩 ı=YGM\ ". r+F-wji(tt%ט@.r_{fWU5~9ߩIB!?"EP  6lڰvEPAzQxi""M:ZL2[ǹF mʝy;{f=^+܁{o]궯%6~`J|s/nǷCiag^rnfwf핏gEbٟOr"IN`gRU p~p_<.˧=>.GBWd]]Jbl9s:EgHLn _svD!MCRP7vᯫh ׂ0SM7Gz}j.ystEӟ+Vf"~)B&n}V4+/,d"8dO`)uwGaL=C_ҭ̝>p<N^;SUgHOiy!Ev耻*lyk^yYS1711=5_ѥvJ '(1)fнn:VAX4I Sa##Ύi3qZ\,P76Y#$kϽM>t|ϒ"ĪgtfB߭yB1PiD?h҂ibY+uHx7rV 9/c%.HmK. i'\}ŲRb2ǵ\)< WhshX$q%V5ձ%?xm_j =7]_,Ҝ:kK0f#7$Q+AkԄxj!MHܒSg7sl)yaXpgR$& 8'9ؓkt`=q;ca#Fa86TjZic)~_>m:p+~}s;IOmꍘUdM-{A . oDTA bil.b"z;"#\ތڋDk63RkcL-mA@/【 $db֔f|JdhC`Z / ,u7W 'p{m6m]Ŝ I4a\zhB )c,C)Z:ˈRhx. ֳ`, $cNYD07kBH<Eb*q Aeb?Xr%5b $Y_oK M6j$2oBo4-[P.WВiY朶(YYFTlq_9JdA]e>[7o+btc&KN2 S#l1lI3|wo4:6 J2ن\M D[XzbBR|Or9,VJ5:!^%&nͅ=0XMLjCx߲+Q)mؒlIɫvhrZp_o/atBrJZՂ~7\f˂O9oAJeKb] ! 'k^ҭ+&-5r$1i#Eff'=EvܲeYK9mnGV3ڍ%cEC0YOlx#7C5e۪4{RoO>U~}h|oL"#?>:G7u72R:Ie=C?GNb#5!:[X('<qܩ**@h9pv+f(y/+jvӼB {!$/Ίv k=koHM B)Jqi+},82Z7yoJzga*_Kwn;f? 'U.X D3mKنWڜ&Xs0H|[R1|;bX2ߜU(Д}OYXwbm]0P\yyTu bJxucֲ{=yb3mׇb0*gW\X:`e}pߩhdJjlގ~N!i$UnFIs{ncXTW{?W8ٔ$sk*ͩ=[4lY+Obϐ˕ET +cm+xcߑ Aj},0 *_rUyT\u/ً)+ qD[Tr/mUw?=N%/;19޿&+=~ӽ/HNkA5xiaʄz:<1ᑀ'R5+Zb a_{zi}|3t43YLa:4˛@o>A]t=@*h&1}Qg(BFV/>wp-Z.[N7z+}ACjЂ9@1EG{/g{3yMIz|jZZZ94-p"@dԂ*rt^,B/Sam8;n3Fi֓4:Eз֎ywls"9'Ι$e+*xh$ U0 F#ibsgxxs:hUӾtJ|r.=:Anz]k=o?<=M?Wf͛+ѿae&X򞟔xAj8+ 7O=#o=eO5ׂGek7#@UɞSAhhȨ"ϾsA[sLx6 ɂUŽ>~t~oh~?f+D6cwGc6J˳T.0`)utTgx[y99o/H֯ZDz.+}7yrEC<ǖC2)Ŗ&dVx@xo7۾-cR^bbIlu lű$=bſTPA=mǣWۍ/ֱ<{yGU{PR4=, .6B6!jVο_E 6iTN>gQҷܩ-LjປϐԀ_)?9'7zn>9elSBUIBԿ4:1.N#fChzra[8YP ߑb/3F3s/3aa$&5Ûm=:$sf_RoJ-_ؗah$E;^ 7(rס#ōzuѲ$~t>+}y3s1}EQBKXAB;+k&njHT.^<kMݴy te)ܣ}-т*_'W>Im-}|݃[[t\=6 ?UDbYS0p'sV9[3DX"B@G?]zJ)evy@9;OBUy/K%(J$:SURIƓ-3 h& `>kُs8ԓb^Pz%K!q^uۈ eS|s=/ƣCK:]A=Cv?i7VgsEIlтX;/ ;J$L0޿f˾K1'͕)оחm? Ƿ~^R7/?8G2pvhOR-[kM 򍼨BC?C9'۹/[+jJUIǖ}<@*:y~[V-Lz̔A#(gX}\4fuJ3̣v4Kr"?H,~wCg:sP-tc!F(|d&3Rusw7<7=+3Dű2 (Bn|C頋&s˯}?tXĴ*F'6F lҔJո}]1Ȭ7n:{hO>^Z*} zR@* 8/ o]Z/Hx3.A[V?b۔g^y/8!C7!b3m_̶: OQX\[9=˗9K٤msDe\=p/q3W3Eow ɛ| C .@@J;#w#9x}umo.`!c'ԱxNΙDyn|ݸr/̅t d%Gkijm+N"_͋zdݫ_Ǽ^<7a+}']S@KƐW<{nw(4XK,*Y0][maMťF+\5,ԏ[ˀ.yδyH'4͗ȨmWoKdG,{,Ehuٛ~},yD Jw@<@w^4,),&6NCUm=\|1|#`-Zۜޫ&qk},mc&ДZ hm*ɍRs9c7iSGȏGWf7"Ex6}K9Crl61\L?+>pIԊYLQ,bb@1W>N>.!,sywSHMXVZJMWNxIyz%V\'?C%Uo8hT )x[N*'GϹ}l5'g<אѮ&{ 4y(\/`'r]ԑ,vFl/b''R۲wiۉI*FRȘ4}2ww5)Z9j-8}[Η>;o/,bk)!$LeR `<w ё2G9Ō#I ,f&5p͵O[ A[+$q =ǟ^v݄?2sw_7({jLGZe[&6pƧߙO3/-ʃs; d␎y#^7UjU+yLm>{Yx)-J*ya7ZSQ>}WZ:@CgZkmH<`aL1K(9nĊb#*8D XS-9HCu4#\4fL"237t}\@,ݩgMvsv3`oB=L*b-I x-f P:Te9 iϕOsMOS?9栭-|RY<\yn>୅={ P+U7*}DҁMdiq? 23t^l|Ro#b=% a袘X{v7wDUF imYd#zT鏹v=UdƂ}`XTK*?~W]jL&PXXc"7'6c!QנwX"38a7:P2Ćhɰrv|Tz ˑ@Uf̈́ =}En3v$'7l9}w l.ۍ+쟿R-cqW=ږ3GWo\!P C1x̐NhieRKcSdv˔Z\wLEn3.Z֮!?泴;O*)B CXLX^_ ][S,#=wS5 䧒U{{N{ID9DG |gJkd=w{m>;Mb͛z!Ӌ;r/ڳMI=n&1CgotQKسʹYF{}jo|Ek "!ڗ> װ\ؾb%1[ d'e  @Hx9]ãs3r*'R$Ey(ƕpuVV53],Rx~ ;7ö53m1l>-6i!E]:N*zhB1/W+Ko,yhz!Ije٫٪舧07"rbΐ`ڷMD^x! R7-k規/|Ֆ<=t}ΆY$ i,~έ|$!cx`nh|eK싱TS1;P-;YR͗i8TdZ_+)N@ԣП=2شUsvx4ȏF_77![>.oq,$_ vf5Ojm؝t@28Ǹ8! S0I\Eԑ,bg|3bo*57g?7~XIN3k%8Jr>ن_%e0HFϺ A0t4wt u5 \N6ž Uԣ֧)r@F~õK3l8ٌ8 Rշ*%9.>h Co܉yFYMUIGO#_?ۀf3@"_Ip8h =άC3*`KPxn{9QjP}Eg`1Mn+OWva?>CH=ib;U࠵R`ls6vcӜkxgZq֒VH˕a@wP=U],Zn= ,8)oԦ3filh>jA`d|z|pݰUyo*Gfev.FPnZO]CjxlTk3ô^M2֫Ll@+\T _7s}Ys,z]fv r߾ph;Ѿ`-t!m[4=I χxu#="ce}'6GK+]҆&,Ů'_BWv\zl_Uovb~wslLa.;Ɏ}aX%3њ8f]¾\Pk7x'FarQ(NcQ.s ]|9%#W2 }؃Y(d| <6XHpHϗk9$36CH&A y]qˣоS97Pȗ'>Z%PٺQpss1aA{/ުrK:ـr§fx~!5Ԧ#-J~xɄ?Hl+EyJ؄E<ȥl[>4htOvÜR+ O W@UWrه1LK\tǩJTP_.w4vb"`51{~J԰g_I!m J.un}H_-[%G+1F: ۏ cW|8LmX̨4~;Mdνzt]?vm$m(.!hCHxu^Q=PAX(!Unb{&Vy۟ZzE` RunlRr?1^- }涠1!<}\ߡ`~yȧ8}D幚)aniHcTb?_(q <:҃s 0ʌ z ږbG{}jP~`xNߕ`2y"QsW4xbqu cϛs`i4'wbB TP ik_p!d +H~[#LCA/iUDI/₹+hVC4iku$FW׉z'Ͻ_~3WM.,AX(~dP?Q4ʹ?4+st|QE+u΀z1b<= 3@8C;޳8`m ] A%M]d5<ܟ]}+Θ7?Z@ UΘ<<>9PX =ݸ*ټq=ťs?OS%Î}I ̼09I&/Nt``5WqA=ž6xTtO7*Uwg\6e17q {:JXL(>"ƐFŘұ#0zvy6>9_msސ>݈E2K1EM쭴[?룕fyٛY(cV cob din`gx!XKPGƵ8bp.?daa!azCG&tؘ׾6;Ů,..B&W>3:s 7ըgD9UjwaJ&_yT}~5yPX1v Rl b+ո_^u2 2|T*|NLc_b1/[U҉umxwy.AG'WZ7U3j*6ꛞsngr{GD,Gɋ_V(mvdtF`)c {珕`+% 顎 ^ⴓ0j*1͔߰fvat6"32%'܇m@o[:JnB+o@]0H +S,Bq>I / >ʋ.{%(iO ;Lf؇0ħ9 衎Qľ"1BX c,0 6tݩ. /?=?9O5/q˝3'W/5f_3FX(6Aoq<@ pˣpEWnJKfAn) 7酥o0i鵄XLސ'r`IL_v0YbEM!1l{4O7@fRN>tQGMs0]Svg&r։‹ZVZuI>{Ƨx%B Z+KⱤӏIL#="gla[@0<мj3Hn)a1ҘRZ&r JOMDlBo gl 8cf@0諃ɏC~t 4h$_%/;cXHAh BQS^iO5?|@&dgBqB=c>\Ҝ3bUbAgC6Aye] ayTaTە[T6 alk jQE0KX+H`1ES>Fb0xn }^\=類4a;R6*MuI2G}66wϩ{]LS3GgO0H5pF -ɸc>6zCEp#="g>2Q ÕlQMneg>, k" Dܣ.glSؼ9ƷLWP=h5r"wUt;50ɸsꎳ:" bHĺI>47_'۬KtIY DPX_?:$b rӭ3nL~f{_<ĶK>8‰[rV0:7imlXS s/AyÆ?*{B3tC&$XuXAr?BYZdH`ѿޖyi륧?JZQ:ΰ |ܧVl8]MG%͆z쯉% 3)W(%dџ@&@hug9ľeCًTĽGѯ"J̣(=!3M^/+\;xR3ʺ: /CizQ};u Vtz|s@)bx̳]QPj`q!ف4F1ÔGzT(*ŭJsf~QD{},e@cY /PcOzЯTsqF#Z$iԅR\YwUFjAQ6VxDC5 GДv#?rwgP1noK;5INH[EU塟( EĢ&xߛRlџ,8 1PtCn . mښm>O&=1 0 < tNT6]*A/zUXޮΠq}uQyLvF"~$xǍoQ3t!^U;-];NQ-ۋUu 3Fzo J_($sA)jJdHqJ> ڽB3T\@_(m*}?UTT+~Ą3Քe{zۇʽ[s03<;Cc5jr`.E"G)WEޙ.#3LDdr{9FNnCX]%g~;0;Vd{k?r3: ? =17V"S\@@K|ئף6BP1N?"ʡO&x?eܐ8Ψ&@"! s9[aw+Kz7gP!,l]}gwwu@ x!mb14;n`pn\6"Ht ]s?Kl]Leuag@eTw!^4Cr6^g@ Q'$? \e=*/M֝xR S(8kVY^DŽK3Gi i-],f (6;m۪`7am=jvL_-[_<aki8„PNϵA;e3–tэ?MS *GF~kiED z!N]g5$Ve7~K_VםOU ZpJɦOll -:; Oz|CFzd3wi"cl49ظ.(#b S y_L@hZ(Oz/a-DSk3~?!klTpG{ߏ+̎ܠ]w PEt B\U㱻Mǡ\?z k:4k~-Mq26E[bEZ60eE D%|DвŖvPHj/cn' O'͊[@[#ȥn»xTa9ΠQ`B®Ѩ񒫰>H`<UU"hiV:w57X  \ b/1j=dW~hbCaK(~s8kA.c|Qv#X9w4*)s EV9[` EoChk$цWhckzh5$_)? r8UJz"q\w!tX݈#QugTL1,0?Y"<{ *Yn.CobW,qF(ц=_ fcoaEtj*5Z$/3. V'0=Q~"k˩Z`!\e{c3 ԚY~&f:a~ ҋc5|Fz|E\i'V\"b7.0SN3T ,T5a?PE?0jnᒎQ^QsEW ,jg8#=EeAXIuw+'3ly-/oAIqɝ; be)4~zXLEZ꬝y.k KawW-u92FtgD'M9Υ<~9 ,kw$1܏ll,F[9Q.qF[rwFB/y1=VvT$N#GIe^-7_ ]qփ d'XJ}x!+Z|I{ǩ少3\.;QD6RnaqVt*².$`ak(}W;R@ѻ5ouwYН 5UE~EN5|q7ZO#"3ogFp,ol9T|ȢNtIDRT_~L4AڠR~\@wF}}G^Uc.ˇc#bQS\;T*] _z^ urjIv/ԧ˲]wA3M ?[ 51kD#X*Kk]x,b <<-<,y* Ĕi-e XwSx϶0<-+EaK^Ftw`{eF; b%jO OzMrHx34+nJ8Ct DmeK3[Uf[/QΝ$Ebm4 (>+,WKTR(0f)СLTgex/cqhj8tgB)<ƂU;YC9&j-)HX?Z#W4+;ts}C+W$xXj!ʪԺ3\@ww\kE{%V꼦7h)5x9V'-mhv?]EW֐?u WX8`] `&'jQl|@k$,gMDV*^/(/]*"QBV4Q^aIbL Y++1 VVX &fZF(hђ =4P1 y߄!糄10--b qFQWqqF=Wqqj 8S\@wq8q885tqǩ.;8N pqqj 8S\@wq8q885tqǩ.;8N pqqj 8S\@wq8q885tqǩ.;8N pqqj 8S\@wq8q885tqǩ.;8N pqqj 8S\@wq8q885tqǩ.;8N pqqj 8S\@wq8q885tqǩh&IENDB`pydantic-2.10.6/docs/logos/netflix_logo.png000066400000000000000000000361031474456633400207100ustar00rootroot00000000000000PNG  IHDRDH pHYs  iTXtXML:com.adobe.xmp HB6IDATxٗ\uo[U=bh$ R۱"۱-R"4%"eG)/ɟDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD=dC33Eנ:sٖs>:=O'~.{Y?ړ_m鬟T{4-: Bk~6]Ձ`/OFSpp~h`wM7 w#i(ѦGO.I?}we[,CHKY?ړ6j 駴a YWumh9~@5wg$z5!:GUGfY?2+ wby- >4mUY5I hU'$z5?'Bt$1C<0{`6'!;{fv~f}Ր"~µv5 0܏tɼ]MEkJs!EA-}h:Ve {9*N1Qf.Lɵ>;1܏$(5 j Za-;yd4 RX]D IzJJXǦW=ʌ`Qpj;en2'UVfv~B\ Gfw*bQ)\Nj8zJbCD4 LNDpDN@=3<2\Ha&DNHEVfDejBZq 0afUT,a襁jxoG_v!%/ AZm>,lI- 8 EӠ`{ewm`E0'noTW]y|VAD@ڵDư6Km De{fWa/Fv@=4Ye M!<4El $ $ "*3}m I+3K4;l9 x $Tp]5nw.K%.މc(]'f!5'C"Y!|} jvGu^$OE30Q=7$HuJOJ2wpp+h^gD){ҕ;`Fj;< psZ, q*Q4"E~`:PeFb`KҲ kwUx`q i.t0dH 7w(.aelA,ІD:ܚ1{$W[ `+[<$هm3 a m{FLQ~jS#@x),hK6 nH/֧y|Q=|=Y7? 2w0f,3{Wu1Ũ+3D){dTIVT/a |dvE/r*QsfrKu,\`}Dp`&,~!#r<ý [p]UY3;뢗NRlt}Րf"po,fUCo1ApolAoU{aq2C[rET渚4$29Y!b7l^[7 < ̔iB*QEq_n&j<{#䯩%fdI6${1sAyý 32ܪqM5L?.qP1ܛ"YNJ Idz "Xn>0M~RƲ;5ýYJ! fU+Q?~*\H|7.`,;aD_`;5ýqR@ M,T *Ý,ݘydvKuN$kApobK<,oBzMta!Q=\H?po)Vf_6*3kwj{sv&YHOSBpo>(.AV 2977σU1 ox=3Dp]Յ&}l1zi!]QM46Dfۚݪ7UT^n&j {C&D.TU i7wS~͇wDaa7Q=? k)f%b;5ý$lȬg ">T `cٝ{sI,SĂͰ1SQLÉPy~|Xê)5Nx`7T><2;7J,V:Pe&)|~*3*{C;sCsevzw%{:lj?!fԚf:N RtjiI=kx^rɬC\Vg_އTjkS=:l͕/zVwT7DưM*!{sYj26$ݰ*3au'S{sfj݇ﭛ]>=Mɵ>=ʌM4C w*]9OB:a䏧3dÃtn2En΋^؀~96\ӡpoRVf" Fa"xdvUZC"t0ܛN-2/@yjސi͙ bPIt}u^m /~66ك#N@jv;ݪ*z<ŽUft8Vݨe%UثdÀN@y+SRs}/Nf:/c͝;gK!]SU5:lE)4"l>ݐ|d0#I/T}?;l ÝS\]`UM?wә4  w^P-T]=f!nzpYp VHoRړQY&Ms4En5[2#@)Bҡp' zzXvvq5;#_ZH)/Ҭ0ܩfbgYHwUD3p':۽t5UFXz[u1Lbԙlښ{SpKy2 \S =ZvS1+_3C3p)Vޑg怛OBwuefT;<; LiugF rW*3F; =]׮t4T`&/ď3Év#IҮvNX3i&TPr䆼/7Uu|a留5{NtpTVB}8sf^-ե*3:gv8$0 t!; Hf/Wi6yqp>a@tTVٻ$xj "T陼t|G|a'ew+&x~. o;MRڪ$ nt|Uq˲${2=M4C w)e*Iu0UلK;.2 ÝFJۗ043cEn}:Pe2*]q-;{ZAnΌͮNY{`ף/mVf{( pV.fXw2o`u ) -IӑB T?|>\W?NN5g蕔nx1swH!ߍw=;+]׬y@ x[u-;D3c r$;U"ȖɫgA}ջ!Ufl^:۬j2w)? 0ܩdtZvo1p]5 +k}m)(Kvi&Tw? kHZkL~V?$Rj5DNcfS!0/*3k?t؁[]g8*fNT ӔOF&UVx=Z/n[Z0'aѡp=k`YHU=ՉPA2SXpK"G "nyҜC?M|$٭Ǩco0i_1ij| 7Tuv$=%۸=n;;Ր5UqOG?/rS~XC"'KQUQIy: ;mm2*5`XMDa<4T;]D5@'EDÝUޒKmrDn–^?@Ƿd7Kx:X!iʪ=3;]qUfn~hRemYvm rGy@&;7;mAkTwx (X5M}ʌsM; o;՛TȎ6#na mTfsqNNcI%G[;fw{<4ɸh]'HFxO~ѕ]`/oU͵du]?AOSp-TVcR nHcE 2zJ!_* wڸ-K{mefja̞ieVZc> MGk\Lme&kxW^X$Geu|e3{RZK`B@A&COFퟓ(4WX`1f=MÝP#5?\w5$rv X)}NN;S鑪;Er\cYHWM4+[jÝv잿]i'w·df^1v-Db>"-;F5=3xMs֜JՋ%u:;혔?!}I=T;#}!{AZRWC"Qo!:;mcN`ͬ/DsmFz;Wvhོ-:Ýv)قUX3{WuqVOwn ׵a+3(gJ!4p8?M^F%w|G wsՆHT-dz ÝTmdUf̋\S}h 2cW ji&;MÝv,IIfEruC"?T̞ iv)LZfzfP/Ypcӎ{BL3wTQG]8!r)z˧ņ%rR~A}Ր*3\;u+/Q^h]RYG1˕sGȐt.:wֹM+]/,GEя 34E wڥQЏB\;W5UVef,\rc/g|=bӎe;c]&w|H+3/WfW-ɹN}pز,R+3H︦:Olwٴ2zS)F?4@/S몽^|XRLf=s%R=ʌ 9U.9$Ts=̠af;Ýv6S-S?<y?WU]]pn#AeRVa;7;̸v)w#FX*0'rŹdR݃P Ѯ1i \Zr)a q_3 ݰ.9wLX)m^M>.w;RuOSi>S_j۪޻fO\rn3=Tnw`^0iǪߥS]%0\W}Veƀ!W]=Ý3-H1"﫾Ve \tnE$0U;ܳr "O{p 2jLZo{T=]nzhv>WVr?:{>֙; wړ,N~-貪 ?4{8{*۶Zє0i7WS+°_I.y{W0 KV4HY܂=Kqef>ÝJʟ7$Gӽ㽆3N8wE$st TƊ1i*mJY칂 y|wx!ce{1g2iKM9?y`VC"/&R{\1iJ-1(}wTf!-=p \@ݮB^;WISp]2o=T~A]{f! 4`bZ/}K(=2N{U*OsRfʎxlvK3b6ΥwXۤ:SmN w=[g Vi\xG56 1"6ݨv*wd;w:@ \@XEPs3k wڽ I\m)(Tf-vXAq>6! LOwZiPt'c }LG\S0k;"怫ʽ<5NeL֏U\efJ5>ЧbК^PM~tQX)= w:ÝD*JvV|6pCu-!p9s|*3iN{ptZ.ŤN{D>P}7UiO,N1i_d=qf˰\n@s4Wg1iv0N!ն9\tJa=ժ_b.0i:3u Jg "nx RmL~X2(e`0i >˭.7i#ZX3;)r2gFsY >?afr9Ý66+w3{vp[A<3:7ߑt>GKܗPh;4}uݐc;]ROk8"Xl9Mn$te˪W w !/;HrLk8`.,:b_ ̖Eol7+3F;ÝŸ:*?"??`Ez}"KiL=+g=쒤 CDu7vFːM93cv޹έ1{C0i&G{vIfq 53E^"+dtc.1i:&aP\o6 4;VE @1>a2Ly;j]P i 2 +3H+e=ю0i&_GE%KfG>@x#B1ũ6Br?W5U>;M~춫S'jxds+ieuv&8ÝKme<Y#{>W?b;Pa1i_(O Kwx/tYU\T}UsiMN7ƘO6Gh;%R;$o> ? wj P[/Vf'*dןj^=sN(UftLY&Nå z!u?-2EQGďo!;M߄K1 fR@] =#-"(hWYNQRuh;X}%TV3Ly;MY^{τ> kHwnA "̸(gdA[P)TJ~72S Qt67@s N;p3Tr$Q~ǀu,gm, 5D悜NpVywP*}_?~HIVeN%S?C4Y8o:{WPh7|V̦9"6b'5j块Ui2; (}I ^q:?O=b<0'rٹ}xYutwN=:{OccW$˪QtB ?r wu@4ÝW)GNM^9]}=#"(Wʚ} wGVHKo 3xPeFHsƘFe!;줼=PJXwέ+=_!ÝBu ✙&T!n).C"4i~_! =BѸRW?Λ#ӥ pe\iL)f:Mp}TڑTRݥ2BB:@R=fi^{|v*:;R5v29_]>=YeeV&+;fÝݸm2|n U>ε;pP{A5k!ϴ֫wT`Y)BVZNe,8.r5[Uw;(ڲV}N4$93n$XZA7*-sɾLxB"`YΥ=3`V-L `U,<awNfjy"믴_ hIqFULu*t`SPQVE7}+6Έ<ߑݵtqո~-0 T1N"|kй==Eh~ju5jN% w:8sXVXT*31pٹ&`3a,UM c&yAd#=#*9R\IpR4q 3: QVvFR ÝθyМ(:-&-CpSW6{+zfE.:̨!-iC2i<;}Lt hn@<ER:)@;u/-LycԌw;csOtF^Jqt=~H|ǀtImܳ!("`S Ýf}=*3 x>;|p k%tjGBbm[nfeC"Şxw)Nm;]58&rɹ8 t Bz/Б1a+S6}W[TGgwh`&;Neܞݭܻo|p ΝO(,GKxaӡP;5L;cQsʄ [U;2#NUQtLnU&쉤tMHR-!cٳ"H+3Z 4凟%4b5}ßp(/) )M$bً̌*3]s'Ez dTP" 2usfi=_)';Pi̴UӥH6g]q&\e&_v0鐘bĿm}[7Tr|"Ýfok}ۍT)+έ&&DZq24s{o]ªE-cݪa,G;&!&pqx|GyA9NpCaɞg"r%6ņB BzQՓ?k6dHdv|G쐺?i4cS?-[ )f;#aŻSʝtXTї]\8&r5TEy0ÝXO? p5ELi&A)pAڧx:a5;9uBfCĪT@;pCdfsXODgCV"C@;pbo?Pό1p%'w"K4m1PؿzK{*39Vf,7mvܛNt{fLZ{ <\3c!UfLlG?FeN&C"s|״S w:,xavsflH {c1)|*3D QKsph1|{" .pɹ"#tޛ7Cy¹uӞR}&o^Dc{w2@;4ݔeca4ʻms5{)74 wj5_p͞sHL+po;5џXPG$O<ØM˛hC:XՀHsbp¹mf#ìPa7GH/lO FKH4Avds =܆Y2;L 3MpFB:XD.O6ц&0ߛNCª-AzHceizM¾vXٙ(:+n6"rѹ dT9۽)DCԟ2%E@+g3QDCa*3{Fd,٪do;74p\9, !껾O:X k\,Ia ELGk|o>msgŭq$ wxt|Xl<;Q_i'=9-Bz9^D{'fgsq<;Q|Ә0⢾[2}xDe?߆U .8,'fp^DSA]3.{bVU!ÝW}t)- |:Bp'qRY \pq.Lc{w)'[KViGPH#eD (ޓޘ+.7NAb .w6;wRd=w Ýh@41E}Emp "Cp'||'eZ=0܉&x [2 6{ֹSuN4ɟ bg4h,\hPp'/0'D+, `NX!Oٰ2Cam+>8E.:pxM58 w-PzfϹh01܉g 2󴓧ŭy&{D[{ŀJhC.8Of0܉uC;֐>pѹ%H b"PDuDNq kaamW}0 9/.9Up'ږ濣₩虝w8d#*l}s1܉AagzIqg[OZH?.*o~9;gf{XDw AUfb9"0 V1܉vૃ0s +!'X#1܉v[ڻf s `%:ox1{lvF W& wy}ЃYkOcsmp= w~aUf8'D E8/Pၯ8fO[YcNc8VU`9qqvA#Nck?U:C}A:4nUNٜ3`|@n@rֹpj8Z3E H I!E"`ҙ3݁1hV?.WC|`i[Y?;;RhA"%'ua 4b ,6İSM @[8A @ A,Bs9ȼ"0'Y,B ƪ c7+#o})q]p̑pHWNu ؆醡a u`ô l@7.5a1ua|CP'}0Y!" ee2dYqee%1c"' 'E@k@zC7 9#">{II|֒PЃA k'u#5''u$܁f,(A;!8"$dE䴸NANwrBHI`=pN;kq `Irt  @ {}lxlCӇfaI6@$=>%+jN$}qOA3N9%nEdnE8-IV]F_K*=nxQo>l`A==2}h *l l˴CUUظ"' ' O;'sp$/,7ohHX=p?ږ ψ:^f>4}`e7CR( T;y"`E 3"$(YΈ{F䌸p'-ADG/m(OH&}U'Lpڕ%Ȋ3➅k}Y!"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""":?O<]IENDB`pydantic-2.10.6/docs/logos/nsa_logo.png000066400000000000000000006064341474456633400200320ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp 4 IDATxw|TUs+7"Uu^W^u]W]]ݵ`PQC!ҧϽ$2LH s=g2yy~_!%jvʢli7 bHNH.` 2i>WwB?bORѫ-$wߎgW"A DN$c # ;ŻE5 &qUn{wnLT( .4q:4h8NCc+>K pIBʶ"P$)@?~D TS$.dת|CLzXIv:V,@5(X &]QQ@0Xj.L(e1L?9yX imD MHHS܎"OoѱǗt`:V L i*~D E6XM<ӂ*5|>aJ( B4L"HIVfj7,)D Ҟ3쀣g'ּm5U@YЛ@1^WgP)JW7TU@b^,R %R' .A`ՄxDV4T|  S5L4 hV` &M v\0}VLj"QP~x =BWQAVa&Th Ir"(D T"nxuK߼*M욊(Oƿ3AT &͢cB U$(d S KLB0@YpdWq'TG϶Tbq5+=jz2~Rba1|IXPqݓHB3H`i<Ïb‚;=EHDЏU:QJ-ES5̠Jb- S0]qHG0UP WFC] V7q:D*H Jn2ZmSv@~hMw#Ftqw,e+t7 $@*"fchΣdf\iiJ JYk"9U7CGjI *@iBb(PMUI!BlҲD . (b>F31b49@{)e?UR:H̞n%M CӀ=D|(QwssH@tYn\Dh.4d)R$bw#F?1j&FT=Y&&>͔uUzc16AltF!%*;`#ZV` #D̠"L 0eQLݍtc*0=~{BłPZc[!D,i5uLE˱+̓ȥbgĈq 2=x0mC?D1$Thե45P/}(v[QTg25 ՖjDufQl)!cnIBXQ-(n Tc1UJd_ۅ-c0RLÇ@ k{F.!%C?b.82T>Q#ơA̠81B ɁJgXRE1٣6'͑ՖjO-=3 ՞bME%s=k"5ȋ TRtv[=0|t05{&B<˿M!A:r"/X, T9HCYwĈѶB L4jR9 )LABb GܜC&@7$@v3P5sاQ6Kȑ9CH,?UO!E\ Ǽo3S kXaM-p,q[h*X:bVw\Tb5+cˑFH^&!T^:UR/s$n#ƾ"fcI#/ 'JS,2JA"fI3d @5ñ$v.db4g.3{ƠcM*/`zѽ;'Tǚ젻Z v) X@𽔼|P̘,}6E̠h?R1~d"bShFD4k\՘AW}&.c0֤ҏ#qaI%q6۵*忢{K0u7D$GvQQ1ʧG0m3 cTA11M`up$3E4B9֖-0l=c3w:t !!EQb4}>EOs1UJƻc?XކšXBvD/UOh `f Ֆ=cΜQ8–֟){mܹ3cǎCRQQ-[3g7onȐ!Q]UUh$111kEb:ap8Z `K;[$ o]U7I!oBHEr'N̠hDJPGx30.aR2ZX&.9cO矹袋X|ySN9kcǢ u]O?婧b޼yu۷>:6Ӊ:ލ=j5N=T^/cǎ#dРAŵ [r/z^Bb )lf`[P?ܚT$JXeoJ`%։-GJY TEЍR.rܙR`ː-nSpƑ5)),,oeܹxL8ט5k&M S8]4x >4SN9SN9g}kɠG}+2>K2d#`dgg|škr{]oR* ܱӔK I)=a*U1"31ZDh &_[Sfr7H#( 0'w~w)))a…|w,\e˖a!fӧsgml,XYnstڕN;ΑRFt'%%5kơ/3V2tP̈#[g "}S%ތ쨶P0][ U7[ϡc-!fcP`TktLTDF]R>gޱBOڵK2o<-Z?@ w͚5ͺ%K5~y-'x"W:G\'''F/bN:館.]Drrmo.,,Z6oƲeRJ,Y’%KxƟA= q=R1(D [ĵ;'ŵۗ~/˗3o޼'н{wrssѴ\nfv-..Gy$9z`\xFۻu /@qq1}/"K,oɒ%{|O3bJrq~;~3jOGhN*0ѪMp.qmxʖzGq!Q?1I$ z_A ŚH|SH|ܱn{hѢsrrٳ'ÇO>Ջݻt:5Mxm?sZ-;Z*++u[^}s %K"z 0Kvv6^x!^x!7pO=To@wy'9#nSH6oToh.ͨdKD ̀6zs۾] 7Ҫ/{[ȘQ1#f躇>h 2BoGBω$tX3ds=c4Hк+zsQG>,[7h{w3f _~yhM#qd$P;]9 (ՖFcR fZ>ˠ@A[AzO[8(m($MT~ɽC5޽!7|cit T9hYFCAb!-u E*2IӾj==R1Q}^WboyFr7ѣm  psYp!vj}YSTmjW:uwލGKM{kӧO~kyM3Hy1yǾO֑ۅކz]k `|,^!ƟA#%AnZ7KBV{ghqQYY'|mI'đG?~_TP `>> .,53oμqxKjTn3<ј7QhQRXX=%%Ui.\ Իyڠ7qt҈їӝ>}:78^\\G}ěo#k8YG@7p*FwVM*I;p3Br@4, 'B @1 T[wU1fECVl6VGہq'V9cZu~7ndҤIk׎A!`֭#7ɨQƋ\-\ZVE&]VEc-Vqqqjw_ EmпF233߿J)e{ߓ~;bm{0XBպp~4Lq\e1cn\5[A|_[ P 3nU_^.`ΨZ{KpfEȗj 6pG3筷bʕ̜9ŋ`>:ߣ#R`47(#aW} zkfa͚5lڴAE,:w܈ޑĨ`/_βemjR`۶m|7Q G`zѯ< ݳF&Ҋ]0?PMujOD̨CMՃwRVlT3F 0݅#ksμq>@ 'P/.cs9 RFW_}Ygլ#8Fۿ&i|gz꩜{ a+NS{J Õݓ?~%D]gs:u*G}4gqFTMAW2#=1|3 k´R>11~"㪃2E p )w3XEz#d {cg}6;3=5j/R;VF\]{qӵkWNq8Mt WD2%* %%%G3i\Q&Lhw٤E541cF>~שU1cÇgĉ|Q!qǑ{42?57͘V!~Bsz !4A"$ba~OZVmBhI(Sg~z㎨oڕ<ѣ޽;'O_.))_wMsoܸYsn-"tY,VgOmH皦Mشi^z)<@kQ}l޼={6Y_~wl̙5 &+>DRVmD&^Y:YCXP!%({%04[Qxz\LJyZfΜ@u_iGR?ɳ>}w^z%}zrssNs[w>v4ƎK=ٳ' l޼ŋG&sL:uj;kv|ᇜp p {QmOPIy 'PTm|z3&-*w,zi!dKA?$PV8.rAu̠'ZlG엹m]ZZJǎxkB{ע*=z`ذatЁ]ҳgw'|~7 h߾=$:oq@DRoᅬ{if͚jgeeqyqm5bDs}{1k,nw>99ƍn`QsOT{iGE|өX [>B(4gT}C-_B<EJDDW .F&fyHB TaxK%uw|8III 8#<>裈FCQ]f1a&L@II .*+Woܹ̝;1cp 73Y#_&甭| %LkruDsƸ.VV>!%BXփ&?oi<4$& MRJ>;`¯ ~F+M1Syf~Gx &M18cܓnMhmsN֭[WX.]F.\ĉ>|x*)}!$t>]>^P$0zS&$s AP5ќdxQ`Mn{  xb_W^~e/_Ν;[e[nW~aqr7Gl7M{キEc4#<2b Z<Ƌ/XaÚ,xw\Zn9s4s1cFD'*%f].WT?gСCRKbgzCKDr=R4ZfH;Ұ9vg.RkbA? bP7 0Erz^Bqi~o44u[oq3`޽; N .[oW__~i>Ҹ,^8kcСMj7`0R٘1c"T/wOo~Æ BC i|,g_'h ŋ9s8p`^v?R^WMTd){:11ކRZ ي"^g_BA^#id y}O:?8PPPO?_|믿Σ>%\N^#?>zkK}L +)j,o{(BfMVZ6bIEaRcgȊj=8ʵ6lO?M~[#m37ڼy3'xbrXf C aڵM#=CQe_d=Xh^oDCڵkz#jEEEtH4%:yd4@(#GӡC5-[}-sN*f| C~k_~eZ'Vӵk׈Vk;z(*]v1y &D%?G4L{qtܹӧOor=R6G׿1k㎫˲3jkn 7GH@O+7dAoBKyf5jb*c>FgDݖp8L6կpˆF5rZ^w#<ҥK}:O>$SL_~{ \u/RW6+W;`РAՋ~ѷo_zŨQx_r%̝;7}= :lf֬YQ]KS[ g}v?ϕW^hbNx뭷xg뮻!5jT NJ=f:7! od]xIo)wRp16hFC]([ 59( OB 2T\ꃇV5B4,ieX|Tf#=~RŞW[HϮD>^HbLEGv@wSUijs.R;0>&322x_;/rӧ^z)_5}]O?|-Zw}wͨQRܹ3'Ndڴi,_sXjUBy>|x}Wj'|ј1cEseȘC覲vnSOm hv39_Ò؅`&Fu$Pw-r@KC!~Rh* gYKP)T@%o'&1AT[t6 Vn`픗v;wvZ֬Y64/Gh.xmƭo?vzlܸ&_׮]Xl/f۶mܹVnѵkWaƠALk.Ç3eee[2DJI\\999dggӷoiùk9"dڴixMΡ,h 4+mWU5Zعsg=|y;馛~qd$Yt/UB&RC}w@}{S7忱ZA?(AځjJfÔ  Qx2dG+W4jGy^ckp -N/஻jDJ\|9>` z- ]–~;߆w|jޫ&տ*^1!1/LVR4bU@9>`/--i;||^s_#0gΜ>aI c9&*oQK)y#袋EEE[q&u'ؙt/ŒjOo'ކd%g@b:1~@D´,>-TMw"Ig^"x z6mo[D媃L1bDٳ'SLW^aOꫜ}ٳ!1#77/8* ڦVJs333?~|sk֤@k΁&wo=efGwmuܩ A852N) Nz d}[~7tMWWWsW4)}y0..o߾jxa1c4*Ҿ}{:(رcݻwʷ)1u" Pk)f؈.B AB|k( XE#GҩSFEa6nȢE5:?"FϧC-cw"vK|S&`碛|!J/Q )xAVDFiCP R LҘ1W0. O1 ]"s#e{gq=\wy.I}pAd&Lh3#8;^z5H3Wb*1er`%fЅF^Lfi22FHL%*k>lEmPkV*B|iEŊŚšXPm(TkR= ՞jK2 ,_M6j*~g֬Y NO?tؕzS7-n:ujD9ᄏEcI$(JĚԃf[\:#Ś>g]pͮt5%g}i&f!4a1gAu)* ÷ iHx)}Z`b᭷jRc˖- u XzuIIIվy7dذaR)@A6tOwzϺ3P2w7R#e@ݪ ݶhDmY zX4C72T4B7 BhNK<%ţڒQXМyL4g.Z|uߤW&&&ҿ_gKKKYx1|k׮?Aۈƶ[nMVk/b9Ӫ+`06ʽ\ڵkh^#T2=-/݋4؉)·s !+=Ŗ-'.XSzaKhE=_~aj-Mɓ'ݱcGD–Wfk{0EW)^%CKg[\A''#f[ ΓFi][I-Ʒk+~#PݻW @ М9JgĢH4EbQwĢgA *>]Th*;OWb󠡄~=Z-RDgA jgI8]3qZ  4EϺ"T5;D o1϶9@9$tĖKRwG9s[5SUE2|>cnZ~G(p̙ul)wT߽Xt){v>;uo9sC7h{ i"tS e•P5fO3FbHPM<δ`0HwiZ. Gu>z}.] /UW]_vB*++998餓w.\Ȼ[tQGŧ~H 2 kѽ%Ƞ 9 %][TU5lP%JOҧQ浰ceeFJJBBWkŧ+ APRv}GKZ J/E5($$${#H3HF\8?9 Ҝ:) NkH,ɧ+u !c/T{[@%0CJ7%Ց%Rbׯirgrgs{+0w\N;64vꩧҫW z.]u'x.;矏:FaObkO/ZK֝H%kwGG#zbZXmMAFihyU N0 Ss>yNNNĚVdiGqqqor4Uſg;~$P+wRXZ\(K4&v-dU1q5J=VJ=V *l.wj+.۫p[*ޠPAWۚRbWj\ #n 5C͚Ǧ }7dh)PqP$X >$7GVD|tH>K3@#H=i*j< ~]o(8vJ3*jGsdbI=cðg iz_5{|8>:Xb^C=1wٲe t_|w٠A袋z^՛g[je7j|^h嚔jk-$Bo!0Yڂ=*g5#_ƙx-}W^aɍ \tE|GqfQvEeЏ;8:[wǏKl,h v42dTڮhcec{}g<lrPXiҧhPCFQ35oEH,MG5komTX+t M'ɦ}^Cx?ى~L|7 訡m(͉jK1|;)ECo=m#pfĒؼD  MX9餓1)kɈFkoٲ9sp6ˠ't% Dwŷ febAu|Bbl9As`&I*}uI&qqE,L3{lN=7F_ȁlow*V{J@(-՚\W D84o(V)(Nql,sRk@1CYĢJjU)V[a1DoM yvNv.w)}5_ML;4U@t40|eToAզh,l}g={ڀK~2_~%+Vا)޾}O>$w_XHE]5\Q?!\/X:ִ57)&JŜq"2\{rvT"dž*СC}=bƍ]XeO>#MFEEE>}ѣGޫK>oɢ+ݷPZgjΚ 1f0V:X_I,/Jdx *T{-U*l:6D xޯ+jhEo1pڃtL#Mjȭ[)^:a_o*՚5'ΜQ8GcO@-g,[x3f 6~x>yz!V\٠tr%$c`% .]4K|ǚף8#~DAV\'fΠI@u2p #^l{y駹#ꪫxӌ- [4J 9jGx͗Db lx~ږIJ퉬.gS@3{EmZ'b4զoV5*_At}hB3 M=篻P5},μcO?夓NlvYЯ_?VZ.㥗^{7p=6_W_Mǎ[R^6Z|{j ,7` z_=^6-]HRڰg[`,_=z l /W^… yærsseѾ}}6ZJ_m4m7H1պ!fУ'fЛei+ FPݳn"͚aL: &ܲ'+WاO>E[ms}%TT{ZJ_O_Y5 Y=*G(b` )Gh]j |_C4PMRSv o_*ҧ{܃20hqy8FG\ؽꫯ6m%% K/\:jovҵkW***¶O8^ym\uU\qdee9bZT[:5q.ٮ۬c򓗬p+fЛ"fУ7蝀@Q ]^6 3<s7rW`7tO=v>Cktm]1/_iPkjRRA,dҢ$lJeTV$" ͈67<5^C_VG/gdrW00+w݋/EVɽpCBӱ$;xyꩧؾ};WJ;HF";;++&jTv,%nMnԨ6[Ie\ZsAoAΠw9_P(!m`C%k KƢEٳ'7p^z^=0 N:SEQXn] J| >Z䭁x!l&V'ZcKV&tߒ|^+& 1:)Z@WV>.t.λ_InR4nYz_]9#k2qU]x衇x饗1cF7Ÿqꫯꛖw632_ ՞6W]* ەmx9ĪA@̠7aAtCȯ2UB UK3d l?T[\1_~%|>cǎ믿_=O?4SNm9 ?"mi5n?̉jYOcDڐ7i[~, ET'!L)Z tˮΥm'*IvkTz-'O! //j Iķ ӱcG4me?3\wuMs8\}\wu涞nk6WQY ]Q(YX9;W* zd |P1|;@BȗqWsXn=zhuvQGqwp'8qg;:SDV+,, .[5ٓ_UJ]6kՖV/ƢJҜT!YWWxMnNլX5٨:oCAh^@Pgh~%n;F]n]40eHÏ%3qǓi>+7]޽{cRps7[u-aP0+u7w2FK Q'YK@ǖBg z=3W1;@rFGޫ >^H|zM\2:tjOrr27n$%%\@iiicƌ>kic*~jû >AwXB{n@!]4ߜjC:@HA3f~] gz8.ҳQɊh[帛*@9=v'li0dȐ& )=#r-iF-Ƿk)ۿ9Sw7R"5çmAF :IM*1*MϻuryރώxEi(fYfX=(T O19crZ^^uڵkoFNZJs83{F|> 9Rr-p5z7n8N:f}Qqʖ?[jAud!T[j[z\[*L%|ݍG²i AJ\͈@U$NӮSTffTvmd'i5TST[H/ -W*~GPLKxy7"9rdř=w'($E5qb1R*%lc'{Г@4g{Qc^Ǒ9Ufz ?dS\x\wuM5ѣY`A>scB[ ˗/+l0h ~fvDpot:2U%`SM2Rxk߳($ĮM1" 膠c]%'_z0o1G/Ǧpp i1ņ3xz\#6UV5 oXشiyyyiVl_kRl SȀWcyW1Aӎ$,T o Bs!V)ɦ˹馛5c)lBN"wڕ ;U'^{mXLպqo {P9ո JvZ 2Tz5lHgƪ>YʆIkbĈL_CXМAﶓsq|:rpBB(!Yvbř7ޔ E]/ TFWga+r#5B0Tdu̠\sRF1ܱSW}m vjV4OKTTTЧOu~rr2V5b*LYY5kVO<ٳg7+h_@RXj'Xur}xǭ_]0byc6$ΩcJX]B=M/.@(b_3ͳTEsd훕w{ߤj;ז1{ ] R)=rSkU?uEK ~=jI=nn|*uޝnݺ'4Zx\+.O>^5裏rgG}Nrk^z,@EH vEN(-5϶rVy%IB*1E@Tl` zWpހB췝N*=ʽ!"Q늯Ap ɽܫtRd֭ڸJ>{#TGߚAmTl3蔭Igڒ熽  IiY4{Sۗ_:s]w5n GNNz+W_}5]թaÆEK.TPKT-Asf#4gk=ɮ裡xei{& 9> y`PB@}v(sJrَ݇fO#$͙ytڕ 6D3m48ZAy굈T'+S9M"9?jNЄ[(zP0Q;<.g R#~FԼ~z+ (w--L"-W]uf5DMA?ۇZ'`@J@UHص ݬق ㅯf7d7N/s\|9l7bĈz+**xyꩧ"4op3y|>yyy5h뮻?)_ޒPI!A$+GyqQ{-HJ| fc4rtP/;Z]>3P/Ǟ1'.?__}F쓜֭[ۧuP) 4gXw (B T=Oӗ J@ ]m)E!%]`Ss S}+lݪe4][_r5ݺucu _SC;Onٳ9{< 4(1)_ ՛fԸL)Dƴ9kQoJAj3FFaw[g{~9)Gl#`{Ti4%f!uX{G5~o4O?mͳ(*T[%>J rn;"ޓ2 wZ BZzM5 +>$n^մ[n{r5ׄxb2wܰ.Mq1pG7|?UWE(_,k^ )LEx?"L[ؙejGUb .PH(sS9ՆN$5eбcGlRx֭燗|7ٶm[ǝ0aW^ye 9e+j[ 1#0%8, >6;ybag^^ҎF\Xqh tAu# qFzeYm{ i@sD x 8묳0ص>W=5 Ҭ))[Q⠵[0au֯jO Mĵ?/5d˖-$U;vС efovnFPUӧs10o<Y<ƍ9-+~B"DOxiq_ART1-T:,ِ5tdq4 )P Վ{[?C&aK!iɓ'ӫW8sƠ->!qPu#%C1DSWbHx3Pb .T՛g $o{5`;v࣏>K+kM,XѣsY4??Yf/F5fBBozjTuo1?צՑ2't9+߮ 5.hq#j+u[oap]9v=,گc օ TNx9s0}tnfw\iPx~g٤Nh҂ơ $6gV0$7X{" Gmh=n6=#˟uYL6+^hUz97oC3qDaÆ1sLrr&,Yn-A6m;'gP1՛kެĪ%Zo; 5}hJl<Ɵ !4Er tMsDë( `R߼3kk׮eƌ̜9aŋlMo64k5.`\;ӧ7[f5W]uU[oG),,;>yd|&qwS^^^ϬYp8MTP^6LCB58?Vys^oKƞ'ުǪS'⩲UE"%Wcʉp*韛E9(-->㭷/_yף*4.`ۗ V:2ï¼`kv=פZK]v)[a E <,B Vc+1Q_R]taƌaZœ9s8wUo~7裏F}B^^}UN=԰V9mAU _)<܉9ԛo̴i駟>gϞ\M>Wz|w# :hhvIĸ]6-rqHD O`FR]Gڀz&Oŋ=~{}Z.䒺t- ;tI-~S*~c[-.1MM5Mێn'NZ\sLj$y^q3 *;ޠ:A{>)TVV2c >쳽~י2eJ+_4  $ }^(!inp@PQwP¼XB!XJG L2^y->jU'ԓg믿G[B(@#gJHw鼾,۾IIiI^1b4:QJ;~+no(pP E{~7ԳjZy1}t> 헚FD-lٲEٻ6%?\KX,U[ ?D!B~q(7Wx&VlN$tw!aH{,5jnhu͵)cORuT[5N]ӃIz?#F [4N=|;V PB1?g>(=m֭[ٳyY|yľcǎo棏>J3}$ EsN÷'auTͼf 84jmS9) Js jCbk`^"UKbVoE]wɇ]3bO[oܹs|ܹ3'N䬳# ;;SSS),,*&O |RJ_|~ٯɇmi~"ǵCP<+j߆\yw3M wVaL<9魷si5g͚UO#Z6bOwc@\ > )xd~1+ >ZhEHv`µ#7r1kqZMU0{#T!e^c̙̜93̴bOsOgoֈY9s|{极jKBXúqI9b!QM}rCPVЌv&M >᫖&;wdĉײ7oXCvvv+ի[Ul㔯|ŒbKC"i \aZ5O-W#ƾDwP[e;ya*ϭd[!Z^AJH;}2 6{1}:8939묳رcJrssxC|5*}7ޖyUW]E~~>wy'<@]ĉyž{sݍo {L:597@: ?; !-am)`)]֎_;>%xl ѫ9Y6?Y|9'Ndƍ ڢQ0a~a??s\:B:'{ )gmA2)1 u(w>&خ(*w7E V~=$*.ZSO=!Cp饗2eVk+u/zΦ9\[õ\P}k~=xa𠈃/M1`2tSp`uoB#/f̘1 ?n==/_mc~G._+pM73D4Ǐ/BY& tw:(g"b֋H H VÐ ~Blǒ؅>dϻZ&O̻ۨBRYY999\r |ATj*׾%BcIv{}_$#Ώqznb3INJktZ>a ޠU I7iY#^D(&YTTׯ{/Çz>'t~i{w{ UNOOs,Y A%HiS%2LU!A%z̃Pl@!j,VN@1 0uyƖvW\ gt!yǏs;^x.cA^{gXj@j˃1!q4Uv&+Wo)t `z٣ =kVVVU h)%Gu?CݱYfQ^^E&P㵄@ B(i;0QJMLQ;]'B~KIе? >+vzx^ }:Gn.=zҽ{sE/Υ5SBd/۪mL~8M|{BZuʝ 麓.in69A&"g롢%ː!CXjU1B0lhUe`aePEQڟ> @w& _tm"_0W@ ߊ ***(//vr(//jq\TWW׵WUUv#i(/ֿ̥1Yr%^{-3f --r-@:BzH)H:zX=Io b]aɩX~y)%9t18cq&޵ՑEΘ7:-2dׯoo~Xb>KS:65rwi P +z{0! ?<@! $R*HJ94lŖL E-/`*++q\uF 7h?y'UϿ÷-.iJTŤ]o֧3큔V8HM"V#F$V8p8{Ϝg;ʝ (m(Z٣#s>Ƕm8餓_"1c&Mgh 3QSHib~k8RĖYATRHGª:ѯEӾPK[SNW_%==EϿ{P9HĪxy<?Fz7_#FcQs{tm\2 Bw}YF;vȦM-F"T{&lZ_HEl6@!2MtCk] `&{\LO?ߢ*[#8ŋ7))i:21M ;sw*c%Oc!)*fȨJ *`!X8"*[Ǭgg W@UNֈku]^>/]vY7.O72$L:t7y`W_h:[-y{) yU'#_wO{tRH"A%)jTPt x-]1)&ab UءSsR!'qEG)i>KO*՞/dd/ڟܪkyfffMa-^us A~'^sӖWFm Ը 0Pa) 2?*"O?4^xa>ms)^xŞi$ڂ$;tn' !&πPI c7tɴȰwp*:4@S$yMuYc".dWa'UɢڒB@<$ك_گԉQ(ZPl) J ÄV7#bnE5(Zhd ymsNC=p#ڑvcW{ pQz`jzC3֋pHUIwެKwL4Vl˱Zlz_VSN9Gy$⹞)^p BPl$$Ѧs'y~^W ~֘1q"H2~r]v"##Q 8-"dU Tָe={ TCXd_@gm;ImS"x**;SK"`vyWbn2?GB+'|oxmFbbbƩb׮]|>E!%%]|s}]W*ڦ\2!~~4&ՊU~'1 S%\<4iR=!zC*XWsBBNĺ۝N'NÁjbZcnZX"+ ;á1cDP6l~7 I*aPj9(.U|: M$Y yqWҝJ-$k օP]i!z ts7Խdq('t%*ޖ}v̝;~m۶50ڵ{p s15o)6T|~F#]YTD(m_gE.6JC*_J޸ Bu.bxmrWһwo$''JRR$$$鸸CKl_@ɷ4i!c a_]'|b1%7hk< TUAUTU?\mÄjWՆGgѩE N&i#x}`4J%e#qQ?tCPY`⠭6qR݌}d|9L=i6k%~z^yM͛>/55O?Ob 8I@`\ zӖ[;@ϡ*"@"n @:R]Oڀ;[eu] y晼2Fk۱o il\>ߑ$/VM)aҥ/zG*TIp>3Ղ7xTxt ]p:haհiJMC!i IƘm poZ8 &^~ncήl$tTlu!@7CF# yc/J=VE`19씶M6ѵk׈BY*5Fyy9<>,^Ysٓow}M qՍ)vϐLvAM2Jno"h#Mo{B%t /ӧOgʔ)z1cGv~Ol,L݇Ȩ3\a_^ 1c~ e ܺ@@/POGPЬDU{IaQuQe`h:?mG Kums1E(AЀd6,mgoHtaүb3.yGToj?=, "ȴ{аI_0.& 듖4J=VԚ6䌝,駟άY"Y~=]tiZ~!W_}5[6^oߞW^yc=ɾ.'Dz&C&W> WYY&h6M=i"=tR&MĖ-[wy>fs VmdW0}\L$lr}.'y1c~PT=y ;Uŀ ATEՔ5 h( AΪD\rAq)|݌ܻ*J).۟V|6^[vauDj2)񶐻 {.Aac=nsBa{ HV0kS2bq'|mw^~tp;q~] E׈I1Ӱiz'OgL>kr-< vuyi7߅$A`RtǰrK6/8q$vp\4bX-*) e!P]4&ѵsG| ~e-+)ܲ%-4Z\OjK6E5c-G{1T%~*sF*R,rxxkkcB*V?%; g$A^A5%r?ok`@iǓv 6hT0IFE.첰=gy&SNyQJFK4%;C_$.Bې ABFU  z]`/m‚!ASZX ATa"dyHNd .>n~;WN0M0% ZX- j&ޱ _js,Y¿gf,4@Rnm&xkOCRSz.!e J XAw[/ @:I6!M~TW:jz˯p[q4TEp`KNX"^c 1n8̙>JN,V+4h& Vq] k)C Wp@O]|ֶj~_<-Ij#MB3۴{1n Ofh7HSbLnj$uGZŵKuð9tlVU6 VEVE{[5]d:8U8fːD`Ț8ߕUȚ,&B1^ce:O(VE}IM,e5^wL~M}9wU):GQ}2[y7ljѨ3n1e\}O6*F˧oXMo0ꊐ^'ʭcֳATB+i"#_iZK.eСudҥ_d C4uGre1i$Sݸq#{/of9xŸ|%sU:B|.2-,E[޾H`*i7K=VZi+_.[8ǚ D1Ì幜1u 8M*Á鯦^qN"?OMY S ^yBbSLȥ{U˫͗EВLUeUN3w^MK6܍lAif>Q]]qGz[Jj0"[$RX()-;* MÓ"SUoI8%Bi T4> 3)4&ʵT6yYfq)hG]y-*:uzڵcÆ ~r4Bh4Sㄲ\.hےW*(=>Lz##>0s38s6h[g7͠˱ĵCvL ې TH?pU$~A)ബM!ɮb6^ϿXɋTy`\^[RBǎοuoz=!Eη})W^-ɽ7 6ccL  嗄B $ B%{/Ʀƽw[E]Zmߝ1lY﹮̙y۞'(k Ɯ+Dt?C1bY zCJ8N3_VT#=dR4EH)x> !VTh3N`=u+tpijn{Y[o9G?׿=i}4-+mPLM(-*V?|y'BZ|B`QY>+_=?;K/H[>g&)cU8 KPh1=X 'e~ugi X8[ۏ5N"_c%d ˸K(.C!ʊwlX4OSRL>+b}^|.Sd+9IWsېezhz\{ )!-or1]d!z6= ;gWeb 8+AbqZ:=>5[fG1H,E9 ϨҢu]"&v~ w;8hdF0|r?iXG J+^-g=RI2i }GjsD:qW\8;v,a=z4\LE1AUIqNrMᷜ`{y} ȝ xxQdhVC!(f7fp.2bTݨC߷Ӏ{;#8{=Nw6m_~%&Mb޼yzY-3T= 3mqHDUR>M%ePLγ- W<|ss3/9_}۷otfܸqL0O?3frz}?h4,^~eN8lWf-Ni%?eNLNn"(";ȿ $IE("XU)rRp`V ܺ STd9S%5d0#GdԩL>?Ckk+hǣSy៯#hMXφo&O-u{UdaY w QL'WClrF?~c;[Wxw93_t]xXx1&, 9䓻=g Spo{0ͺطM)#5Dp1j*թBZIx%'+^ ӟģ>m&e]5\kkrzN~aOsǴn|=0iIܺIAF6'g$+\K]| TURt{V=n]uXJϪ:FƐKd_y~1oEEwu<?ϸ[q8=6sr%{u 6pz+"?8R*w6'/+"I7frqƀ(aDEZ %1,FTغ_EǑqC.$ p84sU6I;:32gE+uI|.!enΟ0p1~ <^]۬2K%'gxq̞=<(\{rhiiA4B⹉p5,<ځ7,1L1+uskF>L(djs1}:շ* #++叧&ixHY ʷiZGco۶}1\rIA/YbGxG`AҲ_A;E,o% ,7".SX SaTWY;稻­\OC3fL;dddͩ?vi]q[?i=~ Ҭ,+7MoJJkr8-P 9.~Vd2ZbFRIY?k.屷jX1 [/qBd"A2F38'1T4{PT#oo$s$^n*wy羜~(--꣏>{]@`h\{=#A Datp 7'K)H'o%qd &s57޸_c?߻ꪫ4d-SOxe_8Jjq"\S ;7l0***(((+O<Ce{\;2A9tS[iTXE]UphW- 5r/K I1W:ꠛB_˒(lTo~4{fffvzNxμg_/YWpdL4;AJ ¦iz{N*1-d53A{1WXXHQQNT*Ess36y-Bnn.W]uUǞp ,\)Srx'=q]Ķ, fƲ"]mH\K$  k( \\\febU1ulx.c|Ayn+~~ g51I 399슥R>oO?{#'-fΜOyo"-a"Q"0pJ)6FX? ]Shysf>j.]%egdʒ(.7ղ#s} K 3clܖoKwxAޑs~SN9e_OGbeY,^nS4Y#o~ N%*U?p6V,H/0 8sƓ1^ڕ+Wr5\ 9 F^^^H$¦MXv-.yW_}5&L`=;`ϟm%@E7@?ݟddF9">Cq|'4ny;1Ȣ+B$:}z)D"q^f2K||&׵'A.ZN29%*㋸W~g>o`;9'Ι~QǜȊEW LDUU9I WWh& L Ё (ÊfZc)~~8s_-Iѹq&5Veq_7,ՙ:$;SG>'/7r^ZRL`2gD0$aX=jmKXD+O7kji$P:5RE"ebWŠF"!ۤMWF?}> A.MECqfѸNp7'NtH~ 6oܩŹA먙}5H#ٌk1Zy%T oДGx"? +x ,Y[oawdffrWrWSOq=q[f ?|wϻ3ހeI }I66zX {Ml5Ds!mSS"r08[0E<2#+#0piX:_U]7q \{n [0-;ig~N/g93wK8nsM[b{8>t/awP60|y&uiٕۚ"R&5dHIAEĒfTz)+d]_WGʔ,ˤ5e1,dBVJ䒣j>"xR'ofd~!9QP6^Vn533:YfGX>p@ƍ'o驸 )b׿*UsDnw3^~*( bBJ9⺴tz"Ht #KlwUW]Ųe˸{u]wc('4/1B"e,V,s$PEʞ4vɓ$(^4G%&T|1|<񟙬Ua%VJL=* ?^s~~>jCJa?w$`ܺCWiKTr.BO1flt\JayUd(bG4ӲqHǒ7 ǗOa NiYh"bshZzo*PUUf58SYlzR\o9 3fJ^N3к7 {NJWdihFwr0> o`?fΜw7uO> &駟vsH#F0f̘=V~sA(:(ΌكAb'WhF-^ƌ2NY+k Y>> }~a~L sPt'D}H,]1z- 9!C@ oA(̅z?+ G\sJVی3j-ZDLRz= QH㽵 Vfm$#˲PA,y Wz5mQnR8p94߬O2gu-gYqi%@RMPl2 (LGk*05FJhH!V% ]пyyy]KuY\~ 4G"I6-#ѴѱMH/Isu5:6S~ԛ.,j{ ɋ:J;.[<{sSLጾaذawy|'444ty\*#! iƩTgmGCwu뀇l㲁ĄgJRB^Ŭy5zD)^aq*|d9J2m R,^g[YWDo1.ܜrT 7֏d,ӏݿl߶M?DsyyP&LF{X2oC [#\6(H3lv )>[NjSODH77ҭ8k#_9`l;}eש4C,hFqxP84iR1 EA&i(41<^#j*V\ @uu5/:PHF'}Bsx Euw4 $Iw%ef+M. #y.5lkkk?>+y9s0}]1k,p DrAHKNRqwGJ*|$ ƙh-nc?ok IO20?MX 1 )%,Bh%eIp$I▛6OE7'\uE\uELrӶmۈD"jyphvTL,i)3om66rQV'LCcz*B||_,߆EwG^ŏE,a4:hMɍ3a S77qE! #NTw. 4,/ ^xSso{+Vx3)\G):Nz;w@*G-LS*” S"%I`x ?N:iy`Ki趙}nQM ~0VWf81B(oќP1r2|x+iqi#I$}U0LHPJy='E-OִԘ˜G?q\VKJTAuCK$ɐb;t!LE] W[Ejc< }yq1pi߭Ɠv$ۍDwS b'g,)ټ=[>[3qiH ۏ !\N7>o~>]oaĉ?q۔̗!WGw3V`2ޒ{8~iG}eHf͚5JEhXpEN, 1^\ZsU K7r'5ýDυv*h&Yg c@`4<8u.tU%bkRbYd spM7rI'9;#Gro rT8AMϖ`h\]g/ke$vyġ):ty_j( ik3sbNۮ3vc(irB}rqm#d@B=1^ZPSJ) lDP4,3޵>J~_xT*~3L‚ ߿e )MS cLkS}7 ŁT}.V{E͝;`0`bĈL;Ѹ$P]yX$ϛdc[>K7J{x㳫(hvU~%%.ѽ(Tֆ9~\ 'ޏX'/fqP%O"+{eI,j76hѢviӦѿjBS.PLM?r,)筭)GHKiuC%/Fաb)6մk 7EYr3\f(P R^ZέÅUϿ3*>/0ˍ Y|w@~{|ԶD;ӻ"eJN'ɨ-tBNBq0VlQÂWӸkv@{w??3&LС}.RQ~_!st%()$&{q ULιǕw~݋ŋ;ޕo?9׿ذaCwXW=jwඏP]'''rpBHZ-bSL_;%%y.ln"MrĐ\>s ©x .STZJE(Υ#,?~{)h?7@hږ%c'Z~}B&/馛LF ߆ ˒b</-?쁁*XcsרYLKs"5뜺%%~'ɿ_IAcGI]"42}Nf&~qZCWiYbMATP5yЙ<NV"/;_~@}Wϐf Ch=LeBw謍d;i %Xw{kj1ԷhB,i`}=X-mn#ʲI))Jqc ۃldB1REmmw7"?['eةh }0"܇n*zFfz\F/>ގ` b`[-J1}tyZ>S^|qE=kd hl?Lbݞbߵ :.){ ahXu՜٣?txv'K{g?dхl.h^~]eIrI\QZ=VDR ֵ?ɄA|d+75P։%3I¡*vf-k_of <=\}9~8B2<VV5ϳth LYGGNHQe ɚ--rH-Lc ug/:W0Ƚ#4C>';dp=MKHH{Ax饗 ߻yXn]vI& mkk+裏xw:(jO驸r'Ѽ ՕK.V;} lmjFk,SB:zB`ZKŎ={6UUUN3}oQTTرco)l^EzJڈE,N;j@($cGSRwnY7m{aל:Gc(N2exlGnX?ZͶF;MM.>v0u. ;ڃA$yfZI8~#˲јm<8'1Ah 3Yj'8w\y睜}U87yC:!PӓQ&%[ W6sa%]t:?j>XP//HK8uoVפ+`"N=4ĒFgZ*15[ZX^- *tm\L"?bMu ~W/"@ҺAC۠K ٞ$u~P^h1İTto)O.4,\͛կ~^_^_״n}Ś5kx'xg۹Iruuy s#Q!PY7r^Op7h(0&A5yt '[ !\Чaw7"eqwO{w0#GEwBq{1-IyfWW܂RjKYr;rZIw R|!T7Ƹp`I`shSW8Uۂ||;,S$ 2W(pB 0lh@TEPcΚZ>[5G5$&% uNp48@G1 zɔIT.q:,< ܎}p&֦&FeR룺!̒M ۽^a1Jઓs^UCm0FG'sPYU5ݽUOUey>FgsBeOlk$w-!h&08,ۛ^YÈNYh`VgHً!h73E\  Itr-lٲ~OׯLݚ_|<J}5!qGf}7N:M n31#x+1M| ٝքh"0Nx=! 3<5\QG3=Ν>[޺E"[?Ñ1)-.>$uyB $_IS) R|X~|^f3-1.1/U^m*k& ,XV*=8 gyˍS&}})lcDY>|n]#5o PDžTA0ݕ2|noշa,$aXTbQ$i1*Clڟw_aԩ|\98ujPb}ƌBn(x^=P%(vb<=[IdI[Q_oPURI-g]c)r.ƔgN0,±#m]#ܮLZ$=DG S)\~6|ɓ'3{Ml3侗v׆38&af3!.֠:4$Rt|?|k;$Ĵd;%)%I̭ 03Hv0$!qpcKYQauv0?#85Ep*Bg@MUڼp +*k,``+Eq}W@W$L ]$1}OL4 O8̑㇡8g#k#YԴ410AEdsm+SFГScZ<{h534̧kxkNR'Ivaն 4-`4nPaʈB>Y/o pj4ǚ Ǎ)i%ld0Rw!fG/!$} ?g㸁T(֍4.Gu%< Otw7r]w{*++yy+67ʷ~ˤI(Aaw+Bsŗkiѝ^D)r~]:uQ6yX/3\wlQOב4L& %OP!xMx)BPYYn?K1&!x(KOR 33\:Nqq1˖-e|n)ͱ:W2T2<͠HaWwC>Ls8A2eSkڈ,.:Vհa[Ӓ|h wN)%FK8$ Pv䑶AmhC=ǎ.>28lPK77L]' lȟP^_O5Hq{i_ f@^Ph4w^Jx^].S?oq+ ~׻SHD~-CZ.m.8{(]54F8)%.T3"ѤlO$9:B@q|?C29 jcP*cI=W~ᥜ1`4Iρߥ hp,ɦHip9TRE]ݗ|̨"b)Sפq;/0R]A˟[R@*c۟{^h:<+ޠ#-asOktr<*8-Y;R W¼ulJ{)RcKcs2W`X,gSdL , Y'Ӓh´,hb͖VT61v@M]$S#˲MTJ3YA>oi"!qphD&j #Ǔ9ymW>yg]! h_1yd:؃￟ݞxAUwTB"ys\~AdߒJHͰ8#]8kE׻W\J&tM|,[l'VFZ)߇yAʲI?kdKtc f2p_3~F4sEM}5ٽW8_|gל2as@]0N4a ʶnk` na^Fu}P? n`$IckDXPޑ'7LSW9vL"H:(퀪(c)j[ژ9f :i~H֑0L>&z>DP(~3F.B 0ٚSl/RdDMvaZSGsu~n8cR9$ٓߧ߬auu oװ$9~'*XǑ-ͷ0n@6EhmhVc>7?0a -Fy NH $5~fw;PP4xvoyfnvƎ_W|<3|7[rO?4---ݞ)3g,f2qf}_A3{6vS<|xTYQ)%foimVvm:?禛ns477siQY̧Xg7,Au!-I/5 ;Ɂ@0Rt~ a{iJMdx84PXppôv+4҄UU6~Lm~'i XQa5ܤ KJo܂iub*H"EerSM]g4blݣ/uw7aޅ& `رz뭬X4i|M{vÆ thyh@*nqhiqw DZ2o9# 8JKKseAWrA^.RB@ʐl epHӊT16z;kSJUW ZI°I*b*+.Q^ d/w0jhy@EtUܶsjy]!%ƌLmU['gfHqP= N6 ׳,}M.FWU>Ylzx!1' 0-ɬڈ3)& ʣ%Fck] Yqu CdfvEN4e͓Ѿ1s9'ho 7+xWJp ^?X=H )BK7Npu}:d7 8sũ7.1_l:o\@+ȸq1c?1DQR,Y~SO=SO=a8IfG&>^ vA2b'H%Z2ڠMRš+>TuM%ĥx]z玪(bI6nY߆53͗k u-1RۡsN+lo>8@a搃F+A,3}`WS?OY`  ZgO {sJTǩuL0-Ik,ŅU 2;4yHܠ$)v־5]^L zvꃱMzѣG3|p ƈ#8øꪫ:U@z<.[vZ= ]i R ~bBAb Q+O$M Ga4ϋM?6qB"e:: \: ZȆ'ʹ&MUa8S]->(@[Pv(Gpl.?o76 /Gaw׏1cpp(aV/ׇ߭c}.! M/˘l QY>hJiM[WKvCI08uW楧Cʰ8f$ ZUCbKbCJl @r0>@LN_L0"In.ՏbjPt?eycU."9UB%JZÙH]!rG4O!E^ &S{)lh.PO~69ɄA@Wɰ~tUpt@[WE?RJmiN&S\rxi{SA"ivȕD g5ϗn>ߣBCK˶I« sILBh%>o=dU&ddf}]^DAFK Y6W1~_sq~lYY3k֬^YgSw…,]]0ga:Gb]&TCI>=`6o !!ŐN33ٌ䤴W\q999e|~ʔ)陻Rʹn/G$5\)]Ta)gb@8B҇Bs8IU}S#?`s]wC9ڦQSXiϋeM:4 WUE!HQl{腙.&^˟Bq:!xG1bW\qwq^{- .c(v?~+ DYY'pgqxLY_n8 ^EQ=z4wcǎ%7L: /QFܜN'il2y;sP.JHBvN)#nidqOz% rp*sV2wMM<8ĩ$&.J8A69aa !0LI?Ē&Q# Rœ&#˲zU-qrRH[$$iZ,Hk4I\/#ev[Jy.r^G{)!e1Yǣ@nAU$1SacGq$ zH4/niW'L~;|If̘AFFgcF=@YYO>$݁v0a'v͈:$h\t'Ta!GUCcʖ2mN=4YHBq|ese-.B-Zċ/G;gu_|O>df:$upC=D~>IlstXVA0Iv[\4apJ%yӵ&C[L "uu4&4hmɕs=7Hcc#K/b;9t)TDz2\ ӡ, ˒4|\wysN T^WD{.5N(Bv' 'MY*6lk*7- ð47܌XIC !TcC+6:ج K)%Y_ KJ%@\X-\pKSN7d…G? };&Li_?쳽UUeҤIdffO}u|eg"-1"'3NfwF5E 8Vz(?~]+;W_͹瞻9;pEqE~z>.\… &cYN<&LI8s$54OH ä)e8I+hSdj)RFbk,EI7aoފǩQRR`ȗlj`pQMTԆX s:'-) SmԲP.Tͅ,5dXLΝܿKuJ߅ۡ7;z<BQ ٻ{}!;;ɓ'sge~T~wBO$"]䰕rKO{'-ej[cms+_>\ŷkx5v]E$apû+ Z"qU ib[/*^4LȲ,|SFs=h¤8ôżm%Y#QMӏ }5ڑ8>~^&8%T8M5Km2%44Owq-e!7p/;v,3fLqM7]w#<ҫs|>_t𖜀#s8FձK@(S{Ш'Чm$͋:OQU& h]8ՙ׮]… 5k_~9 G?}Y?2d?OygYb[neӦMlܸ-[qF^{5n^s0 WVlsvwS߇)J\2\*ױt- v;/N[ujl(ʳ;Mu;T6մRgB٥* #<C4U!MKZmM) Ed)G$CWUAPApM! Ƌf4H&ۮ*mW⑐asFNCkG eK˩3׌LƑC=F K~K&/,-!ǛtՅl%nϿkqp9tzǨչÇ1w4D?RՙBumS!7}ns(&;裏:nhhN`Ĉ~ >1F7ve{weJdR:"hz\Į^ivu/em#2<.kp9T}v1N]9Uvx\z~vk;[3Ͽ;3nSڭz;\sc V%_O$}=;V^zH ֭YwNB^qi]S4 P*8}"akaX5NFS8*Զ.60@(⳥pv{ MSFRs _~M6푀i4;L?i |%'#TGt!T}ݔ+H_fV+z*D'eM<4讨a͚5u];OGҷ5Fxc&r.Do^p͊Gk,[vp7 2`X<J$[?%^&fה*/,^YYD\nϿ馛:H&д_ޑ__8hY0Si+B\>n[JaJ\]6Ti՝vbŊ|嗽~)]wK.&8E:V*Dh+lN"PyjQ?!权%Z)$IY,.JU}Dʤ?mv~kH<ŀB?y^0$Ǎ-aBC;>"0 l#/Gd$uTԆ5Kmss.Tt ŽŘE q) j/g4$Lg{xIK A?, i߅^ȷm,\DHM,+ǭrB,) FxvunIGؤ>9m_ UĀoϫd͖fz')pQUx$]Ÿv{&ljl}7ui u%oA&saw]4gqǩ<›_nj7tynyyy~`0ȃ>9 6ÇqPMJ67%A&CSV C~Sq^o1RNCtnV .~};ü⋜}ٌ9_|}gOxdzG)Зu|!P˝HY =G10vǡnk F3.kS900-@F[Q*phj{uʪVW7p"L#Z)1}T6IMoNM܅yNJnqC tMeNޞ[Adlarj7oa{SOd¡3y@{%yuLpAs8iu^j#[#.efvѽ3E.V0%qRg/J"! %M"wKO*\Eh+ݞ{ww`S-.rJ֭[c=1O)(zJQ2RTEzjdfAuP;ЬTgH\GA*++y z¦M_ʑGɔ)SAtB6Uɺ*I ,.SA*iPp&XjZb t8 ۆtXi&I#=IV6=rȋ"lcْ- ,`tYdiߥ՗x[^A~/49䅙p;Tv-@s89 ^ detI(vkRGpDs@yz UU;ScsM+@Jq[gye%9^;x8\ʼ\NC杯·mǎ));[6ՄXAjCSXestMaVp9{rr3\$vOY^''֏k)w֒FI3f"Iυ;޲Stc=n]IJ/̖-[ o~׿#t Z*mO73<(]'322x)))'v& B I7a= jh#qŻW䚪qj|Xm~;r~O[BQx$4)t3mt1_.DFNGgm>LVű"MSaz毫'3,N=bey]2vM,:!L  o+F*%K0kc.o!k{3D "[>_W^}477x JG!ͣ90jJbիMӫmB3[?U&E.}DxCdffr/|r0}zDwy}{3뮻3g^-hKˋHt;sB",Gc]jg*8:ķkl#'.[gFIQY MQ& 56 ÐvFrhh#t_#-,"I<F]V.9i_<Щ`p <Цt6cܚ5k4iR'%ロロS%xW>|8wq~!/"^z)?YTO"er\|-s $Rf34LnanAJ@uPQL7tf!*/.+;*v?s;PFQn9.Do/~?maο+ BHyfM1OtѫTѲsHz`#ʸ9s&;Vزe >(3f;H jDu dS,!; ayTE3,+ҩj&2[Bck-.n=NVU51g9c\ MeF@Lwp,Esxg-O(–h$ʳ8l`.A8=eyS~[+[ sV`!Hp׌z$nS&piQ[[ѣ930`O>tW\9S9#9쳙1cox[SkZʴTѡo64,}ٍ ) ,)p6[2$%V7xn]niXb=^z)cǎeʔ)Qř3GT"X!бc-n}gho-Tk1:&G]SGqGqwq| y۫q-ZĢE;9袋83)**pX4Hߙ44{ܿm<{C* 4[ooC]).XӒXVE-قpv6+ECok5\:SG:, .U4|kn!ޮ)M8!HOV48MT7(q8+L5&2Y, 3SVfX'Ml۶m۶7|r;<^}Nabc<3zl~سBAAGy$v>;t(,uv{@b'Ln.ô:E}.eͥ;y 'VB8fHievh"29cb9߮㿳7pŇ*tHIfpe5}_ndLlJr4☦}w@VV-95^رs & enjdSmVn%4z=OY4ZNK&M^YQ䲶!7pv!rUWC?!s󬨨`ٲe|G,Zs=nӦMTUU;A܅Sۙ#M=jC3;+>ZE`ҫM1EȌݍ&Bseyo ओN⩧b<3r){;fΜ5\رcx뭷\[V"[>Fq@J(Wʪf"q[굯Pfy MZ3ƹBS4km\yN ^}Ugߢi]JynN9:~aN+7ओNod2;pwrJb q,(&IsH\e:F']VgcI5<E(<l^"sLyZsvk U!nGGYkfXU̒Mi$s֑)I6H< ˲*b)} S& ݩ7 Rɞ íQc<: RNDqf)f6UW]؋{d^|gwc3>GyKc6I_|؞(s]P-w&f Oh)ߩґW1t_9ɽ++`ռۼ,X`mhh7sR]'CRh~H$9..a !̛,0S)an=SP&Ira:TC˶QsItnw;Uli!Mq剥w.aJ^z`/)"7:\Xʺם>Sn0%>Nn;1ª,9@(nz:tرhPLIbBA5oni@"LZ E{w@9̑ʘ~Ը9^rGs8st]qꩧv?<8| $)5gm܂[ xh"fO^W8VT.>xڿK*eu4"G-|x G 3G 6J=Ϭ0}籣BY.>Xھo^+ON%eIb b>\X͉#'E(k+f]0 tPԇ{44‰¾w8Sl<~|T% Ea+ WC`]{Wv/cٲe,^x^{N5•w8z H5B8gsORwohbq3]*(Nd2`ƛqLhoڟ1br g?f{5_.8•oCfHb hB{-2d{Y;_I*UU+m|Hq& r_vͪ*$7ŧK^,=;/Bav}I~g;&B r2\x*)C9=pJv1zjh=?a%"hybϋ@4N ]`֬Y_~oii5?ߡjc&n[-hcoGE»aGHz<l{oϫۻv#lӲSF; |.|?gN*oogjU إ?\`7) p]]GN!hƙ<}py 6S{)+\X]/.d|$C H ^o*r%8na7.!^oZZZxw1Ø1cKx{,St{ @IIIEYvaGI3n7MwvH̔ze)kv I,i([OjPA(;xcʆ☉kx@hnhA1L›ĩ8 5RZś%- MRSGrKn`}=SGTmωKW ESvD_oVpbFgؚh5b`Ɨ钭|d+gY֦H ! uZȤ.';;A_.L !(;B_oOÙgi[y嫕tD&zU)4X+D%sPd!Ksk(-E}vcBP1yx>sE[8fd!1`u-f5Q1OX _z#GהOvZW1 :5ɎL[̊ff.ھoEe\NAҴ0MI}0NQ-:HzY< 85U;#1]Ĕma|hlM~{ ..;n0o507 K TGr!Ǔ$PYD3zee%gf֬Y̜9M6is Guvcƌig[j/R~y晽Ù=Ȗv3ӭtluBv6|^Lnsx;?䥗^wi[:CӷE"ղc=)^\ʚOߕ|4Rx4%TVU5rh)"Kv{*3mk94?mOngKC/9Ӓ0%}~>*"Mv q{+t3qHxvLQ;{#`{w\ru-1n~>1{<\p@4Up|o v@MIˈu,?N3T@}[8(JSk/#z),54HC"J ']>/RJmF*i+FHyc//*Ҳvvv}gM|Q"gTpIkZZ!) RRɮ$j֓ ױ~[{.Ds(I^ͼ1gg>鴉e>$MD1s#dZ;󗖔dyykȲ,f->PQǩM(ʹt pRs;5L,sEUU,KܻmBy!@ /$!p6`pF%ͪ9q̮+YfN3~!];p<hq+q<@(׈5A^ 3d&-*{KBTMᒶrSc~~Iv휵ü%|mp^-kId&? ų̟._ηn^|Y C@^kiJ`f#ulǓsl5޽\qף1(6AUU,13([c͌8Ci-Fv1tzB/zu49ӹ KXf ';6ի?>ch> #:;qa6V G.uq]mZ\ʚiɄ 77Jd֥H}ƆeXWg9=CxzREHX o)Q|5!`(!h\n6b`,p/ I3x 4j4y`Hq4KŊR_)EWK} rv[ưYqK!g`M7gHU?{C8p<\sm~p.PE?3֏}kJ\){ %] H\0#lE82g. lXN\ 8?VԹӕ%Nl |ocRա1brcr\~~B(J{c*R)Oe c(9p"`Sz4U[DB6m䩞ZBVCOakփ K,I}.PJeek) v*Z[UkO\Y˙$rS s^IdP_%Ϥ;BٙJIw+A=O^vL}/53^{-dBsٸϪF¡֏>zOXiZg'0W3yfNfS" "%5϶TLl ڤrC;\nCɢ!l򾋗X E'w $sbH c={R.C*ˈ0x߹$h.7<_qZ",n"sФO,[ھElyvT^w;ygDڦ5]ˏ'\M:[[?Xf3=,jdng[Ua.v,ea;% B J&?Ҹ^ڵkJԋ./+>X ΢՛m XlL+4Ly7nx//Rn:6l9駟^^eKy'z ,MH.\#sMHeGoo:0YKS4 7?vy_ 1پ8 ѐϥb0oIs-g/ ԈTf /˝r t :8!`^8T kEB1[*^\Hb0V *MKxJ5!!$}!V&<}YMG~9$Ag,Z"n_?a`<߼i5Y-ΆY8uwo9rʙ読>p2\xT֦PH`S^ضm8vؤ=&o{ۘ?>gqǸ馛8x?t8-K.73ҏMg- AF9Pn0_8_X>8$2l 19DC@J͏y0yfrҜbF_IƏًRp9r]L߽-bv3g4]ƈI[CA\n}0ထ׹mO) MQ7uEy.p#f?U0(EAXK"*k$s6o- O{f0ASpDW9)|F%kg|NsGvsGho .⥀?_n.Asũ-Q?pL1a9Х";{#&)T-Op,:1JȤL]ZVy-  q d]̬7GGr PJ]mI '7y<<_iy晬\.5kְhѢi"_](9W$gOhnJlv+Ng]GosWd:,/o={<'Ef)Jo :DO#*3T ьG2klvh _.X .o<}_|yJk+WWd ;rcуܽXE0X8mѢÉ߿׬Mgs8- \woyݬےʓ?78gL/.xJuIgsh0MGw>,C'vB,hRӢXD5AvydW1#NH"—Z]x hfo>d츊oݺt-:y;_lzF*44F|YmQtMNb{*Zds !-1yN EDk jl66L_ƀ (߁miȔ?W9\}%G/yAhZ'“`kiJCCfҏ8?V` H\5`J44%U?_(شi^{-W_}5˗/w u|'qPǏ617vGI"C@:x'%s􎦉 `m AwRLޑ49Mai{isnghO*|ikoA)9g婜Sq||c=m"h N_s禛n><ZVfPBJ[J?t'W(P*B!-˧/&K6dh,z;g~;x%|>(k1l;v~ho 7H!Hfl"A$6,m'e8e1vؾ9oZMvASgptrt_HsV4Y,q^ 3F G|w@3>^'> yoXw36#2 ϙ\vw0M8$/jReB'g)RIemBd(a3`Đ?nEny0g,Aw[D.o뷝ɧ޼\^/j󁋗r/AψLMZ|w; 6霋&m "Is'r\,ò=dhzjqcd$O .~AcMcf{Rwjs90Ӥr3#(i?𥎤GCDmLSnZ1~Of\~|+_{e۶mvm|/~NvS|'X{|s! =T x%C)g26_ ?峛 /L۬¼]O!p]F`FsGF<]ayN.[7"R.QU7[)E[,]!|5;LZB 1Ig}tb۱f _-J-=B~;2VTxGw%9!l]m&+f7Uy|@/]svjqJo W ylmd i=@ 4Г/؟g8p6wNVm9`̒Tl(k|램yx&G26q| s{V19O? 6]$ +!H{eifTGt-tlaB?*=c 7.x^ jfʳɍnC~%bl틱k0B8c/Wj`\ r<pPGdkM 2w8c`ێ,ؔs L*yyl M28wUc`^@K4Pshk, J/o??Bv=˃zi6B)OYhdvie O߷n矮YkXem2/ ;%_\/\D ]Hړ]Bl?k3X2G&pxڪ*t^po+=` jy&:CQ'SRB=:X?#޾Pr#p/X| MɎ'H5BxW!^Usqn7z00$:QãUR0g&z 'UhNj`Ha>x._"̞c{<3Cԋm5) ۊ{p d{r6#SŻ.X{X03K;.cy{Rp<* vWr֊Nv3ŀ0aTs,I9A)N\]'G:cyDڮxl"4h1XZ )84rX; 载|򓟜<{V9]u֮]%\\ jJ?.Hv0RVIi^vlyd۫d 䖉XIVk!DGyiߕKLe8mk0 5_P p0`;.6 +=:MxKgJX$sdǗ0m gNc(^QTʷCR3[œrfUڄ_bw2h*Xx*b\Y" vOnEo=i1׼ p!D'vɴ_ڪٮ֝ BJT!08kh{iPV@6!$*avϜ||;?$w-xhܪz=>pJaiBPnOڛc0A yd c)Gv pHliuڹwk坌{C,krٺ/19]|-kIebLlf4,7m˿X45t9Cvl w#2x?Qaa 0|,=k]%G|@,3,.$U'ft=/i4<2Dh&]uwf\s5X K,X^i%\TوAnl7Ҩ|vwXR}O'ZPB&sY葮N!Xap?.T>C|OEaV5t-ocY{'!u ;rY**O)fFkQrO_u2Ahmk7/ߕg#4N+?-Y]&Y17<ɘ".=yc㇄CtլZ75oݿ '2Y,[j>{ьAvY5䉇ܶP-,k#l~}A=A0Tgabf˦ȮgK/+E"{.\Iyd"|W&,.lg'sI3 t29|Iw#>~O&@$E<1oA|׬UQzgذ tq=ܿimw#igdC3ze${تR$FfZ+rs9ьCsM-dC3QFg3KxTS ٘ה;0蜰OX-Ro˶# g\sb2V#dfs^/tnVn!H<_q=.aVxG\R9 9-r7jEm NU4BF{3 #r4hqAK*ZXʚ0E4&/OPdPg;qGbĪ/4g͚źuxrhN:. 6p _1<|ǵʥ$t[wS5RxZX4y-3O輗_vfD).b98|-\d)q&|5M̓^m&S|)e:H"KN_I:l!/9#xQu+ikK)ho q[b܅hr8Τ;W@K,6(8uMGS eg :筚Yϲm㜯enql$mYC>:Fs$H:0-Is}B[ XSs8gф( *dm3u+W?4"A=|8yn4!B 5Ef|"*pC|d͂Z5i?@^^OD9m1̼Lh,]Jy7u!˗/8?ؿ?<$8!w-fI<$Rshvl!Ozr+ѣӣq4|+=WLeJR֥}7fPZ4u92/{d,]3r) 2z H!0N>vCt6k֓8l7O Rlqcуr&,jg$9YvUBֆ [pc> %]~߽x0q,gc׵}/\F#◎]+ Oك`jp9 dz1Zb<}Rq1~|nbgI"&S!h`ռfҹg;IKG -*3tsiTT`Hy:P-4tWJ쯻g @ww7o~wٱc_׸k{ Fd҈*`pmc bgHM,Z=p/~/_PwH80B( U]iRP~ IMRrO04I2cX}c|ռf-{qSGik 1X1/l3O4+ KOGlMz9"k{7o+n~q;MI-()9V s43t ]2z0#~4٩/x_˗/9.9C u:Pg5a"#C/'ϯZ 92u^nG0x K\ZWK2t)9%-փL6u%bo$ g67ޱϾL]c4CJb(4V{/ZBGc(ﯞlB`M( G6#2ikC Vu1Z^gtN~/.FqV tW_^wbhŃ8؟ 21uO[ܽ(`m#,ŕ߽c[ & hwPkkAp/r&]'o9PTz3iɿzY۩裻Jh*& MH.Ƃ% Bb;矏i@ Zj@w&i]}]7}RļjrhV={Cq7曫nfphF ǡ+F=|@̔ڥ INB:R%Ϝrڒr v;qjr<hh 1_>Uc8ufpЊ>.?k7}Z\^~Y oj{XMs羭=OM0)dž<'V/y-(E^~LΩ[Q)v<ҖCStz$4$rd-X< )$O ݐx`pu>Fn;u%h pA8@?T ˲b1.\Hww7'4 CUKz4KPۼA B(izxf7m?<_:C9(c{ MV/ q/9o~f:,/p9 jI͍"Q=8Wki-;E2Q-피uMP Ar#b.,B*t]+kB$=4aslf!>w|FKy_$[Yjw<ޥR6sQۼy4MRJ2CԯF4 ظhX57_Y>3<7n$+yܛ=DhhH/i {êzO=T B>{k1>*xA(P[&2d=/Zʏ|bt ҋ*J $y]5FQCcaW I~峟,V#v۴O~isWvZ~>#=o (KxNjNQ# ?8_իꪫя~Ν.9$/%v4t/KTh2t]ַN-d%IQԭZQZP.=mI;g(UzO𵛶JW0d"&#\)EC`ۡhfdž2?0=y+g2<:> k9l;gZU8saMϒ9,pOA|b8m؎i9aZbPs%tFxESM bYۭٖ=ui8J9;=cfʫ-ȿf^ˣ'j Av ;yod|_fh-l>.ROث=wZ͛DyXpɒ _iS<{9C;*`f})t4rJ3gռI- Z4FL/I:Ư⥃dn7H.֮]M7BmH=R&]Ug#f4 ǎ;xGټy3U=Q|'qY35@23$'.f#x!PLBw\M4ں`aKb!_7ޭ=UӖtW44R0gK!C]Vg4yp{/(gVN_6\ ]ܵck^qp]cGqܼ~g4аwvl5PIN[;ͦg.-\s"\#B`h,3O2g$OݓW{lŭ)^H[ukJ$g =+44"Jq@ʤ7`Nw:l|nݟgWR\*d7M\>lBrk;: $ʳBЕR<ql2?|>#\[=8;^<'Q DZxҘf"h^y=+) s rRNJ!.#s:\uOg{bA_3ycXiY=F9F9G -ey8+:Ղ/2t!m9h5 H3ߗib])05AP:y7kTNX< ĸLﴏyf>TYW]uմHvR &7Y_4l;4<>buiJ*9hf3B<4硇o橧vU}7݇E|!=@Kt M0dIT!Y`{ ]Li'8* ej¡?XcC)tE^,ѡM!;ϔHlۣ&_q{/J)bHXjHrt<, r}d"Cdau5}=r'9Š,y.z3b9]ܿi3 csbjŜ àt @T;RR3PP4l9]mD)tMTeo}*U$y$LpT> O4q8ܺEcc#a8d٩&8[>W&'F6Kk =+% CWKRu]lܸo;vL{ 08Xz5k׮eÆ ̛7ڮWvɠ|ߌѓqjR)BwZs&<7t)$HSž`-rB:^Q4Y:96`ݱy90)nzܳO[^xU{T.]K>3>/<2NC*AaY,HH {2o?īxQ0>xl%sWd_8wo>ʪ-s zF|x)x`[qh ;ᔅm;No74fE+V:z:44jP?$5t!+`r2_)bۚRk.B~|?BMgutIMoZ駟rYfqsꩧrꩧIkk+HM6exx>6m?^3[{袋[ uRҢ@zn)2Vrq%TjtwwͺuG?0?7={&OzJЂ *9Isż>g#I6)B^H眚sRhRdKB=gȲŃR CwKHkRL̓ykV׬YwmƕW^9e0V;vOz7ofӦMuQkL9BSg6!F]]ZUh4:6ACCo|׾]wE8H) _@1Cץb<3107/G]kMܪ>V(R2e0)rJ~H|Gʜ2Ϟ!5n7\G+HelIA_i+sڣF2Λ"v]B2+OPH%S~Wq>Ɔ{]MU.];Sro[j0A{d(mϽOt0 g6%('̪ħ]" WQ tLkb5h1ҹJ':'Z. ŀ.p_{زe yk^4g?٤W?Ze ӕ@S2P w}SnSd͚5\xᅜ{E psÐ'b0m%u,  }u1lR]u'gSfuUɡm5H\ s;͛_m)!Z%-FKV[COy C+Js`FSoܼ ꓋\JC[^HS4@HɸE Ɏ%>phix"a.cgF,ǯ׭w=O㜕3!⑝l;ܴ"Aߩ\L|zn>CuDLYt>spB A(jOz.PI2qCsRHvfNz//w]5\y=)˲j:ifB 20\+_զP9=f&~zv=v455q9pg׼WݶA 0=ZpCxvCI5~%CogΎ*F:N ]sJ:KSGJs躬a]~7K 8޻h:8ep9 |Ml? 7={{QRu63[rgw6`u#aʥ8GbJS@A kWp? wfL#C++=#iBfYOypU_3DƮ~RXQr]T񲞸>+ 70h74HetSbUHe'q3X8s!vUgwswz!.K#fPʩ_{v')TuvpY'7gDHP JX)zBlW˓4Mi*mIY[,O^]Csޑ4+4wo]SIdNdgKeF6U u29/b3-3z7>s05RNlg#qo򳼊珜 ۞Ĭ]'c.Ym}LBF2ks9"+4N$7=,{ !Sg^GutxH"Gk,X5, 1rdmrxyHAK} s/v<2[( b႞g'>4HBRN77Q#ݻ\_!{9.o|IJˎLDDQӮ<*qsws-pgT??kx'<@bts%<`KylMHl)v/k/sZ2IqTLaZg_pPr&y t/er#4M4yYl~ o:s}c\_*t4SGx/FX9ASXdk{K߿۷ ɹl 6i71uIh{_lk7=S ASC }߸a{z|m7oZ]1Rsx7i)#ۍ%suInq6u+e=tڵh*VP i'cBK{!u<'gرkY5~)[7o\s[ǐzTYw U"b׾_r{p||믿a|իACY('0b'L]zC eh4%B&g4ĭKX Чȶ}O'COZOQOx '8WAaŜ&.Y;pɳ}c> Ek,珍܅(ẓA ߳4*ч`< s~jdv[-a29GM3t+eNGcC)@`ق1h9mI9ku}H1a9GBVkr H,Cj(5\q 讧|iծX*Wp*ƚb,c9]*OC:}>*'o|c*Pu-0B `(E45P8\Du\s ۶mFn5kUG^P3hƄؔS]rӚ"740uzͶ3iℨ0ZPoz97'5uɹ',2=--ۉuv71KWץ۷m'g{_ ZN@]D:ߺe,n1U<7hnR9=:7oox\>j0o|lF~DHemx؎)*ٺd19a},#뎮Ae٤r WBkS鹗J= ekHAJF5B&5Pi_d]Rq-ՄY)nx 2bhURw?Bя~;vI|{c =eԽ%A'5Ț˾N7&*qݭ},)vJM{S݅+y7f-z'طt\[Yȿo] ]?}9j04&M!.vE Ug. g{&}!NdqGb?__:Ut*hY=˫ОRj}z)*y{L>Buz9.]Z~9Qu\+MRhIi:[9v>)Br9կb /022B:]*(=t{Hׄ"HƲq/(LVdɘ%4 bBTr ~.1X8Ԃ'cii޾Ӗ|G|'cim3lvC?{7k7Ծ!jo^C{ $CW1](zF1'09+*ޜל=XO_g0U xJKA,v|75f4U^h27aQ,COe2u ڕPaRKp. YZBó5w)iiZs奀M˲jK#R5\[4鞢 O1ſۿ|_kpFs+gO<;I(I"@&?RuŚҹ 41u@22VR tObɽ4_awm>ʍ(|;71lJxsiR6pb8-GQ t7r.N_։{epcf߫>$ph냌ƓD"~^ OzaVmH2.㢔oI WN,g{c帬ezd,³`uF!~Bϗ)SbP$r{ 쯁RRbll2dAb 犅 o}|_?a}}}9s >g]=IҎo;M&9 QR i鸗iRL.(3_ka$e__}Gdkq9¦l?4Br {'i96Mggng߻)/+k|̮(>s.|z‡1td/{т1ZbwsQv1l3s棴Ƃ|tFx`[oE@ϒ]4Yy@,CNHe ]TnW=0eSU2N`yؚkX? ZbVj tOM}8IYG{KZ [l)& @n'׼9n*K+W~75+~P{>DqD'I,S8",-¦nU꽠4ieĀ>KAuD(T13:Uy W5}OtsђzhG ~~isY՘1hkb9?p߾ٿs+ Wihg ܉24hbϱ1dmH@'u"Ad$%,G8\q>2bmr-,ǫAG*P|]*ifJ)tM_ixmJ$CBa찅ij5={vQmϞ=Ӳ<}qM7;VmЕO{psM7qsOz'Xf _'L%w)!aX^i%wsQ#lT]@W(AMVIqu9t3mLjPO^4ȑɹ8G*-ï9e68wQߩͯ6s=n}[{3Ξc!p4չzh~}/a蒦|PUh3rx R1i+ ZnE߸\ϜKlUn9.bq_̖M`R"cXrE]'HRUHS50w\,UA_lYe =T;+^ }"etI/|իW|G}rPʡ "HqOU2f)HKщ1KЦ^^rקQr¯ bM}4ۭk}s,y{* bF}Iz_NGSmý R@)EsC96Oۑx@WO͠ Y悖9drΔs'$se4=BTJRЕ"h2״Uh91.](O|>UW6;0{BB r( 砼mr'ϟ&PtM?xSN1N@gK~lݺ̛7oxoۜ~\tEd P]$H J\ Qih%wsjr!&51e?{.܋z^']<~OنaS#WK&o0UZ+ʸ|-k}t a~5Αl2wCz6Gx]Jt]cBќώ?3\ILYr?NkfJpM"[LyYYrOd3J5ׯFM{tMh"W=]q| AΕDM?_^_K~կxӛ4Z:Eh<tO5MgK_RŘA9ڤ97ȫ%It;1Mt'UIE9M%ZrJڴ3tUKe7s 1>}E'hמgr2 |5%WXOi5` Y%g+"kK3ȹ'% ;F|1B9Lww7|IEn:ַ\uUS'kK uLx+ןg\{lݺ.4kUa]T1>:;h4\5M R_ΞC;B隬UA7|JE)#Nm_+azdmy2 ǸݴkGދ-bѢE\q|3y.`$~]r33gNA @eҎed6|}c%iIT9M{~p7M]+&Tn\ϗk YT /Rع$R踞)`4c,gA0.3ChR7Q-㫷Ma"*ѯ\y]=T07,Z2 P67]OtOB?+L"ʼn[B[ݡ8\wx* s`zSlxŀ^9go(Y0|9Чгvt\pީ>.%9-^W /OXD3RZ)ox(B$1?Hc#=O ) !DxX&Y. AfEI^Q.\M~LI.+abcP Wכ9+'|^xاtT*0###pQ(z׻d9 >0::0Db̹i\}ՓE#|4?NvZ~/<]覯] =@OȴiY˩dN%@ 9nq7Ȩ_VWWDy@ds.!(P( D0MY3M1w`W&h7#\pǶg9c:̘:~kCyy6iSĤ,n{O138ẊX`fKeyޤYDƮDRZ'uHd2|9b{cOdjyyb\9~x)I(Q;,U+CzKxM,eUH+]y{˜9sjlV'n@Ǜ^@/Ϝ4ccc 300P q"H044mO+CSSdF@/9\*u\wb =hhS܅c,ߤhNR 2V~@څ؎JGl%R?_צ :~(ΥQui_ C E=|C%pOv*?4}{yG㌌0>>HU%9!G-{x.kRDaSl5ҦX(^CT+\`jlWYɆ^.4"C?^ !}9Ҷ20g hd`緷,쟛*8({P̦!jSHeF4dd1Md!a^{ )`1u50vt(E"χcJaH@/xTΗsu̳:G4TK[tz(TTSq*۸|ɴƂ 'U*"Ap@g8]Ca ~C6*+0oܸo~(v9ehb r^>~vMoo,rI$ATKs~# IE7f)d!TZ]dmTahS+ɲ `K$8%:Nk< 'Hst>_+^aMV@*evzc4ExO@<`>/9nc'eoev u SJ߃rxHtA~I:npoYeFK85ȒHSV#4j/z}-eFshY>j*)NTKܪa;Z߹p/W~dƦ1lV~yJ05[ gZ.2?bjcӂPE5yr+K/s9"tww߹~@Wj?4 t}zm='or!Bp|}-, s3Qn])t&84(^mnv"{ .j%~RHqSp =_rϥ%g+n TRo6e8%hD"14+4dmM9=?"!3~F?vmvZxnz"c;&hY'ȏ%2v]y6.gZ.2},db]CUy]wu+'ƥ2~Isk6G|w hLv+Lal |'F:-F>#BZ%їd|;[7u]U1 dSTZ=jL!J`hbPf̘AKK MMMtttFSS466DSSXlұ*i?/}R 4O &L]L~h_,ڐY_TF_gz*L&&k*? ߜx夸iJ3t!'@8hTTuBEC/:p"~z]dz;:Y_p[ B3:/2&-b bIJ]fE0tx苳vAy3crHW!XNH7V&sDՃϮ#yυ<ό|#ͼVF]ǿE x#{(/ U ! !}{:th<::[">OpW] ΍S7;<|)?׾-ESЅ+?M"ɫuRֆ igEZcR`%U:uMRgOEz^Fє☞ yh2G,dh8W )Us[uRY;_qVr[>8/]?2Goe|,.чYƒ9#fqԱ5xڟ9>}v}C\~Zx9%LgS s:"`nGvILI@rtmh}!@ec|Oܝ U@&琱q=¥6-"[;) X:q|._NdWAHHԉa] J ZiCH2;n߾}{G?bllls ҵޝʡ|+5k0w%_z |;w;vBQ j%C,K8h5' WP`1s#D~vu)1tmz=11k\Yr_ڗR`9{@׊Z =q=8EqZ`$) 4p<(W`f2"1Gnxd㔿5s-)@`Qd(-J&6g7qҜf9tVi`1v1bSNL9!n)e4MqR٧5e$t잧h(?䖘x5+J A?Q#y zԼ20RGȗ'C߸q#x;Xj_ןS0R~]7 5 +wP!;;tJ]JsJ֐?~GNJWuK͵?RA5M{S_Y֟ )O+gћF&0 ~_mD^ t=T֡)X5h~Q %fR3BsC[180/­w%Wo}w;ij} vĘc-y[KGS3]!GUuIrY50Y%snӘh{G|)`OPBLl@Դwp]UAmҖSg 2t7u6b!ymveBzD L1Z^AJà/Jya ڙ0wfhhy[s=os?T%:/jn)+o-HF)At0G9^opFmQ-'eJ)ˍGCo;gY)βּ|f)fb22CckM'u9wC$36#ALMqd4irO,25Z5p߶%U&<`>~C_PU+p\{w_Q(>|I“-1E~KuY%ݍ4Gx0cé"wh噃#KY\ B f,SpοOikkT\\fvB8M'45&Ϸny-e.zLT%sukbQ=V"}z7Նd}rc:~JkcYv=6!Yb蚺zEl=>Utg!#֛_#:^E[2B_.܆`PД= 2QfKz4``j(~ k&QMC3x2o_{d'Fsww_ɾӳ|?\:.Tɗi:!`lsՖ686%wpe}i-UhTjV"b/aelytWPb$mmP€{lY|+l"7 }:RQ ;{ON7Mq V)YnC#,Wx hKu'?W__o}IoV p?5Yλ?x|+ _X]t\x} ~Mdȩ= DLBuBy BI^EXPfV4JuzB4[ -5=|l 5uxDBe#NG:Bjc3e1;,i.Ƶ[*G э*#W0h $޿SsEC'n{'3; f6%L-[RL- H>g^GW6DɝHDuJU74iUVԢ5:Zyt-N+Ί~ U wnH M!]C3l|aNwf7_UYhX`gǎLRnʏ~%ր'͡I5Fx͒1S Krad/0(3 S\>g VtsY9CMW[؋JJ|ݘ L Tx'j3JQJU"fc#ߞd88<<U!t.sA]"JG6I۶hr|ʖ<ϝÓǿp ޫ_yݧlCT|2&ұp򑏬>^2yO53'q/.d5Nԅ2D ѿSЊ}W]GPkv|MNInhj7I6tMASVf ACw:I M 䘩 gJ$:5 C55Q=t\ư*N6>tx|67\$չrS;3@{*0])3n%ߥ +d/YA`D][6KPj.̂ <\+([.Ͼ,of=h>Pշ䱩>;nE؆sTǎbyCsdꄥJ8Z? 7'X~h<{r]le—$y7uх^<+!dc`"G]نF85NԙP/$ BZ;"\TB*/(d#.}q=@5V/i>}J<~7 7pxeZ-U% Z -LBEUo# ;wor9|AzzzTr>٨ EvSmWMnDDՆF{89V`y:npʹsݲJ>ZFu-CwUX]4V3OY$iWFM U5[;%dTp1V.[78 %vc‘63ķٳ(}9;SW^w {6^ȌОAqhό."ik2}uJ^,f2m\2ᛏků v$hOG&Lbʓǧ]HP5藿m =:-3:SiO,+QSkTJװ^zϬrXEƌ_*.xTo*,FF ~kA J9|$ёrX._HA@&=abH|YEC eX@DVd` v O}|#睍 +/=\?ȷBEү\Ѐ7y% |A֯_Ai*Q śK!Z{xCmzU[:HG*<LU@h}Ѻz[!k k}kMfd⵻TA/nSژ k$6S2wilڳ߽\hH\nhQ KǙE LXX'eND w߲L}cmOo'^04UjTG 39bW<0 |Ex7Z1O-}iHm?fdaB"x*f4|ժX˿ߓM%K :yg]57HHW hnRصGMpzPM ^drR'%\QVh 3_%TT#>яr |B|qVGRJl~т]:rM --TW%p+(Fбcx^朲+(FE#jj<068ú X\ת3QA %4&f;o h4c< lc+x+x, 蚺|}¿I< Z}1_Z. KuRM_>Ϩ:;TjK7PN(Ug8^/X'0]yu7wdBGSs% h̊X\W_я)9;tm|B5t]EQT-޹c+p8XPJVX*g.uhC(&*Xd܃ϧ-?gf:1WehK-4eee6+ y F>$j%wqXdž8=Qd*_=iL8^@G*_S).3R}ytEBwx(z Fww[>yjjJ\f~~y|ce ԧ>7 >Os7_疑^" -yMHZE3h^y-V9crroRimwrqk0jr1t K!=9#VA5Hd#lcjD2&i]SPԵF})NSF,Kan)|?ɅP-jy^5lh =)o=|7uG%QD6gf_=xܒ8.;F⛳(`hG(f8{smgK/kOJ̪JyW,b D(c`3_r:Ҧ!dt\P<^Kj*EՓ8 k(z^ĺﻛ2W/f>{ T45J+ H @JK6{KE/wwE~FFg~/ApZtg9;]fbJO.z>R %5f3s)@ rCh13idqb˿c[n? :[΂(j0HEL|Eu|P.ip;n;c.@h1:6E>V?'V;|.89^h갽yו MEk ˈZǩkJXckᨚ*=)]d̀KJqowzJcdgu~m 3Rtd3P1%FYs.֚J;x6D5[O.\(b1/{׼5M}_dΝM"p ^ *B2I \W;vl}3]zy#j*#+}\_6,A*!N"Kt|Z©_T!p\atײJ޻(Jqقz%S'OcXeJ:SvCnVzW:EQO(2lcģІlEr]IDYkjʑ|!:8@o|ɡlN9'xgC2Og>ɳg\6☦<t?t#gy-֟r|UohHJɘiyų B$UVHVtkvz]SlQJ+,kv21Wm\;3I{*\j,&5Ip=򣩂Iz/^2J-PhJr D5θ[#:%Zyq5p=NØX,ۿ+7ַnP.P)pIEbȷߙk9ggy|H;4@=}|y1֟$(OpguRX}Yw;X@%;CgÃ{=߫0jAPՕɦ9tm!|Il[a`u|'Ϗ7|CD l$@d:ʗ>Sgqj|O?S._{)oȐ83מe,ЕwEǬU ?jqfD\SلD̗W A@n[P/9Ju-t2={1Er܀PC)q_ 3p( y+P8!j& ?PژX}؋R@g4CdbNku|o{?8~;{{~gx,]IWE(ZZi'?oo}\E:͢Ch E_EYI&Lb5(M̕bώ rυߥc-Ҁ_-z3WقֱH0c11VENm"M4JEdIn-OwCtTOYW-?]O̗zbHncn_Ļ>v%ņ' ߺ|IypG߿xc>'?i']7ofKoǏL' 'I,[fY= s!FGeD;#gO,<_ GLz d2`ha~K5U,* dػWlhź3ܷoTddEJ ]!*GO[thЛJ/T"=T.IJ G%w_WWBWWWssz+y{8x9ӷhYW 3rU+~a(Ow녀BaѓŘh=>akKFjCct`bFg~Y=.5Ǿu}}LW^(BA'D6_AѰ- ᖈ(Q,4<|#;8lIG5P-Z0)?p6{떦|STlS_8 R!ՂdTg{љr#C_|O GJse |_'i2| \h|qw}swqv?ٳkWb|SmsR&~uX">OWWko+_ mȿ: uIY^aXr?c{:!Fך7_l"anP#έѥd`dn BP54u/0hV[yx3[>j$s}/޽ ~qtӼ6ڳ)|\_{=Wq+6*^LWy+7db"@L8F~[ n(2&:"8J\ɦPu:|_.slSpT}*yÞlGMIb^hn[Bb܉²Q(C[L,⸡7eznx)_:MI)?͎;֬f a:$FOI -=9d: v_۵k׾zHA+ӛ6}1^KC3֞`ڞpm.u}W/'=sEWH26*-֓K*^0yI,%ŭoz>N v:i1Ǻl WmTou [SW0u|y̩@273[vٵ.˛]Gg:)*c~ Sа寕tj`A<}}"޽C?3P?@2Y:\/+CWLJw[tչwOT,wqSÍ~(7AS @JI6.4*7X|Tmm} GV DUU2\z6Q+H#-u~r-o6盞+k$nBJ|M -R,#/s)N䑁sNĂ/o暯hH|ߧX,R*( Efffaxx6~{X!mE#UR6n= KrcG*/CB\pb,>^\2(yzђ"G'yDDDwv[z6w s+k$7k1)n7U 3) UL si*z6ӥ#9RHp}@ lB XecsoK (ı)~(ofٖ%]tp1Df*A:|ur-"ʞ-9\/h]t2OמƝM I!]'?gųx8of*ދH>4UG;φJUoߡ'U#)Q@|"/x@5"j8BLhl$OP hu)J,U::~L lD5#΂o7vj,a>mcV _ z@F2* S=d,Cg+3{9<<϶̊q\d|.\NItXȉ"b\q[ۨ<@5U"KS3jxV &L V8"(V]}~Oߪƶ/hH2N@e>SA5Q?禛nww Y|ge\u]*R.: E'p 4qn{s> cxxs{דLw7PREJ[vÑᏼ-iM5U3̥HtZ)¾a 媳ui<5e伕!$K2T]BVN",[j8ϒ'48rUL,ϡ aꊋ&Im՟UT ŷԣL1N6*oWwa&m+ ~2bBcdbqRy+gyuD #yJv)x v$HD ?FP[~_HjTka s'')-% G̕Lw/Gt3 S gg(eXݩ"g&D Y칔l Z3髶Go[\lj'ΐ+^^ *f1xz6 pw?3;7e޽رc3P?Q-u{ul᭎tUCKdWYr HL&fR)4lt}eXywxHJ (y=peżZ] = -}j猵b:,[I# ;2CB_xH4aZd2I[[tD"Aww7t6:;;ikk#4O$d24܋zb }@%aZ d ?Y/Cr[O[ 5G˵+НbڪPrsZAf{O4={c;gLn9K*YkYK4mדjPMТZa2VR# sgiQmlsb!P8Ǚydžl\4n>2wOwW1Pezd"F 7 C"N\:4itulg`xJA"ՂE '-7=œǧ~f kEh-Tiט DD6S9ё<]GV{b,ghѿh>}1Ҭu̗l%Lч_ RBO#u'Nj8Ł t$rB!i5Db1C}7ʫ/ r 1T%`H'b`)^uY/ =@e{h[>)]ɼEGek ~z]貑*tt!*E9}Rt`1>WaSw [g" ^#UHt\)q~#WNrExECY! $ºhǫ+!\jzlk/aKنڼƻ/xS8ScӢCXH-)B5cϞ=lܸd0'}t ftmQvݦlttf™" U {BJ~ 2 7ӘqfTNL L];bB-tZ[徸䮪ʊ&/;XO:(=H_k1RgfYߙ$jj̖,qy%$wj7n sR 6 2W(a&1K|'x;kOˤJFL\ߧ\ytgP0_Ur H :e![,ÓL}Y md ޅ4I]x#Kޖ+4vӑ&%*r z & Hm*q-d Ta>]3 r+b9et و\Φx,qN E]6dmxjub^uȚkFHN{v}NΡ))_1B[S5Ts\[[z $dD {DԛbQ%"̶'lQ~T]]y\Y늨ܗ#Xb\<_JJs^C#z%dt 9JUFfRW6i )u08pCS\yv;rblOБrs#-wYR< SpSĻ6!Z^ zv>:Sfd̺%1^lfkA@bCWٙR8OXߙ`dLTfqVhx^GlIlQ.NlVJ #k~=ew^<4B5=duLCB"Ev&Tߚ?#Ʀ8B{ #a=Usa暭df͸EֈW pEAY przaR^[-tUa*o!o01΋!]>D[LY35ͣY0О@[eѲPW;ڹ#>_r铍x<oEV3꓋6WBH,`^ U6JzjO.~uE]R |95g2((zV;n̋?! b\ Cb"h~M1RYSU@&L}nc]a[%+'l{Ld.`\Ay]ה0ح"䮩M[^r]gˤ 'K5t>zu '9)DiO*ggg5s /\sn:2|{bky}).ټgβ~MWlbP$HI2Dzmt#vn3=;)U,>|=?ʺn\Ìp1f֬p|v"d`B¤"=E|p zb7;g5U4|ws9@ T@'u.e[88vh'U,OagW#vYE4R MfrZ-&]{0S+6k=_6 #)T4UA]!A%}AKFf'읧ңt8M+gl"D/~ٰ-g3$xɽ˸|fټa[8wttvQVɥٺ}'Gy W?mM$>-] ѧ9}fݗnd['B5"F(-j<%e=A_.C+cRn H."/:dCdf0 8K:TjѵxӅP\j덲}#7\љ^uJU{oHA0$)9f71[ux~K?`Aɮ]UVfW&frDl"<21CP D1c%z6XF34v|vor|J\27ndKkf"JT{/\v lo~>|;6 ;ϽFMёMs#d2ib]hN&a^wÕ/ۼrNruWОk17K/PyeS\rxi1'~2ʔmwNyW/s|l*(\J 3Z>{b*׮׮CD&4%@U䱖z_@J03;UuURU*d,9 Bo?v'j^DT Yp?=HUS ͂?)"M0^C740h~*NMˠ$ca<tS\U$_ ՞:1M.eҙ]*D ]CWnS\!hU7@X=ĵW^%kwͷpzxdo HY{ҺN@U#UL-(7d-G.3WjDhbQ&櫤cƚA]J0t"/v/lNMk |־45#E1 +)Q* kili/ӟRT@xEīpJ~Iܓ(/} <_iq)Z 2zN 6P /H>t@OG'Bۯ̠\?Ov_'e^g lxBP=91A->kzb-D E %WCIrE:"qVٕGγ7&-"Fw6.V\;JYzttv1:1˓^%?@6.k<ț_u13;Kg{TUu]8F$B-\ {vl⟾}W!%~Nm#O>GTftwv019M"Ht6'KD[U,\-i86ln A jN~~黈xP|gL5s]WeCj=u& QC[u;U 7*;C#gX Jas+R`*.: FpIG)@̶]04~eZ@rD(HZG(5WnrQ"-z p|ݽy0=. GU/繓3? *|yr7#~'MuY/oTjSQcUE42@7nij-Co0EMuqPkP*x<Ɣ=O;Nyt%m@1UȗmvojlTmHE/ܼ>tjyZseI^q=yg_߹LW_O1uZUU֭[O{{;rl ZamojgyǛ_?;8|s2۶l$ݏ,-AUJ-I[Uzy/~ fDda&ݗi>ZD:r>l_&)TץU9ϙbEM>m2u1?>F6a:! Dה.dBw6JԘ)ZW*"-3JDH=Ij$S> |:jd-n?wx[P3bF\"Q#]G@OmDO# PIT/"}鑎Ga>yS zIm~?v%b %&X j`@.aҤ߾7 a/畡RKFxVCWj֩֒ 9L0Ecν<{rH *8y{S=ل2S`X!1-!ܡ%n9ugyכ^ͱSCO*'QVfMA@:B xb|xbWGO3]p J ѿ [^1gNLsvĖ4FIɒR&.Ў7"5nXs3LF "NI3L%t\].mq>xUw=9#'HGW$hTn*oI\)3l{o[,\X"i mq]Ņq5B`d^>8p"A[pWys#7ŲL,ˤZ`Y愄w+oa.#p *;:8՗r tM 8>?y^M?^PuJBhuuGL]cq(rc6wО3pfԙ)|K>DQZ9Dnhʲ@3DP$GE;PХEO^uJU>>Ʈɨ 6RJ:3Q/@J ;C MHF+ή˯TF&8}vv$>MW]JWg)\"J:er>Joor0v^®w~M?u|C\{=qDZhãu"ܿ*}mqfjV%azvͶNe|T}@!q.(1g{S'Z]_M\3SE,_V^WT_/{3]HFu8O[jY?W gw pph>':TL j/\h^fc[S;̶KŃZ N(2hST3)*o&Q[E0.x8 uY(zIDŽŶvҽ&izH `P1lm#>r TX  M2CS ;޸?H w?l5N֙읾4`NMiB^d"Z0C t*&k O){s;_ԈͥPq1Fw6 ~4"b#\}6>Jg'`Utuv H$B, e24dR)J2tJoz+6?sٲag! J믻\G,9CHD481V`Cw1_8>7_ 治 U W^ripxT_qu:ӑjLlp,p]֎\aД)*C |!YI@ypC0í7EHI}ڱ!|{9Ab"Fy!4cS.wMf8y)SAQE>kRP&)e&jآʉBw'͖T#[BS]JH Nf־ xF}ZE _]u]]Vr7u|fZᅗ]<"`!!y ~ d CО`ꡪd$Q'728ur} >,{.A_W;#cS>~}͍T,vCU4er68j]q/F$qyfn=Nng#11>ΕJh dhgb]( ))5[be{_d{ȅ0O :tU<_r>!QLφ|aKQ\_-0K3 }ט0xM7ʾӳdVh \*,;^ļƞ$:Tl#Z`E@TDrB~u_ R,+^ennE?NqK(rB"u9PT46g[2kk/,HoHnlW\]]E-"4͗.Nb5<_ %cRBԨtMkS ɑ*U[ډn-CWkj}CxË ~L־4+HD&Ũ˾.>Rc]ej>Bt`theڢbEWM'ԓ+ũHJvHx->a-! cg0tc"px#l4H:#m!o誠X019Nww'|iR,t]Ƕm,8h0muK%U\+/l6:rm;y.8mLg&*_ #s80Άґ [13esu$j>L,5 Ngd0kGLl_k(e/0)95QhIDk]z8n/z+EyI,+CKjf#MYuht 5Y'FSL\ћbdz0;_c=z9\ xuy@} eo]#]M>Z^7 үy?OHǕ>z |Yw~adH[s+Z[*M uPJPGД)̗mN0/+kie K,G뤸EѹpQGL8 p~@Nb ~M4lK/Br(W-|ET @4z{z}l6K2 es-T*Y&'&S3뙚 t`˦ ϙpbhTmT'RrN!ɨY|_hRRܰAX/BZs0BzJ#chԤ-/eHlfOW[P`l3ОW,̉?: Rv eM1}t3^丵~/P̃@6HqTw]UNO,إ^M dS\f1RQc+I~`zM󽪪Pɓ6(Q%_ Z`i݋q}g95QdSOl 0T %e~eY<}bgfI<Gj lJWW7;yknŵ-8sErmP~4u] C=M!`&dJB4,];v099帔+U2$.w/7nn=F Unێm۵bo>z/bcdlG2'$'Me祰]^9|#g|7dtRqP/m<~tBi ԎM$P6bkl'톍Y ͑~.ӵHCPqɨ5JΏaxTb-SKxҀWov`75v5AgΜ!HpA_)kh/@<[JlAՒMRl[ݩ=[8W}? Shn%nXv/:*uY| J5@J4UhteM}@JΥ( JWŊfuD ibGt=7747ؘ1>WXuYeY`fQQ=(w@Ud'js'C4UadϜT eOבؾe#OazB*h)%Xj뺍(,iJ2X'Na;ʭ7\$#ctwusvxn/U00;&Mޚ%kv aB}gY \Bm<x- nH"U\^LHL<;]|teqe㞧贐o"se+6Zqx8ܼ|zb*Ez皭BçGU $ l3=Z=$詍/1gggw@>QUoMNe !| [)h<WkuOl]J>鼶&0#*IkHO} K[H4Ea[oLĩ1M]e?l}=m15sfbh߱#I=ɘqS-3%6pv:&ife< q> Q'"<%7F)!7jPtglVҖ-l^e]I{?:"\yո^SX MuD"A.Z2|,۷nX(.nP*y^?1T @PK02:$$# _{04d6ኍ9K60CCsTDngfW U "[}@.X|dR7I4bi?Ŭk2^.y?yle 'M2aO6_qp3[ܿz[^7Z!x2}GeO_U?z5;;;cy000$GmńS8__2@1#FD6m J󦀯H v<2E i+Jǿa T[ߕj-|Ϊ푎++,p?QJ$#rqף*ɩirǎ#.th4( a jJGGH8*lMS9x׭̙!:slݼ7grrE}.E6iq od?æ_vtܠ+ellrٽPqqg7"xhBaccZ =}cC M [$q":>#̗\Wkqx"Ahpm\-01mؚ^`oqc14b|~^?f]P<:$;&yXBhөIFTE GGy6NtbW[e^dyxPqmۻa&uwr$flߺ'N088@Xs=֯[A000 A`YU"]]Xtvv"ȗv dqW887clӑپpNt\7OŭUѓtY7nI&|#!ַ駟^9*ak?A:ł=w $ipV%ss"=sxƙ?H=wQV˸y;߉ | h<E[cSµˁ\">Jhi,5UPu| gKdǖR6ư/ʲ3 o5f9v˒¾Fg*LlPq"} $5d]-~X _]ت}`!¬dy|M;y+702[e~~ruwPTVd2Or1o}__cϞ=\ve:79urFEoZV| } g \\ PY3`DeRxf6o6j.?W(ny'~(*я~r̯گhNJv߈P#H"y (+ _L]4Х)GtL9=.vonBD^W@[ A8e%"$0ZM84ε;M^dkf瑙!=rԂS<Vvo߉*+.Q]%31ܗ"x,n=SLNr!DȬqEGG|ii<8BHAQ0>pRES,ghN %9ArOu<$":)|$țw35ʖ,X^&tvT#C]Tl}zs1;2E2fTD/RuZkXculOWy阉9H\JonaXJɿ=pP$ЙY>Y"f3[v}r߳'D8֕. OyGMҙ6S}d߼~Ḛ>#h_җ7ɦMطoΝ;HBR9th!BxppgWrmіmJϲ4@~Xh-$5t3Ͼhh333\{J𣏕Y]Hm  ӗPz̬IxG(K ^'C潧f -־ j}Q' e*]VG2j`VϓXB@B/*K)8tvT&Lj`isB T̠LOќI%"?;h;ܰ;\>N"V볞K%"T+V]B{=\J54wI}nLl&CTBFT*$Lqk,)ecN6E<\.ݍҞˑL&834Doo/R ˲2/і4#'(Y.M;ޟe*o9tTMU89^dll㚭 O([. 5K8{!*$165!ʖW|Ӯ}gyYbwjnXw':8f !=sQ68+x5}vE?fiKlOS=&Uq(T].ߘ#{jDTQb5 '컴asvG'&F J,yK3A ?lxE^mK_86=_mRO d( Wn}Fd.71qjHPEyyѭ5}vv[B۳gt\bf|'NDNv\h<;Zo<~-uV&''9tCCCt˞۷p BڔjVl'[#=wb7 L+~vv!0 /| vBYc4s3B I[S xP^q $`<0yt}Ŧ7.)EXn]ˮ3֊Ο><;$ E}iN175NO)U]v D<׼aBڏ44]OqU v\U+AbrU7)E!ss9t@UUlۦq,RD4EUUt]GӴ~Cc(Zo=\[i219IGGQ(=lA?t;oqNheυ*"CpPrT5qBHDTH -`M>-F*[o6ƍٳg$Ib|SjB8x'[[ܴn;@DPV 0V)iOŸo(G=l&*EpD޶HD&8 ģ:^h3]l VփC6ahUSTVlC8Q-<5eIFT{0ZWT\ 2kPA}'n#,u_X4ɓ'|՘ɩӡd"T)D@Ul62- on]Qs"euN];%qi6ki+fc]ץ 2@QEP(,7lm֭c\}8}4$[]#G-=ycrzSQSQFfʸ~h ڬP(\2#LWOt:.2 ; RC$!yM]X*ࡃRWm]ЩY:n7ǦY)Vp#"7sv{5P/T5 ڷ *$.]jKHn4MT *!#"`Ϟ=!V ^ss[[WU|o|c#c?3P[zT 1ˡt=D_,Fgn?ݺz!P oxCg^74v=#?s?8|k_{njv##ƍQpjKtI~."Ӗ0a`qͦ,ULQLGID4EكiLk}o~`ҷuXiSF VCPM8rѣGkn&7aoϞPxG<w3Lo~| Th  m^eB Ջ*Qd̠x ?;}FIk7ׇlb7L =4IČ&qr|i!K|. 5VvD'qWq) /z+7Tuq<1rvߋ/Uش*0_r0;rS;Taؖ}QU 4U/dd%djӉ'ذa=dژl8q|#L&)JTUD$G:n3 hH$(+BX$!KoUs=؎rT0Tf6Ϟ&7P7,_Η&粕 D(3AFwU7D7^׸/_=NPBңܶ{v|YREGM S`Xוl>tiZsf^BU }%(BZ:4ö=K-K^կ~??糟,SSS||{[nmn|g߾}?~\ kY0hErH0iEHƊ`,P dK)%>('ٶm[#<+~~ __ϸٿWW\}/|o /t?m#by\ "\"JH2 '#= -|otkڬtX-HEєfar`Q/gk_ᩰ7+ϠK tq\soj^*1Sch[rH5zjT23 GO05T~mXTXx$D o.d}w=5tPX,C_SSSi&''a~~]ӨVDatvv6)XyeY5"1>1A6R===mfgf:HPv|zo,?3Kg F[ ,FP6 FJ8ˋҵ=3 O񣽣ttӅ(;M'M5ėn>1Ckz~9]|٢l=*iex&ihbEc~~__m;pw|_OOx^SO=pwp!>Ok׮>c^y@fn7^NJ.+og~Pš|鸊H |+Pf*:f`R^ TC/!ftgc|![04s+{ulYm X*a@FtUPYżzKΊC?ڊ%P|M_fEEh5:=6B[DP(Y.}b_m;=Xm|`LnB,?:-!P4r9N!j6n͐P(`&SSSAP J!DU&:i4MUl68 $#Ͱk$IN {{{)*+6h)Y YWlQuЖ(RynF:= 1iAem \c%ݸ4P$NAo^|>׾5y(z@8y^#mEQ, u<|駟&H R088tB =*?mu _틅.|T:1 UY8No'0bZЬ0%g]5;[ԯ5t g& vk#_v,\ʬZd~zVTlk Uk{;x$v $f! ~w 7_mW19_mYv0 yw&lO&-^@o[ Exݩ¨O %F4|)zCCC0<<`*H$B:fdd$qV64 msss399%;.駟fƍ3[2ɰG,dQCgd̾ӳlc@鼍DҖ3\%tz߅0VDU SΝ$w07tR{OgQCrM9R1BšTu9pfv5 z~@5 .Tfk[zSɳ,CS%:Ert]"884Ǎ;{ HIB}1dq_:pU޸e鹐%b} Tws g?' K_RG>_,sww7x|OB5ZOޕD~m+T{>z̓>AYX;Uf?p=׼׼5 h4Н^ OCwu S,)x%/o #*B4ްun+3S1H/Q̅@\IGU~w1JUΫ瞹 |kb#GDIFdX+M41SDQj&L6.dUiУܝw]ei{f2T^("V"bY`-议XVŲ;(Bzdf2Ny~LfB&$뼒9ss|*kv{op>>Fp EB>-zJ2݇m?@[ pIet?w\lz>lMU(MvRբ9-џ¶KЁFٴy MM(BX088… d2A j ׋22000q|Tk:@<˗uVP7tP$NP"0K\?=z-: )9nMX+T+=JS4xDib rh8ص*ƂvP*;_ ;\q\QƘUsS<8Xs󖶸ʅ#|tHE}&3T"8en[,uӼ.7~մiXgI84]WBɖAק$:rClA?ܝ*7Ip~bak_0|Hi̙R)\7蓡u>{*&noSl>q9RpyV4cEQ]4OӉܗ[OO۶mc׮] N/~ -[or-tMDQXhK.eΝSBçZ TtNjs̒"ӗ1MZB<~/+pRs2~70_C[ 4  \櫾lDK"f:mc~|^kdK\{ELlw~͝ m Å( xbDt; ,S aUhV|<oZW*64lFpecx6ՙ?Rfk?CQhJ.g0f֌vI&^p{4%͒J%jAL4;χ㩇M$P*޾>ΚQ(7w.gE)enm1Z}8 ȗbRBDfD<)'F`+DŽ~??EcmLfӝ.; nKq eRbŜdn~*5\yr8sa#Rt2f#4Fia{"R2d [r \ٮ6s#)<|M6сp9_϶mxG`r_2}BQp]?-=)S':<^pBFioogƌuc֭l߾L&0_=\s \뮻/Wqqfb޽crG`@qhu}EqK dT(Lpf.YKKaZphq PY*3Cz= b='Q# BŬ{./e]S$1X#Et>]e [񍽄} M c\ىjEƩT.|#\ J0"S,1 rrȨԨߣ) uءa#Lb6nqbK[;d  >m]Y3e_J8ژCS|臣pi 1 ,M [T{7tݨJ& Eiop+5C= 1HFܴfjQLKX6}&˲k_Ove峒C^tf42AHj&?`{ټ~Q"qރ_w~O??Ιg֭[$M'tmmm .o}`Ɣy; P'⤗G76`^|QLpKUXRQLв%TMi$|0 f̘̙3 T*.2~nhկ~5ඬ~ bڵk|gM,î!Y3X=cBIٷy )x%VfƮ1-Xü݃qmU |-O#7i W2X9-y3 GjB׮]o[^VZ0gfه o\.J9-}ObN.U@z=D~QF9i ?{(a455=sijrҹ jʴipG!4/1f)ļ&W-G`Om8@"٧Lĉz%apW§cCeonR/Ӓ~]/  "[ւ^=}( f6)3nZ6uSxv(oj@͆V$w ^Bbq[ucQ3AMfGoleD"1`:MCCC=ܞL&r+ɪk+P!jeYB!sy])JS.G{t*J=7o{mh Å*laq{ qz- z5c>G\[+93c Vϊ+ FJ5| PXGt5X߀i$GBzd5*#2 nTe͖>=M>@Uj1?Kናzi WHص"G]駟455'/Z\.=܃2^ ұ&lWBT}CÇ_v.VAcrALsuK^ϼ9IU&k/bhVDb i*eK4F}5) yZDl![tA$`M#oΌY-JCKuD#7ܙ?ϊ!p˫RTtu5CW.U= b>@3VmKJP(U㝝u]Q-z4:D9tp ]#n! xP1.;LرÍt^WFJ<ҏH>gzyPs,b5lFqj!VR}O Å*_l!f[r\%bj6q?񐇁lbr }:2#v/VLf6!k:<D1-u {ɔ( `kK,aGt]s&c)0.~}~׮'ka20;m0{Kh.0 ^ ֮]m[R)zg8餓я~*:G D`%X4A2#VM{qW\H)U<c^Ъ*ȕLDHTW{O~>56ɍ4F|./}EP1m[ĸ`ZUlV{tΥSRat95,Hè[$A=p _2攑[kc)Q;8t}j6STJ+Rԛjc^>Lf. FBur9fϞE.ŴtFu$<,>@؎dɌ8I`aLdQ%QJ2O*9l_#Uo|ᶵ<星IvSۖFQ3e4\XH )?|7=,[#ƣcL1"T<B]6x> 2̗t|T 6p]wx,=ہmPyp2Lf2T- PHJ= wQ#_ܹ]inn+?97nnꫯ0:1WAm"q߸Xw+z(xU{#cy,k^SÅ淺}m 6h]";P੭4E}lh_Й22 *;2N'Ivx~KL/q_.w)C{Ew^}KERrqhii!4H˲FRocTz'GWW~˲m3fPaX,  4}jWcARa/op>C!BdɌ8SA,PhP]!atB%ҭY_-\yQueӗ'XKTUaN鰧?_Shp4:Q恵9Rb -55G!5L/Н 5`•Kz(FX z;j|chƏ垿 L<)Bk&(2Ye|ُp{ww70mڴ1KR\y򗿤{5/}^{4ėk<:@0TYђ {1K }BW-_˥{?-J?U -s[LK*Ev׽9GNht ȕsVSdj:?_}z3ݵqUBË[.iJwנ$h2NE {,gL7'>д PUUUƄ BxK)iiiappH&L˲yQx<^x+ո`i zeGOXS荲IUT .[imKۣS۰_/g>Ü9s{8ҮR{`DRU۫reR[^&]0ES/tcDz,3Sx;xя~DZe֭4\u$PiE7>bK Ҫ੢jn zicA@έ_HTjT uj5!Im'!ƪ916¨rHFL Um\b͝uYԠO?dkHx6|n {B)B`60 n\.yڥu2шL9ib1rR9|ޝBa]C?@Ǡ`$JjҞ xA!XX'_?TawS**QLM}B@iӓۉxN{f_)bŢ-5Llk*KPsJLIJ%<:Ff"|CB/ax%KA\y@kf];g?ˬYǨeEqu tI-bLz p?ȉBEŮ"8d2N;4;v`GTl2nfV^}Lc ^7u2ve!^Jwv0TtBҨ!""Ryp`JS0+g')MW;b-{4!5 p!6{ c/\Aryb7-y-Xk!Mmմ!iHvT!,mIhdC;P%-X\O4Mb8755{]^/===(v;8TA-  zzT*e"HvƌTn394])1miObZLK1tC' 4Oލܼ*,́lV xaOϨ{рY 3er*Fj~A[ƨ]S(UqN(=\<g+޹Sˆ-!zEŮyp;$bZLv)5ڍIToJؕcQV_wuTU@C\c̖+Eflx5bN8㷜Lª"(Mv/75JlJ=4ߚ nr A*efSb$cu:7ɝ{SOhJsR!V0g\{v;{sue8I?xBʳgch JD͵W'}E"  bYV @( x<ĪG$ $ϣ AS40&`; Uoh<5R$HWQ/νDUBjZJ`yB'67H)s܈KpMT<_?{h;!ϸ($(:͞>Yž@K"y.#_Xuy  Qoh>bgw \/=8X(\ ۻ,ax|>JP3J::9d2G G:gE\ 0y:,`I2^Y™aPN 0uDW^Na&=dWoee[xc},FU8oI G$Y+5o߳!\)C%hY"7P^Ι;ɻEN-Kh/ D3<QG(J#Zr7q& VbSrљLK2cƌCHՖoƩ ϶@S$=t[6$WkhCQeNjiT*a6 g2z82j4٥rj5$e!<4aFqq ul&UK:4E*/9XPªV1N}G7w O'VM36Q +BW-ͭ =7 Cq2t#]EYI E@!yva/̡Ϲq][~DM &xShU vKɞ iTmkP}Xn [􁣻̘1t Ϲg>bDZMqx{os#eKţ\^88E J/&$A%S$xF+ǺBJ]T/5GS?x씗yܼ@mZrN8)J<9fahɖ+">8{=?$;r9J:=X%/ mXƨvAb&>@ 0BT?/twwH$Dccx:^x#14E+&sP;Q$Ui6 Pp@|;EP)O1i%|sXn~! 0(͚-n8y2T`0_anK%3bTmAL6&>VT e$DcD"Qg4/BkSom:>p[7-B"#4YTe[NKOqB7੠r;%<Z<}Y">k h+5zq]r|kC*yhES0-hjd)?޹"$N-Fd_!7;EhA~DZC#pϚ67iTeD 援Ǒn?ynӇ3Tnl&?ޞdFW>n#R^}艭Raw<-#B,^Cᇟf(|6|xξ*Ol㌅#ԍM2g0\oT*iBV+5pA߶:AK7vAZy߃ ---l۶ pӟ)<΁$BMj {< ms(lHFe *f|sBw/߿3b||Ƭfw*hX|b7@T9meSE5ln\UH@Q*\X\Z%F$8ؘK@^UJS'J9ᨖ?ͥA{ԃ{#2ݾteYi޺Jރ{~ Ǭ0Wi/ۿIJ==?~p[ݘ{\"Y1<",a y皪/lr_N"8sQw>r,d]6fI<~ކ),)sV=)6uѝZcKT;ӷu2IM6(JvRH p k5EAUUlFQW:MNbnNVWЏ^M{;\CNwb#ybsߘ*Zg79R^>K)6﹄\^{v i avCx5!7F2`#w?x_v)f^q|_sP/tRytz Fd?w^~Zv+1Җhmx ~XyjǪi<>$*B݄2P=D?a9rU3vsצFre|YϘmT@T 龀=Fvhe菜–L7+%h`@yQZ1 t6v oem(tƿ_rst\Bl@ $GJA/2PŧŸ6zUBSPnͷwuvF Ù,>y87X;3Y9?/kbt!`F*S^@f_-WE N& (EVw]̾ar gͤ^66lc?է{@D72˔ɗͯfɫU*fl.O WoDL'=w!}Gy k9rM햂qDƩcYWmX]@QE%-Y{ZfI?u[SdV\"G!C;ŧ~ /NVsy.r)O 1T0jѓ Ke|H^W ^1;IՒ\in:_!q<P34Cepϻ708"4zLC k]|xd}Hq%dsyf]@^o. |X@QJBZ=,GC4׏8N}H8|&zZV?~ss3~jx"NTt@ kѷ6෗+[8gE'9y,n~oOx ^] -4=!ꉅPҙ2 \ rKUڧO4MHU' v f9R7릧Lbq$aP1m~n8O;AJcYJeB;n8{p1e ;CZT ߩ{m} ;$B՗mBJTn~mnsne”B̹RxjTUI] W5Uq zJ+mG բ/[k[fү 'ũ~,IV[ ;R 8qzvg>i,dOGN\|Zd1 IF(&iZ$ȃ1嚦׫څucd&Շn{[R G"tdb$ ;d2DQRPڀz4)%b{W506]?A/׽Nۿ`׾fPHLG!1&=!HL EesEfh3qn̏ zoݽz%3eJ~!ܹ|>mH)Etwuq&C/0\4hk$Z9b'݀P/0cPIfEJ|'D@lLc] M)oѥD(u ?1v1i`!LE5Zg[q^ :|O ?}wbm=egSr^t,j1^ȥ\8u^JkCpyA[|ܺeۙOܨ!sAhiPL ceti\KU[u#9m>noӖXG2 ??UQ1M|JeA,(PB[AQ ]V_06jl`f[/uofDFZ=Q8u#*. 44P%QUQQ@b[:M/WQ:lFm ch/<=YS(Wd S!S,Q4+%]W:2TRq c. 5#[ڷekE1-tTT}X!. =zy"&=EeKou+cc3mn -3>-C"Z#x jf:I2S%>VħJGw)>aU=Cja xPpVIJ2\FG:V՝䌆G YYF;M(89jZ'90n1\PFTMMMع0SL+:88XvavUu}\<ӏmLbZ^p\*xodl깃Nr* -K=Bvm+:OXgIrJZɜ @hb) oG(/'7=ʦJ_]q5e&*7HdHZz(U,ޜDPpj5Z^A1 X:I,=iwְ T[H|ꌹ\̓Y rrKM%$* <)xtnQ?>ף>)yHϚ7Π3L.=K$0,I&omOR8zc:ծ(n&n<>RTkNoo/D@ muH$عs'lH$B0X $S51I,"hTEbW*}oUQƩ c4K$(hbє`I MTm1&ۈ0æm +S^ٕf]wp{ȾC!g/eOӦqRd2|+_  WlbN?u_k|Wkyq2ŊYN-[{04`;dºjJ*$=ǙS^K<*pԄ$hXK~!0݄f_aK;Q4?#oc9'TN0G RݠKW#C꽊,ơzT^/?Blɇ)t:bDٗ9ivJ'la<BcT.DR",šd9n5Uf(wfV)kc35\ fu^r^.[fE[DHtltj6GtH %rUڋg44U d 6m+P(= ɈϫP(6C3:L4E[ZÞ ۣn;ų2(${/=tP8Z$M#ONqs*E?e;|՗ g̤9 SRXЈ=glۺSNYE6#J! 7];9f cR5;x>F3IXnl'}Sx7H@Vniݔ05_%)[~)/QB#hq [ٻo[>˶@n2Raa2ۙ N/Ņ_4W SZ#59xՌwRÚ"PUM4DGJ5{ N4<}UMG2җ!xfkӽ>~:Ɲ<h %1-4qzqD}QX =l6Gs<O{roܴQڽpޢ.'-)Zۖx4QBףpcCLK,n(〮 N^doot5s<|rux GoqQ'={/wtu34'4ea~IxjH,!rP#aC^rLLVK)inn;{dv w7C]p)1O\37-D-q?2}}}!Fw6j5١AN5$((*dJ: "zN((r/ɓ?CrE~mJ72nBԼL@ $a;0wVKR}Z<ɕSrΗ#"󮣰*C/Z*Ny;xt,"9<)%j qbt!mOa\Yfoߎ3!G]NzUSP iJbӚ2hJ|W%Aʲyi\vQ)xY+?p<\$}W; `<7rޏYykqz]U1^ԍ`} GQtMX,FW~d23xԞKu5RxC *Xe Å `i+Y[j!pFl׿1F#d5jCH0kz'2?}h~KVqFf%<+Mgg7\Sgm{LR!s6~KRn9WnaqcC~Rr st"Qna}nKv*öSjW$3.+R_IF(VtqFd2, lE<|UxjqO<j.׼D>>wL+udT' V}<\ -,5<5SR(H |U+tiC~S/m_56wSqJ~~!A9kg4,{w|`=*{ 1 p`Yd2)iks#b940TQjao_{⾧Y|s 5a0cᷴ | sHFt-F4& ATLw_ҍ XdDTTW$yy!z \s>nߚ'o-`$J $?RS`Yh&TW|b >1Dn~p=} TM_H۠əgy]@'.ݿ5BFxG BAFGOEUM=FhJZEJ%={b = ZQUNb@@8U쒅Ҳp ϖwoKtTَR)s KUq464ٞN3-e-ܵ2p¼t}y5un: z <GGcc#ʹ1k,ONKK ̘1y"a{)tܪ&,ݨXw9BjS/U)W 4?Vn7?%ߦ/7$R847Noémt.n"l2MuO>%P]FN@a8 GJ">r8u}O3 ԽdJ5bg.hm_=_1Nu*X1?!.Yp M͒x= 7!ǃdxl];K|<.oI[ٹ>W~s\wigV]e~ \y#}#h|hjM0^o=iRq>NJs x>o . 0/lMm($H4 6*vo4!sVF8uaWQ0-p"s;{_v!m#u9Oݻzyzc-I*:{,yPTLU A+T3q$xJUUv͏N^<:1-S[z<:[26 iLoc5aJM]&ɐaRD"Ecc#BKWNvRxc%P)x[9 nݩeP)+?}x衇 @GGFI&̜9vfΜI2vO>u Bzx8n^C? fbk8>&qF2#/Iao,|?≒X=^# BaCgIߞrC,,;•($77*plQKC$)=d >çpꅐ\Z>5j7K|sA7zv丹cN*3g~fzI$ 0-465kωƋRr}xYp}˭7 DCL>>\\q8!װUej6<_K9G؇hHC6e9i<^t:M___]|X,200@8ӫlCCCyZM4 ˲صkD{Wz,7G*%<֟u?$[}oh憫ZExl/՗c05-<̚*w]U U]}.G'h`:)})00lrֲ0^C!Q{t$g4 bd9Rfk*GJa-98F]dorȃ-[w!rHt|J8@J0 ejBxw,{I)Ж 𪕭|q!X>+Eh0Ir?շ&WbU<2 L;v& ̙31c---4551sLfΜIcc#a`Y֤;JZ;\׃{ZkrՇZ\ j,q8/T(]$V7L>(+Vtѡk)v =4%yjo ^=/л( L0˜6ǂ&/QA2ra`TQڜ]D3[ wZ굗 :W(iZ"g/ W~x*|C\wY3>l UW/2+r% =ݳ͒]jZ:s4)H-~7AANñJ̈dZ@= V!X2݃&Js_Y>-i9Nysxu+KK'V)} zEo*=Qf85tg+)-3ZlU䟿dǟ5g%x)N]Z2)Ԫ%J]MQX<+pQEQ궸 8oF&WQŬU1ThHGUu)(RAő (:xjctH4q˖H~?;wdOG'x˲B zTlA*D^x5k7?irZC^7>22m*s߼H#'TL{|x@ՑD`94a| >3ذ}/@ 'N@?x R1𶧹d {2>TE`:5FE`A4x-cOfÐp?B^&umڋBȫ72;] ,7x#?77xLK9*éeQ09/g⦳w{7QQB^=bi{h"u.T -}S9jZPP22WsgbJZ\8\($}IY)C䫜$N`qc)Q2l (*j2\LI.M1:b-=*i#+{ډ557iX_+0r&bn rN a6>v# twq6~E_6r< }!u$pSUUuL|# VCN뒬F^CkxSFF0s o"Wt]-^ڸ8M^V@2J<75mM!rJ͡Rs!qՑW-\g"Atx }貈FG)7*|m-3ϩ Thurzlpׇe[#WL|'-=wR5Qu[/|r9{qOW櫿Z'@`Dldj/b@E,Ӱ*_Ps4H2{M@"Q75tvr6Gn`Ѭ6^w4~HS@ (x6^5C>TY@n}{cڵL>7glZƹXqNJiPMy^э*W5fjRVfWJwm9yu oPOB%/1;C21іBEX`/ѶUD#1lĴ!/UT 0*&he)w%oczKOn!n1/~qLǏ̿b{EN'WhTUyd_%1TrEȻR|x~Az Zn\1ϐHr[s)8,GH:DT).Lg5AlDWyo `۶m޽oNOO=ظCƤ"hlJTUM88uh=e)AZ1yo;cxQ?䦛n"NO%+W1'CCtg=>ͯZPʼnO+\r|rl:F-׏H%Q J807J9;==P㸼׺@FڊUA^Dw]UXe{ݾ٭rE bf'}Xr<3,֒<+\xN\b"ǫ(Wm# .mYb7MxĆ]E ]%IvlP VFmz1?MDNFź>BJab<Ԫ8Tc+Jb=U0 "f:Jf8oyo 5aӥ/~}Tw"#DISZ K&%\H7c}gtFkP(l?ߣ; *b<9Wӗns f!48)\e>sNN;46mD*"Jqg?N={6'={66mbƍl۶{X\qG.hv~HUET*=5GL3p3p-UBBѦvضm[r%K,99$5C }9/7sV vŠYMCsCa!JR. (Ɋ8UW0{YxTp<\JϣSQcd 6^C(@:60p' 6`KAm$Ǜ(K~j4 St'*4MC׎tsY&LM:,)P2ᐟ KWN?x ٖP!LEf7#JMY8o:C,L}=yl;,0f8ƘnZ4%ZR,ۙҒ UUFr-i=tuVr$8'Pa@%3RxCQ(g?ol䁿mt/OU-Pd0E}|] #EGbx<>_͛7ssw 7@6夓N[oγ>ڵk9s'}~;!B rw.%8Acn賁psu#uI^zaׯ ϻxo}[|B;]ٷoì\d2еqp$(*\+be˖1C,d%t Wݣxbgh@=Pd!-WoDS$A\@lu_ lٲ{|}5ko~cܹ̝;]΅QnwͷW-U,G;/` ?%at n1YW o?< 7pO>~ .[oeٲqscF2[0'Yĕ??bMoLMnpHPu do.1BḤzDpK !ML%JӘPpUMK\~N!k#gUr\oMOES!_eh:<B:DB۔'$U!ꡎC UK9OUD%, ̡X`[5t_\iaO( < Ѣ~d2 L}Srh@sC][Xd +VdD|*y6)iɨ ѕEuj _}&ؓJj,&u%NUU… T*r-,[6_\0eutl;TUP؟ޒ.Ɍ(5#uLJ?6JPӕq}|34oxBBJ0|lHsɲ{_Υ^J\{dI˘T%J:htll8P /7R~=W"2Ւ@^fExDGjE8VFd6J֖Rjž2iwUvhIĢ~ډЅdYBM>Je''L1kGjRULJUICL1R E8j AAU9ӛ#@Q¸іX,F24MضMRT*Er,q,RA2N@Նx@eVK;~,xOfr!|mnm/]!0(Ŭwf_;bl0g|=4]0c nv^pWsay &; 3?;G`rU勌jhzQH[tu(;EFVCnoBD\{͕oK_s̙Oˈο̖f`قt_D<OͽU+x?<#~x޺Qe"#&B4{d,O;=BKpѦ-:ƍ0D\GmrùˣdPÓ:d \-y@C-v:H7MyՋCa /9T?+2O>fET|q8ꑕw,\8usU$ݾ s!5v8(9R,}*K$ؒsر /W>ן~xiW~T38%\atzs^qϧ+B08˛aˎ|+_g]X5RJlGh~pS7;k[> _"CIj$>o8UQ:'h-RMƬl 1+NiT̥f{xA]SみeM\/I[):IP2 N7Ctyçظט5KP2O_I,uoWNJ%N?t>R*pD"O<7`;.`G>nV~_qU(5}ӧ-7 7s\n\ 0('9a`h]eue* T/mpbժUڵ됟y-BKK1hPHן^roόfS6!u$P8c;CPl6?8E@RzWMdᒎϐ`&"6]1"UN}-~2֕4DLjQCOsImTeE< _(EШ1Kz|K^Caɞ~z̴KPZ퐏(7b4K`OY>mU`0o6I!t)@ k}M3hqg\4 \&Z$*MX8RYg^/]] g2$ !lbDgg'xUW]Eoo/mӳ36%(Gi0B@R(o\ʵ<+y]|~[u|gJ$1P:?yZ]ŞߍY%U&O9)'? ^{QVYx1;w:]w?яylk*ewH05bJ˖Ws&{K'9}H\/=vI1ꫯnp[KK 7x!=hiam2<>E#et/+V( J%-df$>w]잻lc$b86XFzxbWy}桳DhW\uv*-'x 2j5T(t]kmBQnUTeF_:WӧCks~*ߍq9_+zy1`᧫\ -ЂŖfM?rd 1OhЋ3Jy\2$|ds%~ik&@EfYۿm˘6}6t _KtM$}ikxŢ8l7$I"Ɔ$Cm[O,J@ Vm^ȊUKRs>] xbm+T}^̇J>Hy=zT ;CmWo;t1 V}ȂwB\sΩEs%LrN:i&.]mA~70k =e;? X Z´pp8AQmbX(}hHi2fy֯_ҥK'vehܼ'O -Grٞ VpmS$'r I9%(VL2Dž!eC4H :>xv7̰44DXdڴicN BvoyOm6f@PZL&qUUömZZZyj5rTjW{ cF4 EQ(2k֬"1(&%Uo[F&r!kI:|Z姿g Dau2kcVVI?kR( gfk6ΦɴaPjзe o`%W@w#I4B8\ߞǶ-?| xTASC_=PypĨ I*Wf &Yaڒo`ۀJ3%< xjN?iCt z| 6bؤ7~^yBM%_UEx4;A8_>>So+裏rYgb7]|`9*r}t#{?5Ő#xhV7x?O?G}~uTЂqTFR#F"V:!ErzUKRTjJR E ,*AT'!,Mi Wjv<ߟDz@8/pP4G'KT3/eo_A8yH\I={ނU+ucz()S+_җc?q?p5׬Yüyxk>FvۏsVP=Z@O ZH/RQ.tIoqᴣ:-[ƭ{4B1h3.Kd0Ms1]hѢ1F=Qa"APnjڸy:>oRˢN.$/dmabfv yYq .aG1K%"fβkHqteI&~7?B^Kdo!+8*p4)pP< ^qlɾ^^}+5 v EX/_ox-mmHuIZFkK l==>ZЀ؝I)iHغ-яLomzJ*agf ]?/.yhS~ <5~t+:xh<Ę[n̿N1ڵkL&K.g9Hmuh=]F72omS? Ќj!k)+ ե^U|}fϞ~!Q,'?[N=c:EbCנh^T5ތ-Y~yOV)DO"Yll>79A5)|Mž#>UcDx?ȶmxL4=c[^Arxwk7rK)$w 35mM8B9&M~ B z9ywm4R-:hƟ  ƬmqD"c֫:Wqx<:da8~"src#^ao~)TFl>՟N 's0T$6U9W`wawKȫ~*)ybg.GkK ]bhYHP(McTLl%JH@0X^dI]a!A/ofo_X*<¢,!bT{F9 +>IG'VH$½ˊ+ذa4|祩7 ωd<GtJ 1S= _!v˫~a8N馛X_sqG5aDHAO gboi>jlZFf[ xQֽeY9~v)%ahPj%Wj.75T^/8}\N;y%RJд̋= ̳oYm1rZ6&|^d1QeZD-[0w:иc.;Ǟx3V/\ŖrbT x5IxF"1TWp|RJB?lOk6Τ}1}treQqm'J t-tA!wol/r"0sx=75Lo}[9Pm|r~p 'I2e|Hq@KNӁp'y8#x.%7I-y\Pƅ1wy'7pn^|9_W8s o;?LދRDbAԵuU\,b]{]VDDXP=I2c03iAg?Y3{0s{7-6P0/2]sʩu}W'D,j0p:dech HKV7_Epa'`1"֪e,^iv 6(w#[#.1ScӳY^SR+zwa;W6M`ٽ@9t[MYYB*^YG.q[ !.$ߏ>X֐X5s\RRRŻr:/yVw[n垷ՊըƼ]A`uFY9}Nk: |<M0=$L 2~]PtI]yJLEQxꩧСCL&X'mJOI޹Zk&S/JqOPۏ) 7|@rr25q|p ,\N:Ix>޹$InM]r}Ւ?~~I*YޒQB"kCL+0>E 6,!ٻwq=?^q#!C/?֔Iՙ1z#&VL!)ѵ !Ζ<\HVo;X6<' ,>%x jS)Yy<<OO@]'//t+d2(2e1GUUp1躛k}]08B^t9tKΦ]F| $ƝRƼ>!H1;~.YfdffbP%샨`QBMG]BέIdb?0i$w޽{+b<fulC\ަn@."Q 4>ÊB$ Ê+p{,!~g&E[^GE8BBy׊\qQnn ݥ`ʥ`Ӥ;N>A>sL V͘1_U\ ^OE܈?(drˌ4s\8;3cINp }%~pb (-MdFG*&(@lz9c 1Ng1~/sKn4 &*u*Z=-VaݮظSCoȔ4g OkCJ799xo靫3_M L/}']ImMiԈ#{ci71 Ts:-]NcNh?IhXxaz[%Ӯ];JB+\[Mm)h5~NeY<=Rwkw0Sq3ٰi+|Yw#ÕG[-+&q른ZjgQ6QQ(R^zӿ?Cd  _/᎗d@YU,6.?.pb50(GbT)l}>mо-gzHHHy`B`2ͧu(=K1ۭTdbIf'U|g2} i#2R\TQFۀTTY;||Vb+"!]9 [P?gժUm۶o߆<[~=ӦMjkޕu`5 R$(&o'B+52|ej'GRk6mZ2rHv*t,SOWY :^x'~/]iҤ #G\#ak6>/=-QtM+e qy'xp,yYyWxꩧhڴ)D$L&9$ڍ,ߞ͖$4\&T'ӧEQм)bZ7duo6h ^xn'B#ݣH cչB^+}Yǔ"AɞBf"[cBG6S-ߵq̜9cDzw^˧~Di̘1;wm6/ry.rd+`ʕ ▏p=~WYRWda@\"DGBcX ׫O`7jC%J^^W6g)7мy+/ ,YùN@ҙpI/y4m*x-LƍCuh?S.o~#'l ~uޙ{vWbbY3 Idk:٥ޝ,gov :I -BIHv_).Sl߾n$xhܸ1''7n%ٯ(dU Dy C{4fKK9p@0]ߋGV8dIs, \F0ʍ ;-#i ;A,&&pVǦN_2nX&MDNNNE4I;daRj:%mUxwF&9B 3п| 7ҾvZzy\cWA(++ϑ#VdƱӈq+pD: Z8cuůtA$`nԗfywj-0ӫW/y.j4+yw8qbZY?@0ĴEJEui`0(:q_uIBh>t$]JK˸w\qqožf0'jdOV1L kLА &P Ak{]QRv4\A/.*ly죕gQ$KELgjj˅RwX\AA{TyGPW+*uM6I5?F`Q˻\Þ`kQ+4ŠU$fC >D@E/l]W b<222hӦ %$d2QRRBaa!6x;ߏ[ +VQFqy瑛tQfsr5BFN)g=> S-ʣʒԣqxef&{zt,H҉u%߿VZdzj*:vEӧsM7Gnnn[G58*tu. RskbkQ>$ XU[ >M&=sӨ+ĶG޴iO?4 ,ջwonndȑ_>1-bȐ!uguGqf!B*4w>FFazI~-Mxqg<QMIMI$>.?|c(8ng()!51i?,ŀ⺳׵-'16+SSjTz@n++'ׇC~βXӝ-@7$дնr~V|v~|,bLȲT VGp1a`3w1sص{-[$&&ʞ:nWVVFyy9!EEA]MұcG$I 0L8ʊMLL$ZØ, <*ng/禓ȲTk~&M.bez9sXbEH{JJ _u}>Z"33I޹d/u@8@G[SJ?̸H?lAݢ~7.B>H4>Ҽ&v/Bzz:n֭[ӣGZG@._1cƐ}dfRƍ#q=t.X!YiW/4Q/,*)<{PʽJ%ʜoo* vLZ6Mw8P^VƓ1r`o N>?#ϫh1 ?;s b^W$ |~G@I9x)|9*8΢"ƿ_Da-wxְ4nU,z^|>_0@WdZ7iV훧X[qɁ`L, ^ӗl{*ywfښx5W^ye0ulĈa5gΜ駟N~~UEo+7n={y#sݳg .W^a߾}w}u2t܀xα\H`> #-IBE󣸽o(7(($ 6KT[05s72cƌ`ۈ#㏱Z!A;Q&48W_&a~~:%p(Q@A'1b7L[?⪋zm})..&)) T mWp8(**"9֊ۍ&ţ *KG˶r߀ KM}7͕bnD>[_rss;華O4ݻwUSRRo8sCU"O?O?%..cZ)//g˖-U=z4?|竹_$1I6*y b]ۛ\9$j*NɞC$,ku]&:t?맜д|FO?lyWsrrB# :joA3ZeQlMu[>Ƅ H-jԣD:ɱ&ʼF^۳Μ}zKN '&&% *B^^^:\ԅؑ'>)FvY:~Fn{.1+HߙjN V׼+V[oٳ!uY,[ÇW%%%]6l5\̙34߂ߟWCLc.IsW[k>d $oGT~1ؚQK'ѣqU;Z%A0Ĵz{QmMiCEn}5s_E9(h41_R= wdh _&*˭ҠQLt!Ϧ`+%꜇7@6R7~xf'5m=jх:_)n,)4F}0ĶmΟ? -2k,.*|>BΏ~0Ĵ!d/oT{3" Wv$g=/B֨2>BJ\zLMz\mѼH|JJ5iNX+9jԣR:IvnYiiVzwHSlF,FU]4ӣq0y;R=&B1FƬ3k&@M@\=yա*'NUV7JJoN޽3gC 7k֌W_}^{-'^#GG1hX_7a\mK0% LS'_ʽz4_Ӷs)ed IeeɅ6%_y*Zڵc޼yt5}Ѯ]`G%o-/?LO-ڊjoT(9Y71 Jb (H{B斋;Q>7:4]d489g>&\N'7nm(.>khʦL^5U:D hBˋAGՀln5i8\^\^ʼ2FV&Zc# Ǻ wDNӊ306ܞyu޽/<(pOK_`_~%;w$33Cݻ7cƌG[7Rvp{˰s >]#z?5Ĥ=Ѐ8ߣ͠>D2#c$ ?y뭷/BΝ[/{!Ww}ph\~ wj 1+>|V.ڹl;OBgرXx~ >RW2 #u8դR_9d!tk8>tp**Er,$ אdU1(2,D8qQ|6~Њ@B$|0%Fy +䫯xLMrB4 ~:`=)yk[ItUQdG*ZڱHѓ=+YOߠ3dX.!4'IA0%QdddpWW)7a{ٳg3~|{Ⱦ}8_%Kր'<=@@er?9Ġ Y&Q/V*)I.אscJ"c5Q{~]J6̹z=m-WҽEڹP-/%s=UK.O?f;*u^2IVT;! IQb_=QIL9S~@C]{=d4g6v45R2sKg?R҆ڵk #[(7clT4ƹe&gfG#D"KO°;f?K,$ ttlv'DuÆ ,Yݻw9_7x+bnX`A^: oFk56-;NF A(O2jn&=@1t]қ!iBG)HGHZ:tŠ+7nO<_};p@H[k47|sƍU-#(5CL 3)6[O/~EnJmQD9s̝n9o] ]8Pl j^|ĶOjS'wqSN+p2cƌWNڷoϕW^Iiih M6ѳgO̙Q.x()ܹ H'ki1A\(TB:ɞCQ5.*? }6"wUk8qe_Wi3f s {|QQGqWٷ?lxM/XS qhNOrV/ t:E K2r#E.C9pm$~p86lXآ*i&RRRo߾#Gk׮c{uqK(Hpn~%Ίt5:9e*ՌACFHAd1Ĕt&VuD]9JK۷3bĈ*u_|E&O\kևZ͋bNԣd迗V. m$&8OQD!A~˴a-| "{ <$x.2KII g֭رc#擟vi]/;x<4i$$^ Zx1W\q@ҧ~ʨQgϦcǎNňDoYyBښ!Lkۚ?>;$81)zt%)$Op[ԼlwbY@4͙B'었i3PVVg4>>˗G<_~sΩvnwy'bرc㏃UCpf DX,I*kި3qs˟:{^Oy߿-L:J~e-/޽{q͛IJJX gkq MyҌ{ tL.c鍫|%nEaVK+51aT}V`L>dK#K|nuVvL]|9?ٸqcy_|Eߦm۶lٲӧ׫>zg)?${/B5sS[Ӣ,?EWt 08yڇ;`oAbNꁽurG_|1o߾̟?i#_}W\qs9USb7oƙ3ضȲ%R^>SxxiFA 엛A«-HD{v4  >ٰaÆ >8sXz5 rnnnnqҶm[ƎuV222l.p+l!†۩d6ν.C2ƟSMKd$EIDHa6rbB1'~15aw\y_~̟??7~xf͚q:VNnݘ1cly4\45=`'iWxݜm r(N|(Q4,EA $%xkƞI~R"K _Abی&+'$-Mu?|V^ yyM&K,CUIII֨QXv-#F|7n%}e]5 Z =xBTM _O6 Y*БM V YX{ i;(((`ĈU?yx≈c}7\vea86x#1#Ahns .@K;i*n mN!&%, >g~rygf6.P(}{ н7»_}|g\|!ǛL&^}*yu]t֍իW3zh~z):#t?9oAFvEIH2Ŭ$Ǿē=C{ƌMa*vI2>bZ!W_}5qnݺQRRR\wuYp!\rI>'\ CKY5Sj хIMHJE@% ,&9ym\0EF9*߄3iUH\p a/B M&vJOOy:axZlt*@Y`AH,^!CD.& /?fo>P> Mdq8Ȃn2 -ܻt'U!-mFe.1$$YFmwɽ>!. oɝwsymFݫuؑݻӽ{wOV"jv&M{]VH>Kqy4q8 ҮNI eHL9dtwBi6kqakEzz::uu9>u1kHCY9sfHa‘>OP"3F6&U-y𻎔:LǹsQI |w K,-^p7$JMH@Ub"x իky';vlw[nxٳk4fѽ{wڴiCϞ=ѣ-[EзZ9Q 8P\)W %-)sjϓ=AHd "1R;biK_u/#FT`ɒ% :]Ws14={p=hѢ`{mo>+{%)bBFZo;tkK%%J]斗>`@rKM{ _!I'l> ,`\. p;:X>nx9sطoC.]7ߤm۶u:_]^: O&T{HJpzC TG95SS|j]q;5Ӻxa7%IB\0^Yt) [oqW{ޒ%K6lZq3?Qxojy(JvNCR($A@X~]ⵕyzY[ˢ(QjCe&tÌ; X!9 6*btvאрql4ӫf͚5\SX-[<̜QF1m4t^)ۿCl7! ~,tIEF|(S'hQҨFbǓSiݫmʕ+X,|5.]Zʼ6`s9ZB֭[oȑ#tRbm2cL[k@1 I2 H Em 0`<$,?q($$((w.Fopz 䔛% 1O$aGsw3eʔ뫮ŋp֭]t> ;$Iq{$%%( EEEZeʔ)UkC'(?1mxR$s]x R٩Qţp'.lr WFB?}$zsLL4ѣG(ktR јהcgy& ,8=HY䯟Bپ(dc|p޸b<Ebb]T]GrDa&{o/1FU͕{L w{wH=]- /}ḩMƍ7X1 ؼys4͛G߾}=ߚ(5M/ P]'5Æ8s75IhDE"U_A|2C;g؅ӢRoErM鷝yeff2jԨ`JZ-3g}| Çxo{WjovbӦM|_/?ÙY?㨈sG;}7}JiD=^J'tns |${K|Q?̵gРA?i}aݺuaEO?FxwPZbJB6t EUtfoj?cסx6\e^w橥{nq *3${h\Mt#8Zf _~9`WGVb>bӳgOn]ǂ:fay ̷kj9s!m}=Ak4AljER4xq͏?E]јL8>^~e:_fP%4Wn@aNVu0*:Mc]*ּ%2#1nEKtQM>&:Mr0+vt|ePLq$tznՅs2nܸ*'Gݻwp8ٳjvԏ?Ȑ!Czmol_A/.@6[A շ`'d1kހ90$]G$FI*U[El0[tsݟTdРAayZSˤIxKiӦJI&+:uWA');5!Œ"wo`b"Y֖o6IAQZˍ\)8<*NcKS`?) W_}VIM[6mPZZ޽{C \ /0b۔)Sx챆q R/dExcH.[c_Sw; n%G@Th''(fȯCFͅ$_>I=hrH^ܱcY6mxט4iRؾ><3&7p@ON֭|ͺP36xg o] 4{| /hͦ}`d |v=ʟ}bi}b=g/t?Q,5 2B/?joIBk_#Bbo* CKJJav;k֬!555cĈ}@ 0[oeʕ\|BCLKbیE)ކϑU#Mi$c n_lȎĂd0w#KP\d\SGl~1jd:,@r@^R12o͛7KJJW^̚5+͛7*ϟ~i^{ZE2|p>Ӡ,ѣYlY ~mS$p\!$x_W|hBB2?ϋ \moʽHt=st&0֎e˖WuҥKv_~QЦM6qGRaҘ>}zZ Σh8~u`6ňud =d$گ|K Xb zT&), \~r ggH<|LVb\ݏߙi=.wabt:6mZPut)SN;j5kV Aڵs9<̷~[1IP-" xU]B3:*οly8$Q2nd !}nWp$_~6Xrw߅9sfzʕN֭9p@H}?\-mJՒE1nI}zւ[ \*VrR%p!7j ;-[{NyH#PT$@͋ğ~Gk[ٳF}vK/틋ctرYnO=gl{qi?Tk$6N`?nj]-Irvfp]x$lWa%ws&'\o{/l_msF+1cF}ҥˉO(?=•YVJB`PdICF2imhN#F KPUpgD&:y( a4м,mؚ˗/gU;`(QG_nh2lq#ɂyqԽEh"]odAZZZkT07o^s4MgϞlܸ1lO?Ĺ-^Y_= -[olrW݃bND6VW u_y\  z1hBYO̗d +%cݴkW_~XM6|?0<wy'|Akw߭k>O鞏(ޢxS5pdqДϷq8NՋUrH:QII,.\-- хDV !@ y\;bڎ'uȆlA׮]ٲeKHw6Ν;gXؼys>zjzLWtd!׻ 9;RO.;FQ|@UW5!^FR|<V&V\Iuؑ 6ОuYo!;wf[oU%C\tEy{K(IMYxc+ ;YdQcgR9@ڼXtQ,/SV.Y\%3䕛tZeی%5(.3fΝ/R+" RcРA,[,6gΜ*Vut䬼+p0ƅ[_A+$|.c2i .Q^ՠR@˪U7]$ qojӭ[ KlڴMzzgy&]$vl2.rm'\]Xto O)MxjYJgoSjbFܘ&,3!w<ׯtlxT46\9hme\y }c;f4D6/W# 09aÆj* PO>awiƧ/dC,)Ԙ@dE68B엄K >bߕ-@'~O$]hC)//ۿjժ:4 U׼sN.bÕ t9J" >,Ʈ|ߥ'hw ?e$QZjFՇ*GnHhDK_".ːytH.G%FTHWy ɘSzz4ÑT~Kb…\r%$$$m=֯_Of-'O_w]w|r֯_@޽Y`M4|Kv o(Ĉ{Bl*hlFxSA5 wV "#$A@L&h\N[.=RN&J[I>O#VmӦ W.]غu+wO>dHJ3ke{SŔPt&'̫!+;pW#6fǀWc%HW8*?qa|.K%[| 8QgI2tO!1KyĴ[AҟmFΝZ|L<+W׽{wVZRd#F_V{QF1k,,KZ==bN |#yjHis-V_A ũ];/^"J]]I|7I=j'x{ &DCoT:y:z"$5.aJDAP5!#2ّkRaRKWWAiT΀օ jWV4u$ F|zj )BhnT{slM/z4'L>IJZVVFJJ nӾ}{j4ŋGu:묰C#s _póT$=1r_[D zM|WIX*-)'UD,;@\Iq]rܹ >NzǛ7o[nadYFK[rڶm}oZy׿U'yh1凖_:M Ȇ#:V/p/>=5؞gp˛D9eфí" :oYmעN$IP2)UB )Lݰ5pT[OO;w_+'CuOg<<,]A=ka̙arss93̬kE\@ԆuS(YD8!Y+W֋]>ϯXs=ʔ)'{ 81~Sr٠I4U1|hJBX\@~~~H߈#غu+g̘1dƠA>}:a6OVMdǻmvj̅@UTEu _^#٤PCC_e&~Zi^f_Ȫ6 {qpC* XnEv;/m۶p x^/_/dǎoߞĐ *1.Yٺu+\pWf޽̛7uqiѸɉT1؛cmz!bi B/;F .dJ=*e^ƙK1K:q^BZ&91t FM\F\A E UxW܈ۧ`2hl^]>&t&%AHGūUC ')))XwzW\Ibbb،TZn͎;?~X_%[(MCL+$*( te&|z/5u`<Ž1X|?-ŸYǜғsZO#|~ >J馛,vMv)桇⩧.{'|2]wϞ47x 7S~p_D7RA#GU%nm1;ql̉eW ;$،~) xBSU@5iG4/:b/cZ1F]55qA7fbI2H;} /YYYtЁ;J޽!C#y@%z @5iKRhg:*IqeO1Z/FV,b`IyXZO[mڴa޽3aӦMh{oHvv6 >g]Žn1p~cхog7Ʉ4555ޠ# VoϕȆXRy :s9+Vw9dff /+UK^^z {IHH`ŊvZxݓ+Ógr\Y?uA ȆX9T$2b3h،~&sŽ|bٞkgw= p%0:Esu%<~ͫ&Q#eN)etI-{cije\>2&B}dՎ!5X%_ǟp&X,_"˺h" T|[n]Lzz:]v )RQ.xב]٨l Y:m.D zMD6H AQIAs>Ӻ~2qDvYپ}{#b+0ջwo֬YSF7i҄;vh?*͍;-Us-ف$dS ;w( q퉋kO\{q^_|4ZdBVq>0 ,4XK٤c QH@f%&;d:.5Sj"̗vt ~5 @Ɗ+?`+#6HG>GⅎGZB%+t9UCR BFv/vMb<4u4M87l,~m ϯ$LWN#4/߅s )&s ^” sJoөUX~WJ:Tׅo)Spaٳg7|sS˗/c̘1XH{(zjϟNz\O>BQ-i1@YhU7/Q{MTrM {cH;3Cc)!Ct?֦z}ce„ U^Ǐ믿NRRޱcGnġCHNN8n^^xvN6~g x)؀ױ͕u )fd='B"+WFEǨ"0:HríR -3R4R6PV)r(ve^2؏֥b-4za{__Q@yc3i،[|$}[$}$Z4yidga5ję~^pkoTh€;bJ Srq1:*9s;vlq1{:osZb<|ws姟~x839y뭷-i4(/"UY`V5jŃFۨ|4-`+|گU(۶mvM⫻N}S|=d/OզU qMU_ *ťQD zMƠ{bHyLOn=ߏ᫿WT[?h^t]bL@1'a55Nb[#)S?uVv"бcG:u}]={M6%//e..iD z ;QZw,ܰI Jzbh4rEIa2>}:\p7|s0Ǵcǎas={b6OԵ$)-1[R7!3 3'͝A{D:~HR@G IB%$džWy]+VzJY"K ߔjIC4BSPmP-iT8#(bcchЋׯ9#Gq<}ϲeXlrL͙1c'Ndʔ)|U322v"44cTkjn#B۞X$S4~N$@]:@?]$.{БT 6_ױFg9gk5\C>}?~<#GK/ 9(b:M$7oio޼9CWelٲ,С)'XS'$ ŜbNƔ +\.,P>'B 47BY1W:օF8wI!_(fdلd"1j0q8c<1jJ[욈 7i҄<|>_ct]gԨQAwzuvm<Ӕ0y%M9;f͚C=N/|x=~*Yf1FRDr3qTOy4MBVH}"p5ĶH'+I#k9׆oE4i^7}ĉm?s<$jq>snw={rs_OY$,u?WZReC 9J#IRРӇI&1tPNgȱ&M"++g}1v;wuO>$ׯgܹ8@ tyd111ڵ+v<}巂**C,!ЄˈFaDfpB+L(++j{qlq%#ljйGܷ{'q S%"6ر!CPZZl2dV"==kO`o[o5D97|رcÖ;??t¯ ]tEa-Uĉ8p/Rbd^A&Ȧ4٥BI/Nu'j@$ LIe$UH?D0: mn$8vly}?kw4BFpy<뮻J駟gϞ<#lڴVZ믿>"hV\ɿ/8 .R̙=s\T" 6د!ϮɓSL!77Z{r+pI,S!N&J50WůcU -:Ql޼'|:+T.rBemjԨ?3cƌ uVêUŽۼysƍkbJ [䮸P8bYO.|@?*5'] |.@t[PY±f|A6mmDaڵk*7pdgg3x`͚5 [ &Icm6\r 0|lݺEqWӵkW."^|ju`ԩ 8nݺ裏V,nݚoիW7H+JV+:aϭ7_`A/ܹs>}qhܹk8t%;XMI!&݄&{5'ζ@ҥ钐"-O [%/cm ?HRI͙:u*˗/ϧ|.]χHJ/ҭ[7,YvL_]'yJ¥,_<,s+6wFc}YgE^{XdIQ|>>CFA׮];'''s50o<6m[o1 ApRNNN<qCVVV=F֭[9Р[$k5xcm;b$@ U"\(QR/H|D^EA7ĶƱ'Xق.8:YFjOYC:"$#Nl,is +>ӧSOdY{XV~?7o(/ /َ}ٳJ[ZZacVY锕Çٸq#+WdÆ a+NjՊ>.~4jԈ*g}6z+ qt>ѣGo<?0ՖX}衇xꩧ-X[nW_}s&3S @"ݔx84)lA٢W'tZZ8$'(S)l(7q,[Pt4*ˀ,E”XЊI2 $'Pbs s%I\ @ (|%(%lr|B4yL04YYY8pCcn2gdYvlٲ*7ps W/cڴi@`W_>tҢE yMF>}0`%b_~t.`@\˖-k߿qϵd ro6 H+\NnEcvYh3$tI,4)NˀiFR5 $x~TEBH_!ƞ M)ހ_6)$u* c~-u9OZ@5I"ݣd-oIufuV^x;(Z7nK.e֬Yu*/8ly=lY0Ұ|ws=ݛ?^pj*Ν|iOX|yXcCڦN"Tt" ~oV\i֬?<[laUFxNq>߷bYf5G_~r~{Qm+zsI{UwM:mU; ͳZn"nJm9VǝwZd7jԈ{w{wt޽{xJW)>r]i۶-[o榛nb jRSSy7"_XbE!CW_͢Eذa7p;vTO>I۶mYti%ի3f ]t_~ s)TСC|Gwy?>b>vUYjOzz:u|G8Z}< 8q"oV7Ȗ-[x>|8f:Vߏ;Q- 6&J)|[nZtDW0,0xȵtf-Õ; NB;N~:r|)lF"==Nտ曐6IAKqL>ܫ;w裏Ԩ={D6ϝw;lkݺ5ǏgСj Y/1޳gfŊ,^N:U3f wuW… nTp8Xp!sooWLLLЀR2)))3^Gw):~w>[EU_*K&YucʾDNq$ ^!K ]W 11@D ~C/Õ9T?Fxql6ʠW |Tr**wp̚5ٳg|rvIqqq̤O>UwɎ;xꩧ߿?M6qу)Si& 2ց۷o'QF!ǻn/^|70aZnUW]z8pҠ8pFmc̘֒h{!4 1wKx%?,Hd:l[)Lmw5C1';dɒ%tޝ;v9flٲ%K.9=lذJۡCسg۷x\s5וv;4jԈFFrr2IIIBJJ 5j6HtM~233),,h4Ҵi`ё㥰~U)Xu,55E1t*1Z2ض_"&e˖Ѿ}Սc…p 5T+<!e˖j zvvvBJJJ())^WUUv;$%%1uT:,/^Lnnnؽpn5k\/>cҥU۱t֍!CpӴiZ]/Æ b̻uV1?yѢE͛7K/U)4bl6[L5kքtԉ/1cн{:P*>t]'33R:J(-fH5`#z*pu5cl"35BA}_/պ"BM$0ĵ%gĮ0uf_{ ?55{/ĵ E"Aoݺqݰͧ[2jԨ1qDV+F!?@FFFs~?o>l6(hsYmk,AZœ~:D-u8ቢ4h;5 ŜT^נ~+wf3QC''jdAC\t!HASBGR,w>J#[8@r7/g͚O/_^%[ob̏殻b޼ytܹVc˼yBښ7oq饗2yի NJĘ1cx7Ǹnyyz_:l6}c >ogOX irBx^d9 Qg_ @rCE;U 7)ClN*-i(4Gא^šb ţ>s=|;pUWs*ǭ]ÇWil5^c!Pt*ۙ3gRA/** kW#u|å %jkKFFFX5[籎O~t=:vX-==o7_vXJ/@1%ښP$,#mEPg/_A.Ѕ@򳀿3ڛ!+fJv$+_72￟o&'ЩS' µ^ˈ#ݻwT.… kUbѢE!mkk {9Zn ul~zmropl7xhI\.ބpDW>\վH M/pxHw$+Uj%[$h&Kg~"6_A-#u$QM$osx(6 _$x6l6l_k׮,^>/2x,\uUlذ֩?W+1yeWubwiy:]MBIOǺ ߿+xDwLlto1E[n|`oXi|EXV^T0I3uEDH]\i| EsY!T VtQo>1.'(+bZ}TɃaҥ [#@1cm]ak6DG"bOԠd{W^v|k~HR;1͕Gg)͘CL1ĴRnM߿.7a„M$x :uD֭Сqqq 2Xetm۶:p+zgggE$4Mt:!66>~4K.eҥuYvm\!7n̡CUq\.6mwXD3<3lUW]"l3{n5OS1odcL[$$ HEd\TW=H^旻c]+^@ZLL+4o#sH6n¹;vXU̙3c1vX:,:wСC4iRPhjdWI^^^H[eMҹsgtB.]ܹ3;w{ [9_^ 7>Kґ ~W6jLKdc<էA kB . NLk?Q &#t}x_x@B}o/e{?lq)3{iPuqz8 2a`;wr~w֯_C 50.Am_X\@L˱6?z3g(3g ,O=T1;vcǎA!dD28tNӦMYh3g䭷 1r}qeEtm#55믿{/~Ӝ?ÐtW^yF}/:޽{sҧO:]>rVR1CK<(dԘV5TA I_\'!/7TC6jQ{p|%$Auw*פ-t$Ō!bld|Mh?noVbL0ɓ';?*_8H8"o^yv<PZVZŢE+oy'x"Ǐ?|Pqx駫]yL8sSSSYz5o 5溯ҽs~Ҍ9HCL+ʒ"!!Mg̍I)^FE螚?Q^D z1ƗzFjC lD7C15 7j3t҅{*]uz=zzIr?rC O?eӦM=:19u!--iӦXdIcnZųQ9S#-rx(rWލ+Sbِ IȠ7I I5#&QD9A ],rzSBFds [sl 7DcYq;凖 F'W_}9s_k>I&a_k#///x.]g㏇eff(Նk6R]%^?L~DtMah\Y?]d~7 +ۏjkbIIѐ :$YW<7骫@&*B(z$4I1:>5;:RRy,[^-@6"~7elak>[K0V_hh42qDV\I8sj}l2$.ؒǨՅG}ŋW)kU>< IIIa~yܣ?tK,Rt]vL2^b _>S~p!ٔޒ#'XGuvP6A TqQf2 + 1@ZH ױ 욆%ll-.d54o?éܹCWq74cǎbҰoߟ)Sx*}aѣ7x#rKHNs]}?c],[CChfB i! 3g9kG|Zj_i bLq 9_v*mP渙o93{TMxػc"2laԨ~ c-[J`lrSm35| /sF0[&UmMJP(rZϾ$()i cdb߿|+ʯPxҋo߿AFFFs_{s ;ݶqnOyWy7('{챳vm*c:JE~>'SGvZB7AyPlv*6JZTkr3^=;qAYy8k5Ekt'~lSJQZeʕ y gO$9r-[+P(θEYgƍ299I٤nSVa``uN_Dh6>>|ddm۶Q޽{yG)޽{~zzG}'xb}\ 4_1:#oO"6v~DfQ(i+@L9%PP_3xA, (d5<`fpOAy%ФI}ejc5(Ҋ/u]<<ӯ*ަ |Rvǜs'8@Z~uVnzʶ*~}4F^1:cog#q1Y+ɲ{6ph+(-j[6 g g8)_]%&G܃؋wss~YIXOPs墨5o$w[$(=ul'0%v 7R컋hl Xv-k׮C)===Y¤ #h{mA\{g)Bfn%(Z\S&"G6G>{1ԵQ&\d+IӪ8Kږ>_74Ћ s|nͯzz֛}nWlV5\PMhI}e 뉺>KsG/b% eFEGEn\\"&"(,H,6_f냱k0E{. ^="ؤ}Ph0kHï4Db/L/Qp3Ɩ0%꾎s'Yk26h=[5iXlԍٕFwy;N)NnFn8srA9w;Sԇ^AlVk!꾖啄+x%WIiE{|~rF=PX gVOE_Խ%323b gqoȵ|9_] b R^jB<ػMA =7t\APZѶsµ'HIjҞ >qu5Us1t`J6p9ُo*/Eby)C{{D#]F:ZEcADB$D]Lr&FK6Y֖VWb+|hgOa}Yk #$a1'HjC$A\{4E]ELPF&'afGdg*2\xA,v3sNtm`쟠܎4yjDB3,?U]M=^&>b" [%( [^Iر S΢ =b/"GеGIHGIĵaqa.Ai5nKHP2)~-*lvFX` hpr}x.geԾz [%?Uћg7 X$@8e3p10ͣ;kL`glԕy➉ug:DHPe$(alҫ:qI#{'p$ik5k'mH8i85K[LUCbBȎſP]UE40tňt% 0 OA%ޡp77B:/i%)AxIZq;PA[B s1/BXEBBgK`, <uI.HhDh;\@1i#wy1bCLXhbD[TU+-VEEĴTy.i{. t"(*3wmYXn@E J5 + im]5q5)k .c3?O{2ōRŒǰuyaGD~-UjtB<\Q1K]&?%XwU\YG'zW g[B&u9J +SCkC[ b`~;tK>ě<=BJ*.auTsEE% _N_Z2Kt%a얉ARy5$p;$5`t0u"Dl}hsI)i?)-2V+QѫAN*ξkNj˿*ag2]7t 7]4tIJF~4> \w<9 "8T-U UXgD\ f9 XfW~b6dB)g{-hKF8}ݡZ8:9Y]kشq81_^=!AE_1CIMZ!kLz'WF+VrS<|}*q0pa6*ADCG8PZ ` .Oo5&_ x< dz`L p#J+h>Tb*ФaH*]N;1=t2/"P&3 B5kxj3JeeCSVj4U08d3ƥdOhjP'~eF1`*IENDB`pydantic-2.10.6/docs/logos/nvidia_logo.png000066400000000000000000000512501474456633400205110ustar00rootroot00000000000000PNG  IHDRRR/ pHYs  iTXtXML:com.adobe.xmp ;MYIDATxwUSo{ (łb%cILLb/DcIذF230LcNA6yk{ye@ h z  =9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4;hYll?"Ad * EP4 &"zE)NQ 375e + g3 olG}DDʃ=Y(@V$YBaKkinOdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4 +vQH(I A!ʐd D) h7a(Adb(@ H)04(Q٦2Ȫ7rN0oBMH#@@rxbgZ\>a(D)@$OQfY.?<>;4ʪ˵rV?Yih ['4++H2( :F&f,<ˤlЮ/L'$Z'( S~qQo $ް39OsP`њgkgX<ˮϧȒ?d! p g&p-*Mձ4P`!gEniª3XT8:fT8bhuO W@B,)9WئeF K3(,/vnvdC1m|ez>X&D@Qe"`(8 \82\.g]/` QFaρ5_̵>wl%!j0a06kΘst۵~P6Ǿhx6iJٗXu#a!?@Q љ(6}J 0r0r$ں}k㲗V|ge`{G4KlAk\n3ۯ1Xw{etV-2l} A:Ϟ^Yi*+ 7xígy#j(PBOÿ#շɳ d& 4+Gîss{ɘ}9/~caFe/HEapv-gi(O@mWP{!݁FAXǀ`Q H s9N/sKopOmۇܛ<[;0ip)Ma`S˻ߝ]tӢmڒ=H ; C9г&ώ 1񻌈[2 (>C>&x_=OyeOU.zwߏ6|0ǿ7g#"=:سO)"{ eQb˻`\mG^4rp[ S@K=_~y͓ߟ{jxz0qu4X=S}Ge^Fw iT=E/|I,v7)+Hc! Q(}IVr,^RD=kKɿIyו8N{10 i &x#oZd/* M~}LQ=Z3.IIl2}q;RɎ%Əe709 >74:4n_Fϖ`¢G8 m6q cXm2Ǧ*=P~ķ˝.ޠh3kOGX+~2(f˞0[y$7ܳSHN@YbrYƊL.Ȁ' ΰ9w'lj_}Wgǡo[h U<9qz}&QȺIMU cCmzuKLU |'7vY Fz6xޘoaYJn$,sgߟel?(D%xèw帧=K([{`czRcPr.u^"1N*@S}2b53j;7hېizpds[q^Q|efMMvL4_sG~?Ӊ <}]zLܯ8 n>31g"lǢ M67/=wV=&lZz;qtYtGna#Է'zٸLM|n_m32OAB@ Te̘Vp+t_r50P`{۪ݝFg-8G33XqΝ3>yjeކx* ,:<[| C䖽!Κps6_iSMarEw<7a7REloCRr_[;{7-ն5w h9#~aԷ(%x#XPy3HFF0!{f~t䏆揆2B'ZU;YwXL<7vߖ`˞aH2>1(ҎWd'w $7B'>w#3&CQp 2Mp;zۯGgߞszC+_6BZ&.IW%c-m\i#/(pM]P CeJߑT(0sY|vkkqf&P~~xY1/s.I.{'2;{K2S[[߼<dbvхڼgҼ(  $,902M8k}l9?&>h; 9>Gn{֞$;!} ŷ}l;9%_=ycK˜춋4W; ) "\273 oNwD%aoMgp;p Ϻr#jOͿ_mPг}ElǪGH3 -k)07N@>pcĎtxwߝƒKwBr(,uQNqIm9E!$ (`vEk§z9볭Gitoi5EL`6j~92ilR] ={BSPGQ`x@MMM 3m40xi';! -QH ͼҤ"=miL*Q' #G_Uy Xt7oZ(JwZֈ=bRmϼy;.E֫W5! !GEXyޚwCV?g䢂 /]D=knSj6 ɮϏɺw,(ATJj_؜tG|{:!!ng&"|ߎz'"{Q(O\]W6uc( wg_sw q~^MFVT'g,{? DRΧS'/$GPdyägcKqi!}ݘjܞIin}au7<9o?K~a߶ҕroa/X~y) |BU."XrlĂ+/l½В==VwM_P:Uġ瓸O/ժc$4qGYd5; eg%^""z@#% =q?U =MacNC3 Peg++F`dYE;}D)+&\LGn.%ؾ;3JuysNHCN `X j f{0REYmy"/EOG>i2h `L֌>'.qmJ z8k&L2dgЛXskY퐀~G<@d]qٯξY@tz0YT%XOJ1kxyU^wLF$+ʯdz'!*{#~$ߛc1Ŧg%%u,3y6X*gtXDM|c,X^~xO BT'+|~0Defi/$L|{ɋum[ jMEvW ,dI՟8 ny#MS! DExødWp`CI0!ߒ >?[A71e?z>2 u !˧ܚxko,8zҙdOV:IЕ=_Gg6&>snɝAhV#Cxg}Q${$6*yߙAkAV߻eji ""E$o0e3FvmGGYb{n> 8?ڮ8E1 qIw,?RtЩţ)S_]$A>i!-{갩%fg lv ?g,N Ԑ՚>:UGP iȼm{9čOzk0ec@Sx{"x.{tH5Cq zշ-. "=u9`EPh-g' 8ێ"+,[uy}D3z'1Z;mcˇXu{u`/g,ӸxIr 5?"+A=C](t0µpFBOy :l۴.#oUx^Oh8[]pOXQEDxØWz>2pD~cͯ<002L N~TB@,a='lOd~uo^ǽXd ]~?0m#}տi4PmguHICojy"1 V}#[S` /2.:z˔efGF$!su=!8 w|SFwWں2[<6:tlxlÂ_ݚ CXz狴$faNLw]Q`3`Ǒwd52 B*"֋_ = 3J]ʘtUēm>nnwpE=_~Okg]_5%>ha#{EG6rٸB ЀU7jo;qٯι?!7?н7_[à6ʛ1\O>?6] }wbxP}`:u&WmzM|7l=RBf2j>ws6붶}BskVKV]z"~AHDaXtuaSjD}?jdaTb~hhW'3\KjD ;>1)16 6o w+?1 s4%8QKr|)5(DjRvSJ9{VMO:?Fwh_֟lzUKKlŦt5y6=M1dFe%ث_ͳdyULvg`LO0'A l :vwr.LLml_fo^c՝h <m2 Mqc+wM ִ:D:Ⱦɳ?|fFu71De(yKsJb5?eimv\rE;Eǂc>Yg2d8Z0E,VC9!34?&mOQ}` <]{;\HS񕀤)1sf NC7R'ɞ3'n[{o{C{j8F1򪢃cD1>g _^p Ǩ=)g]9=7@)@ĄM[Oko{BrIJ_6 +Ja꿮_bY2^YT-H>cuV=w^O/+0R⨳@S  (/ٔ'fk닯;kǑu7IbaEy^2 e=Ln(2epӤ']=o>bYE 4X~j\%෫+QϨE2ܦco=ApcO_s' %SE K:z֑{Ϫ/01%Jp,ydZB=7(B̶l g"Y67]-=Q7^zv=Kb7_-ޮ Q= ^o:w =ް[\diXy o6=3ŕkחIxmNѓ$0`cSa/{ooYr,02.[HOmj `?ZtFƿz#@4 7 $ccMp%4(_~iwZ$bhuӂ@h>v@F8wyjq*CO=se 󷍗Szkqܒ.(ߢK.#6  0f]>B{;{s݇mtmȀ/ YES*ݞj" +pk|d;gvK026ۧ)t/ vC֗,iŷX[]I/,i(Omh{Y{ɧ[EQb/`/M&uiEw\zո˿;̶!]E"k?ooz|vm n5p)[-SC,6(3Bœa(kV_ۀ) (\{(ϐH rJiڼi+xBbTwLtzƓ8_#ph 4ǫ?2)yg&zbAE!f*]+OͿ>2>vy{k:X P?m3S|9#~iL$p4~Ħ!Μpw-x#nbՎx DDDXuhTƒquG_&H?3Ĺ sCV19#3817>sXQ뵍D$Xu'h U_}f:)( Qik5v9}i!( Q!o+sY]]h g{, [?IFly_P`ckSynqd󅄞5M~vc@aDԳg\ҴHdJBi] Y׮d 2 #\#\g0w;vnA pTotЀcLӦ/ȵ3OrK3 P!N$%~]lѡt*t9SBٟRĒF ۦU8=a;G|;^cO<3 6/biJ=4řLijbd%w/̫ZHX _:Uf6>E5Ԡ5ٟjKzK >qINʵg' ¦, x4#1dl R(7>{k,i|XDX伅 o}A:{#G֍H͗_4Ͻ1x:e{ٶgaN֠g;еM>ȱ0qi_SE4_8ǩ/{|y8?,ŝ? _2o4`D\m3Vk'>d`Pp~(8񭭯Zh/cc6;(,ʽfR9tF(г`#p',}Rޕ+ZXwo`Ytj>:~rݥ1}b@oz8YOˎ/+۹#>U6 z8{w2@ #J)yWͺa 'dm>#*=x%B4/{`nv]wEǀclf旅v~KeQ Q8 x_vyv 6.ceuE֨t-\PLSXA Fp}#z|ᶀ w3X&cx HE3p։c..sK(aP CQcgl~q_TQ祩;t[ p,4CGf<1D Ȑe4X =%NN40Fmd_4 Ad5CRP۵~r+2w6}p@KA@QP{PI$#"!"¦G1sl%vv8Cd, @Sp Hܺ|{9\sakQrL>؛|G#F!+0m*\gX&:쓇.쏏CRU8*u\z:ޥ eD$rz"gvel2\#d dv=$6XlK\c7%Nl6E8.>uViX1>*Vd,w56ܱY; ]ctI||˴\" >$ DjNJJ վiYcK3ەrD Q-q ͙N1'2i,5r0'CM}Q5}eY;gpir ETҷ'ʈJ}frmLSYiC"Ml_b`4}5[Q☛gcg4SXD$p hdbid%P )H2DP2`)04h N_`]J>ep+LUlB6DATOw]ۂEl32#s-NCPdXuY4s+{Bžmo  ,hqNk1rN9XaYuyQ&i]wiw!O%1de"~wn\AS01"'폶 R$,â}pC"qޮ/ 0)Ĺx֢c̃9 "4 =Zv9' *B8.0r0pYF!'4=9 AdO h"{AsȞ@D '4=9 Ad?h,C mDaR5C2y oAXnh~uޠc`u=To`=by%<~QhG s{AȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@DHW ʹ&$(  uNQ EoPEu{˯8nc^"}U&RK}E2sʝI˞ ʈ-H d Ggq_g1IIdaK^Og@픢 Jdi :8>S ARR|QG ,p!+>5KQ"D9ƀ &EΤ>Ejv"+Rk#aO%("mh7 gKTUg=45N+ʑ3ئO;8oV]DM^{sج$%گ8T׳rMfkWAļ Gr8(RoAࡰ ݁H;E~*$qFwMcT|iz]ߚ<{uLyJ@|g(>(~\SQO"2M3 rmo rZI{=U_Q"c ~ddPj9*e͖r3 oS\>Fg.wzO" 77{6Y8R{+*LϿ52Neu)=E! `n}LHgݚ\Dyi")rTar7Ewv0 <`d]W>RF IAO _Ze:J4Wo!,+3BX/&>gX˾x1 imG۲=k",eлoz>17jeDBYᚯV+,NYq2M=H ,Ob'(HC1wj>+J^=p%y:Ƣ`]{*Þ{]3V͚ !=MX@bdMMλ֪cDlj)T=,ػf9't{KV핸 :HIƘ̥|#ѝ4T5siŗt68AseTv+FfܒW?1ۧ-/{"9rokVo-:9R4˜oRK0 >bw{i ,"M̽*UݨcJ]S_X&"~흯cRj>Pobk( lcG5o <_HG')B]^Y;dz Y)WMͿi{1lT{q)[f BA0V~}j\?:kI5jψ {A jF?q)$Q :-(OϺnDZp4ܑ߭)a,l5q*e D(/Qsqm #:ζTv1&q- vs  #شog\l׿tơUhz;e3IWbs[z?,0}f=>9k5<:14"N#ʘsǨZVlEM*18+&(w8jq5)64gEϷ+RǜL-*0PO}j]m9$Q xzi멁?;Xswz"4Ǡ=IQ)цY ~xG{O?cF>uq4:{>Vogd9:d{IǨT D9lnyJQe4^y{uej.nlx&ǡރ Uv7;&~N [ᏢѽV}oriԴNuB(rTi*,OSi{m;;)Goaz8U=BPܫ M[[ `RO,'gTk_1;H)ݦ5E]ϧwFǁey( Ee"֫4ݨ42BDDg)v]$9LU<,]o ﮎ7Uv:9}PN (xMWN<ڮ89湌ND"rRژ]ۆZLnJ)x"-7TBCSPjqUv=)=;{]FSHAo5ySz-ޖ+7yqR%*rh_vH;tzF$?qYXw拞D K'$l9ʮ SFgM {qVFG1ثDVciR&FX*R*',`Tb#*Yޮ=j24{w=YhV4轖f uQZuy%١/ia 宅a8tqX#0(?rde3TWd$FDĮv'LλInI80Y+ω~V-(s~sĿ#PqB y1y *( (*Wݼu O?Rw^3s.6?EC8;_9>Mʜ_ $Yf]my 'BQLɽ>Оc 0):[ psuΙf>CH ~OePjȹs.^QlFώ|$5gg ,"Ǣ{d-4aj-뚞8ELJg~(PP{_swEmDe2ou ՇNR P'^U9=YlRfP2OBlJ 1?R+Z[SfܞngJ͂<7N;.u947>S"T29|< (1/x8^ U'xO(J%ey&~\k6 @Q"F=j: %Y&Wi,EV;2kO]A[Ko_kշշ;! Iܫ&0|ָU_HwB(@ӊV٤:,SʋtT=q*"F9njMSh{ eʬϬ>$D'{kw˨ی-|J㵝sATKm\U5E*3f.fJKsͣBQ=tubbyv}Q:I2gumN[:"D'CϢ#96黃u{?Fy+Z\ k"ѨgE]^k:owQUUn{mmSN"<ݝoOQWT=_F`fQ:ƣP>:BQ:d]H$۲=kUz&_١{n }8Ld:+|#]n zG>Etcs;7i= 7?M(-|b5v}a}'k뮳^?0 澮垰@a;t aKJ/qً\ h{$ϢC =qdd.wY2eSUghe9uKƝlȾaZe3u6yiis!N}gx3A~ $D$XxLU;ĘXE}jJ_-(++rVeJ9 U 9\{Olz84ׯv1.g~u;^ =`Q߽3PɈQPYmeAZuMp-г(cg# Uv1֠Z&A%0 􉕕!G,&*BwP 5tCMވ$0h(q6 iq.GU|v{~o5ක-:lio斷B9#3?DGK2MȃN"$d Uan,LR3XhET8F8 f*l?h JV?_*([q$8pGU.6ꆢ`8DQ+16=zC49*ol:[uM tLt@S%yP(#,ă?S %z Y 04b !'Ժdߞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{AsȞ@D '4=9 AdO h"{As?}dIENDB`pydantic-2.10.6/docs/logos/openai_logo.png000066400000000000000000000114371474456633400205150ustar00rootroot00000000000000PNG  IHDRæ$zTXtRaw profile type exifx]( Y,I`~f9`ǝt[>MŔ,ן3J<#TR኎*v=0 Z9-UG :^č"~Nv>.Ge\ܞS=.G횸Stu6sx1T@<$:ȺE*ZCM9$t9 /!@ŧSl~\/|2?H?ρΈ_L%Tim6̍e\ 7Y*4٤H#hsx;a*mgJ-(⵮"j7?ʽiXA2;^tq(}-ٜ(X@"V$C1cPhF*2gI|rGDp;xheˌ T2Pǒc UMԃYrʚsjbԲ.\={Z@-XRJ*U_a9#z?QOKM[nּV;w8&zֽ^b#>ʨkmLSg6}Yoj/'E75nj Zlj.f Ɖ@,h^̢SJ-fSNIb:-b@N}!nA +o=6s.M`a|x u}_mߎގގގގގގ&~Y*}c%^xw_woi^r6PLTEvhrtRNS@fbKGDH pHYs  tIME +8 IDATxі(z_wNwKUfQHKF ڸ5B\>>>> ?? ? ?\? ? ?\? \?\?\?ܞ? *4TϨCxh8.M1F2$Y4˿~:8 iFD??H(g/>؞? ?Οȧ %~  UpR,(Z?/=:@ƀ7(Z 1`Y"uk7opLwG=S@G_&:[@ ׈>s/l/R€Wl (6+0f`@UjuC[ 5#'TyMGT[Plv*'}&M@OW윿h[!beQbE.*'~M4;"e.&`[zIKB-բWTO Fd67J^;.~8JPXlM_fB]1PHcEl 7  TZۆ (3T2&CD2 ": 8 XD<3`-ʠ_.viv~VVȠ.buo# P DC*+KYF+CK$yipu&6 ?46Qy HW{/ 0=@F_BPFDXC+@| RN ' $_C nR$W@vZH~N m“$Itv*~0X绡 :_*t @p'l8{ }+W[4 lBNp9xV)0l+G Y  A̭䫤F& CbGx6f]?2D=x?i 6}$}hv lRmi-} D6߆p})&`87BgDn@7 ac' ʼnks:A]Ckv.`Pb@ PU)z P +0+`&0!J76"@&&#6V sJؑH]Mo'@m@vh(α]x;xF` @ߪ^O[+R{5` |K%x^Hn   PU'= ?= X> Z T y@O "#N o  h ٫ `&Е' VAU @& ~$0v!qFP$0%0 *܀ o@j<-S,cҕubo& 0ɿLdL(5`x:Pf`S&(4``{40@\O6`&V*kOefP;[ poV@9toVrto6` O2(`97_ry(EGJ ; !4`,*On EPLJwጛpmbxcccc߮yTPW+@]wT P_7;@M],5tT ޠ {ޣX~EEޣnZuN;BNnP吻LTl:R${-*wͿF+,_ Vh-t U4,j/t zr V@'ŀwrR 0MNu xr 0-dφ/ }"TYvPXP#ʺFwGڤ-w+\f.ňuC5T%XUΗ?N8o`6`4#GT<F1uBTٜm&̧+Q[ʈch&%l&Lr-wkիnb: [n*\)o[av"f@@LK.?fQR\~*I,( > D',I`aƢ O3`l Z%wJD(J `H-PU H=cJ(`\"~lZNكr]JwJ,ITOqR{wo`f*1E B @wc7@>LaY `|w!Ro# 0{ `>Ob@+/C:] ?Q6`%>R4͸FET&(nlw=!Pv\TZlk_{|mL>zXXo(2cPa:SMXe@tGk=~2!474o$ vDT/ib5.q 0 P$]zVo9tOƐӀ iyVūb>oIENDB`pydantic-2.10.6/docs/logos/oracle_logo.png000066400000000000000000000112731474456633400205050ustar00rootroot00000000000000PNG  IHDRV zTXtRaw profile type exifx[r;DYPxr;)%׶1qEM@Y[,.ڊIT:}>y~wP7j\Cg<{m0>#P}^d\!=PB9M/}xie6";{zX7ε8B|Kpo~g|ɗ-j7B4_"xc|sV;g?NSQoXU|~f D帐 =uIT"SkʌdpF8e;bl w_yJ`#r?;go/_aX^cEޙE@y-_^E6|8`[nȼQB\wƘ/d(W~lcPea*MloΕ,Ell"9jNROM9s.^bI%Rj15Ts-V[jV[kںF00kѪMU{٨Vg~gdȈ#<ʨ }>3<ˬM}Ɋ XeՖ N;n[w?ډ'|ʩ=gT{}!j5y5F]oKl1#bZ,f"g1j(#ƭ`#i'b-r7oM~9goD-y#BTwNZ7Rzg?Xe)SgǍv^g6jnk=^_d?Аd&~8}]/w'_. lrfoG9YFM8CQ쉝Ƒ^VL /k[uxU 3eo<-i+} Nᮇ&gE'=96ɟ%DZ"rEe@(5C7(nzk1u@3m||>y$u4Tk[xHY8T>QY\?\]l8S Tn!PRՄuU'IҦ/+֔уKzc˙ dܒ(D:^ttJ,-0$[TҪ}s" QD4#Y eހ&~-.[T8v'*UeJI{MF"Q˟kU3POL8bΜN0` |3ҌA[%^fgH&v&V;}n\ ͶWPi)B|`Z45bi *M\NvP(M;7^7uN+<ˣK@lb12JqΰP'Q;>v 4M,s\#C}Auǃ!AVRF,DuD{{[8*ЕD^m3t @C&;'c:Zhkc}9Ws$<@Z"Vi#„ N9iRpw#!/sNtldPmZpkQ-啴7wz[\ns@L}gJf84Jmam, e&(£gk˜[)[Hij5SkC?M(C}/s=J=Ѕyո}]ۂckR+s)*RXl%n 5-${hsyOn}֬a\rB/B_݀z>DTq?T) cG>rΜbwKfZBEB{֙P tt`H]LLgcd0'Bei(i~E'SCf#hB'u i9Q@@vIw/ o# JaJIt[1"f[ =#L@3¨.m\(8$sh$P@zxYGMGƯvWiж4xlg-Ӳ%w-\ױI z,Yp*t#V+{h%B $硛q .4۴Yg+(L0Y`ñ( @DQA$U +HO68vAv42/L&u(icMJVzHa$fHrpwwC|6@)Z9q(h.'q,8:` &tGPV ai ]w{S#[w}WF@!;(&Uu3|Kw( MkxvtBA71uSqؖV3r}iO&/d&?45ٽf?rKo*Rv;X iCCPICC profilex}=H@ߦJEvuP8jP! :\#4iHR\ׂ?Ug]\AIEJ.)ޗVb6hmq1]CFefGw1_Rs" 3Lxxj68GXQVωGL #8\xfL#bfES#$NBcgTa{si ",Bl1uR,.P:O3p70IzEm⺩){dȦJAZB>7e[s[@zeo4jrhPLTEGpL@e<tRNS@fbKGDH pHYs  tIME!,{IDATxa(PN!W6CC{Fӎ V⢍ğMٮuM+>-t)ĢR& 2U! D_N`rw?)m<<Ε@ ϳ֯6Xo ﻨ=_m"ql}w~ "D%do@TNO/6w - Vļ*3a@ȣwv[ _ޚv ȜAђeP0TJb~el'j2ND2JkN@PR? m V!q!oGݴ2J^ लQ;{T  U(+kӅNO*n pP6m C~ x/pyj;=yg7/p=O<"@68@nF$@e2nwho N7nmk=5ЍQǧ8vXfڶܴx_7IQy*c_ jo `H0=M  n~XJ /= ƪ1_l_%ה}, צ"W :c2O_}qZ<tQ.uq_lYɠsvSp}x1jI7{'ik~\|WBCp IDATx흉cT6!(!1H&LHLBXA("pMK*RhR ZV*mn.5dI9~y?@!ssω [lպimŴ1>>0;u==挶iݪehtt:sb&!ص]:wہJjܻ9&@pTWؾᦾ.iq~ <ӟ#u= ̡YRNɿsr:%vCTCTKrzg8qh[8\[gdQ .Vp ?#D.c?(B;D/"59G?ϙ$r()@ MD;y$t0;ɺ妙[D.E'y.]ɦLuǣOC{eCסA3~ٌb~H,g6bvbt>M&IC'3< @{gZE'ph\y.=w5C 3۩hJ`txXvҒ,^N\vrL̷pט9N)`9 eו]GK}e> !<&bщRǍ7lFq3}InNjnA{,n ś )6X( b9hշӢME-+9wE9Vk#]< 8נӁனhߥ=T0d5yL".$}*v7:XAge Yу%2lzo,pEܯ@x<( m&GA=h l$n6= ac ?<RYǽ+nK$q:/Ȣ}衳}r,pAgG=oSx3l~9_r|e\m|?.2^VY#xn1kk-i?]&Γ*~<~Wۥ#CPm:SjX :] .nrLܧ~tѣ;|%?vv㼬oڡʏMZ L%q_T5f5⑟(xoē:?Oy9ce6sY"Pyq..٭g֥F9l=ϙms$x%hyOu(JhÆ8]Y}4\kň iN0F.UvN좼X'[4έW_Do{kN\ ԜSE:4/RΤMD ޙh_RM:颏1}T6:l屲*4敟՝UcV>I~n,C'w2,f$ZYP<XfUʾz[u ]Enݘ_hi";f ]íْ7(:&[m<[ <5E+sf߂TbWѺ{кr_Y`Z*'ZH8OE U}S[,SP`hUXƒhUfh߀tJן`(3݃:,g7mP9T-hM~*=VkK><4ۇwВB%GN @K 2%KيV,| Z5ռկy.j!f]v.Zȇz^;:R Mp7D sM-X?r~Nrx-) 8OҞ@}#B;"k-PS5(wu>VAFc6}&LZ /]{4x M[>oI}0!%vR9|2.F'oYpqc8d Z R{@b[02PQ":m&d6>?4mX@q\h'Rlg׈K\.;&U@`l*9)@M$li%~-D5mE O*^`O5';J+OB29|d|wua;[k:NqOiߢORvj+-,'u;k9x~2N9򎷕Q³GKʡL`, W =)2``@C_+\&5;6A߱77văN}G/a>7m?YPE>v_m q^"x(򯩁<#WyGjn2 X]; m TH԰cT'_%5h@> $l򓰑h0%+w}P?@.T/_M^:6~3["[b>oa  4:tIP;+jy51d`@&휡=dd `'D?-h1bd/@!کqD΄,g,E " .]xDshច[սG*9j(hhhтۈ:nI7DDK_`ݖq&/]Z%j [mD<}Ckӂu@l {!|-h%ҏ(#u0ˊ`#kTLMj j9P#|>p0ޠBc heB|4KcBjimL\}[cjtgRVd+UA7I47Z:XkO-@  07PKV2/j 2jT\r-Tj\ݮ1ɤH ug#m H@-I$[1BRhOPc_8Jc"6Z*e:ǜMt.hЊ4Fz H]C'-he2VB2Ac楫Vd*k ,bjW/hD[-i>16PIiF[-Gi1ʹP V;5sZ nP硭 F"CPu5O\1Tme+heB<\g}![X$SkRF&ky_k?l 1k vZfK5Oi v -XgZ|<=[k{mD<5Xh%Ҟ!QQhC-rگ@hЂ5ƠG SFIE{/nydMzQޚ%"9{ K>xs:b_ o4CobYhå4ۏo6pihW-#l6\\-quBXNh-`'yh_B[. s[#F<m,'yo4]9DcWbh%AaCy2tI!1 f cKx> @c6mCK&7;H8/R:tp#D+A.Fw)!,sljhq$԰w:5_JuVLq߄^ā;1/j vP}LNld6h%$0|ȩw؆(| KLPe'܅6_a_ %gSbjb;Xב?::gևF0*#~rjpF99>!~6kCa$ïcL ΄S\Q ǵO:D g26$FWO2SPɡ&aZ>Z,pliO|u<1a@m3Z.d˘qYUNEC )@ 18"&Fx -&Bt A yD%NG QLze KBQGh%SœzGBp%Co` &GPǂ yZICbR`['1٤{Od|EPˢh)ꨢ΄=Ab0<)۞x>$x΃邖 f|KBRp׷Ǧoj*žm%ШZS%We -ﲙ{QtȐWH.1.<\gh1A=Oe;[519A_s\ǁ~Qјh5M0F.6ʹn$l@ipv4IPN69ӓuYK6Z>VK< Fy{V,W !?S6@ v[`X_p7׍5hA!oXL}N=OE+ .'^ނ0C+ 3 -)<jyLDcE]M@K w0P-)Tv1+7@K25XdtlGnlZ9}?>8t>oK,"2ڮE qEؠ$6d5[)]Ve\~ c%͏l9Ik,p̐nDYhYӒct#$>jC#1A krW_E 3b~ ZXSLN= =^?1<b(5nw }1iW#lp )sq, $B"G $ܤy`f(_̔5䤼~+UQ-FZ \tnPUٙWK$[)v+h}QQTy(+;&2jF Ģ'uoZ2ln})qp `L|ērYsעs˒fKk\恝Df}B[ fs5'ӡ(76>a ixDTRz5K,*>H|ReGA_@.Pxda듥蟎gB>m2n|1˵Ygwe ̿尾FۭMت`]b\#['OVI;tksNWUT(az A q{TpS#AgF'h%rƿDb9F輧f~ed$=Dbgt>sQVe7d:F5,%3K8 F>|: PzFAXGt7eq`5fۺ}v#}öO[ѮK"z :O42tJtnoy ~k6ݖH+={cc`1rnF!o RˑK2[1s#Γ*.F&T`99wЮZ'7pP9ZR1\Y x gVF-Ug}.AʞΗ?RԴF*}/ 42w&=[G2.fUmAFnG, /vo^~bso=,2m}a|+Zᅹ)se?w|Mݧ.e+Ė27z,q.˽raF>'sw'ICiS%^W |䡆+KKx۾Բ1<}WC;\{V/(ؿr4FlטN yʶlqCvƀ4݂ٓ[C$1ř'0hk :3jך2iRt́KD|xr1?nr7k2jE;e v3}Ym]ravppʾQM 5Iz>&nW>3t!aI:~ӛ z2jf:nR6БIKrӯѷF! .Vnrk"$W=*AoB~r&-y}f7?6>P𽓯`Z+Ni Sm7m;௥~60*pTsO8L:SХ"Gukp$7wx\L=G\5y>h>-q)PtY84+Bb9]')GMBX[0p ܟf\f D〘kȨ3QrN_fHm|) DM4?Ⱥ/8+ 2k['OrEبq@5! T܊n_/-ErExq@5uD.k [ ^Ru_p$WbٗoQ+BX Tch[HL,|F(R*RgAq*@Xm& "lm8I]ǩk/8_ߙ*H# NȬ-AWFjF82*B +꺯l:."D}!aӊЩB~SVa| h?-HuE( /|:%ƒ[O%aЩ8=)łGvxVE}*{F{eWrDm SExEڋv*"t>]HaYȪЩP "s @ăO ЩD"Զ!dJJ#s>ܚ]>N'S'wDU|A?J*B+B*Ἓ4SV}tZ:uL"L,C@'̊pprVtE}vcЩ_SE6a3A4Y:u_D"taaFzZo`SE*!Exs"tȥ)Q@?,>=7IENDB`pydantic-2.10.6/docs/logos/qualcomm_logo.png000066400000000000000000001033551474456633400210610ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp FDIDATxw%YYsΩͻ, +A3KQWT̢&]dA 9/lعosu{oulrת{+ݪzNx9DQEQEQEщ=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4b@(zEQ У(hĀEQE ((1GQE=(@ QEQ4 Ou?(̸y| E)ala&VFkİ!'\33V4 8wh[N>7;q Ȁ0T {Eu _Uz|50y}CЏƨ`˙\8ֳqfcS#luvLLOMbl!No#9^y`.;*FՉ ^Qak!щ\3?u !ID6Zm|AH3 :N {f`-(P^E ts@QfvӥJ tǠp\jT۩cC`Bml}~r sMͧ*&( zŽ#]WEEI! fe d5tҔg!Lw ;Ϟ%o xȲN&W^x&  !rG4vᆻhF)^ H{ZEE V5xLʈi2s!X(„nc exFlE=ښ3?vcM>(HQpqL>yYbۻn!B۲a S S!j 9\= u}'Ayn6fSWQͽ,1`ax(rU$lUlf}һBp]tPXKI.蟨J ?w޳Ш13$}/DQ46EA;O+#|iZxR/fG-LT VuE@Eoɬh4$Fo!dϨa0lц$E#H'@Eɵ,wAA}dȫz2 J:0A9/}Ygh+Ë|zjSr 9ԆQ eՇ!١*ɍKx--,Ms(*È @8 7.󵦺OuQ+gd ~~R)?airF񹐷 ["(it\x0Jrɻ߇t>l\{J`3?7s\қtRܷMDÆc\ !fU7:zS@ oV րw!΄E@jtoo~D/{_glcj{/_*E)bH1'~ XBm)4Z[{ ;Jg6Ŧn| fƺyEj?gf^Oufo6U>K>)X&h XZ959)̃Ux &j--)M;-lQFY90#qALEQ\"0~<4_0;7:,n$}uT?n㚻WjWI0}[N\z޳/x赧oRkC)D0'yS8GQ`W&j(xe*OH ~_oFA H3û@vmDY}E>6iF [83Rhbe:ք@Ejܹ\eUĚۉݍTZP}xqA +&:to ÷D,u7P+xkTZO$Hi{"|qlP2ʔER&xEj=J^ oh#0tdIN.R"djB/ }KFNz+ckܲPZ&5 @npjEEѭ[ޗ,"omdP,o#"C"^c?bCcJ :kWΰU㷃)I FI2ΔN"tEbX|+Xy ^(F,b4)6hq󼊂u4&Abz; h^ՅݏWawU )Fs1_FdĀ>0-ܨnfwd:['h EՍPQɸK;?2eh*l=s}|~~5E@(bV9UX/T' s8ry4ەեsD`ʹыh`ρ`/027k-]QIRA=f=L)ni#P0R*+a1?@̍t- _lfYD1ޤ\|thAT_ ܩj sؽC7 [MW~sޢܱr>0!JT[j2) |WY)fe_a\p@oYkΔF&YV." Fn[;YpvTTPߚgaF0 EtV@ѻD=@QBŢD;^dAToT',1/L9& E1--^`7:'#?Ϭ:8[U06w]5" )n+EgO.W2$j* qdW}ƞ=sdߩwךw@|`$x@^:eP{d -M ~pKT\ 9l2D[2BGQ $$/,e*li_}Uρw|Pnڋgh'=U?V/%Lk~jRnbf*y6Fg[3)/ޟ }N09`%\imtSy:ƿ׉֨{%Ԁ|X7Lȶ]se @ p~(m_ p??=4 "j/6oʈ;A!\ȓwU<򫅡S%5W(z~- OE~o"l]yPo(Enhi9[ϫ6)Kžxϥƫ>@ ٹ|nښ*#򅄵ޝA.lbn( qaȥ^)=(=_Q~9i-'s/r~%Rpi`/[_QSL5d>?^*֡͜!7V ,ÝX< 3=;(􉉱/Tt=ɉiN)j̱[3f1$l%IԏqjruLg͒?2سjyZ |THah/˻A>HWz?mK6-fH|zh#ay 7c{UiQ|!1iskjO k%%'"Cap stlJ+mThae1[jE@F^iwy}1ՂS5 $RZݗ"eW&;lTגUzD`{a+ F x֍Rk \ŝ w#M1z4-3tu)nO^/p*[P3Ų׆ islK6[Fwl?t`@#׎u{Eߥ0y{$ڔZ%WԆ HZ;_Fھ`wְŎ"ssueY/! Ї_:c6U_wO};=duU'r^1v?Uiq憶|GI4+<`5{/ɾ{;j|y'ˆ=6%]. fˀ^\,?Zh@q 3~}e'IoAhN"YT}7Y)Ի'wj \Foc%tTB-ed lP/ٳ|^%x6 ?bPr]u[m@#Py;7や9u v9Ҝ7UͩRk;u5`hy<lPQD`((|H'xƥ͂ _^"|JnvQ>+VX Ck2{IBAyzEy-%nU>b>!rhfFqIX+'!^ZWmLN&r<Ko Z8Mӿ ,1L7fn7>\NC4W(#y gµ'*LW6PN ᨹ6 hƕC{zsT*Y^&Xm蒨 TD<kګ`,܂DW{޵p ^";wEΖrQ>UǸ\\||ܝvegk}1(Bx׬Mಌ|qL/()Iz9SN,V/T+Pi+2Bk',ZS Ӽoyk"B YrPȣշfrMSVT<8^WgyՉ6plɕ{cdsղI}aNwIJT# Un u& vB}te4ǻ!MU$ k#`uϊo\] |&z+e %)מ}G* 3){"ZlFp'?'iGr*8__#PPo] u?=7L1&$(N/[5Ubt+̺V6^\^nsOj]'I0t#⥈ZҦ:gZosl* W~}y%zvv;}fLG͙ZCmy vzꉕ?[m˻("Xe fɩH7;x^`8:zz[&2lH>y[^wӛW;I"μN`9 hC%c㌆')[7QC׵ncek!}Qٹ2noƗYE}b;nʖrbO=;2 mt fpk Ӂ"/YAS|nsSf۬5* Wg4nfMN dNBۍ5@ϵ?R>2F]㦨\us8$`D;b5_W.BM=#c'EN4)XY `S[@)ZK'U \xfѝWm rO!i; vZV%͋E;s(wE5cɲaqC?UWYh'@_8cޖK'!K_[rCC~H]n|﷼HcZ1^ ᪙-C}w"Xm tooLwͳ0icqY.}ڤ|;2BلP,,?r=wc`e_ 1OO >,֨?vGS^9۾0/Ўξvᅴk5fk&l 2<<׬]cXק#SDR'P0?zp`m٢4FN9:TusTɱKn32֚ 7ynrQY|yuw(kuKakp@*$4?p}BcTwf@d!֛B|gvdf%kse'T9q~ۅo]SmZI&C%58/3+xNԆneb/UaH$jxfvpۛ  2 7-"`[.9EZg&iɴ7)cw7:铸@k˝v=CGTֲUBk*^Տ@Q"BZp5h^&0Q1rNNpr"crh[5M%֚.i3q!Awyɝɞqÿ),)t;m>*~+0Xu(zRF͔32jsۡ' PJnLP C~6.DpyBbȓߨ;06I?(?us(:Xy]>pV<ڸ*֮ɪu#| weYnٞn) UX=#7'0?C 曽 w)(Rc.E%S.5(a/"&- >ܷD |_uXdݵzw?-ΓAI~4WzӇFr^7$'4O}!45XZV5;<Ց9-BXuLzЀC3݉nաhhiw۴N= 4;窘18;*z}gMYjEVfgtCc#+ܺM,.+KJ[]D$Aҽ(Tk]ڔ &Ѿ~ͬ;?Cw&{-Yزdp. U7nYN^ĀiE'=˭Wcf41vic?AHxV䪍ێvk6yΠ|,%-TbeKYպm#0}`|ⓚg ϖ>]`\قRcU==?;֕ۓE7w4/Sw66F*$q"-~xtwW x['-ZN"jTES iG9toi`pʏ~?}Ͻ?No ނ[`|~e>͑lcc>C|.JHQ(\g#仦2reLe%ж]\pi*fgjRXyէlӼfn̳YJH>7+f۷ A_'i\$꿵eQ gj7)ޠAv;Wx:C%{wo{jCH@ZBcޯ0jCy Cl!ˆ|6_>=7U,kI3iAX(϶gWy<'BKFqЩ]Fg6=r7:[Ɍ-ǥ; FJhlgii|.'0@\YU_*j,(e셧? +Obze寺XΩ'u‘Zxr{}wXЎ.Hfo 듇muCl 99Z nSI*{[?w y&lrQ#kX1sV/[eCӏμkyC ÇC; ,ҧ/(Yc߂I)_Tyfݳ奬5hy B9 a ѝ ??gdR,n\]M8tl! 4wUI"<5I[LxHu'\EAóOIukCIcB 3A/b>MεF*iم6~wYbf$l`wa3`=6 ttWc#oC4jO+۪Y̙qxUK ϟ?k&fjֻ'`<ݗ.P$c+؋@jj‘i=gG6S۽l!9!7b&y]2g+{)NlPB]V.nUv [ݕaq~rx`cc<>8͎-7.cypo[<ҖEug̻~7fZzehGVi e9dqڵ;ZA1rӦf^.BZޕ rg󫤬`3?PHY[CTm0oaƯ1Axj*0|ѩI?)!7(m<g~O +땪OZ^E\HYer+zfbG!i!i* =;wQi)?A(!7Н go*lX, ) 6)ƏfzNBKI-(Qo̽6 }~ُaTD܆:>Uv_n"k^&b!Nhu^6h1B2e1L^4{k-J$=/@U՚dd<#)p@Nx1 |d&$dH _!Uk{UT?Sm YYD)BC1&K!UgQhU]@^!"*k]Qt%JFH:[C}~VH >5Ȃe$UV{CS8KlyU|י9!0NGI4$6AvV=PۇJ/cRtͷ:mweZ8hܳW^4Rt.זּZy9Ų֕r2Zq.0y~ۊ_ >֋9W};'m̥\wbgk]S界V=&]3Xž]SKi[ʌ>[P"Q[{0^)㿝OwdnEV KF!.Er]5CL_`h ë\G٭/R(V}-k/\R{p. f6G (\R_Q'@țcstޭVƫcP/$ [qizHj]T>}TÇ9WPV4E]qnZh'|zlȵV[]g=xS($Z{6=M#e[_Z-I,=厇"W\еx͉w_pUы:^Pxܠ!ʩv':(:|.>]dI`K_5Ʈ.m_pBS/+ڗ, 7 fyU!t }@Q!}ݗ-ݩ揌 {ŲCl?t}j+l k?`l؎V"B WJG9f(Eq&_{2mnȼauFƬ堐)j Z/߹Ȁ$+ؿmSudtLAB̩i_-ǖf8mz}[>V96A] "rqn᩵N-6y36 rtC@}aSd(+%xqQ1<1!U{6;>a4jR۽7_c R՛tHc"=J^Q{Xc?,][`EW=:~PR#usvKҬ uvsh\mR^y RE0[T^PnqCeS.3Ey5&˧:*IBsci6v":Cо)NS]Q^bhrheϪ2!1Jk6;' -[ YQ~r&m<% wg Vo_/o9o+w 9r.*گRMԧQs$! e[aXo흹zrU&cy k}΀aT gAjnFb}ylݾw =m;{5ܺqe3PמՆyA*|iAʺGA˭-xSL6&6c9jG0n+jw,<Gd|-pwT,HqI2&?Ny*A4#aEһHw%+3̐מ:I~AV}-O]ޝOԼ_Q $]_Pt>'Sa~rP!A ;ENoGQq2]!G\7=fK}㌻|8=ec髶;2?·^o\ܰe~q>{fT!S9ߤ35DQAH.+;Pُ݀Q ǞzHsCV"B$wHS 5C];_T65iz ImgtwhXB\oj i#c|Q>/Qt;+G5'Ο|h¦LM[$?6ś&u缞`zpڝzˍ;wSVN݂ I෶\:{cxyWiSxw(WY/AԩK?e4/,νwl_2}LC*nS/o]`a&eHtNu7 J͝FeRN ູwFg\T%)JbCs%H ̻{{iVbwc$ 8V y0c- ?5搒vT~rZ%Y%riZ7y/T&-E1Y<9hoZ$|W75oRwJHb(C.>֊O*򽚑Y4B@Y:e E R^UMQ>k sʏ?sf)vhK0yfz?Px}<0VTzkV΅eri!H*`W(.|X>`S4E6Dufy!9b@NDm$j\zͮ_sHI=?_eBm' nB ) k^7v&)ѧb=EK/ȑ(Fc ãKdwExY,gC .yCm5vUڰB"5<^_' bӵ@xU >)_IKl  T$g,բ Kծ3 LT`?c,`բA,҇"Woj75 lhGl+A]C9T#N蠁u [& ˞rm}Tj^ځBSDH`Y9ftV6᮶wf4:6q„Y^˙SJE=17F-+ϰsOcY:1ފA3<~ F*ztg}k]G{;AUy-j8j8+'E= r|Lud$sPsYX;g@TA> ?э`#I7Hq~cͳF85TC[Au/Iʵt}e;%^{dYԋ䁳~ *CDHZ/LQOeuULfzfN> unƎJ>#iKNi+_*kZ7]u*O4dÆW!SynV1HbZˣ._otۖf鈀E9LҔd]wjSQ7wF)eL77f2`2BAX\z/Qf?E[woRnRW޵r9EaZJOTHkY}~m $ڶ:uY?{bvӷ^w#RrmI^,|ycFf.o^4T1 G~IU锠 t<,} bwx "Y@ Z{wobF/A})?larORղ>5gW\ɓCUCYbp(h<+HAl~,._@Į̀wIY򦵡E f6ԅԺ )64^5A. 7"H ;?e2m[Js~-v~9mN4w/7 Pξ~q d [#CRovWE$<ƨG4MvtiB]R1hvHQ4h}Lz WPax2%/`hé/(2#A-Rty|{lfkC-tn-\nSUs⭽_W'D~&av)M]%/ˆ.EQ Vb|Xzt͹M^U9=)wJS {%Y~`+ |w jHUC> ʱB0Ŧô+_O|nZU+8I`ơ݉Sn;{m̒nnna%*'Tmr1oFe7 QD7-dcI& bz- KG+SMpk[ MM ƨP"$^E$9~XPUeh L_ b&}a2~t/k\1}|${ʃs2+MO;!!xk Ae&=o;v9zΑT%-j&2,햩T΍s5Zt`eU^%@ 鳆}Yb2@H/27DP:j+V}ߛ s_iӾK޼H0[]/Jѷ R)CmO֯ T[r1'o^e2߫t`&`Cr7Nbp2D{cJ;U|n osԊw#~R6}h=X b4`f5lVe - e[W{l]͠~*I B/d b1NpPT_Y(*Εõ @ X [Mx`wΝQTX3@TQFw,5:tbf w}#d$ j>2̦Vx_0Wj#I4]j "c2Vz?%3򇂒e^w yS1kQNk[vչ  ptdvPƐh<Ϙ4TcTTIfF..;Uզ!(Q&;[~j][o+&f9)jGl0Ru=j7T<З9rU띨 l$![V]ig>fLXj{>=~R42}n)8SƨB_Cي3_ȦW\Qa *tAM6Y+:2 L\lTd PEL@0@(i;Mu1 i)n5^>.Vc& ̓ DY}d'V'OA%!+a_ JYt)  Z4`h7'THI,~TB6QA{fkj2Tw!)m՟+ep_KW( .ni5BupoE¼kθ;pKnr]v߷-2 |H2Y/ q@pQIAX帲ԋ(Ob]䚃h!{._`D0"UBP= i;ebP^ZÉjXRWЃ}I`ӞTjwy)ĤQ'ż*z +}fVfZ/]~ pz_(kصGQS( L{jk.* dLJbUs1bx+?K5`׼ʚ ]/ZDAc#n |>rj=#g>떫BpIRP E6 .Oc;d_~dy۹@g޲H*v[Ots {Ws{H]qfz;R|1wQcp~c~#j-A,y(hu>O%G뒌欨,bj|™1w/5˹_z/)Px0pJĽyq >婘@( B 7jc-)ao*(A̅`<.V\ذT,EP4FHN' ^ŬtV3Ō@{P9ֵ6}Ix*H+%dRɖ P!=D3I`"ƃ>W)NT~ֈ΂Gk,zZm:r#UPW&UToSU.1S[!O[d7~PO")l *+^Lצȴ9OZTO:[NjIiS y>ۤU7 AFl-!hz<ʣb{N{*~b%\s~3]%RM򏍎B}sh6wz4#+5"H|ey*PTyR!<4q ϐzwfWO/= ˯ ;cMyFZwe+\lWR ޢWS"6BwYvӄ񙹧n}vPoxR/'As ^~e*S#NٷR2Т枅fڳKPbsϮY9]"*>' H0gn:Ip.;WQ V~gzk* <\5Qr34 n7 U<ۻ[Rxj iCinsYsOhTJx\j>oŚXIf6YQP#h"QcҾ' U))4Xki]_gE"jh:H"8}#]tZUqK/?q" LȻ8g=.ڧ^ 4_}t77 5莚ś|Njil;Wt^ŧuqτL= 躜ّ9K".k\ _Yˡtb;?s[w 2rdߕ}ܔͶֱ 4U<4(E|݁˝՗VjIwݬP#+ `ӽ5g ku8t!St1&XU?{+b)' *+K /lvy!^ V;koo甄Ĕ}K㜽3͡`"U giRj>kS-j:σNSa؆ )u2ubGln7ÊU[)W^)X)SV Fͺ}q Yu߳v)y&tx:~sap%<7ѧs5*?Xm.&˩=揠F~qo'Yq]<]yľIW>Z6$4w8s rˍ]4f b;r~54xSqa;U _lRh*q?|Ps$/Xhֿq_Gwg[84EVA }v{nmu=NXs%wq;yIg~\?eD1iq\WYF}]:G{`~xP9/j%ݽֳgVn^wRZ^AhvU})D˅Z=q1f ׫$3nA"+ь|xh䘵 Ԛ˫J컃UnCs7 ؓ13 8g)Ipz^݂1O5s2S8\f~#tTU&kᵾ֞_e5T73:Մ]5ήS[ %, @ݙqHg\!\wqG8}bhnm.y-lZN7ߛO}^ceڄ#J=5԰"5UޛR/+Lv.fσ.?'*˿H*(!O* lRaP`W*Ziڮ^ˀwYͮXw8o[^%~Q0UnrE.O;t+$ǚʛUj&dxqOVل2[WG3- 8J{}D䡺"PELFjڑ Z%9-dq7ݸQs ˽.c Q/<%ՠAo<1ɥU'#7I \5oq>%+EӔ:&g,c(t2$*w:9ۧjB^n[sVH0c&K*dr"li-\=g\sݖ#30SfE ;n`5P ^%&k0ar6HT J7grL(\ ܦʆMVL} SG3Q⃒G> vB(}No!|tr^wS.>"E FZl7u&;(_=Fر?f[ϓjRxߦck5KhJزvIZЀ=A75z@Pdp- KifM~L}`~V{kLkq4rB *Q%sl7FfW * B-J@b`8_I-_u(RX:Vͽ^<Xyl8 ;dz7H" Ժŝ5lðHbYT!Zfj77PaޔWupV.YsŀJMU΍\_cCy>Ū)/ :Ľת}7RD^hW)y Fh2C(4|iR)}ewAW^}hg1 ?о:m34=%ZqޭL4QJ!WZ[ȅsm2y0ad[&9B/,l. [Vv4|irdg"_fk7YW֭)FWuy1f N~ 3o R.Q$@[;OQJ. ni D -+aZ"Zԡ4RV5b߇S*q*5>OXq:HkߊBtubUۏVGtr\u5ڷ ORi`vvU/-S hr#-=slx*([nMРݯ*3,(>7gCd+]mfч@P&F)ꝸUn (;, "&p5pV}~y}G~ug {ۛociyo%wstksHo}cG?UVVsK}4sc8d3;ˎrzXQy'P9ܒT &5/ ]yitO85o>Ѻ f$ t<OYCP g?[ڗ z X{9'C:V;~|WRl=?Lam8:P08RYg^>zo O9i5te'#Fk=F`CG2l5ÄYmC[')[ Miqm6k$kWa9vt]67>~];|O|~lr/ E 3[7+k&(Ȃ)gW&.5spb)g4;Mp̴:Lv&w 8| Zus'N9-SmԵg-_C-Ɔ;\t2:clom *5+1eoa5<^y"Ǧ߀ȶnNhrY0R17MI bjA lwu脲$ztH9Ή"#x1xg KZ Oj V+YXh6*KӃ&@:Sqƴ.v.#pc j IlMXU4~o1}RBo*ϙ_?G8}ǖWjY}WyKh=M֯Lٿ X!r-z.O(#_C.|>~6k[tp}qL[F=!zl:tJ0g'pmS^^on]M9ڄѕ:j;Agx/q_%[hȨ;כ5k ej6W>ha[`MBސ6c#MVdQʡ70"[QsB-f^Z֓V;P|sA[c0KrZDS-;[Xytʹ j',7\іWdϣ7N Y3ީ?G[(j3~#7?|7j"6"% SɌ -<{J09#m`]@&GWc|+{zpm?bF[(֯xȒf2YT٬yl P_$4YN|JZfz9‹Oؠ7BKuYw_8Gj)W7fTћ{QL8@̝1!7|0!߱[^P%qOIڛ/~^K:s# Nh{C_Tս o2tqAqI|Dʤ+߹4n>|^B`/Oﺑ\RN 7Te]HGM lhOZ _3\gۚWPZW1e7˕*Hx/]o6`S[{# Vxk{R-)Nbrw9rpmt9@/3*&ōhr\K FvG}9TPk0AC*aftu`?+o6wo:&1G${y *vK1[xāKP${AK0[?t;rmVnG@UiM5~IU޳ $˱k}VUw]nSK%<9>g>Gix@/8;ֺTpyF~_XИ\}qV㼧]ulwod  + U'#-kw?kD0h ` 3 `OZ@*Ꭽ/m|(;cJQJ-o?1umA/ϓ3nA(Ҫ#H(r;yA핋N:>ao$أ9Mr){~3 5MoLSyDFu,;^"Ҙfro+͡c+sιaOʧ^'ʤ^&p¹\f_ E}IHjeo܃<=rͻ_AZ0U!z=EhJr T`H[;|KL))GvoK17WHwi`aթ&BS ]lݍN+D {}Wwޭ,axl/"wF-"K'Ң۫WCm)%~7u~֛ͱ,v_z-$'cs9jhY"bHɠu'P> k CN)3[#⎝uc8ݝ,Ƿ1U]*׮(zbfJ} sҵas>pT1'&Stc7MjJhPm'x} EkqIk/7<7l:ǡc+G$pdU;pĸ X5p ]O}X[r(%4u;xBpSx|qEwkyD kP@}4?p&KKk /(** 8~~N^f6x\Uj#ۘt$m Z@glpo T~KloMnqP`.zߦR4KMX'n鳽ClJ|5`ZwġQx> \/ai_*L15\pjw&tz V \3WrN(b+RbNK^9 Ë+R¼>ϟ8}ǽ) 0 'Cpq>9g;Z$2ߘrO?ȉ4*\G|¶α-VNgl4([+Lv!Uy„}TE|{{8YTiNgrE%`w>⡕'ݮoUM(* 8 C668+|Yq5QT 6Lw1 0T阬s ?H*[f('M̿[f k6yQRZP5X4)8ހ 5'cu퍗Ҥ5_s{d'&stE6j|;KQ)+}|3fpE ;[X5;Zs(~W{CSgWK?k}SU;o^ͦ_s2[%Ad?P TL8*+(o XG7!d=8w]Bλk*:apM!E&:CvL.VYdgK~l*Ǽ'ּD!IZh\ngW;rYVU΋i܀ M,6m̰pmd@$Tv7$=`mYUv"b ;7;MfOV_Rˋ[U{R¥HoԭUF4#L+Z@Y`~v?W~vZZD+  3N|֣ i5LF_ߤPcⴹf:*L[I9gxS.\lT.)p1Rn$3I=]{/3 tCCYFZJUz/UL(2SQ)s2{63wiIC},CZaqM[l lzv`yMz[[{Yjg&KuS7 ܡZc hO7G)y#/Ѣ\ѸtvY BPO]Wsyy2Q~z! !G5ZYm?ja[V Y A<7Lߕ{oy U5Soj>wu]$պ=jb+HŬz ` ֭yV/AsNjSWYjUIaƁf>65fE3-B k,%;|W ӻ,v;qxeP5Ȯ~xZ.Q|fWcUNҚX=+*XQ|x͖"Q: 1!}3f/۹Zf)zoZ+ݻ|Whnh-0U |R8lpc~2貓7&dIcV1_]DX9YeNqHi/-U&t FIWODo޶HWG8)1k}nܚrڜH9 [=JPwcT^%f.U8m 3tjw=*OT$3uL‰P~ԒV EཉSa/*Q&VA/TQD},F[M~G^dsJͨ0뾿NB"jG!Wh"+(`u&&^?Vl*^*rGUDC5W s7cgm-q!OlV߭?./ R<)5je|g!戾ce{iw=L'/Z.7VC":Ykk#&s`[߻/|F_?8o)S2Zyz/>AXg.[dl. z%(;9~YjvC3h+Sc셨`Зwin}C帅mFvdl}ݞd֛Bnя_E9**SDqTS(EfϻFx쏇bQwҊ"J޵2^yQy"p?NphjLROwh^[Vt'~SU`_3W!{7x(T1^IncLGiC1ʽg_j+T M"6 M-ZyAqIõ:QE'3w 5  ^u* .5a]Bl:Hz7Du9,(sU+IڝdI"ۙCstk7aiwbT}*ԺRس6#y랋͜S9'%Ʈ.νv&H!tU+vUA] 6-nX"R)"E,QBE* ďıc{f?sDZ=;j;s|yS3_xW.n~>kGOob>z*ThX/*^6kWGG\/k7g׎k1[Nno5[XΛZv.i"J hҍ.ߘ/meF9zk]]mlT;<e5֞D_[WOv.J~D9QGkqz~6mY8}\/OrSvqO|ٗ͘g@a]7>jDhDVan7F_׎=:;җn4Ѥ_l]Z8vG7*w,]#նoթF{.J}fD~m4a>Qʹv6O֢qD,?S9>4:j][}׻mj|/}rxbl4XoGl"vE.%.t6j|{;ıGbٵC>~pΗ(ߨQQDi뗏6c8ENDR(M[~j| G:ʇhz{Aw_fݿsyؗǧ ޙr[JS~+-m~xϫJ3|W6mv3huҗؽ%qMwvȏwoi E4v=jA'./=;'~ov9ǹD_(G/ulޗX?1OwJ>:ܟ?XXqf`}#P}uj,k{u_a.X~F9wZkq4̕iD7->OÕΑKmSNzߏeu|SCwM6FJsRcіوO zDDD"VDPjD߼5%aR֖ߣ9_F|Wƥk4ء zn(=U1htߔhh{^ZbYϿ7?{}SbzlMxKDoO_nwԈlE="b={O ҡ zKDrmqa/~yOb|)fIkq1j/ס zDbocզqswc>tw]%"FX<Ҝfn.^?Ĺ 6J5\&z8J AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ AtH@ A{`ׂIENDB`pydantic-2.10.6/docs/logos/redhat_logo.png000066400000000000000000000516621474456633400205150ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp *VNcIDATxw$UߩꞴ].$H zQ0pE\P HzD@Q@9H޼S3=t靰==g멓ɷ H"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD @.""R tP]DD"""@)H(EDD 41SU B5. ~6 DJ~lWe-JdυuKA܀<1"c9=XW>L*s27Vǒ<9&[{#?>$y*>QeIχcF!Sq][1=/9Fc*u骻_9*1w#ÆQxx1 ~/ucrʖ옲s_=y_ s(vD /^a <`0[HF)E m15c}InPW}">HfÍ-"?a ){h"$'R. uahpE ofLADd2}~X3$D: 8S "8]Z\RC |'{u &xi"Ώ]h^&I86pGRF02@(i^שN1|,m BPKjZ6ntx}h G' ? oG 76uό}g Xka&Ql}1<ƛ}-IUx85Йjpn+x !xG`~e) U{p6nm[cQ_;@1-kz@PpQZ\L]@w .u6m/  vfeL:ɅDp"+ ."S@yF;$Vuvہ7P^1pL N 2Y ti_|Uō<,$ 9]\~7[cx7 lP_|@m᪺3'܀ٯ˒Z;cxw xG.-aicmeLEvtnÝ^Ջv] ߫P p yDR@Vx]iZ>[mkgY.p;­pz R.Hj0v `+ r18@1nT@@Oa 7GnMu,H* n=]uAmնjp[ |pυPR%wߡO`"m@π0;B4 Y5;[w~`~ SeaK o>bi $+E_E G1/4oXo͟Twpc7X3n },^E~/8w,?eڴ&L 9,52`)cBi뵋]7oc`6PaQw\vpd GnsCПHBFV҈X _uJO. (qK-#W89v+F [ `rz+Uث @zNjȣM|P?-墉L)V92~-k?iID&`?~խ"##rkk+1vɽM9aobXE|nD dk cQL0Ym~woV\d61j%ԯ v~` <\I' XZEFirh"Rl8>laojm`qmflU;i؇-,E/Hq}u`ޖW!]9%S,|/P Œi77r?R' Cva$[Tj⮖P[{ZgHDzt{][[4N$[ޓU8XZ$M>. @6 To.L<z\@ ) VƵª$@o2VK @]Hp( K<^:g7Q2Q\yuD$Gy 8HJDIy"EHGs-^)  %8Pa.o=gMoju+gOfX(`xC\BxMII ['^x7yn P#cD vs_ :I˨$pp@ְp)n) Л MU"5=]MD ઴"ͥ@oUp:} =M~x%+" i~L@n+ZP7LX.de8(9;}b@ E4QoiC.v~Ѕw+E$o퇭.t%Kr]6SOA~| iCD'}L*, )Hf⦫Uá "צ8.K#P*)y,jvիL.ZXvASu )pW|+b`2ifpD9+6Y )yhanPD&rݝB>I1.g=Ghg5HD .:!poaxi(Ed-WidK蓐LHvPUTi.88;圧@2N9E11bшܟ'M> ai6[DeN}y:4܇ :M*W7t.D"2-`<76S@_~pPHs!ܞsGb\+4 I0仹;1EiBDNiQ0e k3~WZUN -@=9Ō:U <+Ed, PCj:ǯE2= 8"M}bDvaڨ> 8J"O~.5}. b $ MLWḴ""ؼ7Y5 +y2I_K?U_՝2SDZq[$ R>1l1]7 Vq~}*C¾y2la⼷X%PÉHJxjcQ+9 0 vaG;稼"Q rqnNI<\822Hq0!pO2YM2< ^UD}EiTʦC>E3Dv 6UH y>$bYQOzmmwLcG$"a l ܝvy}frYcᰴ "w,&#]>MI'7Y%J"R<},*`*zP%5uSɈH+mH|qFހ& 4pւj-}ϯij]mz [ڵt6."-d Cx>s^#\(TO}2ޕ[wᖼ: L M@o2n@HKU'i< c)Л,+֣ ޯN>=.?$Z%5IWH;ݷ_)"m`k;]d'#[ «ZDDڋ ܕnI/)- +iX /HDCҨ.ӏ@odj\2PV~E)"f`޴ jEGW]?ں"y_Dv5Xab"tdRh Z+^ݺ<CBB!.ԷNoͯ8DDШ"?R{k| 8yxoNDdj,lZl[\+4 {5řq!"9-|8(F-]._|7 zH~+<bDVl%?̋;[1"31lP@1F&xX&~;r`@yEc)"2el]-6?48ND5آB=|9(SޏMez?xy {JDrϰy `0M!5)m%?>y{@H-}pUk10֊tOI,^¤^5Z?-ŠG"~p+V N*E5ŸWąΆ91RDϯE|Oߠz ~~d@{PHq;EJoy@OY?:^h2w,W ]DhvpӿxUԇ. ǏʍSy39u-"jv(7yC}`zɣ{#MS}ލEݗ ZTCϐ!\uLD?c&lAC^0Aq0$|Q٧%|l)&uÁuv䵫RMBp" 7σy@"Q vHӕT|M@OWpK'vWxx o4;̛nti! ` &\`.gyyUᑕ]-@ v}bwk(kr"] lca 6Lm\@o llh\P S5g/ j 'F|a_^2 sPg&u-a= [ٺ/`9 Hnc]hi63\@+"_s;=з-~X&pht/?79y=p1A5ui& Dys.rR@E'O Was sWa_uDU\0PHS0{΍ t[7hpaa1#!0 0# ?suH6ZXEiZ%@o13nvo8}3y?B4VB@|sq+EQuzr<} DHƜ hL0x>?m gFDg6)2.iptZXvh+myiHʟ#\=Z v0?*`_ׂI3y'"YW`QO]FĴ9WD F=Swmgp@1[ED,vaq!nj>Kdܒ^Vp;yEW*"%8=4j81E*eN3PP`u-;i`v~ 5`2Ԭ#"k/Y İ$n- 2 m tGY7ȭ/it9=dp0ލЪr"2I}=kw R1#]5ђԗdv>{~_Dd]['0Uߓ0qz.Ě`cܴo "W.C[-[#K ?OJJ 9^ f+,1DDLOX8=/b%)3ꏃ)z`XB]D^(Jng|ro9BF>%Q.dEOF'?L">z 3M9ɬc*Ё|ܢKpQVX @5F-YՔVsgrd5)m䞻hv)_"l$e/t 58W:7EC.%`>m""_d*tE޲5wg5q:F>Ys-.+p}O0'+B>5Fj ty-?4Z0%ck")*\0 80o`1- ݚ_jn/%oi ?e^9k033MÁ^YÝ*pM護 _L0m;ŕo]hw^F=> cwZcYVjS Nɵk3vAi>wX@wKB].m Fp d$_iuk/u<2R_;[)"sPE -.Gb߁_Տk1]MjYcn^xpX:E\5|ZSoH!݃Z~A\m HY^x)na | p6|bi4/Aژ[ :p8;w,ōRO@vN n-t5ה ׌ zzT`6\th-=7~ٸ7O@Dr¿_#)w-p9}#Eɹ@1§tLOkytpſwuЋj"yC/mW=b9##ԋ@}y㛀'ck~ ,-؋kGB}/ 4qA<;yqGWW+]Ǻ@W ]-mMsɺ(9qM1b!h$ȳ8ͬ^>(JlmA8;6ŝ!+=`udz2p*(p[5F>p[?̍`N>uyy'/Pm]$ku3P}I35pA pnpW9n0dH ̇kW'C` K7|\s]$s}y큛qe$ȋ8m=7ۂ~ i?t_*'0| f?w S#{~tn4pcUjnϞw+~}xȣNq5Կ.^~<|7Jx>= xNxdqM}\뒬E;VaW]}fƷ;M绁7VY[FA>z20%ݎ۔`JFj vxvo;,MY}W~5}ju=Jzk;eQǯ"72z|\]D/iƴ@kYW+3Oרܭkr3CϿc\)J氿׷i^^0^ܬQ^ۗD\9:{df`=ga8BjzqXwۅ@.(K[m<ޯ3pbTA{n"hއ[CI%j'>6H 2u!?}=轰÷Ymp+|7qܖT< 5n'=qAޏjQb su5/;.5H >[TҒsq'jY`}hi{ck R+4fM8 8+> YfI5\1!uMҀd B[ݗz?4 p!nqGjsPsX!p&nGigI 6Ãe2ksR+4,gCKeL \ 7q½ 5Kg;|OVy17n/ᮟe-+RHS fTʻ^iƝNMwvNbqk: d< MJ?$rV8Z)iBC(mPo t{>P]: \WS T.ľiP>5 P`3z8p7#OKv)w{ީx 9=2p[7J .{fBW[YHڥ8"FIilJ7̏l8M?)펣u&Ánw"Ivg[}7.nn?qt I%]He*c=vD&~a]פX/S1G/$7;,VV\N)5L?dU!@F<Ncxۭ[6bN{7?i)$ׅ޿C N;3'~(< *oj-1(MP< v o c8#x7ŲH%> znpg->)ˀ_j;丛 À፮Mv`/oY+v-^(nU)d}s8sn yO}n^{J~X?'{QKY[[͊kzʺ%`0B Kcrs!o΂qFJHB \Xk?~ &w\& d-5K#Z: Áw==Pif׳0_t'+EQ?m 7XiP: m*VMX2_15ɠyhYIe忶xm?WupH1ֺ듕Y_eda A3pdg<Xj:LdB3qsw6]n ,b⃧/LX\.2ykCyYF7pIsc=#[eRdXjm)͋ߖ_W_Սp'Ԉ|ϣOW!oh[;p#ԗm@OO +@4/NcB/+ 2EcOE%֌ .f!nDT[R[TkDL>/z's.ն '/SAfDznLpjюQMw@~0Z˅1ĸ0{x k_ &Yc|F6IN3i`R)܊h<nwCr["X`}tV.:kKƩ)?\&ɑQ@30WiRm]2[[a#HQ wu'Wt%+pm>j[t ]/![?! K3<>LJ^naYȅd:kr_1N㦟L(Ն!AwRD25:f=mmW֦|e| qe@#ӏED&lnMc~~!FIS@J_?CmCYflnv pk-f8+ܪ@ gQ8ɧX,=pwU[5nO\ [nTL̨;A8D&S;`M`t}Z:VD^0 JˡIy%K[dн4)9ˀP vEDFYz¼94yCv75Vcm*H}z? }w=^#"hKWӠQW.+iA'kXR%CeKmЇ ";O {נz:T>3tr; .LXxvg[:p= IRVzi{8u9m4P ( W ۰̔peiR4C fK [jCovâm[mb*MI4;ǰx5gں4K~`THw]7Ғ_0&wZ4S; |-d,t:D)>E sp"xrtNd5U'vQI-9gvzґE anw npliSOV"xgS $[?z#CH b^N\M?=5S+X gढ|le{|}ZDQJV@w> J3˓}nEM(_Ctp uHPf.5"f@n`2ki7z)>nm"'CA3>O{&DpR?|#z6@:D># j]${лN\t:^\72~n}52@D%f0nlWP0j"4`C锾p/M#}}e/NkA$>H΃zcA?gM'M}`'p^@1] /;)M~ s,xM\>hE,VmlCiăN7l/!j-URbIX\OwY9^/WSb0@ؼ n+l--Yaq5<g?qRlft $Y9 ^UL%;G/mb[2_.Y3vU0_.lG`࿠oAQ>nK2ZzW+ܓ-S"=U~0JrZD߆'.ػS,HêoJP@7՘όV:I-/%Ƭ>AU` vD17dŰ 邧'No;Jx^Lɺ1eP A|D&~ ] 5nj"`|>6J tX `AKH PI:ԣG U.Ե! EA Goin@X [EpxZH>ĸjL DC\].W'R[ ϛoY[Wz%,nlOR<wi tN` Fٵmtp Ӗ_("wpռv ߂y#tЯ!:zW&fnhiwL#5#Ep^p),_/^jͤ](  н_i t!>Q{Fg E$?tc0+(] _POuΓ~*wtilo>(^} u)$0$cέeSB5wɟx3.3Fe?,<3I3`\ѩ@ m&yd)P -/CM`N/'Y!`%/#S)HŸ3ᬑoFAt-Vc '%;`B]˰xߖ_צ..O ")3 Z : 󬻛%-0t|Kr4p ~A9 xm*IQLOq 0t1۠g +բuz*^j`|i2pQ0 tT $yY:yFO@bfvw+yiz}CBwwV<}5lfpc7(E%wN1Ԯ! ¿Bldm!: .i>]/I\zu.BdX>Bjk! !+!=\}z[ DuUs<ϕ?Zzܓ@// &ԋ:Sv=Tw-UϿwp! Z+t%/ e8*^:>a +"Kbԉ<@n0OBx $σj+Dox_(lX x,a2mr5k N $65Xdᆙu,q`'kBPCyx9XNcd_x`D w*G`L#0t` qtC8 ̌ddz_R.B/ec=:Wt [ =-;jpqIo}nrw]^!. IW([ ,+L' \\ l7H穯'*iD lc`3ජ[pH孻a[(t^jܴ""""b^@Oު""R@̃zGnRL8qk jEm@@  u):@OM .8ºUDDD,n7O6]4]B:8@>5,nmQ(""`}eT^r/7B]DD/)$pf v5B]DD2N> p[vlXX V.""3XU@a~P yO}ޔvYDDD,킈H2hA [gິ"""%*꺯1 kX 5hNDD!kzǟQWZ ׇ^_+Л(!Ñ])kpB~!|_M""l~#PBͻ e¼~vDD$[ P @9 8nnDD$K2nA Lpkp#""@jC 3R7@co@4;s0/ Pt Kpnk ò[7]i= pj@4 ܲ@{[.V/1 ܳpcp HA]nt5M[+e6iIDDG`( 'a A5v< `}> Ee ||#p""#1{n`( B@w_RM]D$_,`sw"BpVI}5&jjW{2`mryDDd~d{|k?@L,||໨]D$Sb_Okӭ4> _A%x"3&q8K,n)GDcY tkOt1pJy .H'a27 2g ,薤].N`2|4dͣ5]bobioπ?x)Ē+>c EYOd .LsTC)v^biID|CU&kZnTCk/`+s`Kv-_ iqTCFp&<)""kn솏{ $#Gcx<2d pnei@Md!!|iKD$k"`6cj2 vō.HD@7|./ e$mYx[sxLIK Yf>]_A~``nNL""ipkكv"W>3 D,8۵7]Re^8xW[ t?(,UHDP}}a~]2'BxK ]&FXxm!doW.doZ]WTKսdTmvɴ+ \ ?_)%oVn}B=(E$c*1"?*%n ?8 "O[Xs;7RKT `u1,4V."7[uR.bY?UeY""-[ wmv t),wpJ ##N""^Zy=tR~,KT"R5 8%0tGý),/"`s ]PKK,|8&!|aW"iZ..HB.)ׅS>_ʰY/ߏĬ)Ef*Wmc?]6I7pY"5{"8+K!ں=`D+igM"k`F-X/pD`A%ys,vF.2=.8 Xxϧ]0ix 4jR41xc ݀ȕC'`XHY5wX,/O &"0#3-(Ej!|:j0[ <\wIre !Ã֦vD:Ȁu O qNjH K HJ \UX~b6U4\eߊ>SH۟.,?r~(҉K78[i>E2MnF0"2u[3|?op{5(E2̇~f`qx^eO"YR구* uBs]$ƌ_ !jaa[p-&"28Z nHVy t3b0^fkas`aiB7p@;vHX 7;e6*/RuUKCpW|P+E *9]] 1m [V$SC/t pa~[ߏ)Z(E:Su_lQ;:GH} [.4'7H3 -n֩KLc:%N 6K- ɓ@=ϗXr6vư+\}2m8,\ÿԌ>u tǀǒ} 00V(-5^eqknVnF$)E!o l^[gvzVLMwFwH(EijT-bWob8Yq\2͂5DOYg #,j-@k>x F ZvpM}shnM."Yp3 llvlgaf[9_?m+pO[# V{ ƭw"9* F ] /,&n׷0+ ƸUf 'GgXn3pJ~ވQ(Q t)j&Hj5FB y10ú[/y?ff V <V@q"B<rxaEGɗ,""0߾m0~ߵObQi\oYy?yQz?Nu~h}]/}Iw?Q d1 .XYtq|oeB}}F wnF*D|U vv. ʿG/R =Dr]Han8yS<kVc˔GZm\xK%qo;pi ,?}y{B|+ULCO"!~-{|oZCb [~-{y6<Z(n@;3`d?k5H`ޘd 7ɬDWc9&[ R~jjȖSι䚛=b%\JE 7Ts-V{ZjV[k^{뽏`ޙqL32lϱ(V^eV_cmevw@N>N?Rkn[oY#k+keJY]W[IVXLWeʙo!)g rdYq;(c-s)o.2町9RǼ"k[<^>]zf=?'uC,ޡ)~6Tj򴂈 ⢋EGݶ+ YZ3Civ.K\*!'lV{н'>=qL&¼JR1O|[U|G&SWQi˷>wbߎ,a2P<,=qm1,8&X;9XNzk y9{n9l'Y/Kj,:j _j[Xǒ~:4oϔ]WG3Ozqm8BR ط'54xz`ޢ>.IDJsr> um#Då%tmƥg{6U(v"wLk.W4/̈-Mi:INg:3a$_&P=mS]}Zt[F^mk8wuj@rQ QhykF՛+1Z-ܢ fP)V;5nڣYvVY()gElɲ <]1+iꙹ\{Aº:,ʉS (9+3D#E;4~RVX:h;DE|kIBzk p i5z&9C}]Ϭ4@PnBt+M}ǃ,v/}4Emő6ZZaWC=N~Ѫ)+8m:xа0+.  8 T/A{  YXܝ3ZIx^d"֮tt}C " 5:6C1Cm =-B '1qUJTXmZf`3pgsӶ_M>NfJS$. kJƔE5qX6kGt K4A@FξoϾZxiPT."1.,nh=SSb R]=a |i {qH _%ǦfY5q xLH#jvuBlmdi> 0Iz<`TV 7~*7aZ ƽG Xܾ0p‡*(7N*?fT}BZ4U ^ Pw)uP?C+T$z.Z)4G7H,,)cH! ]8>INvȎXP"K6TaDI“[ &w Sg* X1~/`nL!XQ˕C-`є{@ )d-5 @>n=bǑ QB\#FH3 PdT` !ꨉ^$`a> h /ύG‡e) Y=[vv_PL qho[CιZw\`6Һ3mZ`Lɧ3 'qC$bfOPqoA+|(Y*+oPW {j&yhn"DP] ž[(8,FE[%&B5;΍"ZhNiG/_Jn@U&9C&h ʧP]pYK 7A&kKO݀ra.]<⟸wIAq_P%́C`uۤȃ<B״ +aꄱ|)@Ŷ@Q*fh_ xFmya$y8zqYrI0t`4H)Lm8xM= wGHZ&W QS ͟?ihlj;e ,@H}y07\DB`%4$# Cg.!T95qGjDE0;;ɓ}_x {!TMI\)X@8 )YHgnw/v>f̦S+6 I͐3G-P6tQ:>8I" MzJ|Q ,K 86T-SvΠE *`҆ۢ;c#˓!A[d) UjKE8UՁ [M*tv)IS/ H1݀@8@thbiu~N;'O݄,Nm|S>:KՆ*BOzb(Y.R5" ,(=pAT=A m :f]J{wJwƂ$ e^,%Ӏ6m@3(E/WٟZA:?[[V!r&.Z*(t#kT`nMѣMH`˔i3 puC!,h|)/!Kz$Oܷb|4$d!\ݏ[?AWӯۮ(:ClE\iQR(ܵ $m唔v^^=;fʂ/b2s@IXId Sv[7QO;v˓Lxˆ9!.y! LTu& P2~|emW-v~c(Nj6΁8S-K:v=씿F[9?!m E@"A_Fi#. s5B:BJd086`M 4"Ҥ_/{}#:$6UJxZD/tLԟ@FH]wNKz@H[ն N%$y:%oB2J+ pR*hs FK@ Ȇ@i{L~Y,N \b-L8B&>o0"D|F)&E6=$4#jƶe}5Z_ muV2c~1L\^[.XP?YJ&F\|5 u vA"qcti'm>̛v ~MGMh6 @"AQw8~$=()'dY/:ʒK"}*GM d6qEܲ  n:ʸPw~2jM2ʺ@tXԆaj'ЅK%^Kb241iґ9t9}UN>,qs:1/0Вet?}rhpo`l ~ )^X[O#mW }گij1G [S N5miSE~|t$' Я4:+6E!T%ږyjl:+ag΁I=~;G3u;^p&rh$C⋠N%T !V3+e˼Տ$Ϲm˦ diDl wh]\=Ǜ(jiUWJOA;/8G:ke[Oa^ԥ&Mޖyݸ_+RzlMn9@W-( M^,.!rӍ +ң>jS ()X,zvJvRJy'˳j2\ K&00qKidsL+ܛ[Zd90=7V*niWeo=E'0c?dPyV2H @>y*#+ K r":[jvEu4\^"]7A7p#iԸ;uƻTZ5!kQ'oyg?k{eEʼ zvLfu}p1Y31[e1[-!4K3,VYU-'wCJgFp\3[nj9<ũ˧7:&%1 PҤOV'BpMW {ݟaCwucVg8^FwMTe?൤nZ-7{G'^4pN @`.hb.3Y/l'0aUl52#8o| s3A{ƔJ^r\jj25#T4[aCʚZ:\<^yIXyG@Gfp`KX apx_7~/ l$_nYOo,K$DG&a}8HNdh}k> ^ 4jV/8GQx_v:݋ RJy?xJ?('M Zһ,rN|">5]S"=y RJ>-iQ km> q̰] D~{ G!5U;~˶_7ڣ:%np;'2)@Q-iO#Q~z;VW4Y#Ve%.]]ֶO_t3wsRDٌ/8eр"X *<ӗol[m3@-kc$L/͘dۧ[ad CQ/>[s[ z5bbP0M 5WCm;W9i?;<y~v ,~x#m}rM3~n嶩XaV ף/>.Rё7?mi<Ѿsu5i9妁?í'_I& [2]ʺ:/m<<^SG=R#& /驑~v޾Jm_8jny_ʆ8ԛ/oJ;XzɾsJ5D7I_^`X}lӍy'M.*{R=˖ Udxj^Jo1ܗY3##հ߹뭃Vj5\gg sMV/ZɝE.H262V |5q;%bYX=}'.| kOs__jڅ9ژp]]WZqz^ڰs\*=`3a q1- 3_)đQ[Uk tX]9e֖ j ƹXz^k 82}`h|PI`v.(emߠg)\EumpVߟcll~MfPW^iv^CXmNonn$<ώJ#&)QVCA.t2 鮽n#hՃR'6[.&o'9 [ʖEpzDw'Vϭ}AieA ^ac %ܝm/atɘ W;l'~2L$7Ϝ\wc`v`I\]R=rV, PMę]lEo~_f*CV7D\Vn)c/D]1UۯE/:8pѱMsEfZMKK3H=|(GE0*<=-/Sn@.M5U]mZ 8r: PVb 'U}޽E䀧ν}sl蠀g9Kp]"}o{)4UyEEe~dl!p|b!^\3(|PexL,n)7>yyc/_{t:!xp@BXa3768mHxd8e rOL=SYG},n"ӐN:rxkK7΍*SP-3hn>ɱWyV,Aa*U"(~dq11Wl+u ]X ?EeK/#~XNYFЬO:Ì1Q ʸx_\,mTt5epgKѝޭJ)Bݮ_b%*`+Dp1 9첶aFGMPWi:v*f\WH.敟nf-5+ ,~:Qf%9} f>^7Wz5*i{7H,yJb%) pg~fKd9PȺ;|pS/sHUO=v]?{O^Z+_q;C})lbIwi96n7QϜqbWOs-%Ep19'K+JSx&[e 8ꜺNɻsb] z H)oDb KK%'%$Nֻߜ ${Ķ,ޤm|--(ߡ܀8tcq>1 2ksSb=9-l=A@oia4[5N 8gH\2zk4pA !uޱ1/ *ְ|$_k[!EywUa~>f uv:%R@'ՠՍ"nGC%a}n IQGwnm gETQ 72m?ŜXєP](pM.O `usϽ ΐh G.];V j?] WZ&vnfo?5Wb J#8%ΈTĂ5oI*;%i:t a0f~X*gxjFݷRą ] ½}d;:u).x'Jan 'p-u3qt: <:SCr+֛ڻo8g3JU-#~0?>ͽQ|a>'maXT@Ҁ3;X(/G ސ=!p%ᔿl1UKTWOxv^Mhֻ](` =eR@ϑqx{Op^$#ܲoFW*憈$.48Tc8{7WC0IpdY*G3ƈ|^vyk1,W {xx?#z%Uӓ*uw\qvhg܈޿~zҬ:.zoJ4lN[R3k?H LԶE\)'Meݾim{k*۱~~&a?x{lOIR*PCfUJ߸NuOeJ9|紌 $֯_7zÇexnTnzu*ˁ=I;'SSx 1Nїy {~u[d?x n,>gt'mo3;?sܠ?,CWD{9G"pt+Hrv?D q'P'n& Qp4Io+Q [C`p) ЮoS ksi54H)(H>>{ tXs 4X;R#.PFg:TE8D uH-*[l6z@'$\*@LJ@ΝhQ)2pU*ڒfG/d&!g"[XB8naT*"zݽEK(i69-}*@ #@}1.PHKmdJ+9/3GO p A1S|Y$:__3? Tq3+Dٽy0S`PutSO*h$27Hz9 y^1Ͻ^t"9! ! -MhK P%!@$!^ǃ2AFu8$3IpLx %AUEn'47W spoàH*{!).h>m/I--sƼdViK!zvkK7gWSicgE+DX].BtI7"G{txQ4/j3Rr'1$_ؕO{?q_{=4ghn_r<A 6Q(+ vf;y@"Ax ьNɢ? n]xMnι[ck(l^,h7;;;3^}2pnxd:܀n ɀq?b|ʿ|=QNQq^,RIENDB`pydantic-2.10.6/docs/logos/salesforce_logo.png000066400000000000000000000517401474456633400213710ustar00rootroot00000000000000PNG  IHDRx pHYsaaøtEXtSoftwarewww.inkscape.org< IDATxy|\uwi6&i)J}dPTwz^^q^E/&V 5I%I̜P-5͜93gf~>9Ys """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""".8X:QX[;`݉KЉ]؆1;q؆qWs}""|6{M #9cG5Vc8 fd(|q Pp8֜1~uM`1.!3̪o9&!E @.PYXb`!%ͰxwH>Sk75W8 `!; |Gn G+'b[HD$(V\SHeX)\Y-3&>1DDrl`A8'~ yIl1X"1 w89c#fLw0""B 5]?%E܅73mvJ2wQ5bɫ&oi("A)J1N`ME:#&s"%nN;<;U`r91$ JM3(`PY`;k0lŰ]IiJ_""P)M ތ*gR'p %?ʑ~4L%`Nd GŲcšJ_g&>j-%}Pw8C̬{H|wgaDcwH^,g,↩~%/ܦs1v`ߡȻ+ayw(pE=%gc%="`3A 툤F @:9pzQ'=koJ;o:e&jJ8X{?&6HQ䵹c=0P$\_@<3`SX;w8ڇ1R>iI @4_~"i,!A[D 8ɷS;cÕFJѿsI*J 3M#bڳӭ|&=w " w(q[p̪##WxCbfakoH)pc~㷱0WmX!}*mWkK5ȯ^k`~2T(g^Ï|0$+tc]_Ar6zw6|@5o%pߡHV>̺G<9t7e{& 8Wr7D$]t9y_֋ [Jg4sf_bf~"J2faHVñ1Ѥ^tmX9>ѝ~"%%]7.ǵu/&tܦO?$ $@D`_on; );!v3'_ ST%3!v 3'-;/(ثh$l$9uЧJ= 2A|;T)8 Pw(6855 {qb>͌;D$*1:DWKjv&ճN ^? ;T `NCFs% Y4j CO_3~"=0n; +Kqm'wYIM5|oC.`7Pw !|jw" cpx (; cgF\qk%o)5`ߡH^n+$-?0Tqchk ̟CDZ NqURGC'b+1;{FN)m~"r Cg ܢ_D2b0H s; Jeow9NgdHp%̪@D;+&jEAfs3~ hR]B5Xcʀ"ro+'10m@ƶӂS-!Z/?fF%c3~UP8c&1LRhhB+f&%)`^`; ,3[~ c r{wl׷o]1! '&P-_ o: k#F%; 4rs Xg,fV}A 8k\"dW(=+M,o+tG\"`e. OcyykP`N#~!"Y{iM\5{&Pز*}cff~O3 CDdmDr-|8` @8ceF aF"u޿d;Yw{y a `>V/L.@O",=Y0, 3rPp7N̚`rI~%3; ^>KxW0`/; :0w0*9PHDD [0`M4#""TW3VS#?zn,; w(""b̷Q{߁dxbq>AXy0([~$O3 WRw0~!0!ɴ23jH~ }5""X;~IXCgh-5ވC29 S CDDSD3ݰ)vc~ ""y\Bk@)s!H^6_);t!cpL. "",K f7rBH!`0}~M!#m߁x%7oꛚ*C!wLm~o85""O0ww SADD0c -M%YX%""3k.bƃ%i~ ""-b~ܛ ZCDDd;q̪@ʽheY""j3g~V%6p! ;7r0dcNTC'~ ""21毛w ɽrr )'@$;)FH|f ߸}Kߑ $PA!f&um4(;}r+8ADD$ bn Wn%w"""I1v4w{V@UE8ϼMYіV`L!Y1)b~ ""Kuȭ @!̚3q~[ 5~ ""0# w@n%@DDAtįV`g,an~\:+o}]E/[ @+""X/k jDDDs,h<%̱H7jV`zDD$?Y37frADD$mwmHJDDDҨ2qJ,~ ""V0gt_&h;4 8ɭXPpNKr+@DDN:O[ a!d11t>K!d5Mשs+w"""t6ONljs+vw"""t4c,w"""Af7M[w"""$|bDDD2ʚP)s0JDDd)#h򄹗@DD˧'ɽI{k|H^"y# /P28shuM$[tPɕS9Iȫh2(V @ 8s[( tBrxeZh=<'tc3H(+vp_*'9 G05JYpڈA_.>Ls"/ ̜\WsXE8sqd"9̬x*'9ս l; Iʂ_M7sxeW/ OZ^k82r e.#~!"}pb K.Q R:}MtF5 CZǧrMw"·&pϙ 2í43s&TbRYAJCҐ<'fq簫aK#,ne<(0pdUg*bZe!&( CkawDXzXӖRU9ktg*.{簭;},iDHu]IP8@` ^ƮU'1KO ;3G2ȢEACi(@[CgԲ+(oFxz[Mi Ԗvǡ=2p޿czeeQ( QV/ᤚ숪0C/ Q_By댱[>߿a*䔚 돩2ݘzb=}kڣaю^{3oY=$7RyM_{ c񮲐)e|aj9˓ϹִEccw7tԟӟ*|V<6uU,ߕZ|7N-}z8q6<"'.Rjx/V0]'_ۑp|h9ǻx>^{;E;z\qp,[XҜʓk R>\[rO(m <5]C.X'*弱 "?5 o HG4~\*ѕ\:!C7"l+]]0K/Oݻ-]1Fɱ̜\pgy`CןTSȏE]ȥsغ+GU$`[Jr̝̬6zA*j_2,8mw18%c.`%yj._-aFs`0CY~>VM]7[{UW.Ŕa57[ q%A*>^{;=Cߣeo@x⼑|見MR+ y#ހ\8'ϫIz(ަ.W=8x|env=/ԑqh.tQ<*!?9ߞ<ɱi61/t,>Co;i"=UvR ;&3OPnTx<}Z<Oק ˲87Ɨ]撌.Ⅳ8+Ow_?T*n;*$ ^+fZzp\u/F&5&QR U'z@v% 6'?$[|俯{v[~lKAOAYa](zN ǎ\Oȸ [CEҦVij`}>"HJF* <؍/ZJgJ㿡3skVFeHr;{b<%# Yxf5EiuΑy9PBЏ3{o<yȾbJCW",ncKW c`DaF9uu27v%d{wk(/|lIMMoI8zbRQ;lu,Ԗ:,;aaJf<<1KcGTF^U}$ 9@2b;NT"lq) 28QҖOB=:]QUkF=2`Fp+%A=:!ޘt4dPbʰPBC'9'bn4}yv$sڇ'Nu˘gSͽ|iA+ L,.[ą)N= ;}{b?5vuvcY3zXa>6vܦ5``RY-fNvנC[mINՕu%#9VuU\?('.ߦ3ҩXլj 9ŰɆ(_^7v ⠾,q# 8wL!M(at,?|bKWN<7-]1ڪ(e|h%u[ǝ Ŧ.'4 Յ|k'jD& /m ,OLc=V#faTq n[5S''fy0 L.sN sޘbFc`i#\O]ῖsOS'L 5\  #DI Ov^@f GqRᣛ?/.W r|p~S5xS[{Xe IDATށ<Or~O7zrz`YȰq7>JƗW7r12uQN{t{P)5\3Цn> k6m?9v^k;E=!Q`c]ucwD-y?.ٱ|`wIu\G24uF^ۑ %lX7fsWrMua8>^_N|^W_ѺRf:U /2{m愀sNzK\s6N"8O½1ˬ[nI;9/[ Q&}ig/ýM]m[/|j*]5[b׍??w\sBuenݸxWROE;z˻o ?\݆=sozU廒Çn{b'~V~gL.K|:VnYٖ3dp=,ɗZfQKҍ?/~|ynWv%tϼoqU[ӇeS% 6D 80ܷ<44w{<2|kwY/R{ߖtwgyU/qzHҡ$dqjI?}gv&up;isQ~t{=JwGO|cum$M}廓U\?V^Nr|/㿷ڴC6~eD\=uk4RWCmw'u6t=&BNBtI\3XA`/O7O{6k?mMgt/Jvx;)U [?E_6d{Um. Ee/u5bqs/ \Nc{WU$rÃyrk4o_8]]ܾ: 7n?c˻R[o 8dWObe ח`  avYzpa}qML>'frw];qҐAj\$+Ki+`a Ĵ'Ǝz>s{R3m7 |]Ԯ?c\;h8|!'DT?5\ ϱŰdfKY==M?.u‰Յ/ x-jqU3ӥ ]qxEt;. 8<23ɧ>ZMzݶ]wEkytswiZw ۭ'gGi_usoSR&2zOJH  arF!n;\fRi(s_3F G2YngjiwkӲg=\:F6‰gM[vo.67~ʻ1 \L6u,qy=1o펱-gVi%tsܝxn4]/VvpɼSG3=T+ 1QEA0NfL8cTϞ^<ri[sqž'nnw'yJCb€' {c9o0e!zXb 3x^OmM|{4"Og'33" ȣj83F7ɥ\;'tw:xlK}>,pޘ"Q1# 8*d?5viIU*pAe\Pn➦nښ+JѕY݀"0߸^ Z5-k;*._YýM)U{pCm<>dM{%~^l2 L:{tޘ"`SQWHosזj`Pr8b]w)\< :ϭaӇ*]N]ßΨ|Z)fvv /fqu]W[F]Ml`"E,=.ϸK'fӖdlq>eMeQ I5̊.|rg? _=l+.ßϮNy)B,h 3r]g&sx;={- >V_SKF%%HJ'Y媑hz37-FE+seڅ4|ڟ{N,e$D7 vFΟvB喡ܫfÓӻ4Ť<_+&͊Qxtg:zxU͢ Fy2~ ߃ϱt_.[LVꦌ[.8Dl=-@0rI~knctdٮ>e+?y-r9IhaQp3Յpou8|Vo'qwL,n{)b6= =)H(L|mOBL(XQM9;fܶV S<+:8c]m\Ӈ* W[ᩭ=8P#9wLv$nnпK] U`<<'jOÏE㊙1 &|@ t;jk#Iwq#^$$Yn֮wiA\4N&I?Z €xYo O.igi~e'A@m]}YoV3$Ɨpل"] <0,8mm?7 ;ziMlv<=)1x`C[ĥx\qRCٱ}kX}j? {bX'[Ua.zK[]li]4Mz/.s|v['[S9m/fNT 3==g\NF.Ǟkឦ>";cnv+kr;2gj$q}<:'qZp=V5F$d j@?iqx^! \]oqffQI,I&2Ә ?!SH[N|?m[0)1AW5tO W+1j+Wv1\Nܝ|KǗ$4Md'foIagJVqpbE]Onj4$zw S0oYfȿn'v.:eXx}n6YW"7 OmiƱؖn.zj'=G60`༱V \ܰ7?ǠӂYn~6đn68&K^pi#dgL 8M cK'c.oVcKߜ:y׫-3]ᢧvr[V ݩrzUMg\8W}E'B1r=1\`li<]}^SG%Y˷bŞKޣ;f-v,׏M|7^3ih[YIf@L\d }-t1U]/%nn}0?!PsNh}xXٹ_:<% pM},.љ\M>:lqF'D4q3 `dWcGUIGr0oTj7gnt׷%{d8ݲem=(;]GO ܵ3 +)<+pؗҨ,]j7e!_ѱfǷtZ 2$-We "xdsǍ(2CgN3pzk<][1l8 *5t݌\06ftUQE5Jv8YNsR|%nJ{5An޺NLU4M9}nfs-+j5[|rJߚd98.qȰdK-Xh3$7_Apͤ2ÿO`ecē5K=z+rONW>b\-=MҾt0f2KnxwYGjKx1|aT+ xrm?4Ĉ8=^1g6uFYi;`&Ԗn "?os+}pOgvW9,RKzvBJ+zO]Y?QKڒ aΩ#ɾ'fZsx%)5,5~:{,3:[\uIO󻓇sc'?@h箆΄n QUUVE6~x'+wV5poSlM. pxUU9 )s񂋆ޜB*Tmbac7mqY0-UOOlv렛@=XUmwe;_w1I0o~` lcNִx8Bqx9iϱ|a.q+h]߹| S[{y e!Q 9olA[y;xケM,f d2[zyf[Ϡ5R&8ia]# 3QQE=8,Ndq2&4^@BEAm۸ᥝnncY Z}c8wt,'f#c8}1{ hYdLqA&\Jhڇm83oZέ' >1m= x}#9wLSVv9" +P2q?eMqk.,tu=1˒>a]{,-C1Kƻ0§v򨋢Aw>"nB7e08 G%jcٮ>a[C[adQ%!^6_}u7tKFstx9kӳ'0/]8:5]QKc) AO1 mt~1 xQnF_"ǡϱ e!Cy8@eA n6{m7ؚԵ*fMM{lj+&$4f<0㋹o&AkE~{p  'V9H_=oQ W>̋JzXUA e%!'`U G /pCcx/}[\@NO̾9NhUb羦.'?,Z)u} nf6MOOW}ZZ꤆ 1'^1ϽvYdL~@~0c ޖM]f'߭=:*]Q4Pɗ= @ִo^vWՋZ[![/k0,{sD+:.%{`r 7rZ7~}wHzdmq;]oy6lqy`Cs:\rfYx.7l9"\kgԲ2d`Cg;wrYYϭy:f㑪UaW44u'vb<'86=h׿huċv^m̦ oqciL\Nn_“g˭ x%d֡Ʈ$[b\Nv MvMŲ]}<ͬ:/ŌE-9.۶`p[b=63t qffK^I˒`>{ig/m>6ue{8X>.bü=u\Ff߮?o+T-'_jac웺b\\3>V-isk \0ˎ?Gkq#_ouQ~|y9孚BB=1Yil-ma(T G1&* W] į92,r&v -0wm'^6onxߘ"fN.q cau[}aiKZV/sjY-fR)WL,EFj ]<v&( 0sr_ĩ-gJTc[jF8)d^  /J9cTQRN" mUK=Jo9)m\Mu"T¥JR*g,i/{++̜\ʇ&^Wn 1W2OMblғ!.W1RaA9]>>{2*0UZf\IPҐݙzcΨeWCcGƎQֵGykWgף\"dmetu1W8FpxU %A&^@썱X+›",6UQ1ETSaaSW:>Ξ(/ŝ}vDK7e `( W[ldYwj*'H_Ş:hvn$"ϱلZqhާHs'Hl&NI"NibOH,Mz Ӂ/UD|qS^d6!b1S=Ib E$wM S[:ȡj H )?CuyqQMwM`D$%s'J,p^\TDrO'/($IM^( Q-qmN>x/T2|LwExg?;- -xOZo&/"yPE]Y!H@"Cם|zg;_OTgÃҐs uc:%"ip~WƔCY~hֶGyxs7M16v( F8~DABDdEJXtSՏ[$_\wP Њ0V[;{"'2XsO*닋?'x,/V0aO>p 䉳F1!NE?7,wYEƘs|D&Ʈ*UIDATLDrµKR>xs5O$Q3'vF]|"$dbbj @{a֋ي$7s儴TrXbC"N.<6 ,.d7MQom(t{N>Vgٗwq"]caofNV?Lt=CBCN.\fu;eT[$--1{V| n A$DJw̲/Ut>Cbu"kbiCDDDtߟ P ""Vv!Mz.WJ*I.%0h폈HĬL],Ru6'1zPO2Ӟ@=И~3}&:P6Eoqa @(Ж8DDDN`f+HC!%"""6&. LU ""k?Cp.iT @DD$Ǹ+Hދ42sfI``Y8DDD\[a@ Q "" 3kw{%8DDDٛDw {%K@(#[E? Sd_%WV'\F_b 4!""ob`b~2cOC"""bfC~r '@,_gflL m2C|~w$`6 <`93jwn$8fs]íu^xH73[~Gm("""9_ef7$QIsq̺[$'&Bm\W?@|pq.8+[MV qf1oBrC'2uCIUj 0t$""⊱) 57B)W. l̨@dR>M+CSGDD$lQ@1k׃XDDDyp= 7k0ʓ󉈈kuH:pnԓsu|o f{w5+&k~×<=Hz̨{@2`NC6PyEDD̨-X4f͹DDdHboEefC{Ɯs$i5a߁-J u^dOf?w("== |>mo|k';T$}OўoyaR#Q{.8""",DZj;l`^|ڴ_GDDNOLH$}>OQtT@DDdY1~'&NI 8O\ODD6L`>3&w0,3 |7c|тErP>\p 0oX.5ED$W-GT*XD[+Qw?s0fӱè ƲK̛ 8}ݼ6c #H B)&SPh34d& 3" v2y5UV ՖڂCe7 H4@v}y9'ٻ{wΞ>O3k㧗fX3{=$ר~㨏شFH 8 >o qpO=@}odx'{>^{ " }\_}e\_U`]G6HR9 UEmCT i{+ܗAJ!|ڃND HzPI*gϤP? 0E>u$a򸘥]CT,s^ؘ:E %;X3T<1, ګK S$b|NIiF@I:}Lݬ ySf][KR4*z:D\+oT&f2$iTV]a/gwN&#T|ͻ0{lx8u$%-^C1~ LM#I+n%4.gKKTF9M"I~!*g-~$ُ{x+ y1!ޑ:E&+_I}S|=Ƚ7RH8CȮ!*lEO)4F/R \WyQv^A#~)t~HJzj}CTn` &?.L#Ih }澜:DWG?WOo)Ecp5Yʹp$v^A? \:EޠFvwF~.d7@8=u*#SzZo7JIZh ]W*;?ϔ5ĸ~5 i&;wMfd~֦f0p}}:IRi*""[;?I ol}WVƌٻ.SUBXCgKcp([O1e03uZMDc>4j (0yL=:G/E6B+uZL>_ q1\̙1 y"YADp(M/`` OkWܿP.\?BWYjev' 33Bz:i8JX?:86CYB| Y/]* 2XmSTvaM^g3d6ml*!OdA 5^&/zžfeF˩3TjsYڵ'us諮:C%]p(!'#Ə@\*uJ(W3$ :C4fW *=Ɵ@T7@*/uPVy~+u ,7J!ib8ҮK:Drh5 C0woNQ2sVЊz~NQR_#$Ъ)S?<:EINR sheK )D!} n-07l|lLjN?OOc^/ⳟ/vһzG=KZڵ{GShE^%uGi[b;w%eS4.A\Nk]I@oc`}L1GXI@om]):AorCuGI@G  #뽱$@]wI_!`0O[%+:>1Zn`JI?z훩C$+:>!DåRldxN+:qvspx%!|hK^&1O"xt6x7l ySZL62eYt1@c٩ t?)sT|yS4mp/^1ZXTW~{Zx*QgV"Wa! `IDATxuT[@ADE?Q.]6ǙE-k\ܙsCzIeˀ}>}:III >$Tx4g" J% k0 R;$'%POuM5ф,nF B*ސD le c5B*^I$I|~LJEc)>I01 bilh 6%LL\V&[6'Y|(6 iYl(JɌdds ,SrRPǘSq ?7UU^eCVLFɱUUgE qW#-MUm#YiԺU,vjNHȫJ=`&^TZ& q<$%R4|xXy IVu^7r(f4N*ݍ،flf3M E5Q}&,euMF2bc)m51I~,&!PX@ORLD`EV04z4d.>?'"CEm7Ū*1[j]+8%MQ$ xV쪦m&SlmB fɮ) IaUYFBdIY ) MΠ?halz&Q1.玷9ej0Tk3Zf[A݁zjb5 $ $$1 pPc1T\X%#X7-, J(FFodPAr fD\$i+C2IB "L'RkBu{I4ب0:,X6j*0( SjlDvj[ jQ4 1lg@Ad?N&, Y&|>?$?ŀ(5 ƨ(R ɇ99 B @R0585 &3 I T7gwR]QFn}~E5H@eIRduB~N:†x{'x1\lm@L$JX]G80 C Ѹ#<- [V(Jϡ= UHH`3e%v[bwڨJN~Y)ހ^QL%U!IFŀ`D8TȒ$A(NɕG{uS]?L $US3PZ(&0Hwpdp5d x}#4UKV w( *ޯ`ZK Y1(JQV* dP**Ze5 %I7( J1+#&ce4M#i6S ]utv! &hdAEE }M^O^jyU59AU5 AEUI w (ݺygo3[;Dj@8hh䂚+x$lYB`,LM(%yfd*JI1bMH&3A +E1 ln (Z(5u|j5}=<83PxJB$%IX- t;@@4yn&۟b,c !T`E^gSL[eYSV \* Mf4M,Cti]ut"ьhj2!KP[cӷxy4- )BdYXϊ~kgkO  GWA#`0` !bXo1ihZo7Y`44{ن~7'#{޲$cӬ0`0*P?fT02yB!P2#(:{@—  P`TX̛&2b\d7 kLRn7L\z@ttZ.: $Il0b5q_45 B PdP,KH4TUqvz#Ig* 5eIն̩jhAqlJ&$ɧ{u肮bf+~5L0*9 ??q "q:z:}/@+i~ FQV\#Mk֙qعw\l0L^OC?Nۡ ^$I &b,|ަԭ5Jo5vPB}m.sm%b i4 j\ž:jd0P4ٌŊ'':D]u4,c5i6XX=zlup(5r1Cg/$bś~i*.O vPxC~d_,M8;I肮ӥ$ `3q|cpkI'r[" "FX j9ZXaaѤiքvc!׍NCtNI1p,7_X{BMSx)pb;z::Fe4Mݚ )FϦaZ#G\0[PK NޘQ1`5Yq-T]95,/ttH-\&$I\?|r׉6UkM?;,֩)|ɂAVNG N +LuK N,w5dTU쑚I5l'lW(I%U%bAmF89');Ql51h*肮ӡh,X"ͯZWq^?q 7۶{.L$%hG*q0%G G5ּ`uN5ھv}0[C!UYju:]u: b DT)8p]L8,Dd鴌$AU Ǎ=b_BJ账l'_)N(occ?K0%NwZJ);u:]u aKVbv*r=b?ӤWe ϝxIqhGx9۹O֊ƋT]V!<.>vLzN Nۣi"ָDqMIK/:%iEfU$$߾I {(7,  4u%XsB&}4[CJ0]MА3u 䭹pKjz;4pY7mKg A׉R$PC!*+sgSyba-$7^Y^]'hQ1`1l*9lseeU[ {$#uzx ~uܒ۟`c8b;pp{1"G do,ͻ t[jLR]lc5NtkD DIX K._Zhٿl*xVX$lN]YNu 6Ƕ_[&.=X)49i3$ dB)q%TW\9cv5t ]gX- Ŏ'//<&OSCTE] COS=7~~^HiMWn|j-} '9nָO֘|B*eiNX 2/[--m=wfrIڕHڰ0,?!#ʭߦ$[l$m(y6}jjq,hŮ.k;XQtкesnPT}m\hB-Z;4~}֑D0%UԔ^X뫟Y^HHi;Hgw]_4 EV1XiTl.+8P"B[u$CUJ(ZTOahdl9I>"{`fylAej%t"qX*#o?f/rrFx@k<__zhbbCfo\mAծIi!I.:;EtitFN} xU#uT碩sy +8Spյtv bh3o玲]UsE2b7YeIei]u,iDnd>4rEY77.Y;j`wG4Fq6'5M n\e3/|f+X=}HT2l oMLAU֩фE1ljӖ寝i z9k4foY}o 8bEU/M!DMAm;͇gr@Db< b!HzTހڊq? {>}A>GP_Ju$bn: qBUsc{aԈfq@UCSPJD,hqEj3ݓjqHslܼшkBy.+oL / NǶ4?[k_|K,$z'FNiX &vo:qښxtd!^BJ8,~F*mH;11f+قb'$K3X JNA=D{ vEQ#O~/ ^5F Ti5{q||^~AEZuZjO)v_Js]xeGySg!cqt"tAdF́`dy/ݰ&$bM;PA{8L& 1V;)XB|c-vbVB?I6' ml&3rݓe94`NDSU~]ٛWdi4ގF3ޜW~;X$c `}簥G(̘dtBBD/ yoa_}2) 6^ h3cOސXbD:>uMHEl_>ɂM=.êʭsYYz {te4MTPR[~菛-<$﫷1a ;X^s&-%.I "u׳{㪯f1iŽTgf:w6Э.Ng$do+K6_gubR  *1f+)xm^vg ~A0('pһqܩ=NW+qC6AVO=moCDuYbΪ޼yՆä﫷5a5L#EVx{_=TžAR:8b:zDfN'.Su1[T8jf˚]$ZmfJ3O0ϝb8d|o'9ԩYn+4cuP]S6ſ*odu$h zАt&f*TPUaO˰G.N8{̶?[pq5LY`IE*GU02cSLЃ4['=b|{kK;zTƖfrM 2lXҢW&c0(]ԣ.譤SZ#o/y}&-Vt$нe% l~_ѣjsnc}gJk"u%Y17o v;fc[TUJ#ϖ?<٦Gp{|lVG͘f!֧>g,M}',1#Ћ!5LfL>MyׯV!!U/i=-X{y<:zDL5gCvOsNtG,5?X5,vREtAªJN\ U}Ғ^'듗Nt*r=] ;x,MDi3&[#^StA aU%+>j;뎯^YVuC/aVT0UP] 5qC(%67xi"ZqAcu-ڪfO3NCGӢid%0٩-iN!!]#՜>vVnͣ]OiC-*6 M.\0Ⰾns\ѽ/hmI'OnTfMF4i% X|Eo3~S^h肾;oZw7y4$yx|q~]gSVkȯ`yI+W[|K?D$-c/$j63_θoEي11Y=铒>=蕔NnB}RHuo?x6gMG>wݙ_{ :{n.taD%%aq]7 ˔w}B}u*YəL6{#rƑctAyiec-ˋ7yXZePU&\0a0f==QG +yP$Ix\ix\3{2$'g`Z7g&=.izj$6=.;h*8bߙ3eZX[WAuYyi(,q]F|,n%%[ϫm34H r關W$0~?7F~$kyR/1VQb̭x!>wXGdսG%A% >39C{fde-yQ}5w}$de']Mg !kȋsٳH{ftsl( f|f_ EŞ{|2({!1KF=߼&!h$YlWCS#$2zXN2cFnTNxd] Mg?Iu3n1:KqV,r0[{g ;SepVϨ)'<@ٴ׫_s+n2%tim-~;;NV^MIIH0Z!ɍKZS?7*,51zݹ;;ws>Kg/ "R"{(B W0\7(~hAN#QGrԠ촗~ yc>g7͞;3xk=m2(t fgÇ_~mrlQu[v əs@8=ljz9MN3~Ot9OhJ&NJpǻ0wYƶډ"`ʋDPb[|VKc鞜Ѧ^Tϧ=t˼n .~! 0݀FCtibbٹ/=t&.)>f%"x.6!WңFvcvucР^:RV]Ff!{y\ NayKKٻȌom*ڈiI&ܔ:O j"G5D amt<}DbWٝ MD_õb7.s,9~{`=d䊜.$ xql'䪑n9, d ΙLJL*ꐕαu)]4,3[7֭ε*ߞX쌨aH'?N#ΤWR8s|=kl=<}Mpv?g!\+"I"֠r+|;::Dwjޜuu tjk t#}s%l[[Sz=DtJMm-'@;ҭl6NtXPUȼaJJ竫cc7Ybs>{a6KJ9}L;osp/9^,C]44Uh21h ^oԓÚE6ipάܺ!^"MJ!;gNfg1k;xᱯ^ӻ dWKkpo̙=>/?x]xvzeQ{*//M3n3o (_Bz. 0٩++Llwsp* (닆@A8H6'PA7^~2dEAbD83I)\;Ӊ#.]&U}G'^p黏ib &ˮM5IF8/I}>=rZn}nl^ń+Ԇ{R"m$_eѺtkٽ%rD}p_*aBNۣ`bq;N(oMgE_@;tkUT) ѕY ERHKʩ7e7C/ӕmY:Kgr;zDck+?|| ħ|(Py]6"*Wf9K7V;_4e'/3P 1m{Ίs9.{wkvby"& k}p;!Lee㮑k¡ۭi{FFH1(f߯(Ut1RH2b7:q޽n3KyJDհJE\s`۹?Z 0QT*@*3w e%seq_6H-lG_nWnxk>Muᅑ=!Kg4d吸ˢ$HXY^ik]4f3V: )s!R |]e2 6THNj"^#W:G.uûԗn'2u^ I {R&JOWyCcof}W-,ؼ/'cꞐG``:2r 22wZ&.s .]M4fNKZVTJra)HOAUloG $I$zRNdE!e9ɊMSoA7 M֯_ԉܱCӻ8;o!;:{r7bR_͙3 +z8$m>lY)QߵbuZ4oQG3H0]ypw1ou]4LVYY7khi h[^tfFMӐ%Xjiܲ%Pp9u/W:o('jv_c "$!5uƦ02D$H9i\1'7y8XSC4HHi㐓1v2+7ι4aKk+8Kآv;~gxYԜwZV}vo .x:r%DBI1pye3E ;xԫr)--4TeWq̵jªUǸeU1@kGGx)wP*lڸ,RrH0b@r6O-.Rri{( YLFjH4OƗg}ǝҾ ;b k{^Yìy?$#:pHD,޲Ϯyܸ֧\{౼2#j1?֏ dLW-wi؞ĿCLjP•sIv8;:(JLmKHqEX Ei@N\2kʋdd\ ="<. xDPP|F(h ok4K>COV'QG9`{HŜԕC5E4w,Z6W>©iad۟>G?Nw([W4M#2~a4Վ9&dG, 6'S2hr1g1wWعۏ;l,nyyMy?yAyu/,궰٭/RN{J8::Vʑ)o,hX6-j}B`bAgwJ9qħJ=f[Q!肾gh*XeŠg;l.荺>ID7L"lhs#M]4l%nDa$Y/Y yZT\FNo=$*dfh!i;~,7594OM>eL_ M\r] )5 y2ӱ5ےVskwS^^Jya̅X W?1k13 ֱzLHs1o^L2 gLAc 1ݳN~;Fo5 Ҋ5oWԳ(cqSW[ Q?F.Iebm(MȉOfKAu՟0tK&#GYvvc./ rˆ' asЮ"@ ϟ﵂܌/ג;@yG\͋,G.coE n}єd^_)‘!9~hEҲBn>HGEk z7lljz8D0'6'6AxkF(&ٞĈ4|Q74 ɂ1qOMr|hwG=w#%Oݸg<2Rh1\4k+m2\6)BF553x*ܣ|/ !Uԉ u8$VwWga /.> $Wt} ˡB=p#I+&s\{  kT;̵-xQO@L$2ꂮH2! $E$_r}~GpҺѐz߮}ՇNHQGqȯ*ſ_Xq8_R->-PZǣ|'qump2Fc纮a2e_Z> 5G7PSeɺk֜NH殷>C=8iXN6Eky鷯8'1idxm]*z{<§>. iȶ"AQX$4(16dd4c61*,*j BA8GSS*q͖bB&i {Y W=5깼{ &XAgKbIq1& vڪnz>qMm9s'6$Ty ?B*˪gEQt 0f ,zYWM0Md\ zjJzy;"DS_Y{ `BsyOhT㞖#סּ3WN풂/L9"%wJ]IsƑBzLD'->GkcNg.QTWI~M9e 5UR2zPr.Knr>' [V/p<rAs1 Ns闯Rj~żPU0بt@xā=r9Ϻj}㚪d8R7ѢT ՂXk,-1& jkBJzLkJRΛ$%=E} Ӈ)߿ϻC67ܞa9}'~I7m/-z~' ,w9A?sXF|8Gl_ZZ陔#k7{3DEd<~?vϔKy H2&ߟ '=. Q.&I2If5?Eiҭ.0ϚL߃4 Ɍ]5pڷo|) =cDu|2y?'d{t?`؟7gNoߤa:Qht/IYc12h#1lNl{VE#6!)] U(ތ19G/;:]qګwTNUBvX^7/rF3z}V=$77 Gkճ.bhATJlMW/8,];U[8'yW|"'7[>}?CD?@+Kp ؚ=Bu99ᐓx[葐9'Dƣsw_eDc}5 Ew7]4 LUE n8: ('`EUUi USMLeӻ=/;db/_~fv=}Ʉ!v³4lY v'.MtE?뱦desvX_ӧM޺wЬL{q>͊O\SEi[0(6Zku]4TTF A `nr/M-tT@V 1k~ \8 ?Δ_Qǯ,.Ȉ7iE_<|M~Istm}=%o={VvV'hȬ61mI/|Eٸ6ha5ivR|1GnTy(sՒjG$(;% lN^=-C֩Thp='0oF#˝_͓b vXtw|ovXN~q[ Mxj z}x<*([ЀLictMȉOeUy!~wW+aM%wH'>YD3yi0GIn,/ghfNg]E1o^7![cgxsf|!b$06Wy`ES5-M {`KHȊLeC-F'$eE>X Fbmnxܐ:+ Ic1di)`Sacx rݥw͉E:#F Qc{﷏TZE1Vn dW ͪ/%}'(yESEԂu̽N z'gtt:?ózv8E%uޙDvϚKZx호WjD\U]86rT$p(9yFAIC9߿gU0K>ׯcp="Nϗ .]zw+y'fU55*11_* ˂(>/+ 6gw&OQv+L,nb#uUPKA#E}ut?UI] M<31+K a4)*ʀ~0H]{. rZ\"2BuggȌK0/"9TVPUHL%?oοGi읮#U20NiY/Zm.%A4@ @Ϭ,uh`̒i_<Ӟ%IVbp{V!NYcgln4J,*Ąvb b[U H"eʞ}(] &:j| O E"] mR4K%AW(+Qw00k7{7rz﹘P_E^C027.VEߧjk40Y``#:z@:: &>' Muӣ07HC=P6.M>=vb +ݼј+mRv*HZX%##QnNϡiۜl,h[/ߊJM5\yEq֍LXn ^-єgOP58YTދ?)EyPF6VUZFo >,`j6SnbPdIn,y=/}y Bw7F˯K\\:g*dt]IW=HO! ik VCVDKܒ"ukO#aUEia?SfPv/.~n(l]ۊp2rO34t#Rx^j<.=njn 5PHߋ))}|p`8iKPddI¨(fdn`8VbV1ē' au7ԵPU»߁eޖ(Q'ẃOleaM֝O67/A - V%Մ;~v(F@EDžI~ 4gM>HBE8E q[^\&g2 Cf] d>yѣTx^J)wճUejpSPKQ]%&Ԁ|^!ת$Hߒ$\"ye9Ȣo4!* 0MF{*i0 f+-Zd%D#4g<=雒I3DV{G| ?}|HiRg((+w1K]>/u^[3ҲWz!^ί-zw(orxBl[Mf2Lf4Mv!4 F#PH~fbGReOݡ;e}.xdU;~;no[j\Sβl-' ԸofQ6x*;GfnO *D㦡AĦX;c青MT'2$#>ətOL#Ӿn g(y$[|2{}3ޠ(IdaYώ[uwQoZFvL,x"IxA\.lfXiD9*\E#3?}`^qC2P__: )e~_Hy&EMdM/^ƵPʭ,)Y_Uª%tdE,6acDPNgϖ$@Ef3fM{J(bo2-GlC2r雔߭}RH㠤m00?o qT VwTm5-/\Gu]ic@u&| DM%X  z6;kˋ^6^Rw|IpHvlgKaK͐Kh rMWsOys;zDQccY!l5./b͸*(> VR#nI^5!܍+/b^H^ IMk~t\|Ec#1%LW|.-4G^ds>d)njz|hotU  N̤GBqg&rg_TGBUBʰǰ衏tK.E %''cO$kKeVhJh` "bN!{ŹHh0!.9 z5MV[ֿ7_qIdHUIgu޺&蚦a3l(b4FF-ovHJU][L<!wUz44!r:zDDQus0cRg|==q؆߼iM˙e-ZMXLjѤWl"T(Mn FC8F,ZUOڭ ݩ_/Dr|CN8r>^"PjHJ^pǏ4^T$Y=7;P,/ #6 CDFί_#HʈU@1`RvS7 d|Q3+ PC|.l0RȲd%ej2HI߿}vdC\7(8z>`G Czw&>:myvVc?ښ/"Л=3][SZ~"Hk ?t nXʻ?N$!GΉCFsvPFa]%9ebm$6rp0&ŀ,+۪(FT7ZbXXK02%Zf&%> 0ihd$P^_i9S,߬;f_F6ՒTѴ]ჳnbuZ6 rlBn=~JjʏYLYE0[l$EZ)nWxiVS( ,M3]i5,(I1F2bib5&AXJm} 7p="S>j_h50o.UM#m 7/cO:fjMScX>e!H F! {e+w?wQ HW=i9<V5&n9+#:=;~)Ymz[͊x\"cN0$bȌM$#6d'gs4[, F = q{l&҆Jjᐈܷ9j낧3#E\BT&2}}16=}Am( Dso$|^>=2Zu8(HƃS[ڢy$AM[zyBjM 5+HEEn D?]M૵_4})Ym71aI$Z|tLػs<:r 5|ǯ|ǯ,ذ \u•ޕ,BW'kj6zftd?RsZv<(f]Y+ Ue* 'I̱%ͮy{Ub^5^q"ȊOchM0T-/ZAl3'ڝb͒F 1U+';pRAmL$'ݜxlE%`V $8 ֭;kNP[#YyǫQ;oό?Nt"ӞK`O|l6-ŒOٺBC I.چZpCL}e|}99䘄vRyi%?_šB1$ܠ] |^ i9>t <,l&`nG E}4n~!j]HSn $ 9 KxniGXLF+=HюfoGy$|d*a̾D-Ǣ"2j n40'ޟ#_"'Y+dYXե`0ҳ N:9=E,޲VٰywR`6-c6Yا0.u$pp^˼ًzd{4$2=GcMw-y'9M@iʚEHQ9Dlޙd񯣄 oB:|>ΨӸkōHq[Kxptj6 k<1Mx5l兂"ZjDiK ƥᄡc76 yo|gh߿#FtjW,ҽ/"V|bzrsak7c;nn? A5dOa{}"NzһEA@LLۇݷ!HY3l.O->YWdQmپ=jiQ?P HW-k[Vo⍙xK^涠mb*{̞m4 yU $epsؓط{-9jH4Uǜk3'ntt'V,ITQ-2M4U/rmtWʬɜ89dFb 5Ab+FF]qAc]i2{99dHQ>@4z&qGO\𗱤duiPQīk޲wYmm_U;}/ko_ݲY"z91- kpWhvm݌99/sW1}b˳ⒹbvnH˿ 5e-ZV2jX>>6VpOPWA8n }{M챌v7;cwLSuYŋ Lyt"/gO\xT[iJ4#w{FoiCMX}q4̸~Z|E+%=.r f2b`g0ljЭG-bm.3'3wBa'g@\6K8dwF"3Rٵ4 -+Y㎣gпljtOL㥳oNixׯ;;x~Iz&31QLY!Hk{O s􁀟d?`{`,r[bh2 VâxI>?w#j۔wՁǒ\児mET G<%q&RdYh=MO2Cӻ#U/9U}Aj8-66TM[Bq{٣O c)@>-4Ikж)"k14Ppbd}ky ~w< xP^ȉīŮQ2$;K?}X9[!ȥ|RNyqmO?<2iW1dфHRZ{T2T33ڼ,4@t~֮-4p3/o3VecK|MQuLEGbuN'ջ]Nׯ>>Yw.HB0YxYTi y~bL1C57UyL1j/'N,Jj b`V/n}aq(%MS!{#Ȭ,yF=|!+p3׉˲T BN\_P[~bz0Ww9neҙ"K:w$ *7t n^t .B㟑sZe9+ :bJ?7t9w {H@ڶl}[dfc12_3'J-F|vkρ|?a)=q 8M1@w(+|O.cghj,YoOHo߼.S=΀$AF޺e.?v9?$h* F ՜})74hEUQ*M7=*2*  eEkR [^l!0b+ĴC@N4؋7Hf╏pQ5 uGo[?- 08(?^k{ED7w8e`t}Wy%G>rJL۱ᱛ2)l;\(PUF5BK9ٚ9:q45`r&Q阹7bAHq&?t9P`8D4|r׀dG,nZ1xcv $z8s{%I}O\M[cD.6hw$YndL9r"FLJSS-|׽R?."[:B\"u`}0ih1DZՋg=%4M\~WGz˭H]0|mtK~H ϟ&<<.uSuCS+! J^Mi̵N 6}oonܫ}gt3Dn[$dg<)Y8HrĒDFlqVNI1^\ lUG҆Z|nSC⢶C6.A)DSV5gīgݰOW㭉ω;x3s+t|OOZvAqwڵy?:#dzxDZ 'Kf}GK.q.{rb*AN> ;+?O/$^S9;iygI$9Ն1>:HuƓL3ي`d0Dj5pGMIC-%Tt7PYW5bK$M戈^|cɽ1_ؿ[liԔ@ڢ 5UL(Z}XCQ}ekGF4&]x ­՞c>b7qQ<\ڲrt7UQ 1=2$'fgR:}S陘% UB6UXSQL}yX͛-"_ܜwYR^=]Zy6g8`#[g\\Z3PnV1z{k>WOWl H &pՄg9{1[H}KX^~ FMxb:Wr7??~I^M#zPc{1.Hb]K10}3雖CZԔ2obEqk*Ei8$8a`DcssڠQ`69`nY+>S{qF41w˺Gfij! հUd0Ջ[=]G o9y(uĉ\jd#e; el@FgmO3dgg\^7c~ ֣ !KϿs;,t w_ {_>;NQ`TE1ró;ΞOzqPW ITHLǛ[~K-4'I/XGmYhwkAUuu3Gx;NGdW(Qd{De0)2c 6gl4/-~(.cٲ7.gM&QUls'ist#gOi9Mbc՗qɷMBLZckFg} 7@|ڞ ˚G?a@?Ӎ Χ$OtHjwpxط[߶nRX]ƏkͪyL_HO`O …;%{rzߏlsY=Sއ^t,\5`4ӯ>5l,;AY=makMK6pԏXWu{N$l ^tOf]Z4 `Ww;m[1w,=vJ  _Ost/g'_- oȁknog~^\ lYI']ܹc<7$ˋg_~lj<"h@I?}- 3wgv,:bF[rƞYWoeXaHL"rg`wgo`YD{!IP[#?r9n­Ȳ0T5Onv;E45vTqвK ="j+ 1ƝE!y'+1ݺɲp%oOƣ^D `{AoN%)_1׀^[V9Zx}GS1/E pοc?ή2lXŸL%>>;{=If[ޓ4bspkwSwIzsv7ڄEwfAoWH |x|0si4"ޔL <>H=,v(/)3e-gvOw& !%oΞ\x_~kZ[w>Do7[+?嗛7yK"'q8S8㈳x\s#A#yx{w>{ u#& $2:-aP\3^6@SeMW_|Qwnmc[ފ=zAV͜SVt#>v ()[vshFR^QJd[F.7r27v:i==fAF{>!\;"7Ka^!;$U)3njX=g#q99SD-pe~UXK' pj+XhN)i!"Bn7yNGbsr3G\;PV?嘾]si+ l\Y]ç s*,x3ٽph~~7CROQE\j LH=}-V΅,U $eVrũ>zxP0>v9yk(_ݍDKB׹jsIխ܅ Ӯ!Oy┫0+F=p.y޲ɼb^`p'_xgd(PU!>wNѾ@y9Cک<1KR$CV7Nv$36'.-ؠtu$mAЮ9<µ4󶗱f- =W̬\txf?1$(޴gUUKTruh!bkl9b.ޜ[ފM&OػbqV'37o|ۓR$R308;n2Nw*Ϻ)*yE8ջľvF!{RYg(J!%K7WEô_!=9;^\|{v:4OϛAu-DgGi=zp @t|~]bA,6 xU,]u-04HHcQI>ccѭ{(X/,UM[H7W 7Iiײщ}9"hcv?bdvEx6yjӝt2njح/ œ&4̈́疠}=Ɲ~X\/BCYV((Hfz%0+qՔҿ[<˲z\܍"mG-Qe7q1sW =yӘǁ4Tkoq|ڿ CAªb1oЏ7 RUL[2cq9ugjN: X9'QqGR+rAǓ&bMJ_[kۋ M|zq& r26-b2O޳s| ' {1!o8nY:zXXl(@Ѻ%t`o39,a:id&9'ʊs]izIGqkD7w#$&=ͱX9}XR(Y  q!'-/5R;]C9SDp7r^ F)`myKnfsuz4R -6챤ē匧Zfd@F.1֮y}֕d3k )nUG:@V3[H9I'G4gb@Z9-\xCoo;8x^?{wp4ʉ1qdQibP]ƹ<ϢyIL__g|Jt0wvJ).b&>=r1Wm9oWTQ%7qswXdXC x sfNjz& jm3~icnIm^[oPd${,7|_fM>l7xsl ݻ7yDY l'>%UFf$g}"hhґ (U/L!% $QƸϐ@j^>qalHHiy5wA-[w3s*m$X+JXH5M| lsAqX?d4=ڹRTSΔyRfmYGmuhF??;߭ OLgbl8xc /i!Sw\CS̸;ϟkz;Z*3щ<tkE@fH77ٚ'}ّTO8̐#oòvuUPu]gݸiUӈ \[ET~!Am9;.;t5WXƬh{c|zZ>m+doӯmir'?R앷b(hsɯx~o;_]?]jCq@,hCjﰛvp//A~-W _f߬^|ǯ7{ V /8!4{Z*^R:ܟGɝ -s{{1iԪqO: 蘈^qۜ 9GͅɖrF=u{!(.Rq MvF$E1P>a~NQZ`bQY/@siwAUAß]EVDYحy;&,Nݸ*~.g;݈7c/HOǖ{"R^y.XC#{ y]+ꀸ=.6>-PхΦJ}6],,YnIV3/kS'}'Jlp1exkrCBFn$h_?\a0he{#Vhƨn,Α",h-YGTe ؟;z\w\7x] ,<U% p¹O;%Z_U?ϝ%f#P+/~Ӈ<5T<5̎Tݛ?k8P[ESHF-kyힷzmYC>Cli9ZY]^y8&?[:{韉hUj{?;zݸה.<wпKYQX[QŸ^`jt5Mw/;=Y0ӟP/Vwj.v0#@/GrjP^ī>ɘn-G~||Tt Wp'>_G/_||z& Fsn\8pPn o̙Κq/DԶv1v"I`LӗʒO 3kTQmGĻy%9}/?6 9ݘڐPN*D\Z?98[?js߿&)%;YsJ[o0ugd%o1 }]zv]y&rY[Q|2UۑFsDaw޽G~\'mR[kFp#G}>q_wqm^}sYBzK1֖ 4@b~. VcꆥrQ6;*>s;9GwpOs 7ụiD{4种Z, wn9D8n`X> Cj(_{n:X8J(ڸyW }enuvsoAF(F#(>{b6q6.= Y1ɻtEIų$tp$p7rAzhpW'/@jb$(BR|wCyjcU)oLzS> :>*{G2q4I+8]vYYXYXtaRzi7g %;363s{s=( o^tKo,g@€] 5e=Thl-ͧ-{۬`n&śrqPtDoP"ߟ>Mj[Xg x}0t"anߓ(\/a4*2:I WCjvו@t6ε9}ϛ̄M+lw9!%~cff/QMz"`ɷWڵ^E~U*9'<(EA_Z_A j+C%$wy}5Fpӌ{g1h2HtΡ&4OOCk+v:h?sOnj<'ur'[tz'USa&hndײsՔ D 4â %DLzrkVB(qm)nd g/Ċ?*8^T."S`DFbC"{v/8]S7 ;H*ChP8pT"Bݩh?߃՝؀Ӟ+xܷs:P]sȘ_kشk.eՆB{XOXÓܥeq*e\5T9Lŵ- l/cM,ۿ9۠H[A1{'s8+xタ > k?ogZ̙4C'0g&^Ûv()۝kyswlپ՗P* X|`|ϧOLSٿ}Hl*nMfl+ctrFYS`K R)ڲgW}í3JEMOrc@p'ztfh0Z!3_;_^UO^J/VV-e"ݷ_?Z=k>aP:~8^Λ?.AXv[ȧbu:QPYL؀,zy) *=&s=,m,ȒǛbN!l*l i)ܴ~qO2-.ݧ$Gs)OzDkz✃ g^ ^m)( 3z"w1?: I+'Sջt_ U5Ǝ$d l{S}m.E`/%nzR;\IQC)uA2VA|Ë?)Í g!68"i-%aN[xC&=|8Id(.c^RZ_?}$ϓ'PW|ӇOe}4 _BCUHFnc.IẊ74=\ %2t\h1h[](Mc_UGQl:Wy)A.Ҙ.%9'mj҇ၷ Ttm9@v~#&Hk'.ME8~ٽN%FrSS Sx秏)ok߹fIPWu.'$g`ie}BӏO#&3tPQرtwʪ'V`h;}qatHN4j)LT@|X$7䨎sJk}` JvVμbp|_o]7xxJ{\Nl-l۽mW@h$)7h$gcfr|EFY9JOJzp(l㸬1,3vg[>hߕ)H4LA(+ilR'|'.ꂽkB\ˢ)IpP!L2.%4ZjʄOG(/`i6dP/b'OÌԴlwK~(JM{艩Krex#RxA g]o< #' btӚ[љW]iӓInUqj$j+0gWX#Z&=g* 1:v_ѴDv$q} ~,ڷERjibDRNb],-+|υ,^xflKE5QDAR8W׿z3+ُ_+B/!w-65Qڢ :2_ԇϱm$`qG i0kxnTsn Қx'EU_70} v+l~g)k;V]F'.' I_Ascx 5e= g0n%|nR`&>kϐUɢ[ⴋ'] n4MOfO(]G-0phsYB{oq/!.|n7Ok z[Ѧj̍ڌ Pj•(J\.(y> J""/I8]nNtfƦuƀv~9nX,t㓅V:r5\MdÇ@@+v|>۹BxvuoN I9 !=׏acW0fgYy;9F'k=DzsF%g_XסNՁDNҡer*դy:=cb=7+ cXHFv [JsٽdKICw:L/!W:3U~A)-`M׽ 7|G~ͪmAR<Ѻ;'vMďâ3:=- ^25} cgy6 dO8PؘEQ *9739r>_mI i߱Ouwsc8o4{6JFz@7WW57`"26|eMR'%tT$\'qj]93O';.ݧ{;$hO#l0KmE2 H&pIFMKoW~g=1bNM~ЕCa~ϝԵ~:MVM/DvWK^yވ)Ӄ iiQq9L*y CXN{{v]zh{(0ʆꔕ6ְ݇dk-ҠG+q-8SW?/_}ӆxAw!ē'r"Ylˆ\;JHL";oƞI=hZpk7Ң∊ިΐujfwE~[m>~xk>~}C=Ӄ关lNFo,I,SXW=77:I4j*Ohw<+^| QӡȔ2vq0ݧ,رւ= [ZHx[KeڭGjʷhۨkm^t;]PWN݉NO͢IpG/&B-y>6#c8;֢U'[Wa1x<ӛ.os<Ӟws#cP(PR?ۨ1  Bnm;HE|qC=3wX3\.;D`ڣ#T̯5a'-chY8׏eM7A ˴ح4۴Ӣ⠃0Y^=f*:irV38gӸx)\yUf:=n<ۭ$Ufb I|noL3cZYĹQԴF\_Mq5jZ6ۭ{ށc|( *|*: ӎ^v`$gazgsc}!pmaQōu(*:7k /@d l<*:un7AX4?W֕vщi~mc=$ iHVL2M#zDSP=z0@ N$wFY_+tУz>݋$CSvҾqA`.3,)k7/`vSGSV$ PDS}x:N"Nc.SaQd/j RoD=noIDXddp9EŸF{lF W|ϙ!2^dﷇ)* qm)Z-\.1]Keks})>nhԕێ顥1! :~M zX-X6#R%I%n|{w1Lsi>$liZ!.u,ubёΔkRT/:HE7:ؾpE^e Qq 2!c с/ٻA+/1^C !ۍJbD )67m qhJSU"X+ oU0Ē_߫$߳h?8<|]ј5"dZ.Dj,tݞӴ$<̛x[Ygol*r|;"HƔ̑S8nXS0hmec>_}[Dh#,е`1 aQU 0dJN'JX{v98T9Jm$n1ul36&zs.o~j+<)-׿9;5zBڟy$:nW7/9(ɒm8\vPN5H4^dX/.'fo*#,hA??|8P[1mge6aO}ײFuyU=~C~N>?]OX_i>mQp;rBB?Kr-\agZ:5 Q2kgvD*$  co1}dP(?ҽxn'|m5T${I K^].03ȩ2r 휴5N'~  ydحsaX|SN6'qgSẎ#N':0vO(΍v:WM?y n Vo<YGHI$׻e?<lYɖu_pFoqѨMJ>"&-SqΨ>eMb"N#T YTnor)H Ѥr@S/o7d< Ϟ|}Vp1L$G0&)14N}^ΟϰlO@8ه/&f*.'ȇWHPU=kcކA++XPK^m9 6+V $7n41(&q9h$cR}O/ o< iha&$EMkf㒅ڛn?/@Ak swhO1!UvnElyIY;!q[F\Ma6,._q􍇉 FOJ)yQWNd? 6 >wq $8/ ՜0iNҖk ut9!$ҪNNF tnu"XGL8!-C24i =xdb:Ko6,ߤ60i'3&lտ\#yykr&gvcp8p/ 6s5|0 uzljNNT0*i DĂށFsvtFSOoBoO滥S+%|Cbon*2B"'{3MYeeUT+MYdI$}U%bd(Yg^iarZV o`9#vtVf!(:*t7[겚7:=&M|Lz8ih-sv'^#Ņ]~S6ZOct>(/XQ1l5Nʃ*ȈN >:A$vDP([ r_;jQ8.&㮯bGTVWS6v`7|fvPXK~}nkCy=غ{H$X[!!SM)u WmGحFlTbbW?P^*Sф7 :.Pbb`O$hCN1F{*.R@ Q,tҠ::|C#'{HrBJ7,O[I9 Jr@a}ͨʦ:;fCnY>n7uI?eQ",9ZP|{=/u=43>c8IXtEX#v"Uj$DBkzdP@^ CHըHw4rqH2Tr M0.%CdrLLFMWʃ*IN-MD.ĬAĺzP`@o~<m_sqtOC`X/Ɏ1ļ,dq!>hj&7n^]ʳ?(^4*M!44}t76 '\w}u;dF'Ⱛ% ۍ[f J6p: $~LHS2YU%҇yO!rItƩV(H:uNv\ͤ,ӧ5Z )ETEp STwr[{ Efu07Q<..40cj2j2b2a2)뽎=q ι[fڵpwܘl8QC$OqzhӮ.$INl~4~1t:1h9 8}Dqs#]}3]) Iwa'ڏrr-}|hXW;۬DN3xhzBr ֨xh礗^qp✮tIs@0c\kkBK jC:=<|]zgCK>e€#7b54R5tS(7RV{2q95z$e&XoМ|w2ol.x3N8S:H!I}2îi3r`w4HUId`GJn4QwtrqG.Z~dr71f&z<]ֱlBf&&YFBF"@t9)m%vY5ɲ[ӵIXǃo= Gz*s3#N(ъՋNV ~`fD%h-eQGqp~[nr⑞ίu.m1mzEC{) @yC5eMzӇtܒIJwdM#=>[C/dbԍU -BMS#5M @sStAC($YɊ}:khnzт9Mu: DKRx4 Cs+ 'p8uñ[<0+UNZYۥ*(Eh&`c9hEQpki 70(&A8-6s 9jꅛDa Xb¢:| vàSs;5O-@#FQVPTOYc Զ6cvENg#7HBhcH $1,GWsyT/}JFMnEI /&g^qSҐlmuZ$G{/N@RV_͎`Pc3:}V^m47Bx Pdiuzg \NZЇX% ۅ0}ъI¢Ff4*Ӑkb `ә,JySRw,۸\JcN.z|zc 5l,yUY̎B*EExnk ACȌIdXŧ2uPF% $(:<7pO no` Tp\MRIc͸䈘,|"nB&k*Fb@HYJܳNkrM}8"n@Qƣ7Bx0❞ C%]N (vOczP<M)/$ j[ӢI@I 0m,Vݏ:՞-f3˽Z㇪(tE@%Gk\lV}]#Dbѹp!eL>ݸ2o a F!auZ`ٓ=;`tLN̴8yxeNml-׿&.%_[>>e0uN~zBd4R\7 ]{N{68>n%l; wc5,ܻ\* P"b *xvzYܴbQD'MYGZH#cxdbh'ObYg׉k8LzĢWG-LN͢d:{w^BӚ~zf[+VM]lX'w9^r$3OPx4}à+Bx{!zxD'Im^@oC1jf*֬]A!$$epbX5O&HDtoŷ4~ô'D;4,# zƒwniDFV(kKCÎ8^^b|q) vBy~> zIR^UC _r e7_B"VDZ$ISBY^t:T~LJsUҞh ~FY}%yJzfفNlU^n&Bn}Oji)##$WexpĚfmm^][CkEEu)nE7 ?ʅء]kb,9<-#*GR`=CGbmeqEnx뼶CFfsd $PC+zw-bS2 Vsj|X"Z 7ǖ6fpehi:>~]n~?c-q8Is3)AVlϧ5Y[}>=nqgtPzDac .ōNc1ID'eP['܁!LP[Y, z gk3I:+¶6.]\dNΐeq .4 "kwhl~*393exl2_ g14gԵ$7&EYvd0ԚRfT%I+ yq)]T]-L{*-V!$< }%񬏓[ì$NVz52Q_]Nt;Befe#*\* 5OrԳ#D45TSQ[qȃ)"3 aX6Q]+DTi'EimgDq<&dpxz&㑞!g dz{ KJ꫹ȸ"ny~t8RvnBT׳DjQfSQK\̾t;a >Uv)r^EaciuR8cd`hN bx\>(JD'>4C?"W"!St!2;?{ڊ#2κO zSAUܐj0H͵ qTBxpCi( :|"IbOiUAO#<<ƪR5&VVm evXV B:vL4T4k[UO{6h?-6+c8mdw .o\KNU)Ʀ ℡#y0_WIEUuFc '%W9bl.S/eİTlmUCW@VK 0 Qu} 44穥s1g3&w}Ul\HuWP4¤7d@(BjN4Cc?_+m_ƥP[!Ҳxw9i PyA S}al/l Q\4f&+}57@SsA_g~|v& NOD9 db״4{rwײ< -'cqGSuFTt$5dUnuJzD= Veu tzXG]GQ kbn捯_ g2g^ΪȐ{/0g!whO9((݊BJUsCVSO):_{^jن+m[-`b!W[яztz`c$QڠM%;.6+b1)kyvq5獞iÝ*oM;Y=aoO_2q)e:Ɂ]+|HD)<J⫶Yox|?{Z .k-%4j0譌J&-[ҤNAqL@рnEe‚e`5> '0:~u\Z!i`F]b"K2rAnliJ\R0Iאg' 2{o,۰T,>Y~<5+ɫ2g],4%*8FL;[dBY˯1z/wṗEq_Vy6gDxb⑐* RJ щEGe!($A]%FLݾhhW_(LLiJuyE!\+Go{[ qe%=wg~H񟮒ƚ~Uqצښt₍YOowXut:m)?yWcw.V $ Tv0S5pOI*^0͚h5O8;$$}l3) yIVӝ ɢ DjN!7.PP͞ݾ՝0q~ܷE]v;SvPc j+.b}{ E7潁[X/?D}sϲ`O =26)֧ڊuU_[i7,T47c \/N378!sdO[V%,ٵN"cKteCpģ/ށhƥ,>׽#Uֆŧ?"ߪ`aMa}ikO#1R.#V֕щ7?x6.eq^VES8:h܁8v,|wo Z}- RsY~QۅC V3A >T8=E!Vߥ{EWbsvYw۬$ڃ뮧־#]#hhni/_%r̰I\4 6d$F^/Ѥ=,v+KlXsN.?݋1!vCm[#j0)LLt:$D9ia *Uv3@x-QU)1 cxb`Jloh5p(_9 vYg-zY g=qv6!|gz1z驃:|~1ܝ aQ"xFSQXāC˹d_\wa{$C]%>CES KlkJp=ih'ubm!-[^! ! F8].Z6aik}m)HxYz{0vXi/涔L1s` (V&/qųVhꝇӁIIAt(oMզ傆j!|B'_]yiFBUq-b~fӸ7 f.XRS10:IHBX$1ąB)ա( vMVj̍Զ4Rmn}%W uW}n`bMV:1@C5* Ut#F?Q F>_ʷoEE:޶Qҳaq\(Mc(/Z[BVV`ڸ_j&*2ǐÐ" ':80LAM;9]N--T4RHEs('R6r9q @&TPyx' c;t323(~@|1o||=an״6[v֖_*% "ﶯai҇0&9qćF w#l[h%l+9<e5.F zD134֊?:Yc5)C6HjUaF"b:HG/]YӘ6 ?Zq!BÙQXFf1>yR2IJd w)n74T4 {ɭa鶞QIܘiw:л7f5W q ,߶- OAhpB#I""0N[zݵ&kkijK-Mbb 2.ɞ$jY0gʉ1b2sgZך{IbUQ)<1fRzYGRh$E|) :߶բSclڴo>olnIKeIQ DƱcFTJ׏7ձb܀1-+'_%Inj<κN:ĄTU$՘hˊ8y)DR4Ҹ(dp VE0zo{6y&ju9頾|iJ¦M7˲%l!\Nm:N1؁G'wb,4$tg^Ú{_g/q{ŘG؃$AŇQ.`Ujٖ3GLѠ;@Twc_UqȒO:_&55F?ycO"QR,CmO;YK{4԰z/j5%M+^Aa{E+)*>ʛQ(ŽTrwʐQsXPj{|ny#@e TxVV#An]ξ?y* 0b2:WL?NXJHUջq2EhAMs9;4O碱tRAx 9;lϚrs% k~ONjL=C{ =1>:\v Y0;io[u 1v>?l5.4H{~/vvGA)PTsq@B|9EH 2 6|i"Q ӫ4N[&e3b,qJooVMdg~:fGy/8t%IPSNA{X5<+P 6K䬑S5,o8f4uyj:P_kt_3V L5UU?U)(Gvo|m0dpSl .7 9K>;OʚkmVqɢ!KXStsFOnwbYi9K5ՖPi3]?jpu_zl=?2Js]sH4-a8G&ꪠc󄋸rş0aG G}+1G' dÆ=YV3 nJ~*MS:f("'ygwx钼s m{c,1H/`toXrU,&c 0PS0aćDB)`cFNNVۍrb0;lX[ili&Yx^,fGA`0-=bom,&*}o,N;%Aryeշ"N8u$}g|j)W55eҨ]˾ӮPc\9T~*/./ۅ7qHv(!戃Gj2$EW:[t-8`?-ϭ 8vW@ Lӧ$'4yBb±}q:"h3sefJ*GĈXp "1 Ӆy@HއsRkc{yV=[w^<>>ºT[dtgB2ե\<.)=3mHB#~IǚiJiaHR)*sI w皓g{;O>?MtD eogFQ%+3{qP٭ IIwK{a1-u hĮJjksF XQhӝ $Ə"x*wo`yzGB3n7$m͏,صq:? ǟ{B!C^f<>#CŊe_MFGs£x`sFQuϊ\aG'׈J_?$ (5(`/HOF&; GD(')Up\Ѳ , ڊ"(#=9'G?3&(+zlɧ"F`0E%Wz)C'z:ND nVnrT,ȏ,LX^R"b8uI t̵<8X'9Xq MEG3hd| sgؑ$CU)&% |* ߡXZƏs#?ܱ8N[.!TwUs# wb=!,Iu:  #92^/I$Q3:Q6cU.6cGE1{qprP|sT Q)WzFN妘$[|w-4M+&$GM%j[&Z=&8fʇ_*N]NHƕ_iKk [㥱x7.#N>4%!X -Dfq߼[Y͛0`:(HԔ1gta~bYC*p?XfW}`p)jo 4ixq\w9D@]o; dtt(PYIC&5RyЋp$ _P:#TuGoa,ر!!d-C#QmŐ{ |e*Og5ߋ ",]g#'ն`5r*5ԨO2s+yhtF+=ݢ#ҧ"BcMy~<c}^֖WSAB)$2)m]yo2Hl:c6/>vc5ϭ{ܭl+~cҲ ^-}sI:f?y& nnӦJJMaa5rBpgkpg8f=Ctz_.1_}3q5"k Rv;*SKC ǪNm{ydT˻9UC]넼rgɘhk/¢ }v~2>|#=CH2pTelMs!}.ߐ M_1;s$A)"+]oU2wg)]{5FNŘ:X9Qq\|z6qѓ7ehDmQ)i PYR$ݦ)IKw֫`m4Ǫ==s\GӇݼ "ejլe}w?~/%N Mo>>(vQNը3w(qPRϗ~ӇOZbA!|m FN>IH]"1'8u=_}y8ѫg"2N ;|2EmNdvDʼnkJ#Z$WTѨ#Sji`cNqӳ] a% xw}7V}/=Hu}pEե"V%gTTF vh'G32s84HkKx {sT^=Cdк5\. 0Ǯዝk(g;sP &wF{[pʛXoӲfb2m_->+4سw맋(Ƅa46sϛuҷeFG7-g#Wr $ҲEߵ~dt lj_h,޻ٯl4ѐd Jߠ:trR:X5 \N 0qƥ~ϡ_8+>.6E1Q9*0d_D'+Ưiw9afqzV6qYa& ~ E9YKBqE!7t#XۃA_g#<}>u[nY1ID9(βT Kf@j1(:9.kHSq-N&pBMA-K=rſS]Wxo9k@Ɠ:& I|2߳I?Y̑[sIRCk3ILVr)gLGcYE'o[?*tzg0m_ˊD_~v ,f &/i"&e-FK.Z 'e?%Sx"㕻gǑ/_/*\ s$Y95{9,a*O_h$faп߽Z[DMLK7PW>>O͊ 3?qs;? u{6qwpcWD+yл"O"롥e9Ud!9EPix/!LH~9;4B' ^ߨ=rr>)(?9B;4L+p6\Q(<6Zhm&#i h$g;@\3HaѾB\v PR:&dOk9[e@{ಋ,ؼœǯ1LOQ!{ agZcNAI.9~:O Zʲ 6 k h˨t2ԕu-e/?2YLH*﹀o~iCᴶt&<:Z~/ocG>b/n02Q}Sx>O$g@d.'g7ibNu-SF?r%g=}K:)"o6al*ŮBPLhOy4Om5URx^dIff=ݥ~{Zy۷@0 I;̸ :+7H?$?WJ킽8ns:`O:\_ N` r]UMu5Q߲h2>ڼb'i`>rr(+Eop8+Y[Y#|?`dbr&+ݬ+^%6K$4m2CYU(H~fMdAWNmq9“kyӅrw&oy6lZqB_8݊†DVL1-4ԮV3i#HU*h*JU$<˲'R.]xOv;w'\}߿cg1oq΁zrX;~۝kٛS$DƊ1h.ZDoM\:Kt€ .j{F'[KTtdVmd]>Sܨ͠.e{yVo!ˆ\aq 0e-}*<<3n2F##~JkʴmHqN(6j733T=Bko"S]q!2d@.tcmt[Mۼe9sTe+6l]6vE+Ǫm,EqIHR ^M8ziS0AqdF !36DF) cgWwob"1~ڠ$GR< Tb @mv^}4*(I쭭Ж7@};`гRDIn+`kof:\n{> ~Zs/.g|JKID'v-V;G3218P]&Zt5ٝN6 U18ih84].~E]cGMo­(FJZse,& ZKE#'ATcAKrSvo&9ed`zHeHb*& ƢTrUpB&L#z$r+\,v6 -T6׳rj+ZKEuPFZ ׽bnuLpbOb1ńDDz†M`ꧏJxQӫ)r1QCi@vl2{ʹJ87h{]R"b@+y]`0A<1bWN=kY;e{ {Jt@ձ^/TⰑG3  P >2O|ߘ LEc_Ե4V졹@KQBF\獻,{JUc头1_!,D$gHflId`@ &zttbuh[oiāԠ2&Po @Nq]:vpYwF4|oeY*n#8i N˙r qMט8PSFuK#*D&GCpp1:_]٠'F XY?۴Ujlin_%rܰI3n' HFp[k+XgmY렮B,>iهz hj+[sRhӚ_W%Ԯ.'Π[Uh/#IyN~سNpܡċoidў|۵ƒ\aˢErnOu)80FJ pک,aQ>9a   FLFۍicqء-1I5`mvaqip md0z(1iDvNk!0)sĹ7n6]\T"yG'fU!8eK28kfBAvqJxLœ;hv+K7,f麅LJfjpĥMtQ}k3卵*ᗼ]/ϪkJI\^ҪA5mni^rujawe8D A!īWTYVlWe'2$ &ႉs77敼'~޵Vmb@$l{l_K]‹`bbsh<* }( qF` ?ɫ^D։]qu)Dspì3ݍY+voC: UBIB0j;%9SZFJ,F}Y'<+s!$KN{Ol?ZaC~ե};CQhV jԘ+E# :v @-%omhK0(* O(9jcP?f4H*7|uUi㏄)WU +rwSRy۸ ymWPSƹvu 4!5"ï)y/Ufg\&]IŦ4#H2V5}àӋ֍$cw9iu/d0zO|?2 Ih=]tR\4^3vՅDD?#IXOFMNp:ܓ7|PvߢײW\D;e#gs:4n 6ȗMw:3x,+xwny ( $cv7n7фW!!ze8q9'\=^ުu``~l^x]  @Pn?|*gS_b]^'qO w8RcZ!K:1P/ #WU\pz8zWVjwԲpki]-M^.WL ݘFvj7ߞkǓp ѭZ懆@أaJmo^ I<|U{C.=2hE16*(A.ٓbs`%C捓W@Y>d=ou3|8Gg@є ns: øZ6rHX,fLE V a 7n .#~$DFjϦ^К{GE=狌޽>VLPY[:YoяȦ Ft)MdȽ\%SN15#uN'q*ծc\;P l%j[zq9 Iu~R\/6ɲ0 5+c},٭LB&Vo~jdp:hNIP}c׷j/ fuzj4%ESJs,5esZ6U! :0>#2==V[U)f/Ǟ3Aͥ;'!ID54)b&jq]0hU,]_JamliЦEj2j2pZXe E|.8LSk/54 ILTFea$FR²}ټ ф"Ѻ@jd,#$ I g&i0ۭ*>95[4~Ѫ|GLnM9MV3a#䕔}mW<`ѹSLyILHXI>}uK,Ww{,5v18{7s{6>vԔhnY7v챪jH} EC/u:]h Pܠ=T oȻ7K'Զ4|6⽛9P|B.lˆ[-NF dq3xt;M˘#8JE9ӥg rfkdԦ)iCuV35FU Y#Yx# E9߶}$tm7NLA4U,-&II. E٣iQX\U [yV_2X1KmEӟJNMflZ6Os#KlbM,޿Ux=nzؔk=mv!va2K-NW *䂦Z*0;l(.DžOO)XR#cɈc|`ff %*Um%9W *[ht 1IGz}Y5\wڟU5,,ۻU.6 D'pc4Oo`n Yjw- 1\xmR+_̀̑FvFπ#|:}{67z4ճo0's$l"c_ ;Kll/;ťϤAX{l'Lb :=u&O0D }6'  ".8xcǨL&ㆎ縡y(缝l(O^]% W`irWӸd0Azx獞`01p8@lN;v IID`0.p,ܽ%KnTBc͡Ad^d?JnHȉHH/P϶4Դd)c3H }g 5Prw,?@9lt1/3$&}6eh{@9r |'XeLQ $/w\ipN3E>Y3 F,Uw9CShK- h.AqSr`~1d&3%}(sa +7Xm4X̘6l.'`0鍄8UTM1!k \n7mZƇh&, }lr1~c> U%v̓b/ᐻ"vpqKTg&mnVs3LA_S$EiYg6A&%"kCYK6l"Aiٴ֔ 㧖(lg]>&ekaE(RM)W.Tm#C+ynw"v(ఓ[|ܽp>ƸE !>F1xuh{uz [h ;_t#޴#PSNP8/^+u# I2RGMT?nerC2UpYtTݹ;E$IkƧ /-ɐJ U?1\4n6Ui#3G=vT3*YCOc<$Q-lBb:vV}%{/⟟B^uJ' h{uD$hctz(aƜsxL/˭- .wCAE±ֆ4NȒ %7)ݫB$BKgk3a iNɎK!!y(eU@`(oY٥G&3i±BO[i64G^^[c2t#;n)n߱M9#2ċD>}TQ>E7y\5> EL?^9?4X[ɾ"^50KnDbYp}^z Q!5b2v+]3Vy op.'>rm@T<=I ٥}Y8!k^巉l,>Х9Cmſ?9݋.zaX[3ʵi/ 9>U( ޻[ӆ_qRxg狃EmbU|aɧ |sC>D|e=ռuDQ8}!|6:qʉ1ݏꂝk)h~Dql u:'wi죁3FL$Zp(_\qob]i7\ jyC? KݴL">,qFi-;{FI"qDP=M9cfs-ᅐ0(j:`bj'=_ңh:MSҚ7GlV>"^u^q>NFAYa.4qSEX[x#y8/B#j=9<+~N/w 3?U&$qOS!P'|Ś\Tlxe!:O\c߾sgWBהV3?sjmC=ݓ뽂fjgqvA2WAx4\_Ӽh-ZKI.Jv!6 CEfsi 5]O6: DDcK>a<2V1ZE4yk)6j;w+8Ur;4A<~"O쏧ntr0y|yζW[m+̗'o`M½%;PYskzw@J:\I*yGI*->>Yԏ5:8r*>Rc]__ K U/{8@L-Z%d!c±P=rAr?-lLLH(돧&0pcCsOWS݃hW^bнx u{YEa/7hz>4Otl顼Ȍ|{5BM{#IP-.<ȯ䉏M`3Kͼq߄Xy^#Ry՞h)˼kn%sL(غUPADё 7?=o 4eޯ#n]yɼoO<Mɏ$jYkx3IMW6y YeY$( ??\l_]B.{= c<)zkˬ(b\WF2u֙_`pعǻ<ϸC5dWJ?YBM].' be0u(PKbJ؎/_6Q{2d.ُQ͔oei[ ej+0FR7ipT js!<Бf9ܴ ctswߧ\׸#u-;76y9ڇ4Qk1|ؚEGP8j! θ>$Dl R.{NϗoS멓en]ёM F$$n'Fz j+ipVf#:&t깟<{IK3VcsR6ŴWčN;w-렡\N<5Cb5Ouu~fy S;hCFMgϽohpK~VP)+哇S<k3f)H[-.Q?.cS7iBYTBM>yRn?{K#%wll%OP=sE{H+*,~SIkN/%ox_YuEo"H\X$!󎏢wf7BUW)*U "@Q@laEEbH{mwq7H݄ٙ"y=;sϽأ((TTSQ]E^a.uknvv`x,Ix9uV;1*#Iy;8 R) 6myx 5fd{prEvrä^\?rC1'|f7?B“$ez"_)T^Equ%So,?-3푇s?#O:?btlVOEaKu@eP׾W-~t. *z+9* dG'R__Kglc{6x$ uÒ*4o]EOއNel3zYdce 5J ^8P'Wsݘ¶}׹%0TwQUWb:o}.nq=$>dEwO_"#hAFDM7x8 KSѿqM9.6 >3yO1 x@_@K&ZJ~6 [jFD G5|CKr^MNa Rp$8{o'QbiNjڪ+ǟDӸ}@L`p(:=d%@ո٫K t.?Y"R J" ᩷P\YFni1d员žL$!# rpdM?*{ibQe?7ǹNΘ'+b}d٬: Haܟ!:ѣQ+:\E8'ظ-T:s@^11\4qfg{Zu$8C d&3V;s?zܸrgWyם%Cs~rFXlNYq9<*n ^'g1_j27 '_B#0[~]97M<΀sʛYus48s؃NW3 5;bf98Ks+ql;}OaөԤƋN"|%*&78<$ʲYwˤ;%C'OL7UUJ !YN83 +i۬#o>#!9kEZ|erA5 TF]f_gDu:=d&¡>QH 1Ey2$`-xumQm^_.JR"$M?-/o'WCX7blimbmˏVvֈx,- JGvOx&ѢqMTd$ͅRϛzpEԸqniۃ_)nW(@E)Upϕ$=|ٌ;:. &PMzi,BWwN {4ҿ2 #<}Rp]A25Jǭ!OCUYUyv %I,X2y!U]>?%$yejw~}z1NM=7EKR:=FV1'xDNUc9ZS zOj;+S?ƍz2yԳ!ɼOԁK |yZt)-,EDx:;RTӱ~;5^,(IҐ7?{T7O!ƨB`RTHhF?cÌg z'NJT6pspJFae5e8dtgF  C/Ʀn鉐nC'~dweU"p ?H2QpO^../a¹^e.VV >8n1ͬ2{)Ip/zsS^ʠ 9Ҵk&i/ۧӮFA6Nn- #8j;~rBs;L&ѤGjh".(i\*_,葋'(O8c[}=bJC?}v`Ńs;3F-Yq⹻ҺUUD٬5mbuvه,!ž# DnhӍߞ [dk{nhՕ㧨[SMKԲdH:!w2z%8{BբWQºiܳ)Te :!_OR'@mN-5E;Znb^M9l}zw'6⽿\(m?ƩLN8rKD?pq+eO  1oѦT WVnĖ>up_W}+Ȯ_-PMWf?}!ddf5;]$ *ˠoIvWɏSr-紘 jgmg {F4AvpLFǰݻ]6a-;C%[7B 4ffԷbʛAqQZfq_hGw8ptះ\˹ǩtto8).I!H<,>+/*APmy+Ez O$'oװb]y| ,vEwΆN)t3Y6/b;Ɍh@Z.z*9/+Nh巍 W@}Yt+6'<xZ`05Pr7]40ߚ*4/sm2Bv*?AL*c'Ď:dh٠DYe^iƿRb :]}G3xDADԙ*cgKPʻ%YG{+x_筧8cr?'= e^$ng>Aܿx'sUsg۞ ^sܥ:,7s'634'D>F Bvie:d{;;=8ق^oV$@rf/Ovu3YX]!7ǐ(V3Mz\yjU6\ǿn-t ~;7O禅RPYn\7j Mc>ع?L4/5Ģo;q:n AV޶n [pOub+P%Њ^N;C_>.tmwPe{tzA>I:IV7{Rz\&KYWjoJV}} 4Jpyi.{|Ά%H|"E'Jp*^ӵL.LOiv9zF23>cM}yybl4`Q$h(2Oc's5gqkٗpFDrt'ySv"ت*֢ Zgy5=;N(nrF8868jKa=]8߁"*829*03L͍N t̻;ʝ2Z 5[0Ÿc׋ $KF-LfGoCXwj{`<7`o @y%$K%d@f2ɹy;lUB _ d.8ǫ UW1btǾN)7I63 e,G7_ɼ;5^837gQ3M!'aKTRd89w9+2Fl`OU-6ybPt0_<\иF쵋K_S4m؂}Fl$5G70zg nr>)qb$CA.L0o_C0) yZL^n*;#R7:ۉi-!q-o?muMގe5w0hr Z]|d#\ٚM٩ eUc:>ypK^~,pkg|@`^lxx+.x\EnQZ2!$A!\^ʈv=5~0\x-2iyMӡKO?~nYbl>4SSmkٻeZi7[wS=湘-NsKK[jcvrڎ WoU^"nPʒNrۗ5pI{n<Y IQ3Mֈ5>eձV۱tܓHnq!Q-ƫstsr:=xsߒ9 9Iհ%Wrd"r[5y롹4l&,{K 8^TQNxCwߓ!Itr! ;tѝ\e2 RZ(ϥEعy1CfzxwVqY)K9OvVcI)+ۧ:H=!w2PUN\w /J~}mˬ,Rc;cKC i̼94izsqT]!? pvNy ݵg0i>ʬpuLCM5Ncji렷Ç_7~'Y {G()x9L=M+qm{X<IU(YZ4% I++aWI~=?p<3bWz7K'ItjΣG*95$R,j޹,!:le W'\{Hc_OgWi{OPde/ Cmڞ${xRJre@+RUr0V:UJ"mEOpZC|ő?=k[;!/1d/@Ö2 /a :l5p;,  j;BgM$-0XJp|56L\6_~6D6PY)lU((/{-z\N)+Al8Mptg7"C!,^1дN@vQ>#J:I|~6e9Фlj@1Ë|kiśIYŅ=A6ַ$K+( 4@pD7Bհs%V=Xs|ƜtCz.EzYL3LVui̼9}19^j,*R__Cj3Ea QLc⹏Ю1Fzq>!SnO;YiYeuߚ?YU$`E)%JXsl7Κ! -K僣 ࡒWVS ?Y!#&IzSUr>+W}Q,'87;{d3q0-ӇjO(Sng}k{hH!8s{OaGUx|˜$Aqޟ?21Kyzp|Z8\ͨ "M@TtK~ @;ˀq&F-u@cxM: AbY~-;sITn6T3x|"iIزtVn\A*X :pug~MH$=,Ej &#5`NOڎW)oԐ+M&ߖQ\zYLj05Pj"_k[g;qtHnxh&;^5z4`atx^|gIi o5φ]yHU}7}/5I2d1XyēBsZRWA=\BPG&ԦtrOBMmYg=viuys?O`cj ؓ%cNN[<]bѓ:F_ iܪz̆>4n)oK`|O{薢uȎN/aG){Qj/,) MV' 鿾+ޞzǯ:KؖѭT'"T'++&0$V63%ȥ\j`4)zGFeAlTQp!aT禳;)j;i%7gۄD&C:['`˙#CeED.C'}B̌H%JEZ T{>gO"q`&SUׯ'^U82jŋ݊z w!NHKVjZ}>ښ+d ./jZGyKӹj叫 g+codI"'tLFڋlF^a3^}_5q+~T o=W_!*a4n _UyhڥJ+W&;hfɬTA"]i4/j~橱F+^ WP>=}ٗoPNU&j[0ֽӏ3NvǔТn2\ա|=Ůl= LLkO]AQW(.}-ob,fřZ FA•,u\tEQ26(pt!m-;Yg4F B<|gP'0WZ*jˁ@M53H杙ytSR\sW*k~2_BE%Ǚcl=CŋPom2]ߕkI% hAC(,LMtΞgB]Z(NUwsE#L%;Z?/ZߐeQ htO6hLH/"Sxy+סK8NֈxbgeeJ4:8i뀧a . -^ِU"Q-kIuvUT .jBYpɄ ~=;Ao>ECP+te5 &:9'\C G3ٛxrRb'PX-!IPQN]>oIvY1ZޠhA)@QM?Z1j/:,Ntǝ0M7ܜ\/}^fqBQ YHM;#8s:fz\MY YVkt )dkߡLzXmCq9EXCqˊՍi2઒"0#sb'/<a.v kD۰Ffʫ*Xsh;_l%l `gOV$d^s+zZ`2*0/B\9;81S?vǞ,-˶9f}]/^$ J iԸV 4=ҵ:Z=JVo>?gu^+ȡk>2zƴ^ qx2Ag6w tuޖމ rS5VHp4,el;nbn*Q'q2Dv J(&pp e& YO}E* "p>OFMxb3i&s]!jpZtl7>xnl KE1֐"wYg쨷}S`x~EA?Bm95\=y%9B?⥪H7j+}ƐO-"ҤݒWWoV!xC 80UׅήeF#ye4 Hؤ& ϣKptqH:=_?n9sV >,P^#&lAu!:KʶēרԮLbۙ=33U5ZXwj6\G捃ڗV1maoiR_$GIuѧ6I7rjw<5 OVH:X}R6A.buqٴlnW׆-c 9:b`ݡt>줠½g>2A\g2BI6iϪRka .$\?Ư:fB èh9w|GwBxuϼ٘Աk!HABِ+J((t^&mlBQN'i'dhΝ9f졦Ȳe$Bf[tf0:uu+zKCEz{U S"('\890I(V:8Av*lڎMJKuu9{:ߠRQɅ%{71l;; EEaRVΑ@0Tqw^]~hoF6I1X!-o`HNۺ5{7AfPbsqے*ʈj`%+|"̗KxQA fѣEʿ!1cOlZ$'KxB1 qgyV\mH{wleD)̅d=nuZdnmމNaJK^(O_߫)<]OD<5,d;{(=x3>Ucw+']/N-Y+Shh1Ԡ)[wڎ/.PQ^ Ae1sˊ||[2$AQC{ނ]=6آ [tao ߲/l:`{R[+Rhٔ1ΠqodYJ q2䧷bl]m;vxcZ!.F ;;GE|gX /a IffsBrz=z,[C||3 EPŜ`(׶MmcpbwfH:ye(a|{VPV]N\[AQĂ,&/ߛg(=CmCD1c6BHI'6'cɤQ\]IрNGiU%$VY:*D.ܽiވM;2e^aO/N^J;V 2hФ ~zcɴxfT#(Ě>;+QO" 6ZQ :; 9;: #))T_^]e_Eu$Ek5_Ylyzh 5['n5}[(?7"wՇnN$Śh"$by^ZQՃjc :=n{,0!DYg)^_+Ř!چ_hTYؗ kZ^UiSuU־BQU)7ͪ $DtppdXU366d_j/UQ\`g,"h5ՂZUI-?6niMuYŝ. k'n8sn*?WpnWâ- 20jz7ûb%Р苋;$W2"?Rx4SeJV;s׶s,IAD.xNف2d r7+ʙtnEV6!&`KA֟>́dg 9QDcsُ\Y}T=lՃ6! Ө5vVu?[`ƷbO0L!7/85KM)N >nh;c*a(/oZ侣mґ a #B<ήbUog~q;{._y:a{M-68!Qt J!B\N:f! )/XU97<EqV9Z ɞWWX-RCYRսLS|Gܡ7櫅|Col"P%-lYdOr,GRp0##i ^R\ 6UFpصsA9d\BY ӥI{9;n>ǶӇx{w|h;Jn:}naK_iP^Y{JuP(  lކ6}7,ߨ5Zt!K~;VCۅC /ߥ^zNe.v9:䊿>xu.k:lR*RVsdRKA nozs1헯 :%+Y3Ǚ`e$s}2GD.,AmUUf20m3<,XZl<|zɺ-u5F&Wn_`˙c4yf@!Q:My6o?gy]o2-#cLG\x[Aơ;A^&R碉(VoI'WVq(9}TuH^ív/ dS m]3 ߊnT Ϙ6׼3Xz`+H ?z@xե3:vf<]75SNCTs3PCg7 ΰSՄ}FcQYa-\= =iÑ}UƗ٤O/Efx㇏(ηڎPrfP}ݹ$ I7MsW^_vKk-яhWRxj#?FH\ I;n0ۧa6!JX+ky/[E9Szުz cAaeBpdc'/Ƴעf5@7ޅe}K;EOW4uz;Jhd&%pCXR`$^/hfqAqm rձFiW!p43k?ujJ%['YmGB~_TLG$e`4 }CMIaQ2xp\h{aX[m4(dx#YBPf6l}v-n^VLog5Lδ 7O'Pc]g$6%}DO}F\n]=xa#ttJZ1υNi3Uvvc]zaTIm9sS8ـt '}dx4W=fAy) }&1,ݝ(/z &Ħ#n b$Qc篴~~gl3#Xi] 878Fr>l9åJE6=Px8h1!EUELh%LZ1goN"mdHvilm}#{ K4|#gy,ٻ#;~;f)t:HOYsx0[6}YLh5yV*wZR-(X7F5#>mßɧmqlqKP}9A㉽txaaHwSO^yv\QZ]ɃOu|I"v|w 2f2%ŤqV9I㫟!~K#IļnW=Ո֥zx %EtyÚvPQ--ڴs Bϗ_ 8ġ8*vN_k =iM7*v&QGWiܳ ; 5"Ln.ڎ8?\3255s yGif7}OBҴNigƴi;%5-! 3;FE푯~t;G0Ă mkw:2u*,s,|{>>e+ ] 5U`ˈ ۱V5ln2 'ڼ| 3{قRLq>er ^n񬘽V_?!<7FۣQ_x#}g%M*;WOەHʢujPV\s|xx&-ZdKPQ&YRirˊqյN'cSY8X5:4CˡoQQ薸4a<_T`.C \vfq>~^sTQl&&P[1/EGgVS?]3`Ƣ1%M1w %qGxl2GoV݆@% rv?jl b'nNT$r׋:&}.:"$CjG>HH7D-h#7f2^}xHM+N5X*ɇ,خ;L&1GE5#!4Ǥ/^# bW~DN9͗75ŋE}yPμHz`<4gdRQeE|}Uy&/O ;2wÚEyHN=Ϗ#1nyD$~Zm[OsH sK%YY)x7d+'*C So |ս ~} -՗>e}-eg1Dc^(0TH}9Z:*dp!e "{f[`w^\,C|nw9ɟ{ rD+<7/T I*уL=f2勺x躷 (x<3P3[AiZuǡhwֆoyitt O{շu={R0'gu$' W@zU>x;tw.I`v*歠>bW|{а;Y 76lI+@nl&aJ]lFFAA/T/H+ɫ3!{{YT*^`P??$ G8qo4C& &+'g|h b LZ5AI>zC&ui܆s̱]BIh3|i)Ûw4xbd64l.`V_.泯{5mw']71ng!!OAfAMIY[o`̬;!,r$Ai^|Sߢiܫ9eYUAR~6RٝxBYl m)8v1H Cat[ zFH*!+t"/yw39ξB>A<҅Sze[Yg^ѧgL۞ jޙ6~ޙƞ$Cazg7 U"pFܼjۢ 8yRpMv(/ACW{"fŰf㿄G7c4[:J'$׬/5) >GPhy_2JFX, x>=L!lnFxy[B+NUV DξjC'/~Ķ=Z߬27Y=d9n(}&p& N8K։Ez㻇b,_Ng؛OQ-Jld$IUWD$0)&|]tt\uސEw~F\ذ\<[v_ m~qԫ(ekV@yO?myblkW%uG7xw>-T mERÛݱi?[ Fm]n Rj I4nv]AY1N;:`=,`-ԼC7UqՃRdY*DQ8`gj"@qZs=~^|&~84TdÇ׵ՈnW7.p~KhÙ8̱ڎkw}6f1PFm"X{Pħ3%_؁Ϗ#+ᄈE}$m^}l?5bC ?̞=AY,"_hg>VP2PD8Y0s"'GOsЅv FYddQ).GX|qkK|A$^t:3?/).fԫP^V ፅs+U>>T%hE/}rR-' mnu=5(za ֐ ~7Ė[OΉj 9D5cGmb%]TdLfG4;뗓utz(wX#5ܱ*ގ~# xg-! PYtɤdWZ@^i펒|rK*ƀ:z% $I^ov1E- } y,'6p<[^CWY^g+L;C97oO`b A|cixhZ`v*Xbٽ/fJ)G`l*3?%GZ bE6y{7k#ːWv,%Qa4w L:YL^mCēPSuF' o2h=z(G޶; Hdca oW7 +KrvNҙwM}0MM/(s6P9sV, i=a>ip(9 '2Mŕ䗾½>+gw걻9غS>S81[CjoU/ZG :<֬X4QCd #G<Ȋ NO݂RUЅ$V٩l~{zFȚaB M\B` u{[mK_`e©`Q#$7>3BDr@im۝7MGtcX7&ͽO +&dWOZG6c|>V|l6sk?yMIS̘4-Dd{I&d; tO/eޞg5kk+JYleL?gOG1NVďNIx#kkTªx/E KP`IQs> }[MCN[9NMs73?gl5}6>ܽ^}Ԓj;rE }mW |/gsUD8C/3wx~@鏪+t־`#ȇB#<ʉv$)ȁRC2(f|6`m {NT . ;9ypM5pD cb ;tI)pqǤ(͏Jc &EaF™F>3_wtKf_'c+iscxwGw%Mƶ%>޶Fώeo nHClb$vɨJ0yUmmAf:bN7]~N&Ggbަ6G-~<oNذW 3g TWYR#Ex x祔jl\лg--T1P#7GVisȇGm[}q6JɬT s]3iEQĖ~ DFUexpMV K0 xnX1֝3^* e ~!z-̪[n܋!3Y0-(5Q-wZͮx'ݫh@7Q]mx`7t^GNf2D6Ӧf)o[CZ,BEuKӼì 텘m)z1{ލhՕ=n- * {cJ㕍CήL3 r簃b2r5}6چDѧ-bQ[F]lW*=]Pd!AA`2?$|ٲwF,|v/648x\y>7Ag{FōWChf={&bК2.7/jbymjN1wza<Og]lfϑ΁4J*+l;4MXh_7Zf`]|ishږfᛃ;Smb˕'cj^ P7mT'OC~_,FM5828c~9:GSHJ:)e&$50=K 㙼m$3e h>&#fXcR57O\M`֞ϐww@ j dw]^jәɼ[z,L9wƷ.nb]jf"ir "U ~_Մ"͎mKoԀ]ghթcWf5x􇏄SPĻJN!,}zUI ZIІ^wbyn&;#bBEvb6@hCr rxg{tz^n06_!h3{.wj1͡g=-/#jm#d(Χ-/|*vfֺļt'S>ɶ[EYX Էۅ(IOUYNnufvE[ x5}6/0|L&#vf~>&\ qZ> 隌}W|6foXy଒(PS7d:K5ioW x/T:t nޙm) "4e|>#;AP[tKa ye7 u$R1?UU?s^vq'0MhEEKSOZ=FaɈ%Ҵ%W^!ѡ r ~?Msvh(rxI&zl ʚj +J*. !7Ү(Wwp8KM'w-Yo|M`b,MXDcK + ^k~j[.'^{()0p6?'_m{h{?* ǂFcuXfKԪzwi(CY6qŀ$k\܄,<2^V=!Ú]04ȩ;cW[lC<}SIYb&j:" 0B0]JT 5yѣ Mfr$4NdE UPӗӇYꀦφ=_m~ݳ"m\(=9sI]ԦybG OWGHEY1.1oV;NxyQ5LwqHȲo`#sc+ [lv {Km>Y_֐DUwxH*$,ʃXZ"BPVC(8F3XAe?߫ c@LkONFOl4ؾ788h+#tr$>MlXwL)q!wm4{6BypuI}Nc{݂ .IPUɘsvWUN@mˮ0.P~wtaym?|w:p{ \?:UsbuP7\3(ȁ8Bxyc>oi WPZԴxIjT4Ppr8z [Y*Ja=mP?q[\#@]+]&U.[=*J VۥNlLB{q+U3vMx3?Z3Q0w/Ӿ0k3DDAjxr8v|ˏg>ɸfq2;`_i`0pAJKK-&}kӸ( gӉV*Q-qT9䙗OM+\a x#;ፉ?vnJw -oj&jT4lqw~1mT}6}pB]V/?NDV M¬Z˾SͤH {Q=m$Ƃ+M.ܼHeՕJ;l%5hBsHʄP0f|7/ aޯ_{PM]=l?aޞe_ΧGO4=fAu+++)(,$2nNc8x>bDw?}>^ !$I3@1=~\#s!wƒ9B'^$\_F jRgs3؛r>vF1QޡtPQƆ؃V;t1mhz #UwдmO<r3CY 5D[\|bXL{|(- at* s]I@88^@Y0K5~ ȒCqoG [wޖH; >t㐷jU:u& oOESdƄNl {Yϧw<}'uMozA23/SNҰI;>G%t;lCA6=+F#8<[l dA(v ds);#E'U=гa+QBWS~q(`'U=Шhw/6hj[v%Z޹ רBYD4 2At%C8w_4{YǼ2}oY2G m # YzvMoш^tIu_hU7 w>8g`(a7k~xY'i2=v%,8ٳ`#7RW=$ Kb vB/_uLq֌5zF s9fC T^ YeLjSLKéL+ sDDSŊlP^w]t:.@ՌMזθŏg#' :>S{K2` Jރ F* z+'qE7w LNJ6.1[j1NJو W6`U8h}j_7 -g2gPz|;(ZWdnywX9pGެ~ce!5d4&^Iq6KF#U;ShOΦG4ŕO'8P(#72 NUJ jڻLϼq nW;J q mD I[[1i*YىjXhLgvW2 4ˬZ nYDwjah"vTdžG^V4~ٖ٩/gW(U]Ae88e/ &Q?OyYvQt Â>mSMQ>y@4+Z!PLj~yPeGM-_- c]nևŕCg]Lڢ2z;_fIh0 BM(>~a4h_ES5f2 K ʒ_ bd 7ꢧ妋LͤSUoUuș%XbpXvun%LIp Fkvشy$H{$_^TX$PDfEO{*g+f%f2ȟ > =Y,9%kY5bc'?`tEi JTşyex=rZdOҫ(>?E}Ofq,Yh %VW|IHgݸqyï%4]\KqwTFO(evWAU;^sn:WpZSY t k.zʭ-ܲ \Z(&9_0_f3Fs #I-x!'Bɿ{ռSW61 ClVr -[aZ?{bBÖB{W 5莋b0ؙ|$,GTs;vZHyPQS j4:d ';zOG+!^Jmn8;P*;Pm ]":* 'rR8oDQ-s5oυ$4lAB =75'SXlZu qWqnv|-}kơG).7@NΚHMeG~U΁$Cn&^MX=Xf"h)2kFHܽhن[V$?u*c NWO5D'Tke-K`!„05h.`Ըwt&!'\+T 6 ,*\)IOdgɋ6CZukY34n}Kh%L>Qޙ65dXvAޝ릿ƒw3IbuV7Z/7hăsu'F, 5J2I{\hD5j7)/umv YVc6B<}~Od_k!PY.zkNЦ0WE]t[mAyx.zd$^hnLxAV5luW݌π\p4'/+ȵAM _%O-(/e} >EnB`SQuQa.Y*FB~B+$4 f5څDYGDwDsѥAn`v=9T YÇ}VwS"2v=֩;0ٕvF^ׁeW`.%'TV}}"z:sR>x"]#.=A~whЫW.->1[ulfQe[bf j<髡`RvƁF3\iu^W*ڇ52kk[f`Q7AMڋA=^9sry!: Tȵ::CV _|S[7`⭓ eP/07LFzΝDFمg35D£w:S'ҮUg.ː.l%Z/},$C~큣lmgaL޺hnhT gLnYY)ўRlE3-pt(+)mbOp"OYwlfdIbb氻ӫ8ixM!7vTc\)z-}T6`Å8E7v9VÇZ|hbxNAIb١VCGH![9yt'R.yގ/'RZa1[ؙ̹;!do K[,JӮ"sCJ<7=-4ƶ"h),#۴"\Fn26~;lj\`T_5}FF-p۶'k\x VukF!'Kxue\Q-pӠ;D͵]x(&h̚5Kx~7ǁl8Qf&3yt s)(p]ap|Yus]Rc4YFvzaMXyde 'DV lN:N6ۣ7@6d3{ $y9FohЄFED ]! _ysGӦPLŠ?[\ ݄@neܥ'"xxM Է\|Wm/Z,,7\ն r9X?Qq[HJ$(Qk&ZP w1,3QՐE+;nw74'@ko|bl/x/pv$pwwck]xV0 5y?ή BnvCZ]Y:Pƫ(sX79}Ƞdv*@bz ;8sv ! &oCI1?7/j[ r'wAyan\ '{`OuK}7Pۮ<}yFۥVJ$KSߦYT:T6.HI>8â|bpNjav~ѼcF<)qKuoO/fҐ;!-Au)1 5'rWzA\wGsǴqk'l`uzNR>4'L_LkBYIed֠1eyǛR{@-u%ȡ./,) !'l!fνF`yrp״nU'ך7]+yc迍(.+_ޙ_lGzT8c\˛,So]_AmKRdg/=K'Swm= ocx-Vq6VNIa˙gC!/; QQ ,I

iO f~R|[%, IN 0O:ɧ!8~fLj( <5 2fhQ8iv.n[^oSYL=f\n<A*=:YǴ~AAGͼԨq~z|`v07Sr}\T G]~ԮU.I󿠒,X,hTQd :3$EK q{6BPaYiUs k?s-ssRx;i\y%X.{{6 ̥Ͻ0mMc\o弼Ѻrbȓ ge(ܟ>Jk7D.šwŌwzh;n^j?@L޶UҴZh/?N{Ȯ`ᛅRNݰe?n+1il['U@$Nf[bvCfprN vsi^43<(Y ^A|4`,Rh\73<Vhxx7 $~_boR> r ֻbᡗTVWwCzYGFq>yeE?K$g;FnvF.zlSR7J՗6|Gڡ? 0Щ$AA6ѭxb'BK<;?Ⱥ~V Il<Ԧ+$EmJm^~,ܰ r_C>ʎGnoNz&{bSUF tzV>(cA) |j?<ޑ͘p$H8΄S*)/% J\wVB/++tNS5w;,mܽ-vlm ݨ!%ˈfl3EBN09NBb҈_@a?9xVcI8MᴤxΆ+sOPmf"Zü:dCJZOzrf&[eˢ ƷHQ 0חM|^GQzp[͌b~`/lߺ^^yO>o3]w6undJr3<?-{"'+w% 9D4;|B]؜nځ' U=ٸo{bӺ.E6mO^!88[Jݒ'NJ}մ=du݋ٿ,M3qjXφ3TU0V1}oB(%[`EKWUmݨ|$@DV֨u=n}3 Ҫ qVh#ԖW$ebR@wQ;=$$@@APתZZmն_".U Ȏl !{&d&D1>3y^JHr2{ys>UWM R2Ls0=nhk?oStOYeZkKwz ̑ō=s*K_t=}Q:]~  >u-]s2gk= jYv bߓ Y}J޴4j ȨBzo~=lyEܱ Asf_$/6 !\oRg3 K%Psܜf$nxwnʅ_b BmVHiM K[ײ{77M<0:,AeZʶ,꼡\mw=6?/[sAh~[yr㪀E 87}Χ.!zFvz #]sKG݊xT~BP p)#<'̃ɫA}?_x &V[ydcBLt=L \2~^|bEbM*5='޹]g Y5bsGo<Ԙ,BO{GGO`'ČA 5]e dk zq<|B.X>ُ'_ݿAZB%ǽb> x@̙G'k:WW$MrsMQqy^gGޠngr{ b5PYu-e[>{K?>2= Ŭ%UzJu;O jLFz'ӫᯒlJ>2:4U.'}9o;g6Y&I-(pmM@q"~ry(~pWჶ-4?Y?ַpJw 1 UE%qtŏ^#,<6pؙ$4cAlw|212A%7OljSP0ztO⥭kMKng MW>MŚʧl `Yp\q"~8*{С殢L|4=LI\>|ÿ2l4|j9W>Jw 9U=M/YvљC[>΅J߭^B#Y}-+It'~FĎw^t1SW|2f`sl DyE䏞&nhu]tǰ) ֝_<n⸨~ K?zlr̅<8Ӈ>*^a9&pQչ+4IWMH(*¡cXFٚ /M;,Re9PM3 ܽ4u0F!\dnW@% \4Ü{ ';WMO\+nh-hnjC㊲B%S@Es=Ͷf%XB{Y :kX3uc+9y`V`Ҫw|JT2 \ _|Bkuj+a"יiyerҫ*ݺk#JqC-! P-=PξR̙Nwskq:r (o:Ԉkײ&Dס!Ә1P:# iY,HIg\y/}lN. JBF6"ꫤ8/w$}GL4\ Rqwb5@|cg!(SJ]^WsK? ubDgm̈́e{5u~/\ƚuOYبX'xc:5V燹tܠz-rԋi׾Ը{3rp""Z (_θ󽡆R2y]4r 9Q8mPsySfhd>d&W{ Ɲ&;;y!#w>\[./DJ~iÀ$<^1uN]ӸxB:+~0\;k:Z0jҨ~[ok2¿쑽 Q1BD4M5}_"'Oj;gao>-e|0|<<|#xe:mYoғ˙>v&}3Dbd WO'Mkc\ !a{Fc|a+e;VפUE%.{MہUexߍ{WRmUO7TB?M5×5KLEK͞0ؚśnk"OSLVm= 9 49;Vb sM4OeaqtҕOA*"E`y"-EL<?{x #>~]5e:*?7 KxmLZ{ Y Mvgtl` Ϳѳϧ+vu;áJFhijq?ϻ(;֚&f4}߮^2Nd~#KIJ7}V,EJvl~7kt ]\u ݡ=(qb)rҫ*fz|uPzhd[3wpɒ3R(Ȝ)A#DN5c]%0҅\e_8V|uR>~bumZWܟ/q~Cm_WT2n_)s4Lk#ni^tshX5d MoK4#tx3uAPc:oB{ɍ O;)>/5oP.k h\bfvg|~xC .nH|>pAM9s ?u]\8*lMj4E%]>WP~ԅ3f?N"[\5g&=/կPu`]pӘdSHMLdlY:_Է4r((N"*蚎릢-NtTQ_ Layꃻ >|vbc;xJw U84 kyL;^#A6!GF߯^Ըnp$$] 4²Oޔb²CIJ{t<;Jiru@1p@ۅjEPqf6wCxXY;`8)I[ֈf8=j4wX&*x㕢Iдx6rkr)A;rmT!+Մ惤t_6no6ړsJ..À o_AkT L6 S_ɺ]} 6J8P @#bh1L< B~y r9Hm뗛!QƠс55%E%CLNTR<micBӣo'ϒGH^VPZ8² w1 bO!Ld HR`08(E%@.p_eKA[648ZI˨imdݞMUTm$F FӉf+n]yqm\V0.h1uw |e+m޶=:ꝉ gnxq!{㗐}_L)onNӅySK;yX[.@%/f7lW5]8 =JxŶuP[!He0 `1tSqEH_/M)Ƌew&h3v<9IxHVz9ssOlV_='U\闾dPg}sYJ̸݄=7Nd4'ߐmWфt0Oy﷑m՟9CFKOn\-?dF[w0j=h$3d0h=$qP㚙;f #ys|ٰǪbn}â]Ċ3 л vL8|K~ZpW@M98]E%uNͪ|;hR2X}_nR[\s\W$d㓨ٿ5v-PK3s El\qI?zɛheWNT$!.7.JY]U7H3л BtKJ\u?hUebqCLS2y>ڪ Y^˛t>U`d Q"*O Dg7j\sńixW~sR$ClxOvߕ`,9o__Vr۝V3 eҍPt7â봻]|Ys谈;g\ߑ7wgq9m8j'21Ai=;Xj턔Lʎn#2/w&^SRkAv`;wPhm-t~dbD pظ`œ+@|ȚU\N Qx7Ţ4m.#+1uXW0mlPWn'qH.wo// `kwV?8P6t\͌Dl8*!}k%PUǴȌOfpN;0 fd Kq D}tÀ۳%q!@xmMu])s+Z0bҫ 573|_(7*+#VS@ILS ,Ĥ: ^/DǑK.p|tpHBS{26;7hq)ЌBbPv\5#{ [5qnK(o4mBIs[8ٺFWT"W|JoklvL Le 68 i)8lKIb ϡ8&ᑌs8j/ɢYSM# \ j׫KTBW7ձ^B3?q꫄цlSAvRq.ҝU\Cwu [Ry h1ńG%e4𸩱5':u2 L?4{ok}]~H^q|]2 53 @p_r @Өni8wC*^?"GT룁vGs'C* GZկfXfֲDDv8oLE*)R+kyS-4YL+1xHOhz']+1U<6EP܀MA'bLZ4M6Y;Z?趨&; Oo6 a@DefkN,KZ\II@vT~ܘXW$GʼnntVS 5:cgkO 9(JZ,=1uفҹp7])UWL-Dм։Z[3nRcIMZ\I##& #^_ MVZkM c\DHU̠oI8'1m- I3;R(Jf+mVf6[ەڸ`^& ӊKuAC[p ĵ&)*M k~hFVB ']R ,!uaSJTS=={7?`#}6ԐسobR(:P ]knRk5nGNJƮ ~:`\%qH걉bfnȊtAI9uNqlRfzh-&M&DZ P|2%%-/%$[8PCۃV3 )ֶ'}б9Uek샽[.Зhw@t[5 \:d o յGLXDbRޜ(aw0މ6W]-΀>e0=osޒGES={a`;G%tIC47ej"N9{0{6p9󂆚8jSi\wad!>Up@e/1#&P'՘h頼&e8騄8hABD Sr WN/^hkNߞ`vTRXt ,9)BBɂLKILaY^տ6-&u]}P]Wո┠┢z[> $h_8<#":y,y` 9'EpF4,}Z+dԞ.c]%-6=.QL $*+N 4MC3DTւ&!1 8 LLED\bt+bn4 tI+ƻbi˚rZDeuiJ!M}#G/DrL@𵗵&;'\'0YZ,Ğdxx.\^Jg"?ILDޥZjLAy$@Hpz܁s"ZN%!"TpF┢Lx{퓀WBJ4Dr Vӱⴥ0$qCd };g_D"_ 4YJ芳 q^ DT{%LHOWӡ/Nv:[3NY+bV HE~^l:e^]kSg)YSup>=zwHeՈf7'%sEw$Dl뗒kFtX8=bpT77`o n@E~Bx}>\7M.h$EꝜ++&} GDHh&0!v f{Sf6`'?"4tESMD4Vp910hu:ڱh:EFBq.> E'4JrtZ躅J{rlDr?xjČw9Bu PF$oJHH(-64C|>CÅvQ(J b%I80I"ג ^'4 ADF5jn'pWBqr?6gb? H >0 11AhanbVx7v.@;"I {?oD$:TU(N1IFroIENDB`pydantic-2.10.6/docs/logos/ti_logo.png000066400000000000000000000313421474456633400176530ustar00rootroot00000000000000PNG  IHDRߊ pHYs  iTXtXML:com.adobe.xmp *Cp -IDATxweW]C:酴CiREQ ( *? D $$p& BdL2ǚɔ{YkZz!. Lt\.e[QKpCBԲCCL4'LaQtBԲ| ]Rj.tYӜXZ5zp_ Bu%P`wGP],tIq5r&`KjU q.USp9%hk!&t7nDLn,tiZ9˥>X@u%hytkC$$I %Ij.IR,tI`K ]X$5B$$I %Ij.IR,tI`K ]$eK4@R^4L7DׂB< )mǟ;8g{w]E3= 8Xqfop*pm:sfj_p*/Kj7<`gj:%5gqy5-Z ]Rsn)0wT`fWƙ2\KjθvZ3]BT\v-0 Ȼ!nn#/KjN*0+9_晪.Rj~e{蒚ܚ/13e<.)%.|x)7|e%5%޷-Vdzi+tIM9*0+X y*eK*%5vҡ2VT9 ]R pC܆ro^rԄqo~yf?ܨ ]R %^zg]X,8o+ XJ9*0+y*gKVskgE]w.j.qy??7I޸-!r.z%6S`n!ZQAg,/KR' '\ (0wT`fWrp~y*B_J [놸;ȿ!gNbx5.FCpPt -&rcM*U%w^,3G(Ÿ- ]Ҭ40;b"YG(h` 7޽@ ]8:@A*!/QF;uT)wbBE(p~!*r߮?B괐2~1Ի!半5('].ɍWs }n-tNЖWn ]Ҭ<*:F(dn=@ ]8,:ĄD(P`ٓ3TE!hUWsp`hKG(h` -?E]~.էy%xNtjutDfKG(h`BiSKQfKuv7B|U󁛢Cr7NlXR]wG(dD66f:|jB q>mp<ХE8:@A{dWdKuG(KwЧĮ>BqpAtBv"F?F(0`8Хz,VF(h`G(hŇXR=Z^vh{(:@#ŇXR= y'uWP`bKG( }dW`KuXBbB-oyXRZw n#Wy <XR 8Zި8Р?-9BN~j9ԽQ^Pj.աBu8O+c+K Х0:D!;P ImxB\)fJ Х[QHZ^U If6PPͯLmG =XyHtk8jx+vxgΡ3g%ewp~tBΤ𳹅$5kziH5ppzakyC}赒\@en69pHII4 qGRqtj R'~ ]귖ӎLaQt9!נyeIʮB՜-2 qxWAhwԂVO"z }SK^i8rG(hy_;c=xG'Z~Zaϝ/LZ.ZK 𑵾8 ~Z[g"iJאh[})! 9%:~࠹B<`F1gұ5juuukhf?X[+H$i"-ߧE.f>J4qtF0P+~ia[-ő4W-_ڭuCp]I{ݞhB7I}+SBytB'/?0oq$B=!(`j,~>gccB?|I[AO>m?r{]^C内~?]%Q-(Ssqý_zxbf#7͟Gg$.}V}x|t =רEKIHN`#ˤ_\7HH[ tgSrone2> ˼vo*# }!Џv<+%mnrX ܺ=eIsr5\2<毅zjS~ .HR[׼*8 ]mY I}Rj7w%6<UJh\`[藒C(K>r:.>1\I\ \P`fv%Y{]ާ0ҳ佼+_e'CB!xO]̚7]C+5W%J:w?>aIUvĆ8OSq% a~Zs$M.;8󼚿HBtO=Kw=e6\bNtQvIs\bXÍ*U <|)iƅֺ:"Չ. } \IykH.0 cnOSG" 8ூ>[/~ Ի:2g2ϔ6*4󥡻I3{nͧ2iқ! 9]|872хmxUsĖC(:BjPۅ}|v*0+ܮ0:)T Ԍ ҹ+Ԇ -oe'ZxFtgf \`n q j|x_tGJw${`ͧ-:qE!XEBbsB^3 q[sS 5B?Rqm3kZC(t.!y?s=jP P[`_`nƙ獀{e)ɐ !qC܆ro{TyҜ ^@ p!pe(WBP3܏ qyhyDX^2:ԑ+Q]e'Md~7]p1py52< @Z}7պ@r/N!V[bHǾWyĆ\ VQVv!}T=1߈_!TkO/IГG (QGPW,j=-O ГoVRK|e*03X.*0fv%wfJqjIWֺ!Npy,_^t)q;,0 [ =Li"VY`Ի!n\hمfKsbқؤڕ^ q%.Y׷tt) ̬9[| OTXҔ.$rϭ];5ob@,{:8#:4q;PsVKP8i,ssjVb :I<}L?_ :@OOø}ti%A_IzKl瞴/݀};΁ߙҝ5moC0_]kfI̔x!j .::q w2/oڇHXe>7\ʼA*ԙG.K% ͑>5(GXw>^_%) TT/$ rI~Uv>x0}tkaEЬ|825^O{Nr8+6} twL^_B98":6G_>oQo6~qq ?nMS^ꗟFZ'=Wi[~=x >^|>::d;!J?ϑ*+cSrRos <x)ݕ9k>~fԙQtkH=zP }[oV~0:i/!A 5::=#uH֦+ Et1鑚>9g&]kXC qq>Uv,p^t ,IߢGPg'uN#C'n 4[~/:@9!UCpYtuf߳K:+IϐWŃewisy>qdt}tufQ'}li/6 өHZQť@e< x }( =I'ij^y΍p28yY3iGb${Co"}/:18G,Xo!xJiA_?G^@EtQi zIv/!i dY?!f7}C(}h;I?0?a?U%;VKXNza.|L߽#:E(+I/}R&z|x臀K2 <;:&vtt <˼347Wv+G(:J:]Хspե֧ |':Y]O]Ͼt94{x ңi 8i2;zu=t¿GI}r#s鵔'6+FR!X`K{3iuR^{(ڈQtY6i7 J,tir7qՓ/N|+]70 F:J==ti:;Ki_, 2TD،א^򤞱Х齏tek~A:'~yh}CHB,tiz;vD)J`1OI']A~[`<~R^[@j ǩk_O࿿\Bz4k-kNqi#AyХ<~V6ܼqݽοn"]Yg޺@q+AeO-obfRiego eqXUХ|.+:4or}.*]5 k!4 ]l,RmDzlNХ@~3:cK}8/:4K VtMBx[ti<3:9tkG6L!+twG6B!+tmIgoDZ3WR9w!cien7BzgtiO"hYRYב^"E=!T.w,t󁓢Ch>:ʳХnmt eCԍChp nXRw@<ꎅ.uKϣCh BݲХn\J;xMtuϣ_nm ,&-"Bj ԽH'I9}xitq.uo jXgK1<V9FtųХߎ&]=t)s-{t+t)Y!TFPXR7EP wtܥxEPUV7FQBJ\96 ]ꇫqsf2`WR?5:'WR?'CtGP?Ba%?r.vN^O!_Х`tqo/@tpQt. GCw^@gKꕯDPYR\ |6:zmTwKpyt:B ߣC(?DP=\K s#jХRtoGP^s ]KHhXN2YRy|2:cKw9** XR~cq*j!&dKu[*T/ ]2oCS^Ct=Kteu'4: ]ӟFPv`k T/ @8;:fK.עC(T7 ]+(!T7 ]?F.YRBSt8MB T? ]jÇKChbGP,t@fK8ztM.e~Ԗ%EМYR{ ߋ55ImeKmxctڂ.ZXS7ԶEЬDP,tm-@Х8=:6k. ó!IfKpڤm~4B]tBeE!tY<-:B,tix.^B55 ]E/@Х:~tstBWC~4lˀ'Fe4% ]OgE]I.MBu!l=:fKZkR=nuHtBߊ1@FP,tIt ]S%m3C ȡT7 ]Ҧ,2{DP,tIx4pAtB%mÀs 1T/ ]l <8;:H㎍zYfiAfkbtO z wtB4'_ѨGFP,tIz6 zttB4?1O:Y蒦6wC4C>><":H#~5:cKL( xvtBͤ{rOBu%:%xftBTg#zAtBTyr^zX蒺:ˣTd+\k,tI]9gTT ]Rn'mM(Ux0pxtBKHt *:0SI 2)w;^]+m+tIN&8KmGz Y%Hg+C\Kہ>O"m$6>{i =p>i{cFR]CU`8(ήYAO^rT1i![C8 XB%wI5G,} 9 ]R  9UV gS<x6W+ \Fձ%GvQ?'wmt.iv ] ߟnwIwppC ZIDATxw+g}i n!`M- !{/H3I7ro @%ҨL1-nn0 ظ?fݕh%׃9D~=A<RbQQ:i ?~(Dc-)PT)= P!( &~ꇑꗐ," HH3 X7@"r X=]|K0j[2\2DH,g&h#9UgʠS9Y]פL.Щk&v}EHT߇zmW_ >^Q U߂u=lKGDzSZ`ATw qsB/dRZZ(@Qs߷1V: tmIqU.Ua] HD|x'mXI/)TA}Ӿ4eu"= GOَٖ@PKoHA:0_]wo^\23akuTP)қA?]ZsI:_ywMJ]xR|MMtП& ^ \g+dffא@C2M$~o]!33]ADZQ+7ZmH]Iݐ[ ff]Jh#/AvE>J(=֡nfl/|8S~|#o6]33]J'O - ]33]/V* {87Op7!;͙5. Hץh}AJ7Rׁ43kW@9'j7J<龎s=TRAQ]Rz*^ff RRE/dXjuP:tgz@"ޯݹa>4!_\ `羀. ݵ13AU=ٓ Hw܁t/E5('#1"uVΉ} MqyG飐O.溘&@_w{Ч'l녭To>2G#OE|tqFeuUȊ=@Wכ΋ڨ4Eqe 6j=ڠ(>+8n{ꢺ>\I.CZa?[s^D;zGB窌.`fv+t)(* O9i;2 :#9"$ d֕H\U7'|r걗~oR+@Agw݆C2$͈UX2ZS2_ݦO-ݖv13qlS{@;7޽wujzsҧm wxcH=ZgaT_i(] xJ۰ژ. wGM{Sw܊S ʶ^}HWDp&G51AHqɿ#;'bp/Tw'Yu NȉU:4K("顒~ұO4x$7QӺ**BYQpt | a?f⭇BοM7;or ˦:?{ĿG qzaPRe#E"շO!c^dy8,nUXYg {>z.;ϒ܍8t(H4ix5-v:ZEpe݆LjU"+',Qv9 ZgG&F(JD9p37ki3ZE~Zq]+fUnZ VdK\)<& h'qywWs\ٛZs8@%^gv0p0B֪EDZLEMBD'P*=\)'6܆))G7Wg7o+ ^7͎ oU?'AFYl)Ń`]=SL~>7v)^@NS /TiUsm(4B's$PG@U+=\;<UjucrT-d/qDJz*lIw:́zI=x ң%n0:A*;1ݙP31!RYV?I`KT}e>dC|fo!!eCÓ(#I4ރP"eFœGzhjDm0n~G>Ev74Lj=0 d[#ZyU);mB{,N0eZ4˾&B)UjN٬"HwL^:,䴵."G.Gwa>>4pA ql`Tě% =Շ Td"=~k:0a|g.I~D?U^Sr!Jyar^̷-kM4Jp8 ~-E", @Fė'\$Q.V\ RlJ'}a0>MU6ʘM3g*wA+u^^^܍E ţ(b_=*"9ہO5u$%w ["V?@ LR' zVN.܏b7N&Hu&.t!r u˯i@1{_ҽIGO] /+waXHKsJY܊ou'{J+~P[%"z|( ^Ko"^:-NoG1"Ś R'k|~9ߙ>/E*9"9L6;|MQOk=^R _M]Dg+w44UpH=|#$?W)s5h%xp> JG~zeiU%tTD~LVJ |oMz6F#Rp͖&qoMQm@ZrJwYT(/()C Ƌ%`qKE8\ߓCfZ)F}K⺳ۢY}BVMzgDJ7 SE}"‹Gy4hCzON k C/D1EZ=5ٓjIϳS.X0$ed{_'M# f^lAZ;\GKqynӞ};OA:C39S}rx5L}T2Q&EKS4f}?yL0Ǫ+HE(nR+Ƭ$W xw-FzP-urZt/}*-"CZ,gW=bWh? a'"^Sn#WSBau+_&͓4+ ~V7e%oUc& ޏ;8[> S} I_\vAt_rbTr *rLz3믦/Sw|. yp'̿N{6@=aևRC( XCy}6p9HwBycR.tU'U5`R.ī 3E<}ƿmgU/lk;w:[ƨA !u"%zit'i֧1(@Dj(?Km $-w3Z$ >PA>䐥LU3Dy[K+RM⩤DZ 7+X~_J]󠩮3nNAQf"mjb(! M>(X?nDuYqf>Su<tcUyUDRѫ~JbU"ХCg\i@_P*9ɝT9*i^_n>nqpY|e"U)B_OFݏC,;H?_ZyD>[L {D<8"ݕ3Tzi1 }p8Rqӥ/\&)wk; /RgLq@HO:mcB[pI֪ϰኋT16ؖ!~|JCuL&EWƷBBH:UC{Gi)A}1">|xo;$O%6xJfsTQ@άESDH\;8jg3n*TG9ɗ(U=QcxL!Ek-[nr54吙^H\o~_ٗ?D"=.Ĺ;I?#vԛRjJ> =Y9 dDi]eȴVFGk:1{okZ!fl6r.zyp#RJHH'7# `(S)M k z)HH/]>[Zm-h)o"gBx\4}iOD{ [y_P',}Z!2#U =m!N:5"iޱPiV$x{6VY#{nƇXzakRz \C-gjMai4ϲ WD:zJSǐ'O҈{Q_I,~^玑&*Pfz(9:ˮ^FTFsB6A2HGn" dv]TMjGNqqGsIA9HX\wGڠU4|X㍄n7ׂJN܈Qoj%I+A/tiĩBJ_[r#EDŽt$EuN~-g+6?8F{C>dG$/@3o| qzPكVG֣Po?{׶\}9.lUٽ6OA' :Mݗ@>bI^ k^+V X9P0,m&@6P3 ;WrxTp_tBO沟]yV@dqx{P@ =hy^A?: QYw ?aLT%TC*vwfHRBbI߉k(Ϗ :I"y7-jt/EKrP}Zχ:HP(߷rX=@ Fio_ ?sp? DžN AЗ#0@ѣlR\Et"4Tq Eڣ`oLh/|pͻ'"V҃䃓ӆ)}P)"URl4Q:, t[ju3Fu> 2O̱Pޢ28#*X͕oFޠ_A73@$#@i*puTc]6Eyy*Zva/i+ۃZ('QﺆKX.ewJ? CJڐ2^( ÷($Q1 g8gQ鵆2՛y%pfWYK}r5e)&DOySG$RHbTCKIyŁ(>-*F|D&֞\@OjrYI?8VߘJ,eB?j,z/Tu;լ_S}\A Q̿M$񲈸o7*(gdSt Ϋ:&L;Ms/V@d69Ghi3?|OPw';[6is}\i17suIuoIUܨj_#9K"N8@j/xڻ}n R}b1Zm1Jp ;eԲU;'vŞw!G7G*o: y=DNs?|RG_H}Xwv_H/-KN7e|-nE{.!\''}zcoGkE.k紀 pWR7%^#4opt?sGfǬC jՐmgJw'/-kGm5-tiV!מKH{_ !u'fK$n?˪yO:Vg'tq~w^lpLD꘷He&NQė 8ns[!2t^u !i\A5u{& V|rqi9w:(蠼9@*03#!n +DLu{riQ] j(`kM)=4}󆪉T6 EXTNMsO&uPyI\_FYvbu)ˤf OOG>P,2kZ^(C'D}*kDJk\UyIdĔ ro҅\?F Yf͓57 [At=ݳmfz0PpوǞH[)q]VIV(9P9^Pf"H!u3CUMA֟9pm{_"!ң{b-зC~xGj젿!m{ձ:ť ) K7%CR`a'!R:7I"nt"h=?R{ ]/&^K^"׍P }{.B->]>#߼F~qC]@k@?9&Lk}`}2"SDƌǫpcVV{M>7}poyMPb>N#˭^Ќ7l-lW5@~)cy RXpp@NU'1>X, rg@߻rj}dQi.I")j1@=d*L|i>)W :Q]rJ;50_Vj@/&u֙Sf/)2-JL޼{EyXڰ#YP!Έh;Gt|U^ӈMμ+/h!rmCQݕ}8n~ңsg!::Ȇ>kbi-%4ێ $&MZHi~*&!(!9KwVpI~(ǠoJz2m[@ND\EzgT3S֣j'B DW%' N,1"EQ&+ؗRpm#. |")ǚԭa/X y*c;:)glR!<rۤbϳ)=~tܴ6Y#eݓ)XRśV2bEDH&q`S[OXBI"ҘT'Rk[CƣH@Y_׀ćWQzp~IE!sp3U]VׇB36=><6}urIE+66:B-)L7]@xW'J0f'%I=gxaSoV=#(.kCzX3T{RiHD8;d& *hTKs&v^B"K_-=jKNIUI&q5-!x \&~S%4GN2O\ 'cphCz *xEE)Rݴ FzߒE.E%\ua%D0bUAgr1:.HձwHYxMuKnQ.QޮLAQ5βI4Lh ~[𪕭" vu&@m%}|2UWZmhWLo}$@7*t Rm%5)ֽ|) I\+f#@چ/nv}Ru o0P/EW$jZaQLu6Ǻdžt(pպbܱ?O5}V] փuĞ8fpv)􂍜l2?['@)Sb/\IH(q8E$'Ro <FjQ޳u utvK Jx@\a71!ݽ w{ j2!A&U2$4>EC$fX&Fd{xV:? WVQ@\5έ1]]Hsw2VGj1DGdPyWLe-vn]=wKIm:D{*mشh=3u\|gx#/w=(RwН&4^v>zMPZ>ժBQ?E4ԍ ًiNC/MxsXQ)G=Ճ(Io r> zID=43]۬WI!']˞+ rX9>pe:G_|haß qi9?^Tߩ2\G4jt/$d:ŅbYDLAU횫dPLq3 QXxah5E{g[ZYl~ݖnpx)|я*̕O^Ӝ9u9JqbOOgR%\s;|,BQPߕܨ(χ~э~MH~3_ Q .3EDtԫdZdm;Eԝxmns 3b0p<ѺX剝a}J=u^г'SQU27 9gL}P5ZqǗ ט6n/Z]hАqk.VZO_Qf퓛 v";bnP<[lhlLbXnEf1|>_Q#~w_ʩM2ׅNux[9.V{KE1i9W+ eeeET oԥ&q(H#JQ5z)2"iByΎTM麱b_WCf@Q@hCus-ԡ_Bi!Ij1ЃC7VE# +Z3\] T$NW%cCs.(%"hݽEUE/%z[0^ң;j!ZPuڢ^L 8v&Vwþr~x[deb2A p UCpjLJٚ\lhOvDPT U,)߄E*" EBh#;lWtxQB$gH~[Z}A3xKF\֮k@<"jb0ӜD-&G1gqRAЊe r ℙVrPHԪO}S֩P\v8@z\rBYzB5J1LxJe=є|{U㲩oHa@q͗nш I+̦3߷F*Lib )`NX~F&UwjYuiBC6#&*#3ПJq,g͖ rto׬=:,!k̕H鱓(8 ^=)wV3Akjdkx;J ;Ayе@owmI7Ww:;D_\e @z:ǃ}XտݛUT_T_UTCPO\\Xu\4>5u@4AL\ T6LqD1L)Pua,qWiUZbG݄~.FD'/:\@]gVCن"ur;~gAef Q_wMnr_N/֞PnIQv(z eyAoY Ɯ .u4G8 ']MҠČWi,KՊ|5.\eQP&һz:ͽ+}>,Ժ?D&/fV$V5wxCsNP'$֗ˤJH+~mK!?٤)M.SAB7SJ]l78V(!5,=0+]i(oJ\: PXWoo;3KE5ɓہu?ڎҜT-럼Tigw@sP{[o(rcՕٌHUfD\:ruJ)qS3OFP.38H1?G7V?l:? "s%-i %ȈnQ#ru){($a 0MIHuֲ%wsPl%Z@9(D~> %J\5 ҁq EcܶTs7̣s"r72yE٭KT>,]({I /̺.s],LBgT7$6[`#,oKBuz ෤t.^N:epߜXX1| B!(CRԽ˷! (O?O΋Jթo|%M [µ%xR6֚}'貞/ 5DfR뉓ם#B y+?C5 %Ð>R3њ'R_/sv3geň+ƒ1k1Rz|q g5ǪdN.7n^AC\ARNH 0qt|PZ,xQȿGy9ktjGwRR*k4'MN;%? #8".9 ]ob62(?Փ- [Uc?KĭUeIB-oB\g4׈:{i?vz%AY̳sM޵w+>2\A $&oo#ry#t>x@=,FdZǩdxFI; !"N.H7`_ ^e'D|mMM_W]Iܪ`I嚬 xIZ!(90d/fqPE{|KeO!AoMoQηskFV%R俬P5nH'3ku6_Ru;g"_30=CKn7V1O;Cz׬Zq#\d?̟1z~YyRQkϚ'(%AAOU![> %"ť 萎8,qoD:R&@~Q2XI 2%EҢDď#?5K FM۸V F٪X5vMݐvm|\Tm;>zKv^ݕvdvYx>SD81'~0@JbuQi5(_Aݩ@W_Qr%W] _ߍox$CgUE@jOJhSSW l_bFMf~{'u 0EoJ,l_j"Zx1+ͨrsz,/]rmolQ"๻:9r紑Wu4'7T0(Co Gv+;&qOL[@/]PPm)syoWtNպ1מ) @ !_ݳrrvQ:Dise+'fJ)!'|Q1p|`" Q~tPT*ϒH%uZZWDS/.w~sy|/+S^^])$WeECjuc$>ےNcW#mD^k#"'E:Kp//UBRa (Ezp8u-~ZUyJ͍Q)iyRuIըjU`ղڞd -h0劣UMܖ+Z(T,/`ϵP =(ҕ ^\^TneMc&EPn=#DK{WКZPDY>WZ[4R)cZ |Z.&xe9%>>AD%Ej_`S|=4,ͽ\*;R.SL,MjڽdA nuzWSlE_)Ul 8~ߟTSu`/sIDUNϋ~ޑOъ始=`aWu)OH+;墼B7QpabYSDb,TxE.+ZZ7.e @ w"s{C)t(IJV BO#驓w{AD;yYw|xO=A(7G~>WDfRBb24E fW;ȯװ}c̈r RtB!S";)wI("+Cd~+t 5.T[{N_A?-oDȩM{Ђztvxc*ךȽS^s>;A+w8DϤoLt"U9@Q):aruwo&Vɚw$ș1?p1ZozemDǶ2ev2TƮ î8Dp&:s3Wz ~MR+P5GΦn"P~Pu,ؕAZs⫺LHwbg@PT7'O_z(DiAT.wS(}dO*= .1EsllFK)jQSA!4wx|tˉ"xWu[LY6r  9 wg8(|LkWGwqjWA萨@?Wb赉Dեs~AcjY[ V^*QߞMNo(Q;_hMơ//-BrӉD`.# ;w GאCt'jA*|+zHgqMUv.Jfi$>H\8ǻPTQͯ*-} 7e%һ)"^]hC0t ί`JrψȿF1_2-nP^ ?VBYxrqk2y!h+A'u8]!E^tVث.PFȧ\n Q%A䧤(82}%;aƢ1t.w ?2(U_]YQTBonKPAQ&MK봤:"n 1\"W#{Ƚ!>3[~>l\:sW(vTeR: Z-n49{QЉ=1Wq9 v&8]uIř oU;_p|dA(F.~1Q- ըi{"Q@ۇ=HF/uW@< {? ʏ^e }wX*DămfV D>;| 䇲MyMPCϞي@R37+NyW䨮[,zEboZzWpx^;7-i0-r/Eb:+=*?b<1R+Ka⬈8>"?w OAm;-/xFQ o}qoڔri3VԪVuTO#5QٹzmUqAΨgF?Gw۶36];xRz>*u䮦2!nuxd"|qw͌2EA/Rՙ;WW$]r[ιzlJ"$u䅇{C˔ "M@> ضy)'C~R^z_[k ࿦~~EǪ;$29U˧*E,M! ꄙUz O}<| kmVtؐ3wn{zĽrȀMC p |-w"?~mƼɴJ(? :D1~ ::c{R݇=O5.VW?ZvEWHzt{&Skq  zpNi(t::yVzxd/'), |ɲa+3 #HsG?>l@u:@)]vSUGY.&W29;dnf?_NdHnxRq:Nx8QoC] ?{R >f[iSWp16Um'=Pj?e^oa`芢ZKc:s}ARYV9S"RP8IpMu@^lfR}wG6jQDUO! tT"R w^dP=|1~gH9~ zz,FuoQ}%)7o@"ҞgtI~9`hueʴ8rRu "G7F賣% kW+8L|o,կINFfL TΗ/? r<517@9EuEbnj0XizR}Lsw*t2p0b- }@/gQU7vH^ZG1H[x̘OMƻР@ߚ@\NpE;?a^Ϯ8굅!pCE.ȴAIe$zAq!t"\~&z$#aq ): ^1(5R>MFDDu -?iz+_ rgK՝@}Z{Y=D\ YjL bGGB\tC AWU'B5SR\؁>fWעsA:bC x-O- BH)g,(,"#=B1σ 솓;LF?ߙ._A|؉>ydn97BO?$feRIsRJg1̧eD=*0m{f]wʭ(Uyk#5.ФՐn> +ؽ̚o.D\+g zIϗt~'μkZwCEwI<`,"ޖ߈rD/}33 FkDnn&J|Aҷ% ~{ǣ(}RP?5/'1Kcl7D͵g)so:[.>R{)˃ھmXBnwu%,p5U~cٸj{C \,ԇP6pff6);#Kspymk2Z /~p!pCD9}t:W偹ᩜ_qv;kmw}F8 8Lpz•jH(IE AUhڙzv=$'q錵&Hא{̖qm48wژvf/\iAYIlw^ Ogh(_H\v2Y/\,"e{ >MOӶw33t#G~4rFS٤@#";#|VDnA*73k]o"\D~]UMwYs@*7D7F;Y*73k]PU)U-wݵI͉/o8 "xy ]Y⪈2r ff:$xiD,mw&k1/!˛*ʧ_J6qL؁W(+B<ɂ[ow6M!/moQ*_Y}xGh'7 ׏\Az'Cz Vǹ\>l =% [-b[gtE{u(ǚU@>Fğ#nJpWHw8b n:$gU_ Kgff6+]P7-%\ 6UÂ@ tqM*~|<"-j uX6Ɓ+s7Q^P 1O(z? ETb!5D\\B*}nffSD(333њ9 @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kY8nfft33p5̬fff @733kYFD_u:IENDB`pydantic-2.10.6/docs/logos/twitter_logo.png000066400000000000000000000326251474456633400207460ustar00rootroot00000000000000PNG  IHDRߊ pHYs.#.#x?viTXtXML:com.adobe.xmp Ϡ/K/iIDATxygs0ȾfRoٷJQR!)*􍴠B%TTZ6a33s\f9<۹]3\ss,9eؤI$I6CvI4t$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I `K$I 0`ߎ!Iu0+p瘻q!`7٧qqs ճX`d#Q˅ H9w$i KRXXXsR5_$J1 >.j ]X#/,2c~u.J,tIm(*ؼf&8:n"NtJgKj[ف;Ǿƴd*.>l ֭:?{:)JbKjIΒmVO߉Avj ]R ' 1z޿\ O̤L |h'paIu$p(?oĝe>m3$+56nI,tIu*p"4/ !(SuSӴϲē..,tIu>gv`/`8`wb 9?ggNѻt>wPOjKOt#9)E}۩C=.VN# f,m#1/yOH ~z^BTE?F&gi)oьxa?Ċ}9%U#oQU/Ajb{?Mu/ H+KFT,@<:VIG";~|Yti\ ]R7k"#woHϥEcȣ{X2mFLA;4;qĨ;~Cb+ܮ%e|Xt,wG$Db'8kfv <оCu1j Z?` \؍P=V!=0wrIݷ;1 [ѪWtpbۈu?9ަj~TA$u0`(*L@?PGwʺ9N!OWU?Occ#wIY$ub}s\WQsfvJ?!6pJ>>9=CWH*Fҡ fQ5U>w+zĊx;K|G(S{(ab I;]61k_Yz[((w؃IE*Un*>.8'V<01%)(U(Aps)Ƒ4DrbL hIbD\'"O^Bv?nb+񅥑4gDJSQopaĻP1-L l» ,Upb s!f>lJwJ}~bፉ"ʂ/E٧ d{6"ީ,g&b W@lK}n``kbM&1G}B_Xρv/A \LISw:`_~(.d4pÈW`*RGo3HqWw&qBTs8<.Kx) &Ġ{9+?Q@޲ l]: X]:;XCh֦=/{y?lsJ۱Rn2/Q+uET)xq ~K^w3)R4;Tq_̸p s1AR$Ib~.Q >V.:R[ \@ޘ.@lA/:W+t=,BӀ2]+ZWҜR[d*VAOe{(15r%_Wj!Uh*)Q5O|z`Ku47{ix:;[蛗x \ߥ0WvvnѣBk>ClK\IoVFY6n^JoCHw +-:1/1O=TEnr$Mۉ!RVoPukrʛ/F! ;e]c&(Ї+U͌q( q! );d ^{*Sgd ̑BS~b6'^0A׀0?pXv)n!:|v`=`wbwv>wFڗQuYBxuvN$GeeןFD^LLK4019TRGpaʪ>LSvE0z|x#0&7TCH7?Z߉o3?p,UĴHE?⑞2<1ng~B_p3,R B; jk d ezC,+[Wg*(_*3ŽN|(@z. >YNj9CH.s^b[ 1T]|2x߱{viŻs9b]8ģ͉}Bfy{\>/|x,` |-;"b'V$l וdSzXLf. > r;'gjwZeJj ,F܁oB]55cE0%R`ץRˉS#uK+@xZrb `Ug!4ՌiĿ$gc3-GO>Siα,1{9Q,%曚??o(مwo1R\m&ZX`Ob;ǻS[" }[BYuM{ hz*B!$_*ED{nD>|kB[j%R`,H`?`=pB#?DdK/bE K_(7jd^Hʶ?%9_O|ؓxF/59CH-?%=fB;A+&gQ߇+$9_|6ږx~$0krU eŋ. >[ nO΢Y%;2o+B_5lybաsܤ9CH-25o9J) bJss%gQe}$okڕABxk-weQp` fQƎhwp eqUYTH-p5=qnX(%\CS'^g(*2S>J<:X87JDvv15eknddj?bReQ/RA-7Cn``4a|!lCFp]IQ, ߰wruϼ7mc> QVPu40NnNu70wv!_e%]Kp2QbwdS(П%W زu8 \# FY1'Z2ph[.' ޵Z9Bk;~JL=<DUݤRMO,SBx{' ,GS1cvf^>C139ːYx=u<).CHw-iABػ[7؝p  ρCtSم~!rT"6`nV#CDf趲 U<+X;7N+]꿅t[مpF5U9/p2Kf@9t[FO,TmJܵCTq -NO,|%`D䒽R5n݆BX^TWĆ[nqPCqmv)vm 5xʒqKZg:jYp3140CH51!;@ep$~}5}cRB>K-_Ljjj 7IY@M`ہ+3CU`tvI9~ILaLrƝ6%e ]j;tgPN gs2=@R;t Ov[s|SX?KihvI9PBWwBlӳI̳eWe* i.AX [w(RMTJCOz*ǂvb0٭Ġͺy9R1RmU*}6$!Ve-DQJp(wUB,?{ q#W".M[6Z $;ԇفu;G׉;ہ';o'#%H qnjE.zg疚7$UM"Vjn`1z|,iP<+E^uE bͨ}ym\īqD#uov Ir/W큹H%i~oc;c\7ned*h`obISsHM[B8ELMQB,y$I4!;@ա_%IBvnC\BE.y$IC1N[S|fa-I*O RBXvIR+bIco_]Xgg$b`\Ե>$I׸PB $I\v"Խ[KOe(B t6I;   I)tKm%Ilv"UGtxJ꣑;xU3w.:}9xnK&2>r׈G?bX$ILv"U3z|1G'V$ bsN u^'wa`]5_GTOd(JQ>i/2ex*Qv{4;@Q*퀿wGm ?pnIR}X"@, \ ,XsJG4(8x\~p}Gpb]8$^P"W{Kِb_wKk{uIjiEz+ p:FV#Y8%I4?r(c%x 㕧}*p*;`J=LQ" =c Z&$QljՂF+|q 9%I{0;@,*XsL!Ij& }T17n"#I*T䠸N $Rcb q|-IWύUd?K D$)#32,I4$6P" /IRXCtW$?P4 ]ޡѽJl$ey`]w| I>Zevх$IYnP2 !Iܒ ek%\Gܓ e3#IRL%kQ,:$v0*;D*K$I],ef-$5S*$|UVkI2pSvYтz$Iq#jvY/x=IRq;[*z1;@.J$P u쒯)IjOd(Sمpb5%Ie([iɺ43K$PB#Rҵ%I0-2B8)ڒf*;@BY%Iպq[g%^_Lܲ;|}IR<"Cv \A|ppvIRcq;To!I!T@T{7CdJBTkeTB$IqqvLU*góCHj TGOe$V/e$NC b !$I -^PG `Ob>$Io٪Zf$… Z$&kZڅ)ʺx,;DTПBTYPU/t%a!I$wԡG*7;DUԥ!IsTI] V!$I^T'!IJw3pGv*[|2;$)'SBx<;$)͹2 $]Q5u-til[f$UTB1;$4sCTQ o<$%ĎL D`!;@U5![';$0cd&: XTgKM+t%NPeM,tRtvIR ܐʚZgIjST] <`c Iu685;D5nn"ICT]  `A$IvBv:hKNvIR=\V?><D4]Pm,t 5g$MDom-tǁ̀óHt6>M6zCHgBWkI[pXoOl $)ϱBĜpXI0x6.*pzrIj_˽j,i .9$Cԑ?H"IMGudcwYvU8nύ#Iqp[vo<1o}u`Wbt$iPgzwA_ 0.7$!BU$~,b!%$iJB2Cfj+]V{SIR ,\J_ؐغZ\ !`xv\9 lA_ XX;)$߉"Ыap&p91DT#uA[g%E e̒d?7N[ }sbہg/a!`AW#ޣX Tcfhu&S `t_&J"Ϙ8xe1oDyX _cIRܝ$m)'|>=k2)7wt)$b0Ms\whS_L*X[ Gz쒤|5;@[ûtI*W*A 0K"<M^qTqR=y%Wa$G_Fz8Wn8 2;DY}$iV[ܐBjJ`ݞ$fۍ~BjKBAtIRYS|>;$īZ%X}8!;$Чe|.Ir pjv }>@*lz>m3ԗGC-\B*n!vz<B*4% ^>B*HfDYsd: }BJv1Ge'+CHRIniC IjIM*B#HR/;Bǁ HR <;BR1;$u8!zw<l%5Ǯs!zRTлG]R]= Bgw#Zֱ|Dg9`CA$iЋ389;$__eY<4 dX$Uk!44zy&UzG* &;B/xb<{dPXyN"櫻lBeGK*GDzC,j&pMvI-> :n!V 069f:82;aW/UHjĖj( > l \ERbYv1,C!T, NVSx%`}D_ _F3^f#zXLDysJD*/V!=X0;B/_h:IHEK"It5AXji= l\QNI1WpHI_A^8I׆"m ~K,cxIUs5Ib:5,~]\YIR׀&gR v G$ϒsH0y|=0w_L"UF7jM||@$uǀwzk#y%SFT##\c/%n~E"FXB8"^RB,=*pW(c#[$xIJҀ{U%^WROéi؎o/ h]x07To_lC<01rN̄YD8R3dzˁ$5KZ3aˍ#5K u +'/Ƒ%O%֪T*z}CqTW_ T*zÁύ#IfY \R\=^~lK35ɍ~lHK8R 0@u݁&!hrpޗ{oƑZ|`ð̥tuC8α.)b{3CI 01,b {v}8 b5a{X4] jRx8sBܱ.#B { F/I񳁫Hzou͈r߼sFFjnIզB18s,L#~\RY%~.Qϧm,FaXXc[H%usOƑmm/N !J~\`=F܉_\A5t P."{Iꃅ=;7,M52Q"]$M^7yD ,̖P${dTOzy&;TD/l, :6b3Ij =+Xv8 ɣĻAlOk$f`z9z[s_xQ)]ӗ2K*^wu$} W,, ]f{x xv7q=ء$%>'njĝĪwK^V "'L} xXfV@I y&|ʯO</"zbc4q=(':>s88MRmYSrOADJ,Éwv^w&9<eFb{ϱB2xg Jx|^$ eK~ By%[IDATxw|dUϤmK[& "݂4T@APEE4.e${3wL7l޹f6sL6̖nu}fffsfff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nfffff)nff g~E#@&CБ6n]n :2^H,#ZFrznhFsyϷC^.333}3/4,@s{ὅOih#NKƪ'_ Z233l{fG2}Ȇ8.#9}S[3gwlAF4{rN}󀧁ӀqyϿ A-CaL3p.j/af[<ۗRL0  /;$`D~*jWz,[x{2\A fffW79oV;ԡN/ö߄wEL433+zgk, ܏W@ kfffгd4^O%kp&MG>8\QGCtl2!(Qn O=rfffB =KơLso땫333zu47GwG4F MyǍYT8fW8YW̊i#<36:8Bn~}NT:/zhg5`;`n`+`0p_affTz,C,Wk|u\|x8u_]1,\3*;x-|楗+|U =ʜ ͏'Wn̬ 2 <1zڳm8/cÿ_C˾ ><7eϏ9*9h)Mr> 1d]Q7:'|n&C-rF7/D7e fެ*qQ|~zxx<NC-8( AУ'QwPFh,t#z#Q> ɷ?,؞Az`$J+hE]h {g@۶_8͟': ZfffdNᒋcdͩ33 얆|C%ZPTT1ҹ4`("0VA-WuQ=,꜏ɵAЍ4}MWƹ̬Q{NQ(OF`+`9"RE^ƣ`jAE|@Zo\ј{QkbgffVZ2gȫ(9l"Z|kxATK5?O(ςx?zx;<j>'w78⋫t>w:wI8YQ5 vn@cz=rA5OFGC-BPU$8WD:U﫢ĵhřuzSa)t c@?&p qJכY?V}^۵R -Qz.[V`+lCd"#ъhG乨58Z42ܜQF|Oh4oE`6F oΥsp4p/vc5 d1;T\C_OFs'"&Ncf`Crաe(s>jᯊVe; 6rˡVKhUy_Gt3pt4ꎿԴPvW%ff֏,/j_aZWPA$Ǩx4T"4%Q#ϡ(C(u?Z_-gz'j!G]ǽoeo%7F9p/&7~sPK%}\M_'?BsϱjWvYS:tAC]rVߡ#ѸyێN@ȍSP¹Xc{YT}no oEۇ羆э@Ssfn*/󚙙O [蚏ޑmgpӈj ~>?'o۲(C| j-f,0X1zPK{"Lj=7CjTv4~kcuRӀYc KW|43ɛ%EG~EU -z,V<}h|-ȭS?8,|{PNiӿ%+-@qMIoc4ZnaT?/h+JZxvA]pg?(Ant0\W33X5Ctd-]^PQPIߡ')dWnNn(m( tP[-/xhQihŸ %\|ZQF"M`'E.]H%TwE}fffjBϒ!Z|sߨ[QyύFF;hA\dUöMP`ߘ->oXG|PW}d_Jy{}WơWWo$ _I/fLTnQ13~}>Cbd |`r v4 G{ >M2(XU"7_Pu懿ZC k mEy J+4%ME5֝nffey@_6CS#6UYwyGZMCP@;Mw%%4kaɂ2[ݭUQW-(s33낚&/ɬ):z~;3U; *"jj9 QkL Z<$f~FSQR>;,Y9nZI]̬?y@0wg<[ӟD{}4u@-㿣u\(s%ԪZGķ@-f}?'W&hM(Ia4W] q@f7w]Km*BAt,݁kQW (OFmJ(~ϢTBЂ-_ g|՗GC]ÿޮB7 XhQNx,jгd<3_YqջempToBZY4!`E c(Fs؆JNE@m̶(ޅZ|+7BOPyjC~Jw%Mؔ̀Lk+| (GF˽-<AyrA|6ZyL }XOEWt3Ơ xPM? wP"^(^?--Vc[ Lx/̬Y@Գ`l&)i5נLP65 =x%=DhvB-8h: KninCQr˯ދn$oe[#P/ q(Y5 lCI7Vz<4< lt qcQcy̿ ‘\/ż{oՑkx4mO173~&4 dybMtd+uQ5ތZӐƎVM 2 _BfPW{ԅ0VnA b4l'4Jx;0{Z;U`<&_>/ZG9\OPpFAu7Gk.t3] ~a8&emAcQr4mK4H8N3r ޟUQ@}u[v5wa(~.* ?2juK/dɕ[GQ倞6?[ ug6;P7uqj8zQ9Zs ԝwhѕAYAyjq}u? րnF!W(f#t?e-:n*|sv޶Ma(&eȰ+woT bZ7'ɕ|]oe~QK}U?42 ߈Z ȯPI(Jڏ>!tC1eӷ= _TƵYU5g ơ,hӓCգݠk)4>U4WO/歯J{nuPm ]!7?}(0gL8!j%.Fq~}=\>7 M73Ǫ7ɷW"}x+R?en84ErfQ9(7>uVkC]uhLt0><On{OsG,+Q7E?/h!h़%]z>YVZb=]]x%zaۆ(>~4,Q0-*lZfeoƷ/y׎@AwNض y<|Q8Pz!X  e\332EdI^@^Ch'%6/<M;v&Jrĭ(S}1˰$tgn砢0Mr"_쉲ޟDIok +~8<~u?ndExQ3Urdz . Zt^4M [dȵwEh칰z CMy6,|:PuR(}~8m(0nPv&W}_hl~lޗbRYd!DžP ~uiZ+ZYm hf:g/B#vm|& ;PE4Pм*jPk}1 I뎗Dn<{n"|̠s:4|1jCS ~klȍChPջYa[1z̬Z@ +J1*"jބgAr*J.k%$6e2W"W=I]jGMhuCq|P?8䖁%E4 ͟/J73Z@ϒ.SOcw(W+Z+<ۣUjBV@oJ؃khEϮBcYruG5Pн͋e Մo%w'ZtFRg3ϳ'n}23~z.Uax<MZs砱, {`u×vT'~jyGY}[ ι?DK:"^ kM3ᵣFuޗ O[}4(#@Bٿ޾A-6Tv_hzkP~(6M+H ~J;.l> uɿ`^E79hp1gǣ)saטY?4a!jNFE3hi:6~ jվƳAP4Vt-:%D( ~J{u^{][z47 ' y13~jiQu('?a8%@Uܢ+XLnM(HVY{h,5EyeB)õ8pmb3 Lj DݞsG?mGjz\0̖^KC@VDڇZ_EY`z1^G.5ÎUI14 ;Q_mN?j=g5eQtѪgw7P+|ꚟ 7UeCnLԲ&Z)cz IJY?4h_V\~/jTjWuGZP(h4}'X'hz~ m_G.+l$IhBcțQ@~/oPoE[ixcP} G=313~kir e} po#A33>\r@ Gc> .eߟB}v}Ui{ ؞ h-(&2|=ޟBb޵F%e݇OCwGM7۳47 fdˊ;{5'oA7 h:ᚚ}hjoD~sZffU!C{E w7(Zwi\QUuPܒђRBt[+ %C7mϟõCˢn5eOG B4\, WCOYT\gCԫlQSKCR$E jU7EQP=QG5ݯ{%EsO{^Tf1voFb4M4.M޾jv{VpF-SQI׽P[P?*z~{jmQ,CA{p+O 5<3nF":u܏J33תʰ 8zQ+w | cWy?AsqQvPrAp; !R\ԯZV( ;*NquFԻZ囨̝h4ka3w/hKkf+s* 柠Vy AnÿA uϢ !ElcPK{߾BcWhp!d-%T4H4p5\X33ǪWG Yo+kQМ2Fc뢮P02fZhԊ U5j}oUp:a~M8;PQb(}*X. _ ҠP(4>?0 E7* QsU v{VpO`قq$j.@ܗn-Q`Muyc,?Dgɨe]I(1wEmXԵ?e_jVxG!8sU.KUffT^iLR;'[uGFCODyfffScCFu+uȷ¿Q:E 7,w^>P]eZFkF+F.Dn#?e5X4='i4 ?h}9r%m@Ei&}q{+#g*yк#9vwD~ZWlj_%=ƴxPwJJ&hk9x77A]wjm2Fsע!7v-*b[{`ffICϗ1u ؂g&N[ǢP}-`.P0\m0ԍ.D 4.)%ME-qXPy4u?vGnO9+sQ5ts=K5[>-+0iT%; QI2;4-Pw[8M+uܢ'QnFc[:eϯ?sPW h>&A'`~,62733+f#NK6]~J1rq~J?PO5au|QMsjAlGиh{ vcOqМAT%͓o <?J|'̬ߪY@hF%5ף >`P7Ψ=\ׂZG j)Z 7y~5zFRCs7A=gkk GP|}`ff[Mz{b5c>s]@hQ%]},um_ƫ@2O#Wi. wE -9P7\ ':j!˹y ֊ DSMz̬i@d2[4ۍ hLΕƠL|,hTD4guGX8~YxM pY81hz<3<~ ܇znA]?B|lwX4/aP؊/?,w1>4?[t*sJuv-,@#P~hxh<+hn(؟ZCPP=Q (k̬L~zcvXz1XˍKԂBŜs&*zJu˿fu> M;%䭅n"~ ATЖ]Ls`Xajf:w<7?e %]Pa;: eWn*FC;|QݨxgOkffZ fNaQ[1ej(oT%|04o(_IyǓ J]}A B$(UKʤ733+#ln8kh|W7Qw(CeB ET  Q+d#I|^FS֢^@S6A7w43~:ųpU+㽘{PPߕ\}_rPuu^FjFcޅF'KqX}jE=V~j㚙YЫ}qBWz\{qh^((K=*݅@yȾ45@ۢzOM33nՀ1gt6XvgU/=hq(aw(ɰϧ|ԲPIkMAk>,h崭Z뻣u7W13~W:4 fK޹z%תt;z6CA˙^Z8XT;~fx%ڍG7FAOcffNMO1sǬ9jk| ,uUoZu X#Q,|=ڧѨtp+P6Q̬?ZmP6]n}޺EhZu&ʎ¶5ɾ]޾+ײRܟPYz{dޢC-3DK0<eA-4:m̬L@oXLK6Yn޾rME-,PKۻwiffo%-}^zk~lr^jtE hnȖ+~>1_̺hZ[ nu@s/̬LZȭi|~c>dhffO u#ڳ}f6f(|9^>Y0?Em{R:٨ܓha^(I4/܁YӧzKP>.N}/%z_ EXrFeP:c繁}&tdiiӓoP4{4|z~Ӂ/:#=ˠ|<-^޾rhamw )Z6%j Mybҍ,hM޽1W(}O`vDmj\YK433iVY493~C}9@}؇cDԂVhffzu4⹏EozfwR 2^Y 4 b7xmý})\ LH>dffVLvg lNn? * |<[gff5`ʼwxo3 -z0h{"гtTW25/̬Pg2q8y-Uz%QGS |+Y^ Wy}#qz33镀0>׭s33 y@0kU̖j5 vԤ[ŠjfffQ%ˠ|8M^عYE,g0'?ѭs33 I@ϒafg睛U\Mzu hUә;5 1yֹY5T{GAM#ϻy޹`ZP) yx%KAÙ09^j&Fh`8 Q@@` 0xxxCQk+0 ˠ;Њ xQBzS}3O2+ Gwّ O[vAقmCWRlefr]x"Mnz|%uoyg.n2A7 .ض tS:x 8T ;hovپ ;_>tJX CPRV __FR0kmgL`$`] %^;kWjkr%uKͬE ,==* KUSqc- [K<wTEYey 4`1~-gyQ_|͗zp|klu'g= }ԡ)2s:8n5،4b^\ ϢSPb#= ѫc&<񾃺 y∄3U22ǘ}V,O3)Ur+')nK]-FCdQG7to@J-ݺ{sQ}7QUt-hA׻9տ2R؂ޫ(Oc"\~]s?З~Uv6kPi69V~|wQ0k6VJ8y~p CN26g`\¾뢲)TT?`qޛN ܍ޗV*gɭ?R|r]QX27Wu;֊1;OoG7+dhC7֥',9m.|݊ZMT&UտE>nFZF|ѠBYyF/t@_zmփޏ/%Y86hơb S2r-Ԃ;Mg=fI7 ԣV8NQh!#P/Kw2~ [] V)Q%d g =ҍsEuV'yᛙ+P(.~ljӄM9h9jXlw ݋r'΢^H.}B 9$- dR=Bԣ(po c~Bn@9З.T<ǣ1mt{nkEsĞ>k:Kp /H}wF] grS ;C7qvŅP44k*`)t0ʝ!h)mOl>+c[Խqη6jZcOBAV&u5D-p%f'0WOyJِn+aįfK>wj՛55T"p/)vaFv5/K3Sb K$x~E`.i(l2솲׿L|D^wh|pC4d $gz^CYsQ4ܰ)GHn\k%%ϐ\7=}$gNљj^ԣVNe &yYE}\bdM5]Otm 4iGt<3]3Z['\, Bَ)h눯=.J29FGY[Nm/d['(=}JlzqmPε7%aB'y17<[ѴRVDcG8?C\dlZ\Dk]!fѴJⅨ7?;)ٛݿϢ.fϢnh[hvLF܉baWq&K)/znA7ԣ7Ha^R`$o?kք'{r51Lr!)E*@eQ()(rp:!?__`:;Weӕ  &S*1)$(I%';klTnbS).OWZ)e%J\4_*Kr>@} r@%?Ҏ~Ec&~`tKt%Ky5hf0Jnŝe]‚ o?)QJt[ BSS5Mׅ<> NHWptCUz1N`oQ8.ķ`݇;f weFE\'VC؃cWKC3@~Fy}!wT\ 2Ky?Czf~Cs'm-=PTY 3C/LdOw_).\1N;>A$tSRNRdɩHVEoC|U^X ϡ}-QCHƢY}]RUKHz8Kgao7G|/+]o<(}9yT΢{O :OGCmMOI>3h^{\QhY:畨۷'n@]ITOς9(0>g’zF~.=7>ϧO2Z^K֒~F1i52$w  }^,f{_4Tej_gfZ4VZIw<*SCSSH%T31ۚ(8(_(3LJ|s6KG-_H|Glߑlz b4-8M[Ү, 9~J|Ԫ?$eTjR 1I'>gTgQQ:@'~(\+]{ I8nq]Tϸ$KYVCcTD\Aobˉ,=1nzx*+a[Jϋla")ը'i\]\IuohRV(* E}R%~t`af>R\AmI@e3k% 9q+z:!nݭTg)*zR)lI?g |?*!$qc'>{*!pĒK<7bz*H)wZB|bsďV[莰TYuFei7p\}J._[s膤Xwt3#T$ImTo]]4ϸqZH|U1c)=Fe{koI[@gHmh|ߵt@@z3~㒫Zܚ=,fjz͛. ICq_Cϧ)"nwQd#J_TbSPN[@M y@O |_dm̩ޮ Y P}:򫴨~|9u+e ֥ZˠdJA*ъKmRށb>!9YqZ\H%]_ץޯՅtA_]+:Er@4I=̔Jy:CMGzPjٔk!$ũ~}EmBw}e*Ҥ/6Nm [!,4m jEբ;*:W)1Ђ<;c fc3Q1r+mUptmqu,>?z{R1q7;ah5agǭVd%euCߜSM&>FkIS_KjЛ>wgQ_%n.nK#+<ĕI54Hz6iu+Uт} 'zV4(-qSFP"%9=I\ja7jr> 1Orٓjq!]K_ұkoLʅZ R"}6'ʩ 5?*y9ֆMBMTZ]}}_FNKؾI–hدZZАKv*;[lU(=L|mp@{%~VTuE`o8MJlԌo\oͳ~~IVSBMmEJ/Z kDx?>]k,]_oT?YD|vl=P8^AH݊ z*ty2i*UO6 ,Sk3ΫDrUlK qK>eE"st}Ō%l?e6&sg">@Ny>=՘RVzq_}"sKϻ%7 W<cbPsv \ԥ~ޒ|O+Zmm ۯپ\TMח]B/&:ߧЇz\-:_sw~[sK(/8&ԣ*_7v|^CM-Nt#WJpW l×K9tw!f]$Lۤ$yЧ{x.٘C X6A\ZR0LDA[pĬQO,}@u#9a23ͺspT*ǟ_7?H.,p괣zԕ _k.J;yw{anQkoeJ5J멝k`\Z܌nlJnۍgޖțX(~6G(I-p7h+ I_ _BSf L8(iU=3 -&Hi=ݰ>1ֆoWD0S~H~c65tNo$A=凼 JpTvvC_Ίq@_:<(h`J{ u'UBۘdM}lˉ/R(y6PJU32mAg߻kY9ÑpW~Kp}r_z܏Z+j)͵5:_$9jTM8yԊ4썃y\z>jOVI~(Q51)3eꎹJĺ籜P&%UGT 7->㞦bJ>WAQ[Ƹ }6OA/}' YSr܋r~?(/{75, dOҮ@Oyh* Aڛ 4ڈ>\y ]sjdY- ... Then you'll want to do: cd /path/to/repo_folder bump-pydantic my_package See more about it on the [Bump Pydantic](https://github.com/pydantic/bump-pydantic) repository. ## Continue using Pydantic V1 features Pydantic V1 is still available when you need it, though we recommend migrating to Pydantic V2 for its improvements and new features. If you need to use latest Pydantic V1, you can install it with: ```bash pip install "pydantic==1.*" ``` The Pydantic V2 package also continues to provide access to the Pydantic V1 API by importing through `pydantic.v1`. For example, you can use the `BaseModel` class from Pydantic V1 instead of the Pydantic V2 `pydantic.BaseModel` class: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic.v1 import BaseModel ``` You can also import functions that have been removed from Pydantic V2, such as `lenient_isinstance`: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic.v1.utils import lenient_isinstance ``` Pydantic V1 documentation is available at [https://docs.pydantic.dev/1.10/](https://docs.pydantic.dev/1.10/). ### Using Pydantic v1 features in a v1/v2 environment As of `pydantic>=1.10.17`, the `pydantic.v1` namespace can be used within V1. This makes it easier to migrate to V2, which also supports the `pydantic.v1` namespace. In order to unpin a `pydantic<2` dependency and continue using V1 features, take the following steps: 1. Replace `pydantic<2` with `pydantic>=1.10.17` 2. Find and replace all occurrences of: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic. import ``` with: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic.v1. import ``` Here's how you can import `pydantic`'s v1 features based on your version of `pydantic`: === "`pydantic>=1.10.17,<3`" As of `v1.10.17` the `.v1` namespace is available in V1, allowing imports as below: ```python {test="skip" lint="skip" upgrade="skip"} from pydantic.v1.fields import ModelField ``` === "`pydantic<3`" All versions of Pydantic V1 and V2 support the following import pattern, in case you don't know which version of Pydantic you are using: ```python {test="skip" lint="skip" upgrade="skip"} try: from pydantic.v1.fields import ModelField except ImportError: from pydantic.fields import ModelField ``` !!! note When importing modules using `pydantic>=1.10.17,<2` with the `.v1` namespace these modules will *not* be the **same** module as the same import without the `.v1` namespace, but the symbols imported *will* be. For example `pydantic.v1.fields is not pydantic.fields` but `pydantic.v1.fields.ModelField is pydantic.fields.ModelField`. Luckily, this is not likely to be relevant in the vast majority of cases. It's just an unfortunate consequence of providing a smoother migration experience. ## Migration guide The following sections provide details on the most important changes in Pydantic V2. ### Changes to `pydantic.BaseModel` Various method names have been changed; all non-deprecated `BaseModel` methods now have names matching either the format `model_.*` or `__.*pydantic.*__`. Where possible, we have retained the deprecated methods with their old names to help ease migration, but calling them will emit `DeprecationWarning`s. | Pydantic V1 | Pydantic V2 | | ----------- | ------------ | | `__fields__` | `model_fields` | | `__private_attributes__` | `__pydantic_private__` | | `__validators__` | `__pydantic_validator__` | | `construct()` | `model_construct()` | | `copy()` | `model_copy()` | | `dict()` | `model_dump()` | | `json_schema()` | `model_json_schema()` | | `json()` | `model_dump_json()` | | `parse_obj()` | `model_validate()` | | `update_forward_refs()` | `model_rebuild()` | * Some of the built-in data-loading functionality has been slated for removal. In particular, `parse_raw` and `parse_file` are now deprecated. In Pydantic V2, `model_validate_json` works like `parse_raw`. Otherwise, you should load the data and then pass it to `model_validate`. * The `from_orm` method has been deprecated; you can now just use `model_validate` (equivalent to `parse_obj` from Pydantic V1) to achieve something similar, as long as you've set `from_attributes=True` in the model config. * The `__eq__` method has changed for models. * Models can only be equal to other `BaseModel` instances. * For two model instances to be equal, they must have the same: * Type (or, in the case of generic models, non-parametrized generic origin type) * Field values * Extra values (only relevant when `model_config['extra'] == 'allow'`) * Private attribute values; models with different values of private attributes are no longer equal. * Models are no longer equal to the dicts containing their data. * Non-generic models of different types are never equal. * Generic models with different origin types are never equal. We don't require *exact* type equality so that, for example, instances of `MyGenericModel[Any]` could be equal to instances of `MyGenericModel[int]`. * We have replaced the use of the `__root__` field to specify a "custom root model" with a new type called [`RootModel`](concepts/models.md#rootmodel-and-custom-root-types) which is intended to replace the functionality of using a field called `__root__` in Pydantic V1. Note, `RootModel` types no longer support the `arbitrary_types_allowed` config setting. See [this issue comment](https://github.com/pydantic/pydantic/issues/6710#issuecomment-1700948167) for an explanation. * We have significantly expanded Pydantic's capabilities related to customizing serialization. In particular, we have added the [`@field_serializer`](api/functional_serializers.md#pydantic.functional_serializers.field_serializer), [`@model_serializer`](api/functional_serializers.md#pydantic.functional_serializers.model_serializer), and [`@computed_field`](api/fields.md#pydantic.fields.computed_field) decorators, which each address various shortcomings from Pydantic V1. * See [Custom serializers](concepts/serialization.md#custom-serializers) for the usage docs of these new decorators. * Due to performance overhead and implementation complexity, we have now deprecated support for specifying `json_encoders` in the model config. This functionality was originally added for the purpose of achieving custom serialization logic, and we think the new serialization decorators are a better choice in most common scenarios. * We have changed the behavior related to serializing subclasses of models when they occur as nested fields in a parent model. In V1, we would always include all fields from the subclass instance. In V2, when we dump a model, we only include the fields that are defined on the annotated type of the field. This helps prevent some accidental security bugs. You can read more about this (including how to opt out of this behavior) in the [Subclass instances for fields of BaseModel, dataclasses, TypedDict](concepts/serialization.md#subclass-instances-for-fields-of-basemodel-dataclasses-typeddict) section of the model exporting docs. * `GetterDict` has been removed as it was just an implementation detail of `orm_mode`, which has been removed. * In many cases, arguments passed to the constructor will be **copied** in order to perform validation and, where necessary, coercion. This is notable in the case of passing mutable objects as arguments to a constructor. You can see an example + more detail [here](https://docs.pydantic.dev/latest/concepts/models/#attribute-copies). * The `.json()` method is deprecated, and attempting to use this deprecated method with arguments such as `indent` or `ensure_ascii` may lead to confusing errors. For best results, switch to V2's equivalent, `model_dump_json()`. If you'd still like to use said arguments, you can use [this workaround](https://github.com/pydantic/pydantic/issues/8825#issuecomment-1946206415). * JSON serialization of non-string key values is generally done with `str(key)`, leading to some changes in behavior such as the following: ```python from typing import Dict, Optional from pydantic import BaseModel as V2BaseModel from pydantic.v1 import BaseModel as V1BaseModel class V1Model(V1BaseModel): a: Dict[Optional[str], int] class V2Model(V2BaseModel): a: Dict[Optional[str], int] v1_model = V1Model(a={None: 123}) v2_model = V2Model(a={None: 123}) # V1 print(v1_model.json()) #> {"a": {"null": 123}} # V2 print(v2_model.model_dump_json()) #> {"a":{"None":123}} ``` * `model_dump_json()` results are compacted in order to save space, and don't always exactly match that of `json.dumps()` output. That being said, you can easily modify the separators used in `json.dumps()` results in order to align the two outputs: ```python import json from typing import List from pydantic import BaseModel as V2BaseModel from pydantic.v1 import BaseModel as V1BaseModel class V1Model(V1BaseModel): a: List[str] class V2Model(V2BaseModel): a: List[str] v1_model = V1Model(a=['fancy', 'sushi']) v2_model = V2Model(a=['fancy', 'sushi']) # V1 print(v1_model.json()) #> {"a": ["fancy", "sushi"]} # V2 print(v2_model.model_dump_json()) #> {"a":["fancy","sushi"]} # Plain json.dumps print(json.dumps(v2_model.model_dump())) #> {"a": ["fancy", "sushi"]} # Modified json.dumps print(json.dumps(v2_model.model_dump(), separators=(',', ':'))) #> {"a":["fancy","sushi"]} ``` ### Changes to `pydantic.generics.GenericModel` The `pydantic.generics.GenericModel` class is no longer necessary, and has been removed. Instead, you can now create generic `BaseModel` subclasses by just adding `Generic` as a parent class on a `BaseModel` subclass directly. This looks like `class MyGenericModel(BaseModel, Generic[T]): ...`. Mixing of V1 and V2 models is not supported which means that type parameters of such generic `BaseModel` (V2) cannot be V1 models. While it may not raise an error, we strongly advise against using _parametrized_ generics in `isinstance` checks. * For example, you should not do `isinstance(my_model, MyGenericModel[int])`. However, it is fine to do `isinstance(my_model, MyGenericModel)`. (Note that for standard generics, it would raise an error to do a subclass check with a parameterized generic.) * If you need to perform `isinstance` checks against parametrized generics, you can do this by subclassing the parametrized generic class. This looks like `class MyIntModel(MyGenericModel[int]): ...` and `isinstance(my_model, MyIntModel)`. Find more information in the [Generic models](concepts/models.md#generic-models) documentation. ### Changes to `pydantic.Field` `Field` no longer supports arbitrary keyword arguments to be added to the JSON schema. Instead, any extra data you want to add to the JSON schema should be passed as a dictionary to the `json_schema_extra` keyword argument. In Pydantic V1, the `alias` property returns the field's name when no alias is set. In Pydantic V2, this behavior has changed to return `None` when no alias is set. The following properties have been removed from or changed in `Field`: - `const` - `min_items` (use `min_length` instead) - `max_items` (use `max_length` instead) - `unique_items` - `allow_mutation` (use `frozen` instead) - `regex` (use `pattern` instead) - `final` (use the [typing.Final][] type hint instead) Field constraints are no longer automatically pushed down to the parameters of generics. For example, you can no longer validate every element of a list matches a regex by providing `my_list: list[str] = Field(pattern=".*")`. Instead, use [`typing.Annotated`][] to provide an annotation on the `str` itself: `my_list: list[Annotated[str, Field(pattern=".*")]]` * [TODO: Need to document any other backwards-incompatible changes to `pydantic.Field`] ### Changes to dataclasses Pydantic [dataclasses](concepts/dataclasses.md) continue to be useful for enabling the data validation on standard dataclasses without having to subclass `BaseModel`. Pydantic V2 introduces the following changes to this dataclass behavior: * When used as fields, dataclasses (Pydantic or vanilla) no longer accept tuples as validation inputs; dicts should be used instead. * The `__post_init__` in Pydantic dataclasses will now be called _after_ validation, rather than before. * As a result, the `__post_init_post_parse__` method would have become redundant, so has been removed. * Pydantic no longer supports `extra='allow'` for Pydantic dataclasses, where extra fields passed to the initializer would be stored as extra attributes on the dataclass. `extra='ignore'` is still supported for the purpose of ignoring unexpected fields while parsing data, they just won't be stored on the instance. * Pydantic dataclasses no longer have an attribute `__pydantic_model__`, and no longer use an underlying `BaseModel` to perform validation or provide other functionality. * To perform validation, generate a JSON schema, or make use of any other functionality that may have required `__pydantic_model__` in V1, you should now wrap the dataclass with a [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] ([discussed more below](#introduction-of-typeadapter)) and make use of its methods. * In Pydantic V1, if you used a vanilla (i.e., non-Pydantic) dataclass as a field, the config of the parent type would be used as though it was the config for the dataclass itself as well. In Pydantic V2, this is no longer the case. * In Pydantic V2, to override the config (like you would with `model_config` on a `BaseModel`), you can use the `config` parameter on the `@dataclass` decorator. See [Dataclass Config](concepts/dataclasses.md#dataclass-config) for examples. ### Changes to config * In Pydantic V2, to specify config on a model, you should set a class attribute called `model_config` to be a dict with the key/value pairs you want to be used as the config. The Pydantic V1 behavior to create a class called `Config` in the namespace of the parent `BaseModel` subclass is now deprecated. * When subclassing a model, the `model_config` attribute is inherited. This is helpful in the case where you'd like to use a base class with a given configuration for many models. Note, if you inherit from multiple `BaseModel` subclasses, like `class MyModel(Model1, Model2)`, the non-default settings in the `model_config` attribute from the two models will be merged, and for any settings defined in both, those from `Model2` will override those from `Model1`. * The following config settings have been removed: * `allow_mutation` — this has been removed. You should be able to use [frozen](api/config.md#pydantic.config.ConfigDict) equivalently (inverse of current use). * `error_msg_templates` * `fields` — this was the source of various bugs, so has been removed. You should be able to use `Annotated` on fields to modify them as desired. * `getter_dict` — `orm_mode` has been removed, and this implementation detail is no longer necessary. * `smart_union`. * `underscore_attrs_are_private` — the Pydantic V2 behavior is now the same as if this was always set to `True` in Pydantic V1. * `json_loads` * `json_dumps` * `copy_on_model_validation` * `post_init_call` * The following config settings have been renamed: * `allow_population_by_field_name` → `populate_by_name` * `anystr_lower` → `str_to_lower` * `anystr_strip_whitespace` → `str_strip_whitespace` * `anystr_upper` → `str_to_upper` * `keep_untouched` → `ignored_types` * `max_anystr_length` → `str_max_length` * `min_anystr_length` → `str_min_length` * `orm_mode` → `from_attributes` * `schema_extra` → `json_schema_extra` * `validate_all` → `validate_default` See the [`ConfigDict` API reference][pydantic.config.ConfigDict] for more details. ### Changes to validators #### `@validator` and `@root_validator` are deprecated * `@validator` has been deprecated, and should be replaced with [`@field_validator`](concepts/validators.md), which provides various new features and improvements. * The new `@field_validator` decorator does not have the `each_item` keyword argument; validators you want to apply to items within a generic container should be added by annotating the type argument. See [validators in Annotated metadata](concepts/types.md#composing-types-via-annotated) for details. This looks like `List[Annotated[int, Field(ge=0)]]` * Even if you keep using the deprecated `@validator` decorator, you can no longer add the `field` or `config` arguments to the signature of validator functions. If you need access to these, you'll need to migrate to `@field_validator` — see the [next section](#changes-to-validators-allowed-signatures) for more details. * If you use the `always=True` keyword argument to a validator function, note that standard validators for the annotated type will _also_ be applied even to defaults, not just the custom validators. For example, despite the fact that the validator below will never error, the following code raises a `ValidationError`: !!! note To avoid this, you can use the `validate_default` argument in the `Field` function. When set to `True`, it mimics the behavior of `always=True` in Pydantic v1. However, the new way of using `validate_default` is encouraged as it provides more flexibility and control. ```python {test="skip"} from pydantic import BaseModel, validator class Model(BaseModel): x: str = 1 @validator('x', always=True) @classmethod def validate_x(cls, v): return v Model() ``` * `@root_validator` has been deprecated, and should be replaced with [`@model_validator`](api/functional_validators.md#pydantic.functional_validators.model_validator), which also provides new features and improvements. * Under some circumstances (such as assignment when `model_config['validate_assignment'] is True`), the `@model_validator` decorator will receive an instance of the model, not a dict of values. You may need to be careful to handle this case. * Even if you keep using the deprecated `@root_validator` decorator, due to refactors in validation logic, you can no longer run with `skip_on_failure=False` (which is the default value of this keyword argument, so must be set explicitly to `True`). #### Changes to `@validator`'s allowed signatures In Pydantic V1, functions wrapped by `@validator` could receive keyword arguments with metadata about what was being validated. Some of these arguments have been removed from `@field_validator` in Pydantic V2: * `config`: Pydantic V2's config is now a dictionary instead of a class, which means this argument is no longer backwards compatible. If you need to access the configuration you should migrate to `@field_validator` and use `info.config`. * `field`: this argument used to be a `ModelField` object, which was a quasi-internal class that no longer exists in Pydantic V2. Most of this information can still be accessed by using the field name from `info.field_name` to index into `cls.model_fields` ```python from pydantic import BaseModel, ValidationInfo, field_validator class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int, info: ValidationInfo) -> int: assert info.config is not None print(info.config.get('title')) #> Model print(cls.model_fields[info.field_name].is_required()) #> True return v Model(x=1) ``` #### `TypeError` is no longer converted to `ValidationError` in validators Previously, when raising a `TypeError` within a validator function, that error would be wrapped into a `ValidationError` and, in some cases (such as with FastAPI), these errors might be displayed to end users. This led to a variety of undesirable behavior — for example, calling a function with the wrong signature might produce a user-facing `ValidationError`. However, in Pydantic V2, when a `TypeError` is raised in a validator, it is no longer converted into a `ValidationError`: ```python import pytest from pydantic import BaseModel, field_validator # or validator class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int) -> int: return str.lower(v) # raises a TypeError with pytest.raises(TypeError): Model(x=1) ``` This applies to all validation decorators. #### Validator behavior changes Pydantic V2 includes some changes to type coercion. For example: * coercing `int`, `float`, and `Decimal` values to strings is now optional and disabled by default, see [Coerce Numbers to Strings][pydantic.config.ConfigDict.coerce_numbers_to_str]. * iterable of pairs is no longer coerced to a dict. See the [Conversion table](concepts/conversion_table.md) for details on Pydantic V2 type coercion defaults. #### The `allow_reuse` keyword argument is no longer necessary Previously, Pydantic tracked "reused" functions in decorators as this was a common source of mistakes. We did this by comparing the function's fully qualified name (module name + function name), which could result in false positives. The `allow_reuse` keyword argument could be used to disable this when it was intentional. Our approach to detecting repeatedly defined functions has been overhauled to only error for redefinition within a single class, reducing false positives and bringing the behavior more in line with the errors that type checkers and linters would give for defining a method with the same name multiple times in a single class definition. In nearly all cases, if you were using `allow_reuse=True`, you should be able to simply delete that keyword argument and have things keep working as expected. #### `@validate_arguments` has been renamed to `@validate_call` In Pydantic V2, the `@validate_arguments` decorator has been renamed to `@validate_call`. In Pydantic V1, the decorated function had various attributes added, such as `raw_function`, and `validate` (which could be used to validate arguments without actually calling the decorated function). Due to limited use of these attributes, and performance-oriented changes in implementation, we have not preserved this functionality in `@validate_call`. ### Input types are not preserved In Pydantic V1 we made great efforts to preserve the types of all field inputs for generic collections when they were proper subtypes of the field annotations. For example, given the annotation `Mapping[str, int]` if you passed in a `collection.Counter()` you'd get a `collection.Counter()` as the value. Supporting this behavior in V2 would have negative performance implications for the general case (we'd have to check types every time) and would add a lot of complexity to validation. Further, even in V1 this behavior was inconsistent and partially broken: it did not work for many types (`str`, `UUID`, etc.), and for generic collections it's impossible to re-build the original input correctly without a lot of special casing (consider `ChainMap`; rebuilding the input is necessary because we need to replace values after validation, e.g. if coercing strings to ints). In Pydantic V2 we no longer attempt to preserve the input type in all cases; instead, we only promise that the output type will match the type annotations. Going back to the `Mapping` example, we promise the output will be a valid `Mapping`, and in practice it will be a plain `dict`: ```python from typing import Mapping from pydantic import TypeAdapter class MyDict(dict): pass ta = TypeAdapter(Mapping[str, int]) v = ta.validate_python(MyDict()) print(type(v)) #> ``` If you want the output type to be a specific type, consider annotating it as such or implementing a custom validator: ```python from typing import Any, Mapping, TypeVar from typing_extensions import Annotated from pydantic import ( TypeAdapter, ValidationInfo, ValidatorFunctionWrapHandler, WrapValidator, ) def restore_input_type( value: Any, handler: ValidatorFunctionWrapHandler, _info: ValidationInfo ) -> Any: return type(value)(handler(value)) T = TypeVar('T') PreserveType = Annotated[T, WrapValidator(restore_input_type)] ta = TypeAdapter(PreserveType[Mapping[str, int]]) class MyDict(dict): pass v = ta.validate_python(MyDict()) assert type(v) is MyDict ``` While we don't promise to preserve input types everywhere, we _do_ preserve them for subclasses of `BaseModel`, and for dataclasses: ```python import pydantic.dataclasses from pydantic import BaseModel class InnerModel(BaseModel): x: int class OuterModel(BaseModel): inner: InnerModel class SubInnerModel(InnerModel): y: int m = OuterModel(inner=SubInnerModel(x=1, y=2)) print(m) #> inner=SubInnerModel(x=1, y=2) @pydantic.dataclasses.dataclass class InnerDataclass: x: int @pydantic.dataclasses.dataclass class SubInnerDataclass(InnerDataclass): y: int @pydantic.dataclasses.dataclass class OuterDataclass: inner: InnerDataclass d = OuterDataclass(inner=SubInnerDataclass(x=1, y=2)) print(d) #> OuterDataclass(inner=SubInnerDataclass(x=1, y=2)) ``` ### Changes to Handling of Standard Types #### Dicts Iterables of pairs (which include empty iterables) no longer pass validation for fields of type `dict`. #### Unions While union types will still attempt validation of each choice from left to right, they now preserve the type of the input whenever possible, even if the correct type is not the first choice for which the input would pass validation. As a demonstration, consider the following example: ```python from typing import Union from pydantic import BaseModel class Model(BaseModel): x: Union[int, str] print(Model(x='1')) #> x='1' ``` In Pydantic V1, the printed result would have been `x=1`, since the value would pass validation as an `int`. In Pydantic V2, we recognize that the value is an instance of one of the cases and short-circuit the standard union validation. To revert to the non-short-circuiting left-to-right behavior of V1, annotate the union with `Field(union_mode='left_to_right')`. See [Union Mode](./concepts/unions.md#union-modes) for more details. #### Required, optional, and nullable fields Pydantic V2 changes some of the logic for specifying whether a field annotated as `Optional` is required (i.e., has no default value) or not (i.e., has a default value of `None` or any other value of the corresponding type), and now more closely matches the behavior of `dataclasses`. Similarly, fields annotated as `Any` no longer have a default value of `None`. The following table describes the behavior of field annotations in V2: | State | Field Definition | |-------------------------------------------------------|-----------------------------| | Required, cannot be `None` | `f1: str` | | Not required, cannot be `None`, is `'abc'` by default | `f2: str = 'abc'` | | Required, can be `None` | `f3: Optional[str]` | | Not required, can be `None`, is `None` by default | `f4: Optional[str] = None` | | Not required, can be `None`, is `'abc'` by default | `f5: Optional[str] = 'abc'` | | Required, can be any type (including `None`) | `f6: Any` | | Not required, can be any type (including `None`) | `f7: Any = None` | !!! note A field annotated as `typing.Optional[T]` will be required, and will allow for a value of `None`. It does not mean that the field has a default value of `None`. _(This is a breaking change from V1.)_ !!! note Any default value if provided makes a field not required. Here is a code example demonstrating the above: ```python from typing import Optional from pydantic import BaseModel, ValidationError class Foo(BaseModel): f1: str # required, cannot be None f2: Optional[str] # required, can be None - same as str | None f3: Optional[str] = None # not required, can be None f4: str = 'Foobar' # not required, but cannot be None try: Foo(f1=None, f2=None, f4='b') except ValidationError as e: print(e) """ 1 validation error for Foo f1 Input should be a valid string [type=string_type, input_value=None, input_type=NoneType] """ ``` #### Patterns / regex on strings Pydantic V1 used Python's regex library. Pydantic V2 uses the Rust [regex crate]. This crate is not just a "Rust version of regular expressions", it's a completely different approach to regular expressions. In particular, it promises linear time searching of strings in exchange for dropping a couple of features (namely look arounds and backreferences). We believe this is a tradeoff worth making, in particular because Pydantic is used to validate untrusted input where ensuring things don't accidentally run in exponential time depending on the untrusted input is important. On the flipside, for anyone not using these features complex regex validation should be orders of magnitude faster because it's done in Rust and in linear time. If you still want to use Python's regex library, you can use the [`regex_engine`](./api/config.md#pydantic.config.ConfigDict.regex_engine) config setting. [regex crate]: https://github.com/rust-lang/regex ### Introduction of `TypeAdapter` Pydantic V1 had weak support for validating or serializing non-`BaseModel` types. To work with them, you had to either create a "root" model or use the utility functions in `pydantic.tools` (namely, `parse_obj_as` and `schema_of`). In Pydantic V2 this is _a lot_ easier: the [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] class lets you create an object with methods for validating, serializing, and producing JSON schemas for arbitrary types. This serves as a complete replacement for `parse_obj_as` and `schema_of` (which are now deprecated), and also covers some of the use cases of "root" models. ([`RootModel`](concepts/models.md#rootmodel-and-custom-root-types), [discussed above](#changes-to-pydanticbasemodel), covers the others.) ```python from typing import List from pydantic import TypeAdapter adapter = TypeAdapter(List[int]) assert adapter.validate_python(['1', '2', '3']) == [1, 2, 3] print(adapter.json_schema()) #> {'items': {'type': 'integer'}, 'type': 'array'} ``` Due to limitations of inferring generic types with common type checkers, to get proper typing in some scenarios, you may need to explicitly specify the generic parameter: ```python {test="skip"} from pydantic import TypeAdapter adapter = TypeAdapter[str | int](str | int) ... ``` See [Type Adapter](concepts/type_adapter.md) for more information. ### Defining custom types We have completely overhauled the way custom types are defined in pydantic. We have exposed hooks for generating both `pydantic-core` and JSON schemas, allowing you to get all the performance benefits of Pydantic V2 even when using your own custom types. We have also introduced ways to use [`typing.Annotated`][] to add custom validation to your own types. The main changes are: * `__get_validators__` should be replaced with `__get_pydantic_core_schema__`. See [Custom Data Types](concepts/types.md#customizing_validation_with_get_pydantic_core_schema) for more information. * `__modify_schema__` becomes `__get_pydantic_json_schema__`. See [JSON Schema Customization](concepts/json_schema.md#customizing-json-schema) for more information. Additionally, you can use [`typing.Annotated`][] to modify or provide the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` functions of a type by annotating it, rather than modifying the type itself. This provides a powerful and flexible mechanism for integrating third-party types with Pydantic, and in some cases may help you remove hacks from Pydantic V1 introduced to work around the limitations for custom types. See [Custom Data Types](concepts/types.md#custom-types) for more information. ### Changes to JSON schema generation We received many requests over the years to make changes to the JSON schemas that pydantic generates. In Pydantic V2, we have tried to address many of the common requests: * The JSON schema for `Optional` fields now indicates that the value `null` is allowed. * The `Decimal` type is now exposed in JSON schema (and serialized) as a string. * The JSON schema no longer preserves namedtuples as namedtuples. * The JSON schema we generate by default now targets draft 2020-12 (with some OpenAPI extensions). * When they differ, you can now specify if you want the JSON schema representing the inputs to validation, or the outputs from serialization. However, there have been many reasonable requests over the years for changes which we have not chosen to implement. In Pydantic V1, even if you were willing to implement changes yourself, it was very difficult because the JSON schema generation process involved various recursive function calls; to override one, you'd have to copy and modify the whole implementation. In Pydantic V2, one of our design goals was to make it easier to customize JSON schema generation. To this end, we have introduced the class [`GenerateJsonSchema`](api/json_schema.md#pydantic.json_schema.GenerateJsonSchema), which implements the translation of a type's pydantic-core schema into a JSON schema. By design, this class breaks the JSON schema generation process into smaller methods that can be easily overridden in subclasses to modify the "global" approach to generating JSON schema. The various methods that can be used to produce JSON schema (such as `BaseModel.model_json_schema` or `TypeAdapter.json_schema`) accept a keyword argument `schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema`, and you can pass your custom subclass to these methods in order to use your own approach to generating JSON schema. Hopefully this means that if you disagree with any of the choices we've made, or if you are reliant on behaviors in Pydantic V1 that have changed in Pydantic V2, you can use a custom `schema_generator`, modifying the `GenerateJsonSchema` class as necessary for your application. ### `BaseSettings` has moved to `pydantic-settings` [`BaseSettings`](api/pydantic_settings.md#pydantic_settings.BaseSettings), the base object for Pydantic [settings management](concepts/pydantic_settings.md), has been moved to a separate package, [`pydantic-settings`](https://github.com/pydantic/pydantic-settings). Also, the `parse_env_var` classmethod has been removed. So, you need to [customise settings sources](concepts/pydantic_settings.md#customise-settings-sources) to have your own parsing function. ### Color and Payment Card Numbers moved to `pydantic-extra-types` The following special-use types have been moved to the [Pydantic Extra Types](https://github.com/pydantic/pydantic-extra-types) package, which may be installed separately if needed. * [Color Types](api/pydantic_extra_types_color.md) * [Payment Card Numbers](api/pydantic_extra_types_payment.md) ### Url and Dsn types in `pydantic.networks` no longer inherit from `str` In Pydantic V1 the [`AnyUrl`][pydantic.networks.AnyUrl] type inherited from `str`, and all the other `Url` and `Dsn` types inherited from these. In Pydantic V2 these types are built on two new `Url` and `MultiHostUrl` classes using `Annotated`. Inheriting from `str` had upsides and downsides, and for V2 we decided it would be better to remove this. To use these types in APIs which expect `str` you'll now need to convert them (with `str(url)`). Pydantic V2 uses Rust's [Url](https://crates.io/crates/url) crate for URL validation. Some of the URL validation differs slightly from the previous behavior in V1. One notable difference is that the new `Url` types append slashes to the validated version if no path is included, even if a slash is not specified in the argument to a `Url` type constructor. See the example below for this behavior: ```python from pydantic import AnyUrl assert str(AnyUrl(url='https://google.com')) == 'https://google.com/' assert str(AnyUrl(url='https://google.com/')) == 'https://google.com/' assert str(AnyUrl(url='https://google.com/api')) == 'https://google.com/api' assert str(AnyUrl(url='https://google.com/api/')) == 'https://google.com/api/' ``` If you still want to use the old behavior without the appended slash, take a look at this [solution](https://github.com/pydantic/pydantic/issues/7186#issuecomment-1690235887). ### Constrained types The `Constrained*` classes were _removed_, and you should replace them by `Annotated[, Field(...)]`, for example: ```python {test="skip"} from pydantic import BaseModel, ConstrainedInt class MyInt(ConstrainedInt): ge = 0 class Model(BaseModel): x: MyInt ``` ...becomes: ```python from typing_extensions import Annotated from pydantic import BaseModel, Field MyInt = Annotated[int, Field(ge=0)] class Model(BaseModel): x: MyInt ``` Read more about it in the [Composing types via `Annotated`](concepts/types.md#composing-types-via-annotated) docs. For `ConstrainedStr` you can use [`StringConstraints`][pydantic.types.StringConstraints] instead. #### Mypy Plugins Pydantic V2 contains a [mypy](https://mypy.readthedocs.io/en/stable/extending_mypy.html#configuring-mypy-to-use-plugins) plugin in `pydantic.mypy`. When using [V1 features](migration.md#continue-using-pydantic-v1-features) the `pydantic.v1.mypy` plugin might need to also be enabled. To configure the `mypy` plugins: === `mypy.ini` ```ini [mypy] plugins = pydantic.mypy, pydantic.v1.mypy # include `.v1.mypy` if required. ``` === `pyproject.toml` ```toml [tool.mypy] plugins = [ "pydantic.mypy", "pydantic.v1.mypy", ] ``` ## Other changes * Dropped support for [`email-validator<2.0.0`](https://github.com/JoshData/python-email-validator). Make sure to update using `pip install -U email-validator`. ## Moved in Pydantic V2 | Pydantic V1 | Pydantic V2 | | --- | --- | | `pydantic.BaseSettings` | [`pydantic_settings.BaseSettings`](#basesettings-has-moved-to-pydantic-settings) | | `pydantic.color` | [`pydantic_extra_types.color`][pydantic_extra_types.color] | | `pydantic.types.PaymentCardBrand` | [`pydantic_extra_types.PaymentCardBrand`](#color-and-payment-card-numbers-moved-to-pydantic-extra-types) | | `pydantic.types.PaymentCardNumber` | [`pydantic_extra_types.PaymentCardNumber`](#color-and-payment-card-numbers-moved-to-pydantic-extra-types) | | `pydantic.utils.version_info` | [`pydantic.version.version_info`][pydantic.version.version_info] | | `pydantic.error_wrappers.ValidationError` | [`pydantic.ValidationError`][pydantic_core.ValidationError] | | `pydantic.utils.to_camel` | [`pydantic.alias_generators.to_pascal`][pydantic.alias_generators.to_pascal] | | `pydantic.utils.to_lower_camel` | [`pydantic.alias_generators.to_camel`][pydantic.alias_generators.to_camel] | | `pydantic.PyObject` | [`pydantic.ImportString`][pydantic.types.ImportString] | ## Deprecated and moved in Pydantic V2 | Pydantic V1 | Pydantic V2 | | --- | --- | | `pydantic.tools.schema_of` | `pydantic.deprecated.tools.schema_of` | | `pydantic.tools.parse_obj_as` | `pydantic.deprecated.tools.parse_obj_as` | | `pydantic.tools.schema_json_of` | `pydantic.deprecated.tools.schema_json_of` | | `pydantic.json.pydantic_encoder` | `pydantic.deprecated.json.pydantic_encoder` | | `pydantic.validate_arguments` | `pydantic.deprecated.decorator.validate_arguments` | | `pydantic.json.custom_pydantic_encoder` | `pydantic.deprecated.json.custom_pydantic_encoder` | | `pydantic.json.ENCODERS_BY_TYPE` | `pydantic.deprecated.json.ENCODERS_BY_TYPE` | | `pydantic.json.timedelta_isoformat` | `pydantic.deprecated.json.timedelta_isoformat` | | `pydantic.decorator.validate_arguments` | `pydantic.deprecated.decorator.validate_arguments` | | `pydantic.class_validators.validator` | `pydantic.deprecated.class_validators.validator` | | `pydantic.class_validators.root_validator` | `pydantic.deprecated.class_validators.root_validator` | | `pydantic.utils.deep_update` | `pydantic.v1.utils.deep_update` | | `pydantic.utils.GetterDict` | `pydantic.v1.utils.GetterDict` | | `pydantic.utils.lenient_issubclass` | `pydantic.v1.utils.lenient_issubclass` | | `pydantic.utils.lenient_isinstance` | `pydantic.v1.utils.lenient_isinstance` | | `pydantic.utils.is_valid_field` | `pydantic.v1.utils.is_valid_field` | | `pydantic.utils.update_not_none` | `pydantic.v1.utils.update_not_none` | | `pydantic.utils.import_string` | `pydantic.v1.utils.import_string` | | `pydantic.utils.Representation` | `pydantic.v1.utils.Representation` | | `pydantic.utils.ROOT_KEY` | `pydantic.v1.utils.ROOT_KEY` | | `pydantic.utils.smart_deepcopy` | `pydantic.v1.utils.smart_deepcopy` | | `pydantic.utils.sequence_like` | `pydantic.v1.utils.sequence_like` | ## Removed in Pydantic V2 - `pydantic.ConstrainedBytes` - `pydantic.ConstrainedDate` - `pydantic.ConstrainedDecimal` - `pydantic.ConstrainedFloat` - `pydantic.ConstrainedFrozenSet` - `pydantic.ConstrainedInt` - `pydantic.ConstrainedList` - `pydantic.ConstrainedSet` - `pydantic.ConstrainedStr` - `pydantic.JsonWrapper` - `pydantic.NoneBytes` - This was an alias to `None | bytes`. - `pydantic.NoneStr` - This was an alias to `None | str`. - `pydantic.NoneStrBytes` - This was an alias to `None | str | bytes`. - `pydantic.Protocol` - `pydantic.Required` - `pydantic.StrBytes` - This was an alias to `str | bytes`. - `pydantic.compiled` - `pydantic.config.get_config` - `pydantic.config.inherit_config` - `pydantic.config.prepare_config` - `pydantic.create_model_from_namedtuple` - `pydantic.create_model_from_typeddict` - `pydantic.dataclasses.create_pydantic_model_from_dataclass` - `pydantic.dataclasses.make_dataclass_validator` - `pydantic.dataclasses.set_validation` - `pydantic.datetime_parse.parse_date` - `pydantic.datetime_parse.parse_time` - `pydantic.datetime_parse.parse_datetime` - `pydantic.datetime_parse.parse_duration` - `pydantic.error_wrappers.ErrorWrapper` - `pydantic.errors.AnyStrMaxLengthError` - `pydantic.errors.AnyStrMinLengthError` - `pydantic.errors.ArbitraryTypeError` - `pydantic.errors.BoolError` - `pydantic.errors.BytesError` - `pydantic.errors.CallableError` - `pydantic.errors.ClassError` - `pydantic.errors.ColorError` - `pydantic.errors.ConfigError` - `pydantic.errors.DataclassTypeError` - `pydantic.errors.DateError` - `pydantic.errors.DateNotInTheFutureError` - `pydantic.errors.DateNotInThePastError` - `pydantic.errors.DateTimeError` - `pydantic.errors.DecimalError` - `pydantic.errors.DecimalIsNotFiniteError` - `pydantic.errors.DecimalMaxDigitsError` - `pydantic.errors.DecimalMaxPlacesError` - `pydantic.errors.DecimalWholeDigitsError` - `pydantic.errors.DictError` - `pydantic.errors.DurationError` - `pydantic.errors.EmailError` - `pydantic.errors.EnumError` - `pydantic.errors.EnumMemberError` - `pydantic.errors.ExtraError` - `pydantic.errors.FloatError` - `pydantic.errors.FrozenSetError` - `pydantic.errors.FrozenSetMaxLengthError` - `pydantic.errors.FrozenSetMinLengthError` - `pydantic.errors.HashableError` - `pydantic.errors.IPv4AddressError` - `pydantic.errors.IPv4InterfaceError` - `pydantic.errors.IPv4NetworkError` - `pydantic.errors.IPv6AddressError` - `pydantic.errors.IPv6InterfaceError` - `pydantic.errors.IPv6NetworkError` - `pydantic.errors.IPvAnyAddressError` - `pydantic.errors.IPvAnyInterfaceError` - `pydantic.errors.IPvAnyNetworkError` - `pydantic.errors.IntEnumError` - `pydantic.errors.IntegerError` - `pydantic.errors.InvalidByteSize` - `pydantic.errors.InvalidByteSizeUnit` - `pydantic.errors.InvalidDiscriminator` - `pydantic.errors.InvalidLengthForBrand` - `pydantic.errors.JsonError` - `pydantic.errors.JsonTypeError` - `pydantic.errors.ListError` - `pydantic.errors.ListMaxLengthError` - `pydantic.errors.ListMinLengthError` - `pydantic.errors.ListUniqueItemsError` - `pydantic.errors.LuhnValidationError` - `pydantic.errors.MissingDiscriminator` - `pydantic.errors.MissingError` - `pydantic.errors.NoneIsAllowedError` - `pydantic.errors.NoneIsNotAllowedError` - `pydantic.errors.NotDigitError` - `pydantic.errors.NotNoneError` - `pydantic.errors.NumberNotGeError` - `pydantic.errors.NumberNotGtError` - `pydantic.errors.NumberNotLeError` - `pydantic.errors.NumberNotLtError` - `pydantic.errors.NumberNotMultipleError` - `pydantic.errors.PathError` - `pydantic.errors.PathNotADirectoryError` - `pydantic.errors.PathNotAFileError` - `pydantic.errors.PathNotExistsError` - `pydantic.errors.PatternError` - `pydantic.errors.PyObjectError` - `pydantic.errors.PydanticTypeError` - `pydantic.errors.PydanticValueError` - `pydantic.errors.SequenceError` - `pydantic.errors.SetError` - `pydantic.errors.SetMaxLengthError` - `pydantic.errors.SetMinLengthError` - `pydantic.errors.StrError` - `pydantic.errors.StrRegexError` - `pydantic.errors.StrictBoolError` - `pydantic.errors.SubclassError` - `pydantic.errors.TimeError` - `pydantic.errors.TupleError` - `pydantic.errors.TupleLengthError` - `pydantic.errors.UUIDError` - `pydantic.errors.UUIDVersionError` - `pydantic.errors.UrlError` - `pydantic.errors.UrlExtraError` - `pydantic.errors.UrlHostError` - `pydantic.errors.UrlHostTldError` - `pydantic.errors.UrlPortError` - `pydantic.errors.UrlSchemeError` - `pydantic.errors.UrlSchemePermittedError` - `pydantic.errors.UrlUserInfoError` - `pydantic.errors.WrongConstantError` - `pydantic.main.validate_model` - `pydantic.networks.stricturl` - `pydantic.parse_file_as` - `pydantic.parse_raw_as` - `pydantic.stricturl` - `pydantic.tools.parse_file_as` - `pydantic.tools.parse_raw_as` - `pydantic.types.JsonWrapper` - `pydantic.types.NoneBytes` - `pydantic.types.NoneStr` - `pydantic.types.NoneStrBytes` - `pydantic.types.PyObject` - `pydantic.types.StrBytes` - `pydantic.typing.evaluate_forwardref` - `pydantic.typing.AbstractSetIntStr` - `pydantic.typing.AnyCallable` - `pydantic.typing.AnyClassMethod` - `pydantic.typing.CallableGenerator` - `pydantic.typing.DictAny` - `pydantic.typing.DictIntStrAny` - `pydantic.typing.DictStrAny` - `pydantic.typing.IntStr` - `pydantic.typing.ListStr` - `pydantic.typing.MappingIntStrAny` - `pydantic.typing.NoArgAnyCallable` - `pydantic.typing.NoneType` - `pydantic.typing.ReprArgs` - `pydantic.typing.SetStr` - `pydantic.typing.StrPath` - `pydantic.typing.TupleGenerator` - `pydantic.typing.WithArgsTypes` - `pydantic.typing.all_literal_values` - `pydantic.typing.display_as_type` - `pydantic.typing.get_all_type_hints` - `pydantic.typing.get_args` - `pydantic.typing.get_origin` - `pydantic.typing.get_sub_types` - `pydantic.typing.is_callable_type` - `pydantic.typing.is_classvar` - `pydantic.typing.is_finalvar` - `pydantic.typing.is_literal_type` - `pydantic.typing.is_namedtuple` - `pydantic.typing.is_new_type` - `pydantic.typing.is_none_type` - `pydantic.typing.is_typeddict` - `pydantic.typing.is_typeddict_special` - `pydantic.typing.is_union` - `pydantic.typing.new_type_supertype` - `pydantic.typing.resolve_annotations` - `pydantic.typing.typing_base` - `pydantic.typing.update_field_forward_refs` - `pydantic.typing.update_model_forward_refs` - `pydantic.utils.ClassAttribute` - `pydantic.utils.DUNDER_ATTRIBUTES` - `pydantic.utils.PyObjectStr` - `pydantic.utils.ValueItems` - `pydantic.utils.almost_equal_floats` - `pydantic.utils.get_discriminator_alias_and_values` - `pydantic.utils.get_model` - `pydantic.utils.get_unique_discriminator_alias` - `pydantic.utils.in_ipython` - `pydantic.utils.is_valid_identifier` - `pydantic.utils.path_type` - `pydantic.utils.validate_field_name` - `pydantic.validate_model` pydantic-2.10.6/docs/plugins/000077500000000000000000000000001474456633400160445ustar00rootroot00000000000000pydantic-2.10.6/docs/plugins/conversion_table.py000066400000000000000000001153601474456633400217600ustar00rootroot00000000000000from __future__ import annotations as _annotations import collections import typing from collections import deque from dataclasses import dataclass from datetime import date, datetime, time, timedelta from decimal import Decimal from enum import Enum, IntEnum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from typing import Any, Iterable, Mapping, Pattern, Sequence, Type from uuid import UUID from pydantic_core import CoreSchema, core_schema from typing_extensions import TypedDict from pydantic import ByteSize, InstanceOf @dataclass class Row: field_type: type[Any] | str input_type: type[Any] | str python_input: bool = False json_input: bool = False strict: bool = False condition: str | None = None valid_examples: list[Any] | None = None invalid_examples: list[Any] | None = None core_schemas: list[type[CoreSchema]] | None = None @property def field_type_str(self) -> str: return f'{self.field_type.__name__}' if hasattr(self.field_type, '__name__') else f'{self.field_type}' @property def input_type_str(self) -> str: return f'{self.input_type.__name__}' if hasattr(self.input_type, '__name__') else f'{self.input_type}' @property def input_source_str(self) -> str: if self.python_input: if self.json_input: return 'Python & JSON' else: return 'Python' elif self.json_input: return 'JSON' else: return '' @dataclass class ConversionTable: rows: list[Row] col_names = [ 'Field Type', 'Input', 'Strict', 'Input Source', 'Conditions', ] open_nowrap_span = '' close_nowrap_span = '' def col_values(self, row: Row) -> list[str]: o = self.open_nowrap_span c = self.close_nowrap_span return [ f'{o}`{row.field_type_str}`{c}', f'{o}`{row.input_type_str}`{c}', '✓' if row.strict else '', f'{o}{row.input_source_str}{c}', row.condition if row.condition else '', ] @staticmethod def row_as_markdown(cols: list[str]) -> str: return f'| {" | ".join(cols)} |' def as_markdown(self) -> str: lines = [self.row_as_markdown(self.col_names), self.row_as_markdown(['-'] * len(self.col_names))] for row in self.rows: lines.append(self.row_as_markdown(self.col_values(row))) return '\n'.join(lines) @staticmethod def row_sort_key(row: Row) -> Any: field_type = row.field_type_str or ' ' input_type = row.input_type_str or ' ' input_source = row.input_source_str # Include the .isupper() to make it so that leading-lowercase items come first return field_type[0].isupper(), field_type, input_type[0].isupper(), input_type, input_source def sorted(self) -> ConversionTable: return ConversionTable(sorted(self.rows, key=self.row_sort_key)) def filtered(self, predicate: typing.Callable[[Row], bool]) -> ConversionTable: return ConversionTable([row for row in self.rows if predicate(row)]) table_rows: list[Row] = [ Row( str, str, strict=True, python_input=True, json_input=True, core_schemas=[core_schema.StringSchema], ), Row( str, bytes, python_input=True, condition='Assumes UTF-8, error on unicode decoding error.', valid_examples=[b'this is bytes'], invalid_examples=[b'\x81'], core_schemas=[core_schema.StringSchema], ), Row( str, bytearray, python_input=True, condition='Assumes UTF-8, error on unicode decoding error.', valid_examples=[bytearray(b'this is bytearray' * 3)], invalid_examples=[bytearray(b'\x81' * 5)], core_schemas=[core_schema.StringSchema], ), Row( bytes, bytes, strict=True, python_input=True, core_schemas=[core_schema.BytesSchema], ), Row( bytes, str, strict=True, json_input=True, valid_examples=['foo'], core_schemas=[core_schema.BytesSchema], ), Row( bytes, str, python_input=True, valid_examples=['foo'], core_schemas=[core_schema.BytesSchema], ), Row( bytes, bytearray, python_input=True, valid_examples=[bytearray(b'this is bytearray' * 3)], core_schemas=[core_schema.BytesSchema], ), Row( int, int, strict=True, python_input=True, json_input=True, condition='`bool` is explicitly forbidden.', invalid_examples=[2**64, True, False], core_schemas=[core_schema.IntSchema], ), Row( int, int, python_input=True, json_input=True, core_schemas=[core_schema.IntSchema], ), Row( int, float, python_input=True, json_input=True, condition='Must be exact int, e.g. `val % 1 == 0`, raises error for `nan`, `inf`.', valid_examples=[2.0], invalid_examples=[2.1, 2.2250738585072011e308, float('nan'), float('inf')], core_schemas=[core_schema.IntSchema], ), Row( int, Decimal, python_input=True, condition='Must be exact int, e.g. `val % 1 == 0`.', valid_examples=[Decimal(2.0)], invalid_examples=[Decimal(2.1)], core_schemas=[core_schema.IntSchema], ), Row( int, bool, python_input=True, json_input=True, valid_examples=[True, False], core_schemas=[core_schema.IntSchema], ), Row( int, str, python_input=True, json_input=True, condition='Must be numeric only, e.g. `[0-9]+`.', valid_examples=['123'], invalid_examples=['test', '123x'], core_schemas=[core_schema.IntSchema], ), Row( int, bytes, python_input=True, condition='Must be numeric only, e.g. `[0-9]+`.', valid_examples=[b'123'], invalid_examples=[b'test', b'123x'], core_schemas=[core_schema.IntSchema], ), Row( float, float, strict=True, python_input=True, json_input=True, condition='`bool` is explicitly forbidden.', invalid_examples=[True, False], core_schemas=[core_schema.FloatSchema], ), Row( float, int, strict=True, python_input=True, json_input=True, valid_examples=[123], core_schemas=[core_schema.FloatSchema], ), Row( float, str, python_input=True, json_input=True, condition='Must match `[0-9]+(\\.[0-9]+)?`.', valid_examples=['3.141'], invalid_examples=['test', '3.141x'], core_schemas=[core_schema.FloatSchema], ), Row( float, bytes, python_input=True, condition='Must match `[0-9]+(\\.[0-9]+)?`.', valid_examples=[b'3.141'], invalid_examples=[b'test', b'3.141x'], core_schemas=[core_schema.FloatSchema], ), Row( float, Decimal, python_input=True, valid_examples=[Decimal(3.5)], core_schemas=[core_schema.FloatSchema], ), Row( float, bool, python_input=True, json_input=True, valid_examples=[True, False], core_schemas=[core_schema.FloatSchema], ), Row( bool, bool, strict=True, python_input=True, json_input=True, valid_examples=[True, False], core_schemas=[core_schema.BoolSchema], ), Row( bool, int, python_input=True, json_input=True, condition='Allowed values: `0, 1`.', valid_examples=[0, 1], invalid_examples=[2, 100], core_schemas=[core_schema.BoolSchema], ), Row( bool, float, python_input=True, json_input=True, condition='Allowed values: `0.0, 1.0`.', valid_examples=[0.0, 1.0], invalid_examples=[2.0, 100.0], core_schemas=[core_schema.BoolSchema], ), Row( bool, Decimal, python_input=True, condition='Allowed values: `Decimal(0), Decimal(1)`.', valid_examples=[Decimal(0), Decimal(1)], invalid_examples=[Decimal(2), Decimal(100)], core_schemas=[core_schema.BoolSchema], ), Row( bool, str, python_input=True, json_input=True, condition=( "Allowed values: `'f'`, `'n'`, `'no'`, `'off'`, `'false'`, `'False'`, `'t'`, `'y'`, " "`'on'`, `'yes'`, `'true'`, `'True'`." ), valid_examples=['f', 'n', 'no', 'off', 'false', 'False', 't', 'y', 'on', 'yes', 'true', 'True'], invalid_examples=['test'], core_schemas=[core_schema.BoolSchema], ), Row( None, None, strict=True, python_input=True, json_input=True, core_schemas=[core_schema.NoneSchema], ), Row( date, date, strict=True, python_input=True, core_schemas=[core_schema.DateSchema], ), Row( date, datetime, python_input=True, condition='Must be exact date, eg. no `H`, `M`, `S`, `f`.', valid_examples=[datetime(2017, 5, 5)], invalid_examples=[datetime(2017, 5, 5, 10)], core_schemas=[core_schema.DateSchema], ), Row( date, str, python_input=True, json_input=True, condition='Format: `YYYY-MM-DD`.', valid_examples=['2017-05-05'], invalid_examples=['2017-5-5', '2017/05/05'], core_schemas=[core_schema.DateSchema], ), Row( date, bytes, python_input=True, condition='Format: `YYYY-MM-DD` (UTF-8).', valid_examples=[b'2017-05-05'], invalid_examples=[b'2017-5-5', b'2017/05/05'], core_schemas=[core_schema.DateSchema], ), Row( date, int, python_input=True, json_input=True, condition=( 'Interpreted as seconds or ms from epoch. ' 'See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date.' ), valid_examples=[1493942400000, 1493942400], invalid_examples=[1493942401000], core_schemas=[core_schema.DateSchema], ), Row( date, float, python_input=True, json_input=True, condition=( 'Interpreted as seconds or ms from epoch. ' 'See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date.' ), valid_examples=[1493942400000.0, 1493942400.0], invalid_examples=[1493942401000.0], core_schemas=[core_schema.DateSchema], ), Row( date, Decimal, python_input=True, condition=( 'Interpreted as seconds or ms from epoch. ' 'See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date.' ), valid_examples=[Decimal(1493942400000), Decimal(1493942400)], invalid_examples=[Decimal(1493942401000)], core_schemas=[core_schema.DateSchema], ), Row( datetime, datetime, strict=True, python_input=True, core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, date, python_input=True, valid_examples=[date(2017, 5, 5)], core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, str, python_input=True, json_input=True, condition='Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=['2017-05-05 10:10:10', '2017-05-05T10:10:10.0002', '2017-05-05 10:10:10+00:00', '2017-05-05'], invalid_examples=['2017-5-5T10:10:10'], core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, bytes, python_input=True, condition=( 'Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8).' ), valid_examples=[b'2017-05-05 10:10:10', b'2017-05-05T10:10:10.0002', b'2017-05-05 10:10:10+00:00'], invalid_examples=[b'2017-5-5T10:10:10'], core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, int, python_input=True, json_input=True, condition='Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=[1493979010000, 1493979010], core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, float, python_input=True, json_input=True, condition='Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=[1493979010000.0, 1493979010.0], core_schemas=[core_schema.DatetimeSchema], ), Row( datetime, Decimal, python_input=True, condition='Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=[Decimal(1493979010000), Decimal(1493979010)], core_schemas=[core_schema.DatetimeSchema], ), Row( time, time, strict=True, python_input=True, core_schemas=[core_schema.TimeSchema], ), Row( time, str, python_input=True, json_input=True, condition='Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=['10:10:10.0002'], invalid_examples=['1:1:1'], core_schemas=[core_schema.TimeSchema], ), Row( time, bytes, python_input=True, condition='Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=[b'10:10:10.0002'], invalid_examples=[b'1:1:1'], core_schemas=[core_schema.TimeSchema], ), Row( time, int, python_input=True, json_input=True, condition='Interpreted as seconds, range `0 - 86399`.', valid_examples=[3720], invalid_examples=[-1, 86400], core_schemas=[core_schema.TimeSchema], ), Row( time, float, python_input=True, json_input=True, condition='Interpreted as seconds, range `0 - 86399.9*`.', valid_examples=[3720.0002], invalid_examples=[-1.0, 86400.0], core_schemas=[core_schema.TimeSchema], ), Row( time, Decimal, python_input=True, condition='Interpreted as seconds, range `0 - 86399.9*`.', valid_examples=[Decimal(3720.0002)], invalid_examples=[Decimal(-1), Decimal(86400)], core_schemas=[core_schema.TimeSchema], ), Row( timedelta, timedelta, strict=True, python_input=True, core_schemas=[core_schema.TimedeltaSchema], ), Row( timedelta, str, python_input=True, json_input=True, condition='Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/).', valid_examples=['1 days 10:10', '1 d 10:10'], invalid_examples=['1 10:10'], core_schemas=[core_schema.TimedeltaSchema], ), Row( timedelta, bytes, python_input=True, condition='Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8).', valid_examples=[b'1 days 10:10', b'1 d 10:10'], invalid_examples=[b'1 10:10'], core_schemas=[core_schema.TimedeltaSchema], ), Row( timedelta, int, python_input=True, json_input=True, condition='Interpreted as seconds.', valid_examples=[123_000], core_schemas=[core_schema.TimedeltaSchema], ), Row( timedelta, float, python_input=True, json_input=True, condition='Interpreted as seconds.', valid_examples=[123_000.0002], core_schemas=[core_schema.TimedeltaSchema], ), Row( timedelta, Decimal, python_input=True, condition='Interpreted as seconds.', valid_examples=[Decimal(123_000.0002)], core_schemas=[core_schema.TimedeltaSchema], ), Row( dict, dict, strict=True, python_input=True, core_schemas=[core_schema.DictSchema], ), Row( dict, 'Object', strict=True, json_input=True, valid_examples=['{"v": {"1": 1, "2": 2}}'], core_schemas=[core_schema.DictSchema], ), Row( dict, Mapping, python_input=True, condition='Must implement the mapping interface and have an `items()` method.', valid_examples=[], core_schemas=[core_schema.DictSchema], ), Row( TypedDict, dict, strict=True, python_input=True, core_schemas=[core_schema.TypedDictSchema], ), Row( TypedDict, 'Object', strict=True, json_input=True, core_schemas=[core_schema.TypedDictSchema], ), Row( TypedDict, Any, strict=True, python_input=True, core_schemas=[core_schema.TypedDictSchema], ), Row( TypedDict, Mapping, python_input=True, condition='Must implement the mapping interface and have an `items()` method.', valid_examples=[], core_schemas=[core_schema.TypedDictSchema], ), Row( list, list, strict=True, python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, 'Array', strict=True, json_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, tuple, python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, set, python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, frozenset, python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, deque, python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, 'dict_keys', python_input=True, core_schemas=[core_schema.ListSchema], ), Row( list, 'dict_values', python_input=True, core_schemas=[core_schema.ListSchema], ), Row( tuple, tuple, strict=True, python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, 'Array', strict=True, json_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, list, python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, set, python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, frozenset, python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, deque, python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, 'dict_keys', python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( tuple, 'dict_values', python_input=True, core_schemas=[core_schema.TupleSchema], ), Row( set, set, strict=True, python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, 'Array', strict=True, json_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, list, python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, tuple, python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, frozenset, python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, deque, python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, 'dict_keys', python_input=True, core_schemas=[core_schema.SetSchema], ), Row( set, 'dict_values', python_input=True, core_schemas=[core_schema.SetSchema], ), Row( frozenset, frozenset, strict=True, python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, 'Array', strict=True, json_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, list, python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, tuple, python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, set, python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, deque, python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, 'dict_keys', python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( frozenset, 'dict_values', python_input=True, core_schemas=[core_schema.FrozenSetSchema], ), Row( InstanceOf, Any, strict=True, python_input=True, condition='`isinstance()` check must return `True`.', core_schemas=[core_schema.IsInstanceSchema], ), Row( InstanceOf, '-', json_input=True, condition='Never valid.', core_schemas=[core_schema.IsInstanceSchema], ), Row( callable, Any, strict=True, python_input=True, condition='`callable()` check must return `True`.', core_schemas=[core_schema.CallableSchema], ), Row( callable, '-', json_input=True, condition='Never valid.', core_schemas=[core_schema.CallableSchema], ), Row( deque, deque, strict=True, python_input=True, core_schemas=[core_schema.WrapValidatorFunctionSchema], ), Row( deque, 'Array', strict=True, json_input=True, core_schemas=[core_schema.WrapValidatorFunctionSchema], ), Row( deque, list, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( deque, tuple, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( deque, set, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( deque, frozenset, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( Any, Any, strict=True, python_input=True, json_input=True, core_schemas=[core_schema.AnySchema], ), Row( typing.NamedTuple, typing.NamedTuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( typing.NamedTuple, 'Array', strict=True, json_input=True, core_schemas=[core_schema.CallSchema], ), Row( typing.NamedTuple, collections.namedtuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( typing.NamedTuple, tuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( typing.NamedTuple, list, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( typing.NamedTuple, dict, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, collections.namedtuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, 'Array', strict=True, json_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, typing.NamedTuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, tuple, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, list, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( collections.namedtuple, dict, strict=True, python_input=True, core_schemas=[core_schema.CallSchema], ), Row( Sequence, list, strict=True, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( Sequence, 'Array', strict=True, json_input=True, core_schemas=[core_schema.ChainSchema], ), Row( Sequence, tuple, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( Sequence, deque, python_input=True, core_schemas=[core_schema.ChainSchema], ), Row( Iterable, list, strict=True, python_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Iterable, 'Array', strict=True, json_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Iterable, tuple, strict=True, python_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Iterable, set, strict=True, python_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Iterable, frozenset, strict=True, python_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Iterable, deque, strict=True, python_input=True, core_schemas=[core_schema.GeneratorSchema], ), Row( Type, Type, strict=True, python_input=True, core_schemas=[core_schema.IsSubclassSchema], ), Row( Pattern, str, strict=True, python_input=True, json_input=True, condition='Input must be a valid pattern.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( Pattern, bytes, strict=True, python_input=True, condition='Input must be a valid pattern.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Address, IPv4Address, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv4Address, IPv4Interface, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv4Address, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv4Address, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Address, bytes, python_input=True, valid_examples=[b'\x00\x00\x00\x00'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Address, int, python_input=True, condition='integer representing the IP address, must be less than `2**32`', valid_examples=[168_430_090], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Interface, IPv4Interface, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv4Interface, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv4Interface, IPv4Address, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Interface, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Interface, bytes, python_input=True, valid_examples=[b'\xff\xff\xff\xff'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Interface, tuple, python_input=True, valid_examples=[('192.168.0.1', '24')], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Interface, int, python_input=True, condition='integer representing the IP address, must be less than `2**32`', valid_examples=[168_430_090], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Network, IPv4Network, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv4Network, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv4Network, IPv4Address, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Network, IPv4Interface, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Network, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Network, bytes, python_input=True, valid_examples=[b'\xff\xff\xff\xff'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv4Network, int, python_input=True, condition='integer representing the IP network, must be less than `2**32`', valid_examples=[168_430_090], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Address, IPv6Address, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv6Address, IPv6Interface, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv6Address, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv6Address, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Address, bytes, python_input=True, valid_examples=[b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x01'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Address, int, python_input=True, condition='integer representing the IP address, must be less than `2**128`', valid_examples=[340_282_366_920_938_463_463_374_607_431_768_211_455], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Interface, IPv6Interface, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv6Interface, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv6Interface, IPv6Address, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Interface, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Interface, bytes, python_input=True, valid_examples=[b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x01'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Interface, tuple, python_input=True, valid_examples=[('2001:db00::1', '120')], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Interface, int, python_input=True, condition='integer representing the IP address, must be less than `2**128`', valid_examples=[340_282_366_920_938_463_463_374_607_431_768_211_455], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Network, IPv6Network, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IPv6Network, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( IPv6Network, IPv6Address, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Network, IPv6Interface, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Network, str, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Network, bytes, python_input=True, valid_examples=[b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x01'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IPv6Network, int, python_input=True, condition='integer representing the IP address, must be less than `2**128`', valid_examples=[340_282_366_920_938_463_463_374_607_431_768_211_455], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( Enum, Enum, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( Enum, Any, strict=True, json_input=True, condition='Input value must be convertible to enum values.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( Enum, Any, python_input=True, condition='Input value must be convertible to enum values.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IntEnum, IntEnum, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( IntEnum, Any, strict=True, json_input=True, condition='Input value must be convertible to enum values.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( IntEnum, Any, python_input=True, condition='Input value must be convertible to enum values.', core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( Decimal, Decimal, strict=True, python_input=True, core_schemas=[core_schema.CustomErrorSchema], ), Row( Decimal, int, strict=True, json_input=True, core_schemas=[core_schema.CustomErrorSchema], ), Row( Decimal, str, strict=True, json_input=True, core_schemas=[core_schema.CustomErrorSchema], ), Row( Decimal, float, strict=True, json_input=True, core_schemas=[core_schema.CustomErrorSchema], ), Row( Decimal, int, python_input=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( Decimal, str, python_input=True, json_input=True, condition='Must match `[0-9]+(\\.[0-9]+)?`.', valid_examples=['3.141'], invalid_examples=['test', '3.141x'], core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( Decimal, float, python_input=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( Path, Path, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( Path, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( Path, str, python_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( UUID, UUID, strict=True, python_input=True, core_schemas=[core_schema.IsInstanceSchema], ), Row( UUID, str, strict=True, json_input=True, core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( UUID, str, python_input=True, valid_examples=['{12345678-1234-5678-1234-567812345678}'], core_schemas=[core_schema.AfterValidatorFunctionSchema], ), Row( ByteSize, str, strict=True, python_input=True, json_input=True, valid_examples=['1.2', '1.5 KB', '6.2EiB'], core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( ByteSize, int, strict=True, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( ByteSize, float, strict=True, python_input=True, json_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), Row( ByteSize, Decimal, strict=True, python_input=True, core_schemas=[core_schema.PlainValidatorFunctionSchema], ), ] conversion_table = ConversionTable(table_rows).sorted() pydantic-2.10.6/docs/plugins/devtools_output.html000066400000000000000000000067251474456633400222230ustar00rootroot00000000000000 devtools_example.py:31 <module> user: User( id=123, name='John Doe', signup_ts=datetime.datetime(2019, 6, 1, 12, 22), friends=[ 1234, 4567, 7890, ], address=Address( street='Testing', country='uk', lat=51.5, lng=0.0, ), ) (User) should be much easier read than: user: id=123 name='John Doe' signup_ts=datetime.datetime(2019, 6, 1, 12, 22) friends=[1234, 4567, 7890] address=Address(street='Testing', country='uk', lat=51.5, lng=0.0) pydantic-2.10.6/docs/plugins/griffe_doclinks.py000066400000000000000000000062111474456633400215460ustar00rootroot00000000000000from __future__ import annotations import ast import re from functools import partial from pathlib import Path from typing import Any from griffe import Extension, Inspector, ObjectNode, Visitor, get_logger from griffe import Object as GriffeObject from pymdownx.slugs import slugify DOCS_PATH = Path(__file__).parent.parent slugifier = slugify(case='lower') logger = get_logger('griffe_docklinks') def find_heading(content: str, slug: str, file_path: Path) -> tuple[str, int]: for m in re.finditer('^#+ (.+)', content, flags=re.M): heading = m.group(1) h_slug = slugifier(heading, '-') if h_slug == slug: return heading, m.end() raise ValueError(f'heading with slug {slug!r} not found in {file_path}') def insert_at_top(path: str, api_link: str) -> str: rel_file = path.rstrip('/') + '.md' file_path = DOCS_PATH / rel_file content = file_path.read_text() second_heading = re.search('^#+ ', content, flags=re.M) assert second_heading, 'unable to find second heading in file' first_section = content[: second_heading.start()] if f'[{api_link}]' not in first_section: logger.debug('inserting API link "%s" at the top of %s', api_link, file_path.relative_to(DOCS_PATH)) file_path.write_text('??? api "API Documentation"\n' f' [`{api_link}`][{api_link}]
\n\n' f'{content}') heading = file_path.stem.replace('_', ' ').title() return f'!!! abstract "Usage Documentation"\n [{heading}](../{rel_file})\n' def replace_links(m: re.Match[str], *, api_link: str) -> str: path_group = m.group(1) if '#' not in path_group: # no heading id, put the content at the top of the page return insert_at_top(path_group, api_link) usage_path, slug = path_group.split('#', 1) rel_file = usage_path.rstrip('/') + '.md' file_path = DOCS_PATH / rel_file content = file_path.read_text() heading, heading_end = find_heading(content, slug, file_path) next_heading = re.search('^#+ ', content[heading_end:], flags=re.M) if next_heading: next_section = content[heading_end : heading_end + next_heading.start()] else: next_section = content[heading_end:] if f'[{api_link}]' not in next_section: logger.debug('inserting API link "%s" into %s', api_link, file_path.relative_to(DOCS_PATH)) file_path.write_text( f'{content[:heading_end]}\n\n' '??? api "API Documentation"\n' f' [`{api_link}`][{api_link}]
' f'{content[heading_end:]}' ) return f'!!! abstract "Usage Documentation"\n [{heading}](../{rel_file}#{slug})\n' def update_docstring(obj: GriffeObject) -> str: return re.sub( r'usage[\- ]docs: ?https://docs\.pydantic\.dev/.+?/(\S+)', partial(replace_links, api_link=obj.path), obj.docstring.value, flags=re.I, ) class UpdateDocstringsExtension(Extension): def on_instance( self, *, node: ast.AST | ObjectNode, obj: GriffeObject, agent: Visitor | Inspector, **kwargs: Any ) -> None: if not obj.is_alias and obj.docstring is not None: obj.docstring.value = update_docstring(obj) pydantic-2.10.6/docs/plugins/main.py000066400000000000000000000367721474456633400173610ustar00rootroot00000000000000from __future__ import annotations as _annotations import json import logging import os import re import textwrap from pathlib import Path from textwrap import indent import autoflake import pyupgrade._main as pyupgrade_main # type: ignore import requests import tomli import yaml from jinja2 import Template # type: ignore from mkdocs.config import Config from mkdocs.structure.files import Files from mkdocs.structure.pages import Page logger = logging.getLogger('mkdocs.plugin') THIS_DIR = Path(__file__).parent DOCS_DIR = THIS_DIR.parent PROJECT_ROOT = DOCS_DIR.parent try: from .conversion_table import conversion_table except ImportError: # Due to how MkDocs requires this file to be specified (as a path and not a # dot-separated module name), relative imports don't work: # MkDocs is adding the dir. of this file to `sys.path` and uses # `importlib.spec_from_file_location` and `module_from_spec`, which isn't ideal. from conversion_table import conversion_table # Start definition of MkDocs hooks def on_pre_build(config: Config) -> None: """ Before the build starts. """ import mkdocs_redirects.plugin add_changelog() add_mkdocs_run_deps() # work around for very unfortunate bug in mkdocs-redirects: # https://github.com/mkdocs/mkdocs-redirects/issues/65 mkdocs_redirects.plugin.HTML_TEMPLATE = """ Redirecting... Redirecting... """ def on_files(files: Files, config: Config) -> Files: """ After the files are loaded, but before they are read. """ return files def on_page_markdown(markdown: str, page: Page, config: Config, files: Files) -> str: """ Called on each file after it is read and before it is converted to HTML. """ markdown = upgrade_python(markdown) markdown = insert_json_output(markdown) if md := render_index(markdown, page): return md if md := render_why(markdown, page): return md if md := render_pydantic_settings(markdown, page): return md elif md := build_schema_mappings(markdown, page): return md elif md := build_conversion_table(markdown, page): return md elif md := devtools_example(markdown, page): return md elif md := populate_pydantic_people(markdown, page): return md else: return markdown # End definition of MkDocs hooks def add_changelog() -> None: history = (PROJECT_ROOT / 'HISTORY.md').read_text(encoding='utf-8') history = re.sub(r'(\s)@([\w\-]+)', r'\1[@\2](https://github.com/\2)', history, flags=re.I) history = re.sub(r'\[GitHub release]\(', r'[:simple-github: GitHub release](', history) history = re.sub('@@', '@', history) new_file = DOCS_DIR / 'changelog.md' # avoid writing file unless the content has changed to avoid infinite build loop if not new_file.is_file() or new_file.read_text(encoding='utf-8') != history: new_file.write_text(history, encoding='utf-8') def add_mkdocs_run_deps() -> None: # set the pydantic, pydantic-core, pydantic-extra-types versions to configure for running examples in the browser pyproject_toml = (PROJECT_ROOT / 'pyproject.toml').read_text() m = re.search(r'pydantic-core==(.+?)["\']', pyproject_toml) if not m: logger.info( "Could not find pydantic-core version in pyproject.toml, this is expected if you're using a git ref" ) return pydantic_core_version = m.group(1) version_py = (PROJECT_ROOT / 'pydantic' / 'version.py').read_text() pydantic_version = re.search(r'^VERSION ?= (["\'])(.+)\1', version_py, flags=re.M).group(2) uv_lock = (PROJECT_ROOT / 'uv.lock').read_text() pydantic_extra_types_version = re.search(r'name = "pydantic-extra-types"\nversion = "(.+?)"', uv_lock).group(1) mkdocs_run_deps = json.dumps( [ f'pydantic=={pydantic_version}', f'pydantic-core=={pydantic_core_version}', f'pydantic-extra-types=={pydantic_extra_types_version}', ] ) logger.info('Setting mkdocs_run_deps=%s', mkdocs_run_deps) html = f"""\ """ path = DOCS_DIR / 'theme/mkdocs_run_deps.html' path.write_text(html) MIN_MINOR_VERSION = 8 MAX_MINOR_VERSION = 12 def upgrade_python(markdown: str) -> str: """ Apply pyupgrade to all Python code blocks, unless explicitly skipped, create a tab for each version. """ def add_tabs(match: re.Match[str]) -> str: prefix = match.group(1) if 'upgrade="skip"' in prefix: return match.group(0) if m := re.search(r'requires="3.(\d+)"', prefix): min_minor_version = int(m.group(1)) else: min_minor_version = MIN_MINOR_VERSION py_code = match.group(2) numbers = match.group(3) # import devtools # devtools.debug(numbers) output = [] last_code = py_code for minor_version in range(min_minor_version, MAX_MINOR_VERSION + 1): if minor_version == min_minor_version: tab_code = py_code else: tab_code = _upgrade_code(py_code, minor_version) if tab_code == last_code: continue last_code = tab_code content = indent(f'{prefix}\n{tab_code}```{numbers}', ' ' * 4) output.append(f'=== "Python 3.{minor_version} and above"\n\n{content}') if len(output) == 1: return match.group(0) else: return '\n\n'.join(output) # Note: we should move away from this regex approach. It does not handle edge cases (indented code blocks inside # other blocks, etc) and can lead to bugs in the rendering of annotations. Edit with care and make sure the rendered # documentation does not break: return re.sub(r'(``` *py.*?)\n(.+?)^```(\s+(?:^\d+\. (?:[^\n][\n]?)+\n?)*)', add_tabs, markdown, flags=re.M | re.S) def _upgrade_code(code: str, min_version: int) -> str: upgraded = pyupgrade_main._fix_plugins( code, settings=pyupgrade_main.Settings( min_version=(3, min_version), keep_percent_format=True, keep_mock=False, keep_runtime_typing=True, ), ) return autoflake.fix_code(upgraded, remove_all_unused_imports=True) def insert_json_output(markdown: str) -> str: """ Find `output="json"` code fence tags and replace with a separate JSON section """ def replace_json(m: re.Match[str]) -> str: start, attrs, code = m.groups() def replace_last_print(m2: re.Match[str]) -> str: ind, json_text = m2.groups() json_text = indent(json.dumps(json.loads(json_text), indent=2), ind) # no trailing fence as that's not part of code return f'\n{ind}```\n\n{ind}JSON output:\n\n{ind}```json\n{json_text}\n' code = re.sub(r'\n( *)"""(.*?)\1"""\n$', replace_last_print, code, flags=re.S) return f'{start}{attrs}{code}{start}\n' return re.sub(r'(^ *```)([^\n]*?output="json"[^\n]*?\n)(.+?)\1', replace_json, markdown, flags=re.M | re.S) def get_orgs_data() -> list[dict[str, str]]: with (THIS_DIR / 'orgs.toml').open('rb') as f: orgs_data = tomli.load(f) return orgs_data['orgs'] tile_template = """ """ def render_index(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'index.md': return None if version := os.getenv('PYDANTIC_VERSION'): url = f'https://github.com/pydantic/pydantic/releases/tag/{version}' version_str = f'Documentation for version: [{version}]({url})' elif (version_ref := os.getenv('GITHUB_REF')) and version_ref.startswith('refs/tags/'): version = re.sub('^refs/tags/', '', version_ref.lower()) url = f'https://github.com/pydantic/pydantic/releases/tag/{version}' version_str = f'Documentation for version: [{version}]({url})' elif sha := os.getenv('GITHUB_SHA'): url = f'https://github.com/pydantic/pydantic/commit/{sha}' sha = sha[:7] version_str = f'Documentation for development version: [{sha}]({url})' else: version_str = 'Documentation for development version' logger.info('Setting version prefix: %r', version_str) markdown = re.sub(r'{{ *version *}}', version_str, markdown) elements = [tile_template.format(**org) for org in get_orgs_data()] orgs_grid = f'
{"".join(elements)}
' return re.sub(r'{{ *organisations *}}', orgs_grid, markdown) def render_why(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'why.md': return None with (THIS_DIR / 'using.toml').open('rb') as f: using = tomli.load(f)['libs'] libraries = '\n'.join('* [`{repo}`](https://github.com/{repo}) {stars:,} stars'.format(**lib) for lib in using) markdown = re.sub(r'{{ *libraries *}}', libraries, markdown) default_description = '_(Based on the criteria described above)_' elements = [ f'### {org["name"]} {{#org-{org["key"]}}}\n\n{org.get("description") or default_description}' for org in get_orgs_data() ] return re.sub(r'{{ *organisations *}}', '\n\n'.join(elements), markdown) def render_pydantic_settings(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'concepts/pydantic_settings.md': return None req = requests.get('https://raw.githubusercontent.com/pydantic/pydantic-settings/main/docs/index.md') if req.status_code != 200: logger.warning( 'Got HTTP status %d when trying to fetch content of the `pydantic-settings` docs', req.status_code ) return docs_content = req.text.strip() return re.sub(r'{{ *pydantic_settings *}}', docs_content, markdown) def _generate_table_row(col_values: list[str]) -> str: return f'| {" | ".join(col_values)} |\n' def _generate_table_heading(col_names: list[str]) -> str: return _generate_table_row(col_names) + _generate_table_row(['-'] * len(col_names)) def build_schema_mappings(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'usage/schema.md': return None col_names = [ 'Python type', 'JSON Schema Type', 'Additional JSON Schema', 'Defined in', 'Notes', ] table_text = _generate_table_heading(col_names) with (THIS_DIR / 'schema_mappings.toml').open('rb') as f: table = tomli.load(f) for t in table.values(): py_type = t['py_type'] json_type = t['json_type'] additional = t['additional'] defined_in = t['defined_in'] notes = t['notes'] if additional and not isinstance(additional, str): additional = json.dumps(additional) cols = [f'`{py_type}`', f'`{json_type}`', f'`{additional}`' if additional else '', defined_in, notes] table_text += _generate_table_row(cols) return re.sub(r'{{ *schema_mappings_table *}}', table_text, markdown) def build_conversion_table(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'concepts/conversion_table.md': return None filtered_table_predicates = { 'all': lambda r: True, 'json': lambda r: r.json_input, 'json_strict': lambda r: r.json_input and r.strict, 'python': lambda r: r.python_input, 'python_strict': lambda r: r.python_input and r.strict, } for table_id, predicate in filtered_table_predicates.items(): table_markdown = conversion_table.filtered(predicate).as_markdown() table_markdown = textwrap.indent(table_markdown, ' ') markdown = re.sub(rf'{{{{ *conversion_table_{table_id} *}}}}', table_markdown, markdown) return markdown def devtools_example(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'integrations/devtools.md': return None html = (THIS_DIR / 'devtools_output.html').read_text().strip('\n') full_html = f'
\n
{html}
\n
' return re.sub(r'{{ *devtools_example *}}', full_html, markdown) experts_template = Template( """
{% for user in people.experts %}
@{{ user.login }}
Questions replied: {{ user.count }}
{% endfor %}
""" ) most_active_users_template = Template( """
{% for user in people.last_month_active %}
@{{ user.login }}
Questions replied: {{ user.count }}
{% endfor %}
""" ) top_contributors_template = Template( """
{% for user in people.top_contributors %}
@{{ user.login }}
Contributions: {{ user.count }}
{% endfor %}
""" ) top_reviewers_template = Template( """
{% for user in people.top_reviewers %}
@{{ user.login }}
Reviews: {{ user.count }}
{% endfor %}
""" ) maintainers_template = Template( """
{% for user in people.maintainers %} {% endfor %}
""" ) def populate_pydantic_people(markdown: str, page: Page) -> str | None: if page.file.src_uri != 'pydantic_people.md': return None # read people.yml file data with (THIS_DIR / 'people.yml').open('rb') as f: people = yaml.load(f, Loader=yaml.FullLoader) # Render the templates for name, template in [ ('experts', experts_template), ('most_active_users', most_active_users_template), ('top_contributors', top_contributors_template), ('top_reviewers', top_reviewers_template), ('maintainers', maintainers_template), ]: rendered = template.render(people=people) markdown = re.sub(f'{{{{ {name} }}}}', rendered, markdown) return markdown pydantic-2.10.6/docs/plugins/orgs.toml000066400000000000000000000143151474456633400177170ustar00rootroot00000000000000[[orgs]] key = "adobe" name = "Adobe" description = """ [`adobe/dy-sql`](https://github.com/adobe/dy-sql) uses Pydantic. """ [[orgs]] key = "amazon" name = "Amazon and AWS" description = """ * [powertools-lambda-python](https://github.com/aws-powertools/powertools-lambda-python) * [awslabs/gluonts](https://github.com/awslabs/gluonts) * AWS [sponsored Samuel Colvin $5,000](https://twitter.com/samuel_colvin/status/1549383169006239745) to work on Pydantic in 2022 """ [[orgs]] key = "anthropic" name = "Anthropic" description = """ [`anthropics/anthropic-sdk-python`](https://github.com/anthropics/anthropic-sdk-python) uses Pydantic. """ [[orgs]] key = "apple" name = "Apple" [[orgs]] key = "asml" name = "ASML" [[orgs]] key = "astrazeneca" name = "AstraZeneca" description = """ [Multiple repos](https://github.com/search?q=org%3AAstraZeneca+pydantic&type=code) in the `AstraZeneca` GitHub org depend on Pydantic. """ [[orgs]] key = "cisco" name = "Cisco Systems" description = """ * Pydantic is listed in their report of [Open Source Used In RADKit](https://www.cisco.com/c/dam/en_us/about/doing_business/open_source/docs/RADKit-149-1687424532.pdf). * [`cisco/webex-assistant-sdk`](https://github.com/cisco/webex-assistant-sdk) """ [[orgs]] key = "comcast" name = "Comcast" [[orgs]] key = "datadog" name = "Datadog" description = """ * Extensive use of Pydantic in [`DataDog/integrations-core`](https://github.com/DataDog/integrations-core) and other repos * Communication with engineers from Datadog about how they use Pydantic. """ [[orgs]] key = "facebook" name = "Facebook" description = """ [Multiple repos](https://github.com/search?q=org%3Afacebookresearch+pydantic&type=code) in the `facebookresearch` GitHub org depend on Pydantic. """ [[orgs]] key = "github" name = "GitHub" description = """ GitHub sponsored Pydantic $750 in 2022 """ [[orgs]] key = "google" name = "Google" description = """ Extensive use of Pydantic in [`google/turbinia`](https://github.com/google/turbinia) and other repos. """ [[orgs]] key = "hsbc" name = "HSBC" [[orgs]] key = "ibm" name = "IBM" description = """ [Multiple repos](https://github.com/search?q=org%3AIBM+pydantic&type=code) in the `IBM` GitHub org depend on Pydantic. """ [[orgs]] key = "intel" name = "Intel" [[orgs]] key = "intuit" name = "Intuit" [[orgs]] key = "ipcc" name = "Intergovernmental Panel on Climate Change" description = """ [Tweet](https://twitter.com/daniel_huppmann/status/1563461797973110785) explaining how the IPCC use Pydantic. """ [[orgs]] key = "jpmorgan" name = "JPMorgan" [[orgs]] key = "jupyter" name = "Jupyter" description = """ * The developers of the Jupyter notebook are using Pydantic [for subprojects](https://github.com/pydantic/pydantic/issues/773) * Through the FastAPI-based Jupyter server [Jupyverse](https://github.com/jupyter-server/jupyverse) * [FPS](https://github.com/jupyter-server/fps)'s configuration management. """ [[orgs]] key = "microsoft" name = "Microsoft" description = """ * [DeepSpeed](https://github.com/microsoft/DeepSpeed) deep learning optimisation library uses Pydantic extensively * [Multiple repos](https://github.com/search?q=org%3Amicrosoft%20pydantic&type=code) in the `microsoft` GitHub org depend on Pydantic, in particular their * Pydantic is also [used](https://github.com/search?q=org%3AAzure%20pydantic&type=code) in the `Azure` GitHub org * [Comments](https://github.com/tiangolo/fastapi/pull/26) on GitHub show Microsoft engineers using Pydantic as part of Windows and Office """ [[orgs]] key = "molssi" name = "Molecular Science Software Institute" description = """ [Multiple repos](https://github.com/search?q=org%3AMolSSI%20pydantic&type=code) in the `MolSSI` GitHub org depend on Pydantic. """ [[orgs]] key = "nasa" name = "NASA" description = """ [Multiple repos](https://github.com/search?q=org%3Anasa%20pydantic&type=code) in the `NASA` GitHub org depend on Pydantic. NASA are also using Pydantic via FastAPI in their JWST project to process images from the James Webb Space Telescope, see [this tweet](https://twitter.com/benjamin_falk/status/1546947039363305472). """ [[orgs]] key = "netflix" name = "Netflix" description = """ [Multiple repos](https://github.com/search?q=org%3Anetflix%20pydantic&type=code) in the `Netflix` GitHub org depend on Pydantic. """ [[orgs]] key = "nsa" name = "NSA" description = """ The [`nsacyber/WALKOFF`](https://github.com/nsacyber/WALKOFF) repo depends on Pydantic. """ [[orgs]] key = "nvidia" name = "NVIDIA" description = """ [Multiple repositories](https://github.com/search?q=org%3ANVIDIA%20pydantic&type=code) in the `NVIDIA` GitHub org depend on Pydantic. Their "Omniverse Services" depends on Pydantic according to [their documentation](https://web.archive.org/web/20220628161919/https://docs.omniverse.nvidia.com/prod_services/prod_services/core/index.html). """ [[orgs]] key = "openai" name = "OpenAI" description = """ OpenAI use Pydantic for their ChatCompletions API, as per [this](https://github.com/pydantic/pydantic/discussions/6372) discussion on GitHub. Anecdotally, OpenAI use Pydantic extensively for their internal services. """ [[orgs]] key = "oracle" name = "Oracle" [[orgs]] key = "palantir" name = "Palantir" [[orgs]] key = "qualcomm" name = "Qualcomm" [[orgs]] key = "redhat" name = "Red Hat" [[orgs]] key = "revolut" name = "Revolut" description = """ Anecdotally, all internal services at Revolut are built with FastAPI and therefore Pydantic. """ [[orgs]] key = "robusta" name = "Robusta" description = """ The [`robusta-dev/robusta`](https://github.com/robusta-dev/robusta) repo depends on Pydantic. """ [[orgs]] key = "salesforce" name = "Salesforce" description = """ Salesforce [sponsored Samuel Colvin $10,000](https://twitter.com/samuel_colvin/status/1501288247670063104) to work on Pydantic in 2022. """ [[orgs]] key = "starbucks" name = "Starbucks" [[orgs]] key = "ti" name = "Texas Instruments" [[orgs]] key = "twilio" name = "Twilio" [[orgs]] key = "twitter" name = "Twitter" description = """ Twitter's [`the-algorithm`](https://github.com/twitter/the-algorithm) repo where they [open sourced](https://blog.twitter.com/engineering/en_us/topics/open-source/2023/twitter-recommendation-algorithm) their recommendation engine uses Pydantic. """ [[orgs]] key = "ukhomeoffice" name = "UK Home Office" pydantic-2.10.6/docs/plugins/people.yml000066400000000000000000000336421474456633400200630ustar00rootroot00000000000000maintainers: - login: Kludex answers: 22 prs: 112 avatarUrl: https://avatars.githubusercontent.com/u/7353520?u=0934928aa44d3b75af26ad47cb0227c8af30c8ec&v=4 url: https://github.com/Kludex - login: sydney-runkle answers: 36 prs: 292 avatarUrl: https://avatars.githubusercontent.com/u/54324534?u=3a4ffd00a8270b607922250d3a2d9c9af38b9cf9&v=4 url: https://github.com/sydney-runkle - login: adriangb answers: 41 prs: 199 avatarUrl: https://avatars.githubusercontent.com/u/1755071?u=612704256e38d6ac9cbed24f10e4b6ac2da74ecb&v=4 url: https://github.com/adriangb - login: samuelcolvin answers: 295 prs: 399 avatarUrl: https://avatars.githubusercontent.com/u/4039449?u=42eb3b833047c8c4b4f647a031eaef148c16d93f&v=4 url: https://github.com/samuelcolvin - login: alexmojaki answers: 0 prs: 18 avatarUrl: https://avatars.githubusercontent.com/u/3627481?u=9bb2e0cf1c5ef3d0609d2e639a135b7b4ca8b463&v=4 url: https://github.com/alexmojaki - login: hramezani answers: 22 prs: 199 avatarUrl: https://avatars.githubusercontent.com/u/3122442?u=f387fc2dbc0c681f23e80e2ad705790fafcec9a2&v=4 url: https://github.com/hramezani - login: davidhewitt answers: 2 prs: 40 avatarUrl: https://avatars.githubusercontent.com/u/1939362?u=b4b48981c3a097daaad16c4c5417aa7a3e5e32d9&v=4 url: https://github.com/davidhewitt - login: dmontagu answers: 55 prs: 315 avatarUrl: https://avatars.githubusercontent.com/u/35119617?u=540f30c937a6450812628b9592a1dfe91bbe148e&v=4 url: https://github.com/dmontagu experts: - login: PrettyWood count: 143 avatarUrl: https://avatars.githubusercontent.com/u/18406791?u=20a4953f7d7e9d49d054b81e1582b08e87b2125f&v=4 url: https://github.com/PrettyWood - login: Viicos count: 94 avatarUrl: https://avatars.githubusercontent.com/u/65306057?u=fcd677dc1b9bef12aa103613e5ccb3f8ce305af9&v=4 url: https://github.com/Viicos - login: uriyyo count: 93 avatarUrl: https://avatars.githubusercontent.com/u/32038156?u=bbcf79839cafe7b6249326c3b3b5383f2981595a&v=4 url: https://github.com/uriyyo - login: lesnik512 count: 21 avatarUrl: https://avatars.githubusercontent.com/u/2184855?u=9c720fa595336aa83eef20729d4230ba7424d9fa&v=4 url: https://github.com/lesnik512 - login: harunyasar count: 17 avatarUrl: https://avatars.githubusercontent.com/u/1765494?u=5b1ab7c582db4b4016fa31affe977d10af108ad4&v=4 url: https://github.com/harunyasar - login: nymous count: 13 avatarUrl: https://avatars.githubusercontent.com/u/4216559?u=360a36fb602cded27273cbfc0afc296eece90662&v=4 url: https://github.com/nymous - login: ybressler count: null avatarUrl: https://avatars.githubusercontent.com/u/40807730?v=4 url: https://github.com/ybressler last_month_active: - login: Viicos count: 7 avatarUrl: https://avatars.githubusercontent.com/u/65306057?u=fcd677dc1b9bef12aa103613e5ccb3f8ce305af9&v=4 url: https://github.com/Viicos top_contributors: - login: PrettyWood count: 122 avatarUrl: https://avatars.githubusercontent.com/u/18406791?u=20a4953f7d7e9d49d054b81e1582b08e87b2125f&v=4 url: https://github.com/PrettyWood - login: Viicos count: 105 avatarUrl: https://avatars.githubusercontent.com/u/65306057?u=fcd677dc1b9bef12aa103613e5ccb3f8ce305af9&v=4 url: https://github.com/Viicos - login: dependabot-preview count: 75 avatarUrl: https://avatars.githubusercontent.com/in/2141?v=4 url: https://github.com/apps/dependabot-preview - login: tpdorsey count: 71 avatarUrl: https://avatars.githubusercontent.com/u/370316?u=eb206070cfe47f242d5fcea2e6c7514f4d0f27f5&v=4 url: https://github.com/tpdorsey - login: lig count: 49 avatarUrl: https://avatars.githubusercontent.com/u/38705?v=4 url: https://github.com/lig - login: pyup-bot count: 46 avatarUrl: https://avatars.githubusercontent.com/u/16239342?u=8454ae029661131445080f023e1efccc29166485&v=4 url: https://github.com/pyup-bot - login: tiangolo count: 22 avatarUrl: https://avatars.githubusercontent.com/u/1326112?u=740f11212a731f56798f558ceddb0bd07642afa7&v=4 url: https://github.com/tiangolo - login: Bobronium count: 19 avatarUrl: https://avatars.githubusercontent.com/u/36469655?u=f67d8fa6d67d35d2f5ebd5b15e24efeb41036fd3&v=4 url: https://github.com/Bobronium - login: Gr1N count: 17 avatarUrl: https://avatars.githubusercontent.com/u/1087619?u=cd78c4f602bf9f9667277dd0af9302a7fe9dd75a&v=4 url: https://github.com/Gr1N - login: uriyyo count: 15 avatarUrl: https://avatars.githubusercontent.com/u/32038156?u=bbcf79839cafe7b6249326c3b3b5383f2981595a&v=4 url: https://github.com/uriyyo - login: pilosus count: 12 avatarUrl: https://avatars.githubusercontent.com/u/6400248?u=2b30c6675f888c2e47640aed2f1c1a956baae224&v=4 url: https://github.com/pilosus - login: misrasaurabh1 count: 12 avatarUrl: https://avatars.githubusercontent.com/u/1271289?u=b83b0a82b2c95990d93cefbeb8f548d9f2f090c2&v=4 url: https://github.com/misrasaurabh1 - login: yezz123 count: 11 avatarUrl: https://avatars.githubusercontent.com/u/52716203?u=d7062cbc6eb7671d5dc9cc0e32a24ae335e0f225&v=4 url: https://github.com/yezz123 - login: StephenBrown2 count: 10 avatarUrl: https://avatars.githubusercontent.com/u/1148665?u=b69e6fe797302f025a2d125e377e27f8ea0b8058&v=4 url: https://github.com/StephenBrown2 - login: koxudaxi count: 9 avatarUrl: https://avatars.githubusercontent.com/u/630670?u=507d8577b4b3670546b449c4c2ccbc5af40d72f7&v=4 url: https://github.com/koxudaxi - login: cdce8p count: 9 avatarUrl: https://avatars.githubusercontent.com/u/30130371?v=4 url: https://github.com/cdce8p - login: aminalaee count: 8 avatarUrl: https://avatars.githubusercontent.com/u/19784933?u=2f45a312b73e7fb29f3b6f8676e5be6f7220da25&v=4 url: https://github.com/aminalaee - login: NeevCohen count: 8 avatarUrl: https://avatars.githubusercontent.com/u/70970900?u=573a3175906348e0d1529104d56b391e93ca0250&v=4 url: https://github.com/NeevCohen - login: kc0506 count: 8 avatarUrl: https://avatars.githubusercontent.com/u/89458301?u=75f53e971fcba3ff61836c389505a420bddd865c&v=4 url: https://github.com/kc0506 - login: layday count: 7 avatarUrl: https://avatars.githubusercontent.com/u/31134424?u=e8afd95a97b5556c467d1be27788950e67378ef1&v=4 url: https://github.com/layday - login: daviskirk count: 7 avatarUrl: https://avatars.githubusercontent.com/u/1049817?u=b42e1148d23ea9039b325975bbea3ff8c5b4e3ec&v=4 url: https://github.com/daviskirk - login: dgasmith count: 6 avatarUrl: https://avatars.githubusercontent.com/u/1769841?u=44e83d7974f0ab5c431340f1669d98f781594980&v=4 url: https://github.com/dgasmith - login: Atheuz count: 6 avatarUrl: https://avatars.githubusercontent.com/u/202696?v=4 url: https://github.com/Atheuz - login: tlambert03 count: 6 avatarUrl: https://avatars.githubusercontent.com/u/1609449?u=922abf0524b47739b37095e553c99488814b05db&v=4 url: https://github.com/tlambert03 - login: nuno-andre count: 5 avatarUrl: https://avatars.githubusercontent.com/u/6339494?u=893876f31ce65fa8ad8cfcc592392a77f0f8af38&v=4 url: https://github.com/nuno-andre - login: ofek count: 5 avatarUrl: https://avatars.githubusercontent.com/u/9677399?u=386c330f212ce467ce7119d9615c75d0e9b9f1ce&v=4 url: https://github.com/ofek - login: hmvp count: 4 avatarUrl: https://avatars.githubusercontent.com/u/1734544?v=4 url: https://github.com/hmvp - login: retnikt count: 4 avatarUrl: https://avatars.githubusercontent.com/u/24581770?v=4 url: https://github.com/retnikt - login: therefromhere count: 4 avatarUrl: https://avatars.githubusercontent.com/u/197540?v=4 url: https://github.com/therefromhere - login: JeanArhancet count: 4 avatarUrl: https://avatars.githubusercontent.com/u/10811879?u=c0cfe7f7be82474d0deb2ba27601ec96f4f43515&v=4 url: https://github.com/JeanArhancet - login: commonism count: 4 avatarUrl: https://avatars.githubusercontent.com/u/164513?v=4 url: https://github.com/commonism - login: JensHeinrich count: 4 avatarUrl: https://avatars.githubusercontent.com/u/59469646?u=86d6a20768cc4cc65622eafd86672147321bd8f8&v=4 url: https://github.com/JensHeinrich - login: mgorny count: 4 avatarUrl: https://avatars.githubusercontent.com/u/110765?u=7386b9cb55c1973a510d2785832424bc80e7c265&v=4 url: https://github.com/mgorny - login: ornariece count: 4 avatarUrl: https://avatars.githubusercontent.com/u/25489980?u=1e9b5cbbbb1516fbea6da00429e4eef0ef79e4e6&v=4 url: https://github.com/ornariece - login: exs-dwoodward count: 4 avatarUrl: https://avatars.githubusercontent.com/u/166007669?u=cd5df427a775972595777471436c673e94e03a1f&v=4 url: https://github.com/exs-dwoodward - login: dAIsySHEng1 count: 4 avatarUrl: https://avatars.githubusercontent.com/u/45747761?u=c1515d2ccf4877c0b64b5ea5a8c51631affe35de&v=4 url: https://github.com/dAIsySHEng1 - login: AdolfoVillalobos count: 4 avatarUrl: https://avatars.githubusercontent.com/u/16639270?u=faa71bcfb3273a32cd81711a56998e115bca7fcc&v=4 url: https://github.com/AdolfoVillalobos top_reviewers: - login: PrettyWood count: 211 avatarUrl: https://avatars.githubusercontent.com/u/18406791?u=20a4953f7d7e9d49d054b81e1582b08e87b2125f&v=4 url: https://github.com/PrettyWood - login: Viicos count: 115 avatarUrl: https://avatars.githubusercontent.com/u/65306057?u=fcd677dc1b9bef12aa103613e5ccb3f8ce305af9&v=4 url: https://github.com/Viicos - login: lig count: 103 avatarUrl: https://avatars.githubusercontent.com/u/38705?v=4 url: https://github.com/lig - login: tpdorsey count: 77 avatarUrl: https://avatars.githubusercontent.com/u/370316?u=eb206070cfe47f242d5fcea2e6c7514f4d0f27f5&v=4 url: https://github.com/tpdorsey - login: tiangolo count: 44 avatarUrl: https://avatars.githubusercontent.com/u/1326112?u=740f11212a731f56798f558ceddb0bd07642afa7&v=4 url: https://github.com/tiangolo - login: Bobronium count: 27 avatarUrl: https://avatars.githubusercontent.com/u/36469655?u=f67d8fa6d67d35d2f5ebd5b15e24efeb41036fd3&v=4 url: https://github.com/Bobronium - login: Gr1N count: 17 avatarUrl: https://avatars.githubusercontent.com/u/1087619?u=cd78c4f602bf9f9667277dd0af9302a7fe9dd75a&v=4 url: https://github.com/Gr1N - login: StephenBrown2 count: 17 avatarUrl: https://avatars.githubusercontent.com/u/1148665?u=b69e6fe797302f025a2d125e377e27f8ea0b8058&v=4 url: https://github.com/StephenBrown2 - login: ybressler count: 15 avatarUrl: https://avatars.githubusercontent.com/u/40807730?u=b417e3cea56fd0f67983006108f6a1a83d4652a0&v=4 url: https://github.com/ybressler - login: hyperlint-ai count: 12 avatarUrl: https://avatars.githubusercontent.com/in/718456?v=4 url: https://github.com/apps/hyperlint-ai - login: uriyyo count: 11 avatarUrl: https://avatars.githubusercontent.com/u/32038156?u=bbcf79839cafe7b6249326c3b3b5383f2981595a&v=4 url: https://github.com/uriyyo - login: koxudaxi count: 10 avatarUrl: https://avatars.githubusercontent.com/u/630670?u=507d8577b4b3670546b449c4c2ccbc5af40d72f7&v=4 url: https://github.com/koxudaxi - login: daviskirk count: 10 avatarUrl: https://avatars.githubusercontent.com/u/1049817?u=b42e1148d23ea9039b325975bbea3ff8c5b4e3ec&v=4 url: https://github.com/daviskirk - login: yezz123 count: 10 avatarUrl: https://avatars.githubusercontent.com/u/52716203?u=d7062cbc6eb7671d5dc9cc0e32a24ae335e0f225&v=4 url: https://github.com/yezz123 - login: Zac-HD count: 8 avatarUrl: https://avatars.githubusercontent.com/u/12229877?u=abc44dbce4bb3eca2def638bd0d4ab4cfef91b74&v=4 url: https://github.com/Zac-HD - login: layday count: 7 avatarUrl: https://avatars.githubusercontent.com/u/31134424?u=e8afd95a97b5556c467d1be27788950e67378ef1&v=4 url: https://github.com/layday - login: MarkusSintonen count: 7 avatarUrl: https://avatars.githubusercontent.com/u/12939780?v=4 url: https://github.com/MarkusSintonen - login: pilosus count: 6 avatarUrl: https://avatars.githubusercontent.com/u/6400248?u=2b30c6675f888c2e47640aed2f1c1a956baae224&v=4 url: https://github.com/pilosus - login: Kilo59 count: 6 avatarUrl: https://avatars.githubusercontent.com/u/13108583?u=0d34d39c0628091596c9d5ebb4e802009e8c4aca&v=4 url: https://github.com/Kilo59 - login: JeanArhancet count: 6 avatarUrl: https://avatars.githubusercontent.com/u/10811879?u=c0cfe7f7be82474d0deb2ba27601ec96f4f43515&v=4 url: https://github.com/JeanArhancet - login: tlambert03 count: 5 avatarUrl: https://avatars.githubusercontent.com/u/1609449?u=922abf0524b47739b37095e553c99488814b05db&v=4 url: https://github.com/tlambert03 - login: christianbundy count: 5 avatarUrl: https://avatars.githubusercontent.com/u/537700?u=7b64bd12eda862fbf72228495aada9c470df7a90&v=4 url: https://github.com/christianbundy - login: nix010 count: 5 avatarUrl: https://avatars.githubusercontent.com/u/16438204?u=f700f440b89e715795c3bc091800b8d3f39c58d9&v=4 url: https://github.com/nix010 - login: graingert count: 4 avatarUrl: https://avatars.githubusercontent.com/u/413772?u=64b77b6aa405c68a9c6bcf45f84257c66eea5f32&v=4 url: https://github.com/graingert - login: hmvp count: 4 avatarUrl: https://avatars.githubusercontent.com/u/1734544?v=4 url: https://github.com/hmvp - login: wozniakty count: 4 avatarUrl: https://avatars.githubusercontent.com/u/5042313?u=8917c345dcb528733073ff1ce8a512e33f548512&v=4 url: https://github.com/wozniakty - login: nuno-andre count: 4 avatarUrl: https://avatars.githubusercontent.com/u/6339494?u=893876f31ce65fa8ad8cfcc592392a77f0f8af38&v=4 url: https://github.com/nuno-andre - login: antdking count: 4 avatarUrl: https://avatars.githubusercontent.com/u/2099618?u=a9899c1fea247d500e5368a1157a392bcd82e81d&v=4 url: https://github.com/antdking - login: dimaqq count: 4 avatarUrl: https://avatars.githubusercontent.com/u/662249?u=15313dec91bae789685e4abb3c2152251de41948&v=4 url: https://github.com/dimaqq - login: JensHeinrich count: 4 avatarUrl: https://avatars.githubusercontent.com/u/59469646?u=86d6a20768cc4cc65622eafd86672147321bd8f8&v=4 url: https://github.com/JensHeinrich - login: kc0506 count: 4 avatarUrl: https://avatars.githubusercontent.com/u/89458301?u=75f53e971fcba3ff61836c389505a420bddd865c&v=4 url: https://github.com/kc0506 pydantic-2.10.6/docs/plugins/schema_mappings.toml000066400000000000000000000310121474456633400220740ustar00rootroot00000000000000[None] py_type = "None" json_type = "null" additional = "" defined_in = "JSON Schema Core" notes = "Same for `type(None)` or `Literal[None]`" [bool] py_type = "bool" json_type = "boolean" additional = "" defined_in = "JSON Schema Core" notes = "" [str] py_type = "str" json_type = "string" additional = "" defined_in = "JSON Schema Core" notes = "" [float] py_type = "float" json_type = "number" additional = "" defined_in = "JSON Schema Core" notes = "" [int] py_type = "int" json_type = "integer" additional = "" defined_in = "JSON Schema Validation" notes = "" [dict] py_type = "dict" json_type = "object" additional = "" defined_in = "JSON Schema Core" notes = "" [list] py_type = "list" json_type = "array" defined_in = "JSON Schema Core" notes = "" [list.additional.items] [tuple-positional] py_type = "tuple-positional" json_type = "array" defined_in = "JSON Schema Core" notes = "" [tuple-positional.additional.items] [tuple-variable] py_type = "tuple-variable" json_type = "array" defined_in = "JSON Schema Core" notes = "" [tuple-variable.additional.items] [set] py_type = "set" json_type = "array" defined_in = "JSON Schema Validation" notes = "" [set.additional] uniqueItems = true [set.additional.items] [frozenset] py_type = "frozenset" json_type = "array" defined_in = "JSON Schema Validation" notes = "" [frozenset.additional] uniqueItems = true [frozenset.additional.items] ["List[str]"] py_type = "List[str]" json_type = "array" defined_in = "JSON Schema Validation" notes = "And equivalently for any other sub type, e.g. `List[int]`." ["List[str]".additional.items] type = "string" ["Tuple[str, ...]"] py_type = "Tuple[str, ...]" json_type = "array" defined_in = "JSON Schema Validation" notes = "And equivalently for any other sub type, e.g. `Tuple[int, ...]`." ["Tuple[str, ...]".additional.items] type = "string" ["Tuple[str, int]"] py_type = "Tuple[str, int]" json_type = "array" defined_in = "JSON Schema Validation" notes = "And equivalently for any other set of subtypes. Note: If using schemas for OpenAPI, you shouldn't use this declaration, as it would not be valid in OpenAPI (although it is valid in JSON Schema)." ["Tuple[str, int]".additional] minItems = 2 maxItems = 2 [["Tuple[str, int]".additional.items]] type = "string" [["Tuple[str, int]".additional.items]] type = "integer" ["Dict[str, int]"] py_type = "Dict[str, int]" json_type = "object" defined_in = "JSON Schema Validation" notes = "And equivalently for any other subfields for dicts. Have in mind that although you can use other types as keys for dicts with Pydantic, only strings are valid keys for JSON, and so, only str is valid as JSON Schema key types." ["Dict[str, int]".additional.additionalProperties] type = "integer" ["Union[str, int]"] py_type = "Union[str, int]" json_type = "anyOf" defined_in = "JSON Schema Validation" notes = "And equivalently for any other subfields for unions." [["Union[str, int]".additional.anyOf]] type = "string" [["Union[str, int]".additional.anyOf]] type = "integer" [Enum] py_type = "Enum" json_type = "enum" additional = "{\"enum\": [...]}" defined_in = "JSON Schema Validation" notes = "All the literal values in the enum are included in the definition." [SecretStr] py_type = "SecretStr" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [SecretStr.additional] writeOnly = true [SecretBytes] py_type = "SecretBytes" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [SecretBytes.additional] writeOnly = true [EmailStr] py_type = "EmailStr" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [EmailStr.additional] format = "email" [NameEmail] py_type = "NameEmail" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [NameEmail.additional] format = "name-email" [AnyUrl] py_type = "AnyUrl" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [AnyUrl.additional] format = "uri" [Pattern] py_type = "Pattern" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [Pattern.additional] format = "regex" [bytes] py_type = "bytes" json_type = "string" defined_in = "OpenAPI" notes = "" [bytes.additional] format = "binary" [Decimal] py_type = "Decimal" json_type = "number" additional = "" defined_in = "JSON Schema Core" notes = "" [UUID1] py_type = "UUID1" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [UUID1.additional] format = "uuid1" [UUID3] py_type = "UUID3" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [UUID3.additional] format = "uuid3" [UUID4] py_type = "UUID4" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [UUID4.additional] format = "uuid4" [UUID5] py_type = "UUID5" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [UUID5.additional] format = "uuid5" [UUID] py_type = "UUID" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "Suggested in OpenAPI." [UUID.additional] format = "uuid" [FilePath] py_type = "FilePath" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [FilePath.additional] format = "file-path" [DirectoryPath] py_type = "DirectoryPath" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [DirectoryPath.additional] format = "directory-path" [Path] py_type = "Path" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [Path.additional] format = "path" [datetime] py_type = "datetime" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [datetime.additional] format = "date-time" [date] py_type = "date" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [date.additional] format = "date" [time] py_type = "time" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [time.additional] format = "time" [timedelta] py_type = "timedelta" json_type = "number" defined_in = "Difference in seconds (a `float`), with Pydantic standard \"format\" extension" notes = "Suggested in JSON Schema repository's issues by maintainer." [timedelta.additional] format = "time-delta" [Json] py_type = "Json" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [Json.additional] format = "json-string" [IPv4Address] py_type = "IPv4Address" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [IPv4Address.additional] format = "ipv4" [IPv6Address] py_type = "IPv6Address" json_type = "string" defined_in = "JSON Schema Validation" notes = "" [IPv6Address.additional] format = "ipv6" [IPvAnyAddress] py_type = "IPvAnyAddress" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv4 or IPv6 address as used in `ipaddress` module" [IPvAnyAddress.additional] format = "ipvanyaddress" [IPv4Interface] py_type = "IPv4Interface" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv4 interface as used in `ipaddress` module" [IPv4Interface.additional] format = "ipv4interface" [IPv6Interface] py_type = "IPv6Interface" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv6 interface as used in `ipaddress` module" [IPv6Interface.additional] format = "ipv6interface" [IPvAnyInterface] py_type = "IPvAnyInterface" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv4 or IPv6 interface as used in `ipaddress` module" [IPvAnyInterface.additional] format = "ipvanyinterface" [IPv4Network] py_type = "IPv4Network" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv4 network as used in `ipaddress` module" [IPv4Network.additional] format = "ipv4network" [IPv6Network] py_type = "IPv6Network" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv6 network as used in `ipaddress` module" [IPv6Network.additional] format = "ipv6network" [IPvAnyNetwork] py_type = "IPvAnyNetwork" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "IPv4 or IPv6 network as used in `ipaddress` module" [IPvAnyNetwork.additional] format = "ipvanynetwork" [StrictBool] py_type = "StrictBool" json_type = "boolean" additional = "" defined_in = "JSON Schema Core" notes = "" [StrictStr] py_type = "StrictStr" json_type = "string" additional = "" defined_in = "JSON Schema Core" notes = "" [ConstrainedStr] py_type = "ConstrainedStr" json_type = "string" additional = "" defined_in = "JSON Schema Core" notes = "If the type has values declared for the constraints, they are included as validations. See the mapping for `constr` below." ["constr(pattern='^text$', min_length=2, max_length=10)"] py_type = "constr(pattern='^text$', min_length=2, max_length=10)" json_type = "string" defined_in = "JSON Schema Validation" notes = "Any argument not passed to the function (not defined) will not be included in the schema." ["constr(regex='^text$', min_length=2, max_length=10)".additional] pattern = "^text$" minLength = 2 maxLength = 10 [ConstrainedInt] py_type = "ConstrainedInt" json_type = "integer" additional = "" defined_in = "JSON Schema Core" notes = "If the type has values declared for the constraints, they are included as validations. See the mapping for `conint` below." ["conint(gt=1, ge=2, lt=6, le=5, multiple_of=2)"] py_type = "conint(gt=1, ge=2, lt=6, le=5, multiple_of=2)" json_type = "integer" defined_in = "" notes = "Any argument not passed to the function (not defined) will not be included in the schema." ["conint(gt=1, ge=2, lt=6, le=5, multiple_of=2)".additional] maximum = 5 exclusiveMaximum = 6 minimum = 2 exclusiveMinimum = 1 multipleOf = 2 [PositiveInt] py_type = "PositiveInt" json_type = "integer" defined_in = "JSON Schema Validation" notes = "" [PositiveInt.additional] exclusiveMinimum = 0 [NegativeInt] py_type = "NegativeInt" json_type = "integer" defined_in = "JSON Schema Validation" notes = "" [NegativeInt.additional] exclusiveMaximum = 0 [NonNegativeInt] py_type = "NonNegativeInt" json_type = "integer" defined_in = "JSON Schema Validation" notes = "" [NonNegativeInt.additional] minimum = 0 [NonPositiveInt] py_type = "NonPositiveInt" json_type = "integer" defined_in = "JSON Schema Validation" notes = "" [NonPositiveInt.additional] maximum = 0 [ConstrainedFloat] py_type = "ConstrainedFloat" json_type = "number" additional = "" defined_in = "JSON Schema Core" notes = "If the type has values declared for the constraints, they are included as validations. See the mapping for `confloat` below." ["confloat(gt=1, ge=2, lt=6, le=5, multiple_of=2)"] py_type = "confloat(gt=1, ge=2, lt=6, le=5, multiple_of=2)" json_type = "number" defined_in = "JSON Schema Validation" notes = "Any argument not passed to the function (not defined) will not be included in the schema." ["confloat(gt=1, ge=2, lt=6, le=5, multiple_of=2)".additional] maximum = 5 exclusiveMaximum = 6 minimum = 2 exclusiveMinimum = 1 multipleOf = 2 [PositiveFloat] py_type = "PositiveFloat" json_type = "number" defined_in = "JSON Schema Validation" notes = "" [PositiveFloat.additional] exclusiveMinimum = 0 [NegativeFloat] py_type = "NegativeFloat" json_type = "number" defined_in = "JSON Schema Validation" notes = "" [NegativeFloat.additional] exclusiveMaximum = 0 [NonNegativeFloat] py_type = "NonNegativeFloat" json_type = "number" defined_in = "JSON Schema Validation" notes = "" [NonNegativeFloat.additional] minimum = 0 [NonPositiveFloat] py_type = "NonPositiveFloat" json_type = "number" defined_in = "JSON Schema Validation" notes = "" [NonPositiveFloat.additional] maximum = 0 [ConstrainedDecimal] py_type = "ConstrainedDecimal" json_type = "number" additional = "" defined_in = "JSON Schema Core" notes = "If the type has values declared for the constraints, they are included as validations. See the mapping for `condecimal` below." ["condecimal(gt=1, ge=2, lt=6, le=5, multiple_of=2)"] py_type = "condecimal(gt=1, ge=2, lt=6, le=5, multiple_of=2)" json_type = "number" defined_in = "JSON Schema Validation" notes = "Any argument not passed to the function (not defined) will not be included in the schema." ["condecimal(gt=1, ge=2, lt=6, le=5, multiple_of=2)".additional] maximum = 5 exclusiveMaximum = 6 minimum = 2 exclusiveMinimum = 1 multipleOf = 2 [BaseModel] py_type = "BaseModel" json_type = "object" additional = "" defined_in = "JSON Schema Core" notes = "All the properties defined will be defined with standard JSON Schema, including submodels." [Color] py_type = "Color" json_type = "string" defined_in = "Pydantic standard \"format\" extension" notes = "" [Color.additional] format = "color" pydantic-2.10.6/docs/plugins/using.toml000066400000000000000000000052011474456633400200640ustar00rootroot00000000000000[[libs]] repo = "huggingface/transformers" stars = 107475 [[libs]] repo = "tiangolo/fastapi" stars = 60355 [[libs]] repo = "hwchase17/langchain" stars = 54514 [[libs]] repo = "apache/airflow" stars = 30955 [[libs]] repo = "microsoft/DeepSpeed" stars = 26908 [[libs]] repo = "ray-project/ray" stars = 26600 [[libs]] repo = "lm-sys/FastChat" stars = 24924 [[libs]] repo = "Lightning-AI/lightning" stars = 24034 [[libs]] repo = "OpenBB-finance/OpenBBTerminal" stars = 22785 [[libs]] repo = "gradio-app/gradio" stars = 19726 [[libs]] repo = "pola-rs/polars" stars = 18587 [[libs]] repo = "mindsdb/mindsdb" stars = 17242 [[libs]] repo = "RasaHQ/rasa" stars = 16695 [[libs]] repo = "mlflow/mlflow" stars = 14780 [[libs]] repo = "heartexlabs/label-studio" stars = 13634 [[libs]] repo = "spotDL/spotify-downloader" stars = 12124 [[libs]] repo = "Sanster/lama-cleaner" stars = 12075 [[libs]] repo = "airbytehq/airbyte" stars = 11174 [[libs]] repo = "openai/evals" stars = 11110 [[libs]] repo = "matrix-org/synapse" stars = 11071 [[libs]] repo = "ydataai/ydata-profiling" stars = 10884 [[libs]] repo = "pyodide/pyodide" stars = 10245 [[libs]] repo = "tiangolo/sqlmodel" stars = 10160 [[libs]] repo = "lucidrains/DALLE2-pytorch" stars = 9916 [[libs]] repo = "pynecone-io/reflex" stars = 9679 [[libs]] repo = "PaddlePaddle/PaddleNLP" stars = 9663 [[libs]] repo = "aws/serverless-application-model" stars = 9061 [[libs]] repo = "modin-project/modin" stars = 8808 [[libs]] repo = "great-expectations/great_expectations" stars = 8613 [[libs]] repo = "dagster-io/dagster" stars = 7908 [[libs]] repo = "NVlabs/SPADE" stars = 7407 [[libs]] repo = "brycedrennan/imaginAIry" stars = 7217 [[libs]] repo = "chroma-core/chroma" stars = 7127 [[libs]] repo = "lucidrains/imagen-pytorch" stars = 7089 [[libs]] repo = "sqlfluff/sqlfluff" stars = 6278 [[libs]] repo = "deeppavlov/DeepPavlov" stars = 6278 [[libs]] repo = "autogluon/autogluon" stars = 5966 [[libs]] repo = "bridgecrewio/checkov" stars = 5747 [[libs]] repo = "bentoml/BentoML" stars = 5275 [[libs]] repo = "replicate/cog" stars = 5089 [[libs]] repo = "vitalik/django-ninja" stars = 4623 [[libs]] repo = "apache/iceberg" stars = 4479 [[libs]] repo = "jina-ai/discoart" stars = 3820 [[libs]] repo = "embedchain/embedchain" stars = 3493 [[libs]] repo = "skypilot-org/skypilot" stars = 3052 [[libs]] repo = "PrefectHQ/marvin" stars = 2985 [[libs]] repo = "microsoft/FLAML" stars = 2569 [[libs]] repo = "docarray/docarray" stars = 2353 [[libs]] repo = "aws-powertools/powertools-lambda-python" stars = 2198 [[libs]] repo = "NVIDIA/NeMo-Guardrails" stars = 1830 [[libs]] repo = "roman-right/beanie" stars = 1299 [[libs]] repo = "art049/odmantic" stars = 807 pydantic-2.10.6/docs/plugins/using_update.py000066400000000000000000000016151474456633400211100ustar00rootroot00000000000000from pathlib import Path from time import sleep import requests import tomli THIS_DIR = Path(__file__).parent session = requests.Session() def update_lib(lib, *, retry=0): repo = lib['repo'] url = f'https://api.github.com/repos/{repo}' resp = session.get(url) if resp.status_code == 403 and retry < 3: print(f'retrying {repo} {retry}') sleep(5) return update_lib(lib, retry=retry + 1) resp.raise_for_status() data = resp.json() stars = data['watchers_count'] print(f'{repo}: {stars}') lib['stars'] = stars with (THIS_DIR / 'using.toml').open('rb') as f: table = tomli.load(f) libs = table['libs'] for lib in libs: update_lib(lib) libs.sort(key=lambda lib: lib['stars'], reverse=True) with (THIS_DIR / 'using.toml').open('w') as f: for lib in libs: f.write('[[libs]]\nrepo = "{repo}"\nstars = {stars}\n'.format(**lib)) pydantic-2.10.6/docs/pydantic_people.md000066400000000000000000000026771474456633400201000ustar00rootroot00000000000000# Pydantic People Pydantic has an amazing community of contributors, reviewers, and experts that help propel the project forward. Here, we celebrate those people and their contributions. ## Maintainers These are the current maintainers of the Pydantic repository. Feel free to tag us if you have questions, review requests, or feature requests for which you'd like feedback! {{ maintainers }} ## Experts These are the users that have helped others the most with questions in GitHub through *all time*. {{ experts }} ### Most active users last month These are the users that have helped others the most with questions in GitHub during the last month. {{ most_active_users }} ## Top contributors These are the users that have created the most pull requests that have been *merged*. {{ top_contributors }} ## Top Reviewers These are the users that have reviewed the most Pull Requests from others, assisting with code quality, documentation, bug fixes, feature requests, etc. {{ top_reviewers }} ## About the data The data displayed above is calculated monthly via the Github GraphQL API. The source code for this script is located [here](https://github.com/pydantic/pydantic/tree/main/.github/actions/people/people.py). Many thanks to [Sebastián Ramírez](https://github.com/tiangolo) for the script from which we based this logic. Depending on changing conditions, the thresholds for the different categories of contributors may change in the future. pydantic-2.10.6/docs/sponsor_logos/000077500000000000000000000000001474456633400172715ustar00rootroot00000000000000pydantic-2.10.6/docs/sponsor_logos/aws.png000066400000000000000000002200451474456633400205740ustar00rootroot00000000000000PNG  IHDRx pHYsaaøtEXtSoftwarewww.inkscape.org< IDATxwd}WqrH@)`H:Z-*Z9Mֻ#kdkey$*(""Ds 0{?}S==A><vWWzEaǃ((=}w@QEQ EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQQ((= EQEAT((J@QEQz(҃PEQD( *EQDcL1c Z!0b1Bz + 抿wb: iId " kc tSH1>1HX0+O{fK p7".[ !s W^cvL5B LLy:Ij8)NIt^1yjr#k,{q_`)m\_G{{T{+`B0K6 tH1% x۷oDBBzX&" =sI3( 8u:c@a7/3-A#LzO8(CG1՘IB9P?BR-n:ǭr uT\#7Oe%6f!"mu!ƀuP4AzYgh"-ZĜ9sX8>C 084@?}kJ[BUP & .] \p .pYΝ?ωS3>>QY$Ek]|\qeZͰl6'r- 2q?N{NJ%q"ʤ8~XHb04Mȯrc܈iL\"`!=GLNy7no%{ELrD Z! =g2L$Uku8Yy CdX&8@9s4&qְd"6o6i&֬]C__NedY:vl6X$I31*aTAgK ssG#M8(Ώ\'V(*ȍ:F,3 WcLޝa5>X.M`LZY4Z䵛;aFD=yf^ˢIwϖUcl-)cY022J"0\&?gFDA`LFU29$jYu:><,Y2>Y&j,,+4C ptooR<jz(AkpYzi;'pq 4[-<'sfTuT\#W ?>K-`ͪlݲɭ20PǤ*ߔ(l_ 4o3GK7U-A.pIGy .1 bi-8#ʦMJߦz3HQ)0T&H`y/tRe%d-6}i[$m[ 06Ww/+{8z4cdd Pe<K3(QkSJ q#!'_Yv57md[ٶu3kWbhn4yV I%OjLr<o'5.-$L 溫m?şguDMGFxg_ ^{m32oƟ G xSPˉì~_Xn5yA=W}pq<kk)jܼe~^S hK>ys?gYv%Z^..sVw>#).ftN:R[|,K1.L`Yع0C,[ x߈H -A\< 3~$o= %FqqS'0H˗'xe>^|^|e^{m'NMI./Z+YZʁ*SRtj"M`e}ܾc+[6c  V)PT$ϝϾ7=FF(d# 4rm k/|Xe ݷ!cѱL69)E-ę0XF W~c|k_cɒ%>\Z<|I~3Νgi%+:'!ICL3'a9s)ŭVӁ%dgڇeY..>|7?t;wd1uZ8oj9{^%$PcFQM7_zwDSc=۶mW_+_Gٵk.^˰6|o[Xg`e~VZ믟7RcBĥHS'@_ܱk{J1 j\Rx_F?[V,_cxmog<F rErq%sXbEީmH*߲lΟ^~0jbIft,˖.'Wزchi.՘o>]9FcjBt"rY, ݺO_oʬYr ,{3} g b 6n[rǩ'i5[E_qZ `oc#QYjhZjhl cؾ6/{>y Ynޛ k ONB>Op~ !&4)@Z/_"hn+,; B~| 7k9f3KhY%" O~c]v =#1Ыy^~>.O퉤6S&cLUh.3lmws@F JiejXkAAZX0o.#ďu,GV~|ZuCg5 *3{6m{_'Nh5pkE*#Y=+:jyNQr[6_ŴM$ sB盅X9!j5e9kW,/p?8~YNi)WchZ}[i,f6ZͰizl?A3oj!&./}+p*1` Eg`ҹܾ}''g@EvEh100Z-5,ba ©SclY{9jAucXd맿g~#`-eLqL cp.U4ɲxd9t@hqQܽ~\(@yyN -Wg/G?LٸG㵤pV})|aqD i?%}'}fNLshǪe*&p&Ǎe_CasLL EѺ[ 農}Kټy뎉 d& JS'>Z \ۥPe4̝7>|7!MZsD`*'Oattt+ Lͭj4 нwħ>Pqw~ML U`;غy=O,_2e7yo0eo6y΃?0mR1G.cŚ#8색sXEL28Xl}'?]v$OÀR`#GknN+Œ0I`z&d3]Ү~eaLޗ>7EvlF*{&1BCդVټVΟbK|;Q| 1(`~璩#bmvһq3g?qۭ8'80|̬kDH҆ "T>W.ퟥl:=ceq 8K[عc#&IbҼ]f\tƢ_4ZR?(پu??F c|*0R`1\x^ٝ,Ҩ|S6Iuy_* aXugfzL˔$TڵWigWCT(apYΝŪ˒pb&:Ӈ*tD7};)L-MX;N&>s7c f0Sѓ!bMf'~+%VXFY0<g\ɳZuyf۹u !>'3둞VzV+5zvlZ .+ z1G>r?8sb16v36 αcc;ng 8b?iM9x+=`VǕd=w'-*.\ll{[ȣȉ+uu9tApͳ,ΘT Mu1bs>~~S,Y<7Ws1ޠ3+6C,v&VZ4&>u"gڱ[vT @o`yO'D[%[m'g BQ1eQLusVKp =iq8gC(ft +V`7|?Z[`]g5/ LvJV/kVfEӳ4QK>} 0tKMk 3@~+gxh86Qޞt}sY:-3IE2<(k׭^Wre9V˸뮝|?1Ţ4BVSĦ 78v8'$Au&W.g8c\gf FKͦM'\4g[nY`Ѷ->"fedY}'^ifx==IeA7I+`(~|>I__,YfvF8o<$Ǿ]*3IT" U* 1`]~So|Kda,]纞 _xq}3h1]1<ټicW gX4&u-׮gwYYlMϜb{ 1>QZ-?M{i8}M;Hgy=jyp~G~Sj9VHY7_2̤.. =;žשPp{vsS*$ 35O-7Pe/LlP--EQgN9=mM{3sv,$L-pRb?}}]Z=Ǿ銳fbÆ[)/3)J@/`l޼Z>]sI&!/1/Mz|?̬Bkgۍ+A˗gftѳ]W({_ODH.涖XH~~?h-V'.h4 lA8pǏ Za~9޺T>8ab1|̝3>Za֬a߲.&#{f 1-/;vlI wӯ𭮶~z/|DIP ;P-@jpF:blM̏},zQS$WsjY*j/OBױX~u[}ĞEfHV.I7t.)"RUtC3jyXqGϮ[ 852%sv9:w줿V|RTgy(5 IDAT?V`&y ;;Yv ̉ n"WYt 9?2[8v!w$]?k2#Vg_}t!x]o2YY xA-K Dp6C麪Ti`xtVZxA`|K<6FfP 16b2j 184HVǥ8ؔ|[6J8#eσSVR7퉾xh%ʴ=+=[MqB<)! wbrl;'2٩Ӡo[06ٳ9s7Ͼst#h4hZVk:z>;w-b,Y>bxxD!seZ]L?R!gpO}!z)^~P;EGY˯TWJÂyeZ8 \ƍnF<a̝=#IN dB.\<CgX >dž[r"<:Dj,^?ͫ{#)}{ufޱ^\ +qR3=wg?qWRdy&xq2Gȑ>zgdeFGGlj x?YQϓf̛7%|R/[…;ɭ _+O|wkjcAd=KU)Fbp/;yx}hx^u^}U^yU9._x"F'zQxHg*>>@yG^gxh[lڴsfW&j_;0 h3ez>C?tKvg /qϧӝS4κ8 0w֯_[Hd5LJkY000UKרŋ|281tFM$˗tX1/Tͦq XF8Z'OqȑVޠh*ZD.4վW6q!\seiL1 aUܹ۷t،+TۨOf]b Ea,L$I^ig@V[v$`Kb m^O /wŗ^bdd5&hZP@Q3YaM KJFpEtg0^ "ТVyw  r]wßΝ[7g?R> a Ib(a{﹋o|ۼ Y I4?qGOrڕdUO)A_z2̙qZg ;޷2E Yt VaJuA0,\+WTyƐV[neb݊r`Lq~湪Yxޭ1YfAfb?QkBz5i;'1nGǎ#Og02bylD0`O"B*FgbO|+PdžTC!1Y 4Xkɳ\U)&4z$97A˻}A" !bDo/7ػo?<(y18CᅲXM4ձsdKQ'Hi{Zx?XFĹOo7Y|1?Wg7w$=QɟxCZ.[3r FBRK|ٲԥΚھ2òe1hE9_Ɛyl veCFDb*+O1`3FFFyȲ,u`KwUJ*9?LdU:/^?o;uΝZl-Q H > !ط" XE?^jU*Vc¨<~˗w!կ W,s$GXrdOOU*E_`Mq`,6=+01OByvDUv`1Fx<}w{<3oRxJ;2ݘhBC,4yIK&vkM8|*9q8c#;?+?E>ЇYp>,#^s- ;vMxBW_ fF\g3)~ZmrĜ )b~X-}mBLH,U(} @DL;ИPItW,<bu8,^rs8p'O>$?];6EW<];q5=bW75iW&c;ٵkg5ƯZU$RyYru&&Z7ܴN9sfbR(S jU֮]…9t3nx@իVxBp ?~9@4?Ic )7Tq(1_h">O ʉb8yu~~#c}ô J3R|OYV`b:⦌$ZBiMWj,4FMYSxL}8322c?S}[֭;qu,Z2!F 'L.™%觛A71 b`_lG[DP0;q_|_y#G1> Khk4/w< זTpGTF=+]J+H,[63*]X(ZMj:fG{3o+?#|W\%.*1)#9X`.w߾_D1J*ch!oMdUfRyGeq bJr-\SZXj)oAؚyެ٬[*fIRᝫ{]ʅ  g٢ /aoa >\7W|b{!00>'\Iy yBr&th[Jw;arw3PR lkZ) c 'j6>zDRGIc 4aTɟ1c(Kwd) c |/Ǟbٲe̙;Ξ=˾}h1<؞=+EBZp??eժ,?'S7_GGÞ{yyM_݅RE\m(29¿weK=8/VL*ӦMXh!_1 )"2ZT#?p[6.EȒ]Zi`4[Yp{NL`,_4Wy̻72sEZƙ\3U@uom3y^#ohfTĂXr)y-Z2JU\H7/h UZ… ظq}?R~lPcE +Ox"Z*_LY$K˗ٻwoD̠,h6+ 4|Z-L;uϖmo6n;% _VU չX?lB:_faBŽ-Lt)t!ZZA}V-g~&&+: r <i\v=[LJDHM71883/,]5kWBWƺX\O9z8YVlh42drMLJZKVXlokCd ~cqPO|1,x«y^,.dRшZF v3P*2PlRWh4[ѵ!:u7-xF\]KA巾Mъ>CSPPvA6Dm[nKigԧw{ &̛?˗}Nc`,[$N!LAl]mQϱ'16\N߸VSAqrm%ĕ#'֣!l⽟dnq.PkY*R٤lPש,+̬R؛^1Yjjd.،FIf$j0Ɛ9Z&ZY_~{k nlظY Ft-7@r<:i|tI*o@{9= y5(;ɻ+,BU&ZI$sm۶0s@_k֬dhp ֞/vh#<'cN9˗-,Reź.7*1 Qw g^fyPȲr@9wZ*-W.1 sWPksh4}1x VB_!ȳi}/Q3Beϲ,V6L/ ~Xd}yUL+60gk֬hb_·H{zi 1MSKr9,[ŋDmg}뼟W_jZM\`׮ \ZYK)bYϮm>k,!H|_4v}DC%c6!xͷ۔Omhm LBO~ ) F [6md d4x)7#xYܶqCPhڡ{7n3twF axh#Ѐ̺TE,eOG_DzsoE}[طj/ѳ.]ι8'WFzc7ѤIi*dE?#pέ9 ts Y{abw~_ݓ^'6[̟35+VIAVd ytb.ݺ&}{O`҄w Ѣ%sdaxhz[Vbmq.\H*vH:x7 eãNp-w0Qո4z^o֘M7#_G*#% xDlzDv ZvA@.x_ĠRLdI}NOm۶#nfz`>38qǟn|Ol)-e6@\]{߇L!v>$+c(;vlVM_D|yg}FE3P&!ELYÇ8p y\h7]mf .h4TljyB`ppm۶Fl6] eIZxVYŋvVg;YZkXy)4ߋ׸D7D LyΫŀ!={9~d y^҄?@^O_+SUH!eԦձAG2 e42ΝȁC RK^_J_,렬 Lfi9v$LXܙs]xxJ<}RV"*%mL;o|;Nfe)0 <fsԩX;MR݆;ܹٳFػ?#.C&^0q1*n+P-E9sW7_$s4c1JԩS;nھŋ#)v9 E,]dO4[k,g…aI~ 6}}}l۶hײRD8|/`Vx8IB<عkG7=B2IL}",r!:Lٌ~.BZ"iZ,^>DPL!Eec3griF!xaҵg<''B @hqa<{ 4X7l۲1v6fu|`Z7!M[Z&k 5={sB-v?&e𞴙M˪C1{םLBR9?GPc) x 2swBd}Op BPZEUZ{\ XjT$*weڳ@2 ΞҥL_0&sf[)z7`%Z}s70fj/Le,ewIzFĴOx'Hc >k6+@"`Epp .\F NȱMwJ g3Vb6\cR4 8kXjK/4^KAԸ4y>.]`\)LSK 5 & ]>-X$<@hTQWI%j@y e[c-ΝaA  TR|(␜?h~Z/g,۷F ,*auAD;tl b 9v$襔Pyϕ;kL@,B5<4ĺuD]&,"89y ^|vzc&uvG.T9MytJfٱ]8Qn*T(22}kqLhp5[k7vA .[8&e\[QLB$+maGFGxLb? IDATw%N0|v!nF.ҭPg˖ME+kzcBGnZ\5[8vgNAi?,8q4.O*[9Xؼyes;H7sbUdSV;wҥK& .\KlJ,MXG0g0?ϰa+K>PBLo122B1ְW#$ڤy*+X(e[8=z#GNObM/"xK>mV41H8qQE3g8vxJkS}¸rR.OXɐC}6`J#ߩT.^g}˵ =IcQ0+ˡSCoaau|saU{%*`>ii{Rq8Ӹ5 y +FI+W.cyY6}1ZF7l[%;AKNU7<ˑ#Dm޽SC:c gr DFy?1øS1y8LWۋ2yΞ=Nj/T{g{pB:t6SaYѥH`xpO|saۍÔ*Ix b-7e&ZCŠ1bWG9~qON1-TUZ%.\0+Wey|}cÊ0,/]H-wS+c >~f ?{'NფT@ 3 g~hLl4]M0З[nZbf@#v̋pntm׎سw#/SHt{bcX|1?ßK?Yh6bpbo|"ZXA$ԫ eL.2X&w"崔u=`/8bzL2~ ?X87MR{yru4D,X0w)w`8Q0ƠFJy撽,t+Rp|c;gaŋp=6OuK`urx́3׮[K9M5'8u4IEH3ap "qxkjjHD]]ZMv9=r/bXϲg޹g8TO̳ϣ8XPg ?}; &Prc H a6e(R0[?n[ l)V_J?31O>,!Eo;,q[X]ZbvJ%X9Q(ɬk]}:'JsVwgNA~NXUhljDIGG ÐBw7m u&R?󌏏/5X% O<$ָT K H P԰_?` RBYe9qS"1 8&"<0]Wdn,[ 슖v7qX1ׅɥFV;BH~ qP011ƁkcFxSgFB^^4Hy1!@aJҔHW%xG 39W=rI1813ϼ& yP-a0LHYnO gغL@HPB(R0 S/$% hKlŨEhq5O>6!mѿqjZ=TRilj`Z.\ѣ]$;&@/"=sŃ +VԱ$R ` ի<=Q(KѮnΰ}ľ=fvb _WpaL&/$ XT}'_/- latJX,--R.>Y/|i z*/Duef74ɕ90%4dxxxj[HWwIkѣ]\8꯹D@JDuuײyzmkZYreٜ[ x{cr20f] RO?{^ PGc9P'0CA~?|_/џq.VX1M;f*j6(a)4h[ J*;4~qϸ^6+/`i:YXq"Io}xhXqh088ѮnlnG&iJ[Gm2]\IXAFIWbZh暵6uX?!Xp @i[F.Ҕ[?o_eU2֮#弚dÐ&jk/8^!S!e|S e١M׾}]}*'QRVSpAR(=] 5xUҬӌDxQ hÿ< 8pHx`AJ1뺱e@BDQu3ba4FGyinl@(oҊ7F!SUU Ne崵4Uʱ|~ỿJ9gbtA[X#/r 'NL˖}m3oD`jK *XZig,)6CI{o>85OK։98ĕw~ǒߗeh_u;ya^O? ov!bʐc,#\GH` r/8N73g07[ ޢkzOI:OmFs.27&&*:"PYUImM-555TTSQ^AyYT>:&b8flr;w311Ah1F"Ti8F RԦ޷ͺ)1@!QA!ܼj3xȏ}l BmFuϺ47׳zr<7`A 25}v `Nډ>kS}^]M%߅!44}0Y`UFC:鍝jA=cnFFyƌd<C@(ql_1ګk^ʝbre!kZYf ߱?ŚF[!6Nj'O-XtJ2\]bnK[4!fh<FKLk q!O"brְeѾZP˂|r?/bGCwusx/gΜ⅋OL IDA <FIcf& Qzm7A@oo/GƔX*O]"J 6nl煗w/Ag}{|v:p d2򙠷> H6^LiKjT_OEE|0/CfJ&44_ LJ2/^ݻѱ1 ]aa! }?2PI& 1+*8w>FEE:먙Pygn`QeUvsG'sp7ȣg%1Z!ر>4Zb$mE _'evz!+]M] axo䲞ӓP7%o?ɧ~|g8pMN:((!TC)E&Y[ ޢc|SJR1JjEʹ_[رm d3/;]W׷. $7Mum5/zDʄdU˨_m[;~5gϜzsvhg8wgN5YYF8\ȑ#]LL)eRS|MUsqE4lF[83=r>2 $B>"ߠe ۷m&T k a ʡG8sCGfrr0 2Ւ͖+uYCCSG Eaֹ=ضP]] 4 ) SRa=ieB#uPuYM˪ٺe#QSY9 b!a-~vо4*dq6 BlqwϺEiWw@IF<`~IjӞv س c【E(Y40lhyDb piؙ~GXPr-|x$l\"j AkS# Vr'LO}}z& o=J>(6cWؤv@F)L&muu~)`[ދt9̇L$ci,U:,$XH2S+ײ;;8=4LO 8`(_Ju4l ER!hGѶ]cKQsN\N -v'$q ȏ~{+d2@\*N ӧ[O%LLNUO˚]r LL 3f2)\8"cʲAˁ YKwܶÝSJش+vFym͍VUJtuU3_v,Gl.>_]!𒽎n?x3i]Y}sj7"Xw޶c `IP- eS'pz^BzOwcO%eSN_~%%88c*+E.8A%8 IH@(j_,Qt#D/;sVR[Sum<}<y7饬(R!M(LBaBd\Ej-!(`l$ @H>beee0ifnlB1pJ{~C/N'%?pQ\ }ԆջT,E{vK}N:fTܥR^|ic+YY½/R:\FLh!J6uvvH%[۞;w7 TH$ʞ׻iVG y 0_998 1  Ep,P %BB<džk6nXOey90Āq8?\ֈTVc\gu䜹02u|w~Wرu#X:;"__C%>fqe$J>}^c^ZNzi=|bM$`M]ʹ@` _=Hzɂ^ݦNVp+3'MJ:;qkT G;-йa\In6H:u4oq0b _7wf9Pe,X2Z &<zj.N]]o[ּUXqaTTOA07Z^߷])TQ"0"  ri.0.{ߏ*o Ё|b~[?lQ\֚nNTL]֧p{}h9Bd*06ɷx[}sLjc@IZZS1YH+ c,}[_ص2k4LP)%[ feee~\pk#50!^7=ھ\CsNxh)oJJzcI஻*[![L#DQ̑#G)b,t*kl%u6e6Ѧ5ᅲ8P,466+/J)9Piza!br2'& ,?9܍Kv)yꂔeD%BXHiXo'B>,ݲ@#9Ya,Sć~[:#q z.Tssn7dZ4} c,ƺw? PI'\{?8HKcH8AG$֕-_hP%6&mt5^? vL;g5H3~NI8>0a :c,煛mR:cëy23f Z [O K.mW\$ i}aKKq;C m46.NDWvtC<O"qIHt; }\O_{~7YbzhŒ:Hm q`w]//e"+ *PH%j6oh'>~}(W nHEXoB, ѕ$`D$555|_Yr%7H'60Ӌ˾)ŜN//zkl+N;u1w9qDL5?- ҄Oj IDATxi4_; fq֮gR;m'\سӝ_b@) ujX\4Ok3>u*$=xg#p4Zݽ9hc]2IOBm^^t^o;oI$+of>5]#]>;~UVӉ-Ԏ9ѸNRP< mRP\X&a5h|KIkb3- -  < ]’em2ַ䳿kD˺DQd/+%J:^5!%|`|?į|sl߶),(r_*- ^dvRol|GTC 5e 8JX ȠTEo ұ 0I: N1U>(HAr8}0ǎg^|RJzzz?1֎J儠@ BhBBEE%6nH7JJh^;mn>{͋`Wqhooq-KsFʲdX偏=oqȠVgf-d2YIʀ SFdyY>s>iq`gΎ Tu77։2 $lٰ__]ẅ́J6$ vbv5L%6o> VUct L!'?&<%&;$D>bb8D.߂JAA@Je)/ˑdQoHа]V +~=GdBv}̍iY**X2YeX9Ɔ46֣/WwS.$LLLwFYV[UTq+ !(غu+?%Mp,_ޒRr׮; KsiFkˣzc$"@!Qz8;X0B"48NyvVښY׶ꊔ]ql>(Æ8L`MsW~x1 ^n 55tnXօ[x̧i/,"Ξ=K?'OdhhS9?6(ccc`Mi!AP )e)//'QQQAuu%U+VPWWG.[kQLJIӺ%N1?m.M~pKaU#=(& 1J^eEUj8\aap$gNANvmXN>ɓ^x%p-QT'"9꣞r 9^6)$JnD9t'Cɢ9ҧ¡qD)B76*=9O<4mk'mkɖ΍\čd7G򍧰Niľi~y>tk8Xb@5u\F; S^S\n 8%8w$t"+$Z[սpF1:6Jw*Ypw;V'H5-S ]b/7+سg_?pPCp5r" #nRGbC4:e?!\\.GFjYr7vcv::ֳ~Hv FAd/*?Is|Z֖&@k`)J)Yuѵ鲟uլin䩡tWJf-hZ::R[S2;VH@[=? p8fmnuZnlч\(}bǶ??a|4Pd2ǽmQ)[)ծcc4 +|ei9[*kJT؀Kٻ8D!wKiJyu')|>BZ#t싯 ė~a劺yˍd29Koo׵\GyEYm-TUT8+.qXkXj5WVlhh]~UeJʻE:`llc\pjŗ/_N{GϽKqP`z,C(lLMm jI!V˻z,fRR004rٷ `s]|{7f#K2B oJEǑ2ť=ZP+A@FIػw/?=(&E(`PJE*քɃφ6wt%w͵!͕ 0X_⛏>4p:A5Zk7!3/4@$-FojX&7IK/a}+ݳ,ՄxF5--,_d2hMcS#uu|Rn  1z =ٳg]w%UTVUD&f"- ޭ*`Mk 01T859;t!X"tjZb3*Kl 8}oWinlbreRNN51ڀP4G_LUw ! PX[V\С#01wziIQSܷgIwCA477w=HMmM*R$%pD-4X.\GWodE B)dkMEXk֬B!|[͆a8!H *2 -~(ݯ9 ִP k4F52/L]]4[zPҴ 8?2ll&d:-s9 ==K ^FiCoݒbᥗ^PXTs&Dy(HU*8_W?ӟO:s( )RޕZj/+EQJ Ν;Ad0c?Duh6l.D~c ͛7$e!k< %dCT ->?7_UN"[VN!Jaؘ0 0EGن@Xm@ X%nϜLXsx/}'N~m:J$H)Y|mmYc $qQ]em[+U^ƸH5Q TG1CCB KjjjSgzzׯ~u51*y#`h8dX[@y լ5DhRN'@.sO߉_{Z[ٹ6:޾uk[Y^c*1 -Ly l&yس5*+PQQ1aoڢ=t#,B5MڵJ,:]tJXLLM"E)(k`q 7}u1eKc%jC^10X΍kCۤGCWvuukpvhD]lM 5XQsС$Mn[(.p'$vBJsظ566zu=M6JA@kKuu5e,}3rHNJy`bE`0*|G ׃QJIEX1rIw'U4~}h`ZabvY9i_ᅲGqR.*<>%Dfu; <pi'zp0(9=0A ;)st II.` nA] ^HL\(0rb*R_@J0͛6SYY͙ 3N2h_0Ǩvk]\srhJIw{fAƮZ,8p(ƨڬՔX Ƣ$t|aEm-kZ[Qa1JH~m4SB <ęaQl#pQWPnV=cr~Kf2J)"CWq{yS 6v;c)npuϘL ȄBHB?/'N ;FEZ3Q8Ws ;SWSXzB5p|Q̡‰k 96/r5ewɢsTKB'# N V@<-Q9A;`R_iݛYD+hlO-vWjtU0VP,{{ ss7p&0Eۄ/.[wt+Dw]W٤Tmyv**oz\C)V(RÑ#]=z4Ն(Մ>פβvMˬR @Ȳe5LK;J\:ӷn$ Μ9Koo<1W,.uuazi/gRWdM[kr6pӟ 87c4S0W,ɂ:'?>Dޤ]L6]PsyX!f2.[9qo@[-:ea͚V:ץ|KfyxG[5j~7]Ζ4$z[z_gzzqү] GC>-j-m+lBu[_u!QT7޸,Kp'5x>R={0J?*MMX^I@35/ (+/C*Ui~idžc'HѤ %7G(E-Co ,{U.Oz-tvv~:?\|܏E>ya`9RkKYGg?UM_2J QFϚ,IMh0\BŖf]׳yᑑQZ>|/c]{P\Iɮn,q|.Е Rupp˄oPTQQNSs')/'N;crl޼\7ײhN266ΡCGR"5 ђbeĞTH)'6g}/~h+`#G {1&FDZ,.[+SV}hnuݫcՍ (Ҥ-M>e\YY[!}%9tMO#8[WZV8'4-vŁ˳~2IR"Qq8bRZ6oD6N蚒`òelٲל}ht{r)n̙F'! ;g%nq!O*A Q) #.Q>F>V g7@ .n0X€PBxy.LLSҤbYN 1QC݃I h"m#PlߴsV$3[8 B8޼61{vO[0%R ܂0Ң'{{Ya!fbV`5TB"0BB[ Bعm+0@"{&hzZN1/cA<;f,++|w0_Pq4cc1ij/U "d}{ 8*,9Y<Ⱦ sct}KX=f(ں&,UU,keaLR?$~9f: mBY.&U*@*GQYVΆ1YPךSh >xν `+cGS夤ޚ\ {MCJTiOkvDw?]0g`2{>tQ9AXy4!җYiB (*^ǟbt,Ib7k7PR ͳ_-YbRi5qk-ݜ?~Z Xfdvm0.\ /"=Z;_k缪gTquHg5JZ+:g] F9 Kc9~8x/# %%OzcyEG$||BʪrvlV58,( qЛ%sӔR"_cp|qnՒ(ESSkPeh밻%(T0 P(ș3ǁ5qneuTTC[pKTaI^~i70Փ޵=?EshNb4֮<%Pw"ݤ&sv2޽{$kn$P!JLNN [f=f1#~;Z/xYΝ;(˕~8p !O=," nBd@ecC6[F?<,cXalUu)73KCCC qS222v+,.ZkpcbVZAUu=`I]_ O?KǩOZ2"IX_!u;m]]<p=1%7cfe22r'YW&u{HB(iY0_3T{),YjBX z=(,E8 )[wN=l IDAT,ϢLjBX{=SG 9Qyz-.bD 6 1U-X~o%cF )\Xinnby]=78 ֧6G{ڳO:=$o6LҍO bk8kJe_ۢ,S p.W_\, N&b`ߣ3`477RS[Hhfށc=S͒M,qL[:pI475-d"i3=5^C7y0YV56 U  ۛ} 宻wy>(H465аz.iJ x;NP,&T}=+ĉD3|Ov%$"?Y3 GTe̍y,Y⿽Y@c-V]/b q(kT9 &^; o<FHo?d!IōJpΝN)R8[;eBpۑ\]akCo &$Ha">0E3kj1x1iy('cu꒕=i Eɀ%aT`v̪"<K!ve4)9 WJ[Dx.}b\ɑ_tMm܎3X_78 "(%S-عfI=A' 4azl$3b]Z*+"̄Xs] s4Y"<-OXNfhx5Z).pd8j{W.ryMX 1F#{j'nX1l>-rNQ:0rZ_>%/͕pEӮ`<0 9LFow?0n/hZ1Jki$em;]p5AMuwlG0~7b a 1{FkGK3!@WDtnl'D )&zOA! bNs&=fYHS^QF (!_NTɡJSNb؄kʤ5r oAB&"e4;sKz8sf[A Alo^Gfj*r VJb B)`"Ѻwsy>/=䂀@<{!LŚl&CkSWR n '9YiD%Vcm#mͫXlo|SY S<3Qm \v,p'0$0 Tmf$VUL ߭X-X_VIt!"C27!kцLAՃ7ΝR,~X[n>f;rį)4ֲg>v;1WO[_BVce4)WNW]lڴb8δ`*PΒFXLbIjihwΥС#عPμ7¢\q VF#&j cM^c#G 3h& j,Ep 1p]ƥ!Pv:x*cq&w̛3{n Qh23n)./ 8:M6^ɗsχ?|7K-DT c"L!@9@-+W׸161e纩|/%_ay,_,^gPbl۾CGEJի[oQ?;qU8~7oؤs,l$ܵcLJ:_/* C,Y;KW8ݷ [nGEȜÇM 2svE)n I,FyF^ľXkD3ʓ zyEKȯkǟ|0H ۜ1&t`!FFSX>ɏT,v(z10Ћrͪo]s(J s11i ,^k^Mp ]S1 Qp]ع3&bmMXK/ň!J_c޽C1ٮ0ѣG㌊UrG>K/!]%gغWǎ;P&xΜSSʼ^.]Ll& bK/G?\y B4z{O5*A B*\.%*zl6Y~#xmSрOǞM0I-ʠKDwXgm{|~[ѴiHlkD'#cnsxGow"Qnl޼qi[yJRGhg[3::FS`ĜU»FbDܵr0rYzXz%FX 1glTZmڴTk p;C=;n~,ZݻM6k{0r9wD ;X1O}}=Vg3GocN$i(N' S^]*V{v.|>x-}s?wifPRO}_⺫WGWz;X %F&[LibU\b[>{ꩡn[964’E ="I+|n6[l]R$bcٻ4Rs$=Q@z ]c;~W^} bBy('5W~sp?U8r06myٶm['Ly#"!n~' ?oys(n;v/bpAoݏPTsTCR$=<ؓrMI 𱏾?}p>`HP4`鸓(2‘"k 'AIa!<&/yQ\1>& .1%/,'X0^HM dLҩ.-t`SIȵEZ]j%Wuv;)6HƳKop!,B)#̻>ۼFlb  r ŽqU yq?&s+h6IS ,1i&G#9A yP=˯Mj(BI }4n3o&n).{ɒ,X[n5/s76n籢Ab(U /L  3CrBr>".c@ MTo/"7#}R_;/EF0a{sz$ӄ_W> ̝3ncE]_-3j8.!~QLbb )ym$,8rncQKf;$A o2OR,>xbR3Er^Xݔ UeyUTT755 Dá5iǘ@-M1_kW$ N,wqluΪլbj{ ϽĞCc9S*Ӛdc°R<Ȟ= Gɥ';18s0LU8ʶ"$Fxشi f8vsƉƍo1:6VKrG)MOҜ38eK/;%H=l٧{:O? dž-ZfIELbܛP4\E0aHpӉx9u@ZT6) "˖.7~Wl 38RΓ)>w0k(` PLby1::O=u~aa؞.Z_g Cck7`<|ޥXPvenw q4"G=FcQC6ćV^g?K|#dAݼvv闋ٖ y|p)C9\& Fo 6k־49wu,,`Æq Zܝxpl *xM0:6NgkW׮Thg?4yNzμyA=T ibٿ0>$>o$smiBPy1D_ I͇SbD0Ney'fB(L,1h|֕4Ҵ{n~믽@ݦ^]D,ԓbr!(?x~a0 B -6l5C=iXW16SLZ_ϱ`lݲ$Mk]c' 1ĂyeFƚ _<^zd$aڵv&hZs $I(dLGQ(>UURcX.05yB#^3'bYq:E,=bP.d3?OiV{I^]ݻr|x1ƛM2PuꌸbwT1L(.dw;8|Yą1D0`Ԡ0gkYŏ|d`?z{A&zE+ &=>0w¹x^|P3L d. F6^zs~VuA~s7^w [Ǟ=bDƠ!b/BtH, CU岥l2][o6hYj+(=O~wM1H1|c?70{Zw!c3缘5lݶݻt{(:W^YوJ$K|>Ag ĜuѽD Ǒ1"yxUގ Rz =! Vc`kV_ɵW_;7ykVvþ}8v8Ǐ36: . HOVnqB5ƃBѴƐb EtZ~/~+"Eh@KآWYEō%M|mɃ=h<EW cc9rq%s- x#M5*~X־:o PXF)e 0`,_/+XuՕ\yŕ̛?1w+XRBg[k3KbkfxTcgK,䓟$[nhW\c,R( 6<*R|Lw6.刋cI`TssXz٥,_.c+Y*VXAO˳E,Zy}O{ bo6yGشi+HBfeHbgj 1v_ƷZe6mtN? 8ĕn7wx`{{n8YRBV,X^ʢEXYx1 ϣшB>=k H,wYn֭gg;% #k Q|T ̧{ vލ5ڝ{fy&($c==o}{\6k׮-k C͞w%ܵ1tpNO("(73|)I;%݋wB5&D⼛0ۢzHƣ|\bL_g5+ᚫP6C## 222(c-=G8>:xai8ޓeJiK?sX0gŋpcy̟?si&)<%E諨B8`( T+vl.oO=E+HݲE$fD[wPE0tO}Nyx3dUtAPkg|yz=iN8ϧV<9f&<0?(.D5Ws9;ŅTغm;_\"U5:Tt5&’1=4,Xwx-[w XĘ) jiw&sI ,O?bmLa]'#',8vXG{W;mӽ >Ai{@qFsq}n]{_)h¶{?s;d.գ#xehxt\#wv&>S׼x')Sv+I@3Fٱ}.YK k::F,֦<ܚJ몔al޼]Y3 1nuf}{qΝ ݝPFsk֮OoxE\wIJ8Ix?KFǦ_Ͽc?Q xڈ"3Ͼ+6QaqPW_}wsǎ;ٳ@l|_(hw#X;rzcqr~Ƙ&{1Zo#G]N֢ aBG;۷og޽'21/^} 6I}@3nj6[h ؄|?XPPs(aJsJeq28(V6wG1QbD5L0 vw}G9 y-|ݕdױ6>V\L o<=_;+tf 0ol|1GuSyJ77etiQw`G0om߁ C!$*p.ɏz֬}VN+CʦM[yuAɋqem#W_LiLA܅Q)W*&X)!dz5>ڗQBqm,PK>(_|CchMLbAU3b<eO)'HȲBVR>֐m|<FZͼȨ6ř>F8>4?SF>\^%(F 2ƶv;W_ll ےήk#&4&c06cZ>2#=YvNE5.vk`lR$ D$IsIl\O/V<.x1l7/~ǏieMPmCD5\6Ews/C=j:73@޻bwtiNLSe<< x#"K[C7o.}}P&9]6B U&Z[9k_~;G06H*EEkчP[ٿ gⅤl1;neX8L06b|aCU69m,8pB\1&r 7PKc95]- Zr;̿_x5.,'IR4Hqtgej?6…~JO?O;IzC(ʗ<Գ3䤜%V ;Y**BqY.Zl޺?f#x\SKl1PfxƘbMXΦ[xu: ͟$ȸ=>4¿ぇe=dBPdXcz|Z)Ă݂FvSl;ϽĮ{Yz2dcׇNCpsv>9qi߅ХNؽ#ǚ0Z/|3s\w%F rſ}{le1LxE˹ feb[>\A= >WXt!N^A@vGŌ>Y[<&)RZFÉ}`f9αc#<<߰<ρcR$V%r Th)|^K.u]W`y̝;޾~zzjI\l-ktX!Fsv(CLJaǞC~#/y]v,W9 QSHHC` GkuQ bӘv;glt!;ξys^~ui=dB((Ʀ?cݲS Bz$e㉢IDnl6iیe|E}tu8\n!Sc©gdL2%'I-Y+LjXo0"%rH$SH^L=!MN~/ ?O}O]kbs&{NӤ'?q}a0vpҼ1{q0pa         dEE0&z#yBSGkVr@  `WDŽ((zTo AD/Z+9GLP+ `@-A FdNQӊ5xXM1&mBAK@AQ5`D!;usr>[I4ICRZK Ux%)X$(B1ί!(& 8 ,*3xo4l؛(#dv-r->Œ5ZG+***.c>`M A+'9^$Y1hbf3BݑA2@&.nseMn_sBǜzꋼbFeK,TN}w>CWT/cP YjxBXu\hԳ:*!m4`c?8ĴLi|i ēj8"!!E 9****.pFAF"(G @ۓ8KFY<疅mn_:ؔ IDAT,aq_Ե0"y`{qN2D?5w(Dbj?YڇǩkkS\3X6mň$NfV9  ƌc5ZQɃl+A oT35xs܌[6yrEZFiDEEA PXDbn4B*Q4h䆹\mG<:MbbqYa\N-Ď"r%1Xr 3%>TQQQqamFXʥM>tuWz 8|b*c2_ >ruaL@䝤'E_&q'>G6(F:׮z# 'XV0_z^fp^CuؒޜcNUTTT\DsQ:OQ3G,K#:IhdEi/4 PƂ#WAlJ"BxSHba/=#i(&@$XpRƌ# (z L30 @<H cX=}1Ǩ1ᤧ5qFLBXp`'cHl)>`˜W6hI@'Alih-C|r4Ț=CCMZ>%sҀɁ****NƋIT!& #*m~.&xעV`3ۇE^!j̀c`-bOP15 4cE-xEb'1u?G.ksݒoY5RCm/^=ލHPQQQQq"ϨJ*=dMʹO,uڄ` O8D=isьD9a)dBDJ6i Jm [wPYbqRZd\gnq -ft( B0iwTTTTH`K2< m+8iF~Tf2/a"Js\3%vy-;p2}sC kPqͨ1ugF衕}ž?4/Hj: qS7t8H2 MzLV'm}r2BRxNt8Θ@9$v>n  Vc (%S1B\ФAM"kS>sU~(_[{Jɏc:j` M xU2$M.m&!5S*j$>a=Ƈc0h?8K RC`[`32r8`0"*x,M $I$=Ibb^ Q3 |c3k$l ր1?{olu}~p;bHEEj,XݝTuNҕC7!I%yHNvAR[IHL/tZ_-؀Au\}9g oԌ"-C!gD6@#n5( k:3b@6{#ncRCmqBis<kA=܌km?8?},g 9H{ *xLV{O'Fko֜mڄDU h*Wkg?{Frɒ%K>$h1oH]I9F C8SIlWFFHV";̓$-.5n2o2yެa2hv[3I5J6pI_ ^LjPuD ,Z84ttZ86V+cXV Ü3+(uDU!qmߝ[0o:vC}j _4^9s{k x6 Q4$$6e;g<%gSf k(hY-Yd((ʚwydތԣLNm <Ҵmi;em+ٟ;fpm؞ siVseV1kZMK Qiqe8J@ xX8FnC<&( =_/*Ǐy؁FؔBmBӂ$ Q3b L&wG=b?.xwEa;<_2b/ x&(dB00sZ1ntl:|@͓_p~m-C ث ( h /M.K,t$\Kl1#Cg[saoٝM#5wv:L$`iQBi'F"63ȑ?2_?9/2aq-p޽Hp%qe:K'3FU)VқmFvJ0oq/j}TYE3-ӧ;|J(>vQdt)A2 )BE5 ˙_:*YˏgmWx{%$qC >P!w,Ydɇ#)NlSLNJ2NV@)=SJ^$*@5A䯤sEqQ n>8_-zf. _[: :7 m~Ͳ!}!A$'1Rg.s ])9wba>N=k)-n $KJB֔G6_8l͜I#2BK,Yq`c܌B#投YH"O<pGCeNH"$D QCCFC&I!I,FҌ-N&NV_S_;ufqL{aNȆHic6¹i|6:GBD }6~s.UXNq!^/z% (92jkmސwnfa.nMZ+-|:mz)d>d@d0qK~h/\G?&\cQ%K>t(أ}4hQIj ײE". ӄ=eX*DFכ  wv}S z=w KH 9v$-lyu!zK2F-Q_kyV:젖@$c܎zOva:/Y\o\ ]/2JiD 6FąU{_ēĸO*Z{bGWZ(j $w=e~5|0{q$Yhܭ}$QL7'ː KTȨf:Q:ʢ΀*7J#b2YѶa 1OU>}zc+hk[cgdS}LFEhBY*.YdɽS:Ӈ!'q}F.VP޻]9E) emyڐp\y&T̜ ',%Y2&։On{8Wtvo.|,r G)+XI IM+%y/2dɒ%1W^㴨fJ74*f\)uz#G6n8kCK;ΫWG<^,. I(pnurlD\q00a]<ɔkΔt:{{sX9J tJZu%&ۗg_xtML.3$QEɬ 9r Xġ9>;>፫5yg. jV[jk={g,,Yr+.z!j`g)]qSss]T2^dmxb|{̷l{"c$g4g%N92`ۧ"Yc>$޼V1m\?d>;#ƒa6 r7.xkOx(b"nL}^Y. 1$;،J8 ow"e s{ZK,Yr'!>'kAg5uu<}܃s;˩2S 0H) '~fzhKVsxX>(` nZXbh L;.[݋?`[Ř:-A9cCN9?茟<#3넇LfH++dO1;f<j㇧+s>+4Al&1pd@\ӻ5ws>|XrdWQijG*ij/xs# >|p?ϼ@Fn:UC#!I9YLsv|Q#@rJB(&0^}'Py,1i(RMR$ UP"dL>7E t\- X 뤄\r]|^9nX!~$G4Eׄɜd%$qϘeTIO.Y& р}׻:XT8T5|_=33Gcš+R}<|jP 3Rgr'o}!  F RP̄'xIKENX5&tt(Go2a6d ͽMrm r^GC)CVgwfηqyko~>s<\! @-Kɶy# 9}ן <{v˳K.45.δPb]K@HӋXUm@wMtbɒ%?2F"nj}ߓRHB (]TB83G;˩Վiak(d~:6|@nV@"B^3!RqdcXwHȨ:Qv=qkq2".re? Qe6 C4!#0ޜvo}j_|4r"4h+qJ#'$:f;G[u;1N3Vx}+NvĚF2n;R9Q3F]/Ym?K2JȝƂPf1gxtc~Sn{$ X;3& v& ]Wl 5l [P#Dv!C׆\ zޢctڋ;nBEK5oeo 䙃ذYwx )|~xڴ q;8Y!ĝ܅%*̀_~0k9;vBAkD:v;0>+pmkyȑA̡m~+JKFWi+]jyB^ጼ *B؛A} {)\-okAHjFCи;w~u\R}OC %K, b}6F3u|9Oqά4$Uj,)3`,zSaf^C. fpZTȨ Y 7#[M3^{cv ^ȉRK:eGbjX53j]v޽+>U,Иpop3οY=P4cQPRQzG$Ft424c\'ˆ_|*rc^ܙ/Wj.?H ȭ+d-YdIQy0k _9zȀVDG%)qxJw IDAT +Q"]8sȳ0isz+ߺWWMP=rkD"Pzdŵ(sF@QVx8ϛof*7^>F>W[`WOo?SpHhtbKk'T>&c#TKhIbdym~üUsZ|H Ď1!  #PAӄ#;dM%K>Fo3".i#&EpŒ2 r ]|H50ۜ3g"_D;d#I$ǀygѬwsZ˔2ż?6׎{G @Gй#YiL(b/Mí&Exͻ1 +{ s3>@jUE`5P*RFaoUBarhPGo>pqow <{a+[#\s "U({U(Xa{24b]''Py'.Ye ! r݃pbHhiC&#3 }`;f<:kep15>}j—uۀQw3 zӶe+jmI/^_S.McO !Apq>Rb5!&Vj8R)2Gߘ0%9c"96օ1x૏9q @HIސ(7I00wrv$*!(;) 3-_}<~+ ޸<;-9wڪwc/tF.< 8-# սXZcT Ȟ}Y/kaёlD/)0h9 Ĕf3S>qag8Dn]*B|4[~o ~F3BvÓ!A f #-V[fp⫓c\޸tW/!gw\"G:L&2dG 8.; g'$JAd8}$qЌ'N9Wa(|VV`r7 XθwD3~}d3!EycZ=W\],5p: .G4ypふf Ē|fJ`!ܳSW)bɼq.ų{u'<}xBdbB({@Cn )3d.8X r\xg9gtLFHڤ(N'.,q N T,gr,VL9}SG3ph#qha(, ӊ6xޣ EscXV0 ]OƼxaAѫ Ews{g?qyY&r y k/lG:Μju o9w/ovBCAZ3Ly&](5 7̝  SN,E$kjM͉wH4U-dEASRK8F6ƪrz"9qʀͨ.EpS|WD(lSm C3xu +|]ElB{ ҲOɸ+O 7s%P+qQFiwޫ}wDŽd J)%Pl͘?{xZWf)q R1@ 8"~j] #Jɇ; 8-& ޸;{s{˓6 7"xI ֟9:*}y~x(oݺK~̑ث$L \?. DwD{zJ9rdZK<1sNm%Ty!(*%NE  ݄ٔv>4AGrKkKc꘳J(KrHn7]L:aQ;'.ф-oX=^2#h:܍s1x꼸epIA0 d^quwP¥9?p=2Gr ^PRb`Iew@ H8`B=GD8r9sxDv\-<9w1r~ZV&hdF+@4ht.3LE?,#DP1 &dS҈.vdD)fAYO8ȉq="g4;Td8!b uBHNG?~}KW&T axiCt>2gTwupicmg७B $e(5#J(zᵫkl]ɼzD~\uVf;UD잝!kb@^OJ5Y89&`K\:7WxbKK)~^/y+%cޟra]%c`T( 04'JQD]W<=Nlՙ3% "t5C`O:go{$hs٘<_?XW>!s Unbw#Dș`?L@:7-!)ۖ髑yz#,;3ą"QͤGpCMТsC!ҳq.sO#W*xIUS[)d3-x ^0y~D.Q +!Q m˶np~+W.FsqnW3LhZ9UWt^;+. s:' L:|ŕvޘY'0.!2(Ft^IAWX1𖺅2TK܅Ydʶu9jeiIW@GZr1JpS14ObW:;H~"`#7i1*:߽@{> 5{sK*+$W\()N] \|Ͻ+/Œǎe_{|3P+P(|¼z`fDx B{mJ2r*o 9\:ϥ=冫[M׻vK>V\nFG*:gﳾ:HGU.g<:yA Mjm EH O@N#Y*?]^=g2(5i}똤ίQTQX.,bOIX>wML{$f<´9ʷ^pUޠ^?J÷;r  >h=D~S-Obh pWָc}xw7r~Vq %J0Lgf\>c[w'kuue |`dqC~\a%vi/#ށAtJzߕČcbfL ׌PAU3).hڛ頸רXCdT)aF9:j85szx`J`Hv$W͛r*HyL;ŘɚsWF/{̹Ww,OF)lѐ <:PrWv(JHJF!+$,;Ulxir6SD*vH2PMDN8wwZ^K{0?HǗ?9haxq!@ ;*h&VDD33gdQ'XC#Ik3vZ^4qib\\iZscoV/B`9+cd/Cg j{>)d.tR$&m\ ,7~n9TpE{C "PQ Mp -BTsZT*ˆC9V̡ kuIi%  @I{{4]kyyCWݒZ%Y- 3 3@@p# " L ;f,cɶ$#Y#$:tu|ϳZ\'-Q*3++o{Z'cQ=_>aXeO|'oXY v<7fbjU\#+kez76hqHҦ 87! ¥i 15*k֜:N| TW^.<a,fxؔ>VOoU0ĘHg 3=yfYc"<հF 3a3wN?9_Jo> ': Al 5al4wa)#mH8yjjޡݖJmSd>:{XTd/uf✎Iɜըl ܸIqՓ׎;͌6CHȖPkCE3]tKvt0b~۾OO+гVHJTSa/KK'^8y8\Xv^WX sf)!چyIʈt&U5Yּʊn :-wg~,V;[̟yV! S ө6LB(r>KmdFz:H=՗ /-ʐںMJNJXc|nڋHK lSEx C#*aK>f1޻ൿX񇯍 HxCiw%r6~ Ur{oZg [\bl1m7~종4.Hsh9^X0kyP uLލ)$=ytƵݵsch -1,fP-.W@>;d7R:UѾ+g+ +B9M֌k: Z(F K{T Q 3RQH+ ##H8 ET9>_΂cջ쥻J}=AX#]A4GZvVɞ1OVҖK;aq 'ZV(Em^KUfɖ&Vx=H$qqYuu|k7S~e//$tǨV׈֩é| $_Xh}z|߯}cϚ~u}tBj0?,EW/<{Xy`#; jQ N/ l9%inwő*ҺnMEX=^B|^LCq*"I5#~N_4?oz@yΞ4mk7~;|hKtH/Y,'Aejx()1ZApdzD.)N/W? wSFTZ%[{*N;e|:ϟOq3{OQN|`ez[J9 iՕBJ i#U#d_St8.RUh%uVp\G$1n*J[o۝YY',{`&̔E5fm_%\^ao<ɫZhJ[/jjfxMBjƱeFq?}w?X ]6\#vIie~yt#5qI'oc?z}~Xxw==7O^?/>g?Ws"EZ<"m+|)B%C|4fB҂|S]ے@*悝_:6e3Ze;V6Ca=T0sw{ wa`XSnK;VR٩KK$`z=n"t Ժu*$1rbg]ω.Y*iWo=NH }A`h}SnPZ! Lq",Q嵷 wCWܦGAVɲnY/#/NY9|5AMOP^p[_ xȗ> hRIDAT+{8Hɘx b$5H㘣>|TIl9ޫu;mS7; DwAC)mYk/S)/cy9{tw4P|:= Vv69m.%a*1c&TNA+5܅~HuG=zs:73wk;Yg]kG$PBf|zovF53HkE T/z gL{GhKū5g6=?;L[ІWڿiZ=]kkYhh%U&)vsâ3\^>:xzThVoķycI$VSKOyqc0snٗ/JG.zeF|;7K /yY'5N `C9o;愯o|*Uk;F!J \e?Kn1=iPOB/IǘES + ǬIڳ`2yD7 ~reW4c:~f5q]M7N^݃??! =M=MwRh_v_[w{AM+ږ. 38K>󠮌 iqoL埿s>gN |^m(iMm3%3f >NGυye t 㣗gu~^+7'ﳭO%jvLk uhp/S0p̘@J[ O^M|}.o{ܟU?w? I7}w 5ͰK KŨa?[,cxw2#wD|z*6ġX7}.5Ta2OuCi&mߧUӟuڪQFRl{lG۰,88C6,M[.Z+e9߼!HɰDlaGu8{xqfՎT/ ##{F=czsj6KVE|TT1dȜRӧ_-|+{7O @ SCVp#! <Be6d@Qܴu*: V*:Tq_q`{v;w?ygG\}zEpǼaQXknuSmlxc͉y=% tG?́6i*N O}a)PlM=Nʒ￝v}ş~W/d~xv~ĜS6tnu ɏ6HQ!*k$Njs\t2$\0Z)ȐH$jxx_oyfǍcGh n2$Hd~Ǻojz*9%mQT}o*߽>j[Kr>=}"DAVQEIŜ2Nr=O/:~W;O.k+}wD"ͼÙӋ! m%:em޵zgWCf=$e6(F`F;Q+$t ٕWeʒM P^).@f꿑.GA]vNӮ<SI|] y\p= 9oѧ9*qF; 6nt[yGwO[;T0c|vu,,R٦^6*?sV< luFV2MO')./P$mԝdJ5)܂=AvPRX!b d/¸0٘낹]"w(3F^?mOܕgO>{+.Av31?9w WܦMIw,gOhp(-1sITBh{?ׯx:Z6\X =>YfL/pr K_`d-*Fgַ5ۓVleFS/^[jvfck-fIَ8>{/<_)?.p2㽴m^RŲ(ֺÙ4Y2duTFqJ&hsE@fjQXڶW{wL j"[د1ψPq+@Ύ3k3 :I8=RnHqpe־w2}ovm:┩!L}ZM:S۸;#{?ˆՂW:H=fNJ2mȊJ1-4bvhA)H"ttA+r'# ώ<5[+gZ3dwP\d)fV:-yn75?(d)m%eeNfk2B1cLv);XAS>#`*sSk;   Զ&S(Ě@ɫ8CQY/\XOǿ3[蒓ږWCѷS28%MΛ^XrW偣gi}x-r.N/hqgu)b6ogo77%Nӂx4)i4AH25epg^7@ug33 dS!m bB"yAmu(4o%>$Lv@q¼k;| /<hU֒QiR-΀wXgMTDԚ{G[YֱNN8Zψt?YW IBi`foiGRW U$Kn+'#773+\=u 1Snx4DA+}S)/?svsZluIM^g 5rA0@Ge 3Ń+k>sqͧ/2Rms 8h29?s^۹zy3ll#mlR"japHLV+0v[G4;dbhTGD*@t-I{cOǘ ظ(mlmHZ }S|JsW=|PN ԁ*+qu9۫w7rnWʢUG*,j;:B7(]i+FLX-fpFI)TB_$ @x97iR;kV&1EGDO!醎 q +[\ŋʯ>#|h\a2)nJ$;;Q@pA>t9|ߔ}7C/aw|hw>AiDdq?|p}W>*yhnG:+Zg՛ :[Wޫ ޹i~ ~tc\;sR6FL/W`q$imUn5 R3Kϸ) 3UDںC!L*p!,Rʋ.(/z.,Ll,:i'Vx>l[񩓡]@DiP)؜ZvwNIt\t_#ަ `^R{ig4~:}uz2I(UXmG6caSqc;Ռ77G޻x q 36@8k:Ye8>â7.GZ<_xpa?)),؛mY6,\ed>Ov:1( pioZQ*U 9T ''9]UNuf5XM[blFc=lJxۿޑ̷ cpMiƾ{ IvG赒b+y} ie/)Vqkw%+"7g, yLȚtuBցNrI egX̜y9瓐\X̠ 99UTt9S*k[lYY_;KT3J 0#:z8sfklzcXCI5SwXҡ}~>}9]JhJm!'V Oìp.b I;TFtjld0=//TͶ =O.Hz DeR_8謫'@$A>r3bbbD#̉D0ԤHL\ vA"c EXxqRRWW'8qR@4 0A>c` 4RSbřK]೓ K_h`,- M޲0HHH]0@MK@W\bDxjkO+z PVV(9^/vn8Hb Lo6[+YlwLMlhpbWk &''EC!4ip@ooZN'<3ɉ R%99fsGFFrݙbQwH A/Wa[iτb9᤾)S6 pԫH(xмMM.LL0P 15݊Dѥv0lciol"6VFA7\ 8"ܢEiVzR_wwj3魴e]]۶61x咏=trô61mJ%Ҟg>ydc ~$\rr +rr.E'4;kّLG6xvaw S{8}RΊ'nrV0a`)X+qᅹK  :EB4k(*Z+u`GG;j<-: 2jpZLL &:tnN 4 xdL͉ t:,[Cr9rԎ!͋l-X0Ut)B w`YXpe:͕HH3n]Y rO9!@G~#EGGĈ5kE#o͐QpfW-FFx?1IXX WAYY9F KJJ$]A ~HL\ !jkbѹjYtѥ1vY.%1̣E`V!#cRao-lEB#:WsȧNg-%RAcUaLrDFᡃP[kGRB8U]-]QHybV8j1>ΉZarNc}(|d Lݢa`g}CZZ:,*g.E^88aT"'gR8 i_D-[ +ld: s4}hk$nc& z+V܌`xlo;\hi+"=Z[[jp.'(0~\yw(x /]p5`8m?;(~Ywwj;!popثqqѥ K`G9"oo? mzN]Jah34E]BIXX VT^*8X |C.Z ~D2yVJFȟsӤĮ];Qdo0hw^톓 `VaLѥv0l]  hQ̖Ji$]$m8fYVV6*+7 9%Et)Bz4 ""WԄ^Dt6ӽPe )8,̤>+bcD#LK3 Qk^\bDx|#[w6AB!dǭ/Dx6 Ɔ#E2oB2$%%d #aoGw`T"33Kt)B (-+GaaD#DwwjDlX+#%ѥá3q0$^Ǫ"L刌]n7\z y3(X.eOAF4501TA&Y(!A66GD~!A7\ +hEiVvRt_â!" ~Jt@YTSݶO] S`XDt)B멪A"#PZj5Nañvѥ}̅bDA׋֝-hhpb|<8M[6]" !jpMD_".NF74;)Z)>EQ^:xEEkXHe0PsJD򘘘@S Ci SD$Al{e>50ӨqvDt)DDonEl-UXP-@A4Գ~/,[C蒤 >Ca6 g ~5v 5(wnKMRrGDq###UoîLf6  !DśonElX,X*0FFFrcWk &''ECD::xAi`HCê"L刌]RHaءPScRΖxo^fMpm8y[t)DD!gtt5vݳlJp:kGD}}SiX!Qhre^ry|[V$&Qp&x=]GD$TaU$\z"""E""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB M6']&~ӟb͢ ?kiiABB2欿k׮=߻Xյz-|_UNx.CIIIK rrr(9KHHP=D4{HB DDDb ""$@DD$!""" 1IHB DDD 499 #®MDDH^}U\}բ Y~~>N8k)I DADAB^GZZXS,=zzzp1 Ett48 ::044qbdd "-…^  mmm~{BBBk>00SN0j$m7 U188өz jjEqq1sV8x r}vtvvjZxl6^x!rrr|?zݍ'NСC8pQ[[(XV?44qwkANNuuuxWk>a)))QZZkע .sn7>Çc߾}l&J W_hW^z S~m^(u]S-rf5| _+/۫Y}^WQ .}}}ռxb^?TW_}Uz_illT<_kmmUo*LUݻck~_Wf*>|_Vv211|Qjkk;SIHH=!C*Ǐ쵶+qqqsaJOO&߲eꯉNS:e߾}t.O?,YD EHOOW~*|}388?T עux'}^_SIIIC= z v͛7F?0,ajz:Ny75UbIzbddD8t r߯WXTJJJ|~=Z|;2Lrkzn.nr=(¿'!Y=d ~X;22wq&W|~aaa=EQ߯ZJ k3~~3<3,܌׾ꪫp>㟇瞃`P}z =OMHH7L&uh%66/ /.>u]'sK/BoCW|^;)**¶mېuCFFyTVVEt9Gj*?u_9emooM7݄NS}G}k֬91_PUU]]]}z7x#~Ox122?Om6o7ͼ2fe;裏{5~_˗_ ׿G]N~b6-?ϒ_jVÿۿ}l.HVU_ŧר锗_~YW8"^̧6^^y ^;vlNij?Pk``@ILL*fyίT:gu ^WimmUnݪ*N??fƍxO}SsWTT(ccc;88$%% |/`VG\\w^/O??Q:l٢Y]oLr}>o 22R)))Q.RoW+o7xCٵkr^^u~SR:Y]kk%;;c[fC-N O~'P}{Gy/`VǪU3={(>di^Qp3=x(r뭷 0h%77WX,ʵ^|_Wy^PuիU5+_kjj/JJJOV}Zg`{ns=կ~%Qp y~TUM7݄9=5\˗_4{e3::#Gȑ#K.r4h6oތ~Z_~yn֭[}1|Q455h4|r nz)K®o|;wT'?Iu8tМ7>>͛7 UC)..W^y%y8q 6mڤz]f59j^Z=?""6lPF(`b_W5{w3z)hd˖->?^P8..N}$Vff&>ࡇ»ヒSNOnF,X@tyG__u***TX={aݺuvR~ӻF.?>\{p8oVGG/^'%%ש'OÇtRU5Z 3 (**BII JJJ`X+y믫^#**JQee%֬Y~eT jehrOy$/݇i |Q>h5Ok i5<V+.y_u4Y't:;@ַO|~o~l1kh1O;ZZ%]uUxpmޟ~>p8zUtG A?r0lٲ)))Dk k`rwW^}}}xWp#557|]kh-h  _-ŋgW^jSN%55UZ4utt^nƍxG4]wrr x뭷o^?FM !^ Ag?~|3җg}5{=u,]Tqj"O;|5hneͶ񍏏c۶mxg9V[F;::zU|g)TjfԦM|Eq8wuu>bTWW h,{nk~#}1c \_DEE}Zp.SUǵ^sUl@ )))~[u ]wOt:/ڡ$4wXd5pWԿu3M/b *!Bkp:>%;\rvddd>;~mޅc=jVRRyvmsz-܂2;UAsS^^zo>+T_ߴcNSO= />9U|s͛5G:47IIIeeVV&cC>PFSSӜ w}ɓ‰'pItvv냰vxc?5ٲ㷿ݻUomꪫp7b˖-sz޶m[oaƍ~_=y[8z(&&&Պ>v1Z"qF$''ʕ+E!V_ƽދUVFLL y\~1::?9?@FF222vڳVFC@UU^ԩS9t~_^|I,_|ַYRNNx8N?~hmmœO>/{088z4{ZLpLLLs=w .K֡brrTw}O?Y;cccMMMs9ǃnI_$w;| 7m߽{7^g?Su}__|Qtڷo&\vehnnO<͆LNNb…Xj6mڄkF>*""B53׿ӟupmnÞ={~tww#** yyyXnC{z!477k^0cp7oFvvulق?g3ۍ>պupy{࣏>J\~kOGEeH VOjPh1J4EQp7cΝ}4.2U!  _~e|_ĄR>f׮](//G[[R@ɤz{shj|s󆇇qWe֭(++/_gTTT`֭8p,_YYYV&s=/8ʑg`fin9եi͛7y^G?ByypM7K/ hxMN<EQ_WR᪫Ҫm6[o:z{{q뭷bӦM $>>?Y>>(u wt9jDb|_?oAWjw~l 4GŦMpea~v?x g }AWO? =Ouf3>( ^~e <󌦝غu+dggАf.<U744_~ԧ`4p8>w\.|MqEi2+P(((@UUl2oć;[oŋq]w8H .Daa24~?~JM޳g&UVV"22R>ؾ}&k^ K.=fflQchnnF}}=l6FGG5lVf8Δ|Uktww'> XV[F)))g^Ǐ{Ӊ:l۶mV-^+V֎sKjyiwzZ(--Enn.rss6;!^xֆ{pAmqHT1E'!$hl$&&"55 NA?p)%X`,XqvĉA=TEFF"550 Gtt}&&&ۋAw@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$!""" 1IHB DDDb ""$@DD$pօj'D@؀qIOGQ\UNa&DDF 11QtuXj5v4Ҳ^WtIDDR[ր!Y hQabE*858t蒈 K Z *%%%aӦKQx 'Ov.(d-Xr3-]JcHff68tjjl]QȈFqkC׋.'$0h,or.ΝhtճQH^UE0%0QRbDA*\ڂIeea6[ ~`Ġr= We ,dffbTѥ4?HJJƦM}Q88u蒈NRRL&6 eeM5 >144$$""bbb`,5aa<_L!?yy(HDR GQZ3!22Rt9adQ0?سgE]ѼtX K% H@xlĿhZ8lhk 3| .]Drr .3hog dU/07 8f xre!t:rCtҥмMM.LLL.hXDDJQ6 Qt(**]AHM]$>H.ANACD.0eXjt^/v4c=z˙zKi0`*7#?@t)´ݶO] S`XDt)B9 m;D2'A-Zkzg.EIݻZ .$c ~]۫qXR|`Ӣ<w4D4"""zKM] jE +AQ CA*+_Pٙ`Ztt4Jʥ(췤(0eeeZ)).E ɳYB*LKJJ$SmGfߎjJ!fK%23D"D(`ZVV6,*\pgNO7 :j022"" "KˤmAM mG.eބtf0R~fcchjtDtN(.^#;+ndd.W=Zw6|?UiwY.L_bccEwعz./ d߷ ]]pثq1ѥQFe$.ECƆNY.LKM] g.E`=hQ̖Jivv^'.Ei%Z#qѥ199]vY1 0mǡSg_r3bbbD#(v4Ҳ#hYDEEdk֬^/_LC4 d曣z^t9B] idMNbbk3c8 P^n !?# ~66>-piiX8Lw10/"E`V!#cR^c<faj`%D"ݭpb|D`0TnF~~Rh0K(*Zu2DFF.G6 hxWGt9A`bbb`,5a5Rn^8l$ z+Vv;BgR?0(99fK%rr.Epm8y[t)DR[V$&&.EP *>qM"QdoP]Hm0h`f`C49i[GGGĮLNN.'1hhzf:l34 Ji~0̃x =p:kyH^UE0%!H3)LpvsVHa`)X+qᅹK  :EB4kttaa\ D" )L"..Nt9B@ CA*+#Z s0GbG --;zECd8q'Q`INNbENѥ199]vY1N & A*`@yKfUFA">>RuێͶ}K!0E`\ ѥǏu.$իHm;'9l9**HLst(w#tlEl ~nx-T0^UE0)"5]wQZV.#cchjt䳅X%!fqMba ] xeR7qHWc{5蜦.6۞>,g eSmd2SbcHiѢ4XUX,aۏV ==EB+--k{{{t@g0`|LvwwjtL:ea6W"!!At)ttnɓݢKFq[ >5pz:ӁSZ[[jp.$&&RV^#^j&V޲娨 1QI}5v . 9KD"Z[099)ǞN#m0&8qYFy`AofTifzlElL[lڎ]&i3z=VQQQR|byoT+lԖS+E" @A4sb{5?& q 7K`\/-9 mmGDP`2vt`K!I0_)lҲr!,,Lt9B--z'DCa !=(XcxXF%K)g̤Z .$@BqoQ;DhtXLGamӧDBc .EP撕 R==!(-[pp]t)HJJT!]@@mQ27 d QZf Fzg-Qna͑(%'lDNNRnlltVYV; (vN] 91Pʆr=RR.]0n7\, ^qR * *+a`Wu$'l"'w:DŎ&ZZv.h((MO,Yg^/Ah;r6vϘصk'ꝵ> >  _ afNȼ^+""WGmG($,Z E"(]rtlEl ~GB HLwS QWI}Z(z*,T.@~/|@!fxeR7LjLDs@!/)) &gӱ>ĈpywVD@ҘFW ,L{y”8lhk;,"` t~|\m__/jj +W$ꝵػww@H4HJNGc X*v2:tjjl>Z83Kt)4:;Onߎ'DB$чdeerSRDB>c xS111!fG>b :h3b͚bzs@Qb :DhaV!==Ct)t]pثqXR-[يy'Aسg!"h-񠵵'<r 11108O3]Qb @RR2,J.]JHko? N] Qc PVV6,*\pCooZC!"t:\fN(ڂIyu(.1"<<\t9AbV8j1>>&@4 0q,MMc`_t)D!ORSbřK H]]۶ĉK!.r=.% n85ؿoR@$'ccchjl@KK3 IHh5qpfR_-FFEC$-`A[\t)(lm9}Zt)Dc iZ#-=]t): K!` @y˖b UPI}D(@ ٌFW=ECDg@QZf E:n[t9Dt DA"99fK%rrrErFSN>`@da\ DE]] ' 4}PBjŎ&ZZv |@Ħ'3BI}ZqRQb /9 m9(1E`V!#cvuua4]a AS7 11Q:n78($1(^UE05z<4hDcBQI}Z sRQ(c DRRLOlo? m;N>ʈH"dffb… ===qTʈȟ$鰢`%`=G$!""" .HB DDۭAփ\1$0$0$0$0$0$0$0$0$0$0$0$0$0$0$0$0$0\R?p'rIENDB`pydantic-2.10.6/docs/sponsor_logos/explosion_ai.png000066400000000000000000001005251474456633400224730ustar00rootroot00000000000000PNG  IHDRx pHYsaaøtEXtSoftwarewww.inkscape.org< IDATxw|?l˦^B"A)*E#rATH `(_H"- HO0!P{۔M.n=+MΙydw9sseYB!Ī!B%BBBB!VB! Q@!X!J!+D !b( B%BBBB!VB! Q@!X!J!+D !b( B%BBBB!VB! Q@!X!J!+D !b( B%BBBB!VB! Q@!X!J!+D !b( B%BBBB!VB! Q@!X! Gvv6H$q@5wxXZZ;vCFIӧqI\x())y0BO>΍>yg}Nc),,4f'͛X|9ă !JIBAAnܸ]v!11EEEV}}uu56l؀"X~~~ ?^*t^^G͛7od2Ν;ؾ};: 7n@qqmDUMM vލǏ?;w8/Sصk|r !33EEE۷hl/JjŜ_cXbG0aP(8QT0ךSGsQIRcy&㑞.]@.>)899U/ٳ'9˲())˹ !L&ôi_K14 Ξ=1cӼt:qP(еkWMy%zcrIN׉X,ƫ˗#((H1Xŕ+WoRׯ?v㸺"44"yujkkyU4۲,Z4 "!\8RcǎŞ={кukA`Yn/m۶AV7ا .]|s Lɖ\s ØHKJJ ^xFuuC"( &sظq#u&xV^^̙oJ^RjTTTpnzoܖaliDbb"ƍ3fԮ<brz?<$a U O>Aee]x0sakk+(&SyH$y%[bz9!,C᧟~z m6DEEaѢE&P@L0 1l0TTT?ŋT*ۆo=zT#H$zYx9/%Ai&,"''1Ə۷՜b( &YfX~=&N(.k׮Ell#55 @0|JzKܹW%@-A4={СCO|ܤpiL6 #Gٳg) &b'VX3fIqJ%6l؀ӧc۷8ڵ3gҏ'Vʕ+kf^!bٲe VV1l0޽"$ɨ 1y7o|}}1g^eljطo%KTTUUq>c۶mr 7oC"@,?H$PjcꫯxyDׯƠjLC"@Q%,899wށ#>c>Nݻ 6 99s HewOSUUūa]]6mIR# CHH<<<GG;8^x?t҅w['J1n8DEE02B H7x8q#tbbb0 R/// :kLtZwݻL&PRR"9H$BTT^' J͛y/ÛoiV 9Ĭ`ذaطo:t "ߕ+WpebIܾ}sCT*JJJp]dee5j'^u^ny!l߾לLyE( faDEEaӦMСC/$$?!,Ч[n}СC !Q@̒H$Bn?\+ TΝ;C&M!++-[bܹl ZogϞ3f̠I-aвeKlذ vN\F鉏?222qF^ ^z)<עE \Ǐޏ/Ln3W555(++=lT'`4 ֭[ǫpo~hMBb|||zj kB'z=nS*//GYYxVZ!>>g?~;w5kfb1#66z;.˲|2L'O]( ^6Dǘ1cLJJJyfIT1f@%Ģᣏ>ƍc߾}#Gڵky?6 rn:UV.NNN߿?~g$$$K.&ܽ{71D"G}DwdQ! bq$ F a 77Wo.))… Q__3gs]Bm֧O|GjP((//GUUQPP|磰y$ Э[7_B_I ..W)c۷/hi2 'OF !A A||\8po$0k,}͈y) Xz52jeشivj6Ytzzmmm1uT,]TxeqL4 ǎZt!xZ#b\?a0` 4Ȁb87obĉغu+?04i;j.HJJ<]&!**?:u*6o,xgk_{&]!PZZ B5ǚ5kƫ?fϞ ;;;EFaQ`1yd$$$.T"H0n8,^,64u00}*++h"L<999UQQ ̦j͛c…T5Jo~׮]nj3,~255H0p>~-^&Rc֬Yzݪa|G|||jԩS_yH$9s՗K&3PWW-[`ҤI6o>}:lll 8qܹ3$~;cc˖-xW1k4޽:OAxeQ\\+|g'=9r##iP`j5֮]XAŭ[ڵk1zh`j 988SN0 |||zjL8܋>}Acʔ):t("##$ZFZ-v܉g₷z9#~=I#..WƟa˗/J7n@II >>> |>777,]Xl祇666pttDhh(g\pIA>}Zա555í[p]ܼyF``ŭ y<^1d 0@QҴ(0Qeee5k~^Q_} i:uz=0ggg̛7...Xt3 f͚}Ba;nH[[G&u񑶕VqR|rdee>*@ Ƽy.+1tP,[JVk63'ѣ^-L:`"666߿? H=aΜ9û޿T*ŸqвeKEEHӣ$%%aԩ8wxWf͚FJٳgk׮wqu흝ѥK_.W^.] OOO 4&VBvv6~ݺuԩSXJL˲8<}A[[[L>߹ssΝ;0 tRD<۶m+m۶ ]0ḺcpQMfqeB h48x fΜ$xyya…x7 W!%%ƍCRR҃7|Ʋ,Ґ˹O.]xJ,ňŢExMcÆ ða !ALRƴiӐ{@@VXI&5⟘'ʕ+x^[[ko||CSոt9r-!ARa˖-+᭷޲Y:Q`d7oƌ3xݹ͛7cԨQW8x Ə˗/?Myy9.\_F ERztѨļ\z_5?#GD߾}Mn&!@ CP'^"//OpMzP*ooݻOA_XXwyWR\\жm[FDLNêUp^0{lA,%Fbȑ֢+J,X@Ю5k`ٜy&f̘aMl:wVZYHeqq8pW?a0eA,%Fヸ8Kgggc̙g6?QZZʹ˲HMM5ǎվ_~4kdTUUDz,v؁={@b\www|xg9s_qjT*6lཥD"믿ɓ'  RSS9uppAmਈ, ZV ׮]ɓ1e-B KL˗ٮ]x1 FEEO3d _9s ~lvvvl6mѣG_5[ZZʪjcL@yy9+{͍5kjZcj%&رcA=vrr2Ŋbuuue-[7+ҰH$bCBBɓ'[ne]F}+TVV?i.]6mb+++hFn߾%dzV#Gڵt߼ysvl]]_S*YX777]v Çyyylmm-"1"R&%%Ce]]]?nT)**=qP(+"ò֟h4o[oZ|L&ĉL~s +0h }Bmm- JѻwoOq) Ç;#嘄4%… ;O"m۶F{G@V iii;w.OAdd$KӧQ I" <<m۶52d2ڶmh ##CpJ%^'O -[-.!fFLXee%>3AR/0 }>,6o- 1E/@/|g;^00\Ν;HKKݟeYD"8nW^. IDAT[+ D_Я_?Jq?,77'N_6mˋFI3PPP)S舱cbժUzPaܹpBƚbgZ W^]qwE- i+))A^^PSSjpss3\\\;l%f"55cǎťK r|;;;|ᇘ1cކ/={@R[l\CJOOx]vo)S͞,8~8.^;wUUU}wwwxyy!,, xСCFmRF 3,+͛0///|7n^ް:ׯ_ǤIpLb E\\Lãj>ѣG 22ӧ,J`߾}8x T*y82 D[[[7 6mЖʦWZ-{6((HoMBBB؝;wNc?vk۴iæ|Buu5g6**H$zy9;;_}h4lBB;zhEo+aط~ph+̨T*vݬCޘ" cO:*y*=v`AoߦB=Ĥ_~%۬Y3V*6=͞={ؿ˲..]bǍzzz@RC7!!!ERzߛJ̐VeWX ~C6oޜ=yވ 8^#آ"Cq;k,GM&YRi_dlzzzD`iO@0Ry SNرcgk4c(**ԇeYdddD"iT ;zݻ#''yyyw-[Ē%Kc(eY( ̛7_|m۷q)uִ*ȘFPlLL +evvvΝ;\9s&kkk+(@ɓz|U1zvΝln8$ 5fZ͞:uٳg?weKJJX3J\NN;l0V$ z'Nt vҤI\.tnvӦM&1,J:y&xb}eKKK֭[P?eccN:-++3c(yyyaCCCD^LIIa_z%A[ `_~fhׯ111#>>>6魢OX'''_%HQFFy%"99ܹ7a=tN~:۫W/wmڴaO>M3P*ݻaWll,h|C_R:uIn9nhD6mpE.//oFdd$Fرc6mO`?+Vzٳgރ=J% ۷:N1<<<`ܹ"= =pETWWs*#OFs,XEE,X $=!"ÿ/GH8z( Rɻ\.Gtt4>s=˲,._#%%牯/?֭[-OZY0̙3GD$#"7o5k`ذat' /twpp3yfmVYaЩS'!88s|[Q Q`|||zjDGGC,\=ٳC5!_.\K,EkDxgi&pw^$&&,.kG hѢVXbǷ~ş&P(gTTT憏>111D;;;N} w^A#aX: ..z?T*ŋ/͛74 Gx̛7zUII }9lmm1o;;ϟ7pTև+5eL0A]+0n8HRDGڵk۷o_|рqעE DEEqj[SS#h2tX)WWW,[ #G|g;vĎ;/QB˨V$!&&drgϞ>G둔D+fccKbt{sb9 Ddd#cǎ,8"BV.00VB=F"__]LBLZFii);v̽DDDpNPVVf %Vaaݺuh۶#?xװqFDDD!BBT*qj0 Zlirv=<<8!*--EUU#.@vtR4klmme˖ψBjh8d7 b⊊ A'3BPbRb1^|Et:L0ZӛL(!H.L>IV@( 0d,Y0n8OJVϓh4TVVrn"QbL<h!&L& 4FBhȸuWOuGG0 CBL0@Vn*7o䵶րXJ!LrW\4)t:\p՜;::QYJ!Lk׎v&UHgΜ\ʺP@!fM6pqq655$={suT( 3%ͩmAAgh4Xj***8_*;J!Č=sn߾$V9rO^*M6TD(гzTTTءB3<2ޏxT*Gd׮]y}F:::w:Q=Ν;jrn/2"""hlݺsƪ˱e^ <ёC @,ȑ#/|U]]+WbM!ĚbXx1 ˲nj3rJR)FEh((??˖-< WQBA/9999s&~7t;۶mײeKL8f%t:8pΝ֭[v횁"X3GGG?wšC8W㢪 ~-&LǏU8q"'J(,,{RDFF"X3a0d̘1WT1nܸ!b}Bɓ'1x`;10 hL46&3 Jضm 9˲zͲ !a111ի>JxWfܧjqq`̘1?xXf0|89958h O.\1cp-}ݱk.B=ϟɓ̻D" C...ppp0Cmm- rrrp?~)))mACyf1%<(J̙3֭4L־}{!Ġt:>W^yIR#00ŨDYY PXXץvvvł h P%@_^_,7ޠ?!D" EatZwݻ 1;`֬Yto"4‘N×_~ɩtFsTxb1113gS988`ڴi裏 .Â:88`ҤIsTd>}:bbb Ɍcb٘;w.ob*lڴ cƌX,sdtOUV ~`͛7ǢEꫯҰׯ_ǡC̜>}:\]]!pcooYf>Db}ሏǨQo$pfAw `̘1߿"⧟~BN&L'N`&X#Jp :tHP_///M&!&0puuE- a+W555u 4h>*^:"JѱcGH$V2BowŞ={ߡ_po̹z+0ׯ/_;cvee%:Kr +N}̉U<(--T`QBߑǻ0k+##0|֭駟sΏmqvvѣΝ;s:NCRRJJJ877V"==w* I!|t:D"j WFvv6V\ĶZoI$L0=zxj={bݺupssorrXEPTT$(""󗛛7npn0 _cƌ ^kkk9_|r9 ©mdd$yNmxU,4'V$''~>%ЩS'|C!kru6mcуӣj˷CCCѺukNmE"bbb8oDrjgn"v>"3S !eddp{?|rWT*9O޽;ZЩS'Nm\ĆU$ɼP@!O Z`;aпGYYnnnpvv}|GGG΅ YE $sڵ!ˠj9o퍰0 vvvjlXX|pA#B,˲.@ت|aĒ/"7C !?T LƩ ոrqq\.ur\~A*r^2hN,:' !XT[YY)|B\|S[#΢RA}||h `By T wwwNkxĪj{xyyޞױmmN,>hY IDAT(++ˋF!)?g㥥(((utNU% |}}9?Ǐ?xIքRГ7%Ҁ`NjkkIIIk-lP]] 6 66 S"ݺuXFH$<BU˖-9% OF~8Q8wiq%$%%a޽8rHVZYlQ8MXEqq1uIR(*B͛7}z5 ߿իmmmѬY's!h-ǎ-`Z-233yd0@DbYr9 8t.\O+]1l0AÚ&,rG$. }NVcŊjj^?jk9a702't:A;8d2 "B<>,<==9qƏ* W\رcq^oHdd$gcc{  !wؽ{wmYő#G#F{pvvF^^222o>dddp>;ڵk`HR^ sno,6? /OOOL!_|GEUU>ذa;HR(J.Խ{wNw议h֬9UVgY#Fiyyy BL"C ADD~: }񷱱?ϩj ^Bl2 XlPQQ!h`DC! Ǐo f`` zͩL&C֭l _G&bBA,u'!ѣ1p&KΝ;sj+J D"ڷobEb!{4@!B8::wm:*'Nr111ضmE/{JB+!?a>0h)?boo;b֬Yp֬Y;"\fYb("'OFjj*n Rsk3g^@߾}g}{XJH!z`NJ+za+VM6⊍T*L&\X#G !4 Νŋ#00۷/~׏qpp ]bG;T{XBM8ላÇT*yH$Ą 0eڨM,6ž$Ҥd2部(9s;w瑞Gf닞={b„ nȭ^yVBX,F޽ѳgOܹswAFF򐗗DzzzpYl$po+Bab>feY,sW}"&,CBi gYl%d !X MP@!P7t'b-,62_H@!#MjB!b!Qj b,6J!ka B!Of FC#B&BF4 B VBUƆweQSSch!b ~%%%zB1=LfQwWn:ף;FKۋ^=W_}qqq={6̙0c>ٴjڢCz=lPсzh{ ;; Z!!!Bbb"@ Ѵ 666L"hnnFuu޳(--r888@PfBBB͛Pjxzzӓ[L "##!`4ۋVFD4nBk0===hhhѣG!PbƌD\\Z-y3fHPRRbR(B^^ R ///̚5 gٳggg@Ri]FGGC&񸠠MW^d2""ߏ6 77{ @dd$bcc( ,, 6L+"##-ϟQ턈[C&M]DQDGG.]K. F~6m*mcLҸZ466" FM]<"##q5444@ӡqZݍny˱i ..Nr0L(((` "{/^twwCף @NNP^^.b2l.Kc2pE\FUM}nnnpssCdd$/;GPTT+Wʕ+^G{{;Z[[a0&SՈ2l.@|| "777$$$ !!(`0```Dnn. pE֢限G ~d@V#44Try@Dd] @TBT3f;v***PUUrtq-={1& 33S8NZب2"")))HII(CWW:::ֆ\ʕ+144d?`'}uڵk-%@DDc'\\\2:#pB_ZZ| ^h &&V>&9T*zzz$Ɂ`` "A,X :;; ^N"\t =FEDD@R0Q&$''ErD__lTY<<<ذakmmEaah Y҂Yi455Iֆr̟?Fxݚ("Q__N7ȰgϞ[; NNNHMMEaa'Od "fAZZFbb"DQ0T*'<7-h܅ DDtc ...?`g 99UVVM >>>HHH}? ?z!""v`͚5AWB٠"""e`ܹ'JI*Ha""./^,y\ww7Ξ=!TEDD4~2T*,_\rGɄ3gΰ+ MyvA@llEPYYiƏ]HLLDllqؿ *"""?v<==xbv8q6h|m{ B򸦦&٠"""a`ƌHLL>> ETTVV!d#}4$XqDFFZThDž pI\phhh0kJBDD"##kbܹP*6ڵkذa%뮻{aԩS?}L e2|}}G})))h4` }Y+0:99!==6ȶL&qa(//} ca͚5}?]dP(`4q1I;.> )gpp/~ 8p6dB}}=N<,@B.sq+zjdddH?66JKK/?֭öm0q=PTذaEc石1M%%%O}M/هGC:a <`0~QDDV駟bÆ 8}4L&ӄ"" O<zc/AVc-,,ę3g\ueffGIIɤ:Ϥ:z)-Ba'''lܸѢYш^{ 6hlL&=[NJQ~{ク| ( [`Vh m۶IhDnn.n݊/2w|cǎ;lL|{[oUGD$MUUlقt R '''888@W#YɄ! `0XfLdٳ3gδ5_`E-y3x ?AeDDtuu';#bXd ΝPhZ8z&@II rrrpE\|:*W\={@ьZ/ AEx',ږr뭷"33I`˖-4w}7~aˬf=(++î]pA[Z:۶mo[8;;Z/ 7PYY|YYY {AeDD)..իQSSc\Xtz/|zcΝT @hh(֬Yg]EwF[[ *#">ܹ/?lذ.-[6Z-~cشi{ɳx d2b߾}<^!22fbk"w'N?odJ|;wDHH޷r9ñzjh4X 777$A+#٠:`7">>ޢ999ؿ?ѸEGٳgs-kʾ v|M,]T'">|آ u f1c6mR)yhq5TFDZ[[c zooo<ڻ$%%۷oܹs%3 ؿ?mT`0x8JKKo>5 "+Wdd_jjwPI  ~آCCCx7PXXh媈nxx?>Q/::?̢͞Ef1 طo  l޼aaa?DD 8|ٯ_nlXэrlܸ/4.;;}}}6>0Hgptth޽{qQ.$"1Fc=6[<==...fAAA $&c鎀6l߾MMMVKN2CƪUl\XjٯokkCee +$ ?\.y(/슈lbhhgΜ1* ~;T*2J͛f{zz1`Hlڴ Ϸh ?ʺkjj2VE\\ܤR*Νh^o4QWW!W6}1X=<<<,Ԅ?JDV`vQhZW$MXX֝J_b Xf ֬YcQzigsf>>>6HBWp ,ၟ|y_ Z2"W}}}fO+ ښk/L cj@~;"^(rR9i0LE ce˖15xiŊ^I>444)!Iyl0b3g/ʕQ*fBlmmECC+ W4}1L&<%KX4^Eddd7ߴreDdoJquW$]iiY@.#((hRc*<+jEQ?8qbR>#>zj]{{G"00kƀJnlݺ4Z__wreDd/>Uh4qU+//GuuYUՈqE899ᡇŒ3,ƹsDdI>}z<EyyyfdヘW51XQTT^z% ػw/[dF҇^KaU{g6%KX܍`e˖-}gb+WFD`…^ Uc .ܹsf;a5T*~_[nyyy/ivKO"+VI7̈́~hooK/d9s`ܹ6jc3ggxȱ;v`beD4݅J:TEdgg^@GG +vF| .]dAlʦ??D1EFF/+**Btt4f̘.DdL~|sεx%JKKm6w@EGG駟Fpp+8`#/̚5kCii+#L&a刏4nhh;v>.UuxPPP`Aիdgl nnn8~϶MMMXb >ȾyzzB!33S dBAA:::777ى?سgٻT*v eOlH&a֬Yt***ݍŋUt#}I4o4qEddd 88Vo҂gy۷o7{L&SO=|Ъ3q\|hDII 9EDfquu(8yE>p #::~~~V< {=I'^FJ0.<=='Oݢk "33III@"3gB!??ߢȽ޽{///x{{}QֆGGfffgxqwϊ}gͫz. `޼yصk\MGEEEؼy3 |pu]Xv-n>E HK IDATKáCpi>mܸwӷ`Gx'%O}} >^^^VQq ?Nr9c'P]]:dgg ]]]|/A}C 0b'?Avv>(^xx{{[:"駟FKKDc6V;v;\.rg>>>LJ~8.EEEdHMM<\.Gll,qĉ)a /26n8ag ,, 8ŧp dffBbLDtC  ¥Kƭُ%/b˖-V߂H0A̙3؈˗/[*XE\z111lLD7%8|g蒾{<#l~fc\0qF9rسg/_nh:3< w&Y[@@}Y<@380 㑝=չ(((y` h,Z[[a4'eOVŋ4 ^{5w}pppZCll,N:58DE4L&_غu+|||0{I ˗/GBBjkk2A@Roǎ;b G|0I/1]g޼yxw0sL+UFDjooLJ~7xW^0̙/] 㭷»ヒqgpp0}Q##A"=nhhXr%WӄcA@xx8q&dgg#22+p QQQطo jX` #9D@7&pwW^s_Q-[ѣRD))) ?%顡!cjMd d2'Ny?nWWΞ= Z8%L48}===pttĒ%K sM )B.#..ZϟGooى#G prrRDCCC0{VME %%aaa-;0L!2 111P|=ш,c2N&!22cgdawY8 SV-i ҷĹsP\\Txxxpq Dç~*i 77/Fdd+$w S HHHNcAѦV.#::Eg 4P(deek4 x"^xZR"jooG}_xQSSC!-- & pww/ZOOO ;wFѬ1(+V7 ӄL&Ì30c |QQYYhZDDDMJ("++ A& HOOGFFGGG\dddq́Zѷqӌ(/K;w䄟gOФa4qڵ {%300?q#&&}0݋~j5݋UVYo0 "233v]B;=͛gYbppo^yTUUYvb <#HHHZ}Xh.]$iܝwiF_D7GӐ ʕ+ۋB 0L(//p #q8p`LinnFNNߏ"jxyyyF@P ǎHሏ@)i/\b ":$$$@V[Z"iA@||<<1]KE!77~)_A9u/(//GAA3hkkÊ+an`srr‚ |I JndhhHOO'BCCy;'''T*8qj[!//ǎ ___ :99!88ԬHMM, pttĂ @nn. *ZZ:.\0{hD^^~6"a `ƌ7o***PSScGFFT*yO? \(bt L&Cpp0Z'ODKKRIJe˸ld2aӦMGee0EQ^ÇQ^^hZq #//fBMM >|GEq͂7DQDFF ARR8FV`J%n6!33S9p)8l4 9bu.7ֆ'OԩShooGXX d2/N3>===+WRFDJ‚ |Z푀(hooGZZ;???2Mi4f ^ӧOctvuu?>c,B"::9;&T477Ī7Ltzg{Ƒ#G# p1TVV"//f_d2k׮e *;'|rV^O> iyyyA&/dwvvʕ+OputR7N<)笩 HNN8;;c…HLLDII HSLWW? .et%5={RHSRt:@t3#'qX-n,+W"<<eee3IсK.ѣ8wid_T*q1I]K"88߮UVaٲehmmи> -@dl@JJ n888Ҧnb8<΂tCZ%%%Z `ժUpvvڿ...͛pBNgGc4<<"_;g" 4fXh͛6>a4QUUt\.gqvv| 3gDBBw ߎŋeee@WW V\ɿ$ NVVDQᅬ7|ϟ gƏ~#̟?puu&"z!޽[ҸD8q;(v‡~ׯ[W`XbCI*A˗eee{>x ?~.]Boo/m  Ç%M777#44rq뭷B䐭}}}E+V$ Y Djj*6o RZ")Ʉȑ#x뭷PZZ:nٙ 씷7pyFUVVnF1{  … hkkCGGUgDQDii)RRR0sL]?Lwww,] ,B@II ø`4ۋ˗/?DVVʠT*hW( hV耳3-Z$L&nVZ AAAiF ظq#O$1͍4Zt)/^z kF555§~ÇJr Z \#G=#e2PQQ+W"00{{zzb޼yXv-҂▖h4p[ ƍRDDD6mڄ3gB֎{9Oƞ={矣P*ZASNIۋN[nLd2\]]1g(ӢYL(,\k,@bovv6ٳFFT*jpB,^pssL& s={ի6S$"ZZZW_}ϟP(m۶aݺuh4"0Є3L8s v؁SNqKJ$$''c Exx8Lq===xǰgI{vڅd4qy8u2 x衇eI Dp%mmj,c٘7o̙oo.$r RSS%=RTؾ};~جbΝP(uVG?ܹs,@hD]]v9l~"O^LL\h$%%!55P򂧧'<<<¬>0pww ɶmMҶ0\x>>>6˟jkڊe˖a۶m<ƌ&-шJ?~>u X899GPP1c !88xoFMM _k׮ڵkG}}=z=a0닐qx'6ѥPnn.x=FP_įq=<<aɒ%pww=icIoONN^{5BM0|NNNP*P*puuVEHHZ- j8 k٪(#F#DQd0  l0 N^NF477׍|ptt]w݅^z Z2шW^y>G3gݻH40Дb0PXXHKKƃL& WWWnnnPTA`$r(J888d2ahhã_#ۋ}J&aÆ x&ִS\"i֭[}v>)azGzz:ǽݜB}p=Lt)7{(]\\dVFdlDSH{ը(lذ7oV'Zfd2ׯԋZ-PRRb^իW/M9 4-cXj-[Yf7|NMCRaՓzJg8^8űM)|@Ӗ(#G UUU  &&{ERRDrC& ?ϰsNIϟC٤90]Dqq1pc˲g|3fLt)7˗1r۷o֭[mXu1]1LD{{;Ξ=|26M[֭޽{R&2 wΜ9~HU dL&jjjp%dff/_FkkD6 OtSB ;L_& v֭[% G}VGd D7Q]]/"??%%%ס%7r6l'Mt9 bҥpႤq> y6IL( hmmECC󑓓cs @EףeeeDuu5ZZZڊI_>rxyyAV^^^jٳqmM)ݍ'|oMJ… caT!ldxxhkkCss3QWW:TWW555hhhFh4j Bxx8|}}GZY*''˖-COOw&""۶mȆ@4%07``` @cc#Z[[ىNȏɀP(v"Zj5<==@hZhگ4/{!"~i_Ɩ-[eh4BSߏѝ#k ς @.CP|rP*pvv3{/[$"_W/RRRm6,X]̠qE,]...]ј1!X!""C DDDv1!""";@DDdb ""C DDDv1!""";@DDdb ""C DDDv1!""";@DDdb ""C DDDv1!""";@DDdb ""C DDDv1!""";8IDAT@DDdb ""C DDDv1!""";@DDdb ""C DDDv1!""";@DDdb ""C DDDv1!""";@DDdNG1[IENDB`pydantic-2.10.6/docs/sponsor_logos/fastapi.png000066400000000000000000000340551474456633400214350ustar00rootroot00000000000000PNG  IHDRx pHYsõqtEXtSoftwarewww.inkscape.org< IDATxgt\WKuYŖdˎ{-W0@B(yBz!!!Լ@  !e˖܋dYVo3~q3ky-ڇ \ Juz\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(PuDe%&)->A]bcQIRǣXIR8FGUVIҁ&~|ii$շE ? rQ0扊R.꟒iJLRFB?q\-ͪhlPyc=u5V]5Je !}(8G#e*kk@J>]R~+u]b%6NRNkiѶ*m>XUWiku *JJp, B]4[rg+7FtnJ8brlFuҨnY{5-TUr-/+Ѻr.S D#p z$%kzvo[5&%$Z {kTKkAn-WXP=hFޚٳre*:x~m_%dԎֱG:hLFwwNWӲOS}l9_ ֧{v[:q(@%jZvo]0L}SR#rHޮy;ZG8t]oNWq!|SҢO[ۋ][c K({ՃsA9L/)_^޼^/o^]֑A$얥9!#5$uIae@XZ8) \??W]3 |~ݣWߋ DWU'atq:€;VYGBW7F׵CG)96::y=nVWYJOѹôvzpm‘(pٽ1jސgW_6ѯ G&Lӌ|~ݱE/[F@Dxtatv#d+E\iE@DJL-su\e&$YǁKTꩂˆ5z[B@D%UwNחs(c$VV"@@DJLMmc'ÏUPZ_a-#!Q??U3QI1ȰZ?ZP\(K)qyt~;]iqq^s@?^HW- a%)&V8@@V랼z}F(aKѓV4(@P|g#./P`oBV~9<: t>_/lZ,q ! Q`&#!QM[FO'*:RUMzlbrR5q PrѺyt|:z*훭e({μPӺZG[ۋtp!C@H'֍#NjշzbxmAF@]~s%: ݭ?yWQ`MvR5[sXG"Nϧ}K穙M b=}HHDRɻWlC@@?s3: ^_O\檅 W_:?@ytiZnqtZbL~5<{j_mc'YGp::%7+[/̾\3_A__IqA(ho_~z&u0UKUXoNW|US{fSJٳ/μG8FUS~zu H%E^yn1: "'էK]u3: OWiSu 虳._νL1<}+Խu=::a/%: qSר{"pb?YX5-1p//EDE>[?qu Iz|>ӻu%|vu DuezyzXywu$=9k4: "K[@$P08cַp/:Toh!EYGAQɳtDŽ1^2߅<ztFQ$u$??9E1`$1&Fo^η054޹ JND̄$yuO`V+ialr^z# %譋h0y#`Ga@PQQz5ۻ@oo߬%X@+ww @sta1 ~I3}5[`@@7Ny}F嗖X@֫\1ݭ(fvz ca|~?kԸxy^07%U/̾\v"^޼^1_SDO^s4lC cݟ7" BԸx(8 @HHϿR hrǰ =Ka+zAUV]evu$8 @):ӵ$V=56C ꡇ'* wkWjwmu @ǣϻB11Qp @I\8$ 1"#eigYQ(ag3hF=ULu123IlG7GZǀն4+c$#): $=wΥmӯV竴:\{bt%|υ @x싕M3FhnW/ 8ru 3VbUKU`g'a`GR~u 8\US~&: IJӜh`Y-!:'+:p}ڡcE scԚe1cjyJLZ#)q--_f12Xp- O;K}ZǀkWXNa5{OLP LK5:\ Ub8gκH]bc <KQݵ5U1Sꗒ&ϲ:I-:\ mmc&iT,Bɺ+wu 16/fkU(!g(-.:\jzcrn A՜1_6X:3Ul4?M!saBjc2,=C7ZPUAh.q_<1É^`*=>:\`Yiݱ:=NH LHҷ"w-Дӭ#8 ̝D,,٭vmy}^}c8 Qc%ɛkM區@s+)&:\_=;c!1gݻu GPd}cxpYGBI3#8  nOu {;h1:W_M\HRL,k[GL|gdA>۫on+}1sh`ju G@[LK)kp/OT .0TûfXǀ eVﳎq޸)>_/KӍ#YLjxNؽfc.RQTZmc')㱎(qHEy~=ӒS91"2t1XMU1ru 3Z4(uC耯6\u--zMtB6&dek|Vp_,׾:@1Q(tp-%1ҧKgQ(?/V/UYCu p4pPaJ^l4APhn/V-D9sa(0v @ǣI=xUPQ׶lrz*csO\ <}'@8={[G,/۫Y\%3!IC3c% LϦ ͛+f 莏pSOYw۱:JǐnNνys#E8> q?hA.kvk MPVbuC8"%L(IX=8 P>ڵMswZ\oF6pĘ iq_<$a!P2[_pohmFf=JJapw4:-:F=rX:{6:Nbt7ߏ=q^Nn788du !t61bMhNn-D^_?Z:BlpZW}?"$jPZWapa]3 XUa!v酪Ldx:RҬ# B|Ѳ1bҺ!#c[G#烁zpVWY@;?Oe>DGEOT@M^]:BlPZW];?IaCz%wQcu+:B잉3#K{ tDUXl!6(ʡ?P:˵:B\HaO!Shhd{u~)iGlD8h'䫴:BI3~w~,Uy1bRu1I,D8$;KvXT 1bwP,k(e$$YG@jjԯV[@KI\ n EtU75Z@3qk' k!F[T46I雒Xw D+:B잉wD% $ fo}~]:BoJQ? )3 c+:B\[|C h8Z=nu XߔT}yh #p5Z@X ~IJ0怞]:BoJX]y@Ou-Pk! ә,!} xدl(%U_kA>@_:B쇹 )h8JX:BOT}YH|p|-V׹3wӡ  IzOe]e^޼:BgR}e8ӿ q[@=q |S$VPQ7mԅa %|BwM" ~Iυ$ib}u X.q1B1\/fvz{ZG;'LDPiO8E{w]۬c z&1Q˽(}H,aw;sB|S$!/ޥO찎NJ֍#[ǀ&R$ւ ڿ+9nEH }sﴎNJ֍#Yw+~Q$v/go4%Zǀ? $Hyc IJS$V~I/w0O(0֍/-둔]} $:Bc2C |SDp7ךR̄$!} I**^_J7aYǀVD$U4[G@i/LH`a|II#Zǀ;qijmLH7qQ$5z[Uzӵ|zt"0LLyK0pUWY@e$$ꛣ&% al qFo~J??U)L8R\WkA۵+:B,#!Q7ε0SRD8l{ UO\b`) !;jXG@8)Pzu@YC*Xn7nc  UrGk+c]P\6iqhvVﳎVbmമQMZQWYQv emGeME) BkU7X 7}Y:FHVGapvFk~?("u D߯+c QxDDMVWl> ([W]  1=x{, Q|~V﵎3m:Bء’Dj\n{XX ξ3v?ڬ2pJXG;,/۫VapT"@8m$uKHduD8֋vZȲqQN[LdG-ٻ:BX/]bv+z> lPY{nSmc')3!:" Op~I 88>vu Dy;#- I|su=f2XG;%}suE8wwl^rlXS^=u51$vVkr:fYG!d(O3ɱg(!5-r SWS-1I1ڵ]^uF8&Wsyo1X?5 @e cc]t˷Fhwvlz` n[WY5c= @lҲ+$X@{yzFB "^**dtΚRpk# @}1Gyt ێ|I9|O^2,7UϤ.1VUQUuAhݵ5Zђ@%xb=tۇN<^&W2?:-o @;yZ}>c$ Ӭc +cD @;=Θ榑t WYG8:>h@ $xbxVҬZLj8xoDFi)1^**T ׶Z}>=~u %xbScx1zvj t7F2V)SZ;A۪ۭc);&0ؓqN=q8tZ׫7(ƶM*kDxGwtk[6tBSn1^0~]:BDtғkmDxGǩY:IPmXl!Qg+&88۪mcD< @,Ӈ;ZsƩ_Ju 8V9,Fʳ8 @vmӪ}1L׮P- ~zu FGv#^^ @Pk%8FSת:cPՏ/hG@|ztBBԶ*@XjXׯV[`->O1#p^=jQ k~Հtpg Wj{CO/[`0;V=c$o\ +c&<| ?7˵$^ph0wu 8@]K~ru ǢKEZWYffG@ZCһY@kzu[p @=vZ {_WQUu נ:P L}uz~Cbܪ7'*|&B0p[LwnߊYp }u"CF047dl,KE1ͩ褛羯5c0͹kQF'7e̜9:" 7k[IPF]ޫom!k*JO޳SD奺S2!Z" B?8GcN/[wvl6DO_uZGEGE^7|n5$Fj[8ʲ}҇oeVI(ILiVP([tca4\c365-ͺ헴: :~n\o%^5QAwM1W6uއz{;;# ~]Wl,J]G=b1:Z[u廯r:CmMtגOc (QR_xA;jXGA9x??NmEu?8Ȯjߴ: "jb1{u|Q )7(L8%{w_U: @9p `ɭ9}A?Cp}źT•81nl:DqSu֕ェ&: eh?N`gM{EcOcQ]t{XV]x^;k (.--3g_YVWYGAQ\=;t[Wusu{'1X+ןמ:.q ,(٥xAQ`q*k-(.lxyZGn9o!.Cpr B=zY@yo]Kr! KmU* cGxPWZyKȍ(.J^ﴎ |]?9!Ep~ozq:(&Ͳ0ut(P׫>̟oAraAz-ͺW˭ P IKz`|}wӁ%:Q&(/%os`\wfmViiy^(#]t&;kގ-_cPp\+i]۬;PӲO~IXKyYx*t[/bpdg"knTҬϿ\ ;qBKw_a0 ̞}c 6ДW?n0G@HS^yNQFps&My9SmBS_}Nm`pnՋ-~\|UAEGX BDI}=:L{_QQ1;j]QaxvKR}ku1د3_^H^u8KjvQ7k%sUlC@Pߜq2eMEw_Zb5o_7DFo̟IO2ϼ@}ZG҂]wau!GFWL4I*k]K>ճ8S!CasuaAQ3->~^2Wq2b: Rڦ|eQRͣsӕoJݝ7Wl^o.G@LHҽf28LmK~*O?^HM0@@CM9]b`_u~ tę`(@5{zn=?_uqcPgSNٽ[GNKEz`zd:W_(1|~^۲Aw>D " S) M^^شVO\ ~(DY=t)Ґ*nns艕K:nD)iQ&(sdkkWwV:a8Fj\3F??U%Xǁì*ߧ_Z֩糎trtq:W_q:۪W6o Wj~.8@@QhCһk9c1ꞘlbryC-\=p, \!}atՠ5c4z[ֶ"=SJf: \g`j3V q`/iQnucVT&H@Pj#ein($vreqo0A Gr+u~_شQcDGE^}9 KIZQWon+˛kcG005]Ky}(㱎hlпwoǻ훵={uـszk{>޽]sP mB:ax ޫfٽHd>V’ZPKw:(@HJ4goGdɠj[T Jvian/@P J4{OY5SqcT55LZS^}*(W  bշKFvRnl蚩ݲ-|Y[r-/+ѺrVL|CD'Fh`ZW LMOZW NQWoj*m=_[TR_Zm=_*TbpjѬ:)iNꢌDe$$*31I LH:OLtt5A *oWYCٟz*~@"DYhy'ૌ٘%6N-͒ږ:pLV5z[@wI\k !\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D(B\ Qp! .D?-ӳ6IENDB`pydantic-2.10.6/docs/sponsor_logos/jina-ai.png000066400000000000000000000242541474456633400213160ustar00rootroot00000000000000PNG  IHDRXۘ(sIDATx U}Ϛ9p#EoϢXT$*"B9, բbsrA@-U BI39R${\<ȞZf-O98 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` &L8 0p@` Ȼ V)|e4[̔ddRky^Y-UO}?rK4:] {`"2ѩ8NM3גgt^ύ9̰lB`%ʚW2TV$EIb)amzh#f! TJkRZ]$ Q9=uiqw"V󊕏D/~B\W5K ;'lݢ\iNVv'b.GyIrw^'KpH2/6Ǭٙ`o@3Us;;O١oْ* >-P-&J .Q?[sKFVn>f^~]L=|o"u)1,'#L)i^ry`ϱJ+-V#,9^a"v>y`(GUՒ*({eIO@1$R?=^nyײKf3K,p=LN̝5T\~FGw|}&ʧuZ)0h~޽|F=&Jc%p-wJ9U)y`( k<o~afNFLLOן,c^)c GlP RL4yjz'c˻G`}//?ZI^G%(0lit(ydߑw)&DY+;\]L]yu: @Yp¢Y>]]F$؃vay#kDR=,;C~ey͔졊Gw)(HJ>gzdynuR.Vli,TQ0Y}SepG`N5T$1نͻL̺tL$cD}(J#0Q|as+3S6 ke.;ŷn&NsMrl#?ʻ*›CEP2?3g,P"&v?}XAdӼ-y`rJ`xh#2I@(dXok,Rдy_5 ɻC`<^#'Lcٺ:2L0/l"Q#.,͉] +a)>%}[=#Uޥ=&Jż!Yrvsi2 <>w 1'\mIR,]Z5'rrdLIo(e ssC2V g\Z%uy`9tZEzq=hZX' c^K"Eb̜FOJz6/ʻ${ S&dj|Fw"+!;mdKk$1a^ תf@vc*S[n^~:wp@`R%$ISSYgܣ{XkdE; $o$Z]^jR@mO zI*Deٟ?[Y2%{zj3ԌV.S|( om]oNX}_;kP<&*޽|eeѴZM^g4z&Ƥ'Wm/ϬԈk & k<]w%{B-56+ 8xPsѐòz@֬U`(Ϭ6L:Ĕd.I%iZ 2fWRFeMoxV&L8 0p@` Ȼ@\`Y;C7]ZٮFcXfm…SnVn #h/c%c"5i}~ Ǎ7NW9c1c.{xddU`m$ۡ("i|-y%Z^P(k"IK-BہhV_ʘ{9264U丹 d홒 PQQwOZO_x@ƼEQ^X:.W̵H:CRO2ڔ61Ʀ*J$D߭6nxBҿixQ=d/[tIHc^#c^(z'kk#䜚}&\aMږR6,Gh\'k4<|fǏ/tktBX /ڶ턎{lwL/:~IM)^+߿Mcc_A_j͚z%VuV1'$N'RF|\{jEѿB]y:[H{0K6+MWyF0UE܀s=2洤ᬂ{(/>豣h<7{* ӗ 8QcciAZ/:,P_x~{@/MΙ"\Ol}0'%?S}[˖f 34-;t Ӯnګ # 5ۘ׋߻|6@xZEGtG]Z_S_7C 頃.3|+31)Rkߞ|!0TUhvJ_שޥN2zR##ߖoQu>(.<2w__g]WW,Ee 8_]73vK4d.#&54kևWu76F 62jT+WɻIY.cί61l%:9#0hdNc]|ҡ> HApIz[[ISsy*4?-[ޞw9fDLL,$1׬wIiժu=oNigկU[, :5crzj&BmzyGV({ij+Hg-c>UUnbBaMky󌮻nZe cl{HD^%VD|uEr +r0||ɾSsͺEy'/*{8M*cOfyj`NRdʈ>B~N{]#$Q4[>tEco^@`bM.{Qvkj@۷%k?BX>toa;y0TDA`:=MrS_?2kywuf8$dfTK(k{5[pe+_i'ROϱ:$i&Ւ|$%Nw(_axrr̬4MKIA%0'ȘOhPcVAzwnE`f=ZD5 p|70<]"evu%cNĨ1jd1#TWkyzz1y{|)ڱUIE:>ɛ3DuEG.Ƚlv]e.174D`^52oNJ9LMVlJo}kF`f%yqN=LT[A95M >ñ{"]-3 01y[p)K{|g+k;FQ_f?UòVJMfVfUX: -'jdvF$Is+VtJj=2fn!01YVO<ٍF}'cX :8CYjÌ-X=LLy@7ݴWyښV6sò<fi g74hxmd)<֎mtX6GKLص'xJ{Y.$<;CY,0k2xCYl͌Y!0j0&+n\It=^v%Pm'ý0' ̬ZUarvc*1~3]vd),)dj{tz4==19z|?yoߚ VV߭&9KApxsTQ-f^]۬81āimm&ƌHj9Ð,z[-6.3+̒YÒm|?dJUג<4)(+ =7Ș0˃dj2}kᣲv;YfV;$=LLU޶wZ'i[#0RՅ hE6$I4LL C:aZڔϫec:*e0J?@;.c`rWJ{)z@׏Șy7&&~kICa8ʰlYa%sJI_w.XP!ڴ̙F;vXO8;i y^i[tu23*kCQѲelN ihhzzf՚iF54Slc=l{{%B##Tm5]_A)1/vk/iǎK:$R ݢe8LApUWʘ UàAlZywڥK(z ϾpkzJX;w @q&(c>Zv" vd+EoUAp( F퍌9Dw|j4vٲo( <퇨_1;.j4Vy 1Q3QZaE)㓞b=|C˘Kܡ( Qa/}\ڥ!S%k; ÿOW[-mxyA{]oAdM YȠ %KWQ֠|&qhFy$3Ӿ󝿲߾m{+Qq:+9ar Lt.>e#s%/udecyTV__$cf_T] &ﯱeRYWyHC. o^|SH)W3]PiKl_VkCŒ˯a.=Lw&cBe˻dբEwKaeW +LsBREcyQHƌJZw@i̿U]I,]d^NIvf[|kÒtgl]f!B:/(c ð2vɒ}.P[GUŋ!S/~Kys*tc.i-qғ㎂x@,Y^Q4|u$0EITw1$]K_w9`mz{?-kWH&4y`%Kj ^,HPqY{V2Qi*O(O3f3< nteANaaܻ\!kn9ͻ\;.Z:/-+4;/o{FApwޥؤ(y!vGpᘌi~TaȺ&mydmZe wݑgT3(:E* ,|PYsZ_=dMUAkM6w3 … j_R*wTw}c+DC r}S^w)mŊZ,>15lQ=-,۲@:I:[Y^we0ȘLUMTEhѯuK%]!c^iNj7;Eܹղx/tէV7ŗHvi?;>AjTҵx*g1yݧ}IvV-Etp 1Jɻ]K)Z׿z<6!yQC_w!OcbtMK4EgLƍwdqiٲ|gXO$鍊$Ml¿3…%fiZJfIi͚[t=F-0Ĝ9V6EկIWIږd0K~(j5%d{nZ۩@f>dffYRӧkV|s;C^{[l~E^;M3 3Pҙ.RjflkK6~}CIr~Po6#I('Ik{K@g&051vh 2=NO-vM n3_#vO31~Žʱe?HGY{z{˽-Xs]t?Y"E29=ɇgUL_]sbQCшdvyޝ2f٨sݑY' vZIs ^Iߓ^`>m& ]ӘF@tW,y5Oh!sh5&Ё9yڼ@##(^0gΜAss3ktՈsZ~ɲe˖ ??Ag}]8!H !uuFH&9ήR]?Bih=D<Ci T“@HP5Iq1a~Dq1ZZ[D(E&2|~?p8?p5ɲeK J4 xxH h8mmFH$W2͘>G6u֭9+t:qׇ?l:L9W5Y1d̚5 gİÑgUUEKk+=~ f97pCNٗoٓsN4 Ə7}|mm-z>r 2{sq 8y:vaǎhkkDz򏬇ڊPxܸ$Ip\(+-E K?$!3 h ipyP~;wqYB!(d2`^p8e#\ja,(ZQSS'Ob޽xo9s&]qǪU}7z=|߮Jp\y\|CI ˹iD$ EE(>3OG4͛￟иcGɹg}6@$|1i$hnnƮݻԔʕ~`U^^& cժUرs'^}5ܹh/@f `[ I(--׏ô2t55˧mk5 {E??d0{`ʕ(//Gu4b1cϞ=عs'jjjŠ*B"^//|3f ƌeK"HԩSo P[[kZt:paذaF~Ox[?O_RF|sεRO1}:JKKsr.ǃ \!??_zQ<=ǥmmE,>tƍ f2 ~FӼ$Ip8 _Q!լ#kE7sd܄@*1￳j>&Lի|rAEQЈzq$I2y2z)ӣȲswؓ o|K.Kf !lҜ-r 8{__)cXhXun,“Y;N8N\.nx<lĈT᮪p*\~SUx7WNu@Uvrt:!r0}K.Žދ뮻$A4Ck0++~U }$I=>@?+^/կb9})פx9=JL<9紪{K;עEQ @II V\3g"/?Cv!  //(;]( 7wwUG!//>. Iƍի@uunákk]l \Yi)޾BUpmpt:׿ǍK̙=C lܲlYzo1tp=&X:^`kK jzPq?~Q\.x<|>~!//(,,Daa!P\\(p'(J;UP|` ]#ҟӀŸ뮻0gnclC]]З}/|!gsɜ*|5lW#ٕ$I=ֽ2P\\#6{c=mr^g9ڊ* +nuÇ~ϗ|II JKK[II  bh͝6˪ BRUcp 40H (*-EiEQ\\""??~j =PP4zXh<PWaz%[:`||8?ޣs%@qq1-]ګ^ EQh0| :(om4sUUE㈆BP$ =Wא Hip`xhkS3:_ߏ+V {hjlcIpm1\e3f뮳|X,v5kmvw:VÁQFs|r`jL4 a\vuÌ9ۍ%g?y>Ε`ƍ8~ |>`-ŗ^B2V3v{0`@v^zȾe>Ɗ ,Z/0‘ڂKNyΑy/eӧOǃWcmCjԩxsK+S\\'Z:+Ǧ7kii/~vO]^Tj2poh-A}}0[4 ASX!p!AaB!AA#BUUcb} m @&MSBH.&5V N&غu+>?mV5j,T\O>TNv}oX0(s ***0kl8`5h?,k4$`ذa]nsx5)6nĶM?y$\^ѵo5jU}̢"̼馫lرch T)!=5po4?^Ԍ9OkhD'OFw TSP@\8iح(w'OD,3BF8F8F$A,3^j`f$ ը dlk4\ ѣسg}* 2}mrx%$IBqqqvŋ4Ҙ[nW[kl4Múu,`0h4 8UUUFY|> ]_?BAsw18˅E&=US_hoo7u^ݧ̡ÇM?`~9/L" AFs2r:MK&bH$hkko-'?f@@6 %YXQ5"VTrP D ?$ _gΜ13""S I`(+/$ ChZVc3LNZ[,4p2Wsm1cLa8ytw\XdcsOY bhjn6}H(dzſ5(,3k>öÝxLߛ?o6oْëh߾}شyc@IYYUW:ɓ8^gfR++1iDK?3jkf.1;Tf8cڠ^S1?C?⭷B2D^AJsv:[_ed'qBeZ2g+ǹnx<qcbf+0tP߿?5U1{M0bǞ>}* *a IDAT㏯{\={`R<+s׬]k|B!XbE>]N&@(#󚣑 Zz?\tlg^bضm^H PZR 9=.5{U?Z _gnl޲Gu2诓cQnH$U~?N5}UЕ;wc=od؜:i֭_oziզ!Xѧ!K^Ng[$~j:R7{Z[[ `{;d %%p:MYYkBHH$8ud3Ww\dp!0b&#qzm.X`ik}Wϙ3g,?p@ŋsF?3zh9ԱD~cNB^Xv9g[jnl(bRM#F*N| s ^߰ajjNCB1*`;>[;,x@1y;D"V#loGӑ#8u+]kעnϞqdX Rs3{'X`<Oz%Kt:M6_ , 鍥m5+V0}|(0(Pe%%o[B?s_Ј͛7н^AHiFE7[`uP(H O`0Flk= v ANWU!!!ڛxℱ-1H@ hѷ:}V][/ԩS;LFz u8|0MI㞏|wuΝ;cΝص{7jjjj }{=t鮁aCv,c \~?ԧ,}СUU4 x7{eH"hٰ'luPTQcQ8lJǏ!J;zxdQULM3"Uθlf_7-ݪ ps{"?_jZjМ tߤ [)CMON>Sl9p@T}gMQ!yf?}J;9u{ ?.HɧhooB cٲe'Xq-fbLU3pSmm-wEݲe|̚5Աp6mڄ#Gt{U4B`,P*dCj/]by[2%=W.L@K==|dBU xzH@fӟ,:ۗCiGw>WӖ$ #G='XtEҕYba[^񶿧N¡ÇM=ӭpX 7]QCF 4O]H$HPЛ]J54$.|_3O?>!KQڻw/>?/=qǪUBbo:ϟ?;w~y^;pA?MnԮ{ 5PN !nl*}N2k H!(c b$IgwTVV_Z1v|?׾uڽOӉ}.aРA2ecc6lTUźuLVZ+Љǎ'+\MHзEjjuVА.PpU%POLМ@SU$.j[f E'-3 @joأ%HoOcEI穪wn}o\g?/,ٳf!6'N0<;kzU?!p7@oW_LQwP2dsU('iPT=/RyϑddodY3g%$ lظz#}hǃ~˖6VId%p8͛7~ӧF_=T)ID?W.}/c}B>&]Zr)s5x,UQ_l?cs=hR#-=qce??C_*^~e8yx'AC ﻯW/;vH@j@"//s̱bɤaΝxgk.k! U\X/ {P\b. #ĢhjtƟ'Ł6WUĢQh ("BU!# jf2.%=!IW#/7g>9નc֭غu+\.'bɘ~d K\LR,<\8t0>B`!?~<&O* 2G#`mO{q@ y&u,hw;4uaĈ=M@Ӗ CtpHӠhdIJmHO(p5MQpDȢ3 vϻŐ 2LߏA&vpp@4x^ ɓH$joGhGKm-N@ HݰJze>ibKwd߿k BáCk M8X ?c|>#}l>Mp)|xkРA=j޿ueX}vh7ÇԩS̙g}6+TvZ[_Vt:q]wz\ۍO'#2.$S]*xxP\\Ғ%b1ZZbڦǃ*1@1}t9sP`; .lgy&~zfCFjBSb!p8$ɸEBUU$I(btȩ?H[$,w!EgNFRIJ'gŢ S[8wSqQ&M7lqV||,[UjHksCUUlڼ[müo-O9kܹ'>Ϊʶ|r{K9}4֭_UnxI`޽jBx}>CFzFFŌ>| ۍxinh\:#qL_W_߭Ne8NS1[~`ijA~~Mƍu9W̻朞߷@mm-xM~m$ ?6lhjQ{e|>ڠ*àr*5yYڿ녢 |\ؽ{Z)CeOk@zUEKSZZjFիW(;kן(S"HjA"a'O!В့7MuzII1J555!?~x]ԱVa?w\ڗ}SN5X?0W+czy_`WK,k0\ K,_~9g-^l$ /`i!C[-_7hkk3 X,p86477xhC5Zp=`ܸq4p8Ge{cΟ7W Sn4 1n++MFe:KQmNgj s{2z8 vg;yꔥڋ{7BT3yCG'Fޞ ؈`0D"a haMC C}3g΄,˩΢HwSUX 8w me'M}k:u*<QgY30ѨQ'IPzʥ ݻM?fII ̞mx3XugntזcrΞ5}L6 eee}=|샸^z(Rӫ5@ !$ۇXz> MMMhllD$1@f+++zj}(**D2ug΢Y465"ގX4L$S-ZQ{Ξ>&($ qwv_o!0 : ~-:]T \*!}ӏ)ʕ+{dq9&M2}|Ss3N9ӭc{LƏGIIi5Fsc7B,_/GY0[q [αQ9h%IB ÑziϜ9C^4?>! f^/~1eV׿t|EI" 1U93N8]S3kpi={joL£&O~C Pg&IQ4 سgoj?*hf1\K&[WMwuׁ-%0}4,]ݡeoa{u;(Ԙn)((ܫԼ\aa8A, TPU}yKogr޼rqlݺ h!ڊFfh|!N'<Q|>#~7y'x*ȓ-]NbGH 8MC%j,$ Ν庨?_? fI&صkR[,˲k{ ThxjnnƎ;L?$I` ԇk*֛555\MWF~~l,˸L֖Uokx`1n~.O>XUJ,]#L \9Ѐ}]4O^0j̹\5QgN @~c`ܸq$$K,0[̐$Lu80E`B!p00]ׯp LvZg6ksa׮]5,5ou_9ݥiֿb<(((tK;gXRO]]>ؿǝ?z*„M/kV:D0IDAT$a5Z]r -D~.E4wޱ4 HԓOyI˗,/?q&S5yUUmvK}^#S /|+ƎkkY\m5~.Wh;v 6ngd nw҂6 L+]HhiiA(2},rχS-pjdU]48\zX 0v˫#6AW^okk3 ɤ1q8x 4MKuk$ ++O}r`X,KUG-R|G|b~ߛ>~ӦMBcދ/=hN)//ǿ˿`nݚ+?nI_xb˭Pr>5Mϛgj-v `޽^NS naa!Ο7ٶ};ftqzW@(B[[nE eN| (䕕Av8_+HֆCHn4C` }ſ6ol,ڿϕÁߏB”)S0m4rŰF.O;.\/ob׮]x7Ν;5N@ ފs `ztz Ν3}cpQTm,nøqOcmu\XhX, BRP9r9|̚5/SU뮻2}~.;l~a-]jq&M55HhyyF#hmm(-)AUUUYH`N(3d}=}Pb--L~-Æ_z3G/NE&_jÞWǫݵ~K}@j}߿2zAף9uT;w.}Ԝ>fD"qcJ~~> Q#G3O4"[{{;~sZb^y|G-Fw8y }8y81l0tXpa^knwDfLnO\4>}^5PQQaXI[.G/{9,^t_$Ixifs3 P\RX hdF֟L&}~"1__G/ ֬YccKx ү˵Ko]M\^|%,??+ 8N|;-V$S= /hinڵQfaӝLÇ sH$hkkChiiA[k+e~~CFIIx7O}WB/~a={6կf~4;~6o4Xbaer>VTSP_Wk׬+W(dѨ񡙹^Sve | UMp(t2_IχÍcQaƗY["0֮]߸B Eךf> ~v.}xrrP?/~ 9}}N' xwnqALmۼeiV|8g?Y0`͚59+L&+-8C,o7u1^̙{ 8SBwCBq9TΎ G tfKuqwv/ֹٹrG\˱jwҝP*""NV$`C9t'ItE;oNH'離 }OsN@xB`ax<>GFx2ؤAr蝣!r_9~= '$ oDb$?.`Fe:tx_ٚ?τCkkkӇ~8ƛo59P%pGyOR)_5x%HRZ & z+ϝ;kIL{Z[o-j eOX\F'Ӽե[ѣ F@4UOO!۫X, *[YIIYו#}e7O#F|>UJ>b1+{zg+EQE4N++iSsKJJ|J|JIJD/.Ho={yxmm.T(•>IRsϩc^… 3Lg礽dٸi־ڤ Юj-.o9!>LGrS߫7%30!͆KZ]zH7xww PX%Ji֭:udq1ހJ{DBA^5++S>耟 ~/]҅rΝ;/mټY3P}~TР?1g1z~ݻw_… g%*j\om۶Mf_׺MU]9UJQ1{٬R{' m^~eJsj(cOwﮘzt/J~/iP|~n]{e_DԸx;/`ppPІ {n{cpK&Jg2 ̙՛e}>9>_ ooo߮5k;NjH8ŋ48яwWa2!fjjj?NyDB_յk/⺮;O} fzuZ^Oaa=kǎSVCe722)sѬnه#^TNyJ֦ꞻ.9BG?_~Y!$>`PhVxb1%S:S{{-[+WjѢED"f{,9 _+ ICʎHɤ̙ٳ%q9NL`H7̙:pAu]Pb_Ɍ"͝p$@' }cNV厓'OO?{[=߾!J飝;κuUN:=~ky#͊bS*_~ %?]*X2Ԟ={x߯|¹,wOUK/\丮ҒR;:UV2(HMjg* \zf㏧dՂtZtZ4z{j``"d2jimUZx}];Q8.+ W_4577#G5###jnn־}du睺{rJg*8J&jkk>}Ì-L/iy\.W555:t萾(ۚ,8|X6n,y뺚?o5WYAt{(;^>qJ&L&GgL+>U^j>O`PճffᰪBM-~kWSS˖iZb-ܹs5NpX7 _hTg:;ޮ'NsڅTWWkٲeZbب ZUUUJPhܢZI%quysT7%ހ  5+q9٬YGw)In0@0 pU(T0T MpؠM7=W\.WO\.ҙ q>ݶp7 *zO+](8~itF?_c dTd)cb/XyYre[(0̤? cz5fp-{5҅zfgExLdv9W0Xg W "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "0  "07(\MLIENDB`pydantic-2.10.6/docs/sponsor_logos/salesforce.png000066400000000000000000000517401474456633400221340ustar00rootroot00000000000000PNG  IHDRx pHYsaaøtEXtSoftwarewww.inkscape.org< IDATxy|\uwi6&i)J}dPTwz^^q^E/&V 5I%I̜P-5͜93gf~>9Ys """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""".8X:QX[;`݉KЉ]؆1;q؆qWs}""|6{M #9cG5Vc8 fd(|q Pp8֜1~uM`1.!3̪o9&!E @.PYXb`!%ͰxwH>Sk75W8 `!; |Gn G+'b[HD$(V\SHeX)\Y-3&>1DDrl`A8'~ yIl1X"1 w89c#fLw0""B 5]?%E܅73mvJ2wQ5bɫ&oi("A)J1N`ME:#&s"%nN;<;U`r91$ JM3(`PY`;k0lŰ]IiJ_""P)M ތ*gR'p %?ʑ~4L%`Nd GŲcšJ_g&>j-%}Pw8C̬{H|wgaDcwH^,g,↩~%/ܦs1v`ߡȻ+ayw(pE=%gc%="`3A 툤F @:9pzQ'=koJ;o:e&jJ8X{?&6HQ䵹c=0P$\_@<3`SX;w8ڇ1R>iI @4_~"i,!A[D 8ɷS;cÕFJѿsI*J 3M#bڳӭ|&=w " w(q[p̪##WxCbfakoH)pc~㷱0WmX!}*mWkK5ȯ^k`~2T(g^Ï|0$+tc]_Ar6zw6|@5o%pߡHV>̺G<9t7e{& 8Wr7D$]t9y_֋ [Jg4sf_bf~"J2faHVñ1Ѥ^tmX9>ѝ~"%%]7.ǵu/&tܦO?$ $@D`_on; );!v3'_ ST%3!v 3'-;/(ثh$l$9uЧJ= 2A|;T)8 Pw(6855 {qb>͌;D$*1:DWKjv&ճN ^? ;T `NCFs% Y4j CO_3~"=0n; +Kqm'wYIM5|oC.`7Pw !|jw" cpx (; cgF\qk%o)5`ߡH^n+$-?0Tqchk ̟CDZ NqURGC'b+1;{FN)m~"r Cg ܢ_D2b0H s; Jeow9NgdHp%̪@D;+&jEAfs3~ hR]B5Xcʀ"ro+'10m@ƶӂS-!Z/?fF%c3~UP8c&1LRhhB+f&%)`^`; ,3[~ c r{wl׷o]1! '&P-_ o: k#F%; 4rs Xg,fV}A 8k\"dW(=+M,o+tG\"`e. OcyykP`N#~!"Y{iM\5{&Pز*}cff~O3 CDdmDr-|8` @8ceF aF"u޿d;Yw{y a `>V/L.@O",=Y0, 3rPp7N̚`rI~%3; ^>KxW0`/; :0w0*9PHDD [0`M4#""TW3VS#?zn,; w(""b̷Q{߁dxbq>AXy0([~$O3 WRw0~!0!ɴ23jH~ }5""X;~IXCgh-5ވC29 S CDDSD3ݰ)vc~ ""y\Bk@)s!H^6_);t!cpL. "",K f7rBH!`0}~M!#m߁x%7oꛚ*C!wLm~o85""O0ww SADD0c -M%YX%""3k.bƃ%i~ ""-b~ܛ ZCDDd;q̪@ʽheY""j3g~V%6p! ;7r0dcNTC'~ ""21毛w ɽrr )'@$;)FH|f ߸}Kߑ $PA!f&um4(;}r+8ADD$ bn Wn%w"""I1v4w{V@UE8ϼMYіV`L!Y1)b~ ""Kuȭ @!̚3q~[ 5~ ""0# w@n%@DDAtįV`g,an~\:+o}]E/[ @+""X/k jDDDs,h<%̱H7jV`zDD$?Y37frADD$mwmHJDDDҨ2qJ,~ ""V0gt_&h;4 8ɭXPpNKr+@DDN:O[ a!d11t>K!d5Mשs+w"""t6ONljs+vw"""t4c,w"""Af7M[w"""$|bDDD2ʚP)s0JDDd)#h򄹗@DD˧'ɽI{k|H^"y# /P28shuM$[tPɕS9Iȫh2(V @ 8s[( tBrxeZh=<'tc3H(+vp_*'9 G05JYpڈA_.>Ls"/ ̜\WsXE8sqd"9̬x*'9ս l; Iʂ_M7sxeW/ OZ^k82r e.#~!"}pb K.Q R:}MtF5 CZǧrMw"·&pϙ 2í43s&TbRYAJCҐ<'fq簫aK#,ne<(0pdUg*bZe!&( CkawDXzXӖRU9ktg*.{簭;},iDHu]IP8@` ^ƮU'1KO ;3G2ȢEACi(@[CgԲ+(oFxz[Mi Ԗvǡ=2p޿czeeQ( QV/ᤚ숪0C/ Q_By댱[>߿a*䔚 돩2ݘzb=}kڣaю^{3oY=$7RyM_{ c񮲐)e|aj9˓ϹִEccw7tԟӟ*|V<6uU,ߕZ|7N-}z8q6<"'.Rjx/V0]'_ۑp|h9ǻx>^{;E;z\qp,[XҜʓk R>\[rO(m <5]C.X'*弱 "?5 o HG4~\*ѕ\:!C7"l+]]0K/Oݻ-]1Fɱ̜\pgy`CןTSȏE]ȥsغ+GU$`[Jr̝̬6zA*j_2,8mw18%c.`%yj._-aFs`0CY~>VM]7[{UW.Ŕa57[ q%A*>^{;=Cߣeo@x⼑|見MR+ y#ހ\8'ϫIz(ަ.W=8x|env=/ԑqh.tQ<*!?9ߞ<ɱi61/t,>Co;i"=UvR ;&3OPnTx<}Z<Oק ˲87Ɨ]撌.Ⅳ8+Ow_?T*n;*$ ^+fZzp\u/F&5&QR U'z@v% 6'?$[|俯{v[~lKAOAYa](zN ǎ\Oȸ [CEҦVij`}>"HJF* <؍/ZJgJ㿡3skVFeHr;{b<%# Yxf5EiuΑy9PBЏ3{o<yȾbJCW",ncKW c`DaF9uu27v%d{wk(/|lIMMoI8zbRQ;lu,Ԗ:,;aaJf<<1KcGTF^U}$ 9@2b;NT"lq) 28QҖOB=:]QUkF=2`Fp+%A=:!ޘt4dPbʰPBC'9'bn4}yv$sڇ'Nu˘gSͽ|iA+ L,.[ą)N= ;}{b?5vuvcY3zXa>6vܦ5``RY-fNvנC[mINՕu%#9VuU\?('.ߦ3ҩXլj 9ŰɆ(_^7v ⠾,q# 8wL!M(at,?|bKWN<7-]1ڪ(e|h%u[ǝ Ŧ.'4 Յ|k'jD& /m ,OLc=V#faTq n[5S''fy0 L.sN sޘbFc`i#\O]ῖsOS'L 5\  #DI Ov^@f GqRᣛ?/.W r|p~S5xS[{Xe IDATށ<Or~O7zrz`YȰq7>JƗW7r12uQN{t{P)5\3Цn> k6m?9v^k;E=!Q`c]ucwD-y?.ٱ|`wIu\G24uF^ۑ %lX7fsWrMua8>^_N|^W_ѺRf:U /2{m愀sNzK\s6N"8O½1ˬ[nI;9/[ Q&}ig/ýM]m[/|j*]5[b׍??w\sBuenݸxWROE;z˻o ?\݆=sozU廒Çn{b'~V~gL.K|:VnYٖ3dp=,ɗZfQKҍ?/~|ynWv%tϼoqU[ӇeS% 6D 80ܷ<44w{<2|kwY/R{ߖtwgyU/qzHҡ$dqjI?}gv&up;isQ~t{=JwGO|cum$M}廓U\?V^Nr|/㿷ڴC6~eD\=uk4RWCmw'u6t=&BNBtI\3XA`/O7O{6k?mMgt/Jvx;)U [?E_6d{Um. Ee/u5bqs/ \Nc{WU$rÃyrk4o_8]]ܾ: 7n?c˻R[o 8dWObe ח`  avYzpa}qML>'frw];qҐAj\$+Ki+`a Ĵ'Ǝz>s{R3m7 |]Ԯ?c\;h8|!'DT?5\ ϱŰdfKY==M?.u‰Յ/ x-jqU3ӥ ]qxEt;. 8<23ɧ>ZMzݶ]wEkytswiZw ۭ'gGi_usoSR&2zOJH  arF!n;\fRi(s_3F G2YngjiwkӲg=\:F6‰gM[vo.67~ʻ1 \L6u,qy=1o펱-gVi%tsܝxn4]/VvpɼSG3=T+ 1QEA0NfL8cTϞ^<ri[sqž'nnw'yJCb€' {c9o0e!zXb 3x^OmM|{4"Og'33" ȣj83F7ɥ\;'tw:xlK}>,pޘ"Q1# 8*d?5viIU*pAe\Pn➦nښ+JѕY݀"0߸^ Z5-k;*._YýM)U{pCm<>dM{%~^l2 L:{tޘ"`SQWHosזj`Pr8b]w)\< :ϭaӇ*]N]ßΨ|Z)fvv /fqu]W[F]Ml`"E,=.ϸK'fӖdlq>eMeQ I5̊.|rg? _=l+.ßϮNy)B,h 3r]g&sx;={- >V_SKF%%HJ'Y媑hz37-FE+seڅ4|ڟ{N,e$D7 vFΟvB喡ܫfÓӻ4Ť<_+&͊Qxtg:zxU͢ Fy2~ ߃ϱt_.[LVꦌ[.8Dl=-@0rI~knctdٮ>e+?y-r9IhaQp3Յpou8|Vo'qwL,n{)b6= =)H(L|mOBL(XQM9;fܶV S<+:8c]m\Ӈ* W[ᩭ=8P#9wLv$nnпK] U`<<'jOÏE㊙1 &|@ t;jk#Iwq#^$$Yn֮wiA\4N&I?Z €xYo O.igi~e'A@m]}YoV3$Ɨpل"] <0,8mm?7 ;ziMlv<=)1x`C[ĥx\qRCٱ}kX}j? {bX'[Ua.zK[]li]4Mz/.s|v['[S9m/fNT 3==g\NF.Ǟkឦ>";cnv+kr;2gj$q}<:'qZp=V5F$d j@?iqx^! \]oqffQI,I&2Ә ?!SH[N|?m[0)1AW5tO W+1j+Wv1\Nܝ|KǗ$4Md'foIagJVqpbE]Onj4$zw S0oYfȿn'v.:eXx}n6YW"7 OmiƱؖn.zj'=G60`༱V \ܰ7?ǠӂYn~6đn68&K^pi#dgL 8M cK'c.oVcKߜ:y׫-3]ᢧvr[V ݩrzUMg\8W}E'B1r=1\`li<]}^SG%Y˷bŞKޣ;f-v,׏M|7^3ih[YIf@L\d }-t1U]/%nn}0?!PsNh}xXٹ_:<% pM},.љ\M>:lqF'D4q3 `dWcGUIGr0oTj7gnt׷%{d8ݲem=(;]GO ܵ3 +)<+pؗҨ,]j7e!_ѱfǷtZ 2$-We "xdsǍ(2CgN3pzk<][1l8 *5t݌\06ftUQE5Jv8YNsR|%nJ{5An޺NLU4M9}nfs-+j5[|rJߚd98.qȰdK-Xh3$7_Apͤ2ÿO`ecē5K=z+rONW>b\-=MҾt0f2KnxwYGjKx1|aT+ xrm?4Ĉ8=^1g6uFYi;`&Ԗn "?os+}pOgvW9,RKzvBJ+zO]Y?QKڒ aΩ#ɾ'fZsx%)5,5~:{,3:[\uIO󻓇sc'?@h箆΄n QUUVE6~x'+wV5poSlM. pxUU9 )s񂋆ޜB*Tmbac7mqY0-UOOlv렛@=XUmwe;_w1I0o~` lcNִx8Bqx9iϱ|a.q+h]߹| S[{y e!Q 9olA[y;xケM,f d2[zyf[Ϡ5R&8ia]# 3QQE=8,Ndq2&4^@BEAm۸ᥝnncY Z}c8wt,'f#c8}1{ hYdLqA&\Jhڇm83oZέ' >1m= x}#9wLSVv9" +P2q?eMqk.,tu=1˒>a]{,-C1Kƻ0§v򨋢Aw>"nB7e08 G%jcٮ>a[C[adQ%!^6_}u7tKFstx9kӳ'0/]8:5]QKc) AO1 mt~1 xQnF_"ǡϱ e!Cy8@eA n6{m7ؚԵ*fMM{lj+&$4f<0㋹o&AkE~{p  'V9H_=oQ W>̋JzXUA e%!'`U G /pCcx/}[\@NO̾9NhUb羦.'?,Z)u} nf6MOOW}ZZ꤆ 1'^1ϽvYdL~@~0c ޖM]f'߭=:*]Q4Pɗ= @ִo^vWՋZ[![/k0,{sD+:.%{`r 7rZ7~}wHzdmq;]oy6lqy`Cs:\rfYx.7l9"\kgԲ2d`Cg;wrYYϭy:f㑪UaW44u'vb<'86=h׿huċv^m̦ oqciL\Nn_“g˭ x%d֡Ʈ$[b\Nv MvMŲ]}<ͬ:/ŌE-9.۶`p[b=63t qffK^I˒`>{ig/m>6ue{8X>.bü=u\Ff߮?o+T-'_jac웺b\\3>V-isk \0ˎ?Gkq#_ouQ~|y9孚BB=1Yil-ma(T G1&* W] į92,r&v -0wm'^6onxߘ"fN.q cau[}aiKZV/sjY-fR)WL,EFj ]<v&( 0sr_ĩ-gJTc[jF8)d^  /J9cTQRN" mUK=Jo9)m\Mu"T¥JR*g,i/{++̜\ʇ&^Wn 1W2OMblғ!.W1RaA9]>>{2*0UZf\IPҐݙzcΨeWCcGƎQֵGykWgף\"dmetu1W8FpxU %A&^@썱X+›",6UQ1ETSaaSW:>Ξ(/ŝ}vDK7e `( W[ldYwj*'H_Ş:hvn$"ϱلZqhާHs'Hl&NI"NibOH,Mz Ӂ/UD|qS^d6!b1S=Ib E$wM S[:ȡj H )?CuyqQMwM`D$%s'J,p^\TDrO'/($IM^( Q-qmN>x/T2|LwExg?;- -xOZo&/"yPE]Y!H@"Cם|zg;_OTgÃҐs uc:%"ip~WƔCY~hֶGyxs7M16v( F8~DABDdEJXtSՏ[$_\wP Њ0V[;{"'2XsO*닋?'x,/V0aO>p 䉳F1!NE?7,wYEƘs|D&Ʈ*UIDATLDrµKR>xs5O$Q3'vF]|"$dbbj @{a֋ي$7s儴TrXbC"N.<6 ,.d7MQom(t{N>Vgٗwq"]caofNV?Lt=CBCN.\fu;eT[$--1{V| n A$DJw̲/Ut>Cbu"kbiCDDDtߟ P ""Vv!Mz.WJ*I.%0h폈HĬL],Ru6'1zPO2Ӟ@=И~3}&:P6Eoqa @(Ж8DDDN`f+HC!%"""6&. LU ""k?Cp.iT @DD$Ǹ+Hދ42sfI``Y8DDD\[a@ Q "" 3kw{%8DDDٛDw {%K@(#[E? Sd_%WV'\F_b 4!""ob`b~2cOC"""bfC~r '@,_gflL m2C|~w$`6 <`93jwn$8fs]íu^xH73[~Gm("""9_ef7$QIsq̺[$'&Bm\W?@|pq.8+[MV qf1oBrC'2uCIUj 0t$""⊱) 57B)W. l̨@dR>M+CSGDD$lQ@1k׃XDDDyp= 7k0ʓ󉈈kuH:pnԓsu|o f{w5+&k~×<=Hz̨{@2`NC6PyEDD̨-X4f͹DDdHboEefC{Ɯs$i5a߁-J u^dOf?w("== |>mo|k';T$}OўoyaR#Q{.8""",DZj;l`^|ڴ_GDDNOLH$}>OQtT@DDdY1~'&NI 8O\ODD6L`>3&w0,3 |7c|тErP>\p 0oX.5ED$W-GT*XD[+Qw?s0fӱè ƲK̛ 8}ݼ6c #H B)&SPh34d& 3" v2y5UV ՖڂCe7 H4@v}y9'ٻ{wΞ>O3k㧗fX3{=$ר~㨏شFH 8 >o qpO=@}odx'{>^{ " }\_}e\_U`]G6HR9 UEmCT i{+ܗAJ!|ڃND HzPI*gϤP? 0E>u$a򸘥]CT,s^ؘ:E %;X3T<1, ګK S$b|NIiF@I:}Lݬ ySf][KR4*z:D\+oT&f2$iTV]a/gwN&#T|ͻ0{lx8u$%-^C1~ LM#I+n%4.gKKTF9M"I~!*g-~$ُ{x+ y1!ޑ:E&+_I}S|=Ƚ7RH8CȮ!*lEO)4F/R \WyQv^A#~)t~HJzj}CTn` &?.L#Ih }澜:DWG?WOo)Ecp5Yʹp$v^A? \:EޠFvwF~.d7@8=u*#SzZo7JIZh ]W*;?ϔ5ĸ~5 i&;wMfd~֦f0p}}:IRi*""[;?I ol}WVƌٻ.SUBXCgKcp([O1e03uZMDc>4j (0yL=:G/E6B+uZL>_ q1\̙1 y"YADp(M/`` OkWܿP.\?BWYjev' 33Bz:i8JX?:86CYB| Y/]* 2XmSTvaM^g3d6ml*!OdA 5^&/zžfeF˩3TjsYڵ'us諮:C%]p(!'#Ə@\*uJ(W3$ :C4fW *=Ɵ@T7@*/uPVy~+u ,7J!ib8ҮK:Drh5 C0woNQ2sVЊz~NQR_#$Ъ)S?<:EINR sheK )D!} n-07l|lLjN?OOc^/ⳟ/vһzG=KZڵ{GShE^%uGi[b;w%eS4.A\Nk]I@oc`}L1GXI@om]):AorCuGI@G  #뽱$@]wI_!`0O[%+:>1Zn`JI?z훩C$+:>!DåRldxN+:qvspx%!|hK^&1O"xt6x7l ySZL62eYt1@c٩ t?)sT|yS4mp/^1ZXTW~{Zx*QgV"Wa!5:_i9pDIIILD]]v&|aR7]$͐9C"ZCk$Fuۿ6Tº Pe_.&Y_];@|= nz VDm$NoS\j`&X²R{_Z  vŕ-7WW+i=όi&ؿfo7-6* r0='f ,i.LHrJR ȿQo U\ z±,ARM_=g[DD@b'?ǦOﻒԨ ad>[W>pL?(h绢ZW{fgĈJB4V컒++.)vY0w6N1_QTGhEs> "8;E+j&wlH)H,~jnr8~;WOJ|n瓣{AG?nwVF֥m:Up {G_ 8s8k'of%6W㞯LGD GBJ@R}6Tm8Qw%'oO?k*|WߝBH@R*0!'Zo}K}W"qd+\8,[G,C:$y$|(~*l{%O.Z y9;II}Wfi+R?I vl~t|j$Y$%tߝ`Gzm*[$˦* 3㧓VA@,/2}묽kUZJreOl3E_E@'myꧡBwF6V5o|IiְԚh |Х%/&'J&*h{0 cv=疺 ; l{ xM I9v.u޴ }W m﭅U֛çvpB{l o[@+Si7-&x~\:w%14u yEi@$[^p{7'K`kꤹ2ޤ';=.A|W:59C'"uZ# 1|~,7}wt [=,n…Q{oۛ~lx]j_i!.I6V?>;gk`(x6{mKZ8꠮pH8ghz+[v ٿ̞l/1I辵u.ojNoۚ>G{23xx8x|HNYq{p+':s)衅~{;8V >i8RAheg۫M@3e_oӷ;1 MVle߷>VKM[ ?[ l!iX1 ޵ʢcZ̳:X!'~y,|zJܫi|;.w/:cGR xz ;O@>(v!t] ˷O逪P؏\Mj&`:}TjJwu0+|c 5su qs.00PPlk-:0M~>໒hS؇.3=IkE%0kݨ^ۭ5m?#㝣6ˎmdfᠮ0+f\ߕ^E\,^⻒R؋ypYZOzxw ,7}[u˳P_6Vl=smv@Zo@Lŗ|WpsD>I`g]\mg>'_Qc͟>7˷߲ŧ:B8ܚ9Nhw} }Wr2b^) GC=o[_|W,[ϓS8m׉SE|;O.M0u?+u EC{wP0:l@5gQS])l)dZ컒橬?>zËHR)xVwLX +G ?w5SY?3$Q~*m+"]:W+FTއ5vGwx[4m /i7m ìonӍ( ?%W߅'&]23QcBE1۶s2CllLY?zz\HzdeYNٷ281XA D(w\mwj4+h~u6Y/9p!+a]5??k}W\Yp Q^733h=P ߙ今=̀2}XM^6|u|ft+_s/3],l8.y_6^ٙhE- sbwni5JT~uLUی_f}%|n\60Κ%K{tOa;;3mH ;“^˒2eV[n|k"|dﭬg~ \l[}ø͵Fe=]l "d]5M χ9#\ρε-,Q=Ǻ<ʅQ]ggsTmߞ ʃ1j4}|e O WjvsB\pi6+v0e#GGEs5vM~\=tnvt_ckoB}&J|}d+񐅁P.}*vz`>൰~|΄izA>w9ph;S֛/Z IـvY+1χoL];EXUh㻒 9qSA;v5IB \t"_| im~XW]INu]ITۑE¿A Jx[ 5J)U<v}|-鲊μVE4yw-\$,⻒ KvU]o^ʕ1X Z`3:Vݖ̀CE-w5V۸s~wfYC$K|~}2DM}/38 {R`n|Zb }⦦X γu3d`=CZ8IV c{GٙԧaDuOQqxU\M]0oa=\?veB&p~wTI? Htn0K>'^m}w\o]: cmGFep#v԰b _Y WZnm\Ct$J#LxB䩩ğg3vD8{*=w2q+ߎtmNJ8aXOgY?R7‰GN\4?Ko;dGſ:M~Tm-HI⧤GX#PGcx1)fo/gIl H$*\3ڦ][}6RIwXwZ'gwnr8~qg a-Rj EpXſT{t*t<8>kX)Vl 'o+چ}`6 ׍Rvt҄- !o^IvǰXZj+'.#}5?홿rhrfsJg,$v1-{d+POUzQ`f.m M}#?VsjDžF-v~”徫זl~ܤ>dgµݎYhwqhZ"F]1UhfY,u;npT_cCBBcimGpZS wjc^3bwi)}cʞe$aNRN[ʶtB8!E0'&4/E a{lL]n 2S)an{z= Q#.nà]Ϭ؟p w5n<iȒ:ڕNpKv燰L D"KwmM+ u7V6\4XpLwHi@k;%X~069L"rZ ;56T+'NfTe8s.LYV P]9=yR)7q i9Ȯn-} dеf&WJ˨^3z .φ܌J 'LGSWض&'jBgps7fdw u7cM@ 9pE/*+ cp@G7cJ^.>HEF3w27cp{z**Ln'KDRC;dwM .Ӥ݌%"ٓGMj*&tȆm{fVđȞ, 6ХCICzXE^^fIZ^hQnIMT;k܌%"WFws3N*&j!"KZ {3clq] Gw"]7 M]?F">nƚ8v(Sٗ~_M[" -va_/6SCvG/9̀E'bF8_Z n.ٗt)cB,0G`}:z .ٗF7wNj:e&7HzϢ-OAHXnhqD$= @ @oqtubfd\rtV9 Ѕr7Hzh ϲY]܌%"r@fbi]e(hg7Q*Imr"RR+Cd7.gԈ"*꠬6tn1*7qHب[k+?FQnhl7lq3N>e75 .v @;zRYfI=4HT;%+fIk\}6I8G \җ!I4OGf6ЧX"z ?l"6: } ܍%"r@< (r3ζZ7Ecw H54U @۹ZO@$4 ӿq4e5nnCDCkdg֊|hkTW7Hzh@gLQQG)dqzC|7cH\?|ƯHnҡ<㕏sN u\n,I&u`s@YRc\7㔔b4 "b!`yQ}݌5-bZwaLwk)"<O\8 ޒ-niXm5vj7HX1$ Y+܍uwcHji@]#(s3V[&L_n,rpfb%3Nfq0(XnQ]'Z.l @]Y܌ 7Yw%2Kqr QZ#3kmX}s`^:.fMl @L[nBƓx8|m]*?^\?Q}Mg,LYnLp$>3pDHf|z(r:f+b㲑z]5IXݶXdEٖjwBd@wh(l篎>~{%oLYaEpTN&\7^>])كOXP]/,s7^v&\?xr%7|WS\1plZE{Up8c>xB]1JӺQw|xtJqqlju𑣃ɤNC:ox cUוC܍'-%Y ޾Lʡv\o]O?_vTeo/ } ݎ)-uUH`j_wϵb?KM,|Ya\ww7¿/bZvLgJr/tFd[/i ldό.djEېr[ݎmlt;aP'cJ}kZk-ʅ N%3W­GF)+DڃkR)L[ݭhP _x1gr?z;ܮ%_N-QwLJ+}%[mkz\:<^"Xm ܛ ! "226?ܠQà wͱqUt歷.lHL-JY3)s2Y\MK-̗Bv'57b-$.ՔWڐuXvul0T ;?)]IS)aM, dyi]5Yk!+ӺpB~k͂>8gӤC@u5x1b[dq^Yi黒 !}u3aJXV M!B.IB,0TY_|m[tw{Sgw@y5pC;"T_9^qJ;,:dp`' tר. eOwW och~y,\a%($W#v=m'g]e+ 镗mnk>kf,3;koM/iq`߄ke2xr H l9d@^⻒ &gΏs , H<φ!Tl?'{kPZ/ $ߡ_ OP} &9~HIC<麣~#+qoc <^+ڄOJ)_k)U/PT[`}Wq 6wcٶ;ď:;٥wEEٙCDƨX_ >V@㣷 [ylSN&\;n:$Zn5۫-L]iBrm7¥O[}WVpC3޾+{}*i$ߞMFKPiӫ/,i+5;q$3 `ݥ?Ȣ}~LQ߀X~tT4V7GE]# Y6h LR _ ?3o`عPw%{7k=v@egk="*5j3gznpX8}PgY 7<lkl?O΀~LYiA8e'Eʾmwڢkm6wU?Avv]}.xܺ*p0E;xت6үq=?Ke7ڇu֩Ma@\>.nGivi(lwA6!\?^㻚~wH绪pÇvymMԧN .xB!>憱vE,.YOI0u}a`F;pU朗cpg8uFwEUev_RhPW'Vll/,UZ]|96?]Gt @'Qy-,7[ Xپluσ l}_.J?>w*nkuftA,-~Я?wBqeK5,-JaVXYy= b/89)JʬRߕD^|0!H:[L5}-`CO$ʀеݱtu|~?u^<-ŭU V]cUۿ]a̷-C}Xg{OZ=}` HE>VlE1}smVk5 ֥pm!gCn}!:4@Af ٫Brw{gKXrXW `Su:Xe޸+-eG*PܻY)=yw-|OR؏/тݵ*?GG KyTBu7]^k3 UPقv-wȆCslseCb_"rst=.vdZL>hD`Yܺk5uݏ5 4DSM|c*|U]C[ ?lDD$ZT}W -.|5pC0cJEj[S]{ED|oOGo ho=""R=m Zp""# W @ ~sw%"^=$NV-+|W )ߜWTu*3#}]U={+(QHA}W"!s^-VQeq>pv滈oo₩A NoA)7n_i+ڲSeH3VGqkmMx|JK Ͳ3qv2H*ĎM }l O% 6W;稏)8.;K5*]?ৎLk e?"UQ i'I)8  -oxuu:`lرpTW5ٟ:;37%{Iv&9{F ɦ_+WssQ.Lm>j% !{>&L 0. vVt%ڔ`b/{T0 +Li:Gj@eC, w5.k*vb+ʅC{Z tТ*' ` |jZͶ佳j?'de0] 㫡 ,Y_ 83iA54MΚ/R:aD6GCG#Fao]@dfشI~0wE| zp^)@0wAE ԕ2ѿU[bC JC{B|%Y٦DCNl@Phmk}A6X}{5}-K[:N? һTa; w&()E[djNg, g>.TW}Vn k:%@u˳!-~PMUg|Mկ)=ɋ’p Q\A=ȷߝnlfk9Aac,fXo,K2)S}u`'w.E֓t^Ύ~B%~}rJTEuۈnz<(e&}*,)L>|OnQ鷿.WAOe/)WҠQe5ޱ^FǩNb\P^hKD~I4/ߖRѲnGoAJuǷ%Tz&rw ˥JS]bTÔ=p; 8zjd:qʣCDI4}pgfUғ7%Qg W S:Dǣx?Rd:Jģx<̓;).:t0%P p)z!!t 9tG\ؽ~ٹnBʡ@KC: xQ Bm^_/\=LSMGq< DhWS3* @P NO WSMGqI#8tօX(Y)Md2(WadPRmFJJ\'{Έp/3gDJiyLLjxl7(ؖ/}l:#p0^ATLWh>LGq |~=+ݵ`J+Lq ql=^D嘎XcGLq4 ުbQ\h[M~71U("Т" |LGq- b/yKŠb_hQҋ+6}q@՞l'Z(V(*+C]r8֡u;5}b>j:(H_o6z@ׯ/S~I8b_ /ur_ϦHC݂;YjWN(8 h)ԌKLGP5V艏woTAtTP#:/}g: r_uNETy,QPM@|F=@e妣eڝ .@TJNs ]R1@ut Pi_f-(J[/t Pikf *%T{ M@POBT ](JY@W*e-3B?c (3ږR" PgΈ>}(*UhkV2 6U?`:8-݉89ȕ(Z7NGDRAO;sNi\J)gߵ(SEҺ}lt+ ^IjߑB1"IӿQ'#]8)vpR<p7 ֳ(9_Ŧc (Yף~;܏z(A&ƙ@@Zף@mD7Wk1'@GL@Q ]t ILۂ]Ok:wѦ[t+PO;D/q-(`F0G6 #Gzs4chQNj{Q0@x<ңC:rJLۃaHM;rZ` #=~tndw؄!HdntJ {P D2XWkk:Jt0t B ӗ֕%CGTZ^a:„A)C4oQP ^x0]ܳ(Uv2 (G]5ԑa 6ګW0ھ> M(PC^չ¯-0@ F{5ڑإ(5͡#*+0@5EGiu#5s3QjPb5Ѿ(A@%s/I 16Zo0FC66%8>hq i7` TBRl޼qid:Jm9&ꝟS LG v؉Q;!NL뛎2_g(p Ń$m`(pu {x`+ @zxuif:JU|6 ''[&sŶ|MǀV}pud,$5H96w̰g𗤯Yh- 5LILP:Zjuji jj:؋ZM[&Uz(Fm:g: RӺhDg/I۳NZtDHK6(U%kьjn/IfhS?U gLTIDX m(tv 0`9fZu-h:JDٙSBvX6I7OPZDz<Jݚc? N3\{f=wx%1 (\gt;}&ř8ѳi޻y$2N -psZ6?XQ"-(o@zqJ1ֳp3Ʊ UH6u4V(jdLͻ~b6VU@ 4M03cS͝:J1Q8ҞB).31&vi׎Tl4 8"0S3ͺv 1.R/]3\1Q 5`mrVza A X<6G{efp _U"OzfR/2g+*1U'(?;s0q" qmeCd38D:0F0 chmN nڅ? ږut Df]gao7F6?Dv3Gv׃3*1 f'5:۳MG@a k|onlO1o(B}{a-__s L@G֡]LGΜL@a xG.2Rk:" @ygQjߌ8#GuX11c GۀuM@LAm8"(N)MGADy9'w̑MGwk3L@QHiZ 5mcQ`C͑.F={ҒWdQ`X5>c+gW6BIcOqQc (SGI}HK֍:k]I][u؞g:,{]C0cLǦqʬSt+(i؟{v]Y*-PFJ#bz= xX~=GQ^aZw?ޱsz6!mkpFԺc"LIhv5AB,]vװZ]߯;iCz/k\fй2R $=(9yCQ$ OLMj'iQݬ(ֹ/Wg4s3ltS}\QZghu#U?Lh/jwnA6S5KsM\lh/iSyo :)R۟ON.u^ֶI Tt~嗔q$ujTGi'a b^=~MtM:ZlK6V\tlIZhlǦJMΜT,\^7nԣiz5MW&i_ GA$hu4UCQrǡvAOsVo՜[kR:K-}r_kh|w#9 ln΍ꚎIg~\?YzkN=h^K))6FIg<屬¯9}<8v.^僕y/29k%hpF߹&ti ?>ٯ\ g>RI^nw%?~5(8yI:\Pݣ(Рi@ۢbsj4a@\Nb^fFu4?l{p¯EjѦ{"au5t pmjcԮamQp˷n`x"!Allz5Њ@t2#vØ%>ߟc:pRab+-MGyH~6Z!2QɬSKO̘G$1up Q:d8̊G,.:J@]sN;QPE[`~)Q(5IM~-j?"@=2iմn-QPM6 a.}\%d Q^G3Gv75TК٦cE"@ݤ8ޮ(Uo:pZ΍jղ^(ū)%|܄.q7f ̒cUt^榣 jO^Q0j׏Vnj D?zgU'1tga0u@'2[ P\tltA?ջL*HI;u4o: dݾL*9-굩#0%tI(@]׿dbyo,P zu\c1 AZRf];BC66-iÁ\1J5ԵIMiɦU04 Plg Qb,_J[8 ߵjxߍߍ%8".P4 ΪyFCFQAvhnPJӱ^fj'pD <#NDN#>&J\MtD0nQShR;IO^MGA@88sj׍PzQ@8:K(&|pf+w6 h|@]ۿ(pջ(p& )Vf];B428YOOfuUSVך٦cB.ZO]>|P-kft Z+Ey=wbt8*(Nݤ8|pli: p2 ҶAm͛6Z6.@8ש^zRbMG *~@Qz\PrO|~1jj+n-LG*ͦc5Bky׏VFuMGK*ҍϵ7tF(pA34e>9 _l{>b: Xu;u+8ceHKWվamXG2]|a:g0H1QzA_[Qo١߽x!6ISi$M]aJI?yZFu] Vש _hFo>ݲ_3-ӲMGBɀ %AMam^jW{u㫟h6 D~-hQH=y*c>i: 8_=%k4}Igue), bBg_/(QPs tׂUz~&U]pJQ2"߰G% oפگxԳiAyeZc?#E׫?\p,Co}).ʝ`e\D7/>ݲ_7)Wam*+e7VpA818nt0O;7Wt 5WBlm?1ti3&Gf=QPr 4og: ` $ߟj|S׊z4uVr @nRL&tn7os]@eIS~qs#HK֛7U:A{kwjڙs4h2[QY}sS`/+f+.+WjY/%(٦Am]ۿ*Z$ X;nn<< (zd*TR^WVnQZxnCcb4}\khwnAP^vS-5U(puk>ܼt tbRqYk,$۩q$}퀊}Ay]ꐩLuMGA/tG3xmce8nɶZ?W;7 1Գi>+Ft? Qb+ϩrr~R]z5M#ӑhx&za7**+7+OP6 _^+A\1Dz:q}D_W уSN;@C]F 69hwL$={zE6 kD ||ɺ{%Fk)gRfZzwxtTѻ_ֈ|oE}LT `0U՜U?UGKX?pܮh^H.7k]-N׿UC-yz6M7U͚;-]sѺOm >]wGBk'iu#QʬS+$4St[IDAT&rE'*,+׾#}j1Q^xp&Ě~?wi_^FuhgѴe>LyT:L-҂sFMSFu5-!G8ջ߹YHVD[%}^Z7۪''͛ꧽ2UT++<@ݢz5sc jԧXwi|>5ޝܨ>$=wP HԭI=-Ԛ}&TxKoVr̖=fI}5p9 -G_m{x<Ҕm!]{%=[i&y8́" }M}eB(^ͺv2RY@%U5{VIҹg5 ~ᄘhTC5ʝt;*x;Tl4iekco*xҷҒ⃜,ndii'$[иv^6*h7 ɛkwً!L/^=L)vޑ}~퀮{cB7oq)9م%u\qPHS^~Ee¯"#>~?k3ζ|MXe:JDxst.cŊ4T+jW>n$,?%FjhD&!sWսX7*zõzll9ܭ?ԧy}KrV]؃VO70#zVT;}ުg/|NҼG[y'[W>9D'o! 9 @мn=<_~rVp米L:C+suNa y E3Y%*3}9\}yo޳vB_g^>n2:>e35H/,߬vw,=⧱'@ecŁ"YvaF=OGf=}vzʡS.3"͚<]"emF~nX3, 딡єmKB3!#5QfLi4>jgu\."쟏6@0QBj_VNaRDtpwxNՊ lޗ5<͇Q ߰G}0WvM+ӓոvRP^++Ӆxת뚃-w.CuC4(x*1Q^{Cr\`PVC| c<~m)\; wZ".븸'r__+]_^G ͅԏqQ(};K"0s@{(p @xqf ӛƧ5O~1I2X~XNRT5wħsσsr!q.D m;^Պf(Gvt b0Y,XS?\ ?rMr8 ,R>H?ob(߻i_x^wo^.ǻ;ڕIžrG}C*tų՞l=0ZzZx~J ~^wy ͇Դn-15p#f"|qy[ELd<i` 1hl |<{1qV@v( 9 }U\-u.,+QP`#:f.r6SVz#Sg)tFT(q2ɾu0Xk>c88`n[A~ ٓg$w*s~93W\2cGUZ^hg'3̝@98ܜ[`]?Z-ҒM -膗?MP@Tږ 똎R%l![Vj%¯B`N|T# K4 ç[}st*-7w_jΉwxp/v8_O\2Ptyť{*=:Vao܊O-٠jQK|Kuh("'^ #qϏВk>pF=@W<߅ LyOB2;꜇5;LGr_V71!r H>1*m'p1 t?c֘R)KT{M3,QaY8WT?_|yTFVA߳tAf21Pռ^^\Y wTۏk'eV`[5c,pX<^Ğ 6ZuKQjbԄX~4$*5=~rgZvgDp)@@SH{XhOGK||CeDƌ\mm:`Hxp^cPE{ u??PU@=p<L;+g쮄@naghDE (.>S(Q#( q#gj)@EJ F }(. $ꐩ) @E25纑a 2S3cM,Av D*;djQ(.IҘM4J܍ <c%@E qc;Q^avjצ;Cv^}ש^fb8 6;'HRNa:5t6v&W(TyEe-*?yť+*USXe($".j :tL%e*,-Waۏ}*(};O;O8@t˟|x ʿ7`xnx-,"@,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DB,DBIENDB`pydantic-2.10.6/docs/theme/000077500000000000000000000000001474456633400154655ustar00rootroot00000000000000pydantic-2.10.6/docs/theme/announce.html000066400000000000000000000010711474456633400201600ustar00rootroot00000000000000 What's new — we've launched Pydantic Logfire 🔥 to help you monitor and understand your Pydantic validations. pydantic-2.10.6/docs/theme/main.html000066400000000000000000000005011474456633400172730ustar00rootroot00000000000000{% extends "base.html" %} {% block announce %} {% include 'announce.html' ignore missing %} {% endblock %} {% block content %} {{ super() }} {% include 'mkdocs_run_deps.html' ignore missing %} {% endblock %} pydantic-2.10.6/docs/version-policy.md000066400000000000000000000130511474456633400176670ustar00rootroot00000000000000First of all, we recognize that the transitions from Pydantic V1 to V2 has been and will be painful for some users. We're sorry about this pain :pray:, it was an unfortunate but necessary step to correct design mistakes of V1. **There will not be another breaking change of this magnitude!** ## Pydantic V1 Active development of V1 has already stopped, however critical bug fixes and security vulnerabilities will be fixed in V1 until the release of Pydantic V3. ## Pydantic V2 We will not intentionally make breaking changes in minor releases of V2. Functionality marked as deprecated will not be removed until the next major V3 release. Of course, some apparently safe changes and bug fixes will inevitably break some users' code — obligatory link to [xkcd](https://xkcd.com/1172/). The following changes will **NOT** be considered breaking changes, and may occur in minor releases: * Changing the format of JSON Schema [references](https://json-schema.org/understanding-json-schema/structuring#dollarref). * Changing the `msg`, `ctx`, and `loc` fields of [`ValidationError`][pydantic_core.ValidationError] exceptions. `type` will not change — if you're programmatically parsing error messages, you should use `type`. * Adding new keys to [`ValidationError`][pydantic_core.ValidationError] exceptions — e.g. we intend to add `line_number` and `column_number` to errors when validating JSON once we migrate to a new JSON parser. * Adding new [`ValidationError`][pydantic_core.ValidationError] errors. * Changing how `__repr__` behaves, even of public classes. In all cases we will aim to minimize churn and do so only when justified by the increase of quality of Pydantic for users. ## Pydantic V3 and beyond We expect to make new major releases roughly once a year going forward, although as mentioned above, any associated breaking changes should be trivial to fix compared to the V1-to-V2 transition. ## Experimental Features At Pydantic, we like to move quickly and innovate! To that end, we may introduce experimental features in minor releases. !!! abstract "Usage Documentation" To learn more about our current experimental features, see the [experimental features documentation](./concepts/experimental.md). Please keep in mind, experimental features are active works in progress. If these features are successful, they'll eventually become part of Pydantic. If unsuccessful, said features will be removed with little notice. While in its experimental phase, a feature's API and behaviors may not be stable, and it's very possible that changes made to the feature will not be backward-compatible. ### Naming Conventions We use one of the following naming conventions to indicate that a feature is experimental: 1. The feature is located in the [`experimental`](api/experimental.md) module. In this case, you can access the feature like this: ```python {test="skip" lint="skip"} from pydantic.experimental import feature_name ``` 2. The feature is located in the main module, but prefixed with `experimental_`. This case occurs when we add a new field, argument, or method to an existing data structure already within the main `pydantic` module. New features with these naming conventions are subject to change or removal, and we are looking for feedback and suggestions before making them a permanent part of Pydantic. See the [feedback section](./concepts/experimental.md#feedback) for more information. ### Importing Experimental Features When you import an experimental feature from the [`experimental`](api/experimental.md) module, you'll see a warning message that the feature is experimental. You can disable this warning with the following: ```python import warnings from pydantic import PydanticExperimentalWarning warnings.filterwarnings('ignore', category=PydanticExperimentalWarning) ``` ### Lifecycle of Experimental Features 1. A new feature is added, either in the [`experimental`](api/experimental.md) module or with the `experimental_` prefix. 2. The behavior is often modified during patch/minor releases, with potential API/behavior changes. 3. If the feature is successful, we promote it to Pydantic with the following steps: a. If it was in the [`experimental`](api/experimental.md) module, the feature is cloned to Pydantic's main module. The original experimental feature still remains in the [`experimental`](api/experimental.md) module, but it will show a warning when used. If the feature was already in the main Pydantic module, we create a copy of the feature without the `experimental_` prefix, so the feature exists with both the official and experimental names. A deprecation warning is attached to the experimental version. b. At some point, the code of the experimental feature is removed, but there will still be a stub of the feature that provides an error message with appropriate instructions. c. As a last step, the experimental version of the feature is entirely removed from the codebase. If the feature is unsuccessful or unpopular, it's removed with little notice. A stub will remain in the location of the deprecated feature with an error message. Thanks to [streamlit](https://docs.streamlit.io/develop/quick-reference/prerelease) for the inspiration for the lifecycle and naming conventions of our new experimental feature patterns. ## Support for Python versions Pydantic will drop support for a Python version when the following conditions are met: * The Python version has reached its [expected end of life](https://devguide.python.org/versions/). * less than 5% of downloads of the most recent minor release are using that version. pydantic-2.10.6/docs/why.md000066400000000000000000000353371474456633400155270ustar00rootroot00000000000000# Why use Pydantic? Today, Pydantic is downloaded many times a month and used by some of the largest and most recognisable organisations in the world. It's hard to know why so many people have adopted Pydantic since its inception six years ago, but here are a few guesses. ## Type hints powering schema validation {#type-hints} The schema that Pydantic validates against is generally defined by Python [type hints](https://docs.python.org/3/glossary.html#term-type-hint). Type hints are great for this since, if you're writing modern Python, you already know how to use them. Using type hints also means that Pydantic integrates well with static typing tools (like [mypy](https://www.mypy-lang.org/) and [Pyright](https://github.com/microsoft/pyright/)) and IDEs (like [PyCharm](https://www.jetbrains.com/pycharm/) and [VSCode](https://code.visualstudio.com/)). ???+ example "Example - just type hints" _(This example requires Python 3.9+)_ ```python {requires="3.9"} from typing import Annotated, Literal from annotated_types import Gt from pydantic import BaseModel class Fruit(BaseModel): name: str # (1)! color: Literal['red', 'green'] # (2)! weight: Annotated[float, Gt(0)] # (3)! bazam: dict[str, list[tuple[int, bool, float]]] # (4)! print( Fruit( name='Apple', color='red', weight=4.2, bazam={'foobar': [(1, True, 0.1)]}, ) ) #> name='Apple' color='red' weight=4.2 bazam={'foobar': [(1, True, 0.1)]} ``` 1. The `name` field is simply annotated with `str` — any string is allowed. 2. The [`Literal`][typing.Literal] type is used to enforce that `color` is either `'red'` or `'green'`. 3. Even when we want to apply constraints not encapsulated in Python types, we can use [`Annotated`][typing.Annotated] and [`annotated-types`](https://github.com/annotated-types/annotated-types) to enforce constraints while still keeping typing support. 4. I'm not claiming "bazam" is really an attribute of fruit, but rather to show that arbitrarily complex types can easily be validated. !!! tip "Learn more" See the [documentation on supported types](concepts/types.md). ## Performance Pydantic's core validation logic is implemented in a separate package ([`pydantic-core`](https://github.com/pydantic/pydantic-core)), where validation for most types is implemented in Rust. As a result, Pydantic is among the fastest data validation libraries for Python. ??? example "Performance Example - Pydantic vs. dedicated code" In general, dedicated code should be much faster than a general-purpose validator, but in this example Pydantic is >300% faster than dedicated code when parsing JSON and validating URLs. ```python {title="Performance Example" test="skip"} import json import timeit from urllib.parse import urlparse import requests from pydantic import HttpUrl, TypeAdapter reps = 7 number = 100 r = requests.get('https://api.github.com/emojis') r.raise_for_status() emojis_json = r.content def emojis_pure_python(raw_data): data = json.loads(raw_data) output = {} for key, value in data.items(): assert isinstance(key, str) url = urlparse(value) assert url.scheme in ('https', 'http') output[key] = url emojis_pure_python_times = timeit.repeat( 'emojis_pure_python(emojis_json)', globals={ 'emojis_pure_python': emojis_pure_python, 'emojis_json': emojis_json, }, repeat=reps, number=number, ) print(f'pure python: {min(emojis_pure_python_times) / number * 1000:0.2f}ms') #> pure python: 5.32ms type_adapter = TypeAdapter(dict[str, HttpUrl]) emojis_pydantic_times = timeit.repeat( 'type_adapter.validate_json(emojis_json)', globals={ 'type_adapter': type_adapter, 'HttpUrl': HttpUrl, 'emojis_json': emojis_json, }, repeat=reps, number=number, ) print(f'pydantic: {min(emojis_pydantic_times) / number * 1000:0.2f}ms') #> pydantic: 1.54ms print( f'Pydantic {min(emojis_pure_python_times) / min(emojis_pydantic_times):0.2f}x faster' ) #> Pydantic 3.45x faster ``` Unlike other performance-centric libraries written in compiled languages, Pydantic also has excellent support for customizing validation via [functional validators](#customisation). !!! tip "Learn more" Samuel Colvin's [talk at PyCon 2023](https://youtu.be/pWZw7hYoRVU) explains how [`pydantic-core`](https://github.com/pydantic/pydantic-core) works and how it integrates with Pydantic. ## Serialization Pydantic provides functionality to serialize model in three ways: 1. To a Python `dict` made up of the associated Python objects. 2. To a Python `dict` made up only of "jsonable" types. 3. To a JSON string. In all three modes, the output can be customized by excluding specific fields, excluding unset fields, excluding default values, and excluding `None` values. ??? example "Example - Serialization 3 ways" ```python from datetime import datetime from pydantic import BaseModel class Meeting(BaseModel): when: datetime where: bytes why: str = 'No idea' m = Meeting(when='2020-01-01T12:00', where='home') print(m.model_dump(exclude_unset=True)) #> {'when': datetime.datetime(2020, 1, 1, 12, 0), 'where': b'home'} print(m.model_dump(exclude={'where'}, mode='json')) #> {'when': '2020-01-01T12:00:00', 'why': 'No idea'} print(m.model_dump_json(exclude_defaults=True)) #> {"when":"2020-01-01T12:00:00","where":"home"} ``` !!! tip "Learn more" See the [documentation on serialization](concepts/serialization.md). ## JSON Schema A [JSON Schema](https://json-schema.org/) can be generated for any Pydantic schema — allowing self-documenting APIs and integration with a wide variety of tools which support the JSON Schema format. ??? example "Example - JSON Schema" ```python from datetime import datetime from pydantic import BaseModel class Address(BaseModel): street: str city: str zipcode: str class Meeting(BaseModel): when: datetime where: Address why: str = 'No idea' print(Meeting.model_json_schema()) """ { '$defs': { 'Address': { 'properties': { 'street': {'title': 'Street', 'type': 'string'}, 'city': {'title': 'City', 'type': 'string'}, 'zipcode': {'title': 'Zipcode', 'type': 'string'}, }, 'required': ['street', 'city', 'zipcode'], 'title': 'Address', 'type': 'object', } }, 'properties': { 'when': {'format': 'date-time', 'title': 'When', 'type': 'string'}, 'where': {'$ref': '#/$defs/Address'}, 'why': {'default': 'No idea', 'title': 'Why', 'type': 'string'}, }, 'required': ['when', 'where'], 'title': 'Meeting', 'type': 'object', } """ ``` Pydantic is compliant with the latest version of JSON Schema specification ([2020-12](https://json-schema.org/draft/2020-12/release-notes.html)), which is compatible with [OpenAPI 3.1](https://spec.openapis.org/oas/v3.1.0.html). !!! tip "Learn more" See the [documentation on JSON Schema](concepts/json_schema.md). ## Strict mode and data coercion {#strict-lax} By default, Pydantic is tolerant to common incorrect types and coerces data to the right type — e.g. a numeric string passed to an `int` field will be parsed as an `int`. Pydantic also has as [strict mode](concepts/strict_mode.md), where types are not coerced and a validation error is raised unless the input data exactly matches the expected schema. But strict mode would be pretty useless when validating JSON data since JSON doesn't have types matching many common Python types like [`datetime`][datetime.datetime], [`UUID`][uuid.UUID] or [`bytes`][]. To solve this, Pydantic can parse and validate JSON in one step. This allows sensible data conversion (e.g. when parsing strings into [`datetime`][datetime.datetime] objects). Since the JSON parsing is implemented in Rust, it's also very performant. ??? example "Example - Strict mode that's actually useful" ```python from datetime import datetime from pydantic import BaseModel, ValidationError class Meeting(BaseModel): when: datetime where: bytes m = Meeting.model_validate({'when': '2020-01-01T12:00', 'where': 'home'}) print(m) #> when=datetime.datetime(2020, 1, 1, 12, 0) where=b'home' try: m = Meeting.model_validate( {'when': '2020-01-01T12:00', 'where': 'home'}, strict=True ) except ValidationError as e: print(e) """ 2 validation errors for Meeting when Input should be a valid datetime [type=datetime_type, input_value='2020-01-01T12:00', input_type=str] where Input should be a valid bytes [type=bytes_type, input_value='home', input_type=str] """ m_json = Meeting.model_validate_json( '{"when": "2020-01-01T12:00", "where": "home"}' ) print(m_json) #> when=datetime.datetime(2020, 1, 1, 12, 0) where=b'home' ``` !!! tip "Learn more" See the [documentation on strict mode](concepts/strict_mode.md). ## Dataclasses, TypedDicts, and more {#dataclasses-typeddict-more} Pydantic provides four ways to create schemas and perform validation and serialization: 1. [`BaseModel`](concepts/models.md) — Pydantic's own super class with many common utilities available via instance methods. 2. [Pydantic dataclasses](concepts/dataclasses.md) — a wrapper around standard dataclasses with additional validation performed. 3. [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] — a general way to adapt any type for validation and serialization. This allows types like [`TypedDict`](api/standard_library_types.md#typeddict) and [`NamedTuple`](api/standard_library_types.md#typingnamedtuple) to be validated as well as simple types (like [`int`][] or [`timedelta`][datetime.timedelta]) — [all types](concepts/types.md) supported can be used with [`TypeAdapter`][pydantic.type_adapter.TypeAdapter]. 4. [`validate_call`](concepts/validation_decorator.md) — a decorator to perform validation when calling a function. ??? example "Example - schema based on a [`TypedDict`][typing.TypedDict]" ```python from datetime import datetime from typing_extensions import NotRequired, TypedDict from pydantic import TypeAdapter class Meeting(TypedDict): when: datetime where: bytes why: NotRequired[str] meeting_adapter = TypeAdapter(Meeting) m = meeting_adapter.validate_python( # (1)! {'when': '2020-01-01T12:00', 'where': 'home'} ) print(m) #> {'when': datetime.datetime(2020, 1, 1, 12, 0), 'where': b'home'} meeting_adapter.dump_python(m, exclude={'where'}) # (2)! print(meeting_adapter.json_schema()) # (3)! """ { 'properties': { 'when': {'format': 'date-time', 'title': 'When', 'type': 'string'}, 'where': {'format': 'binary', 'title': 'Where', 'type': 'string'}, 'why': {'title': 'Why', 'type': 'string'}, }, 'required': ['when', 'where'], 'title': 'Meeting', 'type': 'object', } """ ``` 1. [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] for a [`TypedDict`][typing.TypedDict] performing validation, it can also validate JSON data directly with [`validate_json`][pydantic.type_adapter.TypeAdapter.validate_json]. 2. [`dump_python`][pydantic.type_adapter.TypeAdapter.dump_python] to serialise a [`TypedDict`][typing.TypedDict] to a python object, it can also serialise to JSON with [`dump_json`][pydantic.type_adapter.TypeAdapter.dump_json]. 3. [`TypeAdapter`][pydantic.type_adapter.TypeAdapter] can also generate a JSON Schema. ## Customisation Functional validators and serializers, as well as a powerful protocol for custom types, means the way Pydantic operates can be customized on a per-field or per-type basis. ??? example "Customisation Example - wrap validators" "wrap validators" are new in Pydantic V2 and are one of the most powerful ways to customize validation. ```python from datetime import datetime, timezone from typing import Any from pydantic_core.core_schema import ValidatorFunctionWrapHandler from pydantic import BaseModel, field_validator class Meeting(BaseModel): when: datetime @field_validator('when', mode='wrap') def when_now( cls, input_value: Any, handler: ValidatorFunctionWrapHandler ) -> datetime: if input_value == 'now': return datetime.now() when = handler(input_value) # in this specific application we know tz naive datetimes are in UTC if when.tzinfo is None: when = when.replace(tzinfo=timezone.utc) return when print(Meeting(when='2020-01-01T12:00+01:00')) #> when=datetime.datetime(2020, 1, 1, 12, 0, tzinfo=TzInfo(+01:00)) print(Meeting(when='now')) #> when=datetime.datetime(2032, 1, 2, 3, 4, 5, 6) print(Meeting(when='2020-01-01T12:00')) #> when=datetime.datetime(2020, 1, 1, 12, 0, tzinfo=datetime.timezone.utc) ``` !!! tip "Learn more" See the documentation on [validators](concepts/validators.md), [custom serializers](concepts/serialization.md#custom-serializers), and [custom types](concepts/types.md#custom-types). ## Ecosystem At the time of writing there are 466,400 repositories on GitHub and 8,119 packages on PyPI that depend on Pydantic. Some notable libraries that depend on Pydantic: {{ libraries }} More libraries using Pydantic can be found at [`Kludex/awesome-pydantic`](https://github.com/Kludex/awesome-pydantic). ## Organisations using Pydantic {#using-pydantic} Some notable companies and organisations using Pydantic together with comments on why/how we know they're using Pydantic. The organisations below are included because they match one or more of the following criteria: * Using Pydantic as a dependency in a public repository. * Referring traffic to the Pydantic documentation site from an organization-internal domain — specific referrers are not included since they're generally not in the public domain. * Direct communication between the Pydantic team and engineers employed by the organization about usage of Pydantic within the organization. We've included some extra detail where appropriate and already in the public domain. {{ organisations }} pydantic-2.10.6/mkdocs.yml000066400000000000000000000274031474456633400154440ustar00rootroot00000000000000site_name: Pydantic site_description: Data validation using Python type hints strict: true site_url: https://docs.pydantic.dev/ theme: name: 'material' custom_dir: 'docs/theme' palette: - media: "(prefers-color-scheme)" scheme: default primary: pink accent: pink toggle: icon: material/lightbulb name: "Switch to light mode" - media: "(prefers-color-scheme: light)" scheme: default primary: pink accent: pink toggle: icon: material/lightbulb-outline name: "Switch to dark mode" - media: "(prefers-color-scheme: dark)" scheme: slate primary: pink accent: pink toggle: icon: material/lightbulb-auto-outline name: "Switch to system preference" features: - content.tabs.link - content.code.annotate - content.code.copy - announce.dismiss - navigation.tabs - navigation.instant - navigation.instant.prefetch - navigation.instant.preview - navigation.instant.progress - navigation.path - navigation.sections - navigation.top - navigation.tracking - search.suggest - toc.follow logo: 'logo-white.svg' favicon: 'favicon.png' repo_name: pydantic/pydantic repo_url: https://github.com/pydantic/pydantic edit_uri: edit/main/docs/ extra: version: provider: mike analytics: feedback: title: Was this page helpful? ratings: - icon: material/thumb-up-outline name: This page was helpful data: 1 note: >- Thanks for your feedback! - icon: material/thumb-down-outline name: This page could be improved data: 0 note: >- Thanks for your feedback! # https://www.mkdocs.org/user-guide/configuration/#validation validation: omitted_files: warn absolute_links: warn unrecognized_links: warn extra_css: - 'extra/terminal.css' - 'extra/tweaks.css' extra_javascript: - 'extra/feedback.js' - 'extra/fluff.js' - 'https://samuelcolvin.github.io/mkdocs-run-code/run_code_main.js' nav: - Get Started: - Welcome to Pydantic: index.md - Why use Pydantic: why.md - Help with Pydantic: help_with_pydantic.md - Installation: install.md - Migration Guide: migration.md - Version Policy: version-policy.md - Contributing: contributing.md - Changelog: changelog.md - Concepts: - Models: concepts/models.md - Fields: concepts/fields.md - JSON Schema: concepts/json_schema.md - JSON: concepts/json.md - Types: concepts/types.md - Unions: concepts/unions.md - Alias: concepts/alias.md - Configuration: concepts/config.md - Serialization: concepts/serialization.md - Validators: concepts/validators.md - Dataclasses: concepts/dataclasses.md - Forward Annotations: concepts/forward_annotations.md - Strict Mode: concepts/strict_mode.md - Type Adapter: concepts/type_adapter.md - Validation Decorator: concepts/validation_decorator.md - Conversion Table: concepts/conversion_table.md - Settings Management: concepts/pydantic_settings.md - Performance: concepts/performance.md - Experimental: concepts/experimental.md - API Documentation: - Pydantic: - BaseModel: api/base_model.md - RootModel: api/root_model.md - Pydantic Dataclasses: api/dataclasses.md - TypeAdapter: api/type_adapter.md - Validate Call: api/validate_call.md - Fields: api/fields.md - Aliases: api/aliases.md - Configuration: api/config.md - JSON Schema: api/json_schema.md - Errors: api/errors.md - Functional Validators: api/functional_validators.md - Functional Serializers: api/functional_serializers.md - Standard Library Types: api/standard_library_types.md - Pydantic Types: api/types.md - Network Types: api/networks.md - Version Information: api/version.md - Annotated Handlers: api/annotated_handlers.md - Experimental: api/experimental.md - Pydantic Core: - pydantic_core: api/pydantic_core.md - pydantic_core.core_schema: api/pydantic_core_schema.md - Pydantic Settings: api/pydantic_settings.md - Pydantic Extra Types: - Color: api/pydantic_extra_types_color.md - Country: api/pydantic_extra_types_country.md - Payment: api/pydantic_extra_types_payment.md - Phone Numbers: api/pydantic_extra_types_phone_numbers.md - Routing Numbers: api/pydantic_extra_types_routing_numbers.md - Coordinate: api/pydantic_extra_types_coordinate.md - Mac Address: api/pydantic_extra_types_mac_address.md - ISBN: api/pydantic_extra_types_isbn.md - Pendulum: api/pydantic_extra_types_pendulum_dt.md - Currency: api/pydantic_extra_types_currency_code.md - Language: api/pydantic_extra_types_language_code.md - Script Code: api/pydantic_extra_types_script_code.md - Semantic Version: api/pydantic_extra_types_semantic_version.md - Timezone Name: api/pydantic_extra_types_timezone_name.md - ULID: api/pydantic_extra_types_ulid.md - Internals: - Architecture: internals/architecture.md - Resolving Annotations: internals/resolving_annotations.md - Examples: - Validating File Data: examples/files.md - Web and API Requests: examples/requests.md - Queues: examples/queues.md - Databases: examples/orms.md - Custom Validators: examples/custom_validators.md - Error Messages: - Error Handling: errors/errors.md - Validation Errors: errors/validation_errors.md - Usage Errors: errors/usage_errors.md - Integrations: - Pydantic Logfire: integrations/logfire.md - Dev Tools: - Mypy: integrations/mypy.md/. - PyCharm: integrations/pycharm.md - Hypothesis: integrations/hypothesis.md - Visual Studio Code: integrations/visual_studio_code.md - datamodel-code-generator: integrations/datamodel_code_generator.md - devtools: integrations/devtools.md - Rich: integrations/rich.md - Linting: integrations/linting.md - Production Tools: - AWS Lambda: integrations/aws_lambda.md - Blog: https://blog.pydantic.dev/ - Pydantic People: pydantic_people.md markdown_extensions: - tables - toc: permalink: true title: Page contents - admonition - pymdownx.details - pymdownx.superfences - pymdownx.highlight: pygments_lang_class: true - pymdownx.extra - pymdownx.emoji: emoji_index: !!python/name:material.extensions.emoji.twemoji emoji_generator: !!python/name:material.extensions.emoji.to_svg - pymdownx.tabbed: alternate_style: true watch: - pydantic hooks: - 'docs/plugins/main.py' plugins: - social - mike: alias_type: symlink canonical_version: latest - search - exclude: glob: - theme/announce.html - plugins/* - __pycache__/* - mkdocstrings: handlers: python: paths: [.] options: members_order: source separate_signature: true filters: ["!^_"] docstring_options: ignore_init_summary: true merge_init_into_class: true show_signature_annotations: true signature_crossrefs: true extensions: - docs/plugins/griffe_doclinks.py import: - url: https://docs.python.org/3/objects.inv domains: [py, std] - redirects: redirect_maps: 'usage/mypy.md': 'integrations/mypy.md' 'mypy_plugin.md': 'integrations/mypy.md' 'datamodel_code_generator.md': 'integrations/datamodel_code_generator.md' 'visual_studio_code.md': 'integrations/visual_studio_code.md' 'hypothesis_plugin.md': 'integrations/hypothesis.md' 'pycharm_plugin.md': 'integrations/pycharm.md' 'usage/model_config.md': 'api/config.md' 'usage/devtools.md': 'integrations/devtools.md' 'usage/rich.md': 'integrations/rich.md' 'usage/linting.md': 'integrations/linting.md' 'usage/types.md': 'concepts/types.md' 'usage/types/secrets.md': 'api/types.md#pydantic.types.Secret' 'usage/types/string_types.md': 'api/types.md#pydantic.types.StringConstraints' 'usage/types/file_types.md': 'api/types.md#pydantic.types.FilePath' 'api/main.md': 'api/base_model.md' 'api/color.md': 'api/pydantic_extra_types_color.md' 'api/alias_generators.md': 'api/config.md#pydantic.config.ConfigDict.alias_generator' 'api/pydantic_core_init.md': 'api/pydantic_core.md' 'usage/types/booleans.md': 'api/standard_library_types.md#booleans' 'usage/types/callables.md': 'api/standard_library_types.md#callable' 'usage/types/custom.md': 'concepts/types.md#custom-types' 'usage/types/datetime.md': 'api/standard_library_types.md#datetime-types' 'usage/types/enum.md': 'api/standard_library_types.md#enum' 'usage/types/json.md': 'api/types.md#pydantic.types.Json' 'usage/types/list_types.md': 'api/standard_library_types.md#list' 'usage/types/standard_types.md': 'api/standard_library_types.md' 'usage/types/strict_types.md': 'concepts/types.md#strict-types' 'usage/types/types.md': 'concepts/types.md' 'usage/types/urls.md': 'api/networks.md' 'usage/types/unions.md': 'api/standard_library_types.md#union' 'usage/types/typevars.md': 'api/standard_library_types.md#type-and-typevar' 'usage/types/types_fields.md': 'api/standard_library_types.md' 'usage/validation_errors.md': 'errors/validation_errors.md' 'usage/errors.md': 'errors/usage_errors.md' 'usage/types/extra_types/color_types.md': 'api/pydantic_extra_types_color.md' 'usage/types/extra_types/extra_types.md': 'api/pydantic_extra_types_color.md' 'usage/types/extra_types/coordinate.md': 'api/pydantic_extra_types_coordinate.md' 'usage/types/extra_types/mac_address.md': 'api/pydantic_extra_types_mac_address.md' 'usage/types/extra_types/payment_cards.md': 'api/pydantic_extra_types_payment.md' 'usage/types/extra_types/phone_numbers.md': 'api/pydantic_extra_types_phone_numbers.md' 'usage/types/extra_types/routing_numbers.md': 'api/pydantic_extra_types_routing_numbers.md' 'version-compatibility.md': 'version-policy.md' 'api/pydantic_extra_types_routing_number.md': 'api/pydantic_extra_types_routing_numbers.md' 'usage/computed_fields.md': 'api/fields.md#pydantic.fields.computed_field' 'usage/conversion_table.md': 'concepts/conversion_table.md' 'usage/dataclasses.md': 'concepts/dataclasses.md' 'usage/fields.md': 'concepts/fields.md' 'usage/json_schema.md': 'concepts/json_schema.md' 'usage/models.md': 'concepts/models.md' 'usage/postponed_annotations.md': 'concepts/forward_annotations.md' 'concepts/postponed_annotations.md': 'concepts/forward_annotations.md' 'usage/pydantic_settings.md': 'concepts/pydantic_settings.md' 'usage/serialization.md': 'concepts/serialization.md' 'usage/strict_mode.md': 'concepts/strict_mode.md' 'usage/type_adapter.md': 'concepts/type_adapter.md' 'usage/validation_decorator.md': 'concepts/validation_decorator.md' 'usage/validators.md': 'concepts/validators.md' 'usage/types/bytesize.md': 'api/types.md#pydantic.types.ByteSize' 'usage/types/dicts_mapping.md': 'api/standard_library_types.md#mapping-types' 'usage/types/encoded.md': 'api/types.md#pydantic.types.EncodedBytes' 'usage/types/enums.md': 'api/standard_library_types.md#enum' 'usage/types/number_types.md': 'api/standard_library_types.md#number-types' 'usage/types/sequence_iterable.md': 'api/standard_library_types.md#other-iterables' 'usage/types/set_types.md': 'api/standard_library_types.md#sets' 'usage/types/uuids.md': 'api/standard_library_types.md#uuid' 'blog/pydantic-v2-alpha.md': 'https://pydantic.dev/articles/pydantic-v2-alpha' 'blog/pydantic-v2-final.md': 'https://pydantic.dev/articles/pydantic-v2-final' 'blog/pydantic-v2.md': 'https://pydantic.dev/articles/pydantic-v2' 'examples/secrets.md': 'api/types.md#pydantic.types.Secret' 'examples/validators.md': 'examples/custom_validators.md' 'architecture.md': 'internals/architecture.md' pydantic-2.10.6/pydantic/000077500000000000000000000000001474456633400152465ustar00rootroot00000000000000pydantic-2.10.6/pydantic/__init__.py000066400000000000000000000350141474456633400173620ustar00rootroot00000000000000import typing from importlib import import_module from warnings import warn from ._migration import getattr_migration from .version import VERSION if typing.TYPE_CHECKING: # import of virtually everything is supported via `__getattr__` below, # but we need them here for type checking and IDE support import pydantic_core from pydantic_core.core_schema import ( FieldSerializationInfo, SerializationInfo, SerializerFunctionWrapHandler, ValidationInfo, ValidatorFunctionWrapHandler, ) from . import dataclasses from .aliases import AliasChoices, AliasGenerator, AliasPath from .annotated_handlers import GetCoreSchemaHandler, GetJsonSchemaHandler from .config import ConfigDict, with_config from .errors import * from .fields import Field, PrivateAttr, computed_field from .functional_serializers import ( PlainSerializer, SerializeAsAny, WrapSerializer, field_serializer, model_serializer, ) from .functional_validators import ( AfterValidator, BeforeValidator, InstanceOf, ModelWrapValidatorHandler, PlainValidator, SkipValidation, WrapValidator, field_validator, model_validator, ) from .json_schema import WithJsonSchema from .main import * from .networks import * from .type_adapter import TypeAdapter from .types import * from .validate_call_decorator import validate_call from .warnings import ( PydanticDeprecatedSince20, PydanticDeprecatedSince26, PydanticDeprecatedSince29, PydanticDeprecationWarning, PydanticExperimentalWarning, ) # this encourages pycharm to import `ValidationError` from here, not pydantic_core ValidationError = pydantic_core.ValidationError from .deprecated.class_validators import root_validator, validator from .deprecated.config import BaseConfig, Extra from .deprecated.tools import * from .root_model import RootModel __version__ = VERSION __all__ = ( # dataclasses 'dataclasses', # functional validators 'field_validator', 'model_validator', 'AfterValidator', 'BeforeValidator', 'PlainValidator', 'WrapValidator', 'SkipValidation', 'InstanceOf', 'ModelWrapValidatorHandler', # JSON Schema 'WithJsonSchema', # deprecated V1 functional validators, these are imported via `__getattr__` below 'root_validator', 'validator', # functional serializers 'field_serializer', 'model_serializer', 'PlainSerializer', 'SerializeAsAny', 'WrapSerializer', # config 'ConfigDict', 'with_config', # deprecated V1 config, these are imported via `__getattr__` below 'BaseConfig', 'Extra', # validate_call 'validate_call', # errors 'PydanticErrorCodes', 'PydanticUserError', 'PydanticSchemaGenerationError', 'PydanticImportError', 'PydanticUndefinedAnnotation', 'PydanticInvalidForJsonSchema', # fields 'Field', 'computed_field', 'PrivateAttr', # alias 'AliasChoices', 'AliasGenerator', 'AliasPath', # main 'BaseModel', 'create_model', # network 'AnyUrl', 'AnyHttpUrl', 'FileUrl', 'HttpUrl', 'FtpUrl', 'WebsocketUrl', 'AnyWebsocketUrl', 'UrlConstraints', 'EmailStr', 'NameEmail', 'IPvAnyAddress', 'IPvAnyInterface', 'IPvAnyNetwork', 'PostgresDsn', 'CockroachDsn', 'AmqpDsn', 'RedisDsn', 'MongoDsn', 'KafkaDsn', 'NatsDsn', 'MySQLDsn', 'MariaDBDsn', 'ClickHouseDsn', 'SnowflakeDsn', 'validate_email', # root_model 'RootModel', # deprecated tools, these are imported via `__getattr__` below 'parse_obj_as', 'schema_of', 'schema_json_of', # types 'Strict', 'StrictStr', 'conbytes', 'conlist', 'conset', 'confrozenset', 'constr', 'StringConstraints', 'ImportString', 'conint', 'PositiveInt', 'NegativeInt', 'NonNegativeInt', 'NonPositiveInt', 'confloat', 'PositiveFloat', 'NegativeFloat', 'NonNegativeFloat', 'NonPositiveFloat', 'FiniteFloat', 'condecimal', 'condate', 'UUID1', 'UUID3', 'UUID4', 'UUID5', 'FilePath', 'DirectoryPath', 'NewPath', 'Json', 'Secret', 'SecretStr', 'SecretBytes', 'SocketPath', 'StrictBool', 'StrictBytes', 'StrictInt', 'StrictFloat', 'PaymentCardNumber', 'ByteSize', 'PastDate', 'FutureDate', 'PastDatetime', 'FutureDatetime', 'AwareDatetime', 'NaiveDatetime', 'AllowInfNan', 'EncoderProtocol', 'EncodedBytes', 'EncodedStr', 'Base64Encoder', 'Base64Bytes', 'Base64Str', 'Base64UrlBytes', 'Base64UrlStr', 'GetPydanticSchema', 'Tag', 'Discriminator', 'JsonValue', 'FailFast', # type_adapter 'TypeAdapter', # version '__version__', 'VERSION', # warnings 'PydanticDeprecatedSince20', 'PydanticDeprecatedSince26', 'PydanticDeprecatedSince29', 'PydanticDeprecationWarning', 'PydanticExperimentalWarning', # annotated handlers 'GetCoreSchemaHandler', 'GetJsonSchemaHandler', # pydantic_core 'ValidationError', 'ValidationInfo', 'SerializationInfo', 'ValidatorFunctionWrapHandler', 'FieldSerializationInfo', 'SerializerFunctionWrapHandler', 'OnErrorOmit', ) # A mapping of {: (package, )} defining dynamic imports _dynamic_imports: 'dict[str, tuple[str, str]]' = { 'dataclasses': (__spec__.parent, '__module__'), # functional validators 'field_validator': (__spec__.parent, '.functional_validators'), 'model_validator': (__spec__.parent, '.functional_validators'), 'AfterValidator': (__spec__.parent, '.functional_validators'), 'BeforeValidator': (__spec__.parent, '.functional_validators'), 'PlainValidator': (__spec__.parent, '.functional_validators'), 'WrapValidator': (__spec__.parent, '.functional_validators'), 'SkipValidation': (__spec__.parent, '.functional_validators'), 'InstanceOf': (__spec__.parent, '.functional_validators'), 'ModelWrapValidatorHandler': (__spec__.parent, '.functional_validators'), # JSON Schema 'WithJsonSchema': (__spec__.parent, '.json_schema'), # functional serializers 'field_serializer': (__spec__.parent, '.functional_serializers'), 'model_serializer': (__spec__.parent, '.functional_serializers'), 'PlainSerializer': (__spec__.parent, '.functional_serializers'), 'SerializeAsAny': (__spec__.parent, '.functional_serializers'), 'WrapSerializer': (__spec__.parent, '.functional_serializers'), # config 'ConfigDict': (__spec__.parent, '.config'), 'with_config': (__spec__.parent, '.config'), # validate call 'validate_call': (__spec__.parent, '.validate_call_decorator'), # errors 'PydanticErrorCodes': (__spec__.parent, '.errors'), 'PydanticUserError': (__spec__.parent, '.errors'), 'PydanticSchemaGenerationError': (__spec__.parent, '.errors'), 'PydanticImportError': (__spec__.parent, '.errors'), 'PydanticUndefinedAnnotation': (__spec__.parent, '.errors'), 'PydanticInvalidForJsonSchema': (__spec__.parent, '.errors'), # fields 'Field': (__spec__.parent, '.fields'), 'computed_field': (__spec__.parent, '.fields'), 'PrivateAttr': (__spec__.parent, '.fields'), # alias 'AliasChoices': (__spec__.parent, '.aliases'), 'AliasGenerator': (__spec__.parent, '.aliases'), 'AliasPath': (__spec__.parent, '.aliases'), # main 'BaseModel': (__spec__.parent, '.main'), 'create_model': (__spec__.parent, '.main'), # network 'AnyUrl': (__spec__.parent, '.networks'), 'AnyHttpUrl': (__spec__.parent, '.networks'), 'FileUrl': (__spec__.parent, '.networks'), 'HttpUrl': (__spec__.parent, '.networks'), 'FtpUrl': (__spec__.parent, '.networks'), 'WebsocketUrl': (__spec__.parent, '.networks'), 'AnyWebsocketUrl': (__spec__.parent, '.networks'), 'UrlConstraints': (__spec__.parent, '.networks'), 'EmailStr': (__spec__.parent, '.networks'), 'NameEmail': (__spec__.parent, '.networks'), 'IPvAnyAddress': (__spec__.parent, '.networks'), 'IPvAnyInterface': (__spec__.parent, '.networks'), 'IPvAnyNetwork': (__spec__.parent, '.networks'), 'PostgresDsn': (__spec__.parent, '.networks'), 'CockroachDsn': (__spec__.parent, '.networks'), 'AmqpDsn': (__spec__.parent, '.networks'), 'RedisDsn': (__spec__.parent, '.networks'), 'MongoDsn': (__spec__.parent, '.networks'), 'KafkaDsn': (__spec__.parent, '.networks'), 'NatsDsn': (__spec__.parent, '.networks'), 'MySQLDsn': (__spec__.parent, '.networks'), 'MariaDBDsn': (__spec__.parent, '.networks'), 'ClickHouseDsn': (__spec__.parent, '.networks'), 'SnowflakeDsn': (__spec__.parent, '.networks'), 'validate_email': (__spec__.parent, '.networks'), # root_model 'RootModel': (__spec__.parent, '.root_model'), # types 'Strict': (__spec__.parent, '.types'), 'StrictStr': (__spec__.parent, '.types'), 'conbytes': (__spec__.parent, '.types'), 'conlist': (__spec__.parent, '.types'), 'conset': (__spec__.parent, '.types'), 'confrozenset': (__spec__.parent, '.types'), 'constr': (__spec__.parent, '.types'), 'StringConstraints': (__spec__.parent, '.types'), 'ImportString': (__spec__.parent, '.types'), 'conint': (__spec__.parent, '.types'), 'PositiveInt': (__spec__.parent, '.types'), 'NegativeInt': (__spec__.parent, '.types'), 'NonNegativeInt': (__spec__.parent, '.types'), 'NonPositiveInt': (__spec__.parent, '.types'), 'confloat': (__spec__.parent, '.types'), 'PositiveFloat': (__spec__.parent, '.types'), 'NegativeFloat': (__spec__.parent, '.types'), 'NonNegativeFloat': (__spec__.parent, '.types'), 'NonPositiveFloat': (__spec__.parent, '.types'), 'FiniteFloat': (__spec__.parent, '.types'), 'condecimal': (__spec__.parent, '.types'), 'condate': (__spec__.parent, '.types'), 'UUID1': (__spec__.parent, '.types'), 'UUID3': (__spec__.parent, '.types'), 'UUID4': (__spec__.parent, '.types'), 'UUID5': (__spec__.parent, '.types'), 'FilePath': (__spec__.parent, '.types'), 'DirectoryPath': (__spec__.parent, '.types'), 'NewPath': (__spec__.parent, '.types'), 'Json': (__spec__.parent, '.types'), 'Secret': (__spec__.parent, '.types'), 'SecretStr': (__spec__.parent, '.types'), 'SecretBytes': (__spec__.parent, '.types'), 'StrictBool': (__spec__.parent, '.types'), 'StrictBytes': (__spec__.parent, '.types'), 'StrictInt': (__spec__.parent, '.types'), 'StrictFloat': (__spec__.parent, '.types'), 'PaymentCardNumber': (__spec__.parent, '.types'), 'ByteSize': (__spec__.parent, '.types'), 'PastDate': (__spec__.parent, '.types'), 'SocketPath': (__spec__.parent, '.types'), 'FutureDate': (__spec__.parent, '.types'), 'PastDatetime': (__spec__.parent, '.types'), 'FutureDatetime': (__spec__.parent, '.types'), 'AwareDatetime': (__spec__.parent, '.types'), 'NaiveDatetime': (__spec__.parent, '.types'), 'AllowInfNan': (__spec__.parent, '.types'), 'EncoderProtocol': (__spec__.parent, '.types'), 'EncodedBytes': (__spec__.parent, '.types'), 'EncodedStr': (__spec__.parent, '.types'), 'Base64Encoder': (__spec__.parent, '.types'), 'Base64Bytes': (__spec__.parent, '.types'), 'Base64Str': (__spec__.parent, '.types'), 'Base64UrlBytes': (__spec__.parent, '.types'), 'Base64UrlStr': (__spec__.parent, '.types'), 'GetPydanticSchema': (__spec__.parent, '.types'), 'Tag': (__spec__.parent, '.types'), 'Discriminator': (__spec__.parent, '.types'), 'JsonValue': (__spec__.parent, '.types'), 'OnErrorOmit': (__spec__.parent, '.types'), 'FailFast': (__spec__.parent, '.types'), # type_adapter 'TypeAdapter': (__spec__.parent, '.type_adapter'), # warnings 'PydanticDeprecatedSince20': (__spec__.parent, '.warnings'), 'PydanticDeprecatedSince26': (__spec__.parent, '.warnings'), 'PydanticDeprecatedSince29': (__spec__.parent, '.warnings'), 'PydanticDeprecationWarning': (__spec__.parent, '.warnings'), 'PydanticExperimentalWarning': (__spec__.parent, '.warnings'), # annotated handlers 'GetCoreSchemaHandler': (__spec__.parent, '.annotated_handlers'), 'GetJsonSchemaHandler': (__spec__.parent, '.annotated_handlers'), # pydantic_core stuff 'ValidationError': ('pydantic_core', '.'), 'ValidationInfo': ('pydantic_core', '.core_schema'), 'SerializationInfo': ('pydantic_core', '.core_schema'), 'ValidatorFunctionWrapHandler': ('pydantic_core', '.core_schema'), 'FieldSerializationInfo': ('pydantic_core', '.core_schema'), 'SerializerFunctionWrapHandler': ('pydantic_core', '.core_schema'), # deprecated, mostly not included in __all__ 'root_validator': (__spec__.parent, '.deprecated.class_validators'), 'validator': (__spec__.parent, '.deprecated.class_validators'), 'BaseConfig': (__spec__.parent, '.deprecated.config'), 'Extra': (__spec__.parent, '.deprecated.config'), 'parse_obj_as': (__spec__.parent, '.deprecated.tools'), 'schema_of': (__spec__.parent, '.deprecated.tools'), 'schema_json_of': (__spec__.parent, '.deprecated.tools'), # deprecated dynamic imports 'FieldValidationInfo': ('pydantic_core', '.core_schema'), 'GenerateSchema': (__spec__.parent, '._internal._generate_schema'), } _deprecated_dynamic_imports = {'FieldValidationInfo', 'GenerateSchema'} _getattr_migration = getattr_migration(__name__) def __getattr__(attr_name: str) -> object: if attr_name in _deprecated_dynamic_imports: warn( f'Importing {attr_name} from `pydantic` is deprecated. This feature is either no longer supported, or is not public.', DeprecationWarning, stacklevel=2, ) dynamic_attr = _dynamic_imports.get(attr_name) if dynamic_attr is None: return _getattr_migration(attr_name) package, module_name = dynamic_attr if module_name == '__module__': result = import_module(f'.{attr_name}', package=package) globals()[attr_name] = result return result else: module = import_module(module_name, package=package) result = getattr(module, attr_name) g = globals() for k, (_, v_module_name) in _dynamic_imports.items(): if v_module_name == module_name and k not in _deprecated_dynamic_imports: g[k] = getattr(module, k) return result def __dir__() -> 'list[str]': return list(__all__) pydantic-2.10.6/pydantic/_internal/000077500000000000000000000000001474456633400172215ustar00rootroot00000000000000pydantic-2.10.6/pydantic/_internal/__init__.py000066400000000000000000000000001474456633400213200ustar00rootroot00000000000000pydantic-2.10.6/pydantic/_internal/_config.py000066400000000000000000000304041474456633400212000ustar00rootroot00000000000000from __future__ import annotations as _annotations import warnings from contextlib import contextmanager from re import Pattern from typing import ( TYPE_CHECKING, Any, Callable, cast, ) from pydantic_core import core_schema from typing_extensions import ( Literal, Self, ) from ..aliases import AliasGenerator from ..config import ConfigDict, ExtraValues, JsonDict, JsonEncoder, JsonSchemaExtraCallable from ..errors import PydanticUserError from ..warnings import PydanticDeprecatedSince20, PydanticDeprecatedSince210 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 if TYPE_CHECKING: from .._internal._schema_generation_shared import GenerateSchema from ..fields import ComputedFieldInfo, FieldInfo DEPRECATION_MESSAGE = 'Support for class-based `config` is deprecated, use ConfigDict instead.' class ConfigWrapper: """Internal wrapper for Config which exposes ConfigDict items as attributes.""" __slots__ = ('config_dict',) config_dict: ConfigDict # all annotations are copied directly from ConfigDict, and should be kept up to date, a test will fail if they # stop matching title: str | None str_to_lower: bool str_to_upper: bool str_strip_whitespace: bool str_min_length: int str_max_length: int | None extra: ExtraValues | None frozen: bool populate_by_name: bool use_enum_values: bool validate_assignment: bool arbitrary_types_allowed: bool from_attributes: bool # whether to use the actual key provided in the data (e.g. alias or first alias for "field required" errors) instead of field_names # to construct error `loc`s, default `True` loc_by_alias: bool alias_generator: Callable[[str], str] | AliasGenerator | None model_title_generator: Callable[[type], str] | None field_title_generator: Callable[[str, FieldInfo | ComputedFieldInfo], str] | None ignored_types: tuple[type, ...] allow_inf_nan: bool json_schema_extra: JsonDict | JsonSchemaExtraCallable | None json_encoders: dict[type[object], JsonEncoder] | None # new in V2 strict: bool # whether instances of models and dataclasses (including subclass instances) should re-validate, default 'never' revalidate_instances: Literal['always', 'never', 'subclass-instances'] ser_json_timedelta: Literal['iso8601', 'float'] ser_json_bytes: Literal['utf8', 'base64', 'hex'] val_json_bytes: Literal['utf8', 'base64', 'hex'] ser_json_inf_nan: Literal['null', 'constants', 'strings'] # whether to validate default values during validation, default False validate_default: bool validate_return: bool protected_namespaces: tuple[str | Pattern[str], ...] hide_input_in_errors: bool defer_build: bool plugin_settings: dict[str, object] | None schema_generator: type[GenerateSchema] | None json_schema_serialization_defaults_required: bool json_schema_mode_override: Literal['validation', 'serialization', None] coerce_numbers_to_str: bool regex_engine: Literal['rust-regex', 'python-re'] validation_error_cause: bool use_attribute_docstrings: bool cache_strings: bool | Literal['all', 'keys', 'none'] def __init__(self, config: ConfigDict | dict[str, Any] | type[Any] | None, *, check: bool = True): if check: self.config_dict = prepare_config(config) else: self.config_dict = cast(ConfigDict, config) @classmethod def for_model(cls, bases: tuple[type[Any], ...], namespace: dict[str, Any], kwargs: dict[str, Any]) -> Self: """Build a new `ConfigWrapper` instance for a `BaseModel`. The config wrapper built based on (in descending order of priority): - options from `kwargs` - options from the `namespace` - options from the base classes (`bases`) Args: bases: A tuple of base classes. namespace: The namespace of the class being created. kwargs: The kwargs passed to the class being created. Returns: A `ConfigWrapper` instance for `BaseModel`. """ config_new = ConfigDict() for base in bases: config = getattr(base, 'model_config', None) if config: config_new.update(config.copy()) config_class_from_namespace = namespace.get('Config') config_dict_from_namespace = namespace.get('model_config') raw_annotations = namespace.get('__annotations__', {}) if raw_annotations.get('model_config') and config_dict_from_namespace is None: raise PydanticUserError( '`model_config` cannot be used as a model field name. Use `model_config` for model configuration.', code='model-config-invalid-field-name', ) if config_class_from_namespace and config_dict_from_namespace: raise PydanticUserError('"Config" and "model_config" cannot be used together', code='config-both') config_from_namespace = config_dict_from_namespace or prepare_config(config_class_from_namespace) config_new.update(config_from_namespace) for k in list(kwargs.keys()): if k in config_keys: config_new[k] = kwargs.pop(k) return cls(config_new) # we don't show `__getattr__` to type checkers so missing attributes cause errors if not TYPE_CHECKING: # pragma: no branch def __getattr__(self, name: str) -> Any: try: return self.config_dict[name] except KeyError: try: return config_defaults[name] except KeyError: raise AttributeError(f'Config has no attribute {name!r}') from None def core_config(self, title: str | None) -> core_schema.CoreConfig: """Create a pydantic-core config. We don't use getattr here since we don't want to populate with defaults. Args: title: The title to use if not set in config. Returns: A `CoreConfig` object created from config. """ config = self.config_dict if config.get('schema_generator') is not None: warnings.warn( 'The `schema_generator` setting has been deprecated since v2.10. This setting no longer has any effect.', PydanticDeprecatedSince210, stacklevel=2, ) core_config_values = { 'title': config.get('title') or title or None, 'extra_fields_behavior': config.get('extra'), 'allow_inf_nan': config.get('allow_inf_nan'), 'populate_by_name': config.get('populate_by_name'), 'str_strip_whitespace': config.get('str_strip_whitespace'), 'str_to_lower': config.get('str_to_lower'), 'str_to_upper': config.get('str_to_upper'), 'strict': config.get('strict'), 'ser_json_timedelta': config.get('ser_json_timedelta'), 'ser_json_bytes': config.get('ser_json_bytes'), 'val_json_bytes': config.get('val_json_bytes'), 'ser_json_inf_nan': config.get('ser_json_inf_nan'), 'from_attributes': config.get('from_attributes'), 'loc_by_alias': config.get('loc_by_alias'), 'revalidate_instances': config.get('revalidate_instances'), 'validate_default': config.get('validate_default'), 'str_max_length': config.get('str_max_length'), 'str_min_length': config.get('str_min_length'), 'hide_input_in_errors': config.get('hide_input_in_errors'), 'coerce_numbers_to_str': config.get('coerce_numbers_to_str'), 'regex_engine': config.get('regex_engine'), 'validation_error_cause': config.get('validation_error_cause'), 'cache_strings': config.get('cache_strings'), } return core_schema.CoreConfig(**{k: v for k, v in core_config_values.items() if v is not None}) def __repr__(self): c = ', '.join(f'{k}={v!r}' for k, v in self.config_dict.items()) return f'ConfigWrapper({c})' class ConfigWrapperStack: """A stack of `ConfigWrapper` instances.""" def __init__(self, config_wrapper: ConfigWrapper): self._config_wrapper_stack: list[ConfigWrapper] = [config_wrapper] @property def tail(self) -> ConfigWrapper: return self._config_wrapper_stack[-1] @contextmanager def push(self, config_wrapper: ConfigWrapper | ConfigDict | None): if config_wrapper is None: yield return if not isinstance(config_wrapper, ConfigWrapper): config_wrapper = ConfigWrapper(config_wrapper, check=False) self._config_wrapper_stack.append(config_wrapper) try: yield finally: self._config_wrapper_stack.pop() config_defaults = ConfigDict( title=None, str_to_lower=False, str_to_upper=False, str_strip_whitespace=False, str_min_length=0, str_max_length=None, # let the model / dataclass decide how to handle it extra=None, frozen=False, populate_by_name=False, use_enum_values=False, validate_assignment=False, arbitrary_types_allowed=False, from_attributes=False, loc_by_alias=True, alias_generator=None, model_title_generator=None, field_title_generator=None, ignored_types=(), allow_inf_nan=True, json_schema_extra=None, strict=False, revalidate_instances='never', ser_json_timedelta='iso8601', ser_json_bytes='utf8', val_json_bytes='utf8', ser_json_inf_nan='null', validate_default=False, validate_return=False, protected_namespaces=('model_validate', 'model_dump'), hide_input_in_errors=False, json_encoders=None, defer_build=False, schema_generator=None, plugin_settings=None, json_schema_serialization_defaults_required=False, json_schema_mode_override=None, coerce_numbers_to_str=False, regex_engine='rust-regex', validation_error_cause=False, use_attribute_docstrings=False, cache_strings=True, ) def prepare_config(config: ConfigDict | dict[str, Any] | type[Any] | None) -> ConfigDict: """Create a `ConfigDict` instance from an existing dict, a class (e.g. old class-based config) or None. Args: config: The input config. Returns: A ConfigDict object created from config. """ if config is None: return ConfigDict() if not isinstance(config, dict): warnings.warn(DEPRECATION_MESSAGE, DeprecationWarning) config = {k: getattr(config, k) for k in dir(config) if not k.startswith('__')} config_dict = cast(ConfigDict, config) check_deprecated(config_dict) return config_dict config_keys = set(ConfigDict.__annotations__.keys()) V2_REMOVED_KEYS = { 'allow_mutation', 'error_msg_templates', 'fields', 'getter_dict', 'smart_union', 'underscore_attrs_are_private', 'json_loads', 'json_dumps', 'copy_on_model_validation', 'post_init_call', } V2_RENAMED_KEYS = { 'allow_population_by_field_name': 'populate_by_name', 'anystr_lower': 'str_to_lower', 'anystr_strip_whitespace': 'str_strip_whitespace', 'anystr_upper': 'str_to_upper', 'keep_untouched': 'ignored_types', 'max_anystr_length': 'str_max_length', 'min_anystr_length': 'str_min_length', 'orm_mode': 'from_attributes', 'schema_extra': 'json_schema_extra', 'validate_all': 'validate_default', } def check_deprecated(config_dict: ConfigDict) -> None: """Check for deprecated config keys and warn the user. Args: config_dict: The input config. """ deprecated_removed_keys = V2_REMOVED_KEYS & config_dict.keys() deprecated_renamed_keys = V2_RENAMED_KEYS.keys() & config_dict.keys() if deprecated_removed_keys or deprecated_renamed_keys: renamings = {k: V2_RENAMED_KEYS[k] for k in sorted(deprecated_renamed_keys)} renamed_bullets = [f'* {k!r} has been renamed to {v!r}' for k, v in renamings.items()] removed_bullets = [f'* {k!r} has been removed' for k in sorted(deprecated_removed_keys)] message = '\n'.join(['Valid config keys have changed in V2:'] + renamed_bullets + removed_bullets) warnings.warn(message, UserWarning) pydantic-2.10.6/pydantic/_internal/_core_metadata.py000066400000000000000000000110571474456633400225260ustar00rootroot00000000000000from __future__ import annotations as _annotations from typing import TYPE_CHECKING, Any, TypedDict, cast from warnings import warn if TYPE_CHECKING: from ..config import JsonDict, JsonSchemaExtraCallable from ._schema_generation_shared import ( GetJsonSchemaFunction, ) class CoreMetadata(TypedDict, total=False): """A `TypedDict` for holding the metadata dict of the schema. Attributes: pydantic_js_functions: List of JSON schema functions that resolve refs during application. pydantic_js_annotation_functions: List of JSON schema functions that don't resolve refs during application. pydantic_js_prefer_positional_arguments: Whether JSON schema generator will prefer positional over keyword arguments for an 'arguments' schema. custom validation function. Only applies to before, plain, and wrap validators. pydantic_js_udpates: key / value pair updates to apply to the JSON schema for a type. pydantic_js_extra: WIP, either key/value pair updates to apply to the JSON schema, or a custom callable. TODO: Perhaps we should move this structure to pydantic-core. At the moment, though, it's easier to iterate on if we leave it in pydantic until we feel there is a semi-stable API. TODO: It's unfortunate how functionally oriented JSON schema generation is, especially that which occurs during the core schema generation process. It's inevitable that we need to store some json schema related information on core schemas, given that we generate JSON schemas directly from core schemas. That being said, debugging related issues is quite difficult when JSON schema information is disguised via dynamically defined functions. """ pydantic_js_functions: list[GetJsonSchemaFunction] pydantic_js_annotation_functions: list[GetJsonSchemaFunction] pydantic_js_prefer_positional_arguments: bool pydantic_js_updates: JsonDict pydantic_js_extra: JsonDict | JsonSchemaExtraCallable def update_core_metadata( core_metadata: Any, /, *, pydantic_js_functions: list[GetJsonSchemaFunction] | None = None, pydantic_js_annotation_functions: list[GetJsonSchemaFunction] | None = None, pydantic_js_updates: JsonDict | None = None, pydantic_js_extra: JsonDict | JsonSchemaExtraCallable | None = None, ) -> None: from ..json_schema import PydanticJsonSchemaWarning """Update CoreMetadata instance in place. When we make modifications in this function, they take effect on the `core_metadata` reference passed in as the first (and only) positional argument. First, cast to `CoreMetadata`, then finish with a cast to `dict[str, Any]` for core schema compatibility. We do this here, instead of before / after each call to this function so that this typing hack can be easily removed if/when we move `CoreMetadata` to `pydantic-core`. For parameter descriptions, see `CoreMetadata` above. """ core_metadata = cast(CoreMetadata, core_metadata) if pydantic_js_functions: core_metadata.setdefault('pydantic_js_functions', []).extend(pydantic_js_functions) if pydantic_js_annotation_functions: core_metadata.setdefault('pydantic_js_annotation_functions', []).extend(pydantic_js_annotation_functions) if pydantic_js_updates: if (existing_updates := core_metadata.get('pydantic_js_updates')) is not None: core_metadata['pydantic_js_updates'] = {**existing_updates, **pydantic_js_updates} else: core_metadata['pydantic_js_updates'] = pydantic_js_updates if pydantic_js_extra is not None: existing_pydantic_js_extra = core_metadata.get('pydantic_js_extra') if existing_pydantic_js_extra is None: core_metadata['pydantic_js_extra'] = pydantic_js_extra if isinstance(existing_pydantic_js_extra, dict): if isinstance(pydantic_js_extra, dict): core_metadata['pydantic_js_extra'] = {**existing_pydantic_js_extra, **pydantic_js_extra} if callable(pydantic_js_extra): warn( 'Composing `dict` and `callable` type `json_schema_extra` is not supported.' 'The `callable` type is being ignored.' "If you'd like support for this behavior, please open an issue on pydantic.", PydanticJsonSchemaWarning, ) if callable(existing_pydantic_js_extra): # if ever there's a case of a callable, we'll just keep the last json schema extra spec core_metadata['pydantic_js_extra'] = pydantic_js_extra pydantic-2.10.6/pydantic/_internal/_core_utils.py000066400000000000000000000637141474456633400221150ustar00rootroot00000000000000from __future__ import annotations import os from collections import defaultdict from typing import Any, Callable, Hashable, TypeVar, Union from pydantic_core import CoreSchema, core_schema from pydantic_core import validate_core_schema as _validate_core_schema from typing_extensions import TypeGuard, get_args, get_origin from ..errors import PydanticUserError from . import _repr from ._core_metadata import CoreMetadata from ._typing_extra import is_generic_alias, is_type_alias_type AnyFunctionSchema = Union[ core_schema.AfterValidatorFunctionSchema, core_schema.BeforeValidatorFunctionSchema, core_schema.WrapValidatorFunctionSchema, core_schema.PlainValidatorFunctionSchema, ] FunctionSchemaWithInnerSchema = Union[ core_schema.AfterValidatorFunctionSchema, core_schema.BeforeValidatorFunctionSchema, core_schema.WrapValidatorFunctionSchema, ] CoreSchemaField = Union[ core_schema.ModelField, core_schema.DataclassField, core_schema.TypedDictField, core_schema.ComputedField ] CoreSchemaOrField = Union[core_schema.CoreSchema, CoreSchemaField] _CORE_SCHEMA_FIELD_TYPES = {'typed-dict-field', 'dataclass-field', 'model-field', 'computed-field'} _FUNCTION_WITH_INNER_SCHEMA_TYPES = {'function-before', 'function-after', 'function-wrap'} _LIST_LIKE_SCHEMA_WITH_ITEMS_TYPES = {'list', 'set', 'frozenset'} TAGGED_UNION_TAG_KEY = 'pydantic.internal.tagged_union_tag' """ Used in a `Tag` schema to specify the tag used for a discriminated union. """ def is_core_schema( schema: CoreSchemaOrField, ) -> TypeGuard[CoreSchema]: return schema['type'] not in _CORE_SCHEMA_FIELD_TYPES def is_core_schema_field( schema: CoreSchemaOrField, ) -> TypeGuard[CoreSchemaField]: return schema['type'] in _CORE_SCHEMA_FIELD_TYPES def is_function_with_inner_schema( schema: CoreSchemaOrField, ) -> TypeGuard[FunctionSchemaWithInnerSchema]: return schema['type'] in _FUNCTION_WITH_INNER_SCHEMA_TYPES def is_list_like_schema_with_items_schema( schema: CoreSchema, ) -> TypeGuard[core_schema.ListSchema | core_schema.SetSchema | core_schema.FrozenSetSchema]: return schema['type'] in _LIST_LIKE_SCHEMA_WITH_ITEMS_TYPES def get_type_ref(type_: type[Any], args_override: tuple[type[Any], ...] | None = None) -> str: """Produces the ref to be used for this type by pydantic_core's core schemas. This `args_override` argument was added for the purpose of creating valid recursive references when creating generic models without needing to create a concrete class. """ origin = get_origin(type_) or type_ args = get_args(type_) if is_generic_alias(type_) else (args_override or ()) generic_metadata = getattr(type_, '__pydantic_generic_metadata__', None) if generic_metadata: origin = generic_metadata['origin'] or origin args = generic_metadata['args'] or args module_name = getattr(origin, '__module__', '') if is_type_alias_type(origin): type_ref = f'{module_name}.{origin.__name__}:{id(origin)}' else: try: qualname = getattr(origin, '__qualname__', f'') except Exception: qualname = getattr(origin, '__qualname__', '') type_ref = f'{module_name}.{qualname}:{id(origin)}' arg_refs: list[str] = [] for arg in args: if isinstance(arg, str): # Handle string literals as a special case; we may be able to remove this special handling if we # wrap them in a ForwardRef at some point. arg_ref = f'{arg}:str-{id(arg)}' else: arg_ref = f'{_repr.display_as_type(arg)}:{id(arg)}' arg_refs.append(arg_ref) if arg_refs: type_ref = f'{type_ref}[{",".join(arg_refs)}]' return type_ref def get_ref(s: core_schema.CoreSchema) -> None | str: """Get the ref from the schema if it has one. This exists just for type checking to work correctly. """ return s.get('ref', None) def collect_definitions(schema: core_schema.CoreSchema) -> dict[str, core_schema.CoreSchema]: defs: dict[str, CoreSchema] = {} def _record_valid_refs(s: core_schema.CoreSchema, recurse: Recurse) -> core_schema.CoreSchema: ref = get_ref(s) if ref: defs[ref] = s return recurse(s, _record_valid_refs) walk_core_schema(schema, _record_valid_refs, copy=False) return defs def define_expected_missing_refs( schema: core_schema.CoreSchema, allowed_missing_refs: set[str] ) -> core_schema.CoreSchema | None: if not allowed_missing_refs: # in this case, there are no missing refs to potentially substitute, so there's no need to walk the schema # this is a common case (will be hit for all non-generic models), so it's worth optimizing for return None refs = collect_definitions(schema).keys() expected_missing_refs = allowed_missing_refs.difference(refs) if expected_missing_refs: definitions: list[core_schema.CoreSchema] = [ core_schema.invalid_schema(ref=ref) for ref in expected_missing_refs ] return core_schema.definitions_schema(schema, definitions) return None def collect_invalid_schemas(schema: core_schema.CoreSchema) -> bool: invalid = False def _is_schema_valid(s: core_schema.CoreSchema, recurse: Recurse) -> core_schema.CoreSchema: nonlocal invalid if s['type'] == 'invalid': invalid = True return s return recurse(s, _is_schema_valid) walk_core_schema(schema, _is_schema_valid, copy=False) return invalid T = TypeVar('T') Recurse = Callable[[core_schema.CoreSchema, 'Walk'], core_schema.CoreSchema] Walk = Callable[[core_schema.CoreSchema, Recurse], core_schema.CoreSchema] # TODO: Should we move _WalkCoreSchema into pydantic_core proper? # Issue: https://github.com/pydantic/pydantic-core/issues/615 CoreSchemaT = TypeVar('CoreSchemaT') class _WalkCoreSchema: def __init__(self, *, copy: bool = True): self._schema_type_to_method = self._build_schema_type_to_method() self._copy = copy def _copy_schema(self, schema: CoreSchemaT) -> CoreSchemaT: return schema.copy() if self._copy else schema # pyright: ignore[reportAttributeAccessIssue] def _build_schema_type_to_method(self) -> dict[core_schema.CoreSchemaType, Recurse]: mapping: dict[core_schema.CoreSchemaType, Recurse] = {} key: core_schema.CoreSchemaType for key in get_args(core_schema.CoreSchemaType): method_name = f"handle_{key.replace('-', '_')}_schema" mapping[key] = getattr(self, method_name, self._handle_other_schemas) return mapping def walk(self, schema: core_schema.CoreSchema, f: Walk) -> core_schema.CoreSchema: return f(schema, self._walk) def _walk(self, schema: core_schema.CoreSchema, f: Walk) -> core_schema.CoreSchema: schema = self._schema_type_to_method[schema['type']](self._copy_schema(schema), f) ser_schema: core_schema.SerSchema | None = schema.get('serialization') # type: ignore if ser_schema: schema['serialization'] = self._handle_ser_schemas(ser_schema, f) return schema def _handle_other_schemas(self, schema: core_schema.CoreSchema, f: Walk) -> core_schema.CoreSchema: sub_schema = schema.get('schema', None) if sub_schema is not None: schema['schema'] = self.walk(sub_schema, f) # type: ignore return schema def _handle_ser_schemas(self, ser_schema: core_schema.SerSchema, f: Walk) -> core_schema.SerSchema: schema: core_schema.CoreSchema | None = ser_schema.get('schema', None) return_schema: core_schema.CoreSchema | None = ser_schema.get('return_schema', None) if schema is not None or return_schema is not None: ser_schema = self._copy_schema(ser_schema) if schema is not None: ser_schema['schema'] = self.walk(schema, f) # type: ignore if return_schema is not None: ser_schema['return_schema'] = self.walk(return_schema, f) # type: ignore return ser_schema def handle_definitions_schema(self, schema: core_schema.DefinitionsSchema, f: Walk) -> core_schema.CoreSchema: new_definitions: list[core_schema.CoreSchema] = [] for definition in schema['definitions']: if 'schema_ref' in definition and 'ref' in definition: # This indicates a purposely indirect reference # We want to keep such references around for implications related to JSON schema, etc.: new_definitions.append(definition) # However, we still need to walk the referenced definition: self.walk(definition, f) continue updated_definition = self.walk(definition, f) if 'ref' in updated_definition: # If the updated definition schema doesn't have a 'ref', it shouldn't go in the definitions # This is most likely to happen due to replacing something with a definition reference, in # which case it should certainly not go in the definitions list new_definitions.append(updated_definition) new_inner_schema = self.walk(schema['schema'], f) if not new_definitions and len(schema) == 3: # This means we'd be returning a "trivial" definitions schema that just wrapped the inner schema return new_inner_schema new_schema = self._copy_schema(schema) new_schema['schema'] = new_inner_schema new_schema['definitions'] = new_definitions return new_schema def handle_list_schema(self, schema: core_schema.ListSchema, f: Walk) -> core_schema.CoreSchema: items_schema = schema.get('items_schema') if items_schema is not None: schema['items_schema'] = self.walk(items_schema, f) return schema def handle_set_schema(self, schema: core_schema.SetSchema, f: Walk) -> core_schema.CoreSchema: items_schema = schema.get('items_schema') if items_schema is not None: schema['items_schema'] = self.walk(items_schema, f) return schema def handle_frozenset_schema(self, schema: core_schema.FrozenSetSchema, f: Walk) -> core_schema.CoreSchema: items_schema = schema.get('items_schema') if items_schema is not None: schema['items_schema'] = self.walk(items_schema, f) return schema def handle_generator_schema(self, schema: core_schema.GeneratorSchema, f: Walk) -> core_schema.CoreSchema: items_schema = schema.get('items_schema') if items_schema is not None: schema['items_schema'] = self.walk(items_schema, f) return schema def handle_tuple_schema(self, schema: core_schema.TupleSchema, f: Walk) -> core_schema.CoreSchema: schema['items_schema'] = [self.walk(v, f) for v in schema['items_schema']] return schema def handle_dict_schema(self, schema: core_schema.DictSchema, f: Walk) -> core_schema.CoreSchema: keys_schema = schema.get('keys_schema') if keys_schema is not None: schema['keys_schema'] = self.walk(keys_schema, f) values_schema = schema.get('values_schema') if values_schema: schema['values_schema'] = self.walk(values_schema, f) return schema def handle_function_after_schema( self, schema: core_schema.AfterValidatorFunctionSchema, f: Walk ) -> core_schema.CoreSchema: schema['schema'] = self.walk(schema['schema'], f) return schema def handle_function_before_schema( self, schema: core_schema.BeforeValidatorFunctionSchema, f: Walk ) -> core_schema.CoreSchema: schema['schema'] = self.walk(schema['schema'], f) if 'json_schema_input_schema' in schema: schema['json_schema_input_schema'] = self.walk(schema['json_schema_input_schema'], f) return schema # TODO duplicate schema types for serializers and validators, needs to be deduplicated: def handle_function_plain_schema( self, schema: core_schema.PlainValidatorFunctionSchema | core_schema.PlainSerializerFunctionSerSchema, f: Walk ) -> core_schema.CoreSchema: if 'json_schema_input_schema' in schema: schema['json_schema_input_schema'] = self.walk(schema['json_schema_input_schema'], f) return schema # pyright: ignore[reportReturnType] # TODO duplicate schema types for serializers and validators, needs to be deduplicated: def handle_function_wrap_schema( self, schema: core_schema.WrapValidatorFunctionSchema | core_schema.WrapSerializerFunctionSerSchema, f: Walk ) -> core_schema.CoreSchema: if 'schema' in schema: schema['schema'] = self.walk(schema['schema'], f) if 'json_schema_input_schema' in schema: schema['json_schema_input_schema'] = self.walk(schema['json_schema_input_schema'], f) return schema # pyright: ignore[reportReturnType] def handle_union_schema(self, schema: core_schema.UnionSchema, f: Walk) -> core_schema.CoreSchema: new_choices: list[CoreSchema | tuple[CoreSchema, str]] = [] for v in schema['choices']: if isinstance(v, tuple): new_choices.append((self.walk(v[0], f), v[1])) else: new_choices.append(self.walk(v, f)) schema['choices'] = new_choices return schema def handle_tagged_union_schema(self, schema: core_schema.TaggedUnionSchema, f: Walk) -> core_schema.CoreSchema: new_choices: dict[Hashable, core_schema.CoreSchema] = {} for k, v in schema['choices'].items(): new_choices[k] = v if isinstance(v, (str, int)) else self.walk(v, f) schema['choices'] = new_choices return schema def handle_chain_schema(self, schema: core_schema.ChainSchema, f: Walk) -> core_schema.CoreSchema: schema['steps'] = [self.walk(v, f) for v in schema['steps']] return schema def handle_lax_or_strict_schema(self, schema: core_schema.LaxOrStrictSchema, f: Walk) -> core_schema.CoreSchema: schema['lax_schema'] = self.walk(schema['lax_schema'], f) schema['strict_schema'] = self.walk(schema['strict_schema'], f) return schema def handle_json_or_python_schema(self, schema: core_schema.JsonOrPythonSchema, f: Walk) -> core_schema.CoreSchema: schema['json_schema'] = self.walk(schema['json_schema'], f) schema['python_schema'] = self.walk(schema['python_schema'], f) return schema def handle_model_fields_schema(self, schema: core_schema.ModelFieldsSchema, f: Walk) -> core_schema.CoreSchema: extras_schema = schema.get('extras_schema') if extras_schema is not None: schema['extras_schema'] = self.walk(extras_schema, f) replaced_fields: dict[str, core_schema.ModelField] = {} replaced_computed_fields: list[core_schema.ComputedField] = [] for computed_field in schema.get('computed_fields', ()): replaced_field = self._copy_schema(computed_field) replaced_field['return_schema'] = self.walk(computed_field['return_schema'], f) replaced_computed_fields.append(replaced_field) if replaced_computed_fields: schema['computed_fields'] = replaced_computed_fields for k, v in schema['fields'].items(): replaced_field = self._copy_schema(v) replaced_field['schema'] = self.walk(v['schema'], f) replaced_fields[k] = replaced_field schema['fields'] = replaced_fields return schema def handle_typed_dict_schema(self, schema: core_schema.TypedDictSchema, f: Walk) -> core_schema.CoreSchema: extras_schema = schema.get('extras_schema') if extras_schema is not None: schema['extras_schema'] = self.walk(extras_schema, f) replaced_computed_fields: list[core_schema.ComputedField] = [] for computed_field in schema.get('computed_fields', ()): replaced_field = self._copy_schema(computed_field) replaced_field['return_schema'] = self.walk(computed_field['return_schema'], f) replaced_computed_fields.append(replaced_field) if replaced_computed_fields: schema['computed_fields'] = replaced_computed_fields replaced_fields: dict[str, core_schema.TypedDictField] = {} for k, v in schema['fields'].items(): replaced_field = self._copy_schema(v) replaced_field['schema'] = self.walk(v['schema'], f) replaced_fields[k] = replaced_field schema['fields'] = replaced_fields return schema def handle_dataclass_args_schema(self, schema: core_schema.DataclassArgsSchema, f: Walk) -> core_schema.CoreSchema: replaced_fields: list[core_schema.DataclassField] = [] replaced_computed_fields: list[core_schema.ComputedField] = [] for computed_field in schema.get('computed_fields', ()): replaced_field = self._copy_schema(computed_field) replaced_field['return_schema'] = self.walk(computed_field['return_schema'], f) replaced_computed_fields.append(replaced_field) if replaced_computed_fields: schema['computed_fields'] = replaced_computed_fields for field in schema['fields']: replaced_field = self._copy_schema(field) replaced_field['schema'] = self.walk(field['schema'], f) replaced_fields.append(replaced_field) schema['fields'] = replaced_fields return schema def handle_arguments_schema(self, schema: core_schema.ArgumentsSchema, f: Walk) -> core_schema.CoreSchema: replaced_arguments_schema: list[core_schema.ArgumentsParameter] = [] for param in schema['arguments_schema']: replaced_param = self._copy_schema(param) replaced_param['schema'] = self.walk(param['schema'], f) replaced_arguments_schema.append(replaced_param) schema['arguments_schema'] = replaced_arguments_schema if 'var_args_schema' in schema: schema['var_args_schema'] = self.walk(schema['var_args_schema'], f) if 'var_kwargs_schema' in schema: schema['var_kwargs_schema'] = self.walk(schema['var_kwargs_schema'], f) return schema def handle_call_schema(self, schema: core_schema.CallSchema, f: Walk) -> core_schema.CoreSchema: schema['arguments_schema'] = self.walk(schema['arguments_schema'], f) if 'return_schema' in schema: schema['return_schema'] = self.walk(schema['return_schema'], f) return schema _dispatch = _WalkCoreSchema().walk _dispatch_no_copy = _WalkCoreSchema(copy=False).walk def walk_core_schema(schema: core_schema.CoreSchema, f: Walk, *, copy: bool = True) -> core_schema.CoreSchema: """Recursively traverse a CoreSchema. Args: schema (core_schema.CoreSchema): The CoreSchema to process, it will not be modified. f (Walk): A function to apply. This function takes two arguments: 1. The current CoreSchema that is being processed (not the same one you passed into this function, one level down). 2. The "next" `f` to call. This lets you for example use `f=functools.partial(some_method, some_context)` to pass data down the recursive calls without using globals or other mutable state. copy: Whether schema should be recursively copied. Returns: core_schema.CoreSchema: A processed CoreSchema. """ return f(schema.copy() if copy else schema, _dispatch if copy else _dispatch_no_copy) def simplify_schema_references(schema: core_schema.CoreSchema) -> core_schema.CoreSchema: # noqa: C901 definitions: dict[str, core_schema.CoreSchema] = {} ref_counts: dict[str, int] = defaultdict(int) involved_in_recursion: dict[str, bool] = {} current_recursion_ref_count: dict[str, int] = defaultdict(int) def collect_refs(s: core_schema.CoreSchema, recurse: Recurse) -> core_schema.CoreSchema: if s['type'] == 'definitions': for definition in s['definitions']: ref = get_ref(definition) assert ref is not None if ref not in definitions: definitions[ref] = definition recurse(definition, collect_refs) return recurse(s['schema'], collect_refs) else: ref = get_ref(s) if ref is not None: new = recurse(s, collect_refs) new_ref = get_ref(new) if new_ref: definitions[new_ref] = new return core_schema.definition_reference_schema(schema_ref=ref) else: return recurse(s, collect_refs) schema = walk_core_schema(schema, collect_refs) def count_refs(s: core_schema.CoreSchema, recurse: Recurse) -> core_schema.CoreSchema: if s['type'] != 'definition-ref': return recurse(s, count_refs) ref = s['schema_ref'] ref_counts[ref] += 1 if ref_counts[ref] >= 2: # If this model is involved in a recursion this should be detected # on its second encounter, we can safely stop the walk here. if current_recursion_ref_count[ref] != 0: involved_in_recursion[ref] = True return s current_recursion_ref_count[ref] += 1 if 'serialization' in s: # Even though this is a `'definition-ref'` schema, there might # be more references inside the serialization schema: recurse(s, count_refs) next_s = definitions[ref] visited: set[str] = set() while next_s['type'] == 'definition-ref': if next_s['schema_ref'] in visited: raise PydanticUserError( f'{ref} contains a circular reference to itself.', code='circular-reference-schema' ) visited.add(next_s['schema_ref']) ref_counts[next_s['schema_ref']] += 1 next_s = definitions[next_s['schema_ref']] recurse(next_s, count_refs) current_recursion_ref_count[ref] -= 1 return s schema = walk_core_schema(schema, count_refs, copy=False) assert all(c == 0 for c in current_recursion_ref_count.values()), 'this is a bug! please report it' def can_be_inlined(s: core_schema.DefinitionReferenceSchema, ref: str) -> bool: if ref_counts[ref] > 1: return False if involved_in_recursion.get(ref, False): return False if 'serialization' in s: return False if 'metadata' in s: metadata = s['metadata'] for k in [ *CoreMetadata.__annotations__.keys(), 'pydantic.internal.union_discriminator', 'pydantic.internal.tagged_union_tag', ]: if k in metadata: # we need to keep this as a ref return False return True def inline_refs(s: core_schema.CoreSchema, recurse: Recurse) -> core_schema.CoreSchema: # Assume there are no infinite loops, because we already checked for that in `count_refs` while s['type'] == 'definition-ref': ref = s['schema_ref'] # Check if the reference is only used once, not involved in recursion and does not have # any extra keys (like 'serialization') if can_be_inlined(s, ref): # Inline the reference by replacing the reference with the actual schema new = definitions.pop(ref) ref_counts[ref] -= 1 # because we just replaced it! # put all other keys that were on the def-ref schema into the inlined version # in particular this is needed for `serialization` if 'serialization' in s: new['serialization'] = s['serialization'] s = new else: break return recurse(s, inline_refs) schema = walk_core_schema(schema, inline_refs, copy=False) def_values = [v for v in definitions.values() if ref_counts[v['ref']] > 0] # type: ignore if def_values: schema = core_schema.definitions_schema(schema=schema, definitions=def_values) return schema def _strip_metadata(schema: CoreSchema) -> CoreSchema: def strip_metadata(s: CoreSchema, recurse: Recurse) -> CoreSchema: s = s.copy() s.pop('metadata', None) if s['type'] == 'model-fields': s = s.copy() s['fields'] = {k: v.copy() for k, v in s['fields'].items()} for field_name, field_schema in s['fields'].items(): field_schema.pop('metadata', None) s['fields'][field_name] = field_schema computed_fields = s.get('computed_fields', None) if computed_fields: s['computed_fields'] = [cf.copy() for cf in computed_fields] for cf in computed_fields: cf.pop('metadata', None) else: s.pop('computed_fields', None) elif s['type'] == 'model': # remove some defaults if s.get('custom_init', True) is False: s.pop('custom_init') if s.get('root_model', True) is False: s.pop('root_model') if {'title'}.issuperset(s.get('config', {}).keys()): s.pop('config', None) return recurse(s, strip_metadata) return walk_core_schema(schema, strip_metadata) def pretty_print_core_schema( schema: CoreSchema, include_metadata: bool = False, ) -> None: """Pretty print a CoreSchema using rich. This is intended for debugging purposes. Args: schema: The CoreSchema to print. include_metadata: Whether to include metadata in the output. Defaults to `False`. """ from rich import print # type: ignore # install it manually in your dev env if not include_metadata: schema = _strip_metadata(schema) return print(schema) def validate_core_schema(schema: CoreSchema) -> CoreSchema: if 'PYDANTIC_SKIP_VALIDATING_CORE_SCHEMAS' in os.environ: return schema return _validate_core_schema(schema) pydantic-2.10.6/pydantic/_internal/_dataclasses.py000066400000000000000000000224161474456633400222260ustar00rootroot00000000000000"""Private logic for creating pydantic dataclasses.""" from __future__ import annotations as _annotations import dataclasses import typing import warnings from functools import partial, wraps from typing import Any, ClassVar from pydantic_core import ( ArgsKwargs, SchemaSerializer, SchemaValidator, core_schema, ) from typing_extensions import TypeGuard from ..errors import PydanticUndefinedAnnotation from ..plugin._schema_validator import PluggableSchemaValidator, create_schema_validator from ..warnings import PydanticDeprecatedSince20 from . import _config, _decorators from ._fields import collect_dataclass_fields from ._generate_schema import GenerateSchema from ._generics import get_standard_typevars_map from ._mock_val_ser import set_dataclass_mocks from ._namespace_utils import NsResolver from ._schema_generation_shared import CallbackGetCoreSchemaHandler from ._signature import generate_pydantic_signature from ._utils import LazyClassAttribute if typing.TYPE_CHECKING: from _typeshed import DataclassInstance as StandardDataclass from ..config import ConfigDict from ..fields import FieldInfo class PydanticDataclass(StandardDataclass, typing.Protocol): """A protocol containing attributes only available once a class has been decorated as a Pydantic dataclass. Attributes: __pydantic_config__: Pydantic-specific configuration settings for the dataclass. __pydantic_complete__: Whether dataclass building is completed, or if there are still undefined fields. __pydantic_core_schema__: The pydantic-core schema used to build the SchemaValidator and SchemaSerializer. __pydantic_decorators__: Metadata containing the decorators defined on the dataclass. __pydantic_fields__: Metadata about the fields defined on the dataclass. __pydantic_serializer__: The pydantic-core SchemaSerializer used to dump instances of the dataclass. __pydantic_validator__: The pydantic-core SchemaValidator used to validate instances of the dataclass. """ __pydantic_config__: ClassVar[ConfigDict] __pydantic_complete__: ClassVar[bool] __pydantic_core_schema__: ClassVar[core_schema.CoreSchema] __pydantic_decorators__: ClassVar[_decorators.DecoratorInfos] __pydantic_fields__: ClassVar[dict[str, FieldInfo]] __pydantic_serializer__: ClassVar[SchemaSerializer] __pydantic_validator__: ClassVar[SchemaValidator | PluggableSchemaValidator] else: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 def set_dataclass_fields( cls: type[StandardDataclass], ns_resolver: NsResolver | None = None, config_wrapper: _config.ConfigWrapper | None = None, ) -> None: """Collect and set `cls.__pydantic_fields__`. Args: cls: The class. ns_resolver: Namespace resolver to use when getting dataclass annotations. config_wrapper: The config wrapper instance, defaults to `None`. """ typevars_map = get_standard_typevars_map(cls) fields = collect_dataclass_fields( cls, ns_resolver=ns_resolver, typevars_map=typevars_map, config_wrapper=config_wrapper ) cls.__pydantic_fields__ = fields # type: ignore def complete_dataclass( cls: type[Any], config_wrapper: _config.ConfigWrapper, *, raise_errors: bool = True, ns_resolver: NsResolver | None = None, _force_build: bool = False, ) -> bool: """Finish building a pydantic dataclass. This logic is called on a class which has already been wrapped in `dataclasses.dataclass()`. This is somewhat analogous to `pydantic._internal._model_construction.complete_model_class`. Args: cls: The class. config_wrapper: The config wrapper instance. raise_errors: Whether to raise errors, defaults to `True`. ns_resolver: The namespace resolver instance to use when collecting dataclass fields and during schema building. _force_build: Whether to force building the dataclass, no matter if [`defer_build`][pydantic.config.ConfigDict.defer_build] is set. Returns: `True` if building a pydantic dataclass is successfully completed, `False` otherwise. Raises: PydanticUndefinedAnnotation: If `raise_error` is `True` and there is an undefined annotations. """ original_init = cls.__init__ # dataclass.__init__ must be defined here so its `__qualname__` can be changed since functions can't be copied, # and so that the mock validator is used if building was deferred: def __init__(__dataclass_self__: PydanticDataclass, *args: Any, **kwargs: Any) -> None: __tracebackhide__ = True s = __dataclass_self__ s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) __init__.__qualname__ = f'{cls.__qualname__}.__init__' cls.__init__ = __init__ # type: ignore cls.__pydantic_config__ = config_wrapper.config_dict # type: ignore set_dataclass_fields(cls, ns_resolver, config_wrapper=config_wrapper) if not _force_build and config_wrapper.defer_build: set_dataclass_mocks(cls, cls.__name__) return False if hasattr(cls, '__post_init_post_parse__'): warnings.warn( 'Support for `__post_init_post_parse__` has been dropped, the method will not be called', DeprecationWarning ) typevars_map = get_standard_typevars_map(cls) gen_schema = GenerateSchema( config_wrapper, ns_resolver=ns_resolver, typevars_map=typevars_map, ) # set __signature__ attr only for the class, but not for its instances # (because instances can define `__call__`, and `inspect.signature` shouldn't # use the `__signature__` attribute and instead generate from `__call__`). cls.__signature__ = LazyClassAttribute( '__signature__', partial( generate_pydantic_signature, # It's important that we reference the `original_init` here, # as it is the one synthesized by the stdlib `dataclass` module: init=original_init, fields=cls.__pydantic_fields__, # type: ignore populate_by_name=config_wrapper.populate_by_name, extra=config_wrapper.extra, is_dataclass=True, ), ) get_core_schema = getattr(cls, '__get_pydantic_core_schema__', None) try: if get_core_schema: schema = get_core_schema( cls, CallbackGetCoreSchemaHandler( partial(gen_schema.generate_schema, from_dunder_get_core_schema=False), gen_schema, ref_mode='unpack', ), ) else: schema = gen_schema.generate_schema(cls, from_dunder_get_core_schema=False) except PydanticUndefinedAnnotation as e: if raise_errors: raise set_dataclass_mocks(cls, cls.__name__, f'`{e.name}`') return False core_config = config_wrapper.core_config(title=cls.__name__) try: schema = gen_schema.clean_schema(schema) except gen_schema.CollectedInvalid: set_dataclass_mocks(cls, cls.__name__, 'all referenced types') return False # We are about to set all the remaining required properties expected for this cast; # __pydantic_decorators__ and __pydantic_fields__ should already be set cls = typing.cast('type[PydanticDataclass]', cls) # debug(schema) cls.__pydantic_core_schema__ = schema cls.__pydantic_validator__ = validator = create_schema_validator( schema, cls, cls.__module__, cls.__qualname__, 'dataclass', core_config, config_wrapper.plugin_settings ) cls.__pydantic_serializer__ = SchemaSerializer(schema, core_config) if config_wrapper.validate_assignment: @wraps(cls.__setattr__) def validated_setattr(instance: Any, field: str, value: str, /) -> None: validator.validate_assignment(instance, field, value) cls.__setattr__ = validated_setattr.__get__(None, cls) # type: ignore cls.__pydantic_complete__ = True return True def is_builtin_dataclass(_cls: type[Any]) -> TypeGuard[type[StandardDataclass]]: """Returns True if a class is a stdlib dataclass and *not* a pydantic dataclass. We check that - `_cls` is a dataclass - `_cls` does not inherit from a processed pydantic dataclass (and thus have a `__pydantic_validator__`) - `_cls` does not have any annotations that are not dataclass fields e.g. ```python import dataclasses import pydantic.dataclasses @dataclasses.dataclass class A: x: int @pydantic.dataclasses.dataclass class B(A): y: int ``` In this case, when we first check `B`, we make an extra check and look at the annotations ('y'), which won't be a superset of all the dataclass fields (only the stdlib fields i.e. 'x') Args: cls: The class. Returns: `True` if the class is a stdlib dataclass, `False` otherwise. """ return ( dataclasses.is_dataclass(_cls) and not hasattr(_cls, '__pydantic_validator__') and set(_cls.__dataclass_fields__).issuperset(set(getattr(_cls, '__annotations__', {}))) ) pydantic-2.10.6/pydantic/_internal/_decorators.py000066400000000000000000000770601474456633400221110ustar00rootroot00000000000000"""Logic related to validators applied to models etc. via the `@field_validator` and `@model_validator` decorators.""" from __future__ import annotations as _annotations from collections import deque from dataclasses import dataclass, field from functools import cached_property, partial, partialmethod from inspect import Parameter, Signature, isdatadescriptor, ismethoddescriptor, signature from itertools import islice from typing import TYPE_CHECKING, Any, Callable, ClassVar, Generic, Iterable, TypeVar, Union from pydantic_core import PydanticUndefined, core_schema from typing_extensions import Literal, TypeAlias, is_typeddict from ..errors import PydanticUserError from ._core_utils import get_type_ref from ._internal_dataclass import slots_true from ._namespace_utils import GlobalsNamespace, MappingNamespace from ._typing_extra import get_function_type_hints from ._utils import can_be_positional if TYPE_CHECKING: from ..fields import ComputedFieldInfo from ..functional_validators import FieldValidatorModes @dataclass(**slots_true) class ValidatorDecoratorInfo: """A container for data from `@validator` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@validator'. fields: A tuple of field names the validator should be called on. mode: The proposed validator mode. each_item: For complex objects (sets, lists etc.) whether to validate individual elements rather than the whole object. always: Whether this method and other validators should be called even if the value is missing. check_fields: Whether to check that the fields actually exist on the model. """ decorator_repr: ClassVar[str] = '@validator' fields: tuple[str, ...] mode: Literal['before', 'after'] each_item: bool always: bool check_fields: bool | None @dataclass(**slots_true) class FieldValidatorDecoratorInfo: """A container for data from `@field_validator` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@field_validator'. fields: A tuple of field names the validator should be called on. mode: The proposed validator mode. check_fields: Whether to check that the fields actually exist on the model. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode) and can only specified when `mode` is either `'before'`, `'plain'` or `'wrap'`. """ decorator_repr: ClassVar[str] = '@field_validator' fields: tuple[str, ...] mode: FieldValidatorModes check_fields: bool | None json_schema_input_type: Any @dataclass(**slots_true) class RootValidatorDecoratorInfo: """A container for data from `@root_validator` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@root_validator'. mode: The proposed validator mode. """ decorator_repr: ClassVar[str] = '@root_validator' mode: Literal['before', 'after'] @dataclass(**slots_true) class FieldSerializerDecoratorInfo: """A container for data from `@field_serializer` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@field_serializer'. fields: A tuple of field names the serializer should be called on. mode: The proposed serializer mode. return_type: The type of the serializer's return value. when_used: The serialization condition. Accepts a string with values `'always'`, `'unless-none'`, `'json'`, and `'json-unless-none'`. check_fields: Whether to check that the fields actually exist on the model. """ decorator_repr: ClassVar[str] = '@field_serializer' fields: tuple[str, ...] mode: Literal['plain', 'wrap'] return_type: Any when_used: core_schema.WhenUsed check_fields: bool | None @dataclass(**slots_true) class ModelSerializerDecoratorInfo: """A container for data from `@model_serializer` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@model_serializer'. mode: The proposed serializer mode. return_type: The type of the serializer's return value. when_used: The serialization condition. Accepts a string with values `'always'`, `'unless-none'`, `'json'`, and `'json-unless-none'`. """ decorator_repr: ClassVar[str] = '@model_serializer' mode: Literal['plain', 'wrap'] return_type: Any when_used: core_schema.WhenUsed @dataclass(**slots_true) class ModelValidatorDecoratorInfo: """A container for data from `@model_validator` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@model_validator'. mode: The proposed serializer mode. """ decorator_repr: ClassVar[str] = '@model_validator' mode: Literal['wrap', 'before', 'after'] DecoratorInfo: TypeAlias = """Union[ ValidatorDecoratorInfo, FieldValidatorDecoratorInfo, RootValidatorDecoratorInfo, FieldSerializerDecoratorInfo, ModelSerializerDecoratorInfo, ModelValidatorDecoratorInfo, ComputedFieldInfo, ]""" ReturnType = TypeVar('ReturnType') DecoratedType: TypeAlias = ( 'Union[classmethod[Any, Any, ReturnType], staticmethod[Any, ReturnType], Callable[..., ReturnType], property]' ) @dataclass # can't use slots here since we set attributes on `__post_init__` class PydanticDescriptorProxy(Generic[ReturnType]): """Wrap a classmethod, staticmethod, property or unbound function and act as a descriptor that allows us to detect decorated items from the class' attributes. This class' __get__ returns the wrapped item's __get__ result, which makes it transparent for classmethods and staticmethods. Attributes: wrapped: The decorator that has to be wrapped. decorator_info: The decorator info. shim: A wrapper function to wrap V1 style function. """ wrapped: DecoratedType[ReturnType] decorator_info: DecoratorInfo shim: Callable[[Callable[..., Any]], Callable[..., Any]] | None = None def __post_init__(self): for attr in 'setter', 'deleter': if hasattr(self.wrapped, attr): f = partial(self._call_wrapped_attr, name=attr) setattr(self, attr, f) def _call_wrapped_attr(self, func: Callable[[Any], None], *, name: str) -> PydanticDescriptorProxy[ReturnType]: self.wrapped = getattr(self.wrapped, name)(func) if isinstance(self.wrapped, property): # update ComputedFieldInfo.wrapped_property from ..fields import ComputedFieldInfo if isinstance(self.decorator_info, ComputedFieldInfo): self.decorator_info.wrapped_property = self.wrapped return self def __get__(self, obj: object | None, obj_type: type[object] | None = None) -> PydanticDescriptorProxy[ReturnType]: try: return self.wrapped.__get__(obj, obj_type) except AttributeError: # not a descriptor, e.g. a partial object return self.wrapped # type: ignore[return-value] def __set_name__(self, instance: Any, name: str) -> None: if hasattr(self.wrapped, '__set_name__'): self.wrapped.__set_name__(instance, name) # pyright: ignore[reportFunctionMemberAccess] def __getattr__(self, __name: str) -> Any: """Forward checks for __isabstractmethod__ and such.""" return getattr(self.wrapped, __name) DecoratorInfoType = TypeVar('DecoratorInfoType', bound=DecoratorInfo) @dataclass(**slots_true) class Decorator(Generic[DecoratorInfoType]): """A generic container class to join together the decorator metadata (metadata from decorator itself, which we have when the decorator is called but not when we are building the core-schema) and the bound function (which we have after the class itself is created). Attributes: cls_ref: The class ref. cls_var_name: The decorated function name. func: The decorated function. shim: A wrapper function to wrap V1 style function. info: The decorator info. """ cls_ref: str cls_var_name: str func: Callable[..., Any] shim: Callable[[Any], Any] | None info: DecoratorInfoType @staticmethod def build( cls_: Any, *, cls_var_name: str, shim: Callable[[Any], Any] | None, info: DecoratorInfoType, ) -> Decorator[DecoratorInfoType]: """Build a new decorator. Args: cls_: The class. cls_var_name: The decorated function name. shim: A wrapper function to wrap V1 style function. info: The decorator info. Returns: The new decorator instance. """ func = get_attribute_from_bases(cls_, cls_var_name) if shim is not None: func = shim(func) func = unwrap_wrapped_function(func, unwrap_partial=False) if not callable(func): # This branch will get hit for classmethod properties attribute = get_attribute_from_base_dicts(cls_, cls_var_name) # prevents the binding call to `__get__` if isinstance(attribute, PydanticDescriptorProxy): func = unwrap_wrapped_function(attribute.wrapped) return Decorator( cls_ref=get_type_ref(cls_), cls_var_name=cls_var_name, func=func, shim=shim, info=info, ) def bind_to_cls(self, cls: Any) -> Decorator[DecoratorInfoType]: """Bind the decorator to a class. Args: cls: the class. Returns: The new decorator instance. """ return self.build( cls, cls_var_name=self.cls_var_name, shim=self.shim, info=self.info, ) def get_bases(tp: type[Any]) -> tuple[type[Any], ...]: """Get the base classes of a class or typeddict. Args: tp: The type or class to get the bases. Returns: The base classes. """ if is_typeddict(tp): return tp.__orig_bases__ # type: ignore try: return tp.__bases__ except AttributeError: return () def mro(tp: type[Any]) -> tuple[type[Any], ...]: """Calculate the Method Resolution Order of bases using the C3 algorithm. See https://www.python.org/download/releases/2.3/mro/ """ # try to use the existing mro, for performance mainly # but also because it helps verify the implementation below if not is_typeddict(tp): try: return tp.__mro__ except AttributeError: # GenericAlias and some other cases pass bases = get_bases(tp) return (tp,) + mro_for_bases(bases) def mro_for_bases(bases: tuple[type[Any], ...]) -> tuple[type[Any], ...]: def merge_seqs(seqs: list[deque[type[Any]]]) -> Iterable[type[Any]]: while True: non_empty = [seq for seq in seqs if seq] if not non_empty: # Nothing left to process, we're done. return candidate: type[Any] | None = None for seq in non_empty: # Find merge candidates among seq heads. candidate = seq[0] not_head = [s for s in non_empty if candidate in islice(s, 1, None)] if not_head: # Reject the candidate. candidate = None else: break if not candidate: raise TypeError('Inconsistent hierarchy, no C3 MRO is possible') yield candidate for seq in non_empty: # Remove candidate. if seq[0] == candidate: seq.popleft() seqs = [deque(mro(base)) for base in bases] + [deque(bases)] return tuple(merge_seqs(seqs)) _sentinel = object() def get_attribute_from_bases(tp: type[Any] | tuple[type[Any], ...], name: str) -> Any: """Get the attribute from the next class in the MRO that has it, aiming to simulate calling the method on the actual class. The reason for iterating over the mro instead of just getting the attribute (which would do that for us) is to support TypedDict, which lacks a real __mro__, but can have a virtual one constructed from its bases (as done here). Args: tp: The type or class to search for the attribute. If a tuple, this is treated as a set of base classes. name: The name of the attribute to retrieve. Returns: Any: The attribute value, if found. Raises: AttributeError: If the attribute is not found in any class in the MRO. """ if isinstance(tp, tuple): for base in mro_for_bases(tp): attribute = base.__dict__.get(name, _sentinel) if attribute is not _sentinel: attribute_get = getattr(attribute, '__get__', None) if attribute_get is not None: return attribute_get(None, tp) return attribute raise AttributeError(f'{name} not found in {tp}') else: try: return getattr(tp, name) except AttributeError: return get_attribute_from_bases(mro(tp), name) def get_attribute_from_base_dicts(tp: type[Any], name: str) -> Any: """Get an attribute out of the `__dict__` following the MRO. This prevents the call to `__get__` on the descriptor, and allows us to get the original function for classmethod properties. Args: tp: The type or class to search for the attribute. name: The name of the attribute to retrieve. Returns: Any: The attribute value, if found. Raises: KeyError: If the attribute is not found in any class's `__dict__` in the MRO. """ for base in reversed(mro(tp)): if name in base.__dict__: return base.__dict__[name] return tp.__dict__[name] # raise the error @dataclass(**slots_true) class DecoratorInfos: """Mapping of name in the class namespace to decorator info. note that the name in the class namespace is the function or attribute name not the field name! """ validators: dict[str, Decorator[ValidatorDecoratorInfo]] = field(default_factory=dict) field_validators: dict[str, Decorator[FieldValidatorDecoratorInfo]] = field(default_factory=dict) root_validators: dict[str, Decorator[RootValidatorDecoratorInfo]] = field(default_factory=dict) field_serializers: dict[str, Decorator[FieldSerializerDecoratorInfo]] = field(default_factory=dict) model_serializers: dict[str, Decorator[ModelSerializerDecoratorInfo]] = field(default_factory=dict) model_validators: dict[str, Decorator[ModelValidatorDecoratorInfo]] = field(default_factory=dict) computed_fields: dict[str, Decorator[ComputedFieldInfo]] = field(default_factory=dict) @staticmethod def build(model_dc: type[Any]) -> DecoratorInfos: # noqa: C901 (ignore complexity) """We want to collect all DecFunc instances that exist as attributes in the namespace of the class (a BaseModel or dataclass) that called us But we want to collect these in the order of the bases So instead of getting them all from the leaf class (the class that called us), we traverse the bases from root (the oldest ancestor class) to leaf and collect all of the instances as we go, taking care to replace any duplicate ones with the last one we see to mimic how function overriding works with inheritance. If we do replace any functions we put the replacement into the position the replaced function was in; that is, we maintain the order. """ # reminder: dicts are ordered and replacement does not alter the order res = DecoratorInfos() for base in reversed(mro(model_dc)[1:]): existing: DecoratorInfos | None = base.__dict__.get('__pydantic_decorators__') if existing is None: existing = DecoratorInfos.build(base) res.validators.update({k: v.bind_to_cls(model_dc) for k, v in existing.validators.items()}) res.field_validators.update({k: v.bind_to_cls(model_dc) for k, v in existing.field_validators.items()}) res.root_validators.update({k: v.bind_to_cls(model_dc) for k, v in existing.root_validators.items()}) res.field_serializers.update({k: v.bind_to_cls(model_dc) for k, v in existing.field_serializers.items()}) res.model_serializers.update({k: v.bind_to_cls(model_dc) for k, v in existing.model_serializers.items()}) res.model_validators.update({k: v.bind_to_cls(model_dc) for k, v in existing.model_validators.items()}) res.computed_fields.update({k: v.bind_to_cls(model_dc) for k, v in existing.computed_fields.items()}) to_replace: list[tuple[str, Any]] = [] for var_name, var_value in vars(model_dc).items(): if isinstance(var_value, PydanticDescriptorProxy): info = var_value.decorator_info if isinstance(info, ValidatorDecoratorInfo): res.validators[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) elif isinstance(info, FieldValidatorDecoratorInfo): res.field_validators[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) elif isinstance(info, RootValidatorDecoratorInfo): res.root_validators[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) elif isinstance(info, FieldSerializerDecoratorInfo): # check whether a serializer function is already registered for fields for field_serializer_decorator in res.field_serializers.values(): # check that each field has at most one serializer function. # serializer functions for the same field in subclasses are allowed, # and are treated as overrides if field_serializer_decorator.cls_var_name == var_name: continue for f in info.fields: if f in field_serializer_decorator.info.fields: raise PydanticUserError( 'Multiple field serializer functions were defined ' f'for field {f!r}, this is not allowed.', code='multiple-field-serializers', ) res.field_serializers[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) elif isinstance(info, ModelValidatorDecoratorInfo): res.model_validators[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) elif isinstance(info, ModelSerializerDecoratorInfo): res.model_serializers[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=var_value.shim, info=info ) else: from ..fields import ComputedFieldInfo isinstance(var_value, ComputedFieldInfo) res.computed_fields[var_name] = Decorator.build( model_dc, cls_var_name=var_name, shim=None, info=info ) to_replace.append((var_name, var_value.wrapped)) if to_replace: # If we can save `__pydantic_decorators__` on the class we'll be able to check for it above # so then we don't need to re-process the type, which means we can discard our descriptor wrappers # and replace them with the thing they are wrapping (see the other setattr call below) # which allows validator class methods to also function as regular class methods model_dc.__pydantic_decorators__ = res for name, value in to_replace: setattr(model_dc, name, value) return res def inspect_validator(validator: Callable[..., Any], mode: FieldValidatorModes) -> bool: """Look at a field or model validator function and determine whether it takes an info argument. An error is raised if the function has an invalid signature. Args: validator: The validator function to inspect. mode: The proposed validator mode. Returns: Whether the validator takes an info argument. """ try: sig = signature(validator) except (ValueError, TypeError): # `inspect.signature` might not be able to infer a signature, e.g. with C objects. # In this case, we assume no info argument is present: return False n_positional = count_positional_required_params(sig) if mode == 'wrap': if n_positional == 3: return True elif n_positional == 2: return False else: assert mode in {'before', 'after', 'plain'}, f"invalid mode: {mode!r}, expected 'before', 'after' or 'plain" if n_positional == 2: return True elif n_positional == 1: return False raise PydanticUserError( f'Unrecognized field_validator function signature for {validator} with `mode={mode}`:{sig}', code='validator-signature', ) def inspect_field_serializer(serializer: Callable[..., Any], mode: Literal['plain', 'wrap']) -> tuple[bool, bool]: """Look at a field serializer function and determine if it is a field serializer, and whether it takes an info argument. An error is raised if the function has an invalid signature. Args: serializer: The serializer function to inspect. mode: The serializer mode, either 'plain' or 'wrap'. Returns: Tuple of (is_field_serializer, info_arg). """ try: sig = signature(serializer) except (ValueError, TypeError): # `inspect.signature` might not be able to infer a signature, e.g. with C objects. # In this case, we assume no info argument is present and this is not a method: return (False, False) first = next(iter(sig.parameters.values()), None) is_field_serializer = first is not None and first.name == 'self' n_positional = count_positional_required_params(sig) if is_field_serializer: # -1 to correct for self parameter info_arg = _serializer_info_arg(mode, n_positional - 1) else: info_arg = _serializer_info_arg(mode, n_positional) if info_arg is None: raise PydanticUserError( f'Unrecognized field_serializer function signature for {serializer} with `mode={mode}`:{sig}', code='field-serializer-signature', ) return is_field_serializer, info_arg def inspect_annotated_serializer(serializer: Callable[..., Any], mode: Literal['plain', 'wrap']) -> bool: """Look at a serializer function used via `Annotated` and determine whether it takes an info argument. An error is raised if the function has an invalid signature. Args: serializer: The serializer function to check. mode: The serializer mode, either 'plain' or 'wrap'. Returns: info_arg """ try: sig = signature(serializer) except (ValueError, TypeError): # `inspect.signature` might not be able to infer a signature, e.g. with C objects. # In this case, we assume no info argument is present: return False info_arg = _serializer_info_arg(mode, count_positional_required_params(sig)) if info_arg is None: raise PydanticUserError( f'Unrecognized field_serializer function signature for {serializer} with `mode={mode}`:{sig}', code='field-serializer-signature', ) else: return info_arg def inspect_model_serializer(serializer: Callable[..., Any], mode: Literal['plain', 'wrap']) -> bool: """Look at a model serializer function and determine whether it takes an info argument. An error is raised if the function has an invalid signature. Args: serializer: The serializer function to check. mode: The serializer mode, either 'plain' or 'wrap'. Returns: `info_arg` - whether the function expects an info argument. """ if isinstance(serializer, (staticmethod, classmethod)) or not is_instance_method_from_sig(serializer): raise PydanticUserError( '`@model_serializer` must be applied to instance methods', code='model-serializer-instance-method' ) sig = signature(serializer) info_arg = _serializer_info_arg(mode, count_positional_required_params(sig)) if info_arg is None: raise PydanticUserError( f'Unrecognized model_serializer function signature for {serializer} with `mode={mode}`:{sig}', code='model-serializer-signature', ) else: return info_arg def _serializer_info_arg(mode: Literal['plain', 'wrap'], n_positional: int) -> bool | None: if mode == 'plain': if n_positional == 1: # (input_value: Any, /) -> Any return False elif n_positional == 2: # (model: Any, input_value: Any, /) -> Any return True else: assert mode == 'wrap', f"invalid mode: {mode!r}, expected 'plain' or 'wrap'" if n_positional == 2: # (input_value: Any, serializer: SerializerFunctionWrapHandler, /) -> Any return False elif n_positional == 3: # (input_value: Any, serializer: SerializerFunctionWrapHandler, info: SerializationInfo, /) -> Any return True return None AnyDecoratorCallable: TypeAlias = ( 'Union[classmethod[Any, Any, Any], staticmethod[Any, Any], partialmethod[Any], Callable[..., Any]]' ) def is_instance_method_from_sig(function: AnyDecoratorCallable) -> bool: """Whether the function is an instance method. It will consider a function as instance method if the first parameter of function is `self`. Args: function: The function to check. Returns: `True` if the function is an instance method, `False` otherwise. """ sig = signature(unwrap_wrapped_function(function)) first = next(iter(sig.parameters.values()), None) if first and first.name == 'self': return True return False def ensure_classmethod_based_on_signature(function: AnyDecoratorCallable) -> Any: """Apply the `@classmethod` decorator on the function. Args: function: The function to apply the decorator on. Return: The `@classmethod` decorator applied function. """ if not isinstance( unwrap_wrapped_function(function, unwrap_class_static_method=False), classmethod ) and _is_classmethod_from_sig(function): return classmethod(function) # type: ignore[arg-type] return function def _is_classmethod_from_sig(function: AnyDecoratorCallable) -> bool: sig = signature(unwrap_wrapped_function(function)) first = next(iter(sig.parameters.values()), None) if first and first.name == 'cls': return True return False def unwrap_wrapped_function( func: Any, *, unwrap_partial: bool = True, unwrap_class_static_method: bool = True, ) -> Any: """Recursively unwraps a wrapped function until the underlying function is reached. This handles property, functools.partial, functools.partialmethod, staticmethod, and classmethod. Args: func: The function to unwrap. unwrap_partial: If True (default), unwrap partial and partialmethod decorators. unwrap_class_static_method: If True (default), also unwrap classmethod and staticmethod decorators. If False, only unwrap partial and partialmethod decorators. Returns: The underlying function of the wrapped function. """ # Define the types we want to check against as a single tuple. unwrap_types = ( (property, cached_property) + ((partial, partialmethod) if unwrap_partial else ()) + ((staticmethod, classmethod) if unwrap_class_static_method else ()) ) while isinstance(func, unwrap_types): if unwrap_class_static_method and isinstance(func, (classmethod, staticmethod)): func = func.__func__ elif isinstance(func, (partial, partialmethod)): func = func.func elif isinstance(func, property): func = func.fget # arbitrary choice, convenient for computed fields else: # Make coverage happy as it can only get here in the last possible case assert isinstance(func, cached_property) func = func.func # type: ignore return func def get_function_return_type( func: Any, explicit_return_type: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> Any: """Get the function return type. It gets the return type from the type annotation if `explicit_return_type` is `None`. Otherwise, it returns `explicit_return_type`. Args: func: The function to get its return type. explicit_return_type: The explicit return type. globalns: The globals namespace to use during type annotation evaluation. localns: The locals namespace to use during type annotation evaluation. Returns: The function return type. """ if explicit_return_type is PydanticUndefined: # try to get it from the type annotation hints = get_function_type_hints( unwrap_wrapped_function(func), include_keys={'return'}, globalns=globalns, localns=localns, ) return hints.get('return', PydanticUndefined) else: return explicit_return_type def count_positional_required_params(sig: Signature) -> int: """Get the number of positional (required) arguments of a signature. This function should only be used to inspect signatures of validation and serialization functions. The first argument (the value being serialized or validated) is counted as a required argument even if a default value exists. Returns: The number of positional arguments of a signature. """ parameters = list(sig.parameters.values()) return sum( 1 for param in parameters if can_be_positional(param) # First argument is the value being validated/serialized, and can have a default value # (e.g. `float`, which has signature `(x=0, /)`). We assume other parameters (the info arg # for instance) should be required, and thus without any default value. and (param.default is Parameter.empty or param is parameters[0]) ) def ensure_property(f: Any) -> Any: """Ensure that a function is a `property` or `cached_property`, or is a valid descriptor. Args: f: The function to check. Returns: The function, or a `property` or `cached_property` instance wrapping the function. """ if ismethoddescriptor(f) or isdatadescriptor(f): return f else: return property(f) pydantic-2.10.6/pydantic/_internal/_decorators_v1.py000066400000000000000000000140661474456633400225140ustar00rootroot00000000000000"""Logic for V1 validators, e.g. `@validator` and `@root_validator`.""" from __future__ import annotations as _annotations from inspect import Parameter, signature from typing import Any, Dict, Tuple, Union, cast from pydantic_core import core_schema from typing_extensions import Protocol from ..errors import PydanticUserError from ._utils import can_be_positional class V1OnlyValueValidator(Protocol): """A simple validator, supported for V1 validators and V2 validators.""" def __call__(self, __value: Any) -> Any: ... class V1ValidatorWithValues(Protocol): """A validator with `values` argument, supported for V1 validators and V2 validators.""" def __call__(self, __value: Any, values: dict[str, Any]) -> Any: ... class V1ValidatorWithValuesKwOnly(Protocol): """A validator with keyword only `values` argument, supported for V1 validators and V2 validators.""" def __call__(self, __value: Any, *, values: dict[str, Any]) -> Any: ... class V1ValidatorWithKwargs(Protocol): """A validator with `kwargs` argument, supported for V1 validators and V2 validators.""" def __call__(self, __value: Any, **kwargs: Any) -> Any: ... class V1ValidatorWithValuesAndKwargs(Protocol): """A validator with `values` and `kwargs` arguments, supported for V1 validators and V2 validators.""" def __call__(self, __value: Any, values: dict[str, Any], **kwargs: Any) -> Any: ... V1Validator = Union[ V1ValidatorWithValues, V1ValidatorWithValuesKwOnly, V1ValidatorWithKwargs, V1ValidatorWithValuesAndKwargs ] def can_be_keyword(param: Parameter) -> bool: return param.kind in (Parameter.POSITIONAL_OR_KEYWORD, Parameter.KEYWORD_ONLY) def make_generic_v1_field_validator(validator: V1Validator) -> core_schema.WithInfoValidatorFunction: """Wrap a V1 style field validator for V2 compatibility. Args: validator: The V1 style field validator. Returns: A wrapped V2 style field validator. Raises: PydanticUserError: If the signature is not supported or the parameters are not available in Pydantic V2. """ sig = signature(validator) needs_values_kw = False for param_num, (param_name, parameter) in enumerate(sig.parameters.items()): if can_be_keyword(parameter) and param_name in ('field', 'config'): raise PydanticUserError( 'The `field` and `config` parameters are not available in Pydantic V2, ' 'please use the `info` parameter instead.', code='validator-field-config-info', ) if parameter.kind is Parameter.VAR_KEYWORD: needs_values_kw = True elif can_be_keyword(parameter) and param_name == 'values': needs_values_kw = True elif can_be_positional(parameter) and param_num == 0: # value continue elif parameter.default is Parameter.empty: # ignore params with defaults e.g. bound by functools.partial raise PydanticUserError( f'Unsupported signature for V1 style validator {validator}: {sig} is not supported.', code='validator-v1-signature', ) if needs_values_kw: # (v, **kwargs), (v, values, **kwargs), (v, *, values, **kwargs) or (v, *, values) val1 = cast(V1ValidatorWithValues, validator) def wrapper1(value: Any, info: core_schema.ValidationInfo) -> Any: return val1(value, values=info.data) return wrapper1 else: val2 = cast(V1OnlyValueValidator, validator) def wrapper2(value: Any, _: core_schema.ValidationInfo) -> Any: return val2(value) return wrapper2 RootValidatorValues = Dict[str, Any] # technically tuple[model_dict, model_extra, fields_set] | tuple[dataclass_dict, init_vars] RootValidatorFieldsTuple = Tuple[Any, ...] class V1RootValidatorFunction(Protocol): """A simple root validator, supported for V1 validators and V2 validators.""" def __call__(self, __values: RootValidatorValues) -> RootValidatorValues: ... class V2CoreBeforeRootValidator(Protocol): """V2 validator with mode='before'.""" def __call__(self, __values: RootValidatorValues, __info: core_schema.ValidationInfo) -> RootValidatorValues: ... class V2CoreAfterRootValidator(Protocol): """V2 validator with mode='after'.""" def __call__( self, __fields_tuple: RootValidatorFieldsTuple, __info: core_schema.ValidationInfo ) -> RootValidatorFieldsTuple: ... def make_v1_generic_root_validator( validator: V1RootValidatorFunction, pre: bool ) -> V2CoreBeforeRootValidator | V2CoreAfterRootValidator: """Wrap a V1 style root validator for V2 compatibility. Args: validator: The V1 style field validator. pre: Whether the validator is a pre validator. Returns: A wrapped V2 style validator. """ if pre is True: # mode='before' for pydantic-core def _wrapper1(values: RootValidatorValues, _: core_schema.ValidationInfo) -> RootValidatorValues: return validator(values) return _wrapper1 # mode='after' for pydantic-core def _wrapper2(fields_tuple: RootValidatorFieldsTuple, _: core_schema.ValidationInfo) -> RootValidatorFieldsTuple: if len(fields_tuple) == 2: # dataclass, this is easy values, init_vars = fields_tuple values = validator(values) return values, init_vars else: # ugly hack: to match v1 behaviour, we merge values and model_extra, then split them up based on fields # afterwards model_dict, model_extra, fields_set = fields_tuple if model_extra: fields = set(model_dict.keys()) model_dict.update(model_extra) model_dict_new = validator(model_dict) for k in list(model_dict_new.keys()): if k not in fields: model_extra[k] = model_dict_new.pop(k) else: model_dict_new = validator(model_dict) return model_dict_new, model_extra, fields_set return _wrapper2 pydantic-2.10.6/pydantic/_internal/_discriminated_union.py000066400000000000000000000635161474456633400237740ustar00rootroot00000000000000from __future__ import annotations as _annotations from typing import TYPE_CHECKING, Any, Hashable, Sequence from pydantic_core import CoreSchema, core_schema from ..errors import PydanticUserError from . import _core_utils from ._core_utils import ( CoreSchemaField, collect_definitions, ) if TYPE_CHECKING: from ..types import Discriminator CORE_SCHEMA_METADATA_DISCRIMINATOR_PLACEHOLDER_KEY = 'pydantic.internal.union_discriminator' class MissingDefinitionForUnionRef(Exception): """Raised when applying a discriminated union discriminator to a schema requires a definition that is not yet defined """ def __init__(self, ref: str) -> None: self.ref = ref super().__init__(f'Missing definition for ref {self.ref!r}') def set_discriminator_in_metadata(schema: CoreSchema, discriminator: Any) -> None: schema.setdefault('metadata', {}) metadata = schema.get('metadata') assert metadata is not None metadata[CORE_SCHEMA_METADATA_DISCRIMINATOR_PLACEHOLDER_KEY] = discriminator def apply_discriminators(schema: core_schema.CoreSchema) -> core_schema.CoreSchema: # We recursively walk through the `schema` passed to `apply_discriminators`, applying discriminators # where necessary at each level. During this recursion, we allow references to be resolved from the definitions # that are originally present on the original, outermost `schema`. Before `apply_discriminators` is called, # `simplify_schema_references` is called on the schema (in the `clean_schema` function), # which often puts the definitions in the outermost schema. global_definitions: dict[str, CoreSchema] = collect_definitions(schema) def inner(s: core_schema.CoreSchema, recurse: _core_utils.Recurse) -> core_schema.CoreSchema: nonlocal global_definitions s = recurse(s, inner) if s['type'] == 'tagged-union': return s metadata = s.get('metadata', {}) discriminator = metadata.pop(CORE_SCHEMA_METADATA_DISCRIMINATOR_PLACEHOLDER_KEY, None) if discriminator is not None: s = apply_discriminator(s, discriminator, global_definitions) return s return _core_utils.walk_core_schema(schema, inner, copy=False) def apply_discriminator( schema: core_schema.CoreSchema, discriminator: str | Discriminator, definitions: dict[str, core_schema.CoreSchema] | None = None, ) -> core_schema.CoreSchema: """Applies the discriminator and returns a new core schema. Args: schema: The input schema. discriminator: The name of the field which will serve as the discriminator. definitions: A mapping of schema ref to schema. Returns: The new core schema. Raises: TypeError: - If `discriminator` is used with invalid union variant. - If `discriminator` is used with `Union` type with one variant. - If `discriminator` value mapped to multiple choices. MissingDefinitionForUnionRef: If the definition for ref is missing. PydanticUserError: - If a model in union doesn't have a discriminator field. - If discriminator field has a non-string alias. - If discriminator fields have different aliases. - If discriminator field not of type `Literal`. """ from ..types import Discriminator if isinstance(discriminator, Discriminator): if isinstance(discriminator.discriminator, str): discriminator = discriminator.discriminator else: return discriminator._convert_schema(schema) return _ApplyInferredDiscriminator(discriminator, definitions or {}).apply(schema) class _ApplyInferredDiscriminator: """This class is used to convert an input schema containing a union schema into one where that union is replaced with a tagged-union, with all the associated debugging and performance benefits. This is done by: * Validating that the input schema is compatible with the provided discriminator * Introspecting the schema to determine which discriminator values should map to which union choices * Handling various edge cases such as 'definitions', 'default', 'nullable' schemas, and more I have chosen to implement the conversion algorithm in this class, rather than a function, to make it easier to maintain state while recursively walking the provided CoreSchema. """ def __init__(self, discriminator: str, definitions: dict[str, core_schema.CoreSchema]): # `discriminator` should be the name of the field which will serve as the discriminator. # It must be the python name of the field, and *not* the field's alias. Note that as of now, # all members of a discriminated union _must_ use a field with the same name as the discriminator. # This may change if/when we expose a way to manually specify the TaggedUnionSchema's choices. self.discriminator = discriminator # `definitions` should contain a mapping of schema ref to schema for all schemas which might # be referenced by some choice self.definitions = definitions # `_discriminator_alias` will hold the value, if present, of the alias for the discriminator # # Note: following the v1 implementation, we currently disallow the use of different aliases # for different choices. This is not a limitation of pydantic_core, but if we try to handle # this, the inference logic gets complicated very quickly, and could result in confusing # debugging challenges for users making subtle mistakes. # # Rather than trying to do the most powerful inference possible, I think we should eventually # expose a way to more-manually control the way the TaggedUnionSchema is constructed through # the use of a new type which would be placed as an Annotation on the Union type. This would # provide the full flexibility/power of pydantic_core's TaggedUnionSchema where necessary for # more complex cases, without over-complicating the inference logic for the common cases. self._discriminator_alias: str | None = None # `_should_be_nullable` indicates whether the converted union has `None` as an allowed value. # If `None` is an acceptable value of the (possibly-wrapped) union, we ignore it while # constructing the TaggedUnionSchema, but set the `_should_be_nullable` attribute to True. # Once we have constructed the TaggedUnionSchema, if `_should_be_nullable` is True, we ensure # that the final schema gets wrapped as a NullableSchema. This has the same semantics on the # python side, but resolves the issue that `None` cannot correspond to any discriminator values. self._should_be_nullable = False # `_is_nullable` is used to track if the final produced schema will definitely be nullable; # we set it to True if the input schema is wrapped in a nullable schema that we know will be preserved # as an indication that, even if None is discovered as one of the union choices, we will not need to wrap # the final value in another nullable schema. # # This is more complicated than just checking for the final outermost schema having type 'nullable' thanks # to the possible presence of other wrapper schemas such as DefinitionsSchema, WithDefaultSchema, etc. self._is_nullable = False # `_choices_to_handle` serves as a stack of choices to add to the tagged union. Initially, choices # from the union in the wrapped schema will be appended to this list, and the recursive choice-handling # algorithm may add more choices to this stack as (nested) unions are encountered. self._choices_to_handle: list[core_schema.CoreSchema] = [] # `_tagged_union_choices` is built during the call to `apply`, and will hold the choices to be included # in the output TaggedUnionSchema that will replace the union from the input schema self._tagged_union_choices: dict[Hashable, core_schema.CoreSchema] = {} # `_used` is changed to True after applying the discriminator to prevent accidental reuse self._used = False def apply(self, schema: core_schema.CoreSchema) -> core_schema.CoreSchema: """Return a new CoreSchema based on `schema` that uses a tagged-union with the discriminator provided to this class. Args: schema: The input schema. Returns: The new core schema. Raises: TypeError: - If `discriminator` is used with invalid union variant. - If `discriminator` is used with `Union` type with one variant. - If `discriminator` value mapped to multiple choices. ValueError: If the definition for ref is missing. PydanticUserError: - If a model in union doesn't have a discriminator field. - If discriminator field has a non-string alias. - If discriminator fields have different aliases. - If discriminator field not of type `Literal`. """ assert not self._used schema = self._apply_to_root(schema) if self._should_be_nullable and not self._is_nullable: schema = core_schema.nullable_schema(schema) self._used = True return schema def _apply_to_root(self, schema: core_schema.CoreSchema) -> core_schema.CoreSchema: """This method handles the outer-most stage of recursion over the input schema: unwrapping nullable or definitions schemas, and calling the `_handle_choice` method iteratively on the choices extracted (recursively) from the possibly-wrapped union. """ if schema['type'] == 'nullable': self._is_nullable = True wrapped = self._apply_to_root(schema['schema']) nullable_wrapper = schema.copy() nullable_wrapper['schema'] = wrapped return nullable_wrapper if schema['type'] == 'definitions': wrapped = self._apply_to_root(schema['schema']) definitions_wrapper = schema.copy() definitions_wrapper['schema'] = wrapped return definitions_wrapper if schema['type'] != 'union': # If the schema is not a union, it probably means it just had a single member and # was flattened by pydantic_core. # However, it still may make sense to apply the discriminator to this schema, # as a way to get discriminated-union-style error messages, so we allow this here. schema = core_schema.union_schema([schema]) # Reverse the choices list before extending the stack so that they get handled in the order they occur choices_schemas = [v[0] if isinstance(v, tuple) else v for v in schema['choices'][::-1]] self._choices_to_handle.extend(choices_schemas) while self._choices_to_handle: choice = self._choices_to_handle.pop() self._handle_choice(choice) if self._discriminator_alias is not None and self._discriminator_alias != self.discriminator: # * We need to annotate `discriminator` as a union here to handle both branches of this conditional # * We need to annotate `discriminator` as list[list[str | int]] and not list[list[str]] due to the # invariance of list, and because list[list[str | int]] is the type of the discriminator argument # to tagged_union_schema below # * See the docstring of pydantic_core.core_schema.tagged_union_schema for more details about how to # interpret the value of the discriminator argument to tagged_union_schema. (The list[list[str]] here # is the appropriate way to provide a list of fallback attributes to check for a discriminator value.) discriminator: str | list[list[str | int]] = [[self.discriminator], [self._discriminator_alias]] else: discriminator = self.discriminator return core_schema.tagged_union_schema( choices=self._tagged_union_choices, discriminator=discriminator, custom_error_type=schema.get('custom_error_type'), custom_error_message=schema.get('custom_error_message'), custom_error_context=schema.get('custom_error_context'), strict=False, from_attributes=True, ref=schema.get('ref'), metadata=schema.get('metadata'), serialization=schema.get('serialization'), ) def _handle_choice(self, choice: core_schema.CoreSchema) -> None: """This method handles the "middle" stage of recursion over the input schema. Specifically, it is responsible for handling each choice of the outermost union (and any "coalesced" choices obtained from inner unions). Here, "handling" entails: * Coalescing nested unions and compatible tagged-unions * Tracking the presence of 'none' and 'nullable' schemas occurring as choices * Validating that each allowed discriminator value maps to a unique choice * Updating the _tagged_union_choices mapping that will ultimately be used to build the TaggedUnionSchema. """ if choice['type'] == 'definition-ref': if choice['schema_ref'] not in self.definitions: raise MissingDefinitionForUnionRef(choice['schema_ref']) if choice['type'] == 'none': self._should_be_nullable = True elif choice['type'] == 'definitions': self._handle_choice(choice['schema']) elif choice['type'] == 'nullable': self._should_be_nullable = True self._handle_choice(choice['schema']) # unwrap the nullable schema elif choice['type'] == 'union': # Reverse the choices list before extending the stack so that they get handled in the order they occur choices_schemas = [v[0] if isinstance(v, tuple) else v for v in choice['choices'][::-1]] self._choices_to_handle.extend(choices_schemas) elif choice['type'] not in { 'model', 'typed-dict', 'tagged-union', 'lax-or-strict', 'dataclass', 'dataclass-args', 'definition-ref', } and not _core_utils.is_function_with_inner_schema(choice): # We should eventually handle 'definition-ref' as well raise TypeError( f'{choice["type"]!r} is not a valid discriminated union variant;' ' should be a `BaseModel` or `dataclass`' ) else: if choice['type'] == 'tagged-union' and self._is_discriminator_shared(choice): # In this case, this inner tagged-union is compatible with the outer tagged-union, # and its choices can be coalesced into the outer TaggedUnionSchema. subchoices = [x for x in choice['choices'].values() if not isinstance(x, (str, int))] # Reverse the choices list before extending the stack so that they get handled in the order they occur self._choices_to_handle.extend(subchoices[::-1]) return inferred_discriminator_values = self._infer_discriminator_values_for_choice(choice, source_name=None) self._set_unique_choice_for_values(choice, inferred_discriminator_values) def _is_discriminator_shared(self, choice: core_schema.TaggedUnionSchema) -> bool: """This method returns a boolean indicating whether the discriminator for the `choice` is the same as that being used for the outermost tagged union. This is used to determine whether this TaggedUnionSchema choice should be "coalesced" into the top level, or whether it should be treated as a separate (nested) choice. """ inner_discriminator = choice['discriminator'] return inner_discriminator == self.discriminator or ( isinstance(inner_discriminator, list) and (self.discriminator in inner_discriminator or [self.discriminator] in inner_discriminator) ) def _infer_discriminator_values_for_choice( # noqa C901 self, choice: core_schema.CoreSchema, source_name: str | None ) -> list[str | int]: """This function recurses over `choice`, extracting all discriminator values that should map to this choice. `model_name` is accepted for the purpose of producing useful error messages. """ if choice['type'] == 'definitions': return self._infer_discriminator_values_for_choice(choice['schema'], source_name=source_name) elif choice['type'] == 'function-plain': raise TypeError( f'{choice["type"]!r} is not a valid discriminated union variant;' ' should be a `BaseModel` or `dataclass`' ) elif _core_utils.is_function_with_inner_schema(choice): return self._infer_discriminator_values_for_choice(choice['schema'], source_name=source_name) elif choice['type'] == 'lax-or-strict': return sorted( set( self._infer_discriminator_values_for_choice(choice['lax_schema'], source_name=None) + self._infer_discriminator_values_for_choice(choice['strict_schema'], source_name=None) ) ) elif choice['type'] == 'tagged-union': values: list[str | int] = [] # Ignore str/int "choices" since these are just references to other choices subchoices = [x for x in choice['choices'].values() if not isinstance(x, (str, int))] for subchoice in subchoices: subchoice_values = self._infer_discriminator_values_for_choice(subchoice, source_name=None) values.extend(subchoice_values) return values elif choice['type'] == 'union': values = [] for subchoice in choice['choices']: subchoice_schema = subchoice[0] if isinstance(subchoice, tuple) else subchoice subchoice_values = self._infer_discriminator_values_for_choice(subchoice_schema, source_name=None) values.extend(subchoice_values) return values elif choice['type'] == 'nullable': self._should_be_nullable = True return self._infer_discriminator_values_for_choice(choice['schema'], source_name=None) elif choice['type'] == 'model': return self._infer_discriminator_values_for_choice(choice['schema'], source_name=choice['cls'].__name__) elif choice['type'] == 'dataclass': return self._infer_discriminator_values_for_choice(choice['schema'], source_name=choice['cls'].__name__) elif choice['type'] == 'model-fields': return self._infer_discriminator_values_for_model_choice(choice, source_name=source_name) elif choice['type'] == 'dataclass-args': return self._infer_discriminator_values_for_dataclass_choice(choice, source_name=source_name) elif choice['type'] == 'typed-dict': return self._infer_discriminator_values_for_typed_dict_choice(choice, source_name=source_name) elif choice['type'] == 'definition-ref': schema_ref = choice['schema_ref'] if schema_ref not in self.definitions: raise MissingDefinitionForUnionRef(schema_ref) return self._infer_discriminator_values_for_choice(self.definitions[schema_ref], source_name=source_name) else: raise TypeError( f'{choice["type"]!r} is not a valid discriminated union variant;' ' should be a `BaseModel` or `dataclass`' ) def _infer_discriminator_values_for_typed_dict_choice( self, choice: core_schema.TypedDictSchema, source_name: str | None = None ) -> list[str | int]: """This method just extracts the _infer_discriminator_values_for_choice logic specific to TypedDictSchema for the sake of readability. """ source = 'TypedDict' if source_name is None else f'TypedDict {source_name!r}' field = choice['fields'].get(self.discriminator) if field is None: raise PydanticUserError( f'{source} needs a discriminator field for key {self.discriminator!r}', code='discriminator-no-field' ) return self._infer_discriminator_values_for_field(field, source) def _infer_discriminator_values_for_model_choice( self, choice: core_schema.ModelFieldsSchema, source_name: str | None = None ) -> list[str | int]: source = 'ModelFields' if source_name is None else f'Model {source_name!r}' field = choice['fields'].get(self.discriminator) if field is None: raise PydanticUserError( f'{source} needs a discriminator field for key {self.discriminator!r}', code='discriminator-no-field' ) return self._infer_discriminator_values_for_field(field, source) def _infer_discriminator_values_for_dataclass_choice( self, choice: core_schema.DataclassArgsSchema, source_name: str | None = None ) -> list[str | int]: source = 'DataclassArgs' if source_name is None else f'Dataclass {source_name!r}' for field in choice['fields']: if field['name'] == self.discriminator: break else: raise PydanticUserError( f'{source} needs a discriminator field for key {self.discriminator!r}', code='discriminator-no-field' ) return self._infer_discriminator_values_for_field(field, source) def _infer_discriminator_values_for_field(self, field: CoreSchemaField, source: str) -> list[str | int]: if field['type'] == 'computed-field': # This should never occur as a discriminator, as it is only relevant to serialization return [] alias = field.get('validation_alias', self.discriminator) if not isinstance(alias, str): raise PydanticUserError( f'Alias {alias!r} is not supported in a discriminated union', code='discriminator-alias-type' ) if self._discriminator_alias is None: self._discriminator_alias = alias elif self._discriminator_alias != alias: raise PydanticUserError( f'Aliases for discriminator {self.discriminator!r} must be the same ' f'(got {alias}, {self._discriminator_alias})', code='discriminator-alias', ) return self._infer_discriminator_values_for_inner_schema(field['schema'], source) def _infer_discriminator_values_for_inner_schema( self, schema: core_schema.CoreSchema, source: str ) -> list[str | int]: """When inferring discriminator values for a field, we typically extract the expected values from a literal schema. This function does that, but also handles nested unions and defaults. """ if schema['type'] == 'literal': return schema['expected'] elif schema['type'] == 'union': # Generally when multiple values are allowed they should be placed in a single `Literal`, but # we add this case to handle the situation where a field is annotated as a `Union` of `Literal`s. # For example, this lets us handle `Union[Literal['key'], Union[Literal['Key'], Literal['KEY']]]` values: list[Any] = [] for choice in schema['choices']: choice_schema = choice[0] if isinstance(choice, tuple) else choice choice_values = self._infer_discriminator_values_for_inner_schema(choice_schema, source) values.extend(choice_values) return values elif schema['type'] == 'default': # This will happen if the field has a default value; we ignore it while extracting the discriminator values return self._infer_discriminator_values_for_inner_schema(schema['schema'], source) elif schema['type'] == 'function-after': # After validators don't affect the discriminator values return self._infer_discriminator_values_for_inner_schema(schema['schema'], source) elif schema['type'] in {'function-before', 'function-wrap', 'function-plain'}: validator_type = repr(schema['type'].split('-')[1]) raise PydanticUserError( f'Cannot use a mode={validator_type} validator in the' f' discriminator field {self.discriminator!r} of {source}', code='discriminator-validator', ) else: raise PydanticUserError( f'{source} needs field {self.discriminator!r} to be of type `Literal`', code='discriminator-needs-literal', ) def _set_unique_choice_for_values(self, choice: core_schema.CoreSchema, values: Sequence[str | int]) -> None: """This method updates `self.tagged_union_choices` so that all provided (discriminator) `values` map to the provided `choice`, validating that none of these values already map to another (different) choice. """ for discriminator_value in values: if discriminator_value in self._tagged_union_choices: # It is okay if `value` is already in tagged_union_choices as long as it maps to the same value. # Because tagged_union_choices may map values to other values, we need to walk the choices dict # until we get to a "real" choice, and confirm that is equal to the one assigned. existing_choice = self._tagged_union_choices[discriminator_value] if existing_choice != choice: raise TypeError( f'Value {discriminator_value!r} for discriminator ' f'{self.discriminator!r} mapped to multiple choices' ) else: self._tagged_union_choices[discriminator_value] = choice pydantic-2.10.6/pydantic/_internal/_docs_extraction.py000066400000000000000000000073171474456633400231320ustar00rootroot00000000000000"""Utilities related to attribute docstring extraction.""" from __future__ import annotations import ast import inspect import textwrap from typing import Any class DocstringVisitor(ast.NodeVisitor): def __init__(self) -> None: super().__init__() self.target: str | None = None self.attrs: dict[str, str] = {} self.previous_node_type: type[ast.AST] | None = None def visit(self, node: ast.AST) -> Any: node_result = super().visit(node) self.previous_node_type = type(node) return node_result def visit_AnnAssign(self, node: ast.AnnAssign) -> Any: if isinstance(node.target, ast.Name): self.target = node.target.id def visit_Expr(self, node: ast.Expr) -> Any: if ( isinstance(node.value, ast.Constant) and isinstance(node.value.value, str) and self.previous_node_type is ast.AnnAssign ): docstring = inspect.cleandoc(node.value.value) if self.target: self.attrs[self.target] = docstring self.target = None def _dedent_source_lines(source: list[str]) -> str: # Required for nested class definitions, e.g. in a function block dedent_source = textwrap.dedent(''.join(source)) if dedent_source.startswith((' ', '\t')): # We are in the case where there's a dedented (usually multiline) string # at a lower indentation level than the class itself. We wrap our class # in a function as a workaround. dedent_source = f'def dedent_workaround():\n{dedent_source}' return dedent_source def _extract_source_from_frame(cls: type[Any]) -> list[str] | None: frame = inspect.currentframe() while frame: if inspect.getmodule(frame) is inspect.getmodule(cls): lnum = frame.f_lineno try: lines, _ = inspect.findsource(frame) except OSError: # Source can't be retrieved (maybe because running in an interactive terminal), # we don't want to error here. pass else: block_lines = inspect.getblock(lines[lnum - 1 :]) dedent_source = _dedent_source_lines(block_lines) try: block_tree = ast.parse(dedent_source) except SyntaxError: pass else: stmt = block_tree.body[0] if isinstance(stmt, ast.FunctionDef) and stmt.name == 'dedent_workaround': # `_dedent_source_lines` wrapped the class around the workaround function stmt = stmt.body[0] if isinstance(stmt, ast.ClassDef) and stmt.name == cls.__name__: return block_lines frame = frame.f_back def extract_docstrings_from_cls(cls: type[Any], use_inspect: bool = False) -> dict[str, str]: """Map model attributes and their corresponding docstring. Args: cls: The class of the Pydantic model to inspect. use_inspect: Whether to skip usage of frames to find the object and use the `inspect` module instead. Returns: A mapping containing attribute names and their corresponding docstring. """ if use_inspect: # Might not work as expected if two classes have the same name in the same source file. try: source, _ = inspect.getsourcelines(cls) except OSError: return {} else: source = _extract_source_from_frame(cls) if not source: return {} dedent_source = _dedent_source_lines(source) visitor = DocstringVisitor() visitor.visit(ast.parse(dedent_source)) return visitor.attrs pydantic-2.10.6/pydantic/_internal/_fields.py000066400000000000000000000412441474456633400212050ustar00rootroot00000000000000"""Private logic related to fields (the `Field()` function and `FieldInfo` class), and arguments to `Annotated`.""" from __future__ import annotations as _annotations import dataclasses import warnings from copy import copy from functools import lru_cache from inspect import Parameter, ismethoddescriptor, signature from typing import TYPE_CHECKING, Any, Callable, Pattern from pydantic_core import PydanticUndefined from typing_extensions import TypeIs from pydantic.errors import PydanticUserError from . import _typing_extra from ._config import ConfigWrapper from ._docs_extraction import extract_docstrings_from_cls from ._import_utils import import_cached_base_model, import_cached_field_info from ._namespace_utils import NsResolver from ._repr import Representation from ._utils import can_be_positional if TYPE_CHECKING: from annotated_types import BaseMetadata from ..fields import FieldInfo from ..main import BaseModel from ._dataclasses import StandardDataclass from ._decorators import DecoratorInfos class PydanticMetadata(Representation): """Base class for annotation markers like `Strict`.""" __slots__ = () def pydantic_general_metadata(**metadata: Any) -> BaseMetadata: """Create a new `_PydanticGeneralMetadata` class with the given metadata. Args: **metadata: The metadata to add. Returns: The new `_PydanticGeneralMetadata` class. """ return _general_metadata_cls()(metadata) # type: ignore @lru_cache(maxsize=None) def _general_metadata_cls() -> type[BaseMetadata]: """Do it this way to avoid importing `annotated_types` at import time.""" from annotated_types import BaseMetadata class _PydanticGeneralMetadata(PydanticMetadata, BaseMetadata): """Pydantic general metadata like `max_digits`.""" def __init__(self, metadata: Any): self.__dict__ = metadata return _PydanticGeneralMetadata # type: ignore def _update_fields_from_docstrings(cls: type[Any], fields: dict[str, FieldInfo], config_wrapper: ConfigWrapper) -> None: if config_wrapper.use_attribute_docstrings: fields_docs = extract_docstrings_from_cls(cls) for ann_name, field_info in fields.items(): if field_info.description is None and ann_name in fields_docs: field_info.description = fields_docs[ann_name] def collect_model_fields( # noqa: C901 cls: type[BaseModel], bases: tuple[type[Any], ...], config_wrapper: ConfigWrapper, ns_resolver: NsResolver | None, *, typevars_map: dict[Any, Any] | None = None, ) -> tuple[dict[str, FieldInfo], set[str]]: """Collect the fields of a nascent pydantic model. Also collect the names of any ClassVars present in the type hints. The returned value is a tuple of two items: the fields dict, and the set of ClassVar names. Args: cls: BaseModel or dataclass. bases: Parents of the class, generally `cls.__bases__`. config_wrapper: The config wrapper instance. ns_resolver: Namespace resolver to use when getting model annotations. typevars_map: A dictionary mapping type variables to their concrete types. Returns: A tuple contains fields and class variables. Raises: NameError: - If there is a conflict between a field name and protected namespaces. - If there is a field other than `root` in `RootModel`. - If a field shadows an attribute in the parent model. """ BaseModel = import_cached_base_model() FieldInfo_ = import_cached_field_info() parent_fields_lookup: dict[str, FieldInfo] = {} for base in reversed(bases): if model_fields := getattr(base, '__pydantic_fields__', None): parent_fields_lookup.update(model_fields) type_hints = _typing_extra.get_model_type_hints(cls, ns_resolver=ns_resolver) # https://docs.python.org/3/howto/annotations.html#accessing-the-annotations-dict-of-an-object-in-python-3-9-and-older # annotations is only used for finding fields in parent classes annotations = cls.__dict__.get('__annotations__', {}) fields: dict[str, FieldInfo] = {} class_vars: set[str] = set() for ann_name, (ann_type, evaluated) in type_hints.items(): if ann_name == 'model_config': # We never want to treat `model_config` as a field # Note: we may need to change this logic if/when we introduce a `BareModel` class with no # protected namespaces (where `model_config` might be allowed as a field name) continue for protected_namespace in config_wrapper.protected_namespaces: ns_violation: bool = False if isinstance(protected_namespace, Pattern): ns_violation = protected_namespace.match(ann_name) is not None elif isinstance(protected_namespace, str): ns_violation = ann_name.startswith(protected_namespace) if ns_violation: for b in bases: if hasattr(b, ann_name): if not (issubclass(b, BaseModel) and ann_name in getattr(b, '__pydantic_fields__', {})): raise NameError( f'Field "{ann_name}" conflicts with member {getattr(b, ann_name)}' f' of protected namespace "{protected_namespace}".' ) else: valid_namespaces = () for pn in config_wrapper.protected_namespaces: if isinstance(pn, Pattern): if not pn.match(ann_name): valid_namespaces += (f're.compile({pn.pattern})',) else: if not ann_name.startswith(pn): valid_namespaces += (pn,) warnings.warn( f'Field "{ann_name}" in {cls.__name__} has conflict with protected namespace "{protected_namespace}".' '\n\nYou may be able to resolve this warning by setting' f" `model_config['protected_namespaces'] = {valid_namespaces}`.", UserWarning, ) if _typing_extra.is_classvar_annotation(ann_type): class_vars.add(ann_name) continue if _is_finalvar_with_default_val(ann_type, getattr(cls, ann_name, PydanticUndefined)): class_vars.add(ann_name) continue if not is_valid_field_name(ann_name): continue if cls.__pydantic_root_model__ and ann_name != 'root': raise NameError( f"Unexpected field with name {ann_name!r}; only 'root' is allowed as a field of a `RootModel`" ) # when building a generic model with `MyModel[int]`, the generic_origin check makes sure we don't get # "... shadows an attribute" warnings generic_origin = getattr(cls, '__pydantic_generic_metadata__', {}).get('origin') for base in bases: dataclass_fields = { field.name for field in (dataclasses.fields(base) if dataclasses.is_dataclass(base) else ()) } if hasattr(base, ann_name): if base is generic_origin: # Don't warn when "shadowing" of attributes in parametrized generics continue if ann_name in dataclass_fields: # Don't warn when inheriting stdlib dataclasses whose fields are "shadowed" by defaults being set # on the class instance. continue if ann_name not in annotations: # Don't warn when a field exists in a parent class but has not been defined in the current class continue warnings.warn( f'Field name "{ann_name}" in "{cls.__qualname__}" shadows an attribute in parent ' f'"{base.__qualname__}"', UserWarning, ) try: default = getattr(cls, ann_name, PydanticUndefined) if default is PydanticUndefined: raise AttributeError except AttributeError: if ann_name in annotations: field_info = FieldInfo_.from_annotation(ann_type) field_info.evaluated = evaluated else: # if field has no default value and is not in __annotations__ this means that it is # defined in a base class and we can take it from there if ann_name in parent_fields_lookup: # The field was present on one of the (possibly multiple) base classes # copy the field to make sure typevar substitutions don't cause issues with the base classes field_info = copy(parent_fields_lookup[ann_name]) else: # The field was not found on any base classes; this seems to be caused by fields not getting # generated thanks to models not being fully defined while initializing recursive models. # Nothing stops us from just creating a new FieldInfo for this type hint, so we do this. field_info = FieldInfo_.from_annotation(ann_type) field_info.evaluated = evaluated else: _warn_on_nested_alias_in_annotation(ann_type, ann_name) if isinstance(default, FieldInfo_) and ismethoddescriptor(default.default): # the `getattr` call above triggers a call to `__get__` for descriptors, so we do # the same if the `= field(default=...)` form is used. Note that we only do this # for method descriptors for now, we might want to extend this to any descriptor # in the future (by simply checking for `hasattr(default.default, '__get__')`). default.default = default.default.__get__(None, cls) field_info = FieldInfo_.from_annotated_attribute(ann_type, default) field_info.evaluated = evaluated # attributes which are fields are removed from the class namespace: # 1. To match the behaviour of annotation-only fields # 2. To avoid false positives in the NameError check above try: delattr(cls, ann_name) except AttributeError: pass # indicates the attribute was on a parent class # Use cls.__dict__['__pydantic_decorators__'] instead of cls.__pydantic_decorators__ # to make sure the decorators have already been built for this exact class decorators: DecoratorInfos = cls.__dict__['__pydantic_decorators__'] if ann_name in decorators.computed_fields: raise ValueError("you can't override a field with a computed field") fields[ann_name] = field_info if typevars_map: for field in fields.values(): field.apply_typevars_map(typevars_map) _update_fields_from_docstrings(cls, fields, config_wrapper) return fields, class_vars def _warn_on_nested_alias_in_annotation(ann_type: type[Any], ann_name: str) -> None: FieldInfo = import_cached_field_info() args = getattr(ann_type, '__args__', None) if args: for anno_arg in args: if _typing_extra.is_annotated(anno_arg): for anno_type_arg in _typing_extra.get_args(anno_arg): if isinstance(anno_type_arg, FieldInfo) and anno_type_arg.alias is not None: warnings.warn( f'`alias` specification on field "{ann_name}" must be set on outermost annotation to take effect.', UserWarning, ) return def _is_finalvar_with_default_val(type_: type[Any], val: Any) -> bool: FieldInfo = import_cached_field_info() if not _typing_extra.is_finalvar(type_): return False elif val is PydanticUndefined: return False elif isinstance(val, FieldInfo) and (val.default is PydanticUndefined and val.default_factory is None): return False else: return True def collect_dataclass_fields( cls: type[StandardDataclass], *, ns_resolver: NsResolver | None = None, typevars_map: dict[Any, Any] | None = None, config_wrapper: ConfigWrapper | None = None, ) -> dict[str, FieldInfo]: """Collect the fields of a dataclass. Args: cls: dataclass. ns_resolver: Namespace resolver to use when getting dataclass annotations. Defaults to an empty instance. typevars_map: A dictionary mapping type variables to their concrete types. config_wrapper: The config wrapper instance. Returns: The dataclass fields. """ FieldInfo_ = import_cached_field_info() fields: dict[str, FieldInfo] = {} ns_resolver = ns_resolver or NsResolver() dataclass_fields = cls.__dataclass_fields__ # The logic here is similar to `_typing_extra.get_cls_type_hints`, # although we do it manually as stdlib dataclasses already have annotations # collected in each class: for base in reversed(cls.__mro__): if not dataclasses.is_dataclass(base): continue with ns_resolver.push(base): for ann_name, dataclass_field in dataclass_fields.items(): if ann_name not in base.__dict__.get('__annotations__', {}): # `__dataclass_fields__`contains every field, even the ones from base classes. # Only collect the ones defined on `base`. continue globalns, localns = ns_resolver.types_namespace ann_type, _ = _typing_extra.try_eval_type(dataclass_field.type, globalns, localns) if _typing_extra.is_classvar_annotation(ann_type): continue if ( not dataclass_field.init and dataclass_field.default is dataclasses.MISSING and dataclass_field.default_factory is dataclasses.MISSING ): # TODO: We should probably do something with this so that validate_assignment behaves properly # Issue: https://github.com/pydantic/pydantic/issues/5470 continue if isinstance(dataclass_field.default, FieldInfo_): if dataclass_field.default.init_var: if dataclass_field.default.init is False: raise PydanticUserError( f'Dataclass field {ann_name} has init=False and init_var=True, but these are mutually exclusive.', code='clashing-init-and-init-var', ) # TODO: same note as above re validate_assignment continue field_info = FieldInfo_.from_annotated_attribute(ann_type, dataclass_field.default) else: field_info = FieldInfo_.from_annotated_attribute(ann_type, dataclass_field) fields[ann_name] = field_info if field_info.default is not PydanticUndefined and isinstance( getattr(cls, ann_name, field_info), FieldInfo_ ): # We need this to fix the default when the "default" from __dataclass_fields__ is a pydantic.FieldInfo setattr(cls, ann_name, field_info.default) if typevars_map: for field in fields.values(): # We don't pass any ns, as `field.annotation` # was already evaluated. TODO: is this method relevant? # Can't we juste use `_generics.replace_types`? field.apply_typevars_map(typevars_map) if config_wrapper is not None: _update_fields_from_docstrings(cls, fields, config_wrapper) return fields def is_valid_field_name(name: str) -> bool: return not name.startswith('_') def is_valid_privateattr_name(name: str) -> bool: return name.startswith('_') and not name.startswith('__') def takes_validated_data_argument( default_factory: Callable[[], Any] | Callable[[dict[str, Any]], Any], ) -> TypeIs[Callable[[dict[str, Any]], Any]]: """Whether the provided default factory callable has a validated data parameter.""" try: sig = signature(default_factory) except (ValueError, TypeError): # `inspect.signature` might not be able to infer a signature, e.g. with C objects. # In this case, we assume no data argument is present: return False parameters = list(sig.parameters.values()) return len(parameters) == 1 and can_be_positional(parameters[0]) and parameters[0].default is Parameter.empty pydantic-2.10.6/pydantic/_internal/_forward_ref.py000066400000000000000000000011431474456633400222310ustar00rootroot00000000000000from __future__ import annotations as _annotations from dataclasses import dataclass from typing import Union @dataclass class PydanticRecursiveRef: type_ref: str __name__ = 'PydanticRecursiveRef' __hash__ = object.__hash__ def __call__(self) -> None: """Defining __call__ is necessary for the `typing` module to let you use an instance of this class as the result of resolving a standard ForwardRef. """ def __or__(self, other): return Union[self, other] # type: ignore def __ror__(self, other): return Union[other, self] # type: ignore pydantic-2.10.6/pydantic/_internal/_generate_schema.py000066400000000000000000003367061474456633400230630ustar00rootroot00000000000000"""Convert python types to pydantic-core schema.""" from __future__ import annotations as _annotations import collections.abc import dataclasses import datetime import inspect import os import pathlib import re import sys import typing import warnings from contextlib import contextmanager from copy import copy, deepcopy from decimal import Decimal from enum import Enum from fractions import Fraction from functools import partial from inspect import Parameter, _ParameterKind, signature from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from itertools import chain from operator import attrgetter from types import FunctionType, LambdaType, MethodType from typing import ( TYPE_CHECKING, Any, Callable, Dict, Final, ForwardRef, Iterable, Iterator, Mapping, Type, TypeVar, Union, cast, overload, ) from uuid import UUID from warnings import warn import typing_extensions from pydantic_core import ( CoreSchema, MultiHostUrl, PydanticCustomError, PydanticSerializationUnexpectedValue, PydanticUndefined, Url, core_schema, to_jsonable_python, ) from typing_extensions import Literal, TypeAliasType, TypedDict, get_args, get_origin, is_typeddict from ..aliases import AliasChoices, AliasGenerator, AliasPath from ..annotated_handlers import GetCoreSchemaHandler, GetJsonSchemaHandler from ..config import ConfigDict, JsonDict, JsonEncoder, JsonSchemaExtraCallable from ..errors import PydanticSchemaGenerationError, PydanticUndefinedAnnotation, PydanticUserError from ..functional_validators import AfterValidator, BeforeValidator, FieldValidatorModes, PlainValidator, WrapValidator from ..json_schema import JsonSchemaValue from ..version import version_short from ..warnings import PydanticDeprecatedSince20 from . import _core_utils, _decorators, _discriminated_union, _known_annotated_metadata, _typing_extra from ._config import ConfigWrapper, ConfigWrapperStack from ._core_metadata import update_core_metadata from ._core_utils import ( collect_invalid_schemas, define_expected_missing_refs, get_ref, get_type_ref, is_function_with_inner_schema, is_list_like_schema_with_items_schema, simplify_schema_references, validate_core_schema, ) from ._decorators import ( Decorator, DecoratorInfos, FieldSerializerDecoratorInfo, FieldValidatorDecoratorInfo, ModelSerializerDecoratorInfo, ModelValidatorDecoratorInfo, RootValidatorDecoratorInfo, ValidatorDecoratorInfo, get_attribute_from_bases, inspect_field_serializer, inspect_model_serializer, inspect_validator, ) from ._docs_extraction import extract_docstrings_from_cls from ._fields import collect_dataclass_fields, takes_validated_data_argument from ._forward_ref import PydanticRecursiveRef from ._generics import get_standard_typevars_map, has_instance_in_type, recursively_defined_type_refs, replace_types from ._import_utils import import_cached_base_model, import_cached_field_info from ._mock_val_ser import MockCoreSchema from ._namespace_utils import NamespacesTuple, NsResolver from ._schema_generation_shared import CallbackGetCoreSchemaHandler from ._utils import lenient_issubclass, smart_deepcopy if TYPE_CHECKING: from ..fields import ComputedFieldInfo, FieldInfo from ..main import BaseModel from ..types import Discriminator from ._dataclasses import StandardDataclass from ._schema_generation_shared import GetJsonSchemaFunction _SUPPORTS_TYPEDDICT = sys.version_info >= (3, 12) FieldDecoratorInfo = Union[ValidatorDecoratorInfo, FieldValidatorDecoratorInfo, FieldSerializerDecoratorInfo] FieldDecoratorInfoType = TypeVar('FieldDecoratorInfoType', bound=FieldDecoratorInfo) AnyFieldDecorator = Union[ Decorator[ValidatorDecoratorInfo], Decorator[FieldValidatorDecoratorInfo], Decorator[FieldSerializerDecoratorInfo], ] ModifyCoreSchemaWrapHandler = GetCoreSchemaHandler GetCoreSchemaFunction = Callable[[Any, ModifyCoreSchemaWrapHandler], core_schema.CoreSchema] TUPLE_TYPES: list[type] = [tuple, typing.Tuple] LIST_TYPES: list[type] = [list, typing.List, collections.abc.MutableSequence] SET_TYPES: list[type] = [set, typing.Set, collections.abc.MutableSet] FROZEN_SET_TYPES: list[type] = [frozenset, typing.FrozenSet, collections.abc.Set] DICT_TYPES: list[type] = [dict, typing.Dict] IP_TYPES: list[type] = [IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network] SEQUENCE_TYPES: list[type] = [typing.Sequence, collections.abc.Sequence] PATH_TYPES: list[type] = [ os.PathLike, pathlib.Path, pathlib.PurePath, pathlib.PosixPath, pathlib.PurePosixPath, pathlib.PureWindowsPath, ] MAPPING_TYPES = [ typing.Mapping, typing.MutableMapping, collections.abc.Mapping, collections.abc.MutableMapping, collections.OrderedDict, typing_extensions.OrderedDict, typing.DefaultDict, collections.defaultdict, collections.Counter, typing.Counter, ] DEQUE_TYPES: list[type] = [collections.deque, typing.Deque] # Note: This does not play very well with type checkers. For example, # `a: LambdaType = lambda x: x` will raise a type error by Pyright. ValidateCallSupportedTypes = Union[ LambdaType, FunctionType, MethodType, partial, ] VALIDATE_CALL_SUPPORTED_TYPES = get_args(ValidateCallSupportedTypes) _mode_to_validator: dict[ FieldValidatorModes, type[BeforeValidator | AfterValidator | PlainValidator | WrapValidator] ] = {'before': BeforeValidator, 'after': AfterValidator, 'plain': PlainValidator, 'wrap': WrapValidator} def check_validator_fields_against_field_name( info: FieldDecoratorInfo, field: str, ) -> bool: """Check if field name is in validator fields. Args: info: The field info. field: The field name to check. Returns: `True` if field name is in validator fields, `False` otherwise. """ if '*' in info.fields: return True for v_field_name in info.fields: if v_field_name == field: return True return False def check_decorator_fields_exist(decorators: Iterable[AnyFieldDecorator], fields: Iterable[str]) -> None: """Check if the defined fields in decorators exist in `fields` param. It ignores the check for a decorator if the decorator has `*` as field or `check_fields=False`. Args: decorators: An iterable of decorators. fields: An iterable of fields name. Raises: PydanticUserError: If one of the field names does not exist in `fields` param. """ fields = set(fields) for dec in decorators: if '*' in dec.info.fields: continue if dec.info.check_fields is False: continue for field in dec.info.fields: if field not in fields: raise PydanticUserError( f'Decorators defined with incorrect fields: {dec.cls_ref}.{dec.cls_var_name}' " (use check_fields=False if you're inheriting from the model and intended this)", code='decorator-missing-field', ) def filter_field_decorator_info_by_field( validator_functions: Iterable[Decorator[FieldDecoratorInfoType]], field: str ) -> list[Decorator[FieldDecoratorInfoType]]: return [dec for dec in validator_functions if check_validator_fields_against_field_name(dec.info, field)] def apply_each_item_validators( schema: core_schema.CoreSchema, each_item_validators: list[Decorator[ValidatorDecoratorInfo]], field_name: str | None, ) -> core_schema.CoreSchema: # This V1 compatibility shim should eventually be removed # fail early if each_item_validators is empty if not each_item_validators: return schema # push down any `each_item=True` validators # note that this won't work for any Annotated types that get wrapped by a function validator # but that's okay because that didn't exist in V1 if schema['type'] == 'nullable': schema['schema'] = apply_each_item_validators(schema['schema'], each_item_validators, field_name) return schema elif schema['type'] == 'tuple': if (variadic_item_index := schema.get('variadic_item_index')) is not None: schema['items_schema'][variadic_item_index] = apply_validators( schema['items_schema'][variadic_item_index], each_item_validators, field_name, ) elif is_list_like_schema_with_items_schema(schema): inner_schema = schema.get('items_schema', core_schema.any_schema()) schema['items_schema'] = apply_validators(inner_schema, each_item_validators, field_name) elif schema['type'] == 'dict': inner_schema = schema.get('values_schema', core_schema.any_schema()) schema['values_schema'] = apply_validators(inner_schema, each_item_validators, field_name) else: raise TypeError( f"`@validator(..., each_item=True)` cannot be applied to fields with a schema of {schema['type']}" ) return schema def _extract_json_schema_info_from_field_info( info: FieldInfo | ComputedFieldInfo, ) -> tuple[JsonDict | None, JsonDict | JsonSchemaExtraCallable | None]: json_schema_updates = { 'title': info.title, 'description': info.description, 'deprecated': bool(info.deprecated) or info.deprecated == '' or None, 'examples': to_jsonable_python(info.examples), } json_schema_updates = {k: v for k, v in json_schema_updates.items() if v is not None} return (json_schema_updates or None, info.json_schema_extra) JsonEncoders = Dict[Type[Any], JsonEncoder] def _add_custom_serialization_from_json_encoders( json_encoders: JsonEncoders | None, tp: Any, schema: CoreSchema ) -> CoreSchema: """Iterate over the json_encoders and add the first matching encoder to the schema. Args: json_encoders: A dictionary of types and their encoder functions. tp: The type to check for a matching encoder. schema: The schema to add the encoder to. """ if not json_encoders: return schema if 'serialization' in schema: return schema # Check the class type and its superclasses for a matching encoder # Decimal.__class__.__mro__ (and probably other cases) doesn't include Decimal itself # if the type is a GenericAlias (e.g. from list[int]) we need to use __class__ instead of .__mro__ for base in (tp, *getattr(tp, '__mro__', tp.__class__.__mro__)[:-1]): encoder = json_encoders.get(base) if encoder is None: continue warnings.warn( f'`json_encoders` is deprecated. See https://docs.pydantic.dev/{version_short()}/concepts/serialization/#custom-serializers for alternatives', PydanticDeprecatedSince20, ) # TODO: in theory we should check that the schema accepts a serialization key schema['serialization'] = core_schema.plain_serializer_function_ser_schema(encoder, when_used='json') return schema return schema def _get_first_non_null(a: Any, b: Any) -> Any: """Return the first argument if it is not None, otherwise return the second argument. Use case: serialization_alias (argument a) and alias (argument b) are both defined, and serialization_alias is ''. This function will return serialization_alias, which is the first argument, even though it is an empty string. """ return a if a is not None else b class GenerateSchema: """Generate core schema for a Pydantic model, dataclass and types like `str`, `datetime`, ... .""" __slots__ = ( '_config_wrapper_stack', '_ns_resolver', '_typevars_map', 'field_name_stack', 'model_type_stack', 'defs', ) def __init__( self, config_wrapper: ConfigWrapper, ns_resolver: NsResolver | None = None, typevars_map: dict[Any, Any] | None = None, ) -> None: # we need a stack for recursing into nested models self._config_wrapper_stack = ConfigWrapperStack(config_wrapper) self._ns_resolver = ns_resolver or NsResolver() self._typevars_map = typevars_map self.field_name_stack = _FieldNameStack() self.model_type_stack = _ModelTypeStack() self.defs = _Definitions() def __init_subclass__(cls) -> None: super().__init_subclass__() warnings.warn( 'Subclassing `GenerateSchema` is not supported. The API is highly subject to change in minor versions.', UserWarning, stacklevel=2, ) @property def _config_wrapper(self) -> ConfigWrapper: return self._config_wrapper_stack.tail @property def _types_namespace(self) -> NamespacesTuple: return self._ns_resolver.types_namespace @property def _arbitrary_types(self) -> bool: return self._config_wrapper.arbitrary_types_allowed # the following methods can be overridden but should be considered # unstable / private APIs def _list_schema(self, items_type: Any) -> CoreSchema: return core_schema.list_schema(self.generate_schema(items_type)) def _dict_schema(self, keys_type: Any, values_type: Any) -> CoreSchema: return core_schema.dict_schema(self.generate_schema(keys_type), self.generate_schema(values_type)) def _set_schema(self, items_type: Any) -> CoreSchema: return core_schema.set_schema(self.generate_schema(items_type)) def _frozenset_schema(self, items_type: Any) -> CoreSchema: return core_schema.frozenset_schema(self.generate_schema(items_type)) def _enum_schema(self, enum_type: type[Enum]) -> CoreSchema: cases: list[Any] = list(enum_type.__members__.values()) enum_ref = get_type_ref(enum_type) description = None if not enum_type.__doc__ else inspect.cleandoc(enum_type.__doc__) if ( description == 'An enumeration.' ): # This is the default value provided by enum.EnumMeta.__new__; don't use it description = None js_updates = {'title': enum_type.__name__, 'description': description} js_updates = {k: v for k, v in js_updates.items() if v is not None} sub_type: Literal['str', 'int', 'float'] | None = None if issubclass(enum_type, int): sub_type = 'int' value_ser_type: core_schema.SerSchema = core_schema.simple_ser_schema('int') elif issubclass(enum_type, str): # this handles `StrEnum` (3.11 only), and also `Foobar(str, Enum)` sub_type = 'str' value_ser_type = core_schema.simple_ser_schema('str') elif issubclass(enum_type, float): sub_type = 'float' value_ser_type = core_schema.simple_ser_schema('float') else: # TODO this is an ugly hack, how do we trigger an Any schema for serialization? value_ser_type = core_schema.plain_serializer_function_ser_schema(lambda x: x) if cases: def get_json_schema(schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: json_schema = handler(schema) original_schema = handler.resolve_ref_schema(json_schema) original_schema.update(js_updates) return json_schema # we don't want to add the missing to the schema if it's the default one default_missing = getattr(enum_type._missing_, '__func__', None) is Enum._missing_.__func__ # pyright: ignore[reportFunctionMemberAccess] enum_schema = core_schema.enum_schema( enum_type, cases, sub_type=sub_type, missing=None if default_missing else enum_type._missing_, ref=enum_ref, metadata={'pydantic_js_functions': [get_json_schema]}, ) if self._config_wrapper.use_enum_values: enum_schema = core_schema.no_info_after_validator_function( attrgetter('value'), enum_schema, serialization=value_ser_type ) return enum_schema else: def get_json_schema_no_cases(_, handler: GetJsonSchemaHandler) -> JsonSchemaValue: json_schema = handler(core_schema.enum_schema(enum_type, cases, sub_type=sub_type, ref=enum_ref)) original_schema = handler.resolve_ref_schema(json_schema) original_schema.update(js_updates) return json_schema # Use an isinstance check for enums with no cases. # The most important use case for this is creating TypeVar bounds for generics that should # be restricted to enums. This is more consistent than it might seem at first, since you can only # subclass enum.Enum (or subclasses of enum.Enum) if all parent classes have no cases. # We use the get_json_schema function when an Enum subclass has been declared with no cases # so that we can still generate a valid json schema. return core_schema.is_instance_schema( enum_type, metadata={'pydantic_js_functions': [get_json_schema_no_cases]}, ) def _ip_schema(self, tp: Any) -> CoreSchema: from ._validators import IP_VALIDATOR_LOOKUP, IpType ip_type_json_schema_format: dict[type[IpType], str] = { IPv4Address: 'ipv4', IPv4Network: 'ipv4network', IPv4Interface: 'ipv4interface', IPv6Address: 'ipv6', IPv6Network: 'ipv6network', IPv6Interface: 'ipv6interface', } def ser_ip(ip: Any, info: core_schema.SerializationInfo) -> str | IpType: if not isinstance(ip, (tp, str)): raise PydanticSerializationUnexpectedValue( f"Expected `{tp}` but got `{type(ip)}` with value `'{ip}'` - serialized value may not be as expected." ) if info.mode == 'python': return ip return str(ip) return core_schema.lax_or_strict_schema( lax_schema=core_schema.no_info_plain_validator_function(IP_VALIDATOR_LOOKUP[tp]), strict_schema=core_schema.json_or_python_schema( json_schema=core_schema.no_info_after_validator_function(tp, core_schema.str_schema()), python_schema=core_schema.is_instance_schema(tp), ), serialization=core_schema.plain_serializer_function_ser_schema(ser_ip, info_arg=True, when_used='always'), metadata={ 'pydantic_js_functions': [lambda _1, _2: {'type': 'string', 'format': ip_type_json_schema_format[tp]}] }, ) def _fraction_schema(self) -> CoreSchema: """Support for [`fractions.Fraction`][fractions.Fraction].""" from ._validators import fraction_validator # TODO: note, this is a fairly common pattern, re lax / strict for attempted type coercion, # can we use a helper function to reduce boilerplate? return core_schema.lax_or_strict_schema( lax_schema=core_schema.no_info_plain_validator_function(fraction_validator), strict_schema=core_schema.json_or_python_schema( json_schema=core_schema.no_info_plain_validator_function(fraction_validator), python_schema=core_schema.is_instance_schema(Fraction), ), # use str serialization to guarantee round trip behavior serialization=core_schema.to_string_ser_schema(when_used='always'), metadata={'pydantic_js_functions': [lambda _1, _2: {'type': 'string', 'format': 'fraction'}]}, ) def _arbitrary_type_schema(self, tp: Any) -> CoreSchema: if not isinstance(tp, type): warn( f'{tp!r} is not a Python type (it may be an instance of an object),' ' Pydantic will allow any object with no validation since we cannot even' ' enforce that the input is an instance of the given type.' ' To get rid of this error wrap the type with `pydantic.SkipValidation`.', UserWarning, ) return core_schema.any_schema() return core_schema.is_instance_schema(tp) def _unknown_type_schema(self, obj: Any) -> CoreSchema: raise PydanticSchemaGenerationError( f'Unable to generate pydantic-core schema for {obj!r}. ' 'Set `arbitrary_types_allowed=True` in the model_config to ignore this error' ' or implement `__get_pydantic_core_schema__` on your type to fully support it.' '\n\nIf you got this error by calling handler() within' ' `__get_pydantic_core_schema__` then you likely need to call' ' `handler.generate_schema()` since we do not call' ' `__get_pydantic_core_schema__` on `` otherwise to avoid infinite recursion.' ) def _apply_discriminator_to_union( self, schema: CoreSchema, discriminator: str | Discriminator | None ) -> CoreSchema: if discriminator is None: return schema try: return _discriminated_union.apply_discriminator( schema, discriminator, ) except _discriminated_union.MissingDefinitionForUnionRef: # defer until defs are resolved _discriminated_union.set_discriminator_in_metadata( schema, discriminator, ) return schema class CollectedInvalid(Exception): pass def clean_schema(self, schema: CoreSchema) -> CoreSchema: schema = self.collect_definitions(schema) schema = simplify_schema_references(schema) if collect_invalid_schemas(schema): raise self.CollectedInvalid() schema = _discriminated_union.apply_discriminators(schema) schema = validate_core_schema(schema) return schema def collect_definitions(self, schema: CoreSchema) -> CoreSchema: ref = cast('str | None', schema.get('ref', None)) if ref: self.defs.definitions[ref] = schema if 'ref' in schema: schema = core_schema.definition_reference_schema(schema['ref']) return core_schema.definitions_schema( schema, list(self.defs.definitions.values()), ) def _add_js_function(self, metadata_schema: CoreSchema, js_function: Callable[..., Any]) -> None: metadata = metadata_schema.get('metadata', {}) pydantic_js_functions = metadata.setdefault('pydantic_js_functions', []) # because of how we generate core schemas for nested generic models # we can end up adding `BaseModel.__get_pydantic_json_schema__` multiple times # this check may fail to catch duplicates if the function is a `functools.partial` # or something like that, but if it does it'll fail by inserting the duplicate if js_function not in pydantic_js_functions: pydantic_js_functions.append(js_function) metadata_schema['metadata'] = metadata def generate_schema( self, obj: Any, from_dunder_get_core_schema: bool = True, ) -> core_schema.CoreSchema: """Generate core schema. Args: obj: The object to generate core schema for. from_dunder_get_core_schema: Whether to generate schema from either the `__get_pydantic_core_schema__` function or `__pydantic_core_schema__` property. Returns: The generated core schema. Raises: PydanticUndefinedAnnotation: If it is not possible to evaluate forward reference. PydanticSchemaGenerationError: If it is not possible to generate pydantic-core schema. TypeError: - If `alias_generator` returns a disallowed type (must be str, AliasPath or AliasChoices). - If V1 style validator with `each_item=True` applied on a wrong field. PydanticUserError: - If `typing.TypedDict` is used instead of `typing_extensions.TypedDict` on Python < 3.12. - If `__modify_schema__` method is used instead of `__get_pydantic_json_schema__`. """ schema: CoreSchema | None = None if from_dunder_get_core_schema: from_property = self._generate_schema_from_property(obj, obj) if from_property is not None: schema = from_property if schema is None: schema = self._generate_schema_inner(obj) metadata_js_function = _extract_get_pydantic_json_schema(obj, schema) if metadata_js_function is not None: metadata_schema = resolve_original_schema(schema, self.defs.definitions) if metadata_schema: self._add_js_function(metadata_schema, metadata_js_function) schema = _add_custom_serialization_from_json_encoders(self._config_wrapper.json_encoders, obj, schema) return schema def _model_schema(self, cls: type[BaseModel]) -> core_schema.CoreSchema: """Generate schema for a Pydantic model.""" with self.defs.get_schema_or_ref(cls) as (model_ref, maybe_schema): if maybe_schema is not None: return maybe_schema fields = getattr(cls, '__pydantic_fields__', {}) decorators = cls.__pydantic_decorators__ computed_fields = decorators.computed_fields check_decorator_fields_exist( chain( decorators.field_validators.values(), decorators.field_serializers.values(), decorators.validators.values(), ), {*fields.keys(), *computed_fields.keys()}, ) config_wrapper = ConfigWrapper(cls.model_config, check=False) core_config = config_wrapper.core_config(title=cls.__name__) model_validators = decorators.model_validators.values() with self._config_wrapper_stack.push(config_wrapper), self._ns_resolver.push(cls): extras_schema = None if core_config.get('extra_fields_behavior') == 'allow': assert cls.__mro__[0] is cls assert cls.__mro__[-1] is object for candidate_cls in cls.__mro__[:-1]: extras_annotation = getattr(candidate_cls, '__annotations__', {}).get( '__pydantic_extra__', None ) if extras_annotation is not None: if isinstance(extras_annotation, str): extras_annotation = _typing_extra.eval_type_backport( _typing_extra._make_forward_ref( extras_annotation, is_argument=False, is_class=True ), *self._types_namespace, ) tp = get_origin(extras_annotation) if tp not in (Dict, dict): raise PydanticSchemaGenerationError( 'The type annotation for `__pydantic_extra__` must be `Dict[str, ...]`' ) extra_items_type = self._get_args_resolving_forward_refs( extras_annotation, required=True, )[1] if not _typing_extra.is_any(extra_items_type): extras_schema = self.generate_schema(extra_items_type) break generic_origin: type[BaseModel] | None = getattr(cls, '__pydantic_generic_metadata__', {}).get('origin') if cls.__pydantic_root_model__: root_field = self._common_field_schema('root', fields['root'], decorators) inner_schema = root_field['schema'] inner_schema = apply_model_validators(inner_schema, model_validators, 'inner') model_schema = core_schema.model_schema( cls, inner_schema, generic_origin=generic_origin, custom_init=getattr(cls, '__pydantic_custom_init__', None), root_model=True, post_init=getattr(cls, '__pydantic_post_init__', None), config=core_config, ref=model_ref, ) else: fields_schema: core_schema.CoreSchema = core_schema.model_fields_schema( {k: self._generate_md_field_schema(k, v, decorators) for k, v in fields.items()}, computed_fields=[ self._computed_field_schema(d, decorators.field_serializers) for d in computed_fields.values() ], extras_schema=extras_schema, model_name=cls.__name__, ) inner_schema = apply_validators(fields_schema, decorators.root_validators.values(), None) new_inner_schema = define_expected_missing_refs(inner_schema, recursively_defined_type_refs()) if new_inner_schema is not None: inner_schema = new_inner_schema inner_schema = apply_model_validators(inner_schema, model_validators, 'inner') model_schema = core_schema.model_schema( cls, inner_schema, generic_origin=generic_origin, custom_init=getattr(cls, '__pydantic_custom_init__', None), root_model=False, post_init=getattr(cls, '__pydantic_post_init__', None), config=core_config, ref=model_ref, ) schema = self._apply_model_serializers(model_schema, decorators.model_serializers.values()) schema = apply_model_validators(schema, model_validators, 'outer') self.defs.definitions[model_ref] = schema return core_schema.definition_reference_schema(model_ref) def _unpack_refs_defs(self, schema: CoreSchema) -> CoreSchema: """Unpack all 'definitions' schemas into `GenerateSchema.defs.definitions` and return the inner schema. """ if schema['type'] == 'definitions': definitions = self.defs.definitions for s in schema['definitions']: definitions[s['ref']] = s # type: ignore return schema['schema'] return schema def _resolve_self_type(self, obj: Any) -> Any: obj = self.model_type_stack.get() if obj is None: raise PydanticUserError('`typing.Self` is invalid in this context', code='invalid-self-type') return obj def _generate_schema_from_property(self, obj: Any, source: Any) -> core_schema.CoreSchema | None: """Try to generate schema from either the `__get_pydantic_core_schema__` function or `__pydantic_core_schema__` property. Note: `__get_pydantic_core_schema__` takes priority so it can decide whether to use a `__pydantic_core_schema__` attribute, or generate a fresh schema. """ # avoid calling `__get_pydantic_core_schema__` if we've already visited this object if _typing_extra.is_self(obj): obj = self._resolve_self_type(obj) with self.defs.get_schema_or_ref(obj) as (_, maybe_schema): if maybe_schema is not None: return maybe_schema if obj is source: ref_mode = 'unpack' else: ref_mode = 'to-def' schema: CoreSchema if (get_schema := getattr(obj, '__get_pydantic_core_schema__', None)) is not None: schema = get_schema( source, CallbackGetCoreSchemaHandler(self._generate_schema_inner, self, ref_mode=ref_mode) ) elif ( hasattr(obj, '__dict__') # In some cases (e.g. a stdlib dataclass subclassing a Pydantic dataclass), # doing an attribute access to get the schema will result in the parent schema # being fetched. Thus, only look for the current obj's dict: and (existing_schema := obj.__dict__.get('__pydantic_core_schema__')) is not None and not isinstance(existing_schema, MockCoreSchema) ): schema = existing_schema elif (validators := getattr(obj, '__get_validators__', None)) is not None: from pydantic.v1 import BaseModel as BaseModelV1 if issubclass(obj, BaseModelV1): warn( f'Mixing V1 models and V2 models (or constructs, like `TypeAdapter`) is not supported. Please upgrade `{obj.__name__}` to V2.', UserWarning, ) else: warn( '`__get_validators__` is deprecated and will be removed, use `__get_pydantic_core_schema__` instead.', PydanticDeprecatedSince20, ) schema = core_schema.chain_schema([core_schema.with_info_plain_validator_function(v) for v in validators()]) else: # we have no existing schema information on the property, exit early so that we can go generate a schema return None schema = self._unpack_refs_defs(schema) if is_function_with_inner_schema(schema): ref = schema['schema'].pop('ref', None) # pyright: ignore[reportCallIssue, reportArgumentType] if ref: schema['ref'] = ref else: ref = get_ref(schema) if ref: self.defs.definitions[ref] = schema return core_schema.definition_reference_schema(ref) return schema def _resolve_forward_ref(self, obj: Any) -> Any: # we assume that types_namespace has the target of forward references in its scope, # but this could fail, for example, if calling Validator on an imported type which contains # forward references to other types only defined in the module from which it was imported # `Validator(SomeImportedTypeAliasWithAForwardReference)` # or the equivalent for BaseModel # class Model(BaseModel): # x: SomeImportedTypeAliasWithAForwardReference try: obj = _typing_extra.eval_type_backport(obj, *self._types_namespace) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e # if obj is still a ForwardRef, it means we can't evaluate it, raise PydanticUndefinedAnnotation if isinstance(obj, ForwardRef): raise PydanticUndefinedAnnotation(obj.__forward_arg__, f'Unable to evaluate forward reference {obj}') if self._typevars_map: obj = replace_types(obj, self._typevars_map) return obj @overload def _get_args_resolving_forward_refs(self, obj: Any, required: Literal[True]) -> tuple[Any, ...]: ... @overload def _get_args_resolving_forward_refs(self, obj: Any) -> tuple[Any, ...] | None: ... def _get_args_resolving_forward_refs(self, obj: Any, required: bool = False) -> tuple[Any, ...] | None: args = get_args(obj) if args: if sys.version_info >= (3, 9): from types import GenericAlias if isinstance(obj, GenericAlias): # PEP 585 generic aliases don't convert args to ForwardRefs, unlike `typing.List/Dict` etc. args = (_typing_extra._make_forward_ref(a) if isinstance(a, str) else a for a in args) args = tuple(self._resolve_forward_ref(a) if isinstance(a, ForwardRef) else a for a in args) elif required: # pragma: no cover raise TypeError(f'Expected {obj} to have generic parameters but it had none') return args def _get_first_arg_or_any(self, obj: Any) -> Any: args = self._get_args_resolving_forward_refs(obj) if not args: return Any return args[0] def _get_first_two_args_or_any(self, obj: Any) -> tuple[Any, Any]: args = self._get_args_resolving_forward_refs(obj) if not args: return (Any, Any) if len(args) < 2: origin = get_origin(obj) raise TypeError(f'Expected two type arguments for {origin}, got 1') return args[0], args[1] def _generate_schema_inner(self, obj: Any) -> core_schema.CoreSchema: if _typing_extra.is_annotated(obj): return self._annotated_schema(obj) if isinstance(obj, dict): # we assume this is already a valid schema return obj # type: ignore[return-value] if isinstance(obj, str): obj = ForwardRef(obj) if isinstance(obj, ForwardRef): return self.generate_schema(self._resolve_forward_ref(obj)) BaseModel = import_cached_base_model() if lenient_issubclass(obj, BaseModel): with self.model_type_stack.push(obj): return self._model_schema(obj) if isinstance(obj, PydanticRecursiveRef): return core_schema.definition_reference_schema(schema_ref=obj.type_ref) return self.match_type(obj) def match_type(self, obj: Any) -> core_schema.CoreSchema: # noqa: C901 """Main mapping of types to schemas. The general structure is a series of if statements starting with the simple cases (non-generic primitive types) and then handling generics and other more complex cases. Each case either generates a schema directly, calls into a public user-overridable method (like `GenerateSchema.tuple_variable_schema`) or calls into a private method that handles some boilerplate before calling into the user-facing method (e.g. `GenerateSchema._tuple_schema`). The idea is that we'll evolve this into adding more and more user facing methods over time as they get requested and we figure out what the right API for them is. """ if obj is str: return core_schema.str_schema() elif obj is bytes: return core_schema.bytes_schema() elif obj is int: return core_schema.int_schema() elif obj is float: return core_schema.float_schema() elif obj is bool: return core_schema.bool_schema() elif obj is complex: return core_schema.complex_schema() elif _typing_extra.is_any(obj) or obj is object: return core_schema.any_schema() elif obj is datetime.date: return core_schema.date_schema() elif obj is datetime.datetime: return core_schema.datetime_schema() elif obj is datetime.time: return core_schema.time_schema() elif obj is datetime.timedelta: return core_schema.timedelta_schema() elif obj is Decimal: return core_schema.decimal_schema() elif obj is UUID: return core_schema.uuid_schema() elif obj is Url: return core_schema.url_schema() elif obj is Fraction: return self._fraction_schema() elif obj is MultiHostUrl: return core_schema.multi_host_url_schema() elif obj is None or obj is _typing_extra.NoneType: return core_schema.none_schema() elif obj in IP_TYPES: return self._ip_schema(obj) elif obj in TUPLE_TYPES: return self._tuple_schema(obj) elif obj in LIST_TYPES: return self._list_schema(Any) elif obj in SET_TYPES: return self._set_schema(Any) elif obj in FROZEN_SET_TYPES: return self._frozenset_schema(Any) elif obj in SEQUENCE_TYPES: return self._sequence_schema(Any) elif obj in DICT_TYPES: return self._dict_schema(Any, Any) elif _typing_extra.is_type_alias_type(obj): return self._type_alias_type_schema(obj) elif obj is type: return self._type_schema() elif _typing_extra.is_callable(obj): return core_schema.callable_schema() elif _typing_extra.is_literal(obj): return self._literal_schema(obj) elif is_typeddict(obj): return self._typed_dict_schema(obj, None) elif _typing_extra.is_namedtuple(obj): return self._namedtuple_schema(obj, None) elif _typing_extra.is_new_type(obj): # NewType, can't use isinstance because it fails <3.10 return self.generate_schema(obj.__supertype__) elif obj is re.Pattern: return self._pattern_schema(obj) elif _typing_extra.is_hashable(obj): return self._hashable_schema() elif isinstance(obj, typing.TypeVar): return self._unsubstituted_typevar_schema(obj) elif _typing_extra.is_finalvar(obj): if obj is Final: return core_schema.any_schema() return self.generate_schema( self._get_first_arg_or_any(obj), ) elif isinstance(obj, VALIDATE_CALL_SUPPORTED_TYPES): return self._call_schema(obj) elif inspect.isclass(obj) and issubclass(obj, Enum): return self._enum_schema(obj) elif _typing_extra.is_zoneinfo_type(obj): return self._zoneinfo_schema() if dataclasses.is_dataclass(obj): return self._dataclass_schema(obj, None) origin = get_origin(obj) if origin is not None: return self._match_generic_type(obj, origin) res = self._get_prepare_pydantic_annotations_for_known_type(obj, ()) if res is not None: source_type, annotations = res return self._apply_annotations(source_type, annotations) if self._arbitrary_types: return self._arbitrary_type_schema(obj) return self._unknown_type_schema(obj) def _match_generic_type(self, obj: Any, origin: Any) -> CoreSchema: # noqa: C901 # Need to handle generic dataclasses before looking for the schema properties because attribute accesses # on _GenericAlias delegate to the origin type, so lose the information about the concrete parametrization # As a result, currently, there is no way to cache the schema for generic dataclasses. This may be possible # to resolve by modifying the value returned by `Generic.__class_getitem__`, but that is a dangerous game. if dataclasses.is_dataclass(origin): return self._dataclass_schema(obj, origin) # pyright: ignore[reportArgumentType] if _typing_extra.is_namedtuple(origin): return self._namedtuple_schema(obj, origin) from_property = self._generate_schema_from_property(origin, obj) if from_property is not None: return from_property if _typing_extra.is_type_alias_type(origin): return self._type_alias_type_schema(obj) elif _typing_extra.origin_is_union(origin): return self._union_schema(obj) elif origin in TUPLE_TYPES: return self._tuple_schema(obj) elif origin in LIST_TYPES: return self._list_schema(self._get_first_arg_or_any(obj)) elif origin in SET_TYPES: return self._set_schema(self._get_first_arg_or_any(obj)) elif origin in FROZEN_SET_TYPES: return self._frozenset_schema(self._get_first_arg_or_any(obj)) elif origin in DICT_TYPES: return self._dict_schema(*self._get_first_two_args_or_any(obj)) elif is_typeddict(origin): return self._typed_dict_schema(obj, origin) elif origin in (typing.Type, type): return self._subclass_schema(obj) elif origin in SEQUENCE_TYPES: return self._sequence_schema(self._get_first_arg_or_any(obj)) elif origin in {typing.Iterable, collections.abc.Iterable, typing.Generator, collections.abc.Generator}: return self._iterable_schema(obj) elif origin in (re.Pattern, typing.Pattern): return self._pattern_schema(obj) res = self._get_prepare_pydantic_annotations_for_known_type(obj, ()) if res is not None: source_type, annotations = res return self._apply_annotations(source_type, annotations) if self._arbitrary_types: return self._arbitrary_type_schema(origin) return self._unknown_type_schema(obj) def _generate_td_field_schema( self, name: str, field_info: FieldInfo, decorators: DecoratorInfos, *, required: bool = True, ) -> core_schema.TypedDictField: """Prepare a TypedDictField to represent a model or typeddict field.""" common_field = self._common_field_schema(name, field_info, decorators) return core_schema.typed_dict_field( common_field['schema'], required=False if not field_info.is_required() else required, serialization_exclude=common_field['serialization_exclude'], validation_alias=common_field['validation_alias'], serialization_alias=common_field['serialization_alias'], metadata=common_field['metadata'], ) def _generate_md_field_schema( self, name: str, field_info: FieldInfo, decorators: DecoratorInfos, ) -> core_schema.ModelField: """Prepare a ModelField to represent a model field.""" common_field = self._common_field_schema(name, field_info, decorators) return core_schema.model_field( common_field['schema'], serialization_exclude=common_field['serialization_exclude'], validation_alias=common_field['validation_alias'], serialization_alias=common_field['serialization_alias'], frozen=common_field['frozen'], metadata=common_field['metadata'], ) def _generate_dc_field_schema( self, name: str, field_info: FieldInfo, decorators: DecoratorInfos, ) -> core_schema.DataclassField: """Prepare a DataclassField to represent the parameter/field, of a dataclass.""" common_field = self._common_field_schema(name, field_info, decorators) return core_schema.dataclass_field( name, common_field['schema'], init=field_info.init, init_only=field_info.init_var or None, kw_only=None if field_info.kw_only else False, serialization_exclude=common_field['serialization_exclude'], validation_alias=common_field['validation_alias'], serialization_alias=common_field['serialization_alias'], frozen=common_field['frozen'], metadata=common_field['metadata'], ) @staticmethod def _apply_alias_generator_to_field_info( alias_generator: Callable[[str], str] | AliasGenerator, field_info: FieldInfo, field_name: str ) -> None: """Apply an alias_generator to aliases on a FieldInfo instance if appropriate. Args: alias_generator: A callable that takes a string and returns a string, or an AliasGenerator instance. field_info: The FieldInfo instance to which the alias_generator is (maybe) applied. field_name: The name of the field from which to generate the alias. """ # Apply an alias_generator if # 1. An alias is not specified # 2. An alias is specified, but the priority is <= 1 if ( field_info.alias_priority is None or field_info.alias_priority <= 1 or field_info.alias is None or field_info.validation_alias is None or field_info.serialization_alias is None ): alias, validation_alias, serialization_alias = None, None, None if isinstance(alias_generator, AliasGenerator): alias, validation_alias, serialization_alias = alias_generator.generate_aliases(field_name) elif isinstance(alias_generator, Callable): alias = alias_generator(field_name) if not isinstance(alias, str): raise TypeError(f'alias_generator {alias_generator} must return str, not {alias.__class__}') # if priority is not set, we set to 1 # which supports the case where the alias_generator from a child class is used # to generate an alias for a field in a parent class if field_info.alias_priority is None or field_info.alias_priority <= 1: field_info.alias_priority = 1 # if the priority is 1, then we set the aliases to the generated alias if field_info.alias_priority == 1: field_info.serialization_alias = _get_first_non_null(serialization_alias, alias) field_info.validation_alias = _get_first_non_null(validation_alias, alias) field_info.alias = alias # if any of the aliases are not set, then we set them to the corresponding generated alias if field_info.alias is None: field_info.alias = alias if field_info.serialization_alias is None: field_info.serialization_alias = _get_first_non_null(serialization_alias, alias) if field_info.validation_alias is None: field_info.validation_alias = _get_first_non_null(validation_alias, alias) @staticmethod def _apply_alias_generator_to_computed_field_info( alias_generator: Callable[[str], str] | AliasGenerator, computed_field_info: ComputedFieldInfo, computed_field_name: str, ): """Apply an alias_generator to alias on a ComputedFieldInfo instance if appropriate. Args: alias_generator: A callable that takes a string and returns a string, or an AliasGenerator instance. computed_field_info: The ComputedFieldInfo instance to which the alias_generator is (maybe) applied. computed_field_name: The name of the computed field from which to generate the alias. """ # Apply an alias_generator if # 1. An alias is not specified # 2. An alias is specified, but the priority is <= 1 if ( computed_field_info.alias_priority is None or computed_field_info.alias_priority <= 1 or computed_field_info.alias is None ): alias, validation_alias, serialization_alias = None, None, None if isinstance(alias_generator, AliasGenerator): alias, validation_alias, serialization_alias = alias_generator.generate_aliases(computed_field_name) elif isinstance(alias_generator, Callable): alias = alias_generator(computed_field_name) if not isinstance(alias, str): raise TypeError(f'alias_generator {alias_generator} must return str, not {alias.__class__}') # if priority is not set, we set to 1 # which supports the case where the alias_generator from a child class is used # to generate an alias for a field in a parent class if computed_field_info.alias_priority is None or computed_field_info.alias_priority <= 1: computed_field_info.alias_priority = 1 # if the priority is 1, then we set the aliases to the generated alias # note that we use the serialization_alias with priority over alias, as computed_field # aliases are used for serialization only (not validation) if computed_field_info.alias_priority == 1: computed_field_info.alias = _get_first_non_null(serialization_alias, alias) @staticmethod def _apply_field_title_generator_to_field_info( config_wrapper: ConfigWrapper, field_info: FieldInfo | ComputedFieldInfo, field_name: str ) -> None: """Apply a field_title_generator on a FieldInfo or ComputedFieldInfo instance if appropriate Args: config_wrapper: The config of the model field_info: The FieldInfo or ComputedField instance to which the title_generator is (maybe) applied. field_name: The name of the field from which to generate the title. """ field_title_generator = field_info.field_title_generator or config_wrapper.field_title_generator if field_title_generator is None: return if field_info.title is None: title = field_title_generator(field_name, field_info) # type: ignore if not isinstance(title, str): raise TypeError(f'field_title_generator {field_title_generator} must return str, not {title.__class__}') field_info.title = title def _common_field_schema( # C901 self, name: str, field_info: FieldInfo, decorators: DecoratorInfos ) -> _CommonField: # Update FieldInfo annotation if appropriate: FieldInfo = import_cached_field_info() if not field_info.evaluated: # TODO Can we use field_info.apply_typevars_map here? try: evaluated_type = _typing_extra.eval_type(field_info.annotation, *self._types_namespace) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e evaluated_type = replace_types(evaluated_type, self._typevars_map) field_info.evaluated = True if not has_instance_in_type(evaluated_type, PydanticRecursiveRef): new_field_info = FieldInfo.from_annotation(evaluated_type) field_info.annotation = new_field_info.annotation # Handle any field info attributes that may have been obtained from now-resolved annotations for k, v in new_field_info._attributes_set.items(): # If an attribute is already set, it means it was set by assigning to a call to Field (or just a # default value), and that should take the highest priority. So don't overwrite existing attributes. # We skip over "attributes" that are present in the metadata_lookup dict because these won't # actually end up as attributes of the `FieldInfo` instance. if k not in field_info._attributes_set and k not in field_info.metadata_lookup: setattr(field_info, k, v) # Finally, ensure the field info also reflects all the `_attributes_set` that are actually metadata. field_info.metadata = [*new_field_info.metadata, *field_info.metadata] source_type, annotations = field_info.annotation, field_info.metadata def set_discriminator(schema: CoreSchema) -> CoreSchema: schema = self._apply_discriminator_to_union(schema, field_info.discriminator) return schema # Convert `@field_validator` decorators to `Before/After/Plain/WrapValidator` instances: validators_from_decorators = [] for decorator in filter_field_decorator_info_by_field(decorators.field_validators.values(), name): validators_from_decorators.append(_mode_to_validator[decorator.info.mode]._from_decorator(decorator)) with self.field_name_stack.push(name): if field_info.discriminator is not None: schema = self._apply_annotations( source_type, annotations + validators_from_decorators, transform_inner_schema=set_discriminator ) else: schema = self._apply_annotations( source_type, annotations + validators_from_decorators, ) # This V1 compatibility shim should eventually be removed # push down any `each_item=True` validators # note that this won't work for any Annotated types that get wrapped by a function validator # but that's okay because that didn't exist in V1 this_field_validators = filter_field_decorator_info_by_field(decorators.validators.values(), name) if _validators_require_validate_default(this_field_validators): field_info.validate_default = True each_item_validators = [v for v in this_field_validators if v.info.each_item is True] this_field_validators = [v for v in this_field_validators if v not in each_item_validators] schema = apply_each_item_validators(schema, each_item_validators, name) schema = apply_validators(schema, this_field_validators, name) # the default validator needs to go outside of any other validators # so that it is the topmost validator for the field validator # which uses it to check if the field has a default value or not if not field_info.is_required(): schema = wrap_default(field_info, schema) schema = self._apply_field_serializers( schema, filter_field_decorator_info_by_field(decorators.field_serializers.values(), name) ) self._apply_field_title_generator_to_field_info(self._config_wrapper, field_info, name) pydantic_js_updates, pydantic_js_extra = _extract_json_schema_info_from_field_info(field_info) core_metadata: dict[str, Any] = {} update_core_metadata( core_metadata, pydantic_js_updates=pydantic_js_updates, pydantic_js_extra=pydantic_js_extra ) alias_generator = self._config_wrapper.alias_generator if alias_generator is not None: self._apply_alias_generator_to_field_info(alias_generator, field_info, name) if isinstance(field_info.validation_alias, (AliasChoices, AliasPath)): validation_alias = field_info.validation_alias.convert_to_aliases() else: validation_alias = field_info.validation_alias return _common_field( schema, serialization_exclude=True if field_info.exclude else None, validation_alias=validation_alias, serialization_alias=field_info.serialization_alias, frozen=field_info.frozen, metadata=core_metadata, ) def _union_schema(self, union_type: Any) -> core_schema.CoreSchema: """Generate schema for a Union.""" args = self._get_args_resolving_forward_refs(union_type, required=True) choices: list[CoreSchema] = [] nullable = False for arg in args: if arg is None or arg is _typing_extra.NoneType: nullable = True else: choices.append(self.generate_schema(arg)) if len(choices) == 1: s = choices[0] else: choices_with_tags: list[CoreSchema | tuple[CoreSchema, str]] = [] for choice in choices: tag = choice.get('metadata', {}).get(_core_utils.TAGGED_UNION_TAG_KEY) if tag is not None: choices_with_tags.append((choice, tag)) else: choices_with_tags.append(choice) s = core_schema.union_schema(choices_with_tags) if nullable: s = core_schema.nullable_schema(s) return s def _type_alias_type_schema(self, obj: TypeAliasType) -> CoreSchema: with self.defs.get_schema_or_ref(obj) as (ref, maybe_schema): if maybe_schema is not None: return maybe_schema origin: TypeAliasType = get_origin(obj) or obj typevars_map = get_standard_typevars_map(obj) with self._ns_resolver.push(origin): try: annotation = _typing_extra.eval_type(origin.__value__, *self._types_namespace) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e annotation = replace_types(annotation, typevars_map) schema = self.generate_schema(annotation) assert schema['type'] != 'definitions' schema['ref'] = ref # type: ignore self.defs.definitions[ref] = schema return core_schema.definition_reference_schema(ref) def _literal_schema(self, literal_type: Any) -> CoreSchema: """Generate schema for a Literal.""" expected = _typing_extra.literal_values(literal_type) assert expected, f'literal "expected" cannot be empty, obj={literal_type}' schema = core_schema.literal_schema(expected) if self._config_wrapper.use_enum_values and any(isinstance(v, Enum) for v in expected): schema = core_schema.no_info_after_validator_function( lambda v: v.value if isinstance(v, Enum) else v, schema ) return schema def _typed_dict_schema(self, typed_dict_cls: Any, origin: Any) -> core_schema.CoreSchema: """Generate schema for a TypedDict. It is not possible to track required/optional keys in TypedDict without __required_keys__ since TypedDict.__new__ erases the base classes (it replaces them with just `dict`) and thus we can track usage of total=True/False __required_keys__ was added in Python 3.9 (https://github.com/miss-islington/cpython/blob/1e9939657dd1f8eb9f596f77c1084d2d351172fc/Doc/library/typing.rst?plain=1#L1546-L1548) however it is buggy (https://github.com/python/typing_extensions/blob/ac52ac5f2cb0e00e7988bae1e2a1b8257ac88d6d/src/typing_extensions.py#L657-L666). On 3.11 but < 3.12 TypedDict does not preserve inheritance information. Hence to avoid creating validators that do not do what users expect we only support typing.TypedDict on Python >= 3.12 or typing_extension.TypedDict on all versions """ FieldInfo = import_cached_field_info() with self.model_type_stack.push(typed_dict_cls), self.defs.get_schema_or_ref(typed_dict_cls) as ( typed_dict_ref, maybe_schema, ): if maybe_schema is not None: return maybe_schema typevars_map = get_standard_typevars_map(typed_dict_cls) if origin is not None: typed_dict_cls = origin if not _SUPPORTS_TYPEDDICT and type(typed_dict_cls).__module__ == 'typing': raise PydanticUserError( 'Please use `typing_extensions.TypedDict` instead of `typing.TypedDict` on Python < 3.12.', code='typed-dict-version', ) try: # if a typed dictionary class doesn't have config, we use the parent's config, hence a default of `None` # see https://github.com/pydantic/pydantic/issues/10917 config: ConfigDict | None = get_attribute_from_bases(typed_dict_cls, '__pydantic_config__') except AttributeError: config = None with self._config_wrapper_stack.push(config): core_config = self._config_wrapper.core_config(title=typed_dict_cls.__name__) required_keys: frozenset[str] = typed_dict_cls.__required_keys__ fields: dict[str, core_schema.TypedDictField] = {} decorators = DecoratorInfos.build(typed_dict_cls) if self._config_wrapper.use_attribute_docstrings: field_docstrings = extract_docstrings_from_cls(typed_dict_cls, use_inspect=True) else: field_docstrings = None try: annotations = _typing_extra.get_cls_type_hints(typed_dict_cls, ns_resolver=self._ns_resolver) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e for field_name, annotation in annotations.items(): annotation = replace_types(annotation, typevars_map) required = field_name in required_keys if _typing_extra.is_required(annotation): required = True annotation = self._get_args_resolving_forward_refs( annotation, required=True, )[0] elif _typing_extra.is_not_required(annotation): required = False annotation = self._get_args_resolving_forward_refs( annotation, required=True, )[0] field_info = FieldInfo.from_annotation(annotation) if ( field_docstrings is not None and field_info.description is None and field_name in field_docstrings ): field_info.description = field_docstrings[field_name] self._apply_field_title_generator_to_field_info(self._config_wrapper, field_info, field_name) fields[field_name] = self._generate_td_field_schema( field_name, field_info, decorators, required=required ) td_schema = core_schema.typed_dict_schema( fields, cls=typed_dict_cls, computed_fields=[ self._computed_field_schema(d, decorators.field_serializers) for d in decorators.computed_fields.values() ], ref=typed_dict_ref, config=core_config, ) schema = self._apply_model_serializers(td_schema, decorators.model_serializers.values()) schema = apply_model_validators(schema, decorators.model_validators.values(), 'all') self.defs.definitions[typed_dict_ref] = schema return core_schema.definition_reference_schema(typed_dict_ref) def _namedtuple_schema(self, namedtuple_cls: Any, origin: Any) -> core_schema.CoreSchema: """Generate schema for a NamedTuple.""" with self.model_type_stack.push(namedtuple_cls), self.defs.get_schema_or_ref(namedtuple_cls) as ( namedtuple_ref, maybe_schema, ): if maybe_schema is not None: return maybe_schema typevars_map = get_standard_typevars_map(namedtuple_cls) if origin is not None: namedtuple_cls = origin try: annotations = _typing_extra.get_cls_type_hints(namedtuple_cls, ns_resolver=self._ns_resolver) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e if not annotations: # annotations is empty, happens if namedtuple_cls defined via collections.namedtuple(...) annotations: dict[str, Any] = {k: Any for k in namedtuple_cls._fields} if typevars_map: annotations = { field_name: replace_types(annotation, typevars_map) for field_name, annotation in annotations.items() } arguments_schema = core_schema.arguments_schema( [ self._generate_parameter_schema( field_name, annotation, default=namedtuple_cls._field_defaults.get(field_name, Parameter.empty), ) for field_name, annotation in annotations.items() ], metadata={'pydantic_js_prefer_positional_arguments': True}, ) return core_schema.call_schema(arguments_schema, namedtuple_cls, ref=namedtuple_ref) def _generate_parameter_schema( self, name: str, annotation: type[Any], default: Any = Parameter.empty, mode: Literal['positional_only', 'positional_or_keyword', 'keyword_only'] | None = None, ) -> core_schema.ArgumentsParameter: """Prepare a ArgumentsParameter to represent a field in a namedtuple or function signature.""" FieldInfo = import_cached_field_info() if default is Parameter.empty: field = FieldInfo.from_annotation(annotation) else: field = FieldInfo.from_annotated_attribute(annotation, default) assert field.annotation is not None, 'field.annotation should not be None when generating a schema' with self.field_name_stack.push(name): schema = self._apply_annotations(field.annotation, [field]) if not field.is_required(): schema = wrap_default(field, schema) parameter_schema = core_schema.arguments_parameter(name, schema) if mode is not None: parameter_schema['mode'] = mode if field.alias is not None: parameter_schema['alias'] = field.alias else: alias_generator = self._config_wrapper.alias_generator if isinstance(alias_generator, AliasGenerator) and alias_generator.alias is not None: parameter_schema['alias'] = alias_generator.alias(name) elif isinstance(alias_generator, Callable): parameter_schema['alias'] = alias_generator(name) return parameter_schema def _tuple_schema(self, tuple_type: Any) -> core_schema.CoreSchema: """Generate schema for a Tuple, e.g. `tuple[int, str]` or `tuple[int, ...]`.""" # TODO: do we really need to resolve type vars here? typevars_map = get_standard_typevars_map(tuple_type) params = self._get_args_resolving_forward_refs(tuple_type) if typevars_map and params: params = tuple(replace_types(param, typevars_map) for param in params) # NOTE: subtle difference: `tuple[()]` gives `params=()`, whereas `typing.Tuple[()]` gives `params=((),)` # This is only true for <3.11, on Python 3.11+ `typing.Tuple[()]` gives `params=()` if not params: if tuple_type in TUPLE_TYPES: return core_schema.tuple_schema([core_schema.any_schema()], variadic_item_index=0) else: # special case for `tuple[()]` which means `tuple[]` - an empty tuple return core_schema.tuple_schema([]) elif params[-1] is Ellipsis: if len(params) == 2: return core_schema.tuple_schema([self.generate_schema(params[0])], variadic_item_index=0) else: # TODO: something like https://github.com/pydantic/pydantic/issues/5952 raise ValueError('Variable tuples can only have one type') elif len(params) == 1 and params[0] == (): # special case for `Tuple[()]` which means `Tuple[]` - an empty tuple # NOTE: This conditional can be removed when we drop support for Python 3.10. return core_schema.tuple_schema([]) else: return core_schema.tuple_schema([self.generate_schema(param) for param in params]) def _type_schema(self) -> core_schema.CoreSchema: return core_schema.custom_error_schema( core_schema.is_instance_schema(type), custom_error_type='is_type', custom_error_message='Input should be a type', ) def _zoneinfo_schema(self) -> core_schema.CoreSchema: """Generate schema for a zone_info.ZoneInfo object""" # we're def >=py3.9 if ZoneInfo was included in input if sys.version_info < (3, 9): assert False, 'Unreachable' # import in this path is safe from zoneinfo import ZoneInfo, ZoneInfoNotFoundError def validate_str_is_valid_iana_tz(value: Any, /) -> ZoneInfo: if isinstance(value, ZoneInfo): return value try: return ZoneInfo(value) except (ZoneInfoNotFoundError, ValueError, TypeError): raise PydanticCustomError('zoneinfo_str', 'invalid timezone: {value}', {'value': value}) metadata = {'pydantic_js_functions': [lambda _1, _2: {'type': 'string', 'format': 'zoneinfo'}]} return core_schema.no_info_plain_validator_function( validate_str_is_valid_iana_tz, serialization=core_schema.to_string_ser_schema(), metadata=metadata, ) def _union_is_subclass_schema(self, union_type: Any) -> core_schema.CoreSchema: """Generate schema for `Type[Union[X, ...]]`.""" args = self._get_args_resolving_forward_refs(union_type, required=True) return core_schema.union_schema([self.generate_schema(typing.Type[args]) for args in args]) def _subclass_schema(self, type_: Any) -> core_schema.CoreSchema: """Generate schema for a Type, e.g. `Type[int]`.""" type_param = self._get_first_arg_or_any(type_) # Assume `type[Annotated[, ...]]` is equivalent to `type[]`: type_param = _typing_extra.annotated_type(type_param) or type_param if _typing_extra.is_any(type_param): return self._type_schema() elif _typing_extra.is_type_alias_type(type_param): return self.generate_schema(typing.Type[type_param.__value__]) elif isinstance(type_param, typing.TypeVar): if type_param.__bound__: if _typing_extra.origin_is_union(get_origin(type_param.__bound__)): return self._union_is_subclass_schema(type_param.__bound__) return core_schema.is_subclass_schema(type_param.__bound__) elif type_param.__constraints__: return core_schema.union_schema( [self.generate_schema(typing.Type[c]) for c in type_param.__constraints__] ) else: return self._type_schema() elif _typing_extra.origin_is_union(get_origin(type_param)): return self._union_is_subclass_schema(type_param) else: if _typing_extra.is_self(type_param): type_param = self._resolve_self_type(type_param) if not inspect.isclass(type_param): raise TypeError(f'Expected a class, got {type_param!r}') return core_schema.is_subclass_schema(type_param) def _sequence_schema(self, items_type: Any) -> core_schema.CoreSchema: """Generate schema for a Sequence, e.g. `Sequence[int]`.""" from ._serializers import serialize_sequence_via_list item_type_schema = self.generate_schema(items_type) list_schema = core_schema.list_schema(item_type_schema) json_schema = smart_deepcopy(list_schema) python_schema = core_schema.is_instance_schema(typing.Sequence, cls_repr='Sequence') if not _typing_extra.is_any(items_type): from ._validators import sequence_validator python_schema = core_schema.chain_schema( [python_schema, core_schema.no_info_wrap_validator_function(sequence_validator, list_schema)], ) serialization = core_schema.wrap_serializer_function_ser_schema( serialize_sequence_via_list, schema=item_type_schema, info_arg=True ) return core_schema.json_or_python_schema( json_schema=json_schema, python_schema=python_schema, serialization=serialization ) def _iterable_schema(self, type_: Any) -> core_schema.GeneratorSchema: """Generate a schema for an `Iterable`.""" item_type = self._get_first_arg_or_any(type_) return core_schema.generator_schema(self.generate_schema(item_type)) def _pattern_schema(self, pattern_type: Any) -> core_schema.CoreSchema: from . import _validators metadata = {'pydantic_js_functions': [lambda _1, _2: {'type': 'string', 'format': 'regex'}]} ser = core_schema.plain_serializer_function_ser_schema( attrgetter('pattern'), when_used='json', return_schema=core_schema.str_schema() ) if pattern_type is typing.Pattern or pattern_type is re.Pattern: # bare type return core_schema.no_info_plain_validator_function( _validators.pattern_either_validator, serialization=ser, metadata=metadata ) param = self._get_args_resolving_forward_refs( pattern_type, required=True, )[0] if param is str: return core_schema.no_info_plain_validator_function( _validators.pattern_str_validator, serialization=ser, metadata=metadata ) elif param is bytes: return core_schema.no_info_plain_validator_function( _validators.pattern_bytes_validator, serialization=ser, metadata=metadata ) else: raise PydanticSchemaGenerationError(f'Unable to generate pydantic-core schema for {pattern_type!r}.') def _hashable_schema(self) -> core_schema.CoreSchema: return core_schema.custom_error_schema( schema=core_schema.json_or_python_schema( json_schema=core_schema.chain_schema( [core_schema.any_schema(), core_schema.is_instance_schema(collections.abc.Hashable)] ), python_schema=core_schema.is_instance_schema(collections.abc.Hashable), ), custom_error_type='is_hashable', custom_error_message='Input should be hashable', ) def _dataclass_schema( self, dataclass: type[StandardDataclass], origin: type[StandardDataclass] | None ) -> core_schema.CoreSchema: """Generate schema for a dataclass.""" with self.model_type_stack.push(dataclass), self.defs.get_schema_or_ref(dataclass) as ( dataclass_ref, maybe_schema, ): if maybe_schema is not None: return maybe_schema typevars_map = get_standard_typevars_map(dataclass) if origin is not None: dataclass = origin # if (plain) dataclass doesn't have config, we use the parent's config, hence a default of `None` # (Pydantic dataclasses have an empty dict config by default). # see https://github.com/pydantic/pydantic/issues/10917 config = getattr(dataclass, '__pydantic_config__', None) from ..dataclasses import is_pydantic_dataclass with self._ns_resolver.push(dataclass), self._config_wrapper_stack.push(config): if is_pydantic_dataclass(dataclass): fields = deepcopy(dataclass.__pydantic_fields__) if typevars_map: for field in fields.values(): field.apply_typevars_map(typevars_map, *self._types_namespace) else: fields = collect_dataclass_fields( dataclass, typevars_map=typevars_map, ) if self._config_wrapper.extra == 'allow': # disallow combination of init=False on a dataclass field and extra='allow' on a dataclass for field_name, field in fields.items(): if field.init is False: raise PydanticUserError( f'Field {field_name} has `init=False` and dataclass has config setting `extra="allow"`. ' f'This combination is not allowed.', code='dataclass-init-false-extra-allow', ) decorators = dataclass.__dict__.get('__pydantic_decorators__') or DecoratorInfos.build(dataclass) # Move kw_only=False args to the start of the list, as this is how vanilla dataclasses work. # Note that when kw_only is missing or None, it is treated as equivalent to kw_only=True args = sorted( (self._generate_dc_field_schema(k, v, decorators) for k, v in fields.items()), key=lambda a: a.get('kw_only') is not False, ) has_post_init = hasattr(dataclass, '__post_init__') has_slots = hasattr(dataclass, '__slots__') args_schema = core_schema.dataclass_args_schema( dataclass.__name__, args, computed_fields=[ self._computed_field_schema(d, decorators.field_serializers) for d in decorators.computed_fields.values() ], collect_init_only=has_post_init, ) inner_schema = apply_validators(args_schema, decorators.root_validators.values(), None) model_validators = decorators.model_validators.values() inner_schema = apply_model_validators(inner_schema, model_validators, 'inner') core_config = self._config_wrapper.core_config(title=dataclass.__name__) dc_schema = core_schema.dataclass_schema( dataclass, inner_schema, generic_origin=origin, post_init=has_post_init, ref=dataclass_ref, fields=[field.name for field in dataclasses.fields(dataclass)], slots=has_slots, config=core_config, # we don't use a custom __setattr__ for dataclasses, so we must # pass along the frozen config setting to the pydantic-core schema frozen=self._config_wrapper_stack.tail.frozen, ) schema = self._apply_model_serializers(dc_schema, decorators.model_serializers.values()) schema = apply_model_validators(schema, model_validators, 'outer') self.defs.definitions[dataclass_ref] = schema return core_schema.definition_reference_schema(dataclass_ref) def _call_schema(self, function: ValidateCallSupportedTypes) -> core_schema.CallSchema: """Generate schema for a Callable. TODO support functional validators once we support them in Config """ sig = signature(function) globalns, localns = self._types_namespace type_hints = _typing_extra.get_function_type_hints(function, globalns=globalns, localns=localns) mode_lookup: dict[_ParameterKind, Literal['positional_only', 'positional_or_keyword', 'keyword_only']] = { Parameter.POSITIONAL_ONLY: 'positional_only', Parameter.POSITIONAL_OR_KEYWORD: 'positional_or_keyword', Parameter.KEYWORD_ONLY: 'keyword_only', } arguments_list: list[core_schema.ArgumentsParameter] = [] var_args_schema: core_schema.CoreSchema | None = None var_kwargs_schema: core_schema.CoreSchema | None = None var_kwargs_mode: core_schema.VarKwargsMode | None = None for name, p in sig.parameters.items(): if p.annotation is sig.empty: annotation = typing.cast(Any, Any) else: annotation = type_hints[name] parameter_mode = mode_lookup.get(p.kind) if parameter_mode is not None: arg_schema = self._generate_parameter_schema(name, annotation, p.default, parameter_mode) arguments_list.append(arg_schema) elif p.kind == Parameter.VAR_POSITIONAL: var_args_schema = self.generate_schema(annotation) else: assert p.kind == Parameter.VAR_KEYWORD, p.kind unpack_type = _typing_extra.unpack_type(annotation) if unpack_type is not None: if not is_typeddict(unpack_type): raise PydanticUserError( f'Expected a `TypedDict` class, got {unpack_type.__name__!r}', code='unpack-typed-dict' ) non_pos_only_param_names = { name for name, p in sig.parameters.items() if p.kind != Parameter.POSITIONAL_ONLY } overlapping_params = non_pos_only_param_names.intersection(unpack_type.__annotations__) if overlapping_params: raise PydanticUserError( f'Typed dictionary {unpack_type.__name__!r} overlaps with parameter' f"{'s' if len(overlapping_params) >= 2 else ''} " f"{', '.join(repr(p) for p in sorted(overlapping_params))}", code='overlapping-unpack-typed-dict', ) var_kwargs_mode = 'unpacked-typed-dict' var_kwargs_schema = self._typed_dict_schema(unpack_type, None) else: var_kwargs_mode = 'uniform' var_kwargs_schema = self.generate_schema(annotation) return_schema: core_schema.CoreSchema | None = None config_wrapper = self._config_wrapper if config_wrapper.validate_return: return_hint = sig.return_annotation if return_hint is not sig.empty: return_schema = self.generate_schema(return_hint) return core_schema.call_schema( core_schema.arguments_schema( arguments_list, var_args_schema=var_args_schema, var_kwargs_mode=var_kwargs_mode, var_kwargs_schema=var_kwargs_schema, populate_by_name=config_wrapper.populate_by_name, ), function, return_schema=return_schema, ) def _unsubstituted_typevar_schema(self, typevar: typing.TypeVar) -> core_schema.CoreSchema: assert isinstance(typevar, typing.TypeVar) bound = typevar.__bound__ constraints = typevar.__constraints__ try: typevar_has_default = typevar.has_default() # type: ignore except AttributeError: # could still have a default if it's an old version of typing_extensions.TypeVar typevar_has_default = getattr(typevar, '__default__', None) is not None if (bound is not None) + (len(constraints) != 0) + typevar_has_default > 1: raise NotImplementedError( 'Pydantic does not support mixing more than one of TypeVar bounds, constraints and defaults' ) if typevar_has_default: return self.generate_schema(typevar.__default__) # type: ignore elif constraints: return self._union_schema(typing.Union[constraints]) # type: ignore elif bound: schema = self.generate_schema(bound) schema['serialization'] = core_schema.wrap_serializer_function_ser_schema( lambda x, h: h(x), schema=core_schema.any_schema() ) return schema else: return core_schema.any_schema() def _computed_field_schema( self, d: Decorator[ComputedFieldInfo], field_serializers: dict[str, Decorator[FieldSerializerDecoratorInfo]], ) -> core_schema.ComputedField: try: # Do not pass in globals as the function could be defined in a different module. # Instead, let `get_function_return_type` infer the globals to use, but still pass # in locals that may contain a parent/rebuild namespace: return_type = _decorators.get_function_return_type( d.func, d.info.return_type, localns=self._types_namespace.locals ) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e if return_type is PydanticUndefined: raise PydanticUserError( 'Computed field is missing return type annotation or specifying `return_type`' ' to the `@computed_field` decorator (e.g. `@computed_field(return_type=int|str)`)', code='model-field-missing-annotation', ) return_type = replace_types(return_type, self._typevars_map) # Create a new ComputedFieldInfo so that different type parametrizations of the same # generic model's computed field can have different return types. d.info = dataclasses.replace(d.info, return_type=return_type) return_type_schema = self.generate_schema(return_type) # Apply serializers to computed field if there exist return_type_schema = self._apply_field_serializers( return_type_schema, filter_field_decorator_info_by_field(field_serializers.values(), d.cls_var_name), ) alias_generator = self._config_wrapper.alias_generator if alias_generator is not None: self._apply_alias_generator_to_computed_field_info( alias_generator=alias_generator, computed_field_info=d.info, computed_field_name=d.cls_var_name ) self._apply_field_title_generator_to_field_info(self._config_wrapper, d.info, d.cls_var_name) pydantic_js_updates, pydantic_js_extra = _extract_json_schema_info_from_field_info(d.info) core_metadata: dict[str, Any] = {} update_core_metadata( core_metadata, pydantic_js_updates={'readOnly': True, **(pydantic_js_updates if pydantic_js_updates else {})}, pydantic_js_extra=pydantic_js_extra, ) return core_schema.computed_field( d.cls_var_name, return_schema=return_type_schema, alias=d.info.alias, metadata=core_metadata ) def _annotated_schema(self, annotated_type: Any) -> core_schema.CoreSchema: """Generate schema for an Annotated type, e.g. `Annotated[int, Field(...)]` or `Annotated[int, Gt(0)]`.""" FieldInfo = import_cached_field_info() source_type, *annotations = self._get_args_resolving_forward_refs( annotated_type, required=True, ) schema = self._apply_annotations(source_type, annotations) # put the default validator last so that TypeAdapter.get_default_value() works # even if there are function validators involved for annotation in annotations: if isinstance(annotation, FieldInfo): schema = wrap_default(annotation, schema) return schema def _get_prepare_pydantic_annotations_for_known_type( self, obj: Any, annotations: tuple[Any, ...] ) -> tuple[Any, list[Any]] | None: from ._std_types_schema import ( deque_schema_prepare_pydantic_annotations, mapping_like_prepare_pydantic_annotations, path_schema_prepare_pydantic_annotations, ) # Check for hashability try: hash(obj) except TypeError: # obj is definitely not a known type if this fails return None # TODO: I'd rather we didn't handle the generic nature in the annotations prep, but the same way we do other # generic types like list[str] via _match_generic_type, but I'm not sure if we can do that because this is # not always called from match_type, but sometimes from _apply_annotations obj_origin = get_origin(obj) or obj if obj_origin in PATH_TYPES: return path_schema_prepare_pydantic_annotations(obj, annotations) elif obj_origin in DEQUE_TYPES: return deque_schema_prepare_pydantic_annotations(obj, annotations) elif obj_origin in MAPPING_TYPES: return mapping_like_prepare_pydantic_annotations(obj, annotations) else: return None def _apply_annotations( self, source_type: Any, annotations: list[Any], transform_inner_schema: Callable[[CoreSchema], CoreSchema] = lambda x: x, ) -> CoreSchema: """Apply arguments from `Annotated` or from `FieldInfo` to a schema. This gets called by `GenerateSchema._annotated_schema` but differs from it in that it does not expect `source_type` to be an `Annotated` object, it expects it to be the first argument of that (in other words, `GenerateSchema._annotated_schema` just unpacks `Annotated`, this process it). """ annotations = list(_known_annotated_metadata.expand_grouped_metadata(annotations)) res = self._get_prepare_pydantic_annotations_for_known_type(source_type, tuple(annotations)) if res is not None: source_type, annotations = res pydantic_js_annotation_functions: list[GetJsonSchemaFunction] = [] def inner_handler(obj: Any) -> CoreSchema: from_property = self._generate_schema_from_property(obj, source_type) if from_property is None: schema = self._generate_schema_inner(obj) else: schema = from_property metadata_js_function = _extract_get_pydantic_json_schema(obj, schema) if metadata_js_function is not None: metadata_schema = resolve_original_schema(schema, self.defs.definitions) if metadata_schema is not None: self._add_js_function(metadata_schema, metadata_js_function) return transform_inner_schema(schema) get_inner_schema = CallbackGetCoreSchemaHandler(inner_handler, self) for annotation in annotations: if annotation is None: continue get_inner_schema = self._get_wrapped_inner_schema( get_inner_schema, annotation, pydantic_js_annotation_functions ) schema = get_inner_schema(source_type) if pydantic_js_annotation_functions: core_metadata = schema.setdefault('metadata', {}) update_core_metadata(core_metadata, pydantic_js_annotation_functions=pydantic_js_annotation_functions) return _add_custom_serialization_from_json_encoders(self._config_wrapper.json_encoders, source_type, schema) def _apply_single_annotation(self, schema: core_schema.CoreSchema, metadata: Any) -> core_schema.CoreSchema: FieldInfo = import_cached_field_info() if isinstance(metadata, FieldInfo): for field_metadata in metadata.metadata: schema = self._apply_single_annotation(schema, field_metadata) if metadata.discriminator is not None: schema = self._apply_discriminator_to_union(schema, metadata.discriminator) return schema if schema['type'] == 'nullable': # for nullable schemas, metadata is automatically applied to the inner schema inner = schema.get('schema', core_schema.any_schema()) inner = self._apply_single_annotation(inner, metadata) if inner: schema['schema'] = inner return schema original_schema = schema ref = schema.get('ref', None) if ref is not None: schema = schema.copy() new_ref = ref + f'_{repr(metadata)}' if new_ref in self.defs.definitions: return self.defs.definitions[new_ref] schema['ref'] = new_ref # type: ignore elif schema['type'] == 'definition-ref': ref = schema['schema_ref'] if ref in self.defs.definitions: schema = self.defs.definitions[ref].copy() new_ref = ref + f'_{repr(metadata)}' if new_ref in self.defs.definitions: return self.defs.definitions[new_ref] schema['ref'] = new_ref # type: ignore maybe_updated_schema = _known_annotated_metadata.apply_known_metadata(metadata, schema.copy()) if maybe_updated_schema is not None: return maybe_updated_schema return original_schema def _apply_single_annotation_json_schema( self, schema: core_schema.CoreSchema, metadata: Any ) -> core_schema.CoreSchema: FieldInfo = import_cached_field_info() if isinstance(metadata, FieldInfo): for field_metadata in metadata.metadata: schema = self._apply_single_annotation_json_schema(schema, field_metadata) pydantic_js_updates, pydantic_js_extra = _extract_json_schema_info_from_field_info(metadata) core_metadata = schema.setdefault('metadata', {}) update_core_metadata( core_metadata, pydantic_js_updates=pydantic_js_updates, pydantic_js_extra=pydantic_js_extra ) return schema def _get_wrapped_inner_schema( self, get_inner_schema: GetCoreSchemaHandler, annotation: Any, pydantic_js_annotation_functions: list[GetJsonSchemaFunction], ) -> CallbackGetCoreSchemaHandler: metadata_get_schema: GetCoreSchemaFunction = getattr(annotation, '__get_pydantic_core_schema__', None) or ( lambda source, handler: handler(source) ) def new_handler(source: Any) -> core_schema.CoreSchema: schema = metadata_get_schema(source, get_inner_schema) schema = self._apply_single_annotation(schema, annotation) schema = self._apply_single_annotation_json_schema(schema, annotation) metadata_js_function = _extract_get_pydantic_json_schema(annotation, schema) if metadata_js_function is not None: pydantic_js_annotation_functions.append(metadata_js_function) return schema return CallbackGetCoreSchemaHandler(new_handler, self) def _apply_field_serializers( self, schema: core_schema.CoreSchema, serializers: list[Decorator[FieldSerializerDecoratorInfo]], ) -> core_schema.CoreSchema: """Apply field serializers to a schema.""" if serializers: schema = copy(schema) if schema['type'] == 'definitions': inner_schema = schema['schema'] schema['schema'] = self._apply_field_serializers(inner_schema, serializers) return schema else: ref = typing.cast('str|None', schema.get('ref', None)) if ref is not None: self.defs.definitions[ref] = schema schema = core_schema.definition_reference_schema(ref) # use the last serializer to make it easy to override a serializer set on a parent model serializer = serializers[-1] is_field_serializer, info_arg = inspect_field_serializer(serializer.func, serializer.info.mode) try: # Do not pass in globals as the function could be defined in a different module. # Instead, let `get_function_return_type` infer the globals to use, but still pass # in locals that may contain a parent/rebuild namespace: return_type = _decorators.get_function_return_type( serializer.func, serializer.info.return_type, localns=self._types_namespace.locals ) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e if return_type is PydanticUndefined: return_schema = None else: return_schema = self.generate_schema(return_type) if serializer.info.mode == 'wrap': schema['serialization'] = core_schema.wrap_serializer_function_ser_schema( serializer.func, is_field_serializer=is_field_serializer, info_arg=info_arg, return_schema=return_schema, when_used=serializer.info.when_used, ) else: assert serializer.info.mode == 'plain' schema['serialization'] = core_schema.plain_serializer_function_ser_schema( serializer.func, is_field_serializer=is_field_serializer, info_arg=info_arg, return_schema=return_schema, when_used=serializer.info.when_used, ) return schema def _apply_model_serializers( self, schema: core_schema.CoreSchema, serializers: Iterable[Decorator[ModelSerializerDecoratorInfo]] ) -> core_schema.CoreSchema: """Apply model serializers to a schema.""" ref: str | None = schema.pop('ref', None) # type: ignore if serializers: serializer = list(serializers)[-1] info_arg = inspect_model_serializer(serializer.func, serializer.info.mode) try: # Do not pass in globals as the function could be defined in a different module. # Instead, let `get_function_return_type` infer the globals to use, but still pass # in locals that may contain a parent/rebuild namespace: return_type = _decorators.get_function_return_type( serializer.func, serializer.info.return_type, localns=self._types_namespace.locals ) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e if return_type is PydanticUndefined: return_schema = None else: return_schema = self.generate_schema(return_type) if serializer.info.mode == 'wrap': ser_schema: core_schema.SerSchema = core_schema.wrap_serializer_function_ser_schema( serializer.func, info_arg=info_arg, return_schema=return_schema, when_used=serializer.info.when_used, ) else: # plain ser_schema = core_schema.plain_serializer_function_ser_schema( serializer.func, info_arg=info_arg, return_schema=return_schema, when_used=serializer.info.when_used, ) schema['serialization'] = ser_schema if ref: schema['ref'] = ref # type: ignore return schema _VALIDATOR_F_MATCH: Mapping[ tuple[FieldValidatorModes, Literal['no-info', 'with-info']], Callable[[Callable[..., Any], core_schema.CoreSchema, str | None], core_schema.CoreSchema], ] = { ('before', 'no-info'): lambda f, schema, _: core_schema.no_info_before_validator_function(f, schema), ('after', 'no-info'): lambda f, schema, _: core_schema.no_info_after_validator_function(f, schema), ('plain', 'no-info'): lambda f, _1, _2: core_schema.no_info_plain_validator_function(f), ('wrap', 'no-info'): lambda f, schema, _: core_schema.no_info_wrap_validator_function(f, schema), ('before', 'with-info'): lambda f, schema, field_name: core_schema.with_info_before_validator_function( f, schema, field_name=field_name ), ('after', 'with-info'): lambda f, schema, field_name: core_schema.with_info_after_validator_function( f, schema, field_name=field_name ), ('plain', 'with-info'): lambda f, _, field_name: core_schema.with_info_plain_validator_function( f, field_name=field_name ), ('wrap', 'with-info'): lambda f, schema, field_name: core_schema.with_info_wrap_validator_function( f, schema, field_name=field_name ), } # TODO V3: this function is only used for deprecated decorators. It should # be removed once we drop support for those. def apply_validators( schema: core_schema.CoreSchema, validators: Iterable[Decorator[RootValidatorDecoratorInfo]] | Iterable[Decorator[ValidatorDecoratorInfo]] | Iterable[Decorator[FieldValidatorDecoratorInfo]], field_name: str | None, ) -> core_schema.CoreSchema: """Apply validators to a schema. Args: schema: The schema to apply validators on. validators: An iterable of validators. field_name: The name of the field if validators are being applied to a model field. Returns: The updated schema. """ for validator in validators: info_arg = inspect_validator(validator.func, validator.info.mode) val_type = 'with-info' if info_arg else 'no-info' schema = _VALIDATOR_F_MATCH[(validator.info.mode, val_type)](validator.func, schema, field_name) return schema def _validators_require_validate_default(validators: Iterable[Decorator[ValidatorDecoratorInfo]]) -> bool: """In v1, if any of the validators for a field had `always=True`, the default value would be validated. This serves as an auxiliary function for re-implementing that logic, by looping over a provided collection of (v1-style) ValidatorDecoratorInfo's and checking if any of them have `always=True`. We should be able to drop this function and the associated logic calling it once we drop support for v1-style validator decorators. (Or we can extend it and keep it if we add something equivalent to the v1-validator `always` kwarg to `field_validator`.) """ for validator in validators: if validator.info.always: return True return False def apply_model_validators( schema: core_schema.CoreSchema, validators: Iterable[Decorator[ModelValidatorDecoratorInfo]], mode: Literal['inner', 'outer', 'all'], ) -> core_schema.CoreSchema: """Apply model validators to a schema. If mode == 'inner', only "before" validators are applied If mode == 'outer', validators other than "before" are applied If mode == 'all', all validators are applied Args: schema: The schema to apply validators on. validators: An iterable of validators. mode: The validator mode. Returns: The updated schema. """ ref: str | None = schema.pop('ref', None) # type: ignore for validator in validators: if mode == 'inner' and validator.info.mode != 'before': continue if mode == 'outer' and validator.info.mode == 'before': continue info_arg = inspect_validator(validator.func, validator.info.mode) if validator.info.mode == 'wrap': if info_arg: schema = core_schema.with_info_wrap_validator_function(function=validator.func, schema=schema) else: schema = core_schema.no_info_wrap_validator_function(function=validator.func, schema=schema) elif validator.info.mode == 'before': if info_arg: schema = core_schema.with_info_before_validator_function(function=validator.func, schema=schema) else: schema = core_schema.no_info_before_validator_function(function=validator.func, schema=schema) else: assert validator.info.mode == 'after' if info_arg: schema = core_schema.with_info_after_validator_function(function=validator.func, schema=schema) else: schema = core_schema.no_info_after_validator_function(function=validator.func, schema=schema) if ref: schema['ref'] = ref # type: ignore return schema def wrap_default(field_info: FieldInfo, schema: core_schema.CoreSchema) -> core_schema.CoreSchema: """Wrap schema with default schema if default value or `default_factory` are available. Args: field_info: The field info object. schema: The schema to apply default on. Returns: Updated schema by default value or `default_factory`. """ if field_info.default_factory: return core_schema.with_default_schema( schema, default_factory=field_info.default_factory, default_factory_takes_data=takes_validated_data_argument(field_info.default_factory), validate_default=field_info.validate_default, ) elif field_info.default is not PydanticUndefined: return core_schema.with_default_schema( schema, default=field_info.default, validate_default=field_info.validate_default ) else: return schema def _extract_get_pydantic_json_schema(tp: Any, schema: CoreSchema) -> GetJsonSchemaFunction | None: """Extract `__get_pydantic_json_schema__` from a type, handling the deprecated `__modify_schema__`.""" js_modify_function = getattr(tp, '__get_pydantic_json_schema__', None) if hasattr(tp, '__modify_schema__'): BaseModel = import_cached_base_model() has_custom_v2_modify_js_func = ( js_modify_function is not None and BaseModel.__get_pydantic_json_schema__.__func__ # type: ignore not in (js_modify_function, getattr(js_modify_function, '__func__', None)) ) if not has_custom_v2_modify_js_func: cls_name = getattr(tp, '__name__', None) raise PydanticUserError( f'The `__modify_schema__` method is not supported in Pydantic v2. ' f'Use `__get_pydantic_json_schema__` instead{f" in class `{cls_name}`" if cls_name else ""}.', code='custom-json-schema', ) # handle GenericAlias' but ignore Annotated which "lies" about its origin (in this case it would be `int`) if hasattr(tp, '__origin__') and not _typing_extra.is_annotated(tp): return _extract_get_pydantic_json_schema(tp.__origin__, schema) if js_modify_function is None: return None return js_modify_function class _CommonField(TypedDict): schema: core_schema.CoreSchema validation_alias: str | list[str | int] | list[list[str | int]] | None serialization_alias: str | None serialization_exclude: bool | None frozen: bool | None metadata: dict[str, Any] def _common_field( schema: core_schema.CoreSchema, *, validation_alias: str | list[str | int] | list[list[str | int]] | None = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, frozen: bool | None = None, metadata: Any = None, ) -> _CommonField: return { 'schema': schema, 'validation_alias': validation_alias, 'serialization_alias': serialization_alias, 'serialization_exclude': serialization_exclude, 'frozen': frozen, 'metadata': metadata, } class _Definitions: """Keeps track of references and definitions.""" def __init__(self) -> None: self.seen: set[str] = set() self.definitions: dict[str, core_schema.CoreSchema] = {} @contextmanager def get_schema_or_ref(self, tp: Any) -> Iterator[tuple[str, None] | tuple[str, CoreSchema]]: """Get a definition for `tp` if one exists. If a definition exists, a tuple of `(ref_string, CoreSchema)` is returned. If no definition exists yet, a tuple of `(ref_string, None)` is returned. Note that the returned `CoreSchema` will always be a `DefinitionReferenceSchema`, not the actual definition itself. This should be called for any type that can be identified by reference. This includes any recursive types. At present the following types can be named/recursive: - BaseModel - Dataclasses - TypedDict - TypeAliasType """ ref = get_type_ref(tp) # return the reference if we're either (1) in a cycle or (2) it was already defined if ref in self.seen or ref in self.definitions: yield (ref, core_schema.definition_reference_schema(ref)) else: self.seen.add(ref) try: yield (ref, None) finally: self.seen.discard(ref) def resolve_original_schema(schema: CoreSchema, definitions: dict[str, CoreSchema]) -> CoreSchema | None: if schema['type'] == 'definition-ref': return definitions.get(schema['schema_ref'], None) elif schema['type'] == 'definitions': return schema['schema'] else: return schema class _FieldNameStack: __slots__ = ('_stack',) def __init__(self) -> None: self._stack: list[str] = [] @contextmanager def push(self, field_name: str) -> Iterator[None]: self._stack.append(field_name) yield self._stack.pop() def get(self) -> str | None: if self._stack: return self._stack[-1] else: return None class _ModelTypeStack: __slots__ = ('_stack',) def __init__(self) -> None: self._stack: list[type] = [] @contextmanager def push(self, type_obj: type) -> Iterator[None]: self._stack.append(type_obj) yield self._stack.pop() def get(self) -> type | None: if self._stack: return self._stack[-1] else: return None pydantic-2.10.6/pydantic/_internal/_generics.py000066400000000000000000000543321474456633400215400ustar00rootroot00000000000000from __future__ import annotations import sys import types import typing from collections import ChainMap from contextlib import contextmanager from contextvars import ContextVar from types import prepare_class from typing import TYPE_CHECKING, Any, Iterator, Mapping, MutableMapping, Tuple, TypeVar from weakref import WeakValueDictionary import typing_extensions from . import _typing_extra from ._core_utils import get_type_ref from ._forward_ref import PydanticRecursiveRef from ._utils import all_identical, is_model_class if sys.version_info >= (3, 10): from typing import _UnionGenericAlias # type: ignore[attr-defined] if TYPE_CHECKING: from ..main import BaseModel GenericTypesCacheKey = Tuple[Any, Any, Tuple[Any, ...]] # Note: We want to remove LimitedDict, but to do this, we'd need to improve the handling of generics caching. # Right now, to handle recursive generics, we some types must remain cached for brief periods without references. # By chaining the WeakValuesDict with a LimitedDict, we have a way to retain caching for all types with references, # while also retaining a limited number of types even without references. This is generally enough to build # specific recursive generic models without losing required items out of the cache. KT = TypeVar('KT') VT = TypeVar('VT') _LIMITED_DICT_SIZE = 100 if TYPE_CHECKING: class LimitedDict(dict, MutableMapping[KT, VT]): def __init__(self, size_limit: int = _LIMITED_DICT_SIZE): ... else: class LimitedDict(dict): """Limit the size/length of a dict used for caching to avoid unlimited increase in memory usage. Since the dict is ordered, and we always remove elements from the beginning, this is effectively a FIFO cache. """ def __init__(self, size_limit: int = _LIMITED_DICT_SIZE): self.size_limit = size_limit super().__init__() def __setitem__(self, key: Any, value: Any, /) -> None: super().__setitem__(key, value) if len(self) > self.size_limit: excess = len(self) - self.size_limit + self.size_limit // 10 to_remove = list(self.keys())[:excess] for k in to_remove: del self[k] # weak dictionaries allow the dynamically created parametrized versions of generic models to get collected # once they are no longer referenced by the caller. if sys.version_info >= (3, 9): # Typing for weak dictionaries available at 3.9 GenericTypesCache = WeakValueDictionary[GenericTypesCacheKey, 'type[BaseModel]'] else: GenericTypesCache = WeakValueDictionary if TYPE_CHECKING: class DeepChainMap(ChainMap[KT, VT]): # type: ignore ... else: class DeepChainMap(ChainMap): """Variant of ChainMap that allows direct updates to inner scopes. Taken from https://docs.python.org/3/library/collections.html#collections.ChainMap, with some light modifications for this use case. """ def clear(self) -> None: for mapping in self.maps: mapping.clear() def __setitem__(self, key: KT, value: VT) -> None: for mapping in self.maps: mapping[key] = value def __delitem__(self, key: KT) -> None: hit = False for mapping in self.maps: if key in mapping: del mapping[key] hit = True if not hit: raise KeyError(key) # Despite the fact that LimitedDict _seems_ no longer necessary, I'm very nervous to actually remove it # and discover later on that we need to re-add all this infrastructure... # _GENERIC_TYPES_CACHE = DeepChainMap(GenericTypesCache(), LimitedDict()) _GENERIC_TYPES_CACHE = GenericTypesCache() class PydanticGenericMetadata(typing_extensions.TypedDict): origin: type[BaseModel] | None # analogous to typing._GenericAlias.__origin__ args: tuple[Any, ...] # analogous to typing._GenericAlias.__args__ parameters: tuple[TypeVar, ...] # analogous to typing.Generic.__parameters__ def create_generic_submodel( model_name: str, origin: type[BaseModel], args: tuple[Any, ...], params: tuple[Any, ...] ) -> type[BaseModel]: """Dynamically create a submodel of a provided (generic) BaseModel. This is used when producing concrete parametrizations of generic models. This function only *creates* the new subclass; the schema/validators/serialization must be updated to reflect a concrete parametrization elsewhere. Args: model_name: The name of the newly created model. origin: The base class for the new model to inherit from. args: A tuple of generic metadata arguments. params: A tuple of generic metadata parameters. Returns: The created submodel. """ namespace: dict[str, Any] = {'__module__': origin.__module__} bases = (origin,) meta, ns, kwds = prepare_class(model_name, bases) namespace.update(ns) created_model = meta( model_name, bases, namespace, __pydantic_generic_metadata__={ 'origin': origin, 'args': args, 'parameters': params, }, __pydantic_reset_parent_namespace__=False, **kwds, ) model_module, called_globally = _get_caller_frame_info(depth=3) if called_globally: # create global reference and therefore allow pickling object_by_reference = None reference_name = model_name reference_module_globals = sys.modules[created_model.__module__].__dict__ while object_by_reference is not created_model: object_by_reference = reference_module_globals.setdefault(reference_name, created_model) reference_name += '_' return created_model def _get_caller_frame_info(depth: int = 2) -> tuple[str | None, bool]: """Used inside a function to check whether it was called globally. Args: depth: The depth to get the frame. Returns: A tuple contains `module_name` and `called_globally`. Raises: RuntimeError: If the function is not called inside a function. """ try: previous_caller_frame = sys._getframe(depth) except ValueError as e: raise RuntimeError('This function must be used inside another function') from e except AttributeError: # sys module does not have _getframe function, so there's nothing we can do about it return None, False frame_globals = previous_caller_frame.f_globals return frame_globals.get('__name__'), previous_caller_frame.f_locals is frame_globals DictValues: type[Any] = {}.values().__class__ def iter_contained_typevars(v: Any) -> Iterator[TypeVar]: """Recursively iterate through all subtypes and type args of `v` and yield any typevars that are found. This is inspired as an alternative to directly accessing the `__parameters__` attribute of a GenericAlias, since __parameters__ of (nested) generic BaseModel subclasses won't show up in that list. """ if isinstance(v, TypeVar): yield v elif is_model_class(v): yield from v.__pydantic_generic_metadata__['parameters'] elif isinstance(v, (DictValues, list)): for var in v: yield from iter_contained_typevars(var) else: args = get_args(v) for arg in args: yield from iter_contained_typevars(arg) def get_args(v: Any) -> Any: pydantic_generic_metadata: PydanticGenericMetadata | None = getattr(v, '__pydantic_generic_metadata__', None) if pydantic_generic_metadata: return pydantic_generic_metadata.get('args') return typing_extensions.get_args(v) def get_origin(v: Any) -> Any: pydantic_generic_metadata: PydanticGenericMetadata | None = getattr(v, '__pydantic_generic_metadata__', None) if pydantic_generic_metadata: return pydantic_generic_metadata.get('origin') return typing_extensions.get_origin(v) def get_standard_typevars_map(cls: Any) -> dict[TypeVar, Any] | None: """Package a generic type's typevars and parametrization (if present) into a dictionary compatible with the `replace_types` function. Specifically, this works with standard typing generics and typing._GenericAlias. """ origin = get_origin(cls) if origin is None: return None if not hasattr(origin, '__parameters__'): return None # In this case, we know that cls is a _GenericAlias, and origin is the generic type # So it is safe to access cls.__args__ and origin.__parameters__ args: tuple[Any, ...] = cls.__args__ # type: ignore parameters: tuple[TypeVar, ...] = origin.__parameters__ return dict(zip(parameters, args)) def get_model_typevars_map(cls: type[BaseModel]) -> dict[TypeVar, Any] | None: """Package a generic BaseModel's typevars and concrete parametrization (if present) into a dictionary compatible with the `replace_types` function. Since BaseModel.__class_getitem__ does not produce a typing._GenericAlias, and the BaseModel generic info is stored in the __pydantic_generic_metadata__ attribute, we need special handling here. """ # TODO: This could be unified with `get_standard_typevars_map` if we stored the generic metadata # in the __origin__, __args__, and __parameters__ attributes of the model. generic_metadata = cls.__pydantic_generic_metadata__ origin = generic_metadata['origin'] args = generic_metadata['args'] return dict(zip(iter_contained_typevars(origin), args)) def replace_types(type_: Any, type_map: Mapping[Any, Any] | None) -> Any: """Return type with all occurrences of `type_map` keys recursively replaced with their values. Args: type_: The class or generic alias. type_map: Mapping from `TypeVar` instance to concrete types. Returns: A new type representing the basic structure of `type_` with all `typevar_map` keys recursively replaced. Example: ```python from typing import List, Tuple, Union from pydantic._internal._generics import replace_types replace_types(Tuple[str, Union[List[str], float]], {str: int}) #> Tuple[int, Union[List[int], float]] ``` """ if not type_map: return type_ type_args = get_args(type_) if _typing_extra.is_annotated(type_): annotated_type, *annotations = type_args annotated = replace_types(annotated_type, type_map) for annotation in annotations: annotated = typing_extensions.Annotated[annotated, annotation] return annotated origin_type = get_origin(type_) # Having type args is a good indicator that this is a typing special form # instance or a generic alias of some sort. if type_args: resolved_type_args = tuple(replace_types(arg, type_map) for arg in type_args) if all_identical(type_args, resolved_type_args): # If all arguments are the same, there is no need to modify the # type or create a new object at all return type_ if ( origin_type is not None and isinstance(type_, _typing_extra.typing_base) and not isinstance(origin_type, _typing_extra.typing_base) and getattr(type_, '_name', None) is not None ): # In python < 3.9 generic aliases don't exist so any of these like `list`, # `type` or `collections.abc.Callable` need to be translated. # See: https://www.python.org/dev/peps/pep-0585 origin_type = getattr(typing, type_._name) assert origin_type is not None if _typing_extra.origin_is_union(origin_type): if any(_typing_extra.is_any(arg) for arg in resolved_type_args): # `Any | T` ~ `Any`: resolved_type_args = (Any,) # `Never | T` ~ `T`: resolved_type_args = tuple( arg for arg in resolved_type_args if not (_typing_extra.is_no_return(arg) or _typing_extra.is_never(arg)) ) # PEP-604 syntax (Ex.: list | str) is represented with a types.UnionType object that does not have __getitem__. # We also cannot use isinstance() since we have to compare types. if sys.version_info >= (3, 10) and origin_type is types.UnionType: return _UnionGenericAlias(origin_type, resolved_type_args) # NotRequired[T] and Required[T] don't support tuple type resolved_type_args, hence the condition below return origin_type[resolved_type_args[0] if len(resolved_type_args) == 1 else resolved_type_args] # We handle pydantic generic models separately as they don't have the same # semantics as "typing" classes or generic aliases if not origin_type and is_model_class(type_): parameters = type_.__pydantic_generic_metadata__['parameters'] if not parameters: return type_ resolved_type_args = tuple(replace_types(t, type_map) for t in parameters) if all_identical(parameters, resolved_type_args): return type_ return type_[resolved_type_args] # Handle special case for typehints that can have lists as arguments. # `typing.Callable[[int, str], int]` is an example for this. if isinstance(type_, list): resolved_list = [replace_types(element, type_map) for element in type_] if all_identical(type_, resolved_list): return type_ return resolved_list # If all else fails, we try to resolve the type directly and otherwise just # return the input with no modifications. return type_map.get(type_, type_) def has_instance_in_type(type_: Any, isinstance_target: Any) -> bool: """Checks if the type, or any of its arbitrary nested args, satisfy `isinstance(, isinstance_target)`. """ if isinstance(type_, isinstance_target): return True if _typing_extra.is_annotated(type_): return has_instance_in_type(type_.__origin__, isinstance_target) if _typing_extra.is_literal(type_): return False type_args = get_args(type_) # Having type args is a good indicator that this is a typing module # class instantiation or a generic alias of some sort. for arg in type_args: if has_instance_in_type(arg, isinstance_target): return True # Handle special case for typehints that can have lists as arguments. # `typing.Callable[[int, str], int]` is an example for this. if ( isinstance(type_, list) # On Python < 3.10, typing_extensions implements `ParamSpec` as a subclass of `list`: and not isinstance(type_, typing_extensions.ParamSpec) ): for element in type_: if has_instance_in_type(element, isinstance_target): return True return False def check_parameters_count(cls: type[BaseModel], parameters: tuple[Any, ...]) -> None: """Check the generic model parameters count is equal. Args: cls: The generic model. parameters: A tuple of passed parameters to the generic model. Raises: TypeError: If the passed parameters count is not equal to generic model parameters count. """ actual = len(parameters) expected = len(cls.__pydantic_generic_metadata__['parameters']) if actual != expected: description = 'many' if actual > expected else 'few' raise TypeError(f'Too {description} parameters for {cls}; actual {actual}, expected {expected}') _generic_recursion_cache: ContextVar[set[str] | None] = ContextVar('_generic_recursion_cache', default=None) @contextmanager def generic_recursion_self_type( origin: type[BaseModel], args: tuple[Any, ...] ) -> Iterator[PydanticRecursiveRef | None]: """This contextmanager should be placed around the recursive calls used to build a generic type, and accept as arguments the generic origin type and the type arguments being passed to it. If the same origin and arguments are observed twice, it implies that a self-reference placeholder can be used while building the core schema, and will produce a schema_ref that will be valid in the final parent schema. """ previously_seen_type_refs = _generic_recursion_cache.get() if previously_seen_type_refs is None: previously_seen_type_refs = set() token = _generic_recursion_cache.set(previously_seen_type_refs) else: token = None try: type_ref = get_type_ref(origin, args_override=args) if type_ref in previously_seen_type_refs: self_type = PydanticRecursiveRef(type_ref=type_ref) yield self_type else: previously_seen_type_refs.add(type_ref) yield previously_seen_type_refs.remove(type_ref) finally: if token: _generic_recursion_cache.reset(token) def recursively_defined_type_refs() -> set[str]: visited = _generic_recursion_cache.get() if not visited: return set() # not in a generic recursion, so there are no types return visited.copy() # don't allow modifications def get_cached_generic_type_early(parent: type[BaseModel], typevar_values: Any) -> type[BaseModel] | None: """The use of a two-stage cache lookup approach was necessary to have the highest performance possible for repeated calls to `__class_getitem__` on generic types (which may happen in tighter loops during runtime), while still ensuring that certain alternative parametrizations ultimately resolve to the same type. As a concrete example, this approach was necessary to make Model[List[T]][int] equal to Model[List[int]]. The approach could be modified to not use two different cache keys at different points, but the _early_cache_key is optimized to be as quick to compute as possible (for repeated-access speed), and the _late_cache_key is optimized to be as "correct" as possible, so that two types that will ultimately be the same after resolving the type arguments will always produce cache hits. If we wanted to move to only using a single cache key per type, we would either need to always use the slower/more computationally intensive logic associated with _late_cache_key, or would need to accept that Model[List[T]][int] is a different type than Model[List[T]][int]. Because we rely on subclass relationships during validation, I think it is worthwhile to ensure that types that are functionally equivalent are actually equal. """ return _GENERIC_TYPES_CACHE.get(_early_cache_key(parent, typevar_values)) def get_cached_generic_type_late( parent: type[BaseModel], typevar_values: Any, origin: type[BaseModel], args: tuple[Any, ...] ) -> type[BaseModel] | None: """See the docstring of `get_cached_generic_type_early` for more information about the two-stage cache lookup.""" cached = _GENERIC_TYPES_CACHE.get(_late_cache_key(origin, args, typevar_values)) if cached is not None: set_cached_generic_type(parent, typevar_values, cached, origin, args) return cached def set_cached_generic_type( parent: type[BaseModel], typevar_values: tuple[Any, ...], type_: type[BaseModel], origin: type[BaseModel] | None = None, args: tuple[Any, ...] | None = None, ) -> None: """See the docstring of `get_cached_generic_type_early` for more information about why items are cached with two different keys. """ _GENERIC_TYPES_CACHE[_early_cache_key(parent, typevar_values)] = type_ if len(typevar_values) == 1: _GENERIC_TYPES_CACHE[_early_cache_key(parent, typevar_values[0])] = type_ if origin and args: _GENERIC_TYPES_CACHE[_late_cache_key(origin, args, typevar_values)] = type_ def _union_orderings_key(typevar_values: Any) -> Any: """This is intended to help differentiate between Union types with the same arguments in different order. Thanks to caching internal to the `typing` module, it is not possible to distinguish between List[Union[int, float]] and List[Union[float, int]] (and similarly for other "parent" origins besides List) because `typing` considers Union[int, float] to be equal to Union[float, int]. However, you _can_ distinguish between (top-level) Union[int, float] vs. Union[float, int]. Because we parse items as the first Union type that is successful, we get slightly more consistent behavior if we make an effort to distinguish the ordering of items in a union. It would be best if we could _always_ get the exact-correct order of items in the union, but that would require a change to the `typing` module itself. (See https://github.com/python/cpython/issues/86483 for reference.) """ if isinstance(typevar_values, tuple): args_data = [] for value in typevar_values: args_data.append(_union_orderings_key(value)) return tuple(args_data) elif _typing_extra.is_union(typevar_values): return get_args(typevar_values) else: return () def _early_cache_key(cls: type[BaseModel], typevar_values: Any) -> GenericTypesCacheKey: """This is intended for minimal computational overhead during lookups of cached types. Note that this is overly simplistic, and it's possible that two different cls/typevar_values inputs would ultimately result in the same type being created in BaseModel.__class_getitem__. To handle this, we have a fallback _late_cache_key that is checked later if the _early_cache_key lookup fails, and should result in a cache hit _precisely_ when the inputs to __class_getitem__ would result in the same type. """ return cls, typevar_values, _union_orderings_key(typevar_values) def _late_cache_key(origin: type[BaseModel], args: tuple[Any, ...], typevar_values: Any) -> GenericTypesCacheKey: """This is intended for use later in the process of creating a new type, when we have more information about the exact args that will be passed. If it turns out that a different set of inputs to __class_getitem__ resulted in the same inputs to the generic type creation process, we can still return the cached type, and update the cache with the _early_cache_key as well. """ # The _union_orderings_key is placed at the start here to ensure there cannot be a collision with an # _early_cache_key, as that function will always produce a BaseModel subclass as the first item in the key, # whereas this function will always produce a tuple as the first item in the key. return _union_orderings_key(typevar_values), origin, args pydantic-2.10.6/pydantic/_internal/_git.py000066400000000000000000000014201474456633400205120ustar00rootroot00000000000000"""Git utilities, adopted from mypy's git utilities (https://github.com/python/mypy/blob/master/mypy/git.py).""" from __future__ import annotations import os import subprocess def is_git_repo(dir: str) -> bool: """Is the given directory version-controlled with git?""" return os.path.exists(os.path.join(dir, '.git')) def have_git() -> bool: """Can we run the git executable?""" try: subprocess.check_output(['git', '--help']) return True except subprocess.CalledProcessError: return False except OSError: return False def git_revision(dir: str) -> str: """Get the SHA-1 of the HEAD of a git repository.""" return subprocess.check_output(['git', 'rev-parse', '--short', 'HEAD'], cwd=dir).decode('utf-8').strip() pydantic-2.10.6/pydantic/_internal/_import_utils.py000066400000000000000000000007001474456633400224610ustar00rootroot00000000000000from functools import lru_cache from typing import TYPE_CHECKING, Type if TYPE_CHECKING: from pydantic import BaseModel from pydantic.fields import FieldInfo @lru_cache(maxsize=None) def import_cached_base_model() -> Type['BaseModel']: from pydantic import BaseModel return BaseModel @lru_cache(maxsize=None) def import_cached_field_info() -> Type['FieldInfo']: from pydantic.fields import FieldInfo return FieldInfo pydantic-2.10.6/pydantic/_internal/_internal_dataclass.py000066400000000000000000000002201474456633400235570ustar00rootroot00000000000000import sys # `slots` is available on Python >= 3.10 if sys.version_info >= (3, 10): slots_true = {'slots': True} else: slots_true = {} pydantic-2.10.6/pydantic/_internal/_known_annotated_metadata.py000066400000000000000000000374721474456633400250000ustar00rootroot00000000000000from __future__ import annotations from collections import defaultdict from copy import copy from functools import lru_cache, partial from typing import TYPE_CHECKING, Any, Iterable from pydantic_core import CoreSchema, PydanticCustomError, ValidationError, to_jsonable_python from pydantic_core import core_schema as cs from ._fields import PydanticMetadata from ._import_utils import import_cached_field_info if TYPE_CHECKING: pass STRICT = {'strict'} FAIL_FAST = {'fail_fast'} LENGTH_CONSTRAINTS = {'min_length', 'max_length'} INEQUALITY = {'le', 'ge', 'lt', 'gt'} NUMERIC_CONSTRAINTS = {'multiple_of', *INEQUALITY} ALLOW_INF_NAN = {'allow_inf_nan'} STR_CONSTRAINTS = { *LENGTH_CONSTRAINTS, *STRICT, 'strip_whitespace', 'to_lower', 'to_upper', 'pattern', 'coerce_numbers_to_str', } BYTES_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT} LIST_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT, *FAIL_FAST} TUPLE_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT, *FAIL_FAST} SET_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT, *FAIL_FAST} DICT_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT} GENERATOR_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *STRICT} SEQUENCE_CONSTRAINTS = {*LENGTH_CONSTRAINTS, *FAIL_FAST} FLOAT_CONSTRAINTS = {*NUMERIC_CONSTRAINTS, *ALLOW_INF_NAN, *STRICT} DECIMAL_CONSTRAINTS = {'max_digits', 'decimal_places', *FLOAT_CONSTRAINTS} INT_CONSTRAINTS = {*NUMERIC_CONSTRAINTS, *ALLOW_INF_NAN, *STRICT} BOOL_CONSTRAINTS = STRICT UUID_CONSTRAINTS = STRICT DATE_TIME_CONSTRAINTS = {*NUMERIC_CONSTRAINTS, *STRICT} TIMEDELTA_CONSTRAINTS = {*NUMERIC_CONSTRAINTS, *STRICT} TIME_CONSTRAINTS = {*NUMERIC_CONSTRAINTS, *STRICT} LAX_OR_STRICT_CONSTRAINTS = STRICT ENUM_CONSTRAINTS = STRICT COMPLEX_CONSTRAINTS = STRICT UNION_CONSTRAINTS = {'union_mode'} URL_CONSTRAINTS = { 'max_length', 'allowed_schemes', 'host_required', 'default_host', 'default_port', 'default_path', } TEXT_SCHEMA_TYPES = ('str', 'bytes', 'url', 'multi-host-url') SEQUENCE_SCHEMA_TYPES = ('list', 'tuple', 'set', 'frozenset', 'generator', *TEXT_SCHEMA_TYPES) NUMERIC_SCHEMA_TYPES = ('float', 'int', 'date', 'time', 'timedelta', 'datetime') CONSTRAINTS_TO_ALLOWED_SCHEMAS: dict[str, set[str]] = defaultdict(set) constraint_schema_pairings: list[tuple[set[str], tuple[str, ...]]] = [ (STR_CONSTRAINTS, TEXT_SCHEMA_TYPES), (BYTES_CONSTRAINTS, ('bytes',)), (LIST_CONSTRAINTS, ('list',)), (TUPLE_CONSTRAINTS, ('tuple',)), (SET_CONSTRAINTS, ('set', 'frozenset')), (DICT_CONSTRAINTS, ('dict',)), (GENERATOR_CONSTRAINTS, ('generator',)), (FLOAT_CONSTRAINTS, ('float',)), (INT_CONSTRAINTS, ('int',)), (DATE_TIME_CONSTRAINTS, ('date', 'time', 'datetime', 'timedelta')), # TODO: this is a bit redundant, we could probably avoid some of these (STRICT, (*TEXT_SCHEMA_TYPES, *SEQUENCE_SCHEMA_TYPES, *NUMERIC_SCHEMA_TYPES, 'typed-dict', 'model')), (UNION_CONSTRAINTS, ('union',)), (URL_CONSTRAINTS, ('url', 'multi-host-url')), (BOOL_CONSTRAINTS, ('bool',)), (UUID_CONSTRAINTS, ('uuid',)), (LAX_OR_STRICT_CONSTRAINTS, ('lax-or-strict',)), (ENUM_CONSTRAINTS, ('enum',)), (DECIMAL_CONSTRAINTS, ('decimal',)), (COMPLEX_CONSTRAINTS, ('complex',)), ] for constraints, schemas in constraint_schema_pairings: for c in constraints: CONSTRAINTS_TO_ALLOWED_SCHEMAS[c].update(schemas) def as_jsonable_value(v: Any) -> Any: if type(v) not in (int, str, float, bytes, bool, type(None)): return to_jsonable_python(v) return v def expand_grouped_metadata(annotations: Iterable[Any]) -> Iterable[Any]: """Expand the annotations. Args: annotations: An iterable of annotations. Returns: An iterable of expanded annotations. Example: ```python from annotated_types import Ge, Len from pydantic._internal._known_annotated_metadata import expand_grouped_metadata print(list(expand_grouped_metadata([Ge(4), Len(5)]))) #> [Ge(ge=4), MinLen(min_length=5)] ``` """ import annotated_types as at FieldInfo = import_cached_field_info() for annotation in annotations: if isinstance(annotation, at.GroupedMetadata): yield from annotation elif isinstance(annotation, FieldInfo): yield from annotation.metadata # this is a bit problematic in that it results in duplicate metadata # all of our "consumers" can handle it, but it is not ideal # we probably should split up FieldInfo into: # - annotated types metadata # - individual metadata known only to Pydantic annotation = copy(annotation) annotation.metadata = [] yield annotation else: yield annotation @lru_cache def _get_at_to_constraint_map() -> dict[type, str]: """Return a mapping of annotated types to constraints. Normally, we would define a mapping like this in the module scope, but we can't do that because we don't permit module level imports of `annotated_types`, in an attempt to speed up the import time of `pydantic`. We still only want to have this dictionary defined in one place, so we use this function to cache the result. """ import annotated_types as at return { at.Gt: 'gt', at.Ge: 'ge', at.Lt: 'lt', at.Le: 'le', at.MultipleOf: 'multiple_of', at.MinLen: 'min_length', at.MaxLen: 'max_length', } def apply_known_metadata(annotation: Any, schema: CoreSchema) -> CoreSchema | None: # noqa: C901 """Apply `annotation` to `schema` if it is an annotation we know about (Gt, Le, etc.). Otherwise return `None`. This does not handle all known annotations. If / when it does, it can always return a CoreSchema and return the unmodified schema if the annotation should be ignored. Assumes that GroupedMetadata has already been expanded via `expand_grouped_metadata`. Args: annotation: The annotation. schema: The schema. Returns: An updated schema with annotation if it is an annotation we know about, `None` otherwise. Raises: PydanticCustomError: If `Predicate` fails. """ import annotated_types as at from ._validators import NUMERIC_VALIDATOR_LOOKUP, forbid_inf_nan_check schema = schema.copy() schema_update, other_metadata = collect_known_metadata([annotation]) schema_type = schema['type'] chain_schema_constraints: set[str] = { 'pattern', 'strip_whitespace', 'to_lower', 'to_upper', 'coerce_numbers_to_str', } chain_schema_steps: list[CoreSchema] = [] for constraint, value in schema_update.items(): if constraint not in CONSTRAINTS_TO_ALLOWED_SCHEMAS: raise ValueError(f'Unknown constraint {constraint}') allowed_schemas = CONSTRAINTS_TO_ALLOWED_SCHEMAS[constraint] # if it becomes necessary to handle more than one constraint # in this recursive case with function-after or function-wrap, we should refactor # this is a bit challenging because we sometimes want to apply constraints to the inner schema, # whereas other times we want to wrap the existing schema with a new one that enforces a new constraint. if schema_type in {'function-before', 'function-wrap', 'function-after'} and constraint == 'strict': schema['schema'] = apply_known_metadata(annotation, schema['schema']) # type: ignore # schema is function schema return schema # if we're allowed to apply constraint directly to the schema, like le to int, do that if schema_type in allowed_schemas: if constraint == 'union_mode' and schema_type == 'union': schema['mode'] = value # type: ignore # schema is UnionSchema else: schema[constraint] = value continue # else, apply a function after validator to the schema to enforce the corresponding constraint if constraint in chain_schema_constraints: def _apply_constraint_with_incompatibility_info( value: Any, handler: cs.ValidatorFunctionWrapHandler ) -> Any: try: x = handler(value) except ValidationError as ve: # if the error is about the type, it's likely that the constraint is incompatible the type of the field # for example, the following invalid schema wouldn't be caught during schema build, but rather at this point # with a cryptic 'string_type' error coming from the string validator, # that we'd rather express as a constraint incompatibility error (TypeError) # Annotated[list[int], Field(pattern='abc')] if 'type' in ve.errors()[0]['type']: raise TypeError( f"Unable to apply constraint '{constraint}' to supplied value {value} for schema of type '{schema_type}'" # noqa: B023 ) raise ve return x chain_schema_steps.append( cs.no_info_wrap_validator_function( _apply_constraint_with_incompatibility_info, cs.str_schema(**{constraint: value}) ) ) elif constraint in NUMERIC_VALIDATOR_LOOKUP: if constraint in LENGTH_CONSTRAINTS: inner_schema = schema while inner_schema['type'] in {'function-before', 'function-wrap', 'function-after'}: inner_schema = inner_schema['schema'] # type: ignore inner_schema_type = inner_schema['type'] if inner_schema_type == 'list' or ( inner_schema_type == 'json-or-python' and inner_schema['json_schema']['type'] == 'list' # type: ignore ): js_constraint_key = 'minItems' if constraint == 'min_length' else 'maxItems' else: js_constraint_key = 'minLength' if constraint == 'min_length' else 'maxLength' else: js_constraint_key = constraint schema = cs.no_info_after_validator_function( partial(NUMERIC_VALIDATOR_LOOKUP[constraint], **{constraint: value}), schema ) metadata = schema.get('metadata', {}) if (existing_json_schema_updates := metadata.get('pydantic_js_updates')) is not None: metadata['pydantic_js_updates'] = { **existing_json_schema_updates, **{js_constraint_key: as_jsonable_value(value)}, } else: metadata['pydantic_js_updates'] = {js_constraint_key: as_jsonable_value(value)} schema['metadata'] = metadata elif constraint == 'allow_inf_nan' and value is False: schema = cs.no_info_after_validator_function( forbid_inf_nan_check, schema, ) else: # It's rare that we'd get here, but it's possible if we add a new constraint and forget to handle it # Most constraint errors are caught at runtime during attempted application raise RuntimeError(f"Unable to apply constraint '{constraint}' to schema of type '{schema_type}'") for annotation in other_metadata: if (annotation_type := type(annotation)) in (at_to_constraint_map := _get_at_to_constraint_map()): constraint = at_to_constraint_map[annotation_type] validator = NUMERIC_VALIDATOR_LOOKUP.get(constraint) if validator is None: raise ValueError(f'Unknown constraint {constraint}') schema = cs.no_info_after_validator_function( partial(validator, {constraint: getattr(annotation, constraint)}), schema ) continue elif isinstance(annotation, (at.Predicate, at.Not)): predicate_name = f'{annotation.func.__qualname__}' if hasattr(annotation.func, '__qualname__') else '' def val_func(v: Any) -> Any: predicate_satisfied = annotation.func(v) # noqa: B023 # annotation.func may also raise an exception, let it pass through if isinstance(annotation, at.Predicate): # noqa: B023 if not predicate_satisfied: raise PydanticCustomError( 'predicate_failed', f'Predicate {predicate_name} failed', # type: ignore # noqa: B023 ) else: if predicate_satisfied: raise PydanticCustomError( 'not_operation_failed', f'Not of {predicate_name} failed', # type: ignore # noqa: B023 ) return v schema = cs.no_info_after_validator_function(val_func, schema) else: # ignore any other unknown metadata return None if chain_schema_steps: chain_schema_steps = [schema] + chain_schema_steps return cs.chain_schema(chain_schema_steps) return schema def collect_known_metadata(annotations: Iterable[Any]) -> tuple[dict[str, Any], list[Any]]: """Split `annotations` into known metadata and unknown annotations. Args: annotations: An iterable of annotations. Returns: A tuple contains a dict of known metadata and a list of unknown annotations. Example: ```python from annotated_types import Gt, Len from pydantic._internal._known_annotated_metadata import collect_known_metadata print(collect_known_metadata([Gt(1), Len(42), ...])) #> ({'gt': 1, 'min_length': 42}, [Ellipsis]) ``` """ annotations = expand_grouped_metadata(annotations) res: dict[str, Any] = {} remaining: list[Any] = [] for annotation in annotations: # isinstance(annotation, PydanticMetadata) also covers ._fields:_PydanticGeneralMetadata if isinstance(annotation, PydanticMetadata): res.update(annotation.__dict__) # we don't use dataclasses.asdict because that recursively calls asdict on the field values elif (annotation_type := type(annotation)) in (at_to_constraint_map := _get_at_to_constraint_map()): constraint = at_to_constraint_map[annotation_type] res[constraint] = getattr(annotation, constraint) elif isinstance(annotation, type) and issubclass(annotation, PydanticMetadata): # also support PydanticMetadata classes being used without initialisation, # e.g. `Annotated[int, Strict]` as well as `Annotated[int, Strict()]` res.update({k: v for k, v in vars(annotation).items() if not k.startswith('_')}) else: remaining.append(annotation) # Nones can sneak in but pydantic-core will reject them # it'd be nice to clean things up so we don't put in None (we probably don't _need_ to, it was just easier) # but this is simple enough to kick that can down the road res = {k: v for k, v in res.items() if v is not None} return res, remaining def check_metadata(metadata: dict[str, Any], allowed: Iterable[str], source_type: Any) -> None: """A small utility function to validate that the given metadata can be applied to the target. More than saving lines of code, this gives us a consistent error message for all of our internal implementations. Args: metadata: A dict of metadata. allowed: An iterable of allowed metadata. source_type: The source type. Raises: TypeError: If there is metadatas that can't be applied on source type. """ unknown = metadata.keys() - set(allowed) if unknown: raise TypeError( f'The following constraints cannot be applied to {source_type!r}: {", ".join([f"{k!r}" for k in unknown])}' ) pydantic-2.10.6/pydantic/_internal/_mock_val_ser.py000066400000000000000000000217161474456633400224050ustar00rootroot00000000000000from __future__ import annotations from typing import TYPE_CHECKING, Any, Callable, Generic, Iterator, Mapping, TypeVar, Union from pydantic_core import CoreSchema, SchemaSerializer, SchemaValidator from typing_extensions import Literal from ..errors import PydanticErrorCodes, PydanticUserError from ..plugin._schema_validator import PluggableSchemaValidator if TYPE_CHECKING: from ..dataclasses import PydanticDataclass from ..main import BaseModel from ..type_adapter import TypeAdapter ValSer = TypeVar('ValSer', bound=Union[SchemaValidator, PluggableSchemaValidator, SchemaSerializer]) T = TypeVar('T') class MockCoreSchema(Mapping[str, Any]): """Mocker for `pydantic_core.CoreSchema` which optionally attempts to rebuild the thing it's mocking when one of its methods is accessed and raises an error if that fails. """ __slots__ = '_error_message', '_code', '_attempt_rebuild', '_built_memo' def __init__( self, error_message: str, *, code: PydanticErrorCodes, attempt_rebuild: Callable[[], CoreSchema | None] | None = None, ) -> None: self._error_message = error_message self._code: PydanticErrorCodes = code self._attempt_rebuild = attempt_rebuild self._built_memo: CoreSchema | None = None def __getitem__(self, key: str) -> Any: return self._get_built().__getitem__(key) def __len__(self) -> int: return self._get_built().__len__() def __iter__(self) -> Iterator[str]: return self._get_built().__iter__() def _get_built(self) -> CoreSchema: if self._built_memo is not None: return self._built_memo if self._attempt_rebuild: schema = self._attempt_rebuild() if schema is not None: self._built_memo = schema return schema raise PydanticUserError(self._error_message, code=self._code) def rebuild(self) -> CoreSchema | None: self._built_memo = None if self._attempt_rebuild: schema = self._attempt_rebuild() if schema is not None: return schema else: raise PydanticUserError(self._error_message, code=self._code) return None class MockValSer(Generic[ValSer]): """Mocker for `pydantic_core.SchemaValidator` or `pydantic_core.SchemaSerializer` which optionally attempts to rebuild the thing it's mocking when one of its methods is accessed and raises an error if that fails. """ __slots__ = '_error_message', '_code', '_val_or_ser', '_attempt_rebuild' def __init__( self, error_message: str, *, code: PydanticErrorCodes, val_or_ser: Literal['validator', 'serializer'], attempt_rebuild: Callable[[], ValSer | None] | None = None, ) -> None: self._error_message = error_message self._val_or_ser = SchemaValidator if val_or_ser == 'validator' else SchemaSerializer self._code: PydanticErrorCodes = code self._attempt_rebuild = attempt_rebuild def __getattr__(self, item: str) -> None: __tracebackhide__ = True if self._attempt_rebuild: val_ser = self._attempt_rebuild() if val_ser is not None: return getattr(val_ser, item) # raise an AttributeError if `item` doesn't exist getattr(self._val_or_ser, item) raise PydanticUserError(self._error_message, code=self._code) def rebuild(self) -> ValSer | None: if self._attempt_rebuild: val_ser = self._attempt_rebuild() if val_ser is not None: return val_ser else: raise PydanticUserError(self._error_message, code=self._code) return None def set_type_adapter_mocks(adapter: TypeAdapter, type_repr: str) -> None: """Set `core_schema`, `validator` and `serializer` to mock core types on a type adapter instance. Args: adapter: The type adapter instance to set the mocks on type_repr: Name of the type used in the adapter, used in error messages """ undefined_type_error_message = ( f'`TypeAdapter[{type_repr}]` is not fully defined; you should define `{type_repr}` and all referenced types,' f' then call `.rebuild()` on the instance.' ) def attempt_rebuild_fn(attr_fn: Callable[[TypeAdapter], T]) -> Callable[[], T | None]: def handler() -> T | None: if adapter.rebuild(raise_errors=False, _parent_namespace_depth=5) is not False: return attr_fn(adapter) else: return None return handler adapter.core_schema = MockCoreSchema( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', attempt_rebuild=attempt_rebuild_fn(lambda ta: ta.core_schema), ) adapter.validator = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='validator', attempt_rebuild=attempt_rebuild_fn(lambda ta: ta.validator), ) adapter.serializer = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='serializer', attempt_rebuild=attempt_rebuild_fn(lambda ta: ta.serializer), ) def set_model_mocks(cls: type[BaseModel], cls_name: str, undefined_name: str = 'all referenced types') -> None: """Set `__pydantic_core_schema__`, `__pydantic_validator__` and `__pydantic_serializer__` to mock core types on a model. Args: cls: The model class to set the mocks on cls_name: Name of the model class, used in error messages undefined_name: Name of the undefined thing, used in error messages """ undefined_type_error_message = ( f'`{cls_name}` is not fully defined; you should define {undefined_name},' f' then call `{cls_name}.model_rebuild()`.' ) def attempt_rebuild_fn(attr_fn: Callable[[type[BaseModel]], T]) -> Callable[[], T | None]: def handler() -> T | None: if cls.model_rebuild(raise_errors=False, _parent_namespace_depth=5) is not False: return attr_fn(cls) else: return None return handler cls.__pydantic_core_schema__ = MockCoreSchema( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_core_schema__), ) cls.__pydantic_validator__ = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='validator', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_validator__), ) cls.__pydantic_serializer__ = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='serializer', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_serializer__), ) def set_dataclass_mocks( cls: type[PydanticDataclass], cls_name: str, undefined_name: str = 'all referenced types' ) -> None: """Set `__pydantic_validator__` and `__pydantic_serializer__` to `MockValSer`s on a dataclass. Args: cls: The model class to set the mocks on cls_name: Name of the model class, used in error messages undefined_name: Name of the undefined thing, used in error messages """ from ..dataclasses import rebuild_dataclass undefined_type_error_message = ( f'`{cls_name}` is not fully defined; you should define {undefined_name},' f' then call `pydantic.dataclasses.rebuild_dataclass({cls_name})`.' ) def attempt_rebuild_fn(attr_fn: Callable[[type[PydanticDataclass]], T]) -> Callable[[], T | None]: def handler() -> T | None: if rebuild_dataclass(cls, raise_errors=False, _parent_namespace_depth=5) is not False: return attr_fn(cls) else: return None return handler cls.__pydantic_core_schema__ = MockCoreSchema( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_core_schema__), ) cls.__pydantic_validator__ = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='validator', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_validator__), ) cls.__pydantic_serializer__ = MockValSer( # pyright: ignore[reportAttributeAccessIssue] undefined_type_error_message, code='class-not-fully-defined', val_or_ser='serializer', attempt_rebuild=attempt_rebuild_fn(lambda c: c.__pydantic_serializer__), ) pydantic-2.10.6/pydantic/_internal/_model_construction.py000066400000000000000000001034351474456633400236520ustar00rootroot00000000000000"""Private logic for creating models.""" from __future__ import annotations as _annotations import builtins import operator import sys import typing import warnings import weakref from abc import ABCMeta from functools import lru_cache, partial from types import FunctionType from typing import Any, Callable, Generic, Literal, NoReturn, cast from pydantic_core import PydanticUndefined, SchemaSerializer from typing_extensions import TypeAliasType, dataclass_transform, deprecated, get_args from ..errors import PydanticUndefinedAnnotation, PydanticUserError from ..plugin._schema_validator import create_schema_validator from ..warnings import GenericBeforeBaseModelWarning, PydanticDeprecatedSince20 from ._config import ConfigWrapper from ._decorators import DecoratorInfos, PydanticDescriptorProxy, get_attribute_from_bases, unwrap_wrapped_function from ._fields import collect_model_fields, is_valid_field_name, is_valid_privateattr_name from ._generate_schema import GenerateSchema from ._generics import PydanticGenericMetadata, get_model_typevars_map from ._import_utils import import_cached_base_model, import_cached_field_info from ._mock_val_ser import set_model_mocks from ._namespace_utils import NsResolver from ._schema_generation_shared import CallbackGetCoreSchemaHandler from ._signature import generate_pydantic_signature from ._typing_extra import ( _make_forward_ref, eval_type_backport, is_annotated, is_classvar_annotation, parent_frame_namespace, ) from ._utils import LazyClassAttribute, SafeGetItemProxy if typing.TYPE_CHECKING: from ..fields import ComputedFieldInfo, FieldInfo, ModelPrivateAttr from ..fields import Field as PydanticModelField from ..fields import PrivateAttr as PydanticModelPrivateAttr from ..main import BaseModel else: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 PydanticModelField = object() PydanticModelPrivateAttr = object() object_setattr = object.__setattr__ class _ModelNamespaceDict(dict): """A dictionary subclass that intercepts attribute setting on model classes and warns about overriding of decorators. """ def __setitem__(self, k: str, v: object) -> None: existing: Any = self.get(k, None) if existing and v is not existing and isinstance(existing, PydanticDescriptorProxy): warnings.warn(f'`{k}` overrides an existing Pydantic `{existing.decorator_info.decorator_repr}` decorator') return super().__setitem__(k, v) def NoInitField( *, init: Literal[False] = False, ) -> Any: """Only for typing purposes. Used as default value of `__pydantic_fields_set__`, `__pydantic_extra__`, `__pydantic_private__`, so they could be ignored when synthesizing the `__init__` signature. """ @dataclass_transform(kw_only_default=True, field_specifiers=(PydanticModelField, PydanticModelPrivateAttr, NoInitField)) class ModelMetaclass(ABCMeta): def __new__( mcs, cls_name: str, bases: tuple[type[Any], ...], namespace: dict[str, Any], __pydantic_generic_metadata__: PydanticGenericMetadata | None = None, __pydantic_reset_parent_namespace__: bool = True, _create_model_module: str | None = None, **kwargs: Any, ) -> type: """Metaclass for creating Pydantic models. Args: cls_name: The name of the class to be created. bases: The base classes of the class to be created. namespace: The attribute dictionary of the class to be created. __pydantic_generic_metadata__: Metadata for generic models. __pydantic_reset_parent_namespace__: Reset parent namespace. _create_model_module: The module of the class to be created, if created by `create_model`. **kwargs: Catch-all for any other keyword arguments. Returns: The new class created by the metaclass. """ # Note `ModelMetaclass` refers to `BaseModel`, but is also used to *create* `BaseModel`, so we rely on the fact # that `BaseModel` itself won't have any bases, but any subclass of it will, to determine whether the `__new__` # call we're in the middle of is for the `BaseModel` class. if bases: base_field_names, class_vars, base_private_attributes = mcs._collect_bases_data(bases) config_wrapper = ConfigWrapper.for_model(bases, namespace, kwargs) namespace['model_config'] = config_wrapper.config_dict private_attributes = inspect_namespace( namespace, config_wrapper.ignored_types, class_vars, base_field_names ) if private_attributes or base_private_attributes: original_model_post_init = get_model_post_init(namespace, bases) if original_model_post_init is not None: # if there are private_attributes and a model_post_init function, we handle both def wrapped_model_post_init(self: BaseModel, context: Any, /) -> None: """We need to both initialize private attributes and call the user-defined model_post_init method. """ init_private_attributes(self, context) original_model_post_init(self, context) namespace['model_post_init'] = wrapped_model_post_init else: namespace['model_post_init'] = init_private_attributes namespace['__class_vars__'] = class_vars namespace['__private_attributes__'] = {**base_private_attributes, **private_attributes} cls = cast('type[BaseModel]', super().__new__(mcs, cls_name, bases, namespace, **kwargs)) BaseModel_ = import_cached_base_model() mro = cls.__mro__ if Generic in mro and mro.index(Generic) < mro.index(BaseModel_): warnings.warn( GenericBeforeBaseModelWarning( 'Classes should inherit from `BaseModel` before generic classes (e.g. `typing.Generic[T]`) ' 'for pydantic generics to work properly.' ), stacklevel=2, ) cls.__pydantic_custom_init__ = not getattr(cls.__init__, '__pydantic_base_init__', False) cls.__pydantic_post_init__ = ( None if cls.model_post_init is BaseModel_.model_post_init else 'model_post_init' ) cls.__pydantic_decorators__ = DecoratorInfos.build(cls) # Use the getattr below to grab the __parameters__ from the `typing.Generic` parent class if __pydantic_generic_metadata__: cls.__pydantic_generic_metadata__ = __pydantic_generic_metadata__ else: parent_parameters = getattr(cls, '__pydantic_generic_metadata__', {}).get('parameters', ()) parameters = getattr(cls, '__parameters__', None) or parent_parameters if parameters and parent_parameters and not all(x in parameters for x in parent_parameters): from ..root_model import RootModelRootType missing_parameters = tuple(x for x in parameters if x not in parent_parameters) if RootModelRootType in parent_parameters and RootModelRootType not in parameters: # This is a special case where the user has subclassed `RootModel`, but has not parametrized # RootModel with the generic type identifiers being used. Ex: # class MyModel(RootModel, Generic[T]): # root: T # Should instead just be: # class MyModel(RootModel[T]): # root: T parameters_str = ', '.join([x.__name__ for x in missing_parameters]) error_message = ( f'{cls.__name__} is a subclass of `RootModel`, but does not include the generic type identifier(s) ' f'{parameters_str} in its parameters. ' f'You should parametrize RootModel directly, e.g., `class {cls.__name__}(RootModel[{parameters_str}]): ...`.' ) else: combined_parameters = parent_parameters + missing_parameters parameters_str = ', '.join([str(x) for x in combined_parameters]) generic_type_label = f'typing.Generic[{parameters_str}]' error_message = ( f'All parameters must be present on typing.Generic;' f' you should inherit from {generic_type_label}.' ) if Generic not in bases: # pragma: no cover # We raise an error here not because it is desirable, but because some cases are mishandled. # It would be nice to remove this error and still have things behave as expected, it's just # challenging because we are using a custom `__class_getitem__` to parametrize generic models, # and not returning a typing._GenericAlias from it. bases_str = ', '.join([x.__name__ for x in bases] + [generic_type_label]) error_message += ( f' Note: `typing.Generic` must go last: `class {cls.__name__}({bases_str}): ...`)' ) raise TypeError(error_message) cls.__pydantic_generic_metadata__ = { 'origin': None, 'args': (), 'parameters': parameters, } cls.__pydantic_complete__ = False # Ensure this specific class gets completed # preserve `__set_name__` protocol defined in https://peps.python.org/pep-0487 # for attributes not in `new_namespace` (e.g. private attributes) for name, obj in private_attributes.items(): obj.__set_name__(cls, name) if __pydantic_reset_parent_namespace__: cls.__pydantic_parent_namespace__ = build_lenient_weakvaluedict(parent_frame_namespace()) parent_namespace: dict[str, Any] | None = getattr(cls, '__pydantic_parent_namespace__', None) if isinstance(parent_namespace, dict): parent_namespace = unpack_lenient_weakvaluedict(parent_namespace) ns_resolver = NsResolver(parent_namespace=parent_namespace) set_model_fields(cls, bases, config_wrapper, ns_resolver) if config_wrapper.frozen and '__hash__' not in namespace: set_default_hash_func(cls, bases) complete_model_class( cls, cls_name, config_wrapper, raise_errors=False, ns_resolver=ns_resolver, create_model_module=_create_model_module, ) # If this is placed before the complete_model_class call above, # the generic computed fields return type is set to PydanticUndefined cls.__pydantic_computed_fields__ = { k: v.info for k, v in cls.__pydantic_decorators__.computed_fields.items() } set_deprecated_descriptors(cls) # using super(cls, cls) on the next line ensures we only call the parent class's __pydantic_init_subclass__ # I believe the `type: ignore` is only necessary because mypy doesn't realize that this code branch is # only hit for _proper_ subclasses of BaseModel super(cls, cls).__pydantic_init_subclass__(**kwargs) # type: ignore[misc] return cls else: # These are instance variables, but have been assigned to `NoInitField` to trick the type checker. for instance_slot in '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__': namespace.pop( instance_slot, None, # In case the metaclass is used with a class other than `BaseModel`. ) namespace.get('__annotations__', {}).clear() return super().__new__(mcs, cls_name, bases, namespace, **kwargs) if not typing.TYPE_CHECKING: # pragma: no branch # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access def __getattr__(self, item: str) -> Any: """This is necessary to keep attribute access working for class attribute access.""" private_attributes = self.__dict__.get('__private_attributes__') if private_attributes and item in private_attributes: return private_attributes[item] raise AttributeError(item) @classmethod def __prepare__(cls, *args: Any, **kwargs: Any) -> dict[str, object]: return _ModelNamespaceDict() def __instancecheck__(self, instance: Any) -> bool: """Avoid calling ABC _abc_subclasscheck unless we're pretty sure. See #3829 and python/cpython#92810 """ return hasattr(instance, '__pydantic_validator__') and super().__instancecheck__(instance) @staticmethod def _collect_bases_data(bases: tuple[type[Any], ...]) -> tuple[set[str], set[str], dict[str, ModelPrivateAttr]]: BaseModel = import_cached_base_model() field_names: set[str] = set() class_vars: set[str] = set() private_attributes: dict[str, ModelPrivateAttr] = {} for base in bases: if issubclass(base, BaseModel) and base is not BaseModel: # model_fields might not be defined yet in the case of generics, so we use getattr here: field_names.update(getattr(base, '__pydantic_fields__', {}).keys()) class_vars.update(base.__class_vars__) private_attributes.update(base.__private_attributes__) return field_names, class_vars, private_attributes @property @deprecated('The `__fields__` attribute is deprecated, use `model_fields` instead.', category=None) def __fields__(self) -> dict[str, FieldInfo]: warnings.warn( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', PydanticDeprecatedSince20, stacklevel=2, ) return self.model_fields @property def model_fields(self) -> dict[str, FieldInfo]: """Get metadata about the fields defined on the model. Returns: A mapping of field names to [`FieldInfo`][pydantic.fields.FieldInfo] objects. """ return getattr(self, '__pydantic_fields__', {}) @property def model_computed_fields(self) -> dict[str, ComputedFieldInfo]: """Get metadata about the computed fields defined on the model. Returns: A mapping of computed field names to [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects. """ return getattr(self, '__pydantic_computed_fields__', {}) def __dir__(self) -> list[str]: attributes = list(super().__dir__()) if '__fields__' in attributes: attributes.remove('__fields__') return attributes def init_private_attributes(self: BaseModel, context: Any, /) -> None: """This function is meant to behave like a BaseModel method to initialise private attributes. It takes context as an argument since that's what pydantic-core passes when calling it. Args: self: The BaseModel instance. context: The context. """ if getattr(self, '__pydantic_private__', None) is None: pydantic_private = {} for name, private_attr in self.__private_attributes__.items(): default = private_attr.get_default() if default is not PydanticUndefined: pydantic_private[name] = default object_setattr(self, '__pydantic_private__', pydantic_private) def get_model_post_init(namespace: dict[str, Any], bases: tuple[type[Any], ...]) -> Callable[..., Any] | None: """Get the `model_post_init` method from the namespace or the class bases, or `None` if not defined.""" if 'model_post_init' in namespace: return namespace['model_post_init'] BaseModel = import_cached_base_model() model_post_init = get_attribute_from_bases(bases, 'model_post_init') if model_post_init is not BaseModel.model_post_init: return model_post_init def inspect_namespace( # noqa C901 namespace: dict[str, Any], ignored_types: tuple[type[Any], ...], base_class_vars: set[str], base_class_fields: set[str], ) -> dict[str, ModelPrivateAttr]: """Iterate over the namespace and: * gather private attributes * check for items which look like fields but are not (e.g. have no annotation) and warn. Args: namespace: The attribute dictionary of the class to be created. ignored_types: A tuple of ignore types. base_class_vars: A set of base class class variables. base_class_fields: A set of base class fields. Returns: A dict contains private attributes info. Raises: TypeError: If there is a `__root__` field in model. NameError: If private attribute name is invalid. PydanticUserError: - If a field does not have a type annotation. - If a field on base class was overridden by a non-annotated attribute. """ from ..fields import ModelPrivateAttr, PrivateAttr FieldInfo = import_cached_field_info() all_ignored_types = ignored_types + default_ignored_types() private_attributes: dict[str, ModelPrivateAttr] = {} raw_annotations = namespace.get('__annotations__', {}) if '__root__' in raw_annotations or '__root__' in namespace: raise TypeError("To define root models, use `pydantic.RootModel` rather than a field called '__root__'") ignored_names: set[str] = set() for var_name, value in list(namespace.items()): if var_name == 'model_config' or var_name == '__pydantic_extra__': continue elif ( isinstance(value, type) and value.__module__ == namespace['__module__'] and '__qualname__' in namespace and value.__qualname__.startswith(namespace['__qualname__']) ): # `value` is a nested type defined in this namespace; don't error continue elif isinstance(value, all_ignored_types) or value.__class__.__module__ == 'functools': ignored_names.add(var_name) continue elif isinstance(value, ModelPrivateAttr): if var_name.startswith('__'): raise NameError( 'Private attributes must not use dunder names;' f' use a single underscore prefix instead of {var_name!r}.' ) elif is_valid_field_name(var_name): raise NameError( 'Private attributes must not use valid field names;' f' use sunder names, e.g. {"_" + var_name!r} instead of {var_name!r}.' ) private_attributes[var_name] = value del namespace[var_name] elif isinstance(value, FieldInfo) and not is_valid_field_name(var_name): suggested_name = var_name.lstrip('_') or 'my_field' # don't suggest '' for all-underscore name raise NameError( f'Fields must not use names with leading underscores;' f' e.g., use {suggested_name!r} instead of {var_name!r}.' ) elif var_name.startswith('__'): continue elif is_valid_privateattr_name(var_name): if var_name not in raw_annotations or not is_classvar_annotation(raw_annotations[var_name]): private_attributes[var_name] = cast(ModelPrivateAttr, PrivateAttr(default=value)) del namespace[var_name] elif var_name in base_class_vars: continue elif var_name not in raw_annotations: if var_name in base_class_fields: raise PydanticUserError( f'Field {var_name!r} defined on a base class was overridden by a non-annotated attribute. ' f'All field definitions, including overrides, require a type annotation.', code='model-field-overridden', ) elif isinstance(value, FieldInfo): raise PydanticUserError( f'Field {var_name!r} requires a type annotation', code='model-field-missing-annotation' ) else: raise PydanticUserError( f'A non-annotated attribute was detected: `{var_name} = {value!r}`. All model fields require a ' f'type annotation; if `{var_name}` is not meant to be a field, you may be able to resolve this ' f"error by annotating it as a `ClassVar` or updating `model_config['ignored_types']`.", code='model-field-missing-annotation', ) for ann_name, ann_type in raw_annotations.items(): if ( is_valid_privateattr_name(ann_name) and ann_name not in private_attributes and ann_name not in ignored_names # This condition can be a false negative when `ann_type` is stringified, # but it is handled in most cases in `set_model_fields`: and not is_classvar_annotation(ann_type) and ann_type not in all_ignored_types and getattr(ann_type, '__module__', None) != 'functools' ): if isinstance(ann_type, str): # Walking up the frames to get the module namespace where the model is defined # (as the model class wasn't created yet, we unfortunately can't use `cls.__module__`): frame = sys._getframe(2) if frame is not None: try: ann_type = eval_type_backport( _make_forward_ref(ann_type, is_argument=False, is_class=True), globalns=frame.f_globals, localns=frame.f_locals, ) except (NameError, TypeError): pass if is_annotated(ann_type): _, *metadata = get_args(ann_type) private_attr = next((v for v in metadata if isinstance(v, ModelPrivateAttr)), None) if private_attr is not None: private_attributes[ann_name] = private_attr continue private_attributes[ann_name] = PrivateAttr() return private_attributes def set_default_hash_func(cls: type[BaseModel], bases: tuple[type[Any], ...]) -> None: base_hash_func = get_attribute_from_bases(bases, '__hash__') new_hash_func = make_hash_func(cls) if base_hash_func in {None, object.__hash__} or getattr(base_hash_func, '__code__', None) == new_hash_func.__code__: # If `__hash__` is some default, we generate a hash function. # It will be `None` if not overridden from BaseModel. # It may be `object.__hash__` if there is another # parent class earlier in the bases which doesn't override `__hash__` (e.g. `typing.Generic`). # It may be a value set by `set_default_hash_func` if `cls` is a subclass of another frozen model. # In the last case we still need a new hash function to account for new `model_fields`. cls.__hash__ = new_hash_func def make_hash_func(cls: type[BaseModel]) -> Any: getter = operator.itemgetter(*cls.__pydantic_fields__.keys()) if cls.__pydantic_fields__ else lambda _: 0 def hash_func(self: Any) -> int: try: return hash(getter(self.__dict__)) except KeyError: # In rare cases (such as when using the deprecated copy method), the __dict__ may not contain # all model fields, which is how we can get here. # getter(self.__dict__) is much faster than any 'safe' method that accounts for missing keys, # and wrapping it in a `try` doesn't slow things down much in the common case. return hash(getter(SafeGetItemProxy(self.__dict__))) return hash_func def set_model_fields( cls: type[BaseModel], bases: tuple[type[Any], ...], config_wrapper: ConfigWrapper, ns_resolver: NsResolver | None, ) -> None: """Collect and set `cls.__pydantic_fields__` and `cls.__class_vars__`. Args: cls: BaseModel or dataclass. bases: Parents of the class, generally `cls.__bases__`. config_wrapper: The config wrapper instance. ns_resolver: Namespace resolver to use when getting model annotations. """ typevars_map = get_model_typevars_map(cls) fields, class_vars = collect_model_fields(cls, bases, config_wrapper, ns_resolver, typevars_map=typevars_map) cls.__pydantic_fields__ = fields cls.__class_vars__.update(class_vars) for k in class_vars: # Class vars should not be private attributes # We remove them _here_ and not earlier because we rely on inspecting the class to determine its classvars, # but private attributes are determined by inspecting the namespace _prior_ to class creation. # In the case that a classvar with a leading-'_' is defined via a ForwardRef (e.g., when using # `__future__.annotations`), we want to remove the private attribute which was detected _before_ we knew it # evaluated to a classvar value = cls.__private_attributes__.pop(k, None) if value is not None and value.default is not PydanticUndefined: setattr(cls, k, value.default) def complete_model_class( cls: type[BaseModel], cls_name: str, config_wrapper: ConfigWrapper, *, raise_errors: bool = True, ns_resolver: NsResolver | None = None, create_model_module: str | None = None, ) -> bool: """Finish building a model class. This logic must be called after class has been created since validation functions must be bound and `get_type_hints` requires a class object. Args: cls: BaseModel or dataclass. cls_name: The model or dataclass name. config_wrapper: The config wrapper instance. raise_errors: Whether to raise errors. ns_resolver: The namespace resolver instance to use during schema building. create_model_module: The module of the class to be created, if created by `create_model`. Returns: `True` if the model is successfully completed, else `False`. Raises: PydanticUndefinedAnnotation: If `PydanticUndefinedAnnotation` occurs in`__get_pydantic_core_schema__` and `raise_errors=True`. """ if config_wrapper.defer_build: set_model_mocks(cls, cls_name) return False typevars_map = get_model_typevars_map(cls) gen_schema = GenerateSchema( config_wrapper, ns_resolver, typevars_map, ) handler = CallbackGetCoreSchemaHandler( partial(gen_schema.generate_schema, from_dunder_get_core_schema=False), gen_schema, ref_mode='unpack', ) try: schema = cls.__get_pydantic_core_schema__(cls, handler) except PydanticUndefinedAnnotation as e: if raise_errors: raise set_model_mocks(cls, cls_name, f'`{e.name}`') return False core_config = config_wrapper.core_config(title=cls.__name__) try: schema = gen_schema.clean_schema(schema) except gen_schema.CollectedInvalid: set_model_mocks(cls, cls_name) return False # debug(schema) cls.__pydantic_core_schema__ = schema cls.__pydantic_validator__ = create_schema_validator( schema, cls, create_model_module or cls.__module__, cls.__qualname__, 'create_model' if create_model_module else 'BaseModel', core_config, config_wrapper.plugin_settings, ) cls.__pydantic_serializer__ = SchemaSerializer(schema, core_config) cls.__pydantic_complete__ = True # set __signature__ attr only for model class, but not for its instances # (because instances can define `__call__`, and `inspect.signature` shouldn't # use the `__signature__` attribute and instead generate from `__call__`). cls.__signature__ = LazyClassAttribute( '__signature__', partial( generate_pydantic_signature, init=cls.__init__, fields=cls.__pydantic_fields__, populate_by_name=config_wrapper.populate_by_name, extra=config_wrapper.extra, ), ) return True def set_deprecated_descriptors(cls: type[BaseModel]) -> None: """Set data descriptors on the class for deprecated fields.""" for field, field_info in cls.__pydantic_fields__.items(): if (msg := field_info.deprecation_message) is not None: desc = _DeprecatedFieldDescriptor(msg) desc.__set_name__(cls, field) setattr(cls, field, desc) for field, computed_field_info in cls.__pydantic_computed_fields__.items(): if ( (msg := computed_field_info.deprecation_message) is not None # Avoid having two warnings emitted: and not hasattr(unwrap_wrapped_function(computed_field_info.wrapped_property), '__deprecated__') ): desc = _DeprecatedFieldDescriptor(msg, computed_field_info.wrapped_property) desc.__set_name__(cls, field) setattr(cls, field, desc) class _DeprecatedFieldDescriptor: """Read-only data descriptor used to emit a runtime deprecation warning before accessing a deprecated field. Attributes: msg: The deprecation message to be emitted. wrapped_property: The property instance if the deprecated field is a computed field, or `None`. field_name: The name of the field being deprecated. """ field_name: str def __init__(self, msg: str, wrapped_property: property | None = None) -> None: self.msg = msg self.wrapped_property = wrapped_property def __set_name__(self, cls: type[BaseModel], name: str) -> None: self.field_name = name def __get__(self, obj: BaseModel | None, obj_type: type[BaseModel] | None = None) -> Any: if obj is None: if self.wrapped_property is not None: return self.wrapped_property.__get__(None, obj_type) raise AttributeError(self.field_name) warnings.warn(self.msg, builtins.DeprecationWarning, stacklevel=2) if self.wrapped_property is not None: return self.wrapped_property.__get__(obj, obj_type) return obj.__dict__[self.field_name] # Defined to make it a data descriptor and take precedence over the instance's dictionary. # Note that it will not be called when setting a value on a model instance # as `BaseModel.__setattr__` is defined and takes priority. def __set__(self, obj: Any, value: Any) -> NoReturn: raise AttributeError(self.field_name) class _PydanticWeakRef: """Wrapper for `weakref.ref` that enables `pickle` serialization. Cloudpickle fails to serialize `weakref.ref` objects due to an arcane error related to abstract base classes (`abc.ABC`). This class works around the issue by wrapping `weakref.ref` instead of subclassing it. See https://github.com/pydantic/pydantic/issues/6763 for context. Semantics: - If not pickled, behaves the same as a `weakref.ref`. - If pickled along with the referenced object, the same `weakref.ref` behavior will be maintained between them after unpickling. - If pickled without the referenced object, after unpickling the underlying reference will be cleared (`__call__` will always return `None`). """ def __init__(self, obj: Any): if obj is None: # The object will be `None` upon deserialization if the serialized weakref # had lost its underlying object. self._wr = None else: self._wr = weakref.ref(obj) def __call__(self) -> Any: if self._wr is None: return None else: return self._wr() def __reduce__(self) -> tuple[Callable, tuple[weakref.ReferenceType | None]]: return _PydanticWeakRef, (self(),) def build_lenient_weakvaluedict(d: dict[str, Any] | None) -> dict[str, Any] | None: """Takes an input dictionary, and produces a new value that (invertibly) replaces the values with weakrefs. We can't just use a WeakValueDictionary because many types (including int, str, etc.) can't be stored as values in a WeakValueDictionary. The `unpack_lenient_weakvaluedict` function can be used to reverse this operation. """ if d is None: return None result = {} for k, v in d.items(): try: proxy = _PydanticWeakRef(v) except TypeError: proxy = v result[k] = proxy return result def unpack_lenient_weakvaluedict(d: dict[str, Any] | None) -> dict[str, Any] | None: """Inverts the transform performed by `build_lenient_weakvaluedict`.""" if d is None: return None result = {} for k, v in d.items(): if isinstance(v, _PydanticWeakRef): v = v() if v is not None: result[k] = v else: result[k] = v return result @lru_cache(maxsize=None) def default_ignored_types() -> tuple[type[Any], ...]: from ..fields import ComputedFieldInfo ignored_types = [ FunctionType, property, classmethod, staticmethod, PydanticDescriptorProxy, ComputedFieldInfo, TypeAliasType, # from `typing_extensions` ] if sys.version_info >= (3, 12): ignored_types.append(typing.TypeAliasType) return tuple(ignored_types) pydantic-2.10.6/pydantic/_internal/_namespace_utils.py000066400000000000000000000273701474456633400231170ustar00rootroot00000000000000from __future__ import annotations import sys from collections.abc import Generator from contextlib import contextmanager from functools import cached_property from typing import Any, Callable, Iterator, Mapping, NamedTuple, TypeVar from typing_extensions import ParamSpec, TypeAlias, TypeAliasType, TypeVarTuple GlobalsNamespace: TypeAlias = 'dict[str, Any]' """A global namespace. In most cases, this is a reference to the `__dict__` attribute of a module. This namespace type is expected as the `globals` argument during annotations evaluation. """ MappingNamespace: TypeAlias = Mapping[str, Any] """Any kind of namespace. In most cases, this is a local namespace (e.g. the `__dict__` attribute of a class, the [`f_locals`][frame.f_locals] attribute of a frame object, when dealing with types defined inside functions). This namespace type is expected as the `locals` argument during annotations evaluation. """ _TypeVarLike: TypeAlias = 'TypeVar | ParamSpec | TypeVarTuple' class NamespacesTuple(NamedTuple): """A tuple of globals and locals to be used during annotations evaluation. This datastructure is defined as a named tuple so that it can easily be unpacked: ```python {lint="skip" test="skip"} def eval_type(typ: type[Any], ns: NamespacesTuple) -> None: return eval(typ, *ns) ``` """ globals: GlobalsNamespace """The namespace to be used as the `globals` argument during annotations evaluation.""" locals: MappingNamespace """The namespace to be used as the `locals` argument during annotations evaluation.""" def get_module_ns_of(obj: Any) -> dict[str, Any]: """Get the namespace of the module where the object is defined. Caution: this function does not return a copy of the module namespace, so the result should not be mutated. The burden of enforcing this is on the caller. """ module_name = getattr(obj, '__module__', None) if module_name: try: return sys.modules[module_name].__dict__ except KeyError: # happens occasionally, see https://github.com/pydantic/pydantic/issues/2363 return {} return {} # Note that this class is almost identical to `collections.ChainMap`, but need to enforce # immutable mappings here: class LazyLocalNamespace(Mapping[str, Any]): """A lazily evaluated mapping, to be used as the `locals` argument during annotations evaluation. While the [`eval`][eval] function expects a mapping as the `locals` argument, it only performs `__getitem__` calls. The [`Mapping`][collections.abc.Mapping] abstract base class is fully implemented only for type checking purposes. Args: *namespaces: The namespaces to consider, in ascending order of priority. Example: ```python {lint="skip" test="skip"} ns = LazyLocalNamespace({'a': 1, 'b': 2}, {'a': 3}) ns['a'] #> 3 ns['b'] #> 2 ``` """ def __init__(self, *namespaces: MappingNamespace) -> None: self._namespaces = namespaces @cached_property def data(self) -> dict[str, Any]: return {k: v for ns in self._namespaces for k, v in ns.items()} def __len__(self) -> int: return len(self.data) def __getitem__(self, key: str) -> Any: return self.data[key] def __contains__(self, key: object) -> bool: return key in self.data def __iter__(self) -> Iterator[str]: return iter(self.data) def ns_for_function(obj: Callable[..., Any], parent_namespace: MappingNamespace | None = None) -> NamespacesTuple: """Return the global and local namespaces to be used when evaluating annotations for the provided function. The global namespace will be the `__dict__` attribute of the module the function was defined in. The local namespace will contain the `__type_params__` introduced by PEP 695. Args: obj: The object to use when building namespaces. parent_namespace: Optional namespace to be added with the lowest priority in the local namespace. If the passed function is a method, the `parent_namespace` will be the namespace of the class the method is defined in. Thus, we also fetch type `__type_params__` from there (i.e. the class-scoped type variables). """ locals_list: list[MappingNamespace] = [] if parent_namespace is not None: locals_list.append(parent_namespace) # Get the `__type_params__` attribute introduced by PEP 695. # Note that the `typing._eval_type` function expects type params to be # passed as a separate argument. However, internally, `_eval_type` calls # `ForwardRef._evaluate` which will merge type params with the localns, # essentially mimicking what we do here. type_params: tuple[_TypeVarLike, ...] if hasattr(obj, '__type_params__'): type_params = obj.__type_params__ else: type_params = () if parent_namespace is not None: # We also fetch type params from the parent namespace. If present, it probably # means the function was defined in a class. This is to support the following: # https://github.com/python/cpython/issues/124089. type_params += parent_namespace.get('__type_params__', ()) locals_list.append({t.__name__: t for t in type_params}) # What about short-cirtuiting to `obj.__globals__`? globalns = get_module_ns_of(obj) return NamespacesTuple(globalns, LazyLocalNamespace(*locals_list)) class NsResolver: """A class responsible for the namespaces resolving logic for annotations evaluation. This class handles the namespace logic when evaluating annotations mainly for class objects. It holds a stack of classes that are being inspected during the core schema building, and the `types_namespace` property exposes the globals and locals to be used for type annotation evaluation. Additionally -- if no class is present in the stack -- a fallback globals and locals can be provided using the `namespaces_tuple` argument (this is useful when generating a schema for a simple annotation, e.g. when using `TypeAdapter`). The namespace creation logic is unfortunately flawed in some cases, for backwards compatibility reasons and to better support valid edge cases. See the description for the `parent_namespace` argument and the example for more details. Args: namespaces_tuple: The default globals and locals to use if no class is present on the stack. This can be useful when using the `GenerateSchema` class with `TypeAdapter`, where the "type" being analyzed is a simple annotation. parent_namespace: An optional parent namespace that will be added to the locals with the lowest priority. For a given class defined in a function, the locals of this function are usually used as the parent namespace: ```python {lint="skip" test="skip"} from pydantic import BaseModel def func() -> None: SomeType = int class Model(BaseModel): f: 'SomeType' # when collecting fields, an namespace resolver instance will be created # this way: # ns_resolver = NsResolver(parent_namespace={'SomeType': SomeType}) ``` For backwards compatibility reasons and to support valid edge cases, this parent namespace will be used for *every* type being pushed to the stack. In the future, we might want to be smarter by only doing so when the type being pushed is defined in the same module as the parent namespace. Example: ```python {lint="skip" test="skip"} ns_resolver = NsResolver( parent_namespace={'fallback': 1}, ) class Sub: m: 'Model' class Model: some_local = 1 sub: Sub ns_resolver = NsResolver() # This is roughly what happens when we build a core schema for `Model`: with ns_resolver.push(Model): ns_resolver.types_namespace #> NamespacesTuple({'Sub': Sub}, {'Model': Model, 'some_local': 1}) # First thing to notice here, the model being pushed is added to the locals. # Because `NsResolver` is being used during the model definition, it is not # yet added to the globals. This is useful when resolving self-referencing annotations. with ns_resolver.push(Sub): ns_resolver.types_namespace #> NamespacesTuple({'Sub': Sub}, {'Sub': Sub, 'Model': Model}) # Second thing to notice: `Sub` is present in both the globals and locals. # This is not an issue, just that as described above, the model being pushed # is added to the locals, but it happens to be present in the globals as well # because it is already defined. # Third thing to notice: `Model` is also added in locals. This is a backwards # compatibility workaround that allows for `Sub` to be able to resolve `'Model'` # correctly (as otherwise models would have to be rebuilt even though this # doesn't look necessary). ``` """ def __init__( self, namespaces_tuple: NamespacesTuple | None = None, parent_namespace: MappingNamespace | None = None, ) -> None: self._base_ns_tuple = namespaces_tuple or NamespacesTuple({}, {}) self._parent_ns = parent_namespace self._types_stack: list[type[Any] | TypeAliasType] = [] @cached_property def types_namespace(self) -> NamespacesTuple: """The current global and local namespaces to be used for annotations evaluation.""" if not self._types_stack: # TODO: should we merge the parent namespace here? # This is relevant for TypeAdapter, where there are no types on the stack, and we might # need access to the parent_ns. Right now, we sidestep this in `type_adapter.py` by passing # locals to both parent_ns and the base_ns_tuple, but this is a bit hacky. # we might consider something like: # if self._parent_ns is not None: # # Hacky workarounds, see class docstring: # # An optional parent namespace that will be added to the locals with the lowest priority # locals_list: list[MappingNamespace] = [self._parent_ns, self._base_ns_tuple.locals] # return NamespacesTuple(self._base_ns_tuple.globals, LazyLocalNamespace(*locals_list)) return self._base_ns_tuple typ = self._types_stack[-1] globalns = get_module_ns_of(typ) locals_list: list[MappingNamespace] = [] # Hacky workarounds, see class docstring: # An optional parent namespace that will be added to the locals with the lowest priority if self._parent_ns is not None: locals_list.append(self._parent_ns) if len(self._types_stack) > 1: first_type = self._types_stack[0] locals_list.append({first_type.__name__: first_type}) if hasattr(typ, '__dict__'): # TypeAliasType is the exception. locals_list.append(vars(typ)) # The len check above presents this from being added twice: locals_list.append({typ.__name__: typ}) return NamespacesTuple(globalns, LazyLocalNamespace(*locals_list)) @contextmanager def push(self, typ: type[Any] | TypeAliasType, /) -> Generator[None]: """Push a type to the stack.""" self._types_stack.append(typ) # Reset the cached property: self.__dict__.pop('types_namespace', None) try: yield finally: self._types_stack.pop() self.__dict__.pop('types_namespace', None) pydantic-2.10.6/pydantic/_internal/_repr.py000066400000000000000000000115401474456633400207030ustar00rootroot00000000000000"""Tools to provide pretty/human-readable display of objects.""" from __future__ import annotations as _annotations import types import typing from typing import Any import typing_extensions from . import _typing_extra if typing.TYPE_CHECKING: ReprArgs: typing_extensions.TypeAlias = 'typing.Iterable[tuple[str | None, Any]]' RichReprResult: typing_extensions.TypeAlias = ( 'typing.Iterable[Any | tuple[Any] | tuple[str, Any] | tuple[str, Any, Any]]' ) class PlainRepr(str): """String class where repr doesn't include quotes. Useful with Representation when you want to return a string representation of something that is valid (or pseudo-valid) python. """ def __repr__(self) -> str: return str(self) class Representation: # Mixin to provide `__str__`, `__repr__`, and `__pretty__` and `__rich_repr__` methods. # `__pretty__` is used by [devtools](https://python-devtools.helpmanual.io/). # `__rich_repr__` is used by [rich](https://rich.readthedocs.io/en/stable/pretty.html). # (this is not a docstring to avoid adding a docstring to classes which inherit from Representation) # we don't want to use a type annotation here as it can break get_type_hints __slots__ = () # type: typing.Collection[str] def __repr_args__(self) -> ReprArgs: """Returns the attributes to show in __str__, __repr__, and __pretty__ this is generally overridden. Can either return: * name - value pairs, e.g.: `[('foo_name', 'foo'), ('bar_name', ['b', 'a', 'r'])]` * or, just values, e.g.: `[(None, 'foo'), (None, ['b', 'a', 'r'])]` """ attrs_names = self.__slots__ if not attrs_names and hasattr(self, '__dict__'): attrs_names = self.__dict__.keys() attrs = ((s, getattr(self, s)) for s in attrs_names) return [(a, v if v is not self else self.__repr_recursion__(v)) for a, v in attrs if v is not None] def __repr_name__(self) -> str: """Name of the instance's class, used in __repr__.""" return self.__class__.__name__ def __repr_recursion__(self, object: Any) -> str: """Returns the string representation of a recursive object.""" # This is copied over from the stdlib `pprint` module: return f'' def __repr_str__(self, join_str: str) -> str: return join_str.join(repr(v) if a is None else f'{a}={v!r}' for a, v in self.__repr_args__()) def __pretty__(self, fmt: typing.Callable[[Any], Any], **kwargs: Any) -> typing.Generator[Any, None, None]: """Used by devtools (https://python-devtools.helpmanual.io/) to pretty print objects.""" yield self.__repr_name__() + '(' yield 1 for name, value in self.__repr_args__(): if name is not None: yield name + '=' yield fmt(value) yield ',' yield 0 yield -1 yield ')' def __rich_repr__(self) -> RichReprResult: """Used by Rich (https://rich.readthedocs.io/en/stable/pretty.html) to pretty print objects.""" for name, field_repr in self.__repr_args__(): if name is None: yield field_repr else: yield name, field_repr def __str__(self) -> str: return self.__repr_str__(' ') def __repr__(self) -> str: return f'{self.__repr_name__()}({self.__repr_str__(", ")})' def display_as_type(obj: Any) -> str: """Pretty representation of a type, should be as close as possible to the original type definition string. Takes some logic from `typing._type_repr`. """ if isinstance(obj, (types.FunctionType, types.BuiltinFunctionType)): return obj.__name__ elif obj is ...: return '...' elif isinstance(obj, Representation): return repr(obj) elif isinstance(obj, typing.ForwardRef) or _typing_extra.is_type_alias_type(obj): return str(obj) if not isinstance(obj, (_typing_extra.typing_base, _typing_extra.WithArgsTypes, type)): obj = obj.__class__ if _typing_extra.origin_is_union(typing_extensions.get_origin(obj)): args = ', '.join(map(display_as_type, typing_extensions.get_args(obj))) return f'Union[{args}]' elif isinstance(obj, _typing_extra.WithArgsTypes): if _typing_extra.is_literal(obj): args = ', '.join(map(repr, typing_extensions.get_args(obj))) else: args = ', '.join(map(display_as_type, typing_extensions.get_args(obj))) try: return f'{obj.__qualname__}[{args}]' except AttributeError: return str(obj).replace('typing.', '').replace('typing_extensions.', '') # handles TypeAliasType in 3.12 elif isinstance(obj, type): return obj.__qualname__ else: return repr(obj).replace('typing.', '').replace('typing_extensions.', '') pydantic-2.10.6/pydantic/_internal/_schema_generation_shared.py000066400000000000000000000114411474456633400247340ustar00rootroot00000000000000"""Types and utility functions used by various other internal tools.""" from __future__ import annotations from typing import TYPE_CHECKING, Any, Callable from pydantic_core import core_schema from typing_extensions import Literal from ..annotated_handlers import GetCoreSchemaHandler, GetJsonSchemaHandler if TYPE_CHECKING: from ..json_schema import GenerateJsonSchema, JsonSchemaValue from ._core_utils import CoreSchemaOrField from ._generate_schema import GenerateSchema from ._namespace_utils import NamespacesTuple GetJsonSchemaFunction = Callable[[CoreSchemaOrField, GetJsonSchemaHandler], JsonSchemaValue] HandlerOverride = Callable[[CoreSchemaOrField], JsonSchemaValue] class GenerateJsonSchemaHandler(GetJsonSchemaHandler): """JsonSchemaHandler implementation that doesn't do ref unwrapping by default. This is used for any Annotated metadata so that we don't end up with conflicting modifications to the definition schema. Used internally by Pydantic, please do not rely on this implementation. See `GetJsonSchemaHandler` for the handler API. """ def __init__(self, generate_json_schema: GenerateJsonSchema, handler_override: HandlerOverride | None) -> None: self.generate_json_schema = generate_json_schema self.handler = handler_override or generate_json_schema.generate_inner self.mode = generate_json_schema.mode def __call__(self, core_schema: CoreSchemaOrField, /) -> JsonSchemaValue: return self.handler(core_schema) def resolve_ref_schema(self, maybe_ref_json_schema: JsonSchemaValue) -> JsonSchemaValue: """Resolves `$ref` in the json schema. This returns the input json schema if there is no `$ref` in json schema. Args: maybe_ref_json_schema: The input json schema that may contains `$ref`. Returns: Resolved json schema. Raises: LookupError: If it can't find the definition for `$ref`. """ if '$ref' not in maybe_ref_json_schema: return maybe_ref_json_schema ref = maybe_ref_json_schema['$ref'] json_schema = self.generate_json_schema.get_schema_from_definitions(ref) if json_schema is None: raise LookupError( f'Could not find a ref for {ref}.' ' Maybe you tried to call resolve_ref_schema from within a recursive model?' ) return json_schema class CallbackGetCoreSchemaHandler(GetCoreSchemaHandler): """Wrapper to use an arbitrary function as a `GetCoreSchemaHandler`. Used internally by Pydantic, please do not rely on this implementation. See `GetCoreSchemaHandler` for the handler API. """ def __init__( self, handler: Callable[[Any], core_schema.CoreSchema], generate_schema: GenerateSchema, ref_mode: Literal['to-def', 'unpack'] = 'to-def', ) -> None: self._handler = handler self._generate_schema = generate_schema self._ref_mode = ref_mode def __call__(self, source_type: Any, /) -> core_schema.CoreSchema: schema = self._handler(source_type) ref = schema.get('ref') if self._ref_mode == 'to-def': if ref is not None: self._generate_schema.defs.definitions[ref] = schema return core_schema.definition_reference_schema(ref) return schema else: # ref_mode = 'unpack return self.resolve_ref_schema(schema) def _get_types_namespace(self) -> NamespacesTuple: return self._generate_schema._types_namespace def generate_schema(self, source_type: Any, /) -> core_schema.CoreSchema: return self._generate_schema.generate_schema(source_type) @property def field_name(self) -> str | None: return self._generate_schema.field_name_stack.get() def resolve_ref_schema(self, maybe_ref_schema: core_schema.CoreSchema) -> core_schema.CoreSchema: """Resolves reference in the core schema. Args: maybe_ref_schema: The input core schema that may contains reference. Returns: Resolved core schema. Raises: LookupError: If it can't find the definition for reference. """ if maybe_ref_schema['type'] == 'definition-ref': ref = maybe_ref_schema['schema_ref'] if ref not in self._generate_schema.defs.definitions: raise LookupError( f'Could not find a ref for {ref}.' ' Maybe you tried to call resolve_ref_schema from within a recursive model?' ) return self._generate_schema.defs.definitions[ref] elif maybe_ref_schema['type'] == 'definitions': return self.resolve_ref_schema(maybe_ref_schema['schema']) return maybe_ref_schema pydantic-2.10.6/pydantic/_internal/_serializers.py000066400000000000000000000025141474456633400222700ustar00rootroot00000000000000from __future__ import annotations import collections import collections.abc import typing from typing import Any from pydantic_core import PydanticOmit, core_schema SEQUENCE_ORIGIN_MAP: dict[Any, Any] = { typing.Deque: collections.deque, collections.deque: collections.deque, list: list, typing.List: list, set: set, typing.AbstractSet: set, typing.Set: set, frozenset: frozenset, typing.FrozenSet: frozenset, typing.Sequence: list, typing.MutableSequence: list, typing.MutableSet: set, # this doesn't handle subclasses of these # parametrized typing.Set creates one of these collections.abc.MutableSet: set, collections.abc.Set: frozenset, } def serialize_sequence_via_list( v: Any, handler: core_schema.SerializerFunctionWrapHandler, info: core_schema.SerializationInfo ) -> Any: items: list[Any] = [] mapped_origin = SEQUENCE_ORIGIN_MAP.get(type(v), None) if mapped_origin is None: # we shouldn't hit this branch, should probably add a serialization error or something return v for index, item in enumerate(v): try: v = handler(item, index) except PydanticOmit: pass else: items.append(v) if info.mode_is_json(): return items else: return mapped_origin(items) pydantic-2.10.6/pydantic/_internal/_signature.py000066400000000000000000000151731474456633400217420ustar00rootroot00000000000000from __future__ import annotations import dataclasses from inspect import Parameter, Signature, signature from typing import TYPE_CHECKING, Any, Callable from pydantic_core import PydanticUndefined from ._utils import is_valid_identifier if TYPE_CHECKING: from ..config import ExtraValues from ..fields import FieldInfo # Copied over from stdlib dataclasses class _HAS_DEFAULT_FACTORY_CLASS: def __repr__(self): return '' _HAS_DEFAULT_FACTORY = _HAS_DEFAULT_FACTORY_CLASS() def _field_name_for_signature(field_name: str, field_info: FieldInfo) -> str: """Extract the correct name to use for the field when generating a signature. Assuming the field has a valid alias, this will return the alias. Otherwise, it will return the field name. First priority is given to the alias, then the validation_alias, then the field name. Args: field_name: The name of the field field_info: The corresponding FieldInfo object. Returns: The correct name to use when generating a signature. """ if isinstance(field_info.alias, str) and is_valid_identifier(field_info.alias): return field_info.alias if isinstance(field_info.validation_alias, str) and is_valid_identifier(field_info.validation_alias): return field_info.validation_alias return field_name def _process_param_defaults(param: Parameter) -> Parameter: """Modify the signature for a parameter in a dataclass where the default value is a FieldInfo instance. Args: param (Parameter): The parameter Returns: Parameter: The custom processed parameter """ from ..fields import FieldInfo param_default = param.default if isinstance(param_default, FieldInfo): annotation = param.annotation # Replace the annotation if appropriate # inspect does "clever" things to show annotations as strings because we have # `from __future__ import annotations` in main, we don't want that if annotation == 'Any': annotation = Any # Replace the field default default = param_default.default if default is PydanticUndefined: if param_default.default_factory is PydanticUndefined: default = Signature.empty else: # this is used by dataclasses to indicate a factory exists: default = dataclasses._HAS_DEFAULT_FACTORY # type: ignore return param.replace( annotation=annotation, name=_field_name_for_signature(param.name, param_default), default=default ) return param def _generate_signature_parameters( # noqa: C901 (ignore complexity, could use a refactor) init: Callable[..., None], fields: dict[str, FieldInfo], populate_by_name: bool, extra: ExtraValues | None, ) -> dict[str, Parameter]: """Generate a mapping of parameter names to Parameter objects for a pydantic BaseModel or dataclass.""" from itertools import islice present_params = signature(init).parameters.values() merged_params: dict[str, Parameter] = {} var_kw = None use_var_kw = False for param in islice(present_params, 1, None): # skip self arg # inspect does "clever" things to show annotations as strings because we have # `from __future__ import annotations` in main, we don't want that if fields.get(param.name): # exclude params with init=False if getattr(fields[param.name], 'init', True) is False: continue param = param.replace(name=_field_name_for_signature(param.name, fields[param.name])) if param.annotation == 'Any': param = param.replace(annotation=Any) if param.kind is param.VAR_KEYWORD: var_kw = param continue merged_params[param.name] = param if var_kw: # if custom init has no var_kw, fields which are not declared in it cannot be passed through allow_names = populate_by_name for field_name, field in fields.items(): # when alias is a str it should be used for signature generation param_name = _field_name_for_signature(field_name, field) if field_name in merged_params or param_name in merged_params: continue if not is_valid_identifier(param_name): if allow_names: param_name = field_name else: use_var_kw = True continue if field.is_required(): default = Parameter.empty elif field.default_factory is not None: # Mimics stdlib dataclasses: default = _HAS_DEFAULT_FACTORY else: default = field.default merged_params[param_name] = Parameter( param_name, Parameter.KEYWORD_ONLY, annotation=field.rebuild_annotation(), default=default, ) if extra == 'allow': use_var_kw = True if var_kw and use_var_kw: # Make sure the parameter for extra kwargs # does not have the same name as a field default_model_signature = [ ('self', Parameter.POSITIONAL_ONLY), ('data', Parameter.VAR_KEYWORD), ] if [(p.name, p.kind) for p in present_params] == default_model_signature: # if this is the standard model signature, use extra_data as the extra args name var_kw_name = 'extra_data' else: # else start from var_kw var_kw_name = var_kw.name # generate a name that's definitely unique while var_kw_name in fields: var_kw_name += '_' merged_params[var_kw_name] = var_kw.replace(name=var_kw_name) return merged_params def generate_pydantic_signature( init: Callable[..., None], fields: dict[str, FieldInfo], populate_by_name: bool, extra: ExtraValues | None, is_dataclass: bool = False, ) -> Signature: """Generate signature for a pydantic BaseModel or dataclass. Args: init: The class init. fields: The model fields. populate_by_name: The `populate_by_name` value of the config. extra: The `extra` value of the config. is_dataclass: Whether the model is a dataclass. Returns: The dataclass/BaseModel subclass signature. """ merged_params = _generate_signature_parameters(init, fields, populate_by_name, extra) if is_dataclass: merged_params = {k: _process_param_defaults(v) for k, v in merged_params.items()} return Signature(parameters=list(merged_params.values()), return_annotation=None) pydantic-2.10.6/pydantic/_internal/_std_types_schema.py000066400000000000000000000374431474456633400233030ustar00rootroot00000000000000"""Logic for generating pydantic-core schemas for standard library types. Import of this module is deferred since it contains imports of many standard library modules. """ # TODO: eventually, we'd like to move all of the types handled here to have pydantic-core validators # so that we can avoid this annotation injection and just use the standard pydantic-core schema generation from __future__ import annotations as _annotations import collections import collections.abc import dataclasses import os import typing from functools import partial from typing import Any, Callable, Iterable, Tuple, TypeVar, cast import typing_extensions from pydantic_core import ( CoreSchema, PydanticCustomError, core_schema, ) from typing_extensions import get_args, get_origin from pydantic._internal._serializers import serialize_sequence_via_list from pydantic.errors import PydanticSchemaGenerationError from pydantic.types import Strict from ..json_schema import JsonSchemaValue from . import _known_annotated_metadata, _typing_extra from ._import_utils import import_cached_field_info from ._internal_dataclass import slots_true from ._schema_generation_shared import GetCoreSchemaHandler, GetJsonSchemaHandler FieldInfo = import_cached_field_info() if typing.TYPE_CHECKING: from ._generate_schema import GenerateSchema StdSchemaFunction = Callable[[GenerateSchema, type[Any]], core_schema.CoreSchema] @dataclasses.dataclass(**slots_true) class InnerSchemaValidator: """Use a fixed CoreSchema, avoiding interference from outward annotations.""" core_schema: CoreSchema js_schema: JsonSchemaValue | None = None js_core_schema: CoreSchema | None = None js_schema_update: JsonSchemaValue | None = None def __get_pydantic_json_schema__(self, _schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: if self.js_schema is not None: return self.js_schema js_schema = handler(self.js_core_schema or self.core_schema) if self.js_schema_update is not None: js_schema.update(self.js_schema_update) return js_schema def __get_pydantic_core_schema__(self, _source_type: Any, _handler: GetCoreSchemaHandler) -> CoreSchema: return self.core_schema def path_schema_prepare_pydantic_annotations( source_type: Any, annotations: Iterable[Any] ) -> tuple[Any, list[Any]] | None: import pathlib orig_source_type: Any = get_origin(source_type) or source_type if ( (source_type_args := get_args(source_type)) and orig_source_type is os.PathLike and source_type_args[0] not in {str, bytes, Any} ): return None if orig_source_type not in { os.PathLike, pathlib.Path, pathlib.PurePath, pathlib.PosixPath, pathlib.PurePosixPath, pathlib.PureWindowsPath, }: return None metadata, remaining_annotations = _known_annotated_metadata.collect_known_metadata(annotations) _known_annotated_metadata.check_metadata(metadata, _known_annotated_metadata.STR_CONSTRAINTS, orig_source_type) is_first_arg_byte = source_type_args and source_type_args[0] is bytes construct_path = pathlib.PurePath if orig_source_type is os.PathLike else orig_source_type constrained_schema = ( core_schema.bytes_schema(**metadata) if is_first_arg_byte else core_schema.str_schema(**metadata) ) def path_validator(input_value: str | bytes) -> os.PathLike[Any]: # type: ignore try: if is_first_arg_byte: if isinstance(input_value, bytes): try: input_value = input_value.decode() except UnicodeDecodeError as e: raise PydanticCustomError('bytes_type', 'Input must be valid bytes') from e else: raise PydanticCustomError('bytes_type', 'Input must be bytes') elif not isinstance(input_value, str): raise PydanticCustomError('path_type', 'Input is not a valid path') return construct_path(input_value) except TypeError as e: raise PydanticCustomError('path_type', 'Input is not a valid path') from e instance_schema = core_schema.json_or_python_schema( json_schema=core_schema.no_info_after_validator_function(path_validator, constrained_schema), python_schema=core_schema.is_instance_schema(orig_source_type), ) strict: bool | None = None for annotation in annotations: if isinstance(annotation, Strict): strict = annotation.strict schema = core_schema.lax_or_strict_schema( lax_schema=core_schema.union_schema( [ instance_schema, core_schema.no_info_after_validator_function(path_validator, constrained_schema), ], custom_error_type='path_type', custom_error_message=f'Input is not a valid path for {orig_source_type}', strict=True, ), strict_schema=instance_schema, serialization=core_schema.to_string_ser_schema(), strict=strict, ) return ( orig_source_type, [ InnerSchemaValidator(schema, js_core_schema=constrained_schema, js_schema_update={'format': 'path'}), *remaining_annotations, ], ) def deque_validator( input_value: Any, handler: core_schema.ValidatorFunctionWrapHandler, maxlen: None | int ) -> collections.deque[Any]: if isinstance(input_value, collections.deque): maxlens = [v for v in (input_value.maxlen, maxlen) if v is not None] if maxlens: maxlen = min(maxlens) return collections.deque(handler(input_value), maxlen=maxlen) else: return collections.deque(handler(input_value), maxlen=maxlen) @dataclasses.dataclass(**slots_true) class DequeValidator: item_source_type: type[Any] metadata: dict[str, Any] def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: if _typing_extra.is_any(self.item_source_type): items_schema = None else: items_schema = handler.generate_schema(self.item_source_type) # if we have a MaxLen annotation might as well set that as the default maxlen on the deque # this lets us reuse existing metadata annotations to let users set the maxlen on a dequeue # that e.g. comes from JSON coerce_instance_wrap = partial( core_schema.no_info_wrap_validator_function, partial(deque_validator, maxlen=self.metadata.get('max_length', None)), ) # we have to use a lax list schema here, because we need to validate the deque's # items via a list schema, but it's ok if the deque itself is not a list metadata_with_strict_override = {**self.metadata, 'strict': False} constrained_schema = core_schema.list_schema(items_schema, **metadata_with_strict_override) check_instance = core_schema.json_or_python_schema( json_schema=core_schema.list_schema(), python_schema=core_schema.is_instance_schema(collections.deque), ) serialization = core_schema.wrap_serializer_function_ser_schema( serialize_sequence_via_list, schema=items_schema or core_schema.any_schema(), info_arg=True ) strict = core_schema.chain_schema([check_instance, coerce_instance_wrap(constrained_schema)]) if self.metadata.get('strict', False): schema = strict else: lax = coerce_instance_wrap(constrained_schema) schema = core_schema.lax_or_strict_schema(lax_schema=lax, strict_schema=strict) schema['serialization'] = serialization return schema def deque_schema_prepare_pydantic_annotations( source_type: Any, annotations: Iterable[Any] ) -> tuple[Any, list[Any]] | None: args = get_args(source_type) if not args: args = typing.cast(Tuple[Any], (Any,)) elif len(args) != 1: raise ValueError('Expected deque to have exactly 1 generic parameter') item_source_type = args[0] metadata, remaining_annotations = _known_annotated_metadata.collect_known_metadata(annotations) _known_annotated_metadata.check_metadata(metadata, _known_annotated_metadata.SEQUENCE_CONSTRAINTS, source_type) return (source_type, [DequeValidator(item_source_type, metadata), *remaining_annotations]) MAPPING_ORIGIN_MAP: dict[Any, Any] = { typing.DefaultDict: collections.defaultdict, collections.defaultdict: collections.defaultdict, collections.OrderedDict: collections.OrderedDict, typing_extensions.OrderedDict: collections.OrderedDict, dict: dict, typing.Dict: dict, collections.Counter: collections.Counter, typing.Counter: collections.Counter, # this doesn't handle subclasses of these typing.Mapping: dict, typing.MutableMapping: dict, # parametrized typing.{Mutable}Mapping creates one of these collections.abc.MutableMapping: dict, collections.abc.Mapping: dict, } def defaultdict_validator( input_value: Any, handler: core_schema.ValidatorFunctionWrapHandler, default_default_factory: Callable[[], Any] ) -> collections.defaultdict[Any, Any]: if isinstance(input_value, collections.defaultdict): default_factory = input_value.default_factory return collections.defaultdict(default_factory, handler(input_value)) else: return collections.defaultdict(default_default_factory, handler(input_value)) def get_defaultdict_default_default_factory(values_source_type: Any) -> Callable[[], Any]: def infer_default() -> Callable[[], Any]: allowed_default_types: dict[Any, Any] = { typing.Tuple: tuple, tuple: tuple, collections.abc.Sequence: tuple, collections.abc.MutableSequence: list, typing.List: list, list: list, typing.Sequence: list, typing.Set: set, set: set, typing.MutableSet: set, collections.abc.MutableSet: set, collections.abc.Set: frozenset, typing.MutableMapping: dict, typing.Mapping: dict, collections.abc.Mapping: dict, collections.abc.MutableMapping: dict, float: float, int: int, str: str, bool: bool, } values_type_origin = get_origin(values_source_type) or values_source_type instructions = 'set using `DefaultDict[..., Annotated[..., Field(default_factory=...)]]`' if isinstance(values_type_origin, TypeVar): def type_var_default_factory() -> None: raise RuntimeError( 'Generic defaultdict cannot be used without a concrete value type or an' ' explicit default factory, ' + instructions ) return type_var_default_factory elif values_type_origin not in allowed_default_types: # a somewhat subjective set of types that have reasonable default values allowed_msg = ', '.join([t.__name__ for t in set(allowed_default_types.values())]) raise PydanticSchemaGenerationError( f'Unable to infer a default factory for keys of type {values_source_type}.' f' Only {allowed_msg} are supported, other types require an explicit default factory' ' ' + instructions ) return allowed_default_types[values_type_origin] # Assume Annotated[..., Field(...)] if _typing_extra.is_annotated(values_source_type): field_info = next((v for v in get_args(values_source_type) if isinstance(v, FieldInfo)), None) else: field_info = None if field_info and field_info.default_factory: # Assume the default factory does not take any argument: default_default_factory = cast(Callable[[], Any], field_info.default_factory) else: default_default_factory = infer_default() return default_default_factory @dataclasses.dataclass(**slots_true) class MappingValidator: mapped_origin: type[Any] keys_source_type: type[Any] values_source_type: type[Any] min_length: int | None = None max_length: int | None = None strict: bool = False def serialize_mapping_via_dict(self, v: Any, handler: core_schema.SerializerFunctionWrapHandler) -> Any: return handler(v) def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: if _typing_extra.is_any(self.keys_source_type): keys_schema = None else: keys_schema = handler.generate_schema(self.keys_source_type) if _typing_extra.is_any(self.values_source_type): values_schema = None else: values_schema = handler.generate_schema(self.values_source_type) metadata = {'min_length': self.min_length, 'max_length': self.max_length, 'strict': self.strict} if self.mapped_origin is dict: schema = core_schema.dict_schema(keys_schema, values_schema, **metadata) else: constrained_schema = core_schema.dict_schema(keys_schema, values_schema, **metadata) check_instance = core_schema.json_or_python_schema( json_schema=core_schema.dict_schema(), python_schema=core_schema.is_instance_schema(self.mapped_origin), ) if self.mapped_origin is collections.defaultdict: default_default_factory = get_defaultdict_default_default_factory(self.values_source_type) coerce_instance_wrap = partial( core_schema.no_info_wrap_validator_function, partial(defaultdict_validator, default_default_factory=default_default_factory), ) else: coerce_instance_wrap = partial(core_schema.no_info_after_validator_function, self.mapped_origin) serialization = core_schema.wrap_serializer_function_ser_schema( self.serialize_mapping_via_dict, schema=core_schema.dict_schema( keys_schema or core_schema.any_schema(), values_schema or core_schema.any_schema() ), info_arg=False, ) strict = core_schema.chain_schema([check_instance, coerce_instance_wrap(constrained_schema)]) if metadata.get('strict', False): schema = strict else: lax = coerce_instance_wrap(constrained_schema) schema = core_schema.lax_or_strict_schema(lax_schema=lax, strict_schema=strict) schema['serialization'] = serialization return schema def mapping_like_prepare_pydantic_annotations( source_type: Any, annotations: Iterable[Any] ) -> tuple[Any, list[Any]] | None: origin: Any = get_origin(source_type) mapped_origin = MAPPING_ORIGIN_MAP.get(origin, None) if origin else MAPPING_ORIGIN_MAP.get(source_type, None) if mapped_origin is None: return None args = get_args(source_type) if not args: args = typing.cast(Tuple[Any, Any], (Any, Any)) elif mapped_origin is collections.Counter: # a single generic if len(args) != 1: raise ValueError('Expected Counter to have exactly 1 generic parameter') args = (args[0], int) # keys are always an int elif len(args) != 2: raise ValueError('Expected mapping to have exactly 2 generic parameters') keys_source_type, values_source_type = args metadata, remaining_annotations = _known_annotated_metadata.collect_known_metadata(annotations) _known_annotated_metadata.check_metadata(metadata, _known_annotated_metadata.SEQUENCE_CONSTRAINTS, source_type) return ( source_type, [ MappingValidator(mapped_origin, keys_source_type, values_source_type, **metadata), *remaining_annotations, ], ) pydantic-2.10.6/pydantic/_internal/_typing_extra.py000066400000000000000000001011641474456633400224520ustar00rootroot00000000000000"""Logic for interacting with type annotations, mostly extensions, shims and hacks to wrap Python's typing module.""" from __future__ import annotations import collections.abc import re import sys import types import typing import warnings from functools import lru_cache, partial from typing import TYPE_CHECKING, Any, Callable import typing_extensions from typing_extensions import TypeIs, deprecated, get_args, get_origin from ._namespace_utils import GlobalsNamespace, MappingNamespace, NsResolver, get_module_ns_of if sys.version_info < (3, 10): NoneType = type(None) EllipsisType = type(Ellipsis) else: from types import EllipsisType as EllipsisType from types import NoneType as NoneType if TYPE_CHECKING: from pydantic import BaseModel # See https://typing-extensions.readthedocs.io/en/latest/#runtime-use-of-types: @lru_cache(maxsize=None) def _get_typing_objects_by_name_of(name: str) -> tuple[Any, ...]: """Get the member named `name` from both `typing` and `typing-extensions` (if it exists).""" result = tuple(getattr(module, name) for module in (typing, typing_extensions) if hasattr(module, name)) if not result: raise ValueError(f'Neither `typing` nor `typing_extensions` has an object called {name!r}') return result # As suggested by the `typing-extensions` documentation, we could apply caching to this method, # but it doesn't seem to improve performance. This also requires `obj` to be hashable, which # might not be always the case: def _is_typing_name(obj: object, name: str) -> bool: """Return whether `obj` is the member of the typing modules (includes the `typing-extensions` one) named `name`.""" # Using `any()` is slower: for thing in _get_typing_objects_by_name_of(name): if obj is thing: return True return False def is_any(tp: Any, /) -> bool: """Return whether the provided argument is the `Any` special form. ```python {test="skip" lint="skip"} is_any(Any) #> True ``` """ return _is_typing_name(tp, name='Any') def is_union(tp: Any, /) -> bool: """Return whether the provided argument is a `Union` special form. ```python {test="skip" lint="skip"} is_union(Union[int, str]) #> True is_union(int | str) #> False ``` """ return _is_typing_name(get_origin(tp), name='Union') def is_literal(tp: Any, /) -> bool: """Return whether the provided argument is a `Literal` special form. ```python {test="skip" lint="skip"} is_literal(Literal[42]) #> True ``` """ return _is_typing_name(get_origin(tp), name='Literal') # TODO remove and replace with `get_args` when we drop support for Python 3.8 # (see https://docs.python.org/3/whatsnew/3.9.html#id4). def literal_values(tp: Any, /) -> list[Any]: """Return the values contained in the provided `Literal` special form.""" if not is_literal(tp): return [tp] values = get_args(tp) return [x for value in values for x in literal_values(value)] def is_annotated(tp: Any, /) -> bool: """Return whether the provided argument is a `Annotated` special form. ```python {test="skip" lint="skip"} is_annotated(Annotated[int, ...]) #> True ``` """ return _is_typing_name(get_origin(tp), name='Annotated') def annotated_type(tp: Any, /) -> Any | None: """Return the type of the `Annotated` special form, or `None`.""" return get_args(tp)[0] if is_annotated(tp) else None def is_unpack(tp: Any, /) -> bool: """Return whether the provided argument is a `Unpack` special form. ```python {test="skip" lint="skip"} is_unpack(Unpack[Ts]) #> True ``` """ return _is_typing_name(get_origin(tp), name='Unpack') def unpack_type(tp: Any, /) -> Any | None: """Return the type wrapped by the `Unpack` special form, or `None`.""" return get_args(tp)[0] if is_unpack(tp) else None def is_self(tp: Any, /) -> bool: """Return whether the provided argument is the `Self` special form. ```python {test="skip" lint="skip"} is_self(Self) #> True ``` """ return _is_typing_name(tp, name='Self') def is_new_type(tp: Any, /) -> bool: """Return whether the provided argument is a `NewType`. ```python {test="skip" lint="skip"} is_new_type(NewType('MyInt', int)) #> True ``` """ if sys.version_info < (3, 10): # On Python < 3.10, `typing.NewType` is a function return hasattr(tp, '__supertype__') else: return _is_typing_name(type(tp), name='NewType') def is_hashable(tp: Any, /) -> bool: """Return whether the provided argument is the `Hashable` class. ```python {test="skip" lint="skip"} is_hashable(Hashable) #> True ``` """ # `get_origin` is documented as normalizing any typing-module aliases to `collections` classes, # hence the second check: return tp is collections.abc.Hashable or get_origin(tp) is collections.abc.Hashable def is_callable(tp: Any, /) -> bool: """Return whether the provided argument is a `Callable`, parametrized or not. ```python {test="skip" lint="skip"} is_callable(Callable[[int], str]) #> True is_callable(typing.Callable) #> True is_callable(collections.abc.Callable) #> True ``` """ # `get_origin` is documented as normalizing any typing-module aliases to `collections` classes, # hence the second check: return tp is collections.abc.Callable or get_origin(tp) is collections.abc.Callable _PARAMSPEC_TYPES: tuple[type[typing_extensions.ParamSpec], ...] = (typing_extensions.ParamSpec,) if sys.version_info >= (3, 10): _PARAMSPEC_TYPES = (*_PARAMSPEC_TYPES, typing.ParamSpec) # pyright: ignore[reportAssignmentType] def is_paramspec(tp: Any, /) -> bool: """Return whether the provided argument is a `ParamSpec`. ```python {test="skip" lint="skip"} P = ParamSpec('P') is_paramspec(P) #> True ``` """ return isinstance(tp, _PARAMSPEC_TYPES) _TYPE_ALIAS_TYPES: tuple[type[typing_extensions.TypeAliasType], ...] = (typing_extensions.TypeAliasType,) if sys.version_info >= (3, 12): _TYPE_ALIAS_TYPES = (*_TYPE_ALIAS_TYPES, typing.TypeAliasType) def is_type_alias_type(tp: Any, /) -> TypeIs[typing_extensions.TypeAliasType]: """Return whether the provided argument is an instance of `TypeAliasType`. ```python {test="skip" lint="skip"} type Int = int is_type_alias_type(Int) #> True Str = TypeAliasType('Str', str) is_type_alias_type(Str) #> True ``` """ return isinstance(tp, _TYPE_ALIAS_TYPES) def is_classvar(tp: Any, /) -> bool: """Return whether the provided argument is a `ClassVar` special form, parametrized or not. Note that in most cases, you will want to use the `is_classvar_annotation` function, which is used to check if an annotation (in the context of a Pydantic model or dataclass) should be treated as being a class variable. ```python {test="skip" lint="skip"} is_classvar(ClassVar[int]) #> True is_classvar(ClassVar) #> True """ # ClassVar is not necessarily parametrized: return _is_typing_name(tp, name='ClassVar') or _is_typing_name(get_origin(tp), name='ClassVar') _classvar_re = re.compile(r'((\w+\.)?Annotated\[)?(\w+\.)?ClassVar\[') def is_classvar_annotation(tp: Any, /) -> bool: """Return whether the provided argument represents a class variable annotation. Although not explicitly stated by the typing specification, `ClassVar` can be used inside `Annotated` and as such, this function checks for this specific scenario. Because this function is used to detect class variables before evaluating forward references (or because evaluation failed), we also implement a naive regex match implementation. This is required because class variables are inspected before fields are collected, so we try to be as accurate as possible. """ if is_classvar(tp) or (anntp := annotated_type(tp)) is not None and is_classvar(anntp): return True str_ann: str | None = None if isinstance(tp, typing.ForwardRef): str_ann = tp.__forward_arg__ if isinstance(tp, str): str_ann = tp if str_ann is not None and _classvar_re.match(str_ann): # stdlib dataclasses do something similar, although a bit more advanced # (see `dataclass._is_type`). return True return False # TODO implement `is_finalvar_annotation` as Final can be wrapped with other special forms: def is_finalvar(tp: Any, /) -> bool: """Return whether the provided argument is a `Final` special form, parametrized or not. ```python {test="skip" lint="skip"} is_finalvar(Final[int]) #> True is_finalvar(Final) #> True """ # Final is not necessarily parametrized: return _is_typing_name(tp, name='Final') or _is_typing_name(get_origin(tp), name='Final') def is_required(tp: Any, /) -> bool: """Return whether the provided argument is a `Required` special form. ```python {test="skip" lint="skip"} is_required(Required[int]) #> True """ return _is_typing_name(get_origin(tp), name='Required') def is_not_required(tp: Any, /) -> bool: """Return whether the provided argument is a `NotRequired` special form. ```python {test="skip" lint="skip"} is_required(Required[int]) #> True """ return _is_typing_name(get_origin(tp), name='NotRequired') def is_no_return(tp: Any, /) -> bool: """Return whether the provided argument is the `NoReturn` special form. ```python {test="skip" lint="skip"} is_no_return(NoReturn) #> True ``` """ return _is_typing_name(tp, name='NoReturn') def is_never(tp: Any, /) -> bool: """Return whether the provided argument is the `Never` special form. ```python {test="skip" lint="skip"} is_never(Never) #> True ``` """ return _is_typing_name(tp, name='Never') _DEPRECATED_TYPES: tuple[type[typing_extensions.deprecated], ...] = (typing_extensions.deprecated,) if hasattr(warnings, 'deprecated'): _DEPRECATED_TYPES = (*_DEPRECATED_TYPES, warnings.deprecated) # pyright: ignore[reportAttributeAccessIssue] def is_deprecated_instance(obj: Any, /) -> TypeIs[deprecated]: """Return whether the argument is an instance of the `warnings.deprecated` class or the `typing_extensions` backport.""" return isinstance(obj, _DEPRECATED_TYPES) _NONE_TYPES: tuple[Any, ...] = (None, NoneType, typing.Literal[None], typing_extensions.Literal[None]) def is_none_type(tp: Any, /) -> bool: """Return whether the argument represents the `None` type as part of an annotation. ```python {test="skip" lint="skip"} is_none_type(None) #> True is_none_type(NoneType) #> True is_none_type(Literal[None]) #> True is_none_type(type[None]) #> False """ return tp in _NONE_TYPES def is_namedtuple(tp: Any, /) -> bool: """Return whether the provided argument is a named tuple class. The class can be created using `typing.NamedTuple` or `collections.namedtuple`. Parametrized generic classes are *not* assumed to be named tuples. """ from ._utils import lenient_issubclass # circ. import return lenient_issubclass(tp, tuple) and hasattr(tp, '_fields') if sys.version_info < (3, 9): def is_zoneinfo_type(tp: Any, /) -> bool: """Return whether the provided argument is the `zoneinfo.ZoneInfo` type.""" return False else: from zoneinfo import ZoneInfo def is_zoneinfo_type(tp: Any, /) -> TypeIs[type[ZoneInfo]]: """Return whether the provided argument is the `zoneinfo.ZoneInfo` type.""" return tp is ZoneInfo if sys.version_info < (3, 10): def origin_is_union(tp: Any, /) -> bool: """Return whether the provided argument is the `Union` special form.""" return _is_typing_name(tp, name='Union') def is_generic_alias(type_: type[Any]) -> bool: return isinstance(type_, typing._GenericAlias) # pyright: ignore[reportAttributeAccessIssue] else: def origin_is_union(tp: Any, /) -> bool: """Return whether the provided argument is the `Union` special form or the `UnionType`.""" return _is_typing_name(tp, name='Union') or tp is types.UnionType def is_generic_alias(tp: Any, /) -> bool: return isinstance(tp, (types.GenericAlias, typing._GenericAlias)) # pyright: ignore[reportAttributeAccessIssue] # TODO: Ideally, we should avoid relying on the private `typing` constructs: if sys.version_info < (3, 9): WithArgsTypes: tuple[Any, ...] = (typing._GenericAlias,) # pyright: ignore[reportAttributeAccessIssue] elif sys.version_info < (3, 10): WithArgsTypes: tuple[Any, ...] = (typing._GenericAlias, types.GenericAlias) # pyright: ignore[reportAttributeAccessIssue] else: WithArgsTypes: tuple[Any, ...] = (typing._GenericAlias, types.GenericAlias, types.UnionType) # pyright: ignore[reportAttributeAccessIssue] # Similarly, we shouldn't rely on this `_Final` class, which is even more private than `_GenericAlias`: typing_base: Any = typing._Final # pyright: ignore[reportAttributeAccessIssue] ### Annotation evaluations functions: def parent_frame_namespace(*, parent_depth: int = 2, force: bool = False) -> dict[str, Any] | None: """We allow use of items in parent namespace to get around the issue with `get_type_hints` only looking in the global module namespace. See https://github.com/pydantic/pydantic/issues/2678#issuecomment-1008139014 -> Scope and suggestion at the end of the next comment by @gvanrossum. WARNING 1: it matters exactly where this is called. By default, this function will build a namespace from the parent of where it is called. WARNING 2: this only looks in the parent namespace, not other parents since (AFAIK) there's no way to collect a dict of exactly what's in scope. Using `f_back` would work sometimes but would be very wrong and confusing in many other cases. See https://discuss.python.org/t/is-there-a-way-to-access-parent-nested-namespaces/20659. There are some cases where we want to force fetching the parent namespace, ex: during a `model_rebuild` call. In this case, we want both the namespace of the class' module, if applicable, and the parent namespace of the module where the rebuild is called. In other cases, like during initial schema build, if a class is defined at the top module level, we don't need to fetch that module's namespace, because the class' __module__ attribute can be used to access the parent namespace. This is done in `_namespace_utils.get_module_ns_of`. Thus, there's no need to cache the parent frame namespace in this case. """ frame = sys._getframe(parent_depth) # note, we don't copy frame.f_locals here (or during the last return call), because we don't expect the namespace to be modified down the line # if this becomes a problem, we could implement some sort of frozen mapping structure to enforce this if force: return frame.f_locals # if either of the following conditions are true, the class is defined at the top module level # to better understand why we need both of these checks, see # https://github.com/pydantic/pydantic/pull/10113#discussion_r1714981531 if frame.f_back is None or frame.f_code.co_name == '': return None return frame.f_locals def _type_convert(arg: Any) -> Any: """Convert `None` to `NoneType` and strings to `ForwardRef` instances. This is a backport of the private `typing._type_convert` function. When evaluating a type, `ForwardRef._evaluate` ends up being called, and is responsible for making this conversion. However, we still have to apply it for the first argument passed to our type evaluation functions, similarly to the `typing.get_type_hints` function. """ if arg is None: return NoneType if isinstance(arg, str): # Like `typing.get_type_hints`, assume the arg can be in any context, # hence the proper `is_argument` and `is_class` args: return _make_forward_ref(arg, is_argument=False, is_class=True) return arg def get_model_type_hints( obj: type[BaseModel], *, ns_resolver: NsResolver | None = None, ) -> dict[str, tuple[Any, bool]]: """Collect annotations from a Pydantic model class, including those from parent classes. Args: obj: The Pydantic model to inspect. ns_resolver: A namespace resolver instance to use. Defaults to an empty instance. Returns: A dictionary mapping annotation names to a two-tuple: the first element is the evaluated type or the original annotation if a `NameError` occurred, the second element is a boolean indicating if whether the evaluation succeeded. """ hints: dict[str, Any] | dict[str, tuple[Any, bool]] = {} ns_resolver = ns_resolver or NsResolver() for base in reversed(obj.__mro__): ann: dict[str, Any] | None = base.__dict__.get('__annotations__') if not ann or isinstance(ann, types.GetSetDescriptorType): continue with ns_resolver.push(base): globalns, localns = ns_resolver.types_namespace for name, value in ann.items(): if name.startswith('_'): # For private attributes, we only need the annotation to detect the `ClassVar` special form. # For this reason, we still try to evaluate it, but we also catch any possible exception (on # top of the `NameError`s caught in `try_eval_type`) that could happen so that users are free # to use any kind of forward annotation for private fields (e.g. circular imports, new typing # syntax, etc). try: hints[name] = try_eval_type(value, globalns, localns) except Exception: hints[name] = (value, False) else: hints[name] = try_eval_type(value, globalns, localns) return hints def get_cls_type_hints( obj: type[Any], *, ns_resolver: NsResolver | None = None, ) -> dict[str, Any]: """Collect annotations from a class, including those from parent classes. Args: obj: The class to inspect. ns_resolver: A namespace resolver instance to use. Defaults to an empty instance. """ hints: dict[str, Any] | dict[str, tuple[Any, bool]] = {} ns_resolver = ns_resolver or NsResolver() for base in reversed(obj.__mro__): ann: dict[str, Any] | None = base.__dict__.get('__annotations__') if not ann or isinstance(ann, types.GetSetDescriptorType): continue with ns_resolver.push(base): globalns, localns = ns_resolver.types_namespace for name, value in ann.items(): hints[name] = eval_type(value, globalns, localns) return hints def try_eval_type( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> tuple[Any, bool]: """Try evaluating the annotation using the provided namespaces. Args: value: The value to evaluate. If `None`, it will be replaced by `type[None]`. If an instance of `str`, it will be converted to a `ForwardRef`. localns: The global namespace to use during annotation evaluation. globalns: The local namespace to use during annotation evaluation. Returns: A two-tuple containing the possibly evaluated type and a boolean indicating whether the evaluation succeeded or not. """ value = _type_convert(value) try: return eval_type_backport(value, globalns, localns), True except NameError: return value, False def eval_type( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> Any: """Evaluate the annotation using the provided namespaces. Args: value: The value to evaluate. If `None`, it will be replaced by `type[None]`. If an instance of `str`, it will be converted to a `ForwardRef`. localns: The global namespace to use during annotation evaluation. globalns: The local namespace to use during annotation evaluation. """ value = _type_convert(value) return eval_type_backport(value, globalns, localns) @deprecated( '`eval_type_lenient` is deprecated, use `try_eval_type` instead.', category=None, ) def eval_type_lenient( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> Any: ev, _ = try_eval_type(value, globalns, localns) return ev def eval_type_backport( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, type_params: tuple[Any, ...] | None = None, ) -> Any: """An enhanced version of `typing._eval_type` which will fall back to using the `eval_type_backport` package if it's installed to let older Python versions use newer typing constructs. Specifically, this transforms `X | Y` into `typing.Union[X, Y]` and `list[X]` into `typing.List[X]` (as well as all the types made generic in PEP 585) if the original syntax is not supported in the current Python version. This function will also display a helpful error if the value passed fails to evaluate. """ try: return _eval_type_backport(value, globalns, localns, type_params) except TypeError as e: if 'Unable to evaluate type annotation' in str(e): raise # If it is a `TypeError` and value isn't a `ForwardRef`, it would have failed during annotation definition. # Thus we assert here for type checking purposes: assert isinstance(value, typing.ForwardRef) message = f'Unable to evaluate type annotation {value.__forward_arg__!r}.' if sys.version_info >= (3, 11): e.add_note(message) raise else: raise TypeError(message) from e def _eval_type_backport( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, type_params: tuple[Any, ...] | None = None, ) -> Any: try: return _eval_type(value, globalns, localns, type_params) except TypeError as e: if not (isinstance(value, typing.ForwardRef) and is_backport_fixable_error(e)): raise try: from eval_type_backport import eval_type_backport except ImportError: raise TypeError( f'Unable to evaluate type annotation {value.__forward_arg__!r}. If you are making use ' 'of the new typing syntax (unions using `|` since Python 3.10 or builtins subscripting ' 'since Python 3.9), you should either replace the use of new syntax with the existing ' '`typing` constructs or install the `eval_type_backport` package.' ) from e return eval_type_backport( value, globalns, localns, # pyright: ignore[reportArgumentType], waiting on a new `eval_type_backport` release. try_default=False, ) def _eval_type( value: Any, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, type_params: tuple[Any, ...] | None = None, ) -> Any: if sys.version_info >= (3, 13): return typing._eval_type( # type: ignore value, globalns, localns, type_params=type_params ) else: return typing._eval_type( # type: ignore value, globalns, localns ) def is_backport_fixable_error(e: TypeError) -> bool: msg = str(e) return ( sys.version_info < (3, 10) and msg.startswith('unsupported operand type(s) for |: ') or sys.version_info < (3, 9) and "' object is not subscriptable" in msg ) def get_function_type_hints( function: Callable[..., Any], *, include_keys: set[str] | None = None, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> dict[str, Any]: """Return type hints for a function. This is similar to the `typing.get_type_hints` function, with a few differences: - Support `functools.partial` by using the underlying `func` attribute. - If `function` happens to be a built-in type (e.g. `int`), assume it doesn't have annotations but specify the `return` key as being the actual type. - Do not wrap type annotation of a parameter with `Optional` if it has a default value of `None` (related bug: https://github.com/python/cpython/issues/90353, only fixed in 3.11+). """ try: if isinstance(function, partial): annotations = function.func.__annotations__ else: annotations = function.__annotations__ except AttributeError: type_hints = get_type_hints(function) if isinstance(function, type): # `type[...]` is a callable, which returns an instance of itself. # At some point, we might even look into the return type of `__new__` # if it returns something else. type_hints.setdefault('return', function) return type_hints if globalns is None: globalns = get_module_ns_of(function) type_params: tuple[Any, ...] | None = None if localns is None: # If localns was specified, it is assumed to already contain type params. This is because # Pydantic has more advanced logic to do so (see `_namespace_utils.ns_for_function`). type_params = getattr(function, '__type_params__', ()) type_hints = {} for name, value in annotations.items(): if include_keys is not None and name not in include_keys: continue if value is None: value = NoneType elif isinstance(value, str): value = _make_forward_ref(value) type_hints[name] = eval_type_backport(value, globalns, localns, type_params) return type_hints if sys.version_info < (3, 9, 8) or (3, 10) <= sys.version_info < (3, 10, 1): def _make_forward_ref( arg: Any, is_argument: bool = True, *, is_class: bool = False, ) -> typing.ForwardRef: """Wrapper for ForwardRef that accounts for the `is_class` argument missing in older versions. The `module` argument is omitted as it breaks <3.9.8, =3.10.0 and isn't used in the calls below. See https://github.com/python/cpython/pull/28560 for some background. The backport happened on 3.9.8, see: https://github.com/pydantic/pydantic/discussions/6244#discussioncomment-6275458, and on 3.10.1 for the 3.10 branch, see: https://github.com/pydantic/pydantic/issues/6912 Implemented as EAFP with memory. """ return typing.ForwardRef(arg, is_argument) else: _make_forward_ref = typing.ForwardRef if sys.version_info >= (3, 10): get_type_hints = typing.get_type_hints else: """ For older versions of python, we have a custom implementation of `get_type_hints` which is a close as possible to the implementation in CPython 3.10.8. """ @typing.no_type_check def get_type_hints( # noqa: C901 obj: Any, globalns: dict[str, Any] | None = None, localns: dict[str, Any] | None = None, include_extras: bool = False, ) -> dict[str, Any]: # pragma: no cover """Taken verbatim from python 3.10.8 unchanged, except: * type annotations of the function definition above. * prefixing `typing.` where appropriate * Use `_make_forward_ref` instead of `typing.ForwardRef` to handle the `is_class` argument. https://github.com/python/cpython/blob/aaaf5174241496afca7ce4d4584570190ff972fe/Lib/typing.py#L1773-L1875 DO NOT CHANGE THIS METHOD UNLESS ABSOLUTELY NECESSARY. ====================================================== Return type hints for an object. This is often the same as obj.__annotations__, but it handles forward references encoded as string literals, adds Optional[t] if a default value equal to None is set and recursively replaces all 'Annotated[T, ...]' with 'T' (unless 'include_extras=True'). The argument may be a module, class, method, or function. The annotations are returned as a dictionary. For classes, annotations include also inherited members. TypeError is raised if the argument is not of a type that can contain annotations, and an empty dictionary is returned if no annotations are present. BEWARE -- the behavior of globalns and localns is counterintuitive (unless you are familiar with how eval() and exec() work). The search order is locals first, then globals. - If no dict arguments are passed, an attempt is made to use the globals from obj (or the respective module's globals for classes), and these are also used as the locals. If the object does not appear to have globals, an empty dictionary is used. For classes, the search order is globals first then locals. - If one dict argument is passed, it is used for both globals and locals. - If two dict arguments are passed, they specify globals and locals, respectively. """ if getattr(obj, '__no_type_check__', None): return {} # Classes require a special treatment. if isinstance(obj, type): hints = {} for base in reversed(obj.__mro__): if globalns is None: base_globals = getattr(sys.modules.get(base.__module__, None), '__dict__', {}) else: base_globals = globalns ann = base.__dict__.get('__annotations__', {}) if isinstance(ann, types.GetSetDescriptorType): ann = {} base_locals = dict(vars(base)) if localns is None else localns if localns is None and globalns is None: # This is surprising, but required. Before Python 3.10, # get_type_hints only evaluated the globalns of # a class. To maintain backwards compatibility, we reverse # the globalns and localns order so that eval() looks into # *base_globals* first rather than *base_locals*. # This only affects ForwardRefs. base_globals, base_locals = base_locals, base_globals for name, value in ann.items(): if value is None: value = type(None) if isinstance(value, str): value = _make_forward_ref(value, is_argument=False, is_class=True) value = eval_type_backport(value, base_globals, base_locals) hints[name] = value if not include_extras and hasattr(typing, '_strip_annotations'): return { k: typing._strip_annotations(t) # type: ignore for k, t in hints.items() } else: return hints if globalns is None: if isinstance(obj, types.ModuleType): globalns = obj.__dict__ else: nsobj = obj # Find globalns for the unwrapped object. while hasattr(nsobj, '__wrapped__'): nsobj = nsobj.__wrapped__ globalns = getattr(nsobj, '__globals__', {}) if localns is None: localns = globalns elif localns is None: localns = globalns hints = getattr(obj, '__annotations__', None) if hints is None: # Return empty annotations for something that _could_ have them. if isinstance(obj, typing._allowed_types): # type: ignore return {} else: raise TypeError(f'{obj!r} is not a module, class, method, ' 'or function.') defaults = typing._get_defaults(obj) # type: ignore hints = dict(hints) for name, value in hints.items(): if value is None: value = type(None) if isinstance(value, str): # class-level forward refs were handled above, this must be either # a module-level annotation or a function argument annotation value = _make_forward_ref( value, is_argument=not isinstance(obj, types.ModuleType), is_class=False, ) value = eval_type_backport(value, globalns, localns) if name in defaults and defaults[name] is None: value = typing.Optional[value] hints[name] = value return hints if include_extras else {k: typing._strip_annotations(t) for k, t in hints.items()} # type: ignore pydantic-2.10.6/pydantic/_internal/_utils.py000066400000000000000000000323411474456633400210750ustar00rootroot00000000000000"""Bucket of reusable internal utilities. This should be reduced as much as possible with functions only used in one place, moved to that place. """ from __future__ import annotations as _annotations import dataclasses import keyword import typing import weakref from collections import OrderedDict, defaultdict, deque from copy import deepcopy from functools import cached_property from inspect import Parameter from itertools import zip_longest from types import BuiltinFunctionType, CodeType, FunctionType, GeneratorType, LambdaType, ModuleType from typing import Any, Callable, Mapping, TypeVar from typing_extensions import TypeAlias, TypeGuard from . import _repr, _typing_extra from ._import_utils import import_cached_base_model if typing.TYPE_CHECKING: MappingIntStrAny: TypeAlias = 'typing.Mapping[int, Any] | typing.Mapping[str, Any]' AbstractSetIntStr: TypeAlias = 'typing.AbstractSet[int] | typing.AbstractSet[str]' from ..main import BaseModel # these are types that are returned unchanged by deepcopy IMMUTABLE_NON_COLLECTIONS_TYPES: set[type[Any]] = { int, float, complex, str, bool, bytes, type, _typing_extra.NoneType, FunctionType, BuiltinFunctionType, LambdaType, weakref.ref, CodeType, # note: including ModuleType will differ from behaviour of deepcopy by not producing error. # It might be not a good idea in general, but considering that this function used only internally # against default values of fields, this will allow to actually have a field with module as default value ModuleType, NotImplemented.__class__, Ellipsis.__class__, } # these are types that if empty, might be copied with simple copy() instead of deepcopy() BUILTIN_COLLECTIONS: set[type[Any]] = { list, set, tuple, frozenset, dict, OrderedDict, defaultdict, deque, } def can_be_positional(param: Parameter) -> bool: """Return whether the parameter accepts a positional argument. ```python {test="skip" lint="skip"} def func(a, /, b, *, c): pass params = inspect.signature(func).parameters can_be_positional(params['a']) #> True can_be_positional(params['b']) #> True can_be_positional(params['c']) #> False ``` """ return param.kind in (Parameter.POSITIONAL_ONLY, Parameter.POSITIONAL_OR_KEYWORD) def sequence_like(v: Any) -> bool: return isinstance(v, (list, tuple, set, frozenset, GeneratorType, deque)) def lenient_isinstance(o: Any, class_or_tuple: type[Any] | tuple[type[Any], ...] | None) -> bool: # pragma: no cover try: return isinstance(o, class_or_tuple) # type: ignore[arg-type] except TypeError: return False def lenient_issubclass(cls: Any, class_or_tuple: Any) -> bool: # pragma: no cover try: return isinstance(cls, type) and issubclass(cls, class_or_tuple) except TypeError: if isinstance(cls, _typing_extra.WithArgsTypes): return False raise # pragma: no cover def is_model_class(cls: Any) -> TypeGuard[type[BaseModel]]: """Returns true if cls is a _proper_ subclass of BaseModel, and provides proper type-checking, unlike raw calls to lenient_issubclass. """ BaseModel = import_cached_base_model() return lenient_issubclass(cls, BaseModel) and cls is not BaseModel def is_valid_identifier(identifier: str) -> bool: """Checks that a string is a valid identifier and not a Python keyword. :param identifier: The identifier to test. :return: True if the identifier is valid. """ return identifier.isidentifier() and not keyword.iskeyword(identifier) KeyType = TypeVar('KeyType') def deep_update(mapping: dict[KeyType, Any], *updating_mappings: dict[KeyType, Any]) -> dict[KeyType, Any]: updated_mapping = mapping.copy() for updating_mapping in updating_mappings: for k, v in updating_mapping.items(): if k in updated_mapping and isinstance(updated_mapping[k], dict) and isinstance(v, dict): updated_mapping[k] = deep_update(updated_mapping[k], v) else: updated_mapping[k] = v return updated_mapping def update_not_none(mapping: dict[Any, Any], **update: Any) -> None: mapping.update({k: v for k, v in update.items() if v is not None}) T = TypeVar('T') def unique_list( input_list: list[T] | tuple[T, ...], *, name_factory: typing.Callable[[T], str] = str, ) -> list[T]: """Make a list unique while maintaining order. We update the list if another one with the same name is set (e.g. model validator overridden in subclass). """ result: list[T] = [] result_names: list[str] = [] for v in input_list: v_name = name_factory(v) if v_name not in result_names: result_names.append(v_name) result.append(v) else: result[result_names.index(v_name)] = v return result class ValueItems(_repr.Representation): """Class for more convenient calculation of excluded or included fields on values.""" __slots__ = ('_items', '_type') def __init__(self, value: Any, items: AbstractSetIntStr | MappingIntStrAny) -> None: items = self._coerce_items(items) if isinstance(value, (list, tuple)): items = self._normalize_indexes(items, len(value)) # type: ignore self._items: MappingIntStrAny = items # type: ignore def is_excluded(self, item: Any) -> bool: """Check if item is fully excluded. :param item: key or index of a value """ return self.is_true(self._items.get(item)) def is_included(self, item: Any) -> bool: """Check if value is contained in self._items. :param item: key or index of value """ return item in self._items def for_element(self, e: int | str) -> AbstractSetIntStr | MappingIntStrAny | None: """:param e: key or index of element on value :return: raw values for element if self._items is dict and contain needed element """ item = self._items.get(e) # type: ignore return item if not self.is_true(item) else None def _normalize_indexes(self, items: MappingIntStrAny, v_length: int) -> dict[int | str, Any]: """:param items: dict or set of indexes which will be normalized :param v_length: length of sequence indexes of which will be >>> self._normalize_indexes({0: True, -2: True, -1: True}, 4) {0: True, 2: True, 3: True} >>> self._normalize_indexes({'__all__': True}, 4) {0: True, 1: True, 2: True, 3: True} """ normalized_items: dict[int | str, Any] = {} all_items = None for i, v in items.items(): if not (isinstance(v, typing.Mapping) or isinstance(v, typing.AbstractSet) or self.is_true(v)): raise TypeError(f'Unexpected type of exclude value for index "{i}" {v.__class__}') if i == '__all__': all_items = self._coerce_value(v) continue if not isinstance(i, int): raise TypeError( 'Excluding fields from a sequence of sub-models or dicts must be performed index-wise: ' 'expected integer keys or keyword "__all__"' ) normalized_i = v_length + i if i < 0 else i normalized_items[normalized_i] = self.merge(v, normalized_items.get(normalized_i)) if not all_items: return normalized_items if self.is_true(all_items): for i in range(v_length): normalized_items.setdefault(i, ...) return normalized_items for i in range(v_length): normalized_item = normalized_items.setdefault(i, {}) if not self.is_true(normalized_item): normalized_items[i] = self.merge(all_items, normalized_item) return normalized_items @classmethod def merge(cls, base: Any, override: Any, intersect: bool = False) -> Any: """Merge a `base` item with an `override` item. Both `base` and `override` are converted to dictionaries if possible. Sets are converted to dictionaries with the sets entries as keys and Ellipsis as values. Each key-value pair existing in `base` is merged with `override`, while the rest of the key-value pairs are updated recursively with this function. Merging takes place based on the "union" of keys if `intersect` is set to `False` (default) and on the intersection of keys if `intersect` is set to `True`. """ override = cls._coerce_value(override) base = cls._coerce_value(base) if override is None: return base if cls.is_true(base) or base is None: return override if cls.is_true(override): return base if intersect else override # intersection or union of keys while preserving ordering: if intersect: merge_keys = [k for k in base if k in override] + [k for k in override if k in base] else: merge_keys = list(base) + [k for k in override if k not in base] merged: dict[int | str, Any] = {} for k in merge_keys: merged_item = cls.merge(base.get(k), override.get(k), intersect=intersect) if merged_item is not None: merged[k] = merged_item return merged @staticmethod def _coerce_items(items: AbstractSetIntStr | MappingIntStrAny) -> MappingIntStrAny: if isinstance(items, typing.Mapping): pass elif isinstance(items, typing.AbstractSet): items = dict.fromkeys(items, ...) # type: ignore else: class_name = getattr(items, '__class__', '???') raise TypeError(f'Unexpected type of exclude value {class_name}') return items # type: ignore @classmethod def _coerce_value(cls, value: Any) -> Any: if value is None or cls.is_true(value): return value return cls._coerce_items(value) @staticmethod def is_true(v: Any) -> bool: return v is True or v is ... def __repr_args__(self) -> _repr.ReprArgs: return [(None, self._items)] if typing.TYPE_CHECKING: def LazyClassAttribute(name: str, get_value: Callable[[], T]) -> T: ... else: class LazyClassAttribute: """A descriptor exposing an attribute only accessible on a class (hidden from instances). The attribute is lazily computed and cached during the first access. """ def __init__(self, name: str, get_value: Callable[[], Any]) -> None: self.name = name self.get_value = get_value @cached_property def value(self) -> Any: return self.get_value() def __get__(self, instance: Any, owner: type[Any]) -> None: if instance is None: return self.value raise AttributeError(f'{self.name!r} attribute of {owner.__name__!r} is class-only') Obj = TypeVar('Obj') def smart_deepcopy(obj: Obj) -> Obj: """Return type as is for immutable built-in types Use obj.copy() for built-in empty collections Use copy.deepcopy() for non-empty collections and unknown objects. """ obj_type = obj.__class__ if obj_type in IMMUTABLE_NON_COLLECTIONS_TYPES: return obj # fastest case: obj is immutable and not collection therefore will not be copied anyway try: if not obj and obj_type in BUILTIN_COLLECTIONS: # faster way for empty collections, no need to copy its members return obj if obj_type is tuple else obj.copy() # tuple doesn't have copy method # type: ignore except (TypeError, ValueError, RuntimeError): # do we really dare to catch ALL errors? Seems a bit risky pass return deepcopy(obj) # slowest way when we actually might need a deepcopy _SENTINEL = object() def all_identical(left: typing.Iterable[Any], right: typing.Iterable[Any]) -> bool: """Check that the items of `left` are the same objects as those in `right`. >>> a, b = object(), object() >>> all_identical([a, b, a], [a, b, a]) True >>> all_identical([a, b, [a]], [a, b, [a]]) # new list object, while "equal" is not "identical" False """ for left_item, right_item in zip_longest(left, right, fillvalue=_SENTINEL): if left_item is not right_item: return False return True @dataclasses.dataclass(frozen=True) class SafeGetItemProxy: """Wrapper redirecting `__getitem__` to `get` with a sentinel value as default This makes is safe to use in `operator.itemgetter` when some keys may be missing """ # Define __slots__manually for performances # @dataclasses.dataclass() only support slots=True in python>=3.10 __slots__ = ('wrapped',) wrapped: Mapping[str, Any] def __getitem__(self, key: str, /) -> Any: return self.wrapped.get(key, _SENTINEL) # required to pass the object to operator.itemgetter() instances due to a quirk of typeshed # https://github.com/python/mypy/issues/13713 # https://github.com/python/typeshed/pull/8785 # Since this is typing-only, hide it in a typing.TYPE_CHECKING block if typing.TYPE_CHECKING: def __contains__(self, key: str, /) -> bool: return self.wrapped.__contains__(key) pydantic-2.10.6/pydantic/_internal/_validate_call.py000066400000000000000000000106701474456633400225220ustar00rootroot00000000000000from __future__ import annotations as _annotations import functools import inspect from functools import partial from typing import Any, Awaitable, Callable import pydantic_core from ..config import ConfigDict from ..plugin._schema_validator import create_schema_validator from ._config import ConfigWrapper from ._generate_schema import GenerateSchema, ValidateCallSupportedTypes from ._namespace_utils import MappingNamespace, NsResolver, ns_for_function def extract_function_name(func: ValidateCallSupportedTypes) -> str: """Extract the name of a `ValidateCallSupportedTypes` object.""" return f'partial({func.func.__name__})' if isinstance(func, functools.partial) else func.__name__ def extract_function_qualname(func: ValidateCallSupportedTypes) -> str: """Extract the qualname of a `ValidateCallSupportedTypes` object.""" return f'partial({func.func.__qualname__})' if isinstance(func, functools.partial) else func.__qualname__ def update_wrapper_attributes(wrapped: ValidateCallSupportedTypes, wrapper: Callable[..., Any]): """Update the `wrapper` function with the attributes of the `wrapped` function. Return the updated function.""" if inspect.iscoroutinefunction(wrapped): @functools.wraps(wrapped) async def wrapper_function(*args, **kwargs): # type: ignore return await wrapper(*args, **kwargs) else: @functools.wraps(wrapped) def wrapper_function(*args, **kwargs): return wrapper(*args, **kwargs) # We need to manually update this because `partial` object has no `__name__` and `__qualname__`. wrapper_function.__name__ = extract_function_name(wrapped) wrapper_function.__qualname__ = extract_function_qualname(wrapped) wrapper_function.raw_function = wrapped # type: ignore return wrapper_function class ValidateCallWrapper: """This is a wrapper around a function that validates the arguments passed to it, and optionally the return value.""" __slots__ = ('__pydantic_validator__', '__return_pydantic_validator__') def __init__( self, function: ValidateCallSupportedTypes, config: ConfigDict | None, validate_return: bool, parent_namespace: MappingNamespace | None, ) -> None: if isinstance(function, partial): schema_type = function.func module = function.func.__module__ else: schema_type = function module = function.__module__ qualname = extract_function_qualname(function) ns_resolver = NsResolver(namespaces_tuple=ns_for_function(schema_type, parent_namespace=parent_namespace)) config_wrapper = ConfigWrapper(config) gen_schema = GenerateSchema(config_wrapper, ns_resolver) schema = gen_schema.clean_schema(gen_schema.generate_schema(function)) core_config = config_wrapper.core_config(title=qualname) self.__pydantic_validator__ = create_schema_validator( schema, schema_type, module, qualname, 'validate_call', core_config, config_wrapper.plugin_settings, ) if validate_return: signature = inspect.signature(function) return_type = signature.return_annotation if signature.return_annotation is not signature.empty else Any gen_schema = GenerateSchema(config_wrapper, ns_resolver) schema = gen_schema.clean_schema(gen_schema.generate_schema(return_type)) validator = create_schema_validator( schema, schema_type, module, qualname, 'validate_call', core_config, config_wrapper.plugin_settings, ) if inspect.iscoroutinefunction(function): async def return_val_wrapper(aw: Awaitable[Any]) -> None: return validator.validate_python(await aw) self.__return_pydantic_validator__ = return_val_wrapper else: self.__return_pydantic_validator__ = validator.validate_python else: self.__return_pydantic_validator__ = None def __call__(self, *args: Any, **kwargs: Any) -> Any: res = self.__pydantic_validator__.validate_python(pydantic_core.ArgsKwargs(args, kwargs)) if self.__return_pydantic_validator__: return self.__return_pydantic_validator__(res) else: return res pydantic-2.10.6/pydantic/_internal/_validators.py000066400000000000000000000370751474456633400221160ustar00rootroot00000000000000"""Validator functions for standard library types. Import of this module is deferred since it contains imports of many standard library modules. """ from __future__ import annotations as _annotations import math import re import typing from decimal import Decimal from fractions import Fraction from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from typing import Any, Callable, Union from pydantic_core import PydanticCustomError, core_schema from pydantic_core._pydantic_core import PydanticKnownError def sequence_validator( input_value: typing.Sequence[Any], /, validator: core_schema.ValidatorFunctionWrapHandler, ) -> typing.Sequence[Any]: """Validator for `Sequence` types, isinstance(v, Sequence) has already been called.""" value_type = type(input_value) # We don't accept any plain string as a sequence # Relevant issue: https://github.com/pydantic/pydantic/issues/5595 if issubclass(value_type, (str, bytes)): raise PydanticCustomError( 'sequence_str', "'{type_name}' instances are not allowed as a Sequence value", {'type_name': value_type.__name__}, ) # TODO: refactor sequence validation to validate with either a list or a tuple # schema, depending on the type of the value. # Additionally, we should be able to remove one of either this validator or the # SequenceValidator in _std_types_schema.py (preferably this one, while porting over some logic). # Effectively, a refactor for sequence validation is needed. if value_type is tuple: input_value = list(input_value) v_list = validator(input_value) # the rest of the logic is just re-creating the original type from `v_list` if value_type is list: return v_list elif issubclass(value_type, range): # return the list as we probably can't re-create the range return v_list elif value_type is tuple: return tuple(v_list) else: # best guess at how to re-create the original type, more custom construction logic might be required return value_type(v_list) # type: ignore[call-arg] def import_string(value: Any) -> Any: if isinstance(value, str): try: return _import_string_logic(value) except ImportError as e: raise PydanticCustomError('import_error', 'Invalid python path: {error}', {'error': str(e)}) from e else: # otherwise we just return the value and let the next validator do the rest of the work return value def _import_string_logic(dotted_path: str) -> Any: """Inspired by uvicorn — dotted paths should include a colon before the final item if that item is not a module. (This is necessary to distinguish between a submodule and an attribute when there is a conflict.). If the dotted path does not include a colon and the final item is not a valid module, importing as an attribute rather than a submodule will be attempted automatically. So, for example, the following values of `dotted_path` result in the following returned values: * 'collections': * 'collections.abc': * 'collections.abc:Mapping': * `collections.abc.Mapping`: (though this is a bit slower than the previous line) An error will be raised under any of the following scenarios: * `dotted_path` contains more than one colon (e.g., 'collections:abc:Mapping') * the substring of `dotted_path` before the colon is not a valid module in the environment (e.g., '123:Mapping') * the substring of `dotted_path` after the colon is not an attribute of the module (e.g., 'collections:abc123') """ from importlib import import_module components = dotted_path.strip().split(':') if len(components) > 2: raise ImportError(f"Import strings should have at most one ':'; received {dotted_path!r}") module_path = components[0] if not module_path: raise ImportError(f'Import strings should have a nonempty module name; received {dotted_path!r}') try: module = import_module(module_path) except ModuleNotFoundError as e: if '.' in module_path: # Check if it would be valid if the final item was separated from its module with a `:` maybe_module_path, maybe_attribute = dotted_path.strip().rsplit('.', 1) try: return _import_string_logic(f'{maybe_module_path}:{maybe_attribute}') except ImportError: pass raise ImportError(f'No module named {module_path!r}') from e raise e if len(components) > 1: attribute = components[1] try: return getattr(module, attribute) except AttributeError as e: raise ImportError(f'cannot import name {attribute!r} from {module_path!r}') from e else: return module def pattern_either_validator(input_value: Any, /) -> typing.Pattern[Any]: if isinstance(input_value, typing.Pattern): return input_value elif isinstance(input_value, (str, bytes)): # todo strict mode return compile_pattern(input_value) # type: ignore else: raise PydanticCustomError('pattern_type', 'Input should be a valid pattern') def pattern_str_validator(input_value: Any, /) -> typing.Pattern[str]: if isinstance(input_value, typing.Pattern): if isinstance(input_value.pattern, str): return input_value else: raise PydanticCustomError('pattern_str_type', 'Input should be a string pattern') elif isinstance(input_value, str): return compile_pattern(input_value) elif isinstance(input_value, bytes): raise PydanticCustomError('pattern_str_type', 'Input should be a string pattern') else: raise PydanticCustomError('pattern_type', 'Input should be a valid pattern') def pattern_bytes_validator(input_value: Any, /) -> typing.Pattern[bytes]: if isinstance(input_value, typing.Pattern): if isinstance(input_value.pattern, bytes): return input_value else: raise PydanticCustomError('pattern_bytes_type', 'Input should be a bytes pattern') elif isinstance(input_value, bytes): return compile_pattern(input_value) elif isinstance(input_value, str): raise PydanticCustomError('pattern_bytes_type', 'Input should be a bytes pattern') else: raise PydanticCustomError('pattern_type', 'Input should be a valid pattern') PatternType = typing.TypeVar('PatternType', str, bytes) def compile_pattern(pattern: PatternType) -> typing.Pattern[PatternType]: try: return re.compile(pattern) except re.error: raise PydanticCustomError('pattern_regex', 'Input should be a valid regular expression') def ip_v4_address_validator(input_value: Any, /) -> IPv4Address: if isinstance(input_value, IPv4Address): return input_value try: return IPv4Address(input_value) except ValueError: raise PydanticCustomError('ip_v4_address', 'Input is not a valid IPv4 address') def ip_v6_address_validator(input_value: Any, /) -> IPv6Address: if isinstance(input_value, IPv6Address): return input_value try: return IPv6Address(input_value) except ValueError: raise PydanticCustomError('ip_v6_address', 'Input is not a valid IPv6 address') def ip_v4_network_validator(input_value: Any, /) -> IPv4Network: """Assume IPv4Network initialised with a default `strict` argument. See more: https://docs.python.org/library/ipaddress.html#ipaddress.IPv4Network """ if isinstance(input_value, IPv4Network): return input_value try: return IPv4Network(input_value) except ValueError: raise PydanticCustomError('ip_v4_network', 'Input is not a valid IPv4 network') def ip_v6_network_validator(input_value: Any, /) -> IPv6Network: """Assume IPv6Network initialised with a default `strict` argument. See more: https://docs.python.org/library/ipaddress.html#ipaddress.IPv6Network """ if isinstance(input_value, IPv6Network): return input_value try: return IPv6Network(input_value) except ValueError: raise PydanticCustomError('ip_v6_network', 'Input is not a valid IPv6 network') def ip_v4_interface_validator(input_value: Any, /) -> IPv4Interface: if isinstance(input_value, IPv4Interface): return input_value try: return IPv4Interface(input_value) except ValueError: raise PydanticCustomError('ip_v4_interface', 'Input is not a valid IPv4 interface') def ip_v6_interface_validator(input_value: Any, /) -> IPv6Interface: if isinstance(input_value, IPv6Interface): return input_value try: return IPv6Interface(input_value) except ValueError: raise PydanticCustomError('ip_v6_interface', 'Input is not a valid IPv6 interface') def fraction_validator(input_value: Any, /) -> Fraction: if isinstance(input_value, Fraction): return input_value try: return Fraction(input_value) except ValueError: raise PydanticCustomError('fraction_parsing', 'Input is not a valid fraction') def forbid_inf_nan_check(x: Any) -> Any: if not math.isfinite(x): raise PydanticKnownError('finite_number') return x def _safe_repr(v: Any) -> int | float | str: """The context argument for `PydanticKnownError` requires a number or str type, so we do a simple repr() coercion for types like timedelta. See tests/test_types.py::test_annotated_metadata_any_order for some context. """ if isinstance(v, (int, float, str)): return v return repr(v) def greater_than_validator(x: Any, gt: Any) -> Any: try: if not (x > gt): raise PydanticKnownError('greater_than', {'gt': _safe_repr(gt)}) return x except TypeError: raise TypeError(f"Unable to apply constraint 'gt' to supplied value {x}") def greater_than_or_equal_validator(x: Any, ge: Any) -> Any: try: if not (x >= ge): raise PydanticKnownError('greater_than_equal', {'ge': _safe_repr(ge)}) return x except TypeError: raise TypeError(f"Unable to apply constraint 'ge' to supplied value {x}") def less_than_validator(x: Any, lt: Any) -> Any: try: if not (x < lt): raise PydanticKnownError('less_than', {'lt': _safe_repr(lt)}) return x except TypeError: raise TypeError(f"Unable to apply constraint 'lt' to supplied value {x}") def less_than_or_equal_validator(x: Any, le: Any) -> Any: try: if not (x <= le): raise PydanticKnownError('less_than_equal', {'le': _safe_repr(le)}) return x except TypeError: raise TypeError(f"Unable to apply constraint 'le' to supplied value {x}") def multiple_of_validator(x: Any, multiple_of: Any) -> Any: try: if x % multiple_of: raise PydanticKnownError('multiple_of', {'multiple_of': _safe_repr(multiple_of)}) return x except TypeError: raise TypeError(f"Unable to apply constraint 'multiple_of' to supplied value {x}") def min_length_validator(x: Any, min_length: Any) -> Any: try: if not (len(x) >= min_length): raise PydanticKnownError( 'too_short', {'field_type': 'Value', 'min_length': min_length, 'actual_length': len(x)} ) return x except TypeError: raise TypeError(f"Unable to apply constraint 'min_length' to supplied value {x}") def max_length_validator(x: Any, max_length: Any) -> Any: try: if len(x) > max_length: raise PydanticKnownError( 'too_long', {'field_type': 'Value', 'max_length': max_length, 'actual_length': len(x)}, ) return x except TypeError: raise TypeError(f"Unable to apply constraint 'max_length' to supplied value {x}") def _extract_decimal_digits_info(decimal: Decimal) -> tuple[int, int]: """Compute the total number of digits and decimal places for a given [`Decimal`][decimal.Decimal] instance. This function handles both normalized and non-normalized Decimal instances. Example: Decimal('1.230') -> 4 digits, 3 decimal places Args: decimal (Decimal): The decimal number to analyze. Returns: tuple[int, int]: A tuple containing the number of decimal places and total digits. Though this could be divided into two separate functions, the logic is easier to follow if we couple the computation of the number of decimals and digits together. """ decimal_tuple = decimal.as_tuple() if not isinstance(decimal_tuple.exponent, int): raise TypeError(f'Unable to extract decimal digits info from supplied value {decimal}') exponent = decimal_tuple.exponent num_digits = len(decimal_tuple.digits) if exponent >= 0: # A positive exponent adds that many trailing zeros # Ex: digit_tuple=(1, 2, 3), exponent=2 -> 12300 -> 0 decimal places, 5 digits num_digits += exponent decimal_places = 0 else: # If the absolute value of the negative exponent is larger than the # number of digits, then it's the same as the number of digits, # because it'll consume all the digits in digit_tuple and then # add abs(exponent) - len(digit_tuple) leading zeros after the decimal point. # Ex: digit_tuple=(1, 2, 3), exponent=-2 -> 1.23 -> 2 decimal places, 3 digits # Ex: digit_tuple=(1, 2, 3), exponent=-4 -> 0.0123 -> 4 decimal places, 4 digits decimal_places = abs(exponent) num_digits = max(num_digits, decimal_places) return decimal_places, num_digits def max_digits_validator(x: Any, max_digits: Any) -> Any: _, num_digits = _extract_decimal_digits_info(x) _, normalized_num_digits = _extract_decimal_digits_info(x.normalize()) try: if (num_digits > max_digits) and (normalized_num_digits > max_digits): raise PydanticKnownError( 'decimal_max_digits', {'max_digits': max_digits}, ) return x except TypeError: raise TypeError(f"Unable to apply constraint 'max_digits' to supplied value {x}") def decimal_places_validator(x: Any, decimal_places: Any) -> Any: decimal_places_, _ = _extract_decimal_digits_info(x) normalized_decimal_places, _ = _extract_decimal_digits_info(x.normalize()) try: if (decimal_places_ > decimal_places) and (normalized_decimal_places > decimal_places): raise PydanticKnownError( 'decimal_max_places', {'decimal_places': decimal_places}, ) return x except TypeError: raise TypeError(f"Unable to apply constraint 'decimal_places' to supplied value {x}") NUMERIC_VALIDATOR_LOOKUP: dict[str, Callable] = { 'gt': greater_than_validator, 'ge': greater_than_or_equal_validator, 'lt': less_than_validator, 'le': less_than_or_equal_validator, 'multiple_of': multiple_of_validator, 'min_length': min_length_validator, 'max_length': max_length_validator, 'max_digits': max_digits_validator, 'decimal_places': decimal_places_validator, } IpType = Union[IPv4Address, IPv6Address, IPv4Network, IPv6Network, IPv4Interface, IPv6Interface] IP_VALIDATOR_LOOKUP: dict[type[IpType], Callable] = { IPv4Address: ip_v4_address_validator, IPv6Address: ip_v6_address_validator, IPv4Network: ip_v4_network_validator, IPv6Network: ip_v6_network_validator, IPv4Interface: ip_v4_interface_validator, IPv6Interface: ip_v6_interface_validator, } pydantic-2.10.6/pydantic/_migration.py000066400000000000000000000272111474456633400177530ustar00rootroot00000000000000import sys from typing import Any, Callable, Dict from .version import version_short MOVED_IN_V2 = { 'pydantic.utils:version_info': 'pydantic.version:version_info', 'pydantic.error_wrappers:ValidationError': 'pydantic:ValidationError', 'pydantic.utils:to_camel': 'pydantic.alias_generators:to_pascal', 'pydantic.utils:to_lower_camel': 'pydantic.alias_generators:to_camel', 'pydantic:PyObject': 'pydantic.types:ImportString', 'pydantic.types:PyObject': 'pydantic.types:ImportString', 'pydantic.generics:GenericModel': 'pydantic.BaseModel', } DEPRECATED_MOVED_IN_V2 = { 'pydantic.tools:schema_of': 'pydantic.deprecated.tools:schema_of', 'pydantic.tools:parse_obj_as': 'pydantic.deprecated.tools:parse_obj_as', 'pydantic.tools:schema_json_of': 'pydantic.deprecated.tools:schema_json_of', 'pydantic.json:pydantic_encoder': 'pydantic.deprecated.json:pydantic_encoder', 'pydantic:validate_arguments': 'pydantic.deprecated.decorator:validate_arguments', 'pydantic.json:custom_pydantic_encoder': 'pydantic.deprecated.json:custom_pydantic_encoder', 'pydantic.json:timedelta_isoformat': 'pydantic.deprecated.json:timedelta_isoformat', 'pydantic.decorator:validate_arguments': 'pydantic.deprecated.decorator:validate_arguments', 'pydantic.class_validators:validator': 'pydantic.deprecated.class_validators:validator', 'pydantic.class_validators:root_validator': 'pydantic.deprecated.class_validators:root_validator', 'pydantic.config:BaseConfig': 'pydantic.deprecated.config:BaseConfig', 'pydantic.config:Extra': 'pydantic.deprecated.config:Extra', } REDIRECT_TO_V1 = { f'pydantic.utils:{obj}': f'pydantic.v1.utils:{obj}' for obj in ( 'deep_update', 'GetterDict', 'lenient_issubclass', 'lenient_isinstance', 'is_valid_field', 'update_not_none', 'import_string', 'Representation', 'ROOT_KEY', 'smart_deepcopy', 'sequence_like', ) } REMOVED_IN_V2 = { 'pydantic:ConstrainedBytes', 'pydantic:ConstrainedDate', 'pydantic:ConstrainedDecimal', 'pydantic:ConstrainedFloat', 'pydantic:ConstrainedFrozenSet', 'pydantic:ConstrainedInt', 'pydantic:ConstrainedList', 'pydantic:ConstrainedSet', 'pydantic:ConstrainedStr', 'pydantic:JsonWrapper', 'pydantic:NoneBytes', 'pydantic:NoneStr', 'pydantic:NoneStrBytes', 'pydantic:Protocol', 'pydantic:Required', 'pydantic:StrBytes', 'pydantic:compiled', 'pydantic.config:get_config', 'pydantic.config:inherit_config', 'pydantic.config:prepare_config', 'pydantic:create_model_from_namedtuple', 'pydantic:create_model_from_typeddict', 'pydantic.dataclasses:create_pydantic_model_from_dataclass', 'pydantic.dataclasses:make_dataclass_validator', 'pydantic.dataclasses:set_validation', 'pydantic.datetime_parse:parse_date', 'pydantic.datetime_parse:parse_time', 'pydantic.datetime_parse:parse_datetime', 'pydantic.datetime_parse:parse_duration', 'pydantic.error_wrappers:ErrorWrapper', 'pydantic.errors:AnyStrMaxLengthError', 'pydantic.errors:AnyStrMinLengthError', 'pydantic.errors:ArbitraryTypeError', 'pydantic.errors:BoolError', 'pydantic.errors:BytesError', 'pydantic.errors:CallableError', 'pydantic.errors:ClassError', 'pydantic.errors:ColorError', 'pydantic.errors:ConfigError', 'pydantic.errors:DataclassTypeError', 'pydantic.errors:DateError', 'pydantic.errors:DateNotInTheFutureError', 'pydantic.errors:DateNotInThePastError', 'pydantic.errors:DateTimeError', 'pydantic.errors:DecimalError', 'pydantic.errors:DecimalIsNotFiniteError', 'pydantic.errors:DecimalMaxDigitsError', 'pydantic.errors:DecimalMaxPlacesError', 'pydantic.errors:DecimalWholeDigitsError', 'pydantic.errors:DictError', 'pydantic.errors:DurationError', 'pydantic.errors:EmailError', 'pydantic.errors:EnumError', 'pydantic.errors:EnumMemberError', 'pydantic.errors:ExtraError', 'pydantic.errors:FloatError', 'pydantic.errors:FrozenSetError', 'pydantic.errors:FrozenSetMaxLengthError', 'pydantic.errors:FrozenSetMinLengthError', 'pydantic.errors:HashableError', 'pydantic.errors:IPv4AddressError', 'pydantic.errors:IPv4InterfaceError', 'pydantic.errors:IPv4NetworkError', 'pydantic.errors:IPv6AddressError', 'pydantic.errors:IPv6InterfaceError', 'pydantic.errors:IPv6NetworkError', 'pydantic.errors:IPvAnyAddressError', 'pydantic.errors:IPvAnyInterfaceError', 'pydantic.errors:IPvAnyNetworkError', 'pydantic.errors:IntEnumError', 'pydantic.errors:IntegerError', 'pydantic.errors:InvalidByteSize', 'pydantic.errors:InvalidByteSizeUnit', 'pydantic.errors:InvalidDiscriminator', 'pydantic.errors:InvalidLengthForBrand', 'pydantic.errors:JsonError', 'pydantic.errors:JsonTypeError', 'pydantic.errors:ListError', 'pydantic.errors:ListMaxLengthError', 'pydantic.errors:ListMinLengthError', 'pydantic.errors:ListUniqueItemsError', 'pydantic.errors:LuhnValidationError', 'pydantic.errors:MissingDiscriminator', 'pydantic.errors:MissingError', 'pydantic.errors:NoneIsAllowedError', 'pydantic.errors:NoneIsNotAllowedError', 'pydantic.errors:NotDigitError', 'pydantic.errors:NotNoneError', 'pydantic.errors:NumberNotGeError', 'pydantic.errors:NumberNotGtError', 'pydantic.errors:NumberNotLeError', 'pydantic.errors:NumberNotLtError', 'pydantic.errors:NumberNotMultipleError', 'pydantic.errors:PathError', 'pydantic.errors:PathNotADirectoryError', 'pydantic.errors:PathNotAFileError', 'pydantic.errors:PathNotExistsError', 'pydantic.errors:PatternError', 'pydantic.errors:PyObjectError', 'pydantic.errors:PydanticTypeError', 'pydantic.errors:PydanticValueError', 'pydantic.errors:SequenceError', 'pydantic.errors:SetError', 'pydantic.errors:SetMaxLengthError', 'pydantic.errors:SetMinLengthError', 'pydantic.errors:StrError', 'pydantic.errors:StrRegexError', 'pydantic.errors:StrictBoolError', 'pydantic.errors:SubclassError', 'pydantic.errors:TimeError', 'pydantic.errors:TupleError', 'pydantic.errors:TupleLengthError', 'pydantic.errors:UUIDError', 'pydantic.errors:UUIDVersionError', 'pydantic.errors:UrlError', 'pydantic.errors:UrlExtraError', 'pydantic.errors:UrlHostError', 'pydantic.errors:UrlHostTldError', 'pydantic.errors:UrlPortError', 'pydantic.errors:UrlSchemeError', 'pydantic.errors:UrlSchemePermittedError', 'pydantic.errors:UrlUserInfoError', 'pydantic.errors:WrongConstantError', 'pydantic.main:validate_model', 'pydantic.networks:stricturl', 'pydantic:parse_file_as', 'pydantic:parse_raw_as', 'pydantic:stricturl', 'pydantic.tools:parse_file_as', 'pydantic.tools:parse_raw_as', 'pydantic.types:ConstrainedBytes', 'pydantic.types:ConstrainedDate', 'pydantic.types:ConstrainedDecimal', 'pydantic.types:ConstrainedFloat', 'pydantic.types:ConstrainedFrozenSet', 'pydantic.types:ConstrainedInt', 'pydantic.types:ConstrainedList', 'pydantic.types:ConstrainedSet', 'pydantic.types:ConstrainedStr', 'pydantic.types:JsonWrapper', 'pydantic.types:NoneBytes', 'pydantic.types:NoneStr', 'pydantic.types:NoneStrBytes', 'pydantic.types:StrBytes', 'pydantic.typing:evaluate_forwardref', 'pydantic.typing:AbstractSetIntStr', 'pydantic.typing:AnyCallable', 'pydantic.typing:AnyClassMethod', 'pydantic.typing:CallableGenerator', 'pydantic.typing:DictAny', 'pydantic.typing:DictIntStrAny', 'pydantic.typing:DictStrAny', 'pydantic.typing:IntStr', 'pydantic.typing:ListStr', 'pydantic.typing:MappingIntStrAny', 'pydantic.typing:NoArgAnyCallable', 'pydantic.typing:NoneType', 'pydantic.typing:ReprArgs', 'pydantic.typing:SetStr', 'pydantic.typing:StrPath', 'pydantic.typing:TupleGenerator', 'pydantic.typing:WithArgsTypes', 'pydantic.typing:all_literal_values', 'pydantic.typing:display_as_type', 'pydantic.typing:get_all_type_hints', 'pydantic.typing:get_args', 'pydantic.typing:get_origin', 'pydantic.typing:get_sub_types', 'pydantic.typing:is_callable_type', 'pydantic.typing:is_classvar', 'pydantic.typing:is_finalvar', 'pydantic.typing:is_literal_type', 'pydantic.typing:is_namedtuple', 'pydantic.typing:is_new_type', 'pydantic.typing:is_none_type', 'pydantic.typing:is_typeddict', 'pydantic.typing:is_typeddict_special', 'pydantic.typing:is_union', 'pydantic.typing:new_type_supertype', 'pydantic.typing:resolve_annotations', 'pydantic.typing:typing_base', 'pydantic.typing:update_field_forward_refs', 'pydantic.typing:update_model_forward_refs', 'pydantic.utils:ClassAttribute', 'pydantic.utils:DUNDER_ATTRIBUTES', 'pydantic.utils:PyObjectStr', 'pydantic.utils:ValueItems', 'pydantic.utils:almost_equal_floats', 'pydantic.utils:get_discriminator_alias_and_values', 'pydantic.utils:get_model', 'pydantic.utils:get_unique_discriminator_alias', 'pydantic.utils:in_ipython', 'pydantic.utils:is_valid_identifier', 'pydantic.utils:path_type', 'pydantic.utils:validate_field_name', 'pydantic:validate_model', } def getattr_migration(module: str) -> Callable[[str], Any]: """Implement PEP 562 for objects that were either moved or removed on the migration to V2. Args: module: The module name. Returns: A callable that will raise an error if the object is not found. """ # This avoids circular import with errors.py. from .errors import PydanticImportError def wrapper(name: str) -> object: """Raise an error if the object is not found, or warn if it was moved. In case it was moved, it still returns the object. Args: name: The object name. Returns: The object. """ if name == '__path__': raise AttributeError(f'module {module!r} has no attribute {name!r}') import warnings from ._internal._validators import import_string import_path = f'{module}:{name}' if import_path in MOVED_IN_V2.keys(): new_location = MOVED_IN_V2[import_path] warnings.warn(f'`{import_path}` has been moved to `{new_location}`.') return import_string(MOVED_IN_V2[import_path]) if import_path in DEPRECATED_MOVED_IN_V2: # skip the warning here because a deprecation warning will be raised elsewhere return import_string(DEPRECATED_MOVED_IN_V2[import_path]) if import_path in REDIRECT_TO_V1: new_location = REDIRECT_TO_V1[import_path] warnings.warn( f'`{import_path}` has been removed. We are importing from `{new_location}` instead.' 'See the migration guide for more details: https://docs.pydantic.dev/latest/migration/' ) return import_string(REDIRECT_TO_V1[import_path]) if import_path == 'pydantic:BaseSettings': raise PydanticImportError( '`BaseSettings` has been moved to the `pydantic-settings` package. ' f'See https://docs.pydantic.dev/{version_short()}/migration/#basesettings-has-moved-to-pydantic-settings ' 'for more details.' ) if import_path in REMOVED_IN_V2: raise PydanticImportError(f'`{import_path}` has been removed in V2.') globals: Dict[str, Any] = sys.modules[module].__dict__ if name in globals: return globals[name] raise AttributeError(f'module {module!r} has no attribute {name!r}') return wrapper pydantic-2.10.6/pydantic/alias_generators.py000066400000000000000000000041141474456633400211420ustar00rootroot00000000000000"""Alias generators for converting between different capitalization conventions.""" import re __all__ = ('to_pascal', 'to_camel', 'to_snake') # TODO: in V3, change the argument names to be more descriptive # Generally, don't only convert from snake_case, or name the functions # more specifically like snake_to_camel. def to_pascal(snake: str) -> str: """Convert a snake_case string to PascalCase. Args: snake: The string to convert. Returns: The PascalCase string. """ camel = snake.title() return re.sub('([0-9A-Za-z])_(?=[0-9A-Z])', lambda m: m.group(1), camel) def to_camel(snake: str) -> str: """Convert a snake_case string to camelCase. Args: snake: The string to convert. Returns: The converted camelCase string. """ # If the string is already in camelCase and does not contain a digit followed # by a lowercase letter, return it as it is if re.match('^[a-z]+[A-Za-z0-9]*$', snake) and not re.search(r'\d[a-z]', snake): return snake camel = to_pascal(snake) return re.sub('(^_*[A-Z])', lambda m: m.group(1).lower(), camel) def to_snake(camel: str) -> str: """Convert a PascalCase, camelCase, or kebab-case string to snake_case. Args: camel: The string to convert. Returns: The converted string in snake_case. """ # Handle the sequence of uppercase letters followed by a lowercase letter snake = re.sub(r'([A-Z]+)([A-Z][a-z])', lambda m: f'{m.group(1)}_{m.group(2)}', camel) # Insert an underscore between a lowercase letter and an uppercase letter snake = re.sub(r'([a-z])([A-Z])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Insert an underscore between a digit and an uppercase letter snake = re.sub(r'([0-9])([A-Z])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Insert an underscore between a lowercase letter and a digit snake = re.sub(r'([a-z])([0-9])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Replace hyphens with underscores to handle kebab-case snake = snake.replace('-', '_') return snake.lower() pydantic-2.10.6/pydantic/aliases.py000066400000000000000000000113261474456633400172440ustar00rootroot00000000000000"""Support for alias configurations.""" from __future__ import annotations import dataclasses from typing import Any, Callable, Literal from pydantic_core import PydanticUndefined from ._internal import _internal_dataclass __all__ = ('AliasGenerator', 'AliasPath', 'AliasChoices') @dataclasses.dataclass(**_internal_dataclass.slots_true) class AliasPath: """Usage docs: https://docs.pydantic.dev/2.10/concepts/alias#aliaspath-and-aliaschoices A data class used by `validation_alias` as a convenience to create aliases. Attributes: path: A list of string or integer aliases. """ path: list[int | str] def __init__(self, first_arg: str, *args: str | int) -> None: self.path = [first_arg] + list(args) def convert_to_aliases(self) -> list[str | int]: """Converts arguments to a list of string or integer aliases. Returns: The list of aliases. """ return self.path def search_dict_for_path(self, d: dict) -> Any: """Searches a dictionary for the path specified by the alias. Returns: The value at the specified path, or `PydanticUndefined` if the path is not found. """ v = d for k in self.path: if isinstance(v, str): # disallow indexing into a str, like for AliasPath('x', 0) and x='abc' return PydanticUndefined try: v = v[k] except (KeyError, IndexError, TypeError): return PydanticUndefined return v @dataclasses.dataclass(**_internal_dataclass.slots_true) class AliasChoices: """Usage docs: https://docs.pydantic.dev/2.10/concepts/alias#aliaspath-and-aliaschoices A data class used by `validation_alias` as a convenience to create aliases. Attributes: choices: A list containing a string or `AliasPath`. """ choices: list[str | AliasPath] def __init__(self, first_choice: str | AliasPath, *choices: str | AliasPath) -> None: self.choices = [first_choice] + list(choices) def convert_to_aliases(self) -> list[list[str | int]]: """Converts arguments to a list of lists containing string or integer aliases. Returns: The list of aliases. """ aliases: list[list[str | int]] = [] for c in self.choices: if isinstance(c, AliasPath): aliases.append(c.convert_to_aliases()) else: aliases.append([c]) return aliases @dataclasses.dataclass(**_internal_dataclass.slots_true) class AliasGenerator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/alias#using-an-aliasgenerator A data class used by `alias_generator` as a convenience to create various aliases. Attributes: alias: A callable that takes a field name and returns an alias for it. validation_alias: A callable that takes a field name and returns a validation alias for it. serialization_alias: A callable that takes a field name and returns a serialization alias for it. """ alias: Callable[[str], str] | None = None validation_alias: Callable[[str], str | AliasPath | AliasChoices] | None = None serialization_alias: Callable[[str], str] | None = None def _generate_alias( self, alias_kind: Literal['alias', 'validation_alias', 'serialization_alias'], allowed_types: tuple[type[str] | type[AliasPath] | type[AliasChoices], ...], field_name: str, ) -> str | AliasPath | AliasChoices | None: """Generate an alias of the specified kind. Returns None if the alias generator is None. Raises: TypeError: If the alias generator produces an invalid type. """ alias = None if alias_generator := getattr(self, alias_kind): alias = alias_generator(field_name) if alias and not isinstance(alias, allowed_types): raise TypeError( f'Invalid `{alias_kind}` type. `{alias_kind}` generator must produce one of `{allowed_types}`' ) return alias def generate_aliases(self, field_name: str) -> tuple[str | None, str | AliasPath | AliasChoices | None, str | None]: """Generate `alias`, `validation_alias`, and `serialization_alias` for a field. Returns: A tuple of three aliases - validation, alias, and serialization. """ alias = self._generate_alias('alias', (str,), field_name) validation_alias = self._generate_alias('validation_alias', (str, AliasChoices, AliasPath), field_name) serialization_alias = self._generate_alias('serialization_alias', (str,), field_name) return alias, validation_alias, serialization_alias # type: ignore pydantic-2.10.6/pydantic/annotated_handlers.py000066400000000000000000000104671474456633400214650ustar00rootroot00000000000000"""Type annotations to use with `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__`.""" from __future__ import annotations as _annotations from typing import TYPE_CHECKING, Any, Union from pydantic_core import core_schema if TYPE_CHECKING: from ._internal._namespace_utils import NamespacesTuple from .json_schema import JsonSchemaMode, JsonSchemaValue CoreSchemaOrField = Union[ core_schema.CoreSchema, core_schema.ModelField, core_schema.DataclassField, core_schema.TypedDictField, core_schema.ComputedField, ] __all__ = 'GetJsonSchemaHandler', 'GetCoreSchemaHandler' class GetJsonSchemaHandler: """Handler to call into the next JSON schema generation function. Attributes: mode: Json schema mode, can be `validation` or `serialization`. """ mode: JsonSchemaMode def __call__(self, core_schema: CoreSchemaOrField, /) -> JsonSchemaValue: """Call the inner handler and get the JsonSchemaValue it returns. This will call the next JSON schema modifying function up until it calls into `pydantic.json_schema.GenerateJsonSchema`, which will raise a `pydantic.errors.PydanticInvalidForJsonSchema` error if it cannot generate a JSON schema. Args: core_schema: A `pydantic_core.core_schema.CoreSchema`. Returns: JsonSchemaValue: The JSON schema generated by the inner JSON schema modify functions. """ raise NotImplementedError def resolve_ref_schema(self, maybe_ref_json_schema: JsonSchemaValue, /) -> JsonSchemaValue: """Get the real schema for a `{"$ref": ...}` schema. If the schema given is not a `$ref` schema, it will be returned as is. This means you don't have to check before calling this function. Args: maybe_ref_json_schema: A JsonSchemaValue which may be a `$ref` schema. Raises: LookupError: If the ref is not found. Returns: JsonSchemaValue: A JsonSchemaValue that has no `$ref`. """ raise NotImplementedError class GetCoreSchemaHandler: """Handler to call into the next CoreSchema schema generation function.""" def __call__(self, source_type: Any, /) -> core_schema.CoreSchema: """Call the inner handler and get the CoreSchema it returns. This will call the next CoreSchema modifying function up until it calls into Pydantic's internal schema generation machinery, which will raise a `pydantic.errors.PydanticSchemaGenerationError` error if it cannot generate a CoreSchema for the given source type. Args: source_type: The input type. Returns: CoreSchema: The `pydantic-core` CoreSchema generated. """ raise NotImplementedError def generate_schema(self, source_type: Any, /) -> core_schema.CoreSchema: """Generate a schema unrelated to the current context. Use this function if e.g. you are handling schema generation for a sequence and want to generate a schema for its items. Otherwise, you may end up doing something like applying a `min_length` constraint that was intended for the sequence itself to its items! Args: source_type: The input type. Returns: CoreSchema: The `pydantic-core` CoreSchema generated. """ raise NotImplementedError def resolve_ref_schema(self, maybe_ref_schema: core_schema.CoreSchema, /) -> core_schema.CoreSchema: """Get the real schema for a `definition-ref` schema. If the schema given is not a `definition-ref` schema, it will be returned as is. This means you don't have to check before calling this function. Args: maybe_ref_schema: A `CoreSchema`, `ref`-based or not. Raises: LookupError: If the `ref` is not found. Returns: A concrete `CoreSchema`. """ raise NotImplementedError @property def field_name(self) -> str | None: """Get the name of the closest field to this validator.""" raise NotImplementedError def _get_types_namespace(self) -> NamespacesTuple: """Internal method used during type resolution for serializer annotations.""" raise NotImplementedError pydantic-2.10.6/pydantic/class_validators.py000066400000000000000000000002241474456633400211530ustar00rootroot00000000000000"""`class_validators` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/color.py000066400000000000000000000517661474456633400167550ustar00rootroot00000000000000"""Color definitions are used as per the CSS3 [CSS Color Module Level 3](http://www.w3.org/TR/css3-color/#svg-color) specification. A few colors have multiple names referring to the sames colors, eg. `grey` and `gray` or `aqua` and `cyan`. In these cases the _last_ color when sorted alphabetically takes preferences, eg. `Color((0, 255, 255)).as_named() == 'cyan'` because "cyan" comes after "aqua". Warning: Deprecated The `Color` class is deprecated, use `pydantic_extra_types` instead. See [`pydantic-extra-types.Color`](../usage/types/extra_types/color_types.md) for more information. """ import math import re from colorsys import hls_to_rgb, rgb_to_hls from typing import Any, Callable, Optional, Tuple, Type, Union, cast from pydantic_core import CoreSchema, PydanticCustomError, core_schema from typing_extensions import deprecated from ._internal import _repr from ._internal._schema_generation_shared import GetJsonSchemaHandler as _GetJsonSchemaHandler from .json_schema import JsonSchemaValue from .warnings import PydanticDeprecatedSince20 ColorTuple = Union[Tuple[int, int, int], Tuple[int, int, int, float]] ColorType = Union[ColorTuple, str] HslColorTuple = Union[Tuple[float, float, float], Tuple[float, float, float, float]] class RGBA: """Internal use only as a representation of a color.""" __slots__ = 'r', 'g', 'b', 'alpha', '_tuple' def __init__(self, r: float, g: float, b: float, alpha: Optional[float]): self.r = r self.g = g self.b = b self.alpha = alpha self._tuple: Tuple[float, float, float, Optional[float]] = (r, g, b, alpha) def __getitem__(self, item: Any) -> Any: return self._tuple[item] # these are not compiled here to avoid import slowdown, they'll be compiled the first time they're used, then cached _r_255 = r'(\d{1,3}(?:\.\d+)?)' _r_comma = r'\s*,\s*' _r_alpha = r'(\d(?:\.\d+)?|\.\d+|\d{1,2}%)' _r_h = r'(-?\d+(?:\.\d+)?|-?\.\d+)(deg|rad|turn)?' _r_sl = r'(\d{1,3}(?:\.\d+)?)%' r_hex_short = r'\s*(?:#|0x)?([0-9a-f])([0-9a-f])([0-9a-f])([0-9a-f])?\s*' r_hex_long = r'\s*(?:#|0x)?([0-9a-f]{2})([0-9a-f]{2})([0-9a-f]{2})([0-9a-f]{2})?\s*' # CSS3 RGB examples: rgb(0, 0, 0), rgba(0, 0, 0, 0.5), rgba(0, 0, 0, 50%) r_rgb = rf'\s*rgba?\(\s*{_r_255}{_r_comma}{_r_255}{_r_comma}{_r_255}(?:{_r_comma}{_r_alpha})?\s*\)\s*' # CSS3 HSL examples: hsl(270, 60%, 50%), hsla(270, 60%, 50%, 0.5), hsla(270, 60%, 50%, 50%) r_hsl = rf'\s*hsla?\(\s*{_r_h}{_r_comma}{_r_sl}{_r_comma}{_r_sl}(?:{_r_comma}{_r_alpha})?\s*\)\s*' # CSS4 RGB examples: rgb(0 0 0), rgb(0 0 0 / 0.5), rgb(0 0 0 / 50%), rgba(0 0 0 / 50%) r_rgb_v4_style = rf'\s*rgba?\(\s*{_r_255}\s+{_r_255}\s+{_r_255}(?:\s*/\s*{_r_alpha})?\s*\)\s*' # CSS4 HSL examples: hsl(270 60% 50%), hsl(270 60% 50% / 0.5), hsl(270 60% 50% / 50%), hsla(270 60% 50% / 50%) r_hsl_v4_style = rf'\s*hsla?\(\s*{_r_h}\s+{_r_sl}\s+{_r_sl}(?:\s*/\s*{_r_alpha})?\s*\)\s*' # colors where the two hex characters are the same, if all colors match this the short version of hex colors can be used repeat_colors = {int(c * 2, 16) for c in '0123456789abcdef'} rads = 2 * math.pi @deprecated( 'The `Color` class is deprecated, use `pydantic_extra_types` instead. ' 'See https://docs.pydantic.dev/latest/api/pydantic_extra_types_color/.', category=PydanticDeprecatedSince20, ) class Color(_repr.Representation): """Represents a color.""" __slots__ = '_original', '_rgba' def __init__(self, value: ColorType) -> None: self._rgba: RGBA self._original: ColorType if isinstance(value, (tuple, list)): self._rgba = parse_tuple(value) elif isinstance(value, str): self._rgba = parse_str(value) elif isinstance(value, Color): self._rgba = value._rgba value = value._original else: raise PydanticCustomError( 'color_error', 'value is not a valid color: value must be a tuple, list or string' ) # if we've got here value must be a valid color self._original = value @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = {} field_schema.update(type='string', format='color') return field_schema def original(self) -> ColorType: """Original value passed to `Color`.""" return self._original def as_named(self, *, fallback: bool = False) -> str: """Returns the name of the color if it can be found in `COLORS_BY_VALUE` dictionary, otherwise returns the hexadecimal representation of the color or raises `ValueError`. Args: fallback: If True, falls back to returning the hexadecimal representation of the color instead of raising a ValueError when no named color is found. Returns: The name of the color, or the hexadecimal representation of the color. Raises: ValueError: When no named color is found and fallback is `False`. """ if self._rgba.alpha is None: rgb = cast(Tuple[int, int, int], self.as_rgb_tuple()) try: return COLORS_BY_VALUE[rgb] except KeyError as e: if fallback: return self.as_hex() else: raise ValueError('no named color found, use fallback=True, as_hex() or as_rgb()') from e else: return self.as_hex() def as_hex(self) -> str: """Returns the hexadecimal representation of the color. Hex string representing the color can be 3, 4, 6, or 8 characters depending on whether the string a "short" representation of the color is possible and whether there's an alpha channel. Returns: The hexadecimal representation of the color. """ values = [float_to_255(c) for c in self._rgba[:3]] if self._rgba.alpha is not None: values.append(float_to_255(self._rgba.alpha)) as_hex = ''.join(f'{v:02x}' for v in values) if all(c in repeat_colors for c in values): as_hex = ''.join(as_hex[c] for c in range(0, len(as_hex), 2)) return '#' + as_hex def as_rgb(self) -> str: """Color as an `rgb(, , )` or `rgba(, , , )` string.""" if self._rgba.alpha is None: return f'rgb({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)})' else: return ( f'rgba({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)}, ' f'{round(self._alpha_float(), 2)})' ) def as_rgb_tuple(self, *, alpha: Optional[bool] = None) -> ColorTuple: """Returns the color as an RGB or RGBA tuple. Args: alpha: Whether to include the alpha channel. There are three options for this input: - `None` (default): Include alpha only if it's set. (e.g. not `None`) - `True`: Always include alpha. - `False`: Always omit alpha. Returns: A tuple that contains the values of the red, green, and blue channels in the range 0 to 255. If alpha is included, it is in the range 0 to 1. """ r, g, b = (float_to_255(c) for c in self._rgba[:3]) if alpha is None: if self._rgba.alpha is None: return r, g, b else: return r, g, b, self._alpha_float() elif alpha: return r, g, b, self._alpha_float() else: # alpha is False return r, g, b def as_hsl(self) -> str: """Color as an `hsl(, , )` or `hsl(, , , )` string.""" if self._rgba.alpha is None: h, s, li = self.as_hsl_tuple(alpha=False) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%})' else: h, s, li, a = self.as_hsl_tuple(alpha=True) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%}, {round(a, 2)})' def as_hsl_tuple(self, *, alpha: Optional[bool] = None) -> HslColorTuple: """Returns the color as an HSL or HSLA tuple. Args: alpha: Whether to include the alpha channel. - `None` (default): Include the alpha channel only if it's set (e.g. not `None`). - `True`: Always include alpha. - `False`: Always omit alpha. Returns: The color as a tuple of hue, saturation, lightness, and alpha (if included). All elements are in the range 0 to 1. Note: This is HSL as used in HTML and most other places, not HLS as used in Python's `colorsys`. """ h, l, s = rgb_to_hls(self._rgba.r, self._rgba.g, self._rgba.b) # noqa: E741 if alpha is None: if self._rgba.alpha is None: return h, s, l else: return h, s, l, self._alpha_float() if alpha: return h, s, l, self._alpha_float() else: # alpha is False return h, s, l def _alpha_float(self) -> float: return 1 if self._rgba.alpha is None else self._rgba.alpha @classmethod def __get_pydantic_core_schema__( cls, source: Type[Any], handler: Callable[[Any], CoreSchema] ) -> core_schema.CoreSchema: return core_schema.with_info_plain_validator_function( cls._validate, serialization=core_schema.to_string_ser_schema() ) @classmethod def _validate(cls, __input_value: Any, _: Any) -> 'Color': return cls(__input_value) def __str__(self) -> str: return self.as_named(fallback=True) def __repr_args__(self) -> '_repr.ReprArgs': return [(None, self.as_named(fallback=True))] + [('rgb', self.as_rgb_tuple())] def __eq__(self, other: Any) -> bool: return isinstance(other, Color) and self.as_rgb_tuple() == other.as_rgb_tuple() def __hash__(self) -> int: return hash(self.as_rgb_tuple()) def parse_tuple(value: Tuple[Any, ...]) -> RGBA: """Parse a tuple or list to get RGBA values. Args: value: A tuple or list. Returns: An `RGBA` tuple parsed from the input tuple. Raises: PydanticCustomError: If tuple is not valid. """ if len(value) == 3: r, g, b = (parse_color_value(v) for v in value) return RGBA(r, g, b, None) elif len(value) == 4: r, g, b = (parse_color_value(v) for v in value[:3]) return RGBA(r, g, b, parse_float_alpha(value[3])) else: raise PydanticCustomError('color_error', 'value is not a valid color: tuples must have length 3 or 4') def parse_str(value: str) -> RGBA: """Parse a string representing a color to an RGBA tuple. Possible formats for the input string include: * named color, see `COLORS_BY_NAME` * hex short eg. `fff` (prefix can be `#`, `0x` or nothing) * hex long eg. `ffffff` (prefix can be `#`, `0x` or nothing) * `rgb(, , )` * `rgba(, , , )` Args: value: A string representing a color. Returns: An `RGBA` tuple parsed from the input string. Raises: ValueError: If the input string cannot be parsed to an RGBA tuple. """ value_lower = value.lower() try: r, g, b = COLORS_BY_NAME[value_lower] except KeyError: pass else: return ints_to_rgba(r, g, b, None) m = re.fullmatch(r_hex_short, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v * 2, 16) for v in rgb) if a: alpha: Optional[float] = int(a * 2, 16) / 255 else: alpha = None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_hex_long, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v, 16) for v in rgb) if a: alpha = int(a, 16) / 255 else: alpha = None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_rgb, value_lower) or re.fullmatch(r_rgb_v4_style, value_lower) if m: return ints_to_rgba(*m.groups()) # type: ignore m = re.fullmatch(r_hsl, value_lower) or re.fullmatch(r_hsl_v4_style, value_lower) if m: return parse_hsl(*m.groups()) # type: ignore raise PydanticCustomError('color_error', 'value is not a valid color: string not recognised as a valid color') def ints_to_rgba(r: Union[int, str], g: Union[int, str], b: Union[int, str], alpha: Optional[float] = None) -> RGBA: """Converts integer or string values for RGB color and an optional alpha value to an `RGBA` object. Args: r: An integer or string representing the red color value. g: An integer or string representing the green color value. b: An integer or string representing the blue color value. alpha: A float representing the alpha value. Defaults to None. Returns: An instance of the `RGBA` class with the corresponding color and alpha values. """ return RGBA(parse_color_value(r), parse_color_value(g), parse_color_value(b), parse_float_alpha(alpha)) def parse_color_value(value: Union[int, str], max_val: int = 255) -> float: """Parse the color value provided and return a number between 0 and 1. Args: value: An integer or string color value. max_val: Maximum range value. Defaults to 255. Raises: PydanticCustomError: If the value is not a valid color. Returns: A number between 0 and 1. """ try: color = float(value) except ValueError: raise PydanticCustomError('color_error', 'value is not a valid color: color values must be a valid number') if 0 <= color <= max_val: return color / max_val else: raise PydanticCustomError( 'color_error', 'value is not a valid color: color values must be in the range 0 to {max_val}', {'max_val': max_val}, ) def parse_float_alpha(value: Union[None, str, float, int]) -> Optional[float]: """Parse an alpha value checking it's a valid float in the range 0 to 1. Args: value: The input value to parse. Returns: The parsed value as a float, or `None` if the value was None or equal 1. Raises: PydanticCustomError: If the input value cannot be successfully parsed as a float in the expected range. """ if value is None: return None try: if isinstance(value, str) and value.endswith('%'): alpha = float(value[:-1]) / 100 else: alpha = float(value) except ValueError: raise PydanticCustomError('color_error', 'value is not a valid color: alpha values must be a valid float') if math.isclose(alpha, 1): return None elif 0 <= alpha <= 1: return alpha else: raise PydanticCustomError('color_error', 'value is not a valid color: alpha values must be in the range 0 to 1') def parse_hsl(h: str, h_units: str, sat: str, light: str, alpha: Optional[float] = None) -> RGBA: """Parse raw hue, saturation, lightness, and alpha values and convert to RGBA. Args: h: The hue value. h_units: The unit for hue value. sat: The saturation value. light: The lightness value. alpha: Alpha value. Returns: An instance of `RGBA`. """ s_value, l_value = parse_color_value(sat, 100), parse_color_value(light, 100) h_value = float(h) if h_units in {None, 'deg'}: h_value = h_value % 360 / 360 elif h_units == 'rad': h_value = h_value % rads / rads else: # turns h_value = h_value % 1 r, g, b = hls_to_rgb(h_value, l_value, s_value) return RGBA(r, g, b, parse_float_alpha(alpha)) def float_to_255(c: float) -> int: """Converts a float value between 0 and 1 (inclusive) to an integer between 0 and 255 (inclusive). Args: c: The float value to be converted. Must be between 0 and 1 (inclusive). Returns: The integer equivalent of the given float value rounded to the nearest whole number. Raises: ValueError: If the given float value is outside the acceptable range of 0 to 1 (inclusive). """ return int(round(c * 255)) COLORS_BY_NAME = { 'aliceblue': (240, 248, 255), 'antiquewhite': (250, 235, 215), 'aqua': (0, 255, 255), 'aquamarine': (127, 255, 212), 'azure': (240, 255, 255), 'beige': (245, 245, 220), 'bisque': (255, 228, 196), 'black': (0, 0, 0), 'blanchedalmond': (255, 235, 205), 'blue': (0, 0, 255), 'blueviolet': (138, 43, 226), 'brown': (165, 42, 42), 'burlywood': (222, 184, 135), 'cadetblue': (95, 158, 160), 'chartreuse': (127, 255, 0), 'chocolate': (210, 105, 30), 'coral': (255, 127, 80), 'cornflowerblue': (100, 149, 237), 'cornsilk': (255, 248, 220), 'crimson': (220, 20, 60), 'cyan': (0, 255, 255), 'darkblue': (0, 0, 139), 'darkcyan': (0, 139, 139), 'darkgoldenrod': (184, 134, 11), 'darkgray': (169, 169, 169), 'darkgreen': (0, 100, 0), 'darkgrey': (169, 169, 169), 'darkkhaki': (189, 183, 107), 'darkmagenta': (139, 0, 139), 'darkolivegreen': (85, 107, 47), 'darkorange': (255, 140, 0), 'darkorchid': (153, 50, 204), 'darkred': (139, 0, 0), 'darksalmon': (233, 150, 122), 'darkseagreen': (143, 188, 143), 'darkslateblue': (72, 61, 139), 'darkslategray': (47, 79, 79), 'darkslategrey': (47, 79, 79), 'darkturquoise': (0, 206, 209), 'darkviolet': (148, 0, 211), 'deeppink': (255, 20, 147), 'deepskyblue': (0, 191, 255), 'dimgray': (105, 105, 105), 'dimgrey': (105, 105, 105), 'dodgerblue': (30, 144, 255), 'firebrick': (178, 34, 34), 'floralwhite': (255, 250, 240), 'forestgreen': (34, 139, 34), 'fuchsia': (255, 0, 255), 'gainsboro': (220, 220, 220), 'ghostwhite': (248, 248, 255), 'gold': (255, 215, 0), 'goldenrod': (218, 165, 32), 'gray': (128, 128, 128), 'green': (0, 128, 0), 'greenyellow': (173, 255, 47), 'grey': (128, 128, 128), 'honeydew': (240, 255, 240), 'hotpink': (255, 105, 180), 'indianred': (205, 92, 92), 'indigo': (75, 0, 130), 'ivory': (255, 255, 240), 'khaki': (240, 230, 140), 'lavender': (230, 230, 250), 'lavenderblush': (255, 240, 245), 'lawngreen': (124, 252, 0), 'lemonchiffon': (255, 250, 205), 'lightblue': (173, 216, 230), 'lightcoral': (240, 128, 128), 'lightcyan': (224, 255, 255), 'lightgoldenrodyellow': (250, 250, 210), 'lightgray': (211, 211, 211), 'lightgreen': (144, 238, 144), 'lightgrey': (211, 211, 211), 'lightpink': (255, 182, 193), 'lightsalmon': (255, 160, 122), 'lightseagreen': (32, 178, 170), 'lightskyblue': (135, 206, 250), 'lightslategray': (119, 136, 153), 'lightslategrey': (119, 136, 153), 'lightsteelblue': (176, 196, 222), 'lightyellow': (255, 255, 224), 'lime': (0, 255, 0), 'limegreen': (50, 205, 50), 'linen': (250, 240, 230), 'magenta': (255, 0, 255), 'maroon': (128, 0, 0), 'mediumaquamarine': (102, 205, 170), 'mediumblue': (0, 0, 205), 'mediumorchid': (186, 85, 211), 'mediumpurple': (147, 112, 219), 'mediumseagreen': (60, 179, 113), 'mediumslateblue': (123, 104, 238), 'mediumspringgreen': (0, 250, 154), 'mediumturquoise': (72, 209, 204), 'mediumvioletred': (199, 21, 133), 'midnightblue': (25, 25, 112), 'mintcream': (245, 255, 250), 'mistyrose': (255, 228, 225), 'moccasin': (255, 228, 181), 'navajowhite': (255, 222, 173), 'navy': (0, 0, 128), 'oldlace': (253, 245, 230), 'olive': (128, 128, 0), 'olivedrab': (107, 142, 35), 'orange': (255, 165, 0), 'orangered': (255, 69, 0), 'orchid': (218, 112, 214), 'palegoldenrod': (238, 232, 170), 'palegreen': (152, 251, 152), 'paleturquoise': (175, 238, 238), 'palevioletred': (219, 112, 147), 'papayawhip': (255, 239, 213), 'peachpuff': (255, 218, 185), 'peru': (205, 133, 63), 'pink': (255, 192, 203), 'plum': (221, 160, 221), 'powderblue': (176, 224, 230), 'purple': (128, 0, 128), 'red': (255, 0, 0), 'rosybrown': (188, 143, 143), 'royalblue': (65, 105, 225), 'saddlebrown': (139, 69, 19), 'salmon': (250, 128, 114), 'sandybrown': (244, 164, 96), 'seagreen': (46, 139, 87), 'seashell': (255, 245, 238), 'sienna': (160, 82, 45), 'silver': (192, 192, 192), 'skyblue': (135, 206, 235), 'slateblue': (106, 90, 205), 'slategray': (112, 128, 144), 'slategrey': (112, 128, 144), 'snow': (255, 250, 250), 'springgreen': (0, 255, 127), 'steelblue': (70, 130, 180), 'tan': (210, 180, 140), 'teal': (0, 128, 128), 'thistle': (216, 191, 216), 'tomato': (255, 99, 71), 'turquoise': (64, 224, 208), 'violet': (238, 130, 238), 'wheat': (245, 222, 179), 'white': (255, 255, 255), 'whitesmoke': (245, 245, 245), 'yellow': (255, 255, 0), 'yellowgreen': (154, 205, 50), } COLORS_BY_VALUE = {v: k for k, v in COLORS_BY_NAME.items()} pydantic-2.10.6/pydantic/config.py000066400000000000000000001054331474456633400170730ustar00rootroot00000000000000"""Configuration for Pydantic models.""" from __future__ import annotations as _annotations from re import Pattern from typing import TYPE_CHECKING, Any, Callable, Dict, List, Type, TypeVar, Union from typing_extensions import Literal, TypeAlias, TypedDict from ._migration import getattr_migration from .aliases import AliasGenerator from .errors import PydanticUserError if TYPE_CHECKING: from ._internal._generate_schema import GenerateSchema as _GenerateSchema from .fields import ComputedFieldInfo, FieldInfo __all__ = ('ConfigDict', 'with_config') JsonValue: TypeAlias = Union[int, float, str, bool, None, List['JsonValue'], 'JsonDict'] JsonDict: TypeAlias = Dict[str, JsonValue] JsonEncoder = Callable[[Any], Any] JsonSchemaExtraCallable: TypeAlias = Union[ Callable[[JsonDict], None], Callable[[JsonDict, Type[Any]], None], ] ExtraValues = Literal['allow', 'ignore', 'forbid'] class ConfigDict(TypedDict, total=False): """A TypedDict for configuring Pydantic behaviour.""" title: str | None """The title for the generated JSON schema, defaults to the model's name""" model_title_generator: Callable[[type], str] | None """A callable that takes a model class and returns the title for it. Defaults to `None`.""" field_title_generator: Callable[[str, FieldInfo | ComputedFieldInfo], str] | None """A callable that takes a field's name and info and returns title for it. Defaults to `None`.""" str_to_lower: bool """Whether to convert all characters to lowercase for str types. Defaults to `False`.""" str_to_upper: bool """Whether to convert all characters to uppercase for str types. Defaults to `False`.""" str_strip_whitespace: bool """Whether to strip leading and trailing whitespace for str types.""" str_min_length: int """The minimum length for str types. Defaults to `None`.""" str_max_length: int | None """The maximum length for str types. Defaults to `None`.""" extra: ExtraValues | None """ Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to `'ignore'`. You can configure how pydantic handles the attributes that are not defined in the model: * `allow` - Allow any extra attributes. * `forbid` - Forbid any extra attributes. * `ignore` - Ignore any extra attributes. ```python from pydantic import BaseModel, ConfigDict class User(BaseModel): model_config = ConfigDict(extra='ignore') # (1)! name: str user = User(name='John Doe', age=20) # (2)! print(user) #> name='John Doe' ``` 1. This is the default behaviour. 2. The `age` argument is ignored. Instead, with `extra='allow'`, the `age` argument is included: ```python from pydantic import BaseModel, ConfigDict class User(BaseModel): model_config = ConfigDict(extra='allow') name: str user = User(name='John Doe', age=20) # (1)! print(user) #> name='John Doe' age=20 ``` 1. The `age` argument is included. With `extra='forbid'`, an error is raised: ```python from pydantic import BaseModel, ConfigDict, ValidationError class User(BaseModel): model_config = ConfigDict(extra='forbid') name: str try: User(name='John Doe', age=20) except ValidationError as e: print(e) ''' 1 validation error for User age Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int] ''' ``` """ frozen: bool """ Whether models are faux-immutable, i.e. whether `__setattr__` is allowed, and also generates a `__hash__()` method for the model. This makes instances of the model potentially hashable if all the attributes are hashable. Defaults to `False`. Note: On V1, the inverse of this setting was called `allow_mutation`, and was `True` by default. """ populate_by_name: bool """ Whether an aliased field may be populated by its name as given by the model attribute, as well as the alias. Defaults to `False`. Note: The name of this configuration setting was changed in **v2.0** from `allow_population_by_field_name` to `populate_by_name`. ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(populate_by_name=True) name: str = Field(alias='full_name') # (1)! age: int user = User(full_name='John Doe', age=20) # (2)! print(user) #> name='John Doe' age=20 user = User(name='John Doe', age=20) # (3)! print(user) #> name='John Doe' age=20 ``` 1. The field `'name'` has an alias `'full_name'`. 2. The model is populated by the alias `'full_name'`. 3. The model is populated by the field name `'name'`. """ use_enum_values: bool """ Whether to populate models with the `value` property of enums, rather than the raw enum. This may be useful if you want to serialize `model.model_dump()` later. Defaults to `False`. !!! note If you have an `Optional[Enum]` value that you set a default for, you need to use `validate_default=True` for said Field to ensure that the `use_enum_values` flag takes effect on the default, as extracting an enum's value occurs during validation, not serialization. ```python from enum import Enum from typing import Optional from pydantic import BaseModel, ConfigDict, Field class SomeEnum(Enum): FOO = 'foo' BAR = 'bar' BAZ = 'baz' class SomeModel(BaseModel): model_config = ConfigDict(use_enum_values=True) some_enum: SomeEnum another_enum: Optional[SomeEnum] = Field( default=SomeEnum.FOO, validate_default=True ) model1 = SomeModel(some_enum=SomeEnum.BAR) print(model1.model_dump()) #> {'some_enum': 'bar', 'another_enum': 'foo'} model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ) print(model2.model_dump()) #> {'some_enum': 'bar', 'another_enum': 'baz'} ``` """ validate_assignment: bool """ Whether to validate the data when the model is changed. Defaults to `False`. The default behavior of Pydantic is to validate the data when the model is created. In case the user changes the data after the model is created, the model is _not_ revalidated. ```python from pydantic import BaseModel class User(BaseModel): name: str user = User(name='John Doe') # (1)! print(user) #> name='John Doe' user.name = 123 # (1)! print(user) #> name=123 ``` 1. The validation happens only when the model is created. 2. The validation does not happen when the data is changed. In case you want to revalidate the model when the data is changed, you can use `validate_assignment=True`: ```python from pydantic import BaseModel, ValidationError class User(BaseModel, validate_assignment=True): # (1)! name: str user = User(name='John Doe') # (2)! print(user) #> name='John Doe' try: user.name = 123 # (3)! except ValidationError as e: print(e) ''' 1 validation error for User name Input should be a valid string [type=string_type, input_value=123, input_type=int] ''' ``` 1. You can either use class keyword arguments, or `model_config` to set `validate_assignment=True`. 2. The validation happens when the model is created. 3. The validation _also_ happens when the data is changed. """ arbitrary_types_allowed: bool """ Whether arbitrary types are allowed for field types. Defaults to `False`. ```python from pydantic import BaseModel, ConfigDict, ValidationError # This is not a pydantic model, it's an arbitrary class class Pet: def __init__(self, name: str): self.name = name class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) pet: Pet owner: str pet = Pet(name='Hedwig') # A simple check of instance type is used to validate the data model = Model(owner='Harry', pet=pet) print(model) #> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry' print(model.pet) #> <__main__.Pet object at 0x0123456789ab> print(model.pet.name) #> Hedwig print(type(model.pet)) #> try: # If the value is not an instance of the type, it's invalid Model(owner='Harry', pet='Hedwig') except ValidationError as e: print(e) ''' 1 validation error for Model pet Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str] ''' # Nothing in the instance of the arbitrary type is checked # Here name probably should have been a str, but it's not validated pet2 = Pet(name=42) model2 = Model(owner='Harry', pet=pet2) print(model2) #> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry' print(model2.pet) #> <__main__.Pet object at 0x0123456789ab> print(model2.pet.name) #> 42 print(type(model2.pet)) #> ``` """ from_attributes: bool """ Whether to build models and look up discriminators of tagged unions using python object attributes. """ loc_by_alias: bool """Whether to use the actual key provided in the data (e.g. alias) for error `loc`s rather than the field's name. Defaults to `True`.""" alias_generator: Callable[[str], str] | AliasGenerator | None """ A callable that takes a field name and returns an alias for it or an instance of [`AliasGenerator`][pydantic.aliases.AliasGenerator]. Defaults to `None`. When using a callable, the alias generator is used for both validation and serialization. If you want to use different alias generators for validation and serialization, you can use [`AliasGenerator`][pydantic.aliases.AliasGenerator] instead. If data source field names do not match your code style (e. g. CamelCase fields), you can automatically generate aliases using `alias_generator`. Here's an example with a basic callable: ```python from pydantic import BaseModel, ConfigDict from pydantic.alias_generators import to_pascal class Voice(BaseModel): model_config = ConfigDict(alias_generator=to_pascal) name: str language_code: str voice = Voice(Name='Filiz', LanguageCode='tr-TR') print(voice.language_code) #> tr-TR print(voice.model_dump(by_alias=True)) #> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'} ``` If you want to use different alias generators for validation and serialization, you can use [`AliasGenerator`][pydantic.aliases.AliasGenerator]. ```python from pydantic import AliasGenerator, BaseModel, ConfigDict from pydantic.alias_generators import to_camel, to_pascal class Athlete(BaseModel): first_name: str last_name: str sport: str model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=to_camel, serialization_alias=to_pascal, ) ) athlete = Athlete(firstName='John', lastName='Doe', sport='track') print(athlete.model_dump(by_alias=True)) #> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'} ``` Note: Pydantic offers three built-in alias generators: [`to_pascal`][pydantic.alias_generators.to_pascal], [`to_camel`][pydantic.alias_generators.to_camel], and [`to_snake`][pydantic.alias_generators.to_snake]. """ ignored_types: tuple[type, ...] """A tuple of types that may occur as values of class attributes without annotations. This is typically used for custom descriptors (classes that behave like `property`). If an attribute is set on a class without an annotation and has a type that is not in this tuple (or otherwise recognized by _pydantic_), an error will be raised. Defaults to `()`. """ allow_inf_nan: bool """Whether to allow infinity (`+inf` an `-inf`) and NaN values to float and decimal fields. Defaults to `True`.""" json_schema_extra: JsonDict | JsonSchemaExtraCallable | None """A dict or callable to provide extra JSON schema properties. Defaults to `None`.""" json_encoders: dict[type[object], JsonEncoder] | None """ A `dict` of custom JSON encoders for specific types. Defaults to `None`. !!! warning "Deprecated" This config option is a carryover from v1. We originally planned to remove it in v2 but didn't have a 1:1 replacement so we are keeping it for now. It is still deprecated and will likely be removed in the future. """ # new in V2 strict: bool """ _(new in V2)_ If `True`, strict validation is applied to all fields on the model. By default, Pydantic attempts to coerce values to the correct type, when possible. There are situations in which you may want to disable this behavior, and instead raise an error if a value's type does not match the field's type annotation. To configure strict mode for all fields on a model, you can set `strict=True` on the model. ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): model_config = ConfigDict(strict=True) name: str age: int ``` See [Strict Mode](../concepts/strict_mode.md) for more details. See the [Conversion Table](../concepts/conversion_table.md) for more details on how Pydantic converts data in both strict and lax modes. """ # whether instances of models and dataclasses (including subclass instances) should re-validate, default 'never' revalidate_instances: Literal['always', 'never', 'subclass-instances'] """ When and how to revalidate models and dataclasses during validation. Accepts the string values of `'never'`, `'always'` and `'subclass-instances'`. Defaults to `'never'`. - `'never'` will not revalidate models and dataclasses during validation - `'always'` will revalidate models and dataclasses during validation - `'subclass-instances'` will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass By default, model and dataclass instances are not revalidated during validation. ```python from typing import List from pydantic import BaseModel class User(BaseModel, revalidate_instances='never'): # (1)! hobbies: List[str] class SubUser(User): sins: List[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] # (2)! t = Transaction(user=my_user) # (3)! print(t) #> user=User(hobbies=[1]) my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) #> user=SubUser(hobbies=['scuba diving'], sins=['lying']) ``` 1. `revalidate_instances` is set to `'never'` by **default. 2. The assignment is not validated, unless you set `validate_assignment` to `True` in the model's config. 3. Since `revalidate_instances` is set to `never`, this is not revalidated. If you want to revalidate instances during validation, you can set `revalidate_instances` to `'always'` in the model's config. ```python from typing import List from pydantic import BaseModel, ValidationError class User(BaseModel, revalidate_instances='always'): # (1)! hobbies: List[str] class SubUser(User): sins: List[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] try: t = Transaction(user=my_user) # (2)! except ValidationError as e: print(e) ''' 1 validation error for Transaction user.hobbies.0 Input should be a valid string [type=string_type, input_value=1, input_type=int] ''' my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) # (3)! #> user=User(hobbies=['scuba diving']) ``` 1. `revalidate_instances` is set to `'always'`. 2. The model is revalidated, since `revalidate_instances` is set to `'always'`. 3. Using `'never'` we would have gotten `user=SubUser(hobbies=['scuba diving'], sins=['lying'])`. It's also possible to set `revalidate_instances` to `'subclass-instances'` to only revalidate instances of subclasses of the model. ```python from typing import List from pydantic import BaseModel class User(BaseModel, revalidate_instances='subclass-instances'): # (1)! hobbies: List[str] class SubUser(User): sins: List[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] t = Transaction(user=my_user) # (2)! print(t) #> user=User(hobbies=[1]) my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) # (3)! #> user=User(hobbies=['scuba diving']) ``` 1. `revalidate_instances` is set to `'subclass-instances'`. 2. This is not revalidated, since `my_user` is not a subclass of `User`. 3. Using `'never'` we would have gotten `user=SubUser(hobbies=['scuba diving'], sins=['lying'])`. """ ser_json_timedelta: Literal['iso8601', 'float'] """ The format of JSON serialized timedeltas. Accepts the string values of `'iso8601'` and `'float'`. Defaults to `'iso8601'`. - `'iso8601'` will serialize timedeltas to ISO 8601 durations. - `'float'` will serialize timedeltas to the total number of seconds. """ ser_json_bytes: Literal['utf8', 'base64', 'hex'] """ The encoding of JSON serialized bytes. Defaults to `'utf8'`. Set equal to `val_json_bytes` to get back an equal value after serialization round trip. - `'utf8'` will serialize bytes to UTF-8 strings. - `'base64'` will serialize bytes to URL safe base64 strings. - `'hex'` will serialize bytes to hexadecimal strings. """ val_json_bytes: Literal['utf8', 'base64', 'hex'] """ The encoding of JSON serialized bytes to decode. Defaults to `'utf8'`. Set equal to `ser_json_bytes` to get back an equal value after serialization round trip. - `'utf8'` will deserialize UTF-8 strings to bytes. - `'base64'` will deserialize URL safe base64 strings to bytes. - `'hex'` will deserialize hexadecimal strings to bytes. """ ser_json_inf_nan: Literal['null', 'constants', 'strings'] """ The encoding of JSON serialized infinity and NaN float values. Defaults to `'null'`. - `'null'` will serialize infinity and NaN values as `null`. - `'constants'` will serialize infinity and NaN values as `Infinity` and `NaN`. - `'strings'` will serialize infinity as string `"Infinity"` and NaN as string `"NaN"`. """ # whether to validate default values during validation, default False validate_default: bool """Whether to validate default values during validation. Defaults to `False`.""" validate_return: bool """Whether to validate the return value from call validators. Defaults to `False`.""" protected_namespaces: tuple[str | Pattern[str], ...] """ A `tuple` of strings and/or patterns that prevent models from having fields with names that conflict with them. For strings, we match on a prefix basis. Ex, if 'dog' is in the protected namespace, 'dog_name' will be protected. For patterns, we match on the entire field name. Ex, if `re.compile(r'^dog$')` is in the protected namespace, 'dog' will be protected, but 'dog_name' will not be. Defaults to `('model_validate', 'model_dump',)`. The reason we've selected these is to prevent collisions with other validation / dumping formats in the future - ex, `model_validate_{some_newly_supported_format}`. Before v2.10, Pydantic used `('model_',)` as the default value for this setting to prevent collisions between model attributes and `BaseModel`'s own methods. This was changed in v2.10 given feedback that this restriction was limiting in AI and data science contexts, where it is common to have fields with names like `model_id`, `model_input`, `model_output`, etc. For more details, see https://github.com/pydantic/pydantic/issues/10315. ```python import warnings from pydantic import BaseModel warnings.filterwarnings('error') # Raise warnings as errors try: class Model(BaseModel): model_dump_something: str except UserWarning as e: print(e) ''' Field "model_dump_something" in Model has conflict with protected namespace "model_dump". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('model_validate',)`. ''' ``` You can customize this behavior using the `protected_namespaces` setting: ```python {test="skip"} import re import warnings from pydantic import BaseModel, ConfigDict with warnings.catch_warnings(record=True) as caught_warnings: warnings.simplefilter('always') # Catch all warnings class Model(BaseModel): safe_field: str also_protect_field: str protect_this: str model_config = ConfigDict( protected_namespaces=( 'protect_me_', 'also_protect_', re.compile('^protect_this$'), ) ) for warning in caught_warnings: print(f'{warning.message}') ''' Field "also_protect_field" in Model has conflict with protected namespace "also_protect_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', re.compile('^protect_this$'))`. Field "protect_this" in Model has conflict with protected namespace "re.compile('^protect_this$')". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', 'also_protect_')`. ''' ``` While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error _is_ raised if there is an actual collision with an existing attribute: ```python from pydantic import BaseModel, ConfigDict try: class Model(BaseModel): model_validate: str model_config = ConfigDict(protected_namespaces=('model_',)) except NameError as e: print(e) ''' Field "model_validate" conflicts with member > of protected namespace "model_". ''' ``` """ hide_input_in_errors: bool """ Whether to hide inputs when printing errors. Defaults to `False`. Pydantic shows the input value and type when it raises `ValidationError` during the validation. ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: str try: Model(a=123) except ValidationError as e: print(e) ''' 1 validation error for Model a Input should be a valid string [type=string_type, input_value=123, input_type=int] ''' ``` You can hide the input value and type by setting the `hide_input_in_errors` config to `True`. ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): a: str model_config = ConfigDict(hide_input_in_errors=True) try: Model(a=123) except ValidationError as e: print(e) ''' 1 validation error for Model a Input should be a valid string [type=string_type] ''' ``` """ defer_build: bool """ Whether to defer model validator and serializer construction until the first model validation. Defaults to False. This can be useful to avoid the overhead of building models which are only used nested within other models, or when you want to manually define type namespace via [`Model.model_rebuild(_types_namespace=...)`][pydantic.BaseModel.model_rebuild]. Since v2.10, this setting also applies to pydantic dataclasses and TypeAdapter instances. """ plugin_settings: dict[str, object] | None """A `dict` of settings for plugins. Defaults to `None`.""" schema_generator: type[_GenerateSchema] | None """ !!! warning `schema_generator` is deprecated in v2.10. Prior to v2.10, this setting was advertised as highly subject to change. It's possible that this interface may once again become public once the internal core schema generation API is more stable, but that will likely come after significant performance improvements have been made. """ json_schema_serialization_defaults_required: bool """ Whether fields with default values should be marked as required in the serialization schema. Defaults to `False`. This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation. However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don't mind fields with defaults being marked as not required during serialization. See [#7209](https://github.com/pydantic/pydantic/issues/7209) for more details. ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): a: str = 'a' model_config = ConfigDict(json_schema_serialization_defaults_required=True) print(Model.model_json_schema(mode='validation')) ''' { 'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } ''' print(Model.model_json_schema(mode='serialization')) ''' { 'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } ''' ``` """ json_schema_mode_override: Literal['validation', 'serialization', None] """ If not `None`, the specified mode will be used to generate the JSON schema regardless of what `mode` was passed to the function call. Defaults to `None`. This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema. It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation and serialization that must both be referenced from the same schema; when this happens, we automatically append `-Input` to the definition reference for the validation schema and `-Output` to the definition reference for the serialization schema. By specifying a `json_schema_mode_override` though, this prevents the conflict between the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes from being added to the definition references. ```python from pydantic import BaseModel, ConfigDict, Json class Model(BaseModel): a: Json[int] # requires a string to validate, but will dump an int print(Model.model_json_schema(mode='serialization')) ''' { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } ''' class ForceInputModel(Model): # the following ensures that even with mode='serialization', we # will get the schema that would be generated for validation. model_config = ConfigDict(json_schema_mode_override='validation') print(ForceInputModel.model_json_schema(mode='serialization')) ''' { 'properties': { 'a': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'integer'}, 'title': 'A', 'type': 'string', } }, 'required': ['a'], 'title': 'ForceInputModel', 'type': 'object', } ''' ``` """ coerce_numbers_to_str: bool """ If `True`, enables automatic coercion of any `Number` type to `str` in "lax" (non-strict) mode. Defaults to `False`. Pydantic doesn't allow number types (`int`, `float`, `Decimal`) to be coerced as type `str` by default. ```python from decimal import Decimal from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): value: str try: print(Model(value=42)) except ValidationError as e: print(e) ''' 1 validation error for Model value Input should be a valid string [type=string_type, input_value=42, input_type=int] ''' class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) value: str repr(Model(value=42).value) #> "42" repr(Model(value=42.13).value) #> "42.13" repr(Model(value=Decimal('42.13')).value) #> "42.13" ``` """ regex_engine: Literal['rust-regex', 'python-re'] """ The regex engine to be used for pattern validation. Defaults to `'rust-regex'`. - `rust-regex` uses the [`regex`](https://docs.rs/regex) Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features. - `python-re` use the [`re`](https://docs.python.org/3/library/re.html) module, which supports all regex features, but may be slower. !!! note If you use a compiled regex pattern, the python-re engine will be used regardless of this setting. This is so that flags such as `re.IGNORECASE` are respected. ```python from pydantic import BaseModel, ConfigDict, Field, ValidationError class Model(BaseModel): model_config = ConfigDict(regex_engine='python-re') value: str = Field(pattern=r'^abc(?=def)') print(Model(value='abcdef').value) #> abcdef try: print(Model(value='abxyzcdef')) except ValidationError as e: print(e) ''' 1 validation error for Model value String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str] ''' ``` """ validation_error_cause: bool """ If `True`, Python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to `False`. Note: Python 3.10 and older don't support exception groups natively. <=3.10, backport must be installed: `pip install exceptiongroup`. Note: The structure of validation errors are likely to change in future Pydantic versions. Pydantic offers no guarantees about their structure. Should be used for visual traceback debugging only. """ use_attribute_docstrings: bool ''' Whether docstrings of attributes (bare string literals immediately following the attribute declaration) should be used for field descriptions. Defaults to `False`. Available in Pydantic v2.7+. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): model_config = ConfigDict(use_attribute_docstrings=True) x: str """ Example of an attribute docstring """ y: int = Field(description="Description in Field") """ Description in Field overrides attribute docstring """ print(Model.model_fields["x"].description) # > Example of an attribute docstring print(Model.model_fields["y"].description) # > Description in Field ``` This requires the source code of the class to be available at runtime. !!! warning "Usage with `TypedDict`" Due to current limitations, attribute docstrings detection may not work as expected when using `TypedDict` (in particular when multiple `TypedDict` classes have the same name in the same source file). The behavior can be different depending on the Python version used. ''' cache_strings: bool | Literal['all', 'keys', 'none'] """ Whether to cache strings to avoid constructing new Python objects. Defaults to True. Enabling this setting should significantly improve validation performance while increasing memory usage slightly. - `True` or `'all'` (the default): cache all strings - `'keys'`: cache only dictionary keys - `False` or `'none'`: no caching !!! note `True` or `'all'` is required to cache strings during general validation because validators don't know if they're in a key or a value. !!! tip If repeated strings are rare, it's recommended to use `'keys'` or `'none'` to reduce memory usage, as the performance difference is minimal if repeated strings are rare. """ _TypeT = TypeVar('_TypeT', bound=type) def with_config(config: ConfigDict) -> Callable[[_TypeT], _TypeT]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/config/#configuration-with-dataclass-from-the-standard-library-or-typeddict A convenience decorator to set a [Pydantic configuration](config.md) on a `TypedDict` or a `dataclass` from the standard library. Although the configuration can be set using the `__pydantic_config__` attribute, it does not play well with type checkers, especially with `TypedDict`. !!! example "Usage" ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, with_config @with_config(ConfigDict(str_to_lower=True)) class Model(TypedDict): x: str ta = TypeAdapter(Model) print(ta.validate_python({'x': 'ABC'})) #> {'x': 'abc'} ``` """ def inner(class_: _TypeT, /) -> _TypeT: # Ideally, we would check for `class_` to either be a `TypedDict` or a stdlib dataclass. # However, the `@with_config` decorator can be applied *after* `@dataclass`. To avoid # common mistakes, we at least check for `class_` to not be a Pydantic model. from ._internal._utils import is_model_class if is_model_class(class_): raise PydanticUserError( f'Cannot use `with_config` on {class_.__name__} as it is a Pydantic model', code='with-config-on-model', ) class_.__pydantic_config__ = config return class_ return inner __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/dataclasses.py000066400000000000000000000370341474456633400201160ustar00rootroot00000000000000"""Provide an enhanced dataclass that performs validation.""" from __future__ import annotations as _annotations import dataclasses import sys import types from typing import TYPE_CHECKING, Any, Callable, Generic, NoReturn, TypeVar, overload from warnings import warn from typing_extensions import Literal, TypeGuard, dataclass_transform from ._internal import _config, _decorators, _namespace_utils, _typing_extra from ._internal import _dataclasses as _pydantic_dataclasses from ._migration import getattr_migration from .config import ConfigDict from .errors import PydanticUserError from .fields import Field, FieldInfo, PrivateAttr if TYPE_CHECKING: from ._internal._dataclasses import PydanticDataclass from ._internal._namespace_utils import MappingNamespace __all__ = 'dataclass', 'rebuild_dataclass' _T = TypeVar('_T') if sys.version_info >= (3, 10): @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) @overload def dataclass( *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = ..., slots: bool = ..., ) -> Callable[[type[_T]], type[PydanticDataclass]]: # type: ignore ... @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) @overload def dataclass( _cls: type[_T], # type: ignore *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = ..., slots: bool = ..., ) -> type[PydanticDataclass]: ... else: @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) @overload def dataclass( *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, ) -> Callable[[type[_T]], type[PydanticDataclass]]: # type: ignore ... @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) @overload def dataclass( _cls: type[_T], # type: ignore *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, ) -> type[PydanticDataclass]: ... @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) def dataclass( _cls: type[_T] | None = None, *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = False, slots: bool = False, ) -> Callable[[type[_T]], type[PydanticDataclass]] | type[PydanticDataclass]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/dataclasses/ A decorator used to create a Pydantic-enhanced dataclass, similar to the standard Python `dataclass`, but with added validation. This function should be used similarly to `dataclasses.dataclass`. Args: _cls: The target `dataclass`. init: Included for signature compatibility with `dataclasses.dataclass`, and is passed through to `dataclasses.dataclass` when appropriate. If specified, must be set to `False`, as pydantic inserts its own `__init__` function. repr: A boolean indicating whether to include the field in the `__repr__` output. eq: Determines if a `__eq__` method should be generated for the class. order: Determines if comparison magic methods should be generated, such as `__lt__`, but not `__eq__`. unsafe_hash: Determines if a `__hash__` method should be included in the class, as in `dataclasses.dataclass`. frozen: Determines if the generated class should be a 'frozen' `dataclass`, which does not allow its attributes to be modified after it has been initialized. If not set, the value from the provided `config` argument will be used (and will default to `False` otherwise). config: The Pydantic config to use for the `dataclass`. validate_on_init: A deprecated parameter included for backwards compatibility; in V2, all Pydantic dataclasses are validated on init. kw_only: Determines if `__init__` method parameters must be specified by keyword only. Defaults to `False`. slots: Determines if the generated class should be a 'slots' `dataclass`, which does not allow the addition of new attributes after instantiation. Returns: A decorator that accepts a class as its argument and returns a Pydantic `dataclass`. Raises: AssertionError: Raised if `init` is not `False` or `validate_on_init` is `False`. """ assert init is False, 'pydantic.dataclasses.dataclass only supports init=False' assert validate_on_init is not False, 'validate_on_init=False is no longer supported' if sys.version_info >= (3, 10): kwargs = {'kw_only': kw_only, 'slots': slots} else: kwargs = {} def make_pydantic_fields_compatible(cls: type[Any]) -> None: """Make sure that stdlib `dataclasses` understands `Field` kwargs like `kw_only` To do that, we simply change `x: int = pydantic.Field(..., kw_only=True)` into `x: int = dataclasses.field(default=pydantic.Field(..., kw_only=True), kw_only=True)` """ for annotation_cls in cls.__mro__: # In Python < 3.9, `__annotations__` might not be present if there are no fields. # we therefore need to use `getattr` to avoid an `AttributeError`. annotations = getattr(annotation_cls, '__annotations__', []) for field_name in annotations: field_value = getattr(cls, field_name, None) # Process only if this is an instance of `FieldInfo`. if not isinstance(field_value, FieldInfo): continue # Initialize arguments for the standard `dataclasses.field`. field_args: dict = {'default': field_value} # Handle `kw_only` for Python 3.10+ if sys.version_info >= (3, 10) and field_value.kw_only: field_args['kw_only'] = True # Set `repr` attribute if it's explicitly specified to be not `True`. if field_value.repr is not True: field_args['repr'] = field_value.repr setattr(cls, field_name, dataclasses.field(**field_args)) # In Python 3.8, dataclasses checks cls.__dict__['__annotations__'] for annotations, # so we must make sure it's initialized before we add to it. if cls.__dict__.get('__annotations__') is None: cls.__annotations__ = {} cls.__annotations__[field_name] = annotations[field_name] def create_dataclass(cls: type[Any]) -> type[PydanticDataclass]: """Create a Pydantic dataclass from a regular dataclass. Args: cls: The class to create the Pydantic dataclass from. Returns: A Pydantic dataclass. """ from ._internal._utils import is_model_class if is_model_class(cls): raise PydanticUserError( f'Cannot create a Pydantic dataclass from {cls.__name__} as it is already a Pydantic model', code='dataclass-on-model', ) original_cls = cls # we warn on conflicting config specifications, but only if the class doesn't have a dataclass base # because a dataclass base might provide a __pydantic_config__ attribute that we don't want to warn about has_dataclass_base = any(dataclasses.is_dataclass(base) for base in cls.__bases__) if not has_dataclass_base and config is not None and hasattr(cls, '__pydantic_config__'): warn( f'`config` is set via both the `dataclass` decorator and `__pydantic_config__` for dataclass {cls.__name__}. ' f'The `config` specification from `dataclass` decorator will take priority.', category=UserWarning, stacklevel=2, ) # if config is not explicitly provided, try to read it from the type config_dict = config if config is not None else getattr(cls, '__pydantic_config__', None) config_wrapper = _config.ConfigWrapper(config_dict) decorators = _decorators.DecoratorInfos.build(cls) # Keep track of the original __doc__ so that we can restore it after applying the dataclasses decorator # Otherwise, classes with no __doc__ will have their signature added into the JSON schema description, # since dataclasses.dataclass will set this as the __doc__ original_doc = cls.__doc__ if _pydantic_dataclasses.is_builtin_dataclass(cls): # Don't preserve the docstring for vanilla dataclasses, as it may include the signature # This matches v1 behavior, and there was an explicit test for it original_doc = None # We don't want to add validation to the existing std lib dataclass, so we will subclass it # If the class is generic, we need to make sure the subclass also inherits from Generic # with all the same parameters. bases = (cls,) if issubclass(cls, Generic): generic_base = Generic[cls.__parameters__] # type: ignore bases = bases + (generic_base,) cls = types.new_class(cls.__name__, bases) make_pydantic_fields_compatible(cls) # Respect frozen setting from dataclass constructor and fallback to config setting if not provided if frozen is not None: frozen_ = frozen if config_wrapper.frozen: # It's not recommended to define both, as the setting from the dataclass decorator will take priority. warn( f'`frozen` is set via both the `dataclass` decorator and `config` for dataclass {cls.__name__!r}.' 'This is not recommended. The `frozen` specification on `dataclass` will take priority.', category=UserWarning, stacklevel=2, ) else: frozen_ = config_wrapper.frozen or False cls = dataclasses.dataclass( # type: ignore[call-overload] cls, # the value of init here doesn't affect anything except that it makes it easier to generate a signature init=True, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen_, **kwargs, ) cls.__pydantic_decorators__ = decorators # type: ignore cls.__doc__ = original_doc cls.__module__ = original_cls.__module__ cls.__qualname__ = original_cls.__qualname__ cls.__pydantic_complete__ = False # `complete_dataclass` will set it to `True` if successful. # TODO `parent_namespace` is currently None, but we could do the same thing as Pydantic models: # fetch the parent ns using `parent_frame_namespace` (if the dataclass was defined in a function), # and possibly cache it (see the `__pydantic_parent_namespace__` logic for models). _pydantic_dataclasses.complete_dataclass(cls, config_wrapper, raise_errors=False) return cls return create_dataclass if _cls is None else create_dataclass(_cls) __getattr__ = getattr_migration(__name__) if (3, 8) <= sys.version_info < (3, 11): # Monkeypatch dataclasses.InitVar so that typing doesn't error if it occurs as a type when evaluating type hints # Starting in 3.11, typing.get_type_hints will not raise an error if the retrieved type hints are not callable. def _call_initvar(*args: Any, **kwargs: Any) -> NoReturn: """This function does nothing but raise an error that is as similar as possible to what you'd get if you were to try calling `InitVar[int]()` without this monkeypatch. The whole purpose is just to ensure typing._type_check does not error if the type hint evaluates to `InitVar[]`. """ raise TypeError("'InitVar' object is not callable") dataclasses.InitVar.__call__ = _call_initvar def rebuild_dataclass( cls: type[PydanticDataclass], *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the dataclass. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. This is analogous to `BaseModel.model_rebuild`. Args: cls: The class to rebuild the pydantic-core schema for. force: Whether to force the rebuilding of the schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: The depth level of the parent namespace, defaults to 2. _types_namespace: The types namespace, defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and cls.__pydantic_complete__: return None if '__pydantic_core_schema__' in cls.__dict__: delattr(cls, '__pydantic_core_schema__') # delete cached value to ensure full rebuild happens if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} ns_resolver = _namespace_utils.NsResolver( parent_namespace=rebuild_ns, ) return _pydantic_dataclasses.complete_dataclass( cls, _config.ConfigWrapper(cls.__pydantic_config__, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, # We could provide a different config instead (with `'defer_build'` set to `True`) # of this explicit `_force_build` argument, but because config can come from the # decorator parameter or the `__pydantic_config__` attribute, `complete_dataclass` # will overwrite `__pydantic_config__` with the provided config above: _force_build=True, ) def is_pydantic_dataclass(class_: type[Any], /) -> TypeGuard[type[PydanticDataclass]]: """Whether a class is a pydantic dataclass. Args: class_: The class. Returns: `True` if the class is a pydantic dataclass, `False` otherwise. """ try: return '__pydantic_validator__' in class_.__dict__ and dataclasses.is_dataclass(class_) except AttributeError: return False pydantic-2.10.6/pydantic/datetime_parse.py000066400000000000000000000002261474456633400206060ustar00rootroot00000000000000"""The `datetime_parse` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/decorator.py000066400000000000000000000002211474456633400175750ustar00rootroot00000000000000"""The `decorator` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/deprecated/000077500000000000000000000000001474456633400173465ustar00rootroot00000000000000pydantic-2.10.6/pydantic/deprecated/__init__.py000066400000000000000000000000001474456633400214450ustar00rootroot00000000000000pydantic-2.10.6/pydantic/deprecated/class_validators.py000066400000000000000000000240051474456633400232560ustar00rootroot00000000000000"""Old `@validator` and `@root_validator` function validators from V1.""" from __future__ import annotations as _annotations from functools import partial, partialmethod from types import FunctionType from typing import TYPE_CHECKING, Any, Callable, TypeVar, Union, overload from warnings import warn from typing_extensions import Literal, Protocol, TypeAlias, deprecated from .._internal import _decorators, _decorators_v1 from ..errors import PydanticUserError from ..warnings import PydanticDeprecatedSince20 _ALLOW_REUSE_WARNING_MESSAGE = '`allow_reuse` is deprecated and will be ignored; it should no longer be necessary' if TYPE_CHECKING: class _OnlyValueValidatorClsMethod(Protocol): def __call__(self, __cls: Any, __value: Any) -> Any: ... class _V1ValidatorWithValuesClsMethod(Protocol): def __call__(self, __cls: Any, __value: Any, values: dict[str, Any]) -> Any: ... class _V1ValidatorWithValuesKwOnlyClsMethod(Protocol): def __call__(self, __cls: Any, __value: Any, *, values: dict[str, Any]) -> Any: ... class _V1ValidatorWithKwargsClsMethod(Protocol): def __call__(self, __cls: Any, **kwargs: Any) -> Any: ... class _V1ValidatorWithValuesAndKwargsClsMethod(Protocol): def __call__(self, __cls: Any, values: dict[str, Any], **kwargs: Any) -> Any: ... class _V1RootValidatorClsMethod(Protocol): def __call__( self, __cls: Any, __values: _decorators_v1.RootValidatorValues ) -> _decorators_v1.RootValidatorValues: ... V1Validator = Union[ _OnlyValueValidatorClsMethod, _V1ValidatorWithValuesClsMethod, _V1ValidatorWithValuesKwOnlyClsMethod, _V1ValidatorWithKwargsClsMethod, _V1ValidatorWithValuesAndKwargsClsMethod, _decorators_v1.V1ValidatorWithValues, _decorators_v1.V1ValidatorWithValuesKwOnly, _decorators_v1.V1ValidatorWithKwargs, _decorators_v1.V1ValidatorWithValuesAndKwargs, ] V1RootValidator = Union[ _V1RootValidatorClsMethod, _decorators_v1.V1RootValidatorFunction, ] _PartialClsOrStaticMethod: TypeAlias = Union[classmethod[Any, Any, Any], staticmethod[Any, Any], partialmethod[Any]] # Allow both a V1 (assumed pre=False) or V2 (assumed mode='after') validator # We lie to type checkers and say we return the same thing we get # but in reality we return a proxy object that _mostly_ behaves like the wrapped thing _V1ValidatorType = TypeVar('_V1ValidatorType', V1Validator, _PartialClsOrStaticMethod) _V1RootValidatorFunctionType = TypeVar( '_V1RootValidatorFunctionType', _decorators_v1.V1RootValidatorFunction, _V1RootValidatorClsMethod, _PartialClsOrStaticMethod, ) else: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 @deprecated( 'Pydantic V1 style `@validator` validators are deprecated.' ' You should migrate to Pydantic V2 style `@field_validator` validators,' ' see the migration guide for more details', category=None, ) def validator( __field: str, *fields: str, pre: bool = False, each_item: bool = False, always: bool = False, check_fields: bool | None = None, allow_reuse: bool = False, ) -> Callable[[_V1ValidatorType], _V1ValidatorType]: """Decorate methods on the class indicating that they should be used to validate fields. Args: __field (str): The first field the validator should be called on; this is separate from `fields` to ensure an error is raised if you don't pass at least one. *fields (str): Additional field(s) the validator should be called on. pre (bool, optional): Whether this validator should be called before the standard validators (else after). Defaults to False. each_item (bool, optional): For complex objects (sets, lists etc.) whether to validate individual elements rather than the whole object. Defaults to False. always (bool, optional): Whether this method and other validators should be called even if the value is missing. Defaults to False. check_fields (bool | None, optional): Whether to check that the fields actually exist on the model. Defaults to None. allow_reuse (bool, optional): Whether to track and raise an error if another validator refers to the decorated function. Defaults to False. Returns: Callable: A decorator that can be used to decorate a function to be used as a validator. """ warn( 'Pydantic V1 style `@validator` validators are deprecated.' ' You should migrate to Pydantic V2 style `@field_validator` validators,' ' see the migration guide for more details', DeprecationWarning, stacklevel=2, ) if allow_reuse is True: # pragma: no cover warn(_ALLOW_REUSE_WARNING_MESSAGE, DeprecationWarning) fields = __field, *fields if isinstance(fields[0], FunctionType): raise PydanticUserError( '`@validator` should be used with fields and keyword arguments, not bare. ' "E.g. usage should be `@validator('', ...)`", code='validator-no-fields', ) elif not all(isinstance(field, str) for field in fields): raise PydanticUserError( '`@validator` fields should be passed as separate string args. ' "E.g. usage should be `@validator('', '', ...)`", code='validator-invalid-fields', ) mode: Literal['before', 'after'] = 'before' if pre is True else 'after' def dec(f: Any) -> _decorators.PydanticDescriptorProxy[Any]: if _decorators.is_instance_method_from_sig(f): raise PydanticUserError( '`@validator` cannot be applied to instance methods', code='validator-instance-method' ) # auto apply the @classmethod decorator f = _decorators.ensure_classmethod_based_on_signature(f) wrap = _decorators_v1.make_generic_v1_field_validator validator_wrapper_info = _decorators.ValidatorDecoratorInfo( fields=fields, mode=mode, each_item=each_item, always=always, check_fields=check_fields, ) return _decorators.PydanticDescriptorProxy(f, validator_wrapper_info, shim=wrap) return dec # type: ignore[return-value] @overload def root_validator( *, # if you don't specify `pre` the default is `pre=False` # which means you need to specify `skip_on_failure=True` skip_on_failure: Literal[True], allow_reuse: bool = ..., ) -> Callable[ [_V1RootValidatorFunctionType], _V1RootValidatorFunctionType, ]: ... @overload def root_validator( *, # if you specify `pre=True` then you don't need to specify # `skip_on_failure`, in fact it is not allowed as an argument! pre: Literal[True], allow_reuse: bool = ..., ) -> Callable[ [_V1RootValidatorFunctionType], _V1RootValidatorFunctionType, ]: ... @overload def root_validator( *, # if you explicitly specify `pre=False` then you # MUST specify `skip_on_failure=True` pre: Literal[False], skip_on_failure: Literal[True], allow_reuse: bool = ..., ) -> Callable[ [_V1RootValidatorFunctionType], _V1RootValidatorFunctionType, ]: ... @deprecated( 'Pydantic V1 style `@root_validator` validators are deprecated.' ' You should migrate to Pydantic V2 style `@model_validator` validators,' ' see the migration guide for more details', category=None, ) def root_validator( *__args, pre: bool = False, skip_on_failure: bool = False, allow_reuse: bool = False, ) -> Any: """Decorate methods on a model indicating that they should be used to validate (and perhaps modify) data either before or after standard model parsing/validation is performed. Args: pre (bool, optional): Whether this validator should be called before the standard validators (else after). Defaults to False. skip_on_failure (bool, optional): Whether to stop validation and return as soon as a failure is encountered. Defaults to False. allow_reuse (bool, optional): Whether to track and raise an error if another validator refers to the decorated function. Defaults to False. Returns: Any: A decorator that can be used to decorate a function to be used as a root_validator. """ warn( 'Pydantic V1 style `@root_validator` validators are deprecated.' ' You should migrate to Pydantic V2 style `@model_validator` validators,' ' see the migration guide for more details', DeprecationWarning, stacklevel=2, ) if __args: # Ensure a nice error is raised if someone attempts to use the bare decorator return root_validator()(*__args) # type: ignore if allow_reuse is True: # pragma: no cover warn(_ALLOW_REUSE_WARNING_MESSAGE, DeprecationWarning) mode: Literal['before', 'after'] = 'before' if pre is True else 'after' if pre is False and skip_on_failure is not True: raise PydanticUserError( 'If you use `@root_validator` with pre=False (the default) you MUST specify `skip_on_failure=True`.' ' Note that `@root_validator` is deprecated and should be replaced with `@model_validator`.', code='root-validator-pre-skip', ) wrap = partial(_decorators_v1.make_v1_generic_root_validator, pre=pre) def dec(f: Callable[..., Any] | classmethod[Any, Any, Any] | staticmethod[Any, Any]) -> Any: if _decorators.is_instance_method_from_sig(f): raise TypeError('`@root_validator` cannot be applied to instance methods') # auto apply the @classmethod decorator res = _decorators.ensure_classmethod_based_on_signature(f) dec_info = _decorators.RootValidatorDecoratorInfo(mode=mode) return _decorators.PydanticDescriptorProxy(res, dec_info, shim=wrap) return dec pydantic-2.10.6/pydantic/deprecated/config.py000066400000000000000000000051471474456633400211740ustar00rootroot00000000000000from __future__ import annotations as _annotations import warnings from typing import TYPE_CHECKING, Any from typing_extensions import Literal, deprecated from .._internal import _config from ..warnings import PydanticDeprecatedSince20 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = 'BaseConfig', 'Extra' class _ConfigMetaclass(type): def __getattr__(self, item: str) -> Any: try: obj = _config.config_defaults[item] warnings.warn(_config.DEPRECATION_MESSAGE, DeprecationWarning) return obj except KeyError as exc: raise AttributeError(f"type object '{self.__name__}' has no attribute {exc}") from exc @deprecated('BaseConfig is deprecated. Use the `pydantic.ConfigDict` instead.', category=PydanticDeprecatedSince20) class BaseConfig(metaclass=_ConfigMetaclass): """This class is only retained for backwards compatibility. !!! Warning "Deprecated" BaseConfig is deprecated. Use the [`pydantic.ConfigDict`][pydantic.ConfigDict] instead. """ def __getattr__(self, item: str) -> Any: try: obj = super().__getattribute__(item) warnings.warn(_config.DEPRECATION_MESSAGE, DeprecationWarning) return obj except AttributeError as exc: try: return getattr(type(self), item) except AttributeError: # re-raising changes the displayed text to reflect that `self` is not a type raise AttributeError(str(exc)) from exc def __init_subclass__(cls, **kwargs: Any) -> None: warnings.warn(_config.DEPRECATION_MESSAGE, DeprecationWarning) return super().__init_subclass__(**kwargs) class _ExtraMeta(type): def __getattribute__(self, __name: str) -> Any: # The @deprecated decorator accesses other attributes, so we only emit a warning for the expected ones if __name in {'allow', 'ignore', 'forbid'}: warnings.warn( "`pydantic.config.Extra` is deprecated, use literal values instead (e.g. `extra='allow'`)", DeprecationWarning, stacklevel=2, ) return super().__getattribute__(__name) @deprecated( "Extra is deprecated. Use literal values instead (e.g. `extra='allow'`)", category=PydanticDeprecatedSince20 ) class Extra(metaclass=_ExtraMeta): allow: Literal['allow'] = 'allow' ignore: Literal['ignore'] = 'ignore' forbid: Literal['forbid'] = 'forbid' pydantic-2.10.6/pydantic/deprecated/copy_internals.py000066400000000000000000000167161474456633400227640ustar00rootroot00000000000000from __future__ import annotations as _annotations import typing from copy import deepcopy from enum import Enum from typing import Any, Tuple import typing_extensions from .._internal import ( _model_construction, _typing_extra, _utils, ) if typing.TYPE_CHECKING: from .. import BaseModel from .._internal._utils import AbstractSetIntStr, MappingIntStrAny AnyClassMethod = classmethod[Any, Any, Any] TupleGenerator = typing.Generator[Tuple[str, Any], None, None] Model = typing.TypeVar('Model', bound='BaseModel') # should be `set[int] | set[str] | dict[int, IncEx] | dict[str, IncEx] | None`, but mypy can't cope IncEx: typing_extensions.TypeAlias = 'set[int] | set[str] | dict[int, Any] | dict[str, Any] | None' _object_setattr = _model_construction.object_setattr def _iter( self: BaseModel, to_dict: bool = False, by_alias: bool = False, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> TupleGenerator: # Merge field set excludes with explicit exclude parameter with explicit overriding field set options. # The extra "is not None" guards are not logically necessary but optimizes performance for the simple case. if exclude is not None: exclude = _utils.ValueItems.merge( {k: v.exclude for k, v in self.__pydantic_fields__.items() if v.exclude is not None}, exclude ) if include is not None: include = _utils.ValueItems.merge({k: True for k in self.__pydantic_fields__}, include, intersect=True) allowed_keys = _calculate_keys(self, include=include, exclude=exclude, exclude_unset=exclude_unset) # type: ignore if allowed_keys is None and not (to_dict or by_alias or exclude_unset or exclude_defaults or exclude_none): # huge boost for plain _iter() yield from self.__dict__.items() if self.__pydantic_extra__: yield from self.__pydantic_extra__.items() return value_exclude = _utils.ValueItems(self, exclude) if exclude is not None else None value_include = _utils.ValueItems(self, include) if include is not None else None if self.__pydantic_extra__ is None: items = self.__dict__.items() else: items = list(self.__dict__.items()) + list(self.__pydantic_extra__.items()) for field_key, v in items: if (allowed_keys is not None and field_key not in allowed_keys) or (exclude_none and v is None): continue if exclude_defaults: try: field = self.__pydantic_fields__[field_key] except KeyError: pass else: if not field.is_required() and field.default == v: continue if by_alias and field_key in self.__pydantic_fields__: dict_key = self.__pydantic_fields__[field_key].alias or field_key else: dict_key = field_key if to_dict or value_include or value_exclude: v = _get_value( type(self), v, to_dict=to_dict, by_alias=by_alias, include=value_include and value_include.for_element(field_key), exclude=value_exclude and value_exclude.for_element(field_key), exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) yield dict_key, v def _copy_and_set_values( self: Model, values: dict[str, Any], fields_set: set[str], extra: dict[str, Any] | None = None, private: dict[str, Any] | None = None, *, deep: bool, # UP006 ) -> Model: if deep: # chances of having empty dict here are quite low for using smart_deepcopy values = deepcopy(values) extra = deepcopy(extra) private = deepcopy(private) cls = self.__class__ m = cls.__new__(cls) _object_setattr(m, '__dict__', values) _object_setattr(m, '__pydantic_extra__', extra) _object_setattr(m, '__pydantic_fields_set__', fields_set) _object_setattr(m, '__pydantic_private__', private) return m @typing.no_type_check def _get_value( cls: type[BaseModel], v: Any, to_dict: bool, by_alias: bool, include: AbstractSetIntStr | MappingIntStrAny | None, exclude: AbstractSetIntStr | MappingIntStrAny | None, exclude_unset: bool, exclude_defaults: bool, exclude_none: bool, ) -> Any: from .. import BaseModel if isinstance(v, BaseModel): if to_dict: return v.model_dump( by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=include, # type: ignore exclude=exclude, # type: ignore exclude_none=exclude_none, ) else: return v.copy(include=include, exclude=exclude) value_exclude = _utils.ValueItems(v, exclude) if exclude else None value_include = _utils.ValueItems(v, include) if include else None if isinstance(v, dict): return { k_: _get_value( cls, v_, to_dict=to_dict, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=value_include and value_include.for_element(k_), exclude=value_exclude and value_exclude.for_element(k_), exclude_none=exclude_none, ) for k_, v_ in v.items() if (not value_exclude or not value_exclude.is_excluded(k_)) and (not value_include or value_include.is_included(k_)) } elif _utils.sequence_like(v): seq_args = ( _get_value( cls, v_, to_dict=to_dict, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=value_include and value_include.for_element(i), exclude=value_exclude and value_exclude.for_element(i), exclude_none=exclude_none, ) for i, v_ in enumerate(v) if (not value_exclude or not value_exclude.is_excluded(i)) and (not value_include or value_include.is_included(i)) ) return v.__class__(*seq_args) if _typing_extra.is_namedtuple(v.__class__) else v.__class__(seq_args) elif isinstance(v, Enum) and getattr(cls.model_config, 'use_enum_values', False): return v.value else: return v def _calculate_keys( self: BaseModel, include: MappingIntStrAny | None, exclude: MappingIntStrAny | None, exclude_unset: bool, update: typing.Dict[str, Any] | None = None, # noqa UP006 ) -> typing.AbstractSet[str] | None: if include is None and exclude is None and exclude_unset is False: return None keys: typing.AbstractSet[str] if exclude_unset: keys = self.__pydantic_fields_set__.copy() else: keys = set(self.__dict__.keys()) keys = keys | (self.__pydantic_extra__ or {}).keys() if include is not None: keys &= include.keys() if update: keys -= update.keys() if exclude: keys -= {k for k, v in exclude.items() if _utils.ValueItems.is_true(v)} return keys pydantic-2.10.6/pydantic/deprecated/decorator.py000066400000000000000000000251331474456633400217060ustar00rootroot00000000000000import warnings from functools import wraps from typing import TYPE_CHECKING, Any, Callable, Dict, List, Mapping, Optional, Tuple, Type, TypeVar, Union, overload from typing_extensions import deprecated from .._internal import _config, _typing_extra from ..alias_generators import to_pascal from ..errors import PydanticUserError from ..functional_validators import field_validator from ..main import BaseModel, create_model from ..warnings import PydanticDeprecatedSince20 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = ('validate_arguments',) if TYPE_CHECKING: AnyCallable = Callable[..., Any] AnyCallableT = TypeVar('AnyCallableT', bound=AnyCallable) ConfigType = Union[None, Type[Any], Dict[str, Any]] @overload def validate_arguments( func: None = None, *, config: 'ConfigType' = None ) -> Callable[['AnyCallableT'], 'AnyCallableT']: ... @overload def validate_arguments(func: 'AnyCallableT') -> 'AnyCallableT': ... @deprecated( 'The `validate_arguments` method is deprecated; use `validate_call` instead.', category=None, ) def validate_arguments(func: Optional['AnyCallableT'] = None, *, config: 'ConfigType' = None) -> Any: """Decorator to validate the arguments passed to a function.""" warnings.warn( 'The `validate_arguments` method is deprecated; use `validate_call` instead.', PydanticDeprecatedSince20, stacklevel=2, ) def validate(_func: 'AnyCallable') -> 'AnyCallable': vd = ValidatedFunction(_func, config) @wraps(_func) def wrapper_function(*args: Any, **kwargs: Any) -> Any: return vd.call(*args, **kwargs) wrapper_function.vd = vd # type: ignore wrapper_function.validate = vd.init_model_instance # type: ignore wrapper_function.raw_function = vd.raw_function # type: ignore wrapper_function.model = vd.model # type: ignore return wrapper_function if func: return validate(func) else: return validate ALT_V_ARGS = 'v__args' ALT_V_KWARGS = 'v__kwargs' V_POSITIONAL_ONLY_NAME = 'v__positional_only' V_DUPLICATE_KWARGS = 'v__duplicate_kwargs' class ValidatedFunction: def __init__(self, function: 'AnyCallable', config: 'ConfigType'): from inspect import Parameter, signature parameters: Mapping[str, Parameter] = signature(function).parameters if parameters.keys() & {ALT_V_ARGS, ALT_V_KWARGS, V_POSITIONAL_ONLY_NAME, V_DUPLICATE_KWARGS}: raise PydanticUserError( f'"{ALT_V_ARGS}", "{ALT_V_KWARGS}", "{V_POSITIONAL_ONLY_NAME}" and "{V_DUPLICATE_KWARGS}" ' f'are not permitted as argument names when using the "{validate_arguments.__name__}" decorator', code=None, ) self.raw_function = function self.arg_mapping: Dict[int, str] = {} self.positional_only_args: set[str] = set() self.v_args_name = 'args' self.v_kwargs_name = 'kwargs' type_hints = _typing_extra.get_type_hints(function, include_extras=True) takes_args = False takes_kwargs = False fields: Dict[str, Tuple[Any, Any]] = {} for i, (name, p) in enumerate(parameters.items()): if p.annotation is p.empty: annotation = Any else: annotation = type_hints[name] default = ... if p.default is p.empty else p.default if p.kind == Parameter.POSITIONAL_ONLY: self.arg_mapping[i] = name fields[name] = annotation, default fields[V_POSITIONAL_ONLY_NAME] = List[str], None self.positional_only_args.add(name) elif p.kind == Parameter.POSITIONAL_OR_KEYWORD: self.arg_mapping[i] = name fields[name] = annotation, default fields[V_DUPLICATE_KWARGS] = List[str], None elif p.kind == Parameter.KEYWORD_ONLY: fields[name] = annotation, default elif p.kind == Parameter.VAR_POSITIONAL: self.v_args_name = name fields[name] = Tuple[annotation, ...], None takes_args = True else: assert p.kind == Parameter.VAR_KEYWORD, p.kind self.v_kwargs_name = name fields[name] = Dict[str, annotation], None takes_kwargs = True # these checks avoid a clash between "args" and a field with that name if not takes_args and self.v_args_name in fields: self.v_args_name = ALT_V_ARGS # same with "kwargs" if not takes_kwargs and self.v_kwargs_name in fields: self.v_kwargs_name = ALT_V_KWARGS if not takes_args: # we add the field so validation below can raise the correct exception fields[self.v_args_name] = List[Any], None if not takes_kwargs: # same with kwargs fields[self.v_kwargs_name] = Dict[Any, Any], None self.create_model(fields, takes_args, takes_kwargs, config) def init_model_instance(self, *args: Any, **kwargs: Any) -> BaseModel: values = self.build_values(args, kwargs) return self.model(**values) def call(self, *args: Any, **kwargs: Any) -> Any: m = self.init_model_instance(*args, **kwargs) return self.execute(m) def build_values(self, args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Dict[str, Any]: values: Dict[str, Any] = {} if args: arg_iter = enumerate(args) while True: try: i, a = next(arg_iter) except StopIteration: break arg_name = self.arg_mapping.get(i) if arg_name is not None: values[arg_name] = a else: values[self.v_args_name] = [a] + [a for _, a in arg_iter] break var_kwargs: Dict[str, Any] = {} wrong_positional_args = [] duplicate_kwargs = [] fields_alias = [ field.alias for name, field in self.model.__pydantic_fields__.items() if name not in (self.v_args_name, self.v_kwargs_name) ] non_var_fields = set(self.model.__pydantic_fields__) - {self.v_args_name, self.v_kwargs_name} for k, v in kwargs.items(): if k in non_var_fields or k in fields_alias: if k in self.positional_only_args: wrong_positional_args.append(k) if k in values: duplicate_kwargs.append(k) values[k] = v else: var_kwargs[k] = v if var_kwargs: values[self.v_kwargs_name] = var_kwargs if wrong_positional_args: values[V_POSITIONAL_ONLY_NAME] = wrong_positional_args if duplicate_kwargs: values[V_DUPLICATE_KWARGS] = duplicate_kwargs return values def execute(self, m: BaseModel) -> Any: d = { k: v for k, v in m.__dict__.items() if k in m.__pydantic_fields_set__ or m.__pydantic_fields__[k].default_factory } var_kwargs = d.pop(self.v_kwargs_name, {}) if self.v_args_name in d: args_: List[Any] = [] in_kwargs = False kwargs = {} for name, value in d.items(): if in_kwargs: kwargs[name] = value elif name == self.v_args_name: args_ += value in_kwargs = True else: args_.append(value) return self.raw_function(*args_, **kwargs, **var_kwargs) elif self.positional_only_args: args_ = [] kwargs = {} for name, value in d.items(): if name in self.positional_only_args: args_.append(value) else: kwargs[name] = value return self.raw_function(*args_, **kwargs, **var_kwargs) else: return self.raw_function(**d, **var_kwargs) def create_model(self, fields: Dict[str, Any], takes_args: bool, takes_kwargs: bool, config: 'ConfigType') -> None: pos_args = len(self.arg_mapping) config_wrapper = _config.ConfigWrapper(config) if config_wrapper.alias_generator: raise PydanticUserError( 'Setting the "alias_generator" property on custom Config for ' '@validate_arguments is not yet supported, please remove.', code=None, ) if config_wrapper.extra is None: config_wrapper.config_dict['extra'] = 'forbid' class DecoratorBaseModel(BaseModel): @field_validator(self.v_args_name, check_fields=False) @classmethod def check_args(cls, v: Optional[List[Any]]) -> Optional[List[Any]]: if takes_args or v is None: return v raise TypeError(f'{pos_args} positional arguments expected but {pos_args + len(v)} given') @field_validator(self.v_kwargs_name, check_fields=False) @classmethod def check_kwargs(cls, v: Optional[Dict[str, Any]]) -> Optional[Dict[str, Any]]: if takes_kwargs or v is None: return v plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v.keys())) raise TypeError(f'unexpected keyword argument{plural}: {keys}') @field_validator(V_POSITIONAL_ONLY_NAME, check_fields=False) @classmethod def check_positional_only(cls, v: Optional[List[str]]) -> None: if v is None: return plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v)) raise TypeError(f'positional-only argument{plural} passed as keyword argument{plural}: {keys}') @field_validator(V_DUPLICATE_KWARGS, check_fields=False) @classmethod def check_duplicate_kwargs(cls, v: Optional[List[str]]) -> None: if v is None: return plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v)) raise TypeError(f'multiple values for argument{plural}: {keys}') model_config = config_wrapper.config_dict self.model = create_model(to_pascal(self.raw_function.__name__), __base__=DecoratorBaseModel, **fields) pydantic-2.10.6/pydantic/deprecated/json.py000066400000000000000000000110751474456633400206750ustar00rootroot00000000000000import datetime import warnings from collections import deque from decimal import Decimal from enum import Enum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from re import Pattern from types import GeneratorType from typing import TYPE_CHECKING, Any, Callable, Dict, Type, Union from uuid import UUID from typing_extensions import deprecated from .._internal._import_utils import import_cached_base_model from ..color import Color from ..networks import NameEmail from ..types import SecretBytes, SecretStr from ..warnings import PydanticDeprecatedSince20 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = 'pydantic_encoder', 'custom_pydantic_encoder', 'timedelta_isoformat' def isoformat(o: Union[datetime.date, datetime.time]) -> str: return o.isoformat() def decimal_encoder(dec_value: Decimal) -> Union[int, float]: """Encodes a Decimal as int of there's no exponent, otherwise float. This is useful when we use ConstrainedDecimal to represent Numeric(x,0) where a integer (but not int typed) is used. Encoding this as a float results in failed round-tripping between encode and parse. Our Id type is a prime example of this. >>> decimal_encoder(Decimal("1.0")) 1.0 >>> decimal_encoder(Decimal("1")) 1 """ exponent = dec_value.as_tuple().exponent if isinstance(exponent, int) and exponent >= 0: return int(dec_value) else: return float(dec_value) ENCODERS_BY_TYPE: Dict[Type[Any], Callable[[Any], Any]] = { bytes: lambda o: o.decode(), Color: str, datetime.date: isoformat, datetime.datetime: isoformat, datetime.time: isoformat, datetime.timedelta: lambda td: td.total_seconds(), Decimal: decimal_encoder, Enum: lambda o: o.value, frozenset: list, deque: list, GeneratorType: list, IPv4Address: str, IPv4Interface: str, IPv4Network: str, IPv6Address: str, IPv6Interface: str, IPv6Network: str, NameEmail: str, Path: str, Pattern: lambda o: o.pattern, SecretBytes: str, SecretStr: str, set: list, UUID: str, } @deprecated( '`pydantic_encoder` is deprecated, use `pydantic_core.to_jsonable_python` instead.', category=None, ) def pydantic_encoder(obj: Any) -> Any: warnings.warn( '`pydantic_encoder` is deprecated, use `pydantic_core.to_jsonable_python` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from dataclasses import asdict, is_dataclass BaseModel = import_cached_base_model() if isinstance(obj, BaseModel): return obj.model_dump() elif is_dataclass(obj): return asdict(obj) # type: ignore # Check the class type and its superclasses for a matching encoder for base in obj.__class__.__mro__[:-1]: try: encoder = ENCODERS_BY_TYPE[base] except KeyError: continue return encoder(obj) else: # We have exited the for loop without finding a suitable encoder raise TypeError(f"Object of type '{obj.__class__.__name__}' is not JSON serializable") # TODO: Add a suggested migration path once there is a way to use custom encoders @deprecated( '`custom_pydantic_encoder` is deprecated, use `BaseModel.model_dump` instead.', category=None, ) def custom_pydantic_encoder(type_encoders: Dict[Any, Callable[[Type[Any]], Any]], obj: Any) -> Any: warnings.warn( '`custom_pydantic_encoder` is deprecated, use `BaseModel.model_dump` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) # Check the class type and its superclasses for a matching encoder for base in obj.__class__.__mro__[:-1]: try: encoder = type_encoders[base] except KeyError: continue return encoder(obj) else: # We have exited the for loop without finding a suitable encoder return pydantic_encoder(obj) @deprecated('`timedelta_isoformat` is deprecated.', category=None) def timedelta_isoformat(td: datetime.timedelta) -> str: """ISO 8601 encoding for Python timedelta object.""" warnings.warn('`timedelta_isoformat` is deprecated.', category=PydanticDeprecatedSince20, stacklevel=2) minutes, seconds = divmod(td.seconds, 60) hours, minutes = divmod(minutes, 60) return f'{"-" if td.days < 0 else ""}P{abs(td.days)}DT{hours:d}H{minutes:d}M{seconds:d}.{td.microseconds:06d}S' pydantic-2.10.6/pydantic/deprecated/parse.py000066400000000000000000000047171474456633400210430ustar00rootroot00000000000000from __future__ import annotations import json import pickle import warnings from enum import Enum from pathlib import Path from typing import TYPE_CHECKING, Any, Callable from typing_extensions import deprecated from ..warnings import PydanticDeprecatedSince20 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 class Protocol(str, Enum): json = 'json' pickle = 'pickle' @deprecated('`load_str_bytes` is deprecated.', category=None) def load_str_bytes( b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: Protocol | None = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, ) -> Any: warnings.warn('`load_str_bytes` is deprecated.', category=PydanticDeprecatedSince20, stacklevel=2) if proto is None and content_type: if content_type.endswith(('json', 'javascript')): pass elif allow_pickle and content_type.endswith('pickle'): proto = Protocol.pickle else: raise TypeError(f'Unknown content-type: {content_type}') proto = proto or Protocol.json if proto == Protocol.json: if isinstance(b, bytes): b = b.decode(encoding) return json_loads(b) # type: ignore elif proto == Protocol.pickle: if not allow_pickle: raise RuntimeError('Trying to decode with pickle with allow_pickle=False') bb = b if isinstance(b, bytes) else b.encode() # type: ignore return pickle.loads(bb) else: raise TypeError(f'Unknown protocol: {proto}') @deprecated('`load_file` is deprecated.', category=None) def load_file( path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: Protocol | None = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, ) -> Any: warnings.warn('`load_file` is deprecated.', category=PydanticDeprecatedSince20, stacklevel=2) path = Path(path) b = path.read_bytes() if content_type is None: if path.suffix in ('.js', '.json'): proto = Protocol.json elif path.suffix == '.pkl': proto = Protocol.pickle return load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=json_loads ) pydantic-2.10.6/pydantic/deprecated/tools.py000066400000000000000000000064101474456633400210610ustar00rootroot00000000000000from __future__ import annotations import json import warnings from typing import TYPE_CHECKING, Any, Callable, Type, TypeVar, Union from typing_extensions import deprecated from ..json_schema import DEFAULT_REF_TEMPLATE, GenerateJsonSchema from ..type_adapter import TypeAdapter from ..warnings import PydanticDeprecatedSince20 if not TYPE_CHECKING: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = 'parse_obj_as', 'schema_of', 'schema_json_of' NameFactory = Union[str, Callable[[Type[Any]], str]] T = TypeVar('T') @deprecated( '`parse_obj_as` is deprecated. Use `pydantic.TypeAdapter.validate_python` instead.', category=None, ) def parse_obj_as(type_: type[T], obj: Any, type_name: NameFactory | None = None) -> T: warnings.warn( '`parse_obj_as` is deprecated. Use `pydantic.TypeAdapter.validate_python` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if type_name is not None: # pragma: no cover warnings.warn( 'The type_name parameter is deprecated. parse_obj_as no longer creates temporary models', DeprecationWarning, stacklevel=2, ) return TypeAdapter(type_).validate_python(obj) @deprecated( '`schema_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead.', category=None, ) def schema_of( type_: Any, *, title: NameFactory | None = None, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, ) -> dict[str, Any]: """Generate a JSON schema (as dict) for the passed model or dynamically generated one.""" warnings.warn( '`schema_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) res = TypeAdapter(type_).json_schema( by_alias=by_alias, schema_generator=schema_generator, ref_template=ref_template, ) if title is not None: if isinstance(title, str): res['title'] = title else: warnings.warn( 'Passing a callable for the `title` parameter is deprecated and no longer supported', DeprecationWarning, stacklevel=2, ) res['title'] = title(type_) return res @deprecated( '`schema_json_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead.', category=None, ) def schema_json_of( type_: Any, *, title: NameFactory | None = None, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, **dumps_kwargs: Any, ) -> str: """Generate a JSON schema (as JSON) for the passed model or dynamically generated one.""" warnings.warn( '`schema_json_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return json.dumps( schema_of(type_, title=title, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator), **dumps_kwargs, ) pydantic-2.10.6/pydantic/env_settings.py000066400000000000000000000002241474456633400203260ustar00rootroot00000000000000"""The `env_settings` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/error_wrappers.py000066400000000000000000000002261474456633400206740ustar00rootroot00000000000000"""The `error_wrappers` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/errors.py000066400000000000000000000116211474456633400171350ustar00rootroot00000000000000"""Pydantic-specific errors.""" from __future__ import annotations as _annotations import re from typing_extensions import Literal, Self from ._migration import getattr_migration from .version import version_short __all__ = ( 'PydanticUserError', 'PydanticUndefinedAnnotation', 'PydanticImportError', 'PydanticSchemaGenerationError', 'PydanticInvalidForJsonSchema', 'PydanticErrorCodes', ) # We use this URL to allow for future flexibility about how we host the docs, while allowing for Pydantic # code in the while with "old" URLs to still work. # 'u' refers to "user errors" - e.g. errors caused by developers using pydantic, as opposed to validation errors. DEV_ERROR_DOCS_URL = f'https://errors.pydantic.dev/{version_short()}/u/' PydanticErrorCodes = Literal[ 'class-not-fully-defined', 'custom-json-schema', 'decorator-missing-field', 'discriminator-no-field', 'discriminator-alias-type', 'discriminator-needs-literal', 'discriminator-alias', 'discriminator-validator', 'callable-discriminator-no-tag', 'typed-dict-version', 'model-field-overridden', 'model-field-missing-annotation', 'config-both', 'removed-kwargs', 'circular-reference-schema', 'invalid-for-json-schema', 'json-schema-already-used', 'base-model-instantiated', 'undefined-annotation', 'schema-for-unknown-type', 'import-error', 'create-model-field-definitions', 'create-model-config-base', 'validator-no-fields', 'validator-invalid-fields', 'validator-instance-method', 'validator-input-type', 'root-validator-pre-skip', 'model-serializer-instance-method', 'validator-field-config-info', 'validator-v1-signature', 'validator-signature', 'field-serializer-signature', 'model-serializer-signature', 'multiple-field-serializers', 'invalid-annotated-type', 'type-adapter-config-unused', 'root-model-extra', 'unevaluable-type-annotation', 'dataclass-init-false-extra-allow', 'clashing-init-and-init-var', 'model-config-invalid-field-name', 'with-config-on-model', 'dataclass-on-model', 'validate-call-type', 'unpack-typed-dict', 'overlapping-unpack-typed-dict', 'invalid-self-type', ] class PydanticErrorMixin: """A mixin class for common functionality shared by all Pydantic-specific errors. Attributes: message: A message describing the error. code: An optional error code from PydanticErrorCodes enum. """ def __init__(self, message: str, *, code: PydanticErrorCodes | None) -> None: self.message = message self.code = code def __str__(self) -> str: if self.code is None: return self.message else: return f'{self.message}\n\nFor further information visit {DEV_ERROR_DOCS_URL}{self.code}' class PydanticUserError(PydanticErrorMixin, TypeError): """An error raised due to incorrect use of Pydantic.""" class PydanticUndefinedAnnotation(PydanticErrorMixin, NameError): """A subclass of `NameError` raised when handling undefined annotations during `CoreSchema` generation. Attributes: name: Name of the error. message: Description of the error. """ def __init__(self, name: str, message: str) -> None: self.name = name super().__init__(message=message, code='undefined-annotation') @classmethod def from_name_error(cls, name_error: NameError) -> Self: """Convert a `NameError` to a `PydanticUndefinedAnnotation` error. Args: name_error: `NameError` to be converted. Returns: Converted `PydanticUndefinedAnnotation` error. """ try: name = name_error.name # type: ignore # python > 3.10 except AttributeError: name = re.search(r".*'(.+?)'", str(name_error)).group(1) # type: ignore[union-attr] return cls(name=name, message=str(name_error)) class PydanticImportError(PydanticErrorMixin, ImportError): """An error raised when an import fails due to module changes between V1 and V2. Attributes: message: Description of the error. """ def __init__(self, message: str) -> None: super().__init__(message, code='import-error') class PydanticSchemaGenerationError(PydanticUserError): """An error raised during failures to generate a `CoreSchema` for some type. Attributes: message: Description of the error. """ def __init__(self, message: str) -> None: super().__init__(message, code='schema-for-unknown-type') class PydanticInvalidForJsonSchema(PydanticUserError): """An error raised during failures to generate a JSON schema for some `CoreSchema`. Attributes: message: Description of the error. """ def __init__(self, message: str) -> None: super().__init__(message, code='invalid-for-json-schema') __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/experimental/000077500000000000000000000000001474456633400177435ustar00rootroot00000000000000pydantic-2.10.6/pydantic/experimental/__init__.py000066400000000000000000000005101474456633400220500ustar00rootroot00000000000000"""The "experimental" module of pydantic contains potential new features that are subject to change.""" import warnings from pydantic.warnings import PydanticExperimentalWarning warnings.warn( 'This module is experimental, its contents are subject to change and deprecation.', category=PydanticExperimentalWarning, ) pydantic-2.10.6/pydantic/experimental/pipeline.py000066400000000000000000000566531474456633400221410ustar00rootroot00000000000000"""Experimental pipeline API functionality. Be careful with this API, it's subject to change.""" from __future__ import annotations import datetime import operator import re import sys from collections import deque from collections.abc import Container from dataclasses import dataclass from decimal import Decimal from functools import cached_property, partial from typing import TYPE_CHECKING, Any, Callable, Generic, Pattern, Protocol, TypeVar, Union, overload import annotated_types from typing_extensions import Annotated if TYPE_CHECKING: from pydantic_core import core_schema as cs from pydantic import GetCoreSchemaHandler from pydantic._internal._internal_dataclass import slots_true as _slots_true if sys.version_info < (3, 10): EllipsisType = type(Ellipsis) else: from types import EllipsisType __all__ = ['validate_as', 'validate_as_deferred', 'transform'] _slots_frozen = {**_slots_true, 'frozen': True} @dataclass(**_slots_frozen) class _ValidateAs: tp: type[Any] strict: bool = False @dataclass class _ValidateAsDefer: func: Callable[[], type[Any]] @cached_property def tp(self) -> type[Any]: return self.func() @dataclass(**_slots_frozen) class _Transform: func: Callable[[Any], Any] @dataclass(**_slots_frozen) class _PipelineOr: left: _Pipeline[Any, Any] right: _Pipeline[Any, Any] @dataclass(**_slots_frozen) class _PipelineAnd: left: _Pipeline[Any, Any] right: _Pipeline[Any, Any] @dataclass(**_slots_frozen) class _Eq: value: Any @dataclass(**_slots_frozen) class _NotEq: value: Any @dataclass(**_slots_frozen) class _In: values: Container[Any] @dataclass(**_slots_frozen) class _NotIn: values: Container[Any] _ConstraintAnnotation = Union[ annotated_types.Le, annotated_types.Ge, annotated_types.Lt, annotated_types.Gt, annotated_types.Len, annotated_types.MultipleOf, annotated_types.Timezone, annotated_types.Interval, annotated_types.Predicate, # common predicates not included in annotated_types _Eq, _NotEq, _In, _NotIn, # regular expressions Pattern[str], ] @dataclass(**_slots_frozen) class _Constraint: constraint: _ConstraintAnnotation _Step = Union[_ValidateAs, _ValidateAsDefer, _Transform, _PipelineOr, _PipelineAnd, _Constraint] _InT = TypeVar('_InT') _OutT = TypeVar('_OutT') _NewOutT = TypeVar('_NewOutT') class _FieldTypeMarker: pass # TODO: ultimately, make this public, see https://github.com/pydantic/pydantic/pull/9459#discussion_r1628197626 # Also, make this frozen eventually, but that doesn't work right now because of the generic base # Which attempts to modify __orig_base__ and such. # We could go with a manual freeze, but that seems overkill for now. @dataclass(**_slots_true) class _Pipeline(Generic[_InT, _OutT]): """Abstract representation of a chain of validation, transformation, and parsing steps.""" _steps: tuple[_Step, ...] def transform( self, func: Callable[[_OutT], _NewOutT], ) -> _Pipeline[_InT, _NewOutT]: """Transform the output of the previous step. If used as the first step in a pipeline, the type of the field is used. That is, the transformation is applied to after the value is parsed to the field's type. """ return _Pipeline[_InT, _NewOutT](self._steps + (_Transform(func),)) @overload def validate_as(self, tp: type[_NewOutT], *, strict: bool = ...) -> _Pipeline[_InT, _NewOutT]: ... @overload def validate_as(self, tp: EllipsisType, *, strict: bool = ...) -> _Pipeline[_InT, Any]: # type: ignore ... def validate_as(self, tp: type[_NewOutT] | EllipsisType, *, strict: bool = False) -> _Pipeline[_InT, Any]: # type: ignore """Validate / parse the input into a new type. If no type is provided, the type of the field is used. Types are parsed in Pydantic's `lax` mode by default, but you can enable `strict` mode by passing `strict=True`. """ if isinstance(tp, EllipsisType): return _Pipeline[_InT, Any](self._steps + (_ValidateAs(_FieldTypeMarker, strict=strict),)) return _Pipeline[_InT, _NewOutT](self._steps + (_ValidateAs(tp, strict=strict),)) def validate_as_deferred(self, func: Callable[[], type[_NewOutT]]) -> _Pipeline[_InT, _NewOutT]: """Parse the input into a new type, deferring resolution of the type until the current class is fully defined. This is useful when you need to reference the class in it's own type annotations. """ return _Pipeline[_InT, _NewOutT](self._steps + (_ValidateAsDefer(func),)) # constraints @overload def constrain(self: _Pipeline[_InT, _NewOutGe], constraint: annotated_types.Ge) -> _Pipeline[_InT, _NewOutGe]: ... @overload def constrain(self: _Pipeline[_InT, _NewOutGt], constraint: annotated_types.Gt) -> _Pipeline[_InT, _NewOutGt]: ... @overload def constrain(self: _Pipeline[_InT, _NewOutLe], constraint: annotated_types.Le) -> _Pipeline[_InT, _NewOutLe]: ... @overload def constrain(self: _Pipeline[_InT, _NewOutLt], constraint: annotated_types.Lt) -> _Pipeline[_InT, _NewOutLt]: ... @overload def constrain( self: _Pipeline[_InT, _NewOutLen], constraint: annotated_types.Len ) -> _Pipeline[_InT, _NewOutLen]: ... @overload def constrain( self: _Pipeline[_InT, _NewOutT], constraint: annotated_types.MultipleOf ) -> _Pipeline[_InT, _NewOutT]: ... @overload def constrain( self: _Pipeline[_InT, _NewOutDatetime], constraint: annotated_types.Timezone ) -> _Pipeline[_InT, _NewOutDatetime]: ... @overload def constrain(self: _Pipeline[_InT, _OutT], constraint: annotated_types.Predicate) -> _Pipeline[_InT, _OutT]: ... @overload def constrain( self: _Pipeline[_InT, _NewOutInterval], constraint: annotated_types.Interval ) -> _Pipeline[_InT, _NewOutInterval]: ... @overload def constrain(self: _Pipeline[_InT, _OutT], constraint: _Eq) -> _Pipeline[_InT, _OutT]: ... @overload def constrain(self: _Pipeline[_InT, _OutT], constraint: _NotEq) -> _Pipeline[_InT, _OutT]: ... @overload def constrain(self: _Pipeline[_InT, _OutT], constraint: _In) -> _Pipeline[_InT, _OutT]: ... @overload def constrain(self: _Pipeline[_InT, _OutT], constraint: _NotIn) -> _Pipeline[_InT, _OutT]: ... @overload def constrain(self: _Pipeline[_InT, _NewOutT], constraint: Pattern[str]) -> _Pipeline[_InT, _NewOutT]: ... def constrain(self, constraint: _ConstraintAnnotation) -> Any: """Constrain a value to meet a certain condition. We support most conditions from `annotated_types`, as well as regular expressions. Most of the time you'll be calling a shortcut method like `gt`, `lt`, `len`, etc so you don't need to call this directly. """ return _Pipeline[_InT, _OutT](self._steps + (_Constraint(constraint),)) def predicate(self: _Pipeline[_InT, _NewOutT], func: Callable[[_NewOutT], bool]) -> _Pipeline[_InT, _NewOutT]: """Constrain a value to meet a certain predicate.""" return self.constrain(annotated_types.Predicate(func)) def gt(self: _Pipeline[_InT, _NewOutGt], gt: _NewOutGt) -> _Pipeline[_InT, _NewOutGt]: """Constrain a value to be greater than a certain value.""" return self.constrain(annotated_types.Gt(gt)) def lt(self: _Pipeline[_InT, _NewOutLt], lt: _NewOutLt) -> _Pipeline[_InT, _NewOutLt]: """Constrain a value to be less than a certain value.""" return self.constrain(annotated_types.Lt(lt)) def ge(self: _Pipeline[_InT, _NewOutGe], ge: _NewOutGe) -> _Pipeline[_InT, _NewOutGe]: """Constrain a value to be greater than or equal to a certain value.""" return self.constrain(annotated_types.Ge(ge)) def le(self: _Pipeline[_InT, _NewOutLe], le: _NewOutLe) -> _Pipeline[_InT, _NewOutLe]: """Constrain a value to be less than or equal to a certain value.""" return self.constrain(annotated_types.Le(le)) def len(self: _Pipeline[_InT, _NewOutLen], min_len: int, max_len: int | None = None) -> _Pipeline[_InT, _NewOutLen]: """Constrain a value to have a certain length.""" return self.constrain(annotated_types.Len(min_len, max_len)) @overload def multiple_of(self: _Pipeline[_InT, _NewOutDiv], multiple_of: _NewOutDiv) -> _Pipeline[_InT, _NewOutDiv]: ... @overload def multiple_of(self: _Pipeline[_InT, _NewOutMod], multiple_of: _NewOutMod) -> _Pipeline[_InT, _NewOutMod]: ... def multiple_of(self: _Pipeline[_InT, Any], multiple_of: Any) -> _Pipeline[_InT, Any]: """Constrain a value to be a multiple of a certain number.""" return self.constrain(annotated_types.MultipleOf(multiple_of)) def eq(self: _Pipeline[_InT, _OutT], value: _OutT) -> _Pipeline[_InT, _OutT]: """Constrain a value to be equal to a certain value.""" return self.constrain(_Eq(value)) def not_eq(self: _Pipeline[_InT, _OutT], value: _OutT) -> _Pipeline[_InT, _OutT]: """Constrain a value to not be equal to a certain value.""" return self.constrain(_NotEq(value)) def in_(self: _Pipeline[_InT, _OutT], values: Container[_OutT]) -> _Pipeline[_InT, _OutT]: """Constrain a value to be in a certain set.""" return self.constrain(_In(values)) def not_in(self: _Pipeline[_InT, _OutT], values: Container[_OutT]) -> _Pipeline[_InT, _OutT]: """Constrain a value to not be in a certain set.""" return self.constrain(_NotIn(values)) # timezone methods def datetime_tz_naive(self: _Pipeline[_InT, datetime.datetime]) -> _Pipeline[_InT, datetime.datetime]: return self.constrain(annotated_types.Timezone(None)) def datetime_tz_aware(self: _Pipeline[_InT, datetime.datetime]) -> _Pipeline[_InT, datetime.datetime]: return self.constrain(annotated_types.Timezone(...)) def datetime_tz( self: _Pipeline[_InT, datetime.datetime], tz: datetime.tzinfo ) -> _Pipeline[_InT, datetime.datetime]: return self.constrain(annotated_types.Timezone(tz)) # type: ignore def datetime_with_tz( self: _Pipeline[_InT, datetime.datetime], tz: datetime.tzinfo | None ) -> _Pipeline[_InT, datetime.datetime]: return self.transform(partial(datetime.datetime.replace, tzinfo=tz)) # string methods def str_lower(self: _Pipeline[_InT, str]) -> _Pipeline[_InT, str]: return self.transform(str.lower) def str_upper(self: _Pipeline[_InT, str]) -> _Pipeline[_InT, str]: return self.transform(str.upper) def str_title(self: _Pipeline[_InT, str]) -> _Pipeline[_InT, str]: return self.transform(str.title) def str_strip(self: _Pipeline[_InT, str]) -> _Pipeline[_InT, str]: return self.transform(str.strip) def str_pattern(self: _Pipeline[_InT, str], pattern: str) -> _Pipeline[_InT, str]: return self.constrain(re.compile(pattern)) def str_contains(self: _Pipeline[_InT, str], substring: str) -> _Pipeline[_InT, str]: return self.predicate(lambda v: substring in v) def str_starts_with(self: _Pipeline[_InT, str], prefix: str) -> _Pipeline[_InT, str]: return self.predicate(lambda v: v.startswith(prefix)) def str_ends_with(self: _Pipeline[_InT, str], suffix: str) -> _Pipeline[_InT, str]: return self.predicate(lambda v: v.endswith(suffix)) # operators def otherwise(self, other: _Pipeline[_OtherIn, _OtherOut]) -> _Pipeline[_InT | _OtherIn, _OutT | _OtherOut]: """Combine two validation chains, returning the result of the first chain if it succeeds, and the second chain if it fails.""" return _Pipeline((_PipelineOr(self, other),)) __or__ = otherwise def then(self, other: _Pipeline[_OutT, _OtherOut]) -> _Pipeline[_InT, _OtherOut]: """Pipe the result of one validation chain into another.""" return _Pipeline((_PipelineAnd(self, other),)) __and__ = then def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> cs.CoreSchema: from pydantic_core import core_schema as cs queue = deque(self._steps) s = None while queue: step = queue.popleft() s = _apply_step(step, s, handler, source_type) s = s or cs.any_schema() return s def __supports_type__(self, _: _OutT) -> bool: raise NotImplementedError validate_as = _Pipeline[Any, Any](()).validate_as validate_as_deferred = _Pipeline[Any, Any](()).validate_as_deferred transform = _Pipeline[Any, Any]((_ValidateAs(_FieldTypeMarker),)).transform def _check_func( func: Callable[[Any], bool], predicate_err: str | Callable[[], str], s: cs.CoreSchema | None ) -> cs.CoreSchema: from pydantic_core import core_schema as cs def handler(v: Any) -> Any: if func(v): return v raise ValueError(f'Expected {predicate_err if isinstance(predicate_err, str) else predicate_err()}') if s is None: return cs.no_info_plain_validator_function(handler) else: return cs.no_info_after_validator_function(handler, s) def _apply_step(step: _Step, s: cs.CoreSchema | None, handler: GetCoreSchemaHandler, source_type: Any) -> cs.CoreSchema: from pydantic_core import core_schema as cs if isinstance(step, _ValidateAs): s = _apply_parse(s, step.tp, step.strict, handler, source_type) elif isinstance(step, _ValidateAsDefer): s = _apply_parse(s, step.tp, False, handler, source_type) elif isinstance(step, _Transform): s = _apply_transform(s, step.func, handler) elif isinstance(step, _Constraint): s = _apply_constraint(s, step.constraint) elif isinstance(step, _PipelineOr): s = cs.union_schema([handler(step.left), handler(step.right)]) else: assert isinstance(step, _PipelineAnd) s = cs.chain_schema([handler(step.left), handler(step.right)]) return s def _apply_parse( s: cs.CoreSchema | None, tp: type[Any], strict: bool, handler: GetCoreSchemaHandler, source_type: Any, ) -> cs.CoreSchema: from pydantic_core import core_schema as cs from pydantic import Strict if tp is _FieldTypeMarker: return handler(source_type) if strict: tp = Annotated[tp, Strict()] # type: ignore if s and s['type'] == 'any': return handler(tp) else: return cs.chain_schema([s, handler(tp)]) if s else handler(tp) def _apply_transform( s: cs.CoreSchema | None, func: Callable[[Any], Any], handler: GetCoreSchemaHandler ) -> cs.CoreSchema: from pydantic_core import core_schema as cs if s is None: return cs.no_info_plain_validator_function(func) if s['type'] == 'str': if func is str.strip: s = s.copy() s['strip_whitespace'] = True return s elif func is str.lower: s = s.copy() s['to_lower'] = True return s elif func is str.upper: s = s.copy() s['to_upper'] = True return s return cs.no_info_after_validator_function(func, s) def _apply_constraint( # noqa: C901 s: cs.CoreSchema | None, constraint: _ConstraintAnnotation ) -> cs.CoreSchema: """Apply a single constraint to a schema.""" if isinstance(constraint, annotated_types.Gt): gt = constraint.gt if s and s['type'] in {'int', 'float', 'decimal'}: s = s.copy() if s['type'] == 'int' and isinstance(gt, int): s['gt'] = gt elif s['type'] == 'float' and isinstance(gt, float): s['gt'] = gt elif s['type'] == 'decimal' and isinstance(gt, Decimal): s['gt'] = gt else: def check_gt(v: Any) -> bool: return v > gt s = _check_func(check_gt, f'> {gt}', s) elif isinstance(constraint, annotated_types.Ge): ge = constraint.ge if s and s['type'] in {'int', 'float', 'decimal'}: s = s.copy() if s['type'] == 'int' and isinstance(ge, int): s['ge'] = ge elif s['type'] == 'float' and isinstance(ge, float): s['ge'] = ge elif s['type'] == 'decimal' and isinstance(ge, Decimal): s['ge'] = ge def check_ge(v: Any) -> bool: return v >= ge s = _check_func(check_ge, f'>= {ge}', s) elif isinstance(constraint, annotated_types.Lt): lt = constraint.lt if s and s['type'] in {'int', 'float', 'decimal'}: s = s.copy() if s['type'] == 'int' and isinstance(lt, int): s['lt'] = lt elif s['type'] == 'float' and isinstance(lt, float): s['lt'] = lt elif s['type'] == 'decimal' and isinstance(lt, Decimal): s['lt'] = lt def check_lt(v: Any) -> bool: return v < lt s = _check_func(check_lt, f'< {lt}', s) elif isinstance(constraint, annotated_types.Le): le = constraint.le if s and s['type'] in {'int', 'float', 'decimal'}: s = s.copy() if s['type'] == 'int' and isinstance(le, int): s['le'] = le elif s['type'] == 'float' and isinstance(le, float): s['le'] = le elif s['type'] == 'decimal' and isinstance(le, Decimal): s['le'] = le def check_le(v: Any) -> bool: return v <= le s = _check_func(check_le, f'<= {le}', s) elif isinstance(constraint, annotated_types.Len): min_len = constraint.min_length max_len = constraint.max_length if s and s['type'] in {'str', 'list', 'tuple', 'set', 'frozenset', 'dict'}: assert ( s['type'] == 'str' or s['type'] == 'list' or s['type'] == 'tuple' or s['type'] == 'set' or s['type'] == 'dict' or s['type'] == 'frozenset' ) s = s.copy() if min_len != 0: s['min_length'] = min_len if max_len is not None: s['max_length'] = max_len def check_len(v: Any) -> bool: if max_len is not None: return (min_len <= len(v)) and (len(v) <= max_len) return min_len <= len(v) s = _check_func(check_len, f'length >= {min_len} and length <= {max_len}', s) elif isinstance(constraint, annotated_types.MultipleOf): multiple_of = constraint.multiple_of if s and s['type'] in {'int', 'float', 'decimal'}: s = s.copy() if s['type'] == 'int' and isinstance(multiple_of, int): s['multiple_of'] = multiple_of elif s['type'] == 'float' and isinstance(multiple_of, float): s['multiple_of'] = multiple_of elif s['type'] == 'decimal' and isinstance(multiple_of, Decimal): s['multiple_of'] = multiple_of def check_multiple_of(v: Any) -> bool: return v % multiple_of == 0 s = _check_func(check_multiple_of, f'% {multiple_of} == 0', s) elif isinstance(constraint, annotated_types.Timezone): tz = constraint.tz if tz is ...: if s and s['type'] == 'datetime': s = s.copy() s['tz_constraint'] = 'aware' else: def check_tz_aware(v: object) -> bool: assert isinstance(v, datetime.datetime) return v.tzinfo is not None s = _check_func(check_tz_aware, 'timezone aware', s) elif tz is None: if s and s['type'] == 'datetime': s = s.copy() s['tz_constraint'] = 'naive' else: def check_tz_naive(v: object) -> bool: assert isinstance(v, datetime.datetime) return v.tzinfo is None s = _check_func(check_tz_naive, 'timezone naive', s) else: raise NotImplementedError('Constraining to a specific timezone is not yet supported') elif isinstance(constraint, annotated_types.Interval): if constraint.ge: s = _apply_constraint(s, annotated_types.Ge(constraint.ge)) if constraint.gt: s = _apply_constraint(s, annotated_types.Gt(constraint.gt)) if constraint.le: s = _apply_constraint(s, annotated_types.Le(constraint.le)) if constraint.lt: s = _apply_constraint(s, annotated_types.Lt(constraint.lt)) assert s is not None elif isinstance(constraint, annotated_types.Predicate): func = constraint.func if func.__name__ == '': # attempt to extract the source code for a lambda function # to use as the function name in error messages # TODO: is there a better way? should we just not do this? import inspect try: # remove ')' suffix, can use removesuffix once we drop 3.8 source = inspect.getsource(func).strip() if source.endswith(')'): source = source[:-1] lambda_source_code = '`' + ''.join(''.join(source.split('lambda ')[1:]).split(':')[1:]).strip() + '`' except OSError: # stringified annotations lambda_source_code = 'lambda' s = _check_func(func, lambda_source_code, s) else: s = _check_func(func, func.__name__, s) elif isinstance(constraint, _NotEq): value = constraint.value def check_not_eq(v: Any) -> bool: return operator.__ne__(v, value) s = _check_func(check_not_eq, f'!= {value}', s) elif isinstance(constraint, _Eq): value = constraint.value def check_eq(v: Any) -> bool: return operator.__eq__(v, value) s = _check_func(check_eq, f'== {value}', s) elif isinstance(constraint, _In): values = constraint.values def check_in(v: Any) -> bool: return operator.__contains__(values, v) s = _check_func(check_in, f'in {values}', s) elif isinstance(constraint, _NotIn): values = constraint.values def check_not_in(v: Any) -> bool: return operator.__not__(operator.__contains__(values, v)) s = _check_func(check_not_in, f'not in {values}', s) else: assert isinstance(constraint, Pattern) if s and s['type'] == 'str': s = s.copy() s['pattern'] = constraint.pattern else: def check_pattern(v: object) -> bool: assert isinstance(v, str) return constraint.match(v) is not None s = _check_func(check_pattern, f'~ {constraint.pattern}', s) return s class _SupportsRange(annotated_types.SupportsLe, annotated_types.SupportsGe, Protocol): pass class _SupportsLen(Protocol): def __len__(self) -> int: ... _NewOutGt = TypeVar('_NewOutGt', bound=annotated_types.SupportsGt) _NewOutGe = TypeVar('_NewOutGe', bound=annotated_types.SupportsGe) _NewOutLt = TypeVar('_NewOutLt', bound=annotated_types.SupportsLt) _NewOutLe = TypeVar('_NewOutLe', bound=annotated_types.SupportsLe) _NewOutLen = TypeVar('_NewOutLen', bound=_SupportsLen) _NewOutDiv = TypeVar('_NewOutDiv', bound=annotated_types.SupportsDiv) _NewOutMod = TypeVar('_NewOutMod', bound=annotated_types.SupportsMod) _NewOutDatetime = TypeVar('_NewOutDatetime', bound=datetime.datetime) _NewOutInterval = TypeVar('_NewOutInterval', bound=_SupportsRange) _OtherIn = TypeVar('_OtherIn') _OtherOut = TypeVar('_OtherOut') pydantic-2.10.6/pydantic/fields.py000066400000000000000000001722471474456633400171030ustar00rootroot00000000000000"""Defining fields on models.""" from __future__ import annotations as _annotations import dataclasses import inspect import sys import typing from copy import copy from dataclasses import Field as DataclassField from functools import cached_property from typing import Any, Callable, ClassVar, TypeVar, cast, overload from warnings import warn import annotated_types import typing_extensions from pydantic_core import PydanticUndefined from typing_extensions import Literal, TypeAlias, Unpack, deprecated from . import types from ._internal import _decorators, _fields, _generics, _internal_dataclass, _repr, _typing_extra, _utils from ._internal._namespace_utils import GlobalsNamespace, MappingNamespace from .aliases import AliasChoices, AliasPath from .config import JsonDict from .errors import PydanticUserError from .json_schema import PydanticJsonSchemaWarning from .warnings import PydanticDeprecatedSince20 if typing.TYPE_CHECKING: from ._internal._repr import ReprArgs else: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = 'Field', 'PrivateAttr', 'computed_field' _Unset: Any = PydanticUndefined if sys.version_info >= (3, 13): import warnings Deprecated: TypeAlias = warnings.deprecated | deprecated else: Deprecated: TypeAlias = deprecated class _FromFieldInfoInputs(typing_extensions.TypedDict, total=False): """This class exists solely to add type checking for the `**kwargs` in `FieldInfo.from_field`.""" annotation: type[Any] | None default_factory: Callable[[], Any] | Callable[[dict[str, Any]], Any] | None alias: str | None alias_priority: int | None validation_alias: str | AliasPath | AliasChoices | None serialization_alias: str | None title: str | None field_title_generator: Callable[[str, FieldInfo], str] | None description: str | None examples: list[Any] | None exclude: bool | None gt: annotated_types.SupportsGt | None ge: annotated_types.SupportsGe | None lt: annotated_types.SupportsLt | None le: annotated_types.SupportsLe | None multiple_of: float | None strict: bool | None min_length: int | None max_length: int | None pattern: str | typing.Pattern[str] | None allow_inf_nan: bool | None max_digits: int | None decimal_places: int | None union_mode: Literal['smart', 'left_to_right'] | None discriminator: str | types.Discriminator | None deprecated: Deprecated | str | bool | None json_schema_extra: JsonDict | Callable[[JsonDict], None] | None frozen: bool | None validate_default: bool | None repr: bool init: bool | None init_var: bool | None kw_only: bool | None coerce_numbers_to_str: bool | None fail_fast: bool | None class _FieldInfoInputs(_FromFieldInfoInputs, total=False): """This class exists solely to add type checking for the `**kwargs` in `FieldInfo.__init__`.""" default: Any class FieldInfo(_repr.Representation): """This class holds information about a field. `FieldInfo` is used for any field definition regardless of whether the [`Field()`][pydantic.fields.Field] function is explicitly used. !!! warning You generally shouldn't be creating `FieldInfo` directly, you'll only need to use it when accessing [`BaseModel`][pydantic.main.BaseModel] `.model_fields` internals. Attributes: annotation: The type annotation of the field. default: The default value of the field. default_factory: A callable to generate the default value. The callable can either take 0 arguments (in which case it is called as is) or a single argument containing the already validated data. alias: The alias name of the field. alias_priority: The priority of the field's alias. validation_alias: The validation alias of the field. serialization_alias: The serialization alias of the field. title: The title of the field. field_title_generator: A callable that takes a field name and returns title for it. description: The description of the field. examples: List of examples of the field. exclude: Whether to exclude the field from the model serialization. discriminator: Field name or Discriminator for discriminating the type in a tagged union. deprecated: A deprecation message, an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport, or a boolean. If `True`, a default deprecation message will be emitted when accessing the field. json_schema_extra: A dict or callable to provide extra JSON schema properties. frozen: Whether the field is frozen. validate_default: Whether to validate the default value of the field. repr: Whether to include the field in representation of the model. init: Whether the field should be included in the constructor of the dataclass. init_var: Whether the field should _only_ be included in the constructor of the dataclass, and not stored. kw_only: Whether the field should be a keyword-only argument in the constructor of the dataclass. metadata: List of metadata constraints. """ annotation: type[Any] | None default: Any default_factory: Callable[[], Any] | Callable[[dict[str, Any]], Any] | None alias: str | None alias_priority: int | None validation_alias: str | AliasPath | AliasChoices | None serialization_alias: str | None title: str | None field_title_generator: Callable[[str, FieldInfo], str] | None description: str | None examples: list[Any] | None exclude: bool | None discriminator: str | types.Discriminator | None deprecated: Deprecated | str | bool | None json_schema_extra: JsonDict | Callable[[JsonDict], None] | None frozen: bool | None validate_default: bool | None repr: bool init: bool | None init_var: bool | None kw_only: bool | None metadata: list[Any] __slots__ = ( 'annotation', 'evaluated', 'default', 'default_factory', 'alias', 'alias_priority', 'validation_alias', 'serialization_alias', 'title', 'field_title_generator', 'description', 'examples', 'exclude', 'discriminator', 'deprecated', 'json_schema_extra', 'frozen', 'validate_default', 'repr', 'init', 'init_var', 'kw_only', 'metadata', '_attributes_set', ) # used to convert kwargs to metadata/constraints, # None has a special meaning - these items are collected into a `PydanticGeneralMetadata` metadata_lookup: ClassVar[dict[str, typing.Callable[[Any], Any] | None]] = { 'strict': types.Strict, 'gt': annotated_types.Gt, 'ge': annotated_types.Ge, 'lt': annotated_types.Lt, 'le': annotated_types.Le, 'multiple_of': annotated_types.MultipleOf, 'min_length': annotated_types.MinLen, 'max_length': annotated_types.MaxLen, 'pattern': None, 'allow_inf_nan': None, 'max_digits': None, 'decimal_places': None, 'union_mode': None, 'coerce_numbers_to_str': None, 'fail_fast': types.FailFast, } def __init__(self, **kwargs: Unpack[_FieldInfoInputs]) -> None: """This class should generally not be initialized directly; instead, use the `pydantic.fields.Field` function or one of the constructor classmethods. See the signature of `pydantic.fields.Field` for more details about the expected arguments. """ self._attributes_set = {k: v for k, v in kwargs.items() if v is not _Unset} kwargs = {k: _DefaultValues.get(k) if v is _Unset else v for k, v in kwargs.items()} # type: ignore self.annotation, annotation_metadata = self._extract_metadata(kwargs.get('annotation')) self.evaluated = False default = kwargs.pop('default', PydanticUndefined) if default is Ellipsis: self.default = PydanticUndefined # Also remove it from the attributes set, otherwise # `GenerateSchema._common_field_schema` mistakenly # uses it: self._attributes_set.pop('default', None) else: self.default = default self.default_factory = kwargs.pop('default_factory', None) if self.default is not PydanticUndefined and self.default_factory is not None: raise TypeError('cannot specify both default and default_factory') self.alias = kwargs.pop('alias', None) self.validation_alias = kwargs.pop('validation_alias', None) self.serialization_alias = kwargs.pop('serialization_alias', None) alias_is_set = any(alias is not None for alias in (self.alias, self.validation_alias, self.serialization_alias)) self.alias_priority = kwargs.pop('alias_priority', None) or 2 if alias_is_set else None self.title = kwargs.pop('title', None) self.field_title_generator = kwargs.pop('field_title_generator', None) self.description = kwargs.pop('description', None) self.examples = kwargs.pop('examples', None) self.exclude = kwargs.pop('exclude', None) self.discriminator = kwargs.pop('discriminator', None) # For compatibility with FastAPI<=0.110.0, we preserve the existing value if it is not overridden self.deprecated = kwargs.pop('deprecated', getattr(self, 'deprecated', None)) self.repr = kwargs.pop('repr', True) self.json_schema_extra = kwargs.pop('json_schema_extra', None) self.validate_default = kwargs.pop('validate_default', None) self.frozen = kwargs.pop('frozen', None) # currently only used on dataclasses self.init = kwargs.pop('init', None) self.init_var = kwargs.pop('init_var', None) self.kw_only = kwargs.pop('kw_only', None) self.metadata = self._collect_metadata(kwargs) + annotation_metadata # type: ignore @staticmethod def from_field(default: Any = PydanticUndefined, **kwargs: Unpack[_FromFieldInfoInputs]) -> FieldInfo: """Create a new `FieldInfo` object with the `Field` function. Args: default: The default value for the field. Defaults to Undefined. **kwargs: Additional arguments dictionary. Raises: TypeError: If 'annotation' is passed as a keyword argument. Returns: A new FieldInfo object with the given parameters. Example: This is how you can create a field with default value like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int = pydantic.Field(4) ``` """ if 'annotation' in kwargs: raise TypeError('"annotation" is not permitted as a Field keyword argument') return FieldInfo(default=default, **kwargs) @staticmethod def from_annotation(annotation: type[Any]) -> FieldInfo: """Creates a `FieldInfo` instance from a bare annotation. This function is used internally to create a `FieldInfo` from a bare annotation like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int # <-- like this ``` We also account for the case where the annotation can be an instance of `Annotated` and where one of the (not first) arguments in `Annotated` is an instance of `FieldInfo`, e.g.: ```python import annotated_types from typing_extensions import Annotated import pydantic class MyModel(pydantic.BaseModel): foo: Annotated[int, annotated_types.Gt(42)] bar: Annotated[int, pydantic.Field(gt=42)] ``` Args: annotation: An annotation object. Returns: An instance of the field metadata. """ final = False if _typing_extra.is_finalvar(annotation): final = True if annotation is not typing_extensions.Final: annotation = typing_extensions.get_args(annotation)[0] if _typing_extra.is_annotated(annotation): first_arg, *extra_args = typing_extensions.get_args(annotation) if _typing_extra.is_finalvar(first_arg): final = True field_info_annotations = [a for a in extra_args if isinstance(a, FieldInfo)] field_info = FieldInfo.merge_field_infos(*field_info_annotations, annotation=first_arg) if field_info: new_field_info = copy(field_info) new_field_info.annotation = first_arg new_field_info.frozen = final or field_info.frozen metadata: list[Any] = [] for a in extra_args: if _typing_extra.is_deprecated_instance(a): new_field_info.deprecated = a.message elif not isinstance(a, FieldInfo): metadata.append(a) else: metadata.extend(a.metadata) new_field_info.metadata = metadata return new_field_info return FieldInfo(annotation=annotation, frozen=final or None) # pyright: ignore[reportArgumentType] @staticmethod def from_annotated_attribute(annotation: type[Any], default: Any) -> FieldInfo: """Create `FieldInfo` from an annotation with a default value. This is used in cases like the following: ```python import annotated_types from typing_extensions import Annotated import pydantic class MyModel(pydantic.BaseModel): foo: int = 4 # <-- like this bar: Annotated[int, annotated_types.Gt(4)] = 4 # <-- or this spam: Annotated[int, pydantic.Field(gt=4)] = 4 # <-- or this ``` Args: annotation: The type annotation of the field. default: The default value of the field. Returns: A field object with the passed values. """ if annotation is default: raise PydanticUserError( 'Error when building FieldInfo from annotated attribute. ' "Make sure you don't have any field name clashing with a type annotation ", code='unevaluable-type-annotation', ) final = _typing_extra.is_finalvar(annotation) if final and annotation is not typing_extensions.Final: annotation = typing_extensions.get_args(annotation)[0] if isinstance(default, FieldInfo): default.annotation, annotation_metadata = FieldInfo._extract_metadata(annotation) # pyright: ignore[reportArgumentType] default.metadata += annotation_metadata default = default.merge_field_infos( *[x for x in annotation_metadata if isinstance(x, FieldInfo)], default, annotation=default.annotation ) default.frozen = final or default.frozen return default if isinstance(default, dataclasses.Field): init_var = False if annotation is dataclasses.InitVar: init_var = True annotation = typing.cast(Any, Any) elif isinstance(annotation, dataclasses.InitVar): init_var = True annotation = annotation.type pydantic_field = FieldInfo._from_dataclass_field(default) pydantic_field.annotation, annotation_metadata = FieldInfo._extract_metadata(annotation) # pyright: ignore[reportArgumentType] pydantic_field.metadata += annotation_metadata pydantic_field = pydantic_field.merge_field_infos( *[x for x in annotation_metadata if isinstance(x, FieldInfo)], pydantic_field, annotation=pydantic_field.annotation, ) pydantic_field.frozen = final or pydantic_field.frozen pydantic_field.init_var = init_var pydantic_field.init = getattr(default, 'init', None) pydantic_field.kw_only = getattr(default, 'kw_only', None) return pydantic_field if _typing_extra.is_annotated(annotation): first_arg, *extra_args = typing_extensions.get_args(annotation) field_infos = [a for a in extra_args if isinstance(a, FieldInfo)] field_info = FieldInfo.merge_field_infos(*field_infos, annotation=first_arg, default=default) metadata: list[Any] = [] for a in extra_args: if _typing_extra.is_deprecated_instance(a): field_info.deprecated = a.message elif not isinstance(a, FieldInfo): metadata.append(a) else: metadata.extend(a.metadata) field_info.metadata = metadata return field_info return FieldInfo(annotation=annotation, default=default, frozen=final or None) # pyright: ignore[reportArgumentType] @staticmethod def merge_field_infos(*field_infos: FieldInfo, **overrides: Any) -> FieldInfo: """Merge `FieldInfo` instances keeping only explicitly set attributes. Later `FieldInfo` instances override earlier ones. Returns: FieldInfo: A merged FieldInfo instance. """ if len(field_infos) == 1: # No merging necessary, but we still need to make a copy and apply the overrides field_info = copy(field_infos[0]) field_info._attributes_set.update(overrides) default_override = overrides.pop('default', PydanticUndefined) if default_override is Ellipsis: default_override = PydanticUndefined if default_override is not PydanticUndefined: field_info.default = default_override for k, v in overrides.items(): setattr(field_info, k, v) return field_info # type: ignore merged_field_info_kwargs: dict[str, Any] = {} metadata = {} for field_info in field_infos: attributes_set = field_info._attributes_set.copy() try: json_schema_extra = attributes_set.pop('json_schema_extra') existing_json_schema_extra = merged_field_info_kwargs.get('json_schema_extra') if existing_json_schema_extra is None: merged_field_info_kwargs['json_schema_extra'] = json_schema_extra if isinstance(existing_json_schema_extra, dict): if isinstance(json_schema_extra, dict): merged_field_info_kwargs['json_schema_extra'] = { **existing_json_schema_extra, **json_schema_extra, } if callable(json_schema_extra): warn( 'Composing `dict` and `callable` type `json_schema_extra` is not supported.' 'The `callable` type is being ignored.' "If you'd like support for this behavior, please open an issue on pydantic.", PydanticJsonSchemaWarning, ) elif callable(json_schema_extra): # if ever there's a case of a callable, we'll just keep the last json schema extra spec merged_field_info_kwargs['json_schema_extra'] = json_schema_extra except KeyError: pass # later FieldInfo instances override everything except json_schema_extra from earlier FieldInfo instances merged_field_info_kwargs.update(attributes_set) for x in field_info.metadata: if not isinstance(x, FieldInfo): metadata[type(x)] = x merged_field_info_kwargs.update(overrides) field_info = FieldInfo(**merged_field_info_kwargs) field_info.metadata = list(metadata.values()) return field_info @staticmethod def _from_dataclass_field(dc_field: DataclassField[Any]) -> FieldInfo: """Return a new `FieldInfo` instance from a `dataclasses.Field` instance. Args: dc_field: The `dataclasses.Field` instance to convert. Returns: The corresponding `FieldInfo` instance. Raises: TypeError: If any of the `FieldInfo` kwargs does not match the `dataclass.Field` kwargs. """ default = dc_field.default if default is dataclasses.MISSING: default = _Unset if dc_field.default_factory is dataclasses.MISSING: default_factory = _Unset else: default_factory = dc_field.default_factory # use the `Field` function so in correct kwargs raise the correct `TypeError` dc_field_metadata = {k: v for k, v in dc_field.metadata.items() if k in _FIELD_ARG_NAMES} return Field(default=default, default_factory=default_factory, repr=dc_field.repr, **dc_field_metadata) # pyright: ignore[reportCallIssue] @staticmethod def _extract_metadata(annotation: type[Any] | None) -> tuple[type[Any] | None, list[Any]]: """Tries to extract metadata/constraints from an annotation if it uses `Annotated`. Args: annotation: The type hint annotation for which metadata has to be extracted. Returns: A tuple containing the extracted metadata type and the list of extra arguments. """ if annotation is not None: if _typing_extra.is_annotated(annotation): first_arg, *extra_args = typing_extensions.get_args(annotation) return first_arg, list(extra_args) return annotation, [] @staticmethod def _collect_metadata(kwargs: dict[str, Any]) -> list[Any]: """Collect annotations from kwargs. Args: kwargs: Keyword arguments passed to the function. Returns: A list of metadata objects - a combination of `annotated_types.BaseMetadata` and `PydanticMetadata`. """ metadata: list[Any] = [] general_metadata = {} for key, value in list(kwargs.items()): try: marker = FieldInfo.metadata_lookup[key] except KeyError: continue del kwargs[key] if value is not None: if marker is None: general_metadata[key] = value else: metadata.append(marker(value)) if general_metadata: metadata.append(_fields.pydantic_general_metadata(**general_metadata)) return metadata @property def deprecation_message(self) -> str | None: """The deprecation message to be emitted, or `None` if not set.""" if self.deprecated is None: return None if isinstance(self.deprecated, bool): return 'deprecated' if self.deprecated else None return self.deprecated if isinstance(self.deprecated, str) else self.deprecated.message @property def default_factory_takes_validated_data(self) -> bool | None: """Whether the provided default factory callable has a validated data parameter. Returns `None` if no default factory is set. """ if self.default_factory is not None: return _fields.takes_validated_data_argument(self.default_factory) @overload def get_default( self, *, call_default_factory: Literal[True], validated_data: dict[str, Any] | None = None ) -> Any: ... @overload def get_default(self, *, call_default_factory: Literal[False] = ...) -> Any: ... def get_default(self, *, call_default_factory: bool = False, validated_data: dict[str, Any] | None = None) -> Any: """Get the default value. We expose an option for whether to call the default_factory (if present), as calling it may result in side effects that we want to avoid. However, there are times when it really should be called (namely, when instantiating a model via `model_construct`). Args: call_default_factory: Whether to call the default factory or not. validated_data: The already validated data to be passed to the default factory. Returns: The default value, calling the default factory if requested or `None` if not set. """ if self.default_factory is None: return _utils.smart_deepcopy(self.default) elif call_default_factory: if self.default_factory_takes_validated_data: fac = cast('Callable[[dict[str, Any]], Any]', self.default_factory) if validated_data is None: raise ValueError( "The default factory requires the 'validated_data' argument, which was not provided when calling 'get_default'." ) return fac(validated_data) else: fac = cast('Callable[[], Any]', self.default_factory) return fac() else: return None def is_required(self) -> bool: """Check if the field is required (i.e., does not have a default value or factory). Returns: `True` if the field is required, `False` otherwise. """ return self.default is PydanticUndefined and self.default_factory is None def rebuild_annotation(self) -> Any: """Attempts to rebuild the original annotation for use in function signatures. If metadata is present, it adds it to the original annotation using `Annotated`. Otherwise, it returns the original annotation as-is. Note that because the metadata has been flattened, the original annotation may not be reconstructed exactly as originally provided, e.g. if the original type had unrecognized annotations, or was annotated with a call to `pydantic.Field`. Returns: The rebuilt annotation. """ if not self.metadata: return self.annotation else: # Annotated arguments must be a tuple return typing_extensions.Annotated[(self.annotation, *self.metadata)] # type: ignore def apply_typevars_map( self, typevars_map: dict[Any, Any] | None, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> None: """Apply a `typevars_map` to the annotation. This method is used when analyzing parametrized generic types to replace typevars with their concrete types. This method applies the `typevars_map` to the annotation in place. Args: typevars_map: A dictionary mapping type variables to their concrete types. globalns: The globals namespace to use during type annotation evaluation. localns: The locals namespace to use during type annotation evaluation. See Also: pydantic._internal._generics.replace_types is used for replacing the typevars with their concrete types. """ annotation, _ = _typing_extra.try_eval_type(self.annotation, globalns, localns) self.annotation = _generics.replace_types(annotation, typevars_map) def __repr_args__(self) -> ReprArgs: yield 'annotation', _repr.PlainRepr(_repr.display_as_type(self.annotation)) yield 'required', self.is_required() for s in self.__slots__: # TODO: properly make use of the protocol (https://rich.readthedocs.io/en/stable/pretty.html#rich-repr-protocol) # By yielding a three-tuple: if s in ('_attributes_set', 'annotation', 'evaluated'): continue elif s == 'metadata' and not self.metadata: continue elif s == 'repr' and self.repr is True: continue if s == 'frozen' and self.frozen is False: continue if s == 'validation_alias' and self.validation_alias == self.alias: continue if s == 'serialization_alias' and self.serialization_alias == self.alias: continue if s == 'default' and self.default is not PydanticUndefined: yield 'default', self.default elif s == 'default_factory' and self.default_factory is not None: yield 'default_factory', _repr.PlainRepr(_repr.display_as_type(self.default_factory)) else: value = getattr(self, s) if value is not None and value is not PydanticUndefined: yield s, value class _EmptyKwargs(typing_extensions.TypedDict): """This class exists solely to ensure that type checking warns about passing `**extra` in `Field`.""" _DefaultValues = { 'default': ..., 'default_factory': None, 'alias': None, 'alias_priority': None, 'validation_alias': None, 'serialization_alias': None, 'title': None, 'description': None, 'examples': None, 'exclude': None, 'discriminator': None, 'json_schema_extra': None, 'frozen': None, 'validate_default': None, 'repr': True, 'init': None, 'init_var': None, 'kw_only': None, 'pattern': None, 'strict': None, 'gt': None, 'ge': None, 'lt': None, 'le': None, 'multiple_of': None, 'allow_inf_nan': None, 'max_digits': None, 'decimal_places': None, 'min_length': None, 'max_length': None, 'coerce_numbers_to_str': None, } _T = TypeVar('_T') # NOTE: Actual return type is 'FieldInfo', but we want to help type checkers # to understand the magic that happens at runtime with the following overloads: @overload # type hint the return value as `Any` to avoid type checking regressions when using `...`. def Field( default: ellipsis, # noqa: F821 # TODO: use `_typing_extra.EllipsisType` when we drop Py3.9 *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> Any: ... @overload # `default` argument set def Field( default: _T, *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> _T: ... @overload # `default_factory` argument set def Field( *, default_factory: Callable[[], _T] | Callable[[dict[str, Any]], _T], alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> _T: ... @overload def Field( # No default set *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> Any: ... def Field( # noqa: C901 default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | Callable[[dict[str, Any]], Any] | None = _Unset, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> Any: """Usage docs: https://docs.pydantic.dev/2.10/concepts/fields Create a field for objects that can be configured. Used to provide extra information about a field, either for the model schema or complex validation. Some arguments apply only to number fields (`int`, `float`, `Decimal`) and some apply only to `str`. Note: - Any `_Unset` objects will be replaced by the corresponding value defined in the `_DefaultValues` dictionary. If a key for the `_Unset` object is not found in the `_DefaultValues` dictionary, it will default to `None` Args: default: Default value if the field is not set. default_factory: A callable to generate the default value. The callable can either take 0 arguments (in which case it is called as is) or a single argument containing the already validated data. alias: The name to use for the attribute when validating or serializing by alias. This is often used for things like converting between snake and camel case. alias_priority: Priority of the alias. This affects whether an alias generator is used. validation_alias: Like `alias`, but only affects validation, not serialization. serialization_alias: Like `alias`, but only affects serialization, not validation. title: Human-readable title. field_title_generator: A callable that takes a field name and returns title for it. description: Human-readable description. examples: Example values for this field. exclude: Whether to exclude the field from the model serialization. discriminator: Field name or Discriminator for discriminating the type in a tagged union. deprecated: A deprecation message, an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport, or a boolean. If `True`, a default deprecation message will be emitted when accessing the field. json_schema_extra: A dict or callable to provide extra JSON schema properties. frozen: Whether the field is frozen. If true, attempts to change the value on an instance will raise an error. validate_default: If `True`, apply validation to the default value every time you create an instance. Otherwise, for performance reasons, the default value of the field is trusted and not validated. repr: A boolean indicating whether to include the field in the `__repr__` output. init: Whether the field should be included in the constructor of the dataclass. (Only applies to dataclasses.) init_var: Whether the field should _only_ be included in the constructor of the dataclass. (Only applies to dataclasses.) kw_only: Whether the field should be a keyword-only argument in the constructor of the dataclass. (Only applies to dataclasses.) coerce_numbers_to_str: Whether to enable coercion of any `Number` type to `str` (not applicable in `strict` mode). strict: If `True`, strict validation is applied to the field. See [Strict Mode](../concepts/strict_mode.md) for details. gt: Greater than. If set, value must be greater than this. Only applicable to numbers. ge: Greater than or equal. If set, value must be greater than or equal to this. Only applicable to numbers. lt: Less than. If set, value must be less than this. Only applicable to numbers. le: Less than or equal. If set, value must be less than or equal to this. Only applicable to numbers. multiple_of: Value must be a multiple of this. Only applicable to numbers. min_length: Minimum length for iterables. max_length: Maximum length for iterables. pattern: Pattern for strings (a regular expression). allow_inf_nan: Allow `inf`, `-inf`, `nan`. Only applicable to numbers. max_digits: Maximum number of allow digits for strings. decimal_places: Maximum number of decimal places allowed for numbers. union_mode: The strategy to apply when validating a union. Can be `smart` (the default), or `left_to_right`. See [Union Mode](../concepts/unions.md#union-modes) for details. fail_fast: If `True`, validation will stop on the first error. If `False`, all validation errors will be collected. This option can be applied only to iterable types (list, tuple, set, and frozenset). extra: (Deprecated) Extra fields that will be included in the JSON schema. !!! warning Deprecated The `extra` kwargs is deprecated. Use `json_schema_extra` instead. Returns: A new [`FieldInfo`][pydantic.fields.FieldInfo]. The return annotation is `Any` so `Field` can be used on type-annotated fields without causing a type error. """ # Check deprecated and removed params from V1. This logic should eventually be removed. const = extra.pop('const', None) # type: ignore if const is not None: raise PydanticUserError('`const` is removed, use `Literal` instead', code='removed-kwargs') min_items = extra.pop('min_items', None) # type: ignore if min_items is not None: warn('`min_items` is deprecated and will be removed, use `min_length` instead', DeprecationWarning) if min_length in (None, _Unset): min_length = min_items # type: ignore max_items = extra.pop('max_items', None) # type: ignore if max_items is not None: warn('`max_items` is deprecated and will be removed, use `max_length` instead', DeprecationWarning) if max_length in (None, _Unset): max_length = max_items # type: ignore unique_items = extra.pop('unique_items', None) # type: ignore if unique_items is not None: raise PydanticUserError( ( '`unique_items` is removed, use `Set` instead' '(this feature is discussed in https://github.com/pydantic/pydantic-core/issues/296)' ), code='removed-kwargs', ) allow_mutation = extra.pop('allow_mutation', None) # type: ignore if allow_mutation is not None: warn('`allow_mutation` is deprecated and will be removed. use `frozen` instead', DeprecationWarning) if allow_mutation is False: frozen = True regex = extra.pop('regex', None) # type: ignore if regex is not None: raise PydanticUserError('`regex` is removed. use `pattern` instead', code='removed-kwargs') if extra: warn( 'Using extra keyword arguments on `Field` is deprecated and will be removed.' ' Use `json_schema_extra` instead.' f' (Extra keys: {", ".join(k.__repr__() for k in extra.keys())})', DeprecationWarning, ) if not json_schema_extra or json_schema_extra is _Unset: json_schema_extra = extra # type: ignore if ( validation_alias and validation_alias is not _Unset and not isinstance(validation_alias, (str, AliasChoices, AliasPath)) ): raise TypeError('Invalid `validation_alias` type. it should be `str`, `AliasChoices`, or `AliasPath`') if serialization_alias in (_Unset, None) and isinstance(alias, str): serialization_alias = alias if validation_alias in (_Unset, None): validation_alias = alias include = extra.pop('include', None) # type: ignore if include is not None: warn('`include` is deprecated and does nothing. It will be removed, use `exclude` instead', DeprecationWarning) return FieldInfo.from_field( default, default_factory=default_factory, alias=alias, alias_priority=alias_priority, validation_alias=validation_alias, serialization_alias=serialization_alias, title=title, field_title_generator=field_title_generator, description=description, examples=examples, exclude=exclude, discriminator=discriminator, deprecated=deprecated, json_schema_extra=json_schema_extra, frozen=frozen, pattern=pattern, validate_default=validate_default, repr=repr, init=init, init_var=init_var, kw_only=kw_only, coerce_numbers_to_str=coerce_numbers_to_str, strict=strict, gt=gt, ge=ge, lt=lt, le=le, multiple_of=multiple_of, min_length=min_length, max_length=max_length, allow_inf_nan=allow_inf_nan, max_digits=max_digits, decimal_places=decimal_places, union_mode=union_mode, fail_fast=fail_fast, ) _FIELD_ARG_NAMES = set(inspect.signature(Field).parameters) _FIELD_ARG_NAMES.remove('extra') # do not include the varkwargs parameter class ModelPrivateAttr(_repr.Representation): """A descriptor for private attributes in class models. !!! warning You generally shouldn't be creating `ModelPrivateAttr` instances directly, instead use `pydantic.fields.PrivateAttr`. (This is similar to `FieldInfo` vs. `Field`.) Attributes: default: The default value of the attribute if not provided. default_factory: A callable function that generates the default value of the attribute if not provided. """ __slots__ = ('default', 'default_factory') def __init__( self, default: Any = PydanticUndefined, *, default_factory: typing.Callable[[], Any] | None = None ) -> None: if default is Ellipsis: self.default = PydanticUndefined else: self.default = default self.default_factory = default_factory if not typing.TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access def __getattr__(self, item: str) -> Any: """This function improves compatibility with custom descriptors by ensuring delegation happens as expected when the default value of a private attribute is a descriptor. """ if item in {'__get__', '__set__', '__delete__'}: if hasattr(self.default, item): return getattr(self.default, item) raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') def __set_name__(self, cls: type[Any], name: str) -> None: """Preserve `__set_name__` protocol defined in https://peps.python.org/pep-0487.""" default = self.default if default is PydanticUndefined: return set_name = getattr(default, '__set_name__', None) if callable(set_name): set_name(cls, name) def get_default(self) -> Any: """Retrieve the default value of the object. If `self.default_factory` is `None`, the method will return a deep copy of the `self.default` object. If `self.default_factory` is not `None`, it will call `self.default_factory` and return the value returned. Returns: The default value of the object. """ return _utils.smart_deepcopy(self.default) if self.default_factory is None else self.default_factory() def __eq__(self, other: Any) -> bool: return isinstance(other, self.__class__) and (self.default, self.default_factory) == ( other.default, other.default_factory, ) # NOTE: Actual return type is 'ModelPrivateAttr', but we want to help type checkers # to understand the magic that happens at runtime. @overload # `default` argument set def PrivateAttr( default: _T, *, init: Literal[False] = False, ) -> _T: ... @overload # `default_factory` argument set def PrivateAttr( *, default_factory: Callable[[], _T], init: Literal[False] = False, ) -> _T: ... @overload # No default set def PrivateAttr( *, init: Literal[False] = False, ) -> Any: ... def PrivateAttr( default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | None = None, init: Literal[False] = False, ) -> Any: """Usage docs: https://docs.pydantic.dev/2.10/concepts/models/#private-model-attributes Indicates that an attribute is intended for private use and not handled during normal validation/serialization. Private attributes are not validated by Pydantic, so it's up to you to ensure they are used in a type-safe manner. Private attributes are stored in `__private_attributes__` on the model. Args: default: The attribute's default value. Defaults to Undefined. default_factory: Callable that will be called when a default value is needed for this attribute. If both `default` and `default_factory` are set, an error will be raised. init: Whether the attribute should be included in the constructor of the dataclass. Always `False`. Returns: An instance of [`ModelPrivateAttr`][pydantic.fields.ModelPrivateAttr] class. Raises: ValueError: If both `default` and `default_factory` are set. """ if default is not PydanticUndefined and default_factory is not None: raise TypeError('cannot specify both default and default_factory') return ModelPrivateAttr( default, default_factory=default_factory, ) @dataclasses.dataclass(**_internal_dataclass.slots_true) class ComputedFieldInfo: """A container for data from `@computed_field` so that we can access it while building the pydantic-core schema. Attributes: decorator_repr: A class variable representing the decorator string, '@computed_field'. wrapped_property: The wrapped computed field property. return_type: The type of the computed field property's return value. alias: The alias of the property to be used during serialization. alias_priority: The priority of the alias. This affects whether an alias generator is used. title: Title of the computed field to include in the serialization JSON schema. field_title_generator: A callable that takes a field name and returns title for it. description: Description of the computed field to include in the serialization JSON schema. deprecated: A deprecation message, an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport, or a boolean. If `True`, a default deprecation message will be emitted when accessing the field. examples: Example values of the computed field to include in the serialization JSON schema. json_schema_extra: A dict or callable to provide extra JSON schema properties. repr: A boolean indicating whether to include the field in the __repr__ output. """ decorator_repr: ClassVar[str] = '@computed_field' wrapped_property: property return_type: Any alias: str | None alias_priority: int | None title: str | None field_title_generator: typing.Callable[[str, ComputedFieldInfo], str] | None description: str | None deprecated: Deprecated | str | bool | None examples: list[Any] | None json_schema_extra: JsonDict | typing.Callable[[JsonDict], None] | None repr: bool @property def deprecation_message(self) -> str | None: """The deprecation message to be emitted, or `None` if not set.""" if self.deprecated is None: return None if isinstance(self.deprecated, bool): return 'deprecated' if self.deprecated else None return self.deprecated if isinstance(self.deprecated, str) else self.deprecated.message def _wrapped_property_is_private(property_: cached_property | property) -> bool: # type: ignore """Returns true if provided property is private, False otherwise.""" wrapped_name: str = '' if isinstance(property_, property): wrapped_name = getattr(property_.fget, '__name__', '') elif isinstance(property_, cached_property): # type: ignore wrapped_name = getattr(property_.func, '__name__', '') # type: ignore return wrapped_name.startswith('_') and not wrapped_name.startswith('__') # this should really be `property[T], cached_property[T]` but property is not generic unlike cached_property # See https://github.com/python/typing/issues/985 and linked issues PropertyT = typing.TypeVar('PropertyT') @typing.overload def computed_field( *, alias: str | None = None, alias_priority: int | None = None, title: str | None = None, field_title_generator: typing.Callable[[str, ComputedFieldInfo], str] | None = None, description: str | None = None, deprecated: Deprecated | str | bool | None = None, examples: list[Any] | None = None, json_schema_extra: JsonDict | typing.Callable[[JsonDict], None] | None = None, repr: bool = True, return_type: Any = PydanticUndefined, ) -> typing.Callable[[PropertyT], PropertyT]: ... @typing.overload def computed_field(__func: PropertyT) -> PropertyT: ... def computed_field( func: PropertyT | None = None, /, *, alias: str | None = None, alias_priority: int | None = None, title: str | None = None, field_title_generator: typing.Callable[[str, ComputedFieldInfo], str] | None = None, description: str | None = None, deprecated: Deprecated | str | bool | None = None, examples: list[Any] | None = None, json_schema_extra: JsonDict | typing.Callable[[JsonDict], None] | None = None, repr: bool | None = None, return_type: Any = PydanticUndefined, ) -> PropertyT | typing.Callable[[PropertyT], PropertyT]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/fields#the-computed_field-decorator Decorator to include `property` and `cached_property` when serializing models or dataclasses. This is useful for fields that are computed from other fields, or for fields that are expensive to compute and should be cached. ```python from pydantic import BaseModel, computed_field class Rectangle(BaseModel): width: int length: int @computed_field @property def area(self) -> int: return self.width * self.length print(Rectangle(width=3, length=2).model_dump()) #> {'width': 3, 'length': 2, 'area': 6} ``` If applied to functions not yet decorated with `@property` or `@cached_property`, the function is automatically wrapped with `property`. Although this is more concise, you will lose IntelliSense in your IDE, and confuse static type checkers, thus explicit use of `@property` is recommended. !!! warning "Mypy Warning" Even with the `@property` or `@cached_property` applied to your function before `@computed_field`, mypy may throw a `Decorated property not supported` error. See [mypy issue #1362](https://github.com/python/mypy/issues/1362), for more information. To avoid this error message, add `# type: ignore[misc]` to the `@computed_field` line. [pyright](https://github.com/microsoft/pyright) supports `@computed_field` without error. ```python import random from pydantic import BaseModel, computed_field class Square(BaseModel): width: float @computed_field def area(self) -> float: # converted to a `property` by `computed_field` return round(self.width**2, 2) @area.setter def area(self, new_area: float) -> None: self.width = new_area**0.5 @computed_field(alias='the magic number', repr=False) def random_number(self) -> int: return random.randint(0, 1_000) square = Square(width=1.3) # `random_number` does not appear in representation print(repr(square)) #> Square(width=1.3, area=1.69) print(square.random_number) #> 3 square.area = 4 print(square.model_dump_json(by_alias=True)) #> {"width":2.0,"area":4.0,"the magic number":3} ``` !!! warning "Overriding with `computed_field`" You can't override a field from a parent class with a `computed_field` in the child class. `mypy` complains about this behavior if allowed, and `dataclasses` doesn't allow this pattern either. See the example below: ```python from pydantic import BaseModel, computed_field class Parent(BaseModel): a: str try: class Child(Parent): @computed_field @property def a(self) -> str: return 'new a' except ValueError as e: print(repr(e)) #> ValueError("you can't override a field with a computed field") ``` Private properties decorated with `@computed_field` have `repr=False` by default. ```python from functools import cached_property from pydantic import BaseModel, computed_field class Model(BaseModel): foo: int @computed_field @cached_property def _private_cached_property(self) -> int: return -self.foo @computed_field @property def _private_property(self) -> int: return -self.foo m = Model(foo=1) print(repr(m)) #> Model(foo=1) ``` Args: func: the function to wrap. alias: alias to use when serializing this computed field, only used when `by_alias=True` alias_priority: priority of the alias. This affects whether an alias generator is used title: Title to use when including this computed field in JSON Schema field_title_generator: A callable that takes a field name and returns title for it. description: Description to use when including this computed field in JSON Schema, defaults to the function's docstring deprecated: A deprecation message (or an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport). to be emitted when accessing the field. Or a boolean. This will automatically be set if the property is decorated with the `deprecated` decorator. examples: Example values to use when including this computed field in JSON Schema json_schema_extra: A dict or callable to provide extra JSON schema properties. repr: whether to include this computed field in model repr. Default is `False` for private properties and `True` for public properties. return_type: optional return for serialization logic to expect when serializing to JSON, if included this must be correct, otherwise a `TypeError` is raised. If you don't include a return type Any is used, which does runtime introspection to handle arbitrary objects. Returns: A proxy wrapper for the property. """ def dec(f: Any) -> Any: nonlocal description, deprecated, return_type, alias_priority unwrapped = _decorators.unwrap_wrapped_function(f) if description is None and unwrapped.__doc__: description = inspect.cleandoc(unwrapped.__doc__) if deprecated is None and hasattr(unwrapped, '__deprecated__'): deprecated = unwrapped.__deprecated__ # if the function isn't already decorated with `@property` (or another descriptor), then we wrap it now f = _decorators.ensure_property(f) alias_priority = (alias_priority or 2) if alias is not None else None if repr is None: repr_: bool = not _wrapped_property_is_private(property_=f) else: repr_ = repr dec_info = ComputedFieldInfo( f, return_type, alias, alias_priority, title, field_title_generator, description, deprecated, examples, json_schema_extra, repr_, ) return _decorators.PydanticDescriptorProxy(f, dec_info) if func is None: return dec else: return dec(func) pydantic-2.10.6/pydantic/functional_serializers.py000066400000000000000000000411551474456633400224040ustar00rootroot00000000000000"""This module contains related classes and functions for serialization.""" from __future__ import annotations import dataclasses from functools import partial, partialmethod from typing import TYPE_CHECKING, Any, Callable, TypeVar, overload from pydantic_core import PydanticUndefined, core_schema from pydantic_core.core_schema import SerializationInfo, SerializerFunctionWrapHandler, WhenUsed from typing_extensions import Annotated, Literal, TypeAlias from . import PydanticUndefinedAnnotation from ._internal import _decorators, _internal_dataclass from .annotated_handlers import GetCoreSchemaHandler @dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class PlainSerializer: """Plain serializers use a function to modify the output of serialization. This is particularly helpful when you want to customize the serialization for annotated types. Consider an input of `list`, which will be serialized into a space-delimited string. ```python from typing import List from typing_extensions import Annotated from pydantic import BaseModel, PlainSerializer CustomStr = Annotated[ List, PlainSerializer(lambda x: ' '.join(x), return_type=str) ] class StudentModel(BaseModel): courses: CustomStr student = StudentModel(courses=['Math', 'Chemistry', 'English']) print(student.model_dump()) #> {'courses': 'Math Chemistry English'} ``` Attributes: func: The serializer function. return_type: The return type for the function. If omitted it will be inferred from the type annotation. when_used: Determines when this serializer should be used. Accepts a string with values `'always'`, `'unless-none'`, `'json'`, and `'json-unless-none'`. Defaults to 'always'. """ func: core_schema.SerializerFunction return_type: Any = PydanticUndefined when_used: WhenUsed = 'always' def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: """Gets the Pydantic core schema. Args: source_type: The source type. handler: The `GetCoreSchemaHandler` instance. Returns: The Pydantic core schema. """ schema = handler(source_type) try: # Do not pass in globals as the function could be defined in a different module. # Instead, let `get_function_return_type` infer the globals to use, but still pass # in locals that may contain a parent/rebuild namespace: return_type = _decorators.get_function_return_type( self.func, self.return_type, localns=handler._get_types_namespace().locals, ) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e return_schema = None if return_type is PydanticUndefined else handler.generate_schema(return_type) schema['serialization'] = core_schema.plain_serializer_function_ser_schema( function=self.func, info_arg=_decorators.inspect_annotated_serializer(self.func, 'plain'), return_schema=return_schema, when_used=self.when_used, ) return schema @dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class WrapSerializer: """Wrap serializers receive the raw inputs along with a handler function that applies the standard serialization logic, and can modify the resulting value before returning it as the final output of serialization. For example, here's a scenario in which a wrap serializer transforms timezones to UTC **and** utilizes the existing `datetime` serialization logic. ```python from datetime import datetime, timezone from typing import Any, Dict from typing_extensions import Annotated from pydantic import BaseModel, WrapSerializer class EventDatetime(BaseModel): start: datetime end: datetime def convert_to_utc(value: Any, handler, info) -> Dict[str, datetime]: # Note that `handler` can actually help serialize the `value` for # further custom serialization in case it's a subclass. partial_result = handler(value, info) if info.mode == 'json': return { k: datetime.fromisoformat(v).astimezone(timezone.utc) for k, v in partial_result.items() } return {k: v.astimezone(timezone.utc) for k, v in partial_result.items()} UTCEventDatetime = Annotated[EventDatetime, WrapSerializer(convert_to_utc)] class EventModel(BaseModel): event_datetime: UTCEventDatetime dt = EventDatetime( start='2024-01-01T07:00:00-08:00', end='2024-01-03T20:00:00+06:00' ) event = EventModel(event_datetime=dt) print(event.model_dump()) ''' { 'event_datetime': { 'start': datetime.datetime( 2024, 1, 1, 15, 0, tzinfo=datetime.timezone.utc ), 'end': datetime.datetime( 2024, 1, 3, 14, 0, tzinfo=datetime.timezone.utc ), } } ''' print(event.model_dump_json()) ''' {"event_datetime":{"start":"2024-01-01T15:00:00Z","end":"2024-01-03T14:00:00Z"}} ''' ``` Attributes: func: The serializer function to be wrapped. return_type: The return type for the function. If omitted it will be inferred from the type annotation. when_used: Determines when this serializer should be used. Accepts a string with values `'always'`, `'unless-none'`, `'json'`, and `'json-unless-none'`. Defaults to 'always'. """ func: core_schema.WrapSerializerFunction return_type: Any = PydanticUndefined when_used: WhenUsed = 'always' def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: """This method is used to get the Pydantic core schema of the class. Args: source_type: Source type. handler: Core schema handler. Returns: The generated core schema of the class. """ schema = handler(source_type) globalns, localns = handler._get_types_namespace() try: # Do not pass in globals as the function could be defined in a different module. # Instead, let `get_function_return_type` infer the globals to use, but still pass # in locals that may contain a parent/rebuild namespace: return_type = _decorators.get_function_return_type( self.func, self.return_type, localns=handler._get_types_namespace().locals, ) except NameError as e: raise PydanticUndefinedAnnotation.from_name_error(e) from e return_schema = None if return_type is PydanticUndefined else handler.generate_schema(return_type) schema['serialization'] = core_schema.wrap_serializer_function_ser_schema( function=self.func, info_arg=_decorators.inspect_annotated_serializer(self.func, 'wrap'), return_schema=return_schema, when_used=self.when_used, ) return schema if TYPE_CHECKING: _Partial: TypeAlias = 'partial[Any] | partialmethod[Any]' FieldPlainSerializer: TypeAlias = 'core_schema.SerializerFunction | _Partial' """A field serializer method or function in `plain` mode.""" FieldWrapSerializer: TypeAlias = 'core_schema.WrapSerializerFunction | _Partial' """A field serializer method or function in `wrap` mode.""" FieldSerializer: TypeAlias = 'FieldPlainSerializer | FieldWrapSerializer' """A field serializer method or function.""" _FieldPlainSerializerT = TypeVar('_FieldPlainSerializerT', bound=FieldPlainSerializer) _FieldWrapSerializerT = TypeVar('_FieldWrapSerializerT', bound=FieldWrapSerializer) @overload def field_serializer( field: str, /, *fields: str, mode: Literal['wrap'], return_type: Any = ..., when_used: WhenUsed = ..., check_fields: bool | None = ..., ) -> Callable[[_FieldWrapSerializerT], _FieldWrapSerializerT]: ... @overload def field_serializer( field: str, /, *fields: str, mode: Literal['plain'] = ..., return_type: Any = ..., when_used: WhenUsed = ..., check_fields: bool | None = ..., ) -> Callable[[_FieldPlainSerializerT], _FieldPlainSerializerT]: ... def field_serializer( *fields: str, mode: Literal['plain', 'wrap'] = 'plain', return_type: Any = PydanticUndefined, when_used: WhenUsed = 'always', check_fields: bool | None = None, ) -> ( Callable[[_FieldWrapSerializerT], _FieldWrapSerializerT] | Callable[[_FieldPlainSerializerT], _FieldPlainSerializerT] ): """Decorator that enables custom field serialization. In the below example, a field of type `set` is used to mitigate duplication. A `field_serializer` is used to serialize the data as a sorted list. ```python from typing import Set from pydantic import BaseModel, field_serializer class StudentModel(BaseModel): name: str = 'Jane' courses: Set[str] @field_serializer('courses', when_used='json') def serialize_courses_in_order(self, courses: Set[str]): return sorted(courses) student = StudentModel(courses={'Math', 'Chemistry', 'English'}) print(student.model_dump_json()) #> {"name":"Jane","courses":["Chemistry","English","Math"]} ``` See [Custom serializers](../concepts/serialization.md#custom-serializers) for more information. Four signatures are supported: - `(self, value: Any, info: FieldSerializationInfo)` - `(self, value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo)` - `(value: Any, info: SerializationInfo)` - `(value: Any, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` Args: fields: Which field(s) the method should be called on. mode: The serialization mode. - `plain` means the function will be called instead of the default serialization logic, - `wrap` means the function will be called with an argument to optionally call the default serialization logic. return_type: Optional return type for the function, if omitted it will be inferred from the type annotation. when_used: Determines the serializer will be used for serialization. check_fields: Whether to check that the fields actually exist on the model. Returns: The decorator function. """ def dec(f: FieldSerializer) -> _decorators.PydanticDescriptorProxy[Any]: dec_info = _decorators.FieldSerializerDecoratorInfo( fields=fields, mode=mode, return_type=return_type, when_used=when_used, check_fields=check_fields, ) return _decorators.PydanticDescriptorProxy(f, dec_info) # pyright: ignore[reportArgumentType] return dec # pyright: ignore[reportReturnType] if TYPE_CHECKING: # The first argument in the following callables represent the `self` type: ModelPlainSerializerWithInfo: TypeAlias = Callable[[Any, SerializationInfo], Any] """A model serializer method with the `info` argument, in `plain` mode.""" ModelPlainSerializerWithoutInfo: TypeAlias = Callable[[Any], Any] """A model serializer method without the `info` argument, in `plain` mode.""" ModelPlainSerializer: TypeAlias = 'ModelPlainSerializerWithInfo | ModelPlainSerializerWithoutInfo' """A model serializer method in `plain` mode.""" ModelWrapSerializerWithInfo: TypeAlias = Callable[[Any, SerializerFunctionWrapHandler, SerializationInfo], Any] """A model serializer method with the `info` argument, in `wrap` mode.""" ModelWrapSerializerWithoutInfo: TypeAlias = Callable[[Any, SerializerFunctionWrapHandler], Any] """A model serializer method without the `info` argument, in `wrap` mode.""" ModelWrapSerializer: TypeAlias = 'ModelWrapSerializerWithInfo | ModelWrapSerializerWithoutInfo' """A model serializer method in `wrap` mode.""" ModelSerializer: TypeAlias = 'ModelPlainSerializer | ModelWrapSerializer' _ModelPlainSerializerT = TypeVar('_ModelPlainSerializerT', bound=ModelPlainSerializer) _ModelWrapSerializerT = TypeVar('_ModelWrapSerializerT', bound=ModelWrapSerializer) @overload def model_serializer(f: _ModelPlainSerializerT, /) -> _ModelPlainSerializerT: ... @overload def model_serializer( *, mode: Literal['wrap'], when_used: WhenUsed = 'always', return_type: Any = ... ) -> Callable[[_ModelWrapSerializerT], _ModelWrapSerializerT]: ... @overload def model_serializer( *, mode: Literal['plain'] = ..., when_used: WhenUsed = 'always', return_type: Any = ..., ) -> Callable[[_ModelPlainSerializerT], _ModelPlainSerializerT]: ... def model_serializer( f: _ModelPlainSerializerT | _ModelWrapSerializerT | None = None, /, *, mode: Literal['plain', 'wrap'] = 'plain', when_used: WhenUsed = 'always', return_type: Any = PydanticUndefined, ) -> ( _ModelPlainSerializerT | Callable[[_ModelWrapSerializerT], _ModelWrapSerializerT] | Callable[[_ModelPlainSerializerT], _ModelPlainSerializerT] ): """Decorator that enables custom model serialization. This is useful when a model need to be serialized in a customized manner, allowing for flexibility beyond just specific fields. An example would be to serialize temperature to the same temperature scale, such as degrees Celsius. ```python from typing import Literal from pydantic import BaseModel, model_serializer class TemperatureModel(BaseModel): unit: Literal['C', 'F'] value: int @model_serializer() def serialize_model(self): if self.unit == 'F': return {'unit': 'C', 'value': int((self.value - 32) / 1.8)} return {'unit': self.unit, 'value': self.value} temperature = TemperatureModel(unit='F', value=212) print(temperature.model_dump()) #> {'unit': 'C', 'value': 100} ``` Two signatures are supported for `mode='plain'`, which is the default: - `(self)` - `(self, info: SerializationInfo)` And two other signatures for `mode='wrap'`: - `(self, nxt: SerializerFunctionWrapHandler)` - `(self, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` See [Custom serializers](../concepts/serialization.md#custom-serializers) for more information. Args: f: The function to be decorated. mode: The serialization mode. - `'plain'` means the function will be called instead of the default serialization logic - `'wrap'` means the function will be called with an argument to optionally call the default serialization logic. when_used: Determines when this serializer should be used. return_type: The return type for the function. If omitted it will be inferred from the type annotation. Returns: The decorator function. """ def dec(f: ModelSerializer) -> _decorators.PydanticDescriptorProxy[Any]: dec_info = _decorators.ModelSerializerDecoratorInfo(mode=mode, return_type=return_type, when_used=when_used) return _decorators.PydanticDescriptorProxy(f, dec_info) if f is None: return dec # pyright: ignore[reportReturnType] else: return dec(f) # pyright: ignore[reportReturnType] AnyType = TypeVar('AnyType') if TYPE_CHECKING: SerializeAsAny = Annotated[AnyType, ...] # SerializeAsAny[list[str]] will be treated by type checkers as list[str] """Force serialization to ignore whatever is defined in the schema and instead ask the object itself how it should be serialized. In particular, this means that when model subclasses are serialized, fields present in the subclass but not in the original schema will be included. """ else: @dataclasses.dataclass(**_internal_dataclass.slots_true) class SerializeAsAny: # noqa: D101 def __class_getitem__(cls, item: Any) -> Any: return Annotated[item, SerializeAsAny()] def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: schema = handler(source_type) schema_to_update = schema while schema_to_update['type'] == 'definitions': schema_to_update = schema_to_update.copy() schema_to_update = schema_to_update['schema'] schema_to_update['serialization'] = core_schema.wrap_serializer_function_ser_schema( lambda x, h: h(x), schema=core_schema.any_schema() ) return schema __hash__ = object.__hash__ pydantic-2.10.6/pydantic/functional_validators.py000066400000000000000000000713661474456633400222270ustar00rootroot00000000000000"""This module contains related classes and functions for validation.""" from __future__ import annotations as _annotations import dataclasses import sys from functools import partialmethod from types import FunctionType from typing import TYPE_CHECKING, Any, Callable, TypeVar, Union, cast, overload from pydantic_core import PydanticUndefined, core_schema from pydantic_core import core_schema as _core_schema from typing_extensions import Annotated, Literal, Self, TypeAlias from ._internal import _decorators, _generics, _internal_dataclass from .annotated_handlers import GetCoreSchemaHandler from .errors import PydanticUserError if sys.version_info < (3, 11): from typing_extensions import Protocol else: from typing import Protocol _inspect_validator = _decorators.inspect_validator @dataclasses.dataclass(frozen=True, **_internal_dataclass.slots_true) class AfterValidator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#field-validators A metadata class that indicates that a validation should be applied **after** the inner validation logic. Attributes: func: The validator function. Example: ```python from typing_extensions import Annotated from pydantic import AfterValidator, BaseModel, ValidationError MyInt = Annotated[int, AfterValidator(lambda v: v + 1)] class Model(BaseModel): a: MyInt print(Model(a=1).a) #> 2 try: Model(a='a') except ValidationError as e: print(e.json(indent=2)) ''' [ { "type": "int_parsing", "loc": [ "a" ], "msg": "Input should be a valid integer, unable to parse string as an integer", "input": "a", "url": "https://errors.pydantic.dev/2/v/int_parsing" } ] ''' ``` """ func: core_schema.NoInfoValidatorFunction | core_schema.WithInfoValidatorFunction def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source_type) info_arg = _inspect_validator(self.func, 'after') if info_arg: func = cast(core_schema.WithInfoValidatorFunction, self.func) return core_schema.with_info_after_validator_function(func, schema=schema, field_name=handler.field_name) else: func = cast(core_schema.NoInfoValidatorFunction, self.func) return core_schema.no_info_after_validator_function(func, schema=schema) @classmethod def _from_decorator(cls, decorator: _decorators.Decorator[_decorators.FieldValidatorDecoratorInfo]) -> Self: return cls(func=decorator.func) @dataclasses.dataclass(frozen=True, **_internal_dataclass.slots_true) class BeforeValidator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#field-validators A metadata class that indicates that a validation should be applied **before** the inner validation logic. Attributes: func: The validator function. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). Example: ```python from typing_extensions import Annotated from pydantic import BaseModel, BeforeValidator MyInt = Annotated[int, BeforeValidator(lambda v: v + 1)] class Model(BaseModel): a: MyInt print(Model(a=1).a) #> 2 try: Model(a='a') except TypeError as e: print(e) #> can only concatenate str (not "int") to str ``` """ func: core_schema.NoInfoValidatorFunction | core_schema.WithInfoValidatorFunction json_schema_input_type: Any = PydanticUndefined def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source_type) input_schema = ( None if self.json_schema_input_type is PydanticUndefined else handler.generate_schema(self.json_schema_input_type) ) info_arg = _inspect_validator(self.func, 'before') if info_arg: func = cast(core_schema.WithInfoValidatorFunction, self.func) return core_schema.with_info_before_validator_function( func, schema=schema, field_name=handler.field_name, json_schema_input_schema=input_schema, ) else: func = cast(core_schema.NoInfoValidatorFunction, self.func) return core_schema.no_info_before_validator_function( func, schema=schema, json_schema_input_schema=input_schema ) @classmethod def _from_decorator(cls, decorator: _decorators.Decorator[_decorators.FieldValidatorDecoratorInfo]) -> Self: return cls( func=decorator.func, json_schema_input_type=decorator.info.json_schema_input_type, ) @dataclasses.dataclass(frozen=True, **_internal_dataclass.slots_true) class PlainValidator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#field-validators A metadata class that indicates that a validation should be applied **instead** of the inner validation logic. !!! note Before v2.9, `PlainValidator` wasn't always compatible with JSON Schema generation for `mode='validation'`. You can now use the `json_schema_input_type` argument to specify the input type of the function to be used in the JSON schema when `mode='validation'` (the default). See the example below for more details. Attributes: func: The validator function. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). If not provided, will default to `Any`. Example: ```python from typing import Union from typing_extensions import Annotated from pydantic import BaseModel, PlainValidator MyInt = Annotated[ int, PlainValidator( lambda v: int(v) + 1, json_schema_input_type=Union[str, int] # (1)! ), ] class Model(BaseModel): a: MyInt print(Model(a='1').a) #> 2 print(Model(a=1).a) #> 2 ``` 1. In this example, we've specified the `json_schema_input_type` as `Union[str, int]` which indicates to the JSON schema generator that in validation mode, the input type for the `a` field can be either a `str` or an `int`. """ func: core_schema.NoInfoValidatorFunction | core_schema.WithInfoValidatorFunction json_schema_input_type: Any = Any def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: # Note that for some valid uses of PlainValidator, it is not possible to generate a core schema for the # source_type, so calling `handler(source_type)` will error, which prevents us from generating a proper # serialization schema. To work around this for use cases that will not involve serialization, we simply # catch any PydanticSchemaGenerationError that may be raised while attempting to build the serialization schema # and abort any attempts to handle special serialization. from pydantic import PydanticSchemaGenerationError try: schema = handler(source_type) # TODO if `schema['serialization']` is one of `'include-exclude-dict/sequence', # schema validation will fail. That's why we use 'type ignore' comments below. serialization = schema.get( 'serialization', core_schema.wrap_serializer_function_ser_schema( function=lambda v, h: h(v), schema=schema, return_schema=handler.generate_schema(source_type), ), ) except PydanticSchemaGenerationError: serialization = None input_schema = handler.generate_schema(self.json_schema_input_type) info_arg = _inspect_validator(self.func, 'plain') if info_arg: func = cast(core_schema.WithInfoValidatorFunction, self.func) return core_schema.with_info_plain_validator_function( func, field_name=handler.field_name, serialization=serialization, # pyright: ignore[reportArgumentType] json_schema_input_schema=input_schema, ) else: func = cast(core_schema.NoInfoValidatorFunction, self.func) return core_schema.no_info_plain_validator_function( func, serialization=serialization, # pyright: ignore[reportArgumentType] json_schema_input_schema=input_schema, ) @classmethod def _from_decorator(cls, decorator: _decorators.Decorator[_decorators.FieldValidatorDecoratorInfo]) -> Self: return cls( func=decorator.func, json_schema_input_type=decorator.info.json_schema_input_type, ) @dataclasses.dataclass(frozen=True, **_internal_dataclass.slots_true) class WrapValidator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#field-validators A metadata class that indicates that a validation should be applied **around** the inner validation logic. Attributes: func: The validator function. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). ```python from datetime import datetime from typing_extensions import Annotated from pydantic import BaseModel, ValidationError, WrapValidator def validate_timestamp(v, handler): if v == 'now': # we don't want to bother with further validation, just return the new value return datetime.now() try: return handler(v) except ValidationError: # validation failed, in this case we want to return a default value return datetime(2000, 1, 1) MyTimestamp = Annotated[datetime, WrapValidator(validate_timestamp)] class Model(BaseModel): a: MyTimestamp print(Model(a='now').a) #> 2032-01-02 03:04:05.000006 print(Model(a='invalid').a) #> 2000-01-01 00:00:00 ``` """ func: core_schema.NoInfoWrapValidatorFunction | core_schema.WithInfoWrapValidatorFunction json_schema_input_type: Any = PydanticUndefined def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source_type) input_schema = ( None if self.json_schema_input_type is PydanticUndefined else handler.generate_schema(self.json_schema_input_type) ) info_arg = _inspect_validator(self.func, 'wrap') if info_arg: func = cast(core_schema.WithInfoWrapValidatorFunction, self.func) return core_schema.with_info_wrap_validator_function( func, schema=schema, field_name=handler.field_name, json_schema_input_schema=input_schema, ) else: func = cast(core_schema.NoInfoWrapValidatorFunction, self.func) return core_schema.no_info_wrap_validator_function( func, schema=schema, json_schema_input_schema=input_schema, ) @classmethod def _from_decorator(cls, decorator: _decorators.Decorator[_decorators.FieldValidatorDecoratorInfo]) -> Self: return cls( func=decorator.func, json_schema_input_type=decorator.info.json_schema_input_type, ) if TYPE_CHECKING: class _OnlyValueValidatorClsMethod(Protocol): def __call__(self, cls: Any, value: Any, /) -> Any: ... class _V2ValidatorClsMethod(Protocol): def __call__(self, cls: Any, value: Any, info: _core_schema.ValidationInfo, /) -> Any: ... class _OnlyValueWrapValidatorClsMethod(Protocol): def __call__(self, cls: Any, value: Any, handler: _core_schema.ValidatorFunctionWrapHandler, /) -> Any: ... class _V2WrapValidatorClsMethod(Protocol): def __call__( self, cls: Any, value: Any, handler: _core_schema.ValidatorFunctionWrapHandler, info: _core_schema.ValidationInfo, /, ) -> Any: ... _V2Validator = Union[ _V2ValidatorClsMethod, _core_schema.WithInfoValidatorFunction, _OnlyValueValidatorClsMethod, _core_schema.NoInfoValidatorFunction, ] _V2WrapValidator = Union[ _V2WrapValidatorClsMethod, _core_schema.WithInfoWrapValidatorFunction, _OnlyValueWrapValidatorClsMethod, _core_schema.NoInfoWrapValidatorFunction, ] _PartialClsOrStaticMethod: TypeAlias = Union[classmethod[Any, Any, Any], staticmethod[Any, Any], partialmethod[Any]] _V2BeforeAfterOrPlainValidatorType = TypeVar( '_V2BeforeAfterOrPlainValidatorType', bound=Union[_V2Validator, _PartialClsOrStaticMethod], ) _V2WrapValidatorType = TypeVar('_V2WrapValidatorType', bound=Union[_V2WrapValidator, _PartialClsOrStaticMethod]) FieldValidatorModes: TypeAlias = Literal['before', 'after', 'wrap', 'plain'] @overload def field_validator( field: str, /, *fields: str, mode: Literal['wrap'], check_fields: bool | None = ..., json_schema_input_type: Any = ..., ) -> Callable[[_V2WrapValidatorType], _V2WrapValidatorType]: ... @overload def field_validator( field: str, /, *fields: str, mode: Literal['before', 'plain'], check_fields: bool | None = ..., json_schema_input_type: Any = ..., ) -> Callable[[_V2BeforeAfterOrPlainValidatorType], _V2BeforeAfterOrPlainValidatorType]: ... @overload def field_validator( field: str, /, *fields: str, mode: Literal['after'] = ..., check_fields: bool | None = ..., ) -> Callable[[_V2BeforeAfterOrPlainValidatorType], _V2BeforeAfterOrPlainValidatorType]: ... def field_validator( field: str, /, *fields: str, mode: FieldValidatorModes = 'after', check_fields: bool | None = None, json_schema_input_type: Any = PydanticUndefined, ) -> Callable[[Any], Any]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#field-validators Decorate methods on the class indicating that they should be used to validate fields. Example usage: ```python from typing import Any from pydantic import ( BaseModel, ValidationError, field_validator, ) class Model(BaseModel): a: str @field_validator('a') @classmethod def ensure_foobar(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v print(repr(Model(a='this is foobar good'))) #> Model(a='this is foobar good') try: Model(a='snap') except ValidationError as exc_info: print(exc_info) ''' 1 validation error for Model a Value error, "foobar" not found in a [type=value_error, input_value='snap', input_type=str] ''' ``` For more in depth examples, see [Field Validators](../concepts/validators.md#field-validators). Args: field: The first field the `field_validator` should be called on; this is separate from `fields` to ensure an error is raised if you don't pass at least one. *fields: Additional field(s) the `field_validator` should be called on. mode: Specifies whether to validate the fields before or after validation. check_fields: Whether to check that the fields actually exist on the model. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode) and can only specified when `mode` is either `'before'`, `'plain'` or `'wrap'`. Returns: A decorator that can be used to decorate a function to be used as a field_validator. Raises: PydanticUserError: - If `@field_validator` is used bare (with no fields). - If the args passed to `@field_validator` as fields are not strings. - If `@field_validator` applied to instance methods. """ if isinstance(field, FunctionType): raise PydanticUserError( '`@field_validator` should be used with fields and keyword arguments, not bare. ' "E.g. usage should be `@validator('', ...)`", code='validator-no-fields', ) if mode not in ('before', 'plain', 'wrap') and json_schema_input_type is not PydanticUndefined: raise PydanticUserError( f"`json_schema_input_type` can't be used when mode is set to {mode!r}", code='validator-input-type', ) if json_schema_input_type is PydanticUndefined and mode == 'plain': json_schema_input_type = Any fields = field, *fields if not all(isinstance(field, str) for field in fields): raise PydanticUserError( '`@field_validator` fields should be passed as separate string args. ' "E.g. usage should be `@validator('', '', ...)`", code='validator-invalid-fields', ) def dec( f: Callable[..., Any] | staticmethod[Any, Any] | classmethod[Any, Any, Any], ) -> _decorators.PydanticDescriptorProxy[Any]: if _decorators.is_instance_method_from_sig(f): raise PydanticUserError( '`@field_validator` cannot be applied to instance methods', code='validator-instance-method' ) # auto apply the @classmethod decorator f = _decorators.ensure_classmethod_based_on_signature(f) dec_info = _decorators.FieldValidatorDecoratorInfo( fields=fields, mode=mode, check_fields=check_fields, json_schema_input_type=json_schema_input_type ) return _decorators.PydanticDescriptorProxy(f, dec_info) return dec _ModelType = TypeVar('_ModelType') _ModelTypeCo = TypeVar('_ModelTypeCo', covariant=True) class ModelWrapValidatorHandler(_core_schema.ValidatorFunctionWrapHandler, Protocol[_ModelTypeCo]): """`@model_validator` decorated function handler argument type. This is used when `mode='wrap'`.""" def __call__( # noqa: D102 self, value: Any, outer_location: str | int | None = None, /, ) -> _ModelTypeCo: # pragma: no cover ... class ModelWrapValidatorWithoutInfo(Protocol[_ModelType]): """A `@model_validator` decorated function signature. This is used when `mode='wrap'` and the function does not have info argument. """ def __call__( # noqa: D102 self, cls: type[_ModelType], # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, handler: ModelWrapValidatorHandler[_ModelType], /, ) -> _ModelType: ... class ModelWrapValidator(Protocol[_ModelType]): """A `@model_validator` decorated function signature. This is used when `mode='wrap'`.""" def __call__( # noqa: D102 self, cls: type[_ModelType], # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, handler: ModelWrapValidatorHandler[_ModelType], info: _core_schema.ValidationInfo, /, ) -> _ModelType: ... class FreeModelBeforeValidatorWithoutInfo(Protocol): """A `@model_validator` decorated function signature. This is used when `mode='before'` and the function does not have info argument. """ def __call__( # noqa: D102 self, # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, /, ) -> Any: ... class ModelBeforeValidatorWithoutInfo(Protocol): """A `@model_validator` decorated function signature. This is used when `mode='before'` and the function does not have info argument. """ def __call__( # noqa: D102 self, cls: Any, # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, /, ) -> Any: ... class FreeModelBeforeValidator(Protocol): """A `@model_validator` decorated function signature. This is used when `mode='before'`.""" def __call__( # noqa: D102 self, # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, info: _core_schema.ValidationInfo, /, ) -> Any: ... class ModelBeforeValidator(Protocol): """A `@model_validator` decorated function signature. This is used when `mode='before'`.""" def __call__( # noqa: D102 self, cls: Any, # this can be a dict, a model instance # or anything else that gets passed to validate_python # thus validators _must_ handle all cases value: Any, info: _core_schema.ValidationInfo, /, ) -> Any: ... ModelAfterValidatorWithoutInfo = Callable[[_ModelType], _ModelType] """A `@model_validator` decorated function signature. This is used when `mode='after'` and the function does not have info argument. """ ModelAfterValidator = Callable[[_ModelType, _core_schema.ValidationInfo], _ModelType] """A `@model_validator` decorated function signature. This is used when `mode='after'`.""" _AnyModelWrapValidator = Union[ModelWrapValidator[_ModelType], ModelWrapValidatorWithoutInfo[_ModelType]] _AnyModelBeforeValidator = Union[ FreeModelBeforeValidator, ModelBeforeValidator, FreeModelBeforeValidatorWithoutInfo, ModelBeforeValidatorWithoutInfo ] _AnyModelAfterValidator = Union[ModelAfterValidator[_ModelType], ModelAfterValidatorWithoutInfo[_ModelType]] @overload def model_validator( *, mode: Literal['wrap'], ) -> Callable[ [_AnyModelWrapValidator[_ModelType]], _decorators.PydanticDescriptorProxy[_decorators.ModelValidatorDecoratorInfo] ]: ... @overload def model_validator( *, mode: Literal['before'], ) -> Callable[ [_AnyModelBeforeValidator], _decorators.PydanticDescriptorProxy[_decorators.ModelValidatorDecoratorInfo] ]: ... @overload def model_validator( *, mode: Literal['after'], ) -> Callable[ [_AnyModelAfterValidator[_ModelType]], _decorators.PydanticDescriptorProxy[_decorators.ModelValidatorDecoratorInfo] ]: ... def model_validator( *, mode: Literal['wrap', 'before', 'after'], ) -> Any: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validators/#model-validators Decorate model methods for validation purposes. Example usage: ```python from typing_extensions import Self from pydantic import BaseModel, ValidationError, model_validator class Square(BaseModel): width: float height: float @model_validator(mode='after') def verify_square(self) -> Self: if self.width != self.height: raise ValueError('width and height do not match') return self s = Square(width=1, height=1) print(repr(s)) #> Square(width=1.0, height=1.0) try: Square(width=1, height=2) except ValidationError as e: print(e) ''' 1 validation error for Square Value error, width and height do not match [type=value_error, input_value={'width': 1, 'height': 2}, input_type=dict] ''' ``` For more in depth examples, see [Model Validators](../concepts/validators.md#model-validators). Args: mode: A required string literal that specifies the validation mode. It can be one of the following: 'wrap', 'before', or 'after'. Returns: A decorator that can be used to decorate a function to be used as a model validator. """ def dec(f: Any) -> _decorators.PydanticDescriptorProxy[Any]: # auto apply the @classmethod decorator f = _decorators.ensure_classmethod_based_on_signature(f) dec_info = _decorators.ModelValidatorDecoratorInfo(mode=mode) return _decorators.PydanticDescriptorProxy(f, dec_info) return dec AnyType = TypeVar('AnyType') if TYPE_CHECKING: # If we add configurable attributes to IsInstance, we'd probably need to stop hiding it from type checkers like this InstanceOf = Annotated[AnyType, ...] # `IsInstance[Sequence]` will be recognized by type checkers as `Sequence` else: @dataclasses.dataclass(**_internal_dataclass.slots_true) class InstanceOf: '''Generic type for annotating a type that is an instance of a given class. Example: ```python from pydantic import BaseModel, InstanceOf class Foo: ... class Bar(BaseModel): foo: InstanceOf[Foo] Bar(foo=Foo()) try: Bar(foo=42) except ValidationError as e: print(e) """ [ │ { │ │ 'type': 'is_instance_of', │ │ 'loc': ('foo',), │ │ 'msg': 'Input should be an instance of Foo', │ │ 'input': 42, │ │ 'ctx': {'class': 'Foo'}, │ │ 'url': 'https://errors.pydantic.dev/0.38.0/v/is_instance_of' │ } ] """ ``` ''' @classmethod def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: from pydantic import PydanticSchemaGenerationError # use the generic _origin_ as the second argument to isinstance when appropriate instance_of_schema = core_schema.is_instance_schema(_generics.get_origin(source) or source) try: # Try to generate the "standard" schema, which will be used when loading from JSON original_schema = handler(source) except PydanticSchemaGenerationError: # If that fails, just produce a schema that can validate from python return instance_of_schema else: # Use the "original" approach to serialization instance_of_schema['serialization'] = core_schema.wrap_serializer_function_ser_schema( function=lambda v, h: h(v), schema=original_schema ) return core_schema.json_or_python_schema(python_schema=instance_of_schema, json_schema=original_schema) __hash__ = object.__hash__ if TYPE_CHECKING: SkipValidation = Annotated[AnyType, ...] # SkipValidation[list[str]] will be treated by type checkers as list[str] else: @dataclasses.dataclass(**_internal_dataclass.slots_true) class SkipValidation: """If this is applied as an annotation (e.g., via `x: Annotated[int, SkipValidation]`), validation will be skipped. You can also use `SkipValidation[int]` as a shorthand for `Annotated[int, SkipValidation]`. This can be useful if you want to use a type annotation for documentation/IDE/type-checking purposes, and know that it is safe to skip validation for one or more of the fields. Because this converts the validation schema to `any_schema`, subsequent annotation-applied transformations may not have the expected effects. Therefore, when used, this annotation should generally be the final annotation applied to a type. """ def __class_getitem__(cls, item: Any) -> Any: return Annotated[item, SkipValidation()] @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: original_schema = handler(source) metadata = {'pydantic_js_annotation_functions': [lambda _c, h: h(original_schema)]} return core_schema.any_schema( metadata=metadata, serialization=core_schema.wrap_serializer_function_ser_schema( function=lambda v, h: h(v), schema=original_schema ), ) __hash__ = object.__hash__ pydantic-2.10.6/pydantic/generics.py000066400000000000000000000002201474456633400174110ustar00rootroot00000000000000"""The `generics` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/json.py000066400000000000000000000002141474456633400165660ustar00rootroot00000000000000"""The `json` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/json_schema.py000066400000000000000000003347351474456633400201300ustar00rootroot00000000000000""" Usage docs: https://docs.pydantic.dev/2.5/concepts/json_schema/ The `json_schema` module contains classes and functions to allow the way [JSON Schema](https://json-schema.org/) is generated to be customized. In general you shouldn't need to use this module directly; instead, you can use [`BaseModel.model_json_schema`][pydantic.BaseModel.model_json_schema] and [`TypeAdapter.json_schema`][pydantic.TypeAdapter.json_schema]. """ from __future__ import annotations as _annotations import dataclasses import inspect import math import os import re import warnings from collections import defaultdict from copy import deepcopy from enum import Enum from typing import ( TYPE_CHECKING, Any, Callable, Counter, Dict, Hashable, Iterable, NewType, Pattern, Sequence, Tuple, TypeVar, Union, cast, overload, ) import pydantic_core from pydantic_core import CoreSchema, PydanticOmit, core_schema, to_jsonable_python from pydantic_core.core_schema import ComputedField from typing_extensions import Annotated, Literal, TypeAlias, assert_never, deprecated, final from pydantic.warnings import PydanticDeprecatedSince26, PydanticDeprecatedSince29 from ._internal import ( _config, _core_metadata, _core_utils, _decorators, _internal_dataclass, _mock_val_ser, _schema_generation_shared, _typing_extra, ) from .annotated_handlers import GetJsonSchemaHandler from .config import JsonDict, JsonValue from .errors import PydanticInvalidForJsonSchema, PydanticSchemaGenerationError, PydanticUserError if TYPE_CHECKING: from . import ConfigDict from ._internal._core_utils import CoreSchemaField, CoreSchemaOrField from ._internal._dataclasses import PydanticDataclass from ._internal._schema_generation_shared import GetJsonSchemaFunction from .main import BaseModel CoreSchemaOrFieldType = Literal[core_schema.CoreSchemaType, core_schema.CoreSchemaFieldType] """ A type alias for defined schema types that represents a union of `core_schema.CoreSchemaType` and `core_schema.CoreSchemaFieldType`. """ JsonSchemaValue = Dict[str, Any] """ A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values. """ JsonSchemaMode = Literal['validation', 'serialization'] """ A type alias that represents the mode of a JSON schema; either 'validation' or 'serialization'. For some types, the inputs to validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs. """ _MODE_TITLE_MAPPING: dict[JsonSchemaMode, str] = {'validation': 'Input', 'serialization': 'Output'} JsonSchemaWarningKind = Literal['skipped-choice', 'non-serializable-default', 'skipped-discriminator'] """ A type alias representing the kinds of warnings that can be emitted during JSON schema generation. See [`GenerateJsonSchema.render_warning_message`][pydantic.json_schema.GenerateJsonSchema.render_warning_message] for more details. """ class PydanticJsonSchemaWarning(UserWarning): """This class is used to emit warnings produced during JSON schema generation. See the [`GenerateJsonSchema.emit_warning`][pydantic.json_schema.GenerateJsonSchema.emit_warning] and [`GenerateJsonSchema.render_warning_message`][pydantic.json_schema.GenerateJsonSchema.render_warning_message] methods for more details; these can be overridden to control warning behavior. """ # ##### JSON Schema Generation ##### DEFAULT_REF_TEMPLATE = '#/$defs/{model}' """The default format string used to generate reference names.""" # There are three types of references relevant to building JSON schemas: # 1. core_schema "ref" values; these are not exposed as part of the JSON schema # * these might look like the fully qualified path of a model, its id, or something similar CoreRef = NewType('CoreRef', str) # 2. keys of the "definitions" object that will eventually go into the JSON schema # * by default, these look like "MyModel", though may change in the presence of collisions # * eventually, we may want to make it easier to modify the way these names are generated DefsRef = NewType('DefsRef', str) # 3. the values corresponding to the "$ref" key in the schema # * By default, these look like "#/$defs/MyModel", as in {"$ref": "#/$defs/MyModel"} JsonRef = NewType('JsonRef', str) CoreModeRef = Tuple[CoreRef, JsonSchemaMode] JsonSchemaKeyT = TypeVar('JsonSchemaKeyT', bound=Hashable) @dataclasses.dataclass(**_internal_dataclass.slots_true) class _DefinitionsRemapping: defs_remapping: dict[DefsRef, DefsRef] json_remapping: dict[JsonRef, JsonRef] @staticmethod def from_prioritized_choices( prioritized_choices: dict[DefsRef, list[DefsRef]], defs_to_json: dict[DefsRef, JsonRef], definitions: dict[DefsRef, JsonSchemaValue], ) -> _DefinitionsRemapping: """ This function should produce a remapping that replaces complex DefsRef with the simpler ones from the prioritized_choices such that applying the name remapping would result in an equivalent JSON schema. """ # We need to iteratively simplify the definitions until we reach a fixed point. # The reason for this is that outer definitions may reference inner definitions that get simplified # into an equivalent reference, and the outer definitions won't be equivalent until we've simplified # the inner definitions. copied_definitions = deepcopy(definitions) definitions_schema = {'$defs': copied_definitions} for _iter in range(100): # prevent an infinite loop in the case of a bug, 100 iterations should be enough # For every possible remapped DefsRef, collect all schemas that that DefsRef might be used for: schemas_for_alternatives: dict[DefsRef, list[JsonSchemaValue]] = defaultdict(list) for defs_ref in copied_definitions: alternatives = prioritized_choices[defs_ref] for alternative in alternatives: schemas_for_alternatives[alternative].append(copied_definitions[defs_ref]) # Deduplicate the schemas for each alternative; the idea is that we only want to remap to a new DefsRef # if it introduces no ambiguity, i.e., there is only one distinct schema for that DefsRef. for defs_ref in schemas_for_alternatives: schemas_for_alternatives[defs_ref] = _deduplicate_schemas(schemas_for_alternatives[defs_ref]) # Build the remapping defs_remapping: dict[DefsRef, DefsRef] = {} json_remapping: dict[JsonRef, JsonRef] = {} for original_defs_ref in definitions: alternatives = prioritized_choices[original_defs_ref] # Pick the first alternative that has only one schema, since that means there is no collision remapped_defs_ref = next(x for x in alternatives if len(schemas_for_alternatives[x]) == 1) defs_remapping[original_defs_ref] = remapped_defs_ref json_remapping[defs_to_json[original_defs_ref]] = defs_to_json[remapped_defs_ref] remapping = _DefinitionsRemapping(defs_remapping, json_remapping) new_definitions_schema = remapping.remap_json_schema({'$defs': copied_definitions}) if definitions_schema == new_definitions_schema: # We've reached the fixed point return remapping definitions_schema = new_definitions_schema raise PydanticInvalidForJsonSchema('Failed to simplify the JSON schema definitions') def remap_defs_ref(self, ref: DefsRef) -> DefsRef: return self.defs_remapping.get(ref, ref) def remap_json_ref(self, ref: JsonRef) -> JsonRef: return self.json_remapping.get(ref, ref) def remap_json_schema(self, schema: Any) -> Any: """ Recursively update the JSON schema replacing all $refs """ if isinstance(schema, str): # Note: this may not really be a JsonRef; we rely on having no collisions between JsonRefs and other strings return self.remap_json_ref(JsonRef(schema)) elif isinstance(schema, list): return [self.remap_json_schema(item) for item in schema] elif isinstance(schema, dict): for key, value in schema.items(): if key == '$ref' and isinstance(value, str): schema['$ref'] = self.remap_json_ref(JsonRef(value)) elif key == '$defs': schema['$defs'] = { self.remap_defs_ref(DefsRef(key)): self.remap_json_schema(value) for key, value in schema['$defs'].items() } else: schema[key] = self.remap_json_schema(value) return schema class GenerateJsonSchema: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#customizing-the-json-schema-generation-process A class for generating JSON schemas. This class generates JSON schemas based on configured parameters. The default schema dialect is [https://json-schema.org/draft/2020-12/schema](https://json-schema.org/draft/2020-12/schema). The class uses `by_alias` to configure how fields with multiple names are handled and `ref_template` to format reference names. Attributes: schema_dialect: The JSON schema dialect used to generate the schema. See [Declaring a Dialect](https://json-schema.org/understanding-json-schema/reference/schema.html#id4) in the JSON Schema documentation for more information about dialects. ignored_warning_kinds: Warnings to ignore when generating the schema. `self.render_warning_message` will do nothing if its argument `kind` is in `ignored_warning_kinds`; this value can be modified on subclasses to easily control which warnings are emitted. by_alias: Whether to use field aliases when generating the schema. ref_template: The format string used when generating reference names. core_to_json_refs: A mapping of core refs to JSON refs. core_to_defs_refs: A mapping of core refs to definition refs. defs_to_core_refs: A mapping of definition refs to core refs. json_to_defs_refs: A mapping of JSON refs to definition refs. definitions: Definitions in the schema. Args: by_alias: Whether to use field aliases in the generated schemas. ref_template: The format string to use when generating reference names. Raises: JsonSchemaError: If the instance of the class is inadvertently reused after generating a schema. """ schema_dialect = 'https://json-schema.org/draft/2020-12/schema' # `self.render_warning_message` will do nothing if its argument `kind` is in `ignored_warning_kinds`; # this value can be modified on subclasses to easily control which warnings are emitted ignored_warning_kinds: set[JsonSchemaWarningKind] = {'skipped-choice'} def __init__(self, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE): self.by_alias = by_alias self.ref_template = ref_template self.core_to_json_refs: dict[CoreModeRef, JsonRef] = {} self.core_to_defs_refs: dict[CoreModeRef, DefsRef] = {} self.defs_to_core_refs: dict[DefsRef, CoreModeRef] = {} self.json_to_defs_refs: dict[JsonRef, DefsRef] = {} self.definitions: dict[DefsRef, JsonSchemaValue] = {} self._config_wrapper_stack = _config.ConfigWrapperStack(_config.ConfigWrapper({})) self._mode: JsonSchemaMode = 'validation' # The following includes a mapping of a fully-unique defs ref choice to a list of preferred # alternatives, which are generally simpler, such as only including the class name. # At the end of schema generation, we use these to produce a JSON schema with more human-readable # definitions, which would also work better in a generated OpenAPI client, etc. self._prioritized_defsref_choices: dict[DefsRef, list[DefsRef]] = {} self._collision_counter: dict[str, int] = defaultdict(int) self._collision_index: dict[str, int] = {} self._schema_type_to_method = self.build_schema_type_to_method() # When we encounter definitions we need to try to build them immediately # so that they are available schemas that reference them # But it's possible that CoreSchema was never going to be used # (e.g. because the CoreSchema that references short circuits is JSON schema generation without needing # the reference) so instead of failing altogether if we can't build a definition we # store the error raised and re-throw it if we end up needing that def self._core_defs_invalid_for_json_schema: dict[DefsRef, PydanticInvalidForJsonSchema] = {} # This changes to True after generating a schema, to prevent issues caused by accidental reuse # of a single instance of a schema generator self._used = False @property def _config(self) -> _config.ConfigWrapper: return self._config_wrapper_stack.tail @property def mode(self) -> JsonSchemaMode: if self._config.json_schema_mode_override is not None: return self._config.json_schema_mode_override else: return self._mode def build_schema_type_to_method( self, ) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]: """Builds a dictionary mapping fields to methods for generating JSON schemas. Returns: A dictionary containing the mapping of `CoreSchemaOrFieldType` to a handler method. Raises: TypeError: If no method has been defined for generating a JSON schema for a given pydantic core schema type. """ mapping: dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] = {} core_schema_types: list[CoreSchemaOrFieldType] = _typing_extra.literal_values( CoreSchemaOrFieldType # type: ignore ) for key in core_schema_types: method_name = f"{key.replace('-', '_')}_schema" try: mapping[key] = getattr(self, method_name) except AttributeError as e: # pragma: no cover if os.environ['PYDANTIC_PRIVATE_ALLOW_UNHANDLED_SCHEMA_TYPES'] == '1': continue raise TypeError( f'No method for generating JsonSchema for core_schema.type={key!r} ' f'(expected: {type(self).__name__}.{method_name})' ) from e return mapping def generate_definitions( self, inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]] ) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]: """Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references. Args: inputs: A sequence of tuples, where: - The first element is a JSON schema key type. - The second element is the JSON mode: either 'validation' or 'serialization'. - The third element is a core schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions. Raises: PydanticUserError: Raised if the JSON schema generator has already been used to generate a JSON schema. """ if self._used: raise PydanticUserError( 'This JSON schema generator has already been used to generate a JSON schema. ' f'You must create a new instance of {type(self).__name__} to generate a new JSON schema.', code='json-schema-already-used', ) for _, mode, schema in inputs: self._mode = mode self.generate_inner(schema) definitions_remapping = self._build_definitions_remapping() json_schemas_map: dict[tuple[JsonSchemaKeyT, JsonSchemaMode], DefsRef] = {} for key, mode, schema in inputs: self._mode = mode json_schema = self.generate_inner(schema) json_schemas_map[(key, mode)] = definitions_remapping.remap_json_schema(json_schema) json_schema = {'$defs': self.definitions} json_schema = definitions_remapping.remap_json_schema(json_schema) self._used = True return json_schemas_map, self.sort(json_schema['$defs']) # type: ignore def generate(self, schema: CoreSchema, mode: JsonSchemaMode = 'validation') -> JsonSchemaValue: """Generates a JSON schema for a specified schema in a specified mode. Args: schema: A Pydantic model. mode: The mode in which to generate the schema. Defaults to 'validation'. Returns: A JSON schema representing the specified schema. Raises: PydanticUserError: If the JSON schema generator has already been used to generate a JSON schema. """ self._mode = mode if self._used: raise PydanticUserError( 'This JSON schema generator has already been used to generate a JSON schema. ' f'You must create a new instance of {type(self).__name__} to generate a new JSON schema.', code='json-schema-already-used', ) json_schema: JsonSchemaValue = self.generate_inner(schema) json_ref_counts = self.get_json_ref_counts(json_schema) ref = cast(JsonRef, json_schema.get('$ref')) while ref is not None: # may need to unpack multiple levels ref_json_schema = self.get_schema_from_definitions(ref) if json_ref_counts[ref] == 1 and ref_json_schema is not None and len(json_schema) == 1: # "Unpack" the ref since this is the only reference and there are no sibling keys json_schema = ref_json_schema.copy() # copy to prevent recursive dict reference json_ref_counts[ref] -= 1 ref = cast(JsonRef, json_schema.get('$ref')) ref = None self._garbage_collect_definitions(json_schema) definitions_remapping = self._build_definitions_remapping() if self.definitions: json_schema['$defs'] = self.definitions json_schema = definitions_remapping.remap_json_schema(json_schema) # For now, we will not set the $schema key. However, if desired, this can be easily added by overriding # this method and adding the following line after a call to super().generate(schema): # json_schema['$schema'] = self.schema_dialect self._used = True return self.sort(json_schema) def generate_inner(self, schema: CoreSchemaOrField) -> JsonSchemaValue: # noqa: C901 """Generates a JSON schema for a given core schema. Args: schema: The given core schema. Returns: The generated JSON schema. TODO: the nested function definitions here seem like bad practice, I'd like to unpack these in a future PR. It'd be great if we could shorten the call stack a bit for JSON schema generation, and I think there's potential for that here. """ # If a schema with the same CoreRef has been handled, just return a reference to it # Note that this assumes that it will _never_ be the case that the same CoreRef is used # on types that should have different JSON schemas if 'ref' in schema: core_ref = CoreRef(schema['ref']) # type: ignore[typeddict-item] core_mode_ref = (core_ref, self.mode) if core_mode_ref in self.core_to_defs_refs and self.core_to_defs_refs[core_mode_ref] in self.definitions: return {'$ref': self.core_to_json_refs[core_mode_ref]} def populate_defs(core_schema: CoreSchema, json_schema: JsonSchemaValue) -> JsonSchemaValue: if 'ref' in core_schema: core_ref = CoreRef(core_schema['ref']) # type: ignore[typeddict-item] defs_ref, ref_json_schema = self.get_cache_defs_ref_schema(core_ref) json_ref = JsonRef(ref_json_schema['$ref']) # Replace the schema if it's not a reference to itself # What we want to avoid is having the def be just a ref to itself # which is what would happen if we blindly assigned any if json_schema.get('$ref', None) != json_ref: self.definitions[defs_ref] = json_schema self._core_defs_invalid_for_json_schema.pop(defs_ref, None) json_schema = ref_json_schema return json_schema def handler_func(schema_or_field: CoreSchemaOrField) -> JsonSchemaValue: """Generate a JSON schema based on the input schema. Args: schema_or_field: The core schema to generate a JSON schema from. Returns: The generated JSON schema. Raises: TypeError: If an unexpected schema type is encountered. """ # Generate the core-schema-type-specific bits of the schema generation: json_schema: JsonSchemaValue | None = None if self.mode == 'serialization' and 'serialization' in schema_or_field: # In this case, we skip the JSON Schema generation of the schema # and use the `'serialization'` schema instead (canonical example: # `Annotated[int, PlainSerializer(str)]`). ser_schema = schema_or_field['serialization'] # type: ignore json_schema = self.ser_schema(ser_schema) # It might be that the 'serialization'` is skipped depending on `when_used`. # This is only relevant for `nullable` schemas though, so we special case here. if ( json_schema is not None and ser_schema.get('when_used') in ('unless-none', 'json-unless-none') and schema_or_field['type'] == 'nullable' ): json_schema = self.get_flattened_anyof([{'type': 'null'}, json_schema]) if json_schema is None: if _core_utils.is_core_schema(schema_or_field) or _core_utils.is_core_schema_field(schema_or_field): generate_for_schema_type = self._schema_type_to_method[schema_or_field['type']] json_schema = generate_for_schema_type(schema_or_field) else: raise TypeError(f'Unexpected schema type: schema={schema_or_field}') if _core_utils.is_core_schema(schema_or_field): json_schema = populate_defs(schema_or_field, json_schema) return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, handler_func) metadata = cast(_core_metadata.CoreMetadata, schema.get('metadata', {})) # TODO: I dislike that we have to wrap these basic dict updates in callables, is there any way around this? if js_updates := metadata.get('pydantic_js_updates'): def js_updates_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, ) -> JsonSchemaValue: json_schema = {**current_handler(schema_or_field), **js_updates} return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, js_updates_handler_func) if js_extra := metadata.get('pydantic_js_extra'): def js_extra_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, ) -> JsonSchemaValue: json_schema = current_handler(schema_or_field) if isinstance(js_extra, dict): json_schema.update(to_jsonable_python(js_extra)) elif callable(js_extra): # similar to typing issue in _update_class_schema when we're working with callable js extra js_extra(json_schema) # type: ignore return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, js_extra_handler_func) for js_modify_function in metadata.get('pydantic_js_functions', ()): def new_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, js_modify_function: GetJsonSchemaFunction = js_modify_function, ) -> JsonSchemaValue: json_schema = js_modify_function(schema_or_field, current_handler) if _core_utils.is_core_schema(schema_or_field): json_schema = populate_defs(schema_or_field, json_schema) original_schema = current_handler.resolve_ref_schema(json_schema) ref = json_schema.pop('$ref', None) if ref and json_schema: original_schema.update(json_schema) return original_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func) for js_modify_function in metadata.get('pydantic_js_annotation_functions', ()): def new_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, js_modify_function: GetJsonSchemaFunction = js_modify_function, ) -> JsonSchemaValue: json_schema = js_modify_function(schema_or_field, current_handler) if _core_utils.is_core_schema(schema_or_field): json_schema = populate_defs(schema_or_field, json_schema) return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func) json_schema = current_handler(schema) if _core_utils.is_core_schema(schema): json_schema = populate_defs(schema, json_schema) return json_schema def sort(self, value: JsonSchemaValue, parent_key: str | None = None) -> JsonSchemaValue: """Override this method to customize the sorting of the JSON schema (e.g., don't sort at all, sort all keys unconditionally, etc.) By default, alphabetically sort the keys in the JSON schema, skipping the 'properties' and 'default' keys to preserve field definition order. This sort is recursive, so it will sort all nested dictionaries as well. """ sorted_dict: dict[str, JsonSchemaValue] = {} keys = value.keys() if parent_key not in ('properties', 'default'): keys = sorted(keys) for key in keys: sorted_dict[key] = self._sort_recursive(value[key], parent_key=key) return sorted_dict def _sort_recursive(self, value: Any, parent_key: str | None = None) -> Any: """Recursively sort a JSON schema value.""" if isinstance(value, dict): sorted_dict: dict[str, JsonSchemaValue] = {} keys = value.keys() if parent_key not in ('properties', 'default'): keys = sorted(keys) for key in keys: sorted_dict[key] = self._sort_recursive(value[key], parent_key=key) return sorted_dict elif isinstance(value, list): sorted_list: list[JsonSchemaValue] = [] for item in value: sorted_list.append(self._sort_recursive(item, parent_key)) return sorted_list else: return value # ### Schema generation methods def invalid_schema(self, schema: core_schema.InvalidSchema) -> JsonSchemaValue: """Placeholder - should never be called.""" raise RuntimeError('Cannot generate schema for invalid_schema. This is a bug! Please report it.') def any_schema(self, schema: core_schema.AnySchema) -> JsonSchemaValue: """Generates a JSON schema that matches any value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {} def none_schema(self, schema: core_schema.NoneSchema) -> JsonSchemaValue: """Generates a JSON schema that matches `None`. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'null'} def bool_schema(self, schema: core_schema.BoolSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a bool value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'boolean'} def int_schema(self, schema: core_schema.IntSchema) -> JsonSchemaValue: """Generates a JSON schema that matches an int value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: dict[str, Any] = {'type': 'integer'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric) json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}} return json_schema def float_schema(self, schema: core_schema.FloatSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a float value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: dict[str, Any] = {'type': 'number'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric) json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}} return json_schema def decimal_schema(self, schema: core_schema.DecimalSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a decimal value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = self.str_schema(core_schema.str_schema()) if self.mode == 'validation': multiple_of = schema.get('multiple_of') le = schema.get('le') ge = schema.get('ge') lt = schema.get('lt') gt = schema.get('gt') json_schema = { 'anyOf': [ self.float_schema( core_schema.float_schema( allow_inf_nan=schema.get('allow_inf_nan'), multiple_of=None if multiple_of is None else float(multiple_of), le=None if le is None else float(le), ge=None if ge is None else float(ge), lt=None if lt is None else float(lt), gt=None if gt is None else float(gt), ) ), json_schema, ], } return json_schema def str_schema(self, schema: core_schema.StringSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a string value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) if isinstance(json_schema.get('pattern'), Pattern): # TODO: should we add regex flags to the pattern? json_schema['pattern'] = json_schema.get('pattern').pattern # type: ignore return json_schema def bytes_schema(self, schema: core_schema.BytesSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a bytes value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string', 'format': 'base64url' if self._config.ser_json_bytes == 'base64' else 'binary'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.bytes) return json_schema def date_schema(self, schema: core_schema.DateSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a date value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'date'} def time_schema(self, schema: core_schema.TimeSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a time value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'time'} def datetime_schema(self, schema: core_schema.DatetimeSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a datetime value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'date-time'} def timedelta_schema(self, schema: core_schema.TimedeltaSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a timedelta value. Args: schema: The core schema. Returns: The generated JSON schema. """ if self._config.ser_json_timedelta == 'float': return {'type': 'number'} return {'type': 'string', 'format': 'duration'} def literal_schema(self, schema: core_schema.LiteralSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a literal value. Args: schema: The core schema. Returns: The generated JSON schema. """ expected = [v.value if isinstance(v, Enum) else v for v in schema['expected']] # jsonify the expected values expected = [to_jsonable_python(v) for v in expected] result: dict[str, Any] = {} if len(expected) == 1: result['const'] = expected[0] else: result['enum'] = expected types = {type(e) for e in expected} if types == {str}: result['type'] = 'string' elif types == {int}: result['type'] = 'integer' elif types == {float}: result['type'] = 'number' elif types == {bool}: result['type'] = 'boolean' elif types == {list}: result['type'] = 'array' elif types == {type(None)}: result['type'] = 'null' return result def enum_schema(self, schema: core_schema.EnumSchema) -> JsonSchemaValue: """Generates a JSON schema that matches an Enum value. Args: schema: The core schema. Returns: The generated JSON schema. """ enum_type = schema['cls'] description = None if not enum_type.__doc__ else inspect.cleandoc(enum_type.__doc__) if ( description == 'An enumeration.' ): # This is the default value provided by enum.EnumMeta.__new__; don't use it description = None result: dict[str, Any] = {'title': enum_type.__name__, 'description': description} result = {k: v for k, v in result.items() if v is not None} expected = [to_jsonable_python(v.value) for v in schema['members']] result['enum'] = expected types = {type(e) for e in expected} if isinstance(enum_type, str) or types == {str}: result['type'] = 'string' elif isinstance(enum_type, int) or types == {int}: result['type'] = 'integer' elif isinstance(enum_type, float) or types == {float}: result['type'] = 'number' elif types == {bool}: result['type'] = 'boolean' elif types == {list}: result['type'] = 'array' return result def is_instance_schema(self, schema: core_schema.IsInstanceSchema) -> JsonSchemaValue: """Handles JSON schema generation for a core schema that checks if a value is an instance of a class. Unless overridden in a subclass, this raises an error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.handle_invalid_for_json_schema(schema, f'core_schema.IsInstanceSchema ({schema["cls"]})') def is_subclass_schema(self, schema: core_schema.IsSubclassSchema) -> JsonSchemaValue: """Handles JSON schema generation for a core schema that checks if a value is a subclass of a class. For backwards compatibility with v1, this does not raise an error, but can be overridden to change this. Args: schema: The core schema. Returns: The generated JSON schema. """ # Note: This is for compatibility with V1; you can override if you want different behavior. return {} def callable_schema(self, schema: core_schema.CallableSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a callable value. Unless overridden in a subclass, this raises an error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.handle_invalid_for_json_schema(schema, 'core_schema.CallableSchema') def list_schema(self, schema: core_schema.ListSchema) -> JsonSchemaValue: """Returns a schema that matches a list schema. Args: schema: The core schema. Returns: The generated JSON schema. """ items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema']) json_schema = {'type': 'array', 'items': items_schema} self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema @deprecated('`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.', category=None) @final def tuple_positional_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Replaced by `tuple_schema`.""" warnings.warn( '`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.', PydanticDeprecatedSince26, stacklevel=2, ) return self.tuple_schema(schema) @deprecated('`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.', category=None) @final def tuple_variable_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Replaced by `tuple_schema`.""" warnings.warn( '`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.', PydanticDeprecatedSince26, stacklevel=2, ) return self.tuple_schema(schema) def tuple_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a tuple schema e.g. `Tuple[int, str, bool]` or `Tuple[int, ...]`. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: JsonSchemaValue = {'type': 'array'} if 'variadic_item_index' in schema: variadic_item_index = schema['variadic_item_index'] if variadic_item_index > 0: json_schema['minItems'] = variadic_item_index json_schema['prefixItems'] = [ self.generate_inner(item) for item in schema['items_schema'][:variadic_item_index] ] if variadic_item_index + 1 == len(schema['items_schema']): # if the variadic item is the last item, then represent it faithfully json_schema['items'] = self.generate_inner(schema['items_schema'][variadic_item_index]) else: # otherwise, 'items' represents the schema for the variadic # item plus the suffix, so just allow anything for simplicity # for now json_schema['items'] = True else: prefixItems = [self.generate_inner(item) for item in schema['items_schema']] if prefixItems: json_schema['prefixItems'] = prefixItems json_schema['minItems'] = len(prefixItems) json_schema['maxItems'] = len(prefixItems) self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema def set_schema(self, schema: core_schema.SetSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a set schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self._common_set_schema(schema) def frozenset_schema(self, schema: core_schema.FrozenSetSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a frozenset schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self._common_set_schema(schema) def _common_set_schema(self, schema: core_schema.SetSchema | core_schema.FrozenSetSchema) -> JsonSchemaValue: items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema']) json_schema = {'type': 'array', 'uniqueItems': True, 'items': items_schema} self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema def generator_schema(self, schema: core_schema.GeneratorSchema) -> JsonSchemaValue: """Returns a JSON schema that represents the provided GeneratorSchema. Args: schema: The schema. Returns: The generated JSON schema. """ items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema']) json_schema = {'type': 'array', 'items': items_schema} self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema def dict_schema(self, schema: core_schema.DictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a dict schema. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: JsonSchemaValue = {'type': 'object'} keys_schema = self.generate_inner(schema['keys_schema']).copy() if 'keys_schema' in schema else {} if '$ref' not in keys_schema: keys_pattern = keys_schema.pop('pattern', None) # Don't give a title to patternProperties/propertyNames: keys_schema.pop('title', None) else: # Here, we assume that if the keys schema is a definition reference, # it can't be a simple string core schema (and thus no pattern can exist). # However, this is only in practice (in theory, a definition reference core # schema could be generated for a simple string schema). # Note that we avoid calling `self.resolve_ref_schema`, as it might not exist yet. keys_pattern = None values_schema = self.generate_inner(schema['values_schema']).copy() if 'values_schema' in schema else {} # don't give a title to additionalProperties: values_schema.pop('title', None) if values_schema or keys_pattern is not None: # don't add additionalProperties if it's empty if keys_pattern is None: json_schema['additionalProperties'] = values_schema else: json_schema['patternProperties'] = {keys_pattern: values_schema} if ( # The len check indicates that constraints are probably present: (keys_schema.get('type') == 'string' and len(keys_schema) > 1) # If this is a definition reference schema, it most likely has constraints: or '$ref' in keys_schema ): keys_schema.pop('type', None) json_schema['propertyNames'] = keys_schema self.update_with_validations(json_schema, schema, self.ValidationsMapping.object) return json_schema def function_before_schema(self, schema: core_schema.BeforeValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-before schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self._mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.generate_inner(schema['schema']) def function_after_schema(self, schema: core_schema.AfterValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-after schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) def function_plain_schema(self, schema: core_schema.PlainValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-plain schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self._mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.handle_invalid_for_json_schema( schema, f'core_schema.PlainValidatorFunctionSchema ({schema["function"]})' ) def function_wrap_schema(self, schema: core_schema.WrapValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-wrap schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self._mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.generate_inner(schema['schema']) def default_schema(self, schema: core_schema.WithDefaultSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema with a default value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = self.generate_inner(schema['schema']) if 'default' not in schema: return json_schema default = schema['default'] # Note: if you want to include the value returned by the default_factory, # override this method and replace the code above with: # if 'default' in schema: # default = schema['default'] # elif 'default_factory' in schema: # default = schema['default_factory']() # else: # return json_schema # we reflect the application of custom plain, no-info serializers to defaults for # JSON Schemas viewed in serialization mode: # TODO: improvements along with https://github.com/pydantic/pydantic/issues/8208 if ( self.mode == 'serialization' and (ser_schema := schema['schema'].get('serialization')) and (ser_func := ser_schema.get('function')) and ser_schema.get('type') == 'function-plain' and not ser_schema.get('info_arg') and not (default is None and ser_schema.get('when_used') in ('unless-none', 'json-unless-none')) ): try: default = ser_func(default) # type: ignore except Exception: # It might be that the provided default needs to be validated (read: parsed) first # (assuming `validate_default` is enabled). However, we can't perform # such validation during JSON Schema generation so we don't support # this pattern for now. # (One example is when using `foo: ByteSize = '1MB'`, which validates and # serializes as an int. In this case, `ser_func` is `int` and `int('1MB')` fails). self.emit_warning( 'non-serializable-default', f'Unable to serialize value {default!r} with the plain serializer; excluding default from JSON schema', ) return json_schema try: encoded_default = self.encode_default(default) except pydantic_core.PydanticSerializationError: self.emit_warning( 'non-serializable-default', f'Default value {default} is not JSON serializable; excluding default from JSON schema', ) # Return the inner schema, as though there was no default return json_schema json_schema['default'] = encoded_default return json_schema def nullable_schema(self, schema: core_schema.NullableSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows null values. Args: schema: The core schema. Returns: The generated JSON schema. """ null_schema = {'type': 'null'} inner_json_schema = self.generate_inner(schema['schema']) if inner_json_schema == null_schema: return null_schema else: # Thanks to the equality check against `null_schema` above, I think 'oneOf' would also be valid here; # I'll use 'anyOf' for now, but it could be changed it if it would work better with some external tooling return self.get_flattened_anyof([inner_json_schema, null_schema]) def union_schema(self, schema: core_schema.UnionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching any of the given schemas. Args: schema: The core schema. Returns: The generated JSON schema. """ generated: list[JsonSchemaValue] = [] choices = schema['choices'] for choice in choices: # choice will be a tuple if an explicit label was provided choice_schema = choice[0] if isinstance(choice, tuple) else choice try: generated.append(self.generate_inner(choice_schema)) except PydanticOmit: continue except PydanticInvalidForJsonSchema as exc: self.emit_warning('skipped-choice', exc.message) if len(generated) == 1: return generated[0] return self.get_flattened_anyof(generated) def tagged_union_schema(self, schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value. Args: schema: The core schema. Returns: The generated JSON schema. """ generated: dict[str, JsonSchemaValue] = {} for k, v in schema['choices'].items(): if isinstance(k, Enum): k = k.value try: # Use str(k) since keys must be strings for json; while not technically correct, # it's the closest that can be represented in valid JSON generated[str(k)] = self.generate_inner(v).copy() except PydanticOmit: continue except PydanticInvalidForJsonSchema as exc: self.emit_warning('skipped-choice', exc.message) one_of_choices = _deduplicate_schemas(generated.values()) json_schema: JsonSchemaValue = {'oneOf': one_of_choices} # This reflects the v1 behavior; TODO: we should make it possible to exclude OpenAPI stuff from the JSON schema openapi_discriminator = self._extract_discriminator(schema, one_of_choices) if openapi_discriminator is not None: json_schema['discriminator'] = { 'propertyName': openapi_discriminator, 'mapping': {k: v.get('$ref', v) for k, v in generated.items()}, } return json_schema def _extract_discriminator( self, schema: core_schema.TaggedUnionSchema, one_of_choices: list[JsonDict] ) -> str | None: """Extract a compatible OpenAPI discriminator from the schema and one_of choices that end up in the final schema.""" openapi_discriminator: str | None = None if isinstance(schema['discriminator'], str): return schema['discriminator'] if isinstance(schema['discriminator'], list): # If the discriminator is a single item list containing a string, that is equivalent to the string case if len(schema['discriminator']) == 1 and isinstance(schema['discriminator'][0], str): return schema['discriminator'][0] # When an alias is used that is different from the field name, the discriminator will be a list of single # str lists, one for the attribute and one for the actual alias. The logic here will work even if there is # more than one possible attribute, and looks for whether a single alias choice is present as a documented # property on all choices. If so, that property will be used as the OpenAPI discriminator. for alias_path in schema['discriminator']: if not isinstance(alias_path, list): break # this means that the discriminator is not a list of alias paths if len(alias_path) != 1: continue # this means that the "alias" does not represent a single field alias = alias_path[0] if not isinstance(alias, str): continue # this means that the "alias" does not represent a field alias_is_present_on_all_choices = True for choice in one_of_choices: try: choice = self.resolve_ref_schema(choice) except RuntimeError as exc: # TODO: fixme - this is a workaround for the fact that we can't always resolve refs # for tagged union choices at this point in the schema gen process, we might need to do # another pass at the end like we do for core schemas self.emit_warning('skipped-discriminator', str(exc)) choice = {} properties = choice.get('properties', {}) if not isinstance(properties, dict) or alias not in properties: alias_is_present_on_all_choices = False break if alias_is_present_on_all_choices: openapi_discriminator = alias break return openapi_discriminator def chain_schema(self, schema: core_schema.ChainSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a core_schema.ChainSchema. When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain. Args: schema: The core schema. Returns: The generated JSON schema. """ step_index = 0 if self.mode == 'validation' else -1 # use first step for validation, last for serialization return self.generate_inner(schema['steps'][step_index]) def lax_or_strict_schema(self, schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema. Args: schema: The core schema. Returns: The generated JSON schema. """ # TODO: Need to read the default value off of model config or whatever use_strict = schema.get('strict', False) # TODO: replace this default False # If your JSON schema fails to generate it is probably # because one of the following two branches failed. if use_strict: return self.generate_inner(schema['strict_schema']) else: return self.generate_inner(schema['lax_schema']) def json_or_python_schema(self, schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema. The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['json_schema']) def typed_dict_schema(self, schema: core_schema.TypedDictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a typed dict. Args: schema: The core schema. Returns: The generated JSON schema. """ total = schema.get('total', True) named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (name, self.field_is_required(field, total), field) for name, field in schema['fields'].items() if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) cls = schema.get('cls') config = _get_typed_dict_config(cls) with self._config_wrapper_stack.push(config): json_schema = self._named_required_fields_schema(named_required_fields) if cls is not None: self._update_class_schema(json_schema, cls, config) else: extra = config.get('extra') if extra == 'forbid': json_schema['additionalProperties'] = False elif extra == 'allow': json_schema['additionalProperties'] = True return json_schema @staticmethod def _name_required_computed_fields( computed_fields: list[ComputedField], ) -> list[tuple[str, bool, core_schema.ComputedField]]: return [(field['property_name'], True, field) for field in computed_fields] def _named_required_fields_schema( self, named_required_fields: Sequence[tuple[str, bool, CoreSchemaField]] ) -> JsonSchemaValue: properties: dict[str, JsonSchemaValue] = {} required_fields: list[str] = [] for name, required, field in named_required_fields: if self.by_alias: name = self._get_alias_name(field, name) try: field_json_schema = self.generate_inner(field).copy() except PydanticOmit: continue if 'title' not in field_json_schema and self.field_title_should_be_set(field): title = self.get_title_from_name(name) field_json_schema['title'] = title field_json_schema = self.handle_ref_overrides(field_json_schema) properties[name] = field_json_schema if required: required_fields.append(name) json_schema = {'type': 'object', 'properties': properties} if required_fields: json_schema['required'] = required_fields return json_schema def _get_alias_name(self, field: CoreSchemaField, name: str) -> str: if field['type'] == 'computed-field': alias: Any = field.get('alias', name) elif self.mode == 'validation': alias = field.get('validation_alias', name) else: alias = field.get('serialization_alias', name) if isinstance(alias, str): name = alias elif isinstance(alias, list): alias = cast('list[str] | str', alias) for path in alias: if isinstance(path, list) and len(path) == 1 and isinstance(path[0], str): # Use the first valid single-item string path; the code that constructs the alias array # should ensure the first such item is what belongs in the JSON schema name = path[0] break else: assert_never(alias) return name def typed_dict_field_schema(self, schema: core_schema.TypedDictField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a typed dict field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) def dataclass_field_schema(self, schema: core_schema.DataclassField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) def model_field_schema(self, schema: core_schema.ModelField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) def computed_field_schema(self, schema: core_schema.ComputedField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a computed field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['return_schema']) def model_schema(self, schema: core_schema.ModelSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model. Args: schema: The core schema. Returns: The generated JSON schema. """ # We do not use schema['model'].model_json_schema() here # because it could lead to inconsistent refs handling, etc. cls = cast('type[BaseModel]', schema['cls']) config = cls.model_config with self._config_wrapper_stack.push(config): json_schema = self.generate_inner(schema['schema']) self._update_class_schema(json_schema, cls, config) return json_schema def _update_class_schema(self, json_schema: JsonSchemaValue, cls: type[Any], config: ConfigDict) -> None: """Update json_schema with the following, extracted from `config` and `cls`: * title * description * additional properties * json_schema_extra * deprecated Done in place, hence there's no return value as the original json_schema is mutated. No ref resolving is involved here, as that's not appropriate for simple updates. """ from .main import BaseModel from .root_model import RootModel if (config_title := config.get('title')) is not None: json_schema.setdefault('title', config_title) elif model_title_generator := config.get('model_title_generator'): title = model_title_generator(cls) if not isinstance(title, str): raise TypeError(f'model_title_generator {model_title_generator} must return str, not {title.__class__}') json_schema.setdefault('title', title) if 'title' not in json_schema: json_schema['title'] = cls.__name__ # BaseModel and dataclasses; don't use cls.__doc__ as it will contain the verbose class signature by default docstring = None if cls is BaseModel or dataclasses.is_dataclass(cls) else cls.__doc__ if docstring: json_schema.setdefault('description', inspect.cleandoc(docstring)) elif issubclass(cls, RootModel) and (root_description := cls.__pydantic_fields__['root'].description): json_schema.setdefault('description', root_description) extra = config.get('extra') if 'additionalProperties' not in json_schema: if extra == 'allow': json_schema['additionalProperties'] = True elif extra == 'forbid': json_schema['additionalProperties'] = False json_schema_extra = config.get('json_schema_extra') if issubclass(cls, BaseModel) and cls.__pydantic_root_model__: root_json_schema_extra = cls.model_fields['root'].json_schema_extra if json_schema_extra and root_json_schema_extra: raise ValueError( '"model_config[\'json_schema_extra\']" and "Field.json_schema_extra" on "RootModel.root"' ' field must not be set simultaneously' ) if root_json_schema_extra: json_schema_extra = root_json_schema_extra if isinstance(json_schema_extra, (staticmethod, classmethod)): # In older versions of python, this is necessary to ensure staticmethod/classmethods are callable json_schema_extra = json_schema_extra.__get__(cls) if isinstance(json_schema_extra, dict): json_schema.update(json_schema_extra) elif callable(json_schema_extra): # FIXME: why are there type ignores here? We support two signatures for json_schema_extra callables... if len(inspect.signature(json_schema_extra).parameters) > 1: json_schema_extra(json_schema, cls) # type: ignore else: json_schema_extra(json_schema) # type: ignore elif json_schema_extra is not None: raise ValueError( f"model_config['json_schema_extra']={json_schema_extra} should be a dict, callable, or None" ) if hasattr(cls, '__deprecated__'): json_schema['deprecated'] = True def resolve_ref_schema(self, json_schema: JsonSchemaValue) -> JsonSchemaValue: """Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema. Args: json_schema: The schema to resolve. Returns: The resolved schema. Raises: RuntimeError: If the schema reference can't be found in definitions. """ if '$ref' not in json_schema: return json_schema ref = json_schema['$ref'] schema_to_update = self.get_schema_from_definitions(JsonRef(ref)) if schema_to_update is None: raise RuntimeError(f'Cannot update undefined schema for $ref={ref}') return self.resolve_ref_schema(schema_to_update) def model_fields_schema(self, schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model's fields. Args: schema: The core schema. Returns: The generated JSON schema. """ named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (name, self.field_is_required(field, total=True), field) for name, field in schema['fields'].items() if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) json_schema = self._named_required_fields_schema(named_required_fields) extras_schema = schema.get('extras_schema', None) if extras_schema is not None: schema_to_update = self.resolve_ref_schema(json_schema) schema_to_update['additionalProperties'] = self.generate_inner(extras_schema) return json_schema def field_is_present(self, field: CoreSchemaField) -> bool: """Whether the field should be included in the generated JSON schema. Args: field: The schema for the field itself. Returns: `True` if the field should be included in the generated JSON schema, `False` otherwise. """ if self.mode == 'serialization': # If you still want to include the field in the generated JSON schema, # override this method and return True return not field.get('serialization_exclude') elif self.mode == 'validation': return True else: assert_never(self.mode) def field_is_required( self, field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField, total: bool, ) -> bool: """Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.). Args: field: The schema for the field itself. total: Only applies to `TypedDictField`s. Indicates if the `TypedDict` this field belongs to is total, in which case any fields that don't explicitly specify `required=False` are required. Returns: `True` if the field should be marked as required in the generated JSON schema, `False` otherwise. """ if self.mode == 'serialization' and self._config.json_schema_serialization_defaults_required: return not field.get('serialization_exclude') else: if field['type'] == 'typed-dict-field': return field.get('required', total) else: return field['schema']['type'] != 'default' def dataclass_args_schema(self, schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass's constructor arguments. Args: schema: The core schema. Returns: The generated JSON schema. """ named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (field['name'], self.field_is_required(field, total=True), field) for field in schema['fields'] if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) return self._named_required_fields_schema(named_required_fields) def dataclass_schema(self, schema: core_schema.DataclassSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass. Args: schema: The core schema. Returns: The generated JSON schema. """ from ._internal._dataclasses import is_builtin_dataclass cls = schema['cls'] config: ConfigDict = getattr(cls, '__pydantic_config__', cast('ConfigDict', {})) with self._config_wrapper_stack.push(config): json_schema = self.generate_inner(schema['schema']).copy() self._update_class_schema(json_schema, cls, config) # Dataclass-specific handling of description if is_builtin_dataclass(cls): # vanilla dataclass; don't use cls.__doc__ as it will contain the class signature by default description = None else: description = None if cls.__doc__ is None else inspect.cleandoc(cls.__doc__) if description: json_schema['description'] = description return json_schema def arguments_schema(self, schema: core_schema.ArgumentsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's arguments. Args: schema: The core schema. Returns: The generated JSON schema. """ prefer_positional = schema.get('metadata', {}).get('pydantic_js_prefer_positional_arguments') arguments = schema['arguments_schema'] kw_only_arguments = [a for a in arguments if a.get('mode') == 'keyword_only'] kw_or_p_arguments = [a for a in arguments if a.get('mode') in {'positional_or_keyword', None}] p_only_arguments = [a for a in arguments if a.get('mode') == 'positional_only'] var_args_schema = schema.get('var_args_schema') var_kwargs_schema = schema.get('var_kwargs_schema') if prefer_positional: positional_possible = not kw_only_arguments and not var_kwargs_schema if positional_possible: return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema) keyword_possible = not p_only_arguments and not var_args_schema if keyword_possible: return self.kw_arguments_schema(kw_or_p_arguments + kw_only_arguments, var_kwargs_schema) if not prefer_positional: positional_possible = not kw_only_arguments and not var_kwargs_schema if positional_possible: return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema) raise PydanticInvalidForJsonSchema( 'Unable to generate JSON schema for arguments validator with positional-only and keyword-only arguments' ) def kw_arguments_schema( self, arguments: list[core_schema.ArgumentsParameter], var_kwargs_schema: CoreSchema | None ) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's keyword arguments. Args: arguments: The core schema. Returns: The generated JSON schema. """ properties: dict[str, JsonSchemaValue] = {} required: list[str] = [] for argument in arguments: name = self.get_argument_name(argument) argument_schema = self.generate_inner(argument['schema']).copy() argument_schema['title'] = self.get_title_from_name(name) properties[name] = argument_schema if argument['schema']['type'] != 'default': # This assumes that if the argument has a default value, # the inner schema must be of type WithDefaultSchema. # I believe this is true, but I am not 100% sure required.append(name) json_schema: JsonSchemaValue = {'type': 'object', 'properties': properties} if required: json_schema['required'] = required if var_kwargs_schema: additional_properties_schema = self.generate_inner(var_kwargs_schema) if additional_properties_schema: json_schema['additionalProperties'] = additional_properties_schema else: json_schema['additionalProperties'] = False return json_schema def p_arguments_schema( self, arguments: list[core_schema.ArgumentsParameter], var_args_schema: CoreSchema | None ) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's positional arguments. Args: arguments: The core schema. Returns: The generated JSON schema. """ prefix_items: list[JsonSchemaValue] = [] min_items = 0 for argument in arguments: name = self.get_argument_name(argument) argument_schema = self.generate_inner(argument['schema']).copy() argument_schema['title'] = self.get_title_from_name(name) prefix_items.append(argument_schema) if argument['schema']['type'] != 'default': # This assumes that if the argument has a default value, # the inner schema must be of type WithDefaultSchema. # I believe this is true, but I am not 100% sure min_items += 1 json_schema: JsonSchemaValue = {'type': 'array'} if prefix_items: json_schema['prefixItems'] = prefix_items if min_items: json_schema['minItems'] = min_items if var_args_schema: items_schema = self.generate_inner(var_args_schema) if items_schema: json_schema['items'] = items_schema else: json_schema['maxItems'] = len(prefix_items) return json_schema def get_argument_name(self, argument: core_schema.ArgumentsParameter) -> str: """Retrieves the name of an argument. Args: argument: The core schema. Returns: The name of the argument. """ name = argument['name'] if self.by_alias: alias = argument.get('alias') if isinstance(alias, str): name = alias else: pass # might want to do something else? return name def call_schema(self, schema: core_schema.CallSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function call. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['arguments_schema']) def custom_error_schema(self, schema: core_schema.CustomErrorSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a custom error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) def json_schema(self, schema: core_schema.JsonSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a JSON object. Args: schema: The core schema. Returns: The generated JSON schema. """ content_core_schema = schema.get('schema') or core_schema.any_schema() content_json_schema = self.generate_inner(content_core_schema) if self.mode == 'validation': return {'type': 'string', 'contentMediaType': 'application/json', 'contentSchema': content_json_schema} else: # self.mode == 'serialization' return content_json_schema def url_schema(self, schema: core_schema.UrlSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a URL. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string', 'format': 'uri', 'minLength': 1} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) return json_schema def multi_host_url_schema(self, schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts. Args: schema: The core schema. Returns: The generated JSON schema. """ # Note: 'multi-host-uri' is a custom/pydantic-specific format, not part of the JSON Schema spec json_schema = {'type': 'string', 'format': 'multi-host-uri', 'minLength': 1} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) return json_schema def uuid_schema(self, schema: core_schema.UuidSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a UUID. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'uuid'} def definitions_schema(self, schema: core_schema.DefinitionsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a JSON object with definitions. Args: schema: The core schema. Returns: The generated JSON schema. """ for definition in schema['definitions']: try: self.generate_inner(definition) except PydanticInvalidForJsonSchema as e: core_ref: CoreRef = CoreRef(definition['ref']) # type: ignore self._core_defs_invalid_for_json_schema[self.get_defs_ref((core_ref, self.mode))] = e continue return self.generate_inner(schema['schema']) def definition_ref_schema(self, schema: core_schema.DefinitionReferenceSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that references a definition. Args: schema: The core schema. Returns: The generated JSON schema. """ core_ref = CoreRef(schema['schema_ref']) _, ref_json_schema = self.get_cache_defs_ref_schema(core_ref) return ref_json_schema def ser_schema( self, schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema ) -> JsonSchemaValue | None: """Generates a JSON schema that matches a schema that defines a serialized object. Args: schema: The core schema. Returns: The generated JSON schema. """ schema_type = schema['type'] if schema_type == 'function-plain' or schema_type == 'function-wrap': # PlainSerializerFunctionSerSchema or WrapSerializerFunctionSerSchema return_schema = schema.get('return_schema') if return_schema is not None: return self.generate_inner(return_schema) elif schema_type == 'format' or schema_type == 'to-string': # FormatSerSchema or ToStringSerSchema return self.str_schema(core_schema.str_schema()) elif schema['type'] == 'model': # ModelSerSchema return self.generate_inner(schema['schema']) return None def complex_schema(self, schema: core_schema.ComplexSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a complex number. JSON has no standard way to represent complex numbers. Complex number is not a numeric type. Here we represent complex number as strings following the rule defined by Python. For instance, '1+2j' is an accepted complex string. Details can be found in [Python's `complex` documentation][complex]. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string'} # ### Utility methods def get_title_from_name(self, name: str) -> str: """Retrieves a title from a name. Args: name: The name to retrieve a title from. Returns: The title. """ return name.title().replace('_', ' ').strip() def field_title_should_be_set(self, schema: CoreSchemaOrField) -> bool: """Returns true if a field with the given schema should have a title set based on the field name. Intuitively, we want this to return true for schemas that wouldn't otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses). Args: schema: The schema to check. Returns: `True` if the field should have a title set, `False` otherwise. """ if _core_utils.is_core_schema_field(schema): if schema['type'] == 'computed-field': field_schema = schema['return_schema'] else: field_schema = schema['schema'] return self.field_title_should_be_set(field_schema) elif _core_utils.is_core_schema(schema): if schema.get('ref'): # things with refs, such as models and enums, should not have titles set return False if schema['type'] in {'default', 'nullable', 'definitions'}: return self.field_title_should_be_set(schema['schema']) # type: ignore[typeddict-item] if _core_utils.is_function_with_inner_schema(schema): return self.field_title_should_be_set(schema['schema']) if schema['type'] == 'definition-ref': # Referenced schemas should not have titles set for the same reason # schemas with refs should not return False return True # anything else should have title set else: raise PydanticInvalidForJsonSchema(f'Unexpected schema type: schema={schema}') # pragma: no cover def normalize_name(self, name: str) -> str: """Normalizes a name to be used as a key in a dictionary. Args: name: The name to normalize. Returns: The normalized name. """ return re.sub(r'[^a-zA-Z0-9.\-_]', '_', name).replace('.', '__') def get_defs_ref(self, core_mode_ref: CoreModeRef) -> DefsRef: """Override this method to change the way that definitions keys are generated from a core reference. Args: core_mode_ref: The core reference. Returns: The definitions key. """ # Split the core ref into "components"; generic origins and arguments are each separate components core_ref, mode = core_mode_ref components = re.split(r'([\][,])', core_ref) # Remove IDs from each component components = [x.rsplit(':', 1)[0] for x in components] core_ref_no_id = ''.join(components) # Remove everything before the last period from each "component" components = [re.sub(r'(?:[^.[\]]+\.)+((?:[^.[\]]+))', r'\1', x) for x in components] short_ref = ''.join(components) mode_title = _MODE_TITLE_MAPPING[mode] # It is important that the generated defs_ref values be such that at least one choice will not # be generated for any other core_ref. Currently, this should be the case because we include # the id of the source type in the core_ref name = DefsRef(self.normalize_name(short_ref)) name_mode = DefsRef(self.normalize_name(short_ref) + f'-{mode_title}') module_qualname = DefsRef(self.normalize_name(core_ref_no_id)) module_qualname_mode = DefsRef(f'{module_qualname}-{mode_title}') module_qualname_id = DefsRef(self.normalize_name(core_ref)) occurrence_index = self._collision_index.get(module_qualname_id) if occurrence_index is None: self._collision_counter[module_qualname] += 1 occurrence_index = self._collision_index[module_qualname_id] = self._collision_counter[module_qualname] module_qualname_occurrence = DefsRef(f'{module_qualname}__{occurrence_index}') module_qualname_occurrence_mode = DefsRef(f'{module_qualname_mode}__{occurrence_index}') self._prioritized_defsref_choices[module_qualname_occurrence_mode] = [ name, name_mode, module_qualname, module_qualname_mode, module_qualname_occurrence, module_qualname_occurrence_mode, ] return module_qualname_occurrence_mode def get_cache_defs_ref_schema(self, core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]: """This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition. Args: core_ref: The core reference to get the definitions reference for. Returns: A tuple of the definitions reference and the JSON schema that will refer to it. """ core_mode_ref = (core_ref, self.mode) maybe_defs_ref = self.core_to_defs_refs.get(core_mode_ref) if maybe_defs_ref is not None: json_ref = self.core_to_json_refs[core_mode_ref] return maybe_defs_ref, {'$ref': json_ref} defs_ref = self.get_defs_ref(core_mode_ref) # populate the ref translation mappings self.core_to_defs_refs[core_mode_ref] = defs_ref self.defs_to_core_refs[defs_ref] = core_mode_ref json_ref = JsonRef(self.ref_template.format(model=defs_ref)) self.core_to_json_refs[core_mode_ref] = json_ref self.json_to_defs_refs[json_ref] = defs_ref ref_json_schema = {'$ref': json_ref} return defs_ref, ref_json_schema def handle_ref_overrides(self, json_schema: JsonSchemaValue) -> JsonSchemaValue: """Remove any sibling keys that are redundant with the referenced schema. Args: json_schema: The schema to remove redundant sibling keys from. Returns: The schema with redundant sibling keys removed. """ if '$ref' in json_schema: # prevent modifications to the input; this copy may be safe to drop if there is significant overhead json_schema = json_schema.copy() referenced_json_schema = self.get_schema_from_definitions(JsonRef(json_schema['$ref'])) if referenced_json_schema is None: # This can happen when building schemas for models with not-yet-defined references. # It may be a good idea to do a recursive pass at the end of the generation to remove # any redundant override keys. return json_schema for k, v in list(json_schema.items()): if k == '$ref': continue if k in referenced_json_schema and referenced_json_schema[k] == v: del json_schema[k] # redundant key return json_schema def get_schema_from_definitions(self, json_ref: JsonRef) -> JsonSchemaValue | None: try: def_ref = self.json_to_defs_refs[json_ref] if def_ref in self._core_defs_invalid_for_json_schema: raise self._core_defs_invalid_for_json_schema[def_ref] return self.definitions.get(def_ref, None) except KeyError: if json_ref.startswith(('http://', 'https://')): return None raise def encode_default(self, dft: Any) -> Any: """Encode a default value to a JSON-serializable value. This is used to encode default values for fields in the generated JSON schema. Args: dft: The default value to encode. Returns: The encoded default value. """ from .type_adapter import TypeAdapter, _type_has_config config = self._config try: default = ( dft if _type_has_config(type(dft)) else TypeAdapter(type(dft), config=config.config_dict).dump_python(dft, mode='json') ) except PydanticSchemaGenerationError: raise pydantic_core.PydanticSerializationError(f'Unable to encode default value {dft}') return pydantic_core.to_jsonable_python( default, timedelta_mode=config.ser_json_timedelta, bytes_mode=config.ser_json_bytes, ) def update_with_validations( self, json_schema: JsonSchemaValue, core_schema: CoreSchema, mapping: dict[str, str] ) -> None: """Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema. Args: json_schema: The JSON schema to update. core_schema: The core schema to get the validations from. mapping: A mapping from core_schema attribute names to the corresponding JSON schema attribute names. """ for core_key, json_schema_key in mapping.items(): if core_key in core_schema: json_schema[json_schema_key] = core_schema[core_key] class ValidationsMapping: """This class just contains mappings from core_schema attribute names to the corresponding JSON schema attribute names. While I suspect it is unlikely to be necessary, you can in principle override this class in a subclass of GenerateJsonSchema (by inheriting from GenerateJsonSchema.ValidationsMapping) to change these mappings. """ numeric = { 'multiple_of': 'multipleOf', 'le': 'maximum', 'ge': 'minimum', 'lt': 'exclusiveMaximum', 'gt': 'exclusiveMinimum', } bytes = { 'min_length': 'minLength', 'max_length': 'maxLength', } string = { 'min_length': 'minLength', 'max_length': 'maxLength', 'pattern': 'pattern', } array = { 'min_length': 'minItems', 'max_length': 'maxItems', } object = { 'min_length': 'minProperties', 'max_length': 'maxProperties', } def get_flattened_anyof(self, schemas: list[JsonSchemaValue]) -> JsonSchemaValue: members = [] for schema in schemas: if len(schema) == 1 and 'anyOf' in schema: members.extend(schema['anyOf']) else: members.append(schema) members = _deduplicate_schemas(members) if len(members) == 1: return members[0] return {'anyOf': members} def get_json_ref_counts(self, json_schema: JsonSchemaValue) -> dict[JsonRef, int]: """Get all values corresponding to the key '$ref' anywhere in the json_schema.""" json_refs: dict[JsonRef, int] = Counter() def _add_json_refs(schema: Any) -> None: if isinstance(schema, dict): if '$ref' in schema: json_ref = JsonRef(schema['$ref']) if not isinstance(json_ref, str): return # in this case, '$ref' might have been the name of a property already_visited = json_ref in json_refs json_refs[json_ref] += 1 if already_visited: return # prevent recursion on a definition that was already visited try: defs_ref = self.json_to_defs_refs[json_ref] if defs_ref in self._core_defs_invalid_for_json_schema: raise self._core_defs_invalid_for_json_schema[defs_ref] _add_json_refs(self.definitions[defs_ref]) except KeyError: if not json_ref.startswith(('http://', 'https://')): raise for k, v in schema.items(): if k == 'examples': continue # skip refs processing for examples, allow arbitrary values / refs _add_json_refs(v) elif isinstance(schema, list): for v in schema: _add_json_refs(v) _add_json_refs(json_schema) return json_refs def handle_invalid_for_json_schema(self, schema: CoreSchemaOrField, error_info: str) -> JsonSchemaValue: raise PydanticInvalidForJsonSchema(f'Cannot generate a JsonSchema for {error_info}') def emit_warning(self, kind: JsonSchemaWarningKind, detail: str) -> None: """This method simply emits PydanticJsonSchemaWarnings based on handling in the `warning_message` method.""" message = self.render_warning_message(kind, detail) if message is not None: warnings.warn(message, PydanticJsonSchemaWarning) def render_warning_message(self, kind: JsonSchemaWarningKind, detail: str) -> str | None: """This method is responsible for ignoring warnings as desired, and for formatting the warning messages. You can override the value of `ignored_warning_kinds` in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don't want warnings to be emitted. Args: kind: The kind of warning to render. It can be one of the following: - 'skipped-choice': A choice field was skipped because it had no valid choices. - 'non-serializable-default': A default value was skipped because it was not JSON-serializable. detail: A string with additional details about the warning. Returns: The formatted warning message, or `None` if no warning should be emitted. """ if kind in self.ignored_warning_kinds: return None return f'{detail} [{kind}]' def _build_definitions_remapping(self) -> _DefinitionsRemapping: defs_to_json: dict[DefsRef, JsonRef] = {} for defs_refs in self._prioritized_defsref_choices.values(): for defs_ref in defs_refs: json_ref = JsonRef(self.ref_template.format(model=defs_ref)) defs_to_json[defs_ref] = json_ref return _DefinitionsRemapping.from_prioritized_choices( self._prioritized_defsref_choices, defs_to_json, self.definitions ) def _garbage_collect_definitions(self, schema: JsonSchemaValue) -> None: visited_defs_refs: set[DefsRef] = set() unvisited_json_refs = _get_all_json_refs(schema) while unvisited_json_refs: next_json_ref = unvisited_json_refs.pop() try: next_defs_ref = self.json_to_defs_refs[next_json_ref] if next_defs_ref in visited_defs_refs: continue visited_defs_refs.add(next_defs_ref) unvisited_json_refs.update(_get_all_json_refs(self.definitions[next_defs_ref])) except KeyError: if not next_json_ref.startswith(('http://', 'https://')): raise self.definitions = {k: v for k, v in self.definitions.items() if k in visited_defs_refs} # ##### Start JSON Schema Generation Functions ##### def model_json_schema( cls: type[BaseModel] | type[PydanticDataclass], by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Utility function to generate a JSON Schema for a model. Args: cls: The model class to generate a JSON Schema for. by_alias: If `True` (the default), fields will be serialized according to their alias. If `False`, fields will be serialized according to their attribute name. ref_template: The template to use for generating JSON Schema references. schema_generator: The class to use for generating the JSON Schema. mode: The mode to use for generating the JSON Schema. It can be one of the following: - 'validation': Generate a JSON Schema for validating data. - 'serialization': Generate a JSON Schema for serializing data. Returns: The generated JSON Schema. """ from .main import BaseModel schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) if isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema): cls.__pydantic_core_schema__.rebuild() if cls is BaseModel: raise AttributeError('model_json_schema() must be called on a subclass of BaseModel, not BaseModel itself.') assert not isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema), 'this is a bug! please report it' return schema_generator_instance.generate(cls.__pydantic_core_schema__, mode=mode) def models_json_schema( models: Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]], *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, ) -> tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]: """Utility function to generate a JSON Schema for multiple models. Args: models: A sequence of tuples of the form (model, mode). by_alias: Whether field aliases should be used as keys in the generated JSON Schema. title: The title of the generated JSON Schema. description: The description of the generated JSON Schema. ref_template: The reference template to use for generating JSON Schema references. schema_generator: The schema generator to use for generating the JSON Schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. """ for cls, _ in models: if isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema): cls.__pydantic_core_schema__.rebuild() instance = schema_generator(by_alias=by_alias, ref_template=ref_template) inputs: list[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode, CoreSchema]] = [ (m, mode, m.__pydantic_core_schema__) for m, mode in models ] json_schemas_map, definitions = instance.generate_definitions(inputs) json_schema: dict[str, Any] = {} if definitions: json_schema['$defs'] = definitions if title: json_schema['title'] = title if description: json_schema['description'] = description return json_schemas_map, json_schema # ##### End JSON Schema Generation Functions ##### _HashableJsonValue: TypeAlias = Union[ int, float, str, bool, None, Tuple['_HashableJsonValue', ...], Tuple[Tuple[str, '_HashableJsonValue'], ...] ] def _deduplicate_schemas(schemas: Iterable[JsonDict]) -> list[JsonDict]: return list({_make_json_hashable(schema): schema for schema in schemas}.values()) def _make_json_hashable(value: JsonValue) -> _HashableJsonValue: if isinstance(value, dict): return tuple(sorted((k, _make_json_hashable(v)) for k, v in value.items())) elif isinstance(value, list): return tuple(_make_json_hashable(v) for v in value) else: return value @dataclasses.dataclass(**_internal_dataclass.slots_true) class WithJsonSchema: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#withjsonschema-annotation Add this as an annotation on a field to override the (base) JSON schema that would be generated for that field. This provides a way to set a JSON schema for types that would otherwise raise errors when producing a JSON schema, such as Callable, or types that have an is-instance core schema, without needing to go so far as creating a custom subclass of pydantic.json_schema.GenerateJsonSchema. Note that any _modifications_ to the schema that would normally be made (such as setting the title for model fields) will still be performed. If `mode` is set this will only apply to that schema generation mode, allowing you to set different json schemas for validation and serialization. """ json_schema: JsonSchemaValue | None mode: Literal['validation', 'serialization'] | None = None def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: mode = self.mode or handler.mode if mode != handler.mode: return handler(core_schema) if self.json_schema is None: # This exception is handled in pydantic.json_schema.GenerateJsonSchema._named_required_fields_schema raise PydanticOmit else: return self.json_schema def __hash__(self) -> int: return hash(type(self.mode)) class Examples: """Add examples to a JSON schema. If the JSON Schema already contains examples, the provided examples will be appended. If `mode` is set this will only apply to that schema generation mode, allowing you to add different examples for validation and serialization. """ @overload @deprecated('Using a dict for `examples` is deprecated since v2.9 and will be removed in v3.0. Use a list instead.') def __init__( self, examples: dict[str, Any], mode: Literal['validation', 'serialization'] | None = None ) -> None: ... @overload def __init__(self, examples: list[Any], mode: Literal['validation', 'serialization'] | None = None) -> None: ... def __init__( self, examples: dict[str, Any] | list[Any], mode: Literal['validation', 'serialization'] | None = None ) -> None: if isinstance(examples, dict): warnings.warn( 'Using a dict for `examples` is deprecated, use a list instead.', PydanticDeprecatedSince29, stacklevel=2, ) self.examples = examples self.mode = mode def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: mode = self.mode or handler.mode json_schema = handler(core_schema) if mode != handler.mode: return json_schema examples = json_schema.get('examples') if examples is None: json_schema['examples'] = to_jsonable_python(self.examples) if isinstance(examples, dict): if isinstance(self.examples, list): warnings.warn( 'Updating existing JSON Schema examples of type dict with examples of type list. ' 'Only the existing examples values will be retained. Note that dict support for ' 'examples is deprecated and will be removed in v3.0.', UserWarning, ) json_schema['examples'] = to_jsonable_python( [ex for value in examples.values() for ex in value] + self.examples ) else: json_schema['examples'] = to_jsonable_python({**examples, **self.examples}) if isinstance(examples, list): if isinstance(self.examples, list): json_schema['examples'] = to_jsonable_python(examples + self.examples) elif isinstance(self.examples, dict): warnings.warn( 'Updating existing JSON Schema examples of type list with examples of type dict. ' 'Only the examples values will be retained. Note that dict support for ' 'examples is deprecated and will be removed in v3.0.', UserWarning, ) json_schema['examples'] = to_jsonable_python( examples + [ex for value in self.examples.values() for ex in value] ) return json_schema def __hash__(self) -> int: return hash(type(self.mode)) def _get_all_json_refs(item: Any) -> set[JsonRef]: """Get all the definitions references from a JSON schema.""" refs: set[JsonRef] = set() stack = [item] while stack: current = stack.pop() if isinstance(current, dict): for key, value in current.items(): if key == 'examples' and isinstance(value, list): # Skip examples that may contain arbitrary values and references # (e.g. `{"examples": [{"$ref": "..."}]}`). Note: checking for value # of type list is necessary to avoid skipping valid portions of the schema, # for instance when "examples" is used as a property key. A more robust solution # could be found, but would require more advanced JSON Schema parsing logic. continue if key == '$ref' and isinstance(value, str): refs.add(JsonRef(value)) elif isinstance(value, dict): stack.append(value) elif isinstance(value, list): stack.extend(value) elif isinstance(current, list): stack.extend(current) return refs AnyType = TypeVar('AnyType') if TYPE_CHECKING: SkipJsonSchema = Annotated[AnyType, ...] else: @dataclasses.dataclass(**_internal_dataclass.slots_true) class SkipJsonSchema: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json_schema/#skipjsonschema-annotation Add this as an annotation on a field to skip generating a JSON schema for that field. Example: ```python from typing import Union from pydantic import BaseModel from pydantic.json_schema import SkipJsonSchema from pprint import pprint class Model(BaseModel): a: Union[int, None] = None # (1)! b: Union[int, SkipJsonSchema[None]] = None # (2)! c: SkipJsonSchema[Union[int, None]] = None # (3)! pprint(Model.model_json_schema()) ''' { 'properties': { 'a': { 'anyOf': [ {'type': 'integer'}, {'type': 'null'} ], 'default': None, 'title': 'A' }, 'b': { 'default': None, 'title': 'B', 'type': 'integer' } }, 'title': 'Model', 'type': 'object' } ''' ``` 1. The integer and null types are both included in the schema for `a`. 2. The integer type is the only type included in the schema for `b`. 3. The entirety of the `c` field is omitted from the schema. """ def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] def __get_pydantic_json_schema__( self, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: raise PydanticOmit def __hash__(self) -> int: return hash(type(self)) def _get_typed_dict_config(cls: type[Any] | None) -> ConfigDict: if cls is not None: try: return _decorators.get_attribute_from_bases(cls, '__pydantic_config__') except AttributeError: pass return {} pydantic-2.10.6/pydantic/main.py000066400000000000000000002246501474456633400165550ustar00rootroot00000000000000"""Logic for creating models.""" from __future__ import annotations as _annotations import operator import sys import types import typing import warnings from copy import copy, deepcopy from functools import cached_property from typing import ( TYPE_CHECKING, Any, Callable, ClassVar, Dict, Generator, Literal, Mapping, Set, Tuple, TypeVar, Union, cast, overload, ) import pydantic_core import typing_extensions from pydantic_core import PydanticUndefined from typing_extensions import Self, TypeAlias, Unpack from ._internal import ( _config, _decorators, _fields, _forward_ref, _generics, _import_utils, _mock_val_ser, _model_construction, _namespace_utils, _repr, _typing_extra, _utils, ) from ._migration import getattr_migration from .aliases import AliasChoices, AliasPath from .annotated_handlers import GetCoreSchemaHandler, GetJsonSchemaHandler from .config import ConfigDict from .errors import PydanticUndefinedAnnotation, PydanticUserError from .json_schema import DEFAULT_REF_TEMPLATE, GenerateJsonSchema, JsonSchemaMode, JsonSchemaValue, model_json_schema from .plugin._schema_validator import PluggableSchemaValidator from .warnings import PydanticDeprecatedSince20 if TYPE_CHECKING: from inspect import Signature from pathlib import Path from pydantic_core import CoreSchema, SchemaSerializer, SchemaValidator from ._internal._namespace_utils import MappingNamespace from ._internal._utils import AbstractSetIntStr, MappingIntStrAny from .deprecated.parse import Protocol as DeprecatedParseProtocol from .fields import ComputedFieldInfo, FieldInfo, ModelPrivateAttr else: # See PyCharm issues https://youtrack.jetbrains.com/issue/PY-21915 # and https://youtrack.jetbrains.com/issue/PY-51428 DeprecationWarning = PydanticDeprecatedSince20 __all__ = 'BaseModel', 'create_model' # Keep these type aliases available at runtime: TupleGenerator: TypeAlias = Generator[Tuple[str, Any], None, None] # NOTE: In reality, `bool` should be replaced by `Literal[True]` but mypy fails to correctly apply bidirectional # type inference (e.g. when using `{'a': {'b': True}}`): # NOTE: Keep this type alias in sync with the stub definition in `pydantic-core`: IncEx: TypeAlias = Union[Set[int], Set[str], Mapping[int, Union['IncEx', bool]], Mapping[str, Union['IncEx', bool]]] _object_setattr = _model_construction.object_setattr class BaseModel(metaclass=_model_construction.ModelMetaclass): """Usage docs: https://docs.pydantic.dev/2.10/concepts/models/ A base class for creating Pydantic models. Attributes: __class_vars__: The names of the class variables defined on the model. __private_attributes__: Metadata about the private attributes of the model. __signature__: The synthesized `__init__` [`Signature`][inspect.Signature] of the model. __pydantic_complete__: Whether model building is completed, or if there are still undefined fields. __pydantic_core_schema__: The core schema of the model. __pydantic_custom_init__: Whether the model has a custom `__init__` function. __pydantic_decorators__: Metadata containing the decorators defined on the model. This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1. __pydantic_generic_metadata__: Metadata for generic models; contains data used for a similar purpose to __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these. __pydantic_parent_namespace__: Parent namespace of the model, used for automatic rebuilding of models. __pydantic_post_init__: The name of the post-init method for the model, if defined. __pydantic_root_model__: Whether the model is a [`RootModel`][pydantic.root_model.RootModel]. __pydantic_serializer__: The `pydantic-core` `SchemaSerializer` used to dump instances of the model. __pydantic_validator__: The `pydantic-core` `SchemaValidator` used to validate instances of the model. __pydantic_fields__: A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects. __pydantic_computed_fields__: A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects. __pydantic_extra__: A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra] is set to `'allow'`. __pydantic_fields_set__: The names of fields explicitly set during instantiation. __pydantic_private__: Values of private attributes set on the model instance. """ # Note: Many of the below class vars are defined in the metaclass, but we define them here for type checking purposes. model_config: ClassVar[ConfigDict] = ConfigDict() """ Configuration for the model, should be a dictionary conforming to [`ConfigDict`][pydantic.config.ConfigDict]. """ # Because `dict` is in the local namespace of the `BaseModel` class, we use `Dict` for annotations. # TODO v3 fallback to `dict` when the deprecated `dict` method gets removed. __class_vars__: ClassVar[set[str]] """The names of the class variables defined on the model.""" __private_attributes__: ClassVar[Dict[str, ModelPrivateAttr]] # noqa: UP006 """Metadata about the private attributes of the model.""" __signature__: ClassVar[Signature] """The synthesized `__init__` [`Signature`][inspect.Signature] of the model.""" __pydantic_complete__: ClassVar[bool] = False """Whether model building is completed, or if there are still undefined fields.""" __pydantic_core_schema__: ClassVar[CoreSchema] """The core schema of the model.""" __pydantic_custom_init__: ClassVar[bool] """Whether the model has a custom `__init__` method.""" # Must be set for `GenerateSchema.model_schema` to work for a plain `BaseModel` annotation. __pydantic_decorators__: ClassVar[_decorators.DecoratorInfos] = _decorators.DecoratorInfos() """Metadata containing the decorators defined on the model. This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1.""" __pydantic_generic_metadata__: ClassVar[_generics.PydanticGenericMetadata] """Metadata for generic models; contains data used for a similar purpose to __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these.""" __pydantic_parent_namespace__: ClassVar[Dict[str, Any] | None] = None # noqa: UP006 """Parent namespace of the model, used for automatic rebuilding of models.""" __pydantic_post_init__: ClassVar[None | Literal['model_post_init']] """The name of the post-init method for the model, if defined.""" __pydantic_root_model__: ClassVar[bool] = False """Whether the model is a [`RootModel`][pydantic.root_model.RootModel].""" __pydantic_serializer__: ClassVar[SchemaSerializer] """The `pydantic-core` `SchemaSerializer` used to dump instances of the model.""" __pydantic_validator__: ClassVar[SchemaValidator | PluggableSchemaValidator] """The `pydantic-core` `SchemaValidator` used to validate instances of the model.""" __pydantic_fields__: ClassVar[Dict[str, FieldInfo]] # noqa: UP006 """A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects. This replaces `Model.__fields__` from Pydantic V1. """ __pydantic_computed_fields__: ClassVar[Dict[str, ComputedFieldInfo]] # noqa: UP006 """A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects.""" __pydantic_extra__: dict[str, Any] | None = _model_construction.NoInitField(init=False) """A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra] is set to `'allow'`.""" __pydantic_fields_set__: set[str] = _model_construction.NoInitField(init=False) """The names of fields explicitly set during instantiation.""" __pydantic_private__: dict[str, Any] | None = _model_construction.NoInitField(init=False) """Values of private attributes set on the model instance.""" if not TYPE_CHECKING: # Prevent `BaseModel` from being instantiated directly # (defined in an `if not TYPE_CHECKING` block for clarity and to avoid type checking errors): __pydantic_core_schema__ = _mock_val_ser.MockCoreSchema( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', code='base-model-instantiated', ) __pydantic_validator__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='validator', code='base-model-instantiated', ) __pydantic_serializer__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='serializer', code='base-model-instantiated', ) __slots__ = '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__' def __init__(self, /, **data: Any) -> None: """Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) if self is not validated_self: warnings.warn( 'A custom validator is returning a value other than `self`.\n' "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n" 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.', stacklevel=2, ) # The following line sets a flag that we use to determine when `__init__` gets overridden by the user __init__.__pydantic_base_init__ = True # pyright: ignore[reportFunctionMemberAccess] if TYPE_CHECKING: model_fields: ClassVar[dict[str, FieldInfo]] model_computed_fields: ClassVar[dict[str, ComputedFieldInfo]] else: # TODO: V3 - remove `model_fields` and `model_computed_fields` properties from the `BaseModel` class - they should only # be accessible on the model class, not on instances. We have these purely for backwards compatibility with Pydantic dict[str, FieldInfo]: """Get metadata about the fields defined on the model. Deprecation warning: you should be getting this information from the model class, not from an instance. In V3, this property will be removed from the `BaseModel` class. Returns: A mapping of field names to [`FieldInfo`][pydantic.fields.FieldInfo] objects. """ # Must be set for `GenerateSchema.model_schema` to work for a plain `BaseModel` annotation, hence the default here. return getattr(self, '__pydantic_fields__', {}) @property def model_computed_fields(self) -> dict[str, ComputedFieldInfo]: """Get metadata about the computed fields defined on the model. Deprecation warning: you should be getting this information from the model class, not from an instance. In V3, this property will be removed from the `BaseModel` class. Returns: A mapping of computed field names to [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects. """ # Must be set for `GenerateSchema.model_schema` to work for a plain `BaseModel` annotation, hence the default here. return getattr(self, '__pydantic_computed_fields__', {}) @property def model_extra(self) -> dict[str, Any] | None: """Get extra fields set during validation. Returns: A dictionary of extra fields, or `None` if `config.extra` is not set to `"allow"`. """ return self.__pydantic_extra__ @property def model_fields_set(self) -> set[str]: """Returns the set of fields that have been explicitly set on this model instance. Returns: A set of strings representing the fields that have been set, i.e. that were not filled from defaults. """ return self.__pydantic_fields_set__ @classmethod def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: C901 """Creates a new instance of the `Model` class with validated data. Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data. Default values are respected, but no other validation is performed. !!! note `model_construct()` generally respects the `model_config.extra` setting on the provided model. That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__` and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored. Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in an error if extra values are passed, but they will be ignored. Args: _fields_set: A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute. Otherwise, the field names from the `values` argument will be used. values: Trusted or pre-validated data dictionary. Returns: A new instance of the `Model` class with validated data. """ m = cls.__new__(cls) fields_values: dict[str, Any] = {} fields_set = set() for name, field in cls.__pydantic_fields__.items(): if field.alias is not None and field.alias in values: fields_values[name] = values.pop(field.alias) fields_set.add(name) if (name not in fields_set) and (field.validation_alias is not None): validation_aliases: list[str | AliasPath] = ( field.validation_alias.choices if isinstance(field.validation_alias, AliasChoices) else [field.validation_alias] ) for alias in validation_aliases: if isinstance(alias, str) and alias in values: fields_values[name] = values.pop(alias) fields_set.add(name) break elif isinstance(alias, AliasPath): value = alias.search_dict_for_path(values) if value is not PydanticUndefined: fields_values[name] = value fields_set.add(name) break if name not in fields_set: if name in values: fields_values[name] = values.pop(name) fields_set.add(name) elif not field.is_required(): fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values) if _fields_set is None: _fields_set = fields_set _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None _object_setattr(m, '__dict__', fields_values) _object_setattr(m, '__pydantic_fields_set__', _fields_set) if not cls.__pydantic_root_model__: _object_setattr(m, '__pydantic_extra__', _extra) if cls.__pydantic_post_init__: m.model_post_init(None) # update private attributes with values set if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None: for k, v in values.items(): if k in m.__private_attributes__: m.__pydantic_private__[k] = v elif not cls.__pydantic_root_model__: # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist # Since it doesn't, that means that `__pydantic_private__` should be set to None _object_setattr(m, '__pydantic_private__', None) return m def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self: """Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#model_copy Returns a copy of the model. Args: update: Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data. deep: Set to `True` to make a deep copy of the model. Returns: New model instance. """ copied = self.__deepcopy__() if deep else self.__copy__() if update: if self.model_config.get('extra') == 'allow': for k, v in update.items(): if k in self.__pydantic_fields__: copied.__dict__[k] = v else: if copied.__pydantic_extra__ is None: copied.__pydantic_extra__ = {} copied.__pydantic_extra__[k] = v else: copied.__dict__.update(update) copied.__pydantic_fields_set__.update(update.keys()) return copied def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, ) -> dict[str, Any]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#modelmodel_dump Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. Args: mode: The mode in which `to_python` should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects. include: A set of fields to include in the output. exclude: A set of fields to exclude from the output. context: Additional context to pass to the serializer. by_alias: Whether to use the field's alias in the dictionary key if defined. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A dictionary representation of the model. """ return self.__pydantic_serializer__.to_python( self, mode=mode, by_alias=by_alias, include=include, exclude=exclude, context=context, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, serialize_as_any=serialize_as_any, ) def model_dump_json( self, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, ) -> str: """Usage docs: https://docs.pydantic.dev/2.10/concepts/serialization/#modelmodel_dump_json Generates a JSON representation of the model using Pydantic's `to_json` method. Args: indent: Indentation to use in the JSON output. If None is passed, the output will be compact. include: Field(s) to include in the JSON output. exclude: Field(s) to exclude from the JSON output. context: Additional context to pass to the serializer. by_alias: Whether to serialize using field aliases. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A JSON string representation of the model. """ return self.__pydantic_serializer__.to_json( self, indent=indent, include=include, exclude=exclude, context=context, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, serialize_as_any=serialize_as_any, ).decode() @classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Generates a JSON schema for a model class. Args: by_alias: Whether to use attribute aliases or not. ref_template: The reference template. schema_generator: To override the logic used to generate the JSON schema, as a subclass of `GenerateJsonSchema` with your desired modifications mode: The mode in which to generate the schema. Returns: The JSON schema for the given model class. """ return model_json_schema( cls, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator, mode=mode ) @classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str: """Compute the class name for parametrizations of generic classes. This method can be overridden to achieve a custom naming scheme for generic BaseModels. Args: params: Tuple of types of the class. Given a generic class `Model` with 2 type variables and a concrete model `Model[str, int]`, the value `(str, int)` would be passed to `params`. Returns: String representing the new class where `params` are passed to `cls` as type variables. Raises: TypeError: Raised when trying to generate concrete names for non-generic models. """ if not issubclass(cls, typing.Generic): raise TypeError('Concrete names should only be generated for generic models.') # Any strings received should represent forward references, so we handle them specially below. # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future, # we may be able to remove this special case. param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' def model_post_init(self, __context: Any) -> None: """Override this method to perform additional initialization after `__init__` and `model_construct`. This is useful if you want to do some validation that requires the entire model to be initialized. """ pass @classmethod def model_rebuild( cls, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the model. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Args: force: Whether to force the rebuilding of the model schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: The depth level of the parent namespace, defaults to 2. _types_namespace: The types namespace, defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and cls.__pydantic_complete__: return None if '__pydantic_core_schema__' in cls.__dict__: delattr(cls, '__pydantic_core_schema__') # delete cached value to ensure full rebuild happens if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {} ns_resolver = _namespace_utils.NsResolver( parent_namespace={**rebuild_ns, **parent_ns}, ) # manually override defer_build so complete_model_class doesn't skip building the model again config = {**cls.model_config, 'defer_build': False} return _model_construction.complete_model_class( cls, cls.__name__, _config.ConfigWrapper(config, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, ) @classmethod def model_validate( cls, obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, ) -> Self: """Validate a pydantic model instance. Args: obj: The object to validate. strict: Whether to enforce types strictly. from_attributes: Whether to extract data from object attributes. context: Additional context to pass to the validator. Raises: ValidationError: If the object could not be validated. Returns: The validated model instance. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True return cls.__pydantic_validator__.validate_python( obj, strict=strict, from_attributes=from_attributes, context=context ) @classmethod def model_validate_json( cls, json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, ) -> Self: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json/#json-parsing Validate the given JSON data against the Pydantic model. Args: json_data: The JSON data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. Returns: The validated Pydantic model. Raises: ValidationError: If `json_data` is not a JSON string or the object could not be validated. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True return cls.__pydantic_validator__.validate_json(json_data, strict=strict, context=context) @classmethod def model_validate_strings( cls, obj: Any, *, strict: bool | None = None, context: Any | None = None, ) -> Self: """Validate the given object with string data against the Pydantic model. Args: obj: The object containing string data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. Returns: The validated Pydantic model. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True return cls.__pydantic_validator__.validate_strings(obj, strict=strict, context=context) @classmethod def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema: """Hook into generating the model's CoreSchema. Args: source: The class we are generating a schema for. This will generally be the same as the `cls` argument if this is a classmethod. handler: A callable that calls into Pydantic's internal CoreSchema generation logic. Returns: A `pydantic-core` `CoreSchema`. """ # Only use the cached value from this _exact_ class; we don't want one from a parent class # This is why we check `cls.__dict__` and don't use `cls.__pydantic_core_schema__` or similar. schema = cls.__dict__.get('__pydantic_core_schema__') if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema): # Due to the way generic classes are built, it's possible that an invalid schema may be temporarily # set on generic classes. I think we could resolve this to ensure that we get proper schema caching # for generics, but for simplicity for now, we just always rebuild if the class has a generic origin. if not cls.__pydantic_generic_metadata__['origin']: return cls.__pydantic_core_schema__ return handler(source) @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler, /, ) -> JsonSchemaValue: """Hook into generating the model's JSON schema. Args: core_schema: A `pydantic-core` CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`), or just call the handler with the original schema. handler: Call into Pydantic's internal JSON schema generation. This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema generation fails. Since this gets called by `BaseModel.model_json_schema` you can override the `schema_generator` argument to that function to change JSON schema generation globally for a type. Returns: A JSON schema, as a Python object. """ return handler(core_schema) @classmethod def __pydantic_init_subclass__(cls, **kwargs: Any) -> None: """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass` only after the class is actually fully initialized. In particular, attributes like `model_fields` will be present when this is called. This is necessary because `__init_subclass__` will always be called by `type.__new__`, and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that `type.__new__` was called in such a manner that the class would already be sufficiently initialized. This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely, any kwargs passed to the class definition that aren't used internally by pydantic. Args: **kwargs: Any keyword arguments passed to the class definition that aren't used internally by pydantic. """ pass def __class_getitem__( cls, typevar_values: type[Any] | tuple[type[Any], ...] ) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef: cached = _generics.get_cached_generic_type_early(cls, typevar_values) if cached is not None: return cached if cls is BaseModel: raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel') if not hasattr(cls, '__parameters__'): raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic') if not cls.__pydantic_generic_metadata__['parameters'] and typing.Generic not in cls.__bases__: raise TypeError(f'{cls} is not a generic class') if not isinstance(typevar_values, tuple): typevar_values = (typevar_values,) _generics.check_parameters_count(cls, typevar_values) # Build map from generic typevars to passed params typevars_map: dict[TypeVar, type[Any]] = dict( zip(cls.__pydantic_generic_metadata__['parameters'], typevar_values) ) if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map: submodel = cls # if arguments are equal to parameters it's the same object _generics.set_cached_generic_type(cls, typevar_values, submodel) else: parent_args = cls.__pydantic_generic_metadata__['args'] if not parent_args: args = typevar_values else: args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args) origin = cls.__pydantic_generic_metadata__['origin'] or cls model_name = origin.model_parametrized_name(args) params = tuple( {param: None for param in _generics.iter_contained_typevars(typevars_map.values())} ) # use dict as ordered set with _generics.generic_recursion_self_type(origin, args) as maybe_self_type: cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args) if cached is not None: return cached if maybe_self_type is not None: return maybe_self_type # Attempt to rebuild the origin in case new types have been defined try: # depth 2 gets you above this __class_getitem__ call. # Note that we explicitly provide the parent ns, otherwise # `model_rebuild` will use the parent ns no matter if it is the ns of a module. # We don't want this here, as this has unexpected effects when a model # is being parametrized during a forward annotation evaluation. parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {} origin.model_rebuild(_types_namespace=parent_ns) except PydanticUndefinedAnnotation: # It's okay if it fails, it just means there are still undefined types # that could be evaluated later. pass submodel = _generics.create_generic_submodel(model_name, origin, args, params) # Update cache _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args) return submodel def __copy__(self) -> Self: """Returns a shallow copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', copy(self.__dict__)) _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__)) _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, ) return m def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self: """Returns a deep copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo)) _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo)) # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str], # and attempting a deepcopy would be marginally slower. _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo), ) return m if not TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access # The same goes for __setattr__ and __delattr__, see: https://github.com/pydantic/pydantic/issues/8643 def __getattr__(self, item: str) -> Any: private_attributes = object.__getattribute__(self, '__private_attributes__') if item in private_attributes: attribute = private_attributes[item] if hasattr(attribute, '__get__'): return attribute.__get__(self, type(self)) # type: ignore try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items return self.__pydantic_private__[item] # type: ignore except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # See `BaseModel.__repr_args__` for more details try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra: try: return pydantic_extra[item] except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: if hasattr(self.__class__, item): return super().__getattribute__(item) # Raises AttributeError if appropriate else: # this is the current error raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') def __setattr__(self, name: str, value: Any) -> None: if name in self.__class_vars__: raise AttributeError( f'{name!r} is a ClassVar of `{self.__class__.__name__}` and cannot be set on an instance. ' f'If you want to set a value on the class, use `{self.__class__.__name__}.{name} = value`.' ) elif not _fields.is_valid_field_name(name): if self.__pydantic_private__ is None or name not in self.__private_attributes__: _object_setattr(self, name, value) else: attribute = self.__private_attributes__[name] if hasattr(attribute, '__set__'): attribute.__set__(self, value) # type: ignore else: self.__pydantic_private__[name] = value return self._check_frozen(name, value) attr = getattr(self.__class__, name, None) # NOTE: We currently special case properties and `cached_property`, but we might need # to generalize this to all data/non-data descriptors at some point. For non-data descriptors # (such as `cached_property`), it isn't obvious though. `cached_property` caches the value # to the instance's `__dict__`, but other non-data descriptors might do things differently. if isinstance(attr, property): attr.__set__(self, value) elif isinstance(attr, cached_property): self.__dict__[name] = value elif self.model_config.get('validate_assignment', None): self.__pydantic_validator__.validate_assignment(self, name, value) elif self.model_config.get('extra') != 'allow' and name not in self.__pydantic_fields__: # TODO - matching error raise ValueError(f'"{self.__class__.__name__}" object has no field "{name}"') elif self.model_config.get('extra') == 'allow' and name not in self.__pydantic_fields__: if self.model_extra and name in self.model_extra: self.__pydantic_extra__[name] = value # type: ignore else: try: getattr(self, name) except AttributeError: # attribute does not already exist on instance, so put it in extra self.__pydantic_extra__[name] = value # type: ignore else: # attribute _does_ already exist on instance, and was not in extra, so update it _object_setattr(self, name, value) else: self.__dict__[name] = value self.__pydantic_fields_set__.add(name) def __delattr__(self, item: str) -> Any: if item in self.__private_attributes__: attribute = self.__private_attributes__[item] if hasattr(attribute, '__delete__'): attribute.__delete__(self) # type: ignore return try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items del self.__pydantic_private__[item] # type: ignore return except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc self._check_frozen(item, None) if item in self.__pydantic_fields__: object.__delattr__(self, item) elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__: del self.__pydantic_extra__[item] else: try: object.__delattr__(self, item) except AttributeError: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') # Because we make use of `@dataclass_transform()`, `__replace__` is already synthesized by # type checkers, so we define the implementation in this `if not TYPE_CHECKING:` block: def __replace__(self, **changes: Any) -> Self: return self.model_copy(update=changes) def _check_frozen(self, name: str, value: Any) -> None: if self.model_config.get('frozen', None): typ = 'frozen_instance' elif getattr(self.__pydantic_fields__.get(name), 'frozen', False): typ = 'frozen_field' else: return error: pydantic_core.InitErrorDetails = { 'type': typ, 'loc': (name,), 'input': value, } raise pydantic_core.ValidationError.from_exception_data(self.__class__.__name__, [error]) def __getstate__(self) -> dict[Any, Any]: private = self.__pydantic_private__ if private: private = {k: v for k, v in private.items() if v is not PydanticUndefined} return { '__dict__': self.__dict__, '__pydantic_extra__': self.__pydantic_extra__, '__pydantic_fields_set__': self.__pydantic_fields_set__, '__pydantic_private__': private, } def __setstate__(self, state: dict[Any, Any]) -> None: _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {})) _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {})) _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {})) _object_setattr(self, '__dict__', state.get('__dict__', {})) if not TYPE_CHECKING: def __eq__(self, other: Any) -> bool: if isinstance(other, BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None) and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # We only want to compare pydantic fields but ignoring fields is costly. # We'll perform a fast check first, and fallback only when needed # See GH-7444 and GH-7825 for rationale and a performance benchmark # First, do the fast (and sometimes faulty) __dict__ comparison if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return # early if there are no keys to ignore (we would just return False later on anyway) model_fields = type(self).__pydantic_fields__.keys() if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields: return False # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore # Resort to costly filtering of the __dict__ objects # We use operator.itemgetter because it is much faster than dict comprehensions # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute # raises an error in BaseModel.__getattr__ instead of returning the class attribute # So we can use operator.itemgetter() instead of operator.attrgetter() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL try: return getter(self.__dict__) == getter(other.__dict__) except KeyError: # In rare cases (such as when using the deprecated BaseModel.copy() method), # the __dict__ may not contain all model fields, which is how we can get here. # getter(self.__dict__) is much faster than any 'safe' method that accounts # for missing keys, and wrapping it in a `try` doesn't slow things down much # in the common case. self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__) other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__) return getter(self_fields_proxy) == getter(other_fields_proxy) # other instance is not a BaseModel else: return NotImplemented # delegate to the other item in the comparison if TYPE_CHECKING: # We put `__init_subclass__` in a TYPE_CHECKING block because, even though we want the type-checking benefits # described in the signature of `__init_subclass__` below, we don't want to modify the default behavior of # subclass initialization. def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]): """This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs. ```python from pydantic import BaseModel class MyModel(BaseModel, extra='allow'): ... ``` However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.) Args: **kwargs: Keyword arguments passed to the class definition, which set model_config Note: You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called *after* the class is fully initialized. """ def __iter__(self) -> TupleGenerator: """So `dict(model)` works.""" yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')] extra = self.__pydantic_extra__ if extra: yield from extra.items() def __repr__(self) -> str: return f'{self.__repr_name__()}({self.__repr_str__(", ")})' def __repr_args__(self) -> _repr.ReprArgs: for k, v in self.__dict__.items(): field = self.__pydantic_fields__.get(k) if field and field.repr: if v is not self: yield k, v else: yield k, self.__repr_recursion__(v) # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # This can happen if a `ValidationError` is raised during initialization and the instance's # repr is generated as part of the exception handling. Therefore, we use `getattr` here # with a fallback, even though the type hints indicate the attribute will always be present. try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra is not None: yield from ((k, v) for k, v in pydantic_extra.items()) yield from ((k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr) # take logic from `_repr.Representation` without the side effects of inheritance, see #5740 __repr_name__ = _repr.Representation.__repr_name__ __repr_recursion__ = _repr.Representation.__repr_recursion__ __repr_str__ = _repr.Representation.__repr_str__ __pretty__ = _repr.Representation.__pretty__ __rich_repr__ = _repr.Representation.__rich_repr__ def __str__(self) -> str: return self.__repr_str__(' ') # ##### Deprecated methods from v1 ##### @property @typing_extensions.deprecated( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=None ) def __fields__(self) -> dict[str, FieldInfo]: warnings.warn( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.model_fields @property @typing_extensions.deprecated( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=None, ) def __fields_set__(self) -> set[str]: warnings.warn( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.__pydantic_fields_set__ @typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None) def dict( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `dict` method is deprecated; use `model_dump` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.model_dump( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None) def json( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, # type: ignore[assignment] models_as_dict: bool = PydanticUndefined, # type: ignore[assignment] **dumps_kwargs: Any, ) -> str: warnings.warn( 'The `json` method is deprecated; use `model_dump_json` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if encoder is not PydanticUndefined: raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.') if models_as_dict is not PydanticUndefined: raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.') if dumps_kwargs: raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.') return self.model_dump_json( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @classmethod @typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None) def parse_obj(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `parse_obj` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=None, ) def parse_raw( # noqa: D102 cls, b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: # pragma: no cover warnings.warn( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse try: obj = parse.load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) except (ValueError, TypeError) as exc: import json # try to match V1 if isinstance(exc, UnicodeDecodeError): type_str = 'value_error.unicodedecode' elif isinstance(exc, json.JSONDecodeError): type_str = 'value_error.jsondecode' elif isinstance(exc, ValueError): type_str = 'value_error' else: type_str = 'type_error' # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same error: pydantic_core.InitErrorDetails = { # The type: ignore on the next line is to ignore the requirement of LiteralString 'type': pydantic_core.PydanticCustomError(type_str, str(exc)), # type: ignore 'loc': ('__root__',), 'input': b, } raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error]) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=None, ) def parse_file( # noqa: D102 cls, path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: warnings.warn( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse obj = parse.load_file( path, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) return cls.parse_obj(obj) @classmethod @typing_extensions.deprecated( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=None, ) def from_orm(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=PydanticDeprecatedSince20, stacklevel=2, ) if not cls.model_config.get('from_attributes', None): raise PydanticUserError( 'You must set the config attribute `from_attributes=True` to use from_orm', code=None ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None) def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: D102 warnings.warn( 'The `construct` method is deprecated; use `model_construct` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_construct(_fields_set=_fields_set, **values) @typing_extensions.deprecated( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=None, ) def copy( self, *, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, # noqa UP006 deep: bool = False, ) -> Self: # pragma: no cover """Returns a copy of the model. !!! warning "Deprecated" This method is now deprecated; use `model_copy` instead. If you need `include` or `exclude`, use: ```python {test="skip" lint="skip"} data = self.model_dump(include=include, exclude=exclude, round_trip=True) data = {**data, **(update or {})} copied = self.model_validate(data) ``` Args: include: Optional set or mapping specifying which fields to include in the copied model. exclude: Optional set or mapping specifying which fields to exclude in the copied model. update: Optional dictionary of field-value pairs to override field values in the copied model. deep: If True, the values of fields that are Pydantic models will be deep-copied. Returns: A copy of the model with included, excluded and updated fields as specified. """ warnings.warn( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals values = dict( copy_internals._iter( self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False ), **(update or {}), ) if self.__pydantic_private__ is None: private = None else: private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined} if self.__pydantic_extra__ is None: extra: dict[str, Any] | None = None else: extra = self.__pydantic_extra__.copy() for k in list(self.__pydantic_extra__): if k not in values: # k was in the exclude extra.pop(k) for k in list(values): if k in self.__pydantic_extra__: # k must have come from extra extra[k] = values.pop(k) # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg if update: fields_set = self.__pydantic_fields_set__ | update.keys() else: fields_set = set(self.__pydantic_fields_set__) # removing excluded fields from `__pydantic_fields_set__` if exclude: fields_set -= set(exclude) return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep) @classmethod @typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None) def schema( # noqa: D102 cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `schema` method is deprecated; use `model_json_schema` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template) @classmethod @typing_extensions.deprecated( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=None, ) def schema_json( # noqa: D102 cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any ) -> str: # pragma: no cover warnings.warn( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) import json from .deprecated.json import pydantic_encoder return json.dumps( cls.model_json_schema(by_alias=by_alias, ref_template=ref_template), default=pydantic_encoder, **dumps_kwargs, ) @classmethod @typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None) def validate(cls, value: Any) -> Self: # noqa: D102 warnings.warn( 'The `validate` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(value) @classmethod @typing_extensions.deprecated( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=None, ) def update_forward_refs(cls, **localns: Any) -> None: # noqa: D102 warnings.warn( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if localns: # pragma: no cover raise TypeError('`localns` arguments are not longer accepted.') cls.model_rebuild(force=True) @typing_extensions.deprecated( 'The private method `_iter` will be removed and should no longer be used.', category=None ) def _iter(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_iter` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._iter(self, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=None, ) def _copy_and_set_values(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._copy_and_set_values(self, *args, **kwargs) @classmethod @typing_extensions.deprecated( 'The private method `_get_value` will be removed and should no longer be used.', category=None, ) def _get_value(cls, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_get_value` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._get_value(cls, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=None, ) def _calculate_keys(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._calculate_keys(self, *args, **kwargs) ModelT = TypeVar('ModelT', bound=BaseModel) @overload def create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: None = None, __module__: str = __name__, __validators__: dict[str, Callable[..., Any]] | None = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any, ) -> type[BaseModel]: ... @overload def create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...], __module__: str = __name__, __validators__: dict[str, Callable[..., Any]] | None = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any, ) -> type[ModelT]: ... def create_model( # noqa: C901 model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...] | None = None, __module__: str | None = None, __validators__: dict[str, Callable[..., Any]] | None = None, __cls_kwargs__: dict[str, Any] | None = None, __slots__: tuple[str, ...] | None = None, **field_definitions: Any, ) -> type[ModelT]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/models/#dynamic-model-creation Dynamically creates and returns a new Pydantic model, in other words, `create_model` dynamically creates a subclass of [`BaseModel`][pydantic.BaseModel]. Args: model_name: The name of the newly created model. __config__: The configuration of the new model. __doc__: The docstring of the new model. __base__: The base class or classes for the new model. __module__: The name of the module that the model belongs to; if `None`, the value is taken from `sys._getframe(1)` __validators__: A dictionary of methods that validate fields. The keys are the names of the validation methods to be added to the model, and the values are the validation methods themselves. You can read more about functional validators [here](https://docs.pydantic.dev/2.9/concepts/validators/#field-validators). __cls_kwargs__: A dictionary of keyword arguments for class creation, such as `metaclass`. __slots__: Deprecated. Should not be passed to `create_model`. **field_definitions: Attributes of the new model. They should be passed in the format: `=(, )`, `=(, )`, or `typing.Annotated[, ]`. Any additional metadata in `typing.Annotated[, , ...]` will be ignored. Note, `FieldInfo` instances should be created via `pydantic.Field(...)`. Initializing `FieldInfo` instances directly is not supported. Returns: The new [model][pydantic.BaseModel]. Raises: PydanticUserError: If `__base__` and `__config__` are both passed. """ if __slots__ is not None: # __slots__ will be ignored from here on warnings.warn('__slots__ should not be passed to create_model', RuntimeWarning) if __base__ is not None: if __config__ is not None: raise PydanticUserError( 'to avoid confusion `__config__` and `__base__` cannot be used together', code='create-model-config-base', ) if not isinstance(__base__, tuple): __base__ = (__base__,) else: __base__ = (cast('type[ModelT]', BaseModel),) __cls_kwargs__ = __cls_kwargs__ or {} fields = {} annotations = {} for f_name, f_def in field_definitions.items(): if not _fields.is_valid_field_name(f_name): warnings.warn(f'fields may not start with an underscore, ignoring "{f_name}"', RuntimeWarning) if isinstance(f_def, tuple): f_def = cast('tuple[str, Any]', f_def) try: f_annotation, f_value = f_def except ValueError as e: raise PydanticUserError( 'Field definitions should be a `(, )`.', code='create-model-field-definitions', ) from e elif _typing_extra.is_annotated(f_def): (f_annotation, f_value, *_) = typing_extensions.get_args( f_def ) # first two input are expected from Annotated, refer to https://docs.python.org/3/library/typing.html#typing.Annotated FieldInfo = _import_utils.import_cached_field_info() if not isinstance(f_value, FieldInfo): raise PydanticUserError( 'Field definitions should be a Annotated[, ]', code='create-model-field-definitions', ) else: f_annotation, f_value = None, f_def if f_annotation: annotations[f_name] = f_annotation fields[f_name] = f_value if __module__ is None: f = sys._getframe(1) __module__ = f.f_globals['__name__'] namespace: dict[str, Any] = {'__annotations__': annotations, '__module__': __module__} if __doc__: namespace.update({'__doc__': __doc__}) if __validators__: namespace.update(__validators__) namespace.update(fields) if __config__: namespace['model_config'] = _config.ConfigWrapper(__config__).config_dict resolved_bases = types.resolve_bases(__base__) meta, ns, kwds = types.prepare_class(model_name, resolved_bases, kwds=__cls_kwargs__) if resolved_bases is not __base__: ns['__orig_bases__'] = __base__ namespace.update(ns) return meta( model_name, resolved_bases, namespace, __pydantic_reset_parent_namespace__=False, _create_model_module=__module__, **kwds, ) __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/mypy.py000066400000000000000000001560561474456633400166330ustar00rootroot00000000000000"""This module includes classes and functions designed specifically for use with the mypy plugin.""" from __future__ import annotations import sys from configparser import ConfigParser from typing import Any, Callable, Iterator from mypy.errorcodes import ErrorCode from mypy.expandtype import expand_type, expand_type_by_instance from mypy.nodes import ( ARG_NAMED, ARG_NAMED_OPT, ARG_OPT, ARG_POS, ARG_STAR2, INVARIANT, MDEF, Argument, AssignmentStmt, Block, CallExpr, ClassDef, Context, Decorator, DictExpr, EllipsisExpr, Expression, FuncDef, IfStmt, JsonDict, MemberExpr, NameExpr, PassStmt, PlaceholderNode, RefExpr, Statement, StrExpr, SymbolTableNode, TempNode, TypeAlias, TypeInfo, Var, ) from mypy.options import Options from mypy.plugin import ( CheckerPluginInterface, ClassDefContext, MethodContext, Plugin, ReportConfigContext, SemanticAnalyzerPluginInterface, ) from mypy.plugins.common import ( deserialize_and_fixup_type, ) from mypy.semanal import set_callable_name from mypy.server.trigger import make_wildcard_trigger from mypy.state import state from mypy.typeops import map_type_from_supertype from mypy.types import ( AnyType, CallableType, Instance, NoneType, Type, TypeOfAny, TypeType, TypeVarType, UnionType, get_proper_type, ) from mypy.typevars import fill_typevars from mypy.util import get_unique_redefinition_name from mypy.version import __version__ as mypy_version from pydantic._internal import _fields from pydantic.version import parse_mypy_version CONFIGFILE_KEY = 'pydantic-mypy' METADATA_KEY = 'pydantic-mypy-metadata' BASEMODEL_FULLNAME = 'pydantic.main.BaseModel' BASESETTINGS_FULLNAME = 'pydantic_settings.main.BaseSettings' ROOT_MODEL_FULLNAME = 'pydantic.root_model.RootModel' MODEL_METACLASS_FULLNAME = 'pydantic._internal._model_construction.ModelMetaclass' FIELD_FULLNAME = 'pydantic.fields.Field' DATACLASS_FULLNAME = 'pydantic.dataclasses.dataclass' MODEL_VALIDATOR_FULLNAME = 'pydantic.functional_validators.model_validator' DECORATOR_FULLNAMES = { 'pydantic.functional_validators.field_validator', 'pydantic.functional_validators.model_validator', 'pydantic.functional_serializers.serializer', 'pydantic.functional_serializers.model_serializer', 'pydantic.deprecated.class_validators.validator', 'pydantic.deprecated.class_validators.root_validator', } MYPY_VERSION_TUPLE = parse_mypy_version(mypy_version) BUILTINS_NAME = 'builtins' # Increment version if plugin changes and mypy caches should be invalidated __version__ = 2 def plugin(version: str) -> type[Plugin]: """`version` is the mypy version string. We might want to use this to print a warning if the mypy version being used is newer, or especially older, than we expect (or need). Args: version: The mypy version string. Return: The Pydantic mypy plugin type. """ return PydanticPlugin class PydanticPlugin(Plugin): """The Pydantic mypy plugin.""" def __init__(self, options: Options) -> None: self.plugin_config = PydanticPluginConfig(options) self._plugin_data = self.plugin_config.to_data() super().__init__(options) def get_base_class_hook(self, fullname: str) -> Callable[[ClassDefContext], None] | None: """Update Pydantic model class.""" sym = self.lookup_fully_qualified(fullname) if sym and isinstance(sym.node, TypeInfo): # pragma: no branch # No branching may occur if the mypy cache has not been cleared if any(base.fullname == BASEMODEL_FULLNAME for base in sym.node.mro): return self._pydantic_model_class_maker_callback return None def get_metaclass_hook(self, fullname: str) -> Callable[[ClassDefContext], None] | None: """Update Pydantic `ModelMetaclass` definition.""" if fullname == MODEL_METACLASS_FULLNAME: return self._pydantic_model_metaclass_marker_callback return None def get_method_hook(self, fullname: str) -> Callable[[MethodContext], Type] | None: """Adjust return type of `from_orm` method call.""" if fullname.endswith('.from_orm'): return from_attributes_callback return None def report_config_data(self, ctx: ReportConfigContext) -> dict[str, Any]: """Return all plugin config data. Used by mypy to determine if cache needs to be discarded. """ return self._plugin_data def _pydantic_model_class_maker_callback(self, ctx: ClassDefContext) -> None: transformer = PydanticModelTransformer(ctx.cls, ctx.reason, ctx.api, self.plugin_config) transformer.transform() def _pydantic_model_metaclass_marker_callback(self, ctx: ClassDefContext) -> None: """Reset dataclass_transform_spec attribute of ModelMetaclass. Let the plugin handle it. This behavior can be disabled if 'debug_dataclass_transform' is set to True', for testing purposes. """ if self.plugin_config.debug_dataclass_transform: return info_metaclass = ctx.cls.info.declared_metaclass assert info_metaclass, "callback not passed from 'get_metaclass_hook'" if getattr(info_metaclass.type, 'dataclass_transform_spec', None): info_metaclass.type.dataclass_transform_spec = None class PydanticPluginConfig: """A Pydantic mypy plugin config holder. Attributes: init_forbid_extra: Whether to add a `**kwargs` at the end of the generated `__init__` signature. init_typed: Whether to annotate fields in the generated `__init__`. warn_required_dynamic_aliases: Whether to raise required dynamic aliases error. debug_dataclass_transform: Whether to not reset `dataclass_transform_spec` attribute of `ModelMetaclass` for testing purposes. """ __slots__ = ( 'init_forbid_extra', 'init_typed', 'warn_required_dynamic_aliases', 'debug_dataclass_transform', ) init_forbid_extra: bool init_typed: bool warn_required_dynamic_aliases: bool debug_dataclass_transform: bool # undocumented def __init__(self, options: Options) -> None: if options.config_file is None: # pragma: no cover return toml_config = parse_toml(options.config_file) if toml_config is not None: config = toml_config.get('tool', {}).get('pydantic-mypy', {}) for key in self.__slots__: setting = config.get(key, False) if not isinstance(setting, bool): raise ValueError(f'Configuration value must be a boolean for key: {key}') setattr(self, key, setting) else: plugin_config = ConfigParser() plugin_config.read(options.config_file) for key in self.__slots__: setting = plugin_config.getboolean(CONFIGFILE_KEY, key, fallback=False) setattr(self, key, setting) def to_data(self) -> dict[str, Any]: """Returns a dict of config names to their values.""" return {key: getattr(self, key) for key in self.__slots__} def from_attributes_callback(ctx: MethodContext) -> Type: """Raise an error if from_attributes is not enabled.""" model_type: Instance ctx_type = ctx.type if isinstance(ctx_type, TypeType): ctx_type = ctx_type.item if isinstance(ctx_type, CallableType) and isinstance(ctx_type.ret_type, Instance): model_type = ctx_type.ret_type # called on the class elif isinstance(ctx_type, Instance): model_type = ctx_type # called on an instance (unusual, but still valid) else: # pragma: no cover detail = f'ctx.type: {ctx_type} (of type {ctx_type.__class__.__name__})' error_unexpected_behavior(detail, ctx.api, ctx.context) return ctx.default_return_type pydantic_metadata = model_type.type.metadata.get(METADATA_KEY) if pydantic_metadata is None: return ctx.default_return_type if not any(base.fullname == BASEMODEL_FULLNAME for base in model_type.type.mro): # not a Pydantic v2 model return ctx.default_return_type from_attributes = pydantic_metadata.get('config', {}).get('from_attributes') if from_attributes is not True: error_from_attributes(model_type.type.name, ctx.api, ctx.context) return ctx.default_return_type class PydanticModelField: """Based on mypy.plugins.dataclasses.DataclassAttribute.""" def __init__( self, name: str, alias: str | None, is_frozen: bool, has_dynamic_alias: bool, has_default: bool, strict: bool | None, line: int, column: int, type: Type | None, info: TypeInfo, ): self.name = name self.alias = alias self.is_frozen = is_frozen self.has_dynamic_alias = has_dynamic_alias self.has_default = has_default self.strict = strict self.line = line self.column = column self.type = type self.info = info def to_argument( self, current_info: TypeInfo, typed: bool, model_strict: bool, force_optional: bool, use_alias: bool, api: SemanticAnalyzerPluginInterface, force_typevars_invariant: bool, is_root_model_root: bool, ) -> Argument: """Based on mypy.plugins.dataclasses.DataclassAttribute.to_argument.""" variable = self.to_var(current_info, api, use_alias, force_typevars_invariant) strict = model_strict if self.strict is None else self.strict if typed or strict: type_annotation = self.expand_type(current_info, api) else: type_annotation = AnyType(TypeOfAny.explicit) return Argument( variable=variable, type_annotation=type_annotation, initializer=None, kind=ARG_OPT if is_root_model_root else (ARG_NAMED_OPT if force_optional or self.has_default else ARG_NAMED), ) def expand_type( self, current_info: TypeInfo, api: SemanticAnalyzerPluginInterface, force_typevars_invariant: bool = False ) -> Type | None: """Based on mypy.plugins.dataclasses.DataclassAttribute.expand_type.""" if force_typevars_invariant: # In some cases, mypy will emit an error "Cannot use a covariant type variable as a parameter" # To prevent that, we add an option to replace typevars with invariant ones while building certain # method signatures (in particular, `__init__`). There may be a better way to do this, if this causes # us problems in the future, we should look into why the dataclasses plugin doesn't have this issue. if isinstance(self.type, TypeVarType): modified_type = self.type.copy_modified() modified_type.variance = INVARIANT self.type = modified_type if self.type is not None and self.info.self_type is not None: # In general, it is not safe to call `expand_type()` during semantic analysis, # however this plugin is called very late, so all types should be fully ready. # Also, it is tricky to avoid eager expansion of Self types here (e.g. because # we serialize attributes). with state.strict_optional_set(api.options.strict_optional): filled_with_typevars = fill_typevars(current_info) # Cannot be TupleType as current_info represents a Pydantic model: assert isinstance(filled_with_typevars, Instance) if force_typevars_invariant: for arg in filled_with_typevars.args: if isinstance(arg, TypeVarType): arg.variance = INVARIANT return expand_type(self.type, {self.info.self_type.id: filled_with_typevars}) return self.type def to_var( self, current_info: TypeInfo, api: SemanticAnalyzerPluginInterface, use_alias: bool, force_typevars_invariant: bool = False, ) -> Var: """Based on mypy.plugins.dataclasses.DataclassAttribute.to_var.""" if use_alias and self.alias is not None: name = self.alias else: name = self.name return Var(name, self.expand_type(current_info, api, force_typevars_invariant)) def serialize(self) -> JsonDict: """Based on mypy.plugins.dataclasses.DataclassAttribute.serialize.""" assert self.type return { 'name': self.name, 'alias': self.alias, 'is_frozen': self.is_frozen, 'has_dynamic_alias': self.has_dynamic_alias, 'has_default': self.has_default, 'strict': self.strict, 'line': self.line, 'column': self.column, 'type': self.type.serialize(), } @classmethod def deserialize(cls, info: TypeInfo, data: JsonDict, api: SemanticAnalyzerPluginInterface) -> PydanticModelField: """Based on mypy.plugins.dataclasses.DataclassAttribute.deserialize.""" data = data.copy() typ = deserialize_and_fixup_type(data.pop('type'), api) return cls(type=typ, info=info, **data) def expand_typevar_from_subtype(self, sub_type: TypeInfo, api: SemanticAnalyzerPluginInterface) -> None: """Expands type vars in the context of a subtype when an attribute is inherited from a generic super type. """ if self.type is not None: with state.strict_optional_set(api.options.strict_optional): self.type = map_type_from_supertype(self.type, sub_type, self.info) class PydanticModelClassVar: """Based on mypy.plugins.dataclasses.DataclassAttribute. ClassVars are ignored by subclasses. Attributes: name: the ClassVar name """ def __init__(self, name): self.name = name @classmethod def deserialize(cls, data: JsonDict) -> PydanticModelClassVar: """Based on mypy.plugins.dataclasses.DataclassAttribute.deserialize.""" data = data.copy() return cls(**data) def serialize(self) -> JsonDict: """Based on mypy.plugins.dataclasses.DataclassAttribute.serialize.""" return { 'name': self.name, } class PydanticModelTransformer: """Transform the BaseModel subclass according to the plugin settings. Attributes: tracked_config_fields: A set of field configs that the plugin has to track their value. """ tracked_config_fields: set[str] = { 'extra', 'frozen', 'from_attributes', 'populate_by_name', 'alias_generator', 'strict', } def __init__( self, cls: ClassDef, reason: Expression | Statement, api: SemanticAnalyzerPluginInterface, plugin_config: PydanticPluginConfig, ) -> None: self._cls = cls self._reason = reason self._api = api self.plugin_config = plugin_config def transform(self) -> bool: """Configures the BaseModel subclass according to the plugin settings. In particular: * determines the model config and fields, * adds a fields-aware signature for the initializer and construct methods * freezes the class if frozen = True * stores the fields, config, and if the class is settings in the mypy metadata for access by subclasses """ info = self._cls.info is_root_model = any(ROOT_MODEL_FULLNAME in base.fullname for base in info.mro[:-1]) config = self.collect_config() fields, class_vars = self.collect_fields_and_class_vars(config, is_root_model) if fields is None or class_vars is None: # Some definitions are not ready. We need another pass. return False for field in fields: if field.type is None: return False is_settings = any(base.fullname == BASESETTINGS_FULLNAME for base in info.mro[:-1]) self.add_initializer(fields, config, is_settings, is_root_model) self.add_model_construct_method(fields, config, is_settings, is_root_model) self.set_frozen(fields, self._api, frozen=config.frozen is True) self.adjust_decorator_signatures() info.metadata[METADATA_KEY] = { 'fields': {field.name: field.serialize() for field in fields}, 'class_vars': {class_var.name: class_var.serialize() for class_var in class_vars}, 'config': config.get_values_dict(), } return True def adjust_decorator_signatures(self) -> None: """When we decorate a function `f` with `pydantic.validator(...)`, `pydantic.field_validator` or `pydantic.serializer(...)`, mypy sees `f` as a regular method taking a `self` instance, even though pydantic internally wraps `f` with `classmethod` if necessary. Teach mypy this by marking any function whose outermost decorator is a `validator()`, `field_validator()` or `serializer()` call as a `classmethod`. """ for sym in self._cls.info.names.values(): if isinstance(sym.node, Decorator): first_dec = sym.node.original_decorators[0] if ( isinstance(first_dec, CallExpr) and isinstance(first_dec.callee, NameExpr) and first_dec.callee.fullname in DECORATOR_FULLNAMES # @model_validator(mode="after") is an exception, it expects a regular method and not ( first_dec.callee.fullname == MODEL_VALIDATOR_FULLNAME and any( first_dec.arg_names[i] == 'mode' and isinstance(arg, StrExpr) and arg.value == 'after' for i, arg in enumerate(first_dec.args) ) ) ): # TODO: Only do this if the first argument of the decorated function is `cls` sym.node.func.is_class = True def collect_config(self) -> ModelConfigData: # noqa: C901 (ignore complexity) """Collects the values of the config attributes that are used by the plugin, accounting for parent classes.""" cls = self._cls config = ModelConfigData() has_config_kwargs = False has_config_from_namespace = False # Handle `class MyModel(BaseModel, =, ...):` for name, expr in cls.keywords.items(): config_data = self.get_config_update(name, expr) if config_data: has_config_kwargs = True config.update(config_data) # Handle `model_config` stmt: Statement | None = None for stmt in cls.defs.body: if not isinstance(stmt, (AssignmentStmt, ClassDef)): continue if isinstance(stmt, AssignmentStmt): lhs = stmt.lvalues[0] if not isinstance(lhs, NameExpr) or lhs.name != 'model_config': continue if isinstance(stmt.rvalue, CallExpr): # calls to `dict` or `ConfigDict` for arg_name, arg in zip(stmt.rvalue.arg_names, stmt.rvalue.args): if arg_name is None: continue config.update(self.get_config_update(arg_name, arg, lax_extra=True)) elif isinstance(stmt.rvalue, DictExpr): # dict literals for key_expr, value_expr in stmt.rvalue.items: if not isinstance(key_expr, StrExpr): continue config.update(self.get_config_update(key_expr.value, value_expr)) elif isinstance(stmt, ClassDef): if stmt.name != 'Config': # 'deprecated' Config-class continue for substmt in stmt.defs.body: if not isinstance(substmt, AssignmentStmt): continue lhs = substmt.lvalues[0] if not isinstance(lhs, NameExpr): continue config.update(self.get_config_update(lhs.name, substmt.rvalue)) if has_config_kwargs: self._api.fail( 'Specifying config in two places is ambiguous, use either Config attribute or class kwargs', cls, ) break has_config_from_namespace = True if has_config_kwargs or has_config_from_namespace: if ( stmt and config.has_alias_generator and not config.populate_by_name and self.plugin_config.warn_required_dynamic_aliases ): error_required_dynamic_aliases(self._api, stmt) for info in cls.info.mro[1:]: # 0 is the current class if METADATA_KEY not in info.metadata: continue # Each class depends on the set of fields in its ancestors self._api.add_plugin_dependency(make_wildcard_trigger(info.fullname)) for name, value in info.metadata[METADATA_KEY]['config'].items(): config.setdefault(name, value) return config def collect_fields_and_class_vars( self, model_config: ModelConfigData, is_root_model: bool ) -> tuple[list[PydanticModelField] | None, list[PydanticModelClassVar] | None]: """Collects the fields for the model, accounting for parent classes.""" cls = self._cls # First, collect fields and ClassVars belonging to any class in the MRO, ignoring duplicates. # # We iterate through the MRO in reverse because attrs defined in the parent must appear # earlier in the attributes list than attrs defined in the child. See: # https://docs.python.org/3/library/dataclasses.html#inheritance # # However, we also want fields defined in the subtype to override ones defined # in the parent. We can implement this via a dict without disrupting the attr order # because dicts preserve insertion order in Python 3.7+. found_fields: dict[str, PydanticModelField] = {} found_class_vars: dict[str, PydanticModelClassVar] = {} for info in reversed(cls.info.mro[1:-1]): # 0 is the current class, -2 is BaseModel, -1 is object # if BASEMODEL_METADATA_TAG_KEY in info.metadata and BASEMODEL_METADATA_KEY not in info.metadata: # # We haven't processed the base class yet. Need another pass. # return None, None if METADATA_KEY not in info.metadata: continue # Each class depends on the set of attributes in its dataclass ancestors. self._api.add_plugin_dependency(make_wildcard_trigger(info.fullname)) for name, data in info.metadata[METADATA_KEY]['fields'].items(): field = PydanticModelField.deserialize(info, data, self._api) # (The following comment comes directly from the dataclasses plugin) # TODO: We shouldn't be performing type operations during the main # semantic analysis pass, since some TypeInfo attributes might # still be in flux. This should be performed in a later phase. field.expand_typevar_from_subtype(cls.info, self._api) found_fields[name] = field sym_node = cls.info.names.get(name) if sym_node and sym_node.node and not isinstance(sym_node.node, Var): self._api.fail( 'BaseModel field may only be overridden by another field', sym_node.node, ) # Collect ClassVars for name, data in info.metadata[METADATA_KEY]['class_vars'].items(): found_class_vars[name] = PydanticModelClassVar.deserialize(data) # Second, collect fields and ClassVars belonging to the current class. current_field_names: set[str] = set() current_class_vars_names: set[str] = set() for stmt in self._get_assignment_statements_from_block(cls.defs): maybe_field = self.collect_field_or_class_var_from_stmt(stmt, model_config, found_class_vars) if maybe_field is None: continue lhs = stmt.lvalues[0] assert isinstance(lhs, NameExpr) # collect_field_or_class_var_from_stmt guarantees this if isinstance(maybe_field, PydanticModelField): if is_root_model and lhs.name != 'root': error_extra_fields_on_root_model(self._api, stmt) else: current_field_names.add(lhs.name) found_fields[lhs.name] = maybe_field elif isinstance(maybe_field, PydanticModelClassVar): current_class_vars_names.add(lhs.name) found_class_vars[lhs.name] = maybe_field return list(found_fields.values()), list(found_class_vars.values()) def _get_assignment_statements_from_if_statement(self, stmt: IfStmt) -> Iterator[AssignmentStmt]: for body in stmt.body: if not body.is_unreachable: yield from self._get_assignment_statements_from_block(body) if stmt.else_body is not None and not stmt.else_body.is_unreachable: yield from self._get_assignment_statements_from_block(stmt.else_body) def _get_assignment_statements_from_block(self, block: Block) -> Iterator[AssignmentStmt]: for stmt in block.body: if isinstance(stmt, AssignmentStmt): yield stmt elif isinstance(stmt, IfStmt): yield from self._get_assignment_statements_from_if_statement(stmt) def collect_field_or_class_var_from_stmt( # noqa C901 self, stmt: AssignmentStmt, model_config: ModelConfigData, class_vars: dict[str, PydanticModelClassVar] ) -> PydanticModelField | PydanticModelClassVar | None: """Get pydantic model field from statement. Args: stmt: The statement. model_config: Configuration settings for the model. class_vars: ClassVars already known to be defined on the model. Returns: A pydantic model field if it could find the field in statement. Otherwise, `None`. """ cls = self._cls lhs = stmt.lvalues[0] if not isinstance(lhs, NameExpr) or not _fields.is_valid_field_name(lhs.name) or lhs.name == 'model_config': return None if not stmt.new_syntax: if ( isinstance(stmt.rvalue, CallExpr) and isinstance(stmt.rvalue.callee, CallExpr) and isinstance(stmt.rvalue.callee.callee, NameExpr) and stmt.rvalue.callee.callee.fullname in DECORATOR_FULLNAMES ): # This is a (possibly-reused) validator or serializer, not a field # In particular, it looks something like: my_validator = validator('my_field')(f) # Eventually, we may want to attempt to respect model_config['ignored_types'] return None if lhs.name in class_vars: # Class vars are not fields and are not required to be annotated return None # The assignment does not have an annotation, and it's not anything else we recognize error_untyped_fields(self._api, stmt) return None lhs = stmt.lvalues[0] if not isinstance(lhs, NameExpr): return None if not _fields.is_valid_field_name(lhs.name) or lhs.name == 'model_config': return None sym = cls.info.names.get(lhs.name) if sym is None: # pragma: no cover # This is likely due to a star import (see the dataclasses plugin for a more detailed explanation) # This is the same logic used in the dataclasses plugin return None node = sym.node if isinstance(node, PlaceholderNode): # pragma: no cover # See the PlaceholderNode docstring for more detail about how this can occur # Basically, it is an edge case when dealing with complex import logic # The dataclasses plugin now asserts this cannot happen, but I'd rather not error if it does.. return None if isinstance(node, TypeAlias): self._api.fail( 'Type aliases inside BaseModel definitions are not supported at runtime', node, ) # Skip processing this node. This doesn't match the runtime behaviour, # but the only alternative would be to modify the SymbolTable, # and it's a little hairy to do that in a plugin. return None if not isinstance(node, Var): # pragma: no cover # Don't know if this edge case still happens with the `is_valid_field` check above # but better safe than sorry # The dataclasses plugin now asserts this cannot happen, but I'd rather not error if it does.. return None # x: ClassVar[int] is not a field if node.is_classvar: return PydanticModelClassVar(lhs.name) # x: InitVar[int] is not supported in BaseModel node_type = get_proper_type(node.type) if isinstance(node_type, Instance) and node_type.type.fullname == 'dataclasses.InitVar': self._api.fail( 'InitVar is not supported in BaseModel', node, ) has_default = self.get_has_default(stmt) strict = self.get_strict(stmt) if sym.type is None and node.is_final and node.is_inferred: # This follows the logic from the dataclasses plugin. The following comment is taken verbatim: # # This is a special case, assignment like x: Final = 42 is classified # annotated above, but mypy strips the `Final` turning it into x = 42. # We do not support inferred types in dataclasses, so we can try inferring # type for simple literals, and otherwise require an explicit type # argument for Final[...]. typ = self._api.analyze_simple_literal_type(stmt.rvalue, is_final=True) if typ: node.type = typ else: self._api.fail( 'Need type argument for Final[...] with non-literal default in BaseModel', stmt, ) node.type = AnyType(TypeOfAny.from_error) alias, has_dynamic_alias = self.get_alias_info(stmt) if has_dynamic_alias and not model_config.populate_by_name and self.plugin_config.warn_required_dynamic_aliases: error_required_dynamic_aliases(self._api, stmt) is_frozen = self.is_field_frozen(stmt) init_type = self._infer_dataclass_attr_init_type(sym, lhs.name, stmt) return PydanticModelField( name=lhs.name, has_dynamic_alias=has_dynamic_alias, has_default=has_default, strict=strict, alias=alias, is_frozen=is_frozen, line=stmt.line, column=stmt.column, type=init_type, info=cls.info, ) def _infer_dataclass_attr_init_type(self, sym: SymbolTableNode, name: str, context: Context) -> Type | None: """Infer __init__ argument type for an attribute. In particular, possibly use the signature of __set__. """ default = sym.type if sym.implicit: return default t = get_proper_type(sym.type) # Perform a simple-minded inference from the signature of __set__, if present. # We can't use mypy.checkmember here, since this plugin runs before type checking. # We only support some basic scanerios here, which is hopefully sufficient for # the vast majority of use cases. if not isinstance(t, Instance): return default setter = t.type.get('__set__') if setter: if isinstance(setter.node, FuncDef): super_info = t.type.get_containing_type_info('__set__') assert super_info if setter.type: setter_type = get_proper_type(map_type_from_supertype(setter.type, t.type, super_info)) else: return AnyType(TypeOfAny.unannotated) if isinstance(setter_type, CallableType) and setter_type.arg_kinds == [ ARG_POS, ARG_POS, ARG_POS, ]: return expand_type_by_instance(setter_type.arg_types[2], t) else: self._api.fail(f'Unsupported signature for "__set__" in "{t.type.name}"', context) else: self._api.fail(f'Unsupported "__set__" in "{t.type.name}"', context) return default def add_initializer( self, fields: list[PydanticModelField], config: ModelConfigData, is_settings: bool, is_root_model: bool ) -> None: """Adds a fields-aware `__init__` method to the class. The added `__init__` will be annotated with types vs. all `Any` depending on the plugin settings. """ if '__init__' in self._cls.info.names and not self._cls.info.names['__init__'].plugin_generated: return # Don't generate an __init__ if one already exists typed = self.plugin_config.init_typed model_strict = bool(config.strict) use_alias = config.populate_by_name is not True requires_dynamic_aliases = bool(config.has_alias_generator and not config.populate_by_name) args = self.get_field_arguments( fields, typed=typed, model_strict=model_strict, requires_dynamic_aliases=requires_dynamic_aliases, use_alias=use_alias, is_settings=is_settings, is_root_model=is_root_model, force_typevars_invariant=True, ) if is_settings: base_settings_node = self._api.lookup_fully_qualified(BASESETTINGS_FULLNAME).node assert isinstance(base_settings_node, TypeInfo) if '__init__' in base_settings_node.names: base_settings_init_node = base_settings_node.names['__init__'].node assert isinstance(base_settings_init_node, FuncDef) if base_settings_init_node is not None and base_settings_init_node.type is not None: func_type = base_settings_init_node.type assert isinstance(func_type, CallableType) for arg_idx, arg_name in enumerate(func_type.arg_names): if arg_name is None or arg_name.startswith('__') or not arg_name.startswith('_'): continue analyzed_variable_type = self._api.anal_type(func_type.arg_types[arg_idx]) variable = Var(arg_name, analyzed_variable_type) args.append(Argument(variable, analyzed_variable_type, None, ARG_OPT)) if not self.should_init_forbid_extra(fields, config): var = Var('kwargs') args.append(Argument(var, AnyType(TypeOfAny.explicit), None, ARG_STAR2)) add_method(self._api, self._cls, '__init__', args=args, return_type=NoneType()) def add_model_construct_method( self, fields: list[PydanticModelField], config: ModelConfigData, is_settings: bool, is_root_model: bool, ) -> None: """Adds a fully typed `model_construct` classmethod to the class. Similar to the fields-aware __init__ method, but always uses the field names (not aliases), and does not treat settings fields as optional. """ set_str = self._api.named_type(f'{BUILTINS_NAME}.set', [self._api.named_type(f'{BUILTINS_NAME}.str')]) optional_set_str = UnionType([set_str, NoneType()]) fields_set_argument = Argument(Var('_fields_set', optional_set_str), optional_set_str, None, ARG_OPT) with state.strict_optional_set(self._api.options.strict_optional): args = self.get_field_arguments( fields, typed=True, model_strict=bool(config.strict), requires_dynamic_aliases=False, use_alias=False, is_settings=is_settings, is_root_model=is_root_model, ) if not self.should_init_forbid_extra(fields, config): var = Var('kwargs') args.append(Argument(var, AnyType(TypeOfAny.explicit), None, ARG_STAR2)) args = args + [fields_set_argument] if is_root_model else [fields_set_argument] + args add_method( self._api, self._cls, 'model_construct', args=args, return_type=fill_typevars(self._cls.info), is_classmethod=True, ) def set_frozen(self, fields: list[PydanticModelField], api: SemanticAnalyzerPluginInterface, frozen: bool) -> None: """Marks all fields as properties so that attempts to set them trigger mypy errors. This is the same approach used by the attrs and dataclasses plugins. """ info = self._cls.info for field in fields: sym_node = info.names.get(field.name) if sym_node is not None: var = sym_node.node if isinstance(var, Var): var.is_property = frozen or field.is_frozen elif isinstance(var, PlaceholderNode) and not self._api.final_iteration: # See https://github.com/pydantic/pydantic/issues/5191 to hit this branch for test coverage self._api.defer() else: # pragma: no cover # I don't know whether it's possible to hit this branch, but I've added it for safety try: var_str = str(var) except TypeError: # This happens for PlaceholderNode; perhaps it will happen for other types in the future.. var_str = repr(var) detail = f'sym_node.node: {var_str} (of type {var.__class__})' error_unexpected_behavior(detail, self._api, self._cls) else: var = field.to_var(info, api, use_alias=False) var.info = info var.is_property = frozen var._fullname = info.fullname + '.' + var.name info.names[var.name] = SymbolTableNode(MDEF, var) def get_config_update(self, name: str, arg: Expression, lax_extra: bool = False) -> ModelConfigData | None: """Determines the config update due to a single kwarg in the ConfigDict definition. Warns if a tracked config attribute is set to a value the plugin doesn't know how to interpret (e.g., an int) """ if name not in self.tracked_config_fields: return None if name == 'extra': if isinstance(arg, StrExpr): forbid_extra = arg.value == 'forbid' elif isinstance(arg, MemberExpr): forbid_extra = arg.name == 'forbid' else: if not lax_extra: # Only emit an error for other types of `arg` (e.g., `NameExpr`, `ConditionalExpr`, etc.) when # reading from a config class, etc. If a ConfigDict is used, then we don't want to emit an error # because you'll get type checking from the ConfigDict itself. # # It would be nice if we could introspect the types better otherwise, but I don't know what the API # is to evaluate an expr into its type and then check if that type is compatible with the expected # type. Note that you can still get proper type checking via: `model_config = ConfigDict(...)`, just # if you don't use an explicit string, the plugin won't be able to infer whether extra is forbidden. error_invalid_config_value(name, self._api, arg) return None return ModelConfigData(forbid_extra=forbid_extra) if name == 'alias_generator': has_alias_generator = True if isinstance(arg, NameExpr) and arg.fullname == 'builtins.None': has_alias_generator = False return ModelConfigData(has_alias_generator=has_alias_generator) if isinstance(arg, NameExpr) and arg.fullname in ('builtins.True', 'builtins.False'): return ModelConfigData(**{name: arg.fullname == 'builtins.True'}) error_invalid_config_value(name, self._api, arg) return None @staticmethod def get_has_default(stmt: AssignmentStmt) -> bool: """Returns a boolean indicating whether the field defined in `stmt` is a required field.""" expr = stmt.rvalue if isinstance(expr, TempNode): # TempNode means annotation-only, so has no default return False if isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME: # The "default value" is a call to `Field`; at this point, the field has a default if and only if: # * there is a positional argument that is not `...` # * there is a keyword argument named "default" that is not `...` # * there is a "default_factory" that is not `None` for arg, name in zip(expr.args, expr.arg_names): # If name is None, then this arg is the default because it is the only positional argument. if name is None or name == 'default': return arg.__class__ is not EllipsisExpr if name == 'default_factory': return not (isinstance(arg, NameExpr) and arg.fullname == 'builtins.None') return False # Has no default if the "default value" is Ellipsis (i.e., `field_name: Annotation = ...`) return not isinstance(expr, EllipsisExpr) @staticmethod def get_strict(stmt: AssignmentStmt) -> bool | None: """Returns a the `strict` value of a field if defined, otherwise `None`.""" expr = stmt.rvalue if isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME: for arg, name in zip(expr.args, expr.arg_names): if name != 'strict': continue if isinstance(arg, NameExpr): if arg.fullname == 'builtins.True': return True elif arg.fullname == 'builtins.False': return False return None return None @staticmethod def get_alias_info(stmt: AssignmentStmt) -> tuple[str | None, bool]: """Returns a pair (alias, has_dynamic_alias), extracted from the declaration of the field defined in `stmt`. `has_dynamic_alias` is True if and only if an alias is provided, but not as a string literal. If `has_dynamic_alias` is True, `alias` will be None. """ expr = stmt.rvalue if isinstance(expr, TempNode): # TempNode means annotation-only return None, False if not ( isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME ): # Assigned value is not a call to pydantic.fields.Field return None, False for i, arg_name in enumerate(expr.arg_names): if arg_name != 'alias': continue arg = expr.args[i] if isinstance(arg, StrExpr): return arg.value, False else: return None, True return None, False @staticmethod def is_field_frozen(stmt: AssignmentStmt) -> bool: """Returns whether the field is frozen, extracted from the declaration of the field defined in `stmt`. Note that this is only whether the field was declared to be frozen in a ` = Field(frozen=True)` sense; this does not determine whether the field is frozen because the entire model is frozen; that is handled separately. """ expr = stmt.rvalue if isinstance(expr, TempNode): # TempNode means annotation-only return False if not ( isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME ): # Assigned value is not a call to pydantic.fields.Field return False for i, arg_name in enumerate(expr.arg_names): if arg_name == 'frozen': arg = expr.args[i] return isinstance(arg, NameExpr) and arg.fullname == 'builtins.True' return False def get_field_arguments( self, fields: list[PydanticModelField], typed: bool, model_strict: bool, use_alias: bool, requires_dynamic_aliases: bool, is_settings: bool, is_root_model: bool, force_typevars_invariant: bool = False, ) -> list[Argument]: """Helper function used during the construction of the `__init__` and `model_construct` method signatures. Returns a list of mypy Argument instances for use in the generated signatures. """ info = self._cls.info arguments = [ field.to_argument( info, typed=typed, model_strict=model_strict, force_optional=requires_dynamic_aliases or is_settings, use_alias=use_alias, api=self._api, force_typevars_invariant=force_typevars_invariant, is_root_model_root=is_root_model and field.name == 'root', ) for field in fields if not (use_alias and field.has_dynamic_alias) ] return arguments def should_init_forbid_extra(self, fields: list[PydanticModelField], config: ModelConfigData) -> bool: """Indicates whether the generated `__init__` should get a `**kwargs` at the end of its signature. We disallow arbitrary kwargs if the extra config setting is "forbid", or if the plugin config says to, *unless* a required dynamic alias is present (since then we can't determine a valid signature). """ if not config.populate_by_name: if self.is_dynamic_alias_present(fields, bool(config.has_alias_generator)): return False if config.forbid_extra: return True return self.plugin_config.init_forbid_extra @staticmethod def is_dynamic_alias_present(fields: list[PydanticModelField], has_alias_generator: bool) -> bool: """Returns whether any fields on the model have a "dynamic alias", i.e., an alias that cannot be determined during static analysis. """ for field in fields: if field.has_dynamic_alias: return True if has_alias_generator: for field in fields: if field.alias is None: return True return False class ModelConfigData: """Pydantic mypy plugin model config class.""" def __init__( self, forbid_extra: bool | None = None, frozen: bool | None = None, from_attributes: bool | None = None, populate_by_name: bool | None = None, has_alias_generator: bool | None = None, strict: bool | None = None, ): self.forbid_extra = forbid_extra self.frozen = frozen self.from_attributes = from_attributes self.populate_by_name = populate_by_name self.has_alias_generator = has_alias_generator self.strict = strict def get_values_dict(self) -> dict[str, Any]: """Returns a dict of Pydantic model config names to their values. It includes the config if config value is not `None`. """ return {k: v for k, v in self.__dict__.items() if v is not None} def update(self, config: ModelConfigData | None) -> None: """Update Pydantic model config values.""" if config is None: return for k, v in config.get_values_dict().items(): setattr(self, k, v) def setdefault(self, key: str, value: Any) -> None: """Set default value for Pydantic model config if config value is `None`.""" if getattr(self, key) is None: setattr(self, key, value) ERROR_ORM = ErrorCode('pydantic-orm', 'Invalid from_attributes call', 'Pydantic') ERROR_CONFIG = ErrorCode('pydantic-config', 'Invalid config value', 'Pydantic') ERROR_ALIAS = ErrorCode('pydantic-alias', 'Dynamic alias disallowed', 'Pydantic') ERROR_UNEXPECTED = ErrorCode('pydantic-unexpected', 'Unexpected behavior', 'Pydantic') ERROR_UNTYPED = ErrorCode('pydantic-field', 'Untyped field disallowed', 'Pydantic') ERROR_FIELD_DEFAULTS = ErrorCode('pydantic-field', 'Invalid Field defaults', 'Pydantic') ERROR_EXTRA_FIELD_ROOT_MODEL = ErrorCode('pydantic-field', 'Extra field on RootModel subclass', 'Pydantic') def error_from_attributes(model_name: str, api: CheckerPluginInterface, context: Context) -> None: """Emits an error when the model does not have `from_attributes=True`.""" api.fail(f'"{model_name}" does not have from_attributes=True', context, code=ERROR_ORM) def error_invalid_config_value(name: str, api: SemanticAnalyzerPluginInterface, context: Context) -> None: """Emits an error when the config value is invalid.""" api.fail(f'Invalid value for "Config.{name}"', context, code=ERROR_CONFIG) def error_required_dynamic_aliases(api: SemanticAnalyzerPluginInterface, context: Context) -> None: """Emits required dynamic aliases error. This will be called when `warn_required_dynamic_aliases=True`. """ api.fail('Required dynamic aliases disallowed', context, code=ERROR_ALIAS) def error_unexpected_behavior( detail: str, api: CheckerPluginInterface | SemanticAnalyzerPluginInterface, context: Context ) -> None: # pragma: no cover """Emits unexpected behavior error.""" # Can't think of a good way to test this, but I confirmed it renders as desired by adding to a non-error path link = 'https://github.com/pydantic/pydantic/issues/new/choose' full_message = f'The pydantic mypy plugin ran into unexpected behavior: {detail}\n' full_message += f'Please consider reporting this bug at {link} so we can try to fix it!' api.fail(full_message, context, code=ERROR_UNEXPECTED) def error_untyped_fields(api: SemanticAnalyzerPluginInterface, context: Context) -> None: """Emits an error when there is an untyped field in the model.""" api.fail('Untyped fields disallowed', context, code=ERROR_UNTYPED) def error_extra_fields_on_root_model(api: CheckerPluginInterface, context: Context) -> None: """Emits an error when there is more than just a root field defined for a subclass of RootModel.""" api.fail('Only `root` is allowed as a field of a `RootModel`', context, code=ERROR_EXTRA_FIELD_ROOT_MODEL) def add_method( api: SemanticAnalyzerPluginInterface | CheckerPluginInterface, cls: ClassDef, name: str, args: list[Argument], return_type: Type, self_type: Type | None = None, tvar_def: TypeVarType | None = None, is_classmethod: bool = False, ) -> None: """Very closely related to `mypy.plugins.common.add_method_to_class`, with a few pydantic-specific changes.""" info = cls.info # First remove any previously generated methods with the same name # to avoid clashes and problems in the semantic analyzer. if name in info.names: sym = info.names[name] if sym.plugin_generated and isinstance(sym.node, FuncDef): cls.defs.body.remove(sym.node) # pragma: no cover if isinstance(api, SemanticAnalyzerPluginInterface): function_type = api.named_type('builtins.function') else: function_type = api.named_generic_type('builtins.function', []) if is_classmethod: self_type = self_type or TypeType(fill_typevars(info)) first = [Argument(Var('_cls'), self_type, None, ARG_POS, True)] else: self_type = self_type or fill_typevars(info) # `self` is positional *ONLY* here, but this can't be expressed # fully in the mypy internal API. ARG_POS is the closest we can get. # Using ARG_POS will, however, give mypy errors if a `self` field # is present on a model: # # Name "self" already defined (possibly by an import) [no-redef] # # As a workaround, we give this argument a name that will # never conflict. By its positional nature, this name will not # be used or exposed to users. first = [Argument(Var('__pydantic_self__'), self_type, None, ARG_POS)] args = first + args arg_types, arg_names, arg_kinds = [], [], [] for arg in args: assert arg.type_annotation, 'All arguments must be fully typed.' arg_types.append(arg.type_annotation) arg_names.append(arg.variable.name) arg_kinds.append(arg.kind) signature = CallableType(arg_types, arg_kinds, arg_names, return_type, function_type) if tvar_def: signature.variables = [tvar_def] func = FuncDef(name, args, Block([PassStmt()])) func.info = info func.type = set_callable_name(signature, func) func.is_class = is_classmethod func._fullname = info.fullname + '.' + name func.line = info.line # NOTE: we would like the plugin generated node to dominate, but we still # need to keep any existing definitions so they get semantically analyzed. if name in info.names: # Get a nice unique name instead. r_name = get_unique_redefinition_name(name, info.names) info.names[r_name] = info.names[name] # Add decorator for is_classmethod # The dataclasses plugin claims this is unnecessary for classmethods, but not including it results in a # signature incompatible with the superclass, which causes mypy errors to occur for every subclass of BaseModel. if is_classmethod: func.is_decorated = True v = Var(name, func.type) v.info = info v._fullname = func._fullname v.is_classmethod = True dec = Decorator(func, [NameExpr('classmethod')], v) dec.line = info.line sym = SymbolTableNode(MDEF, dec) else: sym = SymbolTableNode(MDEF, func) sym.plugin_generated = True info.names[name] = sym info.defn.defs.body.append(func) def parse_toml(config_file: str) -> dict[str, Any] | None: """Returns a dict of config keys to values. It reads configs from toml file and returns `None` if the file is not a toml file. """ if not config_file.endswith('.toml'): return None if sys.version_info >= (3, 11): import tomllib as toml_ else: try: import tomli as toml_ except ImportError: # pragma: no cover import warnings warnings.warn('No TOML parser installed, cannot read configuration from `pyproject.toml`.') return None with open(config_file, 'rb') as rf: return toml_.load(rf) pydantic-2.10.6/pydantic/networks.py000066400000000000000000001176271474456633400175120ustar00rootroot00000000000000"""The networks module contains types for common network-related fields.""" from __future__ import annotations as _annotations import dataclasses as _dataclasses import re from dataclasses import fields from functools import lru_cache from importlib.metadata import version from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from typing import TYPE_CHECKING, Any, ClassVar from pydantic_core import ( MultiHostHost, PydanticCustomError, PydanticSerializationUnexpectedValue, SchemaSerializer, core_schema, ) from pydantic_core import MultiHostUrl as _CoreMultiHostUrl from pydantic_core import Url as _CoreUrl from typing_extensions import Annotated, Self, TypeAlias from pydantic.errors import PydanticUserError from ._internal import _repr, _schema_generation_shared from ._migration import getattr_migration from .annotated_handlers import GetCoreSchemaHandler from .json_schema import JsonSchemaValue from .type_adapter import TypeAdapter if TYPE_CHECKING: import email_validator NetworkType: TypeAlias = 'str | bytes | int | tuple[str | bytes | int, str | int]' else: email_validator = None __all__ = [ 'AnyUrl', 'AnyHttpUrl', 'FileUrl', 'FtpUrl', 'HttpUrl', 'WebsocketUrl', 'AnyWebsocketUrl', 'UrlConstraints', 'EmailStr', 'NameEmail', 'IPvAnyAddress', 'IPvAnyInterface', 'IPvAnyNetwork', 'PostgresDsn', 'CockroachDsn', 'AmqpDsn', 'RedisDsn', 'MongoDsn', 'KafkaDsn', 'NatsDsn', 'validate_email', 'MySQLDsn', 'MariaDBDsn', 'ClickHouseDsn', 'SnowflakeDsn', ] @_dataclasses.dataclass class UrlConstraints: """Url constraints. Attributes: max_length: The maximum length of the url. Defaults to `None`. allowed_schemes: The allowed schemes. Defaults to `None`. host_required: Whether the host is required. Defaults to `None`. default_host: The default host. Defaults to `None`. default_port: The default port. Defaults to `None`. default_path: The default path. Defaults to `None`. """ max_length: int | None = None allowed_schemes: list[str] | None = None host_required: bool | None = None default_host: str | None = None default_port: int | None = None default_path: str | None = None def __hash__(self) -> int: return hash( ( self.max_length, tuple(self.allowed_schemes) if self.allowed_schemes is not None else None, self.host_required, self.default_host, self.default_port, self.default_path, ) ) @property def defined_constraints(self) -> dict[str, Any]: """Fetch a key / value mapping of constraints to values that are not None. Used for core schema updates.""" return {field.name: value for field in fields(self) if (value := getattr(self, field.name)) is not None} def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) # for function-wrap schemas, url constraints is applied to the inner schema # because when we generate schemas for urls, we wrap a core_schema.url_schema() with a function-wrap schema # that helps with validation on initialization, see _BaseUrl and _BaseMultiHostUrl below. schema_to_mutate = schema['schema'] if schema['type'] == 'function-wrap' else schema if annotated_type := schema_to_mutate['type'] not in ('url', 'multi-host-url'): raise PydanticUserError( f"'UrlConstraints' cannot annotate '{annotated_type}'.", code='invalid-annotated-type' ) for constraint_key, constraint_value in self.defined_constraints.items(): schema_to_mutate[constraint_key] = constraint_value return schema class _BaseUrl: _constraints: ClassVar[UrlConstraints] = UrlConstraints() _url: _CoreUrl def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url @property def scheme(self) -> str: """The scheme part of the URL. e.g. `https` in `https://user:pass@host:port/path?query#fragment` """ return self._url.scheme @property def username(self) -> str | None: """The username part of the URL, or `None`. e.g. `user` in `https://user:pass@host:port/path?query#fragment` """ return self._url.username @property def password(self) -> str | None: """The password part of the URL, or `None`. e.g. `pass` in `https://user:pass@host:port/path?query#fragment` """ return self._url.password @property def host(self) -> str | None: """The host part of the URL, or `None`. If the URL must be punycode encoded, this is the encoded host, e.g if the input URL is `https://£££.com`, `host` will be `xn--9aaa.com` """ return self._url.host def unicode_host(self) -> str | None: """The host part of the URL as a unicode string, or `None`. e.g. `host` in `https://user:pass@host:port/path?query#fragment` If the URL must be punycode encoded, this is the decoded host, e.g if the input URL is `https://£££.com`, `unicode_host()` will be `£££.com` """ return self._url.unicode_host() @property def port(self) -> int | None: """The port part of the URL, or `None`. e.g. `port` in `https://user:pass@host:port/path?query#fragment` """ return self._url.port @property def path(self) -> str | None: """The path part of the URL, or `None`. e.g. `/path` in `https://user:pass@host:port/path?query#fragment` """ return self._url.path @property def query(self) -> str | None: """The query part of the URL, or `None`. e.g. `query` in `https://user:pass@host:port/path?query#fragment` """ return self._url.query def query_params(self) -> list[tuple[str, str]]: """The query part of the URL as a list of key-value pairs. e.g. `[('foo', 'bar')]` in `https://user:pass@host:port/path?foo=bar#fragment` """ return self._url.query_params() @property def fragment(self) -> str | None: """The fragment part of the URL, or `None`. e.g. `fragment` in `https://user:pass@host:port/path?query#fragment` """ return self._url.fragment def unicode_string(self) -> str: """The URL as a unicode string, unlike `__str__()` this will not punycode encode the host. If the URL must be punycode encoded, this is the decoded string, e.g if the input URL is `https://£££.com`, `unicode_string()` will be `https://£££.com` """ return self._url.unicode_string() def __str__(self) -> str: """The URL as a string, this will punycode encode the host if required.""" return str(self._url) def __repr__(self) -> str: return f'{self.__class__.__name__}({str(self._url)!r})' def __deepcopy__(self, memo: dict) -> Self: return self.__class__(self._url) def __eq__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url == other._url def __lt__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url < other._url def __gt__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url > other._url def __le__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url <= other._url def __ge__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url >= other._url def __hash__(self) -> int: return hash(self._url) def __len__(self) -> int: return len(str(self._url)) @classmethod def build( cls, *, scheme: str, username: str | None = None, password: str | None = None, host: str, port: int | None = None, path: str | None = None, query: str | None = None, fragment: str | None = None, ) -> Self: """Build a new `Url` instance from its component parts. Args: scheme: The scheme part of the URL. username: The username part of the URL, or omit for no username. password: The password part of the URL, or omit for no password. host: The host part of the URL. port: The port part of the URL, or omit for no port. path: The path part of the URL, or omit for no path. query: The query part of the URL, or omit for no query. fragment: The fragment part of the URL, or omit for no fragment. Returns: An instance of URL """ return cls( _CoreUrl.build( scheme=scheme, username=username, password=password, host=host, port=port, path=path, query=query, fragment=fragment, ) ) @classmethod def serialize_url(cls, url: Any, info: core_schema.SerializationInfo) -> str | Self: if not isinstance(url, cls): raise PydanticSerializationUnexpectedValue( f"Expected `{cls}` but got `{type(url)}` with value `'{url}'` - serialized value may not be as expected." ) if info.mode == 'json': return str(url) return url @classmethod def __get_pydantic_core_schema__( cls, source: type[_BaseUrl], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: def wrap_val(v, h): if isinstance(v, source): return v if isinstance(v, _BaseUrl): v = str(v) core_url = h(v) instance = source.__new__(source) instance._url = core_url return instance return core_schema.no_info_wrap_validator_function( wrap_val, schema=core_schema.url_schema(**cls._constraints.defined_constraints), serialization=core_schema.plain_serializer_function_ser_schema( cls.serialize_url, info_arg=True, when_used='always' ), ) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: # we use the url schema for json schema generation, but we might have to extract it from # the function-wrap schema we use as a tool for validation on initialization inner_schema = core_schema['schema'] if core_schema['type'] == 'function-wrap' else core_schema return handler(inner_schema) __pydantic_serializer__ = SchemaSerializer(core_schema.any_schema(serialization=core_schema.to_string_ser_schema())) class _BaseMultiHostUrl: _constraints: ClassVar[UrlConstraints] = UrlConstraints() _url: _CoreMultiHostUrl def __init__(self, url: str | _CoreMultiHostUrl | _BaseMultiHostUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url @property def scheme(self) -> str: """The scheme part of the URL. e.g. `https` in `https://foo.com,bar.com/path?query#fragment` """ return self._url.scheme @property def path(self) -> str | None: """The path part of the URL, or `None`. e.g. `/path` in `https://foo.com,bar.com/path?query#fragment` """ return self._url.path @property def query(self) -> str | None: """The query part of the URL, or `None`. e.g. `query` in `https://foo.com,bar.com/path?query#fragment` """ return self._url.query def query_params(self) -> list[tuple[str, str]]: """The query part of the URL as a list of key-value pairs. e.g. `[('foo', 'bar')]` in `https://foo.com,bar.com/path?query#fragment` """ return self._url.query_params() @property def fragment(self) -> str | None: """The fragment part of the URL, or `None`. e.g. `fragment` in `https://foo.com,bar.com/path?query#fragment` """ return self._url.fragment def hosts(self) -> list[MultiHostHost]: '''The hosts of the `MultiHostUrl` as [`MultiHostHost`][pydantic_core.MultiHostHost] typed dicts. ```python from pydantic_core import MultiHostUrl mhu = MultiHostUrl('https://foo.com:123,foo:bar@bar.com/path') print(mhu.hosts()) """ [ {'username': None, 'password': None, 'host': 'foo.com', 'port': 123}, {'username': 'foo', 'password': 'bar', 'host': 'bar.com', 'port': 443} ] ``` Returns: A list of dicts, each representing a host. ''' return self._url.hosts() def unicode_string(self) -> str: """The URL as a unicode string, unlike `__str__()` this will not punycode encode the hosts.""" return self._url.unicode_string() def __str__(self) -> str: """The URL as a string, this will punycode encode the host if required.""" return str(self._url) def __repr__(self) -> str: return f'{self.__class__.__name__}({str(self._url)!r})' def __deepcopy__(self, memo: dict) -> Self: return self.__class__(self._url) def __eq__(self, other: Any) -> bool: return self.__class__ is other.__class__ and self._url == other._url def __hash__(self) -> int: return hash(self._url) def __len__(self) -> int: return len(str(self._url)) @classmethod def build( cls, *, scheme: str, hosts: list[MultiHostHost] | None = None, username: str | None = None, password: str | None = None, host: str | None = None, port: int | None = None, path: str | None = None, query: str | None = None, fragment: str | None = None, ) -> Self: """Build a new `MultiHostUrl` instance from its component parts. This method takes either `hosts` - a list of `MultiHostHost` typed dicts, or the individual components `username`, `password`, `host` and `port`. Args: scheme: The scheme part of the URL. hosts: Multiple hosts to build the URL from. username: The username part of the URL. password: The password part of the URL. host: The host part of the URL. port: The port part of the URL. path: The path part of the URL. query: The query part of the URL, or omit for no query. fragment: The fragment part of the URL, or omit for no fragment. Returns: An instance of `MultiHostUrl` """ return cls( _CoreMultiHostUrl.build( scheme=scheme, hosts=hosts, username=username, password=password, host=host, port=port, path=path, query=query, fragment=fragment, ) ) @classmethod def serialize_url(cls, url: Any, info: core_schema.SerializationInfo) -> str | Self: if not isinstance(url, cls): raise PydanticSerializationUnexpectedValue( f"Expected `{cls}` but got `{type(url)}` with value `'{url}'` - serialized value may not be as expected." ) if info.mode == 'json': return str(url) return url @classmethod def __get_pydantic_core_schema__( cls, source: type[_BaseMultiHostUrl], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: def wrap_val(v, h): if isinstance(v, source): return v if isinstance(v, _BaseMultiHostUrl): v = str(v) core_url = h(v) instance = source.__new__(source) instance._url = core_url return instance return core_schema.no_info_wrap_validator_function( wrap_val, schema=core_schema.multi_host_url_schema(**cls._constraints.defined_constraints), serialization=core_schema.plain_serializer_function_ser_schema( cls.serialize_url, info_arg=True, when_used='always' ), ) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: # we use the url schema for json schema generation, but we might have to extract it from # the function-wrap schema we use as a tool for validation on initialization inner_schema = core_schema['schema'] if core_schema['type'] == 'function-wrap' else core_schema return handler(inner_schema) __pydantic_serializer__ = SchemaSerializer(core_schema.any_schema(serialization=core_schema.to_string_ser_schema())) @lru_cache def _build_type_adapter(cls: type[_BaseUrl | _BaseMultiHostUrl]) -> TypeAdapter: return TypeAdapter(cls) class AnyUrl(_BaseUrl): """Base type for all URLs. * Any scheme allowed * Top-level domain (TLD) not required * Host not required Assuming an input URL of `http://samuel:pass@example.com:8000/the/path/?query=here#fragment=is;this=bit`, the types export the following properties: - `scheme`: the URL scheme (`http`), always set. - `host`: the URL host (`example.com`). - `username`: optional username if included (`samuel`). - `password`: optional password if included (`pass`). - `port`: optional port (`8000`). - `path`: optional path (`/the/path/`). - `query`: optional URL query (for example, `GET` arguments or "search string", such as `query=here`). - `fragment`: optional fragment (`fragment=is;this=bit`). """ # Note: all single host urls inherit from `AnyUrl` to preserve compatibility with pre-v2.10 code # Where urls were annotated variants of `AnyUrl`, which was an alias to `pydantic_core.Url` class AnyHttpUrl(AnyUrl): """A type that will accept any http or https URL. * TLD not required * Host not required """ _constraints = UrlConstraints(allowed_schemes=['http', 'https']) class HttpUrl(AnyUrl): """A type that will accept any http or https URL. * TLD not required * Host not required * Max length 2083 ```python from pydantic import BaseModel, HttpUrl, ValidationError class MyModel(BaseModel): url: HttpUrl m = MyModel(url='http://www.example.com') # (1)! print(m.url) #> http://www.example.com/ try: MyModel(url='ftp://invalid.url') except ValidationError as e: print(e) ''' 1 validation error for MyModel url URL scheme should be 'http' or 'https' [type=url_scheme, input_value='ftp://invalid.url', input_type=str] ''' try: MyModel(url='not a url') except ValidationError as e: print(e) ''' 1 validation error for MyModel url Input should be a valid URL, relative URL without a base [type=url_parsing, input_value='not a url', input_type=str] ''' ``` 1. Note: mypy would prefer `m = MyModel(url=HttpUrl('http://www.example.com'))`, but Pydantic will convert the string to an HttpUrl instance anyway. "International domains" (e.g. a URL where the host or TLD includes non-ascii characters) will be encoded via [punycode](https://en.wikipedia.org/wiki/Punycode) (see [this article](https://www.xudongz.com/blog/2017/idn-phishing/) for a good description of why this is important): ```python from pydantic import BaseModel, HttpUrl class MyModel(BaseModel): url: HttpUrl m1 = MyModel(url='http://puny£code.com') print(m1.url) #> http://xn--punycode-eja.com/ m2 = MyModel(url='https://www.аррӏе.com/') print(m2.url) #> https://www.xn--80ak6aa92e.com/ m3 = MyModel(url='https://www.example.珠宝/') print(m3.url) #> https://www.example.xn--pbt977c/ ``` !!! warning "Underscores in Hostnames" In Pydantic, underscores are allowed in all parts of a domain except the TLD. Technically this might be wrong - in theory the hostname cannot have underscores, but subdomains can. To explain this; consider the following two cases: - `exam_ple.co.uk`: the hostname is `exam_ple`, which should not be allowed since it contains an underscore. - `foo_bar.example.com` the hostname is `example`, which should be allowed since the underscore is in the subdomain. Without having an exhaustive list of TLDs, it would be impossible to differentiate between these two. Therefore underscores are allowed, but you can always do further validation in a validator if desired. Also, Chrome, Firefox, and Safari all currently accept `http://exam_ple.com` as a URL, so we're in good (or at least big) company. """ _constraints = UrlConstraints(max_length=2083, allowed_schemes=['http', 'https']) class AnyWebsocketUrl(AnyUrl): """A type that will accept any ws or wss URL. * TLD not required * Host not required """ _constraints = UrlConstraints(allowed_schemes=['ws', 'wss']) class WebsocketUrl(AnyUrl): """A type that will accept any ws or wss URL. * TLD not required * Host not required * Max length 2083 """ _constraints = UrlConstraints(max_length=2083, allowed_schemes=['ws', 'wss']) class FileUrl(AnyUrl): """A type that will accept any file URL. * Host not required """ _constraints = UrlConstraints(allowed_schemes=['file']) class FtpUrl(AnyUrl): """A type that will accept ftp URL. * TLD not required * Host not required """ _constraints = UrlConstraints(allowed_schemes=['ftp']) class PostgresDsn(_BaseMultiHostUrl): """A type that will accept any Postgres DSN. * User info required * TLD not required * Host required * Supports multiple hosts If further validation is required, these properties can be used by validators to enforce specific behaviour: ```python from pydantic import ( BaseModel, HttpUrl, PostgresDsn, ValidationError, field_validator, ) class MyModel(BaseModel): url: HttpUrl m = MyModel(url='http://www.example.com') # the repr() method for a url will display all properties of the url print(repr(m.url)) #> HttpUrl('http://www.example.com/') print(m.url.scheme) #> http print(m.url.host) #> www.example.com print(m.url.port) #> 80 class MyDatabaseModel(BaseModel): db: PostgresDsn @field_validator('db') def check_db_name(cls, v): assert v.path and len(v.path) > 1, 'database must be provided' return v m = MyDatabaseModel(db='postgres://user:pass@localhost:5432/foobar') print(m.db) #> postgres://user:pass@localhost:5432/foobar try: MyDatabaseModel(db='postgres://user:pass@localhost:5432') except ValidationError as e: print(e) ''' 1 validation error for MyDatabaseModel db Assertion failed, database must be provided assert (None) + where None = PostgresDsn('postgres://user:pass@localhost:5432').path [type=assertion_error, input_value='postgres://user:pass@localhost:5432', input_type=str] ''' ``` """ _constraints = UrlConstraints( host_required=True, allowed_schemes=[ 'postgres', 'postgresql', 'postgresql+asyncpg', 'postgresql+pg8000', 'postgresql+psycopg', 'postgresql+psycopg2', 'postgresql+psycopg2cffi', 'postgresql+py-postgresql', 'postgresql+pygresql', ], ) @property def host(self) -> str: """The required URL host.""" return self._url.host # pyright: ignore[reportAttributeAccessIssue] class CockroachDsn(AnyUrl): """A type that will accept any Cockroach DSN. * User info required * TLD not required * Host required """ _constraints = UrlConstraints( host_required=True, allowed_schemes=[ 'cockroachdb', 'cockroachdb+psycopg2', 'cockroachdb+asyncpg', ], ) @property def host(self) -> str: """The required URL host.""" return self._url.host # pyright: ignore[reportReturnType] class AmqpDsn(AnyUrl): """A type that will accept any AMQP DSN. * User info required * TLD not required * Host not required """ _constraints = UrlConstraints(allowed_schemes=['amqp', 'amqps']) class RedisDsn(AnyUrl): """A type that will accept any Redis DSN. * User info required * TLD not required * Host required (e.g., `rediss://:pass@localhost`) """ _constraints = UrlConstraints( allowed_schemes=['redis', 'rediss'], default_host='localhost', default_port=6379, default_path='/0', host_required=True, ) @property def host(self) -> str: """The required URL host.""" return self._url.host # pyright: ignore[reportReturnType] class MongoDsn(_BaseMultiHostUrl): """A type that will accept any MongoDB DSN. * User info not required * Database name not required * Port not required * User info may be passed without user part (e.g., `mongodb://mongodb0.example.com:27017`). """ _constraints = UrlConstraints(allowed_schemes=['mongodb', 'mongodb+srv'], default_port=27017) class KafkaDsn(AnyUrl): """A type that will accept any Kafka DSN. * User info required * TLD not required * Host not required """ _constraints = UrlConstraints(allowed_schemes=['kafka'], default_host='localhost', default_port=9092) class NatsDsn(_BaseMultiHostUrl): """A type that will accept any NATS DSN. NATS is a connective technology built for the ever increasingly hyper-connected world. It is a single technology that enables applications to securely communicate across any combination of cloud vendors, on-premise, edge, web and mobile, and devices. More: https://nats.io """ _constraints = UrlConstraints( allowed_schemes=['nats', 'tls', 'ws', 'wss'], default_host='localhost', default_port=4222 ) class MySQLDsn(AnyUrl): """A type that will accept any MySQL DSN. * User info required * TLD not required * Host not required """ _constraints = UrlConstraints( allowed_schemes=[ 'mysql', 'mysql+mysqlconnector', 'mysql+aiomysql', 'mysql+asyncmy', 'mysql+mysqldb', 'mysql+pymysql', 'mysql+cymysql', 'mysql+pyodbc', ], default_port=3306, host_required=True, ) class MariaDBDsn(AnyUrl): """A type that will accept any MariaDB DSN. * User info required * TLD not required * Host not required """ _constraints = UrlConstraints( allowed_schemes=['mariadb', 'mariadb+mariadbconnector', 'mariadb+pymysql'], default_port=3306, ) class ClickHouseDsn(AnyUrl): """A type that will accept any ClickHouse DSN. * User info required * TLD not required * Host not required """ _constraints = UrlConstraints( allowed_schemes=['clickhouse+native', 'clickhouse+asynch'], default_host='localhost', default_port=9000, ) class SnowflakeDsn(AnyUrl): """A type that will accept any Snowflake DSN. * User info required * TLD not required * Host required """ _constraints = UrlConstraints( allowed_schemes=['snowflake'], host_required=True, ) @property def host(self) -> str: """The required URL host.""" return self._url.host # pyright: ignore[reportReturnType] def import_email_validator() -> None: global email_validator try: import email_validator except ImportError as e: raise ImportError('email-validator is not installed, run `pip install pydantic[email]`') from e if not version('email-validator').partition('.')[0] == '2': raise ImportError('email-validator version >= 2.0 required, run pip install -U email-validator') if TYPE_CHECKING: EmailStr = Annotated[str, ...] else: class EmailStr: """ Info: To use this type, you need to install the optional [`email-validator`](https://github.com/JoshData/python-email-validator) package: ```bash pip install email-validator ``` Validate email addresses. ```python from pydantic import BaseModel, EmailStr class Model(BaseModel): email: EmailStr print(Model(email='contact@mail.com')) #> email='contact@mail.com' ``` """ # noqa: D212 @classmethod def __get_pydantic_core_schema__( cls, _source: type[Any], _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: import_email_validator() return core_schema.no_info_after_validator_function(cls._validate, core_schema.str_schema()) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format='email') return field_schema @classmethod def _validate(cls, input_value: str, /) -> str: return validate_email(input_value)[1] class NameEmail(_repr.Representation): """ Info: To use this type, you need to install the optional [`email-validator`](https://github.com/JoshData/python-email-validator) package: ```bash pip install email-validator ``` Validate a name and email address combination, as specified by [RFC 5322](https://datatracker.ietf.org/doc/html/rfc5322#section-3.4). The `NameEmail` has two properties: `name` and `email`. In case the `name` is not provided, it's inferred from the email address. ```python from pydantic import BaseModel, NameEmail class User(BaseModel): email: NameEmail user = User(email='Fred Bloggs ') print(user.email) #> Fred Bloggs print(user.email.name) #> Fred Bloggs user = User(email='fred.bloggs@example.com') print(user.email) #> fred.bloggs print(user.email.name) #> fred.bloggs ``` """ # noqa: D212 __slots__ = 'name', 'email' def __init__(self, name: str, email: str): self.name = name self.email = email def __eq__(self, other: Any) -> bool: return isinstance(other, NameEmail) and (self.name, self.email) == (other.name, other.email) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format='name-email') return field_schema @classmethod def __get_pydantic_core_schema__( cls, _source: type[Any], _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: import_email_validator() return core_schema.no_info_after_validator_function( cls._validate, core_schema.json_or_python_schema( json_schema=core_schema.str_schema(), python_schema=core_schema.union_schema( [core_schema.is_instance_schema(cls), core_schema.str_schema()], custom_error_type='name_email_type', custom_error_message='Input is not a valid NameEmail', ), serialization=core_schema.to_string_ser_schema(), ), ) @classmethod def _validate(cls, input_value: Self | str, /) -> Self: if isinstance(input_value, str): name, email = validate_email(input_value) return cls(name, email) else: return input_value def __str__(self) -> str: if '@' in self.name: return f'"{self.name}" <{self.email}>' return f'{self.name} <{self.email}>' IPvAnyAddressType: TypeAlias = 'IPv4Address | IPv6Address' IPvAnyInterfaceType: TypeAlias = 'IPv4Interface | IPv6Interface' IPvAnyNetworkType: TypeAlias = 'IPv4Network | IPv6Network' if TYPE_CHECKING: IPvAnyAddress = IPvAnyAddressType IPvAnyInterface = IPvAnyInterfaceType IPvAnyNetwork = IPvAnyNetworkType else: class IPvAnyAddress: """Validate an IPv4 or IPv6 address. ```python from pydantic import BaseModel from pydantic.networks import IPvAnyAddress class IpModel(BaseModel): ip: IPvAnyAddress print(IpModel(ip='127.0.0.1')) #> ip=IPv4Address('127.0.0.1') try: IpModel(ip='http://www.example.com') except ValueError as e: print(e.errors()) ''' [ { 'type': 'ip_any_address', 'loc': ('ip',), 'msg': 'value is not a valid IPv4 or IPv6 address', 'input': 'http://www.example.com', } ] ''' ``` """ __slots__ = () def __new__(cls, value: Any) -> IPvAnyAddressType: """Validate an IPv4 or IPv6 address.""" try: return IPv4Address(value) except ValueError: pass try: return IPv6Address(value) except ValueError: raise PydanticCustomError('ip_any_address', 'value is not a valid IPv4 or IPv6 address') @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = {} field_schema.update(type='string', format='ipvanyaddress') return field_schema @classmethod def __get_pydantic_core_schema__( cls, _source: type[Any], _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: return core_schema.no_info_plain_validator_function( cls._validate, serialization=core_schema.to_string_ser_schema() ) @classmethod def _validate(cls, input_value: Any, /) -> IPvAnyAddressType: return cls(input_value) # type: ignore[return-value] class IPvAnyInterface: """Validate an IPv4 or IPv6 interface.""" __slots__ = () def __new__(cls, value: NetworkType) -> IPvAnyInterfaceType: """Validate an IPv4 or IPv6 interface.""" try: return IPv4Interface(value) except ValueError: pass try: return IPv6Interface(value) except ValueError: raise PydanticCustomError('ip_any_interface', 'value is not a valid IPv4 or IPv6 interface') @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = {} field_schema.update(type='string', format='ipvanyinterface') return field_schema @classmethod def __get_pydantic_core_schema__( cls, _source: type[Any], _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: return core_schema.no_info_plain_validator_function( cls._validate, serialization=core_schema.to_string_ser_schema() ) @classmethod def _validate(cls, input_value: NetworkType, /) -> IPvAnyInterfaceType: return cls(input_value) # type: ignore[return-value] class IPvAnyNetwork: """Validate an IPv4 or IPv6 network.""" __slots__ = () def __new__(cls, value: NetworkType) -> IPvAnyNetworkType: """Validate an IPv4 or IPv6 network.""" # Assume IP Network is defined with a default value for `strict` argument. # Define your own class if you want to specify network address check strictness. try: return IPv4Network(value) except ValueError: pass try: return IPv6Network(value) except ValueError: raise PydanticCustomError('ip_any_network', 'value is not a valid IPv4 or IPv6 network') @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: _schema_generation_shared.GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = {} field_schema.update(type='string', format='ipvanynetwork') return field_schema @classmethod def __get_pydantic_core_schema__( cls, _source: type[Any], _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: return core_schema.no_info_plain_validator_function( cls._validate, serialization=core_schema.to_string_ser_schema() ) @classmethod def _validate(cls, input_value: NetworkType, /) -> IPvAnyNetworkType: return cls(input_value) # type: ignore[return-value] def _build_pretty_email_regex() -> re.Pattern[str]: name_chars = r'[\w!#$%&\'*+\-/=?^_`{|}~]' unquoted_name_group = rf'((?:{name_chars}+\s+)*{name_chars}+)' quoted_name_group = r'"((?:[^"]|\")+)"' email_group = r'<(.+)>' return re.compile(rf'\s*(?:{unquoted_name_group}|{quoted_name_group})?\s*{email_group}\s*') pretty_email_regex = _build_pretty_email_regex() MAX_EMAIL_LENGTH = 2048 """Maximum length for an email. A somewhat arbitrary but very generous number compared to what is allowed by most implementations. """ def validate_email(value: str) -> tuple[str, str]: """Email address validation using [email-validator](https://pypi.org/project/email-validator/). Returns: A tuple containing the local part of the email (or the name for "pretty" email addresses) and the normalized email. Raises: PydanticCustomError: If the email is invalid. Note: Note that: * Raw IP address (literal) domain parts are not allowed. * `"John Doe "` style "pretty" email addresses are processed. * Spaces are striped from the beginning and end of addresses, but no error is raised. """ if email_validator is None: import_email_validator() if len(value) > MAX_EMAIL_LENGTH: raise PydanticCustomError( 'value_error', 'value is not a valid email address: {reason}', {'reason': f'Length must not exceed {MAX_EMAIL_LENGTH} characters'}, ) m = pretty_email_regex.fullmatch(value) name: str | None = None if m: unquoted_name, quoted_name, value = m.groups() name = unquoted_name or quoted_name email = value.strip() try: parts = email_validator.validate_email(email, check_deliverability=False) except email_validator.EmailNotValidError as e: raise PydanticCustomError( 'value_error', 'value is not a valid email address: {reason}', {'reason': str(e.args[0])} ) from e email = parts.normalized assert email is not None name = name or parts.local_part return name, email __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/parse.py000066400000000000000000000002151474456633400167300ustar00rootroot00000000000000"""The `parse` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/plugin/000077500000000000000000000000001474456633400165445ustar00rootroot00000000000000pydantic-2.10.6/pydantic/plugin/__init__.py000066400000000000000000000137451474456633400206670ustar00rootroot00000000000000"""Usage docs: https://docs.pydantic.dev/2.10/concepts/plugins#build-a-plugin Plugin interface for Pydantic plugins, and related types. """ from __future__ import annotations from typing import Any, Callable, NamedTuple from pydantic_core import CoreConfig, CoreSchema, ValidationError from typing_extensions import Literal, Protocol, TypeAlias __all__ = ( 'PydanticPluginProtocol', 'BaseValidateHandlerProtocol', 'ValidatePythonHandlerProtocol', 'ValidateJsonHandlerProtocol', 'ValidateStringsHandlerProtocol', 'NewSchemaReturns', 'SchemaTypePath', 'SchemaKind', ) NewSchemaReturns: TypeAlias = 'tuple[ValidatePythonHandlerProtocol | None, ValidateJsonHandlerProtocol | None, ValidateStringsHandlerProtocol | None]' class SchemaTypePath(NamedTuple): """Path defining where `schema_type` was defined, or where `TypeAdapter` was called.""" module: str name: str SchemaKind: TypeAlias = Literal['BaseModel', 'TypeAdapter', 'dataclass', 'create_model', 'validate_call'] class PydanticPluginProtocol(Protocol): """Protocol defining the interface for Pydantic plugins.""" def new_schema_validator( self, schema: CoreSchema, schema_type: Any, schema_type_path: SchemaTypePath, schema_kind: SchemaKind, config: CoreConfig | None, plugin_settings: dict[str, object], ) -> tuple[ ValidatePythonHandlerProtocol | None, ValidateJsonHandlerProtocol | None, ValidateStringsHandlerProtocol | None ]: """This method is called for each plugin every time a new [`SchemaValidator`][pydantic_core.SchemaValidator] is created. It should return an event handler for each of the three validation methods, or `None` if the plugin does not implement that method. Args: schema: The schema to validate against. schema_type: The original type which the schema was created from, e.g. the model class. schema_type_path: Path defining where `schema_type` was defined, or where `TypeAdapter` was called. schema_kind: The kind of schema to validate against. config: The config to use for validation. plugin_settings: Any plugin settings. Returns: A tuple of optional event handlers for each of the three validation methods - `validate_python`, `validate_json`, `validate_strings`. """ raise NotImplementedError('Pydantic plugins should implement `new_schema_validator`.') class BaseValidateHandlerProtocol(Protocol): """Base class for plugin callbacks protocols. You shouldn't implement this protocol directly, instead use one of the subclasses with adds the correctly typed `on_error` method. """ on_enter: Callable[..., None] """`on_enter` is changed to be more specific on all subclasses""" def on_success(self, result: Any) -> None: """Callback to be notified of successful validation. Args: result: The result of the validation. """ return def on_error(self, error: ValidationError) -> None: """Callback to be notified of validation errors. Args: error: The validation error. """ return def on_exception(self, exception: Exception) -> None: """Callback to be notified of validation exceptions. Args: exception: The exception raised during validation. """ return class ValidatePythonHandlerProtocol(BaseValidateHandlerProtocol, Protocol): """Event handler for `SchemaValidator.validate_python`.""" def on_enter( self, input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: """Callback to be notified of validation start, and create an instance of the event handler. Args: input: The input to be validated. strict: Whether to validate the object in strict mode. from_attributes: Whether to validate objects as inputs by extracting attributes. context: The context to use for validation, this is passed to functional validators. self_instance: An instance of a model to set attributes on from validation, this is used when running validation from the `__init__` method of a model. """ pass class ValidateJsonHandlerProtocol(BaseValidateHandlerProtocol, Protocol): """Event handler for `SchemaValidator.validate_json`.""" def on_enter( self, input: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: """Callback to be notified of validation start, and create an instance of the event handler. Args: input: The JSON data to be validated. strict: Whether to validate the object in strict mode. context: The context to use for validation, this is passed to functional validators. self_instance: An instance of a model to set attributes on from validation, this is used when running validation from the `__init__` method of a model. """ pass StringInput: TypeAlias = 'dict[str, StringInput]' class ValidateStringsHandlerProtocol(BaseValidateHandlerProtocol, Protocol): """Event handler for `SchemaValidator.validate_strings`.""" def on_enter( self, input: StringInput, *, strict: bool | None = None, context: dict[str, Any] | None = None ) -> None: """Callback to be notified of validation start, and create an instance of the event handler. Args: input: The string data to be validated. strict: Whether to validate the object in strict mode. context: The context to use for validation, this is passed to functional validators. """ pass pydantic-2.10.6/pydantic/plugin/_loader.py000066400000000000000000000041341474456633400205250ustar00rootroot00000000000000from __future__ import annotations import importlib.metadata as importlib_metadata import os import warnings from typing import TYPE_CHECKING, Final, Iterable if TYPE_CHECKING: from . import PydanticPluginProtocol PYDANTIC_ENTRY_POINT_GROUP: Final[str] = 'pydantic' # cache of plugins _plugins: dict[str, PydanticPluginProtocol] | None = None # return no plugins while loading plugins to avoid recursion and errors while import plugins # this means that if plugins use pydantic _loading_plugins: bool = False def get_plugins() -> Iterable[PydanticPluginProtocol]: """Load plugins for Pydantic. Inspired by: https://github.com/pytest-dev/pluggy/blob/1.3.0/src/pluggy/_manager.py#L376-L402 """ disabled_plugins = os.getenv('PYDANTIC_DISABLE_PLUGINS') global _plugins, _loading_plugins if _loading_plugins: # this happens when plugins themselves use pydantic, we return no plugins return () elif disabled_plugins in ('__all__', '1', 'true'): return () elif _plugins is None: _plugins = {} # set _loading_plugins so any plugins that use pydantic don't themselves use plugins _loading_plugins = True try: for dist in importlib_metadata.distributions(): for entry_point in dist.entry_points: if entry_point.group != PYDANTIC_ENTRY_POINT_GROUP: continue if entry_point.value in _plugins: continue if disabled_plugins is not None and entry_point.name in disabled_plugins.split(','): continue try: _plugins[entry_point.value] = entry_point.load() except (ImportError, AttributeError) as e: warnings.warn( f'{e.__class__.__name__} while loading the `{entry_point.name}` Pydantic plugin, ' f'this plugin will not be installed.\n\n{e!r}' ) finally: _loading_plugins = False return _plugins.values() pydantic-2.10.6/pydantic/plugin/_schema_validator.py000066400000000000000000000121701474456633400225630ustar00rootroot00000000000000"""Pluggable schema validator for pydantic.""" from __future__ import annotations import functools from typing import TYPE_CHECKING, Any, Callable, Iterable, TypeVar from pydantic_core import CoreConfig, CoreSchema, SchemaValidator, ValidationError from typing_extensions import Literal, ParamSpec if TYPE_CHECKING: from . import BaseValidateHandlerProtocol, PydanticPluginProtocol, SchemaKind, SchemaTypePath P = ParamSpec('P') R = TypeVar('R') Event = Literal['on_validate_python', 'on_validate_json', 'on_validate_strings'] events: list[Event] = list(Event.__args__) # type: ignore def create_schema_validator( schema: CoreSchema, schema_type: Any, schema_type_module: str, schema_type_name: str, schema_kind: SchemaKind, config: CoreConfig | None = None, plugin_settings: dict[str, Any] | None = None, ) -> SchemaValidator | PluggableSchemaValidator: """Create a `SchemaValidator` or `PluggableSchemaValidator` if plugins are installed. Returns: If plugins are installed then return `PluggableSchemaValidator`, otherwise return `SchemaValidator`. """ from . import SchemaTypePath from ._loader import get_plugins plugins = get_plugins() if plugins: return PluggableSchemaValidator( schema, schema_type, SchemaTypePath(schema_type_module, schema_type_name), schema_kind, config, plugins, plugin_settings or {}, ) else: return SchemaValidator(schema, config) class PluggableSchemaValidator: """Pluggable schema validator.""" __slots__ = '_schema_validator', 'validate_json', 'validate_python', 'validate_strings' def __init__( self, schema: CoreSchema, schema_type: Any, schema_type_path: SchemaTypePath, schema_kind: SchemaKind, config: CoreConfig | None, plugins: Iterable[PydanticPluginProtocol], plugin_settings: dict[str, Any], ) -> None: self._schema_validator = SchemaValidator(schema, config) python_event_handlers: list[BaseValidateHandlerProtocol] = [] json_event_handlers: list[BaseValidateHandlerProtocol] = [] strings_event_handlers: list[BaseValidateHandlerProtocol] = [] for plugin in plugins: try: p, j, s = plugin.new_schema_validator( schema, schema_type, schema_type_path, schema_kind, config, plugin_settings ) except TypeError as e: # pragma: no cover raise TypeError(f'Error using plugin `{plugin.__module__}:{plugin.__class__.__name__}`: {e}') from e if p is not None: python_event_handlers.append(p) if j is not None: json_event_handlers.append(j) if s is not None: strings_event_handlers.append(s) self.validate_python = build_wrapper(self._schema_validator.validate_python, python_event_handlers) self.validate_json = build_wrapper(self._schema_validator.validate_json, json_event_handlers) self.validate_strings = build_wrapper(self._schema_validator.validate_strings, strings_event_handlers) def __getattr__(self, name: str) -> Any: return getattr(self._schema_validator, name) def build_wrapper(func: Callable[P, R], event_handlers: list[BaseValidateHandlerProtocol]) -> Callable[P, R]: if not event_handlers: return func else: on_enters = tuple(h.on_enter for h in event_handlers if filter_handlers(h, 'on_enter')) on_successes = tuple(h.on_success for h in event_handlers if filter_handlers(h, 'on_success')) on_errors = tuple(h.on_error for h in event_handlers if filter_handlers(h, 'on_error')) on_exceptions = tuple(h.on_exception for h in event_handlers if filter_handlers(h, 'on_exception')) @functools.wraps(func) def wrapper(*args: P.args, **kwargs: P.kwargs) -> R: for on_enter_handler in on_enters: on_enter_handler(*args, **kwargs) try: result = func(*args, **kwargs) except ValidationError as error: for on_error_handler in on_errors: on_error_handler(error) raise except Exception as exception: for on_exception_handler in on_exceptions: on_exception_handler(exception) raise else: for on_success_handler in on_successes: on_success_handler(result) return result return wrapper def filter_handlers(handler_cls: BaseValidateHandlerProtocol, method_name: str) -> bool: """Filter out handler methods which are not implemented by the plugin directly - e.g. are missing or are inherited from the protocol. """ handler = getattr(handler_cls, method_name, None) if handler is None: return False elif handler.__module__ == 'pydantic.plugin': # this is the original handler, from the protocol due to runtime inheritance # we don't want to call it return False else: return True pydantic-2.10.6/pydantic/py.typed000066400000000000000000000000001474456633400167330ustar00rootroot00000000000000pydantic-2.10.6/pydantic/root_model.py000066400000000000000000000141271474456633400177700ustar00rootroot00000000000000"""RootModel class and type definitions.""" from __future__ import annotations as _annotations import typing from copy import copy, deepcopy from pydantic_core import PydanticUndefined from . import PydanticUserError from ._internal import _model_construction, _repr from .main import BaseModel, _object_setattr if typing.TYPE_CHECKING: from typing import Any from typing_extensions import Literal, Self, dataclass_transform from .fields import Field as PydanticModelField from .fields import PrivateAttr as PydanticModelPrivateAttr # dataclass_transform could be applied to RootModel directly, but `ModelMetaclass`'s dataclass_transform # takes priority (at least with pyright). We trick type checkers into thinking we apply dataclass_transform # on a new metaclass. @dataclass_transform(kw_only_default=False, field_specifiers=(PydanticModelField, PydanticModelPrivateAttr)) class _RootModelMetaclass(_model_construction.ModelMetaclass): ... else: _RootModelMetaclass = _model_construction.ModelMetaclass __all__ = ('RootModel',) RootModelRootType = typing.TypeVar('RootModelRootType') class RootModel(BaseModel, typing.Generic[RootModelRootType], metaclass=_RootModelMetaclass): """Usage docs: https://docs.pydantic.dev/2.10/concepts/models/#rootmodel-and-custom-root-types A Pydantic `BaseModel` for the root object of the model. Attributes: root: The root object of the model. __pydantic_root_model__: Whether the model is a RootModel. __pydantic_private__: Private fields in the model. __pydantic_extra__: Extra fields in the model. """ __pydantic_root_model__ = True __pydantic_private__ = None __pydantic_extra__ = None root: RootModelRootType def __init_subclass__(cls, **kwargs): extra = cls.model_config.get('extra') if extra is not None: raise PydanticUserError( "`RootModel` does not support setting `model_config['extra']`", code='root-model-extra' ) super().__init_subclass__(**kwargs) def __init__(self, /, root: RootModelRootType = PydanticUndefined, **data) -> None: # type: ignore __tracebackhide__ = True if data: if root is not PydanticUndefined: raise ValueError( '"RootModel.__init__" accepts either a single positional argument or arbitrary keyword arguments' ) root = data # type: ignore self.__pydantic_validator__.validate_python(root, self_instance=self) __init__.__pydantic_base_init__ = True # pyright: ignore[reportFunctionMemberAccess] @classmethod def model_construct(cls, root: RootModelRootType, _fields_set: set[str] | None = None) -> Self: # type: ignore """Create a new model using the provided root object and update fields set. Args: root: The root object of the model. _fields_set: The set of fields to be updated. Returns: The new model. Raises: NotImplemented: If the model is not a subclass of `RootModel`. """ return super().model_construct(root=root, _fields_set=_fields_set) def __getstate__(self) -> dict[Any, Any]: return { '__dict__': self.__dict__, '__pydantic_fields_set__': self.__pydantic_fields_set__, } def __setstate__(self, state: dict[Any, Any]) -> None: _object_setattr(self, '__pydantic_fields_set__', state['__pydantic_fields_set__']) _object_setattr(self, '__dict__', state['__dict__']) def __copy__(self) -> Self: """Returns a shallow copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', copy(self.__dict__)) _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) return m def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self: """Returns a deep copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo)) # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str], # and attempting a deepcopy would be marginally slower. _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) return m if typing.TYPE_CHECKING: def model_dump( # type: ignore self, *, mode: Literal['json', 'python'] | str = 'python', include: Any = None, exclude: Any = None, context: dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, ) -> Any: """This method is included just to get a more accurate return type for type checkers. It is included in this `if TYPE_CHECKING:` block since no override is actually necessary. See the documentation of `BaseModel.model_dump` for more details about the arguments. Generally, this method will have a return type of `RootModelRootType`, assuming that `RootModelRootType` is not a `BaseModel` subclass. If `RootModelRootType` is a `BaseModel` subclass, then the return type will likely be `dict[str, Any]`, as `model_dump` calls are recursive. The return type could even be something different, in the case of a custom serializer. Thus, `Any` is used here to catch all of these cases. """ ... def __eq__(self, other: Any) -> bool: if not isinstance(other, RootModel): return NotImplemented return self.__pydantic_fields__['root'].annotation == other.__pydantic_fields__[ 'root' ].annotation and super().__eq__(other) def __repr_args__(self) -> _repr.ReprArgs: yield 'root', self.root pydantic-2.10.6/pydantic/schema.py000066400000000000000000000002161474456633400170570ustar00rootroot00000000000000"""The `schema` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/tools.py000066400000000000000000000002151474456633400167560ustar00rootroot00000000000000"""The `tools` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/type_adapter.py000066400000000000000000000700721474456633400203070ustar00rootroot00000000000000"""Type adapter specification.""" from __future__ import annotations as _annotations import sys from dataclasses import is_dataclass from types import FrameType from typing import ( Any, Generic, Iterable, Literal, TypeVar, cast, final, overload, ) from pydantic_core import CoreSchema, SchemaSerializer, SchemaValidator, Some from typing_extensions import ParamSpec, is_typeddict from pydantic.errors import PydanticUserError from pydantic.main import BaseModel, IncEx from ._internal import _config, _generate_schema, _mock_val_ser, _namespace_utils, _repr, _typing_extra, _utils from .config import ConfigDict from .errors import PydanticUndefinedAnnotation from .json_schema import ( DEFAULT_REF_TEMPLATE, GenerateJsonSchema, JsonSchemaKeyT, JsonSchemaMode, JsonSchemaValue, ) from .plugin._schema_validator import PluggableSchemaValidator, create_schema_validator T = TypeVar('T') R = TypeVar('R') P = ParamSpec('P') TypeAdapterT = TypeVar('TypeAdapterT', bound='TypeAdapter') def _getattr_no_parents(obj: Any, attribute: str) -> Any: """Returns the attribute value without attempting to look up attributes from parent types.""" if hasattr(obj, '__dict__'): try: return obj.__dict__[attribute] except KeyError: pass slots = getattr(obj, '__slots__', None) if slots is not None and attribute in slots: return getattr(obj, attribute) else: raise AttributeError(attribute) def _type_has_config(type_: Any) -> bool: """Returns whether the type has config.""" type_ = _typing_extra.annotated_type(type_) or type_ try: return issubclass(type_, BaseModel) or is_dataclass(type_) or is_typeddict(type_) except TypeError: # type is not a class return False @final class TypeAdapter(Generic[T]): """Usage docs: https://docs.pydantic.dev/2.10/concepts/type_adapter/ Type adapters provide a flexible way to perform validation and serialization based on a Python type. A `TypeAdapter` instance exposes some of the functionality from `BaseModel` instance methods for types that do not have such methods (such as dataclasses, primitive types, and more). **Note:** `TypeAdapter` instances are not types, and cannot be used as type annotations for fields. Args: type: The type associated with the `TypeAdapter`. config: Configuration for the `TypeAdapter`, should be a dictionary conforming to [`ConfigDict`][pydantic.config.ConfigDict]. !!! note You cannot provide a configuration when instantiating a `TypeAdapter` if the type you're using has its own config that cannot be overridden (ex: `BaseModel`, `TypedDict`, and `dataclass`). A [`type-adapter-config-unused`](../errors/usage_errors.md#type-adapter-config-unused) error will be raised in this case. _parent_depth: Depth at which to search for the [parent frame][frame-objects]. This frame is used when resolving forward annotations during schema building, by looking for the globals and locals of this frame. Defaults to 2, which will result in the frame where the `TypeAdapter` was instantiated. !!! note This parameter is named with an underscore to suggest its private nature and discourage use. It may be deprecated in a minor version, so we only recommend using it if you're comfortable with potential change in behavior/support. It's default value is 2 because internally, the `TypeAdapter` class makes another call to fetch the frame. module: The module that passes to plugin if provided. Attributes: core_schema: The core schema for the type. validator: The schema validator for the type. serializer: The schema serializer for the type. pydantic_complete: Whether the core schema for the type is successfully built. ??? tip "Compatibility with `mypy`" Depending on the type used, `mypy` might raise an error when instantiating a `TypeAdapter`. As a workaround, you can explicitly annotate your variable: ```py from typing import Union from pydantic import TypeAdapter ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int]) # type: ignore[arg-type] ``` ??? info "Namespace management nuances and implementation details" Here, we collect some notes on namespace management, and subtle differences from `BaseModel`: `BaseModel` uses its own `__module__` to find out where it was defined and then looks for symbols to resolve forward references in those globals. On the other hand, `TypeAdapter` can be initialized with arbitrary objects, which may not be types and thus do not have a `__module__` available. So instead we look at the globals in our parent stack frame. It is expected that the `ns_resolver` passed to this function will have the correct namespace for the type we're adapting. See the source code for `TypeAdapter.__init__` and `TypeAdapter.rebuild` for various ways to construct this namespace. This works for the case where this function is called in a module that has the target of forward references in its scope, but does not always work for more complex cases. For example, take the following: ```python {title="a.py"} from typing import Dict, List IntList = List[int] OuterDict = Dict[str, 'IntList'] ``` ```python {test="skip" title="b.py"} from a import OuterDict from pydantic import TypeAdapter IntList = int # replaces the symbol the forward reference is looking for v = TypeAdapter(OuterDict) v({'x': 1}) # should fail but doesn't ``` If `OuterDict` were a `BaseModel`, this would work because it would resolve the forward reference within the `a.py` namespace. But `TypeAdapter(OuterDict)` can't determine what module `OuterDict` came from. In other words, the assumption that _all_ forward references exist in the module we are being called from is not technically always true. Although most of the time it is and it works fine for recursive models and such, `BaseModel`'s behavior isn't perfect either and _can_ break in similar ways, so there is no right or wrong between the two. But at the very least this behavior is _subtly_ different from `BaseModel`'s. """ core_schema: CoreSchema validator: SchemaValidator | PluggableSchemaValidator serializer: SchemaSerializer pydantic_complete: bool @overload def __init__( self, type: type[T], *, config: ConfigDict | None = ..., _parent_depth: int = ..., module: str | None = ..., ) -> None: ... # This second overload is for unsupported special forms (such as Annotated, Union, etc.) # Currently there is no way to type this correctly # See https://github.com/python/typing/pull/1618 @overload def __init__( self, type: Any, *, config: ConfigDict | None = ..., _parent_depth: int = ..., module: str | None = ..., ) -> None: ... def __init__( self, type: Any, *, config: ConfigDict | None = None, _parent_depth: int = 2, module: str | None = None, ) -> None: if _type_has_config(type) and config is not None: raise PydanticUserError( 'Cannot use `config` when the type is a BaseModel, dataclass or TypedDict.' ' These types can have their own config and setting the config via the `config`' ' parameter to TypeAdapter will not override it, thus the `config` you passed to' ' TypeAdapter becomes meaningless, which is probably not what you want.', code='type-adapter-config-unused', ) self._type = type self._config = config self._parent_depth = _parent_depth self.pydantic_complete = False parent_frame = self._fetch_parent_frame() if parent_frame is not None: globalns = parent_frame.f_globals # Do not provide a local ns if the type adapter happens to be instantiated at the module level: localns = parent_frame.f_locals if parent_frame.f_locals is not globalns else {} else: globalns = {} localns = {} self._module_name = module or cast(str, globalns.get('__name__', '')) self._init_core_attrs( ns_resolver=_namespace_utils.NsResolver( namespaces_tuple=_namespace_utils.NamespacesTuple(locals=localns, globals=globalns), parent_namespace=localns, ), force=False, ) def _fetch_parent_frame(self) -> FrameType | None: frame = sys._getframe(self._parent_depth) if frame.f_globals.get('__name__') == 'typing': # Because `TypeAdapter` is generic, explicitly parametrizing the class results # in a `typing._GenericAlias` instance, which proxies instantiation calls to the # "real" `TypeAdapter` class and thus adding an extra frame to the call. To avoid # pulling anything from the `typing` module, use the correct frame (the one before): return frame.f_back return frame def _init_core_attrs( self, ns_resolver: _namespace_utils.NsResolver, force: bool, raise_errors: bool = False ) -> bool: """Initialize the core schema, validator, and serializer for the type. Args: ns_resolver: The namespace resolver to use when building the core schema for the adapted type. force: Whether to force the construction of the core schema, validator, and serializer. If `force` is set to `False` and `_defer_build` is `True`, the core schema, validator, and serializer will be set to mocks. raise_errors: Whether to raise errors if initializing any of the core attrs fails. Returns: `True` if the core schema, validator, and serializer were successfully initialized, otherwise `False`. Raises: PydanticUndefinedAnnotation: If `PydanticUndefinedAnnotation` occurs in`__get_pydantic_core_schema__` and `raise_errors=True`. """ if not force and self._defer_build: _mock_val_ser.set_type_adapter_mocks(self, str(self._type)) self.pydantic_complete = False return False try: self.core_schema = _getattr_no_parents(self._type, '__pydantic_core_schema__') self.validator = _getattr_no_parents(self._type, '__pydantic_validator__') self.serializer = _getattr_no_parents(self._type, '__pydantic_serializer__') # TODO: we don't go through the rebuild logic here directly because we don't want # to repeat all of the namespace fetching logic that we've already done # so we simply skip to the block below that does the actual schema generation if ( isinstance(self.core_schema, _mock_val_ser.MockCoreSchema) or isinstance(self.validator, _mock_val_ser.MockValSer) or isinstance(self.serializer, _mock_val_ser.MockValSer) ): raise AttributeError() except AttributeError: config_wrapper = _config.ConfigWrapper(self._config) schema_generator = _generate_schema.GenerateSchema(config_wrapper, ns_resolver=ns_resolver) try: core_schema = schema_generator.generate_schema(self._type) except PydanticUndefinedAnnotation: if raise_errors: raise _mock_val_ser.set_type_adapter_mocks(self, str(self._type)) return False try: self.core_schema = schema_generator.clean_schema(core_schema) except schema_generator.CollectedInvalid: _mock_val_ser.set_type_adapter_mocks(self, str(self._type)) return False core_config = config_wrapper.core_config(None) self.validator = create_schema_validator( schema=self.core_schema, schema_type=self._type, schema_type_module=self._module_name, schema_type_name=str(self._type), schema_kind='TypeAdapter', config=core_config, plugin_settings=config_wrapper.plugin_settings, ) self.serializer = SchemaSerializer(self.core_schema, core_config) self.pydantic_complete = True return True @property def _defer_build(self) -> bool: config = self._config if self._config is not None else self._model_config if config: return config.get('defer_build') is True return False @property def _model_config(self) -> ConfigDict | None: type_: Any = _typing_extra.annotated_type(self._type) or self._type # Eg FastAPI heavily uses Annotated if _utils.lenient_issubclass(type_, BaseModel): return type_.model_config return getattr(type_, '__pydantic_config__', None) def __repr__(self) -> str: return f'TypeAdapter({_repr.display_as_type(self._type)})' def rebuild( self, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: _namespace_utils.MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the adapter's type. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Args: force: Whether to force the rebuilding of the type adapter's schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: Depth at which to search for the [parent frame][frame-objects]. This frame is used when resolving forward annotations during schema rebuilding, by looking for the locals of this frame. Defaults to 2, which will result in the frame where the method was called. _types_namespace: An explicit types namespace to use, instead of using the local namespace from the parent frame. Defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and self.pydantic_complete: return None if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} # we have to manually fetch globals here because there's no type on the stack of the NsResolver # and so we skip the globalns = get_module_ns_of(typ) call that would normally happen globalns = sys._getframe(max(_parent_namespace_depth - 1, 1)).f_globals ns_resolver = _namespace_utils.NsResolver( namespaces_tuple=_namespace_utils.NamespacesTuple(locals=rebuild_ns, globals=globalns), parent_namespace=rebuild_ns, ) return self._init_core_attrs(ns_resolver=ns_resolver, force=True, raise_errors=raise_errors) def validate_python( self, object: Any, /, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, ) -> T: """Validate a Python object against the model. Args: object: The Python object to validate against the model. strict: Whether to strictly check types. from_attributes: Whether to extract data from object attributes. context: Additional context to pass to the validator. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. !!! note When using `TypeAdapter` with a Pydantic `dataclass`, the use of the `from_attributes` argument is not supported. Returns: The validated object. """ return self.validator.validate_python( object, strict=strict, from_attributes=from_attributes, context=context, allow_partial=experimental_allow_partial, ) def validate_json( self, data: str | bytes | bytearray, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, ) -> T: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json/#json-parsing Validate a JSON string or bytes against the model. Args: data: The JSON data to validate against the model. strict: Whether to strictly check types. context: Additional context to use during validation. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. Returns: The validated object. """ return self.validator.validate_json( data, strict=strict, context=context, allow_partial=experimental_allow_partial ) def validate_strings( self, obj: Any, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, ) -> T: """Validate object contains string data against the model. Args: obj: The object contains string data to validate. strict: Whether to strictly check types. context: Additional context to use during validation. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. Returns: The validated object. """ return self.validator.validate_strings( obj, strict=strict, context=context, allow_partial=experimental_allow_partial ) def get_default_value(self, *, strict: bool | None = None, context: dict[str, Any] | None = None) -> Some[T] | None: """Get the default value for the wrapped type. Args: strict: Whether to strictly check types. context: Additional context to pass to the validator. Returns: The default value wrapped in a `Some` if there is one or None if not. """ return self.validator.get_default_value(strict=strict, context=context) def dump_python( self, instance: T, /, *, mode: Literal['json', 'python'] = 'python', include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> Any: """Dump an instance of the adapted type to a Python object. Args: instance: The Python object to serialize. mode: The output format. include: Fields to include in the output. exclude: Fields to exclude from the output. by_alias: Whether to use alias names for field names. exclude_unset: Whether to exclude unset fields. exclude_defaults: Whether to exclude fields with default values. exclude_none: Whether to exclude fields with None values. round_trip: Whether to output the serialized data in a way that is compatible with deserialization. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. context: Additional context to pass to the serializer. Returns: The serialized object. """ return self.serializer.to_python( instance, mode=mode, by_alias=by_alias, include=include, exclude=exclude, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, serialize_as_any=serialize_as_any, context=context, ) def dump_json( self, instance: T, /, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> bytes: """Usage docs: https://docs.pydantic.dev/2.10/concepts/json/#json-serialization Serialize an instance of the adapted type to JSON. Args: instance: The instance to be serialized. indent: Number of spaces for JSON indentation. include: Fields to include. exclude: Fields to exclude. by_alias: Whether to use alias names for field names. exclude_unset: Whether to exclude unset fields. exclude_defaults: Whether to exclude fields with default values. exclude_none: Whether to exclude fields with a value of `None`. round_trip: Whether to serialize and deserialize the instance to ensure round-tripping. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. context: Additional context to pass to the serializer. Returns: The JSON representation of the given instance as bytes. """ return self.serializer.to_json( instance, indent=indent, include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, serialize_as_any=serialize_as_any, context=context, ) def json_schema( self, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Generate a JSON schema for the adapted type. Args: by_alias: Whether to use alias names for field names. ref_template: The format string used for generating $ref strings. schema_generator: The generator class used for creating the schema. mode: The mode to use for schema generation. Returns: The JSON schema for the model as a dictionary. """ schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) if isinstance(self.core_schema, _mock_val_ser.MockCoreSchema): self.core_schema.rebuild() assert not isinstance(self.core_schema, _mock_val_ser.MockCoreSchema), 'this is a bug! please report it' return schema_generator_instance.generate(self.core_schema, mode=mode) @staticmethod def json_schemas( inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]], /, *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, ) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]: """Generate a JSON schema including definitions from multiple type adapters. Args: inputs: Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema. by_alias: Whether to use alias names. title: The title for the schema. description: The description for the schema. ref_template: The format string used for generating $ref strings. schema_generator: The generator class used for creating the schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. """ schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) inputs_ = [] for key, mode, adapter in inputs: # This is the same pattern we follow for model json schemas - we attempt a core schema rebuild if we detect a mock if isinstance(adapter.core_schema, _mock_val_ser.MockCoreSchema): adapter.core_schema.rebuild() assert not isinstance( adapter.core_schema, _mock_val_ser.MockCoreSchema ), 'this is a bug! please report it' inputs_.append((key, mode, adapter.core_schema)) json_schemas_map, definitions = schema_generator_instance.generate_definitions(inputs_) json_schema: dict[str, Any] = {} if definitions: json_schema['$defs'] = definitions if title: json_schema['title'] = title if description: json_schema['description'] = description return json_schemas_map, json_schema pydantic-2.10.6/pydantic/types.py000066400000000000000000003130261474456633400167710ustar00rootroot00000000000000"""The types module contains custom types used by pydantic.""" from __future__ import annotations as _annotations import base64 import dataclasses as _dataclasses import re from datetime import date, datetime from decimal import Decimal from enum import Enum from pathlib import Path from types import ModuleType from typing import ( TYPE_CHECKING, Any, Callable, ClassVar, Dict, FrozenSet, Generic, Hashable, Iterator, List, Pattern, Set, TypeVar, Union, cast, get_args, get_origin, ) from uuid import UUID import annotated_types from annotated_types import BaseMetadata, MaxLen, MinLen from pydantic_core import CoreSchema, PydanticCustomError, SchemaSerializer, core_schema from typing_extensions import Annotated, Literal, Protocol, TypeAlias, TypeAliasType, deprecated from ._internal import _core_utils, _fields, _internal_dataclass, _typing_extra, _utils, _validators from ._migration import getattr_migration from .annotated_handlers import GetCoreSchemaHandler, GetJsonSchemaHandler from .errors import PydanticUserError from .json_schema import JsonSchemaValue from .warnings import PydanticDeprecatedSince20 __all__ = ( 'Strict', 'StrictStr', 'SocketPath', 'conbytes', 'conlist', 'conset', 'confrozenset', 'constr', 'ImportString', 'conint', 'PositiveInt', 'NegativeInt', 'NonNegativeInt', 'NonPositiveInt', 'confloat', 'PositiveFloat', 'NegativeFloat', 'NonNegativeFloat', 'NonPositiveFloat', 'FiniteFloat', 'condecimal', 'UUID1', 'UUID3', 'UUID4', 'UUID5', 'FilePath', 'DirectoryPath', 'NewPath', 'Json', 'Secret', 'SecretStr', 'SecretBytes', 'StrictBool', 'StrictBytes', 'StrictInt', 'StrictFloat', 'PaymentCardNumber', 'ByteSize', 'PastDate', 'FutureDate', 'PastDatetime', 'FutureDatetime', 'condate', 'AwareDatetime', 'NaiveDatetime', 'AllowInfNan', 'EncoderProtocol', 'EncodedBytes', 'EncodedStr', 'Base64Encoder', 'Base64Bytes', 'Base64Str', 'Base64UrlBytes', 'Base64UrlStr', 'GetPydanticSchema', 'StringConstraints', 'Tag', 'Discriminator', 'JsonValue', 'OnErrorOmit', 'FailFast', ) T = TypeVar('T') @_dataclasses.dataclass class Strict(_fields.PydanticMetadata, BaseMetadata): """Usage docs: https://docs.pydantic.dev/2.10/concepts/strict_mode/#strict-mode-with-annotated-strict A field metadata class to indicate that a field should be validated in strict mode. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: strict: Whether to validate the field in strict mode. Example: ```python from typing_extensions import Annotated from pydantic.types import Strict StrictBool = Annotated[bool, Strict()] ``` """ strict: bool = True def __hash__(self) -> int: return hash(self.strict) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BOOLEAN TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ StrictBool = Annotated[bool, Strict()] """A boolean that must be either ``True`` or ``False``.""" # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ INTEGER TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ def conint( *, strict: bool | None = None, gt: int | None = None, ge: int | None = None, lt: int | None = None, le: int | None = None, multiple_of: int | None = None, ) -> type[int]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `conint` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, conint class Foo(BaseModel): bar: conint(strict=True, gt=0) ``` === ":white_check_mark: Do this" ```python from typing_extensions import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[int, Field(strict=True, gt=0)] ``` A wrapper around `int` that allows for additional constraints. Args: strict: Whether to validate the integer in strict mode. Defaults to `None`. gt: The value must be greater than this. ge: The value must be greater than or equal to this. lt: The value must be less than this. le: The value must be less than or equal to this. multiple_of: The value must be a multiple of this. Returns: The wrapped integer type. ```python from pydantic import BaseModel, ValidationError, conint class ConstrainedExample(BaseModel): constrained_int: conint(gt=1) m = ConstrainedExample(constrained_int=2) print(repr(m)) #> ConstrainedExample(constrained_int=2) try: ConstrainedExample(constrained_int=0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_int',), 'msg': 'Input should be greater than 1', 'input': 0, 'ctx': {'gt': 1}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] int, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, ] PositiveInt = Annotated[int, annotated_types.Gt(0)] """An integer that must be greater than zero. ```python from pydantic import BaseModel, PositiveInt, ValidationError class Model(BaseModel): positive_int: PositiveInt m = Model(positive_int=1) print(repr(m)) #> Model(positive_int=1) try: Model(positive_int=-1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('positive_int',), 'msg': 'Input should be greater than 0', 'input': -1, 'ctx': {'gt': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ NegativeInt = Annotated[int, annotated_types.Lt(0)] """An integer that must be less than zero. ```python from pydantic import BaseModel, NegativeInt, ValidationError class Model(BaseModel): negative_int: NegativeInt m = Model(negative_int=-1) print(repr(m)) #> Model(negative_int=-1) try: Model(negative_int=1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than', 'loc': ('negative_int',), 'msg': 'Input should be less than 0', 'input': 1, 'ctx': {'lt': 0}, 'url': 'https://errors.pydantic.dev/2/v/less_than', } ] ''' ``` """ NonPositiveInt = Annotated[int, annotated_types.Le(0)] """An integer that must be less than or equal to zero. ```python from pydantic import BaseModel, NonPositiveInt, ValidationError class Model(BaseModel): non_positive_int: NonPositiveInt m = Model(non_positive_int=0) print(repr(m)) #> Model(non_positive_int=0) try: Model(non_positive_int=1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than_equal', 'loc': ('non_positive_int',), 'msg': 'Input should be less than or equal to 0', 'input': 1, 'ctx': {'le': 0}, 'url': 'https://errors.pydantic.dev/2/v/less_than_equal', } ] ''' ``` """ NonNegativeInt = Annotated[int, annotated_types.Ge(0)] """An integer that must be greater than or equal to zero. ```python from pydantic import BaseModel, NonNegativeInt, ValidationError class Model(BaseModel): non_negative_int: NonNegativeInt m = Model(non_negative_int=0) print(repr(m)) #> Model(non_negative_int=0) try: Model(non_negative_int=-1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than_equal', 'loc': ('non_negative_int',), 'msg': 'Input should be greater than or equal to 0', 'input': -1, 'ctx': {'ge': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than_equal', } ] ''' ``` """ StrictInt = Annotated[int, Strict()] """An integer that must be validated in strict mode. ```python from pydantic import BaseModel, StrictInt, ValidationError class StrictIntModel(BaseModel): strict_int: StrictInt try: StrictIntModel(strict_int=3.14159) except ValidationError as e: print(e) ''' 1 validation error for StrictIntModel strict_int Input should be a valid integer [type=int_type, input_value=3.14159, input_type=float] ''' ``` """ # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ FLOAT TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @_dataclasses.dataclass class AllowInfNan(_fields.PydanticMetadata): """A field metadata class to indicate that a field should allow `-inf`, `inf`, and `nan`. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: allow_inf_nan: Whether to allow `-inf`, `inf`, and `nan`. Defaults to `True`. Example: ```python from typing_extensions import Annotated from pydantic.types import AllowInfNan LaxFloat = Annotated[float, AllowInfNan()] """ allow_inf_nan: bool = True def __hash__(self) -> int: return hash(self.allow_inf_nan) def confloat( *, strict: bool | None = None, gt: float | None = None, ge: float | None = None, lt: float | None = None, le: float | None = None, multiple_of: float | None = None, allow_inf_nan: bool | None = None, ) -> type[float]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `confloat` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, confloat class Foo(BaseModel): bar: confloat(strict=True, gt=0) ``` === ":white_check_mark: Do this" ```python from typing_extensions import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[float, Field(strict=True, gt=0)] ``` A wrapper around `float` that allows for additional constraints. Args: strict: Whether to validate the float in strict mode. gt: The value must be greater than this. ge: The value must be greater than or equal to this. lt: The value must be less than this. le: The value must be less than or equal to this. multiple_of: The value must be a multiple of this. allow_inf_nan: Whether to allow `-inf`, `inf`, and `nan`. Returns: The wrapped float type. ```python from pydantic import BaseModel, ValidationError, confloat class ConstrainedExample(BaseModel): constrained_float: confloat(gt=1.0) m = ConstrainedExample(constrained_float=1.1) print(repr(m)) #> ConstrainedExample(constrained_float=1.1) try: ConstrainedExample(constrained_float=0.9) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_float',), 'msg': 'Input should be greater than 1', 'input': 0.9, 'ctx': {'gt': 1.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] float, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, AllowInfNan(allow_inf_nan) if allow_inf_nan is not None else None, ] PositiveFloat = Annotated[float, annotated_types.Gt(0)] """A float that must be greater than zero. ```python from pydantic import BaseModel, PositiveFloat, ValidationError class Model(BaseModel): positive_float: PositiveFloat m = Model(positive_float=1.0) print(repr(m)) #> Model(positive_float=1.0) try: Model(positive_float=-1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('positive_float',), 'msg': 'Input should be greater than 0', 'input': -1.0, 'ctx': {'gt': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ NegativeFloat = Annotated[float, annotated_types.Lt(0)] """A float that must be less than zero. ```python from pydantic import BaseModel, NegativeFloat, ValidationError class Model(BaseModel): negative_float: NegativeFloat m = Model(negative_float=-1.0) print(repr(m)) #> Model(negative_float=-1.0) try: Model(negative_float=1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than', 'loc': ('negative_float',), 'msg': 'Input should be less than 0', 'input': 1.0, 'ctx': {'lt': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/less_than', } ] ''' ``` """ NonPositiveFloat = Annotated[float, annotated_types.Le(0)] """A float that must be less than or equal to zero. ```python from pydantic import BaseModel, NonPositiveFloat, ValidationError class Model(BaseModel): non_positive_float: NonPositiveFloat m = Model(non_positive_float=0.0) print(repr(m)) #> Model(non_positive_float=0.0) try: Model(non_positive_float=1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than_equal', 'loc': ('non_positive_float',), 'msg': 'Input should be less than or equal to 0', 'input': 1.0, 'ctx': {'le': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/less_than_equal', } ] ''' ``` """ NonNegativeFloat = Annotated[float, annotated_types.Ge(0)] """A float that must be greater than or equal to zero. ```python from pydantic import BaseModel, NonNegativeFloat, ValidationError class Model(BaseModel): non_negative_float: NonNegativeFloat m = Model(non_negative_float=0.0) print(repr(m)) #> Model(non_negative_float=0.0) try: Model(non_negative_float=-1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than_equal', 'loc': ('non_negative_float',), 'msg': 'Input should be greater than or equal to 0', 'input': -1.0, 'ctx': {'ge': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than_equal', } ] ''' ``` """ StrictFloat = Annotated[float, Strict(True)] """A float that must be validated in strict mode. ```python from pydantic import BaseModel, StrictFloat, ValidationError class StrictFloatModel(BaseModel): strict_float: StrictFloat try: StrictFloatModel(strict_float='1.0') except ValidationError as e: print(e) ''' 1 validation error for StrictFloatModel strict_float Input should be a valid number [type=float_type, input_value='1.0', input_type=str] ''' ``` """ FiniteFloat = Annotated[float, AllowInfNan(False)] """A float that must be finite (not ``-inf``, ``inf``, or ``nan``). ```python from pydantic import BaseModel, FiniteFloat class Model(BaseModel): finite: FiniteFloat m = Model(finite=1.0) print(m) #> finite=1.0 ``` """ # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BYTES TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ def conbytes( *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ) -> type[bytes]: """A wrapper around `bytes` that allows for additional constraints. Args: min_length: The minimum length of the bytes. max_length: The maximum length of the bytes. strict: Whether to validate the bytes in strict mode. Returns: The wrapped bytes type. """ return Annotated[ # pyright: ignore[reportReturnType] bytes, Strict(strict) if strict is not None else None, annotated_types.Len(min_length or 0, max_length), ] StrictBytes = Annotated[bytes, Strict()] """A bytes that must be validated in strict mode.""" # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ STRING TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @_dataclasses.dataclass(frozen=True) class StringConstraints(annotated_types.GroupedMetadata): """Usage docs: https://docs.pydantic.dev/2.10/concepts/fields/#string-constraints A field metadata class to apply constraints to `str` types. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: strip_whitespace: Whether to remove leading and trailing whitespace. to_upper: Whether to convert the string to uppercase. to_lower: Whether to convert the string to lowercase. strict: Whether to validate the string in strict mode. min_length: The minimum length of the string. max_length: The maximum length of the string. pattern: A regex pattern that the string must match. Example: ```python from typing_extensions import Annotated from pydantic.types import StringConstraints ConstrainedStr = Annotated[str, StringConstraints(min_length=1, max_length=10)] ``` """ strip_whitespace: bool | None = None to_upper: bool | None = None to_lower: bool | None = None strict: bool | None = None min_length: int | None = None max_length: int | None = None pattern: str | Pattern[str] | None = None def __iter__(self) -> Iterator[BaseMetadata]: if self.min_length is not None: yield MinLen(self.min_length) if self.max_length is not None: yield MaxLen(self.max_length) if self.strict is not None: yield Strict(self.strict) if ( self.strip_whitespace is not None or self.pattern is not None or self.to_lower is not None or self.to_upper is not None ): yield _fields.pydantic_general_metadata( strip_whitespace=self.strip_whitespace, to_upper=self.to_upper, to_lower=self.to_lower, pattern=self.pattern, ) def constr( *, strip_whitespace: bool | None = None, to_upper: bool | None = None, to_lower: bool | None = None, strict: bool | None = None, min_length: int | None = None, max_length: int | None = None, pattern: str | Pattern[str] | None = None, ) -> type[str]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`StringConstraints`][pydantic.types.StringConstraints] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `constr` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$') ``` === ":white_check_mark: Do this" ```python from typing_extensions import Annotated from pydantic import BaseModel, StringConstraints class Foo(BaseModel): bar: Annotated[ str, StringConstraints( strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$' ), ] ``` A wrapper around `str` that allows for additional constraints. ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True) foo = Foo(bar=' hello ') print(foo) #> bar='HELLO' ``` Args: strip_whitespace: Whether to remove leading and trailing whitespace. to_upper: Whether to turn all characters to uppercase. to_lower: Whether to turn all characters to lowercase. strict: Whether to validate the string in strict mode. min_length: The minimum length of the string. max_length: The maximum length of the string. pattern: A regex pattern to validate the string against. Returns: The wrapped string type. """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] str, StringConstraints( strip_whitespace=strip_whitespace, to_upper=to_upper, to_lower=to_lower, strict=strict, min_length=min_length, max_length=max_length, pattern=pattern, ), ] StrictStr = Annotated[str, Strict()] """A string that must be validated in strict mode.""" # ~~~~~~~~~~~~~~~~~~~~~~~~~~~ COLLECTION TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ HashableItemType = TypeVar('HashableItemType', bound=Hashable) def conset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[set[HashableItemType]]: """A wrapper around `typing.Set` that allows for additional constraints. Args: item_type: The type of the items in the set. min_length: The minimum length of the set. max_length: The maximum length of the set. Returns: The wrapped set type. """ return Annotated[Set[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] def confrozenset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[frozenset[HashableItemType]]: """A wrapper around `typing.FrozenSet` that allows for additional constraints. Args: item_type: The type of the items in the frozenset. min_length: The minimum length of the frozenset. max_length: The maximum length of the frozenset. Returns: The wrapped frozenset type. """ return Annotated[FrozenSet[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] AnyItemType = TypeVar('AnyItemType') def conlist( item_type: type[AnyItemType], *, min_length: int | None = None, max_length: int | None = None, unique_items: bool | None = None, ) -> type[list[AnyItemType]]: """A wrapper around typing.List that adds validation. Args: item_type: The type of the items in the list. min_length: The minimum length of the list. Defaults to None. max_length: The maximum length of the list. Defaults to None. unique_items: Whether the items in the list must be unique. Defaults to None. !!! warning Deprecated The `unique_items` parameter is deprecated, use `Set` instead. See [this issue](https://github.com/pydantic/pydantic-core/issues/296) for more details. Returns: The wrapped list type. """ if unique_items is not None: raise PydanticUserError( ( '`unique_items` is removed, use `Set` instead' '(this feature is discussed in https://github.com/pydantic/pydantic-core/issues/296)' ), code='removed-kwargs', ) return Annotated[List[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] # ~~~~~~~~~~~~~~~~~~~~~~~~~~ IMPORT STRING TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ AnyType = TypeVar('AnyType') if TYPE_CHECKING: ImportString = Annotated[AnyType, ...] else: class ImportString: """A type that can be used to import a Python object from a string. `ImportString` expects a string and loads the Python object importable at that dotted path. Attributes of modules may be separated from the module by `:` or `.`, e.g. if `'math:cos'` is provided, the resulting field value would be the function `cos`. If a `.` is used and both an attribute and submodule are present at the same path, the module will be preferred. On model instantiation, pointers will be evaluated and imported. There is some nuance to this behavior, demonstrated in the examples below. ```python import math from pydantic import BaseModel, Field, ImportString, ValidationError class ImportThings(BaseModel): obj: ImportString # A string value will cause an automatic import my_cos = ImportThings(obj='math.cos') # You can use the imported function as you would expect cos_of_0 = my_cos.obj(0) assert cos_of_0 == 1 # A string whose value cannot be imported will raise an error try: ImportThings(obj='foo.bar') except ValidationError as e: print(e) ''' 1 validation error for ImportThings obj Invalid python path: No module named 'foo.bar' [type=import_error, input_value='foo.bar', input_type=str] ''' # Actual python objects can be assigned as well my_cos = ImportThings(obj=math.cos) my_cos_2 = ImportThings(obj='math.cos') my_cos_3 = ImportThings(obj='math:cos') assert my_cos == my_cos_2 == my_cos_3 # You can set default field value either as Python object: class ImportThingsDefaultPyObj(BaseModel): obj: ImportString = math.cos # or as a string value (but only if used with `validate_default=True`) class ImportThingsDefaultString(BaseModel): obj: ImportString = Field(default='math.cos', validate_default=True) my_cos_default1 = ImportThingsDefaultPyObj() my_cos_default2 = ImportThingsDefaultString() assert my_cos_default1.obj == my_cos_default2.obj == math.cos # note: this will not work! class ImportThingsMissingValidateDefault(BaseModel): obj: ImportString = 'math.cos' my_cos_default3 = ImportThingsMissingValidateDefault() assert my_cos_default3.obj == 'math.cos' # just string, not evaluated ``` Serializing an `ImportString` type to json is also possible. ```python from pydantic import BaseModel, ImportString class ImportThings(BaseModel): obj: ImportString # Create an instance m = ImportThings(obj='math.cos') print(m) #> obj= print(m.model_dump_json()) #> {"obj":"math.cos"} ``` """ @classmethod def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: serializer = core_schema.plain_serializer_function_ser_schema(cls._serialize, when_used='json') if cls is source: # Treat bare usage of ImportString (`schema is None`) as the same as ImportString[Any] return core_schema.no_info_plain_validator_function( function=_validators.import_string, serialization=serializer ) else: return core_schema.no_info_before_validator_function( function=_validators.import_string, schema=handler(source), serialization=serializer ) @classmethod def __get_pydantic_json_schema__(cls, cs: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: return handler(core_schema.str_schema()) @staticmethod def _serialize(v: Any) -> str: if isinstance(v, ModuleType): return v.__name__ elif hasattr(v, '__module__') and hasattr(v, '__name__'): return f'{v.__module__}.{v.__name__}' # Handle special cases for sys.XXX streams # if we see more of these, we should consider a more general solution elif hasattr(v, 'name'): if v.name == '': return 'sys.stdout' elif v.name == '': return 'sys.stdin' elif v.name == '': return 'sys.stderr' else: return v def __repr__(self) -> str: return 'ImportString' # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ DECIMAL TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ def condecimal( *, strict: bool | None = None, gt: int | Decimal | None = None, ge: int | Decimal | None = None, lt: int | Decimal | None = None, le: int | Decimal | None = None, multiple_of: int | Decimal | None = None, max_digits: int | None = None, decimal_places: int | None = None, allow_inf_nan: bool | None = None, ) -> type[Decimal]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `condecimal` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, condecimal class Foo(BaseModel): bar: condecimal(strict=True, allow_inf_nan=True) ``` === ":white_check_mark: Do this" ```python from decimal import Decimal from typing_extensions import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[Decimal, Field(strict=True, allow_inf_nan=True)] ``` A wrapper around Decimal that adds validation. Args: strict: Whether to validate the value in strict mode. Defaults to `None`. gt: The value must be greater than this. Defaults to `None`. ge: The value must be greater than or equal to this. Defaults to `None`. lt: The value must be less than this. Defaults to `None`. le: The value must be less than or equal to this. Defaults to `None`. multiple_of: The value must be a multiple of this. Defaults to `None`. max_digits: The maximum number of digits. Defaults to `None`. decimal_places: The number of decimal places. Defaults to `None`. allow_inf_nan: Whether to allow infinity and NaN. Defaults to `None`. ```python from decimal import Decimal from pydantic import BaseModel, ValidationError, condecimal class ConstrainedExample(BaseModel): constrained_decimal: condecimal(gt=Decimal('1.0')) m = ConstrainedExample(constrained_decimal=Decimal('1.1')) print(repr(m)) #> ConstrainedExample(constrained_decimal=Decimal('1.1')) try: ConstrainedExample(constrained_decimal=Decimal('0.9')) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_decimal',), 'msg': 'Input should be greater than 1.0', 'input': Decimal('0.9'), 'ctx': {'gt': Decimal('1.0')}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] Decimal, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, _fields.pydantic_general_metadata(max_digits=max_digits, decimal_places=decimal_places), AllowInfNan(allow_inf_nan) if allow_inf_nan is not None else None, ] # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ UUID TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @_dataclasses.dataclass(**_internal_dataclass.slots_true) class UuidVersion: """A field metadata class to indicate a [UUID](https://docs.python.org/3/library/uuid.html) version. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: uuid_version: The version of the UUID. Must be one of 1, 3, 4, or 5. Example: ```python from uuid import UUID from typing_extensions import Annotated from pydantic.types import UuidVersion UUID1 = Annotated[UUID, UuidVersion(1)] ``` """ uuid_version: Literal[1, 3, 4, 5] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.pop('anyOf', None) # remove the bytes/str union field_schema.update(type='string', format=f'uuid{self.uuid_version}') return field_schema def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: if isinstance(self, source): # used directly as a type return core_schema.uuid_schema(version=self.uuid_version) else: # update existing schema with self.uuid_version schema = handler(source) _check_annotated_type(schema['type'], 'uuid', self.__class__.__name__) schema['version'] = self.uuid_version # type: ignore return schema def __hash__(self) -> int: return hash(type(self.uuid_version)) UUID1 = Annotated[UUID, UuidVersion(1)] """A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 1. ```python import uuid from pydantic import UUID1, BaseModel class Model(BaseModel): uuid1: UUID1 Model(uuid1=uuid.uuid1()) ``` """ UUID3 = Annotated[UUID, UuidVersion(3)] """A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 3. ```python import uuid from pydantic import UUID3, BaseModel class Model(BaseModel): uuid3: UUID3 Model(uuid3=uuid.uuid3(uuid.NAMESPACE_DNS, 'pydantic.org')) ``` """ UUID4 = Annotated[UUID, UuidVersion(4)] """A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 4. ```python import uuid from pydantic import UUID4, BaseModel class Model(BaseModel): uuid4: UUID4 Model(uuid4=uuid.uuid4()) ``` """ UUID5 = Annotated[UUID, UuidVersion(5)] """A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 5. ```python import uuid from pydantic import UUID5, BaseModel class Model(BaseModel): uuid5: UUID5 Model(uuid5=uuid.uuid5(uuid.NAMESPACE_DNS, 'pydantic.org')) ``` """ # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PATH TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @_dataclasses.dataclass class PathType: path_type: Literal['file', 'dir', 'new', 'socket'] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) format_conversion = {'file': 'file-path', 'dir': 'directory-path'} field_schema.update(format=format_conversion.get(self.path_type, 'path'), type='string') return field_schema def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: function_lookup = { 'file': cast(core_schema.WithInfoValidatorFunction, self.validate_file), 'dir': cast(core_schema.WithInfoValidatorFunction, self.validate_directory), 'new': cast(core_schema.WithInfoValidatorFunction, self.validate_new), 'socket': cast(core_schema.WithInfoValidatorFunction, self.validate_socket), } return core_schema.with_info_after_validator_function( function_lookup[self.path_type], handler(source), ) @staticmethod def validate_file(path: Path, _: core_schema.ValidationInfo) -> Path: if path.is_file(): return path else: raise PydanticCustomError('path_not_file', 'Path does not point to a file') @staticmethod def validate_socket(path: Path, _: core_schema.ValidationInfo) -> Path: if path.is_socket(): return path else: raise PydanticCustomError('path_not_socket', 'Path does not point to a socket') @staticmethod def validate_directory(path: Path, _: core_schema.ValidationInfo) -> Path: if path.is_dir(): return path else: raise PydanticCustomError('path_not_directory', 'Path does not point to a directory') @staticmethod def validate_new(path: Path, _: core_schema.ValidationInfo) -> Path: if path.exists(): raise PydanticCustomError('path_exists', 'Path already exists') elif not path.parent.exists(): raise PydanticCustomError('parent_does_not_exist', 'Parent directory does not exist') else: return path def __hash__(self) -> int: return hash(type(self.path_type)) FilePath = Annotated[Path, PathType('file')] """A path that must point to a file. ```python from pathlib import Path from pydantic import BaseModel, FilePath, ValidationError class Model(BaseModel): f: FilePath path = Path('text.txt') path.touch() m = Model(f='text.txt') print(m.model_dump()) #> {'f': PosixPath('text.txt')} path.unlink() path = Path('directory') path.mkdir(exist_ok=True) try: Model(f='directory') # directory except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a file [type=path_not_file, input_value='directory', input_type=str] ''' path.rmdir() try: Model(f='not-exists-file') except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a file [type=path_not_file, input_value='not-exists-file', input_type=str] ''' ``` """ DirectoryPath = Annotated[Path, PathType('dir')] """A path that must point to a directory. ```python from pathlib import Path from pydantic import BaseModel, DirectoryPath, ValidationError class Model(BaseModel): f: DirectoryPath path = Path('directory/') path.mkdir() m = Model(f='directory/') print(m.model_dump()) #> {'f': PosixPath('directory')} path.rmdir() path = Path('file.txt') path.touch() try: Model(f='file.txt') # file except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a directory [type=path_not_directory, input_value='file.txt', input_type=str] ''' path.unlink() try: Model(f='not-exists-directory') except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a directory [type=path_not_directory, input_value='not-exists-directory', input_type=str] ''' ``` """ NewPath = Annotated[Path, PathType('new')] """A path for a new file or directory that must not already exist. The parent directory must already exist.""" SocketPath = Annotated[Path, PathType('socket')] """A path to an existing socket file""" # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ JSON TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: # Json[list[str]] will be recognized by type checkers as list[str] Json = Annotated[AnyType, ...] else: class Json: """A special type wrapper which loads JSON before parsing. You can use the `Json` data type to make Pydantic first load a raw JSON string before validating the loaded data into the parametrized type: ```python from typing import Any, List from pydantic import BaseModel, Json, ValidationError class AnyJsonModel(BaseModel): json_obj: Json[Any] class ConstrainedJsonModel(BaseModel): json_obj: Json[List[int]] print(AnyJsonModel(json_obj='{"b": 1}')) #> json_obj={'b': 1} print(ConstrainedJsonModel(json_obj='[1, 2, 3]')) #> json_obj=[1, 2, 3] try: ConstrainedJsonModel(json_obj=12) except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj JSON input should be string, bytes or bytearray [type=json_type, input_value=12, input_type=int] ''' try: ConstrainedJsonModel(json_obj='[a, b]') except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj Invalid JSON: expected value at line 1 column 2 [type=json_invalid, input_value='[a, b]', input_type=str] ''' try: ConstrainedJsonModel(json_obj='["a", "b"]') except ValidationError as e: print(e) ''' 2 validation errors for ConstrainedJsonModel json_obj.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] json_obj.1 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='b', input_type=str] ''' ``` When you dump the model using `model_dump` or `model_dump_json`, the dumped value will be the result of validation, not the original JSON string. However, you can use the argument `round_trip=True` to get the original JSON string back: ```python from typing import List from pydantic import BaseModel, Json class ConstrainedJsonModel(BaseModel): json_obj: Json[List[int]] print(ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json()) #> {"json_obj":[1,2,3]} print( ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json(round_trip=True) ) #> {"json_obj":"[1,2,3]"} ``` """ @classmethod def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: if cls is source: return core_schema.json_schema(None) else: return core_schema.json_schema(handler(source)) def __repr__(self) -> str: return 'Json' def __hash__(self) -> int: return hash(type(self)) def __eq__(self, other: Any) -> bool: return type(other) is type(self) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SECRET TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SecretType = TypeVar('SecretType') class _SecretBase(Generic[SecretType]): def __init__(self, secret_value: SecretType) -> None: self._secret_value: SecretType = secret_value def get_secret_value(self) -> SecretType: """Get the secret value. Returns: The secret value. """ return self._secret_value def __eq__(self, other: Any) -> bool: return isinstance(other, self.__class__) and self.get_secret_value() == other.get_secret_value() def __hash__(self) -> int: return hash(self.get_secret_value()) def __str__(self) -> str: return str(self._display()) def __repr__(self) -> str: return f'{self.__class__.__name__}({self._display()!r})' def _display(self) -> str | bytes: raise NotImplementedError def _serialize_secret(value: Secret[SecretType], info: core_schema.SerializationInfo) -> str | Secret[SecretType]: if info.mode == 'json': return str(value) else: return value class Secret(_SecretBase[SecretType]): """A generic base class used for defining a field with sensitive information that you do not want to be visible in logging or tracebacks. You may either directly parametrize `Secret` with a type, or subclass from `Secret` with a parametrized type. The benefit of subclassing is that you can define a custom `_display` method, which will be used for `repr()` and `str()` methods. The examples below demonstrate both ways of using `Secret` to create a new secret type. 1. Directly parametrizing `Secret` with a type: ```python from pydantic import BaseModel, Secret SecretBool = Secret[bool] class Model(BaseModel): secret_bool: SecretBool m = Model(secret_bool=True) print(m.model_dump()) #> {'secret_bool': Secret('**********')} print(m.model_dump_json()) #> {"secret_bool":"**********"} print(m.secret_bool.get_secret_value()) #> True ``` 2. Subclassing from parametrized `Secret`: ```python from datetime import date from pydantic import BaseModel, Secret class SecretDate(Secret[date]): def _display(self) -> str: return '****/**/**' class Model(BaseModel): secret_date: SecretDate m = Model(secret_date=date(2022, 1, 1)) print(m.model_dump()) #> {'secret_date': SecretDate('****/**/**')} print(m.model_dump_json()) #> {"secret_date":"****/**/**"} print(m.secret_date.get_secret_value()) #> 2022-01-01 ``` The value returned by the `_display` method will be used for `repr()` and `str()`. You can enforce constraints on the underlying type through annotations: For example: ```python from typing_extensions import Annotated from pydantic import BaseModel, Field, Secret, ValidationError SecretPosInt = Secret[Annotated[int, Field(gt=0, strict=True)]] class Model(BaseModel): sensitive_int: SecretPosInt m = Model(sensitive_int=42) print(m.model_dump()) #> {'sensitive_int': Secret('**********')} try: m = Model(sensitive_int=-42) # (1)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'greater_than', 'loc': ('sensitive_int',), 'msg': 'Input should be greater than 0', 'ctx': {'gt': 0}, } ] ''' try: m = Model(sensitive_int='42') # (2)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'int_type', 'loc': ('sensitive_int',), 'msg': 'Input should be a valid integer', } ] ''' ``` 1. The input value is not greater than 0, so it raises a validation error. 2. The input value is not an integer, so it raises a validation error because the `SecretPosInt` type has strict mode enabled. """ def _display(self) -> str | bytes: return '**********' if self.get_secret_value() else '' @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: inner_type = None # if origin_type is Secret, then cls is a GenericAlias, and we can extract the inner type directly origin_type = get_origin(source) if origin_type is not None: inner_type = get_args(source)[0] # otherwise, we need to get the inner type from the base class else: bases = getattr(cls, '__orig_bases__', getattr(cls, '__bases__', [])) for base in bases: if get_origin(base) is Secret: inner_type = get_args(base)[0] if bases == [] or inner_type is None: raise TypeError( f"Can't get secret type from {cls.__name__}. " 'Please use Secret[], or subclass from Secret[] instead.' ) inner_schema = handler.generate_schema(inner_type) # type: ignore def validate_secret_value(value, handler) -> Secret[SecretType]: if isinstance(value, Secret): value = value.get_secret_value() validated_inner = handler(value) return cls(validated_inner) return core_schema.json_or_python_schema( python_schema=core_schema.no_info_wrap_validator_function( validate_secret_value, inner_schema, ), json_schema=core_schema.no_info_after_validator_function(lambda x: cls(x), inner_schema), serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret, info_arg=True, when_used='always', ), ) __pydantic_serializer__ = SchemaSerializer( core_schema.any_schema( serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret, info_arg=True, when_used='always', ) ) ) def _secret_display(value: SecretType) -> str: # type: ignore return '**********' if value else '' def _serialize_secret_field( value: _SecretField[SecretType], info: core_schema.SerializationInfo ) -> str | _SecretField[SecretType]: if info.mode == 'json': # we want the output to always be string without the `b'` prefix for bytes, # hence we just use `secret_display` return _secret_display(value.get_secret_value()) else: return value class _SecretField(_SecretBase[SecretType]): _inner_schema: ClassVar[CoreSchema] _error_kind: ClassVar[str] @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: def get_json_schema(_core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: json_schema = handler(cls._inner_schema) _utils.update_not_none( json_schema, type='string', writeOnly=True, format='password', ) return json_schema json_schema = core_schema.no_info_after_validator_function( source, # construct the type cls._inner_schema, ) def get_secret_schema(strict: bool) -> CoreSchema: return core_schema.json_or_python_schema( python_schema=core_schema.union_schema( [ core_schema.is_instance_schema(source), json_schema, ], custom_error_type=cls._error_kind, strict=strict, ), json_schema=json_schema, serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret_field, info_arg=True, when_used='always', ), ) return core_schema.lax_or_strict_schema( lax_schema=get_secret_schema(strict=False), strict_schema=get_secret_schema(strict=True), metadata={'pydantic_js_functions': [get_json_schema]}, ) __pydantic_serializer__ = SchemaSerializer( core_schema.any_schema( serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret_field, info_arg=True, when_used='always', ) ) ) class SecretStr(_SecretField[str]): """A string used for storing sensitive information that you do not want to be visible in logging or tracebacks. When the secret value is nonempty, it is displayed as `'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value _is_ empty, it is displayed as `''`. ```python from pydantic import BaseModel, SecretStr class User(BaseModel): username: str password: SecretStr user = User(username='scolvin', password='password1') print(user) #> username='scolvin' password=SecretStr('**********') print(user.password.get_secret_value()) #> password1 print((SecretStr('password'), SecretStr(''))) #> (SecretStr('**********'), SecretStr('')) ``` As seen above, by default, [`SecretStr`][pydantic.types.SecretStr] (and [`SecretBytes`][pydantic.types.SecretBytes]) will be serialized as `**********` when serializing to json. You can use the [`field_serializer`][pydantic.functional_serializers.field_serializer] to dump the secret as plain-text when serializing to json. ```python from pydantic import BaseModel, SecretBytes, SecretStr, field_serializer class Model(BaseModel): password: SecretStr password_bytes: SecretBytes @field_serializer('password', 'password_bytes', when_used='json') def dump_secret(self, v): return v.get_secret_value() model = Model(password='IAmSensitive', password_bytes=b'IAmSensitiveBytes') print(model) #> password=SecretStr('**********') password_bytes=SecretBytes(b'**********') print(model.password) #> ********** print(model.model_dump()) ''' { 'password': SecretStr('**********'), 'password_bytes': SecretBytes(b'**********'), } ''' print(model.model_dump_json()) #> {"password":"IAmSensitive","password_bytes":"IAmSensitiveBytes"} ``` """ _inner_schema: ClassVar[CoreSchema] = core_schema.str_schema() _error_kind: ClassVar[str] = 'string_type' def __len__(self) -> int: return len(self._secret_value) def _display(self) -> str: return _secret_display(self._secret_value) class SecretBytes(_SecretField[bytes]): """A bytes used for storing sensitive information that you do not want to be visible in logging or tracebacks. It displays `b'**********'` instead of the string value on `repr()` and `str()` calls. When the secret value is nonempty, it is displayed as `b'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value _is_ empty, it is displayed as `b''`. ```python from pydantic import BaseModel, SecretBytes class User(BaseModel): username: str password: SecretBytes user = User(username='scolvin', password=b'password1') #> username='scolvin' password=SecretBytes(b'**********') print(user.password.get_secret_value()) #> b'password1' print((SecretBytes(b'password'), SecretBytes(b''))) #> (SecretBytes(b'**********'), SecretBytes(b'')) ``` """ _inner_schema: ClassVar[CoreSchema] = core_schema.bytes_schema() _error_kind: ClassVar[str] = 'bytes_type' def __len__(self) -> int: return len(self._secret_value) def _display(self) -> bytes: return _secret_display(self._secret_value).encode() # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PAYMENT CARD TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class PaymentCardBrand(str, Enum): amex = 'American Express' mastercard = 'Mastercard' visa = 'Visa' other = 'other' def __str__(self) -> str: return self.value @deprecated( 'The `PaymentCardNumber` class is deprecated, use `pydantic_extra_types` instead. ' 'See https://docs.pydantic.dev/latest/api/pydantic_extra_types_payment/#pydantic_extra_types.payment.PaymentCardNumber.', category=PydanticDeprecatedSince20, ) class PaymentCardNumber(str): """Based on: https://en.wikipedia.org/wiki/Payment_card_number.""" strip_whitespace: ClassVar[bool] = True min_length: ClassVar[int] = 12 max_length: ClassVar[int] = 19 bin: str last4: str brand: PaymentCardBrand def __init__(self, card_number: str): self.validate_digits(card_number) card_number = self.validate_luhn_check_digit(card_number) self.bin = card_number[:6] self.last4 = card_number[-4:] self.brand = self.validate_brand(card_number) @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( cls.validate, core_schema.str_schema( min_length=cls.min_length, max_length=cls.max_length, strip_whitespace=cls.strip_whitespace ), ) @classmethod def validate(cls, input_value: str, /, _: core_schema.ValidationInfo) -> PaymentCardNumber: """Validate the card number and return a `PaymentCardNumber` instance.""" return cls(input_value) @property def masked(self) -> str: """Mask all but the last 4 digits of the card number. Returns: A masked card number string. """ num_masked = len(self) - 10 # len(bin) + len(last4) == 10 return f'{self.bin}{"*" * num_masked}{self.last4}' @classmethod def validate_digits(cls, card_number: str) -> None: """Validate that the card number is all digits.""" if not card_number.isdigit(): raise PydanticCustomError('payment_card_number_digits', 'Card number is not all digits') @classmethod def validate_luhn_check_digit(cls, card_number: str) -> str: """Based on: https://en.wikipedia.org/wiki/Luhn_algorithm.""" sum_ = int(card_number[-1]) length = len(card_number) parity = length % 2 for i in range(length - 1): digit = int(card_number[i]) if i % 2 == parity: digit *= 2 if digit > 9: digit -= 9 sum_ += digit valid = sum_ % 10 == 0 if not valid: raise PydanticCustomError('payment_card_number_luhn', 'Card number is not luhn valid') return card_number @staticmethod def validate_brand(card_number: str) -> PaymentCardBrand: """Validate length based on BIN for major brands: https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number_(IIN). """ if card_number[0] == '4': brand = PaymentCardBrand.visa elif 51 <= int(card_number[:2]) <= 55: brand = PaymentCardBrand.mastercard elif card_number[:2] in {'34', '37'}: brand = PaymentCardBrand.amex else: brand = PaymentCardBrand.other required_length: None | int | str = None if brand in PaymentCardBrand.mastercard: required_length = 16 valid = len(card_number) == required_length elif brand == PaymentCardBrand.visa: required_length = '13, 16 or 19' valid = len(card_number) in {13, 16, 19} elif brand == PaymentCardBrand.amex: required_length = 15 valid = len(card_number) == required_length else: valid = True if not valid: raise PydanticCustomError( 'payment_card_number_brand', 'Length for a {brand} card must be {required_length}', {'brand': brand, 'required_length': required_length}, ) return brand # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BYTE SIZE TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ByteSize(int): """Converts a string representing a number of bytes with units (such as `'1KB'` or `'11.5MiB'`) into an integer. You can use the `ByteSize` data type to (case-insensitively) convert a string representation of a number of bytes into an integer, and also to print out human-readable strings representing a number of bytes. In conformance with [IEC 80000-13 Standard](https://en.wikipedia.org/wiki/ISO/IEC_80000) we interpret `'1KB'` to mean 1000 bytes, and `'1KiB'` to mean 1024 bytes. In general, including a middle `'i'` will cause the unit to be interpreted as a power of 2, rather than a power of 10 (so, for example, `'1 MB'` is treated as `1_000_000` bytes, whereas `'1 MiB'` is treated as `1_048_576` bytes). !!! info Note that `1b` will be parsed as "1 byte" and not "1 bit". ```python from pydantic import BaseModel, ByteSize class MyModel(BaseModel): size: ByteSize print(MyModel(size=52000).size) #> 52000 print(MyModel(size='3000 KiB').size) #> 3072000 m = MyModel(size='50 PB') print(m.size.human_readable()) #> 44.4PiB print(m.size.human_readable(decimal=True)) #> 50.0PB print(m.size.human_readable(separator=' ')) #> 44.4 PiB print(m.size.to('TiB')) #> 45474.73508864641 ``` """ byte_sizes = { 'b': 1, 'kb': 10**3, 'mb': 10**6, 'gb': 10**9, 'tb': 10**12, 'pb': 10**15, 'eb': 10**18, 'kib': 2**10, 'mib': 2**20, 'gib': 2**30, 'tib': 2**40, 'pib': 2**50, 'eib': 2**60, 'bit': 1 / 8, 'kbit': 10**3 / 8, 'mbit': 10**6 / 8, 'gbit': 10**9 / 8, 'tbit': 10**12 / 8, 'pbit': 10**15 / 8, 'ebit': 10**18 / 8, 'kibit': 2**10 / 8, 'mibit': 2**20 / 8, 'gibit': 2**30 / 8, 'tibit': 2**40 / 8, 'pibit': 2**50 / 8, 'eibit': 2**60 / 8, } byte_sizes.update({k.lower()[0]: v for k, v in byte_sizes.items() if 'i' not in k}) byte_string_pattern = r'^\s*(\d*\.?\d+)\s*(\w+)?' byte_string_re = re.compile(byte_string_pattern, re.IGNORECASE) @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( function=cls._validate, schema=core_schema.union_schema( [ core_schema.str_schema(pattern=cls.byte_string_pattern), core_schema.int_schema(ge=0), ], custom_error_type='byte_size', custom_error_message='could not parse value and unit from byte string', ), serialization=core_schema.plain_serializer_function_ser_schema( int, return_schema=core_schema.int_schema(ge=0) ), ) @classmethod def _validate(cls, input_value: Any, /, _: core_schema.ValidationInfo) -> ByteSize: try: return cls(int(input_value)) except ValueError: pass str_match = cls.byte_string_re.match(str(input_value)) if str_match is None: raise PydanticCustomError('byte_size', 'could not parse value and unit from byte string') scalar, unit = str_match.groups() if unit is None: unit = 'b' try: unit_mult = cls.byte_sizes[unit.lower()] except KeyError: raise PydanticCustomError('byte_size_unit', 'could not interpret byte unit: {unit}', {'unit': unit}) return cls(int(float(scalar) * unit_mult)) def human_readable(self, decimal: bool = False, separator: str = '') -> str: """Converts a byte size to a human readable string. Args: decimal: If True, use decimal units (e.g. 1000 bytes per KB). If False, use binary units (e.g. 1024 bytes per KiB). separator: A string used to split the value and unit. Defaults to an empty string (''). Returns: A human readable string representation of the byte size. """ if decimal: divisor = 1000 units = 'B', 'KB', 'MB', 'GB', 'TB', 'PB' final_unit = 'EB' else: divisor = 1024 units = 'B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB' final_unit = 'EiB' num = float(self) for unit in units: if abs(num) < divisor: if unit == 'B': return f'{num:0.0f}{separator}{unit}' else: return f'{num:0.1f}{separator}{unit}' num /= divisor return f'{num:0.1f}{separator}{final_unit}' def to(self, unit: str) -> float: """Converts a byte size to another unit, including both byte and bit units. Args: unit: The unit to convert to. Must be one of the following: B, KB, MB, GB, TB, PB, EB, KiB, MiB, GiB, TiB, PiB, EiB (byte units) and bit, kbit, mbit, gbit, tbit, pbit, ebit, kibit, mibit, gibit, tibit, pibit, eibit (bit units). Returns: The byte size in the new unit. """ try: unit_div = self.byte_sizes[unit.lower()] except KeyError: raise PydanticCustomError('byte_size_unit', 'Could not interpret byte unit: {unit}', {'unit': unit}) return self / unit_div # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ DATE TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ def _check_annotated_type(annotated_type: str, expected_type: str, annotation: str) -> None: if annotated_type != expected_type: raise PydanticUserError(f"'{annotation}' cannot annotate '{annotated_type}'.", code='invalid-annotated-type') if TYPE_CHECKING: PastDate = Annotated[date, ...] FutureDate = Annotated[date, ...] else: class PastDate: """A date in the past.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.date_schema(now_op='past') else: schema = handler(source) _check_annotated_type(schema['type'], 'date', cls.__name__) schema['now_op'] = 'past' return schema def __repr__(self) -> str: return 'PastDate' class FutureDate: """A date in the future.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.date_schema(now_op='future') else: schema = handler(source) _check_annotated_type(schema['type'], 'date', cls.__name__) schema['now_op'] = 'future' return schema def __repr__(self) -> str: return 'FutureDate' def condate( *, strict: bool | None = None, gt: date | None = None, ge: date | None = None, lt: date | None = None, le: date | None = None, ) -> type[date]: """A wrapper for date that adds constraints. Args: strict: Whether to validate the date value in strict mode. Defaults to `None`. gt: The value must be greater than this. Defaults to `None`. ge: The value must be greater than or equal to this. Defaults to `None`. lt: The value must be less than this. Defaults to `None`. le: The value must be less than or equal to this. Defaults to `None`. Returns: A date type with the specified constraints. """ return Annotated[ # pyright: ignore[reportReturnType] date, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), ] # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ DATETIME TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: AwareDatetime = Annotated[datetime, ...] NaiveDatetime = Annotated[datetime, ...] PastDatetime = Annotated[datetime, ...] FutureDatetime = Annotated[datetime, ...] else: class AwareDatetime: """A datetime that requires timezone info.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(tz_constraint='aware') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['tz_constraint'] = 'aware' return schema def __repr__(self) -> str: return 'AwareDatetime' class NaiveDatetime: """A datetime that doesn't require timezone info.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(tz_constraint='naive') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['tz_constraint'] = 'naive' return schema def __repr__(self) -> str: return 'NaiveDatetime' class PastDatetime: """A datetime that must be in the past.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(now_op='past') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['now_op'] = 'past' return schema def __repr__(self) -> str: return 'PastDatetime' class FutureDatetime: """A datetime that must be in the future.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(now_op='future') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['now_op'] = 'future' return schema def __repr__(self) -> str: return 'FutureDatetime' # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Encoded TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class EncoderProtocol(Protocol): """Protocol for encoding and decoding data to and from bytes.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data using the encoder. Args: data: The data to decode. Returns: The decoded data. """ ... @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data using the encoder. Args: value: The data to encode. Returns: The encoded data. """ ... @classmethod def get_json_format(cls) -> str: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ ... class Base64Encoder(EncoderProtocol): """Standard (non-URL-safe) Base64 encoder.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.b64encode(value) @classmethod def get_json_format(cls) -> Literal['base64']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64' class Base64UrlEncoder(EncoderProtocol): """URL-safe Base64 encoder.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.urlsafe_b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.urlsafe_b64encode(value) @classmethod def get_json_format(cls) -> Literal['base64url']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64url' @_dataclasses.dataclass(**_internal_dataclass.slots_true) class EncodedBytes: """A bytes type that is encoded and decoded using the specified encoder. `EncodedBytes` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing_extensions import Annotated from pydantic import BaseModel, EncodedBytes, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedBytes = Annotated[bytes, EncodedBytes(encoder=MyEncoder)] class Model(BaseModel): my_encoded_bytes: MyEncodedBytes # Initialize the model with encoded data m = Model(my_encoded_bytes=b'**encoded**: some bytes') # Access decoded value print(m.my_encoded_bytes) #> b'some bytes' # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_bytes': b'**encoded**: some bytes'} # Validate encoded data try: Model(my_encoded_bytes=b'**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_bytes Value error, Cannot decode data [type=value_error, input_value=b'**undecodable**', input_type=bytes] ''' ``` """ encoder: type[EncoderProtocol] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format=self.encoder.get_json_format()) return field_schema def __get_pydantic_core_schema__(self, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) _check_annotated_type(schema['type'], 'bytes', self.__class__.__name__) return core_schema.with_info_after_validator_function( function=self.decode, schema=schema, serialization=core_schema.plain_serializer_function_ser_schema(function=self.encode), ) def decode(self, data: bytes, _: core_schema.ValidationInfo) -> bytes: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data) def encode(self, value: bytes) -> bytes: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value) def __hash__(self) -> int: return hash(self.encoder) @_dataclasses.dataclass(**_internal_dataclass.slots_true) class EncodedStr: """A str type that is encoded and decoded using the specified encoder. `EncodedStr` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing_extensions import Annotated from pydantic import BaseModel, EncodedStr, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedStr = Annotated[str, EncodedStr(encoder=MyEncoder)] class Model(BaseModel): my_encoded_str: MyEncodedStr # Initialize the model with encoded data m = Model(my_encoded_str='**encoded**: some str') # Access decoded value print(m.my_encoded_str) #> some str # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_str': '**encoded**: some str'} # Validate encoded data try: Model(my_encoded_str='**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_str Value error, Cannot decode data [type=value_error, input_value='**undecodable**', input_type=str] ''' ``` """ encoder: type[EncoderProtocol] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format=self.encoder.get_json_format()) return field_schema def __get_pydantic_core_schema__(self, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) _check_annotated_type(schema['type'], 'str', self.__class__.__name__) return core_schema.with_info_after_validator_function( function=self.decode_str, schema=schema, serialization=core_schema.plain_serializer_function_ser_schema(function=self.encode_str), ) def decode_str(self, data: str, _: core_schema.ValidationInfo) -> str: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data.encode()).decode() def encode_str(self, value: str) -> str: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value.encode()).decode() # noqa: UP008 def __hash__(self) -> int: return hash(self.encoder) Base64Bytes = Annotated[bytes, EncodedBytes(encoder=Base64Encoder)] """A bytes type that is encoded and decoded using the standard (non-URL-safe) base64 encoder. Note: Under the hood, `Base64Bytes` uses the standard library `base64.b64encode` and `base64.b64decode` functions. As a result, attempting to decode url-safe base64 data using the `Base64Bytes` type may fail or produce an incorrect decoding. Warning: In versions of Pydantic prior to v2.10, `Base64Bytes` used [`base64.encodebytes`][base64.encodebytes] and [`base64.decodebytes`][base64.decodebytes] functions. According to the [base64 documentation](https://docs.python.org/3/library/base64.html), these methods are considered legacy implementation, and thus, Pydantic v2.10+ now uses the modern [`base64.b64encode`][base64.b64encode] and [`base64.b64decode`][base64.b64decode] functions. If you'd still like to use these legacy encoders / decoders, you can achieve this by creating a custom annotated type, like follows: ```python import base64 from typing import Literal from pydantic_core import PydanticCustomError from typing_extensions import Annotated from pydantic import EncodedBytes, EncoderProtocol class LegacyBase64Encoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: try: return base64.decodebytes(data) except ValueError as e: raise PydanticCustomError( 'base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}, ) @classmethod def encode(cls, value: bytes) -> bytes: return base64.encodebytes(value) @classmethod def get_json_format(cls) -> Literal['base64']: return 'base64' LegacyBase64Bytes = Annotated[bytes, EncodedBytes(encoder=LegacyBase64Encoder)] ``` ```python from pydantic import Base64Bytes, BaseModel, ValidationError class Model(BaseModel): base64_bytes: Base64Bytes # Initialize the model with base64 data m = Model(base64_bytes=b'VGhpcyBpcyB0aGUgd2F5') # Access decoded value print(m.base64_bytes) #> b'This is the way' # Serialize into the base64 form print(m.model_dump()) #> {'base64_bytes': b'VGhpcyBpcyB0aGUgd2F5'} # Validate base64 data try: print(Model(base64_bytes=b'undecodable').base64_bytes) except ValidationError as e: print(e) ''' 1 validation error for Model base64_bytes Base64 decoding error: 'Incorrect padding' [type=base64_decode, input_value=b'undecodable', input_type=bytes] ''' ``` """ Base64Str = Annotated[str, EncodedStr(encoder=Base64Encoder)] """A str type that is encoded and decoded using the standard (non-URL-safe) base64 encoder. Note: Under the hood, `Base64Str` uses the standard library `base64.b64encode` and `base64.b64decode` functions. As a result, attempting to decode url-safe base64 data using the `Base64Str` type may fail or produce an incorrect decoding. Warning: In versions of Pydantic prior to v2.10, `Base64Str` used [`base64.encodebytes`][base64.encodebytes] and [`base64.decodebytes`][base64.decodebytes] functions. According to the [base64 documentation](https://docs.python.org/3/library/base64.html), these methods are considered legacy implementation, and thus, Pydantic v2.10+ now uses the modern [`base64.b64encode`][base64.b64encode] and [`base64.b64decode`][base64.b64decode] functions. See the [`Base64Bytes`][pydantic.types.Base64Bytes] type for more information on how to replicate the old behavior with the legacy encoders / decoders. ```python from pydantic import Base64Str, BaseModel, ValidationError class Model(BaseModel): base64_str: Base64Str # Initialize the model with base64 data m = Model(base64_str='VGhlc2UgYXJlbid0IHRoZSBkcm9pZHMgeW91J3JlIGxvb2tpbmcgZm9y') # Access decoded value print(m.base64_str) #> These aren't the droids you're looking for # Serialize into the base64 form print(m.model_dump()) #> {'base64_str': 'VGhlc2UgYXJlbid0IHRoZSBkcm9pZHMgeW91J3JlIGxvb2tpbmcgZm9y'} # Validate base64 data try: print(Model(base64_str='undecodable').base64_str) except ValidationError as e: print(e) ''' 1 validation error for Model base64_str Base64 decoding error: 'Incorrect padding' [type=base64_decode, input_value='undecodable', input_type=str] ''' ``` """ Base64UrlBytes = Annotated[bytes, EncodedBytes(encoder=Base64UrlEncoder)] """A bytes type that is encoded and decoded using the URL-safe base64 encoder. Note: Under the hood, `Base64UrlBytes` use standard library `base64.urlsafe_b64encode` and `base64.urlsafe_b64decode` functions. As a result, the `Base64UrlBytes` type can be used to faithfully decode "vanilla" base64 data (using `'+'` and `'/'`). ```python from pydantic import Base64UrlBytes, BaseModel class Model(BaseModel): base64url_bytes: Base64UrlBytes # Initialize the model with base64 data m = Model(base64url_bytes=b'SHc_dHc-TXc==') print(m) #> base64url_bytes=b'Hw?tw>Mw' ``` """ Base64UrlStr = Annotated[str, EncodedStr(encoder=Base64UrlEncoder)] """A str type that is encoded and decoded using the URL-safe base64 encoder. Note: Under the hood, `Base64UrlStr` use standard library `base64.urlsafe_b64encode` and `base64.urlsafe_b64decode` functions. As a result, the `Base64UrlStr` type can be used to faithfully decode "vanilla" base64 data (using `'+'` and `'/'`). ```python from pydantic import Base64UrlStr, BaseModel class Model(BaseModel): base64url_str: Base64UrlStr # Initialize the model with base64 data m = Model(base64url_str='SHc_dHc-TXc==') print(m) #> base64url_str='Hw?tw>Mw' ``` """ __getattr__ = getattr_migration(__name__) @_dataclasses.dataclass(**_internal_dataclass.slots_true) class GetPydanticSchema: """Usage docs: https://docs.pydantic.dev/2.10/concepts/types/#using-getpydanticschema-to-reduce-boilerplate A convenience class for creating an annotation that provides pydantic custom type hooks. This class is intended to eliminate the need to create a custom "marker" which defines the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` custom hook methods. For example, to have a field treated by type checkers as `int`, but by pydantic as `Any`, you can do: ```python from typing import Any from typing_extensions import Annotated from pydantic import BaseModel, GetPydanticSchema HandleAsAny = GetPydanticSchema(lambda _s, h: h(Any)) class Model(BaseModel): x: Annotated[int, HandleAsAny] # pydantic sees `x: Any` print(repr(Model(x='abc').x)) #> 'abc' ``` """ get_pydantic_core_schema: Callable[[Any, GetCoreSchemaHandler], CoreSchema] | None = None get_pydantic_json_schema: Callable[[Any, GetJsonSchemaHandler], JsonSchemaValue] | None = None # Note: we may want to consider adding a convenience staticmethod `def for_type(type_: Any) -> GetPydanticSchema:` # which returns `GetPydanticSchema(lambda _s, h: h(type_))` if not TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access def __getattr__(self, item: str) -> Any: """Use this rather than defining `__get_pydantic_core_schema__` etc. to reduce the number of nested calls.""" if item == '__get_pydantic_core_schema__' and self.get_pydantic_core_schema: return self.get_pydantic_core_schema elif item == '__get_pydantic_json_schema__' and self.get_pydantic_json_schema: return self.get_pydantic_json_schema else: return object.__getattribute__(self, item) __hash__ = object.__hash__ @_dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class Tag: """Provides a way to specify the expected tag to use for a case of a (callable) discriminated union. Also provides a way to label a union case in error messages. When using a callable `Discriminator`, attach a `Tag` to each case in the `Union` to specify the tag that should be used to identify that case. For example, in the below example, the `Tag` is used to specify that if `get_discriminator_value` returns `'apple'`, the input should be validated as an `ApplePie`, and if it returns `'pumpkin'`, the input should be validated as a `PumpkinPie`. The primary role of the `Tag` here is to map the return value from the callable `Discriminator` function to the appropriate member of the `Union` in question. ```python from typing import Any, Union from typing_extensions import Annotated, Literal from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` !!! note You must specify a `Tag` for every case in a `Tag` that is associated with a callable `Discriminator`. Failing to do so will result in a `PydanticUserError` with code [`callable-discriminator-no-tag`](../errors/usage_errors.md#callable-discriminator-no-tag). See the [Discriminated Unions] concepts docs for more details on how to use `Tag`s. [Discriminated Unions]: ../concepts/unions.md#discriminated-unions """ tag: str def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) metadata = schema.setdefault('metadata', {}) assert isinstance(metadata, dict) metadata[_core_utils.TAGGED_UNION_TAG_KEY] = self.tag return schema @_dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class Discriminator: """Usage docs: https://docs.pydantic.dev/2.10/concepts/unions/#discriminated-unions-with-callable-discriminator Provides a way to use a custom callable as the way to extract the value of a union discriminator. This allows you to get validation behavior like you'd get from `Field(discriminator=)`, but without needing to have a single shared field across all the union choices. This also makes it possible to handle unions of models and primitive types with discriminated-union-style validation errors. Finally, this allows you to use a custom callable as the way to identify which member of a union a value belongs to, while still seeing all the performance benefits of a discriminated union. Consider this example, which is much more performant with the use of `Discriminator` and thus a `TaggedUnion` than it would be as a normal `Union`. ```python from typing import Any, Union from typing_extensions import Annotated, Literal from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` See the [Discriminated Unions] concepts docs for more details on how to use `Discriminator`s. [Discriminated Unions]: ../concepts/unions.md#discriminated-unions """ discriminator: str | Callable[[Any], Hashable] """The callable or field name for discriminating the type in a tagged union. A `Callable` discriminator must extract the value of the discriminator from the input. A `str` discriminator must be the name of a field to discriminate against. """ custom_error_type: str | None = None """Type to use in [custom errors](../errors/errors.md#custom-errors) replacing the standard discriminated union validation errors. """ custom_error_message: str | None = None """Message to use in custom errors.""" custom_error_context: dict[str, int | str | float] | None = None """Context to use in custom errors.""" def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: origin = _typing_extra.get_origin(source_type) if not origin or not _typing_extra.origin_is_union(origin): raise TypeError(f'{type(self).__name__} must be used with a Union type, not {source_type}') if isinstance(self.discriminator, str): from pydantic import Field return handler(Annotated[source_type, Field(discriminator=self.discriminator)]) else: original_schema = handler(source_type) return self._convert_schema(original_schema) def _convert_schema(self, original_schema: core_schema.CoreSchema) -> core_schema.TaggedUnionSchema: if original_schema['type'] != 'union': # This likely indicates that the schema was a single-item union that was simplified. # In this case, we do the same thing we do in # `pydantic._internal._discriminated_union._ApplyInferredDiscriminator._apply_to_root`, namely, # package the generated schema back into a single-item union. original_schema = core_schema.union_schema([original_schema]) tagged_union_choices = {} for choice in original_schema['choices']: tag = None if isinstance(choice, tuple): choice, tag = choice metadata = choice.get('metadata') if metadata is not None: metadata_tag = metadata.get(_core_utils.TAGGED_UNION_TAG_KEY) if metadata_tag is not None: tag = metadata_tag if tag is None: raise PydanticUserError( f'`Tag` not provided for choice {choice} used with `Discriminator`', code='callable-discriminator-no-tag', ) tagged_union_choices[tag] = choice # Have to do these verbose checks to ensure falsy values ('' and {}) don't get ignored custom_error_type = self.custom_error_type if custom_error_type is None: custom_error_type = original_schema.get('custom_error_type') custom_error_message = self.custom_error_message if custom_error_message is None: custom_error_message = original_schema.get('custom_error_message') custom_error_context = self.custom_error_context if custom_error_context is None: custom_error_context = original_schema.get('custom_error_context') custom_error_type = original_schema.get('custom_error_type') if custom_error_type is None else custom_error_type return core_schema.tagged_union_schema( tagged_union_choices, self.discriminator, custom_error_type=custom_error_type, custom_error_message=custom_error_message, custom_error_context=custom_error_context, strict=original_schema.get('strict'), ref=original_schema.get('ref'), metadata=original_schema.get('metadata'), serialization=original_schema.get('serialization'), ) _JSON_TYPES = {int, float, str, bool, list, dict, type(None)} def _get_type_name(x: Any) -> str: type_ = type(x) if type_ in _JSON_TYPES: return type_.__name__ # Handle proper subclasses; note we don't need to handle None or bool here if isinstance(x, int): return 'int' if isinstance(x, float): return 'float' if isinstance(x, str): return 'str' if isinstance(x, list): return 'list' if isinstance(x, dict): return 'dict' # Fail by returning the type's actual name return getattr(type_, '__name__', '') class _AllowAnyJson: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: python_schema = handler(source_type) return core_schema.json_or_python_schema(json_schema=core_schema.any_schema(), python_schema=python_schema) if TYPE_CHECKING: # This seems to only be necessary for mypy JsonValue: TypeAlias = Union[ List['JsonValue'], Dict[str, 'JsonValue'], str, bool, int, float, None, ] """A `JsonValue` is used to represent a value that can be serialized to JSON. It may be one of: * `List['JsonValue']` * `Dict[str, 'JsonValue']` * `str` * `bool` * `int` * `float` * `None` The following example demonstrates how to use `JsonValue` to validate JSON data, and what kind of errors to expect when input data is not json serializable. ```python import json from pydantic import BaseModel, JsonValue, ValidationError class Model(BaseModel): j: JsonValue valid_json_data = {'j': {'a': {'b': {'c': 1, 'd': [2, None]}}}} invalid_json_data = {'j': {'a': {'b': ...}}} print(repr(Model.model_validate(valid_json_data))) #> Model(j={'a': {'b': {'c': 1, 'd': [2, None]}}}) print(repr(Model.model_validate_json(json.dumps(valid_json_data)))) #> Model(j={'a': {'b': {'c': 1, 'd': [2, None]}}}) try: Model.model_validate(invalid_json_data) except ValidationError as e: print(e) ''' 1 validation error for Model j.dict.a.dict.b input was not a valid JSON value [type=invalid-json-value, input_value=Ellipsis, input_type=ellipsis] ''' ``` """ else: JsonValue = TypeAliasType( 'JsonValue', Annotated[ Union[ Annotated[List['JsonValue'], Tag('list')], Annotated[Dict[str, 'JsonValue'], Tag('dict')], Annotated[str, Tag('str')], Annotated[bool, Tag('bool')], Annotated[int, Tag('int')], Annotated[float, Tag('float')], Annotated[None, Tag('NoneType')], ], Discriminator( _get_type_name, custom_error_type='invalid-json-value', custom_error_message='input was not a valid JSON value', ), _AllowAnyJson, ], ) class _OnErrorOmit: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: # there is no actual default value here but we use with_default_schema since it already has the on_error # behavior implemented and it would be no more efficient to implement it on every other validator # or as a standalone validator return core_schema.with_default_schema(schema=handler(source_type), on_error='omit') OnErrorOmit = Annotated[T, _OnErrorOmit] """ When used as an item in a list, the key type in a dict, optional values of a TypedDict, etc. this annotation omits the item from the iteration if there is any error validating it. That is, instead of a [`ValidationError`][pydantic_core.ValidationError] being propagated up and the entire iterable being discarded any invalid items are discarded and the valid ones are returned. """ @_dataclasses.dataclass class FailFast(_fields.PydanticMetadata, BaseMetadata): """A `FailFast` annotation can be used to specify that validation should stop at the first error. This can be useful when you want to validate a large amount of data and you only need to know if it's valid or not. You might want to enable this setting if you want to validate your data faster (basically, if you use this, validation will be more performant with the caveat that you get less information). ```python from typing import List from typing_extensions import Annotated from pydantic import BaseModel, FailFast, ValidationError class Model(BaseModel): x: Annotated[List[int], FailFast()] # This will raise a single error for the first invalid value and stop validation try: obj = Model(x=[1, 2, 'a', 4, 5, 'b', 7, 8, 9, 'c']) except ValidationError as e: print(e) ''' 1 validation error for Model x.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] ''' ``` """ fail_fast: bool = True pydantic-2.10.6/pydantic/typing.py000066400000000000000000000002121474456633400171250ustar00rootroot00000000000000"""`typing` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/utils.py000066400000000000000000000002151474456633400167560ustar00rootroot00000000000000"""The `utils` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/v1/000077500000000000000000000000001474456633400155745ustar00rootroot00000000000000pydantic-2.10.6/pydantic/v1/__init__.py000066400000000000000000000056021474456633400177100ustar00rootroot00000000000000# flake8: noqa from pydantic.v1 import dataclasses from pydantic.v1.annotated_types import create_model_from_namedtuple, create_model_from_typeddict from pydantic.v1.class_validators import root_validator, validator from pydantic.v1.config import BaseConfig, ConfigDict, Extra from pydantic.v1.decorator import validate_arguments from pydantic.v1.env_settings import BaseSettings from pydantic.v1.error_wrappers import ValidationError from pydantic.v1.errors import * from pydantic.v1.fields import Field, PrivateAttr, Required from pydantic.v1.main import * from pydantic.v1.networks import * from pydantic.v1.parse import Protocol from pydantic.v1.tools import * from pydantic.v1.types import * from pydantic.v1.version import VERSION, compiled __version__ = VERSION # WARNING __all__ from pydantic.errors is not included here, it will be removed as an export here in v2 # please use "from pydantic.v1.errors import ..." instead __all__ = [ # annotated types utils 'create_model_from_namedtuple', 'create_model_from_typeddict', # dataclasses 'dataclasses', # class_validators 'root_validator', 'validator', # config 'BaseConfig', 'ConfigDict', 'Extra', # decorator 'validate_arguments', # env_settings 'BaseSettings', # error_wrappers 'ValidationError', # fields 'Field', 'Required', # main 'BaseModel', 'create_model', 'validate_model', # network 'AnyUrl', 'AnyHttpUrl', 'FileUrl', 'HttpUrl', 'stricturl', 'EmailStr', 'NameEmail', 'IPvAnyAddress', 'IPvAnyInterface', 'IPvAnyNetwork', 'PostgresDsn', 'CockroachDsn', 'AmqpDsn', 'RedisDsn', 'MongoDsn', 'KafkaDsn', 'validate_email', # parse 'Protocol', # tools 'parse_file_as', 'parse_obj_as', 'parse_raw_as', 'schema_of', 'schema_json_of', # types 'NoneStr', 'NoneBytes', 'StrBytes', 'NoneStrBytes', 'StrictStr', 'ConstrainedBytes', 'conbytes', 'ConstrainedList', 'conlist', 'ConstrainedSet', 'conset', 'ConstrainedFrozenSet', 'confrozenset', 'ConstrainedStr', 'constr', 'PyObject', 'ConstrainedInt', 'conint', 'PositiveInt', 'NegativeInt', 'NonNegativeInt', 'NonPositiveInt', 'ConstrainedFloat', 'confloat', 'PositiveFloat', 'NegativeFloat', 'NonNegativeFloat', 'NonPositiveFloat', 'FiniteFloat', 'ConstrainedDecimal', 'condecimal', 'ConstrainedDate', 'condate', 'UUID1', 'UUID3', 'UUID4', 'UUID5', 'FilePath', 'DirectoryPath', 'Json', 'JsonWrapper', 'SecretField', 'SecretStr', 'SecretBytes', 'StrictBool', 'StrictBytes', 'StrictInt', 'StrictFloat', 'PaymentCardNumber', 'PrivateAttr', 'ByteSize', 'PastDate', 'FutureDate', # version 'compiled', 'VERSION', ] pydantic-2.10.6/pydantic/v1/_hypothesis_plugin.py000066400000000000000000000347771474456633400221040ustar00rootroot00000000000000""" Register Hypothesis strategies for Pydantic custom types. This enables fully-automatic generation of test data for most Pydantic classes. Note that this module has *no* runtime impact on Pydantic itself; instead it is registered as a setuptools entry point and Hypothesis will import it if Pydantic is installed. See also: https://hypothesis.readthedocs.io/en/latest/strategies.html#registering-strategies-via-setuptools-entry-points https://hypothesis.readthedocs.io/en/latest/data.html#hypothesis.strategies.register_type_strategy https://hypothesis.readthedocs.io/en/latest/strategies.html#interaction-with-pytest-cov https://docs.pydantic.dev/usage/types/#pydantic-types Note that because our motivation is to *improve user experience*, the strategies are always sound (never generate invalid data) but sacrifice completeness for maintainability (ie may be unable to generate some tricky but valid data). Finally, this module makes liberal use of `# type: ignore[]` pragmas. This is because Hypothesis annotates `register_type_strategy()` with `(T, SearchStrategy[T])`, but in most cases we register e.g. `ConstrainedInt` to generate instances of the builtin `int` type which match the constraints. """ import contextlib import datetime import ipaddress import json import math from fractions import Fraction from typing import Callable, Dict, Type, Union, cast, overload import hypothesis.strategies as st import pydantic import pydantic.color import pydantic.types from pydantic.v1.utils import lenient_issubclass # FilePath and DirectoryPath are explicitly unsupported, as we'd have to create # them on-disk, and that's unsafe in general without being told *where* to do so. # # URLs are unsupported because it's easy for users to define their own strategy for # "normal" URLs, and hard for us to define a general strategy which includes "weird" # URLs but doesn't also have unpredictable performance problems. # # conlist() and conset() are unsupported for now, because the workarounds for # Cython and Hypothesis to handle parametrized generic types are incompatible. # We are rethinking Hypothesis compatibility in Pydantic v2. # Emails try: import email_validator except ImportError: # pragma: no cover pass else: def is_valid_email(s: str) -> bool: # Hypothesis' st.emails() occasionally generates emails like 0@A0--0.ac # that are invalid according to email-validator, so we filter those out. try: email_validator.validate_email(s, check_deliverability=False) return True except email_validator.EmailNotValidError: # pragma: no cover return False # Note that these strategies deliberately stay away from any tricky Unicode # or other encoding issues; we're just trying to generate *something* valid. st.register_type_strategy(pydantic.EmailStr, st.emails().filter(is_valid_email)) # type: ignore[arg-type] st.register_type_strategy( pydantic.NameEmail, st.builds( '{} <{}>'.format, # type: ignore[arg-type] st.from_regex('[A-Za-z0-9_]+( [A-Za-z0-9_]+){0,5}', fullmatch=True), st.emails().filter(is_valid_email), ), ) # PyObject - dotted names, in this case taken from the math module. st.register_type_strategy( pydantic.PyObject, # type: ignore[arg-type] st.sampled_from( [cast(pydantic.PyObject, f'math.{name}') for name in sorted(vars(math)) if not name.startswith('_')] ), ) # CSS3 Colors; as name, hex, rgb(a) tuples or strings, or hsl strings _color_regexes = ( '|'.join( ( pydantic.color.r_hex_short, pydantic.color.r_hex_long, pydantic.color.r_rgb, pydantic.color.r_rgba, pydantic.color.r_hsl, pydantic.color.r_hsla, ) ) # Use more precise regex patterns to avoid value-out-of-range errors .replace(pydantic.color._r_sl, r'(?:(\d\d?(?:\.\d+)?|100(?:\.0+)?)%)') .replace(pydantic.color._r_alpha, r'(?:(0(?:\.\d+)?|1(?:\.0+)?|\.\d+|\d{1,2}%))') .replace(pydantic.color._r_255, r'(?:((?:\d|\d\d|[01]\d\d|2[0-4]\d|25[0-4])(?:\.\d+)?|255(?:\.0+)?))') ) st.register_type_strategy( pydantic.color.Color, st.one_of( st.sampled_from(sorted(pydantic.color.COLORS_BY_NAME)), st.tuples( st.integers(0, 255), st.integers(0, 255), st.integers(0, 255), st.none() | st.floats(0, 1) | st.floats(0, 100).map('{}%'.format), ), st.from_regex(_color_regexes, fullmatch=True), ), ) # Card numbers, valid according to the Luhn algorithm def add_luhn_digit(card_number: str) -> str: # See https://en.wikipedia.org/wiki/Luhn_algorithm for digit in '0123456789': with contextlib.suppress(Exception): pydantic.PaymentCardNumber.validate_luhn_check_digit(card_number + digit) return card_number + digit raise AssertionError('Unreachable') # pragma: no cover card_patterns = ( # Note that these patterns omit the Luhn check digit; that's added by the function above '4[0-9]{14}', # Visa '5[12345][0-9]{13}', # Mastercard '3[47][0-9]{12}', # American Express '[0-26-9][0-9]{10,17}', # other (incomplete to avoid overlap) ) st.register_type_strategy( pydantic.PaymentCardNumber, st.from_regex('|'.join(card_patterns), fullmatch=True).map(add_luhn_digit), # type: ignore[arg-type] ) # UUIDs st.register_type_strategy(pydantic.UUID1, st.uuids(version=1)) st.register_type_strategy(pydantic.UUID3, st.uuids(version=3)) st.register_type_strategy(pydantic.UUID4, st.uuids(version=4)) st.register_type_strategy(pydantic.UUID5, st.uuids(version=5)) # Secrets st.register_type_strategy(pydantic.SecretBytes, st.binary().map(pydantic.SecretBytes)) st.register_type_strategy(pydantic.SecretStr, st.text().map(pydantic.SecretStr)) # IP addresses, networks, and interfaces st.register_type_strategy(pydantic.IPvAnyAddress, st.ip_addresses()) # type: ignore[arg-type] st.register_type_strategy( pydantic.IPvAnyInterface, st.from_type(ipaddress.IPv4Interface) | st.from_type(ipaddress.IPv6Interface), # type: ignore[arg-type] ) st.register_type_strategy( pydantic.IPvAnyNetwork, st.from_type(ipaddress.IPv4Network) | st.from_type(ipaddress.IPv6Network), # type: ignore[arg-type] ) # We hook into the con***() functions and the ConstrainedNumberMeta metaclass, # so here we only have to register subclasses for other constrained types which # don't go via those mechanisms. Then there are the registration hooks below. st.register_type_strategy(pydantic.StrictBool, st.booleans()) st.register_type_strategy(pydantic.StrictStr, st.text()) # FutureDate, PastDate st.register_type_strategy(pydantic.FutureDate, st.dates(min_value=datetime.date.today() + datetime.timedelta(days=1))) st.register_type_strategy(pydantic.PastDate, st.dates(max_value=datetime.date.today() - datetime.timedelta(days=1))) # Constrained-type resolver functions # # For these ones, we actually want to inspect the type in order to work out a # satisfying strategy. First up, the machinery for tracking resolver functions: RESOLVERS: Dict[type, Callable[[type], st.SearchStrategy]] = {} # type: ignore[type-arg] @overload def _registered(typ: Type[pydantic.types.T]) -> Type[pydantic.types.T]: pass @overload def _registered(typ: pydantic.types.ConstrainedNumberMeta) -> pydantic.types.ConstrainedNumberMeta: pass def _registered( typ: Union[Type[pydantic.types.T], pydantic.types.ConstrainedNumberMeta] ) -> Union[Type[pydantic.types.T], pydantic.types.ConstrainedNumberMeta]: # This function replaces the version in `pydantic.types`, in order to # effect the registration of new constrained types so that Hypothesis # can generate valid examples. pydantic.types._DEFINED_TYPES.add(typ) for supertype, resolver in RESOLVERS.items(): if issubclass(typ, supertype): st.register_type_strategy(typ, resolver(typ)) # type: ignore return typ raise NotImplementedError(f'Unknown type {typ!r} has no resolver to register') # pragma: no cover def resolves( typ: Union[type, pydantic.types.ConstrainedNumberMeta] ) -> Callable[[Callable[..., st.SearchStrategy]], Callable[..., st.SearchStrategy]]: # type: ignore[type-arg] def inner(f): # type: ignore assert f not in RESOLVERS RESOLVERS[typ] = f return f return inner # Type-to-strategy resolver functions @resolves(pydantic.JsonWrapper) def resolve_json(cls): # type: ignore[no-untyped-def] try: inner = st.none() if cls.inner_type is None else st.from_type(cls.inner_type) except Exception: # pragma: no cover finite = st.floats(allow_infinity=False, allow_nan=False) inner = st.recursive( base=st.one_of(st.none(), st.booleans(), st.integers(), finite, st.text()), extend=lambda x: st.lists(x) | st.dictionaries(st.text(), x), # type: ignore ) inner_type = getattr(cls, 'inner_type', None) return st.builds( cls.inner_type.json if lenient_issubclass(inner_type, pydantic.BaseModel) else json.dumps, inner, ensure_ascii=st.booleans(), indent=st.none() | st.integers(0, 16), sort_keys=st.booleans(), ) @resolves(pydantic.ConstrainedBytes) def resolve_conbytes(cls): # type: ignore[no-untyped-def] # pragma: no cover min_size = cls.min_length or 0 max_size = cls.max_length if not cls.strip_whitespace: return st.binary(min_size=min_size, max_size=max_size) # Fun with regex to ensure we neither start nor end with whitespace repeats = '{{{},{}}}'.format( min_size - 2 if min_size > 2 else 0, max_size - 2 if (max_size or 0) > 2 else '', ) if min_size >= 2: pattern = rf'\W.{repeats}\W' elif min_size == 1: pattern = rf'\W(.{repeats}\W)?' else: assert min_size == 0 pattern = rf'(\W(.{repeats}\W)?)?' return st.from_regex(pattern.encode(), fullmatch=True) @resolves(pydantic.ConstrainedDecimal) def resolve_condecimal(cls): # type: ignore[no-untyped-def] min_value = cls.ge max_value = cls.le if cls.gt is not None: assert min_value is None, 'Set `gt` or `ge`, but not both' min_value = cls.gt if cls.lt is not None: assert max_value is None, 'Set `lt` or `le`, but not both' max_value = cls.lt s = st.decimals(min_value, max_value, allow_nan=False, places=cls.decimal_places) if cls.lt is not None: s = s.filter(lambda d: d < cls.lt) if cls.gt is not None: s = s.filter(lambda d: cls.gt < d) return s @resolves(pydantic.ConstrainedFloat) def resolve_confloat(cls): # type: ignore[no-untyped-def] min_value = cls.ge max_value = cls.le exclude_min = False exclude_max = False if cls.gt is not None: assert min_value is None, 'Set `gt` or `ge`, but not both' min_value = cls.gt exclude_min = True if cls.lt is not None: assert max_value is None, 'Set `lt` or `le`, but not both' max_value = cls.lt exclude_max = True if cls.multiple_of is None: return st.floats(min_value, max_value, exclude_min=exclude_min, exclude_max=exclude_max, allow_nan=False) if min_value is not None: min_value = math.ceil(min_value / cls.multiple_of) if exclude_min: min_value = min_value + 1 if max_value is not None: assert max_value >= cls.multiple_of, 'Cannot build model with max value smaller than multiple of' max_value = math.floor(max_value / cls.multiple_of) if exclude_max: max_value = max_value - 1 return st.integers(min_value, max_value).map(lambda x: x * cls.multiple_of) @resolves(pydantic.ConstrainedInt) def resolve_conint(cls): # type: ignore[no-untyped-def] min_value = cls.ge max_value = cls.le if cls.gt is not None: assert min_value is None, 'Set `gt` or `ge`, but not both' min_value = cls.gt + 1 if cls.lt is not None: assert max_value is None, 'Set `lt` or `le`, but not both' max_value = cls.lt - 1 if cls.multiple_of is None or cls.multiple_of == 1: return st.integers(min_value, max_value) # These adjustments and the .map handle integer-valued multiples, while the # .filter handles trickier cases as for confloat. if min_value is not None: min_value = math.ceil(Fraction(min_value) / Fraction(cls.multiple_of)) if max_value is not None: max_value = math.floor(Fraction(max_value) / Fraction(cls.multiple_of)) return st.integers(min_value, max_value).map(lambda x: x * cls.multiple_of) @resolves(pydantic.ConstrainedDate) def resolve_condate(cls): # type: ignore[no-untyped-def] if cls.ge is not None: assert cls.gt is None, 'Set `gt` or `ge`, but not both' min_value = cls.ge elif cls.gt is not None: min_value = cls.gt + datetime.timedelta(days=1) else: min_value = datetime.date.min if cls.le is not None: assert cls.lt is None, 'Set `lt` or `le`, but not both' max_value = cls.le elif cls.lt is not None: max_value = cls.lt - datetime.timedelta(days=1) else: max_value = datetime.date.max return st.dates(min_value, max_value) @resolves(pydantic.ConstrainedStr) def resolve_constr(cls): # type: ignore[no-untyped-def] # pragma: no cover min_size = cls.min_length or 0 max_size = cls.max_length if cls.regex is None and not cls.strip_whitespace: return st.text(min_size=min_size, max_size=max_size) if cls.regex is not None: strategy = st.from_regex(cls.regex) if cls.strip_whitespace: strategy = strategy.filter(lambda s: s == s.strip()) elif cls.strip_whitespace: repeats = '{{{},{}}}'.format( min_size - 2 if min_size > 2 else 0, max_size - 2 if (max_size or 0) > 2 else '', ) if min_size >= 2: strategy = st.from_regex(rf'\W.{repeats}\W') elif min_size == 1: strategy = st.from_regex(rf'\W(.{repeats}\W)?') else: assert min_size == 0 strategy = st.from_regex(rf'(\W(.{repeats}\W)?)?') if min_size == 0 and max_size is None: return strategy elif max_size is None: return strategy.filter(lambda s: min_size <= len(s)) return strategy.filter(lambda s: min_size <= len(s) <= max_size) # Finally, register all previously-defined types, and patch in our new function for typ in list(pydantic.types._DEFINED_TYPES): _registered(typ) pydantic.types._registered = _registered st.register_type_strategy(pydantic.Json, resolve_json) pydantic-2.10.6/pydantic/v1/annotated_types.py000066400000000000000000000061251474456633400213530ustar00rootroot00000000000000import sys from typing import TYPE_CHECKING, Any, Dict, FrozenSet, NamedTuple, Type from pydantic.v1.fields import Required from pydantic.v1.main import BaseModel, create_model from pydantic.v1.typing import is_typeddict, is_typeddict_special if TYPE_CHECKING: from typing_extensions import TypedDict if sys.version_info < (3, 11): def is_legacy_typeddict(typeddict_cls: Type['TypedDict']) -> bool: # type: ignore[valid-type] return is_typeddict(typeddict_cls) and type(typeddict_cls).__module__ == 'typing' else: def is_legacy_typeddict(_: Any) -> Any: return False def create_model_from_typeddict( # Mypy bug: `Type[TypedDict]` is resolved as `Any` https://github.com/python/mypy/issues/11030 typeddict_cls: Type['TypedDict'], # type: ignore[valid-type] **kwargs: Any, ) -> Type['BaseModel']: """ Create a `BaseModel` based on the fields of a `TypedDict`. Since `typing.TypedDict` in Python 3.8 does not store runtime information about optional keys, we raise an error if this happens (see https://bugs.python.org/issue38834). """ field_definitions: Dict[str, Any] # Best case scenario: with python 3.9+ or when `TypedDict` is imported from `typing_extensions` if not hasattr(typeddict_cls, '__required_keys__'): raise TypeError( 'You should use `typing_extensions.TypedDict` instead of `typing.TypedDict` with Python < 3.9.2. ' 'Without it, there is no way to differentiate required and optional fields when subclassed.' ) if is_legacy_typeddict(typeddict_cls) and any( is_typeddict_special(t) for t in typeddict_cls.__annotations__.values() ): raise TypeError( 'You should use `typing_extensions.TypedDict` instead of `typing.TypedDict` with Python < 3.11. ' 'Without it, there is no way to reflect Required/NotRequired keys.' ) required_keys: FrozenSet[str] = typeddict_cls.__required_keys__ # type: ignore[attr-defined] field_definitions = { field_name: (field_type, Required if field_name in required_keys else None) for field_name, field_type in typeddict_cls.__annotations__.items() } return create_model(typeddict_cls.__name__, **kwargs, **field_definitions) def create_model_from_namedtuple(namedtuple_cls: Type['NamedTuple'], **kwargs: Any) -> Type['BaseModel']: """ Create a `BaseModel` based on the fields of a named tuple. A named tuple can be created with `typing.NamedTuple` and declared annotations but also with `collections.namedtuple`, in this case we consider all fields to have type `Any`. """ # With python 3.10+, `__annotations__` always exists but can be empty hence the `getattr... or...` logic namedtuple_annotations: Dict[str, Type[Any]] = getattr(namedtuple_cls, '__annotations__', None) or { k: Any for k in namedtuple_cls._fields } field_definitions: Dict[str, Any] = { field_name: (field_type, Required) for field_name, field_type in namedtuple_annotations.items() } return create_model(namedtuple_cls.__name__, **kwargs, **field_definitions) pydantic-2.10.6/pydantic/v1/class_validators.py000066400000000000000000000345201474456633400215070ustar00rootroot00000000000000import warnings from collections import ChainMap from functools import partial, partialmethod, wraps from itertools import chain from types import FunctionType from typing import TYPE_CHECKING, Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, Union, overload from pydantic.v1.errors import ConfigError from pydantic.v1.typing import AnyCallable from pydantic.v1.utils import ROOT_KEY, in_ipython if TYPE_CHECKING: from pydantic.v1.typing import AnyClassMethod class Validator: __slots__ = 'func', 'pre', 'each_item', 'always', 'check_fields', 'skip_on_failure' def __init__( self, func: AnyCallable, pre: bool = False, each_item: bool = False, always: bool = False, check_fields: bool = False, skip_on_failure: bool = False, ): self.func = func self.pre = pre self.each_item = each_item self.always = always self.check_fields = check_fields self.skip_on_failure = skip_on_failure if TYPE_CHECKING: from inspect import Signature from pydantic.v1.config import BaseConfig from pydantic.v1.fields import ModelField from pydantic.v1.types import ModelOrDc ValidatorCallable = Callable[[Optional[ModelOrDc], Any, Dict[str, Any], ModelField, Type[BaseConfig]], Any] ValidatorsList = List[ValidatorCallable] ValidatorListDict = Dict[str, List[Validator]] _FUNCS: Set[str] = set() VALIDATOR_CONFIG_KEY = '__validator_config__' ROOT_VALIDATOR_CONFIG_KEY = '__root_validator_config__' def validator( *fields: str, pre: bool = False, each_item: bool = False, always: bool = False, check_fields: bool = True, whole: Optional[bool] = None, allow_reuse: bool = False, ) -> Callable[[AnyCallable], 'AnyClassMethod']: """ Decorate methods on the class indicating that they should be used to validate fields :param fields: which field(s) the method should be called on :param pre: whether or not this validator should be called before the standard validators (else after) :param each_item: for complex objects (sets, lists etc.) whether to validate individual elements rather than the whole object :param always: whether this method and other validators should be called even if the value is missing :param check_fields: whether to check that the fields actually exist on the model :param allow_reuse: whether to track and raise an error if another validator refers to the decorated function """ if not fields: raise ConfigError('validator with no fields specified') elif isinstance(fields[0], FunctionType): raise ConfigError( "validators should be used with fields and keyword arguments, not bare. " # noqa: Q000 "E.g. usage should be `@validator('', ...)`" ) elif not all(isinstance(field, str) for field in fields): raise ConfigError( "validator fields should be passed as separate string args. " # noqa: Q000 "E.g. usage should be `@validator('', '', ...)`" ) if whole is not None: warnings.warn( 'The "whole" keyword argument is deprecated, use "each_item" (inverse meaning, default False) instead', DeprecationWarning, ) assert each_item is False, '"each_item" and "whole" conflict, remove "whole"' each_item = not whole def dec(f: AnyCallable) -> 'AnyClassMethod': f_cls = _prepare_validator(f, allow_reuse) setattr( f_cls, VALIDATOR_CONFIG_KEY, ( fields, Validator(func=f_cls.__func__, pre=pre, each_item=each_item, always=always, check_fields=check_fields), ), ) return f_cls return dec @overload def root_validator(_func: AnyCallable) -> 'AnyClassMethod': ... @overload def root_validator( *, pre: bool = False, allow_reuse: bool = False, skip_on_failure: bool = False ) -> Callable[[AnyCallable], 'AnyClassMethod']: ... def root_validator( _func: Optional[AnyCallable] = None, *, pre: bool = False, allow_reuse: bool = False, skip_on_failure: bool = False ) -> Union['AnyClassMethod', Callable[[AnyCallable], 'AnyClassMethod']]: """ Decorate methods on a model indicating that they should be used to validate (and perhaps modify) data either before or after standard model parsing/validation is performed. """ if _func: f_cls = _prepare_validator(_func, allow_reuse) setattr( f_cls, ROOT_VALIDATOR_CONFIG_KEY, Validator(func=f_cls.__func__, pre=pre, skip_on_failure=skip_on_failure) ) return f_cls def dec(f: AnyCallable) -> 'AnyClassMethod': f_cls = _prepare_validator(f, allow_reuse) setattr( f_cls, ROOT_VALIDATOR_CONFIG_KEY, Validator(func=f_cls.__func__, pre=pre, skip_on_failure=skip_on_failure) ) return f_cls return dec def _prepare_validator(function: AnyCallable, allow_reuse: bool) -> 'AnyClassMethod': """ Avoid validators with duplicated names since without this, validators can be overwritten silently which generally isn't the intended behaviour, don't run in ipython (see #312) or if allow_reuse is False. """ f_cls = function if isinstance(function, classmethod) else classmethod(function) if not in_ipython() and not allow_reuse: ref = ( getattr(f_cls.__func__, '__module__', '') + '.' + getattr(f_cls.__func__, '__qualname__', f'') ) if ref in _FUNCS: raise ConfigError(f'duplicate validator function "{ref}"; if this is intended, set `allow_reuse=True`') _FUNCS.add(ref) return f_cls class ValidatorGroup: def __init__(self, validators: 'ValidatorListDict') -> None: self.validators = validators self.used_validators = {'*'} def get_validators(self, name: str) -> Optional[Dict[str, Validator]]: self.used_validators.add(name) validators = self.validators.get(name, []) if name != ROOT_KEY: validators += self.validators.get('*', []) if validators: return {getattr(v.func, '__name__', f''): v for v in validators} else: return None def check_for_unused(self) -> None: unused_validators = set( chain.from_iterable( ( getattr(v.func, '__name__', f'') for v in self.validators[f] if v.check_fields ) for f in (self.validators.keys() - self.used_validators) ) ) if unused_validators: fn = ', '.join(unused_validators) raise ConfigError( f"Validators defined with incorrect fields: {fn} " # noqa: Q000 f"(use check_fields=False if you're inheriting from the model and intended this)" ) def extract_validators(namespace: Dict[str, Any]) -> Dict[str, List[Validator]]: validators: Dict[str, List[Validator]] = {} for var_name, value in namespace.items(): validator_config = getattr(value, VALIDATOR_CONFIG_KEY, None) if validator_config: fields, v = validator_config for field in fields: if field in validators: validators[field].append(v) else: validators[field] = [v] return validators def extract_root_validators(namespace: Dict[str, Any]) -> Tuple[List[AnyCallable], List[Tuple[bool, AnyCallable]]]: from inspect import signature pre_validators: List[AnyCallable] = [] post_validators: List[Tuple[bool, AnyCallable]] = [] for name, value in namespace.items(): validator_config: Optional[Validator] = getattr(value, ROOT_VALIDATOR_CONFIG_KEY, None) if validator_config: sig = signature(validator_config.func) args = list(sig.parameters.keys()) if args[0] == 'self': raise ConfigError( f'Invalid signature for root validator {name}: {sig}, "self" not permitted as first argument, ' f'should be: (cls, values).' ) if len(args) != 2: raise ConfigError(f'Invalid signature for root validator {name}: {sig}, should be: (cls, values).') # check function signature if validator_config.pre: pre_validators.append(validator_config.func) else: post_validators.append((validator_config.skip_on_failure, validator_config.func)) return pre_validators, post_validators def inherit_validators(base_validators: 'ValidatorListDict', validators: 'ValidatorListDict') -> 'ValidatorListDict': for field, field_validators in base_validators.items(): if field not in validators: validators[field] = [] validators[field] += field_validators return validators def make_generic_validator(validator: AnyCallable) -> 'ValidatorCallable': """ Make a generic function which calls a validator with the right arguments. Unfortunately other approaches (eg. return a partial of a function that builds the arguments) is slow, hence this laborious way of doing things. It's done like this so validators don't all need **kwargs in their signature, eg. any combination of the arguments "values", "fields" and/or "config" are permitted. """ from inspect import signature if not isinstance(validator, (partial, partialmethod)): # This should be the default case, so overhead is reduced sig = signature(validator) args = list(sig.parameters.keys()) else: # Fix the generated argument lists of partial methods sig = signature(validator.func) args = [ k for k in signature(validator.func).parameters.keys() if k not in validator.args | validator.keywords.keys() ] first_arg = args.pop(0) if first_arg == 'self': raise ConfigError( f'Invalid signature for validator {validator}: {sig}, "self" not permitted as first argument, ' f'should be: (cls, value, values, config, field), "values", "config" and "field" are all optional.' ) elif first_arg == 'cls': # assume the second argument is value return wraps(validator)(_generic_validator_cls(validator, sig, set(args[1:]))) else: # assume the first argument was value which has already been removed return wraps(validator)(_generic_validator_basic(validator, sig, set(args))) def prep_validators(v_funcs: Iterable[AnyCallable]) -> 'ValidatorsList': return [make_generic_validator(f) for f in v_funcs if f] all_kwargs = {'values', 'field', 'config'} def _generic_validator_cls(validator: AnyCallable, sig: 'Signature', args: Set[str]) -> 'ValidatorCallable': # assume the first argument is value has_kwargs = False if 'kwargs' in args: has_kwargs = True args -= {'kwargs'} if not args.issubset(all_kwargs): raise ConfigError( f'Invalid signature for validator {validator}: {sig}, should be: ' f'(cls, value, values, config, field), "values", "config" and "field" are all optional.' ) if has_kwargs: return lambda cls, v, values, field, config: validator(cls, v, values=values, field=field, config=config) elif args == set(): return lambda cls, v, values, field, config: validator(cls, v) elif args == {'values'}: return lambda cls, v, values, field, config: validator(cls, v, values=values) elif args == {'field'}: return lambda cls, v, values, field, config: validator(cls, v, field=field) elif args == {'config'}: return lambda cls, v, values, field, config: validator(cls, v, config=config) elif args == {'values', 'field'}: return lambda cls, v, values, field, config: validator(cls, v, values=values, field=field) elif args == {'values', 'config'}: return lambda cls, v, values, field, config: validator(cls, v, values=values, config=config) elif args == {'field', 'config'}: return lambda cls, v, values, field, config: validator(cls, v, field=field, config=config) else: # args == {'values', 'field', 'config'} return lambda cls, v, values, field, config: validator(cls, v, values=values, field=field, config=config) def _generic_validator_basic(validator: AnyCallable, sig: 'Signature', args: Set[str]) -> 'ValidatorCallable': has_kwargs = False if 'kwargs' in args: has_kwargs = True args -= {'kwargs'} if not args.issubset(all_kwargs): raise ConfigError( f'Invalid signature for validator {validator}: {sig}, should be: ' f'(value, values, config, field), "values", "config" and "field" are all optional.' ) if has_kwargs: return lambda cls, v, values, field, config: validator(v, values=values, field=field, config=config) elif args == set(): return lambda cls, v, values, field, config: validator(v) elif args == {'values'}: return lambda cls, v, values, field, config: validator(v, values=values) elif args == {'field'}: return lambda cls, v, values, field, config: validator(v, field=field) elif args == {'config'}: return lambda cls, v, values, field, config: validator(v, config=config) elif args == {'values', 'field'}: return lambda cls, v, values, field, config: validator(v, values=values, field=field) elif args == {'values', 'config'}: return lambda cls, v, values, field, config: validator(v, values=values, config=config) elif args == {'field', 'config'}: return lambda cls, v, values, field, config: validator(v, field=field, config=config) else: # args == {'values', 'field', 'config'} return lambda cls, v, values, field, config: validator(v, values=values, field=field, config=config) def gather_all_validators(type_: 'ModelOrDc') -> Dict[str, 'AnyClassMethod']: all_attributes = ChainMap(*[cls.__dict__ for cls in type_.__mro__]) # type: ignore[arg-type,var-annotated] return { k: v for k, v in all_attributes.items() if hasattr(v, VALIDATOR_CONFIG_KEY) or hasattr(v, ROOT_VALIDATOR_CONFIG_KEY) } pydantic-2.10.6/pydantic/v1/color.py000066400000000000000000000407141474456633400172720ustar00rootroot00000000000000""" Color definitions are used as per CSS3 specification: http://www.w3.org/TR/css3-color/#svg-color A few colors have multiple names referring to the sames colors, eg. `grey` and `gray` or `aqua` and `cyan`. In these cases the LAST color when sorted alphabetically takes preferences, eg. Color((0, 255, 255)).as_named() == 'cyan' because "cyan" comes after "aqua". """ import math import re from colorsys import hls_to_rgb, rgb_to_hls from typing import TYPE_CHECKING, Any, Dict, Optional, Tuple, Union, cast from pydantic.v1.errors import ColorError from pydantic.v1.utils import Representation, almost_equal_floats if TYPE_CHECKING: from pydantic.v1.typing import CallableGenerator, ReprArgs ColorTuple = Union[Tuple[int, int, int], Tuple[int, int, int, float]] ColorType = Union[ColorTuple, str] HslColorTuple = Union[Tuple[float, float, float], Tuple[float, float, float, float]] class RGBA: """ Internal use only as a representation of a color. """ __slots__ = 'r', 'g', 'b', 'alpha', '_tuple' def __init__(self, r: float, g: float, b: float, alpha: Optional[float]): self.r = r self.g = g self.b = b self.alpha = alpha self._tuple: Tuple[float, float, float, Optional[float]] = (r, g, b, alpha) def __getitem__(self, item: Any) -> Any: return self._tuple[item] # these are not compiled here to avoid import slowdown, they'll be compiled the first time they're used, then cached r_hex_short = r'\s*(?:#|0x)?([0-9a-f])([0-9a-f])([0-9a-f])([0-9a-f])?\s*' r_hex_long = r'\s*(?:#|0x)?([0-9a-f]{2})([0-9a-f]{2})([0-9a-f]{2})([0-9a-f]{2})?\s*' _r_255 = r'(\d{1,3}(?:\.\d+)?)' _r_comma = r'\s*,\s*' r_rgb = fr'\s*rgb\(\s*{_r_255}{_r_comma}{_r_255}{_r_comma}{_r_255}\)\s*' _r_alpha = r'(\d(?:\.\d+)?|\.\d+|\d{1,2}%)' r_rgba = fr'\s*rgba\(\s*{_r_255}{_r_comma}{_r_255}{_r_comma}{_r_255}{_r_comma}{_r_alpha}\s*\)\s*' _r_h = r'(-?\d+(?:\.\d+)?|-?\.\d+)(deg|rad|turn)?' _r_sl = r'(\d{1,3}(?:\.\d+)?)%' r_hsl = fr'\s*hsl\(\s*{_r_h}{_r_comma}{_r_sl}{_r_comma}{_r_sl}\s*\)\s*' r_hsla = fr'\s*hsl\(\s*{_r_h}{_r_comma}{_r_sl}{_r_comma}{_r_sl}{_r_comma}{_r_alpha}\s*\)\s*' # colors where the two hex characters are the same, if all colors match this the short version of hex colors can be used repeat_colors = {int(c * 2, 16) for c in '0123456789abcdef'} rads = 2 * math.pi class Color(Representation): __slots__ = '_original', '_rgba' def __init__(self, value: ColorType) -> None: self._rgba: RGBA self._original: ColorType if isinstance(value, (tuple, list)): self._rgba = parse_tuple(value) elif isinstance(value, str): self._rgba = parse_str(value) elif isinstance(value, Color): self._rgba = value._rgba value = value._original else: raise ColorError(reason='value must be a tuple, list or string') # if we've got here value must be a valid color self._original = value @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='color') def original(self) -> ColorType: """ Original value passed to Color """ return self._original def as_named(self, *, fallback: bool = False) -> str: if self._rgba.alpha is None: rgb = cast(Tuple[int, int, int], self.as_rgb_tuple()) try: return COLORS_BY_VALUE[rgb] except KeyError as e: if fallback: return self.as_hex() else: raise ValueError('no named color found, use fallback=True, as_hex() or as_rgb()') from e else: return self.as_hex() def as_hex(self) -> str: """ Hex string representing the color can be 3, 4, 6 or 8 characters depending on whether the string a "short" representation of the color is possible and whether there's an alpha channel. """ values = [float_to_255(c) for c in self._rgba[:3]] if self._rgba.alpha is not None: values.append(float_to_255(self._rgba.alpha)) as_hex = ''.join(f'{v:02x}' for v in values) if all(c in repeat_colors for c in values): as_hex = ''.join(as_hex[c] for c in range(0, len(as_hex), 2)) return '#' + as_hex def as_rgb(self) -> str: """ Color as an rgb(, , ) or rgba(, , , ) string. """ if self._rgba.alpha is None: return f'rgb({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)})' else: return ( f'rgba({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)}, ' f'{round(self._alpha_float(), 2)})' ) def as_rgb_tuple(self, *, alpha: Optional[bool] = None) -> ColorTuple: """ Color as an RGB or RGBA tuple; red, green and blue are in the range 0 to 255, alpha if included is in the range 0 to 1. :param alpha: whether to include the alpha channel, options are None - (default) include alpha only if it's set (e.g. not None) True - always include alpha, False - always omit alpha, """ r, g, b = (float_to_255(c) for c in self._rgba[:3]) if alpha is None: if self._rgba.alpha is None: return r, g, b else: return r, g, b, self._alpha_float() elif alpha: return r, g, b, self._alpha_float() else: # alpha is False return r, g, b def as_hsl(self) -> str: """ Color as an hsl(, , ) or hsl(, , , ) string. """ if self._rgba.alpha is None: h, s, li = self.as_hsl_tuple(alpha=False) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%})' else: h, s, li, a = self.as_hsl_tuple(alpha=True) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%}, {round(a, 2)})' def as_hsl_tuple(self, *, alpha: Optional[bool] = None) -> HslColorTuple: """ Color as an HSL or HSLA tuple, e.g. hue, saturation, lightness and optionally alpha; all elements are in the range 0 to 1. NOTE: this is HSL as used in HTML and most other places, not HLS as used in python's colorsys. :param alpha: whether to include the alpha channel, options are None - (default) include alpha only if it's set (e.g. not None) True - always include alpha, False - always omit alpha, """ h, l, s = rgb_to_hls(self._rgba.r, self._rgba.g, self._rgba.b) if alpha is None: if self._rgba.alpha is None: return h, s, l else: return h, s, l, self._alpha_float() if alpha: return h, s, l, self._alpha_float() else: # alpha is False return h, s, l def _alpha_float(self) -> float: return 1 if self._rgba.alpha is None else self._rgba.alpha @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls def __str__(self) -> str: return self.as_named(fallback=True) def __repr_args__(self) -> 'ReprArgs': return [(None, self.as_named(fallback=True))] + [('rgb', self.as_rgb_tuple())] # type: ignore def __eq__(self, other: Any) -> bool: return isinstance(other, Color) and self.as_rgb_tuple() == other.as_rgb_tuple() def __hash__(self) -> int: return hash(self.as_rgb_tuple()) def parse_tuple(value: Tuple[Any, ...]) -> RGBA: """ Parse a tuple or list as a color. """ if len(value) == 3: r, g, b = (parse_color_value(v) for v in value) return RGBA(r, g, b, None) elif len(value) == 4: r, g, b = (parse_color_value(v) for v in value[:3]) return RGBA(r, g, b, parse_float_alpha(value[3])) else: raise ColorError(reason='tuples must have length 3 or 4') def parse_str(value: str) -> RGBA: """ Parse a string to an RGBA tuple, trying the following formats (in this order): * named color, see COLORS_BY_NAME below * hex short eg. `fff` (prefix can be `#`, `0x` or nothing) * hex long eg. `ffffff` (prefix can be `#`, `0x` or nothing) * `rgb(, , ) ` * `rgba(, , , )` """ value_lower = value.lower() try: r, g, b = COLORS_BY_NAME[value_lower] except KeyError: pass else: return ints_to_rgba(r, g, b, None) m = re.fullmatch(r_hex_short, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v * 2, 16) for v in rgb) if a: alpha: Optional[float] = int(a * 2, 16) / 255 else: alpha = None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_hex_long, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v, 16) for v in rgb) if a: alpha = int(a, 16) / 255 else: alpha = None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_rgb, value_lower) if m: return ints_to_rgba(*m.groups(), None) # type: ignore m = re.fullmatch(r_rgba, value_lower) if m: return ints_to_rgba(*m.groups()) # type: ignore m = re.fullmatch(r_hsl, value_lower) if m: h, h_units, s, l_ = m.groups() return parse_hsl(h, h_units, s, l_) m = re.fullmatch(r_hsla, value_lower) if m: h, h_units, s, l_, a = m.groups() return parse_hsl(h, h_units, s, l_, parse_float_alpha(a)) raise ColorError(reason='string not recognised as a valid color') def ints_to_rgba(r: Union[int, str], g: Union[int, str], b: Union[int, str], alpha: Optional[float]) -> RGBA: return RGBA(parse_color_value(r), parse_color_value(g), parse_color_value(b), parse_float_alpha(alpha)) def parse_color_value(value: Union[int, str], max_val: int = 255) -> float: """ Parse a value checking it's a valid int in the range 0 to max_val and divide by max_val to give a number in the range 0 to 1 """ try: color = float(value) except ValueError: raise ColorError(reason='color values must be a valid number') if 0 <= color <= max_val: return color / max_val else: raise ColorError(reason=f'color values must be in the range 0 to {max_val}') def parse_float_alpha(value: Union[None, str, float, int]) -> Optional[float]: """ Parse a value checking it's a valid float in the range 0 to 1 """ if value is None: return None try: if isinstance(value, str) and value.endswith('%'): alpha = float(value[:-1]) / 100 else: alpha = float(value) except ValueError: raise ColorError(reason='alpha values must be a valid float') if almost_equal_floats(alpha, 1): return None elif 0 <= alpha <= 1: return alpha else: raise ColorError(reason='alpha values must be in the range 0 to 1') def parse_hsl(h: str, h_units: str, sat: str, light: str, alpha: Optional[float] = None) -> RGBA: """ Parse raw hue, saturation, lightness and alpha values and convert to RGBA. """ s_value, l_value = parse_color_value(sat, 100), parse_color_value(light, 100) h_value = float(h) if h_units in {None, 'deg'}: h_value = h_value % 360 / 360 elif h_units == 'rad': h_value = h_value % rads / rads else: # turns h_value = h_value % 1 r, g, b = hls_to_rgb(h_value, l_value, s_value) return RGBA(r, g, b, alpha) def float_to_255(c: float) -> int: return int(round(c * 255)) COLORS_BY_NAME = { 'aliceblue': (240, 248, 255), 'antiquewhite': (250, 235, 215), 'aqua': (0, 255, 255), 'aquamarine': (127, 255, 212), 'azure': (240, 255, 255), 'beige': (245, 245, 220), 'bisque': (255, 228, 196), 'black': (0, 0, 0), 'blanchedalmond': (255, 235, 205), 'blue': (0, 0, 255), 'blueviolet': (138, 43, 226), 'brown': (165, 42, 42), 'burlywood': (222, 184, 135), 'cadetblue': (95, 158, 160), 'chartreuse': (127, 255, 0), 'chocolate': (210, 105, 30), 'coral': (255, 127, 80), 'cornflowerblue': (100, 149, 237), 'cornsilk': (255, 248, 220), 'crimson': (220, 20, 60), 'cyan': (0, 255, 255), 'darkblue': (0, 0, 139), 'darkcyan': (0, 139, 139), 'darkgoldenrod': (184, 134, 11), 'darkgray': (169, 169, 169), 'darkgreen': (0, 100, 0), 'darkgrey': (169, 169, 169), 'darkkhaki': (189, 183, 107), 'darkmagenta': (139, 0, 139), 'darkolivegreen': (85, 107, 47), 'darkorange': (255, 140, 0), 'darkorchid': (153, 50, 204), 'darkred': (139, 0, 0), 'darksalmon': (233, 150, 122), 'darkseagreen': (143, 188, 143), 'darkslateblue': (72, 61, 139), 'darkslategray': (47, 79, 79), 'darkslategrey': (47, 79, 79), 'darkturquoise': (0, 206, 209), 'darkviolet': (148, 0, 211), 'deeppink': (255, 20, 147), 'deepskyblue': (0, 191, 255), 'dimgray': (105, 105, 105), 'dimgrey': (105, 105, 105), 'dodgerblue': (30, 144, 255), 'firebrick': (178, 34, 34), 'floralwhite': (255, 250, 240), 'forestgreen': (34, 139, 34), 'fuchsia': (255, 0, 255), 'gainsboro': (220, 220, 220), 'ghostwhite': (248, 248, 255), 'gold': (255, 215, 0), 'goldenrod': (218, 165, 32), 'gray': (128, 128, 128), 'green': (0, 128, 0), 'greenyellow': (173, 255, 47), 'grey': (128, 128, 128), 'honeydew': (240, 255, 240), 'hotpink': (255, 105, 180), 'indianred': (205, 92, 92), 'indigo': (75, 0, 130), 'ivory': (255, 255, 240), 'khaki': (240, 230, 140), 'lavender': (230, 230, 250), 'lavenderblush': (255, 240, 245), 'lawngreen': (124, 252, 0), 'lemonchiffon': (255, 250, 205), 'lightblue': (173, 216, 230), 'lightcoral': (240, 128, 128), 'lightcyan': (224, 255, 255), 'lightgoldenrodyellow': (250, 250, 210), 'lightgray': (211, 211, 211), 'lightgreen': (144, 238, 144), 'lightgrey': (211, 211, 211), 'lightpink': (255, 182, 193), 'lightsalmon': (255, 160, 122), 'lightseagreen': (32, 178, 170), 'lightskyblue': (135, 206, 250), 'lightslategray': (119, 136, 153), 'lightslategrey': (119, 136, 153), 'lightsteelblue': (176, 196, 222), 'lightyellow': (255, 255, 224), 'lime': (0, 255, 0), 'limegreen': (50, 205, 50), 'linen': (250, 240, 230), 'magenta': (255, 0, 255), 'maroon': (128, 0, 0), 'mediumaquamarine': (102, 205, 170), 'mediumblue': (0, 0, 205), 'mediumorchid': (186, 85, 211), 'mediumpurple': (147, 112, 219), 'mediumseagreen': (60, 179, 113), 'mediumslateblue': (123, 104, 238), 'mediumspringgreen': (0, 250, 154), 'mediumturquoise': (72, 209, 204), 'mediumvioletred': (199, 21, 133), 'midnightblue': (25, 25, 112), 'mintcream': (245, 255, 250), 'mistyrose': (255, 228, 225), 'moccasin': (255, 228, 181), 'navajowhite': (255, 222, 173), 'navy': (0, 0, 128), 'oldlace': (253, 245, 230), 'olive': (128, 128, 0), 'olivedrab': (107, 142, 35), 'orange': (255, 165, 0), 'orangered': (255, 69, 0), 'orchid': (218, 112, 214), 'palegoldenrod': (238, 232, 170), 'palegreen': (152, 251, 152), 'paleturquoise': (175, 238, 238), 'palevioletred': (219, 112, 147), 'papayawhip': (255, 239, 213), 'peachpuff': (255, 218, 185), 'peru': (205, 133, 63), 'pink': (255, 192, 203), 'plum': (221, 160, 221), 'powderblue': (176, 224, 230), 'purple': (128, 0, 128), 'red': (255, 0, 0), 'rosybrown': (188, 143, 143), 'royalblue': (65, 105, 225), 'saddlebrown': (139, 69, 19), 'salmon': (250, 128, 114), 'sandybrown': (244, 164, 96), 'seagreen': (46, 139, 87), 'seashell': (255, 245, 238), 'sienna': (160, 82, 45), 'silver': (192, 192, 192), 'skyblue': (135, 206, 235), 'slateblue': (106, 90, 205), 'slategray': (112, 128, 144), 'slategrey': (112, 128, 144), 'snow': (255, 250, 250), 'springgreen': (0, 255, 127), 'steelblue': (70, 130, 180), 'tan': (210, 180, 140), 'teal': (0, 128, 128), 'thistle': (216, 191, 216), 'tomato': (255, 99, 71), 'turquoise': (64, 224, 208), 'violet': (238, 130, 238), 'wheat': (245, 222, 179), 'white': (255, 255, 255), 'whitesmoke': (245, 245, 245), 'yellow': (255, 255, 0), 'yellowgreen': (154, 205, 50), } COLORS_BY_VALUE = {v: k for k, v in COLORS_BY_NAME.items()} pydantic-2.10.6/pydantic/v1/config.py000066400000000000000000000146041474456633400174200ustar00rootroot00000000000000import json from enum import Enum from typing import TYPE_CHECKING, Any, Callable, Dict, ForwardRef, Optional, Tuple, Type, Union from typing_extensions import Literal, Protocol from pydantic.v1.typing import AnyArgTCallable, AnyCallable from pydantic.v1.utils import GetterDict from pydantic.v1.version import compiled if TYPE_CHECKING: from typing import overload from pydantic.v1.fields import ModelField from pydantic.v1.main import BaseModel ConfigType = Type['BaseConfig'] class SchemaExtraCallable(Protocol): @overload def __call__(self, schema: Dict[str, Any]) -> None: pass @overload def __call__(self, schema: Dict[str, Any], model_class: Type[BaseModel]) -> None: pass else: SchemaExtraCallable = Callable[..., None] __all__ = 'BaseConfig', 'ConfigDict', 'get_config', 'Extra', 'inherit_config', 'prepare_config' class Extra(str, Enum): allow = 'allow' ignore = 'ignore' forbid = 'forbid' # https://github.com/cython/cython/issues/4003 # Fixed in Cython 3 and Pydantic v1 won't support Cython 3. # Pydantic v2 doesn't depend on Cython at all. if not compiled: from typing_extensions import TypedDict class ConfigDict(TypedDict, total=False): title: Optional[str] anystr_lower: bool anystr_strip_whitespace: bool min_anystr_length: int max_anystr_length: Optional[int] validate_all: bool extra: Extra allow_mutation: bool frozen: bool allow_population_by_field_name: bool use_enum_values: bool fields: Dict[str, Union[str, Dict[str, str]]] validate_assignment: bool error_msg_templates: Dict[str, str] arbitrary_types_allowed: bool orm_mode: bool getter_dict: Type[GetterDict] alias_generator: Optional[Callable[[str], str]] keep_untouched: Tuple[type, ...] schema_extra: Union[Dict[str, object], 'SchemaExtraCallable'] json_loads: Callable[[str], object] json_dumps: AnyArgTCallable[str] json_encoders: Dict[Type[object], AnyCallable] underscore_attrs_are_private: bool allow_inf_nan: bool copy_on_model_validation: Literal['none', 'deep', 'shallow'] # whether dataclass `__post_init__` should be run after validation post_init_call: Literal['before_validation', 'after_validation'] else: ConfigDict = dict # type: ignore class BaseConfig: title: Optional[str] = None anystr_lower: bool = False anystr_upper: bool = False anystr_strip_whitespace: bool = False min_anystr_length: int = 0 max_anystr_length: Optional[int] = None validate_all: bool = False extra: Extra = Extra.ignore allow_mutation: bool = True frozen: bool = False allow_population_by_field_name: bool = False use_enum_values: bool = False fields: Dict[str, Union[str, Dict[str, str]]] = {} validate_assignment: bool = False error_msg_templates: Dict[str, str] = {} arbitrary_types_allowed: bool = False orm_mode: bool = False getter_dict: Type[GetterDict] = GetterDict alias_generator: Optional[Callable[[str], str]] = None keep_untouched: Tuple[type, ...] = () schema_extra: Union[Dict[str, Any], 'SchemaExtraCallable'] = {} json_loads: Callable[[str], Any] = json.loads json_dumps: Callable[..., str] = json.dumps json_encoders: Dict[Union[Type[Any], str, ForwardRef], AnyCallable] = {} underscore_attrs_are_private: bool = False allow_inf_nan: bool = True # whether inherited models as fields should be reconstructed as base model, # and whether such a copy should be shallow or deep copy_on_model_validation: Literal['none', 'deep', 'shallow'] = 'shallow' # whether `Union` should check all allowed types before even trying to coerce smart_union: bool = False # whether dataclass `__post_init__` should be run before or after validation post_init_call: Literal['before_validation', 'after_validation'] = 'before_validation' @classmethod def get_field_info(cls, name: str) -> Dict[str, Any]: """ Get properties of FieldInfo from the `fields` property of the config class. """ fields_value = cls.fields.get(name) if isinstance(fields_value, str): field_info: Dict[str, Any] = {'alias': fields_value} elif isinstance(fields_value, dict): field_info = fields_value else: field_info = {} if 'alias' in field_info: field_info.setdefault('alias_priority', 2) if field_info.get('alias_priority', 0) <= 1 and cls.alias_generator: alias = cls.alias_generator(name) if not isinstance(alias, str): raise TypeError(f'Config.alias_generator must return str, not {alias.__class__}') field_info.update(alias=alias, alias_priority=1) return field_info @classmethod def prepare_field(cls, field: 'ModelField') -> None: """ Optional hook to check or modify fields during model creation. """ pass def get_config(config: Union[ConfigDict, Type[object], None]) -> Type[BaseConfig]: if config is None: return BaseConfig else: config_dict = ( config if isinstance(config, dict) else {k: getattr(config, k) for k in dir(config) if not k.startswith('__')} ) class Config(BaseConfig): ... for k, v in config_dict.items(): setattr(Config, k, v) return Config def inherit_config(self_config: 'ConfigType', parent_config: 'ConfigType', **namespace: Any) -> 'ConfigType': if not self_config: base_classes: Tuple['ConfigType', ...] = (parent_config,) elif self_config == parent_config: base_classes = (self_config,) else: base_classes = self_config, parent_config namespace['json_encoders'] = { **getattr(parent_config, 'json_encoders', {}), **getattr(self_config, 'json_encoders', {}), **namespace.get('json_encoders', {}), } return type('Config', base_classes, namespace) def prepare_config(config: Type[BaseConfig], cls_name: str) -> None: if not isinstance(config.extra, Extra): try: config.extra = Extra(config.extra) except ValueError: raise ValueError(f'"{cls_name}": {config.extra} is not a valid value for "extra"') pydantic-2.10.6/pydantic/v1/dataclasses.py000066400000000000000000000433741474456633400204500ustar00rootroot00000000000000""" The main purpose is to enhance stdlib dataclasses by adding validation A pydantic dataclass can be generated from scratch or from a stdlib one. Behind the scene, a pydantic dataclass is just like a regular one on which we attach a `BaseModel` and magic methods to trigger the validation of the data. `__init__` and `__post_init__` are hence overridden and have extra logic to be able to validate input data. When a pydantic dataclass is generated from scratch, it's just a plain dataclass with validation triggered at initialization The tricky part if for stdlib dataclasses that are converted after into pydantic ones e.g. ```py @dataclasses.dataclass class M: x: int ValidatedM = pydantic.dataclasses.dataclass(M) ``` We indeed still want to support equality, hashing, repr, ... as if it was the stdlib one! ```py assert isinstance(ValidatedM(x=1), M) assert ValidatedM(x=1) == M(x=1) ``` This means we **don't want to create a new dataclass that inherits from it** The trick is to create a wrapper around `M` that will act as a proxy to trigger validation without altering default `M` behaviour. """ import copy import dataclasses import sys from contextlib import contextmanager from functools import wraps try: from functools import cached_property except ImportError: # cached_property available only for python3.8+ pass from typing import TYPE_CHECKING, Any, Callable, ClassVar, Dict, Generator, Optional, Type, TypeVar, Union, overload from typing_extensions import dataclass_transform from pydantic.v1.class_validators import gather_all_validators from pydantic.v1.config import BaseConfig, ConfigDict, Extra, get_config from pydantic.v1.error_wrappers import ValidationError from pydantic.v1.errors import DataclassTypeError from pydantic.v1.fields import Field, FieldInfo, Required, Undefined from pydantic.v1.main import create_model, validate_model from pydantic.v1.utils import ClassAttribute if TYPE_CHECKING: from pydantic.v1.main import BaseModel from pydantic.v1.typing import CallableGenerator, NoArgAnyCallable DataclassT = TypeVar('DataclassT', bound='Dataclass') DataclassClassOrWrapper = Union[Type['Dataclass'], 'DataclassProxy'] class Dataclass: # stdlib attributes __dataclass_fields__: ClassVar[Dict[str, Any]] __dataclass_params__: ClassVar[Any] # in reality `dataclasses._DataclassParams` __post_init__: ClassVar[Callable[..., None]] # Added by pydantic __pydantic_run_validation__: ClassVar[bool] __post_init_post_parse__: ClassVar[Callable[..., None]] __pydantic_initialised__: ClassVar[bool] __pydantic_model__: ClassVar[Type[BaseModel]] __pydantic_validate_values__: ClassVar[Callable[['Dataclass'], None]] __pydantic_has_field_info_default__: ClassVar[bool] # whether a `pydantic.Field` is used as default value def __init__(self, *args: object, **kwargs: object) -> None: pass @classmethod def __get_validators__(cls: Type['Dataclass']) -> 'CallableGenerator': pass @classmethod def __validate__(cls: Type['DataclassT'], v: Any) -> 'DataclassT': pass __all__ = [ 'dataclass', 'set_validation', 'create_pydantic_model_from_dataclass', 'is_builtin_dataclass', 'make_dataclass_validator', ] _T = TypeVar('_T') if sys.version_info >= (3, 10): @dataclass_transform(field_specifiers=(dataclasses.field, Field)) @overload def dataclass( *, init: bool = True, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: Union[ConfigDict, Type[object], None] = None, validate_on_init: Optional[bool] = None, use_proxy: Optional[bool] = None, kw_only: bool = ..., ) -> Callable[[Type[_T]], 'DataclassClassOrWrapper']: ... @dataclass_transform(field_specifiers=(dataclasses.field, Field)) @overload def dataclass( _cls: Type[_T], *, init: bool = True, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: Union[ConfigDict, Type[object], None] = None, validate_on_init: Optional[bool] = None, use_proxy: Optional[bool] = None, kw_only: bool = ..., ) -> 'DataclassClassOrWrapper': ... else: @dataclass_transform(field_specifiers=(dataclasses.field, Field)) @overload def dataclass( *, init: bool = True, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: Union[ConfigDict, Type[object], None] = None, validate_on_init: Optional[bool] = None, use_proxy: Optional[bool] = None, ) -> Callable[[Type[_T]], 'DataclassClassOrWrapper']: ... @dataclass_transform(field_specifiers=(dataclasses.field, Field)) @overload def dataclass( _cls: Type[_T], *, init: bool = True, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: Union[ConfigDict, Type[object], None] = None, validate_on_init: Optional[bool] = None, use_proxy: Optional[bool] = None, ) -> 'DataclassClassOrWrapper': ... @dataclass_transform(field_specifiers=(dataclasses.field, Field)) def dataclass( _cls: Optional[Type[_T]] = None, *, init: bool = True, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: Union[ConfigDict, Type[object], None] = None, validate_on_init: Optional[bool] = None, use_proxy: Optional[bool] = None, kw_only: bool = False, ) -> Union[Callable[[Type[_T]], 'DataclassClassOrWrapper'], 'DataclassClassOrWrapper']: """ Like the python standard lib dataclasses but with type validation. The result is either a pydantic dataclass that will validate input data or a wrapper that will trigger validation around a stdlib dataclass to avoid modifying it directly """ the_config = get_config(config) def wrap(cls: Type[Any]) -> 'DataclassClassOrWrapper': should_use_proxy = ( use_proxy if use_proxy is not None else ( is_builtin_dataclass(cls) and (cls.__bases__[0] is object or set(dir(cls)) == set(dir(cls.__bases__[0]))) ) ) if should_use_proxy: dc_cls_doc = '' dc_cls = DataclassProxy(cls) default_validate_on_init = False else: dc_cls_doc = cls.__doc__ or '' # needs to be done before generating dataclass if sys.version_info >= (3, 10): dc_cls = dataclasses.dataclass( cls, init=init, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen, kw_only=kw_only, ) else: dc_cls = dataclasses.dataclass( # type: ignore cls, init=init, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen ) default_validate_on_init = True should_validate_on_init = default_validate_on_init if validate_on_init is None else validate_on_init _add_pydantic_validation_attributes(cls, the_config, should_validate_on_init, dc_cls_doc) dc_cls.__pydantic_model__.__try_update_forward_refs__(**{cls.__name__: cls}) return dc_cls if _cls is None: return wrap return wrap(_cls) @contextmanager def set_validation(cls: Type['DataclassT'], value: bool) -> Generator[Type['DataclassT'], None, None]: original_run_validation = cls.__pydantic_run_validation__ try: cls.__pydantic_run_validation__ = value yield cls finally: cls.__pydantic_run_validation__ = original_run_validation class DataclassProxy: __slots__ = '__dataclass__' def __init__(self, dc_cls: Type['Dataclass']) -> None: object.__setattr__(self, '__dataclass__', dc_cls) def __call__(self, *args: Any, **kwargs: Any) -> Any: with set_validation(self.__dataclass__, True): return self.__dataclass__(*args, **kwargs) def __getattr__(self, name: str) -> Any: return getattr(self.__dataclass__, name) def __setattr__(self, __name: str, __value: Any) -> None: return setattr(self.__dataclass__, __name, __value) def __instancecheck__(self, instance: Any) -> bool: return isinstance(instance, self.__dataclass__) def __copy__(self) -> 'DataclassProxy': return DataclassProxy(copy.copy(self.__dataclass__)) def __deepcopy__(self, memo: Any) -> 'DataclassProxy': return DataclassProxy(copy.deepcopy(self.__dataclass__, memo)) def _add_pydantic_validation_attributes( # noqa: C901 (ignore complexity) dc_cls: Type['Dataclass'], config: Type[BaseConfig], validate_on_init: bool, dc_cls_doc: str, ) -> None: """ We need to replace the right method. If no `__post_init__` has been set in the stdlib dataclass it won't even exist (code is generated on the fly by `dataclasses`) By default, we run validation after `__init__` or `__post_init__` if defined """ init = dc_cls.__init__ @wraps(init) def handle_extra_init(self: 'Dataclass', *args: Any, **kwargs: Any) -> None: if config.extra == Extra.ignore: init(self, *args, **{k: v for k, v in kwargs.items() if k in self.__dataclass_fields__}) elif config.extra == Extra.allow: for k, v in kwargs.items(): self.__dict__.setdefault(k, v) init(self, *args, **{k: v for k, v in kwargs.items() if k in self.__dataclass_fields__}) else: init(self, *args, **kwargs) if hasattr(dc_cls, '__post_init__'): try: post_init = dc_cls.__post_init__.__wrapped__ # type: ignore[attr-defined] except AttributeError: post_init = dc_cls.__post_init__ @wraps(post_init) def new_post_init(self: 'Dataclass', *args: Any, **kwargs: Any) -> None: if config.post_init_call == 'before_validation': post_init(self, *args, **kwargs) if self.__class__.__pydantic_run_validation__: self.__pydantic_validate_values__() if hasattr(self, '__post_init_post_parse__'): self.__post_init_post_parse__(*args, **kwargs) if config.post_init_call == 'after_validation': post_init(self, *args, **kwargs) setattr(dc_cls, '__init__', handle_extra_init) setattr(dc_cls, '__post_init__', new_post_init) else: @wraps(init) def new_init(self: 'Dataclass', *args: Any, **kwargs: Any) -> None: handle_extra_init(self, *args, **kwargs) if self.__class__.__pydantic_run_validation__: self.__pydantic_validate_values__() if hasattr(self, '__post_init_post_parse__'): # We need to find again the initvars. To do that we use `__dataclass_fields__` instead of # public method `dataclasses.fields` # get all initvars and their default values initvars_and_values: Dict[str, Any] = {} for i, f in enumerate(self.__class__.__dataclass_fields__.values()): if f._field_type is dataclasses._FIELD_INITVAR: # type: ignore[attr-defined] try: # set arg value by default initvars_and_values[f.name] = args[i] except IndexError: initvars_and_values[f.name] = kwargs.get(f.name, f.default) self.__post_init_post_parse__(**initvars_and_values) setattr(dc_cls, '__init__', new_init) setattr(dc_cls, '__pydantic_run_validation__', ClassAttribute('__pydantic_run_validation__', validate_on_init)) setattr(dc_cls, '__pydantic_initialised__', False) setattr(dc_cls, '__pydantic_model__', create_pydantic_model_from_dataclass(dc_cls, config, dc_cls_doc)) setattr(dc_cls, '__pydantic_validate_values__', _dataclass_validate_values) setattr(dc_cls, '__validate__', classmethod(_validate_dataclass)) setattr(dc_cls, '__get_validators__', classmethod(_get_validators)) if dc_cls.__pydantic_model__.__config__.validate_assignment and not dc_cls.__dataclass_params__.frozen: setattr(dc_cls, '__setattr__', _dataclass_validate_assignment_setattr) def _get_validators(cls: 'DataclassClassOrWrapper') -> 'CallableGenerator': yield cls.__validate__ def _validate_dataclass(cls: Type['DataclassT'], v: Any) -> 'DataclassT': with set_validation(cls, True): if isinstance(v, cls): v.__pydantic_validate_values__() return v elif isinstance(v, (list, tuple)): return cls(*v) elif isinstance(v, dict): return cls(**v) else: raise DataclassTypeError(class_name=cls.__name__) def create_pydantic_model_from_dataclass( dc_cls: Type['Dataclass'], config: Type[Any] = BaseConfig, dc_cls_doc: Optional[str] = None, ) -> Type['BaseModel']: field_definitions: Dict[str, Any] = {} for field in dataclasses.fields(dc_cls): default: Any = Undefined default_factory: Optional['NoArgAnyCallable'] = None field_info: FieldInfo if field.default is not dataclasses.MISSING: default = field.default elif field.default_factory is not dataclasses.MISSING: default_factory = field.default_factory else: default = Required if isinstance(default, FieldInfo): field_info = default dc_cls.__pydantic_has_field_info_default__ = True else: field_info = Field(default=default, default_factory=default_factory, **field.metadata) field_definitions[field.name] = (field.type, field_info) validators = gather_all_validators(dc_cls) model: Type['BaseModel'] = create_model( dc_cls.__name__, __config__=config, __module__=dc_cls.__module__, __validators__=validators, __cls_kwargs__={'__resolve_forward_refs__': False}, **field_definitions, ) model.__doc__ = dc_cls_doc if dc_cls_doc is not None else dc_cls.__doc__ or '' return model if sys.version_info >= (3, 8): def _is_field_cached_property(obj: 'Dataclass', k: str) -> bool: return isinstance(getattr(type(obj), k, None), cached_property) else: def _is_field_cached_property(obj: 'Dataclass', k: str) -> bool: return False def _dataclass_validate_values(self: 'Dataclass') -> None: # validation errors can occur if this function is called twice on an already initialised dataclass. # for example if Extra.forbid is enabled, it would consider __pydantic_initialised__ an invalid extra property if getattr(self, '__pydantic_initialised__'): return if getattr(self, '__pydantic_has_field_info_default__', False): # We need to remove `FieldInfo` values since they are not valid as input # It's ok to do that because they are obviously the default values! input_data = { k: v for k, v in self.__dict__.items() if not (isinstance(v, FieldInfo) or _is_field_cached_property(self, k)) } else: input_data = {k: v for k, v in self.__dict__.items() if not _is_field_cached_property(self, k)} d, _, validation_error = validate_model(self.__pydantic_model__, input_data, cls=self.__class__) if validation_error: raise validation_error self.__dict__.update(d) object.__setattr__(self, '__pydantic_initialised__', True) def _dataclass_validate_assignment_setattr(self: 'Dataclass', name: str, value: Any) -> None: if self.__pydantic_initialised__: d = dict(self.__dict__) d.pop(name, None) known_field = self.__pydantic_model__.__fields__.get(name, None) if known_field: value, error_ = known_field.validate(value, d, loc=name, cls=self.__class__) if error_: raise ValidationError([error_], self.__class__) object.__setattr__(self, name, value) def is_builtin_dataclass(_cls: Type[Any]) -> bool: """ Whether a class is a stdlib dataclass (useful to discriminated a pydantic dataclass that is actually a wrapper around a stdlib dataclass) we check that - `_cls` is a dataclass - `_cls` is not a processed pydantic dataclass (with a basemodel attached) - `_cls` is not a pydantic dataclass inheriting directly from a stdlib dataclass e.g. ``` @dataclasses.dataclass class A: x: int @pydantic.dataclasses.dataclass class B(A): y: int ``` In this case, when we first check `B`, we make an extra check and look at the annotations ('y'), which won't be a superset of all the dataclass fields (only the stdlib fields i.e. 'x') """ return ( dataclasses.is_dataclass(_cls) and not hasattr(_cls, '__pydantic_model__') and set(_cls.__dataclass_fields__).issuperset(set(getattr(_cls, '__annotations__', {}))) ) def make_dataclass_validator(dc_cls: Type['Dataclass'], config: Type[BaseConfig]) -> 'CallableGenerator': """ Create a pydantic.dataclass from a builtin dataclass to add type validation and yield the validators It retrieves the parameters of the dataclass and forwards them to the newly created dataclass """ yield from _get_validators(dataclass(dc_cls, config=config, use_proxy=True)) pydantic-2.10.6/pydantic/v1/datetime_parse.py000066400000000000000000000170541474456633400211430ustar00rootroot00000000000000""" Functions to parse datetime objects. We're using regular expressions rather than time.strptime because: - They provide both validation and parsing. - They're more flexible for datetimes. - The date/datetime/time constructors produce friendlier error messages. Stolen from https://raw.githubusercontent.com/django/django/main/django/utils/dateparse.py at 9718fa2e8abe430c3526a9278dd976443d4ae3c6 Changed to: * use standard python datetime types not django.utils.timezone * raise ValueError when regex doesn't match rather than returning None * support parsing unix timestamps for dates and datetimes """ import re from datetime import date, datetime, time, timedelta, timezone from typing import Dict, Optional, Type, Union from pydantic.v1 import errors date_expr = r'(?P\d{4})-(?P\d{1,2})-(?P\d{1,2})' time_expr = ( r'(?P\d{1,2}):(?P\d{1,2})' r'(?::(?P\d{1,2})(?:\.(?P\d{1,6})\d{0,6})?)?' r'(?PZ|[+-]\d{2}(?::?\d{2})?)?$' ) date_re = re.compile(f'{date_expr}$') time_re = re.compile(time_expr) datetime_re = re.compile(f'{date_expr}[T ]{time_expr}') standard_duration_re = re.compile( r'^' r'(?:(?P-?\d+) (days?, )?)?' r'((?:(?P-?\d+):)(?=\d+:\d+))?' r'(?:(?P-?\d+):)?' r'(?P-?\d+)' r'(?:\.(?P\d{1,6})\d{0,6})?' r'$' ) # Support the sections of ISO 8601 date representation that are accepted by timedelta iso8601_duration_re = re.compile( r'^(?P[-+]?)' r'P' r'(?:(?P\d+(.\d+)?)D)?' r'(?:T' r'(?:(?P\d+(.\d+)?)H)?' r'(?:(?P\d+(.\d+)?)M)?' r'(?:(?P\d+(.\d+)?)S)?' r')?' r'$' ) EPOCH = datetime(1970, 1, 1) # if greater than this, the number is in ms, if less than or equal it's in seconds # (in seconds this is 11th October 2603, in ms it's 20th August 1970) MS_WATERSHED = int(2e10) # slightly more than datetime.max in ns - (datetime.max - EPOCH).total_seconds() * 1e9 MAX_NUMBER = int(3e20) StrBytesIntFloat = Union[str, bytes, int, float] def get_numeric(value: StrBytesIntFloat, native_expected_type: str) -> Union[None, int, float]: if isinstance(value, (int, float)): return value try: return float(value) except ValueError: return None except TypeError: raise TypeError(f'invalid type; expected {native_expected_type}, string, bytes, int or float') def from_unix_seconds(seconds: Union[int, float]) -> datetime: if seconds > MAX_NUMBER: return datetime.max elif seconds < -MAX_NUMBER: return datetime.min while abs(seconds) > MS_WATERSHED: seconds /= 1000 dt = EPOCH + timedelta(seconds=seconds) return dt.replace(tzinfo=timezone.utc) def _parse_timezone(value: Optional[str], error: Type[Exception]) -> Union[None, int, timezone]: if value == 'Z': return timezone.utc elif value is not None: offset_mins = int(value[-2:]) if len(value) > 3 else 0 offset = 60 * int(value[1:3]) + offset_mins if value[0] == '-': offset = -offset try: return timezone(timedelta(minutes=offset)) except ValueError: raise error() else: return None def parse_date(value: Union[date, StrBytesIntFloat]) -> date: """ Parse a date/int/float/string and return a datetime.date. Raise ValueError if the input is well formatted but not a valid date. Raise ValueError if the input isn't well formatted. """ if isinstance(value, date): if isinstance(value, datetime): return value.date() else: return value number = get_numeric(value, 'date') if number is not None: return from_unix_seconds(number).date() if isinstance(value, bytes): value = value.decode() match = date_re.match(value) # type: ignore if match is None: raise errors.DateError() kw = {k: int(v) for k, v in match.groupdict().items()} try: return date(**kw) except ValueError: raise errors.DateError() def parse_time(value: Union[time, StrBytesIntFloat]) -> time: """ Parse a time/string and return a datetime.time. Raise ValueError if the input is well formatted but not a valid time. Raise ValueError if the input isn't well formatted, in particular if it contains an offset. """ if isinstance(value, time): return value number = get_numeric(value, 'time') if number is not None: if number >= 86400: # doesn't make sense since the time time loop back around to 0 raise errors.TimeError() return (datetime.min + timedelta(seconds=number)).time() if isinstance(value, bytes): value = value.decode() match = time_re.match(value) # type: ignore if match is None: raise errors.TimeError() kw = match.groupdict() if kw['microsecond']: kw['microsecond'] = kw['microsecond'].ljust(6, '0') tzinfo = _parse_timezone(kw.pop('tzinfo'), errors.TimeError) kw_: Dict[str, Union[None, int, timezone]] = {k: int(v) for k, v in kw.items() if v is not None} kw_['tzinfo'] = tzinfo try: return time(**kw_) # type: ignore except ValueError: raise errors.TimeError() def parse_datetime(value: Union[datetime, StrBytesIntFloat]) -> datetime: """ Parse a datetime/int/float/string and return a datetime.datetime. This function supports time zone offsets. When the input contains one, the output uses a timezone with a fixed offset from UTC. Raise ValueError if the input is well formatted but not a valid datetime. Raise ValueError if the input isn't well formatted. """ if isinstance(value, datetime): return value number = get_numeric(value, 'datetime') if number is not None: return from_unix_seconds(number) if isinstance(value, bytes): value = value.decode() match = datetime_re.match(value) # type: ignore if match is None: raise errors.DateTimeError() kw = match.groupdict() if kw['microsecond']: kw['microsecond'] = kw['microsecond'].ljust(6, '0') tzinfo = _parse_timezone(kw.pop('tzinfo'), errors.DateTimeError) kw_: Dict[str, Union[None, int, timezone]] = {k: int(v) for k, v in kw.items() if v is not None} kw_['tzinfo'] = tzinfo try: return datetime(**kw_) # type: ignore except ValueError: raise errors.DateTimeError() def parse_duration(value: StrBytesIntFloat) -> timedelta: """ Parse a duration int/float/string and return a datetime.timedelta. The preferred format for durations in Django is '%d %H:%M:%S.%f'. Also supports ISO 8601 representation. """ if isinstance(value, timedelta): return value if isinstance(value, (int, float)): # below code requires a string value = f'{value:f}' elif isinstance(value, bytes): value = value.decode() try: match = standard_duration_re.match(value) or iso8601_duration_re.match(value) except TypeError: raise TypeError('invalid type; expected timedelta, string, bytes, int or float') if not match: raise errors.DurationError() kw = match.groupdict() sign = -1 if kw.pop('sign', '+') == '-' else 1 if kw.get('microseconds'): kw['microseconds'] = kw['microseconds'].ljust(6, '0') if kw.get('seconds') and kw.get('microseconds') and kw['seconds'].startswith('-'): kw['microseconds'] = '-' + kw['microseconds'] kw_ = {k: float(v) for k, v in kw.items() if v is not None} return sign * timedelta(**kw_) pydantic-2.10.6/pydantic/v1/decorator.py000066400000000000000000000241431474456633400201340ustar00rootroot00000000000000from functools import wraps from typing import TYPE_CHECKING, Any, Callable, Dict, List, Mapping, Optional, Tuple, Type, TypeVar, Union, overload from pydantic.v1 import validator from pydantic.v1.config import Extra from pydantic.v1.errors import ConfigError from pydantic.v1.main import BaseModel, create_model from pydantic.v1.typing import get_all_type_hints from pydantic.v1.utils import to_camel __all__ = ('validate_arguments',) if TYPE_CHECKING: from pydantic.v1.typing import AnyCallable AnyCallableT = TypeVar('AnyCallableT', bound=AnyCallable) ConfigType = Union[None, Type[Any], Dict[str, Any]] @overload def validate_arguments(func: None = None, *, config: 'ConfigType' = None) -> Callable[['AnyCallableT'], 'AnyCallableT']: ... @overload def validate_arguments(func: 'AnyCallableT') -> 'AnyCallableT': ... def validate_arguments(func: Optional['AnyCallableT'] = None, *, config: 'ConfigType' = None) -> Any: """ Decorator to validate the arguments passed to a function. """ def validate(_func: 'AnyCallable') -> 'AnyCallable': vd = ValidatedFunction(_func, config) @wraps(_func) def wrapper_function(*args: Any, **kwargs: Any) -> Any: return vd.call(*args, **kwargs) wrapper_function.vd = vd # type: ignore wrapper_function.validate = vd.init_model_instance # type: ignore wrapper_function.raw_function = vd.raw_function # type: ignore wrapper_function.model = vd.model # type: ignore return wrapper_function if func: return validate(func) else: return validate ALT_V_ARGS = 'v__args' ALT_V_KWARGS = 'v__kwargs' V_POSITIONAL_ONLY_NAME = 'v__positional_only' V_DUPLICATE_KWARGS = 'v__duplicate_kwargs' class ValidatedFunction: def __init__(self, function: 'AnyCallableT', config: 'ConfigType'): # noqa C901 from inspect import Parameter, signature parameters: Mapping[str, Parameter] = signature(function).parameters if parameters.keys() & {ALT_V_ARGS, ALT_V_KWARGS, V_POSITIONAL_ONLY_NAME, V_DUPLICATE_KWARGS}: raise ConfigError( f'"{ALT_V_ARGS}", "{ALT_V_KWARGS}", "{V_POSITIONAL_ONLY_NAME}" and "{V_DUPLICATE_KWARGS}" ' f'are not permitted as argument names when using the "{validate_arguments.__name__}" decorator' ) self.raw_function = function self.arg_mapping: Dict[int, str] = {} self.positional_only_args = set() self.v_args_name = 'args' self.v_kwargs_name = 'kwargs' type_hints = get_all_type_hints(function) takes_args = False takes_kwargs = False fields: Dict[str, Tuple[Any, Any]] = {} for i, (name, p) in enumerate(parameters.items()): if p.annotation is p.empty: annotation = Any else: annotation = type_hints[name] default = ... if p.default is p.empty else p.default if p.kind == Parameter.POSITIONAL_ONLY: self.arg_mapping[i] = name fields[name] = annotation, default fields[V_POSITIONAL_ONLY_NAME] = List[str], None self.positional_only_args.add(name) elif p.kind == Parameter.POSITIONAL_OR_KEYWORD: self.arg_mapping[i] = name fields[name] = annotation, default fields[V_DUPLICATE_KWARGS] = List[str], None elif p.kind == Parameter.KEYWORD_ONLY: fields[name] = annotation, default elif p.kind == Parameter.VAR_POSITIONAL: self.v_args_name = name fields[name] = Tuple[annotation, ...], None takes_args = True else: assert p.kind == Parameter.VAR_KEYWORD, p.kind self.v_kwargs_name = name fields[name] = Dict[str, annotation], None # type: ignore takes_kwargs = True # these checks avoid a clash between "args" and a field with that name if not takes_args and self.v_args_name in fields: self.v_args_name = ALT_V_ARGS # same with "kwargs" if not takes_kwargs and self.v_kwargs_name in fields: self.v_kwargs_name = ALT_V_KWARGS if not takes_args: # we add the field so validation below can raise the correct exception fields[self.v_args_name] = List[Any], None if not takes_kwargs: # same with kwargs fields[self.v_kwargs_name] = Dict[Any, Any], None self.create_model(fields, takes_args, takes_kwargs, config) def init_model_instance(self, *args: Any, **kwargs: Any) -> BaseModel: values = self.build_values(args, kwargs) return self.model(**values) def call(self, *args: Any, **kwargs: Any) -> Any: m = self.init_model_instance(*args, **kwargs) return self.execute(m) def build_values(self, args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Dict[str, Any]: values: Dict[str, Any] = {} if args: arg_iter = enumerate(args) while True: try: i, a = next(arg_iter) except StopIteration: break arg_name = self.arg_mapping.get(i) if arg_name is not None: values[arg_name] = a else: values[self.v_args_name] = [a] + [a for _, a in arg_iter] break var_kwargs: Dict[str, Any] = {} wrong_positional_args = [] duplicate_kwargs = [] fields_alias = [ field.alias for name, field in self.model.__fields__.items() if name not in (self.v_args_name, self.v_kwargs_name) ] non_var_fields = set(self.model.__fields__) - {self.v_args_name, self.v_kwargs_name} for k, v in kwargs.items(): if k in non_var_fields or k in fields_alias: if k in self.positional_only_args: wrong_positional_args.append(k) if k in values: duplicate_kwargs.append(k) values[k] = v else: var_kwargs[k] = v if var_kwargs: values[self.v_kwargs_name] = var_kwargs if wrong_positional_args: values[V_POSITIONAL_ONLY_NAME] = wrong_positional_args if duplicate_kwargs: values[V_DUPLICATE_KWARGS] = duplicate_kwargs return values def execute(self, m: BaseModel) -> Any: d = {k: v for k, v in m._iter() if k in m.__fields_set__ or m.__fields__[k].default_factory} var_kwargs = d.pop(self.v_kwargs_name, {}) if self.v_args_name in d: args_: List[Any] = [] in_kwargs = False kwargs = {} for name, value in d.items(): if in_kwargs: kwargs[name] = value elif name == self.v_args_name: args_ += value in_kwargs = True else: args_.append(value) return self.raw_function(*args_, **kwargs, **var_kwargs) elif self.positional_only_args: args_ = [] kwargs = {} for name, value in d.items(): if name in self.positional_only_args: args_.append(value) else: kwargs[name] = value return self.raw_function(*args_, **kwargs, **var_kwargs) else: return self.raw_function(**d, **var_kwargs) def create_model(self, fields: Dict[str, Any], takes_args: bool, takes_kwargs: bool, config: 'ConfigType') -> None: pos_args = len(self.arg_mapping) class CustomConfig: pass if not TYPE_CHECKING: # pragma: no branch if isinstance(config, dict): CustomConfig = type('Config', (), config) # noqa: F811 elif config is not None: CustomConfig = config # noqa: F811 if hasattr(CustomConfig, 'fields') or hasattr(CustomConfig, 'alias_generator'): raise ConfigError( 'Setting the "fields" and "alias_generator" property on custom Config for ' '@validate_arguments is not yet supported, please remove.' ) class DecoratorBaseModel(BaseModel): @validator(self.v_args_name, check_fields=False, allow_reuse=True) def check_args(cls, v: Optional[List[Any]]) -> Optional[List[Any]]: if takes_args or v is None: return v raise TypeError(f'{pos_args} positional arguments expected but {pos_args + len(v)} given') @validator(self.v_kwargs_name, check_fields=False, allow_reuse=True) def check_kwargs(cls, v: Optional[Dict[str, Any]]) -> Optional[Dict[str, Any]]: if takes_kwargs or v is None: return v plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v.keys())) raise TypeError(f'unexpected keyword argument{plural}: {keys}') @validator(V_POSITIONAL_ONLY_NAME, check_fields=False, allow_reuse=True) def check_positional_only(cls, v: Optional[List[str]]) -> None: if v is None: return plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v)) raise TypeError(f'positional-only argument{plural} passed as keyword argument{plural}: {keys}') @validator(V_DUPLICATE_KWARGS, check_fields=False, allow_reuse=True) def check_duplicate_kwargs(cls, v: Optional[List[str]]) -> None: if v is None: return plural = '' if len(v) == 1 else 's' keys = ', '.join(map(repr, v)) raise TypeError(f'multiple values for argument{plural}: {keys}') class Config(CustomConfig): extra = getattr(CustomConfig, 'extra', Extra.forbid) self.model = create_model(to_camel(self.raw_function.__name__), __base__=DecoratorBaseModel, **fields) pydantic-2.10.6/pydantic/v1/env_settings.py000066400000000000000000000334311474456633400206620ustar00rootroot00000000000000import os import warnings from pathlib import Path from typing import AbstractSet, Any, Callable, ClassVar, Dict, List, Mapping, Optional, Tuple, Type, Union from pydantic.v1.config import BaseConfig, Extra from pydantic.v1.fields import ModelField from pydantic.v1.main import BaseModel from pydantic.v1.types import JsonWrapper from pydantic.v1.typing import StrPath, display_as_type, get_origin, is_union from pydantic.v1.utils import deep_update, lenient_issubclass, path_type, sequence_like env_file_sentinel = str(object()) SettingsSourceCallable = Callable[['BaseSettings'], Dict[str, Any]] DotenvType = Union[StrPath, List[StrPath], Tuple[StrPath, ...]] class SettingsError(ValueError): pass class BaseSettings(BaseModel): """ Base class for settings, allowing values to be overridden by environment variables. This is useful in production for secrets you do not wish to save in code, it plays nicely with docker(-compose), Heroku and any 12 factor app design. """ def __init__( __pydantic_self__, _env_file: Optional[DotenvType] = env_file_sentinel, _env_file_encoding: Optional[str] = None, _env_nested_delimiter: Optional[str] = None, _secrets_dir: Optional[StrPath] = None, **values: Any, ) -> None: # Uses something other than `self` the first arg to allow "self" as a settable attribute super().__init__( **__pydantic_self__._build_values( values, _env_file=_env_file, _env_file_encoding=_env_file_encoding, _env_nested_delimiter=_env_nested_delimiter, _secrets_dir=_secrets_dir, ) ) def _build_values( self, init_kwargs: Dict[str, Any], _env_file: Optional[DotenvType] = None, _env_file_encoding: Optional[str] = None, _env_nested_delimiter: Optional[str] = None, _secrets_dir: Optional[StrPath] = None, ) -> Dict[str, Any]: # Configure built-in sources init_settings = InitSettingsSource(init_kwargs=init_kwargs) env_settings = EnvSettingsSource( env_file=(_env_file if _env_file != env_file_sentinel else self.__config__.env_file), env_file_encoding=( _env_file_encoding if _env_file_encoding is not None else self.__config__.env_file_encoding ), env_nested_delimiter=( _env_nested_delimiter if _env_nested_delimiter is not None else self.__config__.env_nested_delimiter ), env_prefix_len=len(self.__config__.env_prefix), ) file_secret_settings = SecretsSettingsSource(secrets_dir=_secrets_dir or self.__config__.secrets_dir) # Provide a hook to set built-in sources priority and add / remove sources sources = self.__config__.customise_sources( init_settings=init_settings, env_settings=env_settings, file_secret_settings=file_secret_settings ) if sources: return deep_update(*reversed([source(self) for source in sources])) else: # no one should mean to do this, but I think returning an empty dict is marginally preferable # to an informative error and much better than a confusing error return {} class Config(BaseConfig): env_prefix: str = '' env_file: Optional[DotenvType] = None env_file_encoding: Optional[str] = None env_nested_delimiter: Optional[str] = None secrets_dir: Optional[StrPath] = None validate_all: bool = True extra: Extra = Extra.forbid arbitrary_types_allowed: bool = True case_sensitive: bool = False @classmethod def prepare_field(cls, field: ModelField) -> None: env_names: Union[List[str], AbstractSet[str]] field_info_from_config = cls.get_field_info(field.name) env = field_info_from_config.get('env') or field.field_info.extra.get('env') if env is None: if field.has_alias: warnings.warn( 'aliases are no longer used by BaseSettings to define which environment variables to read. ' 'Instead use the "env" field setting. ' 'See https://pydantic-docs.helpmanual.io/usage/settings/#environment-variable-names', FutureWarning, ) env_names = {cls.env_prefix + field.name} elif isinstance(env, str): env_names = {env} elif isinstance(env, (set, frozenset)): env_names = env elif sequence_like(env): env_names = list(env) else: raise TypeError(f'invalid field env: {env!r} ({display_as_type(env)}); should be string, list or set') if not cls.case_sensitive: env_names = env_names.__class__(n.lower() for n in env_names) field.field_info.extra['env_names'] = env_names @classmethod def customise_sources( cls, init_settings: SettingsSourceCallable, env_settings: SettingsSourceCallable, file_secret_settings: SettingsSourceCallable, ) -> Tuple[SettingsSourceCallable, ...]: return init_settings, env_settings, file_secret_settings @classmethod def parse_env_var(cls, field_name: str, raw_val: str) -> Any: return cls.json_loads(raw_val) # populated by the metaclass using the Config class defined above, annotated here to help IDEs only __config__: ClassVar[Type[Config]] class InitSettingsSource: __slots__ = ('init_kwargs',) def __init__(self, init_kwargs: Dict[str, Any]): self.init_kwargs = init_kwargs def __call__(self, settings: BaseSettings) -> Dict[str, Any]: return self.init_kwargs def __repr__(self) -> str: return f'InitSettingsSource(init_kwargs={self.init_kwargs!r})' class EnvSettingsSource: __slots__ = ('env_file', 'env_file_encoding', 'env_nested_delimiter', 'env_prefix_len') def __init__( self, env_file: Optional[DotenvType], env_file_encoding: Optional[str], env_nested_delimiter: Optional[str] = None, env_prefix_len: int = 0, ): self.env_file: Optional[DotenvType] = env_file self.env_file_encoding: Optional[str] = env_file_encoding self.env_nested_delimiter: Optional[str] = env_nested_delimiter self.env_prefix_len: int = env_prefix_len def __call__(self, settings: BaseSettings) -> Dict[str, Any]: # noqa C901 """ Build environment variables suitable for passing to the Model. """ d: Dict[str, Any] = {} if settings.__config__.case_sensitive: env_vars: Mapping[str, Optional[str]] = os.environ else: env_vars = {k.lower(): v for k, v in os.environ.items()} dotenv_vars = self._read_env_files(settings.__config__.case_sensitive) if dotenv_vars: env_vars = {**dotenv_vars, **env_vars} for field in settings.__fields__.values(): env_val: Optional[str] = None for env_name in field.field_info.extra['env_names']: env_val = env_vars.get(env_name) if env_val is not None: break is_complex, allow_parse_failure = self.field_is_complex(field) if is_complex: if env_val is None: # field is complex but no value found so far, try explode_env_vars env_val_built = self.explode_env_vars(field, env_vars) if env_val_built: d[field.alias] = env_val_built else: # field is complex and there's a value, decode that as JSON, then add explode_env_vars try: env_val = settings.__config__.parse_env_var(field.name, env_val) except ValueError as e: if not allow_parse_failure: raise SettingsError(f'error parsing env var "{env_name}"') from e if isinstance(env_val, dict): d[field.alias] = deep_update(env_val, self.explode_env_vars(field, env_vars)) else: d[field.alias] = env_val elif env_val is not None: # simplest case, field is not complex, we only need to add the value if it was found d[field.alias] = env_val return d def _read_env_files(self, case_sensitive: bool) -> Dict[str, Optional[str]]: env_files = self.env_file if env_files is None: return {} if isinstance(env_files, (str, os.PathLike)): env_files = [env_files] dotenv_vars = {} for env_file in env_files: env_path = Path(env_file).expanduser() if env_path.is_file(): dotenv_vars.update( read_env_file(env_path, encoding=self.env_file_encoding, case_sensitive=case_sensitive) ) return dotenv_vars def field_is_complex(self, field: ModelField) -> Tuple[bool, bool]: """ Find out if a field is complex, and if so whether JSON errors should be ignored """ if lenient_issubclass(field.annotation, JsonWrapper): return False, False if field.is_complex(): allow_parse_failure = False elif is_union(get_origin(field.type_)) and field.sub_fields and any(f.is_complex() for f in field.sub_fields): allow_parse_failure = True else: return False, False return True, allow_parse_failure def explode_env_vars(self, field: ModelField, env_vars: Mapping[str, Optional[str]]) -> Dict[str, Any]: """ Process env_vars and extract the values of keys containing env_nested_delimiter into nested dictionaries. This is applied to a single field, hence filtering by env_var prefix. """ prefixes = [f'{env_name}{self.env_nested_delimiter}' for env_name in field.field_info.extra['env_names']] result: Dict[str, Any] = {} for env_name, env_val in env_vars.items(): if not any(env_name.startswith(prefix) for prefix in prefixes): continue # we remove the prefix before splitting in case the prefix has characters in common with the delimiter env_name_without_prefix = env_name[self.env_prefix_len :] _, *keys, last_key = env_name_without_prefix.split(self.env_nested_delimiter) env_var = result for key in keys: env_var = env_var.setdefault(key, {}) env_var[last_key] = env_val return result def __repr__(self) -> str: return ( f'EnvSettingsSource(env_file={self.env_file!r}, env_file_encoding={self.env_file_encoding!r}, ' f'env_nested_delimiter={self.env_nested_delimiter!r})' ) class SecretsSettingsSource: __slots__ = ('secrets_dir',) def __init__(self, secrets_dir: Optional[StrPath]): self.secrets_dir: Optional[StrPath] = secrets_dir def __call__(self, settings: BaseSettings) -> Dict[str, Any]: """ Build fields from "secrets" files. """ secrets: Dict[str, Optional[str]] = {} if self.secrets_dir is None: return secrets secrets_path = Path(self.secrets_dir).expanduser() if not secrets_path.exists(): warnings.warn(f'directory "{secrets_path}" does not exist') return secrets if not secrets_path.is_dir(): raise SettingsError(f'secrets_dir must reference a directory, not a {path_type(secrets_path)}') for field in settings.__fields__.values(): for env_name in field.field_info.extra['env_names']: path = find_case_path(secrets_path, env_name, settings.__config__.case_sensitive) if not path: # path does not exist, we currently don't return a warning for this continue if path.is_file(): secret_value = path.read_text().strip() if field.is_complex(): try: secret_value = settings.__config__.parse_env_var(field.name, secret_value) except ValueError as e: raise SettingsError(f'error parsing env var "{env_name}"') from e secrets[field.alias] = secret_value else: warnings.warn( f'attempted to load secret file "{path}" but found a {path_type(path)} instead.', stacklevel=4, ) return secrets def __repr__(self) -> str: return f'SecretsSettingsSource(secrets_dir={self.secrets_dir!r})' def read_env_file( file_path: StrPath, *, encoding: str = None, case_sensitive: bool = False ) -> Dict[str, Optional[str]]: try: from dotenv import dotenv_values except ImportError as e: raise ImportError('python-dotenv is not installed, run `pip install pydantic[dotenv]`') from e file_vars: Dict[str, Optional[str]] = dotenv_values(file_path, encoding=encoding or 'utf8') if not case_sensitive: return {k.lower(): v for k, v in file_vars.items()} else: return file_vars def find_case_path(dir_path: Path, file_name: str, case_sensitive: bool) -> Optional[Path]: """ Find a file within path's directory matching filename, optionally ignoring case. """ for f in dir_path.iterdir(): if f.name == file_name: return f elif not case_sensitive and f.name.lower() == file_name.lower(): return f return None pydantic-2.10.6/pydantic/v1/error_wrappers.py000066400000000000000000000121141474456633400212210ustar00rootroot00000000000000import json from typing import TYPE_CHECKING, Any, Dict, Generator, List, Optional, Sequence, Tuple, Type, Union from pydantic.v1.json import pydantic_encoder from pydantic.v1.utils import Representation if TYPE_CHECKING: from typing_extensions import TypedDict from pydantic.v1.config import BaseConfig from pydantic.v1.types import ModelOrDc from pydantic.v1.typing import ReprArgs Loc = Tuple[Union[int, str], ...] class _ErrorDictRequired(TypedDict): loc: Loc msg: str type: str class ErrorDict(_ErrorDictRequired, total=False): ctx: Dict[str, Any] __all__ = 'ErrorWrapper', 'ValidationError' class ErrorWrapper(Representation): __slots__ = 'exc', '_loc' def __init__(self, exc: Exception, loc: Union[str, 'Loc']) -> None: self.exc = exc self._loc = loc def loc_tuple(self) -> 'Loc': if isinstance(self._loc, tuple): return self._loc else: return (self._loc,) def __repr_args__(self) -> 'ReprArgs': return [('exc', self.exc), ('loc', self.loc_tuple())] # ErrorList is something like Union[List[Union[List[ErrorWrapper], ErrorWrapper]], ErrorWrapper] # but recursive, therefore just use: ErrorList = Union[Sequence[Any], ErrorWrapper] class ValidationError(Representation, ValueError): __slots__ = 'raw_errors', 'model', '_error_cache' def __init__(self, errors: Sequence[ErrorList], model: 'ModelOrDc') -> None: self.raw_errors = errors self.model = model self._error_cache: Optional[List['ErrorDict']] = None def errors(self) -> List['ErrorDict']: if self._error_cache is None: try: config = self.model.__config__ # type: ignore except AttributeError: config = self.model.__pydantic_model__.__config__ # type: ignore self._error_cache = list(flatten_errors(self.raw_errors, config)) return self._error_cache def json(self, *, indent: Union[None, int, str] = 2) -> str: return json.dumps(self.errors(), indent=indent, default=pydantic_encoder) def __str__(self) -> str: errors = self.errors() no_errors = len(errors) return ( f'{no_errors} validation error{"" if no_errors == 1 else "s"} for {self.model.__name__}\n' f'{display_errors(errors)}' ) def __repr_args__(self) -> 'ReprArgs': return [('model', self.model.__name__), ('errors', self.errors())] def display_errors(errors: List['ErrorDict']) -> str: return '\n'.join(f'{_display_error_loc(e)}\n {e["msg"]} ({_display_error_type_and_ctx(e)})' for e in errors) def _display_error_loc(error: 'ErrorDict') -> str: return ' -> '.join(str(e) for e in error['loc']) def _display_error_type_and_ctx(error: 'ErrorDict') -> str: t = 'type=' + error['type'] ctx = error.get('ctx') if ctx: return t + ''.join(f'; {k}={v}' for k, v in ctx.items()) else: return t def flatten_errors( errors: Sequence[Any], config: Type['BaseConfig'], loc: Optional['Loc'] = None ) -> Generator['ErrorDict', None, None]: for error in errors: if isinstance(error, ErrorWrapper): if loc: error_loc = loc + error.loc_tuple() else: error_loc = error.loc_tuple() if isinstance(error.exc, ValidationError): yield from flatten_errors(error.exc.raw_errors, config, error_loc) else: yield error_dict(error.exc, config, error_loc) elif isinstance(error, list): yield from flatten_errors(error, config, loc=loc) else: raise RuntimeError(f'Unknown error object: {error}') def error_dict(exc: Exception, config: Type['BaseConfig'], loc: 'Loc') -> 'ErrorDict': type_ = get_exc_type(exc.__class__) msg_template = config.error_msg_templates.get(type_) or getattr(exc, 'msg_template', None) ctx = exc.__dict__ if msg_template: msg = msg_template.format(**ctx) else: msg = str(exc) d: 'ErrorDict' = {'loc': loc, 'msg': msg, 'type': type_} if ctx: d['ctx'] = ctx return d _EXC_TYPE_CACHE: Dict[Type[Exception], str] = {} def get_exc_type(cls: Type[Exception]) -> str: # slightly more efficient than using lru_cache since we don't need to worry about the cache filling up try: return _EXC_TYPE_CACHE[cls] except KeyError: r = _get_exc_type(cls) _EXC_TYPE_CACHE[cls] = r return r def _get_exc_type(cls: Type[Exception]) -> str: if issubclass(cls, AssertionError): return 'assertion_error' base_name = 'type_error' if issubclass(cls, TypeError) else 'value_error' if cls in (TypeError, ValueError): # just TypeError or ValueError, no extra code return base_name # if it's not a TypeError or ValueError, we just take the lowercase of the exception name # no chaining or snake case logic, use "code" for more complex error types. code = getattr(cls, 'code', None) or cls.__name__.replace('Error', '').lower() return base_name + '.' + code pydantic-2.10.6/pydantic/v1/errors.py000066400000000000000000000424761474456633400174770ustar00rootroot00000000000000from decimal import Decimal from pathlib import Path from typing import TYPE_CHECKING, Any, Callable, Sequence, Set, Tuple, Type, Union from pydantic.v1.typing import display_as_type if TYPE_CHECKING: from pydantic.v1.typing import DictStrAny # explicitly state exports to avoid "from pydantic.v1.errors import *" also importing Decimal, Path etc. __all__ = ( 'PydanticTypeError', 'PydanticValueError', 'ConfigError', 'MissingError', 'ExtraError', 'NoneIsNotAllowedError', 'NoneIsAllowedError', 'WrongConstantError', 'NotNoneError', 'BoolError', 'BytesError', 'DictError', 'EmailError', 'UrlError', 'UrlSchemeError', 'UrlSchemePermittedError', 'UrlUserInfoError', 'UrlHostError', 'UrlHostTldError', 'UrlPortError', 'UrlExtraError', 'EnumError', 'IntEnumError', 'EnumMemberError', 'IntegerError', 'FloatError', 'PathError', 'PathNotExistsError', 'PathNotAFileError', 'PathNotADirectoryError', 'PyObjectError', 'SequenceError', 'ListError', 'SetError', 'FrozenSetError', 'TupleError', 'TupleLengthError', 'ListMinLengthError', 'ListMaxLengthError', 'ListUniqueItemsError', 'SetMinLengthError', 'SetMaxLengthError', 'FrozenSetMinLengthError', 'FrozenSetMaxLengthError', 'AnyStrMinLengthError', 'AnyStrMaxLengthError', 'StrError', 'StrRegexError', 'NumberNotGtError', 'NumberNotGeError', 'NumberNotLtError', 'NumberNotLeError', 'NumberNotMultipleError', 'DecimalError', 'DecimalIsNotFiniteError', 'DecimalMaxDigitsError', 'DecimalMaxPlacesError', 'DecimalWholeDigitsError', 'DateTimeError', 'DateError', 'DateNotInThePastError', 'DateNotInTheFutureError', 'TimeError', 'DurationError', 'HashableError', 'UUIDError', 'UUIDVersionError', 'ArbitraryTypeError', 'ClassError', 'SubclassError', 'JsonError', 'JsonTypeError', 'PatternError', 'DataclassTypeError', 'CallableError', 'IPvAnyAddressError', 'IPvAnyInterfaceError', 'IPvAnyNetworkError', 'IPv4AddressError', 'IPv6AddressError', 'IPv4NetworkError', 'IPv6NetworkError', 'IPv4InterfaceError', 'IPv6InterfaceError', 'ColorError', 'StrictBoolError', 'NotDigitError', 'LuhnValidationError', 'InvalidLengthForBrand', 'InvalidByteSize', 'InvalidByteSizeUnit', 'MissingDiscriminator', 'InvalidDiscriminator', ) def cls_kwargs(cls: Type['PydanticErrorMixin'], ctx: 'DictStrAny') -> 'PydanticErrorMixin': """ For built-in exceptions like ValueError or TypeError, we need to implement __reduce__ to override the default behaviour (instead of __getstate__/__setstate__) By default pickle protocol 2 calls `cls.__new__(cls, *args)`. Since we only use kwargs, we need a little constructor to change that. Note: the callable can't be a lambda as pickle looks in the namespace to find it """ return cls(**ctx) class PydanticErrorMixin: code: str msg_template: str def __init__(self, **ctx: Any) -> None: self.__dict__ = ctx def __str__(self) -> str: return self.msg_template.format(**self.__dict__) def __reduce__(self) -> Tuple[Callable[..., 'PydanticErrorMixin'], Tuple[Type['PydanticErrorMixin'], 'DictStrAny']]: return cls_kwargs, (self.__class__, self.__dict__) class PydanticTypeError(PydanticErrorMixin, TypeError): pass class PydanticValueError(PydanticErrorMixin, ValueError): pass class ConfigError(RuntimeError): pass class MissingError(PydanticValueError): msg_template = 'field required' class ExtraError(PydanticValueError): msg_template = 'extra fields not permitted' class NoneIsNotAllowedError(PydanticTypeError): code = 'none.not_allowed' msg_template = 'none is not an allowed value' class NoneIsAllowedError(PydanticTypeError): code = 'none.allowed' msg_template = 'value is not none' class WrongConstantError(PydanticValueError): code = 'const' def __str__(self) -> str: permitted = ', '.join(repr(v) for v in self.permitted) # type: ignore return f'unexpected value; permitted: {permitted}' class NotNoneError(PydanticTypeError): code = 'not_none' msg_template = 'value is not None' class BoolError(PydanticTypeError): msg_template = 'value could not be parsed to a boolean' class BytesError(PydanticTypeError): msg_template = 'byte type expected' class DictError(PydanticTypeError): msg_template = 'value is not a valid dict' class EmailError(PydanticValueError): msg_template = 'value is not a valid email address' class UrlError(PydanticValueError): code = 'url' class UrlSchemeError(UrlError): code = 'url.scheme' msg_template = 'invalid or missing URL scheme' class UrlSchemePermittedError(UrlError): code = 'url.scheme' msg_template = 'URL scheme not permitted' def __init__(self, allowed_schemes: Set[str]): super().__init__(allowed_schemes=allowed_schemes) class UrlUserInfoError(UrlError): code = 'url.userinfo' msg_template = 'userinfo required in URL but missing' class UrlHostError(UrlError): code = 'url.host' msg_template = 'URL host invalid' class UrlHostTldError(UrlError): code = 'url.host' msg_template = 'URL host invalid, top level domain required' class UrlPortError(UrlError): code = 'url.port' msg_template = 'URL port invalid, port cannot exceed 65535' class UrlExtraError(UrlError): code = 'url.extra' msg_template = 'URL invalid, extra characters found after valid URL: {extra!r}' class EnumMemberError(PydanticTypeError): code = 'enum' def __str__(self) -> str: permitted = ', '.join(repr(v.value) for v in self.enum_values) # type: ignore return f'value is not a valid enumeration member; permitted: {permitted}' class IntegerError(PydanticTypeError): msg_template = 'value is not a valid integer' class FloatError(PydanticTypeError): msg_template = 'value is not a valid float' class PathError(PydanticTypeError): msg_template = 'value is not a valid path' class _PathValueError(PydanticValueError): def __init__(self, *, path: Path) -> None: super().__init__(path=str(path)) class PathNotExistsError(_PathValueError): code = 'path.not_exists' msg_template = 'file or directory at path "{path}" does not exist' class PathNotAFileError(_PathValueError): code = 'path.not_a_file' msg_template = 'path "{path}" does not point to a file' class PathNotADirectoryError(_PathValueError): code = 'path.not_a_directory' msg_template = 'path "{path}" does not point to a directory' class PyObjectError(PydanticTypeError): msg_template = 'ensure this value contains valid import path or valid callable: {error_message}' class SequenceError(PydanticTypeError): msg_template = 'value is not a valid sequence' class IterableError(PydanticTypeError): msg_template = 'value is not a valid iterable' class ListError(PydanticTypeError): msg_template = 'value is not a valid list' class SetError(PydanticTypeError): msg_template = 'value is not a valid set' class FrozenSetError(PydanticTypeError): msg_template = 'value is not a valid frozenset' class DequeError(PydanticTypeError): msg_template = 'value is not a valid deque' class TupleError(PydanticTypeError): msg_template = 'value is not a valid tuple' class TupleLengthError(PydanticValueError): code = 'tuple.length' msg_template = 'wrong tuple length {actual_length}, expected {expected_length}' def __init__(self, *, actual_length: int, expected_length: int) -> None: super().__init__(actual_length=actual_length, expected_length=expected_length) class ListMinLengthError(PydanticValueError): code = 'list.min_items' msg_template = 'ensure this value has at least {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class ListMaxLengthError(PydanticValueError): code = 'list.max_items' msg_template = 'ensure this value has at most {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class ListUniqueItemsError(PydanticValueError): code = 'list.unique_items' msg_template = 'the list has duplicated items' class SetMinLengthError(PydanticValueError): code = 'set.min_items' msg_template = 'ensure this value has at least {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class SetMaxLengthError(PydanticValueError): code = 'set.max_items' msg_template = 'ensure this value has at most {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class FrozenSetMinLengthError(PydanticValueError): code = 'frozenset.min_items' msg_template = 'ensure this value has at least {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class FrozenSetMaxLengthError(PydanticValueError): code = 'frozenset.max_items' msg_template = 'ensure this value has at most {limit_value} items' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class AnyStrMinLengthError(PydanticValueError): code = 'any_str.min_length' msg_template = 'ensure this value has at least {limit_value} characters' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class AnyStrMaxLengthError(PydanticValueError): code = 'any_str.max_length' msg_template = 'ensure this value has at most {limit_value} characters' def __init__(self, *, limit_value: int) -> None: super().__init__(limit_value=limit_value) class StrError(PydanticTypeError): msg_template = 'str type expected' class StrRegexError(PydanticValueError): code = 'str.regex' msg_template = 'string does not match regex "{pattern}"' def __init__(self, *, pattern: str) -> None: super().__init__(pattern=pattern) class _NumberBoundError(PydanticValueError): def __init__(self, *, limit_value: Union[int, float, Decimal]) -> None: super().__init__(limit_value=limit_value) class NumberNotGtError(_NumberBoundError): code = 'number.not_gt' msg_template = 'ensure this value is greater than {limit_value}' class NumberNotGeError(_NumberBoundError): code = 'number.not_ge' msg_template = 'ensure this value is greater than or equal to {limit_value}' class NumberNotLtError(_NumberBoundError): code = 'number.not_lt' msg_template = 'ensure this value is less than {limit_value}' class NumberNotLeError(_NumberBoundError): code = 'number.not_le' msg_template = 'ensure this value is less than or equal to {limit_value}' class NumberNotFiniteError(PydanticValueError): code = 'number.not_finite_number' msg_template = 'ensure this value is a finite number' class NumberNotMultipleError(PydanticValueError): code = 'number.not_multiple' msg_template = 'ensure this value is a multiple of {multiple_of}' def __init__(self, *, multiple_of: Union[int, float, Decimal]) -> None: super().__init__(multiple_of=multiple_of) class DecimalError(PydanticTypeError): msg_template = 'value is not a valid decimal' class DecimalIsNotFiniteError(PydanticValueError): code = 'decimal.not_finite' msg_template = 'value is not a valid decimal' class DecimalMaxDigitsError(PydanticValueError): code = 'decimal.max_digits' msg_template = 'ensure that there are no more than {max_digits} digits in total' def __init__(self, *, max_digits: int) -> None: super().__init__(max_digits=max_digits) class DecimalMaxPlacesError(PydanticValueError): code = 'decimal.max_places' msg_template = 'ensure that there are no more than {decimal_places} decimal places' def __init__(self, *, decimal_places: int) -> None: super().__init__(decimal_places=decimal_places) class DecimalWholeDigitsError(PydanticValueError): code = 'decimal.whole_digits' msg_template = 'ensure that there are no more than {whole_digits} digits before the decimal point' def __init__(self, *, whole_digits: int) -> None: super().__init__(whole_digits=whole_digits) class DateTimeError(PydanticValueError): msg_template = 'invalid datetime format' class DateError(PydanticValueError): msg_template = 'invalid date format' class DateNotInThePastError(PydanticValueError): code = 'date.not_in_the_past' msg_template = 'date is not in the past' class DateNotInTheFutureError(PydanticValueError): code = 'date.not_in_the_future' msg_template = 'date is not in the future' class TimeError(PydanticValueError): msg_template = 'invalid time format' class DurationError(PydanticValueError): msg_template = 'invalid duration format' class HashableError(PydanticTypeError): msg_template = 'value is not a valid hashable' class UUIDError(PydanticTypeError): msg_template = 'value is not a valid uuid' class UUIDVersionError(PydanticValueError): code = 'uuid.version' msg_template = 'uuid version {required_version} expected' def __init__(self, *, required_version: int) -> None: super().__init__(required_version=required_version) class ArbitraryTypeError(PydanticTypeError): code = 'arbitrary_type' msg_template = 'instance of {expected_arbitrary_type} expected' def __init__(self, *, expected_arbitrary_type: Type[Any]) -> None: super().__init__(expected_arbitrary_type=display_as_type(expected_arbitrary_type)) class ClassError(PydanticTypeError): code = 'class' msg_template = 'a class is expected' class SubclassError(PydanticTypeError): code = 'subclass' msg_template = 'subclass of {expected_class} expected' def __init__(self, *, expected_class: Type[Any]) -> None: super().__init__(expected_class=display_as_type(expected_class)) class JsonError(PydanticValueError): msg_template = 'Invalid JSON' class JsonTypeError(PydanticTypeError): code = 'json' msg_template = 'JSON object must be str, bytes or bytearray' class PatternError(PydanticValueError): code = 'regex_pattern' msg_template = 'Invalid regular expression' class DataclassTypeError(PydanticTypeError): code = 'dataclass' msg_template = 'instance of {class_name}, tuple or dict expected' class CallableError(PydanticTypeError): msg_template = '{value} is not callable' class EnumError(PydanticTypeError): code = 'enum_instance' msg_template = '{value} is not a valid Enum instance' class IntEnumError(PydanticTypeError): code = 'int_enum_instance' msg_template = '{value} is not a valid IntEnum instance' class IPvAnyAddressError(PydanticValueError): msg_template = 'value is not a valid IPv4 or IPv6 address' class IPvAnyInterfaceError(PydanticValueError): msg_template = 'value is not a valid IPv4 or IPv6 interface' class IPvAnyNetworkError(PydanticValueError): msg_template = 'value is not a valid IPv4 or IPv6 network' class IPv4AddressError(PydanticValueError): msg_template = 'value is not a valid IPv4 address' class IPv6AddressError(PydanticValueError): msg_template = 'value is not a valid IPv6 address' class IPv4NetworkError(PydanticValueError): msg_template = 'value is not a valid IPv4 network' class IPv6NetworkError(PydanticValueError): msg_template = 'value is not a valid IPv6 network' class IPv4InterfaceError(PydanticValueError): msg_template = 'value is not a valid IPv4 interface' class IPv6InterfaceError(PydanticValueError): msg_template = 'value is not a valid IPv6 interface' class ColorError(PydanticValueError): msg_template = 'value is not a valid color: {reason}' class StrictBoolError(PydanticValueError): msg_template = 'value is not a valid boolean' class NotDigitError(PydanticValueError): code = 'payment_card_number.digits' msg_template = 'card number is not all digits' class LuhnValidationError(PydanticValueError): code = 'payment_card_number.luhn_check' msg_template = 'card number is not luhn valid' class InvalidLengthForBrand(PydanticValueError): code = 'payment_card_number.invalid_length_for_brand' msg_template = 'Length for a {brand} card must be {required_length}' class InvalidByteSize(PydanticValueError): msg_template = 'could not parse value and unit from byte string' class InvalidByteSizeUnit(PydanticValueError): msg_template = 'could not interpret byte unit: {unit}' class MissingDiscriminator(PydanticValueError): code = 'discriminated_union.missing_discriminator' msg_template = 'Discriminator {discriminator_key!r} is missing in value' class InvalidDiscriminator(PydanticValueError): code = 'discriminated_union.invalid_discriminator' msg_template = ( 'No match for discriminator {discriminator_key!r} and value {discriminator_value!r} ' '(allowed values: {allowed_values})' ) def __init__(self, *, discriminator_key: str, discriminator_value: Any, allowed_values: Sequence[Any]) -> None: super().__init__( discriminator_key=discriminator_key, discriminator_value=discriminator_value, allowed_values=', '.join(map(repr, allowed_values)), ) pydantic-2.10.6/pydantic/v1/fields.py000066400000000000000000001427311474456633400174240ustar00rootroot00000000000000import copy import re from collections import Counter as CollectionCounter, defaultdict, deque from collections.abc import Callable, Hashable as CollectionsHashable, Iterable as CollectionsIterable from typing import ( TYPE_CHECKING, Any, Counter, DefaultDict, Deque, Dict, ForwardRef, FrozenSet, Generator, Iterable, Iterator, List, Mapping, Optional, Pattern, Sequence, Set, Tuple, Type, TypeVar, Union, ) from typing_extensions import Annotated, Final from pydantic.v1 import errors as errors_ from pydantic.v1.class_validators import Validator, make_generic_validator, prep_validators from pydantic.v1.error_wrappers import ErrorWrapper from pydantic.v1.errors import ConfigError, InvalidDiscriminator, MissingDiscriminator, NoneIsNotAllowedError from pydantic.v1.types import Json, JsonWrapper from pydantic.v1.typing import ( NoArgAnyCallable, convert_generics, display_as_type, get_args, get_origin, is_finalvar, is_literal_type, is_new_type, is_none_type, is_typeddict, is_typeddict_special, is_union, new_type_supertype, ) from pydantic.v1.utils import ( PyObjectStr, Representation, ValueItems, get_discriminator_alias_and_values, get_unique_discriminator_alias, lenient_isinstance, lenient_issubclass, sequence_like, smart_deepcopy, ) from pydantic.v1.validators import constant_validator, dict_validator, find_validators, validate_json Required: Any = Ellipsis T = TypeVar('T') class UndefinedType: def __repr__(self) -> str: return 'PydanticUndefined' def __copy__(self: T) -> T: return self def __reduce__(self) -> str: return 'Undefined' def __deepcopy__(self: T, _: Any) -> T: return self Undefined = UndefinedType() if TYPE_CHECKING: from pydantic.v1.class_validators import ValidatorsList from pydantic.v1.config import BaseConfig from pydantic.v1.error_wrappers import ErrorList from pydantic.v1.types import ModelOrDc from pydantic.v1.typing import AbstractSetIntStr, MappingIntStrAny, ReprArgs ValidateReturn = Tuple[Optional[Any], Optional[ErrorList]] LocStr = Union[Tuple[Union[int, str], ...], str] BoolUndefined = Union[bool, UndefinedType] class FieldInfo(Representation): """ Captures extra information about a field. """ __slots__ = ( 'default', 'default_factory', 'alias', 'alias_priority', 'title', 'description', 'exclude', 'include', 'const', 'gt', 'ge', 'lt', 'le', 'multiple_of', 'allow_inf_nan', 'max_digits', 'decimal_places', 'min_items', 'max_items', 'unique_items', 'min_length', 'max_length', 'allow_mutation', 'repr', 'regex', 'discriminator', 'extra', ) # field constraints with the default value, it's also used in update_from_config below __field_constraints__ = { 'min_length': None, 'max_length': None, 'regex': None, 'gt': None, 'lt': None, 'ge': None, 'le': None, 'multiple_of': None, 'allow_inf_nan': None, 'max_digits': None, 'decimal_places': None, 'min_items': None, 'max_items': None, 'unique_items': None, 'allow_mutation': True, } def __init__(self, default: Any = Undefined, **kwargs: Any) -> None: self.default = default self.default_factory = kwargs.pop('default_factory', None) self.alias = kwargs.pop('alias', None) self.alias_priority = kwargs.pop('alias_priority', 2 if self.alias is not None else None) self.title = kwargs.pop('title', None) self.description = kwargs.pop('description', None) self.exclude = kwargs.pop('exclude', None) self.include = kwargs.pop('include', None) self.const = kwargs.pop('const', None) self.gt = kwargs.pop('gt', None) self.ge = kwargs.pop('ge', None) self.lt = kwargs.pop('lt', None) self.le = kwargs.pop('le', None) self.multiple_of = kwargs.pop('multiple_of', None) self.allow_inf_nan = kwargs.pop('allow_inf_nan', None) self.max_digits = kwargs.pop('max_digits', None) self.decimal_places = kwargs.pop('decimal_places', None) self.min_items = kwargs.pop('min_items', None) self.max_items = kwargs.pop('max_items', None) self.unique_items = kwargs.pop('unique_items', None) self.min_length = kwargs.pop('min_length', None) self.max_length = kwargs.pop('max_length', None) self.allow_mutation = kwargs.pop('allow_mutation', True) self.regex = kwargs.pop('regex', None) self.discriminator = kwargs.pop('discriminator', None) self.repr = kwargs.pop('repr', True) self.extra = kwargs def __repr_args__(self) -> 'ReprArgs': field_defaults_to_hide: Dict[str, Any] = { 'repr': True, **self.__field_constraints__, } attrs = ((s, getattr(self, s)) for s in self.__slots__) return [(a, v) for a, v in attrs if v != field_defaults_to_hide.get(a, None)] def get_constraints(self) -> Set[str]: """ Gets the constraints set on the field by comparing the constraint value with its default value :return: the constraints set on field_info """ return {attr for attr, default in self.__field_constraints__.items() if getattr(self, attr) != default} def update_from_config(self, from_config: Dict[str, Any]) -> None: """ Update this FieldInfo based on a dict from get_field_info, only fields which have not been set are dated. """ for attr_name, value in from_config.items(): try: current_value = getattr(self, attr_name) except AttributeError: # attr_name is not an attribute of FieldInfo, it should therefore be added to extra # (except if extra already has this value!) self.extra.setdefault(attr_name, value) else: if current_value is self.__field_constraints__.get(attr_name, None): setattr(self, attr_name, value) elif attr_name == 'exclude': self.exclude = ValueItems.merge(value, current_value) elif attr_name == 'include': self.include = ValueItems.merge(value, current_value, intersect=True) def _validate(self) -> None: if self.default is not Undefined and self.default_factory is not None: raise ValueError('cannot specify both default and default_factory') def Field( default: Any = Undefined, *, default_factory: Optional[NoArgAnyCallable] = None, alias: Optional[str] = None, title: Optional[str] = None, description: Optional[str] = None, exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny', Any]] = None, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny', Any]] = None, const: Optional[bool] = None, gt: Optional[float] = None, ge: Optional[float] = None, lt: Optional[float] = None, le: Optional[float] = None, multiple_of: Optional[float] = None, allow_inf_nan: Optional[bool] = None, max_digits: Optional[int] = None, decimal_places: Optional[int] = None, min_items: Optional[int] = None, max_items: Optional[int] = None, unique_items: Optional[bool] = None, min_length: Optional[int] = None, max_length: Optional[int] = None, allow_mutation: bool = True, regex: Optional[str] = None, discriminator: Optional[str] = None, repr: bool = True, **extra: Any, ) -> Any: """ Used to provide extra information about a field, either for the model schema or complex validation. Some arguments apply only to number fields (``int``, ``float``, ``Decimal``) and some apply only to ``str``. :param default: since this is replacing the field’s default, its first argument is used to set the default, use ellipsis (``...``) to indicate the field is required :param default_factory: callable that will be called when a default value is needed for this field If both `default` and `default_factory` are set, an error is raised. :param alias: the public name of the field :param title: can be any string, used in the schema :param description: can be any string, used in the schema :param exclude: exclude this field while dumping. Takes same values as the ``include`` and ``exclude`` arguments on the ``.dict`` method. :param include: include this field while dumping. Takes same values as the ``include`` and ``exclude`` arguments on the ``.dict`` method. :param const: this field is required and *must* take it's default value :param gt: only applies to numbers, requires the field to be "greater than". The schema will have an ``exclusiveMinimum`` validation keyword :param ge: only applies to numbers, requires the field to be "greater than or equal to". The schema will have a ``minimum`` validation keyword :param lt: only applies to numbers, requires the field to be "less than". The schema will have an ``exclusiveMaximum`` validation keyword :param le: only applies to numbers, requires the field to be "less than or equal to". The schema will have a ``maximum`` validation keyword :param multiple_of: only applies to numbers, requires the field to be "a multiple of". The schema will have a ``multipleOf`` validation keyword :param allow_inf_nan: only applies to numbers, allows the field to be NaN or infinity (+inf or -inf), which is a valid Python float. Default True, set to False for compatibility with JSON. :param max_digits: only applies to Decimals, requires the field to have a maximum number of digits within the decimal. It does not include a zero before the decimal point or trailing decimal zeroes. :param decimal_places: only applies to Decimals, requires the field to have at most a number of decimal places allowed. It does not include trailing decimal zeroes. :param min_items: only applies to lists, requires the field to have a minimum number of elements. The schema will have a ``minItems`` validation keyword :param max_items: only applies to lists, requires the field to have a maximum number of elements. The schema will have a ``maxItems`` validation keyword :param unique_items: only applies to lists, requires the field not to have duplicated elements. The schema will have a ``uniqueItems`` validation keyword :param min_length: only applies to strings, requires the field to have a minimum length. The schema will have a ``minLength`` validation keyword :param max_length: only applies to strings, requires the field to have a maximum length. The schema will have a ``maxLength`` validation keyword :param allow_mutation: a boolean which defaults to True. When False, the field raises a TypeError if the field is assigned on an instance. The BaseModel Config must set validate_assignment to True :param regex: only applies to strings, requires the field match against a regular expression pattern string. The schema will have a ``pattern`` validation keyword :param discriminator: only useful with a (discriminated a.k.a. tagged) `Union` of sub models with a common field. The `discriminator` is the name of this common field to shorten validation and improve generated schema :param repr: show this field in the representation :param **extra: any additional keyword arguments will be added as is to the schema """ field_info = FieldInfo( default, default_factory=default_factory, alias=alias, title=title, description=description, exclude=exclude, include=include, const=const, gt=gt, ge=ge, lt=lt, le=le, multiple_of=multiple_of, allow_inf_nan=allow_inf_nan, max_digits=max_digits, decimal_places=decimal_places, min_items=min_items, max_items=max_items, unique_items=unique_items, min_length=min_length, max_length=max_length, allow_mutation=allow_mutation, regex=regex, discriminator=discriminator, repr=repr, **extra, ) field_info._validate() return field_info # used to be an enum but changed to int's for small performance improvement as less access overhead SHAPE_SINGLETON = 1 SHAPE_LIST = 2 SHAPE_SET = 3 SHAPE_MAPPING = 4 SHAPE_TUPLE = 5 SHAPE_TUPLE_ELLIPSIS = 6 SHAPE_SEQUENCE = 7 SHAPE_FROZENSET = 8 SHAPE_ITERABLE = 9 SHAPE_GENERIC = 10 SHAPE_DEQUE = 11 SHAPE_DICT = 12 SHAPE_DEFAULTDICT = 13 SHAPE_COUNTER = 14 SHAPE_NAME_LOOKUP = { SHAPE_LIST: 'List[{}]', SHAPE_SET: 'Set[{}]', SHAPE_TUPLE_ELLIPSIS: 'Tuple[{}, ...]', SHAPE_SEQUENCE: 'Sequence[{}]', SHAPE_FROZENSET: 'FrozenSet[{}]', SHAPE_ITERABLE: 'Iterable[{}]', SHAPE_DEQUE: 'Deque[{}]', SHAPE_DICT: 'Dict[{}]', SHAPE_DEFAULTDICT: 'DefaultDict[{}]', SHAPE_COUNTER: 'Counter[{}]', } MAPPING_LIKE_SHAPES: Set[int] = {SHAPE_DEFAULTDICT, SHAPE_DICT, SHAPE_MAPPING, SHAPE_COUNTER} class ModelField(Representation): __slots__ = ( 'type_', 'outer_type_', 'annotation', 'sub_fields', 'sub_fields_mapping', 'key_field', 'validators', 'pre_validators', 'post_validators', 'default', 'default_factory', 'required', 'final', 'model_config', 'name', 'alias', 'has_alias', 'field_info', 'discriminator_key', 'discriminator_alias', 'validate_always', 'allow_none', 'shape', 'class_validators', 'parse_json', ) def __init__( self, *, name: str, type_: Type[Any], class_validators: Optional[Dict[str, Validator]], model_config: Type['BaseConfig'], default: Any = None, default_factory: Optional[NoArgAnyCallable] = None, required: 'BoolUndefined' = Undefined, final: bool = False, alias: Optional[str] = None, field_info: Optional[FieldInfo] = None, ) -> None: self.name: str = name self.has_alias: bool = alias is not None self.alias: str = alias if alias is not None else name self.annotation = type_ self.type_: Any = convert_generics(type_) self.outer_type_: Any = type_ self.class_validators = class_validators or {} self.default: Any = default self.default_factory: Optional[NoArgAnyCallable] = default_factory self.required: 'BoolUndefined' = required self.final: bool = final self.model_config = model_config self.field_info: FieldInfo = field_info or FieldInfo(default) self.discriminator_key: Optional[str] = self.field_info.discriminator self.discriminator_alias: Optional[str] = self.discriminator_key self.allow_none: bool = False self.validate_always: bool = False self.sub_fields: Optional[List[ModelField]] = None self.sub_fields_mapping: Optional[Dict[str, 'ModelField']] = None # used for discriminated union self.key_field: Optional[ModelField] = None self.validators: 'ValidatorsList' = [] self.pre_validators: Optional['ValidatorsList'] = None self.post_validators: Optional['ValidatorsList'] = None self.parse_json: bool = False self.shape: int = SHAPE_SINGLETON self.model_config.prepare_field(self) self.prepare() def get_default(self) -> Any: return smart_deepcopy(self.default) if self.default_factory is None else self.default_factory() @staticmethod def _get_field_info( field_name: str, annotation: Any, value: Any, config: Type['BaseConfig'] ) -> Tuple[FieldInfo, Any]: """ Get a FieldInfo from a root typing.Annotated annotation, value, or config default. The FieldInfo may be set in typing.Annotated or the value, but not both. If neither contain a FieldInfo, a new one will be created using the config. :param field_name: name of the field for use in error messages :param annotation: a type hint such as `str` or `Annotated[str, Field(..., min_length=5)]` :param value: the field's assigned value :param config: the model's config object :return: the FieldInfo contained in the `annotation`, the value, or a new one from the config. """ field_info_from_config = config.get_field_info(field_name) field_info = None if get_origin(annotation) is Annotated: field_infos = [arg for arg in get_args(annotation)[1:] if isinstance(arg, FieldInfo)] if len(field_infos) > 1: raise ValueError(f'cannot specify multiple `Annotated` `Field`s for {field_name!r}') field_info = next(iter(field_infos), None) if field_info is not None: field_info = copy.copy(field_info) field_info.update_from_config(field_info_from_config) if field_info.default not in (Undefined, Required): raise ValueError(f'`Field` default cannot be set in `Annotated` for {field_name!r}') if value is not Undefined and value is not Required: # check also `Required` because of `validate_arguments` that sets `...` as default value field_info.default = value if isinstance(value, FieldInfo): if field_info is not None: raise ValueError(f'cannot specify `Annotated` and value `Field`s together for {field_name!r}') field_info = value field_info.update_from_config(field_info_from_config) elif field_info is None: field_info = FieldInfo(value, **field_info_from_config) value = None if field_info.default_factory is not None else field_info.default field_info._validate() return field_info, value @classmethod def infer( cls, *, name: str, value: Any, annotation: Any, class_validators: Optional[Dict[str, Validator]], config: Type['BaseConfig'], ) -> 'ModelField': from pydantic.v1.schema import get_annotation_from_field_info field_info, value = cls._get_field_info(name, annotation, value, config) required: 'BoolUndefined' = Undefined if value is Required: required = True value = None elif value is not Undefined: required = False annotation = get_annotation_from_field_info(annotation, field_info, name, config.validate_assignment) return cls( name=name, type_=annotation, alias=field_info.alias, class_validators=class_validators, default=value, default_factory=field_info.default_factory, required=required, model_config=config, field_info=field_info, ) def set_config(self, config: Type['BaseConfig']) -> None: self.model_config = config info_from_config = config.get_field_info(self.name) config.prepare_field(self) new_alias = info_from_config.get('alias') new_alias_priority = info_from_config.get('alias_priority') or 0 if new_alias and new_alias_priority >= (self.field_info.alias_priority or 0): self.field_info.alias = new_alias self.field_info.alias_priority = new_alias_priority self.alias = new_alias new_exclude = info_from_config.get('exclude') if new_exclude is not None: self.field_info.exclude = ValueItems.merge(self.field_info.exclude, new_exclude) new_include = info_from_config.get('include') if new_include is not None: self.field_info.include = ValueItems.merge(self.field_info.include, new_include, intersect=True) @property def alt_alias(self) -> bool: return self.name != self.alias def prepare(self) -> None: """ Prepare the field but inspecting self.default, self.type_ etc. Note: this method is **not** idempotent (because _type_analysis is not idempotent), e.g. calling it it multiple times may modify the field and configure it incorrectly. """ self._set_default_and_type() if self.type_.__class__ is ForwardRef or self.type_.__class__ is DeferredType: # self.type_ is currently a ForwardRef and there's nothing we can do now, # user will need to call model.update_forward_refs() return self._type_analysis() if self.required is Undefined: self.required = True if self.default is Undefined and self.default_factory is None: self.default = None self.populate_validators() def _set_default_and_type(self) -> None: """ Set the default value, infer the type if needed and check if `None` value is valid. """ if self.default_factory is not None: if self.type_ is Undefined: raise errors_.ConfigError( f'you need to set the type of field {self.name!r} when using `default_factory`' ) return default_value = self.get_default() if default_value is not None and self.type_ is Undefined: self.type_ = default_value.__class__ self.outer_type_ = self.type_ self.annotation = self.type_ if self.type_ is Undefined: raise errors_.ConfigError(f'unable to infer type for attribute "{self.name}"') if self.required is False and default_value is None: self.allow_none = True def _type_analysis(self) -> None: # noqa: C901 (ignore complexity) # typing interface is horrible, we have to do some ugly checks if lenient_issubclass(self.type_, JsonWrapper): self.type_ = self.type_.inner_type self.parse_json = True elif lenient_issubclass(self.type_, Json): self.type_ = Any self.parse_json = True elif isinstance(self.type_, TypeVar): if self.type_.__bound__: self.type_ = self.type_.__bound__ elif self.type_.__constraints__: self.type_ = Union[self.type_.__constraints__] else: self.type_ = Any elif is_new_type(self.type_): self.type_ = new_type_supertype(self.type_) if self.type_ is Any or self.type_ is object: if self.required is Undefined: self.required = False self.allow_none = True return elif self.type_ is Pattern or self.type_ is re.Pattern: # python 3.7 only, Pattern is a typing object but without sub fields return elif is_literal_type(self.type_): return elif is_typeddict(self.type_): return if is_finalvar(self.type_): self.final = True if self.type_ is Final: self.type_ = Any else: self.type_ = get_args(self.type_)[0] self._type_analysis() return origin = get_origin(self.type_) if origin is Annotated or is_typeddict_special(origin): self.type_ = get_args(self.type_)[0] self._type_analysis() return if self.discriminator_key is not None and not is_union(origin): raise TypeError('`discriminator` can only be used with `Union` type with more than one variant') # add extra check for `collections.abc.Hashable` for python 3.10+ where origin is not `None` if origin is None or origin is CollectionsHashable: # field is not "typing" object eg. Union, Dict, List etc. # allow None for virtual superclasses of NoneType, e.g. Hashable if isinstance(self.type_, type) and isinstance(None, self.type_): self.allow_none = True return elif origin is Callable: return elif is_union(origin): types_ = [] for type_ in get_args(self.type_): if is_none_type(type_) or type_ is Any or type_ is object: if self.required is Undefined: self.required = False self.allow_none = True if is_none_type(type_): continue types_.append(type_) if len(types_) == 1: # Optional[] self.type_ = types_[0] # this is the one case where the "outer type" isn't just the original type self.outer_type_ = self.type_ # re-run to correctly interpret the new self.type_ self._type_analysis() else: self.sub_fields = [self._create_sub_type(t, f'{self.name}_{display_as_type(t)}') for t in types_] if self.discriminator_key is not None: self.prepare_discriminated_union_sub_fields() return elif issubclass(origin, Tuple): # type: ignore # origin == Tuple without item type args = get_args(self.type_) if not args: # plain tuple self.type_ = Any self.shape = SHAPE_TUPLE_ELLIPSIS elif len(args) == 2 and args[1] is Ellipsis: # e.g. Tuple[int, ...] self.type_ = args[0] self.shape = SHAPE_TUPLE_ELLIPSIS self.sub_fields = [self._create_sub_type(args[0], f'{self.name}_0')] elif args == ((),): # Tuple[()] means empty tuple self.shape = SHAPE_TUPLE self.type_ = Any self.sub_fields = [] else: self.shape = SHAPE_TUPLE self.sub_fields = [self._create_sub_type(t, f'{self.name}_{i}') for i, t in enumerate(args)] return elif issubclass(origin, List): # Create self validators get_validators = getattr(self.type_, '__get_validators__', None) if get_validators: self.class_validators.update( {f'list_{i}': Validator(validator, pre=True) for i, validator in enumerate(get_validators())} ) self.type_ = get_args(self.type_)[0] self.shape = SHAPE_LIST elif issubclass(origin, Set): # Create self validators get_validators = getattr(self.type_, '__get_validators__', None) if get_validators: self.class_validators.update( {f'set_{i}': Validator(validator, pre=True) for i, validator in enumerate(get_validators())} ) self.type_ = get_args(self.type_)[0] self.shape = SHAPE_SET elif issubclass(origin, FrozenSet): # Create self validators get_validators = getattr(self.type_, '__get_validators__', None) if get_validators: self.class_validators.update( {f'frozenset_{i}': Validator(validator, pre=True) for i, validator in enumerate(get_validators())} ) self.type_ = get_args(self.type_)[0] self.shape = SHAPE_FROZENSET elif issubclass(origin, Deque): self.type_ = get_args(self.type_)[0] self.shape = SHAPE_DEQUE elif issubclass(origin, Sequence): self.type_ = get_args(self.type_)[0] self.shape = SHAPE_SEQUENCE # priority to most common mapping: dict elif origin is dict or origin is Dict: self.key_field = self._create_sub_type(get_args(self.type_)[0], 'key_' + self.name, for_keys=True) self.type_ = get_args(self.type_)[1] self.shape = SHAPE_DICT elif issubclass(origin, DefaultDict): self.key_field = self._create_sub_type(get_args(self.type_)[0], 'key_' + self.name, for_keys=True) self.type_ = get_args(self.type_)[1] self.shape = SHAPE_DEFAULTDICT elif issubclass(origin, Counter): self.key_field = self._create_sub_type(get_args(self.type_)[0], 'key_' + self.name, for_keys=True) self.type_ = int self.shape = SHAPE_COUNTER elif issubclass(origin, Mapping): self.key_field = self._create_sub_type(get_args(self.type_)[0], 'key_' + self.name, for_keys=True) self.type_ = get_args(self.type_)[1] self.shape = SHAPE_MAPPING # Equality check as almost everything inherits form Iterable, including str # check for Iterable and CollectionsIterable, as it could receive one even when declared with the other elif origin in {Iterable, CollectionsIterable}: self.type_ = get_args(self.type_)[0] self.shape = SHAPE_ITERABLE self.sub_fields = [self._create_sub_type(self.type_, f'{self.name}_type')] elif issubclass(origin, Type): # type: ignore return elif hasattr(origin, '__get_validators__') or self.model_config.arbitrary_types_allowed: # Is a Pydantic-compatible generic that handles itself # or we have arbitrary_types_allowed = True self.shape = SHAPE_GENERIC self.sub_fields = [self._create_sub_type(t, f'{self.name}_{i}') for i, t in enumerate(get_args(self.type_))] self.type_ = origin return else: raise TypeError(f'Fields of type "{origin}" are not supported.') # type_ has been refined eg. as the type of a List and sub_fields needs to be populated self.sub_fields = [self._create_sub_type(self.type_, '_' + self.name)] def prepare_discriminated_union_sub_fields(self) -> None: """ Prepare the mapping -> and update `sub_fields` Note that this process can be aborted if a `ForwardRef` is encountered """ assert self.discriminator_key is not None if self.type_.__class__ is DeferredType: return assert self.sub_fields is not None sub_fields_mapping: Dict[str, 'ModelField'] = {} all_aliases: Set[str] = set() for sub_field in self.sub_fields: t = sub_field.type_ if t.__class__ is ForwardRef: # Stopping everything...will need to call `update_forward_refs` return alias, discriminator_values = get_discriminator_alias_and_values(t, self.discriminator_key) all_aliases.add(alias) for discriminator_value in discriminator_values: sub_fields_mapping[discriminator_value] = sub_field self.sub_fields_mapping = sub_fields_mapping self.discriminator_alias = get_unique_discriminator_alias(all_aliases, self.discriminator_key) def _create_sub_type(self, type_: Type[Any], name: str, *, for_keys: bool = False) -> 'ModelField': if for_keys: class_validators = None else: # validators for sub items should not have `each_item` as we want to check only the first sublevel class_validators = { k: Validator( func=v.func, pre=v.pre, each_item=False, always=v.always, check_fields=v.check_fields, skip_on_failure=v.skip_on_failure, ) for k, v in self.class_validators.items() if v.each_item } field_info, _ = self._get_field_info(name, type_, None, self.model_config) return self.__class__( type_=type_, name=name, class_validators=class_validators, model_config=self.model_config, field_info=field_info, ) def populate_validators(self) -> None: """ Prepare self.pre_validators, self.validators, and self.post_validators based on self.type_'s __get_validators__ and class validators. This method should be idempotent, e.g. it should be safe to call multiple times without mis-configuring the field. """ self.validate_always = getattr(self.type_, 'validate_always', False) or any( v.always for v in self.class_validators.values() ) class_validators_ = self.class_validators.values() if not self.sub_fields or self.shape == SHAPE_GENERIC: get_validators = getattr(self.type_, '__get_validators__', None) v_funcs = ( *[v.func for v in class_validators_ if v.each_item and v.pre], *(get_validators() if get_validators else list(find_validators(self.type_, self.model_config))), *[v.func for v in class_validators_ if v.each_item and not v.pre], ) self.validators = prep_validators(v_funcs) self.pre_validators = [] self.post_validators = [] if self.field_info and self.field_info.const: self.post_validators.append(make_generic_validator(constant_validator)) if class_validators_: self.pre_validators += prep_validators(v.func for v in class_validators_ if not v.each_item and v.pre) self.post_validators += prep_validators(v.func for v in class_validators_ if not v.each_item and not v.pre) if self.parse_json: self.pre_validators.append(make_generic_validator(validate_json)) self.pre_validators = self.pre_validators or None self.post_validators = self.post_validators or None def validate( self, v: Any, values: Dict[str, Any], *, loc: 'LocStr', cls: Optional['ModelOrDc'] = None ) -> 'ValidateReturn': assert self.type_.__class__ is not DeferredType if self.type_.__class__ is ForwardRef: assert cls is not None raise ConfigError( f'field "{self.name}" not yet prepared so type is still a ForwardRef, ' f'you might need to call {cls.__name__}.update_forward_refs().' ) errors: Optional['ErrorList'] if self.pre_validators: v, errors = self._apply_validators(v, values, loc, cls, self.pre_validators) if errors: return v, errors if v is None: if is_none_type(self.type_): # keep validating pass elif self.allow_none: if self.post_validators: return self._apply_validators(v, values, loc, cls, self.post_validators) else: return None, None else: return v, ErrorWrapper(NoneIsNotAllowedError(), loc) if self.shape == SHAPE_SINGLETON: v, errors = self._validate_singleton(v, values, loc, cls) elif self.shape in MAPPING_LIKE_SHAPES: v, errors = self._validate_mapping_like(v, values, loc, cls) elif self.shape == SHAPE_TUPLE: v, errors = self._validate_tuple(v, values, loc, cls) elif self.shape == SHAPE_ITERABLE: v, errors = self._validate_iterable(v, values, loc, cls) elif self.shape == SHAPE_GENERIC: v, errors = self._apply_validators(v, values, loc, cls, self.validators) else: # sequence, list, set, generator, tuple with ellipsis, frozen set v, errors = self._validate_sequence_like(v, values, loc, cls) if not errors and self.post_validators: v, errors = self._apply_validators(v, values, loc, cls, self.post_validators) return v, errors def _validate_sequence_like( # noqa: C901 (ignore complexity) self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': """ Validate sequence-like containers: lists, tuples, sets and generators Note that large if-else blocks are necessary to enable Cython optimization, which is why we disable the complexity check above. """ if not sequence_like(v): e: errors_.PydanticTypeError if self.shape == SHAPE_LIST: e = errors_.ListError() elif self.shape in (SHAPE_TUPLE, SHAPE_TUPLE_ELLIPSIS): e = errors_.TupleError() elif self.shape == SHAPE_SET: e = errors_.SetError() elif self.shape == SHAPE_FROZENSET: e = errors_.FrozenSetError() else: e = errors_.SequenceError() return v, ErrorWrapper(e, loc) loc = loc if isinstance(loc, tuple) else (loc,) result = [] errors: List[ErrorList] = [] for i, v_ in enumerate(v): v_loc = *loc, i r, ee = self._validate_singleton(v_, values, v_loc, cls) if ee: errors.append(ee) else: result.append(r) if errors: return v, errors converted: Union[List[Any], Set[Any], FrozenSet[Any], Tuple[Any, ...], Iterator[Any], Deque[Any]] = result if self.shape == SHAPE_SET: converted = set(result) elif self.shape == SHAPE_FROZENSET: converted = frozenset(result) elif self.shape == SHAPE_TUPLE_ELLIPSIS: converted = tuple(result) elif self.shape == SHAPE_DEQUE: converted = deque(result, maxlen=getattr(v, 'maxlen', None)) elif self.shape == SHAPE_SEQUENCE: if isinstance(v, tuple): converted = tuple(result) elif isinstance(v, set): converted = set(result) elif isinstance(v, Generator): converted = iter(result) elif isinstance(v, deque): converted = deque(result, maxlen=getattr(v, 'maxlen', None)) return converted, None def _validate_iterable( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': """ Validate Iterables. This intentionally doesn't validate values to allow infinite generators. """ try: iterable = iter(v) except TypeError: return v, ErrorWrapper(errors_.IterableError(), loc) return iterable, None def _validate_tuple( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': e: Optional[Exception] = None if not sequence_like(v): e = errors_.TupleError() else: actual_length, expected_length = len(v), len(self.sub_fields) # type: ignore if actual_length != expected_length: e = errors_.TupleLengthError(actual_length=actual_length, expected_length=expected_length) if e: return v, ErrorWrapper(e, loc) loc = loc if isinstance(loc, tuple) else (loc,) result = [] errors: List[ErrorList] = [] for i, (v_, field) in enumerate(zip(v, self.sub_fields)): # type: ignore v_loc = *loc, i r, ee = field.validate(v_, values, loc=v_loc, cls=cls) if ee: errors.append(ee) else: result.append(r) if errors: return v, errors else: return tuple(result), None def _validate_mapping_like( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': try: v_iter = dict_validator(v) except TypeError as exc: return v, ErrorWrapper(exc, loc) loc = loc if isinstance(loc, tuple) else (loc,) result, errors = {}, [] for k, v_ in v_iter.items(): v_loc = *loc, '__key__' key_result, key_errors = self.key_field.validate(k, values, loc=v_loc, cls=cls) # type: ignore if key_errors: errors.append(key_errors) continue v_loc = *loc, k value_result, value_errors = self._validate_singleton(v_, values, v_loc, cls) if value_errors: errors.append(value_errors) continue result[key_result] = value_result if errors: return v, errors elif self.shape == SHAPE_DICT: return result, None elif self.shape == SHAPE_DEFAULTDICT: return defaultdict(self.type_, result), None elif self.shape == SHAPE_COUNTER: return CollectionCounter(result), None else: return self._get_mapping_value(v, result), None def _get_mapping_value(self, original: T, converted: Dict[Any, Any]) -> Union[T, Dict[Any, Any]]: """ When type is `Mapping[KT, KV]` (or another unsupported mapping), we try to avoid coercing to `dict` unwillingly. """ original_cls = original.__class__ if original_cls == dict or original_cls == Dict: return converted elif original_cls in {defaultdict, DefaultDict}: return defaultdict(self.type_, converted) else: try: # Counter, OrderedDict, UserDict, ... return original_cls(converted) # type: ignore except TypeError: raise RuntimeError(f'Could not convert dictionary to {original_cls.__name__!r}') from None def _validate_singleton( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': if self.sub_fields: if self.discriminator_key is not None: return self._validate_discriminated_union(v, values, loc, cls) errors = [] if self.model_config.smart_union and is_union(get_origin(self.type_)): # 1st pass: check if the value is an exact instance of one of the Union types # (e.g. to avoid coercing a bool into an int) for field in self.sub_fields: if v.__class__ is field.outer_type_: return v, None # 2nd pass: check if the value is an instance of any subclass of the Union types for field in self.sub_fields: # This whole logic will be improved later on to support more complex `isinstance` checks # It will probably be done once a strict mode is added and be something like: # ``` # value, error = field.validate(v, values, strict=True) # if error is None: # return value, None # ``` try: if isinstance(v, field.outer_type_): return v, None except TypeError: # compound type if lenient_isinstance(v, get_origin(field.outer_type_)): value, error = field.validate(v, values, loc=loc, cls=cls) if not error: return value, None # 1st pass by default or 3rd pass with `smart_union` enabled: # check if the value can be coerced into one of the Union types for field in self.sub_fields: value, error = field.validate(v, values, loc=loc, cls=cls) if error: errors.append(error) else: return value, None return v, errors else: return self._apply_validators(v, values, loc, cls, self.validators) def _validate_discriminated_union( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'] ) -> 'ValidateReturn': assert self.discriminator_key is not None assert self.discriminator_alias is not None try: try: discriminator_value = v[self.discriminator_alias] except KeyError: if self.model_config.allow_population_by_field_name: discriminator_value = v[self.discriminator_key] else: raise except KeyError: return v, ErrorWrapper(MissingDiscriminator(discriminator_key=self.discriminator_key), loc) except TypeError: try: # BaseModel or dataclass discriminator_value = getattr(v, self.discriminator_key) except (AttributeError, TypeError): return v, ErrorWrapper(MissingDiscriminator(discriminator_key=self.discriminator_key), loc) if self.sub_fields_mapping is None: assert cls is not None raise ConfigError( f'field "{self.name}" not yet prepared so type is still a ForwardRef, ' f'you might need to call {cls.__name__}.update_forward_refs().' ) try: sub_field = self.sub_fields_mapping[discriminator_value] except (KeyError, TypeError): # KeyError: `discriminator_value` is not in the dictionary. # TypeError: `discriminator_value` is unhashable. assert self.sub_fields_mapping is not None return v, ErrorWrapper( InvalidDiscriminator( discriminator_key=self.discriminator_key, discriminator_value=discriminator_value, allowed_values=list(self.sub_fields_mapping), ), loc, ) else: if not isinstance(loc, tuple): loc = (loc,) return sub_field.validate(v, values, loc=(*loc, display_as_type(sub_field.type_)), cls=cls) def _apply_validators( self, v: Any, values: Dict[str, Any], loc: 'LocStr', cls: Optional['ModelOrDc'], validators: 'ValidatorsList' ) -> 'ValidateReturn': for validator in validators: try: v = validator(cls, v, values, self, self.model_config) except (ValueError, TypeError, AssertionError) as exc: return v, ErrorWrapper(exc, loc) return v, None def is_complex(self) -> bool: """ Whether the field is "complex" eg. env variables should be parsed as JSON. """ from pydantic.v1.main import BaseModel return ( self.shape != SHAPE_SINGLETON or hasattr(self.type_, '__pydantic_model__') or lenient_issubclass(self.type_, (BaseModel, list, set, frozenset, dict)) ) def _type_display(self) -> PyObjectStr: t = display_as_type(self.type_) if self.shape in MAPPING_LIKE_SHAPES: t = f'Mapping[{display_as_type(self.key_field.type_)}, {t}]' # type: ignore elif self.shape == SHAPE_TUPLE: t = 'Tuple[{}]'.format(', '.join(display_as_type(f.type_) for f in self.sub_fields)) # type: ignore elif self.shape == SHAPE_GENERIC: assert self.sub_fields t = '{}[{}]'.format( display_as_type(self.type_), ', '.join(display_as_type(f.type_) for f in self.sub_fields) ) elif self.shape != SHAPE_SINGLETON: t = SHAPE_NAME_LOOKUP[self.shape].format(t) if self.allow_none and (self.shape != SHAPE_SINGLETON or not self.sub_fields): t = f'Optional[{t}]' return PyObjectStr(t) def __repr_args__(self) -> 'ReprArgs': args = [('name', self.name), ('type', self._type_display()), ('required', self.required)] if not self.required: if self.default_factory is not None: args.append(('default_factory', f'')) else: args.append(('default', self.default)) if self.alt_alias: args.append(('alias', self.alias)) return args class ModelPrivateAttr(Representation): __slots__ = ('default', 'default_factory') def __init__(self, default: Any = Undefined, *, default_factory: Optional[NoArgAnyCallable] = None) -> None: self.default = default self.default_factory = default_factory def get_default(self) -> Any: return smart_deepcopy(self.default) if self.default_factory is None else self.default_factory() def __eq__(self, other: Any) -> bool: return isinstance(other, self.__class__) and (self.default, self.default_factory) == ( other.default, other.default_factory, ) def PrivateAttr( default: Any = Undefined, *, default_factory: Optional[NoArgAnyCallable] = None, ) -> Any: """ Indicates that attribute is only used internally and never mixed with regular fields. Types or values of private attrs are not checked by pydantic and it's up to you to keep them relevant. Private attrs are stored in model __slots__. :param default: the attribute’s default value :param default_factory: callable that will be called when a default value is needed for this attribute If both `default` and `default_factory` are set, an error is raised. """ if default is not Undefined and default_factory is not None: raise ValueError('cannot specify both default and default_factory') return ModelPrivateAttr( default, default_factory=default_factory, ) class DeferredType: """ Used to postpone field preparation, while creating recursive generic models. """ def is_finalvar_with_default_val(type_: Type[Any], val: Any) -> bool: return is_finalvar(type_) and val is not Undefined and not isinstance(val, FieldInfo) pydantic-2.10.6/pydantic/v1/generics.py000066400000000000000000000427171474456633400177600ustar00rootroot00000000000000import sys import types import typing from typing import ( TYPE_CHECKING, Any, ClassVar, Dict, ForwardRef, Generic, Iterator, List, Mapping, Optional, Tuple, Type, TypeVar, Union, cast, ) from weakref import WeakKeyDictionary, WeakValueDictionary from typing_extensions import Annotated, Literal as ExtLiteral from pydantic.v1.class_validators import gather_all_validators from pydantic.v1.fields import DeferredType from pydantic.v1.main import BaseModel, create_model from pydantic.v1.types import JsonWrapper from pydantic.v1.typing import display_as_type, get_all_type_hints, get_args, get_origin, typing_base from pydantic.v1.utils import all_identical, lenient_issubclass if sys.version_info >= (3, 10): from typing import _UnionGenericAlias if sys.version_info >= (3, 8): from typing import Literal GenericModelT = TypeVar('GenericModelT', bound='GenericModel') TypeVarType = Any # since mypy doesn't allow the use of TypeVar as a type CacheKey = Tuple[Type[Any], Any, Tuple[Any, ...]] Parametrization = Mapping[TypeVarType, Type[Any]] # weak dictionaries allow the dynamically created parametrized versions of generic models to get collected # once they are no longer referenced by the caller. if sys.version_info >= (3, 9): # Typing for weak dictionaries available at 3.9 GenericTypesCache = WeakValueDictionary[CacheKey, Type[BaseModel]] AssignedParameters = WeakKeyDictionary[Type[BaseModel], Parametrization] else: GenericTypesCache = WeakValueDictionary AssignedParameters = WeakKeyDictionary # _generic_types_cache is a Mapping from __class_getitem__ arguments to the parametrized version of generic models. # This ensures multiple calls of e.g. A[B] return always the same class. _generic_types_cache = GenericTypesCache() # _assigned_parameters is a Mapping from parametrized version of generic models to assigned types of parametrizations # as captured during construction of the class (not instances). # E.g., for generic model `Model[A, B]`, when parametrized model `Model[int, str]` is created, # `Model[int, str]`: {A: int, B: str}` will be stored in `_assigned_parameters`. # (This information is only otherwise available after creation from the class name string). _assigned_parameters = AssignedParameters() class GenericModel(BaseModel): __slots__ = () __concrete__: ClassVar[bool] = False if TYPE_CHECKING: # Putting this in a TYPE_CHECKING block allows us to replace `if Generic not in cls.__bases__` with # `not hasattr(cls, "__parameters__")`. This means we don't need to force non-concrete subclasses of # `GenericModel` to also inherit from `Generic`, which would require changes to the use of `create_model` below. __parameters__: ClassVar[Tuple[TypeVarType, ...]] # Setting the return type as Type[Any] instead of Type[BaseModel] prevents PyCharm warnings def __class_getitem__(cls: Type[GenericModelT], params: Union[Type[Any], Tuple[Type[Any], ...]]) -> Type[Any]: """Instantiates a new class from a generic class `cls` and type variables `params`. :param params: Tuple of types the class . Given a generic class `Model` with 2 type variables and a concrete model `Model[str, int]`, the value `(str, int)` would be passed to `params`. :return: New model class inheriting from `cls` with instantiated types described by `params`. If no parameters are given, `cls` is returned as is. """ def _cache_key(_params: Any) -> CacheKey: args = get_args(_params) # python returns a list for Callables, which is not hashable if len(args) == 2 and isinstance(args[0], list): args = (tuple(args[0]), args[1]) return cls, _params, args cached = _generic_types_cache.get(_cache_key(params)) if cached is not None: return cached if cls.__concrete__ and Generic not in cls.__bases__: raise TypeError('Cannot parameterize a concrete instantiation of a generic model') if not isinstance(params, tuple): params = (params,) if cls is GenericModel and any(isinstance(param, TypeVar) for param in params): raise TypeError('Type parameters should be placed on typing.Generic, not GenericModel') if not hasattr(cls, '__parameters__'): raise TypeError(f'Type {cls.__name__} must inherit from typing.Generic before being parameterized') check_parameters_count(cls, params) # Build map from generic typevars to passed params typevars_map: Dict[TypeVarType, Type[Any]] = dict(zip(cls.__parameters__, params)) if all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map: return cls # if arguments are equal to parameters it's the same object # Create new model with original model as parent inserting fields with DeferredType. model_name = cls.__concrete_name__(params) validators = gather_all_validators(cls) type_hints = get_all_type_hints(cls).items() instance_type_hints = {k: v for k, v in type_hints if get_origin(v) is not ClassVar} fields = {k: (DeferredType(), cls.__fields__[k].field_info) for k in instance_type_hints if k in cls.__fields__} model_module, called_globally = get_caller_frame_info() created_model = cast( Type[GenericModel], # casting ensures mypy is aware of the __concrete__ and __parameters__ attributes create_model( model_name, __module__=model_module or cls.__module__, __base__=(cls,) + tuple(cls.__parameterized_bases__(typevars_map)), __config__=None, __validators__=validators, __cls_kwargs__=None, **fields, ), ) _assigned_parameters[created_model] = typevars_map if called_globally: # create global reference and therefore allow pickling object_by_reference = None reference_name = model_name reference_module_globals = sys.modules[created_model.__module__].__dict__ while object_by_reference is not created_model: object_by_reference = reference_module_globals.setdefault(reference_name, created_model) reference_name += '_' created_model.Config = cls.Config # Find any typevars that are still present in the model. # If none are left, the model is fully "concrete", otherwise the new # class is a generic class as well taking the found typevars as # parameters. new_params = tuple( {param: None for param in iter_contained_typevars(typevars_map.values())} ) # use dict as ordered set created_model.__concrete__ = not new_params if new_params: created_model.__parameters__ = new_params # Save created model in cache so we don't end up creating duplicate # models that should be identical. _generic_types_cache[_cache_key(params)] = created_model if len(params) == 1: _generic_types_cache[_cache_key(params[0])] = created_model # Recursively walk class type hints and replace generic typevars # with concrete types that were passed. _prepare_model_fields(created_model, fields, instance_type_hints, typevars_map) return created_model @classmethod def __concrete_name__(cls: Type[Any], params: Tuple[Type[Any], ...]) -> str: """Compute class name for child classes. :param params: Tuple of types the class . Given a generic class `Model` with 2 type variables and a concrete model `Model[str, int]`, the value `(str, int)` would be passed to `params`. :return: String representing a the new class where `params` are passed to `cls` as type variables. This method can be overridden to achieve a custom naming scheme for GenericModels. """ param_names = [display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' @classmethod def __parameterized_bases__(cls, typevars_map: Parametrization) -> Iterator[Type[Any]]: """ Returns unbound bases of cls parameterised to given type variables :param typevars_map: Dictionary of type applications for binding subclasses. Given a generic class `Model` with 2 type variables [S, T] and a concrete model `Model[str, int]`, the value `{S: str, T: int}` would be passed to `typevars_map`. :return: an iterator of generic sub classes, parameterised by `typevars_map` and other assigned parameters of `cls` e.g.: ``` class A(GenericModel, Generic[T]): ... class B(A[V], Generic[V]): ... assert A[int] in B.__parameterized_bases__({V: int}) ``` """ def build_base_model( base_model: Type[GenericModel], mapped_types: Parametrization ) -> Iterator[Type[GenericModel]]: base_parameters = tuple(mapped_types[param] for param in base_model.__parameters__) parameterized_base = base_model.__class_getitem__(base_parameters) if parameterized_base is base_model or parameterized_base is cls: # Avoid duplication in MRO return yield parameterized_base for base_model in cls.__bases__: if not issubclass(base_model, GenericModel): # not a class that can be meaningfully parameterized continue elif not getattr(base_model, '__parameters__', None): # base_model is "GenericModel" (and has no __parameters__) # or # base_model is already concrete, and will be included transitively via cls. continue elif cls in _assigned_parameters: if base_model in _assigned_parameters: # cls is partially parameterised but not from base_model # e.g. cls = B[S], base_model = A[S] # B[S][int] should subclass A[int], (and will be transitively via B[int]) # but it's not viable to consistently subclass types with arbitrary construction # So don't attempt to include A[S][int] continue else: # base_model not in _assigned_parameters: # cls is partially parameterized, base_model is original generic # e.g. cls = B[str, T], base_model = B[S, T] # Need to determine the mapping for the base_model parameters mapped_types: Parametrization = { key: typevars_map.get(value, value) for key, value in _assigned_parameters[cls].items() } yield from build_base_model(base_model, mapped_types) else: # cls is base generic, so base_class has a distinct base # can construct the Parameterised base model using typevars_map directly yield from build_base_model(base_model, typevars_map) def replace_types(type_: Any, type_map: Mapping[Any, Any]) -> Any: """Return type with all occurrences of `type_map` keys recursively replaced with their values. :param type_: Any type, class or generic alias :param type_map: Mapping from `TypeVar` instance to concrete types. :return: New type representing the basic structure of `type_` with all `typevar_map` keys recursively replaced. >>> replace_types(Tuple[str, Union[List[str], float]], {str: int}) Tuple[int, Union[List[int], float]] """ if not type_map: return type_ type_args = get_args(type_) origin_type = get_origin(type_) if origin_type is Annotated: annotated_type, *annotations = type_args return Annotated[replace_types(annotated_type, type_map), tuple(annotations)] if (origin_type is ExtLiteral) or (sys.version_info >= (3, 8) and origin_type is Literal): return type_map.get(type_, type_) # Having type args is a good indicator that this is a typing module # class instantiation or a generic alias of some sort. if type_args: resolved_type_args = tuple(replace_types(arg, type_map) for arg in type_args) if all_identical(type_args, resolved_type_args): # If all arguments are the same, there is no need to modify the # type or create a new object at all return type_ if ( origin_type is not None and isinstance(type_, typing_base) and not isinstance(origin_type, typing_base) and getattr(type_, '_name', None) is not None ): # In python < 3.9 generic aliases don't exist so any of these like `list`, # `type` or `collections.abc.Callable` need to be translated. # See: https://www.python.org/dev/peps/pep-0585 origin_type = getattr(typing, type_._name) assert origin_type is not None # PEP-604 syntax (Ex.: list | str) is represented with a types.UnionType object that does not have __getitem__. # We also cannot use isinstance() since we have to compare types. if sys.version_info >= (3, 10) and origin_type is types.UnionType: # noqa: E721 return _UnionGenericAlias(origin_type, resolved_type_args) return origin_type[resolved_type_args] # We handle pydantic generic models separately as they don't have the same # semantics as "typing" classes or generic aliases if not origin_type and lenient_issubclass(type_, GenericModel) and not type_.__concrete__: type_args = type_.__parameters__ resolved_type_args = tuple(replace_types(t, type_map) for t in type_args) if all_identical(type_args, resolved_type_args): return type_ return type_[resolved_type_args] # Handle special case for typehints that can have lists as arguments. # `typing.Callable[[int, str], int]` is an example for this. if isinstance(type_, (List, list)): resolved_list = list(replace_types(element, type_map) for element in type_) if all_identical(type_, resolved_list): return type_ return resolved_list # For JsonWrapperValue, need to handle its inner type to allow correct parsing # of generic Json arguments like Json[T] if not origin_type and lenient_issubclass(type_, JsonWrapper): type_.inner_type = replace_types(type_.inner_type, type_map) return type_ # If all else fails, we try to resolve the type directly and otherwise just # return the input with no modifications. new_type = type_map.get(type_, type_) # Convert string to ForwardRef if isinstance(new_type, str): return ForwardRef(new_type) else: return new_type def check_parameters_count(cls: Type[GenericModel], parameters: Tuple[Any, ...]) -> None: actual = len(parameters) expected = len(cls.__parameters__) if actual != expected: description = 'many' if actual > expected else 'few' raise TypeError(f'Too {description} parameters for {cls.__name__}; actual {actual}, expected {expected}') DictValues: Type[Any] = {}.values().__class__ def iter_contained_typevars(v: Any) -> Iterator[TypeVarType]: """Recursively iterate through all subtypes and type args of `v` and yield any typevars that are found.""" if isinstance(v, TypeVar): yield v elif hasattr(v, '__parameters__') and not get_origin(v) and lenient_issubclass(v, GenericModel): yield from v.__parameters__ elif isinstance(v, (DictValues, list)): for var in v: yield from iter_contained_typevars(var) else: args = get_args(v) for arg in args: yield from iter_contained_typevars(arg) def get_caller_frame_info() -> Tuple[Optional[str], bool]: """ Used inside a function to check whether it was called globally Will only work against non-compiled code, therefore used only in pydantic.generics :returns Tuple[module_name, called_globally] """ try: previous_caller_frame = sys._getframe(2) except ValueError as e: raise RuntimeError('This function must be used inside another function') from e except AttributeError: # sys module does not have _getframe function, so there's nothing we can do about it return None, False frame_globals = previous_caller_frame.f_globals return frame_globals.get('__name__'), previous_caller_frame.f_locals is frame_globals def _prepare_model_fields( created_model: Type[GenericModel], fields: Mapping[str, Any], instance_type_hints: Mapping[str, type], typevars_map: Mapping[Any, type], ) -> None: """ Replace DeferredType fields with concrete type hints and prepare them. """ for key, field in created_model.__fields__.items(): if key not in fields: assert field.type_.__class__ is not DeferredType # https://github.com/nedbat/coveragepy/issues/198 continue # pragma: no cover assert field.type_.__class__ is DeferredType, field.type_.__class__ field_type_hint = instance_type_hints[key] concrete_type = replace_types(field_type_hint, typevars_map) field.type_ = concrete_type field.outer_type_ = concrete_type field.prepare() created_model.__annotations__[key] = concrete_type pydantic-2.10.6/pydantic/v1/json.py000066400000000000000000000064761474456633400171340ustar00rootroot00000000000000import datetime from collections import deque from decimal import Decimal from enum import Enum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from re import Pattern from types import GeneratorType from typing import Any, Callable, Dict, Type, Union from uuid import UUID from pydantic.v1.color import Color from pydantic.v1.networks import NameEmail from pydantic.v1.types import SecretBytes, SecretStr __all__ = 'pydantic_encoder', 'custom_pydantic_encoder', 'timedelta_isoformat' def isoformat(o: Union[datetime.date, datetime.time]) -> str: return o.isoformat() def decimal_encoder(dec_value: Decimal) -> Union[int, float]: """ Encodes a Decimal as int of there's no exponent, otherwise float This is useful when we use ConstrainedDecimal to represent Numeric(x,0) where a integer (but not int typed) is used. Encoding this as a float results in failed round-tripping between encode and parse. Our Id type is a prime example of this. >>> decimal_encoder(Decimal("1.0")) 1.0 >>> decimal_encoder(Decimal("1")) 1 """ if dec_value.as_tuple().exponent >= 0: return int(dec_value) else: return float(dec_value) ENCODERS_BY_TYPE: Dict[Type[Any], Callable[[Any], Any]] = { bytes: lambda o: o.decode(), Color: str, datetime.date: isoformat, datetime.datetime: isoformat, datetime.time: isoformat, datetime.timedelta: lambda td: td.total_seconds(), Decimal: decimal_encoder, Enum: lambda o: o.value, frozenset: list, deque: list, GeneratorType: list, IPv4Address: str, IPv4Interface: str, IPv4Network: str, IPv6Address: str, IPv6Interface: str, IPv6Network: str, NameEmail: str, Path: str, Pattern: lambda o: o.pattern, SecretBytes: str, SecretStr: str, set: list, UUID: str, } def pydantic_encoder(obj: Any) -> Any: from dataclasses import asdict, is_dataclass from pydantic.v1.main import BaseModel if isinstance(obj, BaseModel): return obj.dict() elif is_dataclass(obj): return asdict(obj) # Check the class type and its superclasses for a matching encoder for base in obj.__class__.__mro__[:-1]: try: encoder = ENCODERS_BY_TYPE[base] except KeyError: continue return encoder(obj) else: # We have exited the for loop without finding a suitable encoder raise TypeError(f"Object of type '{obj.__class__.__name__}' is not JSON serializable") def custom_pydantic_encoder(type_encoders: Dict[Any, Callable[[Type[Any]], Any]], obj: Any) -> Any: # Check the class type and its superclasses for a matching encoder for base in obj.__class__.__mro__[:-1]: try: encoder = type_encoders[base] except KeyError: continue return encoder(obj) else: # We have exited the for loop without finding a suitable encoder return pydantic_encoder(obj) def timedelta_isoformat(td: datetime.timedelta) -> str: """ ISO 8601 encoding for Python timedelta object. """ minutes, seconds = divmod(td.seconds, 60) hours, minutes = divmod(minutes, 60) return f'{"-" if td.days < 0 else ""}P{abs(td.days)}DT{hours:d}H{minutes:d}M{seconds:d}.{td.microseconds:06d}S' pydantic-2.10.6/pydantic/v1/main.py000066400000000000000000001270131474456633400170760ustar00rootroot00000000000000import warnings from abc import ABCMeta from copy import deepcopy from enum import Enum from functools import partial from pathlib import Path from types import FunctionType, prepare_class, resolve_bases from typing import ( TYPE_CHECKING, AbstractSet, Any, Callable, ClassVar, Dict, List, Mapping, Optional, Tuple, Type, TypeVar, Union, cast, no_type_check, overload, ) from typing_extensions import dataclass_transform from pydantic.v1.class_validators import ValidatorGroup, extract_root_validators, extract_validators, inherit_validators from pydantic.v1.config import BaseConfig, Extra, inherit_config, prepare_config from pydantic.v1.error_wrappers import ErrorWrapper, ValidationError from pydantic.v1.errors import ConfigError, DictError, ExtraError, MissingError from pydantic.v1.fields import ( MAPPING_LIKE_SHAPES, Field, ModelField, ModelPrivateAttr, PrivateAttr, Undefined, is_finalvar_with_default_val, ) from pydantic.v1.json import custom_pydantic_encoder, pydantic_encoder from pydantic.v1.parse import Protocol, load_file, load_str_bytes from pydantic.v1.schema import default_ref_template, model_schema from pydantic.v1.types import PyObject, StrBytes from pydantic.v1.typing import ( AnyCallable, get_args, get_origin, is_classvar, is_namedtuple, is_union, resolve_annotations, update_model_forward_refs, ) from pydantic.v1.utils import ( DUNDER_ATTRIBUTES, ROOT_KEY, ClassAttribute, GetterDict, Representation, ValueItems, generate_model_signature, is_valid_field, is_valid_private_name, lenient_issubclass, sequence_like, smart_deepcopy, unique_list, validate_field_name, ) if TYPE_CHECKING: from inspect import Signature from pydantic.v1.class_validators import ValidatorListDict from pydantic.v1.types import ModelOrDc from pydantic.v1.typing import ( AbstractSetIntStr, AnyClassMethod, CallableGenerator, DictAny, DictStrAny, MappingIntStrAny, ReprArgs, SetStr, TupleGenerator, ) Model = TypeVar('Model', bound='BaseModel') __all__ = 'BaseModel', 'create_model', 'validate_model' _T = TypeVar('_T') def validate_custom_root_type(fields: Dict[str, ModelField]) -> None: if len(fields) > 1: raise ValueError(f'{ROOT_KEY} cannot be mixed with other fields') def generate_hash_function(frozen: bool) -> Optional[Callable[[Any], int]]: def hash_function(self_: Any) -> int: return hash(self_.__class__) + hash(tuple(self_.__dict__.values())) return hash_function if frozen else None # If a field is of type `Callable`, its default value should be a function and cannot to ignored. ANNOTATED_FIELD_UNTOUCHED_TYPES: Tuple[Any, ...] = (property, type, classmethod, staticmethod) # When creating a `BaseModel` instance, we bypass all the methods, properties... added to the model UNTOUCHED_TYPES: Tuple[Any, ...] = (FunctionType,) + ANNOTATED_FIELD_UNTOUCHED_TYPES # Note `ModelMetaclass` refers to `BaseModel`, but is also used to *create* `BaseModel`, so we need to add this extra # (somewhat hacky) boolean to keep track of whether we've created the `BaseModel` class yet, and therefore whether it's # safe to refer to it. If it *hasn't* been created, we assume that the `__new__` call we're in the middle of is for # the `BaseModel` class, since that's defined immediately after the metaclass. _is_base_model_class_defined = False @dataclass_transform(kw_only_default=True, field_specifiers=(Field,)) class ModelMetaclass(ABCMeta): @no_type_check # noqa C901 def __new__(mcs, name, bases, namespace, **kwargs): # noqa C901 fields: Dict[str, ModelField] = {} config = BaseConfig validators: 'ValidatorListDict' = {} pre_root_validators, post_root_validators = [], [] private_attributes: Dict[str, ModelPrivateAttr] = {} base_private_attributes: Dict[str, ModelPrivateAttr] = {} slots: SetStr = namespace.get('__slots__', ()) slots = {slots} if isinstance(slots, str) else set(slots) class_vars: SetStr = set() hash_func: Optional[Callable[[Any], int]] = None for base in reversed(bases): if _is_base_model_class_defined and issubclass(base, BaseModel) and base != BaseModel: fields.update(smart_deepcopy(base.__fields__)) config = inherit_config(base.__config__, config) validators = inherit_validators(base.__validators__, validators) pre_root_validators += base.__pre_root_validators__ post_root_validators += base.__post_root_validators__ base_private_attributes.update(base.__private_attributes__) class_vars.update(base.__class_vars__) hash_func = base.__hash__ resolve_forward_refs = kwargs.pop('__resolve_forward_refs__', True) allowed_config_kwargs: SetStr = { key for key in dir(config) if not (key.startswith('__') and key.endswith('__')) # skip dunder methods and attributes } config_kwargs = {key: kwargs.pop(key) for key in kwargs.keys() & allowed_config_kwargs} config_from_namespace = namespace.get('Config') if config_kwargs and config_from_namespace: raise TypeError('Specifying config in two places is ambiguous, use either Config attribute or class kwargs') config = inherit_config(config_from_namespace, config, **config_kwargs) validators = inherit_validators(extract_validators(namespace), validators) vg = ValidatorGroup(validators) for f in fields.values(): f.set_config(config) extra_validators = vg.get_validators(f.name) if extra_validators: f.class_validators.update(extra_validators) # re-run prepare to add extra validators f.populate_validators() prepare_config(config, name) untouched_types = ANNOTATED_FIELD_UNTOUCHED_TYPES def is_untouched(v: Any) -> bool: return isinstance(v, untouched_types) or v.__class__.__name__ == 'cython_function_or_method' if (namespace.get('__module__'), namespace.get('__qualname__')) != ('pydantic.main', 'BaseModel'): annotations = resolve_annotations(namespace.get('__annotations__', {}), namespace.get('__module__', None)) # annotation only fields need to come first in fields for ann_name, ann_type in annotations.items(): if is_classvar(ann_type): class_vars.add(ann_name) elif is_finalvar_with_default_val(ann_type, namespace.get(ann_name, Undefined)): class_vars.add(ann_name) elif is_valid_field(ann_name): validate_field_name(bases, ann_name) value = namespace.get(ann_name, Undefined) allowed_types = get_args(ann_type) if is_union(get_origin(ann_type)) else (ann_type,) if ( is_untouched(value) and ann_type != PyObject and not any( lenient_issubclass(get_origin(allowed_type), Type) for allowed_type in allowed_types ) ): continue fields[ann_name] = ModelField.infer( name=ann_name, value=value, annotation=ann_type, class_validators=vg.get_validators(ann_name), config=config, ) elif ann_name not in namespace and config.underscore_attrs_are_private: private_attributes[ann_name] = PrivateAttr() untouched_types = UNTOUCHED_TYPES + config.keep_untouched for var_name, value in namespace.items(): can_be_changed = var_name not in class_vars and not is_untouched(value) if isinstance(value, ModelPrivateAttr): if not is_valid_private_name(var_name): raise NameError( f'Private attributes "{var_name}" must not be a valid field name; ' f'Use sunder or dunder names, e. g. "_{var_name}" or "__{var_name}__"' ) private_attributes[var_name] = value elif config.underscore_attrs_are_private and is_valid_private_name(var_name) and can_be_changed: private_attributes[var_name] = PrivateAttr(default=value) elif is_valid_field(var_name) and var_name not in annotations and can_be_changed: validate_field_name(bases, var_name) inferred = ModelField.infer( name=var_name, value=value, annotation=annotations.get(var_name, Undefined), class_validators=vg.get_validators(var_name), config=config, ) if var_name in fields: if lenient_issubclass(inferred.type_, fields[var_name].type_): inferred.type_ = fields[var_name].type_ else: raise TypeError( f'The type of {name}.{var_name} differs from the new default value; ' f'if you wish to change the type of this field, please use a type annotation' ) fields[var_name] = inferred _custom_root_type = ROOT_KEY in fields if _custom_root_type: validate_custom_root_type(fields) vg.check_for_unused() if config.json_encoders: json_encoder = partial(custom_pydantic_encoder, config.json_encoders) else: json_encoder = pydantic_encoder pre_rv_new, post_rv_new = extract_root_validators(namespace) if hash_func is None: hash_func = generate_hash_function(config.frozen) exclude_from_namespace = fields | private_attributes.keys() | {'__slots__'} new_namespace = { '__config__': config, '__fields__': fields, '__exclude_fields__': { name: field.field_info.exclude for name, field in fields.items() if field.field_info.exclude is not None } or None, '__include_fields__': { name: field.field_info.include for name, field in fields.items() if field.field_info.include is not None } or None, '__validators__': vg.validators, '__pre_root_validators__': unique_list( pre_root_validators + pre_rv_new, name_factory=lambda v: v.__name__, ), '__post_root_validators__': unique_list( post_root_validators + post_rv_new, name_factory=lambda skip_on_failure_and_v: skip_on_failure_and_v[1].__name__, ), '__schema_cache__': {}, '__json_encoder__': staticmethod(json_encoder), '__custom_root_type__': _custom_root_type, '__private_attributes__': {**base_private_attributes, **private_attributes}, '__slots__': slots | private_attributes.keys(), '__hash__': hash_func, '__class_vars__': class_vars, **{n: v for n, v in namespace.items() if n not in exclude_from_namespace}, } cls = super().__new__(mcs, name, bases, new_namespace, **kwargs) # set __signature__ attr only for model class, but not for its instances cls.__signature__ = ClassAttribute('__signature__', generate_model_signature(cls.__init__, fields, config)) if resolve_forward_refs: cls.__try_update_forward_refs__() # preserve `__set_name__` protocol defined in https://peps.python.org/pep-0487 # for attributes not in `new_namespace` (e.g. private attributes) for name, obj in namespace.items(): if name not in new_namespace: set_name = getattr(obj, '__set_name__', None) if callable(set_name): set_name(cls, name) return cls def __instancecheck__(self, instance: Any) -> bool: """ Avoid calling ABC _abc_subclasscheck unless we're pretty sure. See #3829 and python/cpython#92810 """ return hasattr(instance, '__post_root_validators__') and super().__instancecheck__(instance) object_setattr = object.__setattr__ class BaseModel(Representation, metaclass=ModelMetaclass): if TYPE_CHECKING: # populated by the metaclass, defined here to help IDEs only __fields__: ClassVar[Dict[str, ModelField]] = {} __include_fields__: ClassVar[Optional[Mapping[str, Any]]] = None __exclude_fields__: ClassVar[Optional[Mapping[str, Any]]] = None __validators__: ClassVar[Dict[str, AnyCallable]] = {} __pre_root_validators__: ClassVar[List[AnyCallable]] __post_root_validators__: ClassVar[List[Tuple[bool, AnyCallable]]] __config__: ClassVar[Type[BaseConfig]] = BaseConfig __json_encoder__: ClassVar[Callable[[Any], Any]] = lambda x: x __schema_cache__: ClassVar['DictAny'] = {} __custom_root_type__: ClassVar[bool] = False __signature__: ClassVar['Signature'] __private_attributes__: ClassVar[Dict[str, ModelPrivateAttr]] __class_vars__: ClassVar[SetStr] __fields_set__: ClassVar[SetStr] = set() Config = BaseConfig __slots__ = ('__dict__', '__fields_set__') __doc__ = '' # Null out the Representation docstring def __init__(__pydantic_self__, **data: Any) -> None: """ Create a new model by parsing and validating input data from keyword arguments. Raises ValidationError if the input data cannot be parsed to form a valid model. """ # Uses something other than `self` the first arg to allow "self" as a settable attribute values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data) if validation_error: raise validation_error try: object_setattr(__pydantic_self__, '__dict__', values) except TypeError as e: raise TypeError( 'Model values must be a dict; you may not have returned a dictionary from a root validator' ) from e object_setattr(__pydantic_self__, '__fields_set__', fields_set) __pydantic_self__._init_private_attributes() @no_type_check def __setattr__(self, name, value): # noqa: C901 (ignore complexity) if name in self.__private_attributes__ or name in DUNDER_ATTRIBUTES: return object_setattr(self, name, value) if self.__config__.extra is not Extra.allow and name not in self.__fields__: raise ValueError(f'"{self.__class__.__name__}" object has no field "{name}"') elif not self.__config__.allow_mutation or self.__config__.frozen: raise TypeError(f'"{self.__class__.__name__}" is immutable and does not support item assignment') elif name in self.__fields__ and self.__fields__[name].final: raise TypeError( f'"{self.__class__.__name__}" object "{name}" field is final and does not support reassignment' ) elif self.__config__.validate_assignment: new_values = {**self.__dict__, name: value} for validator in self.__pre_root_validators__: try: new_values = validator(self.__class__, new_values) except (ValueError, TypeError, AssertionError) as exc: raise ValidationError([ErrorWrapper(exc, loc=ROOT_KEY)], self.__class__) known_field = self.__fields__.get(name, None) if known_field: # We want to # - make sure validators are called without the current value for this field inside `values` # - keep other values (e.g. submodels) untouched (using `BaseModel.dict()` will change them into dicts) # - keep the order of the fields if not known_field.field_info.allow_mutation: raise TypeError(f'"{known_field.name}" has allow_mutation set to False and cannot be assigned') dict_without_original_value = {k: v for k, v in self.__dict__.items() if k != name} value, error_ = known_field.validate(value, dict_without_original_value, loc=name, cls=self.__class__) if error_: raise ValidationError([error_], self.__class__) else: new_values[name] = value errors = [] for skip_on_failure, validator in self.__post_root_validators__: if skip_on_failure and errors: continue try: new_values = validator(self.__class__, new_values) except (ValueError, TypeError, AssertionError) as exc: errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) if errors: raise ValidationError(errors, self.__class__) # update the whole __dict__ as other values than just `value` # may be changed (e.g. with `root_validator`) object_setattr(self, '__dict__', new_values) else: self.__dict__[name] = value self.__fields_set__.add(name) def __getstate__(self) -> 'DictAny': private_attrs = ((k, getattr(self, k, Undefined)) for k in self.__private_attributes__) return { '__dict__': self.__dict__, '__fields_set__': self.__fields_set__, '__private_attribute_values__': {k: v for k, v in private_attrs if v is not Undefined}, } def __setstate__(self, state: 'DictAny') -> None: object_setattr(self, '__dict__', state['__dict__']) object_setattr(self, '__fields_set__', state['__fields_set__']) for name, value in state.get('__private_attribute_values__', {}).items(): object_setattr(self, name, value) def _init_private_attributes(self) -> None: for name, private_attr in self.__private_attributes__.items(): default = private_attr.get_default() if default is not Undefined: object_setattr(self, name, default) def dict( self, *, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> 'DictStrAny': """ Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. """ if skip_defaults is not None: warnings.warn( f'{self.__class__.__name__}.dict(): "skip_defaults" is deprecated and replaced by "exclude_unset"', DeprecationWarning, ) exclude_unset = skip_defaults return dict( self._iter( to_dict=True, by_alias=by_alias, include=include, exclude=exclude, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) ) def json( self, *, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any, ) -> str: """ Generate a JSON representation of the model, `include` and `exclude` arguments as per `dict()`. `encoder` is an optional function to supply as `default` to json.dumps(), other arguments as per `json.dumps()`. """ if skip_defaults is not None: warnings.warn( f'{self.__class__.__name__}.json(): "skip_defaults" is deprecated and replaced by "exclude_unset"', DeprecationWarning, ) exclude_unset = skip_defaults encoder = cast(Callable[[Any], Any], encoder or self.__json_encoder__) # We don't directly call `self.dict()`, which does exactly this with `to_dict=True` # because we want to be able to keep raw `BaseModel` instances and not as `dict`. # This allows users to write custom JSON encoders for given `BaseModel` classes. data = dict( self._iter( to_dict=models_as_dict, by_alias=by_alias, include=include, exclude=exclude, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) ) if self.__custom_root_type__: data = data[ROOT_KEY] return self.__config__.json_dumps(data, default=encoder, **dumps_kwargs) @classmethod def _enforce_dict_if_root(cls, obj: Any) -> Any: if cls.__custom_root_type__ and ( not (isinstance(obj, dict) and obj.keys() == {ROOT_KEY}) and not (isinstance(obj, BaseModel) and obj.__fields__.keys() == {ROOT_KEY}) or cls.__fields__[ROOT_KEY].shape in MAPPING_LIKE_SHAPES ): return {ROOT_KEY: obj} else: return obj @classmethod def parse_obj(cls: Type['Model'], obj: Any) -> 'Model': obj = cls._enforce_dict_if_root(obj) if not isinstance(obj, dict): try: obj = dict(obj) except (TypeError, ValueError) as e: exc = TypeError(f'{cls.__name__} expected dict not {obj.__class__.__name__}') raise ValidationError([ErrorWrapper(exc, loc=ROOT_KEY)], cls) from e return cls(**obj) @classmethod def parse_raw( cls: Type['Model'], b: StrBytes, *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, ) -> 'Model': try: obj = load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=cls.__config__.json_loads, ) except (ValueError, TypeError, UnicodeDecodeError) as e: raise ValidationError([ErrorWrapper(e, loc=ROOT_KEY)], cls) return cls.parse_obj(obj) @classmethod def parse_file( cls: Type['Model'], path: Union[str, Path], *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, ) -> 'Model': obj = load_file( path, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=cls.__config__.json_loads, ) return cls.parse_obj(obj) @classmethod def from_orm(cls: Type['Model'], obj: Any) -> 'Model': if not cls.__config__.orm_mode: raise ConfigError('You must have the config attribute orm_mode=True to use from_orm') obj = {ROOT_KEY: obj} if cls.__custom_root_type__ else cls._decompose_class(obj) m = cls.__new__(cls) values, fields_set, validation_error = validate_model(cls, obj) if validation_error: raise validation_error object_setattr(m, '__dict__', values) object_setattr(m, '__fields_set__', fields_set) m._init_private_attributes() return m @classmethod def construct(cls: Type['Model'], _fields_set: Optional['SetStr'] = None, **values: Any) -> 'Model': """ Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if `Config.extra = 'allow'` was set since it adds all passed values """ m = cls.__new__(cls) fields_values: Dict[str, Any] = {} for name, field in cls.__fields__.items(): if field.alt_alias and field.alias in values: fields_values[name] = values[field.alias] elif name in values: fields_values[name] = values[name] elif not field.required: fields_values[name] = field.get_default() fields_values.update(values) object_setattr(m, '__dict__', fields_values) if _fields_set is None: _fields_set = set(values.keys()) object_setattr(m, '__fields_set__', _fields_set) m._init_private_attributes() return m def _copy_and_set_values(self: 'Model', values: 'DictStrAny', fields_set: 'SetStr', *, deep: bool) -> 'Model': if deep: # chances of having empty dict here are quite low for using smart_deepcopy values = deepcopy(values) cls = self.__class__ m = cls.__new__(cls) object_setattr(m, '__dict__', values) object_setattr(m, '__fields_set__', fields_set) for name in self.__private_attributes__: value = getattr(self, name, Undefined) if value is not Undefined: if deep: value = deepcopy(value) object_setattr(m, name, value) return m def copy( self: 'Model', *, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, update: Optional['DictStrAny'] = None, deep: bool = False, ) -> 'Model': """ Duplicate a model, optionally choose which fields to include, exclude and change. :param include: fields to include in new model :param exclude: fields to exclude from new model, as with values this takes precedence over include :param update: values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data :param deep: set to `True` to make a deep copy of the model :return: new model instance """ values = dict( self._iter(to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False), **(update or {}), ) # new `__fields_set__` can have unset optional fields with a set value in `update` kwarg if update: fields_set = self.__fields_set__ | update.keys() else: fields_set = set(self.__fields_set__) return self._copy_and_set_values(values, fields_set, deep=deep) @classmethod def schema(cls, by_alias: bool = True, ref_template: str = default_ref_template) -> 'DictStrAny': cached = cls.__schema_cache__.get((by_alias, ref_template)) if cached is not None: return cached s = model_schema(cls, by_alias=by_alias, ref_template=ref_template) cls.__schema_cache__[(by_alias, ref_template)] = s return s @classmethod def schema_json( cls, *, by_alias: bool = True, ref_template: str = default_ref_template, **dumps_kwargs: Any ) -> str: from pydantic.v1.json import pydantic_encoder return cls.__config__.json_dumps( cls.schema(by_alias=by_alias, ref_template=ref_template), default=pydantic_encoder, **dumps_kwargs ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls: Type['Model'], value: Any) -> 'Model': if isinstance(value, cls): copy_on_model_validation = cls.__config__.copy_on_model_validation # whether to deep or shallow copy the model on validation, None means do not copy deep_copy: Optional[bool] = None if copy_on_model_validation not in {'deep', 'shallow', 'none'}: # Warn about deprecated behavior warnings.warn( "`copy_on_model_validation` should be a string: 'deep', 'shallow' or 'none'", DeprecationWarning ) if copy_on_model_validation: deep_copy = False if copy_on_model_validation == 'shallow': # shallow copy deep_copy = False elif copy_on_model_validation == 'deep': # deep copy deep_copy = True if deep_copy is None: return value else: return value._copy_and_set_values(value.__dict__, value.__fields_set__, deep=deep_copy) value = cls._enforce_dict_if_root(value) if isinstance(value, dict): return cls(**value) elif cls.__config__.orm_mode: return cls.from_orm(value) else: try: value_as_dict = dict(value) except (TypeError, ValueError) as e: raise DictError() from e return cls(**value_as_dict) @classmethod def _decompose_class(cls: Type['Model'], obj: Any) -> GetterDict: if isinstance(obj, GetterDict): return obj return cls.__config__.getter_dict(obj) @classmethod @no_type_check def _get_value( cls, v: Any, to_dict: bool, by_alias: bool, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']], exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']], exclude_unset: bool, exclude_defaults: bool, exclude_none: bool, ) -> Any: if isinstance(v, BaseModel): if to_dict: v_dict = v.dict( by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=include, exclude=exclude, exclude_none=exclude_none, ) if ROOT_KEY in v_dict: return v_dict[ROOT_KEY] return v_dict else: return v.copy(include=include, exclude=exclude) value_exclude = ValueItems(v, exclude) if exclude else None value_include = ValueItems(v, include) if include else None if isinstance(v, dict): return { k_: cls._get_value( v_, to_dict=to_dict, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=value_include and value_include.for_element(k_), exclude=value_exclude and value_exclude.for_element(k_), exclude_none=exclude_none, ) for k_, v_ in v.items() if (not value_exclude or not value_exclude.is_excluded(k_)) and (not value_include or value_include.is_included(k_)) } elif sequence_like(v): seq_args = ( cls._get_value( v_, to_dict=to_dict, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, include=value_include and value_include.for_element(i), exclude=value_exclude and value_exclude.for_element(i), exclude_none=exclude_none, ) for i, v_ in enumerate(v) if (not value_exclude or not value_exclude.is_excluded(i)) and (not value_include or value_include.is_included(i)) ) return v.__class__(*seq_args) if is_namedtuple(v.__class__) else v.__class__(seq_args) elif isinstance(v, Enum) and getattr(cls.Config, 'use_enum_values', False): return v.value else: return v @classmethod def __try_update_forward_refs__(cls, **localns: Any) -> None: """ Same as update_forward_refs but will not raise exception when forward references are not defined. """ update_model_forward_refs(cls, cls.__fields__.values(), cls.__config__.json_encoders, localns, (NameError,)) @classmethod def update_forward_refs(cls, **localns: Any) -> None: """ Try to update ForwardRefs on fields based on this Model, globalns and localns. """ update_model_forward_refs(cls, cls.__fields__.values(), cls.__config__.json_encoders, localns) def __iter__(self) -> 'TupleGenerator': """ so `dict(model)` works """ yield from self.__dict__.items() def _iter( self, to_dict: bool = False, by_alias: bool = False, include: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, exclude: Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> 'TupleGenerator': # Merge field set excludes with explicit exclude parameter with explicit overriding field set options. # The extra "is not None" guards are not logically necessary but optimizes performance for the simple case. if exclude is not None or self.__exclude_fields__ is not None: exclude = ValueItems.merge(self.__exclude_fields__, exclude) if include is not None or self.__include_fields__ is not None: include = ValueItems.merge(self.__include_fields__, include, intersect=True) allowed_keys = self._calculate_keys( include=include, exclude=exclude, exclude_unset=exclude_unset # type: ignore ) if allowed_keys is None and not (to_dict or by_alias or exclude_unset or exclude_defaults or exclude_none): # huge boost for plain _iter() yield from self.__dict__.items() return value_exclude = ValueItems(self, exclude) if exclude is not None else None value_include = ValueItems(self, include) if include is not None else None for field_key, v in self.__dict__.items(): if (allowed_keys is not None and field_key not in allowed_keys) or (exclude_none and v is None): continue if exclude_defaults: model_field = self.__fields__.get(field_key) if not getattr(model_field, 'required', True) and getattr(model_field, 'default', _missing) == v: continue if by_alias and field_key in self.__fields__: dict_key = self.__fields__[field_key].alias else: dict_key = field_key if to_dict or value_include or value_exclude: v = self._get_value( v, to_dict=to_dict, by_alias=by_alias, include=value_include and value_include.for_element(field_key), exclude=value_exclude and value_exclude.for_element(field_key), exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) yield dict_key, v def _calculate_keys( self, include: Optional['MappingIntStrAny'], exclude: Optional['MappingIntStrAny'], exclude_unset: bool, update: Optional['DictStrAny'] = None, ) -> Optional[AbstractSet[str]]: if include is None and exclude is None and exclude_unset is False: return None keys: AbstractSet[str] if exclude_unset: keys = self.__fields_set__.copy() else: keys = self.__dict__.keys() if include is not None: keys &= include.keys() if update: keys -= update.keys() if exclude: keys -= {k for k, v in exclude.items() if ValueItems.is_true(v)} return keys def __eq__(self, other: Any) -> bool: if isinstance(other, BaseModel): return self.dict() == other.dict() else: return self.dict() == other def __repr_args__(self) -> 'ReprArgs': return [ (k, v) for k, v in self.__dict__.items() if k not in DUNDER_ATTRIBUTES and (k not in self.__fields__ or self.__fields__[k].field_info.repr) ] _is_base_model_class_defined = True @overload def create_model( __model_name: str, *, __config__: Optional[Type[BaseConfig]] = None, __base__: None = None, __module__: str = __name__, __validators__: Dict[str, 'AnyClassMethod'] = None, __cls_kwargs__: Dict[str, Any] = None, **field_definitions: Any, ) -> Type['BaseModel']: ... @overload def create_model( __model_name: str, *, __config__: Optional[Type[BaseConfig]] = None, __base__: Union[Type['Model'], Tuple[Type['Model'], ...]], __module__: str = __name__, __validators__: Dict[str, 'AnyClassMethod'] = None, __cls_kwargs__: Dict[str, Any] = None, **field_definitions: Any, ) -> Type['Model']: ... def create_model( __model_name: str, *, __config__: Optional[Type[BaseConfig]] = None, __base__: Union[None, Type['Model'], Tuple[Type['Model'], ...]] = None, __module__: str = __name__, __validators__: Dict[str, 'AnyClassMethod'] = None, __cls_kwargs__: Dict[str, Any] = None, __slots__: Optional[Tuple[str, ...]] = None, **field_definitions: Any, ) -> Type['Model']: """ Dynamically create a model. :param __model_name: name of the created model :param __config__: config class to use for the new model :param __base__: base class for the new model to inherit from :param __module__: module of the created model :param __validators__: a dict of method names and @validator class methods :param __cls_kwargs__: a dict for class creation :param __slots__: Deprecated, `__slots__` should not be passed to `create_model` :param field_definitions: fields of the model (or extra fields if a base is supplied) in the format `=(, )` or `=, e.g. `foobar=(str, ...)` or `foobar=123`, or, for complex use-cases, in the format `=` or `=(, )`, e.g. `foo=Field(datetime, default_factory=datetime.utcnow, alias='bar')` or `foo=(str, FieldInfo(title='Foo'))` """ if __slots__ is not None: # __slots__ will be ignored from here on warnings.warn('__slots__ should not be passed to create_model', RuntimeWarning) if __base__ is not None: if __config__ is not None: raise ConfigError('to avoid confusion __config__ and __base__ cannot be used together') if not isinstance(__base__, tuple): __base__ = (__base__,) else: __base__ = (cast(Type['Model'], BaseModel),) __cls_kwargs__ = __cls_kwargs__ or {} fields = {} annotations = {} for f_name, f_def in field_definitions.items(): if not is_valid_field(f_name): warnings.warn(f'fields may not start with an underscore, ignoring "{f_name}"', RuntimeWarning) if isinstance(f_def, tuple): try: f_annotation, f_value = f_def except ValueError as e: raise ConfigError( 'field definitions should either be a tuple of (, ) or just a ' 'default value, unfortunately this means tuples as ' 'default values are not allowed' ) from e else: f_annotation, f_value = None, f_def if f_annotation: annotations[f_name] = f_annotation fields[f_name] = f_value namespace: 'DictStrAny' = {'__annotations__': annotations, '__module__': __module__} if __validators__: namespace.update(__validators__) namespace.update(fields) if __config__: namespace['Config'] = inherit_config(__config__, BaseConfig) resolved_bases = resolve_bases(__base__) meta, ns, kwds = prepare_class(__model_name, resolved_bases, kwds=__cls_kwargs__) if resolved_bases is not __base__: ns['__orig_bases__'] = __base__ namespace.update(ns) return meta(__model_name, resolved_bases, namespace, **kwds) _missing = object() def validate_model( # noqa: C901 (ignore complexity) model: Type[BaseModel], input_data: 'DictStrAny', cls: 'ModelOrDc' = None ) -> Tuple['DictStrAny', 'SetStr', Optional[ValidationError]]: """ validate data against a model. """ values = {} errors = [] # input_data names, possibly alias names_used = set() # field names, never aliases fields_set = set() config = model.__config__ check_extra = config.extra is not Extra.ignore cls_ = cls or model for validator in model.__pre_root_validators__: try: input_data = validator(cls_, input_data) except (ValueError, TypeError, AssertionError) as exc: return {}, set(), ValidationError([ErrorWrapper(exc, loc=ROOT_KEY)], cls_) for name, field in model.__fields__.items(): value = input_data.get(field.alias, _missing) using_name = False if value is _missing and config.allow_population_by_field_name and field.alt_alias: value = input_data.get(field.name, _missing) using_name = True if value is _missing: if field.required: errors.append(ErrorWrapper(MissingError(), loc=field.alias)) continue value = field.get_default() if not config.validate_all and not field.validate_always: values[name] = value continue else: fields_set.add(name) if check_extra: names_used.add(field.name if using_name else field.alias) v_, errors_ = field.validate(value, values, loc=field.alias, cls=cls_) if isinstance(errors_, ErrorWrapper): errors.append(errors_) elif isinstance(errors_, list): errors.extend(errors_) else: values[name] = v_ if check_extra: if isinstance(input_data, GetterDict): extra = input_data.extra_keys() - names_used else: extra = input_data.keys() - names_used if extra: fields_set |= extra if config.extra is Extra.allow: for f in extra: values[f] = input_data[f] else: for f in sorted(extra): errors.append(ErrorWrapper(ExtraError(), loc=f)) for skip_on_failure, validator in model.__post_root_validators__: if skip_on_failure and errors: continue try: values = validator(cls_, values) except (ValueError, TypeError, AssertionError) as exc: errors.append(ErrorWrapper(exc, loc=ROOT_KEY)) if errors: return values, fields_set, ValidationError(errors, cls_) else: return values, fields_set, None pydantic-2.10.6/pydantic/v1/mypy.py000066400000000000000000001140451474456633400171510ustar00rootroot00000000000000import sys from configparser import ConfigParser from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Type as TypingType, Union from mypy.errorcodes import ErrorCode from mypy.nodes import ( ARG_NAMED, ARG_NAMED_OPT, ARG_OPT, ARG_POS, ARG_STAR2, MDEF, Argument, AssignmentStmt, Block, CallExpr, ClassDef, Context, Decorator, EllipsisExpr, FuncBase, FuncDef, JsonDict, MemberExpr, NameExpr, PassStmt, PlaceholderNode, RefExpr, StrExpr, SymbolNode, SymbolTableNode, TempNode, TypeInfo, TypeVarExpr, Var, ) from mypy.options import Options from mypy.plugin import ( CheckerPluginInterface, ClassDefContext, FunctionContext, MethodContext, Plugin, ReportConfigContext, SemanticAnalyzerPluginInterface, ) from mypy.plugins import dataclasses from mypy.semanal import set_callable_name # type: ignore from mypy.server.trigger import make_wildcard_trigger from mypy.types import ( AnyType, CallableType, Instance, NoneType, Overloaded, ProperType, Type, TypeOfAny, TypeType, TypeVarId, TypeVarType, UnionType, get_proper_type, ) from mypy.typevars import fill_typevars from mypy.util import get_unique_redefinition_name from mypy.version import __version__ as mypy_version from pydantic.v1.utils import is_valid_field try: from mypy.types import TypeVarDef # type: ignore[attr-defined] except ImportError: # pragma: no cover # Backward-compatible with TypeVarDef from Mypy 0.910. from mypy.types import TypeVarType as TypeVarDef CONFIGFILE_KEY = 'pydantic-mypy' METADATA_KEY = 'pydantic-mypy-metadata' _NAMESPACE = __name__[:-5] # 'pydantic' in 1.10.X, 'pydantic.v1' in v2.X BASEMODEL_FULLNAME = f'{_NAMESPACE}.main.BaseModel' BASESETTINGS_FULLNAME = f'{_NAMESPACE}.env_settings.BaseSettings' MODEL_METACLASS_FULLNAME = f'{_NAMESPACE}.main.ModelMetaclass' FIELD_FULLNAME = f'{_NAMESPACE}.fields.Field' DATACLASS_FULLNAME = f'{_NAMESPACE}.dataclasses.dataclass' def parse_mypy_version(version: str) -> Tuple[int, ...]: return tuple(map(int, version.partition('+')[0].split('.'))) MYPY_VERSION_TUPLE = parse_mypy_version(mypy_version) BUILTINS_NAME = 'builtins' if MYPY_VERSION_TUPLE >= (0, 930) else '__builtins__' # Increment version if plugin changes and mypy caches should be invalidated __version__ = 2 def plugin(version: str) -> 'TypingType[Plugin]': """ `version` is the mypy version string We might want to use this to print a warning if the mypy version being used is newer, or especially older, than we expect (or need). """ return PydanticPlugin class PydanticPlugin(Plugin): def __init__(self, options: Options) -> None: self.plugin_config = PydanticPluginConfig(options) self._plugin_data = self.plugin_config.to_data() super().__init__(options) def get_base_class_hook(self, fullname: str) -> 'Optional[Callable[[ClassDefContext], None]]': sym = self.lookup_fully_qualified(fullname) if sym and isinstance(sym.node, TypeInfo): # pragma: no branch # No branching may occur if the mypy cache has not been cleared if any(get_fullname(base) == BASEMODEL_FULLNAME for base in sym.node.mro): return self._pydantic_model_class_maker_callback return None def get_metaclass_hook(self, fullname: str) -> Optional[Callable[[ClassDefContext], None]]: if fullname == MODEL_METACLASS_FULLNAME: return self._pydantic_model_metaclass_marker_callback return None def get_function_hook(self, fullname: str) -> 'Optional[Callable[[FunctionContext], Type]]': sym = self.lookup_fully_qualified(fullname) if sym and sym.fullname == FIELD_FULLNAME: return self._pydantic_field_callback return None def get_method_hook(self, fullname: str) -> Optional[Callable[[MethodContext], Type]]: if fullname.endswith('.from_orm'): return from_orm_callback return None def get_class_decorator_hook(self, fullname: str) -> Optional[Callable[[ClassDefContext], None]]: """Mark pydantic.dataclasses as dataclass. Mypy version 1.1.1 added support for `@dataclass_transform` decorator. """ if fullname == DATACLASS_FULLNAME and MYPY_VERSION_TUPLE < (1, 1): return dataclasses.dataclass_class_maker_callback # type: ignore[return-value] return None def report_config_data(self, ctx: ReportConfigContext) -> Dict[str, Any]: """Return all plugin config data. Used by mypy to determine if cache needs to be discarded. """ return self._plugin_data def _pydantic_model_class_maker_callback(self, ctx: ClassDefContext) -> None: transformer = PydanticModelTransformer(ctx, self.plugin_config) transformer.transform() def _pydantic_model_metaclass_marker_callback(self, ctx: ClassDefContext) -> None: """Reset dataclass_transform_spec attribute of ModelMetaclass. Let the plugin handle it. This behavior can be disabled if 'debug_dataclass_transform' is set to True', for testing purposes. """ if self.plugin_config.debug_dataclass_transform: return info_metaclass = ctx.cls.info.declared_metaclass assert info_metaclass, "callback not passed from 'get_metaclass_hook'" if getattr(info_metaclass.type, 'dataclass_transform_spec', None): info_metaclass.type.dataclass_transform_spec = None # type: ignore[attr-defined] def _pydantic_field_callback(self, ctx: FunctionContext) -> 'Type': """ Extract the type of the `default` argument from the Field function, and use it as the return type. In particular: * Check whether the default and default_factory argument is specified. * Output an error if both are specified. * Retrieve the type of the argument which is specified, and use it as return type for the function. """ default_any_type = ctx.default_return_type assert ctx.callee_arg_names[0] == 'default', '"default" is no longer first argument in Field()' assert ctx.callee_arg_names[1] == 'default_factory', '"default_factory" is no longer second argument in Field()' default_args = ctx.args[0] default_factory_args = ctx.args[1] if default_args and default_factory_args: error_default_and_default_factory_specified(ctx.api, ctx.context) return default_any_type if default_args: default_type = ctx.arg_types[0][0] default_arg = default_args[0] # Fallback to default Any type if the field is required if not isinstance(default_arg, EllipsisExpr): return default_type elif default_factory_args: default_factory_type = ctx.arg_types[1][0] # Functions which use `ParamSpec` can be overloaded, exposing the callable's types as a parameter # Pydantic calls the default factory without any argument, so we retrieve the first item if isinstance(default_factory_type, Overloaded): if MYPY_VERSION_TUPLE > (0, 910): default_factory_type = default_factory_type.items[0] else: # Mypy0.910 exposes the items of overloaded types in a function default_factory_type = default_factory_type.items()[0] # type: ignore[operator] if isinstance(default_factory_type, CallableType): ret_type = default_factory_type.ret_type # mypy doesn't think `ret_type` has `args`, you'd think mypy should know, # add this check in case it varies by version args = getattr(ret_type, 'args', None) if args: if all(isinstance(arg, TypeVarType) for arg in args): # Looks like the default factory is a type like `list` or `dict`, replace all args with `Any` ret_type.args = tuple(default_any_type for _ in args) # type: ignore[attr-defined] return ret_type return default_any_type class PydanticPluginConfig: __slots__ = ( 'init_forbid_extra', 'init_typed', 'warn_required_dynamic_aliases', 'warn_untyped_fields', 'debug_dataclass_transform', ) init_forbid_extra: bool init_typed: bool warn_required_dynamic_aliases: bool warn_untyped_fields: bool debug_dataclass_transform: bool # undocumented def __init__(self, options: Options) -> None: if options.config_file is None: # pragma: no cover return toml_config = parse_toml(options.config_file) if toml_config is not None: config = toml_config.get('tool', {}).get('pydantic-mypy', {}) for key in self.__slots__: setting = config.get(key, False) if not isinstance(setting, bool): raise ValueError(f'Configuration value must be a boolean for key: {key}') setattr(self, key, setting) else: plugin_config = ConfigParser() plugin_config.read(options.config_file) for key in self.__slots__: setting = plugin_config.getboolean(CONFIGFILE_KEY, key, fallback=False) setattr(self, key, setting) def to_data(self) -> Dict[str, Any]: return {key: getattr(self, key) for key in self.__slots__} def from_orm_callback(ctx: MethodContext) -> Type: """ Raise an error if orm_mode is not enabled """ model_type: Instance ctx_type = ctx.type if isinstance(ctx_type, TypeType): ctx_type = ctx_type.item if isinstance(ctx_type, CallableType) and isinstance(ctx_type.ret_type, Instance): model_type = ctx_type.ret_type # called on the class elif isinstance(ctx_type, Instance): model_type = ctx_type # called on an instance (unusual, but still valid) else: # pragma: no cover detail = f'ctx.type: {ctx_type} (of type {ctx_type.__class__.__name__})' error_unexpected_behavior(detail, ctx.api, ctx.context) return ctx.default_return_type pydantic_metadata = model_type.type.metadata.get(METADATA_KEY) if pydantic_metadata is None: return ctx.default_return_type orm_mode = pydantic_metadata.get('config', {}).get('orm_mode') if orm_mode is not True: error_from_orm(get_name(model_type.type), ctx.api, ctx.context) return ctx.default_return_type class PydanticModelTransformer: tracked_config_fields: Set[str] = { 'extra', 'allow_mutation', 'frozen', 'orm_mode', 'allow_population_by_field_name', 'alias_generator', } def __init__(self, ctx: ClassDefContext, plugin_config: PydanticPluginConfig) -> None: self._ctx = ctx self.plugin_config = plugin_config def transform(self) -> None: """ Configures the BaseModel subclass according to the plugin settings. In particular: * determines the model config and fields, * adds a fields-aware signature for the initializer and construct methods * freezes the class if allow_mutation = False or frozen = True * stores the fields, config, and if the class is settings in the mypy metadata for access by subclasses """ ctx = self._ctx info = ctx.cls.info self.adjust_validator_signatures() config = self.collect_config() fields = self.collect_fields(config) is_settings = any(get_fullname(base) == BASESETTINGS_FULLNAME for base in info.mro[:-1]) self.add_initializer(fields, config, is_settings) self.add_construct_method(fields) self.set_frozen(fields, frozen=config.allow_mutation is False or config.frozen is True) info.metadata[METADATA_KEY] = { 'fields': {field.name: field.serialize() for field in fields}, 'config': config.set_values_dict(), } def adjust_validator_signatures(self) -> None: """When we decorate a function `f` with `pydantic.validator(...), mypy sees `f` as a regular method taking a `self` instance, even though pydantic internally wraps `f` with `classmethod` if necessary. Teach mypy this by marking any function whose outermost decorator is a `validator()` call as a classmethod. """ for name, sym in self._ctx.cls.info.names.items(): if isinstance(sym.node, Decorator): first_dec = sym.node.original_decorators[0] if ( isinstance(first_dec, CallExpr) and isinstance(first_dec.callee, NameExpr) and first_dec.callee.fullname == f'{_NAMESPACE}.class_validators.validator' ): sym.node.func.is_class = True def collect_config(self) -> 'ModelConfigData': """ Collects the values of the config attributes that are used by the plugin, accounting for parent classes. """ ctx = self._ctx cls = ctx.cls config = ModelConfigData() for stmt in cls.defs.body: if not isinstance(stmt, ClassDef): continue if stmt.name == 'Config': for substmt in stmt.defs.body: if not isinstance(substmt, AssignmentStmt): continue config.update(self.get_config_update(substmt)) if ( config.has_alias_generator and not config.allow_population_by_field_name and self.plugin_config.warn_required_dynamic_aliases ): error_required_dynamic_aliases(ctx.api, stmt) for info in cls.info.mro[1:]: # 0 is the current class if METADATA_KEY not in info.metadata: continue # Each class depends on the set of fields in its ancestors ctx.api.add_plugin_dependency(make_wildcard_trigger(get_fullname(info))) for name, value in info.metadata[METADATA_KEY]['config'].items(): config.setdefault(name, value) return config def collect_fields(self, model_config: 'ModelConfigData') -> List['PydanticModelField']: """ Collects the fields for the model, accounting for parent classes """ # First, collect fields belonging to the current class. ctx = self._ctx cls = self._ctx.cls fields = [] # type: List[PydanticModelField] known_fields = set() # type: Set[str] for stmt in cls.defs.body: if not isinstance(stmt, AssignmentStmt): # `and stmt.new_syntax` to require annotation continue lhs = stmt.lvalues[0] if not isinstance(lhs, NameExpr) or not is_valid_field(lhs.name): continue if not stmt.new_syntax and self.plugin_config.warn_untyped_fields: error_untyped_fields(ctx.api, stmt) # if lhs.name == '__config__': # BaseConfig not well handled; I'm not sure why yet # continue sym = cls.info.names.get(lhs.name) if sym is None: # pragma: no cover # This is likely due to a star import (see the dataclasses plugin for a more detailed explanation) # This is the same logic used in the dataclasses plugin continue node = sym.node if isinstance(node, PlaceholderNode): # pragma: no cover # See the PlaceholderNode docstring for more detail about how this can occur # Basically, it is an edge case when dealing with complex import logic # This is the same logic used in the dataclasses plugin continue if not isinstance(node, Var): # pragma: no cover # Don't know if this edge case still happens with the `is_valid_field` check above # but better safe than sorry continue # x: ClassVar[int] is ignored by dataclasses. if node.is_classvar: continue is_required = self.get_is_required(cls, stmt, lhs) alias, has_dynamic_alias = self.get_alias_info(stmt) if ( has_dynamic_alias and not model_config.allow_population_by_field_name and self.plugin_config.warn_required_dynamic_aliases ): error_required_dynamic_aliases(ctx.api, stmt) fields.append( PydanticModelField( name=lhs.name, is_required=is_required, alias=alias, has_dynamic_alias=has_dynamic_alias, line=stmt.line, column=stmt.column, ) ) known_fields.add(lhs.name) all_fields = fields.copy() for info in cls.info.mro[1:]: # 0 is the current class, -2 is BaseModel, -1 is object if METADATA_KEY not in info.metadata: continue superclass_fields = [] # Each class depends on the set of fields in its ancestors ctx.api.add_plugin_dependency(make_wildcard_trigger(get_fullname(info))) for name, data in info.metadata[METADATA_KEY]['fields'].items(): if name not in known_fields: field = PydanticModelField.deserialize(info, data) known_fields.add(name) superclass_fields.append(field) else: (field,) = (a for a in all_fields if a.name == name) all_fields.remove(field) superclass_fields.append(field) all_fields = superclass_fields + all_fields return all_fields def add_initializer(self, fields: List['PydanticModelField'], config: 'ModelConfigData', is_settings: bool) -> None: """ Adds a fields-aware `__init__` method to the class. The added `__init__` will be annotated with types vs. all `Any` depending on the plugin settings. """ ctx = self._ctx typed = self.plugin_config.init_typed use_alias = config.allow_population_by_field_name is not True force_all_optional = is_settings or bool( config.has_alias_generator and not config.allow_population_by_field_name ) init_arguments = self.get_field_arguments( fields, typed=typed, force_all_optional=force_all_optional, use_alias=use_alias ) if not self.should_init_forbid_extra(fields, config): var = Var('kwargs') init_arguments.append(Argument(var, AnyType(TypeOfAny.explicit), None, ARG_STAR2)) if '__init__' not in ctx.cls.info.names: add_method(ctx, '__init__', init_arguments, NoneType()) def add_construct_method(self, fields: List['PydanticModelField']) -> None: """ Adds a fully typed `construct` classmethod to the class. Similar to the fields-aware __init__ method, but always uses the field names (not aliases), and does not treat settings fields as optional. """ ctx = self._ctx set_str = ctx.api.named_type(f'{BUILTINS_NAME}.set', [ctx.api.named_type(f'{BUILTINS_NAME}.str')]) optional_set_str = UnionType([set_str, NoneType()]) fields_set_argument = Argument(Var('_fields_set', optional_set_str), optional_set_str, None, ARG_OPT) construct_arguments = self.get_field_arguments(fields, typed=True, force_all_optional=False, use_alias=False) construct_arguments = [fields_set_argument] + construct_arguments obj_type = ctx.api.named_type(f'{BUILTINS_NAME}.object') self_tvar_name = '_PydanticBaseModel' # Make sure it does not conflict with other names in the class tvar_fullname = ctx.cls.fullname + '.' + self_tvar_name if MYPY_VERSION_TUPLE >= (1, 4): tvd = TypeVarType( self_tvar_name, tvar_fullname, ( TypeVarId(-1, namespace=ctx.cls.fullname + '.construct') if MYPY_VERSION_TUPLE >= (1, 11) else TypeVarId(-1) ), [], obj_type, AnyType(TypeOfAny.from_omitted_generics), # type: ignore[arg-type] ) self_tvar_expr = TypeVarExpr( self_tvar_name, tvar_fullname, [], obj_type, AnyType(TypeOfAny.from_omitted_generics), # type: ignore[arg-type] ) else: tvd = TypeVarDef(self_tvar_name, tvar_fullname, -1, [], obj_type) self_tvar_expr = TypeVarExpr(self_tvar_name, tvar_fullname, [], obj_type) ctx.cls.info.names[self_tvar_name] = SymbolTableNode(MDEF, self_tvar_expr) # Backward-compatible with TypeVarDef from Mypy 0.910. if isinstance(tvd, TypeVarType): self_type = tvd else: self_type = TypeVarType(tvd) add_method( ctx, 'construct', construct_arguments, return_type=self_type, self_type=self_type, tvar_def=tvd, is_classmethod=True, ) def set_frozen(self, fields: List['PydanticModelField'], frozen: bool) -> None: """ Marks all fields as properties so that attempts to set them trigger mypy errors. This is the same approach used by the attrs and dataclasses plugins. """ ctx = self._ctx info = ctx.cls.info for field in fields: sym_node = info.names.get(field.name) if sym_node is not None: var = sym_node.node if isinstance(var, Var): var.is_property = frozen elif isinstance(var, PlaceholderNode) and not ctx.api.final_iteration: # See https://github.com/pydantic/pydantic/issues/5191 to hit this branch for test coverage ctx.api.defer() else: # pragma: no cover # I don't know whether it's possible to hit this branch, but I've added it for safety try: var_str = str(var) except TypeError: # This happens for PlaceholderNode; perhaps it will happen for other types in the future.. var_str = repr(var) detail = f'sym_node.node: {var_str} (of type {var.__class__})' error_unexpected_behavior(detail, ctx.api, ctx.cls) else: var = field.to_var(info, use_alias=False) var.info = info var.is_property = frozen var._fullname = get_fullname(info) + '.' + get_name(var) info.names[get_name(var)] = SymbolTableNode(MDEF, var) def get_config_update(self, substmt: AssignmentStmt) -> Optional['ModelConfigData']: """ Determines the config update due to a single statement in the Config class definition. Warns if a tracked config attribute is set to a value the plugin doesn't know how to interpret (e.g., an int) """ lhs = substmt.lvalues[0] if not (isinstance(lhs, NameExpr) and lhs.name in self.tracked_config_fields): return None if lhs.name == 'extra': if isinstance(substmt.rvalue, StrExpr): forbid_extra = substmt.rvalue.value == 'forbid' elif isinstance(substmt.rvalue, MemberExpr): forbid_extra = substmt.rvalue.name == 'forbid' else: error_invalid_config_value(lhs.name, self._ctx.api, substmt) return None return ModelConfigData(forbid_extra=forbid_extra) if lhs.name == 'alias_generator': has_alias_generator = True if isinstance(substmt.rvalue, NameExpr) and substmt.rvalue.fullname == 'builtins.None': has_alias_generator = False return ModelConfigData(has_alias_generator=has_alias_generator) if isinstance(substmt.rvalue, NameExpr) and substmt.rvalue.fullname in ('builtins.True', 'builtins.False'): return ModelConfigData(**{lhs.name: substmt.rvalue.fullname == 'builtins.True'}) error_invalid_config_value(lhs.name, self._ctx.api, substmt) return None @staticmethod def get_is_required(cls: ClassDef, stmt: AssignmentStmt, lhs: NameExpr) -> bool: """ Returns a boolean indicating whether the field defined in `stmt` is a required field. """ expr = stmt.rvalue if isinstance(expr, TempNode): # TempNode means annotation-only, so only non-required if Optional value_type = get_proper_type(cls.info[lhs.name].type) return not PydanticModelTransformer.type_has_implicit_default(value_type) if isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME: # The "default value" is a call to `Field`; at this point, the field is # only required if default is Ellipsis (i.e., `field_name: Annotation = Field(...)`) or if default_factory # is specified. for arg, name in zip(expr.args, expr.arg_names): # If name is None, then this arg is the default because it is the only positional argument. if name is None or name == 'default': return arg.__class__ is EllipsisExpr if name == 'default_factory': return False # In this case, default and default_factory are not specified, so we need to look at the annotation value_type = get_proper_type(cls.info[lhs.name].type) return not PydanticModelTransformer.type_has_implicit_default(value_type) # Only required if the "default value" is Ellipsis (i.e., `field_name: Annotation = ...`) return isinstance(expr, EllipsisExpr) @staticmethod def type_has_implicit_default(type_: Optional[ProperType]) -> bool: """ Returns True if the passed type will be given an implicit default value. In pydantic v1, this is the case for Optional types and Any (with default value None). """ if isinstance(type_, AnyType): # Annotated as Any return True if isinstance(type_, UnionType) and any( isinstance(item, NoneType) or isinstance(item, AnyType) for item in type_.items ): # Annotated as Optional, or otherwise having NoneType or AnyType in the union return True return False @staticmethod def get_alias_info(stmt: AssignmentStmt) -> Tuple[Optional[str], bool]: """ Returns a pair (alias, has_dynamic_alias), extracted from the declaration of the field defined in `stmt`. `has_dynamic_alias` is True if and only if an alias is provided, but not as a string literal. If `has_dynamic_alias` is True, `alias` will be None. """ expr = stmt.rvalue if isinstance(expr, TempNode): # TempNode means annotation-only return None, False if not ( isinstance(expr, CallExpr) and isinstance(expr.callee, RefExpr) and expr.callee.fullname == FIELD_FULLNAME ): # Assigned value is not a call to pydantic.fields.Field return None, False for i, arg_name in enumerate(expr.arg_names): if arg_name != 'alias': continue arg = expr.args[i] if isinstance(arg, StrExpr): return arg.value, False else: return None, True return None, False def get_field_arguments( self, fields: List['PydanticModelField'], typed: bool, force_all_optional: bool, use_alias: bool ) -> List[Argument]: """ Helper function used during the construction of the `__init__` and `construct` method signatures. Returns a list of mypy Argument instances for use in the generated signatures. """ info = self._ctx.cls.info arguments = [ field.to_argument(info, typed=typed, force_optional=force_all_optional, use_alias=use_alias) for field in fields if not (use_alias and field.has_dynamic_alias) ] return arguments def should_init_forbid_extra(self, fields: List['PydanticModelField'], config: 'ModelConfigData') -> bool: """ Indicates whether the generated `__init__` should get a `**kwargs` at the end of its signature We disallow arbitrary kwargs if the extra config setting is "forbid", or if the plugin config says to, *unless* a required dynamic alias is present (since then we can't determine a valid signature). """ if not config.allow_population_by_field_name: if self.is_dynamic_alias_present(fields, bool(config.has_alias_generator)): return False if config.forbid_extra: return True return self.plugin_config.init_forbid_extra @staticmethod def is_dynamic_alias_present(fields: List['PydanticModelField'], has_alias_generator: bool) -> bool: """ Returns whether any fields on the model have a "dynamic alias", i.e., an alias that cannot be determined during static analysis. """ for field in fields: if field.has_dynamic_alias: return True if has_alias_generator: for field in fields: if field.alias is None: return True return False class PydanticModelField: def __init__( self, name: str, is_required: bool, alias: Optional[str], has_dynamic_alias: bool, line: int, column: int ): self.name = name self.is_required = is_required self.alias = alias self.has_dynamic_alias = has_dynamic_alias self.line = line self.column = column def to_var(self, info: TypeInfo, use_alias: bool) -> Var: name = self.name if use_alias and self.alias is not None: name = self.alias return Var(name, info[self.name].type) def to_argument(self, info: TypeInfo, typed: bool, force_optional: bool, use_alias: bool) -> Argument: if typed and info[self.name].type is not None: type_annotation = info[self.name].type else: type_annotation = AnyType(TypeOfAny.explicit) return Argument( variable=self.to_var(info, use_alias), type_annotation=type_annotation, initializer=None, kind=ARG_NAMED_OPT if force_optional or not self.is_required else ARG_NAMED, ) def serialize(self) -> JsonDict: return self.__dict__ @classmethod def deserialize(cls, info: TypeInfo, data: JsonDict) -> 'PydanticModelField': return cls(**data) class ModelConfigData: def __init__( self, forbid_extra: Optional[bool] = None, allow_mutation: Optional[bool] = None, frozen: Optional[bool] = None, orm_mode: Optional[bool] = None, allow_population_by_field_name: Optional[bool] = None, has_alias_generator: Optional[bool] = None, ): self.forbid_extra = forbid_extra self.allow_mutation = allow_mutation self.frozen = frozen self.orm_mode = orm_mode self.allow_population_by_field_name = allow_population_by_field_name self.has_alias_generator = has_alias_generator def set_values_dict(self) -> Dict[str, Any]: return {k: v for k, v in self.__dict__.items() if v is not None} def update(self, config: Optional['ModelConfigData']) -> None: if config is None: return for k, v in config.set_values_dict().items(): setattr(self, k, v) def setdefault(self, key: str, value: Any) -> None: if getattr(self, key) is None: setattr(self, key, value) ERROR_ORM = ErrorCode('pydantic-orm', 'Invalid from_orm call', 'Pydantic') ERROR_CONFIG = ErrorCode('pydantic-config', 'Invalid config value', 'Pydantic') ERROR_ALIAS = ErrorCode('pydantic-alias', 'Dynamic alias disallowed', 'Pydantic') ERROR_UNEXPECTED = ErrorCode('pydantic-unexpected', 'Unexpected behavior', 'Pydantic') ERROR_UNTYPED = ErrorCode('pydantic-field', 'Untyped field disallowed', 'Pydantic') ERROR_FIELD_DEFAULTS = ErrorCode('pydantic-field', 'Invalid Field defaults', 'Pydantic') def error_from_orm(model_name: str, api: CheckerPluginInterface, context: Context) -> None: api.fail(f'"{model_name}" does not have orm_mode=True', context, code=ERROR_ORM) def error_invalid_config_value(name: str, api: SemanticAnalyzerPluginInterface, context: Context) -> None: api.fail(f'Invalid value for "Config.{name}"', context, code=ERROR_CONFIG) def error_required_dynamic_aliases(api: SemanticAnalyzerPluginInterface, context: Context) -> None: api.fail('Required dynamic aliases disallowed', context, code=ERROR_ALIAS) def error_unexpected_behavior( detail: str, api: Union[CheckerPluginInterface, SemanticAnalyzerPluginInterface], context: Context ) -> None: # pragma: no cover # Can't think of a good way to test this, but I confirmed it renders as desired by adding to a non-error path link = 'https://github.com/pydantic/pydantic/issues/new/choose' full_message = f'The pydantic mypy plugin ran into unexpected behavior: {detail}\n' full_message += f'Please consider reporting this bug at {link} so we can try to fix it!' api.fail(full_message, context, code=ERROR_UNEXPECTED) def error_untyped_fields(api: SemanticAnalyzerPluginInterface, context: Context) -> None: api.fail('Untyped fields disallowed', context, code=ERROR_UNTYPED) def error_default_and_default_factory_specified(api: CheckerPluginInterface, context: Context) -> None: api.fail('Field default and default_factory cannot be specified together', context, code=ERROR_FIELD_DEFAULTS) def add_method( ctx: ClassDefContext, name: str, args: List[Argument], return_type: Type, self_type: Optional[Type] = None, tvar_def: Optional[TypeVarDef] = None, is_classmethod: bool = False, is_new: bool = False, # is_staticmethod: bool = False, ) -> None: """ Adds a new method to a class. This can be dropped if/when https://github.com/python/mypy/issues/7301 is merged """ info = ctx.cls.info # First remove any previously generated methods with the same name # to avoid clashes and problems in the semantic analyzer. if name in info.names: sym = info.names[name] if sym.plugin_generated and isinstance(sym.node, FuncDef): ctx.cls.defs.body.remove(sym.node) # pragma: no cover self_type = self_type or fill_typevars(info) if is_classmethod or is_new: first = [Argument(Var('_cls'), TypeType.make_normalized(self_type), None, ARG_POS)] # elif is_staticmethod: # first = [] else: self_type = self_type or fill_typevars(info) first = [Argument(Var('__pydantic_self__'), self_type, None, ARG_POS)] args = first + args arg_types, arg_names, arg_kinds = [], [], [] for arg in args: assert arg.type_annotation, 'All arguments must be fully typed.' arg_types.append(arg.type_annotation) arg_names.append(get_name(arg.variable)) arg_kinds.append(arg.kind) function_type = ctx.api.named_type(f'{BUILTINS_NAME}.function') signature = CallableType(arg_types, arg_kinds, arg_names, return_type, function_type) if tvar_def: signature.variables = [tvar_def] func = FuncDef(name, args, Block([PassStmt()])) func.info = info func.type = set_callable_name(signature, func) func.is_class = is_classmethod # func.is_static = is_staticmethod func._fullname = get_fullname(info) + '.' + name func.line = info.line # NOTE: we would like the plugin generated node to dominate, but we still # need to keep any existing definitions so they get semantically analyzed. if name in info.names: # Get a nice unique name instead. r_name = get_unique_redefinition_name(name, info.names) info.names[r_name] = info.names[name] if is_classmethod: # or is_staticmethod: func.is_decorated = True v = Var(name, func.type) v.info = info v._fullname = func._fullname # if is_classmethod: v.is_classmethod = True dec = Decorator(func, [NameExpr('classmethod')], v) # else: # v.is_staticmethod = True # dec = Decorator(func, [NameExpr('staticmethod')], v) dec.line = info.line sym = SymbolTableNode(MDEF, dec) else: sym = SymbolTableNode(MDEF, func) sym.plugin_generated = True info.names[name] = sym info.defn.defs.body.append(func) def get_fullname(x: Union[FuncBase, SymbolNode]) -> str: """ Used for compatibility with mypy 0.740; can be dropped once support for 0.740 is dropped. """ fn = x.fullname if callable(fn): # pragma: no cover return fn() return fn def get_name(x: Union[FuncBase, SymbolNode]) -> str: """ Used for compatibility with mypy 0.740; can be dropped once support for 0.740 is dropped. """ fn = x.name if callable(fn): # pragma: no cover return fn() return fn def parse_toml(config_file: str) -> Optional[Dict[str, Any]]: if not config_file.endswith('.toml'): return None read_mode = 'rb' if sys.version_info >= (3, 11): import tomllib as toml_ else: try: import tomli as toml_ except ImportError: # older versions of mypy have toml as a dependency, not tomli read_mode = 'r' try: import toml as toml_ # type: ignore[no-redef] except ImportError: # pragma: no cover import warnings warnings.warn('No TOML parser installed, cannot read configuration from `pyproject.toml`.') return None with open(config_file, read_mode) as rf: return toml_.load(rf) # type: ignore[arg-type] pydantic-2.10.6/pydantic/v1/networks.py000066400000000000000000000531541474456633400200320ustar00rootroot00000000000000import re from ipaddress import ( IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network, _BaseAddress, _BaseNetwork, ) from typing import ( TYPE_CHECKING, Any, Collection, Dict, Generator, List, Match, Optional, Pattern, Set, Tuple, Type, Union, cast, no_type_check, ) from pydantic.v1 import errors from pydantic.v1.utils import Representation, update_not_none from pydantic.v1.validators import constr_length_validator, str_validator if TYPE_CHECKING: import email_validator from typing_extensions import TypedDict from pydantic.v1.config import BaseConfig from pydantic.v1.fields import ModelField from pydantic.v1.typing import AnyCallable CallableGenerator = Generator[AnyCallable, None, None] class Parts(TypedDict, total=False): scheme: str user: Optional[str] password: Optional[str] ipv4: Optional[str] ipv6: Optional[str] domain: Optional[str] port: Optional[str] path: Optional[str] query: Optional[str] fragment: Optional[str] class HostParts(TypedDict, total=False): host: str tld: Optional[str] host_type: Optional[str] port: Optional[str] rebuild: bool else: email_validator = None class Parts(dict): pass NetworkType = Union[str, bytes, int, Tuple[Union[str, bytes, int], Union[str, int]]] __all__ = [ 'AnyUrl', 'AnyHttpUrl', 'FileUrl', 'HttpUrl', 'stricturl', 'EmailStr', 'NameEmail', 'IPvAnyAddress', 'IPvAnyInterface', 'IPvAnyNetwork', 'PostgresDsn', 'CockroachDsn', 'AmqpDsn', 'RedisDsn', 'MongoDsn', 'KafkaDsn', 'validate_email', ] _url_regex_cache = None _multi_host_url_regex_cache = None _ascii_domain_regex_cache = None _int_domain_regex_cache = None _host_regex_cache = None _host_regex = ( r'(?:' r'(?P(?:\d{1,3}\.){3}\d{1,3})(?=$|[/:#?])|' # ipv4 r'(?P\[[A-F0-9]*:[A-F0-9:]+\])(?=$|[/:#?])|' # ipv6 r'(?P[^\s/:?#]+)' # domain, validation occurs later r')?' r'(?::(?P\d+))?' # port ) _scheme_regex = r'(?:(?P[a-z][a-z0-9+\-.]+)://)?' # scheme https://tools.ietf.org/html/rfc3986#appendix-A _user_info_regex = r'(?:(?P[^\s:/]*)(?::(?P[^\s/]*))?@)?' _path_regex = r'(?P/[^\s?#]*)?' _query_regex = r'(?:\?(?P[^\s#]*))?' _fragment_regex = r'(?:#(?P[^\s#]*))?' def url_regex() -> Pattern[str]: global _url_regex_cache if _url_regex_cache is None: _url_regex_cache = re.compile( rf'{_scheme_regex}{_user_info_regex}{_host_regex}{_path_regex}{_query_regex}{_fragment_regex}', re.IGNORECASE, ) return _url_regex_cache def multi_host_url_regex() -> Pattern[str]: """ Compiled multi host url regex. Additionally to `url_regex` it allows to match multiple hosts. E.g. host1.db.net,host2.db.net """ global _multi_host_url_regex_cache if _multi_host_url_regex_cache is None: _multi_host_url_regex_cache = re.compile( rf'{_scheme_regex}{_user_info_regex}' r'(?P([^/]*))' # validation occurs later rf'{_path_regex}{_query_regex}{_fragment_regex}', re.IGNORECASE, ) return _multi_host_url_regex_cache def ascii_domain_regex() -> Pattern[str]: global _ascii_domain_regex_cache if _ascii_domain_regex_cache is None: ascii_chunk = r'[_0-9a-z](?:[-_0-9a-z]{0,61}[_0-9a-z])?' ascii_domain_ending = r'(?P\.[a-z]{2,63})?\.?' _ascii_domain_regex_cache = re.compile( fr'(?:{ascii_chunk}\.)*?{ascii_chunk}{ascii_domain_ending}', re.IGNORECASE ) return _ascii_domain_regex_cache def int_domain_regex() -> Pattern[str]: global _int_domain_regex_cache if _int_domain_regex_cache is None: int_chunk = r'[_0-9a-\U00040000](?:[-_0-9a-\U00040000]{0,61}[_0-9a-\U00040000])?' int_domain_ending = r'(?P(\.[^\W\d_]{2,63})|(\.(?:xn--)[_0-9a-z-]{2,63}))?\.?' _int_domain_regex_cache = re.compile(fr'(?:{int_chunk}\.)*?{int_chunk}{int_domain_ending}', re.IGNORECASE) return _int_domain_regex_cache def host_regex() -> Pattern[str]: global _host_regex_cache if _host_regex_cache is None: _host_regex_cache = re.compile( _host_regex, re.IGNORECASE, ) return _host_regex_cache class AnyUrl(str): strip_whitespace = True min_length = 1 max_length = 2**16 allowed_schemes: Optional[Collection[str]] = None tld_required: bool = False user_required: bool = False host_required: bool = True hidden_parts: Set[str] = set() __slots__ = ('scheme', 'user', 'password', 'host', 'tld', 'host_type', 'port', 'path', 'query', 'fragment') @no_type_check def __new__(cls, url: Optional[str], **kwargs) -> object: return str.__new__(cls, cls.build(**kwargs) if url is None else url) def __init__( self, url: str, *, scheme: str, user: Optional[str] = None, password: Optional[str] = None, host: Optional[str] = None, tld: Optional[str] = None, host_type: str = 'domain', port: Optional[str] = None, path: Optional[str] = None, query: Optional[str] = None, fragment: Optional[str] = None, ) -> None: str.__init__(url) self.scheme = scheme self.user = user self.password = password self.host = host self.tld = tld self.host_type = host_type self.port = port self.path = path self.query = query self.fragment = fragment @classmethod def build( cls, *, scheme: str, user: Optional[str] = None, password: Optional[str] = None, host: str, port: Optional[str] = None, path: Optional[str] = None, query: Optional[str] = None, fragment: Optional[str] = None, **_kwargs: str, ) -> str: parts = Parts( scheme=scheme, user=user, password=password, host=host, port=port, path=path, query=query, fragment=fragment, **_kwargs, # type: ignore[misc] ) url = scheme + '://' if user: url += user if password: url += ':' + password if user or password: url += '@' url += host if port and ('port' not in cls.hidden_parts or cls.get_default_parts(parts).get('port') != port): url += ':' + port if path: url += path if query: url += '?' + query if fragment: url += '#' + fragment return url @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, minLength=cls.min_length, maxLength=cls.max_length, format='uri') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: Any, field: 'ModelField', config: 'BaseConfig') -> 'AnyUrl': if value.__class__ == cls: return value value = str_validator(value) if cls.strip_whitespace: value = value.strip() url: str = cast(str, constr_length_validator(value, field, config)) m = cls._match_url(url) # the regex should always match, if it doesn't please report with details of the URL tried assert m, 'URL regex failed unexpectedly' original_parts = cast('Parts', m.groupdict()) parts = cls.apply_default_parts(original_parts) parts = cls.validate_parts(parts) if m.end() != len(url): raise errors.UrlExtraError(extra=url[m.end() :]) return cls._build_url(m, url, parts) @classmethod def _build_url(cls, m: Match[str], url: str, parts: 'Parts') -> 'AnyUrl': """ Validate hosts and build the AnyUrl object. Split from `validate` so this method can be altered in `MultiHostDsn`. """ host, tld, host_type, rebuild = cls.validate_host(parts) return cls( None if rebuild else url, scheme=parts['scheme'], user=parts['user'], password=parts['password'], host=host, tld=tld, host_type=host_type, port=parts['port'], path=parts['path'], query=parts['query'], fragment=parts['fragment'], ) @staticmethod def _match_url(url: str) -> Optional[Match[str]]: return url_regex().match(url) @staticmethod def _validate_port(port: Optional[str]) -> None: if port is not None and int(port) > 65_535: raise errors.UrlPortError() @classmethod def validate_parts(cls, parts: 'Parts', validate_port: bool = True) -> 'Parts': """ A method used to validate parts of a URL. Could be overridden to set default values for parts if missing """ scheme = parts['scheme'] if scheme is None: raise errors.UrlSchemeError() if cls.allowed_schemes and scheme.lower() not in cls.allowed_schemes: raise errors.UrlSchemePermittedError(set(cls.allowed_schemes)) if validate_port: cls._validate_port(parts['port']) user = parts['user'] if cls.user_required and user is None: raise errors.UrlUserInfoError() return parts @classmethod def validate_host(cls, parts: 'Parts') -> Tuple[str, Optional[str], str, bool]: tld, host_type, rebuild = None, None, False for f in ('domain', 'ipv4', 'ipv6'): host = parts[f] # type: ignore[literal-required] if host: host_type = f break if host is None: if cls.host_required: raise errors.UrlHostError() elif host_type == 'domain': is_international = False d = ascii_domain_regex().fullmatch(host) if d is None: d = int_domain_regex().fullmatch(host) if d is None: raise errors.UrlHostError() is_international = True tld = d.group('tld') if tld is None and not is_international: d = int_domain_regex().fullmatch(host) assert d is not None tld = d.group('tld') is_international = True if tld is not None: tld = tld[1:] elif cls.tld_required: raise errors.UrlHostTldError() if is_international: host_type = 'int_domain' rebuild = True host = host.encode('idna').decode('ascii') if tld is not None: tld = tld.encode('idna').decode('ascii') return host, tld, host_type, rebuild # type: ignore @staticmethod def get_default_parts(parts: 'Parts') -> 'Parts': return {} @classmethod def apply_default_parts(cls, parts: 'Parts') -> 'Parts': for key, value in cls.get_default_parts(parts).items(): if not parts[key]: # type: ignore[literal-required] parts[key] = value # type: ignore[literal-required] return parts def __repr__(self) -> str: extra = ', '.join(f'{n}={getattr(self, n)!r}' for n in self.__slots__ if getattr(self, n) is not None) return f'{self.__class__.__name__}({super().__repr__()}, {extra})' class AnyHttpUrl(AnyUrl): allowed_schemes = {'http', 'https'} __slots__ = () class HttpUrl(AnyHttpUrl): tld_required = True # https://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers max_length = 2083 hidden_parts = {'port'} @staticmethod def get_default_parts(parts: 'Parts') -> 'Parts': return {'port': '80' if parts['scheme'] == 'http' else '443'} class FileUrl(AnyUrl): allowed_schemes = {'file'} host_required = False __slots__ = () class MultiHostDsn(AnyUrl): __slots__ = AnyUrl.__slots__ + ('hosts',) def __init__(self, *args: Any, hosts: Optional[List['HostParts']] = None, **kwargs: Any): super().__init__(*args, **kwargs) self.hosts = hosts @staticmethod def _match_url(url: str) -> Optional[Match[str]]: return multi_host_url_regex().match(url) @classmethod def validate_parts(cls, parts: 'Parts', validate_port: bool = True) -> 'Parts': return super().validate_parts(parts, validate_port=False) @classmethod def _build_url(cls, m: Match[str], url: str, parts: 'Parts') -> 'MultiHostDsn': hosts_parts: List['HostParts'] = [] host_re = host_regex() for host in m.groupdict()['hosts'].split(','): d: Parts = host_re.match(host).groupdict() # type: ignore host, tld, host_type, rebuild = cls.validate_host(d) port = d.get('port') cls._validate_port(port) hosts_parts.append( { 'host': host, 'host_type': host_type, 'tld': tld, 'rebuild': rebuild, 'port': port, } ) if len(hosts_parts) > 1: return cls( None if any([hp['rebuild'] for hp in hosts_parts]) else url, scheme=parts['scheme'], user=parts['user'], password=parts['password'], path=parts['path'], query=parts['query'], fragment=parts['fragment'], host_type=None, hosts=hosts_parts, ) else: # backwards compatibility with single host host_part = hosts_parts[0] return cls( None if host_part['rebuild'] else url, scheme=parts['scheme'], user=parts['user'], password=parts['password'], host=host_part['host'], tld=host_part['tld'], host_type=host_part['host_type'], port=host_part.get('port'), path=parts['path'], query=parts['query'], fragment=parts['fragment'], ) class PostgresDsn(MultiHostDsn): allowed_schemes = { 'postgres', 'postgresql', 'postgresql+asyncpg', 'postgresql+pg8000', 'postgresql+psycopg', 'postgresql+psycopg2', 'postgresql+psycopg2cffi', 'postgresql+py-postgresql', 'postgresql+pygresql', } user_required = True __slots__ = () class CockroachDsn(AnyUrl): allowed_schemes = { 'cockroachdb', 'cockroachdb+psycopg2', 'cockroachdb+asyncpg', } user_required = True class AmqpDsn(AnyUrl): allowed_schemes = {'amqp', 'amqps'} host_required = False class RedisDsn(AnyUrl): __slots__ = () allowed_schemes = {'redis', 'rediss'} host_required = False @staticmethod def get_default_parts(parts: 'Parts') -> 'Parts': return { 'domain': 'localhost' if not (parts['ipv4'] or parts['ipv6']) else '', 'port': '6379', 'path': '/0', } class MongoDsn(AnyUrl): allowed_schemes = {'mongodb'} # TODO: Needed to generic "Parts" for "Replica Set", "Sharded Cluster", and other mongodb deployment modes @staticmethod def get_default_parts(parts: 'Parts') -> 'Parts': return { 'port': '27017', } class KafkaDsn(AnyUrl): allowed_schemes = {'kafka'} @staticmethod def get_default_parts(parts: 'Parts') -> 'Parts': return { 'domain': 'localhost', 'port': '9092', } def stricturl( *, strip_whitespace: bool = True, min_length: int = 1, max_length: int = 2**16, tld_required: bool = True, host_required: bool = True, allowed_schemes: Optional[Collection[str]] = None, ) -> Type[AnyUrl]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict( strip_whitespace=strip_whitespace, min_length=min_length, max_length=max_length, tld_required=tld_required, host_required=host_required, allowed_schemes=allowed_schemes, ) return type('UrlValue', (AnyUrl,), namespace) def import_email_validator() -> None: global email_validator try: import email_validator except ImportError as e: raise ImportError('email-validator is not installed, run `pip install pydantic[email]`') from e class EmailStr(str): @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='email') @classmethod def __get_validators__(cls) -> 'CallableGenerator': # included here and below so the error happens straight away import_email_validator() yield str_validator yield cls.validate @classmethod def validate(cls, value: Union[str]) -> str: return validate_email(value)[1] class NameEmail(Representation): __slots__ = 'name', 'email' def __init__(self, name: str, email: str): self.name = name self.email = email def __eq__(self, other: Any) -> bool: return isinstance(other, NameEmail) and (self.name, self.email) == (other.name, other.email) @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='name-email') @classmethod def __get_validators__(cls) -> 'CallableGenerator': import_email_validator() yield cls.validate @classmethod def validate(cls, value: Any) -> 'NameEmail': if value.__class__ == cls: return value value = str_validator(value) return cls(*validate_email(value)) def __str__(self) -> str: return f'{self.name} <{self.email}>' class IPvAnyAddress(_BaseAddress): __slots__ = () @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='ipvanyaddress') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: Union[str, bytes, int]) -> Union[IPv4Address, IPv6Address]: try: return IPv4Address(value) except ValueError: pass try: return IPv6Address(value) except ValueError: raise errors.IPvAnyAddressError() class IPvAnyInterface(_BaseAddress): __slots__ = () @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='ipvanyinterface') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: NetworkType) -> Union[IPv4Interface, IPv6Interface]: try: return IPv4Interface(value) except ValueError: pass try: return IPv6Interface(value) except ValueError: raise errors.IPvAnyInterfaceError() class IPvAnyNetwork(_BaseNetwork): # type: ignore @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='ipvanynetwork') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: NetworkType) -> Union[IPv4Network, IPv6Network]: # Assume IP Network is defined with a default value for ``strict`` argument. # Define your own class if you want to specify network address check strictness. try: return IPv4Network(value) except ValueError: pass try: return IPv6Network(value) except ValueError: raise errors.IPvAnyNetworkError() pretty_email_regex = re.compile(r'([\w ]*?) *<(.*)> *') MAX_EMAIL_LENGTH = 2048 """Maximum length for an email. A somewhat arbitrary but very generous number compared to what is allowed by most implementations. """ def validate_email(value: Union[str]) -> Tuple[str, str]: """ Email address validation using https://pypi.org/project/email-validator/ Notes: * raw ip address (literal) domain parts are not allowed. * "John Doe " style "pretty" email addresses are processed * spaces are striped from the beginning and end of addresses but no error is raised """ if email_validator is None: import_email_validator() if len(value) > MAX_EMAIL_LENGTH: raise errors.EmailError() m = pretty_email_regex.fullmatch(value) name: Union[str, None] = None if m: name, value = m.groups() email = value.strip() try: parts = email_validator.validate_email(email, check_deliverability=False) except email_validator.EmailNotValidError as e: raise errors.EmailError from e if hasattr(parts, 'normalized'): # email-validator >= 2 email = parts.normalized assert email is not None name = name or parts.local_part return name, email else: # email-validator >1, <2 at_index = email.index('@') local_part = email[:at_index] # RFC 5321, local part must be case-sensitive. global_part = email[at_index:].lower() return name or local_part, local_part + global_part pydantic-2.10.6/pydantic/v1/parse.py000066400000000000000000000034351474456633400172650ustar00rootroot00000000000000import json import pickle from enum import Enum from pathlib import Path from typing import Any, Callable, Union from pydantic.v1.types import StrBytes class Protocol(str, Enum): json = 'json' pickle = 'pickle' def load_str_bytes( b: StrBytes, *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, ) -> Any: if proto is None and content_type: if content_type.endswith(('json', 'javascript')): pass elif allow_pickle and content_type.endswith('pickle'): proto = Protocol.pickle else: raise TypeError(f'Unknown content-type: {content_type}') proto = proto or Protocol.json if proto == Protocol.json: if isinstance(b, bytes): b = b.decode(encoding) return json_loads(b) elif proto == Protocol.pickle: if not allow_pickle: raise RuntimeError('Trying to decode with pickle with allow_pickle=False') bb = b if isinstance(b, bytes) else b.encode() return pickle.loads(bb) else: raise TypeError(f'Unknown protocol: {proto}') def load_file( path: Union[str, Path], *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, ) -> Any: path = Path(path) b = path.read_bytes() if content_type is None: if path.suffix in ('.js', '.json'): proto = Protocol.json elif path.suffix == '.pkl': proto = Protocol.pickle return load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=json_loads ) pydantic-2.10.6/pydantic/v1/py.typed000066400000000000000000000000001474456633400172610ustar00rootroot00000000000000pydantic-2.10.6/pydantic/v1/schema.py000066400000000000000000001352711474456633400174170ustar00rootroot00000000000000import re import warnings from collections import defaultdict from dataclasses import is_dataclass from datetime import date, datetime, time, timedelta from decimal import Decimal from enum import Enum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from typing import ( TYPE_CHECKING, Any, Callable, Dict, ForwardRef, FrozenSet, Generic, Iterable, List, Optional, Pattern, Sequence, Set, Tuple, Type, TypeVar, Union, cast, ) from uuid import UUID from typing_extensions import Annotated, Literal from pydantic.v1.fields import ( MAPPING_LIKE_SHAPES, SHAPE_DEQUE, SHAPE_FROZENSET, SHAPE_GENERIC, SHAPE_ITERABLE, SHAPE_LIST, SHAPE_SEQUENCE, SHAPE_SET, SHAPE_SINGLETON, SHAPE_TUPLE, SHAPE_TUPLE_ELLIPSIS, FieldInfo, ModelField, ) from pydantic.v1.json import pydantic_encoder from pydantic.v1.networks import AnyUrl, EmailStr from pydantic.v1.types import ( ConstrainedDecimal, ConstrainedFloat, ConstrainedFrozenSet, ConstrainedInt, ConstrainedList, ConstrainedSet, ConstrainedStr, SecretBytes, SecretStr, StrictBytes, StrictStr, conbytes, condecimal, confloat, confrozenset, conint, conlist, conset, constr, ) from pydantic.v1.typing import ( all_literal_values, get_args, get_origin, get_sub_types, is_callable_type, is_literal_type, is_namedtuple, is_none_type, is_union, ) from pydantic.v1.utils import ROOT_KEY, get_model, lenient_issubclass if TYPE_CHECKING: from pydantic.v1.dataclasses import Dataclass from pydantic.v1.main import BaseModel default_prefix = '#/definitions/' default_ref_template = '#/definitions/{model}' TypeModelOrEnum = Union[Type['BaseModel'], Type[Enum]] TypeModelSet = Set[TypeModelOrEnum] def _apply_modify_schema( modify_schema: Callable[..., None], field: Optional[ModelField], field_schema: Dict[str, Any] ) -> None: from inspect import signature sig = signature(modify_schema) args = set(sig.parameters.keys()) if 'field' in args or 'kwargs' in args: modify_schema(field_schema, field=field) else: modify_schema(field_schema) def schema( models: Sequence[Union[Type['BaseModel'], Type['Dataclass']]], *, by_alias: bool = True, title: Optional[str] = None, description: Optional[str] = None, ref_prefix: Optional[str] = None, ref_template: str = default_ref_template, ) -> Dict[str, Any]: """ Process a list of models and generate a single JSON Schema with all of them defined in the ``definitions`` top-level JSON key, including their sub-models. :param models: a list of models to include in the generated JSON Schema :param by_alias: generate the schemas using the aliases defined, if any :param title: title for the generated schema that includes the definitions :param description: description for the generated schema :param ref_prefix: the JSON Pointer prefix for schema references with ``$ref``, if None, will be set to the default of ``#/definitions/``. Update it if you want the schemas to reference the definitions somewhere else, e.g. for OpenAPI use ``#/components/schemas/``. The resulting generated schemas will still be at the top-level key ``definitions``, so you can extract them from there. But all the references will have the set prefix. :param ref_template: Use a ``string.format()`` template for ``$ref`` instead of a prefix. This can be useful for references that cannot be represented by ``ref_prefix`` such as a definition stored in another file. For a sibling json file in a ``/schemas`` directory use ``"/schemas/${model}.json#"``. :return: dict with the JSON Schema with a ``definitions`` top-level key including the schema definitions for the models and sub-models passed in ``models``. """ clean_models = [get_model(model) for model in models] flat_models = get_flat_models_from_models(clean_models) model_name_map = get_model_name_map(flat_models) definitions = {} output_schema: Dict[str, Any] = {} if title: output_schema['title'] = title if description: output_schema['description'] = description for model in clean_models: m_schema, m_definitions, m_nested_models = model_process_schema( model, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, ) definitions.update(m_definitions) model_name = model_name_map[model] definitions[model_name] = m_schema if definitions: output_schema['definitions'] = definitions return output_schema def model_schema( model: Union[Type['BaseModel'], Type['Dataclass']], by_alias: bool = True, ref_prefix: Optional[str] = None, ref_template: str = default_ref_template, ) -> Dict[str, Any]: """ Generate a JSON Schema for one model. With all the sub-models defined in the ``definitions`` top-level JSON key. :param model: a Pydantic model (a class that inherits from BaseModel) :param by_alias: generate the schemas using the aliases defined, if any :param ref_prefix: the JSON Pointer prefix for schema references with ``$ref``, if None, will be set to the default of ``#/definitions/``. Update it if you want the schemas to reference the definitions somewhere else, e.g. for OpenAPI use ``#/components/schemas/``. The resulting generated schemas will still be at the top-level key ``definitions``, so you can extract them from there. But all the references will have the set prefix. :param ref_template: Use a ``string.format()`` template for ``$ref`` instead of a prefix. This can be useful for references that cannot be represented by ``ref_prefix`` such as a definition stored in another file. For a sibling json file in a ``/schemas`` directory use ``"/schemas/${model}.json#"``. :return: dict with the JSON Schema for the passed ``model`` """ model = get_model(model) flat_models = get_flat_models_from_model(model) model_name_map = get_model_name_map(flat_models) model_name = model_name_map[model] m_schema, m_definitions, nested_models = model_process_schema( model, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template ) if model_name in nested_models: # model_name is in Nested models, it has circular references m_definitions[model_name] = m_schema m_schema = get_schema_ref(model_name, ref_prefix, ref_template, False) if m_definitions: m_schema.update({'definitions': m_definitions}) return m_schema def get_field_info_schema(field: ModelField, schema_overrides: bool = False) -> Tuple[Dict[str, Any], bool]: # If no title is explicitly set, we don't set title in the schema for enums. # The behaviour is the same as `BaseModel` reference, where the default title # is in the definitions part of the schema. schema_: Dict[str, Any] = {} if field.field_info.title or not lenient_issubclass(field.type_, Enum): schema_['title'] = field.field_info.title or field.alias.title().replace('_', ' ') if field.field_info.title: schema_overrides = True if field.field_info.description: schema_['description'] = field.field_info.description schema_overrides = True if not field.required and field.default is not None and not is_callable_type(field.outer_type_): schema_['default'] = encode_default(field.default) schema_overrides = True return schema_, schema_overrides def field_schema( field: ModelField, *, by_alias: bool = True, model_name_map: Dict[TypeModelOrEnum, str], ref_prefix: Optional[str] = None, ref_template: str = default_ref_template, known_models: Optional[TypeModelSet] = None, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ Process a Pydantic field and return a tuple with a JSON Schema for it as the first item. Also return a dictionary of definitions with models as keys and their schemas as values. If the passed field is a model and has sub-models, and those sub-models don't have overrides (as ``title``, ``default``, etc), they will be included in the definitions and referenced in the schema instead of included recursively. :param field: a Pydantic ``ModelField`` :param by_alias: use the defined alias (if any) in the returned schema :param model_name_map: used to generate the JSON Schema references to other models included in the definitions :param ref_prefix: the JSON Pointer prefix to use for references to other schemas, if None, the default of #/definitions/ will be used :param ref_template: Use a ``string.format()`` template for ``$ref`` instead of a prefix. This can be useful for references that cannot be represented by ``ref_prefix`` such as a definition stored in another file. For a sibling json file in a ``/schemas`` directory use ``"/schemas/${model}.json#"``. :param known_models: used to solve circular references :return: tuple of the schema for this field and additional definitions """ s, schema_overrides = get_field_info_schema(field) validation_schema = get_field_schema_validations(field) if validation_schema: s.update(validation_schema) schema_overrides = True f_schema, f_definitions, f_nested_models = field_type_schema( field, by_alias=by_alias, model_name_map=model_name_map, schema_overrides=schema_overrides, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models or set(), ) # $ref will only be returned when there are no schema_overrides if '$ref' in f_schema: return f_schema, f_definitions, f_nested_models else: s.update(f_schema) return s, f_definitions, f_nested_models numeric_types = (int, float, Decimal) _str_types_attrs: Tuple[Tuple[str, Union[type, Tuple[type, ...]], str], ...] = ( ('max_length', numeric_types, 'maxLength'), ('min_length', numeric_types, 'minLength'), ('regex', str, 'pattern'), ) _numeric_types_attrs: Tuple[Tuple[str, Union[type, Tuple[type, ...]], str], ...] = ( ('gt', numeric_types, 'exclusiveMinimum'), ('lt', numeric_types, 'exclusiveMaximum'), ('ge', numeric_types, 'minimum'), ('le', numeric_types, 'maximum'), ('multiple_of', numeric_types, 'multipleOf'), ) def get_field_schema_validations(field: ModelField) -> Dict[str, Any]: """ Get the JSON Schema validation keywords for a ``field`` with an annotation of a Pydantic ``FieldInfo`` with validation arguments. """ f_schema: Dict[str, Any] = {} if lenient_issubclass(field.type_, Enum): # schema is already updated by `enum_process_schema`; just update with field extra if field.field_info.extra: f_schema.update(field.field_info.extra) return f_schema if lenient_issubclass(field.type_, (str, bytes)): for attr_name, t, keyword in _str_types_attrs: attr = getattr(field.field_info, attr_name, None) if isinstance(attr, t): f_schema[keyword] = attr if lenient_issubclass(field.type_, numeric_types) and not issubclass(field.type_, bool): for attr_name, t, keyword in _numeric_types_attrs: attr = getattr(field.field_info, attr_name, None) if isinstance(attr, t): f_schema[keyword] = attr if field.field_info is not None and field.field_info.const: f_schema['const'] = field.default if field.field_info.extra: f_schema.update(field.field_info.extra) modify_schema = getattr(field.outer_type_, '__modify_schema__', None) if modify_schema: _apply_modify_schema(modify_schema, field, f_schema) return f_schema def get_model_name_map(unique_models: TypeModelSet) -> Dict[TypeModelOrEnum, str]: """ Process a set of models and generate unique names for them to be used as keys in the JSON Schema definitions. By default the names are the same as the class name. But if two models in different Python modules have the same name (e.g. "users.Model" and "items.Model"), the generated names will be based on the Python module path for those conflicting models to prevent name collisions. :param unique_models: a Python set of models :return: dict mapping models to names """ name_model_map = {} conflicting_names: Set[str] = set() for model in unique_models: model_name = normalize_name(model.__name__) if model_name in conflicting_names: model_name = get_long_model_name(model) name_model_map[model_name] = model elif model_name in name_model_map: conflicting_names.add(model_name) conflicting_model = name_model_map.pop(model_name) name_model_map[get_long_model_name(conflicting_model)] = conflicting_model name_model_map[get_long_model_name(model)] = model else: name_model_map[model_name] = model return {v: k for k, v in name_model_map.items()} def get_flat_models_from_model(model: Type['BaseModel'], known_models: Optional[TypeModelSet] = None) -> TypeModelSet: """ Take a single ``model`` and generate a set with itself and all the sub-models in the tree. I.e. if you pass model ``Foo`` (subclass of Pydantic ``BaseModel``) as ``model``, and it has a field of type ``Bar`` (also subclass of ``BaseModel``) and that model ``Bar`` has a field of type ``Baz`` (also subclass of ``BaseModel``), the return value will be ``set([Foo, Bar, Baz])``. :param model: a Pydantic ``BaseModel`` subclass :param known_models: used to solve circular references :return: a set with the initial model and all its sub-models """ known_models = known_models or set() flat_models: TypeModelSet = set() flat_models.add(model) known_models |= flat_models fields = cast(Sequence[ModelField], model.__fields__.values()) flat_models |= get_flat_models_from_fields(fields, known_models=known_models) return flat_models def get_flat_models_from_field(field: ModelField, known_models: TypeModelSet) -> TypeModelSet: """ Take a single Pydantic ``ModelField`` (from a model) that could have been declared as a subclass of BaseModel (so, it could be a submodel), and generate a set with its model and all the sub-models in the tree. I.e. if you pass a field that was declared to be of type ``Foo`` (subclass of BaseModel) as ``field``, and that model ``Foo`` has a field of type ``Bar`` (also subclass of ``BaseModel``) and that model ``Bar`` has a field of type ``Baz`` (also subclass of ``BaseModel``), the return value will be ``set([Foo, Bar, Baz])``. :param field: a Pydantic ``ModelField`` :param known_models: used to solve circular references :return: a set with the model used in the declaration for this field, if any, and all its sub-models """ from pydantic.v1.main import BaseModel flat_models: TypeModelSet = set() field_type = field.type_ if lenient_issubclass(getattr(field_type, '__pydantic_model__', None), BaseModel): field_type = field_type.__pydantic_model__ if field.sub_fields and not lenient_issubclass(field_type, BaseModel): flat_models |= get_flat_models_from_fields(field.sub_fields, known_models=known_models) elif lenient_issubclass(field_type, BaseModel) and field_type not in known_models: flat_models |= get_flat_models_from_model(field_type, known_models=known_models) elif lenient_issubclass(field_type, Enum): flat_models.add(field_type) return flat_models def get_flat_models_from_fields(fields: Sequence[ModelField], known_models: TypeModelSet) -> TypeModelSet: """ Take a list of Pydantic ``ModelField``s (from a model) that could have been declared as subclasses of ``BaseModel`` (so, any of them could be a submodel), and generate a set with their models and all the sub-models in the tree. I.e. if you pass a the fields of a model ``Foo`` (subclass of ``BaseModel``) as ``fields``, and on of them has a field of type ``Bar`` (also subclass of ``BaseModel``) and that model ``Bar`` has a field of type ``Baz`` (also subclass of ``BaseModel``), the return value will be ``set([Foo, Bar, Baz])``. :param fields: a list of Pydantic ``ModelField``s :param known_models: used to solve circular references :return: a set with any model declared in the fields, and all their sub-models """ flat_models: TypeModelSet = set() for field in fields: flat_models |= get_flat_models_from_field(field, known_models=known_models) return flat_models def get_flat_models_from_models(models: Sequence[Type['BaseModel']]) -> TypeModelSet: """ Take a list of ``models`` and generate a set with them and all their sub-models in their trees. I.e. if you pass a list of two models, ``Foo`` and ``Bar``, both subclasses of Pydantic ``BaseModel`` as models, and ``Bar`` has a field of type ``Baz`` (also subclass of ``BaseModel``), the return value will be ``set([Foo, Bar, Baz])``. """ flat_models: TypeModelSet = set() for model in models: flat_models |= get_flat_models_from_model(model) return flat_models def get_long_model_name(model: TypeModelOrEnum) -> str: return f'{model.__module__}__{model.__qualname__}'.replace('.', '__') def field_type_schema( field: ModelField, *, by_alias: bool, model_name_map: Dict[TypeModelOrEnum, str], ref_template: str, schema_overrides: bool = False, ref_prefix: Optional[str] = None, known_models: TypeModelSet, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ Used by ``field_schema()``, you probably should be using that function. Take a single ``field`` and generate the schema for its type only, not including additional information as title, etc. Also return additional schema definitions, from sub-models. """ from pydantic.v1.main import BaseModel # noqa: F811 definitions = {} nested_models: Set[str] = set() f_schema: Dict[str, Any] if field.shape in { SHAPE_LIST, SHAPE_TUPLE_ELLIPSIS, SHAPE_SEQUENCE, SHAPE_SET, SHAPE_FROZENSET, SHAPE_ITERABLE, SHAPE_DEQUE, }: items_schema, f_definitions, f_nested_models = field_singleton_schema( field, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) definitions.update(f_definitions) nested_models.update(f_nested_models) f_schema = {'type': 'array', 'items': items_schema} if field.shape in {SHAPE_SET, SHAPE_FROZENSET}: f_schema['uniqueItems'] = True elif field.shape in MAPPING_LIKE_SHAPES: f_schema = {'type': 'object'} key_field = cast(ModelField, field.key_field) regex = getattr(key_field.type_, 'regex', None) items_schema, f_definitions, f_nested_models = field_singleton_schema( field, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) definitions.update(f_definitions) nested_models.update(f_nested_models) if regex: # Dict keys have a regex pattern # items_schema might be a schema or empty dict, add it either way f_schema['patternProperties'] = {ConstrainedStr._get_pattern(regex): items_schema} if items_schema: # The dict values are not simply Any, so they need a schema f_schema['additionalProperties'] = items_schema elif field.shape == SHAPE_TUPLE or (field.shape == SHAPE_GENERIC and not issubclass(field.type_, BaseModel)): sub_schema = [] sub_fields = cast(List[ModelField], field.sub_fields) for sf in sub_fields: sf_schema, sf_definitions, sf_nested_models = field_type_schema( sf, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) definitions.update(sf_definitions) nested_models.update(sf_nested_models) sub_schema.append(sf_schema) sub_fields_len = len(sub_fields) if field.shape == SHAPE_GENERIC: all_of_schemas = sub_schema[0] if sub_fields_len == 1 else {'type': 'array', 'items': sub_schema} f_schema = {'allOf': [all_of_schemas]} else: f_schema = { 'type': 'array', 'minItems': sub_fields_len, 'maxItems': sub_fields_len, } if sub_fields_len >= 1: f_schema['items'] = sub_schema else: assert field.shape in {SHAPE_SINGLETON, SHAPE_GENERIC}, field.shape f_schema, f_definitions, f_nested_models = field_singleton_schema( field, by_alias=by_alias, model_name_map=model_name_map, schema_overrides=schema_overrides, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) definitions.update(f_definitions) nested_models.update(f_nested_models) # check field type to avoid repeated calls to the same __modify_schema__ method if field.type_ != field.outer_type_: if field.shape == SHAPE_GENERIC: field_type = field.type_ else: field_type = field.outer_type_ modify_schema = getattr(field_type, '__modify_schema__', None) if modify_schema: _apply_modify_schema(modify_schema, field, f_schema) return f_schema, definitions, nested_models def model_process_schema( model: TypeModelOrEnum, *, by_alias: bool = True, model_name_map: Dict[TypeModelOrEnum, str], ref_prefix: Optional[str] = None, ref_template: str = default_ref_template, known_models: Optional[TypeModelSet] = None, field: Optional[ModelField] = None, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ Used by ``model_schema()``, you probably should be using that function. Take a single ``model`` and generate its schema. Also return additional schema definitions, from sub-models. The sub-models of the returned schema will be referenced, but their definitions will not be included in the schema. All the definitions are returned as the second value. """ from inspect import getdoc, signature known_models = known_models or set() if lenient_issubclass(model, Enum): model = cast(Type[Enum], model) s = enum_process_schema(model, field=field) return s, {}, set() model = cast(Type['BaseModel'], model) s = {'title': model.__config__.title or model.__name__} doc = getdoc(model) if doc: s['description'] = doc known_models.add(model) m_schema, m_definitions, nested_models = model_type_schema( model, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) s.update(m_schema) schema_extra = model.__config__.schema_extra if callable(schema_extra): if len(signature(schema_extra).parameters) == 1: schema_extra(s) else: schema_extra(s, model) else: s.update(schema_extra) return s, m_definitions, nested_models def model_type_schema( model: Type['BaseModel'], *, by_alias: bool, model_name_map: Dict[TypeModelOrEnum, str], ref_template: str, ref_prefix: Optional[str] = None, known_models: TypeModelSet, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ You probably should be using ``model_schema()``, this function is indirectly used by that function. Take a single ``model`` and generate the schema for its type only, not including additional information as title, etc. Also return additional schema definitions, from sub-models. """ properties = {} required = [] definitions: Dict[str, Any] = {} nested_models: Set[str] = set() for k, f in model.__fields__.items(): try: f_schema, f_definitions, f_nested_models = field_schema( f, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) except SkipField as skip: warnings.warn(skip.message, UserWarning) continue definitions.update(f_definitions) nested_models.update(f_nested_models) if by_alias: properties[f.alias] = f_schema if f.required: required.append(f.alias) else: properties[k] = f_schema if f.required: required.append(k) if ROOT_KEY in properties: out_schema = properties[ROOT_KEY] out_schema['title'] = model.__config__.title or model.__name__ else: out_schema = {'type': 'object', 'properties': properties} if required: out_schema['required'] = required if model.__config__.extra == 'forbid': out_schema['additionalProperties'] = False return out_schema, definitions, nested_models def enum_process_schema(enum: Type[Enum], *, field: Optional[ModelField] = None) -> Dict[str, Any]: """ Take a single `enum` and generate its schema. This is similar to the `model_process_schema` function, but applies to ``Enum`` objects. """ import inspect schema_: Dict[str, Any] = { 'title': enum.__name__, # Python assigns all enums a default docstring value of 'An enumeration', so # all enums will have a description field even if not explicitly provided. 'description': inspect.cleandoc(enum.__doc__ or 'An enumeration.'), # Add enum values and the enum field type to the schema. 'enum': [item.value for item in cast(Iterable[Enum], enum)], } add_field_type_to_schema(enum, schema_) modify_schema = getattr(enum, '__modify_schema__', None) if modify_schema: _apply_modify_schema(modify_schema, field, schema_) return schema_ def field_singleton_sub_fields_schema( field: ModelField, *, by_alias: bool, model_name_map: Dict[TypeModelOrEnum, str], ref_template: str, schema_overrides: bool = False, ref_prefix: Optional[str] = None, known_models: TypeModelSet, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ This function is indirectly used by ``field_schema()``, you probably should be using that function. Take a list of Pydantic ``ModelField`` from the declaration of a type with parameters, and generate their schema. I.e., fields used as "type parameters", like ``str`` and ``int`` in ``Tuple[str, int]``. """ sub_fields = cast(List[ModelField], field.sub_fields) definitions = {} nested_models: Set[str] = set() if len(sub_fields) == 1: return field_type_schema( sub_fields[0], by_alias=by_alias, model_name_map=model_name_map, schema_overrides=schema_overrides, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) else: s: Dict[str, Any] = {} # https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.2.md#discriminator-object field_has_discriminator: bool = field.discriminator_key is not None if field_has_discriminator: assert field.sub_fields_mapping is not None discriminator_models_refs: Dict[str, Union[str, Dict[str, Any]]] = {} for discriminator_value, sub_field in field.sub_fields_mapping.items(): if isinstance(discriminator_value, Enum): discriminator_value = str(discriminator_value.value) # sub_field is either a `BaseModel` or directly an `Annotated` `Union` of many if is_union(get_origin(sub_field.type_)): sub_models = get_sub_types(sub_field.type_) discriminator_models_refs[discriminator_value] = { model_name_map[sub_model]: get_schema_ref( model_name_map[sub_model], ref_prefix, ref_template, False ) for sub_model in sub_models } else: sub_field_type = sub_field.type_ if hasattr(sub_field_type, '__pydantic_model__'): sub_field_type = sub_field_type.__pydantic_model__ discriminator_model_name = model_name_map[sub_field_type] discriminator_model_ref = get_schema_ref(discriminator_model_name, ref_prefix, ref_template, False) discriminator_models_refs[discriminator_value] = discriminator_model_ref['$ref'] s['discriminator'] = { 'propertyName': field.discriminator_alias if by_alias else field.discriminator_key, 'mapping': discriminator_models_refs, } sub_field_schemas = [] for sf in sub_fields: sub_schema, sub_definitions, sub_nested_models = field_type_schema( sf, by_alias=by_alias, model_name_map=model_name_map, schema_overrides=schema_overrides, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) definitions.update(sub_definitions) if schema_overrides and 'allOf' in sub_schema: # if the sub_field is a referenced schema we only need the referenced # object. Otherwise we will end up with several allOf inside anyOf/oneOf. # See https://github.com/pydantic/pydantic/issues/1209 sub_schema = sub_schema['allOf'][0] if sub_schema.keys() == {'discriminator', 'oneOf'}: # we don't want discriminator information inside oneOf choices, this is dealt with elsewhere sub_schema.pop('discriminator') sub_field_schemas.append(sub_schema) nested_models.update(sub_nested_models) s['oneOf' if field_has_discriminator else 'anyOf'] = sub_field_schemas return s, definitions, nested_models # Order is important, e.g. subclasses of str must go before str # this is used only for standard library types, custom types should use __modify_schema__ instead field_class_to_schema: Tuple[Tuple[Any, Dict[str, Any]], ...] = ( (Path, {'type': 'string', 'format': 'path'}), (datetime, {'type': 'string', 'format': 'date-time'}), (date, {'type': 'string', 'format': 'date'}), (time, {'type': 'string', 'format': 'time'}), (timedelta, {'type': 'number', 'format': 'time-delta'}), (IPv4Network, {'type': 'string', 'format': 'ipv4network'}), (IPv6Network, {'type': 'string', 'format': 'ipv6network'}), (IPv4Interface, {'type': 'string', 'format': 'ipv4interface'}), (IPv6Interface, {'type': 'string', 'format': 'ipv6interface'}), (IPv4Address, {'type': 'string', 'format': 'ipv4'}), (IPv6Address, {'type': 'string', 'format': 'ipv6'}), (Pattern, {'type': 'string', 'format': 'regex'}), (str, {'type': 'string'}), (bytes, {'type': 'string', 'format': 'binary'}), (bool, {'type': 'boolean'}), (int, {'type': 'integer'}), (float, {'type': 'number'}), (Decimal, {'type': 'number'}), (UUID, {'type': 'string', 'format': 'uuid'}), (dict, {'type': 'object'}), (list, {'type': 'array', 'items': {}}), (tuple, {'type': 'array', 'items': {}}), (set, {'type': 'array', 'items': {}, 'uniqueItems': True}), (frozenset, {'type': 'array', 'items': {}, 'uniqueItems': True}), ) json_scheme = {'type': 'string', 'format': 'json-string'} def add_field_type_to_schema(field_type: Any, schema_: Dict[str, Any]) -> None: """ Update the given `schema` with the type-specific metadata for the given `field_type`. This function looks through `field_class_to_schema` for a class that matches the given `field_type`, and then modifies the given `schema` with the information from that type. """ for type_, t_schema in field_class_to_schema: # Fallback for `typing.Pattern` and `re.Pattern` as they are not a valid class if lenient_issubclass(field_type, type_) or field_type is type_ is Pattern: schema_.update(t_schema) break def get_schema_ref(name: str, ref_prefix: Optional[str], ref_template: str, schema_overrides: bool) -> Dict[str, Any]: if ref_prefix: schema_ref = {'$ref': ref_prefix + name} else: schema_ref = {'$ref': ref_template.format(model=name)} return {'allOf': [schema_ref]} if schema_overrides else schema_ref def field_singleton_schema( # noqa: C901 (ignore complexity) field: ModelField, *, by_alias: bool, model_name_map: Dict[TypeModelOrEnum, str], ref_template: str, schema_overrides: bool = False, ref_prefix: Optional[str] = None, known_models: TypeModelSet, ) -> Tuple[Dict[str, Any], Dict[str, Any], Set[str]]: """ This function is indirectly used by ``field_schema()``, you should probably be using that function. Take a single Pydantic ``ModelField``, and return its schema and any additional definitions from sub-models. """ from pydantic.v1.main import BaseModel definitions: Dict[str, Any] = {} nested_models: Set[str] = set() field_type = field.type_ # Recurse into this field if it contains sub_fields and is NOT a # BaseModel OR that BaseModel is a const if field.sub_fields and ( (field.field_info and field.field_info.const) or not lenient_issubclass(field_type, BaseModel) ): return field_singleton_sub_fields_schema( field, by_alias=by_alias, model_name_map=model_name_map, schema_overrides=schema_overrides, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) if field_type is Any or field_type is object or field_type.__class__ == TypeVar or get_origin(field_type) is type: return {}, definitions, nested_models # no restrictions if is_none_type(field_type): return {'type': 'null'}, definitions, nested_models if is_callable_type(field_type): raise SkipField(f'Callable {field.name} was excluded from schema since JSON schema has no equivalent type.') f_schema: Dict[str, Any] = {} if field.field_info is not None and field.field_info.const: f_schema['const'] = field.default if is_literal_type(field_type): values = tuple(x.value if isinstance(x, Enum) else x for x in all_literal_values(field_type)) if len({v.__class__ for v in values}) > 1: return field_schema( multitypes_literal_field_for_schema(values, field), by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, ) # All values have the same type field_type = values[0].__class__ f_schema['enum'] = list(values) add_field_type_to_schema(field_type, f_schema) elif lenient_issubclass(field_type, Enum): enum_name = model_name_map[field_type] f_schema, schema_overrides = get_field_info_schema(field, schema_overrides) f_schema.update(get_schema_ref(enum_name, ref_prefix, ref_template, schema_overrides)) definitions[enum_name] = enum_process_schema(field_type, field=field) elif is_namedtuple(field_type): sub_schema, *_ = model_process_schema( field_type.__pydantic_model__, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, field=field, ) items_schemas = list(sub_schema['properties'].values()) f_schema.update( { 'type': 'array', 'items': items_schemas, 'minItems': len(items_schemas), 'maxItems': len(items_schemas), } ) elif not hasattr(field_type, '__pydantic_model__'): add_field_type_to_schema(field_type, f_schema) modify_schema = getattr(field_type, '__modify_schema__', None) if modify_schema: _apply_modify_schema(modify_schema, field, f_schema) if f_schema: return f_schema, definitions, nested_models # Handle dataclass-based models if lenient_issubclass(getattr(field_type, '__pydantic_model__', None), BaseModel): field_type = field_type.__pydantic_model__ if issubclass(field_type, BaseModel): model_name = model_name_map[field_type] if field_type not in known_models: sub_schema, sub_definitions, sub_nested_models = model_process_schema( field_type, by_alias=by_alias, model_name_map=model_name_map, ref_prefix=ref_prefix, ref_template=ref_template, known_models=known_models, field=field, ) definitions.update(sub_definitions) definitions[model_name] = sub_schema nested_models.update(sub_nested_models) else: nested_models.add(model_name) schema_ref = get_schema_ref(model_name, ref_prefix, ref_template, schema_overrides) return schema_ref, definitions, nested_models # For generics with no args args = get_args(field_type) if args is not None and not args and Generic in field_type.__bases__: return f_schema, definitions, nested_models raise ValueError(f'Value not declarable with JSON Schema, field: {field}') def multitypes_literal_field_for_schema(values: Tuple[Any, ...], field: ModelField) -> ModelField: """ To support `Literal` with values of different types, we split it into multiple `Literal` with same type e.g. `Literal['qwe', 'asd', 1, 2]` becomes `Union[Literal['qwe', 'asd'], Literal[1, 2]]` """ literal_distinct_types = defaultdict(list) for v in values: literal_distinct_types[v.__class__].append(v) distinct_literals = (Literal[tuple(same_type_values)] for same_type_values in literal_distinct_types.values()) return ModelField( name=field.name, type_=Union[tuple(distinct_literals)], # type: ignore class_validators=field.class_validators, model_config=field.model_config, default=field.default, required=field.required, alias=field.alias, field_info=field.field_info, ) def encode_default(dft: Any) -> Any: from pydantic.v1.main import BaseModel if isinstance(dft, BaseModel) or is_dataclass(dft): dft = cast('dict[str, Any]', pydantic_encoder(dft)) if isinstance(dft, dict): return {encode_default(k): encode_default(v) for k, v in dft.items()} elif isinstance(dft, Enum): return dft.value elif isinstance(dft, (int, float, str)): return dft elif isinstance(dft, (list, tuple)): t = dft.__class__ seq_args = (encode_default(v) for v in dft) return t(*seq_args) if is_namedtuple(t) else t(seq_args) elif dft is None: return None else: return pydantic_encoder(dft) _map_types_constraint: Dict[Any, Callable[..., type]] = {int: conint, float: confloat, Decimal: condecimal} def get_annotation_from_field_info( annotation: Any, field_info: FieldInfo, field_name: str, validate_assignment: bool = False ) -> Type[Any]: """ Get an annotation with validation implemented for numbers and strings based on the field_info. :param annotation: an annotation from a field specification, as ``str``, ``ConstrainedStr`` :param field_info: an instance of FieldInfo, possibly with declarations for validations and JSON Schema :param field_name: name of the field for use in error messages :param validate_assignment: default False, flag for BaseModel Config value of validate_assignment :return: the same ``annotation`` if unmodified or a new annotation with validation in place """ constraints = field_info.get_constraints() used_constraints: Set[str] = set() if constraints: annotation, used_constraints = get_annotation_with_constraints(annotation, field_info) if validate_assignment: used_constraints.add('allow_mutation') unused_constraints = constraints - used_constraints if unused_constraints: raise ValueError( f'On field "{field_name}" the following field constraints are set but not enforced: ' f'{", ".join(unused_constraints)}. ' f'\nFor more details see https://docs.pydantic.dev/usage/schema/#unenforced-field-constraints' ) return annotation def get_annotation_with_constraints(annotation: Any, field_info: FieldInfo) -> Tuple[Type[Any], Set[str]]: # noqa: C901 """ Get an annotation with used constraints implemented for numbers and strings based on the field_info. :param annotation: an annotation from a field specification, as ``str``, ``ConstrainedStr`` :param field_info: an instance of FieldInfo, possibly with declarations for validations and JSON Schema :return: the same ``annotation`` if unmodified or a new annotation along with the used constraints. """ used_constraints: Set[str] = set() def go(type_: Any) -> Type[Any]: if ( is_literal_type(type_) or isinstance(type_, ForwardRef) or lenient_issubclass(type_, (ConstrainedList, ConstrainedSet, ConstrainedFrozenSet)) ): return type_ origin = get_origin(type_) if origin is not None: args: Tuple[Any, ...] = get_args(type_) if any(isinstance(a, ForwardRef) for a in args): # forward refs cause infinite recursion below return type_ if origin is Annotated: return go(args[0]) if is_union(origin): return Union[tuple(go(a) for a in args)] # type: ignore if issubclass(origin, List) and ( field_info.min_items is not None or field_info.max_items is not None or field_info.unique_items is not None ): used_constraints.update({'min_items', 'max_items', 'unique_items'}) return conlist( go(args[0]), min_items=field_info.min_items, max_items=field_info.max_items, unique_items=field_info.unique_items, ) if issubclass(origin, Set) and (field_info.min_items is not None or field_info.max_items is not None): used_constraints.update({'min_items', 'max_items'}) return conset(go(args[0]), min_items=field_info.min_items, max_items=field_info.max_items) if issubclass(origin, FrozenSet) and (field_info.min_items is not None or field_info.max_items is not None): used_constraints.update({'min_items', 'max_items'}) return confrozenset(go(args[0]), min_items=field_info.min_items, max_items=field_info.max_items) for t in (Tuple, List, Set, FrozenSet, Sequence): if issubclass(origin, t): # type: ignore return t[tuple(go(a) for a in args)] # type: ignore if issubclass(origin, Dict): return Dict[args[0], go(args[1])] # type: ignore attrs: Optional[Tuple[str, ...]] = None constraint_func: Optional[Callable[..., type]] = None if isinstance(type_, type): if issubclass(type_, (SecretStr, SecretBytes)): attrs = ('max_length', 'min_length') def constraint_func(**kw: Any) -> Type[Any]: # noqa: F811 return type(type_.__name__, (type_,), kw) elif issubclass(type_, str) and not issubclass(type_, (EmailStr, AnyUrl)): attrs = ('max_length', 'min_length', 'regex') if issubclass(type_, StrictStr): def constraint_func(**kw: Any) -> Type[Any]: return type(type_.__name__, (type_,), kw) else: constraint_func = constr elif issubclass(type_, bytes): attrs = ('max_length', 'min_length', 'regex') if issubclass(type_, StrictBytes): def constraint_func(**kw: Any) -> Type[Any]: return type(type_.__name__, (type_,), kw) else: constraint_func = conbytes elif issubclass(type_, numeric_types) and not issubclass( type_, ( ConstrainedInt, ConstrainedFloat, ConstrainedDecimal, ConstrainedList, ConstrainedSet, ConstrainedFrozenSet, bool, ), ): # Is numeric type attrs = ('gt', 'lt', 'ge', 'le', 'multiple_of') if issubclass(type_, float): attrs += ('allow_inf_nan',) if issubclass(type_, Decimal): attrs += ('max_digits', 'decimal_places') numeric_type = next(t for t in numeric_types if issubclass(type_, t)) # pragma: no branch constraint_func = _map_types_constraint[numeric_type] if attrs: used_constraints.update(set(attrs)) kwargs = { attr_name: attr for attr_name, attr in ((attr_name, getattr(field_info, attr_name)) for attr_name in attrs) if attr is not None } if kwargs: constraint_func = cast(Callable[..., type], constraint_func) return constraint_func(**kwargs) return type_ return go(annotation), used_constraints def normalize_name(name: str) -> str: """ Normalizes the given name. This can be applied to either a model *or* enum. """ return re.sub(r'[^a-zA-Z0-9.\-_]', '_', name) class SkipField(Exception): """ Utility exception used to exclude fields from schema. """ def __init__(self, message: str) -> None: self.message = message pydantic-2.10.6/pydantic/v1/tools.py000066400000000000000000000055011474456633400173070ustar00rootroot00000000000000import json from functools import lru_cache from pathlib import Path from typing import TYPE_CHECKING, Any, Callable, Optional, Type, TypeVar, Union from pydantic.v1.parse import Protocol, load_file, load_str_bytes from pydantic.v1.types import StrBytes from pydantic.v1.typing import display_as_type __all__ = ('parse_file_as', 'parse_obj_as', 'parse_raw_as', 'schema_of', 'schema_json_of') NameFactory = Union[str, Callable[[Type[Any]], str]] if TYPE_CHECKING: from pydantic.v1.typing import DictStrAny def _generate_parsing_type_name(type_: Any) -> str: return f'ParsingModel[{display_as_type(type_)}]' @lru_cache(maxsize=2048) def _get_parsing_type(type_: Any, *, type_name: Optional[NameFactory] = None) -> Any: from pydantic.v1.main import create_model if type_name is None: type_name = _generate_parsing_type_name if not isinstance(type_name, str): type_name = type_name(type_) return create_model(type_name, __root__=(type_, ...)) T = TypeVar('T') def parse_obj_as(type_: Type[T], obj: Any, *, type_name: Optional[NameFactory] = None) -> T: model_type = _get_parsing_type(type_, type_name=type_name) # type: ignore[arg-type] return model_type(__root__=obj).__root__ def parse_file_as( type_: Type[T], path: Union[str, Path], *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, type_name: Optional[NameFactory] = None, ) -> T: obj = load_file( path, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=json_loads, ) return parse_obj_as(type_, obj, type_name=type_name) def parse_raw_as( type_: Type[T], b: StrBytes, *, content_type: str = None, encoding: str = 'utf8', proto: Protocol = None, allow_pickle: bool = False, json_loads: Callable[[str], Any] = json.loads, type_name: Optional[NameFactory] = None, ) -> T: obj = load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, json_loads=json_loads, ) return parse_obj_as(type_, obj, type_name=type_name) def schema_of(type_: Any, *, title: Optional[NameFactory] = None, **schema_kwargs: Any) -> 'DictStrAny': """Generate a JSON schema (as dict) for the passed model or dynamically generated one""" return _get_parsing_type(type_, type_name=title).schema(**schema_kwargs) def schema_json_of(type_: Any, *, title: Optional[NameFactory] = None, **schema_json_kwargs: Any) -> str: """Generate a JSON schema (as JSON) for the passed model or dynamically generated one""" return _get_parsing_type(type_, type_name=title).schema_json(**schema_json_kwargs) pydantic-2.10.6/pydantic/v1/types.py000066400000000000000000001051771474456633400173250ustar00rootroot00000000000000import abc import math import re import warnings from datetime import date from decimal import Decimal, InvalidOperation from enum import Enum from pathlib import Path from types import new_class from typing import ( TYPE_CHECKING, Any, Callable, ClassVar, Dict, FrozenSet, List, Optional, Pattern, Set, Tuple, Type, TypeVar, Union, cast, overload, ) from uuid import UUID from weakref import WeakSet from pydantic.v1 import errors from pydantic.v1.datetime_parse import parse_date from pydantic.v1.utils import import_string, update_not_none from pydantic.v1.validators import ( bytes_validator, constr_length_validator, constr_lower, constr_strip_whitespace, constr_upper, decimal_validator, float_finite_validator, float_validator, frozenset_validator, int_validator, list_validator, number_multiple_validator, number_size_validator, path_exists_validator, path_validator, set_validator, str_validator, strict_bytes_validator, strict_float_validator, strict_int_validator, strict_str_validator, ) __all__ = [ 'NoneStr', 'NoneBytes', 'StrBytes', 'NoneStrBytes', 'StrictStr', 'ConstrainedBytes', 'conbytes', 'ConstrainedList', 'conlist', 'ConstrainedSet', 'conset', 'ConstrainedFrozenSet', 'confrozenset', 'ConstrainedStr', 'constr', 'PyObject', 'ConstrainedInt', 'conint', 'PositiveInt', 'NegativeInt', 'NonNegativeInt', 'NonPositiveInt', 'ConstrainedFloat', 'confloat', 'PositiveFloat', 'NegativeFloat', 'NonNegativeFloat', 'NonPositiveFloat', 'FiniteFloat', 'ConstrainedDecimal', 'condecimal', 'UUID1', 'UUID3', 'UUID4', 'UUID5', 'FilePath', 'DirectoryPath', 'Json', 'JsonWrapper', 'SecretField', 'SecretStr', 'SecretBytes', 'StrictBool', 'StrictBytes', 'StrictInt', 'StrictFloat', 'PaymentCardNumber', 'ByteSize', 'PastDate', 'FutureDate', 'ConstrainedDate', 'condate', ] NoneStr = Optional[str] NoneBytes = Optional[bytes] StrBytes = Union[str, bytes] NoneStrBytes = Optional[StrBytes] OptionalInt = Optional[int] OptionalIntFloat = Union[OptionalInt, float] OptionalIntFloatDecimal = Union[OptionalIntFloat, Decimal] OptionalDate = Optional[date] StrIntFloat = Union[str, int, float] if TYPE_CHECKING: from typing_extensions import Annotated from pydantic.v1.dataclasses import Dataclass from pydantic.v1.main import BaseModel from pydantic.v1.typing import CallableGenerator ModelOrDc = Type[Union[BaseModel, Dataclass]] T = TypeVar('T') _DEFINED_TYPES: 'WeakSet[type]' = WeakSet() @overload def _registered(typ: Type[T]) -> Type[T]: pass @overload def _registered(typ: 'ConstrainedNumberMeta') -> 'ConstrainedNumberMeta': pass def _registered(typ: Union[Type[T], 'ConstrainedNumberMeta']) -> Union[Type[T], 'ConstrainedNumberMeta']: # In order to generate valid examples of constrained types, Hypothesis needs # to inspect the type object - so we keep a weakref to each contype object # until it can be registered. When (or if) our Hypothesis plugin is loaded, # it monkeypatches this function. # If Hypothesis is never used, the total effect is to keep a weak reference # which has minimal memory usage and doesn't even affect garbage collection. _DEFINED_TYPES.add(typ) return typ class ConstrainedNumberMeta(type): def __new__(cls, name: str, bases: Any, dct: Dict[str, Any]) -> 'ConstrainedInt': # type: ignore new_cls = cast('ConstrainedInt', type.__new__(cls, name, bases, dct)) if new_cls.gt is not None and new_cls.ge is not None: raise errors.ConfigError('bounds gt and ge cannot be specified at the same time') if new_cls.lt is not None and new_cls.le is not None: raise errors.ConfigError('bounds lt and le cannot be specified at the same time') return _registered(new_cls) # type: ignore # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BOOLEAN TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: StrictBool = bool else: class StrictBool(int): """ StrictBool to allow for bools which are not type-coerced. """ @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='boolean') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: Any) -> bool: """ Ensure that we only allow bools. """ if isinstance(value, bool): return value raise errors.StrictBoolError() # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ INTEGER TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ConstrainedInt(int, metaclass=ConstrainedNumberMeta): strict: bool = False gt: OptionalInt = None ge: OptionalInt = None lt: OptionalInt = None le: OptionalInt = None multiple_of: OptionalInt = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, exclusiveMinimum=cls.gt, exclusiveMaximum=cls.lt, minimum=cls.ge, maximum=cls.le, multipleOf=cls.multiple_of, ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield strict_int_validator if cls.strict else int_validator yield number_size_validator yield number_multiple_validator def conint( *, strict: bool = False, gt: Optional[int] = None, ge: Optional[int] = None, lt: Optional[int] = None, le: Optional[int] = None, multiple_of: Optional[int] = None, ) -> Type[int]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict(strict=strict, gt=gt, ge=ge, lt=lt, le=le, multiple_of=multiple_of) return type('ConstrainedIntValue', (ConstrainedInt,), namespace) if TYPE_CHECKING: PositiveInt = int NegativeInt = int NonPositiveInt = int NonNegativeInt = int StrictInt = int else: class PositiveInt(ConstrainedInt): gt = 0 class NegativeInt(ConstrainedInt): lt = 0 class NonPositiveInt(ConstrainedInt): le = 0 class NonNegativeInt(ConstrainedInt): ge = 0 class StrictInt(ConstrainedInt): strict = True # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ FLOAT TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ConstrainedFloat(float, metaclass=ConstrainedNumberMeta): strict: bool = False gt: OptionalIntFloat = None ge: OptionalIntFloat = None lt: OptionalIntFloat = None le: OptionalIntFloat = None multiple_of: OptionalIntFloat = None allow_inf_nan: Optional[bool] = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, exclusiveMinimum=cls.gt, exclusiveMaximum=cls.lt, minimum=cls.ge, maximum=cls.le, multipleOf=cls.multiple_of, ) # Modify constraints to account for differences between IEEE floats and JSON if field_schema.get('exclusiveMinimum') == -math.inf: del field_schema['exclusiveMinimum'] if field_schema.get('minimum') == -math.inf: del field_schema['minimum'] if field_schema.get('exclusiveMaximum') == math.inf: del field_schema['exclusiveMaximum'] if field_schema.get('maximum') == math.inf: del field_schema['maximum'] @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield strict_float_validator if cls.strict else float_validator yield number_size_validator yield number_multiple_validator yield float_finite_validator def confloat( *, strict: bool = False, gt: float = None, ge: float = None, lt: float = None, le: float = None, multiple_of: float = None, allow_inf_nan: Optional[bool] = None, ) -> Type[float]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict(strict=strict, gt=gt, ge=ge, lt=lt, le=le, multiple_of=multiple_of, allow_inf_nan=allow_inf_nan) return type('ConstrainedFloatValue', (ConstrainedFloat,), namespace) if TYPE_CHECKING: PositiveFloat = float NegativeFloat = float NonPositiveFloat = float NonNegativeFloat = float StrictFloat = float FiniteFloat = float else: class PositiveFloat(ConstrainedFloat): gt = 0 class NegativeFloat(ConstrainedFloat): lt = 0 class NonPositiveFloat(ConstrainedFloat): le = 0 class NonNegativeFloat(ConstrainedFloat): ge = 0 class StrictFloat(ConstrainedFloat): strict = True class FiniteFloat(ConstrainedFloat): allow_inf_nan = False # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BYTES TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ConstrainedBytes(bytes): strip_whitespace = False to_upper = False to_lower = False min_length: OptionalInt = None max_length: OptionalInt = None strict: bool = False @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, minLength=cls.min_length, maxLength=cls.max_length) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield strict_bytes_validator if cls.strict else bytes_validator yield constr_strip_whitespace yield constr_upper yield constr_lower yield constr_length_validator def conbytes( *, strip_whitespace: bool = False, to_upper: bool = False, to_lower: bool = False, min_length: Optional[int] = None, max_length: Optional[int] = None, strict: bool = False, ) -> Type[bytes]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict( strip_whitespace=strip_whitespace, to_upper=to_upper, to_lower=to_lower, min_length=min_length, max_length=max_length, strict=strict, ) return _registered(type('ConstrainedBytesValue', (ConstrainedBytes,), namespace)) if TYPE_CHECKING: StrictBytes = bytes else: class StrictBytes(ConstrainedBytes): strict = True # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ STRING TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ConstrainedStr(str): strip_whitespace = False to_upper = False to_lower = False min_length: OptionalInt = None max_length: OptionalInt = None curtail_length: OptionalInt = None regex: Optional[Union[str, Pattern[str]]] = None strict = False @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, minLength=cls.min_length, maxLength=cls.max_length, pattern=cls.regex and cls._get_pattern(cls.regex), ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield strict_str_validator if cls.strict else str_validator yield constr_strip_whitespace yield constr_upper yield constr_lower yield constr_length_validator yield cls.validate @classmethod def validate(cls, value: Union[str]) -> Union[str]: if cls.curtail_length and len(value) > cls.curtail_length: value = value[: cls.curtail_length] if cls.regex: if not re.match(cls.regex, value): raise errors.StrRegexError(pattern=cls._get_pattern(cls.regex)) return value @staticmethod def _get_pattern(regex: Union[str, Pattern[str]]) -> str: return regex if isinstance(regex, str) else regex.pattern def constr( *, strip_whitespace: bool = False, to_upper: bool = False, to_lower: bool = False, strict: bool = False, min_length: Optional[int] = None, max_length: Optional[int] = None, curtail_length: Optional[int] = None, regex: Optional[str] = None, ) -> Type[str]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict( strip_whitespace=strip_whitespace, to_upper=to_upper, to_lower=to_lower, strict=strict, min_length=min_length, max_length=max_length, curtail_length=curtail_length, regex=regex and re.compile(regex), ) return _registered(type('ConstrainedStrValue', (ConstrainedStr,), namespace)) if TYPE_CHECKING: StrictStr = str else: class StrictStr(ConstrainedStr): strict = True # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SET TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # This types superclass should be Set[T], but cython chokes on that... class ConstrainedSet(set): # type: ignore # Needed for pydantic to detect that this is a set __origin__ = set __args__: Set[Type[T]] # type: ignore min_items: Optional[int] = None max_items: Optional[int] = None item_type: Type[T] # type: ignore @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.set_length_validator @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, minItems=cls.min_items, maxItems=cls.max_items) @classmethod def set_length_validator(cls, v: 'Optional[Set[T]]') -> 'Optional[Set[T]]': if v is None: return None v = set_validator(v) v_len = len(v) if cls.min_items is not None and v_len < cls.min_items: raise errors.SetMinLengthError(limit_value=cls.min_items) if cls.max_items is not None and v_len > cls.max_items: raise errors.SetMaxLengthError(limit_value=cls.max_items) return v def conset(item_type: Type[T], *, min_items: Optional[int] = None, max_items: Optional[int] = None) -> Type[Set[T]]: # __args__ is needed to conform to typing generics api namespace = {'min_items': min_items, 'max_items': max_items, 'item_type': item_type, '__args__': [item_type]} # We use new_class to be able to deal with Generic types return new_class('ConstrainedSetValue', (ConstrainedSet,), {}, lambda ns: ns.update(namespace)) # This types superclass should be FrozenSet[T], but cython chokes on that... class ConstrainedFrozenSet(frozenset): # type: ignore # Needed for pydantic to detect that this is a set __origin__ = frozenset __args__: FrozenSet[Type[T]] # type: ignore min_items: Optional[int] = None max_items: Optional[int] = None item_type: Type[T] # type: ignore @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.frozenset_length_validator @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, minItems=cls.min_items, maxItems=cls.max_items) @classmethod def frozenset_length_validator(cls, v: 'Optional[FrozenSet[T]]') -> 'Optional[FrozenSet[T]]': if v is None: return None v = frozenset_validator(v) v_len = len(v) if cls.min_items is not None and v_len < cls.min_items: raise errors.FrozenSetMinLengthError(limit_value=cls.min_items) if cls.max_items is not None and v_len > cls.max_items: raise errors.FrozenSetMaxLengthError(limit_value=cls.max_items) return v def confrozenset( item_type: Type[T], *, min_items: Optional[int] = None, max_items: Optional[int] = None ) -> Type[FrozenSet[T]]: # __args__ is needed to conform to typing generics api namespace = {'min_items': min_items, 'max_items': max_items, 'item_type': item_type, '__args__': [item_type]} # We use new_class to be able to deal with Generic types return new_class('ConstrainedFrozenSetValue', (ConstrainedFrozenSet,), {}, lambda ns: ns.update(namespace)) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ LIST TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # This types superclass should be List[T], but cython chokes on that... class ConstrainedList(list): # type: ignore # Needed for pydantic to detect that this is a list __origin__ = list __args__: Tuple[Type[T], ...] # type: ignore min_items: Optional[int] = None max_items: Optional[int] = None unique_items: Optional[bool] = None item_type: Type[T] # type: ignore @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.list_length_validator if cls.unique_items: yield cls.unique_items_validator @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, minItems=cls.min_items, maxItems=cls.max_items, uniqueItems=cls.unique_items) @classmethod def list_length_validator(cls, v: 'Optional[List[T]]') -> 'Optional[List[T]]': if v is None: return None v = list_validator(v) v_len = len(v) if cls.min_items is not None and v_len < cls.min_items: raise errors.ListMinLengthError(limit_value=cls.min_items) if cls.max_items is not None and v_len > cls.max_items: raise errors.ListMaxLengthError(limit_value=cls.max_items) return v @classmethod def unique_items_validator(cls, v: 'Optional[List[T]]') -> 'Optional[List[T]]': if v is None: return None for i, value in enumerate(v, start=1): if value in v[i:]: raise errors.ListUniqueItemsError() return v def conlist( item_type: Type[T], *, min_items: Optional[int] = None, max_items: Optional[int] = None, unique_items: bool = None ) -> Type[List[T]]: # __args__ is needed to conform to typing generics api namespace = dict( min_items=min_items, max_items=max_items, unique_items=unique_items, item_type=item_type, __args__=(item_type,) ) # We use new_class to be able to deal with Generic types return new_class('ConstrainedListValue', (ConstrainedList,), {}, lambda ns: ns.update(namespace)) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PYOBJECT TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: PyObject = Callable[..., Any] else: class PyObject: validate_always = True @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, value: Any) -> Any: if isinstance(value, Callable): return value try: value = str_validator(value) except errors.StrError: raise errors.PyObjectError(error_message='value is neither a valid import path not a valid callable') try: return import_string(value) except ImportError as e: raise errors.PyObjectError(error_message=str(e)) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ DECIMAL TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class ConstrainedDecimal(Decimal, metaclass=ConstrainedNumberMeta): gt: OptionalIntFloatDecimal = None ge: OptionalIntFloatDecimal = None lt: OptionalIntFloatDecimal = None le: OptionalIntFloatDecimal = None max_digits: OptionalInt = None decimal_places: OptionalInt = None multiple_of: OptionalIntFloatDecimal = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, exclusiveMinimum=cls.gt, exclusiveMaximum=cls.lt, minimum=cls.ge, maximum=cls.le, multipleOf=cls.multiple_of, ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield decimal_validator yield number_size_validator yield number_multiple_validator yield cls.validate @classmethod def validate(cls, value: Decimal) -> Decimal: try: normalized_value = value.normalize() except InvalidOperation: normalized_value = value digit_tuple, exponent = normalized_value.as_tuple()[1:] if exponent in {'F', 'n', 'N'}: raise errors.DecimalIsNotFiniteError() if exponent >= 0: # A positive exponent adds that many trailing zeros. digits = len(digit_tuple) + exponent decimals = 0 else: # If the absolute value of the negative exponent is larger than the # number of digits, then it's the same as the number of digits, # because it'll consume all of the digits in digit_tuple and then # add abs(exponent) - len(digit_tuple) leading zeros after the # decimal point. if abs(exponent) > len(digit_tuple): digits = decimals = abs(exponent) else: digits = len(digit_tuple) decimals = abs(exponent) whole_digits = digits - decimals if cls.max_digits is not None and digits > cls.max_digits: raise errors.DecimalMaxDigitsError(max_digits=cls.max_digits) if cls.decimal_places is not None and decimals > cls.decimal_places: raise errors.DecimalMaxPlacesError(decimal_places=cls.decimal_places) if cls.max_digits is not None and cls.decimal_places is not None: expected = cls.max_digits - cls.decimal_places if whole_digits > expected: raise errors.DecimalWholeDigitsError(whole_digits=expected) return value def condecimal( *, gt: Decimal = None, ge: Decimal = None, lt: Decimal = None, le: Decimal = None, max_digits: Optional[int] = None, decimal_places: Optional[int] = None, multiple_of: Decimal = None, ) -> Type[Decimal]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict( gt=gt, ge=ge, lt=lt, le=le, max_digits=max_digits, decimal_places=decimal_places, multiple_of=multiple_of ) return type('ConstrainedDecimalValue', (ConstrainedDecimal,), namespace) # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ UUID TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: UUID1 = UUID UUID3 = UUID UUID4 = UUID UUID5 = UUID else: class UUID1(UUID): _required_version = 1 @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format=f'uuid{cls._required_version}') class UUID3(UUID1): _required_version = 3 class UUID4(UUID1): _required_version = 4 class UUID5(UUID1): _required_version = 5 # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PATH TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: FilePath = Path DirectoryPath = Path else: class FilePath(Path): @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(format='file-path') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield path_validator yield path_exists_validator yield cls.validate @classmethod def validate(cls, value: Path) -> Path: if not value.is_file(): raise errors.PathNotAFileError(path=value) return value class DirectoryPath(Path): @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(format='directory-path') @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield path_validator yield path_exists_validator yield cls.validate @classmethod def validate(cls, value: Path) -> Path: if not value.is_dir(): raise errors.PathNotADirectoryError(path=value) return value # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ JSON TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class JsonWrapper: pass class JsonMeta(type): def __getitem__(self, t: Type[Any]) -> Type[JsonWrapper]: if t is Any: return Json # allow Json[Any] to replecate plain Json return _registered(type('JsonWrapperValue', (JsonWrapper,), {'inner_type': t})) if TYPE_CHECKING: Json = Annotated[T, ...] # Json[list[str]] will be recognized by type checkers as list[str] else: class Json(metaclass=JsonMeta): @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: field_schema.update(type='string', format='json-string') # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SECRET TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class SecretField(abc.ABC): """ Note: this should be implemented as a generic like `SecretField(ABC, Generic[T])`, the `__init__()` should be part of the abstract class and the `get_secret_value()` method should use the generic `T` type. However Cython doesn't support very well generics at the moment and the generated code fails to be imported (see https://github.com/cython/cython/issues/2753). """ def __eq__(self, other: Any) -> bool: return isinstance(other, self.__class__) and self.get_secret_value() == other.get_secret_value() def __str__(self) -> str: return '**********' if self.get_secret_value() else '' def __hash__(self) -> int: return hash(self.get_secret_value()) @abc.abstractmethod def get_secret_value(self) -> Any: # pragma: no cover ... class SecretStr(SecretField): min_length: OptionalInt = None max_length: OptionalInt = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, type='string', writeOnly=True, format='password', minLength=cls.min_length, maxLength=cls.max_length, ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate yield constr_length_validator @classmethod def validate(cls, value: Any) -> 'SecretStr': if isinstance(value, cls): return value value = str_validator(value) return cls(value) def __init__(self, value: str): self._secret_value = value def __repr__(self) -> str: return f"SecretStr('{self}')" def __len__(self) -> int: return len(self._secret_value) def display(self) -> str: warnings.warn('`secret_str.display()` is deprecated, use `str(secret_str)` instead', DeprecationWarning) return str(self) def get_secret_value(self) -> str: return self._secret_value class SecretBytes(SecretField): min_length: OptionalInt = None max_length: OptionalInt = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none( field_schema, type='string', writeOnly=True, format='password', minLength=cls.min_length, maxLength=cls.max_length, ) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate yield constr_length_validator @classmethod def validate(cls, value: Any) -> 'SecretBytes': if isinstance(value, cls): return value value = bytes_validator(value) return cls(value) def __init__(self, value: bytes): self._secret_value = value def __repr__(self) -> str: return f"SecretBytes(b'{self}')" def __len__(self) -> int: return len(self._secret_value) def display(self) -> str: warnings.warn('`secret_bytes.display()` is deprecated, use `str(secret_bytes)` instead', DeprecationWarning) return str(self) def get_secret_value(self) -> bytes: return self._secret_value # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PAYMENT CARD TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ class PaymentCardBrand(str, Enum): # If you add another card type, please also add it to the # Hypothesis strategy in `pydantic._hypothesis_plugin`. amex = 'American Express' mastercard = 'Mastercard' visa = 'Visa' other = 'other' def __str__(self) -> str: return self.value class PaymentCardNumber(str): """ Based on: https://en.wikipedia.org/wiki/Payment_card_number """ strip_whitespace: ClassVar[bool] = True min_length: ClassVar[int] = 12 max_length: ClassVar[int] = 19 bin: str last4: str brand: PaymentCardBrand def __init__(self, card_number: str): self.bin = card_number[:6] self.last4 = card_number[-4:] self.brand = self._get_brand(card_number) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield str_validator yield constr_strip_whitespace yield constr_length_validator yield cls.validate_digits yield cls.validate_luhn_check_digit yield cls yield cls.validate_length_for_brand @property def masked(self) -> str: num_masked = len(self) - 10 # len(bin) + len(last4) == 10 return f'{self.bin}{"*" * num_masked}{self.last4}' @classmethod def validate_digits(cls, card_number: str) -> str: if not card_number.isdigit(): raise errors.NotDigitError return card_number @classmethod def validate_luhn_check_digit(cls, card_number: str) -> str: """ Based on: https://en.wikipedia.org/wiki/Luhn_algorithm """ sum_ = int(card_number[-1]) length = len(card_number) parity = length % 2 for i in range(length - 1): digit = int(card_number[i]) if i % 2 == parity: digit *= 2 if digit > 9: digit -= 9 sum_ += digit valid = sum_ % 10 == 0 if not valid: raise errors.LuhnValidationError return card_number @classmethod def validate_length_for_brand(cls, card_number: 'PaymentCardNumber') -> 'PaymentCardNumber': """ Validate length based on BIN for major brands: https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number_(IIN) """ required_length: Union[None, int, str] = None if card_number.brand in PaymentCardBrand.mastercard: required_length = 16 valid = len(card_number) == required_length elif card_number.brand == PaymentCardBrand.visa: required_length = '13, 16 or 19' valid = len(card_number) in {13, 16, 19} elif card_number.brand == PaymentCardBrand.amex: required_length = 15 valid = len(card_number) == required_length else: valid = True if not valid: raise errors.InvalidLengthForBrand(brand=card_number.brand, required_length=required_length) return card_number @staticmethod def _get_brand(card_number: str) -> PaymentCardBrand: if card_number[0] == '4': brand = PaymentCardBrand.visa elif 51 <= int(card_number[:2]) <= 55: brand = PaymentCardBrand.mastercard elif card_number[:2] in {'34', '37'}: brand = PaymentCardBrand.amex else: brand = PaymentCardBrand.other return brand # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BYTE SIZE TYPE ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ BYTE_SIZES = { 'b': 1, 'kb': 10**3, 'mb': 10**6, 'gb': 10**9, 'tb': 10**12, 'pb': 10**15, 'eb': 10**18, 'kib': 2**10, 'mib': 2**20, 'gib': 2**30, 'tib': 2**40, 'pib': 2**50, 'eib': 2**60, } BYTE_SIZES.update({k.lower()[0]: v for k, v in BYTE_SIZES.items() if 'i' not in k}) byte_string_re = re.compile(r'^\s*(\d*\.?\d+)\s*(\w+)?', re.IGNORECASE) class ByteSize(int): @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield cls.validate @classmethod def validate(cls, v: StrIntFloat) -> 'ByteSize': try: return cls(int(v)) except ValueError: pass str_match = byte_string_re.match(str(v)) if str_match is None: raise errors.InvalidByteSize() scalar, unit = str_match.groups() if unit is None: unit = 'b' try: unit_mult = BYTE_SIZES[unit.lower()] except KeyError: raise errors.InvalidByteSizeUnit(unit=unit) return cls(int(float(scalar) * unit_mult)) def human_readable(self, decimal: bool = False) -> str: if decimal: divisor = 1000 units = ['B', 'KB', 'MB', 'GB', 'TB', 'PB'] final_unit = 'EB' else: divisor = 1024 units = ['B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB'] final_unit = 'EiB' num = float(self) for unit in units: if abs(num) < divisor: return f'{num:0.1f}{unit}' num /= divisor return f'{num:0.1f}{final_unit}' def to(self, unit: str) -> float: try: unit_div = BYTE_SIZES[unit.lower()] except KeyError: raise errors.InvalidByteSizeUnit(unit=unit) return self / unit_div # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ DATE TYPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ if TYPE_CHECKING: PastDate = date FutureDate = date else: class PastDate(date): @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield parse_date yield cls.validate @classmethod def validate(cls, value: date) -> date: if value >= date.today(): raise errors.DateNotInThePastError() return value class FutureDate(date): @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield parse_date yield cls.validate @classmethod def validate(cls, value: date) -> date: if value <= date.today(): raise errors.DateNotInTheFutureError() return value class ConstrainedDate(date, metaclass=ConstrainedNumberMeta): gt: OptionalDate = None ge: OptionalDate = None lt: OptionalDate = None le: OptionalDate = None @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: update_not_none(field_schema, exclusiveMinimum=cls.gt, exclusiveMaximum=cls.lt, minimum=cls.ge, maximum=cls.le) @classmethod def __get_validators__(cls) -> 'CallableGenerator': yield parse_date yield number_size_validator def condate( *, gt: date = None, ge: date = None, lt: date = None, le: date = None, ) -> Type[date]: # use kwargs then define conf in a dict to aid with IDE type hinting namespace = dict(gt=gt, ge=ge, lt=lt, le=le) return type('ConstrainedDateValue', (ConstrainedDate,), namespace) pydantic-2.10.6/pydantic/v1/typing.py000066400000000000000000000456731474456633400174770ustar00rootroot00000000000000import sys import typing from collections.abc import Callable from os import PathLike from typing import ( # type: ignore TYPE_CHECKING, AbstractSet, Any, Callable as TypingCallable, ClassVar, Dict, ForwardRef, Generator, Iterable, List, Mapping, NewType, Optional, Sequence, Set, Tuple, Type, TypeVar, Union, _eval_type, cast, get_type_hints, ) from typing_extensions import ( Annotated, Final, Literal, NotRequired as TypedDictNotRequired, Required as TypedDictRequired, ) try: from typing import _TypingBase as typing_base # type: ignore except ImportError: from typing import _Final as typing_base # type: ignore try: from typing import GenericAlias as TypingGenericAlias # type: ignore except ImportError: # python < 3.9 does not have GenericAlias (list[int], tuple[str, ...] and so on) TypingGenericAlias = () try: from types import UnionType as TypesUnionType # type: ignore except ImportError: # python < 3.10 does not have UnionType (str | int, byte | bool and so on) TypesUnionType = () if sys.version_info < (3, 9): def evaluate_forwardref(type_: ForwardRef, globalns: Any, localns: Any) -> Any: return type_._evaluate(globalns, localns) else: def evaluate_forwardref(type_: ForwardRef, globalns: Any, localns: Any) -> Any: # Even though it is the right signature for python 3.9, mypy complains with # `error: Too many arguments for "_evaluate" of "ForwardRef"` hence the cast... # Python 3.13/3.12.4+ made `recursive_guard` a kwarg, so name it explicitly to avoid: # TypeError: ForwardRef._evaluate() missing 1 required keyword-only argument: 'recursive_guard' return cast(Any, type_)._evaluate(globalns, localns, recursive_guard=set()) if sys.version_info < (3, 9): # Ensure we always get all the whole `Annotated` hint, not just the annotated type. # For 3.7 to 3.8, `get_type_hints` doesn't recognize `typing_extensions.Annotated`, # so it already returns the full annotation get_all_type_hints = get_type_hints else: def get_all_type_hints(obj: Any, globalns: Any = None, localns: Any = None) -> Any: return get_type_hints(obj, globalns, localns, include_extras=True) _T = TypeVar('_T') AnyCallable = TypingCallable[..., Any] NoArgAnyCallable = TypingCallable[[], Any] # workaround for https://github.com/python/mypy/issues/9496 AnyArgTCallable = TypingCallable[..., _T] # Annotated[...] is implemented by returning an instance of one of these classes, depending on # python/typing_extensions version. AnnotatedTypeNames = {'AnnotatedMeta', '_AnnotatedAlias'} LITERAL_TYPES: Set[Any] = {Literal} if hasattr(typing, 'Literal'): LITERAL_TYPES.add(typing.Literal) if sys.version_info < (3, 8): def get_origin(t: Type[Any]) -> Optional[Type[Any]]: if type(t).__name__ in AnnotatedTypeNames: # weirdly this is a runtime requirement, as well as for mypy return cast(Type[Any], Annotated) return getattr(t, '__origin__', None) else: from typing import get_origin as _typing_get_origin def get_origin(tp: Type[Any]) -> Optional[Type[Any]]: """ We can't directly use `typing.get_origin` since we need a fallback to support custom generic classes like `ConstrainedList` It should be useless once https://github.com/cython/cython/issues/3537 is solved and https://github.com/pydantic/pydantic/pull/1753 is merged. """ if type(tp).__name__ in AnnotatedTypeNames: return cast(Type[Any], Annotated) # mypy complains about _SpecialForm return _typing_get_origin(tp) or getattr(tp, '__origin__', None) if sys.version_info < (3, 8): from typing import _GenericAlias def get_args(t: Type[Any]) -> Tuple[Any, ...]: """Compatibility version of get_args for python 3.7. Mostly compatible with the python 3.8 `typing` module version and able to handle almost all use cases. """ if type(t).__name__ in AnnotatedTypeNames: return t.__args__ + t.__metadata__ if isinstance(t, _GenericAlias): res = t.__args__ if t.__origin__ is Callable and res and res[0] is not Ellipsis: res = (list(res[:-1]), res[-1]) return res return getattr(t, '__args__', ()) else: from typing import get_args as _typing_get_args def _generic_get_args(tp: Type[Any]) -> Tuple[Any, ...]: """ In python 3.9, `typing.Dict`, `typing.List`, ... do have an empty `__args__` by default (instead of the generic ~T for example). In order to still support `Dict` for example and consider it as `Dict[Any, Any]`, we retrieve the `_nparams` value that tells us how many parameters it needs. """ if hasattr(tp, '_nparams'): return (Any,) * tp._nparams # Special case for `tuple[()]`, which used to return ((),) with `typing.Tuple` # in python 3.10- but now returns () for `tuple` and `Tuple`. # This will probably be clarified in pydantic v2 try: if tp == Tuple[()] or sys.version_info >= (3, 9) and tp == tuple[()]: # type: ignore[misc] return ((),) # there is a TypeError when compiled with cython except TypeError: # pragma: no cover pass return () def get_args(tp: Type[Any]) -> Tuple[Any, ...]: """Get type arguments with all substitutions performed. For unions, basic simplifications used by Union constructor are performed. Examples:: get_args(Dict[str, int]) == (str, int) get_args(int) == () get_args(Union[int, Union[T, int], str][int]) == (int, str) get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int]) get_args(Callable[[], T][int]) == ([], int) """ if type(tp).__name__ in AnnotatedTypeNames: return tp.__args__ + tp.__metadata__ # the fallback is needed for the same reasons as `get_origin` (see above) return _typing_get_args(tp) or getattr(tp, '__args__', ()) or _generic_get_args(tp) if sys.version_info < (3, 9): def convert_generics(tp: Type[Any]) -> Type[Any]: """Python 3.9 and older only supports generics from `typing` module. They convert strings to ForwardRef automatically. Examples:: typing.List['Hero'] == typing.List[ForwardRef('Hero')] """ return tp else: from typing import _UnionGenericAlias # type: ignore from typing_extensions import _AnnotatedAlias def convert_generics(tp: Type[Any]) -> Type[Any]: """ Recursively searches for `str` type hints and replaces them with ForwardRef. Examples:: convert_generics(list['Hero']) == list[ForwardRef('Hero')] convert_generics(dict['Hero', 'Team']) == dict[ForwardRef('Hero'), ForwardRef('Team')] convert_generics(typing.Dict['Hero', 'Team']) == typing.Dict[ForwardRef('Hero'), ForwardRef('Team')] convert_generics(list[str | 'Hero'] | int) == list[str | ForwardRef('Hero')] | int """ origin = get_origin(tp) if not origin or not hasattr(tp, '__args__'): return tp args = get_args(tp) # typing.Annotated needs special treatment if origin is Annotated: return _AnnotatedAlias(convert_generics(args[0]), args[1:]) # recursively replace `str` instances inside of `GenericAlias` with `ForwardRef(arg)` converted = tuple( ForwardRef(arg) if isinstance(arg, str) and isinstance(tp, TypingGenericAlias) else convert_generics(arg) for arg in args ) if converted == args: return tp elif isinstance(tp, TypingGenericAlias): return TypingGenericAlias(origin, converted) elif isinstance(tp, TypesUnionType): # recreate types.UnionType (PEP604, Python >= 3.10) return _UnionGenericAlias(origin, converted) else: try: setattr(tp, '__args__', converted) except AttributeError: pass return tp if sys.version_info < (3, 10): def is_union(tp: Optional[Type[Any]]) -> bool: return tp is Union WithArgsTypes = (TypingGenericAlias,) else: import types import typing def is_union(tp: Optional[Type[Any]]) -> bool: return tp is Union or tp is types.UnionType # noqa: E721 WithArgsTypes = (typing._GenericAlias, types.GenericAlias, types.UnionType) StrPath = Union[str, PathLike] if TYPE_CHECKING: from pydantic.v1.fields import ModelField TupleGenerator = Generator[Tuple[str, Any], None, None] DictStrAny = Dict[str, Any] DictAny = Dict[Any, Any] SetStr = Set[str] ListStr = List[str] IntStr = Union[int, str] AbstractSetIntStr = AbstractSet[IntStr] DictIntStrAny = Dict[IntStr, Any] MappingIntStrAny = Mapping[IntStr, Any] CallableGenerator = Generator[AnyCallable, None, None] ReprArgs = Sequence[Tuple[Optional[str], Any]] MYPY = False if MYPY: AnyClassMethod = classmethod[Any] else: # classmethod[TargetType, CallableParamSpecType, CallableReturnType] AnyClassMethod = classmethod[Any, Any, Any] __all__ = ( 'AnyCallable', 'NoArgAnyCallable', 'NoneType', 'is_none_type', 'display_as_type', 'resolve_annotations', 'is_callable_type', 'is_literal_type', 'all_literal_values', 'is_namedtuple', 'is_typeddict', 'is_typeddict_special', 'is_new_type', 'new_type_supertype', 'is_classvar', 'is_finalvar', 'update_field_forward_refs', 'update_model_forward_refs', 'TupleGenerator', 'DictStrAny', 'DictAny', 'SetStr', 'ListStr', 'IntStr', 'AbstractSetIntStr', 'DictIntStrAny', 'CallableGenerator', 'ReprArgs', 'AnyClassMethod', 'CallableGenerator', 'WithArgsTypes', 'get_args', 'get_origin', 'get_sub_types', 'typing_base', 'get_all_type_hints', 'is_union', 'StrPath', 'MappingIntStrAny', ) NoneType = None.__class__ NONE_TYPES: Tuple[Any, Any, Any] = (None, NoneType, Literal[None]) if sys.version_info < (3, 8): # Even though this implementation is slower, we need it for python 3.7: # In python 3.7 "Literal" is not a builtin type and uses a different # mechanism. # for this reason `Literal[None] is Literal[None]` evaluates to `False`, # breaking the faster implementation used for the other python versions. def is_none_type(type_: Any) -> bool: return type_ in NONE_TYPES elif sys.version_info[:2] == (3, 8): def is_none_type(type_: Any) -> bool: for none_type in NONE_TYPES: if type_ is none_type: return True # With python 3.8, specifically 3.8.10, Literal "is" check sare very flakey # can change on very subtle changes like use of types in other modules, # hopefully this check avoids that issue. if is_literal_type(type_): # pragma: no cover return all_literal_values(type_) == (None,) return False else: def is_none_type(type_: Any) -> bool: return type_ in NONE_TYPES def display_as_type(v: Type[Any]) -> str: if not isinstance(v, typing_base) and not isinstance(v, WithArgsTypes) and not isinstance(v, type): v = v.__class__ if is_union(get_origin(v)): return f'Union[{", ".join(map(display_as_type, get_args(v)))}]' if isinstance(v, WithArgsTypes): # Generic alias are constructs like `list[int]` return str(v).replace('typing.', '') try: return v.__name__ except AttributeError: # happens with typing objects return str(v).replace('typing.', '') def resolve_annotations(raw_annotations: Dict[str, Type[Any]], module_name: Optional[str]) -> Dict[str, Type[Any]]: """ Partially taken from typing.get_type_hints. Resolve string or ForwardRef annotations into type objects if possible. """ base_globals: Optional[Dict[str, Any]] = None if module_name: try: module = sys.modules[module_name] except KeyError: # happens occasionally, see https://github.com/pydantic/pydantic/issues/2363 pass else: base_globals = module.__dict__ annotations = {} for name, value in raw_annotations.items(): if isinstance(value, str): if (3, 10) > sys.version_info >= (3, 9, 8) or sys.version_info >= (3, 10, 1): value = ForwardRef(value, is_argument=False, is_class=True) else: value = ForwardRef(value, is_argument=False) try: if sys.version_info >= (3, 13): value = _eval_type(value, base_globals, None, type_params=()) else: value = _eval_type(value, base_globals, None) except NameError: # this is ok, it can be fixed with update_forward_refs pass annotations[name] = value return annotations def is_callable_type(type_: Type[Any]) -> bool: return type_ is Callable or get_origin(type_) is Callable def is_literal_type(type_: Type[Any]) -> bool: return Literal is not None and get_origin(type_) in LITERAL_TYPES def literal_values(type_: Type[Any]) -> Tuple[Any, ...]: return get_args(type_) def all_literal_values(type_: Type[Any]) -> Tuple[Any, ...]: """ This method is used to retrieve all Literal values as Literal can be used recursively (see https://www.python.org/dev/peps/pep-0586) e.g. `Literal[Literal[Literal[1, 2, 3], "foo"], 5, None]` """ if not is_literal_type(type_): return (type_,) values = literal_values(type_) return tuple(x for value in values for x in all_literal_values(value)) def is_namedtuple(type_: Type[Any]) -> bool: """ Check if a given class is a named tuple. It can be either a `typing.NamedTuple` or `collections.namedtuple` """ from pydantic.v1.utils import lenient_issubclass return lenient_issubclass(type_, tuple) and hasattr(type_, '_fields') def is_typeddict(type_: Type[Any]) -> bool: """ Check if a given class is a typed dict (from `typing` or `typing_extensions`) In 3.10, there will be a public method (https://docs.python.org/3.10/library/typing.html#typing.is_typeddict) """ from pydantic.v1.utils import lenient_issubclass return lenient_issubclass(type_, dict) and hasattr(type_, '__total__') def _check_typeddict_special(type_: Any) -> bool: return type_ is TypedDictRequired or type_ is TypedDictNotRequired def is_typeddict_special(type_: Any) -> bool: """ Check if type is a TypedDict special form (Required or NotRequired). """ return _check_typeddict_special(type_) or _check_typeddict_special(get_origin(type_)) test_type = NewType('test_type', str) def is_new_type(type_: Type[Any]) -> bool: """ Check whether type_ was created using typing.NewType """ return isinstance(type_, test_type.__class__) and hasattr(type_, '__supertype__') # type: ignore def new_type_supertype(type_: Type[Any]) -> Type[Any]: while hasattr(type_, '__supertype__'): type_ = type_.__supertype__ return type_ def _check_classvar(v: Optional[Type[Any]]) -> bool: if v is None: return False return v.__class__ == ClassVar.__class__ and getattr(v, '_name', None) == 'ClassVar' def _check_finalvar(v: Optional[Type[Any]]) -> bool: """ Check if a given type is a `typing.Final` type. """ if v is None: return False return v.__class__ == Final.__class__ and (sys.version_info < (3, 8) or getattr(v, '_name', None) == 'Final') def is_classvar(ann_type: Type[Any]) -> bool: if _check_classvar(ann_type) or _check_classvar(get_origin(ann_type)): return True # this is an ugly workaround for class vars that contain forward references and are therefore themselves # forward references, see #3679 if ann_type.__class__ == ForwardRef and ann_type.__forward_arg__.startswith('ClassVar['): return True return False def is_finalvar(ann_type: Type[Any]) -> bool: return _check_finalvar(ann_type) or _check_finalvar(get_origin(ann_type)) def update_field_forward_refs(field: 'ModelField', globalns: Any, localns: Any) -> None: """ Try to update ForwardRefs on fields based on this ModelField, globalns and localns. """ prepare = False if field.type_.__class__ == ForwardRef: prepare = True field.type_ = evaluate_forwardref(field.type_, globalns, localns or None) if field.outer_type_.__class__ == ForwardRef: prepare = True field.outer_type_ = evaluate_forwardref(field.outer_type_, globalns, localns or None) if prepare: field.prepare() if field.sub_fields: for sub_f in field.sub_fields: update_field_forward_refs(sub_f, globalns=globalns, localns=localns) if field.discriminator_key is not None: field.prepare_discriminated_union_sub_fields() def update_model_forward_refs( model: Type[Any], fields: Iterable['ModelField'], json_encoders: Dict[Union[Type[Any], str, ForwardRef], AnyCallable], localns: 'DictStrAny', exc_to_suppress: Tuple[Type[BaseException], ...] = (), ) -> None: """ Try to update model fields ForwardRefs based on model and localns. """ if model.__module__ in sys.modules: globalns = sys.modules[model.__module__].__dict__.copy() else: globalns = {} globalns.setdefault(model.__name__, model) for f in fields: try: update_field_forward_refs(f, globalns=globalns, localns=localns) except exc_to_suppress: pass for key in set(json_encoders.keys()): if isinstance(key, str): fr: ForwardRef = ForwardRef(key) elif isinstance(key, ForwardRef): fr = key else: continue try: new_key = evaluate_forwardref(fr, globalns, localns or None) except exc_to_suppress: # pragma: no cover continue json_encoders[new_key] = json_encoders.pop(key) def get_class(type_: Type[Any]) -> Union[None, bool, Type[Any]]: """ Tries to get the class of a Type[T] annotation. Returns True if Type is used without brackets. Otherwise returns None. """ if type_ is type: return True if get_origin(type_) is None: return None args = get_args(type_) if not args or not isinstance(args[0], type): return True else: return args[0] def get_sub_types(tp: Any) -> List[Any]: """ Return all the types that are allowed by type `tp` `tp` can be a `Union` of allowed types or an `Annotated` type """ origin = get_origin(tp) if origin is Annotated: return get_sub_types(get_args(tp)[0]) elif is_union(origin): return [x for t in get_args(tp) for x in get_sub_types(t)] else: return [tp] pydantic-2.10.6/pydantic/v1/utils.py000066400000000000000000000625251474456633400173200ustar00rootroot00000000000000import keyword import warnings import weakref from collections import OrderedDict, defaultdict, deque from copy import deepcopy from itertools import islice, zip_longest from types import BuiltinFunctionType, CodeType, FunctionType, GeneratorType, LambdaType, ModuleType from typing import ( TYPE_CHECKING, AbstractSet, Any, Callable, Collection, Dict, Generator, Iterable, Iterator, List, Mapping, NoReturn, Optional, Set, Tuple, Type, TypeVar, Union, ) from typing_extensions import Annotated from pydantic.v1.errors import ConfigError from pydantic.v1.typing import ( NoneType, WithArgsTypes, all_literal_values, display_as_type, get_args, get_origin, is_literal_type, is_union, ) from pydantic.v1.version import version_info if TYPE_CHECKING: from inspect import Signature from pathlib import Path from pydantic.v1.config import BaseConfig from pydantic.v1.dataclasses import Dataclass from pydantic.v1.fields import ModelField from pydantic.v1.main import BaseModel from pydantic.v1.typing import AbstractSetIntStr, DictIntStrAny, IntStr, MappingIntStrAny, ReprArgs RichReprResult = Iterable[Union[Any, Tuple[Any], Tuple[str, Any], Tuple[str, Any, Any]]] __all__ = ( 'import_string', 'sequence_like', 'validate_field_name', 'lenient_isinstance', 'lenient_issubclass', 'in_ipython', 'is_valid_identifier', 'deep_update', 'update_not_none', 'almost_equal_floats', 'get_model', 'to_camel', 'to_lower_camel', 'is_valid_field', 'smart_deepcopy', 'PyObjectStr', 'Representation', 'GetterDict', 'ValueItems', 'version_info', # required here to match behaviour in v1.3 'ClassAttribute', 'path_type', 'ROOT_KEY', 'get_unique_discriminator_alias', 'get_discriminator_alias_and_values', 'DUNDER_ATTRIBUTES', ) ROOT_KEY = '__root__' # these are types that are returned unchanged by deepcopy IMMUTABLE_NON_COLLECTIONS_TYPES: Set[Type[Any]] = { int, float, complex, str, bool, bytes, type, NoneType, FunctionType, BuiltinFunctionType, LambdaType, weakref.ref, CodeType, # note: including ModuleType will differ from behaviour of deepcopy by not producing error. # It might be not a good idea in general, but considering that this function used only internally # against default values of fields, this will allow to actually have a field with module as default value ModuleType, NotImplemented.__class__, Ellipsis.__class__, } # these are types that if empty, might be copied with simple copy() instead of deepcopy() BUILTIN_COLLECTIONS: Set[Type[Any]] = { list, set, tuple, frozenset, dict, OrderedDict, defaultdict, deque, } def import_string(dotted_path: str) -> Any: """ Stolen approximately from django. Import a dotted module path and return the attribute/class designated by the last name in the path. Raise ImportError if the import fails. """ from importlib import import_module try: module_path, class_name = dotted_path.strip(' ').rsplit('.', 1) except ValueError as e: raise ImportError(f'"{dotted_path}" doesn\'t look like a module path') from e module = import_module(module_path) try: return getattr(module, class_name) except AttributeError as e: raise ImportError(f'Module "{module_path}" does not define a "{class_name}" attribute') from e def truncate(v: Union[str], *, max_len: int = 80) -> str: """ Truncate a value and add a unicode ellipsis (three dots) to the end if it was too long """ warnings.warn('`truncate` is no-longer used by pydantic and is deprecated', DeprecationWarning) if isinstance(v, str) and len(v) > (max_len - 2): # -3 so quote + string + … + quote has correct length return (v[: (max_len - 3)] + '…').__repr__() try: v = v.__repr__() except TypeError: v = v.__class__.__repr__(v) # in case v is a type if len(v) > max_len: v = v[: max_len - 1] + '…' return v def sequence_like(v: Any) -> bool: return isinstance(v, (list, tuple, set, frozenset, GeneratorType, deque)) def validate_field_name(bases: List[Type['BaseModel']], field_name: str) -> None: """ Ensure that the field's name does not shadow an existing attribute of the model. """ for base in bases: if getattr(base, field_name, None): raise NameError( f'Field name "{field_name}" shadows a BaseModel attribute; ' f'use a different field name with "alias=\'{field_name}\'".' ) def lenient_isinstance(o: Any, class_or_tuple: Union[Type[Any], Tuple[Type[Any], ...], None]) -> bool: try: return isinstance(o, class_or_tuple) # type: ignore[arg-type] except TypeError: return False def lenient_issubclass(cls: Any, class_or_tuple: Union[Type[Any], Tuple[Type[Any], ...], None]) -> bool: try: return isinstance(cls, type) and issubclass(cls, class_or_tuple) # type: ignore[arg-type] except TypeError: if isinstance(cls, WithArgsTypes): return False raise # pragma: no cover def in_ipython() -> bool: """ Check whether we're in an ipython environment, including jupyter notebooks. """ try: eval('__IPYTHON__') except NameError: return False else: # pragma: no cover return True def is_valid_identifier(identifier: str) -> bool: """ Checks that a string is a valid identifier and not a Python keyword. :param identifier: The identifier to test. :return: True if the identifier is valid. """ return identifier.isidentifier() and not keyword.iskeyword(identifier) KeyType = TypeVar('KeyType') def deep_update(mapping: Dict[KeyType, Any], *updating_mappings: Dict[KeyType, Any]) -> Dict[KeyType, Any]: updated_mapping = mapping.copy() for updating_mapping in updating_mappings: for k, v in updating_mapping.items(): if k in updated_mapping and isinstance(updated_mapping[k], dict) and isinstance(v, dict): updated_mapping[k] = deep_update(updated_mapping[k], v) else: updated_mapping[k] = v return updated_mapping def update_not_none(mapping: Dict[Any, Any], **update: Any) -> None: mapping.update({k: v for k, v in update.items() if v is not None}) def almost_equal_floats(value_1: float, value_2: float, *, delta: float = 1e-8) -> bool: """ Return True if two floats are almost equal """ return abs(value_1 - value_2) <= delta def generate_model_signature( init: Callable[..., None], fields: Dict[str, 'ModelField'], config: Type['BaseConfig'] ) -> 'Signature': """ Generate signature for model based on its fields """ from inspect import Parameter, Signature, signature from pydantic.v1.config import Extra present_params = signature(init).parameters.values() merged_params: Dict[str, Parameter] = {} var_kw = None use_var_kw = False for param in islice(present_params, 1, None): # skip self arg if param.kind is param.VAR_KEYWORD: var_kw = param continue merged_params[param.name] = param if var_kw: # if custom init has no var_kw, fields which are not declared in it cannot be passed through allow_names = config.allow_population_by_field_name for field_name, field in fields.items(): param_name = field.alias if field_name in merged_params or param_name in merged_params: continue elif not is_valid_identifier(param_name): if allow_names and is_valid_identifier(field_name): param_name = field_name else: use_var_kw = True continue # TODO: replace annotation with actual expected types once #1055 solved kwargs = {'default': field.default} if not field.required else {} merged_params[param_name] = Parameter( param_name, Parameter.KEYWORD_ONLY, annotation=field.annotation, **kwargs ) if config.extra is Extra.allow: use_var_kw = True if var_kw and use_var_kw: # Make sure the parameter for extra kwargs # does not have the same name as a field default_model_signature = [ ('__pydantic_self__', Parameter.POSITIONAL_OR_KEYWORD), ('data', Parameter.VAR_KEYWORD), ] if [(p.name, p.kind) for p in present_params] == default_model_signature: # if this is the standard model signature, use extra_data as the extra args name var_kw_name = 'extra_data' else: # else start from var_kw var_kw_name = var_kw.name # generate a name that's definitely unique while var_kw_name in fields: var_kw_name += '_' merged_params[var_kw_name] = var_kw.replace(name=var_kw_name) return Signature(parameters=list(merged_params.values()), return_annotation=None) def get_model(obj: Union[Type['BaseModel'], Type['Dataclass']]) -> Type['BaseModel']: from pydantic.v1.main import BaseModel try: model_cls = obj.__pydantic_model__ # type: ignore except AttributeError: model_cls = obj if not issubclass(model_cls, BaseModel): raise TypeError('Unsupported type, must be either BaseModel or dataclass') return model_cls def to_camel(string: str) -> str: return ''.join(word.capitalize() for word in string.split('_')) def to_lower_camel(string: str) -> str: if len(string) >= 1: pascal_string = to_camel(string) return pascal_string[0].lower() + pascal_string[1:] return string.lower() T = TypeVar('T') def unique_list( input_list: Union[List[T], Tuple[T, ...]], *, name_factory: Callable[[T], str] = str, ) -> List[T]: """ Make a list unique while maintaining order. We update the list if another one with the same name is set (e.g. root validator overridden in subclass) """ result: List[T] = [] result_names: List[str] = [] for v in input_list: v_name = name_factory(v) if v_name not in result_names: result_names.append(v_name) result.append(v) else: result[result_names.index(v_name)] = v return result class PyObjectStr(str): """ String class where repr doesn't include quotes. Useful with Representation when you want to return a string representation of something that valid (or pseudo-valid) python. """ def __repr__(self) -> str: return str(self) class Representation: """ Mixin to provide __str__, __repr__, and __pretty__ methods. See #884 for more details. __pretty__ is used by [devtools](https://python-devtools.helpmanual.io/) to provide human readable representations of objects. """ __slots__: Tuple[str, ...] = tuple() def __repr_args__(self) -> 'ReprArgs': """ Returns the attributes to show in __str__, __repr__, and __pretty__ this is generally overridden. Can either return: * name - value pairs, e.g.: `[('foo_name', 'foo'), ('bar_name', ['b', 'a', 'r'])]` * or, just values, e.g.: `[(None, 'foo'), (None, ['b', 'a', 'r'])]` """ attrs = ((s, getattr(self, s)) for s in self.__slots__) return [(a, v) for a, v in attrs if v is not None] def __repr_name__(self) -> str: """ Name of the instance's class, used in __repr__. """ return self.__class__.__name__ def __repr_str__(self, join_str: str) -> str: return join_str.join(repr(v) if a is None else f'{a}={v!r}' for a, v in self.__repr_args__()) def __pretty__(self, fmt: Callable[[Any], Any], **kwargs: Any) -> Generator[Any, None, None]: """ Used by devtools (https://python-devtools.helpmanual.io/) to provide a human readable representations of objects """ yield self.__repr_name__() + '(' yield 1 for name, value in self.__repr_args__(): if name is not None: yield name + '=' yield fmt(value) yield ',' yield 0 yield -1 yield ')' def __str__(self) -> str: return self.__repr_str__(' ') def __repr__(self) -> str: return f'{self.__repr_name__()}({self.__repr_str__(", ")})' def __rich_repr__(self) -> 'RichReprResult': """Get fields for Rich library""" for name, field_repr in self.__repr_args__(): if name is None: yield field_repr else: yield name, field_repr class GetterDict(Representation): """ Hack to make object's smell just enough like dicts for validate_model. We can't inherit from Mapping[str, Any] because it upsets cython so we have to implement all methods ourselves. """ __slots__ = ('_obj',) def __init__(self, obj: Any): self._obj = obj def __getitem__(self, key: str) -> Any: try: return getattr(self._obj, key) except AttributeError as e: raise KeyError(key) from e def get(self, key: Any, default: Any = None) -> Any: return getattr(self._obj, key, default) def extra_keys(self) -> Set[Any]: """ We don't want to get any other attributes of obj if the model didn't explicitly ask for them """ return set() def keys(self) -> List[Any]: """ Keys of the pseudo dictionary, uses a list not set so order information can be maintained like python dictionaries. """ return list(self) def values(self) -> List[Any]: return [self[k] for k in self] def items(self) -> Iterator[Tuple[str, Any]]: for k in self: yield k, self.get(k) def __iter__(self) -> Iterator[str]: for name in dir(self._obj): if not name.startswith('_'): yield name def __len__(self) -> int: return sum(1 for _ in self) def __contains__(self, item: Any) -> bool: return item in self.keys() def __eq__(self, other: Any) -> bool: return dict(self) == dict(other.items()) def __repr_args__(self) -> 'ReprArgs': return [(None, dict(self))] def __repr_name__(self) -> str: return f'GetterDict[{display_as_type(self._obj)}]' class ValueItems(Representation): """ Class for more convenient calculation of excluded or included fields on values. """ __slots__ = ('_items', '_type') def __init__(self, value: Any, items: Union['AbstractSetIntStr', 'MappingIntStrAny']) -> None: items = self._coerce_items(items) if isinstance(value, (list, tuple)): items = self._normalize_indexes(items, len(value)) self._items: 'MappingIntStrAny' = items def is_excluded(self, item: Any) -> bool: """ Check if item is fully excluded. :param item: key or index of a value """ return self.is_true(self._items.get(item)) def is_included(self, item: Any) -> bool: """ Check if value is contained in self._items :param item: key or index of value """ return item in self._items def for_element(self, e: 'IntStr') -> Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']]: """ :param e: key or index of element on value :return: raw values for element if self._items is dict and contain needed element """ item = self._items.get(e) return item if not self.is_true(item) else None def _normalize_indexes(self, items: 'MappingIntStrAny', v_length: int) -> 'DictIntStrAny': """ :param items: dict or set of indexes which will be normalized :param v_length: length of sequence indexes of which will be >>> self._normalize_indexes({0: True, -2: True, -1: True}, 4) {0: True, 2: True, 3: True} >>> self._normalize_indexes({'__all__': True}, 4) {0: True, 1: True, 2: True, 3: True} """ normalized_items: 'DictIntStrAny' = {} all_items = None for i, v in items.items(): if not (isinstance(v, Mapping) or isinstance(v, AbstractSet) or self.is_true(v)): raise TypeError(f'Unexpected type of exclude value for index "{i}" {v.__class__}') if i == '__all__': all_items = self._coerce_value(v) continue if not isinstance(i, int): raise TypeError( 'Excluding fields from a sequence of sub-models or dicts must be performed index-wise: ' 'expected integer keys or keyword "__all__"' ) normalized_i = v_length + i if i < 0 else i normalized_items[normalized_i] = self.merge(v, normalized_items.get(normalized_i)) if not all_items: return normalized_items if self.is_true(all_items): for i in range(v_length): normalized_items.setdefault(i, ...) return normalized_items for i in range(v_length): normalized_item = normalized_items.setdefault(i, {}) if not self.is_true(normalized_item): normalized_items[i] = self.merge(all_items, normalized_item) return normalized_items @classmethod def merge(cls, base: Any, override: Any, intersect: bool = False) -> Any: """ Merge a ``base`` item with an ``override`` item. Both ``base`` and ``override`` are converted to dictionaries if possible. Sets are converted to dictionaries with the sets entries as keys and Ellipsis as values. Each key-value pair existing in ``base`` is merged with ``override``, while the rest of the key-value pairs are updated recursively with this function. Merging takes place based on the "union" of keys if ``intersect`` is set to ``False`` (default) and on the intersection of keys if ``intersect`` is set to ``True``. """ override = cls._coerce_value(override) base = cls._coerce_value(base) if override is None: return base if cls.is_true(base) or base is None: return override if cls.is_true(override): return base if intersect else override # intersection or union of keys while preserving ordering: if intersect: merge_keys = [k for k in base if k in override] + [k for k in override if k in base] else: merge_keys = list(base) + [k for k in override if k not in base] merged: 'DictIntStrAny' = {} for k in merge_keys: merged_item = cls.merge(base.get(k), override.get(k), intersect=intersect) if merged_item is not None: merged[k] = merged_item return merged @staticmethod def _coerce_items(items: Union['AbstractSetIntStr', 'MappingIntStrAny']) -> 'MappingIntStrAny': if isinstance(items, Mapping): pass elif isinstance(items, AbstractSet): items = dict.fromkeys(items, ...) else: class_name = getattr(items, '__class__', '???') assert_never( items, f'Unexpected type of exclude value {class_name}', ) return items @classmethod def _coerce_value(cls, value: Any) -> Any: if value is None or cls.is_true(value): return value return cls._coerce_items(value) @staticmethod def is_true(v: Any) -> bool: return v is True or v is ... def __repr_args__(self) -> 'ReprArgs': return [(None, self._items)] class ClassAttribute: """ Hide class attribute from its instances """ __slots__ = ( 'name', 'value', ) def __init__(self, name: str, value: Any) -> None: self.name = name self.value = value def __get__(self, instance: Any, owner: Type[Any]) -> None: if instance is None: return self.value raise AttributeError(f'{self.name!r} attribute of {owner.__name__!r} is class-only') path_types = { 'is_dir': 'directory', 'is_file': 'file', 'is_mount': 'mount point', 'is_symlink': 'symlink', 'is_block_device': 'block device', 'is_char_device': 'char device', 'is_fifo': 'FIFO', 'is_socket': 'socket', } def path_type(p: 'Path') -> str: """ Find out what sort of thing a path is. """ assert p.exists(), 'path does not exist' for method, name in path_types.items(): if getattr(p, method)(): return name return 'unknown' Obj = TypeVar('Obj') def smart_deepcopy(obj: Obj) -> Obj: """ Return type as is for immutable built-in types Use obj.copy() for built-in empty collections Use copy.deepcopy() for non-empty collections and unknown objects """ obj_type = obj.__class__ if obj_type in IMMUTABLE_NON_COLLECTIONS_TYPES: return obj # fastest case: obj is immutable and not collection therefore will not be copied anyway try: if not obj and obj_type in BUILTIN_COLLECTIONS: # faster way for empty collections, no need to copy its members return obj if obj_type is tuple else obj.copy() # type: ignore # tuple doesn't have copy method except (TypeError, ValueError, RuntimeError): # do we really dare to catch ALL errors? Seems a bit risky pass return deepcopy(obj) # slowest way when we actually might need a deepcopy def is_valid_field(name: str) -> bool: if not name.startswith('_'): return True return ROOT_KEY == name DUNDER_ATTRIBUTES = { '__annotations__', '__classcell__', '__doc__', '__module__', '__orig_bases__', '__orig_class__', '__qualname__', } def is_valid_private_name(name: str) -> bool: return not is_valid_field(name) and name not in DUNDER_ATTRIBUTES _EMPTY = object() def all_identical(left: Iterable[Any], right: Iterable[Any]) -> bool: """ Check that the items of `left` are the same objects as those in `right`. >>> a, b = object(), object() >>> all_identical([a, b, a], [a, b, a]) True >>> all_identical([a, b, [a]], [a, b, [a]]) # new list object, while "equal" is not "identical" False """ for left_item, right_item in zip_longest(left, right, fillvalue=_EMPTY): if left_item is not right_item: return False return True def assert_never(obj: NoReturn, msg: str) -> NoReturn: """ Helper to make sure that we have covered all possible types. This is mostly useful for ``mypy``, docs: https://mypy.readthedocs.io/en/latest/literal_types.html#exhaustive-checks """ raise TypeError(msg) def get_unique_discriminator_alias(all_aliases: Collection[str], discriminator_key: str) -> str: """Validate that all aliases are the same and if that's the case return the alias""" unique_aliases = set(all_aliases) if len(unique_aliases) > 1: raise ConfigError( f'Aliases for discriminator {discriminator_key!r} must be the same (got {", ".join(sorted(all_aliases))})' ) return unique_aliases.pop() def get_discriminator_alias_and_values(tp: Any, discriminator_key: str) -> Tuple[str, Tuple[str, ...]]: """ Get alias and all valid values in the `Literal` type of the discriminator field `tp` can be a `BaseModel` class or directly an `Annotated` `Union` of many. """ is_root_model = getattr(tp, '__custom_root_type__', False) if get_origin(tp) is Annotated: tp = get_args(tp)[0] if hasattr(tp, '__pydantic_model__'): tp = tp.__pydantic_model__ if is_union(get_origin(tp)): alias, all_values = _get_union_alias_and_all_values(tp, discriminator_key) return alias, tuple(v for values in all_values for v in values) elif is_root_model: union_type = tp.__fields__[ROOT_KEY].type_ alias, all_values = _get_union_alias_and_all_values(union_type, discriminator_key) if len(set(all_values)) > 1: raise ConfigError( f'Field {discriminator_key!r} is not the same for all submodels of {display_as_type(tp)!r}' ) return alias, all_values[0] else: try: t_discriminator_type = tp.__fields__[discriminator_key].type_ except AttributeError as e: raise TypeError(f'Type {tp.__name__!r} is not a valid `BaseModel` or `dataclass`') from e except KeyError as e: raise ConfigError(f'Model {tp.__name__!r} needs a discriminator field for key {discriminator_key!r}') from e if not is_literal_type(t_discriminator_type): raise ConfigError(f'Field {discriminator_key!r} of model {tp.__name__!r} needs to be a `Literal`') return tp.__fields__[discriminator_key].alias, all_literal_values(t_discriminator_type) def _get_union_alias_and_all_values( union_type: Type[Any], discriminator_key: str ) -> Tuple[str, Tuple[Tuple[str, ...], ...]]: zipped_aliases_values = [get_discriminator_alias_and_values(t, discriminator_key) for t in get_args(union_type)] # unzip: [('alias_a',('v1', 'v2)), ('alias_b', ('v3',))] => [('alias_a', 'alias_b'), (('v1', 'v2'), ('v3',))] all_aliases, all_values = zip(*zipped_aliases_values) return get_unique_discriminator_alias(all_aliases, discriminator_key), all_values pydantic-2.10.6/pydantic/v1/validators.py000066400000000000000000000532531474456633400203260ustar00rootroot00000000000000import math import re from collections import OrderedDict, deque from collections.abc import Hashable as CollectionsHashable from datetime import date, datetime, time, timedelta from decimal import Decimal, DecimalException from enum import Enum, IntEnum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from typing import ( TYPE_CHECKING, Any, Callable, Deque, Dict, ForwardRef, FrozenSet, Generator, Hashable, List, NamedTuple, Pattern, Set, Tuple, Type, TypeVar, Union, ) from uuid import UUID from warnings import warn from pydantic.v1 import errors from pydantic.v1.datetime_parse import parse_date, parse_datetime, parse_duration, parse_time from pydantic.v1.typing import ( AnyCallable, all_literal_values, display_as_type, get_class, is_callable_type, is_literal_type, is_namedtuple, is_none_type, is_typeddict, ) from pydantic.v1.utils import almost_equal_floats, lenient_issubclass, sequence_like if TYPE_CHECKING: from typing_extensions import Literal, TypedDict from pydantic.v1.config import BaseConfig from pydantic.v1.fields import ModelField from pydantic.v1.types import ConstrainedDecimal, ConstrainedFloat, ConstrainedInt ConstrainedNumber = Union[ConstrainedDecimal, ConstrainedFloat, ConstrainedInt] AnyOrderedDict = OrderedDict[Any, Any] Number = Union[int, float, Decimal] StrBytes = Union[str, bytes] def str_validator(v: Any) -> Union[str]: if isinstance(v, str): if isinstance(v, Enum): return v.value else: return v elif isinstance(v, (float, int, Decimal)): # is there anything else we want to add here? If you think so, create an issue. return str(v) elif isinstance(v, (bytes, bytearray)): return v.decode() else: raise errors.StrError() def strict_str_validator(v: Any) -> Union[str]: if isinstance(v, str) and not isinstance(v, Enum): return v raise errors.StrError() def bytes_validator(v: Any) -> Union[bytes]: if isinstance(v, bytes): return v elif isinstance(v, bytearray): return bytes(v) elif isinstance(v, str): return v.encode() elif isinstance(v, (float, int, Decimal)): return str(v).encode() else: raise errors.BytesError() def strict_bytes_validator(v: Any) -> Union[bytes]: if isinstance(v, bytes): return v elif isinstance(v, bytearray): return bytes(v) else: raise errors.BytesError() BOOL_FALSE = {0, '0', 'off', 'f', 'false', 'n', 'no'} BOOL_TRUE = {1, '1', 'on', 't', 'true', 'y', 'yes'} def bool_validator(v: Any) -> bool: if v is True or v is False: return v if isinstance(v, bytes): v = v.decode() if isinstance(v, str): v = v.lower() try: if v in BOOL_TRUE: return True if v in BOOL_FALSE: return False except TypeError: raise errors.BoolError() raise errors.BoolError() # matches the default limit cpython, see https://github.com/python/cpython/pull/96500 max_str_int = 4_300 def int_validator(v: Any) -> int: if isinstance(v, int) and not (v is True or v is False): return v # see https://github.com/pydantic/pydantic/issues/1477 and in turn, https://github.com/python/cpython/issues/95778 # this check should be unnecessary once patch releases are out for 3.7, 3.8, 3.9 and 3.10 # but better to check here until then. # NOTICE: this does not fully protect user from the DOS risk since the standard library JSON implementation # (and other std lib modules like xml) use `int()` and are likely called before this, the best workaround is to # 1. update to the latest patch release of python once released, 2. use a different JSON library like ujson if isinstance(v, (str, bytes, bytearray)) and len(v) > max_str_int: raise errors.IntegerError() try: return int(v) except (TypeError, ValueError, OverflowError): raise errors.IntegerError() def strict_int_validator(v: Any) -> int: if isinstance(v, int) and not (v is True or v is False): return v raise errors.IntegerError() def float_validator(v: Any) -> float: if isinstance(v, float): return v try: return float(v) except (TypeError, ValueError): raise errors.FloatError() def strict_float_validator(v: Any) -> float: if isinstance(v, float): return v raise errors.FloatError() def float_finite_validator(v: 'Number', field: 'ModelField', config: 'BaseConfig') -> 'Number': allow_inf_nan = getattr(field.type_, 'allow_inf_nan', None) if allow_inf_nan is None: allow_inf_nan = config.allow_inf_nan if allow_inf_nan is False and (math.isnan(v) or math.isinf(v)): raise errors.NumberNotFiniteError() return v def number_multiple_validator(v: 'Number', field: 'ModelField') -> 'Number': field_type: ConstrainedNumber = field.type_ if field_type.multiple_of is not None: mod = float(v) / float(field_type.multiple_of) % 1 if not almost_equal_floats(mod, 0.0) and not almost_equal_floats(mod, 1.0): raise errors.NumberNotMultipleError(multiple_of=field_type.multiple_of) return v def number_size_validator(v: 'Number', field: 'ModelField') -> 'Number': field_type: ConstrainedNumber = field.type_ if field_type.gt is not None and not v > field_type.gt: raise errors.NumberNotGtError(limit_value=field_type.gt) elif field_type.ge is not None and not v >= field_type.ge: raise errors.NumberNotGeError(limit_value=field_type.ge) if field_type.lt is not None and not v < field_type.lt: raise errors.NumberNotLtError(limit_value=field_type.lt) if field_type.le is not None and not v <= field_type.le: raise errors.NumberNotLeError(limit_value=field_type.le) return v def constant_validator(v: 'Any', field: 'ModelField') -> 'Any': """Validate ``const`` fields. The value provided for a ``const`` field must be equal to the default value of the field. This is to support the keyword of the same name in JSON Schema. """ if v != field.default: raise errors.WrongConstantError(given=v, permitted=[field.default]) return v def anystr_length_validator(v: 'StrBytes', config: 'BaseConfig') -> 'StrBytes': v_len = len(v) min_length = config.min_anystr_length if v_len < min_length: raise errors.AnyStrMinLengthError(limit_value=min_length) max_length = config.max_anystr_length if max_length is not None and v_len > max_length: raise errors.AnyStrMaxLengthError(limit_value=max_length) return v def anystr_strip_whitespace(v: 'StrBytes') -> 'StrBytes': return v.strip() def anystr_upper(v: 'StrBytes') -> 'StrBytes': return v.upper() def anystr_lower(v: 'StrBytes') -> 'StrBytes': return v.lower() def ordered_dict_validator(v: Any) -> 'AnyOrderedDict': if isinstance(v, OrderedDict): return v try: return OrderedDict(v) except (TypeError, ValueError): raise errors.DictError() def dict_validator(v: Any) -> Dict[Any, Any]: if isinstance(v, dict): return v try: return dict(v) except (TypeError, ValueError): raise errors.DictError() def list_validator(v: Any) -> List[Any]: if isinstance(v, list): return v elif sequence_like(v): return list(v) else: raise errors.ListError() def tuple_validator(v: Any) -> Tuple[Any, ...]: if isinstance(v, tuple): return v elif sequence_like(v): return tuple(v) else: raise errors.TupleError() def set_validator(v: Any) -> Set[Any]: if isinstance(v, set): return v elif sequence_like(v): return set(v) else: raise errors.SetError() def frozenset_validator(v: Any) -> FrozenSet[Any]: if isinstance(v, frozenset): return v elif sequence_like(v): return frozenset(v) else: raise errors.FrozenSetError() def deque_validator(v: Any) -> Deque[Any]: if isinstance(v, deque): return v elif sequence_like(v): return deque(v) else: raise errors.DequeError() def enum_member_validator(v: Any, field: 'ModelField', config: 'BaseConfig') -> Enum: try: enum_v = field.type_(v) except ValueError: # field.type_ should be an enum, so will be iterable raise errors.EnumMemberError(enum_values=list(field.type_)) return enum_v.value if config.use_enum_values else enum_v def uuid_validator(v: Any, field: 'ModelField') -> UUID: try: if isinstance(v, str): v = UUID(v) elif isinstance(v, (bytes, bytearray)): try: v = UUID(v.decode()) except ValueError: # 16 bytes in big-endian order as the bytes argument fail # the above check v = UUID(bytes=v) except ValueError: raise errors.UUIDError() if not isinstance(v, UUID): raise errors.UUIDError() required_version = getattr(field.type_, '_required_version', None) if required_version and v.version != required_version: raise errors.UUIDVersionError(required_version=required_version) return v def decimal_validator(v: Any) -> Decimal: if isinstance(v, Decimal): return v elif isinstance(v, (bytes, bytearray)): v = v.decode() v = str(v).strip() try: v = Decimal(v) except DecimalException: raise errors.DecimalError() if not v.is_finite(): raise errors.DecimalIsNotFiniteError() return v def hashable_validator(v: Any) -> Hashable: if isinstance(v, Hashable): return v raise errors.HashableError() def ip_v4_address_validator(v: Any) -> IPv4Address: if isinstance(v, IPv4Address): return v try: return IPv4Address(v) except ValueError: raise errors.IPv4AddressError() def ip_v6_address_validator(v: Any) -> IPv6Address: if isinstance(v, IPv6Address): return v try: return IPv6Address(v) except ValueError: raise errors.IPv6AddressError() def ip_v4_network_validator(v: Any) -> IPv4Network: """ Assume IPv4Network initialised with a default ``strict`` argument See more: https://docs.python.org/library/ipaddress.html#ipaddress.IPv4Network """ if isinstance(v, IPv4Network): return v try: return IPv4Network(v) except ValueError: raise errors.IPv4NetworkError() def ip_v6_network_validator(v: Any) -> IPv6Network: """ Assume IPv6Network initialised with a default ``strict`` argument See more: https://docs.python.org/library/ipaddress.html#ipaddress.IPv6Network """ if isinstance(v, IPv6Network): return v try: return IPv6Network(v) except ValueError: raise errors.IPv6NetworkError() def ip_v4_interface_validator(v: Any) -> IPv4Interface: if isinstance(v, IPv4Interface): return v try: return IPv4Interface(v) except ValueError: raise errors.IPv4InterfaceError() def ip_v6_interface_validator(v: Any) -> IPv6Interface: if isinstance(v, IPv6Interface): return v try: return IPv6Interface(v) except ValueError: raise errors.IPv6InterfaceError() def path_validator(v: Any) -> Path: if isinstance(v, Path): return v try: return Path(v) except TypeError: raise errors.PathError() def path_exists_validator(v: Any) -> Path: if not v.exists(): raise errors.PathNotExistsError(path=v) return v def callable_validator(v: Any) -> AnyCallable: """ Perform a simple check if the value is callable. Note: complete matching of argument type hints and return types is not performed """ if callable(v): return v raise errors.CallableError(value=v) def enum_validator(v: Any) -> Enum: if isinstance(v, Enum): return v raise errors.EnumError(value=v) def int_enum_validator(v: Any) -> IntEnum: if isinstance(v, IntEnum): return v raise errors.IntEnumError(value=v) def make_literal_validator(type_: Any) -> Callable[[Any], Any]: permitted_choices = all_literal_values(type_) # To have a O(1) complexity and still return one of the values set inside the `Literal`, # we create a dict with the set values (a set causes some problems with the way intersection works). # In some cases the set value and checked value can indeed be different (see `test_literal_validator_str_enum`) allowed_choices = {v: v for v in permitted_choices} def literal_validator(v: Any) -> Any: try: return allowed_choices[v] except (KeyError, TypeError): raise errors.WrongConstantError(given=v, permitted=permitted_choices) return literal_validator def constr_length_validator(v: 'StrBytes', field: 'ModelField', config: 'BaseConfig') -> 'StrBytes': v_len = len(v) min_length = field.type_.min_length if field.type_.min_length is not None else config.min_anystr_length if v_len < min_length: raise errors.AnyStrMinLengthError(limit_value=min_length) max_length = field.type_.max_length if field.type_.max_length is not None else config.max_anystr_length if max_length is not None and v_len > max_length: raise errors.AnyStrMaxLengthError(limit_value=max_length) return v def constr_strip_whitespace(v: 'StrBytes', field: 'ModelField', config: 'BaseConfig') -> 'StrBytes': strip_whitespace = field.type_.strip_whitespace or config.anystr_strip_whitespace if strip_whitespace: v = v.strip() return v def constr_upper(v: 'StrBytes', field: 'ModelField', config: 'BaseConfig') -> 'StrBytes': upper = field.type_.to_upper or config.anystr_upper if upper: v = v.upper() return v def constr_lower(v: 'StrBytes', field: 'ModelField', config: 'BaseConfig') -> 'StrBytes': lower = field.type_.to_lower or config.anystr_lower if lower: v = v.lower() return v def validate_json(v: Any, config: 'BaseConfig') -> Any: if v is None: # pass None through to other validators return v try: return config.json_loads(v) # type: ignore except ValueError: raise errors.JsonError() except TypeError: raise errors.JsonTypeError() T = TypeVar('T') def make_arbitrary_type_validator(type_: Type[T]) -> Callable[[T], T]: def arbitrary_type_validator(v: Any) -> T: if isinstance(v, type_): return v raise errors.ArbitraryTypeError(expected_arbitrary_type=type_) return arbitrary_type_validator def make_class_validator(type_: Type[T]) -> Callable[[Any], Type[T]]: def class_validator(v: Any) -> Type[T]: if lenient_issubclass(v, type_): return v raise errors.SubclassError(expected_class=type_) return class_validator def any_class_validator(v: Any) -> Type[T]: if isinstance(v, type): return v raise errors.ClassError() def none_validator(v: Any) -> 'Literal[None]': if v is None: return v raise errors.NotNoneError() def pattern_validator(v: Any) -> Pattern[str]: if isinstance(v, Pattern): return v str_value = str_validator(v) try: return re.compile(str_value) except re.error: raise errors.PatternError() NamedTupleT = TypeVar('NamedTupleT', bound=NamedTuple) def make_namedtuple_validator( namedtuple_cls: Type[NamedTupleT], config: Type['BaseConfig'] ) -> Callable[[Tuple[Any, ...]], NamedTupleT]: from pydantic.v1.annotated_types import create_model_from_namedtuple NamedTupleModel = create_model_from_namedtuple( namedtuple_cls, __config__=config, __module__=namedtuple_cls.__module__, ) namedtuple_cls.__pydantic_model__ = NamedTupleModel # type: ignore[attr-defined] def namedtuple_validator(values: Tuple[Any, ...]) -> NamedTupleT: annotations = NamedTupleModel.__annotations__ if len(values) > len(annotations): raise errors.ListMaxLengthError(limit_value=len(annotations)) dict_values: Dict[str, Any] = dict(zip(annotations, values)) validated_dict_values: Dict[str, Any] = dict(NamedTupleModel(**dict_values)) return namedtuple_cls(**validated_dict_values) return namedtuple_validator def make_typeddict_validator( typeddict_cls: Type['TypedDict'], config: Type['BaseConfig'] # type: ignore[valid-type] ) -> Callable[[Any], Dict[str, Any]]: from pydantic.v1.annotated_types import create_model_from_typeddict TypedDictModel = create_model_from_typeddict( typeddict_cls, __config__=config, __module__=typeddict_cls.__module__, ) typeddict_cls.__pydantic_model__ = TypedDictModel # type: ignore[attr-defined] def typeddict_validator(values: 'TypedDict') -> Dict[str, Any]: # type: ignore[valid-type] return TypedDictModel.parse_obj(values).dict(exclude_unset=True) return typeddict_validator class IfConfig: def __init__(self, validator: AnyCallable, *config_attr_names: str, ignored_value: Any = False) -> None: self.validator = validator self.config_attr_names = config_attr_names self.ignored_value = ignored_value def check(self, config: Type['BaseConfig']) -> bool: return any(getattr(config, name) not in {None, self.ignored_value} for name in self.config_attr_names) # order is important here, for example: bool is a subclass of int so has to come first, datetime before date same, # IPv4Interface before IPv4Address, etc _VALIDATORS: List[Tuple[Type[Any], List[Any]]] = [ (IntEnum, [int_validator, enum_member_validator]), (Enum, [enum_member_validator]), ( str, [ str_validator, IfConfig(anystr_strip_whitespace, 'anystr_strip_whitespace'), IfConfig(anystr_upper, 'anystr_upper'), IfConfig(anystr_lower, 'anystr_lower'), IfConfig(anystr_length_validator, 'min_anystr_length', 'max_anystr_length'), ], ), ( bytes, [ bytes_validator, IfConfig(anystr_strip_whitespace, 'anystr_strip_whitespace'), IfConfig(anystr_upper, 'anystr_upper'), IfConfig(anystr_lower, 'anystr_lower'), IfConfig(anystr_length_validator, 'min_anystr_length', 'max_anystr_length'), ], ), (bool, [bool_validator]), (int, [int_validator]), (float, [float_validator, IfConfig(float_finite_validator, 'allow_inf_nan', ignored_value=True)]), (Path, [path_validator]), (datetime, [parse_datetime]), (date, [parse_date]), (time, [parse_time]), (timedelta, [parse_duration]), (OrderedDict, [ordered_dict_validator]), (dict, [dict_validator]), (list, [list_validator]), (tuple, [tuple_validator]), (set, [set_validator]), (frozenset, [frozenset_validator]), (deque, [deque_validator]), (UUID, [uuid_validator]), (Decimal, [decimal_validator]), (IPv4Interface, [ip_v4_interface_validator]), (IPv6Interface, [ip_v6_interface_validator]), (IPv4Address, [ip_v4_address_validator]), (IPv6Address, [ip_v6_address_validator]), (IPv4Network, [ip_v4_network_validator]), (IPv6Network, [ip_v6_network_validator]), ] def find_validators( # noqa: C901 (ignore complexity) type_: Type[Any], config: Type['BaseConfig'] ) -> Generator[AnyCallable, None, None]: from pydantic.v1.dataclasses import is_builtin_dataclass, make_dataclass_validator if type_ is Any or type_ is object: return type_type = type_.__class__ if type_type == ForwardRef or type_type == TypeVar: return if is_none_type(type_): yield none_validator return if type_ is Pattern or type_ is re.Pattern: yield pattern_validator return if type_ is Hashable or type_ is CollectionsHashable: yield hashable_validator return if is_callable_type(type_): yield callable_validator return if is_literal_type(type_): yield make_literal_validator(type_) return if is_builtin_dataclass(type_): yield from make_dataclass_validator(type_, config) return if type_ is Enum: yield enum_validator return if type_ is IntEnum: yield int_enum_validator return if is_namedtuple(type_): yield tuple_validator yield make_namedtuple_validator(type_, config) return if is_typeddict(type_): yield make_typeddict_validator(type_, config) return class_ = get_class(type_) if class_ is not None: if class_ is not Any and isinstance(class_, type): yield make_class_validator(class_) else: yield any_class_validator return for val_type, validators in _VALIDATORS: try: if issubclass(type_, val_type): for v in validators: if isinstance(v, IfConfig): if v.check(config): yield v.validator else: yield v return except TypeError: raise RuntimeError(f'error checking inheritance of {type_!r} (type: {display_as_type(type_)})') if config.arbitrary_types_allowed: yield make_arbitrary_type_validator(type_) else: if hasattr(type_, '__pydantic_core_schema__'): warn(f'Mixing V1 and V2 models is not supported. `{type_.__name__}` is a V2 model.', UserWarning) raise RuntimeError(f'no validator found for {type_}, see `arbitrary_types_allowed` in Config') pydantic-2.10.6/pydantic/v1/version.py000066400000000000000000000020171474456633400176330ustar00rootroot00000000000000__all__ = 'compiled', 'VERSION', 'version_info' VERSION = '1.10.19' try: import cython # type: ignore except ImportError: compiled: bool = False else: # pragma: no cover try: compiled = cython.compiled except AttributeError: compiled = False def version_info() -> str: import platform import sys from importlib import import_module from pathlib import Path optional_deps = [] for p in ('devtools', 'dotenv', 'email-validator', 'typing-extensions'): try: import_module(p.replace('-', '_')) except ImportError: continue optional_deps.append(p) info = { 'pydantic version': VERSION, 'pydantic compiled': compiled, 'install path': Path(__file__).resolve().parent, 'python version': sys.version, 'platform': platform.platform(), 'optional deps. installed': optional_deps, } return '\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\n', ' ')) for k, v in info.items()) pydantic-2.10.6/pydantic/validate_call_decorator.py000066400000000000000000000104101474456633400224420ustar00rootroot00000000000000"""Decorator for validating function calls.""" from __future__ import annotations as _annotations import inspect from functools import partial from types import BuiltinFunctionType from typing import TYPE_CHECKING, Any, Callable, TypeVar, cast, overload from ._internal import _generate_schema, _typing_extra, _validate_call from .errors import PydanticUserError __all__ = ('validate_call',) if TYPE_CHECKING: from .config import ConfigDict AnyCallableT = TypeVar('AnyCallableT', bound=Callable[..., Any]) _INVALID_TYPE_ERROR_CODE = 'validate-call-type' def _check_function_type(function: object) -> None: """Check if the input function is a supported type for `validate_call`.""" if isinstance(function, _generate_schema.VALIDATE_CALL_SUPPORTED_TYPES): try: inspect.signature(cast(_generate_schema.ValidateCallSupportedTypes, function)) except ValueError: raise PydanticUserError( f"Input function `{function}` doesn't have a valid signature", code=_INVALID_TYPE_ERROR_CODE ) if isinstance(function, partial): try: assert not isinstance(partial.func, partial), 'Partial of partial' _check_function_type(function.func) except PydanticUserError as e: raise PydanticUserError( f'Partial of `{function.func}` is invalid because the type of `{function.func}` is not supported by `validate_call`', code=_INVALID_TYPE_ERROR_CODE, ) from e return if isinstance(function, BuiltinFunctionType): raise PydanticUserError(f'Input built-in function `{function}` is not supported', code=_INVALID_TYPE_ERROR_CODE) if isinstance(function, (classmethod, staticmethod, property)): name = type(function).__name__ raise PydanticUserError( f'The `@{name}` decorator should be applied after `@validate_call` (put `@{name}` on top)', code=_INVALID_TYPE_ERROR_CODE, ) if inspect.isclass(function): raise PydanticUserError( f'Unable to validate {function}: `validate_call` should be applied to functions, not classes (put `@validate_call` on top of `__init__` or `__new__` instead)', code=_INVALID_TYPE_ERROR_CODE, ) if callable(function): raise PydanticUserError( f'Unable to validate {function}: `validate_call` should be applied to functions, not instances or other callables. Use `validate_call` explicitly on `__call__` instead.', code=_INVALID_TYPE_ERROR_CODE, ) raise PydanticUserError( f'Unable to validate {function}: `validate_call` should be applied to one of the following: function, method, partial, or lambda', code=_INVALID_TYPE_ERROR_CODE, ) @overload def validate_call( *, config: ConfigDict | None = None, validate_return: bool = False ) -> Callable[[AnyCallableT], AnyCallableT]: ... @overload def validate_call(func: AnyCallableT, /) -> AnyCallableT: ... def validate_call( func: AnyCallableT | None = None, /, *, config: ConfigDict | None = None, validate_return: bool = False, ) -> AnyCallableT | Callable[[AnyCallableT], AnyCallableT]: """Usage docs: https://docs.pydantic.dev/2.10/concepts/validation_decorator/ Returns a decorated wrapper around the function that validates the arguments and, optionally, the return value. Usage may be either as a plain decorator `@validate_call` or with arguments `@validate_call(...)`. Args: func: The function to be decorated. config: The configuration dictionary. validate_return: Whether to validate the return value. Returns: The decorated function. """ parent_namespace = _typing_extra.parent_frame_namespace() def validate(function: AnyCallableT) -> AnyCallableT: _check_function_type(function) validate_call_wrapper = _validate_call.ValidateCallWrapper( cast(_generate_schema.ValidateCallSupportedTypes, function), config, validate_return, parent_namespace ) return _validate_call.update_wrapper_attributes(function, validate_call_wrapper.__call__) # type: ignore if func is not None: return validate(func) else: return validate pydantic-2.10.6/pydantic/validators.py000066400000000000000000000002221474456633400177640ustar00rootroot00000000000000"""The `validators` module is a backport module from V1.""" from ._migration import getattr_migration __getattr__ = getattr_migration(__name__) pydantic-2.10.6/pydantic/version.py000066400000000000000000000047021474456633400173100ustar00rootroot00000000000000"""The `version` module holds the version information for Pydantic.""" from __future__ import annotations as _annotations __all__ = 'VERSION', 'version_info' VERSION = '2.10.6' """The version of Pydantic.""" def version_short() -> str: """Return the `major.minor` part of Pydantic version. It returns '2.1' if Pydantic version is '2.1.1'. """ return '.'.join(VERSION.split('.')[:2]) def version_info() -> str: """Return complete version information for Pydantic and its dependencies.""" import importlib.metadata as importlib_metadata import os import platform import sys from pathlib import Path import pydantic_core._pydantic_core as pdc from ._internal import _git as git # get data about packages that are closely related to pydantic, use pydantic or often conflict with pydantic package_names = { 'email-validator', 'fastapi', 'mypy', 'pydantic-extra-types', 'pydantic-settings', 'pyright', 'typing_extensions', } related_packages = [] for dist in importlib_metadata.distributions(): name = dist.metadata['Name'] if name in package_names: related_packages.append(f'{name}-{dist.version}') pydantic_dir = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) most_recent_commit = ( git.git_revision(pydantic_dir) if git.is_git_repo(pydantic_dir) and git.have_git() else 'unknown' ) info = { 'pydantic version': VERSION, 'pydantic-core version': pdc.__version__, 'pydantic-core build': getattr(pdc, 'build_info', None) or pdc.build_profile, 'install path': Path(__file__).resolve().parent, 'python version': sys.version, 'platform': platform.platform(), 'related packages': ' '.join(related_packages), 'commit': most_recent_commit, } return '\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\n', ' ')) for k, v in info.items()) def parse_mypy_version(version: str) -> tuple[int, int, int]: """Parse `mypy` string version to a 3-tuple of ints. It parses normal version like `1.11.0` and extra info followed by a `+` sign like `1.11.0+dev.d6d9d8cd4f27c52edac1f537e236ec48a01e54cb.dirty`. Args: version: The mypy version string. Returns: A triple of ints, e.g. `(1, 11, 0)`. """ return tuple(map(int, version.partition('+')[0].split('.'))) # pyright: ignore[reportReturnType] pydantic-2.10.6/pydantic/warnings.py000066400000000000000000000064261474456633400174600ustar00rootroot00000000000000"""Pydantic-specific warnings.""" from __future__ import annotations as _annotations from .version import version_short __all__ = ( 'PydanticDeprecatedSince20', 'PydanticDeprecationWarning', 'PydanticDeprecatedSince26', 'PydanticExperimentalWarning', ) class PydanticDeprecationWarning(DeprecationWarning): """A Pydantic specific deprecation warning. This warning is raised when using deprecated functionality in Pydantic. It provides information on when the deprecation was introduced and the expected version in which the corresponding functionality will be removed. Attributes: message: Description of the warning. since: Pydantic version in what the deprecation was introduced. expected_removal: Pydantic version in what the corresponding functionality expected to be removed. """ message: str since: tuple[int, int] expected_removal: tuple[int, int] def __init__( self, message: str, *args: object, since: tuple[int, int], expected_removal: tuple[int, int] | None = None ) -> None: super().__init__(message, *args) self.message = message.rstrip('.') self.since = since self.expected_removal = expected_removal if expected_removal is not None else (since[0] + 1, 0) def __str__(self) -> str: message = ( f'{self.message}. Deprecated in Pydantic V{self.since[0]}.{self.since[1]}' f' to be removed in V{self.expected_removal[0]}.{self.expected_removal[1]}.' ) if self.since == (2, 0): message += f' See Pydantic V2 Migration Guide at https://errors.pydantic.dev/{version_short()}/migration/' return message class PydanticDeprecatedSince20(PydanticDeprecationWarning): """A specific `PydanticDeprecationWarning` subclass defining functionality deprecated since Pydantic 2.0.""" def __init__(self, message: str, *args: object) -> None: super().__init__(message, *args, since=(2, 0), expected_removal=(3, 0)) class PydanticDeprecatedSince26(PydanticDeprecationWarning): """A specific `PydanticDeprecationWarning` subclass defining functionality deprecated since Pydantic 2.6.""" def __init__(self, message: str, *args: object) -> None: super().__init__(message, *args, since=(2, 6), expected_removal=(3, 0)) class PydanticDeprecatedSince29(PydanticDeprecationWarning): """A specific `PydanticDeprecationWarning` subclass defining functionality deprecated since Pydantic 2.9.""" def __init__(self, message: str, *args: object) -> None: super().__init__(message, *args, since=(2, 9), expected_removal=(3, 0)) class PydanticDeprecatedSince210(PydanticDeprecationWarning): """A specific `PydanticDeprecationWarning` subclass defining functionality deprecated since Pydantic 2.10.""" def __init__(self, message: str, *args: object) -> None: super().__init__(message, *args, since=(2, 10), expected_removal=(3, 0)) class GenericBeforeBaseModelWarning(Warning): pass class PydanticExperimentalWarning(Warning): """A Pydantic specific experimental functionality warning. This warning is raised when using experimental functionality in Pydantic. It is raised to warn users that the functionality may change or be removed in future versions of Pydantic. """ pydantic-2.10.6/pyproject.toml000066400000000000000000000175441474456633400163620ustar00rootroot00000000000000[build-system] requires = ['hatchling', 'hatch-fancy-pypi-readme>=22.5.0'] build-backend = 'hatchling.build' [project] name = 'pydantic' description = 'Data validation using Python type hints' authors = [ {name = 'Samuel Colvin', email = 's@muelcolvin.com'}, {name = 'Eric Jolibois', email = 'em.jolibois@gmail.com'}, {name = 'Hasan Ramezani', email = 'hasan.r67@gmail.com'}, {name = 'Adrian Garcia Badaracco', email = '1755071+adriangb@users.noreply.github.com'}, {name = 'Terrence Dorsey', email = 'terry@pydantic.dev'}, {name = 'David Montague', email = 'david@pydantic.dev'}, {name = 'Serge Matveenko', email = 'lig@countzero.co'}, {name = 'Marcelo Trylesinski', email = 'marcelotryle@gmail.com'}, {name = 'Sydney Runkle', email = 'sydneymarierunkle@gmail.com'}, {name = 'David Hewitt', email = 'mail@davidhewitt.io'}, {name = 'Alex Hall', email='alex.mojaki@gmail.com'}, {name = 'Victorien Plot', email='contact@vctrn.dev'}, ] license = 'MIT' classifiers = [ 'Development Status :: 5 - Production/Stable', 'Programming Language :: Python', 'Programming Language :: Python :: Implementation :: CPython', 'Programming Language :: Python :: Implementation :: PyPy', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Programming Language :: Python :: 3.10', 'Programming Language :: Python :: 3.11', 'Programming Language :: Python :: 3.12', 'Programming Language :: Python :: 3.13', 'Intended Audience :: Developers', 'Intended Audience :: Information Technology', 'License :: OSI Approved :: MIT License', 'Operating System :: OS Independent', 'Framework :: Hypothesis', 'Framework :: Pydantic', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Internet', ] requires-python = '>=3.8' dependencies = [ 'typing-extensions>=4.12.2', 'annotated-types>=0.6.0', 'pydantic-core==2.27.2', ] dynamic = ['version', 'readme'] [project.optional-dependencies] email = ['email-validator>=2.0.0'] timezone = [ # See: https://docs.python.org/3/library/zoneinfo.html#data-sources 'tzdata; python_version >= "3.9" and platform_system == "Windows"', ] [project.urls] Homepage = 'https://github.com/pydantic/pydantic' Documentation = 'https://docs.pydantic.dev' Funding = 'https://github.com/sponsors/samuelcolvin' Source = 'https://github.com/pydantic/pydantic' Changelog = 'https://docs.pydantic.dev/latest/changelog/' [dependency-groups] dev = [ 'coverage[toml]', 'pytz', 'dirty-equals', 'eval-type-backport', 'pytest', 'pytest-mock', 'pytest-pretty', 'pytest-examples', 'faker', 'pytest-benchmark', 'pytest-codspeed', 'pytest-memray; platform_python_implementation == "CPython" and platform_system != "Windows"', 'packaging', 'jsonschema', ] docs = [ 'autoflake', 'mkdocs', 'mkdocs-exclude', 'mkdocs-material[imaging]', 'mkdocs-redirects', 'mkdocstrings-python', 'tomli', 'pyupgrade', 'mike', 'pydantic-settings', 'pydantic-extra-types @ git+https://github.com/pydantic/pydantic-extra-types.git@main', 'requests', ] linting = [ 'eval-type-backport', 'ruff', 'pyright', ] testing-extra = [ 'cloudpickle', # used when generate devtools docs example 'ansi2html', 'devtools', # used in docs tests 'sqlalchemy', 'greenlet; python_version < "3.13"', ] typechecking = [ 'mypy', 'pyright', 'pydantic-settings', ] all = [ { include-group = 'dev' }, { include-group = 'docs' }, { include-group = 'linting' }, { include-group = 'testing-extra' }, { include-group = 'typechecking' }, ] [tool.hatch.version] path = 'pydantic/version.py' [tool.hatch.metadata] allow-direct-references = true [tool.hatch.build.targets.sdist] # limit which files are included in the sdist (.tar.gz) asset, # see https://github.com/pydantic/pydantic/pull/4542 include = [ '/README.md', '/HISTORY.md', '/Makefile', '/pydantic', '/tests', '/requirements', ] [tool.hatch.metadata.hooks.fancy-pypi-readme] content-type = 'text/markdown' # construct the PyPI readme from README.md and HISTORY.md fragments = [ {path = 'README.md'}, {text = "\n## Changelog\n\n"}, {path = 'HISTORY.md', pattern = '(.+?)'}, {text = "\n... see [here](https://docs.pydantic.dev/changelog/#v0322-2019-08-17) for earlier changes.\n"}, ] # convert GitHub issue/PR numbers and handles to links substitutions = [ {pattern = '(\s+)#(\d+)', replacement = '\1[#\2](https://github.com/pydantic/pydantic/issues/\2)'}, {pattern = '(\s+)@([\w\-]+)', replacement = '\1[@\2](https://github.com/\2)'}, {pattern = '@@', replacement = '@'}, ] [tool.pytest.ini_options] testpaths = 'tests' xfail_strict = true filterwarnings = [ 'error', 'ignore:path is deprecated.*:DeprecationWarning:', ] addopts = [ '--benchmark-columns', 'min,mean,stddev,outliers,rounds,iterations', '--benchmark-group-by', 'group', '--benchmark-warmup', 'on', '--benchmark-disable', # this is enabled by `make benchmark` when you actually want to run benchmarks ] markers = [ 'skip_json_schema_validation: Disable JSON Schema validation.', ] [tool.uv] default-groups = ['dev'] # configuring https://github.com/pydantic/hooky [tool.hooky] reviewers = ['sydney-runkle'] require_change_file = false unconfirmed_label = 'pending' [tool.ruff] line-length = 120 target-version = 'py38' extend-exclude = ['pydantic/v1', 'tests/mypy/'] [tool.ruff.lint] select = [ 'F', # Pyflakes 'E', # pycodestyle (Error) 'I', # isort 'D', # pydocstyle 'UP', # pyupgrade 'YTT', # flake8-2020 'B', # flake8-bugbear 'T10', # flake8-debugger 'T20', # flake8-print 'C4', # flake8-comprehensions 'PYI006', # flake8-pyi ] ignore = ['D105', 'D107', 'D205', 'D415', 'E501', 'B011', 'B028', 'B904'] flake8-quotes = {inline-quotes = 'single', multiline-quotes = 'double'} isort = { known-first-party = ['pydantic', 'tests'] } mccabe = { max-complexity = 14 } pydocstyle = { convention = 'google' } [tool.ruff.lint.per-file-ignores] 'docs/*' = ['D'] 'pydantic/__init__.py' = ['F405', 'F403', 'D'] 'tests/test_forward_ref.py' = ['F821'] 'tests/*' = ['D', 'B', 'C4'] 'pydantic/deprecated/*' = ['D'] 'pydantic/json_schema.py' = ['D'] [tool.ruff.lint.extend-per-file-ignores] "docs/**/*.py" = ['T'] "tests/**/*.py" = ['T', 'E721', 'F811'] "tests/benchmarks/**/*.py" = ['UP006', 'UP007'] [tool.ruff.format] quote-style = 'single' [tool.coverage.run] source = ['pydantic'] omit = ['pydantic/deprecated/*', 'pydantic/v1/*'] branch = true relative_files = true context = '${CONTEXT}' [tool.coverage.report] precision = 2 exclude_lines = [ 'pragma: no cover', 'raise NotImplementedError', 'if TYPE_CHECKING:', 'if typing.TYPE_CHECKING:', '@overload', '@typing.overload', '\(Protocol\):$', 'typing.assert_never', 'assert_never', ] [tool.coverage.paths] source = [ 'pydantic/', '/Users/runner/work/pydantic/pydantic/pydantic/', 'D:\a\pydantic\pydantic\pydantic', ] [tool.pyright] include = ['pydantic', 'tests/test_pipeline.py'] exclude = ['pydantic/_hypothesis_plugin.py', 'pydantic/mypy.py', 'pydantic/v1'] # reportUnnecessaryTypeIgnoreComment can't be set since we run pyright with multiple python versions # reportUnnecessaryTypeIgnoreComment = true strict = ['tests/test_pipeline.py'] enableExperimentalFeatures = true [tool.codespell] skip = '.git,env*,pydantic/v1/*,uv.lock' # `ser` - abbreviation for "ser"ialisation # `crate` - a rust crate ignore-words-list = 'gir,ser,crate' [tool.codeflash] module-root = "pydantic" tests-root = "tests" test-framework = "pytest" ignore-paths = [] formatter-cmd = ["ruff check --exit-zero --fix $file", "ruff format $file"] pydantic-2.10.6/release/000077500000000000000000000000001474456633400150535ustar00rootroot00000000000000pydantic-2.10.6/release/README.md000066400000000000000000000032661474456633400163410ustar00rootroot00000000000000# Release Instructions. **Note:** _This should only apply to maintainers when prepare for and publishing a new release._ Prerequisites: * `gh` cli is installed - see installation instructions [here](https://docs.github.com/en/github-cli/github-cli/quickstart) * Run `gh auth login` to authenticate with GitHub, which is needed for the API calls made in the release process. To create a new release: 1. Edit `pydantic/version.py` to set the new version number and run `uv lock -P pydantic` 2. **(If the new version is a new minor or major release)** run `pre-commit run -a usage_docs` to update the usage links in docstrings. 3. Run `uv run release/make_history.py` to update `HISTORY.md` and `CITATION.cff`. 4. **Important:** curate the changes in `HISTORY.md`: - make sure the markdown is valid; in particular, check text that should be in `code-blocks` is. - mark any breaking changes with `**Breaking Change:**` - curate the list of pydantic-core updates in the `packaging` section: - check the corresponding pydantic-core releases for any highlights to manually add to the history - deduplicate the `packaging` entries to include only the most recent version bumps for each package 5. Create a pull request with these changes. 6. Once the pull request is merged, create a new release on GitHub: - the tag should be `v{VERSION}` - the title should be `v{VERSION} {DATE}` - the body should contain: - a copy-paste of the `HISTORY.md` section you prepared previously, plus - a full changelog link in the form `Full Changelog: https://github.com/pydantic/pydantic/compare/v{PREV_VERSION}...v{VERSION}/` 7. Ask @samuelcolvin or @dmontagu to approve the release once CI has run. pydantic-2.10.6/release/make_history.py000066400000000000000000000076601474456633400201340ustar00rootroot00000000000000from __future__ import annotations as _annotations import argparse import json import re import subprocess import sys from datetime import date from pathlib import Path import requests def main(): root_dir = Path(__file__).parent.parent parser = argparse.ArgumentParser() # For easier iteration, can generate the release notes without saving parser.add_argument('--preview', help='print preview of release notes to terminal without saving to HISTORY.md') args = parser.parse_args() if args.preview: new_version = args.preview else: version_file = root_dir / 'pydantic' / 'version.py' new_version = re.search(r"VERSION = '(.*)'", version_file.read_text()).group(1) history_path = root_dir / 'HISTORY.md' history_content = history_path.read_text() # use ( to avoid matching beta versions if f'## v{new_version} (' in history_content: print(f'WARNING: v{new_version} already in history, stopping') sys.exit(1) date_today_str = f'{date.today():%Y-%m-%d}' title = f'v{new_version} ({date_today_str})' notes = get_notes(new_version) new_chunk = ( f'## {title}\n\n' f'[GitHub release](https://github.com/pydantic/pydantic/releases/tag/v{new_version})\n\n' f'{notes}\n\n' ) if args.preview: print(new_chunk) return history = new_chunk + history_content history_path.write_text(history) print(f'\nSUCCESS: added "{title}" section to {history_path.relative_to(root_dir)}') citation_path = root_dir / 'CITATION.cff' citation_text = citation_path.read_text() if not (alpha_version := 'a' in new_version) and not (beta_version := 'b' in new_version): citation_text = re.sub(r'(?<=\nversion: ).*', f'v{new_version}', citation_text) citation_text = re.sub(r'(?<=date-released: ).*', date_today_str, citation_text) citation_path.write_text(citation_text) print( f'SUCCESS: updated version=v{new_version} and date-released={date_today_str} in {citation_path.relative_to(root_dir)}' ) else: print( f'WARNING: not updating CITATION.cff because version is {"alpha" if alpha_version else "beta"} version {new_version}' ) def get_notes(new_version: str) -> str: last_tag = get_last_tag() auth_token = get_gh_auth_token() data = {'target_committish': 'main', 'previous_tag_name': last_tag, 'tag_name': f'v{new_version}'} response = requests.post( 'https://api.github.com/repos/pydantic/pydantic/releases/generate-notes', headers={ 'Accept': 'application/vnd.github+json', 'Authorization': f'Bearer {auth_token}', 'x-github-api-version': '2022-11-28', }, data=json.dumps(data), ) response.raise_for_status() body = response.json()['body'] body = body.replace('\n\n', '') # Add one level to all headers so they match HISTORY.md, and add trailing newline body = re.sub(pattern='^(#+ .+?)$', repl=r'#\1\n', string=body, flags=re.MULTILINE) # Ensure a blank line before headers body = re.sub(pattern='([^\n])(\n#+ .+?\n)', repl=r'\1\n\2', string=body) # Render PR links nicely body = re.sub( pattern='https://github.com/pydantic/pydantic/pull/(\\d+)', repl=r'[#\1](https://github.com/pydantic/pydantic/pull/\1)', string=body, ) # Remove "full changelog" link body = re.sub( pattern=r'\*\*Full Changelog\*\*: https://.*$', repl='', string=body, ) return body.strip() def get_last_tag(): return run('git', 'describe', '--tags', '--abbrev=0') def get_gh_auth_token(): return run('gh', 'auth', 'token') def run(*args: str) -> str: p = subprocess.run(args, stdout=subprocess.PIPE, check=True, encoding='utf-8') return p.stdout.strip() if __name__ == '__main__': main() pydantic-2.10.6/tests/000077500000000000000000000000001474456633400145755ustar00rootroot00000000000000pydantic-2.10.6/tests/__init__.py000066400000000000000000000000001474456633400166740ustar00rootroot00000000000000pydantic-2.10.6/tests/benchmarks/000077500000000000000000000000001474456633400167125ustar00rootroot00000000000000pydantic-2.10.6/tests/benchmarks/__init__.py000066400000000000000000000000001474456633400210110ustar00rootroot00000000000000pydantic-2.10.6/tests/benchmarks/basemodel_eq_performance.py000066400000000000000000000544171474456633400243000ustar00rootroot00000000000000from __future__ import annotations import dataclasses import enum import gc import itertools import operator import sys import textwrap import timeit from importlib import metadata from typing import TYPE_CHECKING, Any, Callable, Generic, Iterable, Sized, TypeVar # Do not import additional dependencies at top-level if TYPE_CHECKING: import matplotlib.pyplot as plt import numpy as np from matplotlib import axes, figure import pydantic PYTHON_VERSION = '.'.join(map(str, sys.version_info)) PYDANTIC_VERSION = metadata.version('pydantic') # New implementation of pydantic.BaseModel.__eq__ to test class OldImplementationModel(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ return ( self_type == other_type and self.__dict__ == other.__dict__ and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ) else: return NotImplemented # delegate to the other item in the comparison class DictComprehensionEqModel(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ field_names = type(self).model_fields.keys() return ( self_type == other_type and ({k: self.__dict__[k] for k in field_names} == {k: other.__dict__[k] for k in field_names}) and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ) else: return NotImplemented # delegate to the other item in the comparison class ItemGetterEqModel(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ model_fields = type(self).model_fields.keys() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: None return ( self_type == other_type and getter(self.__dict__) == getter(other.__dict__) and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ) else: return NotImplemented # delegate to the other item in the comparison class ItemGetterEqModelFastPath(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # Fix GH-7444 by comparing only pydantic fields # We provide a fast-path for performance: __dict__ comparison is *much* faster # See tests/benchmarks/test_basemodel_eq_performances.py and GH-7825 for benchmarks if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True else: # Else, we need to perform a more detailed, costlier comparison model_fields = type(self).model_fields.keys() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: None return getter(self.__dict__) == getter(other.__dict__) else: return NotImplemented # delegate to the other item in the comparison K = TypeVar('K') V = TypeVar('V') # We need a sentinel value for missing fields when comparing models # Models are equals if-and-only-if they miss the same fields, and since None is a legitimate value # we can't default to None # We use the single-value enum trick to allow correct typing when using a sentinel class _SentinelType(enum.Enum): SENTINEL = enum.auto() _SENTINEL = _SentinelType.SENTINEL @dataclasses.dataclass class _SafeGetItemProxy(Generic[K, V]): """Wrapper redirecting `__getitem__` to `get` and a sentinel value This makes is safe to use in `operator.itemgetter` when some keys may be missing """ wrapped: dict[K, V] def __getitem__(self, key: K, /) -> V | _SentinelType: return self.wrapped.get(key, _SENTINEL) def __contains__(self, key: K, /) -> bool: return self.wrapped.__contains__(key) class SafeItemGetterEqModelFastPath(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # Fix GH-7444 by comparing only pydantic fields # We provide a fast-path for performance: __dict__ comparison is *much* faster # See tests/benchmarks/test_basemodel_eq_performances.py and GH-7825 for benchmarks if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True else: # Else, we need to perform a more detailed, costlier comparison model_fields = type(self).model_fields.keys() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: None return getter(_SafeGetItemProxy(self.__dict__)) == getter(_SafeGetItemProxy(other.__dict__)) else: return NotImplemented # delegate to the other item in the comparison class ItemGetterEqModelFastPathFallback(pydantic.BaseModel, frozen=True): def __eq__(self, other: Any) -> bool: if isinstance(other, pydantic.BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and self.__pydantic_private__ == other.__pydantic_private__ and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # Fix GH-7444 by comparing only pydantic fields # We provide a fast-path for performance: __dict__ comparison is *much* faster # See tests/benchmarks/test_basemodel_eq_performances.py and GH-7825 for benchmarks if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True else: # Else, we need to perform a more detailed, costlier comparison model_fields = type(self).model_fields.keys() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: None try: return getter(self.__dict__) == getter(other.__dict__) except KeyError: return getter(_SafeGetItemProxy(self.__dict__)) == getter(_SafeGetItemProxy(other.__dict__)) else: return NotImplemented # delegate to the other item in the comparison IMPLEMENTATIONS = { # Commented out because it is too slow for benchmark to complete in reasonable time # "dict comprehension": DictComprehensionEqModel, 'itemgetter': ItemGetterEqModel, 'itemgetter+fastpath': ItemGetterEqModelFastPath, # Commented-out because it is too slow to run with run_benchmark_random_unequal #'itemgetter+safety+fastpath': SafeItemGetterEqModelFastPath, 'itemgetter+fastpath+safe-fallback': ItemGetterEqModelFastPathFallback, } # Benchmark running & plotting code def plot_all_benchmark( bases: dict[str, type[pydantic.BaseModel]], sizes: list[int], ) -> figure.Figure: import matplotlib.pyplot as plt n_rows, n_cols = len(BENCHMARKS), 2 fig, axes = plt.subplots(n_rows, n_cols, figsize=(n_cols * 6, n_rows * 4)) for row, (name, benchmark) in enumerate(BENCHMARKS.items()): for col, mimic_cached_property in enumerate([False, True]): plot_benchmark( f'{name}, {mimic_cached_property=}', benchmark, bases=bases, sizes=sizes, mimic_cached_property=mimic_cached_property, ax=axes[row, col], ) for ax in axes.ravel(): ax.legend() fig.suptitle(f'python {PYTHON_VERSION}, pydantic {PYDANTIC_VERSION}') return fig def plot_benchmark( title: str, benchmark: Callable, bases: dict[str, type[pydantic.BaseModel]], sizes: list[int], mimic_cached_property: bool, ax: axes.Axes | None = None, ): import matplotlib.pyplot as plt import numpy as np ax = ax or plt.gca() arr_sizes = np.asarray(sizes) baseline = benchmark( title=f'{title}, baseline', base=OldImplementationModel, sizes=sizes, mimic_cached_property=mimic_cached_property, ) ax.plot(sizes, baseline / baseline, label='baseline') for name, base in bases.items(): times = benchmark( title=f'{title}, {name}', base=base, sizes=sizes, mimic_cached_property=mimic_cached_property, ) mask_valid = ~np.isnan(times) ax.plot(arr_sizes[mask_valid], times[mask_valid] / baseline[mask_valid], label=name) ax.set_title(title) ax.set_xlabel('Number of pydantic fields') ax.set_ylabel('Average time relative to baseline') return ax class SizedIterable(Sized, Iterable): pass def run_benchmark_nodiff( title: str, base: type[pydantic.BaseModel], sizes: SizedIterable, mimic_cached_property: bool, n_execution: int = 10_000, n_repeat: int = 5, ) -> np.ndarray: setup = textwrap.dedent( """ import pydantic Model = pydantic.create_model( "Model", __base__=Base, **{f'x{i}': (int, i) for i in range(%(size)d)} ) left = Model() right = Model() """ ) if mimic_cached_property: # Mimic functools.cached_property editing __dict__ # NOTE: we must edit both objects, otherwise the dict don't have the same size and # dict.__eq__ has a very fast path. This makes our timing comparison incorrect # However, the value must be different, otherwise *our* __dict__ == right.__dict__ # fast-path prevents our correct code from running setup += textwrap.dedent( """ object.__setattr__(left, 'cache', None) object.__setattr__(right, 'cache', -1) """ ) statement = 'left == right' namespace = {'Base': base} return run_benchmark( title, setup=setup, statement=statement, n_execution=n_execution, n_repeat=n_repeat, globals=namespace, params={'size': sizes}, ) def run_benchmark_first_diff( title: str, base: type[pydantic.BaseModel], sizes: SizedIterable, mimic_cached_property: bool, n_execution: int = 10_000, n_repeat: int = 5, ) -> np.ndarray: setup = textwrap.dedent( """ import pydantic Model = pydantic.create_model( "Model", __base__=Base, **{f'x{i}': (int, i) for i in range(%(size)d)} ) left = Model() right = Model(f0=-1) if %(size)d > 0 else Model() """ ) if mimic_cached_property: # Mimic functools.cached_property editing __dict__ # NOTE: we must edit both objects, otherwise the dict don't have the same size and # dict.__eq__ has a very fast path. This makes our timing comparison incorrect # However, the value must be different, otherwise *our* __dict__ == right.__dict__ # fast-path prevents our correct code from running setup += textwrap.dedent( """ object.__setattr__(left, 'cache', None) object.__setattr__(right, 'cache', -1) """ ) statement = 'left == right' namespace = {'Base': base} return run_benchmark( title, setup=setup, statement=statement, n_execution=n_execution, n_repeat=n_repeat, globals=namespace, params={'size': sizes}, ) def run_benchmark_last_diff( title: str, base: type[pydantic.BaseModel], sizes: SizedIterable, mimic_cached_property: bool, n_execution: int = 10_000, n_repeat: int = 5, ) -> np.ndarray: setup = textwrap.dedent( """ import pydantic Model = pydantic.create_model( "Model", __base__=Base, # shift the range() so that there is a field named size **{f'x{i}': (int, i) for i in range(1, %(size)d + 1)} ) left = Model() right = Model(f%(size)d=-1) if %(size)d > 0 else Model() """ ) if mimic_cached_property: # Mimic functools.cached_property editing __dict__ # NOTE: we must edit both objects, otherwise the dict don't have the same size and # dict.__eq__ has a very fast path. This makes our timing comparison incorrect # However, the value must be different, otherwise *our* __dict__ == right.__dict__ # fast-path prevents our correct code from running setup += textwrap.dedent( """ object.__setattr__(left, 'cache', None) object.__setattr__(right, 'cache', -1) """ ) statement = 'left == right' namespace = {'Base': base} return run_benchmark( title, setup=setup, statement=statement, n_execution=n_execution, n_repeat=n_repeat, globals=namespace, params={'size': sizes}, ) def run_benchmark_random_unequal( title: str, base: type[pydantic.BaseModel], sizes: SizedIterable, mimic_cached_property: bool, n_samples: int = 100, n_execution: int = 1_000, n_repeat: int = 5, ) -> np.ndarray: import numpy as np setup = textwrap.dedent( """ import pydantic Model = pydantic.create_model( "Model", __base__=Base, **{f'x{i}': (int, i) for i in range(%(size)d)} ) left = Model() right = Model(f%(field)d=-1) """ ) if mimic_cached_property: # Mimic functools.cached_property editing __dict__ # NOTE: we must edit both objects, otherwise the dict don't have the same size and # dict.__eq__ has a very fast path. This makes our timing comparison incorrect # However, the value must be different, otherwise *our* __dict__ == right.__dict__ # fast-path prevents our correct code from running setup += textwrap.dedent( """ object.__setattr__(left, 'cache', None) object.__setattr__(right, 'cache', -1) """ ) statement = 'left == right' namespace = {'Base': base} arr_sizes = np.fromiter(sizes, dtype=int) mask_valid_sizes = arr_sizes > 0 arr_valid_sizes = arr_sizes[mask_valid_sizes] # we can't support 0 when sampling the field rng = np.random.default_rng() arr_fields = rng.integers(arr_valid_sizes, size=(n_samples, *arr_valid_sizes.shape)) # broadcast the sizes against their sample, so we can iterate on (size, field) tuple # as parameters of the timing test arr_size_broadcast, _ = np.meshgrid(arr_valid_sizes, arr_fields[:, 0]) results = run_benchmark( title, setup=setup, statement=statement, n_execution=n_execution, n_repeat=n_repeat, globals=namespace, params={'size': arr_size_broadcast.ravel(), 'field': arr_fields.ravel()}, ) times = np.empty(arr_sizes.shape, dtype=float) times[~mask_valid_sizes] = np.nan times[mask_valid_sizes] = results.reshape((n_samples, *arr_valid_sizes.shape)).mean(axis=0) return times BENCHMARKS = { 'All field equals': run_benchmark_nodiff, 'First field unequal': run_benchmark_first_diff, 'Last field unequal': run_benchmark_last_diff, 'Random unequal field': run_benchmark_random_unequal, } def run_benchmark( title: str, setup: str, statement: str, n_execution: int = 10_000, n_repeat: int = 5, globals: dict[str, Any] | None = None, progress_bar: bool = True, params: dict[str, SizedIterable] | None = None, ) -> np.ndarray: import numpy as np import tqdm.auto as tqdm namespace = globals or {} # fast-path if not params: length = 1 packed_params = [()] else: length = len(next(iter(params.values()))) # This iterator yields a tuple of (key, value) pairs # First, make a list of N iterator over (key, value), where the provided values are iterated param_pairs = [zip(itertools.repeat(name), value) for name, value in params.items()] # Then pack our individual parameter iterator into one packed_params = zip(*param_pairs) times = [ # Take the min of the repeats as recommended by timeit doc min( timeit.Timer( setup=setup % dict(local_params), stmt=statement, globals=namespace, ).repeat(repeat=n_repeat, number=n_execution) ) / n_execution for local_params in tqdm.tqdm(packed_params, desc=title, total=length, disable=not progress_bar) ] gc.collect() return np.asarray(times, dtype=float) if __name__ == '__main__': # run with `uv run tests/benchmarks/test_basemodel_eq_performance.py` import argparse import pathlib try: import matplotlib # noqa: F401 import numpy # noqa: F401 import tqdm # noqa: F401 except ImportError as err: raise ImportError( 'This benchmark additionally depends on numpy, matplotlib and tqdm. ' 'Install those in your environment to run the benchmark.' ) from err parser = argparse.ArgumentParser( description='Test the performance of various BaseModel.__eq__ implementations fixing GH-7444.' ) parser.add_argument( '-o', '--output-path', default=None, type=pathlib.Path, help=( 'Output directory or file in which to save the benchmark results. ' 'If a directory is passed, a default filename is used.' ), ) parser.add_argument( '--min-n-fields', type=int, default=0, help=('Test the performance of BaseModel.__eq__ on models with at least this number of fields. Defaults to 0.'), ) parser.add_argument( '--max-n-fields', type=int, default=100, help=('Test the performance of BaseModel.__eq__ on models with up to this number of fields. Defaults to 100.'), ) args = parser.parse_args() import matplotlib.pyplot as plt sizes = list(range(args.min_n_fields, args.max_n_fields)) fig = plot_all_benchmark(IMPLEMENTATIONS, sizes=sizes) plt.tight_layout() if args.output_path is None: plt.show() else: if args.output_path.suffix: filepath = args.output_path else: filepath = args.output_path / f"eq-benchmark_python-{PYTHON_VERSION.replace('.', '-')}.png" fig.savefig( filepath, dpi=200, facecolor='white', transparent=False, ) print(f'wrote {filepath!s}', file=sys.stderr) pydantic-2.10.6/tests/benchmarks/generate_north_star_data.py000066400000000000000000000062451474456633400243210ustar00rootroot00000000000000from datetime import datetime from typing import Any, Callable, List, TypeVar, Union from faker import Faker f = Faker() Faker.seed(0) T = TypeVar('T') ## Helper functions # by default faker uses upper bound of now for datetime, which # is not helpful for reproducing benchmark data _END_DATETIME = datetime(2023, 1, 1, 0, 0, 0, 0) def one_of(*callables: Callable[[], Any]) -> Any: return f.random.choice(callables)() def list_of(callable: Callable[[], T], max_length: int) -> List[T]: return [callable() for _ in range(f.random_int(max=max_length))] def lax_int(*args: Any, **kwargs: Any) -> Union[int, float, str]: return f.random.choice((int, float, str))(f.random_int(*args, **kwargs)) def lax_float(*args: Any, **kwargs: Any) -> Union[int, float, str]: return f.random.choice((int, float, str))(f.pyfloat(*args, **kwargs)) def time_seconds() -> int: dt = f.date_time(end_datetime=_END_DATETIME) midnight = dt.replace(hour=0, minute=0, second=0, microsecond=0) return (dt - midnight).total_seconds() def time_microseconds() -> float: return float(time_seconds()) + (f.random_int(max=999999) * 1e-6) def time_string() -> str: return f.time() def lax_time() -> Union[int, float, str]: return one_of(time_seconds, time_microseconds, time_string) def date_string() -> str: return f.date(end_datetime=_END_DATETIME).format('%Y-%m-%d') def datetime_timestamp() -> int: dt = f.date_time(end_datetime=_END_DATETIME) midnight = dt.replace(hour=0, minute=0, second=0, microsecond=0) return (dt - midnight).total_seconds() def datetime_microseconds() -> float: return float(datetime_timestamp()) + (f.random_int(max=999999) * 1e-6) def datetime_str() -> str: return f.date_time(end_datetime=_END_DATETIME).isoformat() def lax_datetime() -> Union[int, float, str]: return one_of(datetime_timestamp, datetime_microseconds, datetime_str) ## Sample data generators def blog() -> dict: return { 'type': 'blog', 'title': f.text(max_nb_chars=40), 'post_count': lax_int(), 'readers': lax_int(), 'avg_post_rating': lax_float(min_value=0, max_value=5), 'url': f.url(), } def social_profile() -> dict: return { 'type': 'profile', 'username': f.user_name(), 'join_date': date_string(), **one_of(facebook_profile, twitter_profile, linkedin_profile), } def facebook_profile() -> dict: return {'network': 'facebook', 'friends': lax_int()} def twitter_profile() -> dict: return {'network': 'twitter', 'followers': lax_int()} def linkedin_profile() -> dict: return {'network': 'linkedin', 'connections': min(f.random_int(), 500)} def website() -> dict: return one_of(blog, social_profile) def person() -> dict: return { 'id': f.uuid4(), 'name': f.name(), 'height': str(f.pydecimal(min_value=1, max_value=2, right_digits=2)), 'entry_created_date': date_string(), 'entry_created_time': lax_time(), 'entry_updated_at': lax_datetime(), 'websites': list_of(website, max_length=5), } def person_data(length: int) -> List[dict]: return [person() for _ in range(length)] pydantic-2.10.6/tests/benchmarks/shared.py000066400000000000000000000115771474456633400205450ustar00rootroot00000000000000from collections import deque from datetime import date, datetime, time, timedelta from decimal import Decimal from enum import Enum, IntEnum from ipaddress import ( IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network, ) from pathlib import Path from re import Pattern from typing import ( Any, Callable, Deque, Dict, FrozenSet, Iterable, List, NamedTuple, Optional, Sequence, Set, Tuple, Type, Union, ) from uuid import UUID, uuid4, uuid5 from typing_extensions import Literal, TypedDict from pydantic import ( UUID1, UUID3, UUID4, UUID5, Base64Bytes, Base64Str, Base64UrlBytes, Base64UrlStr, BaseModel, ByteSize, DirectoryPath, FilePath, FiniteFloat, FutureDate, ImportString, Json, JsonValue, NegativeFloat, NegativeInt, NewPath, NonNegativeFloat, NonNegativeInt, NonPositiveFloat, NonPositiveInt, OnErrorOmit, PastDate, PastDatetime, PositiveFloat, PositiveInt, Secret, SecretBytes, SecretStr, StrictBool, ) class SimpleModel(BaseModel): field1: str field2: int field3: float class NestedModel(BaseModel): field1: str field2: List[int] field3: Dict[str, float] class OuterModel(BaseModel): nested: NestedModel optional_nested: Optional[NestedModel] class ComplexModel(BaseModel): field1: Union[str, int, float] field2: List[Dict[str, Union[int, float]]] field3: Optional[List[Union[str, int]]] class Color(Enum): RED = 'red' GREEN = 'green' BLUE = 'blue' class ToolEnum(IntEnum): spanner = 1 wrench = 2 screwdriver = 3 class Point(NamedTuple): x: int y: int class User(TypedDict): name: str id: int class Foo: pass StdLibTypes = [ deque, # collections.deque Deque[str], # typing.Deque Deque[int], # typing.Deque Deque[float], # typing.Deque Deque[bytes], # typing.Deque str, # str int, # int float, # float complex, # complex bool, # bool bytes, # bytes date, # datetime.date datetime, # datetime.datetime time, # datetime.time timedelta, # datetime.timedelta Decimal, # decimal.Decimal Color, # enum ToolEnum, # int enum IPv4Address, # ipaddress.IPv4Address IPv6Address, # ipaddress.IPv6Address IPv4Interface, # ipaddress.IPv4Interface IPv6Interface, # ipaddress.IPv6Interface IPv4Network, # ipaddress.IPv4Network IPv6Network, # ipaddress.IPv6Network Path, # pathlib.Path Pattern, # typing.Pattern UUID, # uuid.UUID uuid4, # uuid.uuid4 uuid5, # uuid.uuid5 Point, # named tuple list, # built-in list List[int], # typing.List List[str], # typing.List List[bytes], # typing.List List[float], # typing.List dict, # built-in dict Dict[str, float], # typing.Dict Dict[str, bytes], # typing.Dict Dict[str, int], # typing.Dict Dict[str, str], # typing.Dict User, # TypedDict tuple, # tuple Tuple[int, str, float], # typing.Tuple set, # built-in set Set[int], # typing.Set Set[str], # typing.Set frozenset, # built-in frozenset FrozenSet[int], # typing.FrozenSet FrozenSet[str], # typing.FrozenSet Optional[int], # typing.Optional Optional[str], # typing.Optional Optional[float], # typing.Optional Optional[bytes], # typing.Optional Optional[bool], # typing.Optional Sequence[int], # typing.Sequence Sequence[str], # typing.Sequence Sequence[bytes], # typing.Sequence Sequence[float], # typing.Sequence Iterable[int], # typing.Iterable Iterable[str], # typing.Iterable Iterable[bytes], # typing.Iterable Iterable[float], # typing.Iterable Callable[[int], int], # typing.Callable Callable[[str], str], # typing.Callable Literal['apple', 'pumpkin'], # Type[Foo], # typing.Type Any, # typing.Any ] PydanticTypes = [ StrictBool, PositiveInt, PositiveFloat, NegativeInt, NegativeFloat, NonNegativeInt, NonPositiveInt, NonNegativeFloat, NonPositiveFloat, FiniteFloat, UUID1, UUID3, UUID4, UUID5, FilePath, DirectoryPath, NewPath, Base64Bytes, Base64Str, Base64UrlBytes, Base64UrlStr, JsonValue, OnErrorOmit, ImportString, Json[Any], Json[List[int]], Json[List[str]], Json[List[bytes]], Json[List[float]], Json[List[Any]], Secret[bool], Secret[int], Secret[float], Secret[str], Secret[bytes], SecretStr, SecretBytes, ByteSize, PastDate, FutureDate, PastDatetime, ] class DeferredModel(BaseModel): model_config = {'defer_build': True} def rebuild_model(model: Type[BaseModel]) -> None: model.model_rebuild(force=True, _types_namespace={}) pydantic-2.10.6/tests/benchmarks/test_discriminated_unions.py000066400000000000000000000024611474456633400245400ustar00rootroot00000000000000from __future__ import annotations from typing import Literal, Union import pytest from typing_extensions import Annotated from pydantic import BaseModel, Field, TypeAdapter class NestedState(BaseModel): state_type: Literal['nested'] substate: AnyState class LoopState(BaseModel): state_type: Literal['loop'] substate: AnyState class LeafState(BaseModel): state_type: Literal['leaf'] AnyState = Annotated[Union[NestedState, LoopState, LeafState], Field(discriminator='state_type')] @pytest.mark.benchmark def test_schema_build(benchmark) -> None: @benchmark def run(): adapter = TypeAdapter(AnyState) assert adapter.core_schema['schema']['type'] == 'tagged-union' any_state_adapter = TypeAdapter(AnyState) def build_nested_state(n): if n <= 0: return {'state_type': 'leaf'} else: return {'state_type': 'loop', 'substate': {'state_type': 'nested', 'substate': build_nested_state(n - 1)}} @pytest.mark.benchmark def test_efficiency_with_highly_nested_examples(benchmark) -> None: # can go much higher, but we keep it reasonably low here for a proof of concept @benchmark def run(): for i in range(1, 12): very_nested_input = build_nested_state(i) any_state_adapter.validate_python(very_nested_input) pydantic-2.10.6/tests/benchmarks/test_fastapi_startup_generics.py000066400000000000000000000105671474456633400254240ustar00rootroot00000000000000"""https://github.com/pydantic/pydantic/issues/6768""" from __future__ import annotations from pathlib import Path from typing import Any, Generic, List, TypeVar from typing_extensions import Annotated from pydantic import BaseModel, TypeAdapter, create_model from pydantic.fields import FieldInfo TYPES_DEFAULTS = {int: 0, str: '', bool: False} # some dummy basic types with defaults for some fields TYPES = [*TYPES_DEFAULTS.keys()] # these are set low to minimise test time, they're increased below in the cProfile call INNER_DATA_MODEL_COUNT = 5 OUTER_DATA_MODEL_COUNT = 5 def create_data_models() -> list[Any]: # Create varying inner models with different sizes and fields (not actually realistic) models = [] for i in range(INNER_DATA_MODEL_COUNT): fields = {} for j in range(i): type_ = TYPES[j % len(TYPES)] type_default = TYPES_DEFAULTS[type_] if j % 4 == 0: type_ = List[type_] type_default = [] default = ... if j % 2 == 0 else type_default fields[f'f{j}'] = (type_, default) models.append(create_model(f'M1{i}', **fields)) # Crate varying outer models where some fields use the inner models (not really realistic) models_with_nested = [] for i in range(OUTER_DATA_MODEL_COUNT): fields = {} for j in range(i): type_ = models[j % len(models)] if j % 2 == 0 else TYPES[j % len(TYPES)] if j % 4 == 0: type_ = List[type_] fields[f'f{j}'] = (type_, ...) models_with_nested.append(create_model(f'M2{i}', **fields)) return [*models, *models_with_nested] def test_fastapi_startup_perf(benchmark: Any): data_models = create_data_models() # API models for reading / writing the different data models T = TypeVar('T') class GetModel(BaseModel, Generic[T]): res: T class GetModel2(GetModel[T], Generic[T]): foo: str bar: str class GetManyModel(BaseModel, Generic[T]): res: list[T] class GetManyModel2(GetManyModel[T], Generic[T]): foo: str bar: str class GetManyModel3(BaseModel, Generic[T]): res: dict[str, T] class GetManyModel4(BaseModel, Generic[T]): res: dict[str, list[T]] class PutModel(BaseModel, Generic[T]): data: T class PutModel2(PutModel[T], Generic[T]): foo: str bar: str class PutManyModel(BaseModel, Generic[T]): data: list[T] class PutManyModel2(PutManyModel[T], Generic[T]): foo: str bar: str api_models: list[Any] = [ GetModel, GetModel2, GetManyModel, GetManyModel2, GetManyModel3, GetManyModel4, PutModel, PutModel2, PutManyModel, PutManyModel2, ] assert len(data_models) == INNER_DATA_MODEL_COUNT + OUTER_DATA_MODEL_COUNT def bench(): concrete_api_models = [] adapters = [] for outer_api_model in api_models: for data_model in data_models: concrete_api_model = outer_api_model[ data_model ] # Would be used eg as request or response body in FastAPI concrete_api_models.append(concrete_api_model) # Emulate FastAPI creating its TypeAdapters adapt = TypeAdapter(Annotated[concrete_api_model, FieldInfo(description='foo')]) adapters.append(adapt) adapt = TypeAdapter(Annotated[concrete_api_model, FieldInfo(description='bar')]) adapters.append(adapt) assert len(concrete_api_models) == len(data_models) * len(api_models) assert len(adapters) == len(concrete_api_models) * 2 benchmark(bench) if __name__ == '__main__': # run with `uv run tests/benchmarks/test_fastapi_startup.py` import cProfile import sys import time INNER_DATA_MODEL_COUNT = 50 OUTER_DATA_MODEL_COUNT = 50 print(f'Python version: {sys.version}') if sys.argv[-1] == 'cProfile': cProfile.run( 'test_fastapi_startup_perf(lambda f: f())', sort='tottime', filename=Path(__file__).name.strip('.py') + '.cprof', ) else: start = time.perf_counter() test_fastapi_startup_perf(lambda f: f()) end = time.perf_counter() print(f'Time taken: {end - start:.2f}s') pydantic-2.10.6/tests/benchmarks/test_fastapi_startup_simple.py000066400000000000000000000062571474456633400251170ustar00rootroot00000000000000"""https://github.com/pydantic/pydantic/issues/6768""" from __future__ import annotations from datetime import datetime from pathlib import Path from typing import Any, Callable, Dict, List, Tuple from uuid import UUID import pytest from annotated_types import Gt from typing_extensions import Annotated from pydantic import AnyUrl, BaseModel, EmailStr, TypeAdapter from pydantic.functional_validators import AfterValidator from pydantic.types import StringConstraints try: import email_validator except ImportError: email_validator = None @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_fastapi_startup_perf(benchmark: Callable[[Callable[[], Any]], None]): def run() -> None: class User(BaseModel): id: int username: str email: EmailStr full_name: str | None = None class Address(BaseModel): street: str city: str state: Annotated[str, AfterValidator(lambda x: x.upper())] postal_code: Annotated[str, StringConstraints(min_length=5, max_length=5, pattern=r'[A-Z0-9]+')] class Product(BaseModel): id: int name: str price: Annotated[float, Gt(0)] description: str | None = None class BlogPost(BaseModel): title: Annotated[str, StringConstraints(pattern=r'[A-Za-z0-9]+')] content: str author: User published: bool = False class Website(BaseModel): name: str url: AnyUrl description: str | None = None class Order(BaseModel): order_id: str customer: User shipping_address: Address products: list[Product] class Comment(BaseModel): text: str author: User post: BlogPost created_at: datetime class Event(BaseModel): event_id: UUID name: str date: datetime location: str class Category(BaseModel): name: str description: str | None = None ReviewGroup = List[Dict[Tuple[User, Product], Comment]] data_models = [ User, Address, Product, BlogPost, Website, Order, Comment, Event, Category, ReviewGroup, ] for _ in range(5): # FastAPI creates a new TypeAdapter for each endpoint for model in data_models: TypeAdapter(model) benchmark(run) if __name__ == '__main__': # run with `uv run tests/benchmarks/test_fastapi_startup_simple.py` import cProfile import sys import time print(f'Python version: {sys.version}') if sys.argv[-1] == 'cProfile': cProfile.run( 'test_fastapi_startup_perf(lambda f: f())', sort='tottime', filename=Path(__file__).name.strip('.py') + '.cprof', ) else: start = time.perf_counter() test_fastapi_startup_perf(lambda f: f()) end = time.perf_counter() print(f'Time taken: {end - start:.6f}s') pydantic-2.10.6/tests/benchmarks/test_imports.py000066400000000000000000000005161474456633400220220ustar00rootroot00000000000000import pytest @pytest.mark.benchmark def test_import_basemodel(benchmark) -> None: @benchmark def run(): from pydantic import BaseModel assert BaseModel @pytest.mark.benchmark def test_import_field(benchmark) -> None: @benchmark def run(): from pydantic import Field assert Field pydantic-2.10.6/tests/benchmarks/test_isinstance.py000066400000000000000000000004121474456633400224600ustar00rootroot00000000000000from pydantic import BaseModel class ModelV2(BaseModel): my_str: str mv2 = ModelV2(my_str='hello') def test_isinstance_basemodel(benchmark) -> None: @benchmark def run(): for _ in range(10000): assert isinstance(mv2, BaseModel) pydantic-2.10.6/tests/benchmarks/test_model_schema_generation.py000066400000000000000000000213771474456633400251700ustar00rootroot00000000000000from typing import ( Any, Dict, Generic, List, Literal, Optional, Type, TypeVar, Union, get_origin, ) import pytest from typing_extensions import Annotated, Self from pydantic import ( AfterValidator, BaseModel, BeforeValidator, Discriminator, Field, PlainSerializer, PlainValidator, Tag, WrapSerializer, WrapValidator, create_model, model_serializer, model_validator, ) from pydantic.dataclasses import dataclass, rebuild_dataclass from .shared import DeferredModel, PydanticTypes, StdLibTypes, rebuild_model @pytest.mark.benchmark(group='model_schema_generation') def test_simple_model_schema_generation(benchmark) -> None: class SimpleModel(DeferredModel): field1: str field2: int field3: float benchmark(rebuild_model, SimpleModel) @pytest.mark.benchmark(group='model_schema_generation') def test_simple_model_schema_lots_of_fields_generation(benchmark) -> None: IntStr = Union[int, str] Model = create_model( 'Model', __config__={'defer_build': True}, **{f'f{i}': (IntStr, ...) for i in range(100)}, ) benchmark(rebuild_model, Model) @pytest.mark.benchmark(group='model_schema_generation') def test_nested_model_schema_generation(benchmark) -> None: class NestedModel(BaseModel): field1: str field2: List[int] field3: Dict[str, float] class OuterModel(DeferredModel): nested: NestedModel optional_nested: Optional[NestedModel] benchmark(rebuild_model, OuterModel) @pytest.mark.benchmark(group='model_schema_generation') def test_complex_model_schema_generation(benchmark) -> None: class ComplexModel(DeferredModel): field1: Union[str, int, float] field2: List[Dict[str, Union[int, float]]] field3: Optional[List[Union[str, int]]] benchmark(rebuild_model, ComplexModel) @pytest.mark.benchmark(group='model_schema_generation') def test_recursive_model_schema_generation(benchmark) -> None: class RecursiveModel(DeferredModel): name: str children: Optional[List['RecursiveModel']] = None benchmark(rebuild_model, RecursiveModel) @pytest.mark.benchmark(group='model_schema_generation') def test_construct_dataclass_schema(benchmark): @dataclass(frozen=True, kw_only=True) class Cat: type: Literal['cat'] = 'cat' @dataclass(frozen=True, kw_only=True) class Dog: type: Literal['dog'] = 'dog' @dataclass(frozen=True, kw_only=True) class NestedDataClass: animal: Annotated[Union[Cat, Dog], Discriminator('type')] class NestedModel(BaseModel): animal: Annotated[Union[Cat, Dog], Discriminator('type')] @dataclass(frozen=True, kw_only=True, config={'defer_build': True}) class Root: data_class: NestedDataClass model: NestedModel benchmark(lambda: rebuild_dataclass(Root, force=True, _types_namespace={})) @pytest.mark.benchmark(group='model_schema_generation') def test_lots_of_models_with_lots_of_fields(benchmark): T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): value: T class RecursiveModel(BaseModel): name: str children: Optional[List['RecursiveModel']] = None class Address(BaseModel): street: Annotated[str, Field(max_length=100)] city: Annotated[str, Field(min_length=2)] zipcode: Annotated[str, Field(pattern=r'^\d{5}$')] class Person(BaseModel): name: Annotated[str, Field(min_length=1)] age: Annotated[int, Field(ge=0, le=120)] address: Address class Company(BaseModel): name: Annotated[str, Field(min_length=1)] employees: Annotated[List[Person], Field(min_length=1)] class Product(BaseModel): id: Annotated[int, Field(ge=1)] name: Annotated[str, Field(min_length=1)] price: Annotated[float, Field(ge=0)] metadata: Dict[str, str] # Repeat the pattern for other models up to Model_99 models: list[type[BaseModel]] = [] for i in range(100): model_fields = {} field_types = [ Annotated[int, Field(ge=0, le=1000)], Annotated[str, Field(max_length=50)], Annotated[List[int], Field(min_length=1, max_length=10)], int, str, List[int], Dict[str, Union[str, int]], GenericModel[int], RecursiveModel, Address, Person, Company, Product, Union[ int, str, List[str], Dict[str, int], GenericModel[str], RecursiveModel, Address, Person, Company, Product, ], ] for j in range(100): field_type = field_types[j % len(field_types)] if get_origin(field_type) is Annotated: model_fields[f'field_{j}'] = field_type else: model_fields[f'field_{j}'] = (field_type, ...) model_name = f'Model_{i}' models.append(create_model(model_name, __config__={'defer_build': True}, **model_fields)) def rebuild_models(models: List[Type[BaseModel]]) -> None: for model in models: rebuild_model(model) benchmark(rebuild_models, models) @pytest.mark.benchmark(group='model_schema_generation') def test_field_validators_serializers(benchmark) -> None: class ModelWithFieldValidatorsSerializers(DeferredModel): field1: Annotated[Any, BeforeValidator(lambda v: v)] field2: Annotated[Any, AfterValidator(lambda v: v)] field3: Annotated[Any, PlainValidator(lambda v: v)] field4: Annotated[Any, WrapValidator(lambda v, h: h(v))] field5: Annotated[Any, PlainSerializer(lambda x: x, return_type=Any)] field6: Annotated[Any, WrapSerializer(lambda x, nxt: nxt(x), when_used='json')] benchmark(rebuild_model, ModelWithFieldValidatorsSerializers) @pytest.mark.benchmark(group='model_schema_generation') def test_model_validators_serializers(benchmark): class ModelWithValidator(DeferredModel): field: Any @model_validator(mode='before') @classmethod def validate_model_before(cls, data: Any) -> Any: return data @model_validator(mode='after') def validate_model_after(self) -> Self: return self @model_serializer def serialize_model(self) -> Any: return self.field benchmark(rebuild_model, ModelWithValidator) @pytest.mark.benchmark(group='model_schema_generation') def test_tagged_union_with_str_discriminator_schema_generation(benchmark): class Cat(BaseModel): pet_type: Literal['cat'] meows: int class Dog(BaseModel): pet_type: Literal['dog'] barks: float class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] scales: bool class Model(DeferredModel): pet: Union[Cat, Dog, Lizard] = Field(discriminator='pet_type') n: int benchmark(rebuild_model, Model) @pytest.mark.benchmark(group='model_schema_generation') def test_tagged_union_with_callable_discriminator_schema_generation(benchmark): class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(DeferredModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] benchmark(rebuild_model, ThanksgivingDinner) @pytest.mark.parametrize('field_type', StdLibTypes) @pytest.mark.benchmark(group='stdlib_schema_generation') @pytest.mark.skip('Clutters codspeed CI, but should be enabled on branches where we modify schema building.') def test_stdlib_type_schema_generation(benchmark, field_type): class StdlibTypeModel(DeferredModel): field: field_type benchmark(rebuild_model, StdlibTypeModel) @pytest.mark.parametrize('field_type', PydanticTypes) @pytest.mark.benchmark(group='pydantic_custom_types_schema_generation') @pytest.mark.skip('Clutters codspeed CI, but should be enabled on branches where we modify schema building.') def test_pydantic_custom_types_schema_generation(benchmark, field_type): class PydanticTypeModel(DeferredModel): field: field_type benchmark(rebuild_model, PydanticTypeModel) pydantic-2.10.6/tests/benchmarks/test_model_schema_generation_recursive.py000066400000000000000000000045231474456633400272510ustar00rootroot00000000000000from typing import Dict, Generic, Literal, Optional, TypeVar, Union import pytest from pydantic import Field from .shared import DeferredModel, rebuild_model @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_simple_recursive_model_schema_generation(benchmark): class Foo(DeferredModel): a: int = 123 sibling: 'Foo' = None benchmark(rebuild_model, Foo) @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_generic_recursive_model_schema_generation(benchmark): T = TypeVar('T') class GenericFoo(DeferredModel, Generic[T]): value: T sibling: Optional['GenericFoo[T]'] = None benchmark(rebuild_model, GenericFoo[int]) @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_nested_recursive_model_schema_generation(benchmark): class Node(DeferredModel): value: int left: Optional['Node'] = None right: Optional['Node'] = None class Tree(DeferredModel): root: Node metadata: Dict[str, 'Tree'] = Field(default_factory=dict) benchmark(rebuild_model, Tree) @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_nested_recursive_generic_model_schema_generation(benchmark): T = TypeVar('T') class GenericNode(DeferredModel, Generic[T]): value: T left: Optional['GenericNode[T]'] = None right: Optional['GenericNode[T]'] = None class GenericTree(DeferredModel, Generic[T]): root: GenericNode[T] metadata: Dict[str, 'GenericTree[T]'] = Field(default_factory=dict) benchmark(rebuild_model, GenericTree[int]) @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_recursive_discriminated_union_with_base_model(benchmark) -> None: class Foo(DeferredModel): type: Literal['foo'] x: 'Foobar' class Bar(DeferredModel): type: Literal['bar'] class Foobar(DeferredModel): value: Union[Foo, Bar] = Field(discriminator='type') benchmark(rebuild_model, Foobar) @pytest.mark.benchmark(group='model_schema_generation_recursive') def test_deeply_nested_recursive_model_schema_generation(benchmark): class A(DeferredModel): b: 'B' class B(DeferredModel): c: 'C' class C(DeferredModel): a: Optional['A'] benchmark(rebuild_model, C) pydantic-2.10.6/tests/benchmarks/test_model_serialization.py000066400000000000000000000026531474456633400243660ustar00rootroot00000000000000from typing import List import pytest from pydantic import BaseModel from .shared import ComplexModel, NestedModel, OuterModel, SimpleModel @pytest.mark.benchmark(group='model_serialization') def test_simple_model_serialization(benchmark): model = SimpleModel(field1='test', field2=42, field3=3.14) benchmark(model.model_dump) @pytest.mark.benchmark(group='model_serialization') def test_nested_model_serialization(benchmark): model = OuterModel( nested=NestedModel(field1='test', field2=[1, 2, 3], field3={'a': 1.1, 'b': 2.2}), optional_nested=None ) benchmark(model.model_dump) @pytest.mark.benchmark(group='model_serialization') def test_complex_model_serialization(benchmark): model = ComplexModel(field1='test', field2=[{'a': 1, 'b': 2.2}, {'c': 3, 'd': 4.4}], field3=['test', 1, 2, 'test2']) benchmark(model.model_dump) @pytest.mark.benchmark(group='model_serialization') def test_list_of_models_serialization(benchmark): class SimpleListModel(BaseModel): items: List[SimpleModel] model = SimpleListModel(items=[SimpleModel(field1=f'test{i}', field2=i, field3=float(i)) for i in range(10)]) benchmark(model.model_dump) @pytest.mark.benchmark(group='model_serialization') def test_model_json_serialization(benchmark): model = ComplexModel(field1='test', field2=[{'a': 1, 'b': 2.2}, {'c': 3, 'd': 4.4}], field3=['test', 1, 2, 'test2']) benchmark(model.model_dump_json) pydantic-2.10.6/tests/benchmarks/test_model_validation.py000066400000000000000000000030631474456633400236370ustar00rootroot00000000000000from typing import List import pytest from pydantic import BaseModel from .shared import ComplexModel, OuterModel, SimpleModel pytestmark = [ pytest.mark.benchmark(group='model_validation'), pytest.mark.parametrize('method', ['model_validate', '__init__']), ] def test_simple_model_validation(method: str, benchmark): data = {'field1': 'test', 'field2': 42, 'field3': 3.14} if method == '__init__': benchmark(lambda data: SimpleModel(**data), data) else: benchmark(SimpleModel.model_validate, data) def test_nested_model_validation(method: str, benchmark): data = {'nested': {'field1': 'test', 'field2': [1, 2, 3], 'field3': {'a': 1.1, 'b': 2.2}}, 'optional_nested': None} if method == '__init__': benchmark(lambda data: OuterModel(**data), data) else: benchmark(OuterModel.model_validate, data) def test_complex_model_validation(method: str, benchmark): data = {'field1': 'test', 'field2': [{'a': 1, 'b': 2.2}, {'c': 3, 'd': 4.4}], 'field3': ['test', 1, 2, 'test2']} if method == '__init__': benchmark(lambda data: ComplexModel(**data), data) else: benchmark(ComplexModel.model_validate, data) def test_list_of_models_validation(method: str, benchmark): class SimpleListModel(BaseModel): items: List[SimpleModel] data = {'items': [{'field1': f'test{i}', 'field2': i, 'field3': float(i)} for i in range(10)]} if method == '__init__': benchmark(lambda data: SimpleListModel(**data), data) else: benchmark(SimpleListModel.model_validate, data) pydantic-2.10.6/tests/benchmarks/test_north_star.py000066400000000000000000000077221474456633400225160ustar00rootroot00000000000000""" An integration-style benchmark of a model with a class of what should (hopefully) be some of the most common field types used in pydantic validation. Used to gauge overall pydantic performance. """ import json from datetime import date, datetime, time from decimal import Decimal from pathlib import Path from typing import List, Union from uuid import UUID import pytest from typing_extensions import Annotated, Literal @pytest.fixture(scope='module') def pydantic_type_adapter(): from pydantic import BaseModel, Field, TypeAdapter from pydantic.networks import AnyHttpUrl class Blog(BaseModel): type: Literal['blog'] title: str post_count: int readers: int avg_post_rating: float url: AnyHttpUrl class SocialProfileBase(BaseModel): type: Literal['profile'] network: Literal['facebook', 'twitter', 'linkedin'] username: str join_date: date class FacebookProfile(SocialProfileBase): network: Literal['facebook'] friends: int class TwitterProfile(SocialProfileBase): network: Literal['twitter'] followers: int class LinkedinProfile(SocialProfileBase): network: Literal['linkedin'] connections: Annotated[int, Field(le=500)] SocialProfile = Annotated[Union[FacebookProfile, TwitterProfile, LinkedinProfile], Field(discriminator='network')] Website = Annotated[Union[Blog, SocialProfile], Field(discriminator='type')] class Person(BaseModel): id: UUID name: str height: Decimal entry_created_date: date entry_created_time: time entry_updated_at: datetime websites: List[Website] = Field(default_factory=list) return TypeAdapter(List[Person]) _NORTH_STAR_DATA_PATH = Path(__file__).parent / 'north_star_data.json' @pytest.fixture(scope='module') def north_star_data_bytes(): return _north_star_data_bytes() def _north_star_data_bytes() -> bytes: from .generate_north_star_data import person_data needs_generating = not _NORTH_STAR_DATA_PATH.exists() if needs_generating: data = json.dumps(person_data(length=1000)).encode() _NORTH_STAR_DATA_PATH.write_bytes(data) else: data = _NORTH_STAR_DATA_PATH.read_bytes() return data def test_north_star_validate_json(pydantic_type_adapter, north_star_data_bytes, benchmark): benchmark(pydantic_type_adapter.validate_json, north_star_data_bytes) def test_north_star_validate_json_strict(pydantic_type_adapter, north_star_data_bytes, benchmark): coerced_north_star_data = pydantic_type_adapter.dump_json( pydantic_type_adapter.validate_json(north_star_data_bytes) ) benchmark(pydantic_type_adapter.validate_json, coerced_north_star_data, strict=True) def test_north_star_dump_json(pydantic_type_adapter, north_star_data_bytes, benchmark): parsed = pydantic_type_adapter.validate_json(north_star_data_bytes) benchmark(pydantic_type_adapter.dump_json, parsed) def test_north_star_validate_python(pydantic_type_adapter, north_star_data_bytes, benchmark): benchmark(pydantic_type_adapter.validate_python, json.loads(north_star_data_bytes)) def test_north_star_validate_python_strict(pydantic_type_adapter, north_star_data_bytes, benchmark): coerced_north_star_data = pydantic_type_adapter.dump_python( pydantic_type_adapter.validate_json(north_star_data_bytes) ) benchmark(pydantic_type_adapter.validate_python, coerced_north_star_data, strict=True) def test_north_star_dump_python(pydantic_type_adapter, north_star_data_bytes, benchmark): parsed = pydantic_type_adapter.validate_python(json.loads(north_star_data_bytes)) benchmark(pydantic_type_adapter.dump_python, parsed) def test_north_star_json_loads(north_star_data_bytes, benchmark): benchmark(json.loads, north_star_data_bytes) def test_north_star_json_dumps(north_star_data_bytes, benchmark): parsed = json.loads(north_star_data_bytes) benchmark(json.dumps, parsed) pydantic-2.10.6/tests/check_usage_docs.py000066400000000000000000000021521474456633400204200ustar00rootroot00000000000000""" Check that all `Usage docs` tags in docstrings link to the latest version of pydantic. """ import re import sys from pathlib import Path ROOT_DIR = Path(__file__).parent.parent PYDANTIC_DIR = ROOT_DIR / 'pydantic' version_file = PYDANTIC_DIR / 'version.py' version = re.search(rb"VERSION = '(.*)'", version_file.read_bytes()).group(1) version_major_minor = b'.'.join(version.split(b'.')[:2]) expected_base = b'https://docs.pydantic.dev/' + version_major_minor + b'/' paths = sys.argv[1:] error_count = 0 for path_str in paths: path = ROOT_DIR / path_str b = path.read_bytes() changed = 0 def sub(m: re.Match) -> bytes: global changed if m.group(2) != expected_base: changed += 1 return m.group(1) + expected_base else: return m.group(0) b = re.sub(rb'(""" *usage.docs: *)(https://.+?/.+?/)', sub, b, flags=re.I) if changed: error_count += changed path.write_bytes(b) plural = 's' if changed > 1 else '' print(f'{path_str:50} {changed} usage docs link{plural} updated') if error_count: sys.exit(1) pydantic-2.10.6/tests/conftest.py000066400000000000000000000142361474456633400170020ustar00rootroot00000000000000from __future__ import annotations import importlib.util import inspect import os import re import secrets import subprocess import sys import textwrap from dataclasses import dataclass from pathlib import Path from types import FunctionType, ModuleType from typing import Any, Callable import pytest from _pytest.assertion.rewrite import AssertionRewritingHook from jsonschema import Draft202012Validator, SchemaError from pydantic._internal._generate_schema import GenerateSchema from pydantic.json_schema import GenerateJsonSchema def pytest_addoption(parser: pytest.Parser): parser.addoption('--test-mypy', action='store_true', help='run mypy tests') parser.addoption('--update-mypy', action='store_true', help='update mypy tests') def _extract_source_code_from_function(function: FunctionType): if function.__code__.co_argcount: raise RuntimeError(f'function {function.__qualname__} cannot have any arguments') code_lines = '' body_started = False for line in textwrap.dedent(inspect.getsource(function)).split('\n'): if line.startswith('def '): body_started = True continue elif body_started: code_lines += f'{line}\n' return textwrap.dedent(code_lines) def _create_module_file(code: str, tmp_path: Path, name: str) -> tuple[str, str]: # Max path length in Windows is 260. Leaving some buffer here max_name_len = 240 - len(str(tmp_path)) # Windows does not allow these characters in paths. Linux bans slashes only. sanitized_name = re.sub('[' + re.escape('<>:"/\\|?*') + ']', '-', name)[:max_name_len] name = f'{sanitized_name}_{secrets.token_hex(5)}' path = tmp_path / f'{name}.py' path.write_text(code) return name, str(path) @pytest.fixture(scope='session', autouse=True) def disable_error_urls(): # Don't add URLs during docs tests when printing # Otherwise we'll get version numbers in the URLs that will update frequently os.environ['PYDANTIC_ERRORS_INCLUDE_URL'] = 'false' @pytest.fixture def create_module( tmp_path: Path, request: pytest.FixtureRequest ) -> Callable[[FunctionType | str, bool, str | None], ModuleType]: def run( source_code_or_function: FunctionType | str, rewrite_assertions: bool = True, module_name_prefix: str | None = None, ) -> ModuleType: """ Create module object, execute it and return Can be used as a decorator of the function from the source code of which the module will be constructed :param source_code_or_function string or function with body as a source code for created module :param rewrite_assertions: whether to rewrite assertions in module or not :param module_name_prefix: string prefix to use in the name of the module, does not affect the name of the file. """ if isinstance(source_code_or_function, FunctionType): source_code = _extract_source_code_from_function(source_code_or_function) else: source_code = source_code_or_function module_name, filename = _create_module_file(source_code, tmp_path, request.node.name) if module_name_prefix: module_name = module_name_prefix + module_name if rewrite_assertions: loader = AssertionRewritingHook(config=request.config) loader.mark_rewrite(module_name) else: loader = None spec = importlib.util.spec_from_file_location(module_name, filename, loader=loader) sys.modules[module_name] = module = importlib.util.module_from_spec(spec) # pyright: ignore[reportArgumentType] spec.loader.exec_module(module) # pyright: ignore[reportOptionalMemberAccess] return module return run @pytest.fixture def subprocess_run_code(tmp_path: Path): def run_code(source_code_or_function) -> str: if isinstance(source_code_or_function, FunctionType): source_code = _extract_source_code_from_function(source_code_or_function) else: source_code = source_code_or_function py_file = tmp_path / 'test.py' py_file.write_text(source_code) return subprocess.check_output([sys.executable, str(py_file)], cwd=tmp_path, encoding='utf8') return run_code @dataclass class Err: message: str errors: Any | None = None def __repr__(self): if self.errors: return f'Err({self.message!r}, errors={self.errors!r})' else: return f'Err({self.message!r})' def message_escaped(self): return re.escape(self.message) @dataclass class CallCounter: count: int = 0 def reset(self) -> None: self.count = 0 @pytest.fixture def generate_schema_calls(monkeypatch: pytest.MonkeyPatch) -> CallCounter: orig_generate_schema = GenerateSchema.generate_schema counter = CallCounter() depth = 0 # generate_schema can be called recursively def generate_schema_call_counter(*args: Any, **kwargs: Any) -> Any: nonlocal depth counter.count += 1 if depth == 0 else 0 depth += 1 try: return orig_generate_schema(*args, **kwargs) finally: depth -= 1 monkeypatch.setattr(GenerateSchema, 'generate_schema', generate_schema_call_counter) return counter @pytest.fixture(scope='function', autouse=True) def validate_json_schemas(monkeypatch: pytest.MonkeyPatch, request: pytest.FixtureRequest) -> None: orig_generate = GenerateJsonSchema.generate def generate(*args: Any, **kwargs: Any) -> Any: json_schema = orig_generate(*args, **kwargs) if not request.node.get_closest_marker('skip_json_schema_validation'): try: Draft202012Validator.check_schema(json_schema) except SchemaError: pytest.fail( 'Failed to validate the JSON Schema against the Draft 2020-12 spec. ' 'If this is expected, you can mark the test function with the `skip_json_schema_validation` ' 'marker. Note that this validation only takes place during tests, and is not active at runtime.' ) return json_schema monkeypatch.setattr(GenerateJsonSchema, 'generate', generate) pydantic-2.10.6/tests/mypy/000077500000000000000000000000001474456633400155735ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/README.md000066400000000000000000000037401474456633400170560ustar00rootroot00000000000000# Mypy plugin type checking suite > [!WARNING] > The test suite is subject to changes. It is currently not user friendly as the output and configuration > files are separated from the source modules, making it hard to navigate. In the future, we may switch > to using the [`pytest-mypy-plugins`][https://github.com/TypedDjango/pytest-mypy-plugins] library, which > provides more flexibility when it comes to merging different mypy configurations. The `test_mypy_results` test defined in [`test_mypy.py`](./test_mypy.py) runs Mypy on the files defined in [`modules/`](./modules/), using the configuration files from [`configs/`](./configs/). The Mypy output is merged with the source file and saved in the [`outputs/`](./outputs/) folder. For instance, with the following file: ```python from pydantic import BaseModel class Model(BaseModel): a: int model = Model(a=1, b=2) ``` The output will look like: ```python from pydantic import BaseModel class Model(BaseModel): a: int model = Model(a=1, b=2) # MYPY: error: Unexpected keyword argument "b" for "Model" [call-arg] ``` ## Adding a new test 1. Define a new file in the [`modules/`](./modules/) folder: ```python # modules/new_test.py class Model(BaseModel): a: int model = Model(a=1, b=2) ``` 2. Add the new file in the defined `cases` in [`test_mypy.py`](./test_mypy.py), together with a configuration file: ```python cases: list[ParameterSet | tuple[str, str]] = [ ..., # One-off cases *[ ('mypy-plugin.ini', 'custom_constructor.py'), ('mypy-plugin.ini', 'config_conditional_extra.py'), ..., ('mypy-plugin.ini', 'new_test.py'), # <-- new test added. ] ``` 3. Run `make test-mypy-update-all`. It should create a new output with your new file. 4. Make sure the output contains the expected Mypy error message/code. > [!NOTE] > You can also edit existing module files. In that case, only step 3 and 4 are relevant. pydantic-2.10.6/tests/mypy/__init__.py000066400000000000000000000000001474456633400176720ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/configs/000077500000000000000000000000001474456633400172235ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/configs/mypy-default.ini000066400000000000000000000007271474456633400223520ustar00rootroot00000000000000[mypy] follow_imports = silent strict_optional = True warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True check_untyped_defs = True no_implicit_reexport = True python_version = 3.10 # for strict mypy: (this is the tricky one :-)) disallow_untyped_defs = True # TODO 3.9 drop the following line: force_uppercase_builtins = True # TODO 3.10 drop the following line: force_union_syntax = True [mypy-pydantic_core.*] follow_imports = skip pydantic-2.10.6/tests/mypy/configs/mypy-plugin-strict-no-any.ini000066400000000000000000000015261474456633400247270ustar00rootroot00000000000000[mypy] plugins = pydantic.mypy warn_unreachable = true follow_imports = silent strict_optional = True warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True check_untyped_defs = True no_implicit_reexport = True disallow_untyped_defs = True disallow_any_decorated = True disallow_any_expr = True disallow_any_explicit = True # The following should be set to True, but results in a Mypy crash # (https://github.com/python/mypy/issues/17954) disallow_any_unimported = False disallow_subclassing_any = True warn_return_any = True python_version = 3.10 # TODO 3.9 drop the following line: force_uppercase_builtins = True # TODO 3.10 drop the following line: force_union_syntax = True [pydantic-mypy] init_forbid_extra = True init_typed = True warn_required_dynamic_aliases = True [mypy-pydantic_core.*] follow_imports = skip pydantic-2.10.6/tests/mypy/configs/mypy-plugin-strict.ini000066400000000000000000000010401474456633400235170ustar00rootroot00000000000000[mypy] plugins = pydantic.mypy follow_imports = silent strict_optional = True warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True check_untyped_defs = True no_implicit_reexport = True disallow_untyped_defs = True python_version = 3.10 # TODO 3.9 drop the following line: force_uppercase_builtins = True # TODO 3.10 drop the following line: force_union_syntax = True [pydantic-mypy] init_forbid_extra = True init_typed = True warn_required_dynamic_aliases = True [mypy-pydantic_core.*] follow_imports = skip pydantic-2.10.6/tests/mypy/configs/mypy-plugin-very-strict.ini000066400000000000000000000005341474456633400245110ustar00rootroot00000000000000[mypy] plugins = pydantic.mypy strict = True follow_imports = silent python_version = 3.10 # TODO 3.9 drop the following line: force_uppercase_builtins = True # TODO 3.10 drop the following line: force_union_syntax = True [pydantic-mypy] init_forbid_extra = True init_typed = True warn_required_dynamic_aliases = True warn_untyped_fields = True pydantic-2.10.6/tests/mypy/configs/mypy-plugin.ini000066400000000000000000000007601474456633400222210ustar00rootroot00000000000000[mypy] plugins = pydantic.mypy follow_imports = silent strict_optional = True warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True check_untyped_defs = True no_implicit_reexport = True python_version = 3.10 # for strict mypy: (this is the tricky one :-)) disallow_untyped_defs = True # TODO 3.9 drop the following line: force_uppercase_builtins = True # TODO 3.10 drop the following line: force_union_syntax = True [mypy-pydantic_core.*] follow_imports = skip pydantic-2.10.6/tests/mypy/configs/pyproject-default.toml000066400000000000000000000013641474456633400235650ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] follow_imports = "silent" strict_optional = true warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true check_untyped_defs = true no_implicit_reexport = true disallow_untyped_defs = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/configs/pyproject-plugin-bad-param.toml000066400000000000000000000015651474456633400252640ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] plugins = [ "pydantic.mypy" ] follow_imports = "silent" strict_optional = true warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true check_untyped_defs = true no_implicit_reexport = true disallow_untyped_defs = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [tool.pydantic-mypy] init_forbid_extra = "foo" # this will raise a ValueError for the config [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/configs/pyproject-plugin-no-strict-optional.toml000066400000000000000000000014311474456633400271750ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] plugins = [ "pydantic.mypy" ] follow_imports = "silent" no_strict_optional = true warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true check_untyped_defs = true no_implicit_reexport = true disallow_untyped_defs = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/configs/pyproject-plugin-strict-equality.toml000066400000000000000000000020651474456633400265770ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] plugins = "pydantic.mypy" ignore_missing_imports = true warn_return_any = true warn_unreachable = true warn_unused_configs = true follow_imports = "normal" show_column_numbers = true strict_optional = true warn_redundant_casts = true pretty = false strict = true warn_unused_ignores = true check_untyped_defs = true disallow_untyped_calls = true disallow_untyped_defs = true disallow_untyped_decorators = false strict_equality = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [tool.pydantic-mypy] init_forbid_extra = true init_typed = true warn_required_dynamic_aliases = true [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/configs/pyproject-plugin-strict.toml000066400000000000000000000015741474456633400247500ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] plugins = [ "pydantic.mypy" ] follow_imports = "silent" strict_optional = true warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true check_untyped_defs = true no_implicit_reexport = true disallow_untyped_defs = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [tool.pydantic-mypy] init_forbid_extra = true init_typed = true warn_required_dynamic_aliases = true [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/configs/pyproject-plugin.toml000066400000000000000000000014261474456633400234360ustar00rootroot00000000000000[build-system] requires = ["poetry>=0.12"] build_backend = "poetry.masonry.api" [tool.poetry] name = "test" version = "0.0.1" readme = "README.md" authors = [ "author@example.com" ] [tool.poetry.dependencies] python = "*" [tool.pytest.ini_options] addopts = "-v -p no:warnings" [tool.mypy] plugins = [ "pydantic.mypy" ] follow_imports = "silent" strict_optional = true warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true check_untyped_defs = true no_implicit_reexport = true disallow_untyped_defs = true python_version = '3.10' # TODO 3.9 drop the following line: force_uppercase_builtins = true # TODO 3.10 drop the following line: force_union_syntax = true [[tool.mypy.overrides]] module = [ 'pydantic_core.*', ] follow_imports = "skip" pydantic-2.10.6/tests/mypy/modules/000077500000000000000000000000001474456633400172435ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/modules/config_conditional_extra.py000066400000000000000000000005431474456633400246520ustar00rootroot00000000000000"""Test that the mypy plugin does not change the config type checking. This test can most likely be removed when we drop support for the old V1 `Config` class. """ from pydantic import BaseModel, ConfigDict def condition() -> bool: return True class MyModel(BaseModel): model_config = ConfigDict(extra='ignore' if condition() else 'forbid') pydantic-2.10.6/tests/mypy/modules/covariant_typevar.py000066400000000000000000000002601474456633400233530ustar00rootroot00000000000000from typing import Generic, TypeVar from pydantic import BaseModel T = TypeVar("T", covariant=True) class Foo(BaseModel, Generic[T]): value: T class Bar(Foo[T]): ... pydantic-2.10.6/tests/mypy/modules/custom_constructor.py000066400000000000000000000004031474456633400235710ustar00rootroot00000000000000from pydantic import BaseModel class Person(BaseModel): id: int name: str birth_year: int def __init__(self, id: int) -> None: super().__init__(id=id, name='Patrick', birth_year=1991) Person(1) Person(id=1) Person(name='Patrick') pydantic-2.10.6/tests/mypy/modules/dataclass_no_any.py000066400000000000000000000002241474456633400231150ustar00rootroot00000000000000from pydantic.dataclasses import dataclass @dataclass class Foo: foo: int @dataclass(config={'title': 'Bar Title'}) class Bar: bar: str pydantic-2.10.6/tests/mypy/modules/fail_defaults.py000066400000000000000000000007161474456633400224230ustar00rootroot00000000000000from pydantic import BaseModel, Field class Model(BaseModel): # Required undefined_default_no_args: int = Field() undefined_default: int = Field(description='my desc') positional_ellipsis_default: int = Field(...) named_ellipsis_default: int = Field(default=...) # Not required positional_default: int = Field(1) named_default: int = Field(default=2) named_default_factory: int = Field(default_factory=lambda: 3) Model() pydantic-2.10.6/tests/mypy/modules/from_orm_v1_noconflict.py000066400000000000000000000012151474456633400242600ustar00rootroot00000000000000""" Test from_orm check does not raise pydantic-orm error on v1.BaseModel subclass """ from dataclasses import dataclass from typing import Optional from pydantic import BaseModel, ConfigDict from pydantic.v1 import BaseModel as BaseModelV1 @dataclass class CustomObject: x: int y: Optional[int] obj = CustomObject(x=1, y=2) class CustomModel(BaseModel): model_config = ConfigDict( from_attributes=True, strict=True, ) x: int cm = CustomModel.from_orm(obj) class CustomModelV1(BaseModelV1): class Config: orm_mode = True strict = True x: int cmv1 = CustomModelV1.from_orm(obj) pydantic-2.10.6/tests/mypy/modules/frozen_field.py000066400000000000000000000002021474456633400222550ustar00rootroot00000000000000from pydantic import BaseModel, Field class Foo(BaseModel): a: int = Field(default=1, frozen=True) foo = Foo() foo.a = 2 pydantic-2.10.6/tests/mypy/modules/generics.py000066400000000000000000000016751474456633400214250ustar00rootroot00000000000000from typing import Any, Dict, Generic, Optional, TypeVar from typing_extensions import assert_type from pydantic import BaseModel Tbody = TypeVar('Tbody') class Response(BaseModel, Generic[Tbody]): url: str body: Tbody class JsonBody(BaseModel): raw: str data: Dict[str, Any] class HtmlBody(BaseModel): raw: str doctype: str class JsonResponse(Response[JsonBody]): pass class HtmlResponse(Response[HtmlBody]): def custom_method(self) -> None: doctype = self.body.doctype print(f'self: {doctype}') example = {'url': 'foo.com', 'body': {'raw': '....', 'doctype': 'html'}} resp = HtmlResponse.model_validate(example) resp.custom_method() assert_type(resp.body, HtmlBody) T = TypeVar('T', int, str) class HistoryField(BaseModel, Generic[T]): value: Optional[T] class DomainType(HistoryField[int]): pass thing = DomainType(value=None) assert_type(thing.value, Optional[int]) pydantic-2.10.6/tests/mypy/modules/metaclass_args.py000066400000000000000000000010751474456633400226100ustar00rootroot00000000000000from pydantic import BaseModel, Field class ConfigClassUsed(BaseModel): i: int = Field(2, alias='j') class Config: populate_by_name = True ConfigClassUsed(i=None) class MetaclassArgumentsNoDefault(BaseModel, populate_by_name=True): i: int = Field(alias='j') MetaclassArgumentsNoDefault(i=None) class MetaclassArgumentsWithDefault(BaseModel, populate_by_name=True): i: int = Field(2, alias='j') MetaclassArgumentsWithDefault(i=None) class NoArguments(BaseModel): i: int = Field(2, alias='j') NoArguments(i=1) NoArguments(j=None) pydantic-2.10.6/tests/mypy/modules/no_strict_optional.py000066400000000000000000000007201474456633400235250ustar00rootroot00000000000000from typing import Optional, Union from pydantic import BaseModel, ConfigDict class MongoSettings(BaseModel): MONGO_PASSWORD: Union[str, None] class CustomBaseModel(BaseModel): model_config = ConfigDict( validate_assignment=True, validate_default=True, extra='forbid', frozen=True, ) class HealthStatus(CustomBaseModel): status: str description: Optional[str] = None hs = HealthStatus(status='healthy') pydantic-2.10.6/tests/mypy/modules/plugin_fail.py000066400000000000000000000117461474456633400221170ustar00rootroot00000000000000from typing import Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, ConfigDict, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) x: int y: str def method(self) -> None: pass model = Model(x=1, y='y', z='z') model = Model(x=1) model.y = 'a' Model.from_orm({}) class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') kwargs_model = KwargsModel(x=1) kwargs_model.y = 'a' KwargsModel.from_orm({}) class ForbidExtraModel(BaseModel): model_config = ConfigDict(extra=Extra.forbid) ForbidExtraModel(x=1) class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) class BadExtraModel(BaseModel): model_config = ConfigDict(extra=1) # type: ignore[typeddict-item] class KwargsBadExtraModel(BaseModel, extra=1): pass class BadConfig1(BaseModel): model_config = ConfigDict(from_attributes={}) # type: ignore[typeddict-item] class KwargsBadConfig1(BaseModel, from_attributes={}): pass class BadConfig2(BaseModel): model_config = ConfigDict(from_attributes=list) # type: ignore[typeddict-item] class KwargsBadConfig2(BaseModel, from_attributes=list): pass class InheritingModel(Model): model_config = ConfigDict(frozen=False) class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... c: int = Field(...) d: Union[int, str] e = ... # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) j = 1 DefaultTestingModel() class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 UndefinedAnnotationModel() Model.model_construct(x=1) Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int model_config = ConfigDict(populate_by_name=True) DynamicAliasModel2(y='y', z=1) DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int model_config = ConfigDict(alias_generator=lambda x: x + '_') AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') model_config = ConfigDict(alias_generator=lambda x: x + '_') # type: ignore[pydantic-alias] class UntypedFieldModel(BaseModel): x: int = 1 y = 2 z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' class InheritingModel2(FrozenModel): model_config = ConfigDict(frozen=False) inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/modules/plugin_fail_baseConfig.py000066400000000000000000000121731474456633400242320ustar00rootroot00000000000000from typing import Any, Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: int y: str def method(self) -> None: pass class Config: alias_generator = None frozen = True extra = Extra.forbid def config_method(self) -> None: ... model = Model(x=1, y='y', z='z') model = Model(x=1) model.y = 'a' Model.from_orm({}) class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') kwargs_model = KwargsModel(x=1) kwargs_model.y = 'a' KwargsModel.from_orm({}) class ForbidExtraModel(BaseModel): class Config: extra = 'forbid' ForbidExtraModel(x=1) class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) class BadExtraModel(BaseModel): class Config: extra = 1 # type: ignore[pydantic-config] extra = 1 class KwargsBadExtraModel(BaseModel, extra=1): pass class BadConfig1(BaseModel): class Config: from_attributes: Any = {} # not sensible, but should still be handled gracefully class KwargsBadConfig1(BaseModel, from_attributes={}): pass class BadConfig2(BaseModel): class Config: from_attributes = list # not sensible, but should still be handled gracefully class KwargsBadConfig2(BaseModel, from_attributes=list): pass class InheritingModel(Model): class Config: frozen = False class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... c: int = Field(...) d: Union[int, str] e = ... # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) j = 1 DefaultTestingModel() class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 UndefinedAnnotationModel() Model.model_construct(x=1) Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int class Config: populate_by_name = True DynamicAliasModel2(y='y', z=1) DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int class Config: alias_generator = lambda x: x + '_' # noqa E731 AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') class Config: # type: ignore[pydantic-alias] alias_generator = lambda x: x + '_' # noqa E731 class UntypedFieldModel(BaseModel): x: int = 1 y = 2 z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str class Config: alias_generator = None frozen = True extra = Extra.forbid frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' class InheritingModel2(FrozenModel): class Config: frozen = False inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/modules/plugin_optional_inheritance.py000066400000000000000000000004411474456633400253700ustar00rootroot00000000000000from typing import Optional from pydantic import BaseModel class Foo(BaseModel): id: Optional[int] class Bar(BaseModel): foo: Optional[Foo] class Baz(Bar): name: str b = Bar(foo={'id': 1}) assert b.foo.id == 1 z = Baz(foo={'id': 1}, name='test') assert z.foo.id == 1 pydantic-2.10.6/tests/mypy/modules/plugin_strict_fields.py000066400000000000000000000016341474456633400240350ustar00rootroot00000000000000from pydantic import BaseModel, Field class Model(BaseModel): a: int b: int = Field(strict=True) c: int = Field(strict=False) # expected error: b Model(a='1', b='2', c='3') class ModelStrictMode(BaseModel): model_config = {'strict': True} a: int b: int = Field(strict=True) c: int = Field(strict=False) # expected error: a, b ModelStrictMode(a='1', b='2', c='3') class ModelOverride1(Model): b: int = Field(strict=False) c: int = Field(strict=True) # expected error: c ModelOverride1(a='1', b='2', c='3') class ModelOverride2(ModelStrictMode): b: int = Field(strict=False) c: int = Field(strict=True) # expected error: a, c ModelOverride2(a='1', b='2', c='3') class ModelOverrideStrictMode(ModelStrictMode): model_config = {'strict': False} # expected error: b ModelOverrideStrictMode(a='1', b='2', c='3') pydantic-2.10.6/tests/mypy/modules/plugin_success.py000066400000000000000000000134361474456633400226520ustar00rootroot00000000000000from dataclasses import InitVar from typing import Any, ClassVar, Generic, List, Optional, TypeVar, Union from typing_extensions import Self from pydantic import BaseModel, ConfigDict, Field, RootModel, create_model, field_validator, model_validator, validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = ConfigDict(from_attributes=True) class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config={'validate_assignment': True}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') _TModel = TypeVar('_TModel') _TType = TypeVar('_TType') class OrmMixin(Generic[_TModel, _TType]): @classmethod def from_orm(cls, model: _TModel) -> _TType: raise NotImplementedError @classmethod def from_orm_optional(cls, model: Optional[_TModel]) -> Optional[_TType]: if model is None: return None return cls.from_orm(model) @dataclass class MyDataClass: foo: InitVar[str] bar: str MyDataClass(foo='foo', bar='bar') def get_my_custom_validator(field_name: str) -> Any: @validator(field_name, allow_reuse=True) def my_custom_validator(cls: Any, v: int) -> int: return v return my_custom_validator def foo() -> None: class MyModel(BaseModel): number: int custom_validator = get_my_custom_validator('number') # type: ignore[pydantic-field] @model_validator(mode='before') @classmethod def validate_before(cls, values: Any) -> Any: return values @model_validator(mode='after') def validate_after(self) -> Self: return self MyModel(number=2) class InnerModel(BaseModel): my_var: Union[str, None] = Field(default=None) class OuterModel(InnerModel): pass m = OuterModel() if m.my_var is None: # In https://github.com/pydantic/pydantic/issues/7399, this was unreachable print('not unreachable') class Foo(BaseModel): pass class Bar(Foo, RootModel[int]): pass pydantic-2.10.6/tests/mypy/modules/plugin_success_baseConfig.py000066400000000000000000000104351474456633400247660ustar00rootroot00000000000000from typing import ClassVar, Generic, List, Optional, TypeVar, Union from pydantic import BaseModel, Field, create_model, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = dict(from_attributes=True) class NotConfig: frozen = True class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str class NotConfig: frozen = True kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = dict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config=dict(validate_assignment=True)) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = dict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') pydantic-2.10.6/tests/mypy/modules/pydantic_settings.py000066400000000000000000000007151474456633400233530ustar00rootroot00000000000000from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): foo: str s = Settings() s = Settings(foo='test', _case_sensitive=True, _env_prefix='test__', _env_file='test') s = Settings(foo='test', _case_sensitive=1, _env_prefix=2, _env_file=3) class SettingsWithConfigDict(BaseSettings): bar: str model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') scd = SettingsWithConfigDict() pydantic-2.10.6/tests/mypy/modules/root_models.py000066400000000000000000000005711474456633400221460ustar00rootroot00000000000000from typing import List from pydantic import RootModel class Pets1(RootModel[List[str]]): pass pets_construct = Pets1.model_construct(['dog']) Pets2 = RootModel[List[str]] class Pets3(RootModel): root: List[str] pets1 = Pets1(['dog', 'cat']) pets2 = Pets2(['dog', 'cat']) pets3 = Pets3(['dog', 'cat']) class Pets4(RootModel[List[str]]): pets: List[str] pydantic-2.10.6/tests/mypy/modules/strict_equality.py000066400000000000000000000002771474456633400230500ustar00rootroot00000000000000from pydantic import BaseModel class User(BaseModel): username: str user = User(username='test') print(user == 'test') print(user.username == int('1')) print(user.username == 'test') pydantic-2.10.6/tests/mypy/outputs/000077500000000000000000000000001474456633400173165ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/000077500000000000000000000000001474456633400200345ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/000077500000000000000000000000001474456633400233135ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/metaclass_args.py000066400000000000000000000017751474456633400266670ustar00rootroot00000000000000from pydantic import BaseModel, Field class ConfigClassUsed(BaseModel): i: int = Field(2, alias='j') class Config: populate_by_name = True ConfigClassUsed(i=None) # MYPY: error: Unexpected keyword argument "i" for "ConfigClassUsed" [call-arg] class MetaclassArgumentsNoDefault(BaseModel, populate_by_name=True): i: int = Field(alias='j') MetaclassArgumentsNoDefault(i=None) # MYPY: error: Unexpected keyword argument "i" for "MetaclassArgumentsNoDefault" [call-arg] class MetaclassArgumentsWithDefault(BaseModel, populate_by_name=True): i: int = Field(2, alias='j') MetaclassArgumentsWithDefault(i=None) # MYPY: error: Unexpected keyword argument "i" for "MetaclassArgumentsWithDefault" [call-arg] class NoArguments(BaseModel): i: int = Field(2, alias='j') NoArguments(i=1) # MYPY: error: Unexpected keyword argument "i" for "NoArguments" [call-arg] NoArguments(j=None) # MYPY: error: Argument "j" to "NoArguments" has incompatible type "None"; expected "int" [arg-type] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/plugin_success.py000066400000000000000000000142211474456633400267130ustar00rootroot00000000000000from dataclasses import InitVar from typing import Any, ClassVar, Generic, List, Optional, TypeVar, Union from typing_extensions import Self from pydantic import BaseModel, ConfigDict, Field, RootModel, create_model, field_validator, model_validator, validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = ConfigDict(from_attributes=True) class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): # MYPY: error: Cannot inherit non-frozen dataclass from a frozen one [misc] a: int = 1 KwargsMutationModel(x=1).x = 2 # MYPY: error: Property "x" defined in "KwargsNoMutationModel" is read-only [misc] KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config={'validate_assignment': True}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') _TModel = TypeVar('_TModel') _TType = TypeVar('_TType') class OrmMixin(Generic[_TModel, _TType]): @classmethod def from_orm(cls, model: _TModel) -> _TType: raise NotImplementedError @classmethod def from_orm_optional(cls, model: Optional[_TModel]) -> Optional[_TType]: if model is None: return None return cls.from_orm(model) @dataclass class MyDataClass: foo: InitVar[str] bar: str MyDataClass(foo='foo', bar='bar') def get_my_custom_validator(field_name: str) -> Any: @validator(field_name, allow_reuse=True) def my_custom_validator(cls: Any, v: int) -> int: return v return my_custom_validator def foo() -> None: class MyModel(BaseModel): number: int custom_validator = get_my_custom_validator('number') # type: ignore[pydantic-field] # MYPY: error: Unused "type: ignore" comment [unused-ignore] @model_validator(mode='before') @classmethod def validate_before(cls, values: Any) -> Any: return values @model_validator(mode='after') def validate_after(self) -> Self: return self MyModel(number=2) class InnerModel(BaseModel): my_var: Union[str, None] = Field(default=None) class OuterModel(InnerModel): pass m = OuterModel() if m.my_var is None: # In https://github.com/pydantic/pydantic/issues/7399, this was unreachable print('not unreachable') class Foo(BaseModel): pass class Bar(Foo, RootModel[int]): pass pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/plugin_success_baseConfig.py000066400000000000000000000111221474456633400310300ustar00rootroot00000000000000from typing import ClassVar, Generic, List, Optional, TypeVar, Union from pydantic import BaseModel, Field, create_model, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = dict(from_attributes=True) class NotConfig: frozen = True class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str class NotConfig: frozen = True kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = dict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): # MYPY: error: Cannot inherit non-frozen dataclass from a frozen one [misc] a: int = 1 KwargsMutationModel(x=1).x = 2 # MYPY: error: Property "x" defined in "KwargsNoMutationModel" is read-only [misc] KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config=dict(validate_assignment=True)) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = dict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/pydantic_settings.py000066400000000000000000000021451474456633400274220ustar00rootroot00000000000000from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): foo: str s = Settings() # MYPY: error: Missing named argument "foo" for "Settings" [call-arg] s = Settings(foo='test', _case_sensitive=True, _env_prefix='test__', _env_file='test') # MYPY: error: Unexpected keyword argument "_case_sensitive" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_prefix" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_file" for "Settings" [call-arg] s = Settings(foo='test', _case_sensitive=1, _env_prefix=2, _env_file=3) # MYPY: error: Unexpected keyword argument "_case_sensitive" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_prefix" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_file" for "Settings" [call-arg] class SettingsWithConfigDict(BaseSettings): bar: str model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') scd = SettingsWithConfigDict() # MYPY: error: Missing named argument "bar" for "SettingsWithConfigDict" [call-arg] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-default_ini/root_models.py000066400000000000000000000007111474456633400262120ustar00rootroot00000000000000from typing import List from pydantic import RootModel class Pets1(RootModel[List[str]]): pass pets_construct = Pets1.model_construct(['dog']) Pets2 = RootModel[List[str]] class Pets3(RootModel): # MYPY: error: Missing type parameters for generic type "RootModel" [type-arg] root: List[str] pets1 = Pets1(['dog', 'cat']) pets2 = Pets2(['dog', 'cat']) pets3 = Pets3(['dog', 'cat']) class Pets4(RootModel[List[str]]): pets: List[str] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/000077500000000000000000000000001474456633400244735ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/fail_defaults.py000066400000000000000000000014551474456633400276540ustar00rootroot00000000000000from pydantic import BaseModel, Field class Model(BaseModel): # Required undefined_default_no_args: int = Field() undefined_default: int = Field(description='my desc') positional_ellipsis_default: int = Field(...) named_ellipsis_default: int = Field(default=...) # Not required positional_default: int = Field(1) named_default: int = Field(default=2) named_default_factory: int = Field(default_factory=lambda: 3) Model() # MYPY: error: Missing named argument "undefined_default_no_args" for "Model" [call-arg] # MYPY: error: Missing named argument "undefined_default" for "Model" [call-arg] # MYPY: error: Missing named argument "positional_ellipsis_default" for "Model" [call-arg] # MYPY: error: Missing named argument "named_ellipsis_default" for "Model" [call-arg] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/plugin_fail.py000066400000000000000000000223011474456633400273340ustar00rootroot00000000000000from typing import Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, ConfigDict, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) x: int y: str def method(self) -> None: pass model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): model_config = ConfigDict(extra=Extra.forbid) ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): model_config = ConfigDict(extra=1) # type: ignore[typeddict-item] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): model_config = ConfigDict(from_attributes={}) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): model_config = ConfigDict(from_attributes=list) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): model_config = ConfigDict(frozen=False) class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') # MYPY: error: Argument "x" to "InheritingModel" has incompatible type "str"; expected "int" [arg-type] Model(x='1', y='2') # MYPY: error: Argument "x" to "Model" has incompatible type "str"; expected "int" [arg-type] class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) # MYPY: error: Argument "data" to "Response" has incompatible type "int"; expected "Model" [arg-type] class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) # MYPY: error: Argument "y" to "AliasModel" has incompatible type "int"; expected "str" [arg-type] x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] z: int DynamicAliasModel(y='y', z='1') # MYPY: error: Argument "z" to "DynamicAliasModel" has incompatible type "str"; expected "int" [arg-type] class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int model_config = ConfigDict(populate_by_name=True) DynamicAliasModel2(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int model_config = ConfigDict(alias_generator=lambda x: x + '_') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') model_config = ConfigDict(alias_generator=lambda x: x + '_') # type: ignore[pydantic-alias] class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "AliasGeneratorModel2" [call-arg] AliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "AliasGeneratorModel2" [call-arg] class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsAliasGeneratorModel2" [call-arg] KwargsAliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "KwargsAliasGeneratorModel2" [call-arg] class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): model_config = ConfigDict(frozen=False) inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/plugin_fail_baseConfig.py000066400000000000000000000223671474456633400314700ustar00rootroot00000000000000from typing import Any, Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: int y: str def method(self) -> None: pass class Config: alias_generator = None frozen = True extra = Extra.forbid def config_method(self) -> None: ... model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): class Config: extra = 'forbid' ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): class Config: extra = 1 # type: ignore[pydantic-config] extra = 1 # MYPY: error: Invalid value for "Config.extra" [pydantic-config] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): class Config: from_attributes: Any = {} # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): class Config: from_attributes = list # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): class Config: frozen = False class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') # MYPY: error: Argument "x" to "InheritingModel" has incompatible type "str"; expected "int" [arg-type] Model(x='1', y='2') # MYPY: error: Argument "x" to "Model" has incompatible type "str"; expected "int" [arg-type] class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) # MYPY: error: Argument "data" to "Response" has incompatible type "int"; expected "Model" [arg-type] class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) # MYPY: error: Argument "y" to "AliasModel" has incompatible type "int"; expected "str" [arg-type] x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] z: int DynamicAliasModel(y='y', z='1') # MYPY: error: Argument "z" to "DynamicAliasModel" has incompatible type "str"; expected "int" [arg-type] class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int class Config: populate_by_name = True DynamicAliasModel2(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int class Config: # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] alias_generator = lambda x: x + '_' # noqa E731 AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') class Config: # type: ignore[pydantic-alias] alias_generator = lambda x: x + '_' # noqa E731 class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "AliasGeneratorModel2" [call-arg] AliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "AliasGeneratorModel2" [call-arg] class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsAliasGeneratorModel2" [call-arg] KwargsAliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "KwargsAliasGeneratorModel2" [call-arg] class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str class Config: alias_generator = None frozen = True extra = Extra.forbid frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): class Config: frozen = False inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/plugin_success.py000066400000000000000000000136621474456633400301030ustar00rootroot00000000000000from dataclasses import InitVar from typing import Any, ClassVar, Generic, List, Optional, TypeVar, Union from typing_extensions import Self from pydantic import BaseModel, ConfigDict, Field, RootModel, create_model, field_validator, model_validator, validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = ConfigDict(from_attributes=True) class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config={'validate_assignment': True}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') _TModel = TypeVar('_TModel') _TType = TypeVar('_TType') class OrmMixin(Generic[_TModel, _TType]): @classmethod def from_orm(cls, model: _TModel) -> _TType: raise NotImplementedError @classmethod def from_orm_optional(cls, model: Optional[_TModel]) -> Optional[_TType]: if model is None: return None return cls.from_orm(model) @dataclass class MyDataClass: foo: InitVar[str] bar: str MyDataClass(foo='foo', bar='bar') def get_my_custom_validator(field_name: str) -> Any: @validator(field_name, allow_reuse=True) def my_custom_validator(cls: Any, v: int) -> int: return v return my_custom_validator def foo() -> None: class MyModel(BaseModel): number: int custom_validator = get_my_custom_validator('number') # type: ignore[pydantic-field] @model_validator(mode='before') @classmethod def validate_before(cls, values: Any) -> Any: return values @model_validator(mode='after') def validate_after(self) -> Self: return self MyModel(number=2) class InnerModel(BaseModel): my_var: Union[str, None] = Field(default=None) class OuterModel(InnerModel): pass m = OuterModel() if m.my_var is None: # In https://github.com/pydantic/pydantic/issues/7399, this was unreachable print('not unreachable') class Foo(BaseModel): pass class Bar(Foo, RootModel[int]): pass pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-strict_ini/plugin_success_baseConfig.py000066400000000000000000000106611474456633400322170ustar00rootroot00000000000000from typing import ClassVar, Generic, List, Optional, TypeVar, Union from pydantic import BaseModel, Field, create_model, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = dict(from_attributes=True) class NotConfig: frozen = True class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str class NotConfig: frozen = True kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = dict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config=dict(validate_assignment=True)) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = dict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-very-strict_ini/000077500000000000000000000000001474456633400254565ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin-very-strict_ini/metaclass_args.py000066400000000000000000000021101474456633400310120ustar00rootroot00000000000000from pydantic import BaseModel, Field class ConfigClassUsed(BaseModel): i: int = Field(2, alias='j') class Config: populate_by_name = True ConfigClassUsed(i=None) # MYPY: error: Argument "i" to "ConfigClassUsed" has incompatible type "None"; expected "int" [arg-type] class MetaclassArgumentsNoDefault(BaseModel, populate_by_name=True): i: int = Field(alias='j') MetaclassArgumentsNoDefault(i=None) # MYPY: error: Argument "i" to "MetaclassArgumentsNoDefault" has incompatible type "None"; expected "int" [arg-type] class MetaclassArgumentsWithDefault(BaseModel, populate_by_name=True): i: int = Field(2, alias='j') MetaclassArgumentsWithDefault(i=None) # MYPY: error: Argument "i" to "MetaclassArgumentsWithDefault" has incompatible type "None"; expected "int" [arg-type] class NoArguments(BaseModel): i: int = Field(2, alias='j') NoArguments(i=1) # MYPY: error: Unexpected keyword argument "i" for "NoArguments" [call-arg] NoArguments(j=None) # MYPY: error: Argument "j" to "NoArguments" has incompatible type "None"; expected "int" [arg-type] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/000077500000000000000000000000001474456633400231655ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/custom_constructor.py000066400000000000000000000005621474456633400275210ustar00rootroot00000000000000from pydantic import BaseModel class Person(BaseModel): id: int name: str birth_year: int def __init__(self, id: int) -> None: # MYPY: note: "Person" defined here super().__init__(id=id, name='Patrick', birth_year=1991) Person(1) Person(id=1) Person(name='Patrick') # MYPY: error: Unexpected keyword argument "name" for "Person" [call-arg] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/frozen_field.py000066400000000000000000000003041474456633400262020ustar00rootroot00000000000000from pydantic import BaseModel, Field class Foo(BaseModel): a: int = Field(default=1, frozen=True) foo = Foo() foo.a = 2 # MYPY: error: Property "a" defined in "Foo" is read-only [misc] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/plugin_fail.py000066400000000000000000000175531474456633400260430ustar00rootroot00000000000000from typing import Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, ConfigDict, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) x: int y: str def method(self) -> None: pass model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): model_config = ConfigDict(extra=Extra.forbid) ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): model_config = ConfigDict(extra=1) # type: ignore[typeddict-item] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): model_config = ConfigDict(from_attributes={}) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): model_config = ConfigDict(from_attributes=list) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): model_config = ConfigDict(frozen=False) class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int model_config = ConfigDict(populate_by_name=True) DynamicAliasModel2(y='y', z=1) # MYPY: error: Missing named argument "x" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Missing named argument "x" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int model_config = ConfigDict(alias_generator=lambda x: x + '_') AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') model_config = ConfigDict(alias_generator=lambda x: x + '_') # type: ignore[pydantic-alias] # MYPY: error: Unused "type: ignore" comment [unused-ignore] class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): model_config = ConfigDict(frozen=False) inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/plugin_fail_baseConfig.py000066400000000000000000000176411474456633400301610ustar00rootroot00000000000000from typing import Any, Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: int y: str def method(self) -> None: pass class Config: alias_generator = None frozen = True extra = Extra.forbid def config_method(self) -> None: ... model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): class Config: extra = 'forbid' ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): class Config: extra = 1 # type: ignore[pydantic-config] extra = 1 # MYPY: error: Invalid value for "Config.extra" [pydantic-config] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): class Config: from_attributes: Any = {} # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): class Config: from_attributes = list # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): class Config: frozen = False class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int class Config: populate_by_name = True DynamicAliasModel2(y='y', z=1) # MYPY: error: Missing named argument "x" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Missing named argument "x" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int class Config: alias_generator = lambda x: x + '_' # noqa E731 AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') class Config: # type: ignore[pydantic-alias] # MYPY: error: Unused "type: ignore" comment [unused-ignore] alias_generator = lambda x: x + '_' # noqa E731 class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str class Config: alias_generator = None frozen = True extra = Extra.forbid frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): class Config: frozen = False inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/plugin_optional_inheritance.py000066400000000000000000000007051474456633400313150ustar00rootroot00000000000000from typing import Optional from pydantic import BaseModel class Foo(BaseModel): id: Optional[int] class Bar(BaseModel): foo: Optional[Foo] class Baz(Bar): name: str b = Bar(foo={'id': 1}) assert b.foo.id == 1 # MYPY: error: Item "None" of "Optional[Foo]" has no attribute "id" [union-attr] z = Baz(foo={'id': 1}, name='test') assert z.foo.id == 1 # MYPY: error: Item "None" of "Optional[Foo]" has no attribute "id" [union-attr] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/plugin_strict_fields.py000066400000000000000000000031051474456633400277520ustar00rootroot00000000000000from pydantic import BaseModel, Field class Model(BaseModel): a: int b: int = Field(strict=True) c: int = Field(strict=False) # expected error: b Model(a='1', b='2', c='3') # MYPY: error: Argument "b" to "Model" has incompatible type "str"; expected "int" [arg-type] class ModelStrictMode(BaseModel): model_config = {'strict': True} a: int b: int = Field(strict=True) c: int = Field(strict=False) # expected error: a, b ModelStrictMode(a='1', b='2', c='3') # MYPY: error: Argument "a" to "ModelStrictMode" has incompatible type "str"; expected "int" [arg-type] # MYPY: error: Argument "b" to "ModelStrictMode" has incompatible type "str"; expected "int" [arg-type] class ModelOverride1(Model): b: int = Field(strict=False) c: int = Field(strict=True) # expected error: c ModelOverride1(a='1', b='2', c='3') # MYPY: error: Argument "c" to "ModelOverride1" has incompatible type "str"; expected "int" [arg-type] class ModelOverride2(ModelStrictMode): b: int = Field(strict=False) c: int = Field(strict=True) # expected error: a, c ModelOverride2(a='1', b='2', c='3') # MYPY: error: Argument "a" to "ModelOverride2" has incompatible type "str"; expected "int" [arg-type] # MYPY: error: Argument "c" to "ModelOverride2" has incompatible type "str"; expected "int" [arg-type] class ModelOverrideStrictMode(ModelStrictMode): model_config = {'strict': False} # expected error: b ModelOverrideStrictMode(a='1', b='2', c='3') # MYPY: error: Argument "b" to "ModelOverrideStrictMode" has incompatible type "str"; expected "int" [arg-type] pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/pydantic_settings.py000066400000000000000000000015331474456633400272740ustar00rootroot00000000000000from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): foo: str s = Settings() s = Settings(foo='test', _case_sensitive=True, _env_prefix='test__', _env_file='test') s = Settings(foo='test', _case_sensitive=1, _env_prefix=2, _env_file=3) # MYPY: error: Argument "_case_sensitive" to "Settings" has incompatible type "int"; expected "Optional[bool]" [arg-type] # MYPY: error: Argument "_env_prefix" to "Settings" has incompatible type "int"; expected "Optional[str]" [arg-type] # MYPY: error: Argument "_env_file" to "Settings" has incompatible type "int"; expected "Optional[Union[Path, str, Sequence[Union[Path, str]]]]" [arg-type] class SettingsWithConfigDict(BaseSettings): bar: str model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') scd = SettingsWithConfigDict() pydantic-2.10.6/tests/mypy/outputs/1.10.1/mypy-plugin_ini/root_models.py000066400000000000000000000010351474456633400260640ustar00rootroot00000000000000from typing import List from pydantic import RootModel class Pets1(RootModel[List[str]]): pass pets_construct = Pets1.model_construct(['dog']) Pets2 = RootModel[List[str]] class Pets3(RootModel): # MYPY: error: Missing type parameters for generic type "RootModel" [type-arg] root: List[str] pets1 = Pets1(['dog', 'cat']) pets2 = Pets2(['dog', 'cat']) pets3 = Pets3(['dog', 'cat']) class Pets4(RootModel[List[str]]): pets: List[str] # MYPY: error: Only `root` is allowed as a field of a `RootModel` [pydantic-field] pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-default_toml/000077500000000000000000000000001474456633400245305ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-default_toml/pydantic_settings.py000066400000000000000000000021451474456633400306370ustar00rootroot00000000000000from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): foo: str s = Settings() # MYPY: error: Missing named argument "foo" for "Settings" [call-arg] s = Settings(foo='test', _case_sensitive=True, _env_prefix='test__', _env_file='test') # MYPY: error: Unexpected keyword argument "_case_sensitive" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_prefix" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_file" for "Settings" [call-arg] s = Settings(foo='test', _case_sensitive=1, _env_prefix=2, _env_file=3) # MYPY: error: Unexpected keyword argument "_case_sensitive" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_prefix" for "Settings" [call-arg] # MYPY: error: Unexpected keyword argument "_env_file" for "Settings" [call-arg] class SettingsWithConfigDict(BaseSettings): bar: str model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') scd = SettingsWithConfigDict() # MYPY: error: Missing named argument "bar" for "SettingsWithConfigDict" [call-arg] pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-default_toml/root_models.py000066400000000000000000000007111474456633400274270ustar00rootroot00000000000000from typing import List from pydantic import RootModel class Pets1(RootModel[List[str]]): pass pets_construct = Pets1.model_construct(['dog']) Pets2 = RootModel[List[str]] class Pets3(RootModel): # MYPY: error: Missing type parameters for generic type "RootModel" [type-arg] root: List[str] pets1 = Pets1(['dog', 'cat']) pets2 = Pets2(['dog', 'cat']) pets3 = Pets3(['dog', 'cat']) class Pets4(RootModel[List[str]]): pets: List[str] pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict-equality_toml/000077500000000000000000000000001474456633400275435ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict-equality_toml/strict_equality.py000066400000000000000000000007001474456633400333370ustar00rootroot00000000000000from pydantic import BaseModel class User(BaseModel): username: str user = User(username='test') print(user == 'test') # MYPY: error: Non-overlapping equality check (left operand type: "User", right operand type: "Literal['test']") [comparison-overlap] print(user.username == int('1')) # MYPY: error: Non-overlapping equality check (left operand type: "str", right operand type: "int") [comparison-overlap] print(user.username == 'test') pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/000077500000000000000000000000001474456633400257105ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/fail_defaults.py000066400000000000000000000014551474456633400310710ustar00rootroot00000000000000from pydantic import BaseModel, Field class Model(BaseModel): # Required undefined_default_no_args: int = Field() undefined_default: int = Field(description='my desc') positional_ellipsis_default: int = Field(...) named_ellipsis_default: int = Field(default=...) # Not required positional_default: int = Field(1) named_default: int = Field(default=2) named_default_factory: int = Field(default_factory=lambda: 3) Model() # MYPY: error: Missing named argument "undefined_default_no_args" for "Model" [call-arg] # MYPY: error: Missing named argument "undefined_default" for "Model" [call-arg] # MYPY: error: Missing named argument "positional_ellipsis_default" for "Model" [call-arg] # MYPY: error: Missing named argument "named_ellipsis_default" for "Model" [call-arg] pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/plugin_fail.py000066400000000000000000000223011474456633400305510ustar00rootroot00000000000000from typing import Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, ConfigDict, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) x: int y: str def method(self) -> None: pass model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): model_config = ConfigDict(extra=Extra.forbid) ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): model_config = ConfigDict(extra=1) # type: ignore[typeddict-item] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): model_config = ConfigDict(from_attributes={}) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): model_config = ConfigDict(from_attributes=list) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): model_config = ConfigDict(frozen=False) class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') # MYPY: error: Argument "x" to "InheritingModel" has incompatible type "str"; expected "int" [arg-type] Model(x='1', y='2') # MYPY: error: Argument "x" to "Model" has incompatible type "str"; expected "int" [arg-type] class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) # MYPY: error: Argument "data" to "Response" has incompatible type "int"; expected "Model" [arg-type] class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) # MYPY: error: Argument "y" to "AliasModel" has incompatible type "int"; expected "str" [arg-type] x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] z: int DynamicAliasModel(y='y', z='1') # MYPY: error: Argument "z" to "DynamicAliasModel" has incompatible type "str"; expected "int" [arg-type] class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int model_config = ConfigDict(populate_by_name=True) DynamicAliasModel2(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int model_config = ConfigDict(alias_generator=lambda x: x + '_') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') model_config = ConfigDict(alias_generator=lambda x: x + '_') # type: ignore[pydantic-alias] class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "AliasGeneratorModel2" [call-arg] AliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "AliasGeneratorModel2" [call-arg] class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsAliasGeneratorModel2" [call-arg] KwargsAliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "KwargsAliasGeneratorModel2" [call-arg] class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): model_config = ConfigDict(frozen=False) inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/plugin_fail_baseConfig.py000066400000000000000000000223671474456633400327050ustar00rootroot00000000000000from typing import Any, Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: int y: str def method(self) -> None: pass class Config: alias_generator = None frozen = True extra = Extra.forbid def config_method(self) -> None: ... model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): class Config: extra = 'forbid' ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): class Config: extra = 1 # type: ignore[pydantic-config] extra = 1 # MYPY: error: Invalid value for "Config.extra" [pydantic-config] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): class Config: from_attributes: Any = {} # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): class Config: from_attributes = list # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): class Config: frozen = False class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') # MYPY: error: Argument "x" to "InheritingModel" has incompatible type "str"; expected "int" [arg-type] Model(x='1', y='2') # MYPY: error: Argument "x" to "Model" has incompatible type "str"; expected "int" [arg-type] class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) # MYPY: error: Argument "data" to "Response" has incompatible type "int"; expected "Model" [arg-type] class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) # MYPY: error: Argument "y" to "AliasModel" has incompatible type "int"; expected "str" [arg-type] x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] z: int DynamicAliasModel(y='y', z='1') # MYPY: error: Argument "z" to "DynamicAliasModel" has incompatible type "str"; expected "int" [arg-type] class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int class Config: populate_by_name = True DynamicAliasModel2(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Unexpected keyword argument "y" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int class Config: # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] alias_generator = lambda x: x + '_' # noqa E731 AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') class Config: # type: ignore[pydantic-alias] alias_generator = lambda x: x + '_' # noqa E731 class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "AliasGeneratorModel2" [call-arg] AliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "AliasGeneratorModel2" [call-arg] class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] x: int = Field(..., alias='y') # MYPY: error: Required dynamic aliases disallowed [pydantic-alias] KwargsAliasGeneratorModel2(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsAliasGeneratorModel2" [call-arg] KwargsAliasGeneratorModel2(y=1, z=1) # MYPY: error: Unexpected keyword argument "z" for "KwargsAliasGeneratorModel2" [call-arg] class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str class Config: alias_generator = None frozen = True extra = Extra.forbid frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): class Config: frozen = False inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/plugin_success.py000066400000000000000000000136621474456633400313200ustar00rootroot00000000000000from dataclasses import InitVar from typing import Any, ClassVar, Generic, List, Optional, TypeVar, Union from typing_extensions import Self from pydantic import BaseModel, ConfigDict, Field, RootModel, create_model, field_validator, model_validator, validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = ConfigDict(from_attributes=True) class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config={'validate_assignment': True}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = ConfigDict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = ConfigDict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') _TModel = TypeVar('_TModel') _TType = TypeVar('_TType') class OrmMixin(Generic[_TModel, _TType]): @classmethod def from_orm(cls, model: _TModel) -> _TType: raise NotImplementedError @classmethod def from_orm_optional(cls, model: Optional[_TModel]) -> Optional[_TType]: if model is None: return None return cls.from_orm(model) @dataclass class MyDataClass: foo: InitVar[str] bar: str MyDataClass(foo='foo', bar='bar') def get_my_custom_validator(field_name: str) -> Any: @validator(field_name, allow_reuse=True) def my_custom_validator(cls: Any, v: int) -> int: return v return my_custom_validator def foo() -> None: class MyModel(BaseModel): number: int custom_validator = get_my_custom_validator('number') # type: ignore[pydantic-field] @model_validator(mode='before') @classmethod def validate_before(cls, values: Any) -> Any: return values @model_validator(mode='after') def validate_after(self) -> Self: return self MyModel(number=2) class InnerModel(BaseModel): my_var: Union[str, None] = Field(default=None) class OuterModel(InnerModel): pass m = OuterModel() if m.my_var is None: # In https://github.com/pydantic/pydantic/issues/7399, this was unreachable print('not unreachable') class Foo(BaseModel): pass class Bar(Foo, RootModel[int]): pass pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin-strict_toml/plugin_success_baseConfig.py000066400000000000000000000106611474456633400334340ustar00rootroot00000000000000from typing import ClassVar, Generic, List, Optional, TypeVar, Union from pydantic import BaseModel, Field, create_model, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: float y: str model_config = dict(from_attributes=True) class NotConfig: frozen = True class SelfReferencingModel(BaseModel): submodel: Optional['SelfReferencingModel'] @property def prop(self) -> None: ... SelfReferencingModel.model_rebuild() model = Model(x=1, y='y') Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model.x = 2 model.model_validate(model) self_referencing_model = SelfReferencingModel(submodel=SelfReferencingModel(submodel=None)) class KwargsModel(BaseModel, from_attributes=True): x: float y: str class NotConfig: frozen = True kwargs_model = KwargsModel(x=1, y='y') KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model.x = 2 kwargs_model.model_validate(kwargs_model.__dict__) class InheritingModel(Model): z: int = 1 InheritingModel.model_validate(model.__dict__) class ForwardReferencingModel(Model): future: 'FutureModel' class FutureModel(Model): pass ForwardReferencingModel.model_rebuild() future_model = FutureModel(x=1, y='a') forward_model = ForwardReferencingModel(x=1, y='a', future=future_model) class NoMutationModel(BaseModel): x: int model_config = dict(frozen=True) class MutationModel(NoMutationModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) MutationModel(x=1).x = 2 MutationModel.model_validate(model.__dict__) class KwargsNoMutationModel(BaseModel, frozen=True): x: int class KwargsMutationModel(KwargsNoMutationModel, frozen=False, from_attributes=True): a: int = 1 KwargsMutationModel(x=1).x = 2 KwargsMutationModel.model_validate(model.__dict__) class OverrideModel(Model): x: int OverrideModel(x=1, y='b') class Mixin: def f(self) -> None: pass class MultiInheritanceModel(BaseModel, Mixin): pass MultiInheritanceModel().f() class AliasModel(BaseModel): x: str = Field(..., alias='y') alias_model = AliasModel(y='hello') assert alias_model.x == 'hello' class ClassVarModel(BaseModel): x: int y: ClassVar[int] = 1 ClassVarModel(x=1) @dataclass(config=dict(validate_assignment=True)) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') class TypeAliasAsAttribute(BaseModel): __type_alias_attribute__ = Union[str, bytes] class NestedModel(BaseModel): class Model(BaseModel): id: str model: Model _ = NestedModel.Model DynamicModel = create_model('DynamicModel', __base__=Model) dynamic_model = DynamicModel(x=1, y='y') dynamic_model.x = 2 class FrozenModel(BaseModel): x: int model_config = dict(frozen=True) class NotFrozenModel(FrozenModel): a: int = 1 model_config = dict(frozen=False, from_attributes=True) NotFrozenModel(x=1).x = 2 NotFrozenModel.model_validate(model.__dict__) class KwargsFrozenModel(BaseModel, frozen=True): x: int class KwargsNotFrozenModel(FrozenModel, frozen=False, from_attributes=True): a: int = 1 KwargsNotFrozenModel(x=1).x = 2 KwargsNotFrozenModel.model_validate(model.__dict__) class ModelWithSelfField(BaseModel): self: str def f(name: str) -> str: return name class ModelWithAllowReuseValidator(BaseModel): name: str normalize_name = field_validator('name')(f) model_with_allow_reuse_validator = ModelWithAllowReuseValidator(name='xyz') T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(cls, name: str) -> str: return name def _default_factory_str() -> str: return 'x' def _default_factory_list() -> List[int]: return [1, 2, 3] class FieldDefaultTestingModel(BaseModel): # Required a: int b: int = Field() c: int = Field(...) # Default d: int = Field(1) # Default factory g: List[int] = Field(default_factory=_default_factory_list) h: str = Field(default_factory=_default_factory_str) i: str = Field(default_factory=lambda: 'test') pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin_toml/000077500000000000000000000000001474456633400244025ustar00rootroot00000000000000pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin_toml/plugin_fail.py000066400000000000000000000175531474456633400272600ustar00rootroot00000000000000from typing import Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, ConfigDict, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) x: int y: str def method(self) -> None: pass model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): model_config = ConfigDict(extra=Extra.forbid) ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): model_config = ConfigDict(extra=1) # type: ignore[typeddict-item] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): model_config = ConfigDict(from_attributes={}) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): model_config = ConfigDict(from_attributes=list) # type: ignore[typeddict-item] # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] # MYPY: note: Error code "pydantic-config" not covered by "type: ignore" comment class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): model_config = ConfigDict(frozen=False) class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int model_config = ConfigDict(populate_by_name=True) DynamicAliasModel2(y='y', z=1) # MYPY: error: Missing named argument "x" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Missing named argument "x" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int model_config = ConfigDict(alias_generator=lambda x: x + '_') AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') model_config = ConfigDict(alias_generator=lambda x: x + '_') # type: ignore[pydantic-alias] # MYPY: error: Unused "type: ignore" comment [unused-ignore] class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str model_config = ConfigDict(alias_generator=None, frozen=True, extra=Extra.forbid) frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): model_config = ConfigDict(frozen=False) inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin_toml/plugin_fail_baseConfig.py000066400000000000000000000176411474456633400313760ustar00rootroot00000000000000from typing import Any, Generic, List, Optional, Set, TypeVar, Union from pydantic import BaseModel, Extra, Field, field_validator from pydantic.dataclasses import dataclass class Model(BaseModel): x: int y: str def method(self) -> None: pass class Config: alias_generator = None frozen = True extra = Extra.forbid def config_method(self) -> None: ... model = Model(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "Model" [call-arg] model = Model(x=1) # MYPY: error: Missing named argument "y" for "Model" [call-arg] model.y = 'a' # MYPY: error: Property "y" defined in "Model" is read-only [misc] Model.from_orm({}) # MYPY: error: "Model" does not have from_attributes=True [pydantic-orm] class KwargsModel(BaseModel, alias_generator=None, frozen=True, extra=Extra.forbid): x: int y: str def method(self) -> None: pass kwargs_model = KwargsModel(x=1, y='y', z='z') # MYPY: error: Unexpected keyword argument "z" for "KwargsModel" [call-arg] kwargs_model = KwargsModel(x=1) # MYPY: error: Missing named argument "y" for "KwargsModel" [call-arg] kwargs_model.y = 'a' # MYPY: error: Property "y" defined in "KwargsModel" is read-only [misc] KwargsModel.from_orm({}) # MYPY: error: "KwargsModel" does not have from_attributes=True [pydantic-orm] class ForbidExtraModel(BaseModel): class Config: extra = 'forbid' ForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "ForbidExtraModel" [call-arg] class KwargsForbidExtraModel(BaseModel, extra='forbid'): pass KwargsForbidExtraModel(x=1) # MYPY: error: Unexpected keyword argument "x" for "KwargsForbidExtraModel" [call-arg] class BadExtraModel(BaseModel): class Config: extra = 1 # type: ignore[pydantic-config] extra = 1 # MYPY: error: Invalid value for "Config.extra" [pydantic-config] class KwargsBadExtraModel(BaseModel, extra=1): # MYPY: error: Invalid value for "Config.extra" [pydantic-config] pass class BadConfig1(BaseModel): class Config: from_attributes: Any = {} # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig1(BaseModel, from_attributes={}): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class BadConfig2(BaseModel): class Config: from_attributes = list # not sensible, but should still be handled gracefully # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] class KwargsBadConfig2(BaseModel, from_attributes=list): # MYPY: error: Invalid value for "Config.from_attributes" [pydantic-config] pass class InheritingModel(Model): class Config: frozen = False class KwargsInheritingModel(KwargsModel, frozen=False): pass class DefaultTestingModel(BaseModel): # Required a: int b: int = ... # MYPY: error: Incompatible types in assignment (expression has type "EllipsisType", variable has type "int") [assignment] c: int = Field(...) d: Union[int, str] e = ... # MYPY: error: Untyped fields disallowed [pydantic-field] # Not required f: Optional[int] g: int = 1 h: int = Field(1) i: int = Field(None) # MYPY: error: Incompatible types in assignment (expression has type "None", variable has type "int") [assignment] j = 1 # MYPY: error: Untyped fields disallowed [pydantic-field] DefaultTestingModel() # MYPY: error: Missing named argument "a" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "b" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "c" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "d" for "DefaultTestingModel" [call-arg] # MYPY: error: Missing named argument "f" for "DefaultTestingModel" [call-arg] class UndefinedAnnotationModel(BaseModel): undefined: Undefined # noqa F821 # MYPY: error: Name "Undefined" is not defined [name-defined] UndefinedAnnotationModel() # MYPY: error: Missing named argument "undefined" for "UndefinedAnnotationModel" [call-arg] Model.model_construct(x=1) # MYPY: error: Missing named argument "y" for "model_construct" of "Model" [call-arg] Model.model_construct(_fields_set={'x'}, x=1, y='2') Model.model_construct(x='1', y='2') # MYPY: error: Argument "x" to "model_construct" of "Model" has incompatible type "str"; expected "int" [arg-type] # Strict mode fails inheriting = InheritingModel(x='1', y='1') Model(x='1', y='2') class Blah(BaseModel): fields_set: Optional[Set[str]] = None # (comment to keep line numbers unchanged) T = TypeVar('T') class Response(BaseModel, Generic[T]): data: T error: Optional[str] response = Response[Model](data=model, error=None) response = Response[Model](data=1, error=None) class AliasModel(BaseModel): x: str = Field(..., alias='y') z: int AliasModel(y=1, z=2) x_alias = 'y' class DynamicAliasModel(BaseModel): x: str = Field(..., alias=x_alias) z: int DynamicAliasModel(y='y', z='1') class DynamicAliasModel2(BaseModel): x: str = Field(..., alias=x_alias) z: int class Config: populate_by_name = True DynamicAliasModel2(y='y', z=1) # MYPY: error: Missing named argument "x" for "DynamicAliasModel2" [call-arg] DynamicAliasModel2(x='y', z=1) class KwargsDynamicAliasModel(BaseModel, populate_by_name=True): x: str = Field(..., alias=x_alias) z: int KwargsDynamicAliasModel(y='y', z=1) # MYPY: error: Missing named argument "x" for "KwargsDynamicAliasModel" [call-arg] KwargsDynamicAliasModel(x='y', z=1) class AliasGeneratorModel(BaseModel): x: int class Config: alias_generator = lambda x: x + '_' # noqa E731 AliasGeneratorModel(x=1) AliasGeneratorModel(x_=1) AliasGeneratorModel(z=1) class AliasGeneratorModel2(BaseModel): x: int = Field(..., alias='y') class Config: # type: ignore[pydantic-alias] # MYPY: error: Unused "type: ignore" comment [unused-ignore] alias_generator = lambda x: x + '_' # noqa E731 class UntypedFieldModel(BaseModel): x: int = 1 y = 2 # MYPY: error: Untyped fields disallowed [pydantic-field] z = 2 # type: ignore[pydantic-field] AliasGeneratorModel2(x=1) AliasGeneratorModel2(y=1, z=1) class KwargsAliasGeneratorModel(BaseModel, alias_generator=lambda x: x + '_'): x: int KwargsAliasGeneratorModel(x=1) KwargsAliasGeneratorModel(x_=1) KwargsAliasGeneratorModel(z=1) class KwargsAliasGeneratorModel2(BaseModel, alias_generator=lambda x: x + '_'): x: int = Field(..., alias='y') KwargsAliasGeneratorModel2(x=1) KwargsAliasGeneratorModel2(y=1, z=1) class CoverageTester(Missing): # noqa F821 # MYPY: error: Name "Missing" is not defined [name-defined] def from_orm(self) -> None: pass CoverageTester().from_orm() @dataclass(config={}) class AddProject: name: str slug: Optional[str] description: Optional[str] p = AddProject(name='x', slug='y', description='z') # Same as Model, but with frozen = True class FrozenModel(BaseModel): x: int y: str class Config: alias_generator = None frozen = True extra = Extra.forbid frozenmodel = FrozenModel(x=1, y='b') frozenmodel.y = 'a' # MYPY: error: Property "y" defined in "FrozenModel" is read-only [misc] class InheritingModel2(FrozenModel): class Config: frozen = False inheriting2 = InheritingModel2(x=1, y='c') inheriting2.y = 'd' class ModelWithAnnotatedValidator(BaseModel): name: str @field_validator('name') def noop_validator_with_annotations(self, name: str) -> str: # This is a mistake: the first argument to a validator is the class itself, # like a classmethod. self.instance_method() # MYPY: error: Missing positional argument "self" in call to "instance_method" of "ModelWithAnnotatedValidator" [call-arg] return name def instance_method(self) -> None: ... pydantic-2.10.6/tests/mypy/outputs/1.10.1/pyproject-plugin_toml/pydantic_settings.py000066400000000000000000000015331474456633400305110ustar00rootroot00000000000000from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): foo: str s = Settings() s = Settings(foo='test', _case_sensitive=True, _env_prefix='test__', _env_file='test') s = Settings(foo='test', _case_sensitive=1, _env_prefix=2, _env_file=3) # MYPY: error: Argument "_case_sensitive" to "Settings" has incompatible type "int"; expected "Optional[bool]" [arg-type] # MYPY: error: Argument "_env_prefix" to "Settings" has incompatible type "int"; expected "Optional[str]" [arg-type] # MYPY: error: Argument "_env_file" to "Settings" has incompatible type "int"; expected "Optional[Union[Path, str, Sequence[Union[Path, str]]]]" [arg-type] class SettingsWithConfigDict(BaseSettings): bar: str model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') scd = SettingsWithConfigDict() pydantic-2.10.6/tests/mypy/test_mypy.py000066400000000000000000000240761474456633400202130ustar00rootroot00000000000000from __future__ import annotations import dataclasses import importlib import os import re import sys from bisect import insort from collections.abc import Collection from pathlib import Path from typing import TYPE_CHECKING import pytest from _pytest.mark import Mark, MarkDecorator from _pytest.mark.structures import ParameterSet from typing_extensions import TypeAlias # Pyright doesn't like try/expect blocks for imports: if TYPE_CHECKING: from mypy import api as mypy_api from mypy.version import __version__ as mypy_version from pydantic.version import parse_mypy_version else: try: from mypy import api as mypy_api from mypy.version import __version__ as mypy_version from pydantic.version import parse_mypy_version except ImportError: mypy_api = None mypy_version = None parse_mypy_version = lambda _: (0,) # noqa: E731 MYPY_VERSION_TUPLE = parse_mypy_version(mypy_version) PYDANTIC_ROOT = Path(__file__).parent.parent.parent pytestmark = pytest.mark.skipif( '--test-mypy' not in sys.argv and os.environ.get('PYCHARM_HOSTED') != '1', # never skip when running via the PyCharm runner reason='Test only with "--test-mypy" flag', ) # This ensures mypy can find the test files, no matter where tests are run from: os.chdir(Path(__file__).parent.parent.parent) # Type hint taken from the signature of `pytest.param`: Marks: TypeAlias = 'MarkDecorator | Collection[MarkDecorator | Mark]' def build_cases( configs: list[str], modules: list[str], marks: Marks = (), ) -> list[ParameterSet]: """Produces the cartesian product of the configs and modules, optionally with marks.""" return [pytest.param(config, module, marks=marks) for config in configs for module in modules] cases: list[ParameterSet | tuple[str, str]] = [ # No plugin *build_cases( ['mypy-default.ini', 'pyproject-default.toml'], ['pydantic_settings.py'], ), *build_cases( ['mypy-default.ini', 'pyproject-default.toml'], ['root_models.py'], pytest.mark.skipif( MYPY_VERSION_TUPLE < (1, 1, 1), reason='`dataclass_transform` only supported on mypy >= 1.1.1' ), ), *build_cases( ['mypy-default.ini'], ['plugin_success.py', 'plugin_success_baseConfig.py', 'metaclass_args.py'], ), # Default plugin config *build_cases( ['mypy-plugin.ini', 'pyproject-plugin.toml'], [ 'plugin_success.py', 'plugin_fail.py', 'plugin_success_baseConfig.py', 'plugin_fail_baseConfig.py', 'pydantic_settings.py', ], ), # Strict plugin config *build_cases( ['mypy-plugin-strict.ini', 'pyproject-plugin-strict.toml'], [ 'plugin_success.py', 'plugin_fail.py', 'fail_defaults.py', 'plugin_success_baseConfig.py', 'plugin_fail_baseConfig.py', ], ), # One-off cases *[ ('mypy-plugin.ini', 'custom_constructor.py'), ('mypy-plugin.ini', 'config_conditional_extra.py'), ('mypy-plugin.ini', 'covariant_typevar.py'), ('mypy-plugin.ini', 'frozen_field.py'), ('mypy-plugin.ini', 'plugin_optional_inheritance.py'), ('mypy-plugin.ini', 'generics.py'), ('mypy-plugin.ini', 'root_models.py'), ('mypy-plugin.ini', 'plugin_strict_fields.py'), ('mypy-plugin-strict-no-any.ini', 'dataclass_no_any.py'), ('mypy-plugin-very-strict.ini', 'metaclass_args.py'), ('pyproject-plugin-no-strict-optional.toml', 'no_strict_optional.py'), ('pyproject-plugin-strict-equality.toml', 'strict_equality.py'), ('pyproject-plugin.toml', 'from_orm_v1_noconflict.py'), ], ] @dataclasses.dataclass class MypyTestConfig: existing_output_path: Path | None """The path pointing to the existing test result, or `None` if this is the first time the test is run.""" current_output_path: Path """The path pointing to the current test result to be created or compared to the existing one.""" def get_test_config(module_path: Path, config_path: Path) -> MypyTestConfig: """Given a file to test with a specific config, get a test config.""" outputs_dir = PYDANTIC_ROOT / 'tests/mypy/outputs' outputs_dir.mkdir(exist_ok=True) existing_versions = [ x.name for x in outputs_dir.iterdir() if x.is_dir() and re.match(r'[0-9]+(?:\.[0-9]+)*', x.name) ] def _convert_to_output_path(v: str) -> Path: return outputs_dir / v / config_path.name.replace('.', '_') / module_path.name existing: Path | None = None # Build sorted list of (parsed_version, version) pairs, including the current mypy version being used parsed_version_pairs = sorted((parse_mypy_version(v), v) for v in existing_versions) if MYPY_VERSION_TUPLE not in [x[0] for x in parsed_version_pairs]: insort(parsed_version_pairs, (MYPY_VERSION_TUPLE, mypy_version)) for parsed_version, version in parsed_version_pairs[::-1]: if parsed_version > MYPY_VERSION_TUPLE: continue output_path = _convert_to_output_path(version) if output_path.exists(): existing = output_path break return MypyTestConfig(existing, _convert_to_output_path(mypy_version)) def get_expected_return_code(source_code: str) -> int: """Return 1 if at least one `# MYPY:` comment was found, else 0.""" if re.findall(r'^\s*# MYPY:', source_code, flags=re.MULTILINE): return 1 return 0 @pytest.mark.parametrize( ['config_filename', 'python_filename'], cases, ) def test_mypy_results(config_filename: str, python_filename: str, request: pytest.FixtureRequest) -> None: input_path = PYDANTIC_ROOT / 'tests/mypy/modules' / python_filename config_path = PYDANTIC_ROOT / 'tests/mypy/configs' / config_filename test_config = get_test_config(input_path, config_path) # Specifying a different cache dir for each configuration dramatically speeds up subsequent execution # It also prevents cache-invalidation-related bugs in the tests cache_dir = f'.mypy_cache/test-{os.path.splitext(config_filename)[0]}' command = [ str(input_path), '--config-file', str(config_path), '--cache-dir', cache_dir, '--show-error-codes', '--show-traceback', ] print(f"\nExecuting: mypy {' '.join(command)}") # makes it easier to debug as necessary mypy_out, mypy_err, mypy_returncode = mypy_api.run(command) # Need to strip filenames due to differences in formatting by OS mypy_out = '\n'.join(['.py:'.join(line.split('.py:')[1:]) for line in mypy_out.split('\n') if line]).strip() mypy_out = re.sub(r'\n\s*\n', r'\n', mypy_out) if mypy_out: print('{0}\n{1:^100}\n{0}\n{2}\n{0}'.format('=' * 100, f'mypy {mypy_version} output', mypy_out)) assert mypy_err == '' input_code = input_path.read_text() existing_output_code: str | None = None if test_config.existing_output_path is not None: existing_output_code = test_config.existing_output_path.read_text() print(f'Comparing output with {test_config.existing_output_path}') else: print(f'Comparing output with {input_path} (expecting no mypy errors)') merged_output = merge_python_and_mypy_output(input_code, mypy_out) if merged_output == (existing_output_code or input_code): # Test passed, no changes needed pass elif request.config.getoption('update_mypy'): test_config.current_output_path.parent.mkdir(parents=True, exist_ok=True) test_config.current_output_path.write_text(merged_output) else: print('**** Merged Output ****') print(merged_output) print('***********************') assert existing_output_code is not None, 'No output file found, run `make test-mypy-update` to create it' assert merged_output == existing_output_code expected_returncode = get_expected_return_code(existing_output_code) assert mypy_returncode == expected_returncode def test_bad_toml_config() -> None: full_config_filename = 'tests/mypy/configs/pyproject-plugin-bad-param.toml' full_filename = 'tests/mypy/modules/generics.py' # File doesn't matter command = [full_filename, '--config-file', full_config_filename, '--show-error-codes'] print(f"\nExecuting: mypy {' '.join(command)}") # makes it easier to debug as necessary with pytest.raises(ValueError) as e: mypy_api.run(command) assert str(e.value) == 'Configuration value must be a boolean for key: init_forbid_extra' @pytest.mark.parametrize('module', ['dataclass_no_any', 'plugin_success', 'plugin_success_baseConfig']) def test_success_cases_run(module: str) -> None: """ Ensure the "success" files can actually be executed """ module_name = f'tests.mypy.modules.{module}' try: importlib.import_module(module_name) except Exception: pytest.fail(reason=f'Unable to execute module {module_name}') @pytest.mark.parametrize( ['v_str', 'v_tuple'], [ ('1.11.0', (1, 11, 0)), ('1.11.0+dev.d6d9d8cd4f27c52edac1f537e236ec48a01e54cb.dirty', (1, 11, 0)), ], ) def test_parse_mypy_version(v_str: str, v_tuple: tuple[int, int, int]) -> None: assert parse_mypy_version(v_str) == v_tuple def merge_python_and_mypy_output(source_code: str, mypy_output: str) -> str: merged_lines = [(line, False) for line in source_code.splitlines()] for line in mypy_output.splitlines()[::-1]: if not line: continue try: line_number, message = re.split(r':(?:\d+:)?', line, maxsplit=1) merged_lines.insert(int(line_number), (f'# MYPY: {message.strip()}', True)) except ValueError: # This could happen due to lack of a ':' in `line`, or the pre-':' contents not being a number # Either way, put all such lines at the top of the file merged_lines.insert(0, (f'# MYPY: {line.strip()}', True)) merged_lines = [line for line, is_mypy in merged_lines if is_mypy or not line.strip().startswith('# MYPY: ')] return '\n'.join(merged_lines) + '\n' pydantic-2.10.6/tests/plugin/000077500000000000000000000000001474456633400160735ustar00rootroot00000000000000pydantic-2.10.6/tests/plugin/example_plugin.py000066400000000000000000000012441474456633400214570ustar00rootroot00000000000000from pydantic import BaseModel class MyModel(BaseModel): x: int m = MyModel(x='10') if m.x != 10: raise ValueError('m.x should be 10') log = [] class ValidatePythonHandler: def on_enter(self, *args, **kwargs) -> None: log.append(f'on_enter args={args} kwargs={kwargs}') def on_success(self, result) -> None: log.append(f'on_success result={result}') def on_error(self, error) -> None: log.append(f'on_error error={error}') class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): return ValidatePythonHandler(), None, None plugin = Plugin() pydantic-2.10.6/tests/plugin/pyproject.toml000066400000000000000000000003361474456633400210110ustar00rootroot00000000000000[build-system] requires = ["hatchling"] build-backend = "hatchling.build" [project] name = "example_plugin" version = "0.1.0" requires-python = '>=3.8' [project.entry-points.pydantic] my_plugin = "example_plugin:plugin" pydantic-2.10.6/tests/plugin/test_plugin.py000066400000000000000000000010371474456633400210030ustar00rootroot00000000000000import os import pytest pytestmark = pytest.mark.skipif(not os.getenv('TEST_PLUGIN'), reason='Test only with `TEST_PLUGIN` env var set.') def test_plugin_usage(): from pydantic import BaseModel class MyModel(BaseModel): x: int y: str m = MyModel(x='10', y='hello') assert m.x == 10 assert m.y == 'hello' from example_plugin import log assert log == [ "on_enter args=({'x': '10', 'y': 'hello'},) kwargs={'self_instance': MyModel()}", "on_success result=x=10 y='hello'", ] pydantic-2.10.6/tests/test_abc.py000066400000000000000000000024301474456633400167320ustar00rootroot00000000000000import abc import sys import pytest from pydantic import BaseModel def test_model_subclassing_abstract_base_classes(): class Model(BaseModel, abc.ABC): some_field: str @pytest.mark.skipif(sys.version_info < (3, 12), reason='error value different on older versions') def test_model_subclassing_abstract_base_classes_without_implementation_raises_exception(): class Model(BaseModel, abc.ABC): some_field: str @abc.abstractmethod def my_abstract_method(self): pass @classmethod @abc.abstractmethod def my_abstract_classmethod(cls): pass @staticmethod @abc.abstractmethod def my_abstract_staticmethod(): pass @property @abc.abstractmethod def my_abstract_property(self): pass @my_abstract_property.setter @abc.abstractmethod def my_abstract_property(self, val): pass with pytest.raises(TypeError) as excinfo: Model(some_field='some_value') assert str(excinfo.value) == ( "Can't instantiate abstract class Model without an implementation for abstract methods " "'my_abstract_classmethod', 'my_abstract_method', 'my_abstract_property', 'my_abstract_staticmethod'" ) pydantic-2.10.6/tests/test_aliases.py000066400000000000000000000571421474456633400176400ustar00rootroot00000000000000from contextlib import nullcontext as does_not_raise from inspect import signature from typing import Any, ContextManager, List, Optional import pytest from dirty_equals import IsStr from pydantic_core import PydanticUndefined from pydantic import ( AliasChoices, AliasGenerator, AliasPath, BaseModel, ConfigDict, Field, ValidationError, computed_field, ) def test_alias_generator(): def to_camel(string: str): return ''.join(x.capitalize() for x in string.split('_')) class MyModel(BaseModel): model_config = ConfigDict(alias_generator=to_camel) a: List[str] = None foo_bar: str data = {'A': ['foo', 'bar'], 'FooBar': 'foobar'} v = MyModel(**data) assert v.a == ['foo', 'bar'] assert v.foo_bar == 'foobar' assert v.model_dump(by_alias=True) == data def test_alias_generator_wrong_type_error(): def return_bytes(string): return b'not a string' with pytest.raises(TypeError) as e: class MyModel(BaseModel): model_config = ConfigDict(alias_generator=return_bytes) bar: Any assert str(e.value) == IsStr(regex="alias_generator must return str, not ") def test_basic_alias(): class Model(BaseModel): a: str = Field('foobar', alias='_a') assert Model().a == 'foobar' assert Model(_a='different').a == 'different' assert repr(Model.model_fields['a']) == ( "FieldInfo(annotation=str, required=False, default='foobar', alias='_a', alias_priority=2)" ) def test_field_info_repr_with_aliases(): class Model(BaseModel): a: str = Field('foobar', alias='_a', validation_alias='a_val', serialization_alias='a_ser') assert repr(Model.model_fields['a']) == ( "FieldInfo(annotation=str, required=False, default='foobar', alias='_a', " "alias_priority=2, validation_alias='a_val', serialization_alias='a_ser')" ) def test_alias_error(): class Model(BaseModel): a: int = Field(123, alias='_a') assert Model(_a='123').a == 123 with pytest.raises(ValidationError) as exc_info: Model(_a='foo') assert exc_info.value.errors(include_url=False) == [ { 'input': 'foo', 'loc': ('_a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] def test_alias_error_loc_by_alias(): class Model(BaseModel): model_config = dict(loc_by_alias=False) a: int = Field(123, alias='_a') assert Model(_a='123').a == 123 with pytest.raises(ValidationError) as exc_info: Model(_a='foo') assert exc_info.value.errors(include_url=False) == [ { 'input': 'foo', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] def test_annotation_config(): class Model(BaseModel): b: float = Field(alias='foobar') a: int = 10 _c: str assert list(Model.model_fields.keys()) == ['b', 'a'] assert [f.alias for f in Model.model_fields.values()] == ['foobar', None] assert Model(foobar='123').b == 123.0 def test_pop_by_field_name(): class Model(BaseModel): model_config = ConfigDict(extra='forbid', populate_by_name=True) last_updated_by: Optional[str] = Field(None, alias='lastUpdatedBy') assert Model(lastUpdatedBy='foo').model_dump() == {'last_updated_by': 'foo'} assert Model(last_updated_by='foo').model_dump() == {'last_updated_by': 'foo'} with pytest.raises(ValidationError) as exc_info: Model(lastUpdatedBy='foo', last_updated_by='bar') assert exc_info.value.errors(include_url=False) == [ { 'input': 'bar', 'loc': ('last_updated_by',), 'msg': 'Extra inputs are not permitted', 'type': 'extra_forbidden', } ] def test_alias_override_behavior(): class Parent(BaseModel): # Use `gt` to demonstrate that using `Field` to override an alias does not preserve other attributes x: int = Field(alias='x1', gt=0) class Child(Parent): x: int = Field(alias='x2') y: int = Field(alias='y2') assert Parent.model_fields['x'].alias == 'x1' assert Child.model_fields['x'].alias == 'x2' assert Child.model_fields['y'].alias == 'y2' Parent(x1=1) with pytest.raises(ValidationError) as exc_info: Parent(x1=-1) assert exc_info.value.errors(include_url=False) == [ {'ctx': {'gt': 0}, 'input': -1, 'loc': ('x1',), 'msg': 'Input should be greater than 0', 'type': 'greater_than'} ] Child(x2=1, y2=2) # Check the gt=0 is not preserved from Parent Child(x2=-1, y2=2) # Check the alias from Parent cannot be used with pytest.raises(ValidationError) as exc_info: Child(x1=1, y2=2) assert exc_info.value.errors(include_url=False) == [ {'input': {'x1': 1, 'y2': 2}, 'loc': ('x2',), 'msg': 'Field required', 'type': 'missing'} ] # Check the type hint from Parent _is_ preserved with pytest.raises(ValidationError) as exc_info: Child(x2='a', y2=2) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('x2',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_alias_generator_parent(): class Parent(BaseModel): model_config = ConfigDict(populate_by_name=True, alias_generator=lambda f_name: f_name + '1') x: int class Child(Parent): model_config = ConfigDict(alias_generator=lambda f_name: f_name + '2') y: int assert Child.model_fields['y'].alias == 'y2' assert Child.model_fields['x'].alias == 'x2' upper_alias_generator = [ pytest.param( lambda x: x.upper(), id='basic_callable', ), pytest.param( AliasGenerator(lambda x: x.upper()), id='alias_generator', ), ] @pytest.mark.parametrize('alias_generator', upper_alias_generator) def test_alias_generator_on_parent(alias_generator): class Parent(BaseModel): model_config = ConfigDict(alias_generator=alias_generator) x: bool = Field(alias='a_b_c') y: str class Child(Parent): y: str z: str assert Parent.model_fields['x'].alias == 'a_b_c' assert Parent.model_fields['y'].alias == 'Y' assert Child.model_fields['x'].alias == 'a_b_c' assert Child.model_fields['y'].alias == 'Y' assert Child.model_fields['z'].alias == 'Z' @pytest.mark.parametrize('alias_generator', upper_alias_generator) def test_alias_generator_on_child(alias_generator): class Parent(BaseModel): x: bool = Field(alias='abc') y: str class Child(Parent): model_config = ConfigDict(alias_generator=alias_generator) y: str z: str assert [f.alias for f in Parent.model_fields.values()] == ['abc', None] assert [f.alias for f in Child.model_fields.values()] == ['abc', 'Y', 'Z'] @pytest.mark.parametrize('alias_generator', upper_alias_generator) def test_alias_generator_used_by_default(alias_generator): class Model(BaseModel): model_config = ConfigDict(alias_generator=alias_generator) a: str b: str = Field(alias='b_alias') c: str = Field(validation_alias='c_val_alias') d: str = Field(serialization_alias='d_ser_alias') e: str = Field(alias='e_alias', validation_alias='e_val_alias') f: str = Field(alias='f_alias', serialization_alias='f_ser_alias') g: str = Field(alias='g_alias', validation_alias='g_val_alias', serialization_alias='g_ser_alias') assert { name: {k: getattr(f, k) for k in ('alias', 'validation_alias', 'serialization_alias')} for name, f in Model.model_fields.items() } == { # Validation/serialization aliases should be: # 1. The specific alias, if specified, or # 2. The alias, if specified, or # 3. The generated alias (i.e. the field name in upper case) 'a': { 'alias': 'A', 'validation_alias': 'A', 'serialization_alias': 'A', }, 'b': { 'alias': 'b_alias', 'validation_alias': 'b_alias', 'serialization_alias': 'b_alias', }, 'c': { 'alias': 'C', 'validation_alias': 'c_val_alias', 'serialization_alias': 'C', }, 'd': { 'alias': 'D', 'validation_alias': 'D', 'serialization_alias': 'd_ser_alias', }, 'e': { 'alias': 'e_alias', 'validation_alias': 'e_val_alias', 'serialization_alias': 'e_alias', }, 'f': { 'alias': 'f_alias', 'validation_alias': 'f_alias', 'serialization_alias': 'f_ser_alias', }, 'g': { 'alias': 'g_alias', 'validation_alias': 'g_val_alias', 'serialization_alias': 'g_ser_alias', }, } @pytest.mark.parametrize('alias_generator', upper_alias_generator) def test_low_priority_alias(alias_generator): class Parent(BaseModel): w: bool = Field(alias='w_', validation_alias='w_val_alias', serialization_alias='w_ser_alias') x: bool = Field( alias='abc', alias_priority=1, validation_alias='x_val_alias', serialization_alias='x_ser_alias' ) y: str class Child(Parent): model_config = ConfigDict(alias_generator=alias_generator) y: str z: str assert [f.alias for f in Parent.model_fields.values()] == ['w_', 'abc', None] assert [f.validation_alias for f in Parent.model_fields.values()] == ['w_val_alias', 'x_val_alias', None] assert [f.serialization_alias for f in Parent.model_fields.values()] == ['w_ser_alias', 'x_ser_alias', None] assert [f.alias for f in Child.model_fields.values()] == ['w_', 'X', 'Y', 'Z'] assert [f.validation_alias for f in Child.model_fields.values()] == ['w_val_alias', 'X', 'Y', 'Z'] assert [f.serialization_alias for f in Child.model_fields.values()] == ['w_ser_alias', 'X', 'Y', 'Z'] @pytest.mark.parametrize( 'cls_params, field_params, validation_key, serialization_key', [ pytest.param( {}, {'alias': 'x1', 'validation_alias': 'x2'}, 'x2', 'x1', id='alias-validation_alias', ), pytest.param( {'alias_generator': str.upper}, {'alias': 'x'}, 'x', 'x', id='alias_generator-alias', ), pytest.param( {'alias_generator': str.upper}, {'alias': 'x1', 'validation_alias': 'x2'}, 'x2', 'x1', id='alias_generator-alias-validation_alias', ), pytest.param( {'alias_generator': str.upper}, {'alias': 'x1', 'serialization_alias': 'x2'}, 'x1', 'x2', id='alias_generator-alias-serialization_alias', ), pytest.param( {'alias_generator': str.upper}, {'alias': 'x1', 'validation_alias': 'x2', 'serialization_alias': 'x3'}, 'x2', 'x3', id='alias_generator-alias-validation_alias-serialization_alias', ), ], ) def test_aliases_priority(cls_params, field_params, validation_key, serialization_key): class Model(BaseModel, **cls_params): x: int = Field(**field_params) model = Model(**{validation_key: 1}) assert model.x == 1 assert model.model_dump(by_alias=True).get(serialization_key, None) is not None def test_empty_string_alias(): class Model(BaseModel): empty_string_key: int = Field(alias='') data = {'': 123} m = Model(**data) assert m.empty_string_key == 123 assert m.model_dump(by_alias=True) == data @pytest.mark.parametrize( 'use_construct, populate_by_name_config, arg_name, expectation', [ [False, True, 'bar', does_not_raise()], [False, True, 'bar_', does_not_raise()], [False, False, 'bar', does_not_raise()], [False, False, 'bar_', pytest.raises(ValueError)], [True, True, 'bar', does_not_raise()], [True, True, 'bar_', does_not_raise()], [True, False, 'bar', does_not_raise()], [True, False, 'bar_', does_not_raise()], ], ) def test_populate_by_name_config( use_construct: bool, populate_by_name_config: bool, arg_name: str, expectation: ContextManager, ): expected_value: int = 7 class Foo(BaseModel): model_config = ConfigDict(populate_by_name=populate_by_name_config) bar_: int = Field(alias='bar') with expectation: if use_construct: f = Foo.model_construct(**{arg_name: expected_value}) else: f = Foo(**{arg_name: expected_value}) assert f.bar_ == expected_value def test_validation_alias(): class Model(BaseModel): x: str = Field(validation_alias='foo') data = {'foo': 'bar'} m = Model(**data) assert m.x == 'bar' with pytest.raises(ValidationError) as exc_info: Model(x='bar') assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': ('foo',), 'msg': 'Field required', 'input': {'x': 'bar'}, } ] def test_validation_alias_with_alias(): class Model(BaseModel): x: str = Field(alias='x_alias', validation_alias='foo') data = {'foo': 'bar'} m = Model(**data) assert m.x == 'bar' sig = signature(Model) assert 'x_alias' in sig.parameters with pytest.raises(ValidationError) as exc_info: Model(x='bar') assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': ('foo',), 'msg': 'Field required', 'input': {'x': 'bar'}, } ] def test_validation_alias_from_str_alias(): class Model(BaseModel): x: str = Field(alias='foo') data = {'foo': 'bar'} m = Model(**data) assert m.x == 'bar' sig = signature(Model) assert 'foo' in sig.parameters with pytest.raises(ValidationError) as exc_info: Model(x='bar') assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': ('foo',), 'msg': 'Field required', 'input': {'x': 'bar'}, } ] def test_validation_alias_from_list_alias(): class Model(BaseModel): x: str = Field(alias=['foo', 'bar']) data = {'foo': {'bar': 'test'}} m = Model(**data) assert m.x == 'test' sig = signature(Model) assert 'x' in sig.parameters class Model(BaseModel): x: str = Field(alias=['foo', 1]) data = {'foo': ['bar0', 'bar1']} m = Model(**data) assert m.x == 'bar1' sig = signature(Model) assert 'x' in sig.parameters def test_serialization_alias(): class Model(BaseModel): x: str = Field(serialization_alias='foo') m = Model(x='bar') assert m.x == 'bar' assert m.model_dump() == {'x': 'bar'} assert m.model_dump(by_alias=True) == {'foo': 'bar'} def test_serialization_alias_with_alias(): class Model(BaseModel): x: str = Field(alias='x_alias', serialization_alias='foo') data = {'x_alias': 'bar'} m = Model(**data) assert m.x == 'bar' assert m.model_dump() == {'x': 'bar'} assert m.model_dump(by_alias=True) == {'foo': 'bar'} sig = signature(Model) assert 'x_alias' in sig.parameters def test_serialization_alias_from_alias(): class Model(BaseModel): x: str = Field(alias='foo') data = {'foo': 'bar'} m = Model(**data) assert m.x == 'bar' assert m.model_dump() == {'x': 'bar'} assert m.model_dump(by_alias=True) == {'foo': 'bar'} sig = signature(Model) assert 'foo' in sig.parameters @pytest.mark.parametrize( 'field,expected', [ pytest.param( Field(alias='x_alias', validation_alias='x_val_alias', serialization_alias='x_ser_alias'), { 'properties': {'x_val_alias': {'title': 'X Val Alias', 'type': 'string'}}, 'required': ['x_val_alias'], }, id='single_alias', ), pytest.param( Field(validation_alias=AliasChoices('y_alias', 'another_alias')), { 'properties': {'y_alias': {'title': 'Y Alias', 'type': 'string'}}, 'required': ['y_alias'], }, id='multiple_aliases', ), pytest.param( Field(validation_alias=AliasChoices(AliasPath('z_alias', 'even_another_alias'), 'and_another')), { 'properties': {'and_another': {'title': 'And Another', 'type': 'string'}}, 'required': ['and_another'], }, id='multiple_aliases_with_path', ), ], ) def test_aliases_json_schema(field, expected): class Model(BaseModel): x: str = field assert Model.model_json_schema() == {'title': 'Model', 'type': 'object', **expected} @pytest.mark.parametrize( 'value', [ 'a', AliasPath('a', 'b', 1), AliasChoices('a', 'b'), AliasChoices('a', AliasPath('b', 1)), ], ) def test_validation_alias_path(value): class Model(BaseModel): x: str = Field(validation_alias=value) assert Model.model_fields['x'].validation_alias == value def test_search_dict_for_alias_path(): ap = AliasPath('a', 1) assert ap.search_dict_for_path({'a': ['hello', 'world']}) == 'world' assert ap.search_dict_for_path({'a': 'hello'}) is PydanticUndefined def test_validation_alias_invalid_value_type(): m = 'Invalid `validation_alias` type. it should be `str`, `AliasChoices`, or `AliasPath`' with pytest.raises(TypeError, match=m): class Model(BaseModel): x: str = Field(validation_alias=123) def test_validation_alias_parse_data(): class Model(BaseModel): x: str = Field(validation_alias=AliasChoices('a', AliasPath('b', 1), 'c')) assert Model.model_fields['x'].validation_alias == AliasChoices('a', AliasPath('b', 1), 'c') assert Model.model_validate({'a': 'hello'}).x == 'hello' assert Model.model_validate({'b': ['hello', 'world']}).x == 'world' assert Model.model_validate({'c': 'test'}).x == 'test' with pytest.raises(ValidationError) as exc_info: Model.model_validate({'b': ['hello']}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': ('a',), 'msg': 'Field required', 'input': {'b': ['hello']}, } ] def test_alias_generator_class() -> None: class Model(BaseModel): a: str model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=lambda field_name: f'validation_{field_name}', serialization_alias=lambda field_name: f'serialization_{field_name}', ) ) assert Model.model_fields['a'].validation_alias == 'validation_a' assert Model.model_fields['a'].serialization_alias == 'serialization_a' assert Model.model_fields['a'].alias is None def test_alias_generator_with_alias() -> None: class Model(BaseModel): a: str model_config = ConfigDict(alias_generator=AliasGenerator(alias=lambda field_name: f'{field_name}_alias')) assert Model.model_fields['a'].validation_alias == 'a_alias' assert Model.model_fields['a'].serialization_alias == 'a_alias' assert Model.model_fields['a'].alias == 'a_alias' def test_alias_generator_with_positional_arg() -> None: class Model(BaseModel): a: str model_config = ConfigDict(alias_generator=AliasGenerator(lambda field_name: f'{field_name}_alias')) assert Model.model_fields['a'].validation_alias == 'a_alias' assert Model.model_fields['a'].serialization_alias == 'a_alias' assert Model.model_fields['a'].alias == 'a_alias' @pytest.mark.parametrize('alias_generator', upper_alias_generator) def test_alias_generator_with_computed_field(alias_generator) -> None: class Rectangle(BaseModel): model_config = ConfigDict(populate_by_name=True, alias_generator=alias_generator) width: int height: int @computed_field @property def area(self) -> int: return self.width * self.height r = Rectangle(width=10, height=20) assert r.model_dump(by_alias=True) == {'WIDTH': 10, 'HEIGHT': 20, 'AREA': 200} def test_alias_generator_with_invalid_callables() -> None: for alias_kind in ('validation_alias', 'serialization_alias', 'alias'): with pytest.raises( TypeError, match=f'Invalid `{alias_kind}` type. `{alias_kind}` generator must produce one of' ): class Foo(BaseModel): a: str model_config = ConfigDict(alias_generator=AliasGenerator(**{alias_kind: lambda x: 1})) def test_all_alias_kinds_specified() -> None: class Foo(BaseModel): a: str model_config = ConfigDict( alias_generator=AliasGenerator( alias=lambda field_name: f'{field_name}_alias', validation_alias=lambda field_name: f'{field_name}_val_alias', serialization_alias=lambda field_name: f'{field_name}_ser_alias', ) ) assert Foo.model_fields['a'].alias == 'a_alias' assert Foo.model_fields['a'].validation_alias == 'a_val_alias' assert Foo.model_fields['a'].serialization_alias == 'a_ser_alias' # the same behavior we'd expect if we defined alias, validation_alias # and serialization_alias on the field itself f = Foo(a_val_alias='a') assert f.a == 'a' assert f.model_dump(by_alias=True) == {'a_ser_alias': 'a'} assert f.model_dump(by_alias=False) == {'a': 'a'} def test_alias_generator_with_computed_field_for_serialization() -> None: """Tests that the alias generator is used for computed fields, with serialization_alias taking precedence over alias.""" class Rectangle(BaseModel): model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=lambda field_name: f'{field_name}_val_alias', alias=lambda field_name: f'{field_name}_alias', serialization_alias=lambda field_name: f'{field_name}_ser_alias', ) ) width: int height: int @computed_field def area(self) -> int: return self.width * self.height r = Rectangle(width_val_alias=10, height_val_alias=20) assert r.model_dump(by_alias=True) == {'width_ser_alias': 10, 'height_ser_alias': 20, 'area_ser_alias': 200} empty_str_alias_generator = AliasGenerator( validation_alias=lambda x: '', alias=lambda x: f'{x}_alias', serialization_alias=lambda x: '' ) def test_alias_gen_with_empty_string() -> None: class Model(BaseModel): a: str model_config = ConfigDict(alias_generator=empty_str_alias_generator) assert Model.model_fields['a'].validation_alias == '' assert Model.model_fields['a'].serialization_alias == '' assert Model.model_fields['a'].alias == 'a_alias' def test_alias_gen_with_empty_string_and_computed_field() -> None: class Model(BaseModel): model_config = ConfigDict(alias_generator=empty_str_alias_generator) a: str @computed_field def b(self) -> str: return self.a assert Model.model_fields['a'].validation_alias == '' assert Model.model_fields['a'].serialization_alias == '' assert Model.model_fields['a'].alias == 'a_alias' assert Model.model_computed_fields['b'].alias == '' pydantic-2.10.6/tests/test_allow_partial.py000066400000000000000000000072641474456633400210510ustar00rootroot00000000000000from typing import Dict, List, Tuple import pytest from annotated_types import Ge from typing_extensions import Annotated, TypedDict from pydantic import TypeAdapter, ValidationError from .conftest import Err @pytest.mark.parametrize( 'mode,value,expected', [ ('python', {'a': 1, 'b': 'b', 'c': (3, '4')}, {'a': 1, 'b': 'b', 'c': (3, '4')}), ('python', {'a': 1, 'b': 'b', 'c': (3,)}, {'a': 1, 'b': 'b'}), ('python', {'a': 1, 'b': 'b'}, {'a': 1, 'b': 'b'}), ('json', '{"a": 1, "b": "b", "c": [3, "4"]}', {'a': 1, 'b': 'b', 'c': (3, '4')}), ('json', '{"a": 1, "b": "b", "c": [3, "4"]}', {'a': 1, 'b': 'b', 'c': (3, '4')}), ('json', '{"a": 1, "b": "b", "c": [3]}', {'a': 1, 'b': 'b'}), ('json', '{"a": 1, "b": "b", "c": [3', {'a': 1, 'b': 'b'}), ('json', '{"a": 1, "b": "b', {'a': 1}), ('json', '{"a": 1, "b": ', {'a': 1}), ('python', {'a': 1, 'c': (3,), 'b': 'b'}, Err(r'c\.1\s+Field required')), ('json', '{"a": 1, "c": [3], "b": "b"}', Err(r'c\.1\s+Field required')), ], ) def test_typed_dict(mode, value, expected): class Foobar(TypedDict, total=False): a: int b: str c: Tuple[int, str] ta = TypeAdapter(Foobar) if mode == 'python': if isinstance(expected, Err): with pytest.raises(ValidationError, match=expected.message): ta.validate_python(value, experimental_allow_partial=True) else: assert ta.validate_python(value, experimental_allow_partial=True) == expected else: if isinstance(expected, Err): with pytest.raises(ValidationError, match=expected.message): ta.validate_json(value, experimental_allow_partial=True) else: assert ta.validate_json(value, experimental_allow_partial=True) == expected @pytest.mark.parametrize( 'mode,value,expected', [ ('python', [10, 20, 30], [10, 20, 30]), ('python', ['10', '20', '30'], [10, 20, 30]), ('python', [10, 20, 30], [10, 20, 30]), ('python', [10, 20, 3], [10, 20]), ('json', '[10, 20, 30]', [10, 20, 30]), ('json', '[10, 20, 30', [10, 20, 30]), ('json', '[10, 20, 3', [10, 20]), ], ) def test_list(mode, value, expected): ta = TypeAdapter(List[Annotated[int, Ge(10)]]) if mode == 'python': if isinstance(expected, Err): with pytest.raises(ValidationError, match=expected.message): ta.validate_python(value, experimental_allow_partial=True) else: assert ta.validate_python(value, experimental_allow_partial=True) == expected else: if isinstance(expected, Err): with pytest.raises(ValidationError, match=expected.message): ta.validate_json(value, experimental_allow_partial=True) else: assert ta.validate_json(value, experimental_allow_partial=True) == expected def test_dict(): ta = TypeAdapter(Dict[str, Annotated[int, Ge(10)]]) eap = dict(experimental_allow_partial=True) assert ta.validate_python({'a': 10, 'b': 20, 'c': 30}, **eap) == {'a': 10, 'b': 20, 'c': 30} assert ta.validate_python({'a': 10, 'b': 20, 'c': 3}, **eap) == {'a': 10, 'b': 20} assert ta.validate_strings({'a': '10', 'b': '20', 'c': '30'}, strict=True, **eap) == {'a': 10, 'b': 20, 'c': 30} assert ta.validate_strings({'a': '10', 'b': '20', 'c': '3'}, strict=True, **eap) == {'a': 10, 'b': 20} assert ta.validate_json('{"a": 10, "b": 20, "c": 30}', **eap) == {'a': 10, 'b': 20, 'c': 30} assert ta.validate_json('{"a": 10, "b": 20, "c": 3', **eap) == {'a': 10, 'b': 20} assert ta.validate_json('{"a": 10, "b": 20, "c": 3}', **eap) == {'a': 10, 'b': 20} pydantic-2.10.6/tests/test_annotated.py000066400000000000000000000523721474456633400201740ustar00rootroot00000000000000import datetime as dt import sys from dataclasses import dataclass from decimal import Decimal from typing import Any, Callable, Generic, Iterator, List, Optional, Set, TypeVar import pytest import pytz from annotated_types import BaseMetadata, GroupedMetadata, Gt, Lt, Not, Predicate from pydantic_core import CoreSchema, PydanticUndefined, core_schema from typing_extensions import Annotated from pydantic import ( BaseModel, BeforeValidator, Field, GetCoreSchemaHandler, PydanticUserError, TypeAdapter, ValidationError, ) from pydantic.errors import PydanticSchemaGenerationError from pydantic.fields import PrivateAttr from pydantic.functional_validators import AfterValidator NO_VALUE = object() @pytest.mark.parametrize( 'hint_fn,value,expected_repr', [ ( lambda: Annotated[int, Gt(0)], 5, 'FieldInfo(annotation=int, required=False, default=5, metadata=[Gt(gt=0)])', ), ( lambda: Annotated[int, Field(gt=0)], 5, 'FieldInfo(annotation=int, required=False, default=5, metadata=[Gt(gt=0)])', ), ( lambda: int, Field(5, gt=0), 'FieldInfo(annotation=int, required=False, default=5, metadata=[Gt(gt=0)])', ), ( lambda: int, Field(default_factory=lambda: 5, gt=0), 'FieldInfo(annotation=int, required=False, default_factory=, metadata=[Gt(gt=0)])', ), ( lambda: Annotated[int, Lt(2)], Field(5, gt=0), 'FieldInfo(annotation=int, required=False, default=5, metadata=[Gt(gt=0), Lt(lt=2)])', ), ( lambda: Annotated[int, Gt(0)], NO_VALUE, 'FieldInfo(annotation=int, required=True, metadata=[Gt(gt=0)])', ), ( lambda: Annotated[int, Gt(0)], Field(), 'FieldInfo(annotation=int, required=True, metadata=[Gt(gt=0)])', ), ( lambda: int, Field(gt=0), 'FieldInfo(annotation=int, required=True, metadata=[Gt(gt=0)])', ), ( lambda: Annotated[int, Gt(0)], PydanticUndefined, 'FieldInfo(annotation=int, required=True, metadata=[Gt(gt=0)])', ), ( lambda: Annotated[int, Field(gt=0), Lt(2)], 5, 'FieldInfo(annotation=int, required=False, default=5, metadata=[Gt(gt=0), Lt(lt=2)])', ), ( lambda: Annotated[int, Field(alias='foobar')], PydanticUndefined, "FieldInfo(annotation=int, required=True, alias='foobar', alias_priority=2)", ), ], ) def test_annotated(hint_fn, value, expected_repr): hint = hint_fn() if value is NO_VALUE: class M(BaseModel): x: hint else: class M(BaseModel): x: hint = value assert repr(M.model_fields['x']) == expected_repr @pytest.mark.parametrize('metadata', [0, 'foo']) def test_annotated_allows_unknown(metadata): class M(BaseModel): x: Annotated[int, metadata] = 5 field_info = M.model_fields['x'] assert len(field_info.metadata) == 1 assert metadata in field_info.metadata, 'Records the unknown metadata' assert metadata in M.__annotations__['x'].__metadata__, 'Annotated type is recorded' @pytest.mark.parametrize( ['hint_fn', 'value', 'empty_init_ctx'], [ ( lambda: int, PydanticUndefined, pytest.raises(ValueError, match=r'Field required \[type=missing,'), ), ( lambda: Annotated[int, Field()], PydanticUndefined, pytest.raises(ValueError, match=r'Field required \[type=missing,'), ), ], ) def test_annotated_instance_exceptions(hint_fn, value, empty_init_ctx): hint = hint_fn() class M(BaseModel): x: hint = value with empty_init_ctx: assert M().x == 5 def test_field_reuse(): field = Field(description='Long description') class Model(BaseModel): one: int = field assert Model(one=1).model_dump() == {'one': 1} class AnnotatedModel(BaseModel): one: Annotated[int, field] assert AnnotatedModel(one=1).model_dump() == {'one': 1} def test_config_field_info(): class Foo(BaseModel): a: Annotated[int, Field(description='descr', json_schema_extra={'foobar': 'hello'})] assert Foo.model_json_schema(by_alias=True)['properties'] == { 'a': {'title': 'A', 'description': 'descr', 'foobar': 'hello', 'type': 'integer'}, } @pytest.mark.skipif(sys.version_info < (3, 10), reason='repr different on older versions') def test_annotated_alias() -> None: # https://github.com/pydantic/pydantic/issues/2971 StrAlias = Annotated[str, Field(max_length=3)] IntAlias = Annotated[int, Field(default_factory=lambda: 2)] Nested = Annotated[List[StrAlias], Field(description='foo')] class MyModel(BaseModel): a: StrAlias = 'abc' b: StrAlias c: IntAlias d: IntAlias e: Nested fields_repr = {k: repr(v) for k, v in MyModel.model_fields.items()} assert fields_repr == { 'a': "FieldInfo(annotation=str, required=False, default='abc', metadata=[MaxLen(max_length=3)])", 'b': 'FieldInfo(annotation=str, required=True, metadata=[MaxLen(max_length=3)])', 'c': 'FieldInfo(annotation=int, required=False, default_factory=)', 'd': 'FieldInfo(annotation=int, required=False, default_factory=)', 'e': "FieldInfo(annotation=List[Annotated[str, FieldInfo(annotation=NoneType, required=True, metadata=[MaxLen(max_length=3)])]], required=True, description='foo')", } assert MyModel(b='def', e=['xyz']).model_dump() == dict(a='abc', b='def', c=2, d=2, e=['xyz']) def test_modify_get_schema_annotated() -> None: calls: List[str] = [] class CustomType: @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: calls.append('CustomType:before') with pytest.raises(PydanticSchemaGenerationError): handler(source) schema = core_schema.no_info_plain_validator_function(lambda _: CustomType()) calls.append('CustomType:after') return schema class PydanticMetadata: def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: calls.append('PydanticMetadata:before') schema = handler(source) calls.append('PydanticMetadata:after') return schema class GroupedMetadataMarker(GroupedMetadata): def __iter__(self) -> Iterator[BaseMetadata]: # no way to actually hook into schema building # so just register when our iter is called calls.append('GroupedMetadataMarker:iter') yield from [] class _(BaseModel): x: Annotated[CustomType, GroupedMetadataMarker(), PydanticMetadata()] # insert_assert(calls) assert calls == [ 'GroupedMetadataMarker:iter', 'PydanticMetadata:before', 'CustomType:before', 'CustomType:after', 'PydanticMetadata:after', ] calls.clear() class _(BaseModel): x: Annotated[CustomType, PydanticMetadata(), GroupedMetadataMarker()] # insert_assert(calls) assert calls == [ 'GroupedMetadataMarker:iter', 'PydanticMetadata:before', 'CustomType:before', 'CustomType:after', 'PydanticMetadata:after', ] calls.clear() def test_annotated_alias_at_low_level() -> None: with pytest.warns( UserWarning, match=r'`alias` specification on field "low_level_alias_field" must be set on outermost annotation to take effect.', ): class Model(BaseModel): low_level_alias_field: Optional[Annotated[int, Field(alias='field_alias')]] = None assert Model(field_alias=1).low_level_alias_field is None def test_get_pydantic_core_schema_source_type() -> None: types: Set[Any] = set() class PydanticMarker: def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: types.add(source) return handler(source) class _(BaseModel): x: Annotated[Annotated[int, 'foo'], PydanticMarker()] assert types == {int} types.clear() T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): y: T class _(BaseModel): x: Annotated[GenericModel[int], PydanticMarker()] assert types == {GenericModel[int]} types.clear() def test_merge_field_infos_type_adapter() -> None: ta = TypeAdapter( Annotated[ int, Field(gt=0), Field(lt=100), Field(gt=1), Field(description='abc'), Field(3), Field(description=None) ] ) default = ta.get_default_value() assert default is not None assert default.value == 3 # insert_assert(ta.validate_python(2)) assert ta.validate_python(2) == 2 with pytest.raises(ValidationError) as exc_info: ta.validate_python(1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'greater_than', 'loc': (), 'msg': 'Input should be greater than 1', 'input': 1, 'ctx': {'gt': 1}} ] # insert_assert(ta.json_schema()) assert ta.json_schema() == { 'default': 3, 'description': 'abc', 'exclusiveMaximum': 100, 'exclusiveMinimum': 1, 'type': 'integer', } def test_merge_field_infos_model() -> None: class Model(BaseModel): x: Annotated[ int, Field(gt=0), Field(lt=100), Field(gt=1), Field(description='abc'), Field(3), Field(description=None) ] = Field(5) # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'x': {'default': 5, 'exclusiveMaximum': 100, 'exclusiveMinimum': 1, 'title': 'X', 'type': 'integer'} }, 'title': 'Model', 'type': 'object', } def test_model_dump_doesnt_dump_annotated_dunder(): class Model(BaseModel): one: int AnnotatedModel = Annotated[Model, ...] # In Pydantic v1, `AnnotatedModel.dict()` would have returned # `{'one': 1, '__orig_class__': typing.Annotated[...]}` assert AnnotatedModel(one=1).model_dump() == {'one': 1} def test_merge_field_infos_ordering() -> None: TheType = Annotated[int, AfterValidator(lambda x: x), Field(le=2), AfterValidator(lambda x: x * 2), Field(lt=4)] class Model(BaseModel): x: TheType assert Model(x=1).x == 2 with pytest.raises(ValidationError) as exc_info: Model(x=2) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'less_than', 'loc': ('x',), 'msg': 'Input should be less than 4', 'input': 2, 'ctx': {'lt': 4}} ] with pytest.raises(ValidationError) as exc_info: Model(x=3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'less_than_equal', 'loc': ('x',), 'msg': 'Input should be less than or equal to 2', 'input': 3, 'ctx': {'le': 2}, } ] def test_validate_float_inf_nan_python() -> None: ta = TypeAdapter(Annotated[float, AfterValidator(lambda x: x * 3), Field(allow_inf_nan=False)]) assert ta.validate_python(2.0) == 6.0 ta = TypeAdapter(Annotated[float, AfterValidator(lambda _: float('nan')), Field(allow_inf_nan=False)]) with pytest.raises(ValidationError) as exc_info: ta.validate_python(1.0) # insert_assert(exc_info.value.errors(include_url=False)) # TODO: input should be float('nan'), this seems like a subtle bug in pydantic-core assert exc_info.value.errors(include_url=False) == [ {'type': 'finite_number', 'loc': (), 'msg': 'Input should be a finite number', 'input': 1.0} ] def test_predicate_success_python() -> None: ta = TypeAdapter(Annotated[int, Predicate(lambda x: x > 0)]) assert ta.validate_python(1) == 1 def test_predicate_error_python() -> None: ta = TypeAdapter(Annotated[int, Predicate(lambda x: x > 0)]) with pytest.raises(ValidationError) as exc_info: ta.validate_python(-1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'predicate_failed', 'loc': (), 'msg': 'Predicate test_predicate_error_python.. failed', 'input': -1, } ] def test_not_operation_error_python() -> None: ta = TypeAdapter(Annotated[int, Not(lambda x: x > 5)]) with pytest.raises(ValidationError) as exc_info: ta.validate_python(6) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'not_operation_failed', 'loc': (), 'msg': 'Not of test_not_operation_error_python.. failed', 'input': 6, } ] def test_annotated_field_info_not_lost_from_forwardref(): from pydantic import BaseModel class ForwardRefAnnotatedFieldModel(BaseModel): foo: 'Annotated[Integer, Field(alias="bar", default=1)]' = 2 foo2: 'Annotated[Integer, Field(alias="bar2", default=1)]' = Field(default=2, alias='baz') Integer = int ForwardRefAnnotatedFieldModel.model_rebuild() assert ForwardRefAnnotatedFieldModel(bar=3).foo == 3 assert ForwardRefAnnotatedFieldModel(baz=3).foo2 == 3 with pytest.raises(ValidationError) as exc_info: ForwardRefAnnotatedFieldModel(bar='bar') assert exc_info.value.errors(include_url=False) == [ { 'input': 'bar', 'loc': ('bar',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] def test_annotated_private_field_with_default(): class AnnotatedPrivateFieldModel(BaseModel): _foo: Annotated[int, PrivateAttr(default=1)] _bar: Annotated[str, 'hello'] _baz: 'Annotated[str, PrivateAttr(default=2)]' model = AnnotatedPrivateFieldModel() assert model._foo == 1 assert model._baz == 2 assert model.__pydantic_private__ == {'_foo': 1, '_baz': 2} with pytest.raises(AttributeError): assert model._bar model._bar = 'world' assert model._bar == 'world' assert model.__pydantic_private__ == {'_foo': 1, '_bar': 'world', '_baz': 2} with pytest.raises(AttributeError): assert model.bar def test_min_length_field_info_not_lost(): class AnnotatedFieldModel(BaseModel): foo: 'Annotated[String, Field(min_length=3)]' = Field(description='hello') String = str AnnotatedFieldModel.model_rebuild() assert AnnotatedFieldModel(foo='000').foo == '000' with pytest.raises(ValidationError) as exc_info: AnnotatedFieldModel(foo='00') assert exc_info.value.errors(include_url=False) == [ { 'loc': ('foo',), 'input': '00', 'ctx': {'min_length': 3}, 'msg': 'String should have at least 3 characters', 'type': 'string_too_short', } ] # Ensure that the inner annotation does not override the outer, even for metadata: class AnnotatedFieldModel2(BaseModel): foo: 'Annotated[String, Field(min_length=3)]' = Field(description='hello', min_length=2) AnnotatedFieldModel2(foo='00') class AnnotatedFieldModel4(BaseModel): foo: 'Annotated[String, Field(min_length=3)]' = Field(description='hello', min_length=4) with pytest.raises(ValidationError) as exc_info: AnnotatedFieldModel4(foo='00') assert exc_info.value.errors(include_url=False) == [ { 'loc': ('foo',), 'input': '00', 'ctx': {'min_length': 4}, 'msg': 'String should have at least 4 characters', 'type': 'string_too_short', } ] def test_tzinfo_validator_example_pattern() -> None: """Test that tzinfo custom validator pattern works as explained in the examples/validators docs.""" @dataclass(frozen=True) class MyDatetimeValidator: tz_constraint: Optional[str] = None def tz_constraint_validator( self, value: dt.datetime, handler: Callable, # (1)! ): """Validate tz_constraint and tz_info.""" # handle naive datetimes if self.tz_constraint is None: assert value.tzinfo is None, 'tz_constraint is None, but provided value is tz-aware.' return handler(value) # validate tz_constraint and tz-aware tzinfo if self.tz_constraint not in pytz.all_timezones: raise PydanticUserError( f'Invalid tz_constraint: {self.tz_constraint}', code='unevaluable-type-annotation' ) result = handler(value) # (2)! assert self.tz_constraint == str( result.tzinfo ), f'Invalid tzinfo: {str(result.tzinfo)}, expected: {self.tz_constraint}' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.tz_constraint_validator, handler(source_type), ) LA = 'America/Los_Angeles' # passing naive test ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator()]) ta.validate_python(dt.datetime.now()) # failing naive test ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator()]) with pytest.raises(Exception): ta.validate_python(dt.datetime.now(pytz.timezone(LA))) # passing tz-aware test ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(LA)]) ta.validate_python(dt.datetime.now(pytz.timezone(LA))) # failing bad tz ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator('foo')]) with pytest.raises(Exception): ta.validate_python(dt.datetime.now()) # failing tz-aware test ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(LA)]) with pytest.raises(Exception): ta.validate_python(dt.datetime.now()) def test_utcoffset_validator_example_pattern() -> None: """Test that utcoffset custom validator pattern works as explained in the examples/validators docs.""" @dataclass(frozen=True) class MyDatetimeValidator: lower_bound: int upper_bound: int def validate_tz_bounds(self, value: dt.datetime, handler: Callable): """Validate and test bounds""" assert value.utcoffset() is not None, 'UTC offset must exist' assert self.lower_bound <= self.upper_bound, 'Invalid bounds' result = handler(value) hours_offset = value.utcoffset().total_seconds() / 3600 assert self.lower_bound <= hours_offset <= self.upper_bound, 'Value out of bounds' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.validate_tz_bounds, handler(source_type), ) LA = 'America/Los_Angeles' # test valid bound passing ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(-10, 10)]) ta.validate_python(dt.datetime.now(pytz.timezone(LA))) # test valid bound failing - missing TZ ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(-12, 12)]) with pytest.raises(Exception): ta.validate_python(dt.datetime.now()) # test invalid bound ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(0, 4)]) with pytest.raises(Exception): ta.validate_python(dt.datetime.now(pytz.timezone(LA))) def test_incompatible_metadata_error() -> None: ta = TypeAdapter(Annotated[List[int], Field(pattern='abc')]) with pytest.raises(TypeError, match="Unable to apply constraint 'pattern'"): ta.validate_python([1, 2, 3]) def test_compatible_metadata_raises_correct_validation_error() -> None: """Using a no-op before validator to ensure that constraint is applied as part of a chain.""" ta = TypeAdapter(Annotated[str, BeforeValidator(lambda x: x), Field(pattern='abc')]) with pytest.raises(ValidationError, match="String should match pattern 'abc'"): ta.validate_python('def') def test_decimal_constraints_after_annotation() -> None: DecimalAnnotation = Annotated[Decimal, BeforeValidator(lambda v: v), Field(max_digits=10, decimal_places=4)] ta = TypeAdapter(DecimalAnnotation) assert ta.validate_python(Decimal('123.4567')) == Decimal('123.4567') with pytest.raises(ValidationError) as e: ta.validate_python(Decimal('123.45678')) assert e.value.errors()[0]['type'] == 'decimal_max_places' with pytest.raises(ValidationError) as e: ta.validate_python(Decimal('12345678.901')) assert e.value.errors()[0]['type'] == 'decimal_max_digits' pydantic-2.10.6/tests/test_assert_in_validators.py000066400000000000000000000021271474456633400224270ustar00rootroot00000000000000""" PYTEST_DONT_REWRITE """ import difflib import pprint import pytest from dirty_equals import HasRepr from pydantic import BaseModel, ValidationError, field_validator def _pformat_lines(obj): return pprint.pformat(obj).splitlines(keepends=True) def _assert_eq(left, right): if left != right: pytest.fail('\n' + '\n'.join(difflib.ndiff(_pformat_lines(left), _pformat_lines(right)))) def test_assert_raises_validation_error(): class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls, v): assert v == 'a', 'invalid a' return v assert Model(a='a').a == 'a' with pytest.raises(ValidationError) as exc_info: Model(a='snap') _assert_eq( [ { 'ctx': {'error': HasRepr(repr(AssertionError('invalid a')))}, 'input': 'snap', 'loc': ('a',), 'msg': 'Assertion failed, invalid a', 'type': 'assertion_error', } ], exc_info.value.errors(include_url=False), ) pydantic-2.10.6/tests/test_callable.py000066400000000000000000000014421474456633400177460ustar00rootroot00000000000000import sys from typing import Callable import pytest from pydantic import BaseModel, ValidationError collection_callable_types = [Callable, Callable[[int], int]] if sys.version_info >= (3, 9): from collections.abc import Callable as CollectionsCallable collection_callable_types += [CollectionsCallable, CollectionsCallable[[int], int]] @pytest.mark.parametrize('annotation', collection_callable_types) def test_callable(annotation): class Model(BaseModel): callback: annotation m = Model(callback=lambda x: x) assert callable(m.callback) @pytest.mark.parametrize('annotation', collection_callable_types) def test_non_callable(annotation): class Model(BaseModel): callback: annotation with pytest.raises(ValidationError): Model(callback=1) pydantic-2.10.6/tests/test_color.py000066400000000000000000000171071474456633400173320ustar00rootroot00000000000000from datetime import datetime import pytest from pydantic_core import PydanticCustomError from pydantic import BaseModel, ValidationError from pydantic.color import Color pytestmark = pytest.mark.filterwarnings( 'ignore:The `Color` class is deprecated, use `pydantic_extra_types` instead.*:DeprecationWarning' ) @pytest.mark.parametrize( 'raw_color, as_tuple', [ # named colors ('aliceblue', (240, 248, 255)), ('Antiquewhite', (250, 235, 215)), ('#000000', (0, 0, 0)), ('#DAB', (221, 170, 187)), ('#dab', (221, 170, 187)), ('#000', (0, 0, 0)), ('0x797979', (121, 121, 121)), ('0x777', (119, 119, 119)), ('0x777777', (119, 119, 119)), ('0x777777cc', (119, 119, 119, 0.8)), ('777', (119, 119, 119)), ('777c', (119, 119, 119, 0.8)), (' 777', (119, 119, 119)), ('777 ', (119, 119, 119)), (' 777 ', (119, 119, 119)), ((0, 0, 128), (0, 0, 128)), ([0, 0, 128], (0, 0, 128)), ((0, 0, 205, 1.0), (0, 0, 205)), ((0, 0, 205, 0.5), (0, 0, 205, 0.5)), ('rgb(0, 0, 205)', (0, 0, 205)), ('rgb(0, 0, 205.2)', (0, 0, 205)), ('rgb(0, 0.2, 205)', (0, 0, 205)), ('rgba(0, 0, 128, 0.6)', (0, 0, 128, 0.6)), ('rgba(0, 0, 128, .6)', (0, 0, 128, 0.6)), ('rgba(0, 0, 128, 60%)', (0, 0, 128, 0.6)), (' rgba(0, 0, 128,0.6) ', (0, 0, 128, 0.6)), ('rgba(00,0,128,0.6 )', (0, 0, 128, 0.6)), ('rgba(0, 0, 128, 0)', (0, 0, 128, 0)), ('rgba(0, 0, 128, 1)', (0, 0, 128)), ('rgb(0 0.2 205)', (0, 0, 205)), ('rgb(0 0.2 205 / 0.6)', (0, 0, 205, 0.6)), ('rgb(0 0.2 205 / 60%)', (0, 0, 205, 0.6)), ('rgba(0 0 128)', (0, 0, 128)), ('rgba(0 0 128 / 0.6)', (0, 0, 128, 0.6)), ('rgba(0 0 128 / 60%)', (0, 0, 128, 0.6)), ('hsl(270, 60%, 70%)', (178, 133, 224)), ('hsl(180, 100%, 50%)', (0, 255, 255)), ('hsl(630, 60%, 70%)', (178, 133, 224)), ('hsl(270deg, 60%, 70%)', (178, 133, 224)), ('hsl(.75turn, 60%, 70%)', (178, 133, 224)), ('hsl(-.25turn, 60%, 70%)', (178, 133, 224)), ('hsl(-0.25turn, 60%, 70%)', (178, 133, 224)), ('hsl(4.71238rad, 60%, 70%)', (178, 133, 224)), ('hsl(10.9955rad, 60%, 70%)', (178, 133, 224)), ('hsl(270, 60%, 50%, .15)', (127, 51, 204, 0.15)), ('hsl(270.00deg, 60%, 50%, 15%)', (127, 51, 204, 0.15)), ('hsl(630 60% 70%)', (178, 133, 224)), ('hsl(270 60% 50% / .15)', (127, 51, 204, 0.15)), ('hsla(630, 60%, 70%)', (178, 133, 224)), ('hsla(630 60% 70%)', (178, 133, 224)), ('hsla(270 60% 50% / .15)', (127, 51, 204, 0.15)), ], ) def test_color_success(raw_color, as_tuple): c = Color(raw_color) assert c.as_rgb_tuple() == as_tuple assert c.original() == raw_color @pytest.mark.parametrize( 'color', [ # named colors 'nosuchname', 'chucknorris', # hex '#0000000', 'x000', # rgb/rgba tuples (256, 256, 256), (128, 128, 128, 0.5, 128), (0, 0, 'x'), (0, 0, 0, 1.5), (0, 0, 0, 'x'), (0, 0, 1280), (0, 0, 1205, 0.1), (0, 0, 1128, 0.5), (0, 0, 1128, -0.5), (0, 0, 1128, 1.5), # rgb/rgba strings 'rgb(0, 0, 1205)', 'rgb(0, 0, 1128)', 'rgb(0, 0, 200 / 0.2)', 'rgb(72 122 18, 0.3)', 'rgba(0, 0, 11205, 0.1)', 'rgba(0, 0, 128, 11.5)', 'rgba(0, 0, 128 / 11.5)', 'rgba(72 122 18 0.3)', # hsl/hsla strings 'hsl(180, 101%, 50%)', 'hsl(72 122 18 / 0.3)', 'hsl(630 60% 70%, 0.3)', 'hsla(72 122 18 / 0.3)', # neither a tuple, not a string datetime(2017, 10, 5, 19, 47, 7), object, range(10), ], ) def test_color_fail(color): with pytest.raises(PydanticCustomError) as exc_info: Color(color) assert exc_info.value.type == 'color_error' def test_model_validation(): class Model(BaseModel): color: Color assert Model(color='red').color.as_hex() == '#f00' assert Model(color=Color('red')).color.as_hex() == '#f00' with pytest.raises(ValidationError) as exc_info: Model(color='snot') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'color_error', 'loc': ('color',), 'msg': 'value is not a valid color: string not recognised as a valid color', 'input': 'snot', } ] def test_as_rgb(): assert Color('bad').as_rgb() == 'rgb(187, 170, 221)' assert Color((1, 2, 3, 0.123456)).as_rgb() == 'rgba(1, 2, 3, 0.12)' assert Color((1, 2, 3, 0.1)).as_rgb() == 'rgba(1, 2, 3, 0.1)' def test_as_rgb_tuple(): assert Color((1, 2, 3)).as_rgb_tuple(alpha=None) == (1, 2, 3) assert Color((1, 2, 3, 1)).as_rgb_tuple(alpha=None) == (1, 2, 3) assert Color((1, 2, 3, 0.3)).as_rgb_tuple(alpha=None) == (1, 2, 3, 0.3) assert Color((1, 2, 3, 0.3)).as_rgb_tuple(alpha=None) == (1, 2, 3, 0.3) assert Color((1, 2, 3)).as_rgb_tuple(alpha=False) == (1, 2, 3) assert Color((1, 2, 3, 0.3)).as_rgb_tuple(alpha=False) == (1, 2, 3) assert Color((1, 2, 3)).as_rgb_tuple(alpha=True) == (1, 2, 3, 1) assert Color((1, 2, 3, 0.3)).as_rgb_tuple(alpha=True) == (1, 2, 3, 0.3) def test_as_hsl(): assert Color('bad').as_hsl() == 'hsl(260, 43%, 77%)' assert Color((1, 2, 3, 0.123456)).as_hsl() == 'hsl(210, 50%, 1%, 0.12)' assert Color('hsl(260, 43%, 77%)').as_hsl() == 'hsl(260, 43%, 77%)' def test_as_hsl_tuple(): c = Color('016997') h, s, l_, a = c.as_hsl_tuple(alpha=True) assert h == pytest.approx(0.551, rel=0.01) assert s == pytest.approx(0.986, rel=0.01) assert l_ == pytest.approx(0.298, rel=0.01) assert a == 1 assert c.as_hsl_tuple(alpha=False) == c.as_hsl_tuple(alpha=None) == (h, s, l_) c = Color((3, 40, 50, 0.5)) hsla = c.as_hsl_tuple(alpha=None) assert len(hsla) == 4 assert hsla[3] == 0.5 def test_as_hex(): assert Color((1, 2, 3)).as_hex() == '#010203' assert Color((119, 119, 119)).as_hex() == '#777' assert Color((119, 0, 238)).as_hex() == '#70e' assert Color('B0B').as_hex() == '#b0b' assert Color((1, 2, 3, 0.123456)).as_hex() == '#0102031f' assert Color((1, 2, 3, 0.1)).as_hex() == '#0102031a' def test_as_named(): assert Color((0, 255, 255)).as_named() == 'cyan' assert Color('#808000').as_named() == 'olive' assert Color('hsl(180, 100%, 50%)').as_named() == 'cyan' assert Color((240, 248, 255)).as_named() == 'aliceblue' with pytest.raises(ValueError) as exc_info: Color((1, 2, 3)).as_named() assert exc_info.value.args[0] == 'no named color found, use fallback=True, as_hex() or as_rgb()' assert Color((1, 2, 3)).as_named(fallback=True) == '#010203' assert Color((1, 2, 3, 0.1)).as_named(fallback=True) == '#0102031a' def test_str_repr(): assert str(Color('red')) == 'red' assert repr(Color('red')) == "Color('red', rgb=(255, 0, 0))" assert str(Color((1, 2, 3))) == '#010203' assert repr(Color((1, 2, 3))) == "Color('#010203', rgb=(1, 2, 3))" def test_eq(): assert Color('red') == Color('red') assert Color('red') != Color('blue') assert Color('red') != 'red' assert Color('red') == Color((255, 0, 0)) assert Color('red') != Color((0, 0, 255)) def test_color_hashable(): assert hash(Color('red')) != hash(Color('blue')) assert hash(Color('red')) == hash(Color((255, 0, 0))) assert hash(Color('red')) != hash(Color((255, 0, 0, 0.5))) pydantic-2.10.6/tests/test_computed_fields.py000066400000000000000000000567561474456633400213770ustar00rootroot00000000000000import random import sys from abc import ABC, abstractmethod from functools import cached_property, lru_cache, singledispatchmethod from typing import Any, Callable, ClassVar, Generic, List, Tuple, TypeVar import pytest from pydantic_core import ValidationError, core_schema from typing_extensions import TypedDict from pydantic import ( BaseModel, Field, FieldSerializationInfo, GetCoreSchemaHandler, PrivateAttr, TypeAdapter, computed_field, dataclasses, field_serializer, field_validator, ) from pydantic.alias_generators import to_camel from pydantic.errors import PydanticUserError def test_computed_fields_get(): class Rectangle(BaseModel): width: int length: int @computed_field def area(self) -> int: """An awesome area""" return self.width * self.length @computed_field(title='Pikarea', description='Another area') @property def area2(self) -> int: return self.width * self.length @property def double_width(self) -> int: return self.width * 2 rect = Rectangle(width=10, length=5) assert set(rect.model_fields) == {'width', 'length'} assert set(rect.model_computed_fields) == {'area', 'area2'} assert rect.__dict__ == {'width': 10, 'length': 5} assert rect.model_computed_fields['area'].description == 'An awesome area' assert rect.model_computed_fields['area2'].title == 'Pikarea' assert rect.model_computed_fields['area2'].description == 'Another area' assert rect.area == 50 assert rect.double_width == 20 assert rect.model_dump() == {'width': 10, 'length': 5, 'area': 50, 'area2': 50} assert rect.model_dump_json() == '{"width":10,"length":5,"area":50,"area2":50}' assert set(Rectangle.model_fields) == {'width', 'length'} assert set(Rectangle.model_computed_fields) == {'area', 'area2'} assert Rectangle.model_computed_fields['area'].description == 'An awesome area' assert Rectangle.model_computed_fields['area2'].title == 'Pikarea' assert Rectangle.model_computed_fields['area2'].description == 'Another area' def test_computed_fields_json_schema(): class Rectangle(BaseModel): width: int length: int @computed_field def area(self) -> int: """An awesome area""" return self.width * self.length @computed_field( title='Pikarea', description='Another area', examples=[100, 200], json_schema_extra={'foo': 42}, ) @property def area2(self) -> int: return self.width * self.length @property def double_width(self) -> int: return self.width * 2 assert Rectangle.model_json_schema(mode='serialization') == { 'title': 'Rectangle', 'type': 'object', 'properties': { 'width': { 'title': 'Width', 'type': 'integer', }, 'length': { 'title': 'Length', 'type': 'integer', }, 'area': { 'title': 'Area', 'description': 'An awesome area', 'type': 'integer', 'readOnly': True, }, 'area2': { 'title': 'Pikarea', 'description': 'Another area', 'examples': [100, 200], 'foo': 42, 'type': 'integer', 'readOnly': True, }, }, 'required': ['width', 'length', 'area', 'area2'], } def test_computed_fields_set(): class Square(BaseModel): side: float @computed_field @property def area(self) -> float: return self.side**2 @computed_field @property def area_string(self) -> str: return f'{self.area} square units' @field_serializer('area_string') def serialize_area_string(self, area_string): return area_string.upper() @area.setter def area(self, new_area: int): self.side = new_area**0.5 s = Square(side=10) assert s.model_dump() == {'side': 10.0, 'area': 100.0, 'area_string': '100.0 SQUARE UNITS'} s.area = 64 assert s.model_dump() == {'side': 8.0, 'area': 64.0, 'area_string': '64.0 SQUARE UNITS'} assert Square.model_computed_fields['area'].wrapped_property is Square.area def test_computed_fields_del(): class User(BaseModel): first: str last: str @computed_field def fullname(self) -> str: return f'{self.first} {self.last}' @fullname.setter def fullname(self, new_fullname: str) -> None: self.first, self.last = new_fullname.split() @fullname.deleter def fullname(self): self.first = '' self.last = '' user = User(first='John', last='Smith') assert user.model_dump() == {'first': 'John', 'last': 'Smith', 'fullname': 'John Smith'} user.fullname = 'Pika Chu' assert user.model_dump() == {'first': 'Pika', 'last': 'Chu', 'fullname': 'Pika Chu'} del user.fullname assert user.model_dump() == {'first': '', 'last': '', 'fullname': ' '} def test_cached_property(): class Model(BaseModel): minimum: int = Field(alias='min') maximum: int = Field(alias='max') @computed_field(alias='the magic number') @cached_property def random_number(self) -> int: """An awesome area""" return random.randint(self.minimum, self.maximum) @cached_property def cached_property_2(self) -> int: return 42 @cached_property def _cached_property_3(self) -> int: return 43 rect = Model(min=10, max=10_000) assert rect.__private_attributes__ == {} assert rect.cached_property_2 == 42 assert rect._cached_property_3 == 43 first_n = rect.random_number second_n = rect.random_number assert first_n == second_n assert rect.model_dump() == {'minimum': 10, 'maximum': 10_000, 'random_number': first_n} assert rect.model_dump(by_alias=True) == {'min': 10, 'max': 10_000, 'the magic number': first_n} assert rect.model_dump(by_alias=True, exclude={'random_number'}) == {'min': 10, 'max': 10000} # `cached_property` is a non-data descriptor, assert that you can assign a value to it: rect2 = Model(min=1, max=1) rect2.cached_property_2 = 1 rect2._cached_property_3 = 2 assert rect2.cached_property_2 == 1 assert rect2._cached_property_3 == 2 def test_properties_and_computed_fields(): class Model(BaseModel): x: str _private_float: float = PrivateAttr(0) @property def public_int(self) -> int: return int(self._private_float) @public_int.setter def public_int(self, v: float) -> None: self._private_float = v @computed_field @property def public_str(self) -> str: return f'public {self.public_int}' m = Model(x='pika') assert m.model_dump() == {'x': 'pika', 'public_str': 'public 0'} m._private_float = 3.1 assert m.model_dump() == {'x': 'pika', 'public_str': 'public 3'} m.public_int = 2 assert m._private_float == 2.0 assert m.model_dump() == {'x': 'pika', 'public_str': 'public 2'} def test_computed_fields_repr(): class Model(BaseModel): x: int @computed_field(repr=False) @property def double(self) -> int: return self.x * 2 @computed_field # repr=True by default @property def triple(self) -> int: return self.x * 3 assert repr(Model(x=2)) == 'Model(x=2, triple=6)' def test_functools(): class Model(BaseModel, frozen=True): x: int @lru_cache def x_pow(self, p): return self.x**p @singledispatchmethod def neg(self, arg): raise NotImplementedError('Cannot negate a') @neg.register def _(self, arg: int): return -arg @neg.register def _(self, arg: bool): return not arg m = Model(x=2) assert m.x_pow(1) == 2 assert m.x_pow(2) == 4 assert m.neg(1) == -1 assert m.neg(True) is False def test_include_exclude(): class Model(BaseModel): x: int y: int @computed_field def x_list(self) -> List[int]: return [self.x, self.x + 1] @computed_field def y_list(self) -> List[int]: return [self.y, self.y + 1, self.y + 2] m = Model(x=1, y=2) assert m.model_dump() == {'x': 1, 'y': 2, 'x_list': [1, 2], 'y_list': [2, 3, 4]} assert m.model_dump(include={'x'}) == {'x': 1} assert m.model_dump(include={'x': None, 'x_list': {0}}) == {'x': 1, 'x_list': [1]} assert m.model_dump(exclude={'x': ..., 'y_list': {2}}) == {'y': 2, 'x_list': [1, 2], 'y_list': [2, 3]} def test_exclude_none(): class Model(BaseModel): x: int y: int @computed_field def sum(self) -> int: return self.x + self.y @computed_field def none(self) -> None: return None m = Model(x=1, y=2) assert m.model_dump(exclude_none=False) == {'x': 1, 'y': 2, 'sum': 3, 'none': None} assert m.model_dump(exclude_none=True) == {'x': 1, 'y': 2, 'sum': 3} assert m.model_dump(mode='json', exclude_none=False) == {'x': 1, 'y': 2, 'sum': 3, 'none': None} assert m.model_dump(mode='json', exclude_none=True) == {'x': 1, 'y': 2, 'sum': 3} def test_expected_type(): class Model(BaseModel): x: int y: int @computed_field def x_list(self) -> List[int]: return [self.x, self.x + 1] @computed_field def y_str(self) -> bytes: s = f'y={self.y}' return s.encode() m = Model(x=1, y=2) assert m.model_dump() == {'x': 1, 'y': 2, 'x_list': [1, 2], 'y_str': b'y=2'} assert m.model_dump(mode='json') == {'x': 1, 'y': 2, 'x_list': [1, 2], 'y_str': 'y=2'} assert m.model_dump_json() == '{"x":1,"y":2,"x_list":[1,2],"y_str":"y=2"}' def test_expected_type_wrong(): class Model(BaseModel): x: int @computed_field def x_list(self) -> List[int]: return 'not a list' m = Model(x=1) with pytest.warns(UserWarning, match=r'Expected `list\[int\]` but got `str`'): m.model_dump() with pytest.warns(UserWarning, match=r'Expected `list\[int\]` but got `str`'): m.model_dump(mode='json') with pytest.warns(UserWarning, match=r'Expected `list\[int\]` but got `str`'): m.model_dump_json() def test_inheritance(): class Base(BaseModel): x: int @computed_field def double(self) -> int: return self.x * 2 class Child(Base): y: int @computed_field def triple(self) -> int: return self.y * 3 c = Child(x=2, y=3) assert c.double == 4 assert c.triple == 9 assert c.model_dump() == {'x': 2, 'y': 3, 'double': 4, 'triple': 9} def test_dataclass(): @dataclasses.dataclass class MyDataClass: x: int @computed_field def double(self) -> int: return self.x * 2 m = MyDataClass(x=2) assert m.double == 4 assert TypeAdapter(MyDataClass).dump_python(m) == {'x': 2, 'double': 4} def test_free_function(): @property def double_func(self) -> int: return self.x * 2 class MyModel(BaseModel): x: int double = computed_field(double_func) m = MyModel(x=2) assert set(m.model_fields) == {'x'} assert m.__private_attributes__ == {} assert m.double == 4 assert repr(m) == 'MyModel(x=2, double=4)' assert m.model_dump() == {'x': 2, 'double': 4} def test_private_computed_field(): class MyModel(BaseModel): x: int @computed_field(repr=True) def _double(self) -> int: return self.x * 2 m = MyModel(x=2) assert repr(m) == 'MyModel(x=2, _double=4)' assert m.__private_attributes__ == {} assert m._double == 4 assert m.model_dump() == {'x': 2, '_double': 4} @pytest.mark.skipif( sys.version_info < (3, 9) or sys.version_info >= (3, 13), reason='@computed_field @classmethod @property only works in 3.9-3.12', ) def test_classmethod(): class MyModel(BaseModel): x: int y: ClassVar[int] = 4 @computed_field @classmethod @property def two_y(cls) -> int: return cls.y * 2 m = MyModel(x=1) assert m.two_y == 8 assert m.model_dump() == {'x': 1, 'two_y': 8} def test_frozen(): class Square(BaseModel, frozen=True): side: float @computed_field @property def area(self) -> float: return self.side**2 @area.setter def area(self, new_area: int): self.side = new_area**0.5 m = Square(side=4) assert m.area == 16.0 assert m.model_dump() == {'side': 4.0, 'area': 16.0} with pytest.raises(ValidationError) as exc_info: m.area = 4 assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_instance', 'loc': ('area',), 'msg': 'Instance is frozen', 'input': 4} ] def test_validate_assignment(): class Square(BaseModel, validate_assignment=True): side: float @field_validator('side') def small_side(cls, s): if s < 2: raise ValueError('must be >=2') return float(round(s)) @computed_field @property def area(self) -> float: return self.side**2 @area.setter def area(self, new_area: int): self.side = new_area**0.5 with pytest.raises(ValidationError, match=r'side\s+Value error, must be >=2'): Square(side=1) m = Square(side=4.0) assert m.area == 16.0 assert m.model_dump() == {'side': 4.0, 'area': 16.0} m.area = 10.0 assert m.side == 3.0 with pytest.raises(ValidationError, match=r'side\s+Value error, must be >=2'): m.area = 3 def test_abstractmethod(): class AbstractSquare(BaseModel): side: float @computed_field @property @abstractmethod def area(self) -> float: raise NotImplementedError() class Square(AbstractSquare): @computed_field @property def area(self) -> float: return self.side + 1 m = Square(side=4.0) assert m.model_dump() == {'side': 4.0, 'area': 5.0} @pytest.mark.skipif(sys.version_info < (3, 12), reason='error message is different on older versions') @pytest.mark.parametrize( 'bases', [ (BaseModel, ABC), (ABC, BaseModel), (BaseModel,), ], ) def test_abstractmethod_missing(bases: Tuple[Any, ...]): class AbstractSquare(*bases): side: float @computed_field @property @abstractmethod def area(self) -> float: raise NotImplementedError() class Square(AbstractSquare): pass with pytest.raises( TypeError, match="Can't instantiate abstract class Square without an implementation for abstract method 'area'" ): Square(side=4.0) class CustomType(str): @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(str) schema['serialization'] = core_schema.plain_serializer_function_ser_schema(lambda x: '123') return schema def test_computed_fields_infer_return_type(): class Model(BaseModel): @computed_field def cfield(self) -> CustomType: return CustomType('abc') assert Model().model_dump() == {'cfield': '123'} assert Model().model_dump_json() == '{"cfield":"123"}' def test_computed_fields_missing_return_type(): with pytest.raises(PydanticUserError, match='Computed field is missing return type annotation'): class _Model(BaseModel): @computed_field def cfield(self): raise NotImplementedError class Model(BaseModel): @computed_field(return_type=CustomType) def cfield(self): return CustomType('abc') assert Model().model_dump() == {'cfield': '123'} assert Model().model_dump_json() == '{"cfield":"123"}' def test_alias_generator(): class MyModel(BaseModel): my_standard_field: int @computed_field # *will* be overridden by alias generator @property def my_computed_field(self) -> int: return self.my_standard_field + 1 @computed_field(alias='my_alias_none') # will *not* be overridden by alias generator @property def my_aliased_computed_field_none(self) -> int: return self.my_standard_field + 2 @computed_field(alias='my_alias_1', alias_priority=1) # *will* be overridden by alias generator @property def my_aliased_computed_field_1(self) -> int: return self.my_standard_field + 3 @computed_field(alias='my_alias_2', alias_priority=2) # will *not* be overridden by alias generator @property def my_aliased_computed_field_2(self) -> int: return self.my_standard_field + 4 class MySubModel(MyModel): model_config = dict(alias_generator=to_camel, populate_by_name=True) model = MyModel(my_standard_field=1) assert model.model_dump() == { 'my_standard_field': 1, 'my_computed_field': 2, 'my_aliased_computed_field_none': 3, 'my_aliased_computed_field_1': 4, 'my_aliased_computed_field_2': 5, } assert model.model_dump(by_alias=True) == { 'my_standard_field': 1, 'my_computed_field': 2, 'my_alias_none': 3, 'my_alias_1': 4, 'my_alias_2': 5, } submodel = MySubModel(my_standard_field=1) assert submodel.model_dump() == { 'my_standard_field': 1, 'my_computed_field': 2, 'my_aliased_computed_field_none': 3, 'my_aliased_computed_field_1': 4, 'my_aliased_computed_field_2': 5, } assert submodel.model_dump(by_alias=True) == { 'myStandardField': 1, 'myComputedField': 2, 'my_alias_none': 3, 'myAliasedComputedField1': 4, 'my_alias_2': 5, } def make_base_model() -> Any: class CompModel(BaseModel): pass class Model(BaseModel): @computed_field @property def comp_1(self) -> CompModel: return CompModel() @computed_field @property def comp_2(self) -> CompModel: return CompModel() return Model def make_dataclass() -> Any: class CompModel(BaseModel): pass @dataclasses.dataclass class Model: @computed_field @property def comp_1(self) -> CompModel: return CompModel() @computed_field @property def comp_2(self) -> CompModel: return CompModel() return Model def make_typed_dict() -> Any: class CompModel(BaseModel): pass class Model(TypedDict): @computed_field # type: ignore @property def comp_1(self) -> CompModel: return CompModel() @computed_field # type: ignore @property def comp_2(self) -> CompModel: return CompModel() return Model @pytest.mark.parametrize( 'model_factory', [ make_base_model, pytest.param( make_typed_dict, marks=pytest.mark.xfail( reason='computed fields do not work with TypedDict yet. See https://github.com/pydantic/pydantic-core/issues/657' ), ), make_dataclass, ], ) def test_multiple_references_to_schema(model_factory: Callable[[], Any]) -> None: """ https://github.com/pydantic/pydantic/issues/5980 """ model = model_factory() ta = TypeAdapter(model) assert ta.dump_python(model()) == {'comp_1': {}, 'comp_2': {}} assert ta.json_schema() == {'type': 'object', 'properties': {}, 'title': 'Model'} assert ta.json_schema(mode='serialization') == { '$defs': {'CompModel': {'properties': {}, 'title': 'CompModel', 'type': 'object'}}, 'properties': { 'comp_1': {'$ref': '#/$defs/CompModel', 'readOnly': True}, 'comp_2': {'$ref': '#/$defs/CompModel', 'readOnly': True}, }, 'required': ['comp_1', 'comp_2'], 'title': 'Model', 'type': 'object', } def test_generic_computed_field(): T = TypeVar('T') class A(BaseModel, Generic[T]): x: T @computed_field @property def double_x(self) -> T: return self.x * 2 assert A[int](x=1).model_dump() == {'x': 1, 'double_x': 2} assert A[str](x='abc').model_dump() == {'x': 'abc', 'double_x': 'abcabc'} assert A(x='xxxxxx').model_computed_fields['double_x'].return_type == T assert A[int](x=123).model_computed_fields['double_x'].return_type == int assert A[str](x='x').model_computed_fields['double_x'].return_type == str class B(BaseModel, Generic[T]): @computed_field @property def double_x(self) -> T: return 'abc' # this may not match the annotated return type, and will warn if not with pytest.warns( UserWarning, match="Expected `int` but got `str` with value `'abc'` - serialized value may not be as expected" ): B[int]().model_dump() def test_computed_field_override_raises(): class Model(BaseModel): name: str = 'foo' with pytest.raises(ValueError, match="you can't override a field with a computed field"): class SubModel(Model): @computed_field @property def name(self) -> str: return 'bar' @pytest.mark.skip(reason='waiting on next pydantic-core version, right now, causes a recursion error') def test_computed_field_excluded_from_model_dump_recursive() -> None: # see https://github.com/pydantic/pydantic/issues/9015 for a more contextualized example class Model(BaseModel): bar: int @computed_field @property def id(self) -> str: str_obj = self.model_dump_json(exclude={'id'}) # you could imagine hashing str_obj, etc. but for simplicity, just wrap it in a descriptive string return f'id: {str_obj}' m = Model(bar=42) assert m.model_dump() == {'bar': 42, 'id': 'id: {"bar":42}'} def test_computed_field_with_field_serializer(): class MyModel(BaseModel): other_field: int = 42 @computed_field @property def my_field(self) -> str: return 'foo' @field_serializer('*') def my_field_serializer(self, value: Any, info: FieldSerializationInfo) -> Any: return f'{info.field_name} = {value}' assert MyModel().model_dump() == {'my_field': 'my_field = foo', 'other_field': 'other_field = 42'} def test_fields_on_instance_and_cls() -> None: """For now, we support `model_fields` and `model_computed_fields` access on both instances and classes. In V3, we should only support class access, though we need to preserve the current behavior for V2 compatibility.""" class Rectangle(BaseModel): x: int y: int @computed_field @property def area(self) -> int: return self.x * self.y r = Rectangle(x=10, y=5) for attr in {'model_fields', 'model_computed_fields'}: assert getattr(r, attr) == getattr(Rectangle, attr) assert set(r.model_fields) == {'x', 'y'} assert set(r.model_computed_fields) == {'area'} pydantic-2.10.6/tests/test_config.py000066400000000000000000001010151474456633400174510ustar00rootroot00000000000000import json import re import sys from contextlib import nullcontext as does_not_raise from decimal import Decimal from inspect import signature from typing import Any, ContextManager, Dict, Iterable, NamedTuple, Optional, Type, Union from dirty_equals import HasRepr, IsPartialDict from pydantic_core import SchemaError, SchemaSerializer, SchemaValidator from pydantic import ( BaseConfig, BaseModel, Field, PrivateAttr, PydanticDeprecatedSince20, PydanticSchemaGenerationError, ValidationError, create_model, field_validator, validate_call, with_config, ) from pydantic._internal._config import ConfigWrapper, config_defaults from pydantic._internal._generate_schema import GenerateSchema from pydantic._internal._mock_val_ser import MockValSer from pydantic._internal._typing_extra import get_type_hints from pydantic.config import ConfigDict, JsonValue from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.dataclasses import rebuild_dataclass from pydantic.errors import PydanticUserError from pydantic.fields import ComputedFieldInfo, FieldInfo from pydantic.type_adapter import TypeAdapter from pydantic.warnings import PydanticDeprecatedSince210, PydanticDeprecationWarning from .conftest import CallCounter if sys.version_info < (3, 9): from typing_extensions import Annotated else: from typing import Annotated import pytest @pytest.fixture(scope='session', name='BaseConfigModelWithStrictConfig') def model_with_strict_config(): class ModelWithStrictConfig(BaseModel): a: int # strict=False overrides the Config b: Annotated[int, Field(strict=False)] # strict=None or not including it is equivalent # lets this field be overridden by the Config c: Annotated[int, Field(strict=None)] d: Annotated[int, Field()] model_config = ConfigDict(strict=True) return ModelWithStrictConfig def _equals(a: Union[str, Iterable[str]], b: Union[str, Iterable[str]]) -> bool: """ Compare strings with spaces removed """ if isinstance(a, str) and isinstance(b, str): return a.replace(' ', '') == b.replace(' ', '') elif isinstance(a, Iterable) and isinstance(b, Iterable): return all(_equals(a_, b_) for a_, b_ in zip(a, b)) else: raise TypeError(f'arguments must be both strings or both lists, not {type(a)}, {type(b)}') def test_config_dict_missing_keys(): assert ConfigDict().get('missing_property') is None with pytest.raises(KeyError, match="'missing_property'"): ConfigDict()['missing_property'] class TestsBaseConfig: @pytest.mark.filterwarnings('ignore:.* is deprecated.*:DeprecationWarning') def test_base_config_equality_defaults_of_config_dict_class(self): for key, value in config_defaults.items(): assert getattr(BaseConfig, key) == value def test_config_and_module_config_cannot_be_used_together(self): with pytest.raises(PydanticUserError): class MyModel(BaseModel): model_config = ConfigDict(title='MyTitle') class Config: title = 'MyTitleConfig' @pytest.mark.filterwarnings('ignore:.* is deprecated.*:DeprecationWarning') def test_base_config_properly_converted_to_dict(self): class MyConfig(BaseConfig): title = 'MyTitle' frozen = True class MyBaseModel(BaseModel): class Config(MyConfig): ... class MyModel(MyBaseModel): ... MyModel.model_config['title'] = 'MyTitle' MyModel.model_config['frozen'] = True assert 'str_to_lower' not in MyModel.model_config def test_base_config_custom_init_signature(self): class MyModel(BaseModel): id: int name: str = 'John Doe' f__: str = Field(alias='foo') model_config = ConfigDict(extra='allow') def __init__(self, id: int = 1, bar=2, *, baz: Any, **data): super().__init__(id=id, **data) self.bar = bar self.baz = baz sig = signature(MyModel) assert _equals( map(str, sig.parameters.values()), ('id: int = 1', 'bar=2', 'baz: Any', "name: str = 'John Doe'", 'foo: str', '**data'), ) assert _equals(str(sig), "(id: int = 1, bar=2, *, baz: Any, name: str = 'John Doe', foo: str, **data) -> None") def test_base_config_custom_init_signature_with_no_var_kw(self): class Model(BaseModel): a: float b: int = 2 c: int def __init__(self, a: float, b: int): super().__init__(a=a, b=b, c=1) model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(a: float, b: int) -> None') def test_base_config_use_field_name(self): class Foo(BaseModel): foo: str = Field(alias='this is invalid') model_config = ConfigDict(populate_by_name=True) assert _equals(str(signature(Foo)), '(*, foo: str) -> None') def test_base_config_does_not_use_reserved_word(self): class Foo(BaseModel): from_: str = Field(alias='from') model_config = ConfigDict(populate_by_name=True) assert _equals(str(signature(Foo)), '(*, from_: str) -> None') def test_base_config_extra_allow_no_conflict(self): class Model(BaseModel): spam: str model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(*, spam: str, **extra_data: Any) -> None') def test_base_config_extra_allow_conflict_twice(self): class Model(BaseModel): extra_data: str extra_data_: str model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(*, extra_data: str, extra_data_: str, **extra_data__: Any) -> None') def test_base_config_extra_allow_conflict_custom_signature(self): class Model(BaseModel): extra_data: int def __init__(self, extra_data: int = 1, **foobar: Any): super().__init__(extra_data=extra_data, **foobar) model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(extra_data: int = 1, **foobar: Any) -> None') def test_base_config_private_attribute_intersection_with_extra_field(self): class Model(BaseModel): _foo = PrivateAttr('private_attribute') model_config = ConfigDict(extra='allow') assert set(Model.__private_attributes__) == {'_foo'} m = Model(_foo='field') assert m._foo == 'private_attribute' assert m.__dict__ == {} assert m.__pydantic_extra__ == {'_foo': 'field'} assert m.model_dump() == {'_foo': 'field'} m._foo = 'still_private' assert m._foo == 'still_private' assert m.__dict__ == {} assert m.__pydantic_extra__ == {'_foo': 'field'} assert m.model_dump() == {'_foo': 'field'} def test_base_config_parse_model_with_strict_config_disabled( self, BaseConfigModelWithStrictConfig: Type[BaseModel] ) -> None: class Model(BaseConfigModelWithStrictConfig): model_config = ConfigDict(strict=False) values = [ Model(a='1', b=2, c=3, d=4), Model(a=1, b=2, c='3', d=4), Model(a=1, b=2, c=3, d='4'), Model(a=1, b='2', c=3, d=4), Model(a=1, b=2, c=3, d=4), ] assert all(v.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4} for v in values) def test_finite_float_config(self): class Model(BaseModel): a: float model_config = ConfigDict(allow_inf_nan=False) assert Model(a=42).a == 42 with pytest.raises(ValidationError) as exc_info: Model(a=float('nan')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'finite_number', 'loc': ('a',), 'msg': 'Input should be a finite number', 'input': HasRepr('nan'), } ] @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [ (True, ' 123 ', '123'), (True, ' 123\t\n', '123'), (False, ' 123 ', ' 123 '), ], ) def test_str_strip_whitespace(self, enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_strip_whitespace=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [(True, 'ABCDefG', 'ABCDEFG'), (False, 'ABCDefG', 'ABCDefG')], ) def test_str_to_upper(self, enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_to_upper=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [(True, 'ABCDefG', 'abcdefg'), (False, 'ABCDefG', 'ABCDefG')], ) def test_str_to_lower(self, enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_to_lower=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check def test_namedtuple_arbitrary_type(self): class CustomClass: pass class Tup(NamedTuple): c: CustomClass class Model(BaseModel): x: Tup model_config = ConfigDict(arbitrary_types_allowed=True) data = {'x': Tup(c=CustomClass())} model = Model.model_validate(data) assert isinstance(model.x.c, CustomClass) with pytest.raises(PydanticSchemaGenerationError): class ModelNoArbitraryTypes(BaseModel): x: Tup @pytest.mark.parametrize( 'use_construct, populate_by_name_config, arg_name, expectation', [ [False, True, 'bar', does_not_raise()], [False, True, 'bar_', does_not_raise()], [False, False, 'bar', does_not_raise()], [False, False, 'bar_', pytest.raises(ValueError)], [True, True, 'bar', does_not_raise()], [True, True, 'bar_', does_not_raise()], [True, False, 'bar', does_not_raise()], [True, False, 'bar_', does_not_raise()], ], ) def test_populate_by_name_config( self, use_construct: bool, populate_by_name_config: bool, arg_name: str, expectation: ContextManager, ): expected_value: int = 7 class Foo(BaseModel): bar_: int = Field(alias='bar') model_config = dict(populate_by_name=populate_by_name_config) with expectation: if use_construct: f = Foo.model_construct(**{arg_name: expected_value}) else: f = Foo(**{arg_name: expected_value}) assert f.bar_ == expected_value def test_immutable_copy_with_frozen(self): class Model(BaseModel): a: int b: int model_config = ConfigDict(frozen=True) m = Model(a=40, b=10) assert m == m.model_copy() def test_config_class_is_deprecated(self): with pytest.warns(PydanticDeprecatedSince20) as all_warnings: class Config(BaseConfig): pass # typing-extensions swallows one of the warnings, so we need to support # both ways for now. assert len(all_warnings) in [1, 2] expected_warnings = [ 'Support for class-based `config` is deprecated, use ConfigDict instead', ] if len(all_warnings) == 2: expected_warnings.insert(0, 'BaseConfig is deprecated. Use the `pydantic.ConfigDict` instead') assert [w.message.message for w in all_warnings] == expected_warnings def test_config_class_attributes_are_deprecated(self): with pytest.warns(PydanticDeprecatedSince20) as all_warnings: assert BaseConfig.validate_assignment is False assert BaseConfig().validate_assignment is False class Config(BaseConfig): pass assert Config.validate_assignment is False assert Config().validate_assignment is False assert len(all_warnings) == 7 expected_warnings = { 'Support for class-based `config` is deprecated, use ConfigDict instead', 'BaseConfig is deprecated. Use the `pydantic.ConfigDict` instead', } assert set(w.message.message for w in all_warnings) <= expected_warnings @pytest.mark.filterwarnings('ignore:.* is deprecated.*:DeprecationWarning') def test_config_class_missing_attributes(self): with pytest.raises(AttributeError, match="type object 'BaseConfig' has no attribute 'missing_attribute'"): BaseConfig.missing_attribute with pytest.raises(AttributeError, match="'BaseConfig' object has no attribute 'missing_attribute'"): BaseConfig().missing_attribute class Config(BaseConfig): pass with pytest.raises(AttributeError, match="type object 'Config' has no attribute 'missing_attribute'"): Config.missing_attribute with pytest.raises(AttributeError, match="'Config' object has no attribute 'missing_attribute'"): Config().missing_attribute def test_config_key_deprecation(): config_dict = { 'allow_mutation': None, 'error_msg_templates': None, 'fields': None, 'getter_dict': None, 'schema_extra': None, 'smart_union': None, 'underscore_attrs_are_private': None, 'allow_population_by_field_name': None, 'anystr_lower': None, 'anystr_strip_whitespace': None, 'anystr_upper': None, 'keep_untouched': None, 'max_anystr_length': None, 'min_anystr_length': None, 'orm_mode': None, 'validate_all': None, } warning_message = """ Valid config keys have changed in V2: * 'allow_population_by_field_name' has been renamed to 'populate_by_name' * 'anystr_lower' has been renamed to 'str_to_lower' * 'anystr_strip_whitespace' has been renamed to 'str_strip_whitespace' * 'anystr_upper' has been renamed to 'str_to_upper' * 'keep_untouched' has been renamed to 'ignored_types' * 'max_anystr_length' has been renamed to 'str_max_length' * 'min_anystr_length' has been renamed to 'str_min_length' * 'orm_mode' has been renamed to 'from_attributes' * 'schema_extra' has been renamed to 'json_schema_extra' * 'validate_all' has been renamed to 'validate_default' * 'allow_mutation' has been removed * 'error_msg_templates' has been removed * 'fields' has been removed * 'getter_dict' has been removed * 'smart_union' has been removed * 'underscore_attrs_are_private' has been removed """.strip() with pytest.warns(UserWarning, match=re.escape(warning_message)): class MyModel(BaseModel): model_config = config_dict with pytest.warns(UserWarning, match=re.escape(warning_message)): create_model('MyCreatedModel', __config__=config_dict) with pytest.warns(UserWarning, match=re.escape(warning_message)): @pydantic_dataclass(config=config_dict) class MyDataclass: pass with pytest.warns(UserWarning, match=re.escape(warning_message)): @validate_call(config=config_dict) def my_function(): pass def test_invalid_extra(): extra_error = re.escape( "Input should be 'allow', 'forbid' or 'ignore'" " [type=literal_error, input_value='invalid-value', input_type=str]" ) config_dict = {'extra': 'invalid-value'} with pytest.raises(SchemaError, match=extra_error): class MyModel(BaseModel): model_config = config_dict with pytest.raises(SchemaError, match=extra_error): create_model('MyCreatedModel', __config__=config_dict) with pytest.raises(SchemaError, match=extra_error): @pydantic_dataclass(config=config_dict) class MyDataclass: pass def test_invalid_config_keys(): @validate_call(config={'alias_generator': lambda x: x}) def my_function(): pass def test_multiple_inheritance_config(): class Parent(BaseModel): model_config = ConfigDict(frozen=True, extra='forbid') class Mixin(BaseModel): model_config = ConfigDict(use_enum_values=True) class Child(Mixin, Parent): model_config = ConfigDict(populate_by_name=True) assert BaseModel.model_config.get('frozen') is None assert BaseModel.model_config.get('populate_by_name') is None assert BaseModel.model_config.get('extra') is None assert BaseModel.model_config.get('use_enum_values') is None assert Parent.model_config.get('frozen') is True assert Parent.model_config.get('populate_by_name') is None assert Parent.model_config.get('extra') == 'forbid' assert Parent.model_config.get('use_enum_values') is None assert Mixin.model_config.get('frozen') is None assert Mixin.model_config.get('populate_by_name') is None assert Mixin.model_config.get('extra') is None assert Mixin.model_config.get('use_enum_values') is True assert Child.model_config.get('frozen') is True assert Child.model_config.get('populate_by_name') is True assert Child.model_config.get('extra') == 'forbid' assert Child.model_config.get('use_enum_values') is True def test_config_wrapper_match(): localns = { '_GenerateSchema': GenerateSchema, 'GenerateSchema': GenerateSchema, 'JsonValue': JsonValue, 'FieldInfo': FieldInfo, 'ComputedFieldInfo': ComputedFieldInfo, } config_dict_annotations = [(k, str(v)) for k, v in get_type_hints(ConfigDict, localns=localns).items()] config_dict_annotations.sort() # remove config config_wrapper_annotations = [ (k, str(v)) for k, v in get_type_hints(ConfigWrapper, localns=localns).items() if k != 'config_dict' ] config_wrapper_annotations.sort() assert ( config_dict_annotations == config_wrapper_annotations ), 'ConfigDict and ConfigWrapper must have the same annotations (except ConfigWrapper.config_dict)' @pytest.mark.skipif(sys.version_info < (3, 11), reason='requires backport pre 3.11, fully tested in pydantic core') def test_config_validation_error_cause(): class Foo(BaseModel): foo: int @field_validator('foo') def check_foo(cls, v): assert v > 5, 'Must be greater than 5' # Should be disabled by default: with pytest.raises(ValidationError) as exc_info: Foo(foo=4) assert exc_info.value.__cause__ is None Foo.model_config = ConfigDict(validation_error_cause=True) Foo.model_rebuild(force=True) with pytest.raises(ValidationError) as exc_info: Foo(foo=4) # Confirm python error attached as a cause, and error location specified in a note: assert exc_info.value.__cause__ is not None assert isinstance(exc_info.value.__cause__, ExceptionGroup) # noqa: F821 assert len(exc_info.value.__cause__.exceptions) == 1 src_exc = exc_info.value.__cause__.exceptions[0] assert repr(src_exc) == "AssertionError('Must be greater than 5\\nassert 4 > 5')" assert len(src_exc.__notes__) == 1 assert src_exc.__notes__[0] == '\nPydantic: cause of loc: foo' def test_config_defaults_match(): localns = { '_GenerateSchema': GenerateSchema, 'GenerateSchema': GenerateSchema, 'FieldInfo': FieldInfo, 'ComputedFieldInfo': ComputedFieldInfo, } config_dict_keys = sorted(list(get_type_hints(ConfigDict, localns=localns).keys())) config_defaults_keys = sorted(list(config_defaults.keys())) assert config_dict_keys == config_defaults_keys, 'ConfigDict and config_defaults must have the same keys' def test_config_is_not_inherited_in_model_fields(): from typing import List from pydantic import BaseModel, ConfigDict class Inner(BaseModel): a: str class Outer(BaseModel): # this cause the inner model incorrectly dumpped: model_config = ConfigDict(str_to_lower=True) x: List[str] # should be converted to lower inner: Inner # should not have fields converted to lower m = Outer.model_validate(dict(x=['Abc'], inner=dict(a='Def'))) assert m.model_dump() == {'x': ['abc'], 'inner': {'a': 'Def'}} @pytest.mark.parametrize( 'config,input_str', ( ({}, 'type=string_type, input_value=123, input_type=int'), ({'hide_input_in_errors': False}, 'type=string_type, input_value=123, input_type=int'), ({'hide_input_in_errors': True}, 'type=string_type'), ), ) def test_hide_input_in_errors(config, input_str): class Model(BaseModel): x: str model_config = ConfigDict(**config) with pytest.raises(ValidationError, match=re.escape(f'Input should be a valid string [{input_str}]')): Model(x=123) parametrize_inf_nan_capable_type = pytest.mark.parametrize('inf_nan_capable_type', [float, Decimal]) parametrize_inf_nan_capable_value = pytest.mark.parametrize('inf_nan_value', ['Inf', 'NaN']) @parametrize_inf_nan_capable_value @parametrize_inf_nan_capable_type def test_config_inf_nan_enabled(inf_nan_capable_type, inf_nan_value): class Model(BaseModel): model_config = ConfigDict(allow_inf_nan=True) value: inf_nan_capable_type assert Model(value=inf_nan_capable_type(inf_nan_value)) @parametrize_inf_nan_capable_value @parametrize_inf_nan_capable_type def test_config_inf_nan_disabled(inf_nan_capable_type, inf_nan_value): class Model(BaseModel): model_config = ConfigDict(allow_inf_nan=False) value: inf_nan_capable_type with pytest.raises(ValidationError) as e: Model(value=inf_nan_capable_type(inf_nan_value)) assert e.value.errors(include_url=False)[0] == IsPartialDict( { 'loc': ('value',), 'msg': 'Input should be a finite number', 'type': 'finite_number', } ) @pytest.mark.parametrize( 'config,expected', ( (ConfigDict(), 'ConfigWrapper()'), (ConfigDict(title='test'), "ConfigWrapper(title='test')"), ), ) def test_config_wrapper_repr(config, expected): assert repr(ConfigWrapper(config=config)) == expected def test_config_wrapper_get_item(): config_wrapper = ConfigWrapper(config=ConfigDict(title='test')) assert config_wrapper.title == 'test' with pytest.raises(AttributeError, match="Config has no attribute 'test'"): config_wrapper.test def test_config_inheritance_with_annotations(): class Parent(BaseModel): model_config: ConfigDict = {'extra': 'allow'} class Child(Parent): model_config: ConfigDict = {'str_to_lower': True} assert Child.model_config == {'extra': 'allow', 'str_to_lower': True} def test_json_encoders_model() -> None: with pytest.warns(PydanticDeprecationWarning): class Model(BaseModel): model_config = ConfigDict(json_encoders={Decimal: lambda x: str(x * 2), int: lambda x: str(x * 3)}) value: Decimal x: int assert json.loads(Model(value=Decimal('1.1'), x=1).model_dump_json()) == {'value': '2.2', 'x': '3'} @pytest.mark.filterwarnings('ignore::pydantic.warnings.PydanticDeprecationWarning') def test_json_encoders_type_adapter() -> None: config = ConfigDict(json_encoders={Decimal: lambda x: str(x * 2), int: lambda x: str(x * 3)}) ta = TypeAdapter(int, config=config) assert json.loads(ta.dump_json(1)) == '3' ta = TypeAdapter(Decimal, config=config) assert json.loads(ta.dump_json(Decimal('1.1'))) == '2.2' ta = TypeAdapter(Union[Decimal, int], config=config) assert json.loads(ta.dump_json(Decimal('1.1'))) == '2.2' assert json.loads(ta.dump_json(1)) == '2' @pytest.mark.parametrize('defer_build', [True, False]) def test_config_model_defer_build(defer_build: bool, generate_schema_calls: CallCounter): config = ConfigDict(defer_build=defer_build) class MyModel(BaseModel): model_config = config x: int if defer_build: assert isinstance(MyModel.__pydantic_validator__, MockValSer) assert isinstance(MyModel.__pydantic_serializer__, MockValSer) assert generate_schema_calls.count == 0, 'Should respect defer_build' else: assert isinstance(MyModel.__pydantic_validator__, SchemaValidator) assert isinstance(MyModel.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == 1, 'Should respect defer_build' m = MyModel(x=1) assert m.x == 1 assert m.model_dump()['x'] == 1 assert m.model_validate({'x': 2}).x == 2 assert m.model_json_schema()['type'] == 'object' assert isinstance(MyModel.__pydantic_validator__, SchemaValidator) assert isinstance(MyModel.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == 1, 'Should not build duplicated core schemas' @pytest.mark.parametrize('defer_build', [True, False]) def test_config_dataclass_defer_build(defer_build: bool, generate_schema_calls: CallCounter) -> None: config = ConfigDict(defer_build=defer_build) @pydantic_dataclass(config=config) class MyDataclass: x: int if defer_build: assert isinstance(MyDataclass.__pydantic_validator__, MockValSer) assert isinstance(MyDataclass.__pydantic_serializer__, MockValSer) assert generate_schema_calls.count == 0, 'Should respect defer_build' else: assert isinstance(MyDataclass.__pydantic_validator__, SchemaValidator) assert isinstance(MyDataclass.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == 1, 'Should respect defer_build' m = MyDataclass(x=1) assert m.x == 1 assert isinstance(MyDataclass.__pydantic_validator__, SchemaValidator) assert isinstance(MyDataclass.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == 1, 'Should not build duplicated core schemas' def test_dataclass_defer_build_override_on_rebuild_dataclass(generate_schema_calls: CallCounter) -> None: config = ConfigDict(defer_build=True) @pydantic_dataclass(config=config) class MyDataclass: x: int assert isinstance(MyDataclass.__pydantic_validator__, MockValSer) assert isinstance(MyDataclass.__pydantic_serializer__, MockValSer) assert generate_schema_calls.count == 0, 'Should respect defer_build' rebuild_dataclass(MyDataclass, force=True) assert isinstance(MyDataclass.__pydantic_validator__, SchemaValidator) assert isinstance(MyDataclass.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == 1, 'Should have called generate_schema once' @pytest.mark.parametrize('defer_build', [True, False]) def test_config_model_type_adapter_defer_build(defer_build: bool, generate_schema_calls: CallCounter): config = ConfigDict(defer_build=defer_build) class MyModel(BaseModel): model_config = config x: int assert generate_schema_calls.count == (0 if defer_build is True else 1) generate_schema_calls.reset() ta = TypeAdapter(MyModel) assert generate_schema_calls.count == 0, 'Should use model generated schema' assert ta.validate_python({'x': 1}).x == 1 assert ta.validate_python({'x': 2}).x == 2 assert ta.dump_python(MyModel.model_construct(x=1))['x'] == 1 assert ta.json_schema()['type'] == 'object' assert generate_schema_calls.count == (1 if defer_build is True else 0), 'Should not build duplicate core schemas' @pytest.mark.parametrize('defer_build', [True, False]) def test_config_plain_type_adapter_defer_build(defer_build: bool, generate_schema_calls: CallCounter): config = ConfigDict(defer_build=defer_build) ta = TypeAdapter(Dict[str, int], config=config) assert generate_schema_calls.count == (0 if defer_build else 1) generate_schema_calls.reset() assert ta.validate_python({}) == {} assert ta.validate_python({'x': 1}) == {'x': 1} assert ta.dump_python({'x': 2}) == {'x': 2} assert ta.json_schema()['type'] == 'object' assert generate_schema_calls.count == (1 if defer_build else 0), 'Should not build duplicate core schemas' @pytest.mark.parametrize('defer_build', [True, False]) def test_config_model_defer_build_nested(defer_build: bool, generate_schema_calls: CallCounter): config = ConfigDict(defer_build=defer_build) assert generate_schema_calls.count == 0 class MyNestedModel(BaseModel): model_config = config x: int class MyModel(BaseModel): y: MyNestedModel assert isinstance(MyModel.__pydantic_validator__, SchemaValidator) assert isinstance(MyModel.__pydantic_serializer__, SchemaSerializer) expected_schema_count = 1 if defer_build is True else 2 assert generate_schema_calls.count == expected_schema_count, 'Should respect defer_build' if defer_build: assert isinstance(MyNestedModel.__pydantic_validator__, MockValSer) assert isinstance(MyNestedModel.__pydantic_serializer__, MockValSer) else: assert isinstance(MyNestedModel.__pydantic_validator__, SchemaValidator) assert isinstance(MyNestedModel.__pydantic_serializer__, SchemaSerializer) m = MyModel(y={'x': 1}) assert m.y.x == 1 assert m.model_dump() == {'y': {'x': 1}} assert m.model_validate({'y': {'x': 1}}).y.x == 1 assert m.model_json_schema()['type'] == 'object' if defer_build: assert isinstance(MyNestedModel.__pydantic_validator__, MockValSer) assert isinstance(MyNestedModel.__pydantic_serializer__, MockValSer) else: assert isinstance(MyNestedModel.__pydantic_validator__, SchemaValidator) assert isinstance(MyNestedModel.__pydantic_serializer__, SchemaSerializer) assert generate_schema_calls.count == expected_schema_count, 'Should not build duplicated core schemas' def test_config_model_defer_build_ser_first(): class M1(BaseModel, defer_build=True): a: str class M2(BaseModel, defer_build=True): b: M1 m = M2.model_validate({'b': {'a': 'foo'}}) assert m.b.model_dump() == {'a': 'foo'} def test_defer_build_json_schema(): class M(BaseModel, defer_build=True): a: int assert M.model_json_schema() == { 'title': 'M', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], } def test_partial_creation_with_defer_build(): class M(BaseModel): a: int b: int def create_partial(model, optionals): override_fields = {} model.model_rebuild() for name, field in model.model_fields.items(): if field.is_required() and name in optionals: assert field.annotation is not None override_fields[name] = (Optional[field.annotation], FieldInfo.merge_field_infos(field, default=None)) return create_model(f'Partial{model.__name__}', __base__=model, **override_fields) partial = create_partial(M, {'a'}) # Comment this away and the last assertion works assert M.model_json_schema()['required'] == ['a', 'b'] # AssertionError: assert ['a', 'b'] == ['b'] assert partial.model_json_schema()['required'] == ['b'] def test_model_config_as_model_field_raises(): with pytest.raises(PydanticUserError) as exc_info: class MyModel(BaseModel): model_config: str assert exc_info.value.code == 'model-config-invalid-field-name' def test_dataclass_allows_model_config_as_model_field(): config_title = 'from_config' field_title = 'from_field' @pydantic_dataclass(config={'title': config_title}) class MyDataclass: model_config: dict m = MyDataclass(model_config={'title': field_title}) assert m.model_config['title'] == field_title assert m.__pydantic_config__['title'] == config_title def test_with_config_disallowed_with_model(): msg = 'Cannot use `with_config` on Model as it is a Pydantic model' with pytest.raises(PydanticUserError, match=msg): @with_config({'coerce_numbers_to_str': True}) class Model(BaseModel): pass def test_empty_config_with_annotations(): class Model(BaseModel): model_config: ConfigDict = {} assert Model.model_config == {} def test_generate_schema_deprecation_warning() -> None: with pytest.warns( PydanticDeprecatedSince210, match='The `schema_generator` setting has been deprecated since v2.10.' ): class Model(BaseModel): model_config = ConfigDict(schema_generator=GenerateSchema) pydantic-2.10.6/tests/test_construction.py000066400000000000000000000373511474456633400207510ustar00rootroot00000000000000import pickle from typing import Any, List, Optional import pytest from pydantic_core import PydanticUndefined, ValidationError from pydantic import AliasChoices, AliasPath, BaseModel, ConfigDict, Field, PrivateAttr, PydanticDeprecatedSince20 class Model(BaseModel): a: float b: int = 10 def test_simple_construct(): m = Model.model_construct(a=3.14) assert m.a == 3.14 assert m.b == 10 assert m.model_fields_set == {'a'} assert m.model_dump() == {'a': 3.14, 'b': 10} def test_construct_misuse(): m = Model.model_construct(b='foobar') assert m.b == 'foobar' with pytest.warns(UserWarning, match='Expected `int` but got `str`'): assert m.model_dump() == {'b': 'foobar'} with pytest.raises(AttributeError, match="'Model' object has no attribute 'a'"): print(m.a) def test_construct_fields_set(): m = Model.model_construct(a=3.0, b=-1, _fields_set={'a'}) assert m.a == 3 assert m.b == -1 assert m.model_fields_set == {'a'} assert m.model_dump() == {'a': 3, 'b': -1} def test_construct_allow_extra(): """model_construct() should allow extra fields only in the case of extra='allow'""" class Foo(BaseModel, extra='allow'): x: int model = Foo.model_construct(x=1, y=2) assert model.x == 1 assert model.y == 2 @pytest.mark.parametrize('extra', ['ignore', 'forbid']) def test_construct_ignore_extra(extra: str) -> None: """model_construct() should ignore extra fields only in the case of extra='ignore' or extra='forbid'""" class Foo(BaseModel, extra=extra): x: int model = Foo.model_construct(x=1, y=2) assert model.x == 1 assert model.__pydantic_extra__ is None assert 'y' not in model.__dict__ def test_construct_keep_order(): class Foo(BaseModel): a: int b: int = 42 c: float instance = Foo(a=1, b=321, c=3.14) instance_construct = Foo.model_construct(**instance.model_dump()) assert instance == instance_construct assert instance.model_dump() == instance_construct.model_dump() assert instance.model_dump_json() == instance_construct.model_dump_json() def test_construct_with_aliases(): class MyModel(BaseModel): x: int = Field(alias='x_alias') my_model = MyModel.model_construct(x_alias=1) assert my_model.x == 1 assert my_model.model_fields_set == {'x'} assert my_model.model_dump() == {'x': 1} def test_construct_with_validation_aliases(): class MyModel(BaseModel): x: int = Field(validation_alias='x_alias') my_model = MyModel.model_construct(x_alias=1) assert my_model.x == 1 assert my_model.model_fields_set == {'x'} assert my_model.model_dump() == {'x': 1} def test_large_any_str(): class Model(BaseModel): a: bytes b: str content_bytes = b'x' * (2**16 + 1) content_str = 'x' * (2**16 + 1) m = Model(a=content_bytes, b=content_str) assert m.a == content_bytes assert m.b == content_str def deprecated_copy(m: BaseModel, *, include=None, exclude=None, update=None, deep=False): """ This should only be used to make calls to the deprecated `copy` method with arguments that have been removed from `model_copy`. Otherwise, use the `copy_method` fixture below """ with pytest.warns( PydanticDeprecatedSince20, match=( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.' ), ): return m.copy(include=include, exclude=exclude, update=update, deep=deep) @pytest.fixture(params=['copy', 'model_copy']) def copy_method(request): """ Fixture to test both the old/deprecated `copy` and new `model_copy` methods. """ if request.param == 'copy': return deprecated_copy else: def new_copy_method(m, *, update=None, deep=False): return m.model_copy(update=update, deep=deep) return new_copy_method def test_simple_copy(copy_method): m = Model(a=24) m2 = copy_method(m) assert m.a == m2.a == 24 assert m.b == m2.b == 10 assert m == m2 assert m.model_fields == m2.model_fields @pytest.fixture(scope='session', name='ModelTwo') def model_two_fixture(): class ModelTwo(BaseModel): _foo_ = PrivateAttr({'private'}) a: float b: int = 10 c: str = 'foobar' d: Model return ModelTwo def test_deep_copy(ModelTwo, copy_method): m = ModelTwo(a=24, d=Model(a='12')) m._foo_ = {'new value'} m2 = copy_method(m, deep=True) assert m.a == m2.a == 24 assert m.b == m2.b == 10 assert m.c == m2.c == 'foobar' assert m.d is not m2.d assert m == m2 assert m.model_fields == m2.model_fields assert m._foo_ == m2._foo_ assert m._foo_ is not m2._foo_ def test_copy_exclude(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = deprecated_copy(m, exclude={'b'}) assert m.a == m2.a == 24 assert isinstance(m2.d, Model) assert m2.d.a == 12 assert hasattr(m2, 'c') assert not hasattr(m2, 'b') assert set(m.model_dump().keys()) == {'a', 'b', 'c', 'd'} assert set(m2.model_dump().keys()) == {'a', 'c', 'd'} assert m != m2 def test_copy_include(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = deprecated_copy(m, include={'a'}) assert m.a == m2.a == 24 assert set(m.model_dump().keys()) == {'a', 'b', 'c', 'd'} assert set(m2.model_dump().keys()) == {'a'} assert m != m2 def test_copy_include_exclude(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = deprecated_copy(m, include={'a', 'b', 'c'}, exclude={'c'}) assert set(m.model_dump().keys()) == {'a', 'b', 'c', 'd'} assert set(m2.model_dump().keys()) == {'a', 'b'} def test_copy_advanced_exclude(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) m2 = deprecated_copy(m, exclude={'f': {'c': ..., 'd': {-1: {'a'}}}}) assert hasattr(m.f, 'c') assert not hasattr(m2.f, 'c') assert m2.model_dump() == {'e': 'e', 'f': {'d': [{'a': 'a', 'b': 'b'}, {'b': 'e'}]}} m2 = deprecated_copy(m, exclude={'e': ..., 'f': {'d'}}) assert m2.model_dump() == {'f': {'c': 'foo'}} def test_copy_advanced_include(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) m2 = deprecated_copy(m, include={'f': {'c'}}) assert hasattr(m.f, 'c') assert hasattr(m2.f, 'c') assert m2.model_dump() == {'f': {'c': 'foo'}} m2 = deprecated_copy(m, include={'e': ..., 'f': {'d': {-1}}}) assert m2.model_dump() == {'e': 'e', 'f': {'d': [{'a': 'c', 'b': 'e'}]}} def test_copy_advanced_include_exclude(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) m2 = deprecated_copy(m, include={'e': ..., 'f': {'d'}}, exclude={'e': ..., 'f': {'d': {0}}}) assert m2.model_dump() == {'f': {'d': [{'a': 'c', 'b': 'e'}]}} def test_copy_update(ModelTwo, copy_method): m = ModelTwo(a=24, d=Model(a='12')) m2 = copy_method(m, update={'a': 'different'}) assert m.a == 24 assert m2.a == 'different' m_keys = m.model_dump().keys() with pytest.warns(UserWarning, match='Expected `float` but got `str`'): m2_keys = m2.model_dump().keys() assert set(m_keys) == set(m2_keys) == {'a', 'b', 'c', 'd'} assert m != m2 def test_copy_update_unset(copy_method): class Foo(BaseModel): foo: Optional[str] = None bar: Optional[str] = None assert ( copy_method(Foo(foo='hello'), update={'bar': 'world'}).model_dump_json(exclude_unset=True) == '{"foo":"hello","bar":"world"}' ) class ExtraModel(BaseModel, extra='allow'): pass def test_copy_deep_extra(copy_method): class Foo(BaseModel, extra='allow'): pass m = Foo(extra=[]) assert copy_method(m).extra is m.extra assert copy_method(m, deep=True).extra == m.extra assert copy_method(m, deep=True).extra is not m.extra def test_copy_set_fields(ModelTwo, copy_method): m = ModelTwo(a=24, d=Model(a='12')) m2 = copy_method(m) assert m.model_dump(exclude_unset=True) == {'a': 24.0, 'd': {'a': 12}} assert m.model_dump(exclude_unset=True) == m2.model_dump(exclude_unset=True) def test_simple_pickle(): m = Model(a='24') b = pickle.dumps(m) m2 = pickle.loads(b) assert m.a == m2.a == 24 assert m.b == m2.b == 10 assert m == m2 assert m is not m2 assert tuple(m) == (('a', 24.0), ('b', 10)) assert tuple(m2) == (('a', 24.0), ('b', 10)) assert m.model_fields == m2.model_fields def test_recursive_pickle(create_module): @create_module def module(): from pydantic import BaseModel, PrivateAttr class PickleModel(BaseModel): a: float b: int = 10 class PickleModelTwo(BaseModel): _foo_ = PrivateAttr({'private'}) a: float b: int = 10 c: str = 'foobar' d: PickleModel m = module.PickleModelTwo(a=24, d=module.PickleModel(a='123.45')) m2 = pickle.loads(pickle.dumps(m)) assert m == m2 assert m.d.a == 123.45 assert m2.d.a == 123.45 assert m.model_fields == m2.model_fields assert m._foo_ == m2._foo_ def test_pickle_undefined(create_module): @create_module def module(): from pydantic import BaseModel, PrivateAttr class PickleModel(BaseModel): a: float b: int = 10 class PickleModelTwo(BaseModel): _foo_ = PrivateAttr({'private'}) a: float b: int = 10 c: str = 'foobar' d: PickleModel m = module.PickleModelTwo(a=24, d=module.PickleModel(a='123.45')) m2 = pickle.loads(pickle.dumps(m)) assert m2._foo_ == {'private'} m._foo_ = PydanticUndefined m3 = pickle.loads(pickle.dumps(m)) assert not hasattr(m3, '_foo_') def test_copy_undefined(ModelTwo, copy_method): m = ModelTwo(a=24, d=Model(a='123.45')) m2 = copy_method(m) assert m2._foo_ == {'private'} m._foo_ = PydanticUndefined m3 = copy_method(m) assert not hasattr(m3, '_foo_') def test_immutable_copy_with_frozen(copy_method): class Model(BaseModel): model_config = ConfigDict(frozen=True) a: int b: int m = Model(a=40, b=10) assert m == copy_method(m) assert repr(m) == 'Model(a=40, b=10)' m2 = copy_method(m, update={'b': 12}) assert repr(m2) == 'Model(a=40, b=12)' with pytest.raises(ValidationError): m2.b = 13 def test_pickle_fields_set(): m = Model(a=24) assert m.model_dump(exclude_unset=True) == {'a': 24} m2 = pickle.loads(pickle.dumps(m)) assert m2.model_dump(exclude_unset=True) == {'a': 24} def test_pickle_preserves_extra(): m = ExtraModel(a=24) assert m.model_extra == {'a': 24} m2 = pickle.loads(pickle.dumps(m)) assert m2.model_extra == {'a': 24} def test_copy_update_exclude(): class SubModel(BaseModel): a: str b: str class Model(BaseModel): c: str d: SubModel m = Model(c='ex', d=dict(a='ax', b='bx')) assert m.model_dump() == {'c': 'ex', 'd': {'a': 'ax', 'b': 'bx'}} assert deprecated_copy(m, exclude={'c'}).model_dump() == {'d': {'a': 'ax', 'b': 'bx'}} with pytest.warns(UserWarning, match='Expected `str` but got `int`'): assert deprecated_copy(m, exclude={'c'}, update={'c': 42}).model_dump() == { 'c': 42, 'd': {'a': 'ax', 'b': 'bx'}, } with pytest.warns( PydanticDeprecatedSince20, match='The private method `_calculate_keys` will be removed and should no longer be used.', ): assert m._calculate_keys(exclude={'x': ...}, include=None, exclude_unset=False) == {'c', 'd'} assert m._calculate_keys(exclude={'x': ...}, include=None, exclude_unset=False, update={'c': 42}) == {'d'} def test_shallow_copy_modify(copy_method): class X(BaseModel): val: int deep: Any x = X(val=1, deep={'deep_thing': [1, 2]}) y = copy_method(x) y.val = 2 y.deep['deep_thing'].append(3) assert x.val == 1 assert y.val == 2 # deep['deep_thing'] gets modified assert x.deep['deep_thing'] == [1, 2, 3] assert y.deep['deep_thing'] == [1, 2, 3] def test_construct_default_factory(): class Model(BaseModel): foo: List[int] = Field(default_factory=list) bar: str = 'Baz' m = Model.model_construct() assert m.foo == [] assert m.bar == 'Baz' def test_copy_with_excluded_fields(): class User(BaseModel): name: str age: int dob: str user = User(name='test_user', age=23, dob='01/01/2000') user_copy = deprecated_copy(user, exclude={'dob': ...}) assert 'dob' in user.model_fields_set assert 'dob' not in user_copy.model_fields_set def test_dunder_copy(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = m.__copy__() assert m is not m2 assert m.a == m2.a == 24 assert isinstance(m2.d, Model) assert m.d is m2.d assert m.d.a == m2.d.a == 12 m.a = 12 assert m.a != m2.a def test_dunder_deepcopy(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = m.__copy__() assert m is not m2 assert m.a == m2.a == 24 assert isinstance(m2.d, Model) assert m.d is m2.d assert m.d.a == m2.d.a == 12 m.a = 12 assert m.a != m2.a def test_model_copy(ModelTwo): m = ModelTwo(a=24, d=Model(a='12')) m2 = m.__copy__() assert m is not m2 assert m.a == m2.a == 24 assert isinstance(m2.d, Model) assert m.d is m2.d assert m.d.a == m2.d.a == 12 m.a = 12 assert m.a != m2.a def test_pydantic_extra(): class Model(BaseModel): model_config = dict(extra='allow') x: int m = Model.model_construct(x=1, y=2) assert m.__pydantic_extra__ == {'y': 2} def test_retain_order_of_fields(): class MyModel(BaseModel): a: str = 'a' b: str m = MyModel.model_construct(b='b') assert m.model_dump_json() == '{"a":"a","b":"b"}' def test_initialize_with_private_attr(): class MyModel(BaseModel): _a: str m = MyModel.model_construct(_a='a') assert m._a == 'a' assert '_a' in m.__pydantic_private__ def test_model_construct_with_alias_choices() -> None: class MyModel(BaseModel): a: str = Field(validation_alias=AliasChoices('aaa', 'AAA')) assert MyModel.model_construct(a='a_value').a == 'a_value' assert MyModel.model_construct(aaa='a_value').a == 'a_value' assert MyModel.model_construct(AAA='a_value').a == 'a_value' def test_model_construct_with_alias_path() -> None: class MyModel(BaseModel): a: str = Field(validation_alias=AliasPath('aaa', 'AAA')) assert MyModel.model_construct(a='a_value').a == 'a_value' assert MyModel.model_construct(aaa={'AAA': 'a_value'}).a == 'a_value' def test_model_construct_with_alias_choices_and_path() -> None: class MyModel(BaseModel): a: str = Field(validation_alias=AliasChoices('aaa', AliasPath('AAA', 'aaa'))) assert MyModel.model_construct(a='a_value').a == 'a_value' assert MyModel.model_construct(aaa='a_value').a == 'a_value' assert MyModel.model_construct(AAA={'aaa': 'a_value'}).a == 'a_value' pydantic-2.10.6/tests/test_create_model.py000066400000000000000000000460601474456633400206370ustar00rootroot00000000000000import platform import re from typing import Generic, Optional, Tuple, TypeVar import pytest from typing_extensions import Annotated from pydantic import ( BaseModel, ConfigDict, Field, PrivateAttr, PydanticDeprecatedSince20, PydanticUserError, ValidationError, create_model, errors, field_validator, validator, ) from pydantic.fields import ModelPrivateAttr def test_create_model(): model = create_model('FooModel', foo=(str, ...), bar=(int, 123)) assert issubclass(model, BaseModel) assert model.model_config == BaseModel.model_config assert model.__name__ == 'FooModel' assert model.model_fields.keys() == {'foo', 'bar'} assert not model.__pydantic_decorators__.validators assert not model.__pydantic_decorators__.root_validators assert not model.__pydantic_decorators__.field_validators assert not model.__pydantic_decorators__.field_serializers assert model.__module__ == 'tests.test_create_model' def test_create_model_usage(): model = create_model('FooModel', foo=(str, ...), bar=(int, 123)) m = model(foo='hello') assert m.foo == 'hello' assert m.bar == 123 with pytest.raises(ValidationError): model() with pytest.raises(ValidationError): model(foo='hello', bar='xxx') def test_create_model_pickle(create_module): """ Pickle will work for dynamically created model only if it was defined globally with its class name and module where it's defined was specified """ @create_module def module(): import pickle from pydantic import create_model FooModel = create_model('FooModel', foo=(str, ...), bar=(int, 123), __module__=__name__) m = FooModel(foo='hello') d = pickle.dumps(m) m2 = pickle.loads(d) assert m2.foo == m.foo == 'hello' assert m2.bar == m.bar == 123 assert m2 == m assert m2 is not m def test_create_model_multi_inheritance(): class Mixin: pass Generic_T = Generic[TypeVar('T')] FooModel = create_model('FooModel', value=(int, ...), __base__=(BaseModel, Generic_T)) assert FooModel.__orig_bases__ == (BaseModel, Generic_T) def test_create_model_must_not_reset_parent_namespace(): # It's important to use the annotation `'namespace'` as this is a particular string that is present # in the parent namespace if you reset the parent namespace in the call to `create_model`. AbcModel = create_model('AbcModel', abc=('namespace', None)) with pytest.raises( PydanticUserError, match=re.escape( '`AbcModel` is not fully defined; you should define `namespace`, then call `AbcModel.model_rebuild()`.' ), ): AbcModel(abc=1) # Rebuild the model now that `namespace` is defined namespace = int # noqa F841 AbcModel.model_rebuild() assert AbcModel(abc=1).abc == 1 with pytest.raises(ValidationError) as exc_info: AbcModel(abc='a') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('abc',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] def test_invalid_name(): with pytest.warns(RuntimeWarning): model = create_model('FooModel', _foo=(str, ...)) assert len(model.model_fields) == 0 def test_field_wrong_tuple(): with pytest.raises(errors.PydanticUserError): create_model('FooModel', foo=(1, 2, 3)) def test_config_and_base(): with pytest.raises(errors.PydanticUserError): create_model('FooModel', __config__=BaseModel.model_config, __base__=BaseModel) def test_inheritance(): class BarModel(BaseModel): x: int = 1 y: int = 2 model = create_model('FooModel', foo=(str, ...), bar=(int, 123), __base__=BarModel) assert model.model_fields.keys() == {'foo', 'bar', 'x', 'y'} m = model(foo='a', x=4) assert m.model_dump() == {'bar': 123, 'foo': 'a', 'x': 4, 'y': 2} # bases as a tuple model = create_model('FooModel', foo=(str, ...), bar=(int, 123), __base__=(BarModel,)) assert model.model_fields.keys() == {'foo', 'bar', 'x', 'y'} m = model(foo='a', x=4) assert m.model_dump() == {'bar': 123, 'foo': 'a', 'x': 4, 'y': 2} def test_custom_config(): config = ConfigDict(frozen=True) expected_config = BaseModel.model_config.copy() expected_config['frozen'] = True model = create_model('FooModel', foo=(int, ...), __config__=config) m = model(**{'foo': '987'}) assert m.foo == 987 assert model.model_config == expected_config with pytest.raises(ValidationError): m.foo = 654 def test_custom_config_inherits(): class Config(ConfigDict): custom_config: bool config = Config(custom_config=True, validate_assignment=True) expected_config = Config(BaseModel.model_config) expected_config.update(config) model = create_model('FooModel', foo=(int, ...), __config__=config) m = model(**{'foo': '987'}) assert m.foo == 987 assert model.model_config == expected_config with pytest.raises(ValidationError): m.foo = ['123'] def test_custom_config_extras(): config = ConfigDict(extra='forbid') model = create_model('FooModel', foo=(int, ...), __config__=config) assert model(foo=654) with pytest.raises(ValidationError): model(bar=654) def test_inheritance_validators(): class BarModel(BaseModel): @field_validator('a', check_fields=False) @classmethod def check_a(cls, v): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v model = create_model('FooModel', a=(str, 'cake'), __base__=BarModel) assert model().a == 'cake' assert model(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): model(a='something else') def test_inheritance_validators_always(): class BarModel(BaseModel): @field_validator('a', check_fields=False) @classmethod def check_a(cls, v): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v model = create_model('FooModel', a=(str, Field('cake', validate_default=True)), __base__=BarModel) with pytest.raises(ValidationError): model() assert model(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): model(a='something else') def test_inheritance_validators_all(): with pytest.warns(PydanticDeprecatedSince20, match='Pydantic V1 style `@validator` validators are deprecated'): class BarModel(BaseModel): @validator('*') @classmethod def check_all(cls, v): return v * 2 model = create_model('FooModel', a=(int, ...), b=(int, ...), __base__=BarModel) assert model(a=2, b=6).model_dump() == {'a': 4, 'b': 12} def test_funky_name(): model = create_model('FooModel', **{'this-is-funky': (int, ...)}) m = model(**{'this-is-funky': '123'}) assert m.model_dump() == {'this-is-funky': 123} with pytest.raises(ValidationError) as exc_info: model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('this-is-funky',), 'msg': 'Field required', 'type': 'missing'} ] def test_repeat_base_usage(): class Model(BaseModel): a: str assert Model.model_fields.keys() == {'a'} model = create_model('FooModel', b=(int, 1), __base__=Model) assert Model.model_fields.keys() == {'a'} assert model.model_fields.keys() == {'a', 'b'} model2 = create_model('Foo2Model', c=(int, 1), __base__=Model) assert Model.model_fields.keys() == {'a'} assert model.model_fields.keys() == {'a', 'b'} assert model2.model_fields.keys() == {'a', 'c'} model3 = create_model('Foo2Model', d=(int, 1), __base__=model) assert Model.model_fields.keys() == {'a'} assert model.model_fields.keys() == {'a', 'b'} assert model2.model_fields.keys() == {'a', 'c'} assert model3.model_fields.keys() == {'a', 'b', 'd'} def test_dynamic_and_static(): class A(BaseModel): x: int y: float z: str DynamicA = create_model('A', x=(int, ...), y=(float, ...), z=(str, ...)) for field_name in ('x', 'y', 'z'): assert A.model_fields[field_name].default == DynamicA.model_fields[field_name].default def test_create_model_field_and_model_title(): m = create_model('M', __config__=ConfigDict(title='abc'), a=(str, Field(title='field-title'))) assert m.model_json_schema() == { 'properties': {'a': {'title': 'field-title', 'type': 'string'}}, 'required': ['a'], 'title': 'abc', 'type': 'object', } def test_create_model_field_description(): m = create_model('M', a=(str, Field(description='descr')), __doc__='Some doc') assert m.model_json_schema() == { 'properties': {'a': {'description': 'descr', 'title': 'A', 'type': 'string'}}, 'required': ['a'], 'title': 'M', 'type': 'object', 'description': 'Some doc', } def test_create_model_with_doc(): model = create_model('FooModel', foo=(str, ...), bar=(int, 123), __doc__='The Foo model') assert model.__name__ == 'FooModel' assert model.__doc__ == 'The Foo model' @pytest.mark.parametrize('base', [ModelPrivateAttr, object]) @pytest.mark.parametrize('use_annotation', [True, False]) def test_private_descriptors(base, use_annotation): set_name_calls = [] get_calls = [] set_calls = [] delete_calls = [] class MyDescriptor(base): def __init__(self, fn): super().__init__() self.fn = fn self.name = '' def __set_name__(self, owner, name): set_name_calls.append((owner, name)) self.name = name def __get__(self, obj, type=None): get_calls.append((obj, type)) return self.fn(obj) if obj else self def __set__(self, obj, value): set_calls.append((obj, value)) self.fn = lambda obj: value def __delete__(self, obj): delete_calls.append(obj) def fail(obj): # I have purposely not used the exact formatting you'd get if the attribute wasn't defined, # to make it clear this function is being called, while also having sensible behavior raise AttributeError(f'{self.name!r} is not defined on {obj!r}') self.fn = fail class A(BaseModel): x: int if use_annotation: _some_func: MyDescriptor = MyDescriptor(lambda self: self.x) else: _some_func = MyDescriptor(lambda self: self.x) @property def _double_x(self): return self.x * 2 assert set(A.__private_attributes__) == {'_some_func'} assert set_name_calls == [(A, '_some_func')] a = A(x=2) assert a._double_x == 4 # Ensure properties with leading underscores work fine and don't become private attributes assert get_calls == [] assert a._some_func == 2 assert get_calls == [(a, A)] assert set_calls == [] a._some_func = 3 assert set_calls == [(a, 3)] assert a._some_func == 3 assert get_calls == [(a, A), (a, A)] assert delete_calls == [] del a._some_func assert delete_calls == [a] with pytest.raises(AttributeError, match=r"'_some_func' is not defined on A\(x=2\)"): a._some_func assert get_calls == [(a, A), (a, A), (a, A)] def test_private_attr_set_name(): class SetNameInt(int): _owner_attr_name: Optional[str] = None def __set_name__(self, owner, name): self._owner_attr_name = f'{owner.__name__}.{name}' _private_attr_default = SetNameInt(1) class Model(BaseModel): _private_attr_1: int = PrivateAttr(default=_private_attr_default) _private_attr_2: SetNameInt = SetNameInt(2) assert _private_attr_default._owner_attr_name == 'Model._private_attr_1' m = Model() assert m._private_attr_1 == 1 assert m._private_attr_1._owner_attr_name == 'Model._private_attr_1' assert m._private_attr_2 == 2 assert m._private_attr_2._owner_attr_name == 'Model._private_attr_2' def test_private_attr_default_descriptor_attribute_error(): class SetNameInt(int): def __get__(self, obj, cls): return self _private_attr_default = SetNameInt(1) class Model(BaseModel): _private_attr: int = PrivateAttr(default=_private_attr_default) assert Model.__private_attributes__['_private_attr'].__get__(None, Model) == _private_attr_default with pytest.raises(AttributeError, match="'ModelPrivateAttr' object has no attribute 'some_attr'"): Model.__private_attributes__['_private_attr'].some_attr def test_private_attr_set_name_do_not_crash_if_not_callable(): class SetNameInt(int): __set_name__ = None _private_attr_default = SetNameInt(2) class Model(BaseModel): _private_attr: int = PrivateAttr(default=_private_attr_default) # Checks below are just to ensure that everything is the same as in `test_private_attr_set_name` # The main check is that model class definition above doesn't crash assert Model()._private_attr == 2 def test_del_model_attr(): class Model(BaseModel): some_field: str m = Model(some_field='value') assert hasattr(m, 'some_field') del m.some_field assert not hasattr(m, 'some_field') @pytest.mark.skipif( platform.python_implementation() == 'PyPy', reason='In this single case `del` behaves weird on pypy', ) def test_del_model_attr_error(): class Model(BaseModel): some_field: str m = Model(some_field='value') assert not hasattr(m, 'other_field') with pytest.raises(AttributeError, match='other_field'): del m.other_field def test_del_model_attr_with_privat_attrs(): class Model(BaseModel): _private_attr: int = PrivateAttr(default=1) some_field: str m = Model(some_field='value') assert hasattr(m, 'some_field') del m.some_field assert not hasattr(m, 'some_field') @pytest.mark.skipif( platform.python_implementation() == 'PyPy', reason='In this single case `del` behaves weird on pypy', ) def test_del_model_attr_with_privat_attrs_error(): class Model(BaseModel): _private_attr: int = PrivateAttr(default=1) some_field: str m = Model(some_field='value') assert not hasattr(m, 'other_field') with pytest.raises(AttributeError, match="'Model' object has no attribute 'other_field'"): del m.other_field def test_del_model_attr_with_privat_attrs_twice_error(): class Model(BaseModel): _private_attr: int = 1 some_field: str m = Model(some_field='value') assert hasattr(m, '_private_attr') del m._private_attr with pytest.raises(AttributeError, match="'Model' object has no attribute '_private_attr'"): del m._private_attr def test_create_model_with_slots(): field_definitions = {'__slots__': (Optional[Tuple[str, ...]], None), 'foobar': (Optional[int], None)} with pytest.warns(RuntimeWarning, match='__slots__ should not be passed to create_model'): model = create_model('PartialPet', **field_definitions) assert model.model_fields.keys() == {'foobar'} def test_create_model_non_annotated(): with pytest.raises( TypeError, match='A non-annotated attribute was detected: `bar = 123`. All model fields require a type annotation', ): create_model('FooModel', foo=(str, ...), bar=123) @pytest.mark.parametrize( 'annotation_type,field_info', [ (bool, Field(alias='foo_bool_alias', description='foo boolean')), (str, Field(alias='foo_str_alis', description='foo string')), ], ) def test_create_model_typing_annotated_field_info(annotation_type, field_info): annotated_foo = Annotated[annotation_type, field_info] model = create_model('FooModel', foo=annotated_foo, bar=(int, 123)) assert model.model_fields.keys() == {'foo', 'bar'} foo = model.model_fields.get('foo') assert foo is not None assert foo.annotation == annotation_type assert foo.alias == field_info.alias assert foo.description == field_info.description def test_create_model_expect_field_info_as_metadata_typing(): annotated_foo = Annotated[int, 10] with pytest.raises(PydanticUserError, match=r'Field definitions should be a Annotated\[, \]'): create_model('FooModel', foo=annotated_foo) def test_create_model_tuple(): model = create_model('FooModel', foo=(Tuple[int, int], (1, 2))) assert model().foo == (1, 2) assert model(foo=(3, 4)).foo == (3, 4) def test_create_model_tuple_3(): with pytest.raises(PydanticUserError, match=r'^Field definitions should be a `\(, \)`\.\n'): create_model('FooModel', foo=(Tuple[int, int], (1, 2), 'more')) def test_create_model_protected_namespace_default(): with pytest.warns( UserWarning, match='Field "model_dump_something" in Model has conflict with protected namespace "model_dump"' ): create_model('Model', model_dump_something=(str, ...)) def test_create_model_custom_protected_namespace(): with pytest.warns(UserWarning, match='Field "test_field" in Model has conflict with protected namespace "test_"'): create_model( 'Model', __config__=ConfigDict(protected_namespaces=('test_',)), model_prefixed_field=(str, ...), test_field=(str, ...), ) def test_create_model_multiple_protected_namespace(): with pytest.warns( UserWarning, match='Field "also_protect_field" in Model has conflict with protected namespace "also_protect_"' ): create_model( 'Model', __config__=ConfigDict(protected_namespaces=('protect_me_', 'also_protect_')), also_protect_field=(str, ...), ) def test_json_schema_with_inner_models_with_duplicate_names(): model_a = create_model( 'a', inner=(str, ...), ) model_b = create_model( 'a', outer=(model_a, ...), ) assert model_b.model_json_schema() == { '$defs': { 'a': { 'properties': {'inner': {'title': 'Inner', 'type': 'string'}}, 'required': ['inner'], 'title': 'a', 'type': 'object', } }, 'properties': {'outer': {'$ref': '#/$defs/a'}}, 'required': ['outer'], 'title': 'a', 'type': 'object', } def test_resolving_forward_refs_across_modules(create_module): module = create_module( # language=Python """\ from __future__ import annotations from dataclasses import dataclass from pydantic import BaseModel class X(BaseModel): pass @dataclass class Y: x: X """ ) Z = create_model('Z', y=(module.Y, ...)) assert Z(y={'x': {}}).y is not None def test_type_field_in_the_same_module(): class A: pass B = create_model('B', a_cls=(type, A)) b = B() assert b.a_cls == A pydantic-2.10.6/tests/test_dataclasses.py000066400000000000000000002477761474456633400205240ustar00rootroot00000000000000import dataclasses import inspect import pickle import re import sys import traceback from collections.abc import Hashable from dataclasses import InitVar from datetime import date, datetime from pathlib import Path from typing import Any, Callable, ClassVar, Dict, FrozenSet, Generic, List, Optional, Set, TypeVar, Union import pytest from dirty_equals import HasRepr from pydantic_core import ArgsKwargs, SchemaValidator from typing_extensions import Annotated, Literal import pydantic from pydantic import ( BaseModel, BeforeValidator, ConfigDict, PydanticDeprecatedSince20, PydanticUndefinedAnnotation, PydanticUserError, RootModel, TypeAdapter, ValidationError, ValidationInfo, computed_field, field_serializer, field_validator, model_validator, with_config, ) from pydantic._internal._mock_val_ser import MockValSer from pydantic.dataclasses import is_pydantic_dataclass, rebuild_dataclass from pydantic.fields import Field, FieldInfo from pydantic.json_schema import model_json_schema def test_cannot_create_dataclass_from_basemodel_subclass(): msg = 'Cannot create a Pydantic dataclass from SubModel as it is already a Pydantic model' with pytest.raises(PydanticUserError, match=msg): @pydantic.dataclasses.dataclass class SubModel(BaseModel): pass def test_simple(): @pydantic.dataclasses.dataclass class MyDataclass: a: int b: float d = MyDataclass('1', '2.5') assert d.a == 1 assert d.b == 2.5 d = MyDataclass(b=10, a=20) assert d.a == 20 assert d.b == 10 def test_model_name(): @pydantic.dataclasses.dataclass class MyDataClass: model_name: str d = MyDataClass('foo') assert d.model_name == 'foo' d = MyDataClass(model_name='foo') assert d.model_name == 'foo' def test_value_error(): @pydantic.dataclasses.dataclass class MyDataclass: a: int b: int with pytest.raises(ValidationError) as exc_info: MyDataclass(1, 'wrong') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (1,), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'wrong', } ] def test_frozen(): @pydantic.dataclasses.dataclass(frozen=True) class MyDataclass: a: int d = MyDataclass(1) assert d.a == 1 with pytest.raises(AttributeError): d.a = 7 def test_validate_assignment(): @pydantic.dataclasses.dataclass(config=ConfigDict(validate_assignment=True)) class MyDataclass: a: int d = MyDataclass(1) assert d.a == 1 d.a = '7' assert d.a == 7 def test_validate_assignment_error(): @pydantic.dataclasses.dataclass(config=ConfigDict(validate_assignment=True)) class MyDataclass: a: int d = MyDataclass(1) with pytest.raises(ValidationError) as exc_info: d.a = 'xxx' assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'xxx', } ] def test_not_validate_assignment(): @pydantic.dataclasses.dataclass class MyDataclass: a: int d = MyDataclass(1) assert d.a == 1 d.a = '7' assert d.a == '7' def test_validate_assignment_value_change(): @pydantic.dataclasses.dataclass(config=ConfigDict(validate_assignment=True), frozen=False) class MyDataclass: a: int @field_validator('a') @classmethod def double_a(cls, v: int) -> int: return v * 2 d = MyDataclass(2) assert d.a == 4 d.a = 3 assert d.a == 6 @pytest.mark.parametrize( 'config', [ ConfigDict(validate_assignment=False), ConfigDict(extra=None), ConfigDict(extra='forbid'), ConfigDict(extra='ignore'), ConfigDict(validate_assignment=False, extra=None), ConfigDict(validate_assignment=False, extra='forbid'), ConfigDict(validate_assignment=False, extra='ignore'), ConfigDict(validate_assignment=False, extra='allow'), ConfigDict(validate_assignment=True, extra='allow'), ], ) def test_validate_assignment_extra_unknown_field_assigned_allowed(config: ConfigDict): @pydantic.dataclasses.dataclass(config=config) class MyDataclass: a: int d = MyDataclass(1) assert d.a == 1 d.extra_field = 123 assert d.extra_field == 123 @pytest.mark.parametrize( 'config', [ ConfigDict(validate_assignment=True), ConfigDict(validate_assignment=True, extra=None), ConfigDict(validate_assignment=True, extra='forbid'), ConfigDict(validate_assignment=True, extra='ignore'), ], ) def test_validate_assignment_extra_unknown_field_assigned_errors(config: ConfigDict): @pydantic.dataclasses.dataclass(config=config) class MyDataclass: a: int d = MyDataclass(1) assert d.a == 1 with pytest.raises(ValidationError) as exc_info: d.extra_field = 1.23 assert exc_info.value.errors(include_url=False) == [ { 'type': 'no_such_attribute', 'loc': ('extra_field',), 'msg': "Object has no attribute 'extra_field'", 'input': 1.23, 'ctx': {'attribute': 'extra_field'}, } ] def test_post_init(): post_init_called = False @pydantic.dataclasses.dataclass class MyDataclass: a: int def __post_init__(self): nonlocal post_init_called post_init_called = True d = MyDataclass('1') assert d.a == 1 assert post_init_called def test_post_init_validation(): @dataclasses.dataclass class DC: a: int def __post_init__(self): self.a *= 2 assert DC(a='2').a == '22' PydanticDC = pydantic.dataclasses.dataclass(DC) assert DC(a='2').a == '22' assert PydanticDC(a='2').a == 4 def test_convert_vanilla_dc(): @dataclasses.dataclass class DC: a: int b: str = dataclasses.field(init=False) def __post_init__(self): self.a *= 2 self.b = 'hello' dc1 = DC(a='2') assert dc1.a == '22' assert dc1.b == 'hello' PydanticDC = pydantic.dataclasses.dataclass(DC) dc2 = DC(a='2') assert dc2.a == '22' assert dc2.b == 'hello' py_dc = PydanticDC(a='2') assert py_dc.a == 4 assert py_dc.b == 'hello' def test_std_dataclass_with_parent(): @dataclasses.dataclass class DCParent: a: int @dataclasses.dataclass class DC(DCParent): b: int def __post_init__(self): self.a *= 2 assert dataclasses.asdict(DC(a='2', b='1')) == {'a': '22', 'b': '1'} PydanticDC = pydantic.dataclasses.dataclass(DC) assert dataclasses.asdict(DC(a='2', b='1')) == {'a': '22', 'b': '1'} assert dataclasses.asdict(PydanticDC(a='2', b='1')) == {'a': 4, 'b': 1} def test_post_init_inheritance_chain(): parent_post_init_called = False post_init_called = False @pydantic.dataclasses.dataclass class ParentDataclass: a: int def __post_init__(self): nonlocal parent_post_init_called parent_post_init_called = True @pydantic.dataclasses.dataclass class MyDataclass(ParentDataclass): b: int def __post_init__(self): super().__post_init__() nonlocal post_init_called post_init_called = True d = MyDataclass(a=1, b=2) assert d.a == 1 assert d.b == 2 assert parent_post_init_called assert post_init_called def test_post_init_post_parse(): with pytest.warns(PydanticDeprecatedSince20, match='Support for `__post_init_post_parse__` has been dropped'): @pydantic.dataclasses.dataclass class MyDataclass: a: int def __post_init_post_parse__(self): pass def test_post_init_assignment(): from dataclasses import field # Based on: https://docs.python.org/3/library/dataclasses.html#post-init-processing @pydantic.dataclasses.dataclass class C: a: float b: float c: float = field(init=False) def __post_init__(self): self.c = self.a + self.b c = C(0.1, 0.2) assert c.a == 0.1 assert c.b == 0.2 assert c.c == 0.30000000000000004 def test_inheritance(): @pydantic.dataclasses.dataclass class A: a: str = None a_ = A(a=b'a') assert a_.a == 'a' @pydantic.dataclasses.dataclass class B(A): b: int = None b = B(a='a', b=12) assert b.a == 'a' assert b.b == 12 with pytest.raises(ValidationError): B(a='a', b='b') a_ = A(a=b'a') assert a_.a == 'a' def test_validate_long_string_error(): @pydantic.dataclasses.dataclass(config=dict(str_max_length=3)) class MyDataclass: a: str with pytest.raises(ValidationError) as exc_info: MyDataclass('xxxx') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_long', 'loc': (0,), 'msg': 'String should have at most 3 characters', 'input': 'xxxx', 'ctx': {'max_length': 3}, } ] def test_validate_assignment_long_string_error(): @pydantic.dataclasses.dataclass(config=ConfigDict(str_max_length=3, validate_assignment=True)) class MyDataclass: a: str d = MyDataclass('xxx') with pytest.raises(ValidationError) as exc_info: d.a = 'xxxx' assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_long', 'loc': ('a',), 'msg': 'String should have at most 3 characters', 'input': 'xxxx', 'ctx': {'max_length': 3}, } ] def test_no_validate_assignment_long_string_error(): @pydantic.dataclasses.dataclass(config=ConfigDict(str_max_length=3, validate_assignment=False)) class MyDataclass: a: str d = MyDataclass('xxx') d.a = 'xxxx' assert d.a == 'xxxx' def test_nested_dataclass(): @pydantic.dataclasses.dataclass class Nested: number: int @pydantic.dataclasses.dataclass class Outer: n: Nested navbar = Outer(n=Nested(number='1')) assert isinstance(navbar.n, Nested) assert navbar.n.number == 1 navbar = Outer(n={'number': '3'}) assert isinstance(navbar.n, Nested) assert navbar.n.number == 3 with pytest.raises(ValidationError) as exc_info: Outer(n='not nested') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'dataclass_type', 'loc': ('n',), 'msg': 'Input should be a dictionary or an instance of Nested', 'input': 'not nested', 'ctx': {'class_name': 'Nested'}, } ] with pytest.raises(ValidationError) as exc_info: Outer(n={'number': 'x'}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('n', 'number'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', } ] def test_arbitrary_types_allowed(): class Button: def __init__(self, href: str): self.href = href @pydantic.dataclasses.dataclass(config=dict(arbitrary_types_allowed=True)) class Navbar: button: Button btn = Button(href='a') navbar = Navbar(button=btn) assert navbar.button.href == 'a' with pytest.raises(ValidationError) as exc_info: Navbar(button=('b',)) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': ('button',), 'msg': 'Input should be an instance of test_arbitrary_types_allowed..Button', 'input': ('b',), 'ctx': {'class': 'test_arbitrary_types_allowed..Button'}, } ] def test_nested_dataclass_model(): @pydantic.dataclasses.dataclass class Nested: number: int class Outer(BaseModel): n: Nested navbar = Outer(n=Nested(number='1')) assert navbar.n.number == 1 def test_fields(): @pydantic.dataclasses.dataclass class User: id: int name: str = 'John Doe' signup_ts: datetime = None user = User(id=123) fields = user.__pydantic_fields__ assert fields['id'].is_required() is True assert fields['name'].is_required() is False assert fields['name'].default == 'John Doe' assert fields['signup_ts'].is_required() is False assert fields['signup_ts'].default is None @pytest.mark.parametrize('field_constructor', [dataclasses.field, pydantic.dataclasses.Field]) def test_default_factory_field(field_constructor: Callable): @pydantic.dataclasses.dataclass class User: id: int other: Dict[str, str] = field_constructor(default_factory=lambda: {'John': 'Joey'}) user = User(id=123) assert user.id == 123 assert user.other == {'John': 'Joey'} fields = user.__pydantic_fields__ assert fields['id'].is_required() is True assert repr(fields['id'].default) == 'PydanticUndefined' assert fields['other'].is_required() is False assert fields['other'].default_factory() == {'John': 'Joey'} def test_default_factory_singleton_field(): class MySingleton: pass MY_SINGLETON = MySingleton() @pydantic.dataclasses.dataclass(config=dict(arbitrary_types_allowed=True)) class Foo: singleton: MySingleton = dataclasses.field(default_factory=lambda: MY_SINGLETON) # Returning a singleton from a default_factory is supported assert Foo().singleton is Foo().singleton def test_schema(): @pydantic.dataclasses.dataclass class User: id: int name: str = 'John Doe' aliases: Dict[str, str] = dataclasses.field(default_factory=lambda: {'John': 'Joey'}) signup_ts: datetime = None age: Optional[int] = dataclasses.field( default=None, metadata=dict(title='The age of the user', description='do not lie!') ) height: Optional[int] = pydantic.Field(None, title='The height in cm', ge=50, le=300) User(id=123) assert model_json_schema(User) == { 'properties': { 'age': { 'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': None, 'title': 'The age of the user', 'description': 'do not lie!', }, 'aliases': { 'additionalProperties': {'type': 'string'}, 'title': 'Aliases', 'type': 'object', }, 'height': { 'anyOf': [{'maximum': 300, 'minimum': 50, 'type': 'integer'}, {'type': 'null'}], 'default': None, 'title': 'The height in cm', }, 'id': {'title': 'Id', 'type': 'integer'}, 'name': {'default': 'John Doe', 'title': 'Name', 'type': 'string'}, 'signup_ts': {'default': None, 'format': 'date-time', 'title': 'Signup Ts', 'type': 'string'}, }, 'required': ['id'], 'title': 'User', 'type': 'object', } def test_nested_schema(): @pydantic.dataclasses.dataclass class Nested: number: int @pydantic.dataclasses.dataclass class Outer: n: Nested assert model_json_schema(Outer) == { '$defs': { 'Nested': { 'properties': {'number': {'title': 'Number', 'type': 'integer'}}, 'required': ['number'], 'title': 'Nested', 'type': 'object', } }, 'properties': {'n': {'$ref': '#/$defs/Nested'}}, 'required': ['n'], 'title': 'Outer', 'type': 'object', } def test_initvar(): @pydantic.dataclasses.dataclass class TestInitVar: x: int y: dataclasses.InitVar tiv = TestInitVar(1, 2) assert tiv.x == 1 with pytest.raises(AttributeError): tiv.y def test_derived_field_from_initvar(): @pydantic.dataclasses.dataclass class DerivedWithInitVar: plusone: int = dataclasses.field(init=False) number: dataclasses.InitVar[int] def __post_init__(self, number): self.plusone = number + 1 derived = DerivedWithInitVar('1') assert derived.plusone == 2 with pytest.raises(ValidationError, match='Input should be a valid integer, unable to parse string as an integer'): DerivedWithInitVar('Not A Number') def test_initvars_post_init(): @pydantic.dataclasses.dataclass class PathDataPostInit: path: Path base_path: dataclasses.InitVar[Optional[Path]] = None def __post_init__(self, base_path): if base_path is not None: self.path = base_path / self.path path_data = PathDataPostInit('world') assert 'path' in path_data.__dict__ assert 'base_path' not in path_data.__dict__ assert path_data.path == Path('world') p = PathDataPostInit('world', base_path='/hello') assert p.path == Path('/hello/world') def test_classvar(): @pydantic.dataclasses.dataclass class TestClassVar: klassvar: ClassVar = "I'm a Class variable" x: int tcv = TestClassVar(2) assert tcv.klassvar == "I'm a Class variable" def test_frozenset_field(): @pydantic.dataclasses.dataclass class TestFrozenSet: set: FrozenSet[int] test_set = frozenset({1, 2, 3}) object_under_test = TestFrozenSet(set=test_set) assert object_under_test.set == test_set def test_inheritance_post_init(): post_init_called = False @pydantic.dataclasses.dataclass class Base: a: int def __post_init__(self): nonlocal post_init_called post_init_called = True @pydantic.dataclasses.dataclass class Child(Base): b: int Child(a=1, b=2) assert post_init_called def test_hashable_required(): @pydantic.dataclasses.dataclass class MyDataclass: v: Hashable MyDataclass(v=None) with pytest.raises(ValidationError) as exc_info: MyDataclass(v=[]) assert exc_info.value.errors(include_url=False) == [ {'input': [], 'loc': ('v',), 'msg': 'Input should be hashable', 'type': 'is_hashable'} ] with pytest.raises(ValidationError) as exc_info: # Should this raise a TypeError instead? https://github.com/pydantic/pydantic/issues/5487 MyDataclass() assert exc_info.value.errors(include_url=False) == [ {'input': HasRepr('ArgsKwargs(())'), 'loc': ('v',), 'msg': 'Field required', 'type': 'missing'} ] @pytest.mark.parametrize('default', [1, None]) def test_default_value(default): @pydantic.dataclasses.dataclass class MyDataclass: v: int = default assert dataclasses.asdict(MyDataclass()) == {'v': default} assert dataclasses.asdict(MyDataclass(v=42)) == {'v': 42} def test_default_value_ellipsis(): """ https://github.com/pydantic/pydantic/issues/5488 """ @pydantic.dataclasses.dataclass class MyDataclass: v: int = ... assert dataclasses.asdict(MyDataclass(v=42)) == {'v': 42} with pytest.raises(ValidationError, match='type=missing'): MyDataclass() def test_override_builtin_dataclass(): @dataclasses.dataclass class File: hash: str name: Optional[str] size: int content: Optional[bytes] = None ValidFile = pydantic.dataclasses.dataclass(File) file = File(hash='xxx', name=b'whatever.txt', size='456') valid_file = ValidFile(hash='xxx', name=b'whatever.txt', size='456') assert file.name == b'whatever.txt' assert file.size == '456' assert valid_file.name == 'whatever.txt' assert valid_file.size == 456 assert isinstance(valid_file, File) assert isinstance(valid_file, ValidFile) with pytest.raises(ValidationError) as e: ValidFile(hash=[1], name='name', size=3) assert e.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('hash',), 'msg': 'Input should be a valid string', 'input': [1], }, ] def test_override_builtin_dataclass_2(): @dataclasses.dataclass class Meta: modified_date: Optional[datetime] seen_count: int Meta(modified_date='not-validated', seen_count=0) @pydantic.dataclasses.dataclass @dataclasses.dataclass class File(Meta): filename: str Meta(modified_date='still-not-validated', seen_count=0) f = File(filename=b'thefilename', modified_date='2020-01-01T00:00', seen_count='7') assert f.filename == 'thefilename' assert f.modified_date == datetime(2020, 1, 1, 0, 0) assert f.seen_count == 7 def test_override_builtin_dataclass_nested(): @dataclasses.dataclass class Meta: modified_date: Optional[datetime] seen_count: int __pydantic_config__ = {'revalidate_instances': 'always'} @dataclasses.dataclass class File: filename: str meta: Meta FileChecked = pydantic.dataclasses.dataclass(File) f = FileChecked(filename=b'thefilename', meta=Meta(modified_date='2020-01-01T00:00', seen_count='7')) assert f.filename == 'thefilename' assert f.meta.modified_date == datetime(2020, 1, 1, 0, 0) assert f.meta.seen_count == 7 with pytest.raises(ValidationError) as e: FileChecked(filename=b'thefilename', meta=Meta(modified_date='2020-01-01T00:00', seen_count=['7'])) # insert_assert(e.value.errors(include_url=False)) assert e.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('meta', 'seen_count'), 'msg': 'Input should be a valid integer', 'input': ['7']} ] class Foo( BaseModel, ): file: File foo = Foo.model_validate( { 'file': { 'filename': b'thefilename', 'meta': {'modified_date': '2020-01-01T00:00', 'seen_count': '7'}, }, } ) assert foo.file.filename == 'thefilename' assert foo.file.meta.modified_date == datetime(2020, 1, 1, 0, 0) assert foo.file.meta.seen_count == 7 def test_override_builtin_dataclass_nested_schema(): @dataclasses.dataclass class Meta: modified_date: Optional[datetime] seen_count: int @dataclasses.dataclass class File: filename: str meta: Meta FileChecked = pydantic.dataclasses.dataclass(File) assert model_json_schema(FileChecked) == { '$defs': { 'Meta': { 'properties': { 'modified_date': { 'anyOf': [{'format': 'date-time', 'type': 'string'}, {'type': 'null'}], 'title': 'Modified Date', }, 'seen_count': {'title': 'Seen Count', 'type': 'integer'}, }, 'required': ['modified_date', 'seen_count'], 'title': 'Meta', 'type': 'object', } }, 'properties': {'filename': {'title': 'Filename', 'type': 'string'}, 'meta': {'$ref': '#/$defs/Meta'}}, 'required': ['filename', 'meta'], 'title': 'File', 'type': 'object', } def test_inherit_builtin_dataclass(): @dataclasses.dataclass class Z: z: int @dataclasses.dataclass class Y(Z): y: int @pydantic.dataclasses.dataclass class X(Y): x: int pika = X(x='2', y='4', z='3') assert pika.x == 2 assert pika.y == 4 assert pika.z == 3 def test_forward_stdlib_dataclass_params(): @dataclasses.dataclass(frozen=True) class Item: name: str class Example(BaseModel): item: Item other: str model_config = ConfigDict(arbitrary_types_allowed=True) e = Example(item=Item(name='pika'), other='bulbi') e.other = 'bulbi2' with pytest.raises(dataclasses.FrozenInstanceError): e.item.name = 'pika2' def test_pydantic_callable_field(): """pydantic callable fields behaviour should be the same as stdlib dataclass""" def foo(arg1, arg2): return arg1, arg2 def bar(x: int, y: float, z: str) -> bool: return str(x + y) == z class PydanticModel(BaseModel): required_callable: Callable required_callable_2: Callable[[int, float, str], bool] default_callable: Callable = foo default_callable_2: Callable[[int, float, str], bool] = bar @pydantic.dataclasses.dataclass class PydanticDataclass: required_callable: Callable required_callable_2: Callable[[int, float, str], bool] default_callable: Callable = foo default_callable_2: Callable[[int, float, str], bool] = bar @dataclasses.dataclass class StdlibDataclass: required_callable: Callable required_callable_2: Callable[[int, float, str], bool] default_callable: Callable = foo default_callable_2: Callable[[int, float, str], bool] = bar pyd_m = PydanticModel(required_callable=foo, required_callable_2=bar) pyd_dc = PydanticDataclass(required_callable=foo, required_callable_2=bar) std_dc = StdlibDataclass(required_callable=foo, required_callable_2=bar) assert ( pyd_m.required_callable is pyd_m.default_callable is pyd_dc.required_callable is pyd_dc.default_callable is std_dc.required_callable is std_dc.default_callable ) assert ( pyd_m.required_callable_2 is pyd_m.default_callable_2 is pyd_dc.required_callable_2 is pyd_dc.default_callable_2 is std_dc.required_callable_2 is std_dc.default_callable_2 ) def test_pickle_overridden_builtin_dataclass(create_module: Any): module = create_module( # language=Python """\ import dataclasses import pydantic @pydantic.dataclasses.dataclass(config=pydantic.config.ConfigDict(validate_assignment=True)) class BuiltInDataclassForPickle: value: int """ ) obj = module.BuiltInDataclassForPickle(value=5) pickled_obj = pickle.dumps(obj) restored_obj = pickle.loads(pickled_obj) assert restored_obj.value == 5 assert restored_obj == obj # ensure the restored dataclass is still a pydantic dataclass with pytest.raises(ValidationError): restored_obj.value = 'value of a wrong type' def lazy_cases_for_dataclass_equality_checks(): """ The reason for the convoluted structure of this function is to avoid creating the classes while collecting tests, which may trigger breakpoints etc. while working on one specific test. """ cases = [] def get_cases(): if cases: return cases # cases already "built" @dataclasses.dataclass(frozen=True) class StdLibFoo: a: str b: int @pydantic.dataclasses.dataclass(frozen=True) class PydanticFoo: a: str b: int @dataclasses.dataclass(frozen=True) class StdLibBar: c: StdLibFoo @pydantic.dataclasses.dataclass(frozen=True) class PydanticBar: c: PydanticFoo @dataclasses.dataclass(frozen=True) class StdLibBaz: c: PydanticFoo @pydantic.dataclasses.dataclass(frozen=True) class PydanticBaz: c: StdLibFoo foo = StdLibFoo(a='Foo', b=1) cases.append((foo, StdLibBar(c=foo))) foo = PydanticFoo(a='Foo', b=1) cases.append((foo, PydanticBar(c=foo))) foo = PydanticFoo(a='Foo', b=1) cases.append((foo, StdLibBaz(c=foo))) foo = StdLibFoo(a='Foo', b=1) cases.append((foo, PydanticBaz(c=foo))) return cases case_ids = ['stdlib_stdlib', 'pydantic_pydantic', 'pydantic_stdlib', 'stdlib_pydantic'] def case(i): def get_foo_bar(): return get_cases()[i] get_foo_bar.__name__ = case_ids[i] # get nice names in pytest output return get_foo_bar return [case(i) for i in range(4)] @pytest.mark.parametrize('foo_bar_getter', lazy_cases_for_dataclass_equality_checks()) def test_dataclass_equality_for_field_values(foo_bar_getter): # Related to issue #2162 foo, bar = foo_bar_getter() assert dataclasses.asdict(foo) == dataclasses.asdict(bar.c) assert dataclasses.astuple(foo) == dataclasses.astuple(bar.c) assert foo == bar.c def test_issue_2383(): @dataclasses.dataclass class A: s: str def __hash__(self): return 123 class B(pydantic.BaseModel): a: A a = A('') b = B(a=a) assert hash(a) == 123 assert hash(b.a) == 123 def test_issue_2398(): @dataclasses.dataclass(order=True) class DC: num: int = 42 class Model(pydantic.BaseModel): dc: DC real_dc = DC() model = Model(dc=real_dc) # This works as expected. assert real_dc <= real_dc assert model.dc <= model.dc assert real_dc <= model.dc def test_issue_2424(): @dataclasses.dataclass class Base: x: str @dataclasses.dataclass class Thing(Base): y: str = dataclasses.field(default_factory=str) assert Thing(x='hi').y == '' @pydantic.dataclasses.dataclass class ValidatedThing(Base): y: str = dataclasses.field(default_factory=str) assert Thing(x='hi').y == '' assert ValidatedThing(x='hi').y == '' def test_issue_2541(): @dataclasses.dataclass(frozen=True) class Infos: id: int @dataclasses.dataclass(frozen=True) class Item: name: str infos: Infos class Example(BaseModel): item: Item e = Example.model_validate({'item': {'name': '123', 'infos': {'id': '1'}}}) assert e.item.name == '123' assert e.item.infos.id == 1 with pytest.raises(dataclasses.FrozenInstanceError): e.item.infos.id = 2 def test_complex_nested_vanilla_dataclass(): @dataclasses.dataclass class Span: first: int last: int @dataclasses.dataclass class LabeledSpan(Span): label: str @dataclasses.dataclass class BinaryRelation: subject: LabeledSpan object: LabeledSpan label: str @dataclasses.dataclass class Sentence: relations: BinaryRelation class M(pydantic.BaseModel): s: Sentence assert M.model_json_schema() == { '$defs': { 'BinaryRelation': { 'properties': { 'label': {'title': 'Label', 'type': 'string'}, 'object': {'$ref': '#/$defs/LabeledSpan'}, 'subject': {'$ref': '#/$defs/LabeledSpan'}, }, 'required': ['subject', 'object', 'label'], 'title': 'BinaryRelation', 'type': 'object', }, 'LabeledSpan': { 'properties': { 'first': {'title': 'First', 'type': 'integer'}, 'label': {'title': 'Label', 'type': 'string'}, 'last': {'title': 'Last', 'type': 'integer'}, }, 'required': ['first', 'last', 'label'], 'title': 'LabeledSpan', 'type': 'object', }, 'Sentence': { 'properties': {'relations': {'$ref': '#/$defs/BinaryRelation'}}, 'required': ['relations'], 'title': 'Sentence', 'type': 'object', }, }, 'properties': {'s': {'$ref': '#/$defs/Sentence'}}, 'required': ['s'], 'title': 'M', 'type': 'object', } def test_json_schema_with_computed_field(): @dataclasses.dataclass class MyDataclass: x: int @computed_field @property def double_x(self) -> int: return 2 * self.x class Model(BaseModel): dc: MyDataclass assert Model.model_json_schema(mode='validation') == { '$defs': { 'MyDataclass': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'MyDataclass', 'type': 'object', } }, 'properties': {'dc': {'$ref': '#/$defs/MyDataclass'}}, 'required': ['dc'], 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization') == { '$defs': { 'MyDataclass': { 'properties': { 'double_x': {'readOnly': True, 'title': 'Double X', 'type': 'integer'}, 'x': {'title': 'X', 'type': 'integer'}, }, 'required': ['x', 'double_x'], 'title': 'MyDataclass', 'type': 'object', } }, 'properties': {'dc': {'$ref': '#/$defs/MyDataclass'}}, 'required': ['dc'], 'title': 'Model', 'type': 'object', } def test_issue_2594(): @dataclasses.dataclass class Empty: pass @pydantic.dataclasses.dataclass class M: e: Empty assert isinstance(M(e={}).e, Empty) def test_schema_description_unset(): @pydantic.dataclasses.dataclass class A: x: int assert 'description' not in model_json_schema(A) @pydantic.dataclasses.dataclass @dataclasses.dataclass class B: x: int assert 'description' not in model_json_schema(B) def test_schema_description_set(): @pydantic.dataclasses.dataclass class A: """my description""" x: int assert model_json_schema(A)['description'] == 'my description' @pydantic.dataclasses.dataclass @dataclasses.dataclass class B: """my description""" x: int assert model_json_schema(A)['description'] == 'my description' def test_issue_3011(): """Validation of a subclass of a dataclass""" @dataclasses.dataclass class A: thing_a: str class B(A): thing_b: str @pydantic.dataclasses.dataclass class C: thing: A b = B('Thing A') c = C(thing=b) assert c.thing.thing_a == 'Thing A' def test_issue_3162(): @dataclasses.dataclass class User: id: int name: str class Users(BaseModel): user: User other_user: User assert Users.model_json_schema() == { '$defs': { 'User': { 'properties': {'id': {'title': 'Id', 'type': 'integer'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['id', 'name'], 'title': 'User', 'type': 'object', } }, 'properties': {'other_user': {'$ref': '#/$defs/User'}, 'user': {'$ref': '#/$defs/User'}}, 'required': ['user', 'other_user'], 'title': 'Users', 'type': 'object', } def test_discriminated_union_basemodel_instance_value(): @pydantic.dataclasses.dataclass class A: l: Literal['a'] # noqa: E741 @pydantic.dataclasses.dataclass class B: l: Literal['b'] # noqa: E741 @pydantic.dataclasses.dataclass class Top: sub: Union[A, B] = dataclasses.field(metadata=dict(discriminator='l')) t = Top(sub=A(l='a')) assert isinstance(t, Top) # insert_assert(model_json_schema(Top)) assert model_json_schema(Top) == { 'title': 'Top', 'type': 'object', 'properties': { 'sub': { 'title': 'Sub', 'discriminator': {'mapping': {'a': '#/$defs/A', 'b': '#/$defs/B'}, 'propertyName': 'l'}, 'oneOf': [{'$ref': '#/$defs/A'}, {'$ref': '#/$defs/B'}], } }, 'required': ['sub'], '$defs': { 'A': { 'properties': {'l': {'const': 'a', 'title': 'L', 'type': 'string'}}, 'required': ['l'], 'title': 'A', 'type': 'object', }, 'B': { 'properties': {'l': {'const': 'b', 'title': 'L', 'type': 'string'}}, 'required': ['l'], 'title': 'B', 'type': 'object', }, }, } def test_post_init_after_validation(): @dataclasses.dataclass class SetWrapper: set: Set[int] def __post_init__(self): assert isinstance( self.set, set ), f"self.set should be a set but it's {self.set!r} of type {type(self.set).__name__}" class Model(pydantic.BaseModel): set_wrapper: SetWrapper model = Model(set_wrapper=SetWrapper({1, 2, 3})) json_text = model.model_dump_json() assert Model.model_validate_json(json_text).model_dump() == model.model_dump() def test_new_not_called(): """ pydantic dataclasses do not preserve sunder attributes set in __new__ """ class StandardClass: """Class which modifies instance creation.""" a: str def __new__(cls, *args, **kwargs): instance = super().__new__(cls) instance._special_property = 1 return instance StandardLibDataclass = dataclasses.dataclass(StandardClass) PydanticDataclass = pydantic.dataclasses.dataclass(StandardClass) test_string = 'string' std_instance = StandardLibDataclass(a=test_string) assert std_instance._special_property == 1 assert std_instance.a == test_string pyd_instance = PydanticDataclass(a=test_string) assert not hasattr(pyd_instance, '_special_property') assert pyd_instance.a == test_string def test_ignore_extra(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='ignore')) class Foo: x: int foo = Foo(**{'x': '1', 'y': '2'}) assert foo.__dict__ == {'x': 1} def test_ignore_extra_subclass(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='ignore')) class Foo: x: int @pydantic.dataclasses.dataclass(config=ConfigDict(extra='ignore')) class Bar(Foo): y: int bar = Bar(**{'x': '1', 'y': '2', 'z': '3'}) assert bar.__dict__ == {'x': 1, 'y': 2} def test_allow_extra(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='allow')) class Foo: x: int foo = Foo(**{'x': '1', 'y': '2'}) assert foo.__dict__ == {'x': 1, 'y': '2'} def test_allow_extra_subclass(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='allow')) class Foo: x: int @pydantic.dataclasses.dataclass(config=ConfigDict(extra='allow')) class Bar(Foo): y: int bar = Bar(**{'x': '1', 'y': '2', 'z': '3'}) assert bar.__dict__ == {'x': 1, 'y': 2, 'z': '3'} def test_forbid_extra(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='forbid')) class Foo: x: int msg = re.escape("Unexpected keyword argument [type=unexpected_keyword_argument, input_value='2', input_type=str]") with pytest.raises(ValidationError, match=msg): Foo(**{'x': '1', 'y': '2'}) def test_self_reference_dataclass(): @pydantic.dataclasses.dataclass class MyDataclass: self_reference: Optional['MyDataclass'] = None assert MyDataclass.__pydantic_fields__['self_reference'].annotation == Optional[MyDataclass] instance = MyDataclass(self_reference=MyDataclass(self_reference=MyDataclass())) assert TypeAdapter(MyDataclass).dump_python(instance) == { 'self_reference': {'self_reference': {'self_reference': None}} } with pytest.raises(ValidationError) as exc_info: MyDataclass(self_reference=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'dataclass_type', 'loc': ('self_reference',), 'msg': 'Input should be a dictionary or an instance of MyDataclass', 'input': 1, 'ctx': {'class_name': 'MyDataclass'}, } ] def test_cyclic_reference_dataclass(create_module): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='forbid')) class D1: d2: Optional['D2'] = None @create_module def module(): from typing import Optional import pydantic @pydantic.dataclasses.dataclass(config=pydantic.ConfigDict(extra='forbid')) class D2: d1: Optional['D1'] = None # Ensure D2 is in the local namespace; note everything works even though it wasn't _defined_ in this namespace D2 = module.D2 # Confirm D1 and D2 require rebuilding assert isinstance(D1.__pydantic_validator__, MockValSer) assert isinstance(D2.__pydantic_validator__, MockValSer) # Note: the rebuilds of D1 and D2 happen automatically, and works since it grabs the locals here as the namespace, # which contains D1 and D2 instance = D1(d2=D2(d1=D1(d2=D2(d1=D1())))) # Confirm D1 and D2 have been rebuilt assert isinstance(D1.__pydantic_validator__, SchemaValidator) assert isinstance(D2.__pydantic_validator__, SchemaValidator) assert TypeAdapter(D1).dump_python(instance) == {'d2': {'d1': {'d2': {'d1': {'d2': None}}}}} with pytest.raises(ValidationError) as exc_info: D2(d1=D2()) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'dataclass_type', 'loc': ('d1',), 'msg': 'Input should be a dictionary or an instance of D1', 'input': D2(d1=None), 'ctx': {'class_name': 'D1'}, } ] with pytest.raises(ValidationError) as exc_info: TypeAdapter(D1).validate_python(dict(d2=dict(d1=dict(d2=dict(d2=dict()))))) assert exc_info.value.errors(include_url=False) == [ { 'input': {}, 'loc': ('d2', 'd1', 'd2', 'd2'), 'msg': 'Unexpected keyword argument', 'type': 'unexpected_keyword_argument', } ] def test_cross_module_cyclic_reference_dataclass(create_module): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='forbid')) class D1: d2: Optional['D2'] = None # noqa F821 @create_module def module(): from typing import Optional import pydantic @pydantic.dataclasses.dataclass(config=pydantic.ConfigDict(extra='forbid')) class D2: d1: Optional['D1'] = None # Since D2 is not in locals, it will not be picked up by the auto-rebuild: with pytest.raises( PydanticUserError, match=re.escape( '`D1` is not fully defined; you should define `D2`, then call' ' `pydantic.dataclasses.rebuild_dataclass(D1)`.' ), ): D1() # Explicitly rebuild D1, specifying the appropriate types namespace rebuild_dataclass(D1, _types_namespace={'D2': module.D2, 'D1': D1}) # Confirm D2 still requires a rebuild (it will happen automatically) assert isinstance(module.D2.__pydantic_validator__, MockValSer) instance = D1(d2=module.D2(d1=D1(d2=module.D2(d1=D1())))) # Confirm auto-rebuild of D2 has now happened assert isinstance(module.D2.__pydantic_validator__, SchemaValidator) assert TypeAdapter(D1).dump_python(instance) == {'d2': {'d1': {'d2': {'d1': {'d2': None}}}}} with pytest.raises(ValidationError) as exc_info: module.D2(d1=module.D2()) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'dataclass_type', 'input': module.D2(d1=None), 'loc': ('d1',), 'msg': 'Input should be a dictionary or an instance of D1', 'ctx': {'class_name': 'D1'}, } ] with pytest.raises(ValidationError) as exc_info: TypeAdapter(D1).validate_python(dict(d2=dict(d1=dict(d2=dict(d2=dict()))))) assert exc_info.value.errors(include_url=False) == [ { 'input': {}, 'loc': ('d2', 'd1', 'd2', 'd2'), 'msg': 'Unexpected keyword argument', 'type': 'unexpected_keyword_argument', } ] @pytest.mark.parametrize( 'dataclass_decorator', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) def test_base_dataclasses_annotations_resolving(create_module, dataclass_decorator: Callable): @create_module def module(): import dataclasses from typing import NewType OddInt = NewType('OddInt', int) @dataclasses.dataclass class D1: d1: 'OddInt' s: str __pydantic_config__ = {'str_to_lower': True} @dataclass_decorator class D2(module.D1): d2: int assert TypeAdapter(D2).validate_python({'d1': 1, 'd2': 2, 's': 'ABC'}) == D2(d1=1, d2=2, s='abc') @pytest.mark.parametrize( 'dataclass_decorator', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) def test_base_dataclasses_annotations_resolving_with_override(create_module, dataclass_decorator: Callable): @create_module def module1(): import dataclasses from typing import NewType IDType = NewType('IDType', int) @dataclasses.dataclass class D1: db_id: 'IDType' __pydantic_config__ = {'str_to_lower': True} @create_module def module2(): import dataclasses from typing import NewType IDType = NewType('IDType', str) @dataclasses.dataclass class D2: db_id: 'IDType' s: str __pydantic_config__ = {'str_to_lower': False} @dataclass_decorator class D3(module1.D1, module2.D2): ... assert TypeAdapter(D3).validate_python({'db_id': 42, 's': 'ABC'}) == D3(db_id=42, s='abc') @pytest.mark.skipif(sys.version_info < (3, 10), reason='kw_only is not available in python < 3.10') def test_kw_only(): @pydantic.dataclasses.dataclass(kw_only=True) class A: a: int | None = None b: str with pytest.raises(ValidationError): A(1, '') assert A(b='hi').b == 'hi' @pytest.mark.skipif(sys.version_info < (3, 10), reason='kw_only is not available in python < 3.10') def test_kw_only_subclass(): @pydantic.dataclasses.dataclass class A: x: int y: int = pydantic.Field(default=0, kw_only=True) @pydantic.dataclasses.dataclass class B(A): z: int assert B(1, 2) == B(x=1, y=0, z=2) assert B(1, y=2, z=3) == B(x=1, y=2, z=3) @pytest.mark.parametrize('field_constructor', [pydantic.dataclasses.Field, dataclasses.field]) def test_repr_false(field_constructor: Callable): @pydantic.dataclasses.dataclass class A: visible_field: str hidden_field: str = field_constructor(repr=False) instance = A(visible_field='this_should_be_included', hidden_field='this_should_not_be_included') assert "visible_field='this_should_be_included'" in repr(instance) assert "hidden_field='this_should_not_be_included'" not in repr(instance) def dataclass_decorators(include_identity: bool = False, exclude_combined: bool = False): decorators = [pydantic.dataclasses.dataclass, dataclasses.dataclass] ids = ['pydantic', 'stdlib'] if not exclude_combined: def combined_decorator(cls): """ Should be equivalent to: @pydantic.dataclasses.dataclass @dataclasses.dataclass """ return pydantic.dataclasses.dataclass(dataclasses.dataclass(cls)) decorators.append(combined_decorator) ids.append('combined') if include_identity: def identity_decorator(cls): return cls decorators.append(identity_decorator) ids.append('identity') return {'argvalues': decorators, 'ids': ids} @pytest.mark.skipif(sys.version_info < (3, 10), reason='kw_only is not available in python < 3.10') @pytest.mark.parametrize('decorator1', **dataclass_decorators(exclude_combined=True)) @pytest.mark.parametrize('decorator2', **dataclass_decorators(exclude_combined=True)) def test_kw_only_inheritance(decorator1, decorator2): # Exclude combined from the decorators since it doesn't know how to accept kw_only @decorator1(kw_only=True) class Parent: x: int @decorator2 class Child(Parent): y: int child = Child(1, x=2) assert child.x == 2 assert child.y == 1 def test_extra_forbid_list_no_error(): @pydantic.dataclasses.dataclass(config=dict(extra='forbid')) class Bar: ... @pydantic.dataclasses.dataclass class Foo: a: List[Bar] assert isinstance(Foo(a=[Bar()]).a[0], Bar) def test_extra_forbid_list_error(): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='forbid')) class Bar: ... with pytest.raises(ValidationError, match=r'a\s+Unexpected keyword argument'): Bar(a=1) def test_field_validator(): @pydantic.dataclasses.dataclass class MyDataclass: a: int b: float @field_validator('b') @classmethod def double_b(cls, v): return v * 2 d = MyDataclass('1', '2.5') assert d.a == 1 assert d.b == 5.0 def test_model_validator_before(): @pydantic.dataclasses.dataclass class MyDataclass: a: int b: float @model_validator(mode='before') @classmethod def double_b(cls, v: ArgsKwargs): v.kwargs['b'] *= 2 return v d = MyDataclass('1', b='2') assert d.a == 1 assert d.b == 22.0 def test_model_validator_after(): @pydantic.dataclasses.dataclass class MyDataclass: a: int b: float @model_validator(mode='after') def double_b(self) -> 'MyDataclass': self.b *= 2 return self d = MyDataclass('1', b='2') assert d.a == 1 assert d.b == 4 def test_parent_post_init(): """ Test that the parent's __post_init__ gets called and the order in which it gets called relative to validation. In V1 we called it before validation, in V2 it gets called after. """ @dataclasses.dataclass class A: a: float def __post_init__(self): self.a *= 2 assert A(a=1.2).a == 2.4 @pydantic.dataclasses.dataclass class B(A): @field_validator('a') @classmethod def validate_a(cls, value, _): value += 3 return value assert B(a=1).a == 8 # (1 + 3) * 2 = 8 def test_subclass_post_init_order(): @dataclasses.dataclass class A: a: float @pydantic.dataclasses.dataclass class B(A): def __post_init__(self): self.a *= 2 @field_validator('a') @classmethod def validate_a(cls, value): value += 3 return value assert B(a=1).a == 8 # (1 + 3) * 2 = 8 def test_subclass_post_init_inheritance(): @dataclasses.dataclass class A: a: int @pydantic.dataclasses.dataclass class B(A): def __post_init__(self): self.a *= 2 @field_validator('a') @classmethod def validate_a(cls, value): value += 3 return value @pydantic.dataclasses.dataclass class C(B): def __post_init__(self): self.a *= 3 assert C(1).a == 12 # (1 + 3) * 3 def test_config_as_type_deprecated(): class Config: validate_assignment = True with pytest.warns( PydanticDeprecatedSince20, match='Support for class-based `config` is deprecated, use ConfigDict instead.' ): @pydantic.dataclasses.dataclass(config=Config) class MyDataclass: a: int assert MyDataclass.__pydantic_config__ == ConfigDict(validate_assignment=True) def test_validator_info_field_name_data_before(): """ Test accessing info.field_name and info.data We only test the `before` validator because they all share the same implementation. """ @pydantic.dataclasses.dataclass class Model: a: str b: str @field_validator('b', mode='before') @classmethod def check_a(cls, v: Any, info: ValidationInfo) -> Any: assert v == b'but my barbaz is better' assert info.field_name == 'b' assert info.data == {'a': 'your foobar is good'} return 'just kidding!' assert Model(a=b'your foobar is good', b=b'but my barbaz is better').b == 'just kidding!' @pytest.mark.parametrize( 'decorator1, expected_parent, expected_child', [ ( pydantic.dataclasses.dataclass, ['parent before', 'parent', 'parent after'], ['parent before', 'child', 'parent after', 'child before', 'child after'], ), (dataclasses.dataclass, [], ['parent before', 'child', 'parent after', 'child before', 'child after']), ], ids=['pydantic', 'stdlib'], ) def test_inheritance_replace(decorator1: Callable[[Any], Any], expected_parent: List[str], expected_child: List[str]): """We promise that if you add a validator with the same _function_ name as an existing validator it replaces the existing validator and is run instead of it. """ @decorator1 class Parent: a: List[str] @field_validator('a') @classmethod def parent_val_before(cls, v: List[str]): v.append('parent before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('parent') return v @field_validator('a') @classmethod def parent_val_after(cls, v: List[str]): v.append('parent after') return v @pydantic.dataclasses.dataclass class Child(Parent): @field_validator('a') @classmethod def child_val_before(cls, v: List[str]): v.append('child before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('child') return v @field_validator('a') @classmethod def child_val_after(cls, v: List[str]): v.append('child after') return v assert Parent(a=[]).a == expected_parent assert Child(a=[]).a == expected_child @pytest.mark.parametrize( 'decorator1', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) @pytest.mark.parametrize( 'default', [1, dataclasses.field(default=1), Field(default=1)], ids=['1', 'dataclasses.field(default=1)', 'pydantic.Field(default=1)'], ) def test_dataclasses_inheritance_default_value_is_not_deleted( decorator1: Callable[[Any], Any], default: Literal[1] ) -> None: if decorator1 is dataclasses.dataclass and isinstance(default, FieldInfo): pytest.skip(reason="stdlib dataclasses don't support Pydantic fields") @decorator1 class Parent: a: int = default assert Parent.a == 1 assert Parent().a == 1 @pydantic.dataclasses.dataclass class Child(Parent): pass assert Child.a == 1 assert Child().a == 1 def test_dataclass_config_validate_default(): @pydantic.dataclasses.dataclass class Model: x: int = -1 @field_validator('x') @classmethod def force_x_positive(cls, v): assert v > 0 return v assert Model().x == -1 @pydantic.dataclasses.dataclass(config=ConfigDict(validate_default=True)) class ValidatingModel(Model): pass with pytest.raises(ValidationError) as exc_info: ValidatingModel() assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(AssertionError('assert -1 > 0')))}, 'input': -1, 'loc': ('x',), 'msg': 'Assertion failed, assert -1 > 0', 'type': 'assertion_error', } ] @pytest.mark.parametrize('dataclass_decorator', **dataclass_decorators()) def test_unparametrized_generic_dataclass(dataclass_decorator): T = TypeVar('T') @dataclass_decorator class GenericDataclass(Generic[T]): x: T # In principle we could call GenericDataclass(...) below, but this won't do validation # for standard dataclasses, so we just use TypeAdapter to get validation for each. validator = pydantic.TypeAdapter(GenericDataclass) assert validator.validate_python({'x': None}).x is None assert validator.validate_python({'x': 1}).x == 1 with pytest.raises(ValidationError) as exc_info: validator.validate_python({'y': None}) assert exc_info.value.errors(include_url=False) == [ {'input': {'y': None}, 'loc': ('x',), 'msg': 'Field required', 'type': 'missing'} ] @pytest.mark.parametrize('dataclass_decorator', **dataclass_decorators()) @pytest.mark.parametrize( 'annotation,input_value,error,output_value', [ (int, 1, False, 1), (str, 'a', False, 'a'), ( int, 'a', True, [ { 'input': 'a', 'loc': ('x',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ], ), ], ) def test_parametrized_generic_dataclass(dataclass_decorator, annotation, input_value, error, output_value): T = TypeVar('T') @dataclass_decorator class GenericDataclass(Generic[T]): x: T # Need to use TypeAdapter here because GenericDataclass[annotation] will be a GenericAlias, which delegates # method calls to the (non-parametrized) origin class. This is essentially a limitation of typing._GenericAlias. validator = pydantic.TypeAdapter(GenericDataclass[annotation]) if not error: assert validator.validate_python({'x': input_value}).x == output_value else: with pytest.raises(ValidationError) as exc_info: validator.validate_python({'x': input_value}) assert exc_info.value.errors(include_url=False) == output_value def test_multiple_parametrized_generic_dataclasses(): T = TypeVar('T') @pydantic.dataclasses.dataclass class GenericDataclass(Generic[T]): x: T validator1 = pydantic.TypeAdapter(GenericDataclass[int]) validator2 = pydantic.TypeAdapter(GenericDataclass[str]) # verify that generic parameters are showing up in the type ref for generic dataclasses # this can probably be removed if the schema changes in some way that makes this part of the test fail assert '[int:' in validator1.core_schema['ref'] assert '[str:' in validator2.core_schema['ref'] assert validator1.validate_python({'x': 1}).x == 1 assert validator2.validate_python({'x': 'hello world'}).x == 'hello world' with pytest.raises(ValidationError) as exc_info: validator2.validate_python({'x': 1}) assert exc_info.value.errors(include_url=False) == [ {'input': 1, 'loc': ('x',), 'msg': 'Input should be a valid string', 'type': 'string_type'} ] with pytest.raises(ValidationError) as exc_info: validator1.validate_python({'x': 'hello world'}) assert exc_info.value.errors(include_url=False) == [ { 'input': 'hello world', 'loc': ('x',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] @pytest.mark.parametrize('dataclass_decorator', **dataclass_decorators(include_identity=True)) def test_pydantic_dataclass_preserves_metadata(dataclass_decorator: Callable[[Any], Any]) -> None: @dataclass_decorator class FooStd: """Docstring""" FooPydantic = pydantic.dataclasses.dataclass(FooStd) assert FooPydantic.__module__ == FooStd.__module__ assert FooPydantic.__name__ == FooStd.__name__ assert FooPydantic.__qualname__ == FooStd.__qualname__ def test_recursive_dataclasses_gh_4509(create_module) -> None: @create_module def module(): import dataclasses from typing import List import pydantic @dataclasses.dataclass class Recipe: author: 'Cook' @dataclasses.dataclass class Cook: recipes: List[Recipe] @pydantic.dataclasses.dataclass class Foo(Cook): pass gordon = module.Cook([]) burger = module.Recipe(author=gordon) me = module.Foo([burger]) assert me.recipes == [burger] def test_dataclass_alias_generator(): def alias_generator(name: str) -> str: return 'alias_' + name @pydantic.dataclasses.dataclass(config=ConfigDict(alias_generator=alias_generator)) class User: name: str score: int = Field(alias='my_score') user = User(**{'alias_name': 'test name', 'my_score': 2}) assert user.name == 'test name' assert user.score == 2 with pytest.raises(ValidationError) as exc_info: User(name='test name', score=2) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': ('alias_name',), 'msg': 'Field required', 'input': ArgsKwargs((), {'name': 'test name', 'score': 2}), }, { 'type': 'missing', 'loc': ('my_score',), 'msg': 'Field required', 'input': ArgsKwargs((), {'name': 'test name', 'score': 2}), }, ] def test_init_vars_inheritance(): init_vars = [] @pydantic.dataclasses.dataclass class Foo: init: 'InitVar[int]' @pydantic.dataclasses.dataclass class Bar(Foo): arg: int def __post_init__(self, init: int) -> None: init_vars.append(init) bar = Bar(init=1, arg=2) assert TypeAdapter(Bar).dump_python(bar) == {'arg': 2} assert init_vars == [1] with pytest.raises(ValidationError) as exc_info: Bar(init='a', arg=2) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('init',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] @pytest.mark.skipif(not hasattr(pydantic.dataclasses, '_call_initvar'), reason='InitVar was not modified') @pytest.mark.parametrize('remove_monkeypatch', [True, False]) def test_init_vars_call_monkeypatch(remove_monkeypatch, monkeypatch): # Parametrizing like this allows us to test that the behavior is the same with or without the monkeypatch if remove_monkeypatch: monkeypatch.delattr(InitVar, '__call__') InitVar(int) # this is what is produced by InitVar[int]; note monkeypatching __call__ doesn't break this with pytest.raises(TypeError, match="'InitVar' object is not callable") as exc: InitVar[int]() # Check that the custom __call__ was called precisely if the monkeypatch was not removed stack_depth = len(traceback.extract_tb(exc.value.__traceback__)) assert stack_depth == 1 if remove_monkeypatch else 2 @pytest.mark.parametrize('decorator1', **dataclass_decorators()) @pytest.mark.parametrize('decorator2', **dataclass_decorators()) def test_decorators_in_model_field(decorator1, decorator2): @decorator1 class Demo1: int1: int @field_validator('int1', mode='before') def set_int_1(cls, v): return v + 100 @field_serializer('int1') def serialize_int_1(self, v): return v + 10 @decorator2 class Demo2(Demo1): int2: int @field_validator('int2', mode='before') def set_int_2(cls, v): return v + 200 @field_serializer('int2') def serialize_int_2(self, v): return v + 20 class Model(BaseModel): x: Demo2 m = Model.model_validate(dict(x=dict(int1=1, int2=2))) assert m.x.int1 == 101 assert m.x.int2 == 202 assert m.model_dump() == {'x': {'int1': 111, 'int2': 222}} @pytest.mark.parametrize('decorator1', **dataclass_decorators()) @pytest.mark.parametrize('decorator2', **dataclass_decorators()) def test_vanilla_dataclass_decorators_in_type_adapter(decorator1, decorator2): @decorator1 class Demo1: int1: int @field_validator('int1', mode='before') def set_int_1(cls, v): return v + 100 @field_serializer('int1') def serialize_int_1(self, v): return v + 10 @decorator2 class Demo2(Demo1): int2: int @field_validator('int2', mode='before') def set_int_2(cls, v): return v + 200 @field_serializer('int2') def serialize_int_2(self, v): return v + 20 adapter = TypeAdapter(Demo2) m = adapter.validate_python(dict(int1=1, int2=2)) assert m.int1 == 101 assert m.int2 == 202 assert adapter.dump_python(m) == {'int1': 111, 'int2': 222} @pytest.mark.parametrize( 'dataclass_decorator', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) @pytest.mark.skipif(sys.version_info < (3, 10), reason='slots are only supported for dataclasses in Python >= 3.10') def test_dataclass_slots(dataclass_decorator): @dataclass_decorator(slots=True) class Model: a: str b: str dc = TypeAdapter(Model).validate_python({'a': 'foo', 'b': 'bar'}) assert dc.a == 'foo' assert dc.b == 'bar' @pytest.mark.parametrize( 'dataclass_decorator', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) @pytest.mark.skipif(sys.version_info < (3, 10), reason='slots are only supported for dataclasses in Python >= 3.10') def test_dataclass_slots_mixed(dataclass_decorator): @dataclass_decorator(slots=True) class Model: x: int y: dataclasses.InitVar[str] z: ClassVar[str] = 'z-classvar' @dataclass_decorator class SubModel(Model): x2: int y2: dataclasses.InitVar[str] z2: ClassVar[str] = 'z2-classvar' dc = TypeAdapter(SubModel).validate_python({'x': 1, 'y': 'a', 'x2': 2, 'y2': 'b'}) assert dc.x == 1 assert dc.x2 == 2 assert SubModel.z == 'z-classvar' assert SubModel.z2 == 'z2-classvar' def test_rebuild_dataclass(): @pydantic.dataclasses.dataclass class MyDataClass: x: str assert rebuild_dataclass(MyDataClass) is None @pydantic.dataclasses.dataclass class MyDataClass1: d2: Optional['Foo'] = None # noqa F821 with pytest.raises(PydanticUndefinedAnnotation, match="name 'Foo' is not defined"): rebuild_dataclass(MyDataClass1, _parent_namespace_depth=0) @pydantic.dataclasses.dataclass class MyDataClass2: x: 'Foo' # noqa F821 assert not MyDataClass2.__pydantic_complete__ assert rebuild_dataclass(MyDataClass2, _types_namespace={'Foo': int}) assert MyDataClass2.__pydantic_complete__ @pytest.mark.parametrize( 'dataclass_decorator', [ pydantic.dataclasses.dataclass, dataclasses.dataclass, ], ids=['pydantic', 'stdlib'], ) def test_model_config(dataclass_decorator: Any) -> None: @dataclass_decorator class Model: x: str __pydantic_config__ = ConfigDict(str_to_lower=True) ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC'}).x == 'abc' def test_model_config_override_in_decorator() -> None: with pytest.warns( UserWarning, match='`config` is set via both the `dataclass` decorator and `__pydantic_config__`' ): @pydantic.dataclasses.dataclass(config=ConfigDict(str_to_lower=False, str_strip_whitespace=True)) class Model: x: str __pydantic_config__ = ConfigDict(str_to_lower=True) ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC '}).x == 'ABC' def test_model_config_override_in_decorator_empty_config() -> None: with pytest.warns( UserWarning, match='`config` is set via both the `dataclass` decorator and `__pydantic_config__`' ): @pydantic.dataclasses.dataclass(config=ConfigDict()) class Model: x: str __pydantic_config__ = ConfigDict(str_to_lower=True) ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC '}).x == 'ABC ' def test_dataclasses_with_config_decorator(): @dataclasses.dataclass @with_config(ConfigDict(str_to_lower=True)) class Model1: x: str ta = TypeAdapter(Model1) assert ta.validate_python({'x': 'ABC'}).x == 'abc' @with_config(ConfigDict(str_to_lower=True)) @dataclasses.dataclass class Model2: x: str ta = TypeAdapter(Model2) assert ta.validate_python({'x': 'ABC'}).x == 'abc' def test_pydantic_field_annotation(): @pydantic.dataclasses.dataclass class Model: x: Annotated[int, Field(gt=0)] with pytest.raises(ValidationError) as exc_info: Model(x=-1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'gt': 0}, 'input': -1, 'loc': ('x',), 'msg': 'Input should be greater than 0', 'type': 'greater_than', } ] def test_combined_field_annotations(): """ This test is included to document the fact that `Field` and `field` can be used together. That said, if you mix them like this, there is a good chance you'll run into surprising behavior/bugs. (E.g., `x: Annotated[int, Field(gt=1, validate_default=True)] = field(default=0)` doesn't cause an error) I would generally advise against doing this, and if we do change the behavior in the future to somehow merge pydantic.FieldInfo and dataclasses.Field in a way that changes runtime behavior for existing code, I would probably consider it a bugfix rather than a breaking change. """ @pydantic.dataclasses.dataclass class Model: x: Annotated[int, Field(gt=1)] = dataclasses.field(default=1) assert Model().x == 1 with pytest.raises(ValidationError) as exc_info: Model(x=0) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'gt': 1}, 'input': 0, 'loc': ('x',), 'msg': 'Input should be greater than 1', 'type': 'greater_than', } ] def test_dataclass_field_default_factory_with_init(): @pydantic.dataclasses.dataclass class Model: x: int = dataclasses.field(default_factory=lambda: 3, init=False) m = Model() assert 'x' in Model.__pydantic_fields__ assert m.x == 3 assert RootModel[Model](m).model_dump() == {'x': 3} def test_dataclass_field_default_with_init(): @pydantic.dataclasses.dataclass class Model: x: int = dataclasses.field(default=3, init=False) m = Model() assert 'x' in Model.__pydantic_fields__ assert m.x == 3 assert RootModel[Model](m).model_dump() == {'x': 3} def test_metadata(): @dataclasses.dataclass class Test: value: int = dataclasses.field(metadata={'info': 'Some int value', 'json_schema_extra': {'a': 'b'}}) PydanticTest = pydantic.dataclasses.dataclass(Test) assert TypeAdapter(PydanticTest).json_schema() == { 'properties': {'value': {'a': 'b', 'title': 'Value', 'type': 'integer'}}, 'required': ['value'], 'title': 'Test', 'type': 'object', } def test_signature(): @pydantic.dataclasses.dataclass class Model: x: int y: str = 'y' z: float = dataclasses.field(default=1.0) a: float = dataclasses.field(default_factory=float) b: float = Field(default=1.0) c: float = Field(default_factory=float) d: int = dataclasses.field(metadata={'alias': 'dd'}, default=1) assert str(inspect.signature(Model)) == ( "(x: int, y: str = 'y', z: float = 1.0, a: float = , b: float = 1.0, c: float = , dd: int = 1) -> None" ) def test_inherited_dataclass_signature(): @pydantic.dataclasses.dataclass class A: a: int @pydantic.dataclasses.dataclass class B(A): b: int assert str(inspect.signature(A)) == '(a: int) -> None' assert str(inspect.signature(B)) == '(a: int, b: int) -> None' def test_dataclasses_with_slots_and_default(): @pydantic.dataclasses.dataclass(slots=True) class A: a: int = 0 assert A().a == 0 @pydantic.dataclasses.dataclass(slots=True) class B: b: int = Field(1) assert B().b == 1 @pytest.mark.parametrize('decorator1', **dataclass_decorators()) def test_annotated_before_validator_called_once(decorator1): count = 0 def convert(value: int) -> str: nonlocal count count += 1 return str(value) IntToStr = Annotated[str, BeforeValidator(convert)] @decorator1 class A: a: IntToStr assert count == 0 TypeAdapter(A).validate_python({'a': 123}) assert count == 1 def test_is_pydantic_dataclass(): @pydantic.dataclasses.dataclass class PydanticDataclass: a: int @dataclasses.dataclass class StdLibDataclass: b: int assert is_pydantic_dataclass(PydanticDataclass) is True assert is_pydantic_dataclass(StdLibDataclass) is False def test_can_inherit_stdlib_dataclasses_with_defaults(): @dataclasses.dataclass class Base: a: None = None class Model(BaseModel, Base): pass assert Model().a is None def test_can_inherit_stdlib_dataclasses_default_factories_and_use_them(): """This test documents that default factories are not supported""" @dataclasses.dataclass class Base: a: str = dataclasses.field(default_factory=lambda: 'TEST') class Model(BaseModel, Base): pass with pytest.raises(ValidationError): assert Model().a == 'TEST' def test_can_inherit_stdlib_dataclasses_default_factories_and_provide_a_value(): @dataclasses.dataclass class Base: a: str = dataclasses.field(default_factory=lambda: 'TEST') class Model(BaseModel, Base): pass assert Model(a='NOT_THE_SAME').a == 'NOT_THE_SAME' def test_can_inherit_stdlib_dataclasses_with_dataclass_fields(): @dataclasses.dataclass class Base: a: int = dataclasses.field(default=5) class Model(BaseModel, Base): pass assert Model().a == 5 def test_alias_with_dashes(): """Test for fix issue #7226.""" @pydantic.dataclasses.dataclass class Foo: some_var: str = Field(alias='some-var') obj = Foo(**{'some-var': 'some_value'}) assert obj.some_var == 'some_value' def test_validate_strings(): @pydantic.dataclasses.dataclass class Nested: d: date class Model(BaseModel): n: Nested assert Model.model_validate_strings({'n': {'d': '2017-01-01'}}).n.d == date(2017, 1, 1) @pytest.mark.parametrize('field_constructor', [dataclasses.field, pydantic.dataclasses.Field]) @pytest.mark.parametrize('extra', ['ignore', 'forbid']) def test_init_false_not_in_signature(extra, field_constructor): @pydantic.dataclasses.dataclass(config=ConfigDict(extra=extra)) class MyDataclass: a: int = field_constructor(init=False, default=-1) b: int = pydantic.dataclasses.Field(default=2) signature = inspect.signature(MyDataclass) # `a` should not be in the __init__ assert 'a' not in signature.parameters.keys() assert 'b' in signature.parameters.keys() init_test_cases = [ ({'a': 2, 'b': -1}, 'ignore', {'a': 2, 'b': 1}), ({'a': 2}, 'ignore', {'a': 2, 'b': 1}), ( {'a': 2, 'b': -1}, 'forbid', [ { 'type': 'unexpected_keyword_argument', 'loc': ('b',), 'msg': 'Unexpected keyword argument', 'input': -1, } ], ), ({'a': 2}, 'forbid', {'a': 2, 'b': 1}), ] @pytest.mark.parametrize('field_constructor', [dataclasses.field, pydantic.dataclasses.Field]) @pytest.mark.parametrize( 'input_data,extra,expected', init_test_cases, ) def test_init_false_with_post_init(input_data, extra, expected, field_constructor): @pydantic.dataclasses.dataclass(config=ConfigDict(extra=extra)) class MyDataclass: a: int b: int = field_constructor(init=False) def __post_init__(self): self.b = 1 if isinstance(expected, list): with pytest.raises(ValidationError) as exc_info: MyDataclass(**input_data) assert exc_info.value.errors(include_url=False) == expected else: assert dataclasses.asdict(MyDataclass(**input_data)) == expected @pytest.mark.parametrize('field_constructor', [dataclasses.field, pydantic.dataclasses.Field]) @pytest.mark.parametrize( 'input_data,extra,expected', init_test_cases, ) def test_init_false_with_default(input_data, extra, expected, field_constructor): @pydantic.dataclasses.dataclass(config=ConfigDict(extra=extra)) class MyDataclass: a: int b: int = field_constructor(init=False, default=1) if isinstance(expected, list): with pytest.raises(ValidationError) as exc_info: MyDataclass(**input_data) assert exc_info.value.errors(include_url=False) == expected else: assert dataclasses.asdict(MyDataclass(**input_data)) == expected def test_disallow_extra_allow_and_init_false() -> None: with pytest.raises(PydanticUserError, match='This combination is not allowed.'): @pydantic.dataclasses.dataclass(config=ConfigDict(extra='allow')) class A: a: int = Field(init=False, default=1) def test_disallow_init_false_and_init_var_true() -> None: with pytest.raises(PydanticUserError, match='mutually exclusive.'): @pydantic.dataclasses.dataclass class Foo: bar: str = Field(init=False, init_var=True) def test_annotations_valid_for_field_inheritance() -> None: # testing https://github.com/pydantic/pydantic/issues/8670 @pydantic.dataclasses.dataclass() class A: a: int = pydantic.dataclasses.Field() @pydantic.dataclasses.dataclass() class B(A): ... assert B.__pydantic_fields__['a'].annotation is int assert B(a=1).a == 1 def test_annotations_valid_for_field_inheritance_with_existing_field() -> None: # variation on testing https://github.com/pydantic/pydantic/issues/8670 @pydantic.dataclasses.dataclass() class A: a: int = pydantic.dataclasses.Field() @pydantic.dataclasses.dataclass() class B(A): b: str = pydantic.dataclasses.Field() assert B.__pydantic_fields__['a'].annotation is int assert B.__pydantic_fields__['b'].annotation is str b = B(a=1, b='b') assert b.a == 1 assert b.b == 'b' def test_annotation_with_double_override() -> None: @pydantic.dataclasses.dataclass() class A: a: int b: int c: int = pydantic.dataclasses.Field() d: int = pydantic.dataclasses.Field() # note, the order of fields is different here, as to test that the annotation # is correctly set on the field no matter the base's default / current class's default @pydantic.dataclasses.dataclass() class B(A): a: str c: str b: str = pydantic.dataclasses.Field() d: str = pydantic.dataclasses.Field() @pydantic.dataclasses.dataclass() class C(B): ... for class_ in [B, C]: instance = class_(a='a', b='b', c='c', d='d') for field_name in ['a', 'b', 'c', 'd']: assert class_.__pydantic_fields__[field_name].annotation is str assert getattr(instance, field_name) == field_name def test_schema_valid_for_inner_generic() -> None: T = TypeVar('T') @pydantic.dataclasses.dataclass() class Inner(Generic[T]): x: T @pydantic.dataclasses.dataclass() class Outer: inner: Inner[int] assert Outer(inner={'x': 1}).inner.x == 1 # note, this isn't Inner[Int] like it is for the BaseModel case, but the type of x is substituted, which is the important part assert Outer.__pydantic_core_schema__['schema']['fields'][0]['schema']['cls'] == Inner assert ( Outer.__pydantic_core_schema__['schema']['fields'][0]['schema']['schema']['fields'][0]['schema']['type'] == 'int' ) def test_validation_works_for_cyclical_forward_refs() -> None: @pydantic.dataclasses.dataclass() class X: y: Union['Y', None] @pydantic.dataclasses.dataclass() class Y: x: Union[X, None] assert Y(x={'y': None}).x.y is None def test_annotated_with_field_default_factory() -> None: """ https://github.com/pydantic/pydantic/issues/9947 """ field = dataclasses.field @pydantic.dataclasses.dataclass() class A: a: Annotated[int, Field(default_factory=lambda: 1)] b: Annotated[int, Field(default_factory=lambda: 1)] = Field() c: Annotated[int, Field(default_factory=lambda: 2), Field(default_factory=lambda: 1)] = Field() d: Annotated[int, Field] = Field(default_factory=lambda: 2) e: int = Field(default_factory=lambda: 2) f: Annotated[int, Field(default_factory=lambda: 1)] = Field(default_factory=lambda: 2) # check the same tests for dataclasses.field @pydantic.dataclasses.dataclass() class B: a: Annotated[int, Field(default_factory=lambda: 1)] b: Annotated[int, Field(default_factory=lambda: 1)] = field() c: Annotated[int, field(default_factory=lambda: 2), Field(default_factory=lambda: 1)] = field() d: Annotated[int, field] = Field(default_factory=lambda: 2) e: int = field(default_factory=lambda: 2) f: Annotated[int, Field(default_factory=lambda: 1)] = field(default_factory=lambda: 2) for cls in (A, B): instance = cls() # type: ignore field_names = ('a', 'b', 'c', 'd', 'e', 'f') results = (1, 1, 1, 2, 2, 2) for field_name, result in zip(field_names, results): assert getattr(instance, field_name) == result def test_simple_frozen() -> None: @pydantic.dataclasses.dataclass(frozen=True) class MyDataclass: x: str inst = MyDataclass('hello') try: inst.x = 'other' except Exception as e: assert "cannot assign to field 'x'" in repr(e) @pydantic.dataclasses.dataclass(config=ConfigDict(frozen=True)) class MyDataclass2: x: str inst = MyDataclass2('hello') try: inst.x = 'other' except Exception as e: assert "cannot assign to field 'x'" in repr(e) def test_frozen_with_validate_assignment() -> None: """Test for https://github.com/pydantic/pydantic/issues/10041.""" @pydantic.dataclasses.dataclass(frozen=True, config=ConfigDict(validate_assignment=True)) class MyDataclass: x: str inst = MyDataclass('hello') try: inst.x = 'other' except Exception as e: assert "cannot assign to field 'x'" in repr(e) @pydantic.dataclasses.dataclass(config=ConfigDict(frozen=True, validate_assignment=True)) class MyDataclass2: x: str inst = MyDataclass2('hello') # we want to make sure that the error raised relates to the frozen nature of the instance try: inst.x = 'other' except ValidationError as e: assert 'Instance is frozen' in repr(e) def test_warns_on_double_frozen() -> None: with pytest.warns(UserWarning, match='`frozen` is set via both the `dataclass` decorator and `config`'): @pydantic.dataclasses.dataclass(frozen=True, config=ConfigDict(frozen=True)) class DC: x: int def test_warns_on_double_config() -> None: with pytest.warns( UserWarning, match='`config` is set via both the `dataclass` decorator and `__pydantic_config__`' ): @pydantic.dataclasses.dataclass(config=ConfigDict(title='from decorator')) class Foo: __pydantic_config__ = ConfigDict(title='from __pydantic_config__') def test_config_pushdown_vanilla_dc() -> None: class ArbitraryType: pass @dataclasses.dataclass class DC: a: ArbitraryType class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) dc: DC def test_deferred_dataclass_fields_available() -> None: # This aligns with deferred Pydantic models: @pydantic.dataclasses.dataclass(config={'defer_build': True}) class A: a: int assert 'a' in A.__pydantic_fields__ # pyright: ignore[reportAttributeAccessIssue] pydantic-2.10.6/tests/test_datetime.py000066400000000000000000000542261474456633400200130ustar00rootroot00000000000000import re from datetime import date, datetime, time, timedelta, timezone import pytest from dirty_equals import HasRepr from typing_extensions import Annotated from pydantic import ( AwareDatetime, BaseModel, FutureDate, FutureDatetime, NaiveDatetime, PastDate, PastDatetime, ValidationError, condate, ) from .conftest import Err def create_tz(minutes): return timezone(timedelta(minutes=minutes)) @pytest.fixture(scope='module', params=[FutureDate, Annotated[date, FutureDate()]]) def future_date_type(request): return request.param @pytest.fixture(scope='module', params=[PastDate, Annotated[date, PastDate()]]) def past_date_type(request): return request.param @pytest.fixture(scope='module', params=[FutureDatetime, Annotated[datetime, FutureDatetime()]]) def future_datetime_type(request): return request.param @pytest.fixture(scope='module', params=[PastDatetime, Annotated[datetime, PastDatetime()]]) def past_datetime_type(request): return request.param @pytest.fixture(scope='module', params=[AwareDatetime, Annotated[datetime, AwareDatetime()]]) def aware_datetime_type(request): return request.param @pytest.fixture(scope='module', params=[NaiveDatetime, Annotated[datetime, NaiveDatetime()]]) def naive_datetime_type(request): return request.param @pytest.fixture(scope='module', name='DateModel') def date_model_fixture(): class DateModel(BaseModel): d: date return DateModel @pytest.mark.parametrize( 'value,result', [ # Valid inputs (1_493_942_400, date(2017, 5, 5)), (1_493_942_400_000, date(2017, 5, 5)), (0, date(1970, 1, 1)), ('2012-04-23', date(2012, 4, 23)), (b'2012-04-23', date(2012, 4, 23)), (date(2012, 4, 9), date(2012, 4, 9)), (datetime(2012, 4, 9, 0, 0), date(2012, 4, 9)), # Invalid inputs (datetime(2012, 4, 9, 12, 15), Err('Datetimes provided to dates should have zero time - e.g. be exact dates')), ('x20120423', Err('Input should be a valid date or datetime, input is too short')), ('2012-04-56', Err('Input should be a valid date or datetime, day value is outside expected range')), (19_999_958_400, date(2603, 10, 11)), # just before watershed (20000044800, Err('type=date_from_datetime_inexact,')), # just after watershed (1_549_238_400, date(2019, 2, 4)), # nowish in s (1_549_238_400_000, date(2019, 2, 4)), # nowish in ms (1_549_238_400_000_000, Err('Input should be a valid date or datetime, dates after 9999')), # nowish in μs (1_549_238_400_000_000_000, Err('Input should be a valid date or datetime, dates after 9999')), # nowish in ns ('infinity', Err('Input should be a valid date or datetime, input is too short')), (float('inf'), Err('Input should be a valid date or datetime, dates after 9999')), (int('1' + '0' * 100), Err('Input should be a valid date or datetime, dates after 9999')), (1e1000, Err('Input should be a valid date or datetime, dates after 9999')), (float('-infinity'), Err('Input should be a valid date or datetime, dates before 0000')), (float('nan'), Err('Input should be a valid date or datetime, NaN values not permitted')), ], ) def test_date_parsing(DateModel, value, result): if isinstance(result, Err): with pytest.raises(ValidationError, match=result.message_escaped()): DateModel(d=value) else: assert DateModel(d=value).d == result @pytest.fixture(scope='module', name='TimeModel') def time_model_fixture(): class TimeModel(BaseModel): d: time return TimeModel @pytest.mark.parametrize( 'value,result', [ # Valid inputs ('09:15:00', time(9, 15)), ('10:10', time(10, 10)), ('10:20:30.400', time(10, 20, 30, 400_000)), (b'10:20:30.400', time(10, 20, 30, 400_000)), (time(4, 8, 16), time(4, 8, 16)), (3610, time(1, 0, 10, tzinfo=timezone.utc)), (3600.5, time(1, 0, 0, 500000, tzinfo=timezone.utc)), (86400 - 1, time(23, 59, 59, tzinfo=timezone.utc)), # Invalid inputs ('4:8:16', Err('Input should be in a valid time format, invalid character in hour [type=time_parsing,')), (86400, Err('Input should be in a valid time format, numeric times may not exceed 86,399 seconds')), ('xxx', Err('Input should be in a valid time format, input is too short [type=time_parsing,')), ('091500', Err('Input should be in a valid time format, invalid time separator, expected `:`')), (b'091500', Err('Input should be in a valid time format, invalid time separator, expected `:`')), ('09:15:90', Err('Input should be in a valid time format, second value is outside expected range of 0-59')), ('11:05:00Y', Err('Input should be in a valid time format, invalid timezone sign')), # https://github.com/pydantic/speedate/issues/10 ('11:05:00-05:30', time(11, 5, 0, tzinfo=create_tz(-330))), ('11:05:00-0530', time(11, 5, 0, tzinfo=create_tz(-330))), ('11:05:00Z', time(11, 5, 0, tzinfo=timezone.utc)), ('11:05:00+00:00', time(11, 5, 0, tzinfo=timezone.utc)), ('11:05-06:00', time(11, 5, 0, tzinfo=create_tz(-360))), ('11:05+06:00', time(11, 5, 0, tzinfo=create_tz(360))), ('11:05:00-25:00', Err('Input should be in a valid time format, timezone offset must be less than 24 hours')), ], ) def test_time_parsing(TimeModel, value, result): if isinstance(result, Err): with pytest.raises(ValidationError, match=result.message_escaped()): TimeModel(d=value) else: assert TimeModel(d=value).d == result @pytest.fixture(scope='module', name='DatetimeModel') def datetime_model_fixture(): class DatetimeModel(BaseModel): dt: datetime return DatetimeModel @pytest.mark.parametrize( 'value,result', [ # Valid inputs # values in seconds (1_494_012_444.883_309, datetime(2017, 5, 5, 19, 27, 24, 883_309, tzinfo=timezone.utc)), (1_494_012_444, datetime(2017, 5, 5, 19, 27, 24, tzinfo=timezone.utc)), # values in ms (1_494_012_444_000, datetime(2017, 5, 5, 19, 27, 24, tzinfo=timezone.utc)), ('2012-04-23T09:15:00', datetime(2012, 4, 23, 9, 15)), ('2012-04-23T09:15:00Z', datetime(2012, 4, 23, 9, 15, 0, 0, timezone.utc)), ('2012-04-23T10:20:30.400+02:30', datetime(2012, 4, 23, 10, 20, 30, 400_000, create_tz(150))), ('2012-04-23T10:20:30.400+02:00', datetime(2012, 4, 23, 10, 20, 30, 400_000, create_tz(120))), ('2012-04-23T10:20:30.400-02:00', datetime(2012, 4, 23, 10, 20, 30, 400_000, create_tz(-120))), (b'2012-04-23T10:20:30.400-02:00', datetime(2012, 4, 23, 10, 20, 30, 400_000, create_tz(-120))), (datetime(2017, 5, 5), datetime(2017, 5, 5)), (0, datetime(1970, 1, 1, 0, 0, 0, tzinfo=timezone.utc)), # Numeric inputs as strings ('1494012444.883309', datetime(2017, 5, 5, 19, 27, 24, 883309, tzinfo=timezone.utc)), ('1494012444', datetime(2017, 5, 5, 19, 27, 24, tzinfo=timezone.utc)), (b'1494012444', datetime(2017, 5, 5, 19, 27, 24, tzinfo=timezone.utc)), ('1494012444000.883309', datetime(2017, 5, 5, 19, 27, 24, 883, tzinfo=timezone.utc)), ('-1494012444000.883309', datetime(1922, 8, 29, 4, 32, 35, 999117, tzinfo=timezone.utc)), (19_999_999_999, datetime(2603, 10, 11, 11, 33, 19, tzinfo=timezone.utc)), # just before watershed (20_000_000_001, datetime(1970, 8, 20, 11, 33, 20, 1000, tzinfo=timezone.utc)), # just after watershed (1_549_316_052, datetime(2019, 2, 4, 21, 34, 12, 0, tzinfo=timezone.utc)), # nowish in s (1_549_316_052_104, datetime(2019, 2, 4, 21, 34, 12, 104_000, tzinfo=timezone.utc)), # nowish in ms # Invalid inputs (1_549_316_052_104_324, Err('Input should be a valid datetime, dates after 9999')), # nowish in μs (1_549_316_052_104_324_096, Err('Input should be a valid datetime, dates after 9999')), # nowish in ns (float('inf'), Err('Input should be a valid datetime, dates after 9999')), (float('-inf'), Err('Input should be a valid datetime, dates before 0000')), (1e50, Err('Input should be a valid datetime, dates after 9999')), (float('nan'), Err('Input should be a valid datetime, NaN values not permitted')), ], ) def test_datetime_parsing(DatetimeModel, value, result): if isinstance(result, Err): with pytest.raises(ValidationError, match=result.message_escaped()): DatetimeModel(dt=value) else: assert DatetimeModel(dt=value).dt == result @pytest.mark.parametrize( 'value,result', [ # Invalid inputs ('2012-4-9 4:8:16', Err('Input should be a valid datetime or date, invalid character in month')), ('x20120423091500', Err('Input should be a valid datetime or date, invalid character in year')), ('2012-04-56T09:15:90', Err('Input should be a valid datetime or date, day value is outside expected range')), ( '2012-04-23T11:05:00-25:00', Err('Input should be a valid datetime or date, unexpected extra characters at the end of the input'), ), ('infinity', Err('Input should be a valid datetime or date, input is too short')), ], ) def test_datetime_parsing_from_str(DatetimeModel, value, result): if isinstance(result, Err): with pytest.raises(ValidationError, match=result.message_escaped()): DatetimeModel(dt=value) else: assert DatetimeModel(dt=value).dt == result def test_aware_datetime_validation_success(aware_datetime_type): class Model(BaseModel): foo: aware_datetime_type value = datetime.now(tz=timezone.utc) assert Model(foo=value).foo == value def test_aware_datetime_validation_fails(aware_datetime_type): class Model(BaseModel): foo: aware_datetime_type value = datetime.now() with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'timezone_aware', 'loc': ('foo',), 'msg': 'Input should have timezone info', 'input': value, } ] def test_naive_datetime_validation_success(naive_datetime_type): class Model(BaseModel): foo: naive_datetime_type value = datetime.now() assert Model(foo=value).foo == value def test_naive_datetime_validation_fails(naive_datetime_type): class Model(BaseModel): foo: naive_datetime_type value = datetime.now(tz=timezone.utc) with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'timezone_naive', 'loc': ('foo',), 'msg': 'Input should not have timezone info', 'input': value, } ] @pytest.fixture(scope='module', name='TimedeltaModel') def timedelta_model_fixture(): class TimedeltaModel(BaseModel): d: timedelta return TimedeltaModel @pytest.mark.parametrize( 'delta', [ timedelta(days=4, minutes=15, seconds=30, milliseconds=100), # fractions of seconds timedelta(hours=10, minutes=15, seconds=30), # hours, minutes, seconds timedelta(days=4, minutes=15, seconds=30), # multiple days timedelta(days=1, minutes=00, seconds=00), # single day timedelta(days=-4, minutes=15, seconds=30), # negative durations timedelta(minutes=15, seconds=30), # minute & seconds timedelta(seconds=30), # seconds ], ) def test_parse_python_format(TimedeltaModel, delta): assert TimedeltaModel(d=delta).d == delta # assert TimedeltaModel(d=str(delta)).d == delta @pytest.mark.parametrize( 'value,result', [ # seconds (timedelta(seconds=30), timedelta(seconds=30)), (30, timedelta(seconds=30)), (30.1, timedelta(seconds=30, milliseconds=100)), (9.9e-05, timedelta(microseconds=99)), # minutes seconds ('00:15:30', timedelta(minutes=15, seconds=30)), ('00:05:30', timedelta(minutes=5, seconds=30)), # hours minutes seconds ('10:15:30', timedelta(hours=10, minutes=15, seconds=30)), ('01:15:30', timedelta(hours=1, minutes=15, seconds=30)), # ('100:200:300', timedelta(hours=100, minutes=200, seconds=300)), # days ('4d,00:15:30', timedelta(days=4, minutes=15, seconds=30)), ('4d,10:15:30', timedelta(days=4, hours=10, minutes=15, seconds=30)), # fractions of seconds ('00:15:30.1', timedelta(minutes=15, seconds=30, milliseconds=100)), ('00:15:30.01', timedelta(minutes=15, seconds=30, milliseconds=10)), ('00:15:30.001', timedelta(minutes=15, seconds=30, milliseconds=1)), ('00:15:30.0001', timedelta(minutes=15, seconds=30, microseconds=100)), ('00:15:30.00001', timedelta(minutes=15, seconds=30, microseconds=10)), ('00:15:30.000001', timedelta(minutes=15, seconds=30, microseconds=1)), (b'00:15:30.000001', timedelta(minutes=15, seconds=30, microseconds=1)), # negative ('-4d,00:15:30', timedelta(days=-4, minutes=-15, seconds=-30)), (-172800, timedelta(days=-2)), ('-00:15:30', timedelta(minutes=-15, seconds=-30)), ('-01:15:30', timedelta(hours=-1, minutes=-15, seconds=-30)), (-30.1, timedelta(seconds=-30, milliseconds=-100)), # iso_8601 ('30', Err('Input should be a valid timedelta, "day" identifier')), ('P4Y', timedelta(days=1460)), ('P4M', timedelta(days=120)), ('P4W', timedelta(days=28)), ('P4D', timedelta(days=4)), ('P0.5D', timedelta(hours=12)), ('PT5H', timedelta(hours=5)), ('PT5M', timedelta(minutes=5)), ('PT5S', timedelta(seconds=5)), ('PT0.000005S', timedelta(microseconds=5)), (b'PT0.000005S', timedelta(microseconds=5)), ], ) def test_parse_durations(TimedeltaModel, value, result): if isinstance(result, Err): with pytest.raises(ValidationError, match=result.message_escaped()): TimedeltaModel(d=value) else: assert TimedeltaModel(d=value).d == result @pytest.mark.parametrize( 'field, value, error_message', [ ('dt', [], 'Input should be a valid datetime'), ('dt', {}, 'Input should be a valid datetime'), ('dt', object, 'Input should be a valid datetime'), ('d', [], 'Input should be a valid date'), ('d', {}, 'Input should be a valid date'), ('d', object, 'Input should be a valid date'), ('t', [], 'Input should be a valid time'), ('t', {}, 'Input should be a valid time'), ('t', object, 'Input should be a valid time'), ('td', [], 'Input should be a valid timedelta'), ('td', {}, 'Input should be a valid timedelta'), ('td', object, 'Input should be a valid timedelta'), ], ) def test_model_type_errors(field, value, error_message): class Model(BaseModel): dt: datetime = None d: date = None t: time = None td: timedelta = None with pytest.raises(ValidationError) as exc_info: Model(**{field: value}) assert len(exc_info.value.errors(include_url=False)) == 1 error = exc_info.value.errors(include_url=False)[0] assert error['msg'] == error_message @pytest.mark.parametrize('field', ['dt', 'd', 't', 'dt']) def test_unicode_decode_error(field): class Model(BaseModel): dt: datetime = None d: date = None t: time = None td: timedelta = None with pytest.raises(ValidationError) as exc_info: Model(**{field: b'\x81\x81\x81\x81\x81\x81\x81\x81'}) assert exc_info.value.error_count() == 1 # errors vary def test_nan(): class Model(BaseModel): dt: datetime d: date with pytest.raises(ValidationError) as exc_info: Model(dt=float('nan'), d=float('nan')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_parsing', 'loc': ('dt',), 'msg': 'Input should be a valid datetime, NaN values not permitted', 'input': HasRepr('nan'), 'ctx': {'error': 'NaN values not permitted'}, }, { 'type': 'date_from_datetime_parsing', 'loc': ('d',), 'msg': 'Input should be a valid date or datetime, NaN values not permitted', 'input': HasRepr('nan'), 'ctx': {'error': 'NaN values not permitted'}, }, ] @pytest.mark.parametrize( 'constraint,msg,ok_value,error_value', [ ('gt', 'greater than', date(2020, 1, 2), date(2019, 12, 31)), ('gt', 'greater than', date(2020, 1, 2), date(2020, 1, 1)), ('ge', 'greater than or equal to', date(2020, 1, 2), date(2019, 12, 31)), ('ge', 'greater than or equal to', date(2020, 1, 1), date(2019, 12, 31)), ('lt', 'less than', date(2019, 12, 31), date(2020, 1, 2)), ('lt', 'less than', date(2019, 12, 31), date(2020, 1, 1)), ('le', 'less than or equal to', date(2019, 12, 31), date(2020, 1, 2)), ('le', 'less than or equal to', date(2020, 1, 1), date(2020, 1, 2)), ], ) def test_date_constraints(constraint, msg, ok_value, error_value): class Model(BaseModel): a: condate(**{constraint: date(2020, 1, 1)}) assert Model(a=ok_value).model_dump() == {'a': ok_value} with pytest.raises(ValidationError, match=re.escape(f'Input should be {msg} 2020-01-01')): Model(a=error_value) @pytest.mark.parametrize( 'value,result', ( ('1996-01-22', date(1996, 1, 22)), (date(1996, 1, 22), date(1996, 1, 22)), ), ) def test_past_date_validation_success(value, result, past_date_type): class Model(BaseModel): foo: past_date_type assert Model(foo=value).foo == result @pytest.mark.parametrize( 'value', ( date.today(), date.today() + timedelta(1), '2064-06-01', ), ) def test_past_date_validation_fails(value, past_date_type): class Model(BaseModel): foo: past_date_type with pytest.raises(ValidationError) as exc_info: Model(foo=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'date_past', 'loc': ('foo',), 'msg': 'Date should be in the past', 'input': value, } ] @pytest.mark.parametrize( 'value,result', ( (date.today() + timedelta(1), date.today() + timedelta(1)), ('2064-06-01', date(2064, 6, 1)), ), ) def test_future_date_validation_success(value, result, future_date_type): class Model(BaseModel): foo: future_date_type assert Model(foo=value).foo == result @pytest.mark.parametrize( 'value', ( date.today(), date.today() - timedelta(1), '1996-01-22', ), ) def test_future_date_validation_fails(value, future_date_type): class Model(BaseModel): foo: future_date_type with pytest.raises(ValidationError) as exc_info: Model(foo=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'date_future', 'loc': ('foo',), 'msg': 'Date should be in the future', 'input': value, } ] @pytest.mark.parametrize( 'value,result', ( ('1996-01-22T10:20:30', datetime(1996, 1, 22, 10, 20, 30)), (datetime(1996, 1, 22, 10, 20, 30), datetime(1996, 1, 22, 10, 20, 30)), ), ) def test_past_datetime_validation_success(value, result, past_datetime_type): class Model(BaseModel): foo: past_datetime_type assert Model(foo=value).foo == result @pytest.mark.parametrize( 'value', ( datetime.now() + timedelta(1), '2064-06-01T10:20:30', ), ) def test_past_datetime_validation_fails(value, past_datetime_type): class Model(BaseModel): foo: past_datetime_type with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_past', 'loc': ('foo',), 'msg': 'Input should be in the past', 'input': value, } ] def test_future_datetime_validation_success(future_datetime_type): class Model(BaseModel): foo: future_datetime_type d = datetime.now() + timedelta(1) assert Model(foo=d).foo == d assert Model(foo='2064-06-01T10:20:30').foo == datetime(2064, 6, 1, 10, 20, 30) @pytest.mark.parametrize( 'value', ( datetime.now(), datetime.now() - timedelta(1), '1996-01-22T10:20:30', ), ) def test_future_datetime_validation_fails(value, future_datetime_type): class Model(BaseModel): foo: future_datetime_type with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_future', 'loc': ('foo',), 'msg': 'Input should be in the future', 'input': value, } ] @pytest.mark.parametrize( 'annotation', ( PastDate, PastDatetime, FutureDate, FutureDatetime, NaiveDatetime, AwareDatetime, ), ) def test_invalid_annotated_type(annotation): with pytest.raises(TypeError, match=f"'{annotation.__name__}' cannot annotate 'str'."): class Model(BaseModel): foo: Annotated[str, annotation()] def test_tzinfo_could_be_reused(): class Model(BaseModel): value: datetime m = Model(value='2015-10-21T15:28:00.000000+01:00') assert m.model_dump_json() == '{"value":"2015-10-21T15:28:00+01:00"}' target = datetime(1955, 11, 12, 14, 38, tzinfo=m.value.tzinfo) assert target == datetime(1955, 11, 12, 14, 38, tzinfo=timezone(timedelta(hours=1))) now = datetime.now(tz=m.value.tzinfo) assert isinstance(now, datetime) def test_datetime_from_date_str(): class Model(BaseModel): value: datetime m = Model(value='2015-10-21') assert m.value == datetime(2015, 10, 21, 0, 0) pydantic-2.10.6/tests/test_decorators.py000066400000000000000000000061341474456633400203570ustar00rootroot00000000000000import pytest from pydantic import PydanticUserError from pydantic._internal._decorators import inspect_annotated_serializer, inspect_validator def _two_pos_required_args(a, b): pass def _two_pos_required_args_extra_optional(a, b, c=1, d=2, *, e=3): pass def _three_pos_required_args(a, b, c): pass def _one_pos_required_arg_one_optional(a, b=1): pass @pytest.mark.parametrize( [ 'obj', 'mode', 'expected', ], [ (str, 'plain', False), (float, 'plain', False), (int, 'plain', False), (lambda a: str(a), 'plain', False), (lambda a='': str(a), 'plain', False), (_two_pos_required_args, 'plain', True), (_two_pos_required_args, 'wrap', False), (_two_pos_required_args_extra_optional, 'plain', True), (_two_pos_required_args_extra_optional, 'wrap', False), (_three_pos_required_args, 'wrap', True), (_one_pos_required_arg_one_optional, 'plain', False), ], ) def test_inspect_validator(obj, mode, expected): assert inspect_validator(obj, mode=mode) == expected def test_inspect_validator_error_wrap(): def validator1(arg1): pass def validator4(arg1, arg2, arg3, arg4): pass with pytest.raises(PydanticUserError) as e: inspect_validator(validator1, mode='wrap') assert e.value.code == 'validator-signature' with pytest.raises(PydanticUserError) as e: inspect_validator(validator4, mode='wrap') assert e.value.code == 'validator-signature' @pytest.mark.parametrize('mode', ['before', 'after', 'plain']) def test_inspect_validator_error(mode): def validator(): pass def validator3(arg1, arg2, arg3): pass with pytest.raises(PydanticUserError) as e: inspect_validator(validator, mode=mode) assert e.value.code == 'validator-signature' with pytest.raises(PydanticUserError) as e: inspect_validator(validator3, mode=mode) assert e.value.code == 'validator-signature' @pytest.mark.parametrize( [ 'obj', 'mode', 'expected', ], [ (str, 'plain', False), (float, 'plain', False), (int, 'plain', False), (lambda a: str(a), 'plain', False), (lambda a='': str(a), 'plain', False), (_two_pos_required_args, 'plain', True), (_two_pos_required_args, 'wrap', False), (_two_pos_required_args_extra_optional, 'plain', True), (_two_pos_required_args_extra_optional, 'wrap', False), (_three_pos_required_args, 'wrap', True), (_one_pos_required_arg_one_optional, 'plain', False), ], ) def test_inspect_annotated_serializer(obj, mode, expected): assert inspect_annotated_serializer(obj, mode=mode) == expected @pytest.mark.parametrize('mode', ['plain', 'wrap']) def test_inspect_annotated_serializer_invalid_number_of_arguments(mode): # TODO: add more erroneous cases def serializer(): pass with pytest.raises(PydanticUserError) as e: inspect_annotated_serializer(serializer, mode=mode) assert e.value.code == 'field-serializer-signature' pydantic-2.10.6/tests/test_deprecated.py000066400000000000000000000641231474456633400203140ustar00rootroot00000000000000import platform import re from datetime import date, timedelta from pathlib import Path from types import SimpleNamespace from typing import Any, Dict, Iterable, List, Type import pytest from pydantic_core import CoreSchema, core_schema from typing_extensions import Literal from pydantic import ( BaseModel, ConfigDict, Field, GetCoreSchemaHandler, GetJsonSchemaHandler, PydanticDeprecatedSince20, PydanticUserError, ValidationError, conlist, root_validator, ) from pydantic.config import Extra from pydantic.deprecated.decorator import validate_arguments from pydantic.deprecated.json import custom_pydantic_encoder, pydantic_encoder, timedelta_isoformat from pydantic.deprecated.parse import load_file, load_str_bytes from pydantic.deprecated.tools import parse_obj_as, schema_json_of, schema_of from pydantic.functional_serializers import model_serializer from pydantic.json_schema import JsonSchemaValue from pydantic.type_adapter import TypeAdapter def deprecated_from_orm(model_type: Type[BaseModel], obj: Any) -> Any: with pytest.warns( PydanticDeprecatedSince20, match=re.escape( "The `from_orm` method is deprecated; set `model_config['from_attributes']=True` " 'and use `model_validate` instead.' ), ): return model_type.from_orm(obj) def test_from_attributes_root(): class PokemonCls: def __init__(self, *, en_name: str, jp_name: str): self.en_name = en_name self.jp_name = jp_name class Pokemon(BaseModel): model_config = ConfigDict(from_attributes=True) en_name: str jp_name: str with pytest.warns( PydanticDeprecatedSince20, match='Pydantic V1 style `@root_validator` validators are deprecated.' ): class PokemonList(BaseModel): root: List[Pokemon] @root_validator(pre=True) @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] model_config = ConfigDict(from_attributes=True) pika = PokemonCls(en_name='Pikachu', jp_name='ピカチュウ') bulbi = PokemonCls(en_name='Bulbasaur', jp_name='フシギダネ') pokemons = deprecated_from_orm(PokemonList, [pika, bulbi]) assert pokemons.root == [ Pokemon(en_name='Pikachu', jp_name='ピカチュウ'), Pokemon(en_name='Bulbasaur', jp_name='フシギダネ'), ] with pytest.warns( PydanticDeprecatedSince20, match='Pydantic V1 style `@root_validator` validators are deprecated.' ): class PokemonDict(BaseModel): root: Dict[str, Pokemon] model_config = ConfigDict(from_attributes=True) @root_validator(pre=True) @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] pokemons = deprecated_from_orm(PokemonDict, {'pika': pika, 'bulbi': bulbi}) assert pokemons.root == { 'pika': Pokemon(en_name='Pikachu', jp_name='ピカチュウ'), 'bulbi': Pokemon(en_name='Bulbasaur', jp_name='フシギダネ'), } def test_from_attributes(): class PetCls: def __init__(self, *, name: str, species: str): self.name = name self.species = species class PersonCls: def __init__(self, *, name: str, age: float = None, pets: List[PetCls]): self.name = name self.age = age self.pets = pets class Pet(BaseModel): model_config = ConfigDict(from_attributes=True) name: str species: str class Person(BaseModel): model_config = ConfigDict(from_attributes=True) name: str age: float = None pets: List[Pet] bones = PetCls(name='Bones', species='dog') orion = PetCls(name='Orion', species='cat') anna = PersonCls(name='Anna', age=20, pets=[bones, orion]) anna_model = deprecated_from_orm(Person, anna) assert anna_model.model_dump() == { 'name': 'Anna', 'pets': [{'name': 'Bones', 'species': 'dog'}, {'name': 'Orion', 'species': 'cat'}], 'age': 20.0, } def test_not_from_attributes(): class Pet(BaseModel): name: str species: str with pytest.raises(PydanticUserError): deprecated_from_orm(Pet, None) def test_object_with_getattr(): class FooGetAttr: def __getattr__(self, key: str): if key == 'foo': return 'Foo' else: raise AttributeError class Model(BaseModel): model_config = ConfigDict(from_attributes=True) foo: str bar: int = 1 class ModelInvalid(BaseModel): model_config = ConfigDict(from_attributes=True) foo: str bar: int foo = FooGetAttr() model = deprecated_from_orm(Model, foo) assert model.foo == 'Foo' assert model.bar == 1 assert model.model_dump(exclude_unset=True) == {'foo': 'Foo'} with pytest.raises(ValidationError): deprecated_from_orm(ModelInvalid, foo) def test_properties(): class XyProperty: x = 4 @property def y(self): return '5' class Model(BaseModel): model_config = ConfigDict(from_attributes=True) x: int y: int model = deprecated_from_orm(Model, XyProperty()) assert model.x == 4 assert model.y == 5 @pytest.mark.parametrize('extra', ['ignore', 'forbid', 'allow']) def test_extra_allow_from_orm(extra: Literal['ignore', 'forbid', 'allow']): class TestCls: x = 1 y = 2 class Model(BaseModel): model_config = ConfigDict(from_attributes=True, extra=extra) x: int model = deprecated_from_orm(Model, TestCls()) assert model.x == 1 assert not hasattr(model, 'y') @pytest.mark.filterwarnings('ignore:Pydantic V1 style `@root_validator` validators are deprecated.*:DeprecationWarning') def test_root_validator(): validator_value = None class TestCls: x = 1 y = 2 class Model(BaseModel): model_config = ConfigDict(from_attributes=True) x: int y: int z: int @root_validator(pre=True) def change_input_data(cls, value): nonlocal validator_value validator_value = value return {'x': value.x, 'y': value.y, 'z': value.x + value.y} model = deprecated_from_orm(Model, TestCls()) assert model.model_dump() == {'x': 1, 'y': 2, 'z': 3} # assert isinstance(validator_value, GetterDict) assert isinstance(validator_value, TestCls) def test_nested_orm(): class User(BaseModel): model_config = ConfigDict(from_attributes=True) first_name: str last_name: str class State(BaseModel): model_config = ConfigDict(from_attributes=True) user: User # Pass an "orm instance" deprecated_from_orm(State, SimpleNamespace(user=SimpleNamespace(first_name='John', last_name='Appleseed'))) # Pass dictionary data directly State(**{'user': {'first_name': 'John', 'last_name': 'Appleseed'}}) def test_parse_raw_pass(): class Model(BaseModel): x: int y: int with pytest.warns(PydanticDeprecatedSince20) as all_warnings: model = Model.parse_raw('{"x": 1, "y": 2}') assert model.model_dump() == {'x': 1, 'y': 2} assert len(all_warnings) == 2 expected_warnings = [ 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, otherwise load the data then use `model_validate` instead', '`load_str_bytes` is deprecated', ] assert [w.message.message for w in all_warnings] == expected_warnings @pytest.mark.skipif(platform.python_implementation() == 'PyPy', reason='Different error str on PyPy') def test_parse_raw_pass_fail(): class Model(BaseModel): x: int y: int with pytest.warns(PydanticDeprecatedSince20) as all_warnings: with pytest.raises(ValidationError, match='1 validation error for Model') as exc_info: Model.parse_raw('invalid') assert len(all_warnings) == 2 expected_warnings = [ 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, otherwise load the data then use `model_validate` instead', '`load_str_bytes` is deprecated', ] assert [w.message.message for w in all_warnings] == expected_warnings # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'value_error.jsondecode', 'loc': ('__root__',), 'msg': 'Expecting value: line 1 column 1 (char 0)', 'input': 'invalid', } ] def test_fields(): class Model(BaseModel): x: int y: int = 2 m = Model(x=1) assert len(Model.model_fields) == 2 assert len(m.model_fields) == 2 match = '^The `__fields__` attribute is deprecated, use `model_fields` instead.' with pytest.warns(PydanticDeprecatedSince20, match=match): assert len(Model.__fields__) == 2 with pytest.warns(PydanticDeprecatedSince20, match=match): assert len(m.__fields__) == 2 def test_fields_set(): class Model(BaseModel): x: int y: int = 2 m = Model(x=1) assert m.model_fields_set == {'x'} match = '^The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.' with pytest.warns(PydanticDeprecatedSince20, match=match): assert m.__fields_set__ == {'x'} def test_fields_dir(): class Model(BaseModel): x: int y: int = 2 assert '__fields__' not in dir(Model) @pytest.mark.parametrize('attribute,value', [('allow', 'allow'), ('ignore', 'ignore'), ('forbid', 'forbid')]) def test_extra_used_as_enum( attribute: str, value: str, ) -> None: with pytest.warns( PydanticDeprecatedSince20, match=re.escape("`pydantic.config.Extra` is deprecated, use literal values instead (e.g. `extra='allow'`)"), ): assert getattr(Extra, attribute) == value def test_field_min_items_deprecation(): m = '`min_items` is deprecated and will be removed. use `min_length` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: List[int] = Field(None, min_items=1) with pytest.raises(ValidationError) as exc_info: Model(x=[]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('x',), 'msg': 'List should have at least 1 item after validation, not 0', 'input': [], 'ctx': {'field_type': 'List', 'min_length': 1, 'actual_length': 0}, } ] def test_field_min_items_with_min_length(): m = '`min_items` is deprecated and will be removed. use `min_length` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: List[int] = Field(None, min_items=1, min_length=2) with pytest.raises(ValidationError) as exc_info: Model(x=[1]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('x',), 'msg': 'List should have at least 2 items after validation, not 1', 'input': [1], 'ctx': {'field_type': 'List', 'min_length': 2, 'actual_length': 1}, } ] def test_field_max_items(): m = '`max_items` is deprecated and will be removed. use `max_length` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: List[int] = Field(None, max_items=1) with pytest.raises(ValidationError) as exc_info: Model(x=[1, 2]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('x',), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [1, 2], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, } ] def test_field_max_items_with_max_length(): m = '`max_items` is deprecated and will be removed. use `max_length` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: List[int] = Field(None, max_items=1, max_length=2) with pytest.raises(ValidationError) as exc_info: Model(x=[1, 2, 3]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('x',), 'msg': 'List should have at most 2 items after validation, not 3', 'input': [1, 2, 3], 'ctx': {'field_type': 'List', 'max_length': 2, 'actual_length': 3}, } ] def test_field_const(): with pytest.raises(PydanticUserError, match='`const` is removed. use `Literal` instead'): class Model(BaseModel): x: str = Field('test', const=True) def test_field_include_deprecation(): with pytest.warns(PydanticDeprecatedSince20) as all_warnings: class Model(BaseModel): x: int = Field(include=True) assert len(all_warnings) == 2 expected_warnings = [ "Using extra keyword arguments on `Field` is deprecated and will be removed. Use `json_schema_extra` instead. (Extra keys: 'include')", '`include` is deprecated and does nothing. It will be removed, use `exclude` instead', ] assert [w.message.message for w in all_warnings] == expected_warnings def test_unique_items_items(): with pytest.raises(PydanticUserError, match='`unique_items` is removed. use `Set` instead'): class Model(BaseModel): x: List[int] = Field(None, unique_items=True) def test_unique_items_conlist(): with pytest.raises(PydanticUserError, match='`unique_items` is removed. use `Set` instead'): class Model(BaseModel): x: conlist(int, unique_items=True) def test_allow_mutation(): m = '`allow_mutation` is deprecated and will be removed. use `frozen` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): model_config = ConfigDict(validate_assignment=True) x: int = Field(allow_mutation=False) y: int = Field(allow_mutation=True) m = Model(x=1, y=2) assert m.x == 1 with pytest.raises(ValidationError) as exc_info: m.x = 2 assert exc_info.value.errors(include_url=False) == [ {'input': 2, 'loc': ('x',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] m.y = 3 assert m.y == 3 def test_field_regex(): with pytest.raises(PydanticUserError, match='`regex` is removed. use `pattern` instead'): class Model(BaseModel): x: str = Field('test', regex=r'^test$') def test_modify_schema_error(): with pytest.raises( PydanticUserError, match='The `__modify_schema__` method is not supported in Pydantic v2. ' 'Use `__get_pydantic_json_schema__` instead in class `Model`.', ): class Model(BaseModel): @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: pass def test_modify_schema_on_nested_class_error() -> None: class SomeLongName: @classmethod def __modify_schema__(cls, field_schema): pass with pytest.raises( PydanticUserError, match='The `__modify_schema__` method is not supported in Pydantic v2. ' 'Use `__get_pydantic_json_schema__` instead in class `SomeLongName`.', ): class B(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) a: SomeLongName def test_v1_v2_custom_type_compatibility() -> None: """Create a custom type that works with V1 and V2""" class MyType: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.int_schema() @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: return {'anyOf': [{'type': 'string'}, {'type': 'number'}]} @classmethod def __modify_schema__(cls, field_schema: Dict[str, Any]) -> None: raise NotImplementedError # not actually called, we just want to make sure the method can exist @classmethod def __get_validators__(cls) -> Iterable[Any]: raise NotImplementedError # not actually called, we just want to make sure the method can exist yield ta = TypeAdapter(MyType) assert ta.validate_python('123') == 123 assert ta.json_schema() == {'anyOf': [{'type': 'string'}, {'type': 'number'}]} def test_v1_get_validators(): class CustomDate(date): @classmethod def __get_validators__(cls): yield cls.validate1 yield cls.validate2 @classmethod def validate1(cls, v, i): print(v) if v.year < 2000: raise ValueError('Invalid year') return v @classmethod def validate2(cls, v, i): return date.today().replace(month=1, day=1) with pytest.warns( PydanticDeprecatedSince20, match='^`__get_validators__` is deprecated and will be removed, use `__get_pydantic_core_schema__` instead.', ): class Model(BaseModel): x: CustomDate with pytest.raises(ValidationError, match='Value error, Invalid year'): Model(x=date(1999, 1, 1)) m = Model(x=date.today()) assert m.x.day == 1 def test_v1_get_validators_invalid_validator(): class InvalidValidator: @classmethod def __get_validators__(cls): yield cls.has_wrong_arguments @classmethod def has_wrong_arguments(cls): pass with pytest.warns( PydanticDeprecatedSince20, match='^`__get_validators__` is deprecated and will be removed, use `__get_pydantic_core_schema__` instead.', ): class InvalidValidatorModel(BaseModel): x: InvalidValidator with pytest.raises(TypeError, match='takes 1 positional argument but 3 were given'): InvalidValidatorModel(x=1) def test_field_extra_arguments(): m = re.escape( 'Using extra keyword arguments on `Field` is deprecated and will be removed. Use `json_schema_extra` instead. ' "(Extra keys: 'test', 'foo')" ) with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: str = Field('test', test='test', foo='bar') assert Model.model_json_schema(by_alias=True)['properties'] == { 'x': {'default': 'test', 'foo': 'bar', 'test': 'test', 'title': 'X', 'type': 'string'} } def test_field_extra_does_not_rewrite_json_schema_extra(): m = 'Using extra keyword arguments on `Field` is deprecated and will be removed. Use `json_schema_extra` instead' with pytest.warns(PydanticDeprecatedSince20, match=m): class Model(BaseModel): x: str = Field('test', test='test', json_schema_extra={'test': 'json_schema_extra value'}) assert Model.model_json_schema(by_alias=True)['properties'] == { 'x': {'default': 'test', 'test': 'json_schema_extra value', 'title': 'X', 'type': 'string'} } class SimpleModel(BaseModel): x: int def test_dict(): m = SimpleModel(x=1) with pytest.warns(PydanticDeprecatedSince20, match=r'^The `dict` method is deprecated; use `model_dump` instead\.'): assert m.dict() == {'x': 1} def test_json(): m = SimpleModel(x=1) with pytest.warns( PydanticDeprecatedSince20, match=r'^The `json` method is deprecated; use `model_dump_json` instead\.' ): assert m.json() == '{"x":1}' with pytest.warns(PydanticDeprecatedSince20): with pytest.raises(TypeError, match='The `encoder` argument is no longer supported'): m.json(encoder=1) with pytest.raises(TypeError, match='The `models_as_dict` argument is no longer supported'): m.json(models_as_dict=True) with pytest.raises(TypeError, match='`dumps_kwargs` keyword arguments are no longer supported.'): m.json(foo=4) def test_parse_obj(): with pytest.warns( PydanticDeprecatedSince20, match='^The `parse_obj` method is deprecated; use `model_validate` instead.' ): m = SimpleModel.parse_obj({'x': 1}) assert m.model_dump() == {'x': 1} def test_parse_file(tmp_path): path = tmp_path / 'test.json' path.write_text('{"x": 12}') with pytest.warns(PydanticDeprecatedSince20) as all_warnings: assert SimpleModel.parse_file(str(path)).model_dump() == {'x': 12} assert len(all_warnings) == 4 expected_warnings = [ 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON use `model_validate_json`, otherwise `model_validate` instead', '`load_file` is deprecated', '`load_str_bytes` is deprecated', 'The `parse_obj` method is deprecated; use `model_validate` instead', ] assert [w.message.message for w in all_warnings] == expected_warnings def test_construct(): with pytest.warns( PydanticDeprecatedSince20, match='The `construct` method is deprecated; use `model_construct` instead.' ): m = SimpleModel.construct(x='not an int') assert m.x == 'not an int' def test_json_schema(): m = SimpleModel(x=1) with pytest.warns( PydanticDeprecatedSince20, match='^The `schema` method is deprecated; use `model_json_schema` instead.' ): assert m.schema() == { 'title': 'SimpleModel', 'type': 'object', 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], } def test_validate(): with pytest.warns( PydanticDeprecatedSince20, match='^The `validate` method is deprecated; use `model_validate` instead.' ): m = SimpleModel.validate({'x': 1}) assert m.model_dump() == {'x': 1} def test_update_forward_refs(): with pytest.warns(PydanticDeprecatedSince20, match='^The `update_forward_refs` method is deprecated;'): SimpleModel.update_forward_refs() def test_copy_and_set_values(): m = SimpleModel(x=1) with pytest.warns( PydanticDeprecatedSince20, match='^The private method `_copy_and_set_values` will be removed and ' ): m2 = m._copy_and_set_values(values={'x': 2}, fields_set={'x'}, deep=False) assert m2.x == 2 def test_get_value(): m = SimpleModel(x=1) with pytest.warns(PydanticDeprecatedSince20, match='^The private method `_get_value` will be removed and '): v = m._get_value( [1, 2, 3], to_dict=False, by_alias=False, include=None, exclude=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, ) assert v == [1, 2, 3] def test_deprecated_module(tmp_path: Path) -> None: class Model(BaseModel): x: int with pytest.warns(PydanticDeprecatedSince20) as all_warnings: assert hasattr(parse_obj_as, '__deprecated__') parse_obj_as(Model, {'x': 1}) assert hasattr(schema_json_of, '__deprecated__') schema_json_of(Model) assert hasattr(schema_of, '__deprecated__') schema_of(Model) assert hasattr(load_str_bytes, '__deprecated__') load_str_bytes('{"x": 1}') assert hasattr(load_file, '__deprecated__') file = tmp_path / 'main.py' file.write_text('{"x": 1}') load_file(file) assert hasattr(pydantic_encoder, '__deprecated__') pydantic_encoder(Model(x=1)) assert hasattr(custom_pydantic_encoder, '__deprecated__') custom_pydantic_encoder({int: lambda x: str(x)}, Model(x=1)) assert hasattr(timedelta_isoformat, '__deprecated__') timedelta_isoformat(timedelta(seconds=1)) def test(a: int, b: int): pass validate_arguments()(test) assert len(all_warnings) == 12 expected_warnings = [ '`parse_obj_as` is deprecated. Use `pydantic.TypeAdapter.validate_python` instead', '`schema_json_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead', '`schema_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead', '`schema_of` is deprecated. Use `pydantic.TypeAdapter.json_schema` instead', '`load_str_bytes` is deprecated', '`load_file` is deprecated', '`load_str_bytes` is deprecated', '`pydantic_encoder` is deprecated, use `pydantic_core.to_jsonable_python` instead', '`custom_pydantic_encoder` is deprecated, use `BaseModel.model_dump` instead', '`pydantic_encoder` is deprecated, use `pydantic_core.to_jsonable_python` instead', '`timedelta_isoformat` is deprecated', 'The `validate_arguments` method is deprecated; use `validate_call` instead', ] assert [w.message.message for w in all_warnings] == expected_warnings def test_deprecated_color(): from pydantic.color import Color with pytest.warns( PydanticDeprecatedSince20, match='The `Color` class is deprecated, use `pydantic_extra_types` instead.' ): Color('red') def test_deprecated_payment(): from pydantic import PaymentCardNumber with pytest.warns( PydanticDeprecatedSince20, match='The `PaymentCardNumber` class is deprecated, use `pydantic_extra_types` instead.', ): PaymentCardNumber('4242424242424242') pydantic-2.10.6/tests/test_deprecated_fields.py000066400000000000000000000170671474456633400216470ustar00rootroot00000000000000import importlib.metadata import pytest from packaging.version import Version from typing_extensions import Annotated, Self, deprecated from pydantic import BaseModel, Field, computed_field, field_validator, model_validator def test_deprecated_fields(): class Model(BaseModel): a: Annotated[int, Field(deprecated='')] b: Annotated[int, Field(deprecated='This is deprecated')] c: Annotated[int, Field(deprecated=None)] assert Model.model_json_schema() == { 'properties': { 'a': {'deprecated': True, 'title': 'A', 'type': 'integer'}, 'b': {'deprecated': True, 'title': 'B', 'type': 'integer'}, 'c': {'title': 'C', 'type': 'integer'}, }, 'required': ['a', 'b', 'c'], 'title': 'Model', 'type': 'object', } instance = Model(a=1, b=1, c=1) pytest.warns(DeprecationWarning, lambda: instance.a, match='^$') with pytest.warns(DeprecationWarning, match='^This is deprecated$'): b = instance.b assert b == 1 @pytest.mark.skipif( Version(importlib.metadata.version('typing_extensions')) < Version('4.9'), reason='`deprecated` type annotation requires typing_extensions>=4.9', ) def test_deprecated_fields_deprecated_class(): class Model(BaseModel): a: Annotated[int, deprecated('')] b: Annotated[int, deprecated('This is deprecated')] = 1 c: Annotated[int, Field(deprecated=deprecated('This is deprecated'))] = 1 assert Model.model_json_schema() == { 'properties': { 'a': {'deprecated': True, 'title': 'A', 'type': 'integer'}, 'b': {'default': 1, 'deprecated': True, 'title': 'B', 'type': 'integer'}, 'c': {'default': 1, 'deprecated': True, 'title': 'C', 'type': 'integer'}, }, 'required': ['a'], 'title': 'Model', 'type': 'object', } instance = Model(a=1) pytest.warns(DeprecationWarning, lambda: instance.a, match='^$') pytest.warns(DeprecationWarning, lambda: instance.b, match='^This is deprecated$') pytest.warns(DeprecationWarning, lambda: instance.c, match='^This is deprecated$') def test_deprecated_fields_field_validator(): class Model(BaseModel): x: int = Field(deprecated='x is deprecated') @field_validator('x') @classmethod def validate_x(cls, v: int) -> int: return v * 2 instance = Model(x=1) with pytest.warns(DeprecationWarning): assert instance.x == 2 def test_deprecated_fields_model_validator(): class Model(BaseModel): x: int = Field(deprecated='x is deprecated') @model_validator(mode='after') def validate_x(self) -> Self: self.x = self.x * 2 return self with pytest.warns(DeprecationWarning): instance = Model(x=1) assert instance.x == 2 def test_deprecated_fields_validate_assignment(): class Model(BaseModel): x: int = Field(deprecated='x is deprecated') model_config = {'validate_assignment': True} instance = Model(x=1) with pytest.warns(DeprecationWarning): assert instance.x == 1 instance.x = 2 with pytest.warns(DeprecationWarning): assert instance.x == 2 def test_computed_field_deprecated(): class Model(BaseModel): @computed_field @property @deprecated('This is deprecated') def p1(self) -> int: return 1 @computed_field(deprecated='This is deprecated') @property @deprecated('This is deprecated (this message is overridden)') def p2(self) -> int: return 1 @computed_field(deprecated='') @property def p3(self) -> int: return 1 @computed_field(deprecated='This is deprecated') @property def p4(self) -> int: return 1 @computed_field @deprecated('This is deprecated') def p5(self) -> int: return 1 assert Model.model_json_schema(mode='serialization') == { 'properties': { 'p1': {'deprecated': True, 'readOnly': True, 'title': 'P1', 'type': 'integer'}, 'p2': {'deprecated': True, 'readOnly': True, 'title': 'P2', 'type': 'integer'}, 'p3': {'deprecated': True, 'readOnly': True, 'title': 'P3', 'type': 'integer'}, 'p4': {'deprecated': True, 'readOnly': True, 'title': 'P4', 'type': 'integer'}, 'p5': {'deprecated': True, 'readOnly': True, 'title': 'P5', 'type': 'integer'}, }, 'required': ['p1', 'p2', 'p3', 'p4', 'p5'], 'title': 'Model', 'type': 'object', } instance = Model() pytest.warns(DeprecationWarning, lambda: instance.p1, match='^This is deprecated$') pytest.warns(DeprecationWarning, lambda: instance.p2, match='^This is deprecated$') pytest.warns(DeprecationWarning, lambda: instance.p4, match='^This is deprecated$') pytest.warns(DeprecationWarning, lambda: instance.p5, match='^This is deprecated$') with pytest.warns(DeprecationWarning, match='^$'): p3 = instance.p3 assert p3 == 1 @pytest.mark.skipif( Version(importlib.metadata.version('typing_extensions')) < Version('4.9'), reason='`deprecated` type annotation requires typing_extensions>=4.9', ) def test_computed_field_deprecated_deprecated_class(): class Model(BaseModel): @computed_field(deprecated=deprecated('This is deprecated')) @property def p1(self) -> int: return 1 @computed_field(deprecated=True) @property def p2(self) -> int: return 2 @computed_field(deprecated='This is a deprecated string') @property def p3(self) -> int: return 3 assert Model.model_json_schema(mode='serialization') == { 'properties': { 'p1': {'deprecated': True, 'readOnly': True, 'title': 'P1', 'type': 'integer'}, 'p2': {'deprecated': True, 'readOnly': True, 'title': 'P2', 'type': 'integer'}, 'p3': {'deprecated': True, 'readOnly': True, 'title': 'P3', 'type': 'integer'}, }, 'required': ['p1', 'p2', 'p3'], 'title': 'Model', 'type': 'object', } instance = Model() with pytest.warns(DeprecationWarning, match='^This is deprecated$'): p1 = instance.p1 with pytest.warns(DeprecationWarning, match='^deprecated$'): p2 = instance.p2 with pytest.warns(DeprecationWarning, match='^This is a deprecated string$'): p3 = instance.p3 assert p1 == 1 assert p2 == 2 assert p3 == 3 def test_deprecated_with_boolean() -> None: class Model(BaseModel): a: Annotated[int, Field(deprecated=True)] b: Annotated[int, Field(deprecated=False)] assert Model.model_json_schema() == { 'properties': { 'a': {'deprecated': True, 'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}, }, 'required': ['a', 'b'], 'title': 'Model', 'type': 'object', } instance = Model(a=1, b=1) pytest.warns(DeprecationWarning, lambda: instance.a, match='deprecated') def test_computed_field_deprecated_class_access() -> None: class Model(BaseModel): @computed_field(deprecated=True) def prop(self) -> int: return 1 assert isinstance(Model.prop, property) def test_computed_field_deprecated_subclass() -> None: """https://github.com/pydantic/pydantic/issues/10384""" class Base(BaseModel): @computed_field(deprecated=True) def prop(self) -> int: return 1 class Sub(Base): pass pydantic-2.10.6/tests/test_deprecated_validate_arguments.py000066400000000000000000000275201474456633400242520ustar00rootroot00000000000000import asyncio import inspect from pathlib import Path from typing import List import pytest from dirty_equals import IsInstance from typing_extensions import Annotated from pydantic import BaseModel, Field, PydanticDeprecatedSince20, ValidationError from pydantic.deprecated.decorator import ValidatedFunction from pydantic.deprecated.decorator import validate_arguments as validate_arguments_deprecated from pydantic.errors import PydanticUserError def validate_arguments(*args, **kwargs): with pytest.warns( PydanticDeprecatedSince20, match='^The `validate_arguments` method is deprecated; use `validate_call`' ): return validate_arguments_deprecated(*args, **kwargs) def test_args(): @validate_arguments def foo(a: int, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(*[1, 2]) == '1, 2' assert foo(*(1, 2)) == '1, 2' assert foo(*[1], 2) == '1, 2' with pytest.raises(ValidationError) as exc_info: foo() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'}, ] with pytest.raises(ValidationError) as exc_info: foo(1, 'x') assert exc_info.value.errors(include_url=False) == [ { 'input': 'x', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] with pytest.raises(TypeError, match='2 positional arguments expected but 3 given'): foo(1, 2, 3) with pytest.raises(TypeError, match="unexpected keyword argument: 'apple'"): foo(1, 2, apple=3) with pytest.raises(TypeError, match="multiple values for argument: 'a'"): foo(1, 2, a=3) with pytest.raises(TypeError, match="multiple values for arguments: 'a', 'b'"): foo(1, 2, a=3, b=4) def test_wrap(): @validate_arguments def foo_bar(a: int, b: int): """This is the foo_bar method.""" return f'{a}, {b}' assert foo_bar.__doc__ == 'This is the foo_bar method.' assert foo_bar.__name__ == 'foo_bar' assert foo_bar.__module__ == 'tests.test_deprecated_validate_arguments' assert foo_bar.__qualname__ == 'test_wrap..foo_bar' assert isinstance(foo_bar.vd, ValidatedFunction) assert callable(foo_bar.raw_function) assert foo_bar.vd.arg_mapping == {0: 'a', 1: 'b'} assert foo_bar.vd.positional_only_args == set() assert issubclass(foo_bar.model, BaseModel) assert foo_bar.model.model_fields.keys() == {'a', 'b', 'args', 'kwargs', 'v__duplicate_kwargs'} assert foo_bar.model.__name__ == 'FooBar' assert foo_bar.model.model_json_schema()['title'] == 'FooBar' assert repr(inspect.signature(foo_bar)) == '' def test_kwargs(): @validate_arguments def foo(*, a: int, b: int): return a + b assert foo.model.model_fields.keys() == {'a', 'b', 'args', 'kwargs'} assert foo(a=1, b=3) == 4 with pytest.raises(ValidationError) as exc_info: foo(a=1, b='x') assert exc_info.value.errors(include_url=False) == [ { 'input': 'x', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] with pytest.raises(TypeError, match='0 positional arguments expected but 2 given'): foo(1, 'x') def test_untyped(): @validate_arguments def foo(a, b, c='x', *, d='y'): return ', '.join(str(arg) for arg in [a, b, c, d]) assert foo(1, 2) == '1, 2, x, y' assert foo(1, {'x': 2}, c='3', d='4') == "1, {'x': 2}, 3, 4" @pytest.mark.parametrize('validated', (True, False)) def test_var_args_kwargs(validated): def foo(a, b, *args, d=3, **kwargs): return f'a={a!r}, b={b!r}, args={args!r}, d={d!r}, kwargs={kwargs!r}' if validated: foo = validate_arguments(foo) assert foo(1, 2) == 'a=1, b=2, args=(), d=3, kwargs={}' assert foo(1, 2, 3, d=4) == 'a=1, b=2, args=(3,), d=4, kwargs={}' assert foo(*[1, 2, 3], d=4) == 'a=1, b=2, args=(3,), d=4, kwargs={}' assert foo(1, 2, args=(10, 11)) == "a=1, b=2, args=(), d=3, kwargs={'args': (10, 11)}" assert foo(1, 2, 3, args=(10, 11)) == "a=1, b=2, args=(3,), d=3, kwargs={'args': (10, 11)}" assert foo(1, 2, 3, e=10) == "a=1, b=2, args=(3,), d=3, kwargs={'e': 10}" assert foo(1, 2, kwargs=4) == "a=1, b=2, args=(), d=3, kwargs={'kwargs': 4}" assert foo(1, 2, kwargs=4, e=5) == "a=1, b=2, args=(), d=3, kwargs={'kwargs': 4, 'e': 5}" def test_field_can_provide_factory() -> None: @validate_arguments def foo(a: int, b: int = Field(default_factory=lambda: 99), *args: int) -> int: """mypy is happy with this""" return a + b + sum(args) assert foo(3) == 102 assert foo(1, 2, 3) == 6 def test_positional_only(create_module): with pytest.warns(PydanticDeprecatedSince20): module = create_module( # language=Python """ from pydantic.deprecated.decorator import validate_arguments @validate_arguments def foo(a, b, /, c=None): return f'{a}, {b}, {c}' """ ) assert module.foo(1, 2) == '1, 2, None' assert module.foo(1, 2, 44) == '1, 2, 44' assert module.foo(1, 2, c=44) == '1, 2, 44' with pytest.raises(TypeError, match="positional-only argument passed as keyword argument: 'b'"): module.foo(1, b=2) with pytest.raises(TypeError, match="positional-only arguments passed as keyword arguments: 'a', 'b'"): module.foo(a=1, b=2) def test_args_name(): @validate_arguments def foo(args: int, kwargs: int): return f'args={args!r}, kwargs={kwargs!r}' assert foo.model.model_fields.keys() == {'args', 'kwargs', 'v__args', 'v__kwargs', 'v__duplicate_kwargs'} assert foo(1, 2) == 'args=1, kwargs=2' with pytest.raises(TypeError, match="unexpected keyword argument: 'apple'"): foo(1, 2, apple=4) with pytest.raises(TypeError, match="unexpected keyword arguments: 'apple', 'banana'"): foo(1, 2, apple=4, banana=5) with pytest.raises(TypeError, match='2 positional arguments expected but 3 given'): foo(1, 2, 3) def test_v_args(): with pytest.raises( PydanticUserError, match='"v__args", "v__kwargs", "v__positional_only" and "v__duplicate_kwargs" are not permitted', ): @validate_arguments def foo1(v__args: int): pass with pytest.raises( PydanticUserError, match='"v__args", "v__kwargs", "v__positional_only" and "v__duplicate_kwargs" are not permitted', ): @validate_arguments def foo2(v__kwargs: int): pass with pytest.raises( PydanticUserError, match='"v__args", "v__kwargs", "v__positional_only" and "v__duplicate_kwargs" are not permitted', ): @validate_arguments def foo3(v__positional_only: int): pass with pytest.raises( PydanticUserError, match='"v__args", "v__kwargs", "v__positional_only" and "v__duplicate_kwargs" are not permitted', ): @validate_arguments def foo4(v__duplicate_kwargs: int): pass def test_async(): @validate_arguments async def foo(a, b): return f'a={a} b={b}' async def run(): v = await foo(1, 2) assert v == 'a=1 b=2' asyncio.run(run()) with pytest.raises(ValidationError) as exc_info: asyncio.run(foo('x')) assert exc_info.value.errors(include_url=False) == [ {'input': {'a': 'x'}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'} ] def test_string_annotation(): @validate_arguments def foo(a: 'List[int]', b: 'Path'): return f'a={a!r} b={b!r}' assert foo([1, 2, 3], '/') with pytest.raises(ValidationError) as exc_info: foo(['x']) assert exc_info.value.errors(include_url=False) == [ { 'input': 'x', 'loc': ('a', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, {'input': {'a': ['x']}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'}, ] def test_item_method(): class X: def __init__(self, v): self.v = v @validate_arguments def foo(self, a: int, b: int): assert self.v == a return f'{a}, {b}' x = X(4) assert x.foo(4, 2) == '4, 2' assert x.foo(*[4, 2]) == '4, 2' with pytest.raises(ValidationError) as exc_info: x.foo() assert exc_info.value.errors(include_url=False) == [ {'input': {'self': IsInstance(X)}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'self': IsInstance(X)}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'}, ] def test_class_method(): class X: @classmethod @validate_arguments def foo(cls, a: int, b: int): assert cls == X return f'{a}, {b}' x = X() assert x.foo(4, 2) == '4, 2' assert x.foo(*[4, 2]) == '4, 2' with pytest.raises(ValidationError) as exc_info: x.foo() assert exc_info.value.errors(include_url=False) == [ {'input': {'cls': X}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'cls': X}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'}, ] def test_config_title(): @validate_arguments(config=dict(title='Testing')) def foo(a: int, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(1, b=2) == '1, 2' assert foo.model.model_json_schema()['title'] == 'Testing' def test_config_title_cls(): class Config: title = 'Testing' @validate_arguments(config={'title': 'Testing'}) def foo(a: int, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(1, b=2) == '1, 2' assert foo.model.model_json_schema()['title'] == 'Testing' def test_config_fields(): with pytest.raises(PydanticUserError, match='Setting the "alias_generator" property on custom Config for @'): @validate_arguments(config=dict(alias_generator=lambda x: x)) def foo(a: int, b: int): return f'{a}, {b}' def test_config_arbitrary_types_allowed(): class EggBox: def __str__(self) -> str: return 'EggBox()' @validate_arguments(config=dict(arbitrary_types_allowed=True)) def foo(a: int, b: EggBox): return f'{a}, {b}' assert foo(1, EggBox()) == '1, EggBox()' with pytest.raises(ValidationError) as exc_info: assert foo(1, 2) == '1, 2' assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'test_config_arbitrary_types_allowed..EggBox'}, 'input': 2, 'loc': ('b',), 'msg': 'Input should be an instance of ' 'test_config_arbitrary_types_allowed..EggBox', 'type': 'is_instance_of', } ] def test_validate(mocker): stub = mocker.stub(name='on_something_stub') @validate_arguments def func(s: str, count: int, *, separator: bytes = b''): stub(s, count, separator) func.validate('qwe', 2) with pytest.raises(ValidationError): func.validate(['qwe'], 2) stub.assert_not_called() def test_use_of_alias(): @validate_arguments def foo(c: int = Field(default_factory=lambda: 20), a: int = Field(default_factory=lambda: 10, alias='b')): return a + c assert foo(b=10) == 30 def test_populate_by_name(): @validate_arguments(config=dict(populate_by_name=True)) def foo(a: Annotated[int, Field(alias='b')], c: Annotated[int, Field(alias='d')]): return a + c assert foo(a=10, d=1) == 11 assert foo(b=10, c=1) == 11 assert foo(a=10, c=1) == 11 pydantic-2.10.6/tests/test_discriminated_union.py000066400000000000000000002346721474456633400222530ustar00rootroot00000000000000import re import sys from dataclasses import dataclass from enum import Enum, IntEnum from types import SimpleNamespace from typing import Any, Callable, Generic, List, Optional, Sequence, TypeVar, Union import pytest from dirty_equals import HasRepr, IsStr from pydantic_core import SchemaValidator, core_schema from typing_extensions import Annotated, Literal, TypedDict from pydantic import ( BaseModel, ConfigDict, Discriminator, Field, PlainSerializer, TypeAdapter, ValidationError, field_validator, ) from pydantic._internal._discriminated_union import apply_discriminator from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.errors import PydanticUserError from pydantic.fields import FieldInfo from pydantic.functional_validators import model_validator from pydantic.json_schema import GenerateJsonSchema from pydantic.types import Tag def test_discriminated_union_type(): with pytest.raises( TypeError, match="'str' is not a valid discriminated union variant; should be a `BaseModel` or `dataclass`" ): class Model(BaseModel): x: str = Field(discriminator='qwe') @pytest.mark.parametrize('union', [True, False]) def test_discriminated_single_variant(union): class InnerModel(BaseModel): qwe: Literal['qwe'] y: int class Model(BaseModel): if union: x: Union[InnerModel] = Field(discriminator='qwe') else: x: InnerModel = Field(discriminator='qwe') assert Model(x={'qwe': 'qwe', 'y': 1}).x.qwe == 'qwe' with pytest.raises(ValidationError) as exc_info: Model(x={'qwe': 'asd', 'y': 'a'}) # note: incorrect type of "y" is not reported due to discriminator failure assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'qwe'", 'expected_tags': "'qwe'", 'tag': 'asd'}, 'input': {'qwe': 'asd', 'y': 'a'}, 'loc': ('x',), 'msg': "Input tag 'asd' found using 'qwe' does not match any of the expected " "tags: 'qwe'", 'type': 'union_tag_invalid', } ] def test_discriminated_union_single_variant(): class InnerModel(BaseModel): qwe: Literal['qwe'] class Model(BaseModel): x: Union[InnerModel] = Field(discriminator='qwe') assert Model(x={'qwe': 'qwe'}).x.qwe == 'qwe' def test_discriminated_union_invalid_type(): with pytest.raises( TypeError, match="'str' is not a valid discriminated union variant; should be a `BaseModel` or `dataclass`" ): class Model(BaseModel): x: Union[str, int] = Field(discriminator='qwe') def test_discriminated_union_defined_discriminator(): class Cat(BaseModel): c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str with pytest.raises(PydanticUserError, match="Model 'Cat' needs a discriminator field for key 'pet_type'"): class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int def test_discriminated_union_literal_discriminator(): class Cat(BaseModel): pet_type: int c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str with pytest.raises(PydanticUserError, match="Model 'Cat' needs field 'pet_type' to be of type `Literal`"): class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int def test_discriminated_union_root_same_discriminator(): class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] Cat = Union[BlackCat, WhiteCat] class Dog(BaseModel): pet_type: Literal['dog'] CatDog = TypeAdapter(Annotated[Union[Cat, Dog], Field(discriminator='pet_type')]).validate_python CatDog({'pet_type': 'blackcat'}) CatDog({'pet_type': 'whitecat'}) CatDog({'pet_type': 'dog'}) with pytest.raises(ValidationError) as exc_info: CatDog({'pet_type': 'llama'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'", 'expected_tags': "'blackcat', 'whitecat', 'dog'", 'tag': 'llama'}, 'input': {'pet_type': 'llama'}, 'loc': (), 'msg': "Input tag 'llama' found using 'pet_type' does not match any of the " "expected tags: 'blackcat', 'whitecat', 'dog'", 'type': 'union_tag_invalid', } ] @pytest.mark.parametrize('color_discriminator_kind', ['discriminator', 'field_str', 'field_discriminator']) @pytest.mark.parametrize('pet_discriminator_kind', ['discriminator', 'field_str', 'field_discriminator']) def test_discriminated_union_validation(color_discriminator_kind, pet_discriminator_kind): def _get_str_discriminator(discriminator: str, kind: str): if kind == 'discriminator': return Discriminator(discriminator) elif kind == 'field_str': return Field(discriminator=discriminator) elif kind == 'field_discriminator': return Field(discriminator=Discriminator(discriminator)) raise ValueError(f'Invalid kind: {kind}') class BlackCat(BaseModel): pet_type: Literal['cat'] color: Literal['black'] black_infos: str class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_infos: str color_discriminator = _get_str_discriminator('color', color_discriminator_kind) Cat = Annotated[Union[BlackCat, WhiteCat], color_discriminator] class Dog(BaseModel): pet_type: Literal['dog'] d: str class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] m: str pet_discriminator = _get_str_discriminator('pet_type', pet_discriminator_kind) class Model(BaseModel): pet: Annotated[Union[Cat, Dog, Lizard], pet_discriminator] number: int with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_typ': 'cat'}, 'number': 'x'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'"}, 'input': {'pet_typ': 'cat'}, 'loc': ('pet',), 'msg': "Unable to extract tag using discriminator 'pet_type'", 'type': 'union_tag_not_found', }, { 'input': 'x', 'loc': ('number',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': 'fish', 'number': 2}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_attributes_type', 'loc': ('pet',), 'msg': 'Input should be a valid dictionary or object to extract fields from', 'input': 'fish', } ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'fish'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'", 'expected_tags': "'cat', 'dog', 'reptile', 'lizard'", 'tag': 'fish'}, 'input': {'pet_type': 'fish'}, 'loc': ('pet',), 'msg': "Input tag 'fish' found using 'pet_type' does not match any of the " "expected tags: 'cat', 'dog', 'reptile', 'lizard'", 'type': 'union_tag_invalid', } ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'lizard'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ {'input': {'pet_type': 'lizard'}, 'loc': ('pet', 'lizard', 'm'), 'msg': 'Field required', 'type': 'missing'} ] m = Model.model_validate({'pet': {'pet_type': 'lizard', 'm': 'pika'}, 'number': 2}) assert isinstance(m.pet, Lizard) assert m.model_dump() == {'pet': {'pet_type': 'lizard', 'm': 'pika'}, 'number': 2} with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'cat', 'color': 'white'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ { 'input': {'color': 'white', 'pet_type': 'cat'}, 'loc': ('pet', 'cat', 'white', 'white_infos'), 'msg': 'Field required', 'type': 'missing', } ] m = Model.model_validate({'pet': {'pet_type': 'cat', 'color': 'white', 'white_infos': 'pika'}, 'number': 2}) assert isinstance(m.pet, WhiteCat) def test_discriminated_annotated_union(): class BlackCat(BaseModel): pet_type: Literal['cat'] color: Literal['black'] black_infos: str class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_infos: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal['dog'] dog_name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pet: Pet number: int with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_typ': 'cat'}, 'number': 'x'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'"}, 'input': {'pet_typ': 'cat'}, 'loc': ('pet',), 'msg': "Unable to extract tag using discriminator 'pet_type'", 'type': 'union_tag_not_found', }, { 'input': 'x', 'loc': ('number',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'fish'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'", 'expected_tags': "'cat', 'dog'", 'tag': 'fish'}, 'input': {'pet_type': 'fish'}, 'loc': ('pet',), 'msg': "Input tag 'fish' found using 'pet_type' does not match any of the " "expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'dog'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ {'input': {'pet_type': 'dog'}, 'loc': ('pet', 'dog', 'dog_name'), 'msg': 'Field required', 'type': 'missing'} ] m = Model.model_validate({'pet': {'pet_type': 'dog', 'dog_name': 'milou'}, 'number': 2}) assert isinstance(m.pet, Dog) with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'cat', 'color': 'red'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'color'", 'expected_tags': "'black', 'white'", 'tag': 'red'}, 'input': {'color': 'red', 'pet_type': 'cat'}, 'loc': ('pet', 'cat'), 'msg': "Input tag 'red' found using 'color' does not match any of the " "expected tags: 'black', 'white'", 'type': 'union_tag_invalid', } ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'pet': {'pet_type': 'cat', 'color': 'white'}, 'number': 2}) assert exc_info.value.errors(include_url=False) == [ { 'input': {'color': 'white', 'pet_type': 'cat'}, 'loc': ('pet', 'cat', 'white', 'white_infos'), 'msg': 'Field required', 'type': 'missing', } ] m = Model.model_validate({'pet': {'pet_type': 'cat', 'color': 'white', 'white_infos': 'pika'}, 'number': 2}) assert isinstance(m.pet, WhiteCat) def test_discriminated_union_basemodel_instance_value(): class A(BaseModel): foo: Literal['a'] class B(BaseModel): foo: Literal['b'] class Top(BaseModel): sub: Union[A, B] = Field(discriminator='foo') t = Top(sub=A(foo='a')) assert isinstance(t, Top) def test_discriminated_union_basemodel_instance_value_with_alias(): class A(BaseModel): literal: Literal['a'] = Field(alias='lit') class B(BaseModel): model_config = ConfigDict(populate_by_name=True) literal: Literal['b'] = Field(alias='lit') class Top(BaseModel): sub: Union[A, B] = Field(discriminator='literal') with pytest.raises(ValidationError) as exc_info: Top(sub=A(literal='a')) assert exc_info.value.errors(include_url=False) == [ {'input': {'literal': 'a'}, 'loc': ('lit',), 'msg': 'Field required', 'type': 'missing'} ] assert Top(sub=A(lit='a')).sub.literal == 'a' assert Top(sub=B(lit='b')).sub.literal == 'b' assert Top(sub=B(literal='b')).sub.literal == 'b' def test_discriminated_union_int(): class A(BaseModel): m: Literal[1] class B(BaseModel): m: Literal[2] class Top(BaseModel): sub: Union[A, B] = Field(discriminator='m') assert isinstance(Top.model_validate({'sub': {'m': 2}}).sub, B) with pytest.raises(ValidationError) as exc_info: Top.model_validate({'sub': {'m': 3}}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'m'", 'expected_tags': '1, 2', 'tag': '3'}, 'input': {'m': 3}, 'loc': ('sub',), 'msg': "Input tag '3' found using 'm' does not match any of the expected " 'tags: 1, 2', 'type': 'union_tag_invalid', } ] class FooIntEnum(int, Enum): pass class FooStrEnum(str, Enum): pass ENUM_TEST_CASES = [ pytest.param(Enum, {'a': 1, 'b': 2}), pytest.param(Enum, {'a': 'v_a', 'b': 'v_b'}), (FooIntEnum, {'a': 1, 'b': 2}), (IntEnum, {'a': 1, 'b': 2}), (FooStrEnum, {'a': 'v_a', 'b': 'v_b'}), ] if sys.version_info >= (3, 11): from enum import StrEnum ENUM_TEST_CASES.append((StrEnum, {'a': 'v_a', 'b': 'v_b'})) @pytest.mark.skipif(sys.version_info[:2] == (3, 8), reason='https://github.com/python/cpython/issues/103592') @pytest.mark.parametrize('base_class,choices', ENUM_TEST_CASES) def test_discriminated_union_enum(base_class, choices): EnumValue = base_class('EnumValue', choices) class A(BaseModel): m: Literal[EnumValue.a] class B(BaseModel): m: Literal[EnumValue.b] class Top(BaseModel): sub: Union[A, B] = Field(discriminator='m') assert isinstance(Top.model_validate({'sub': {'m': EnumValue.b}}).sub, B) if isinstance(EnumValue.b, (int, str)): assert isinstance(Top.model_validate({'sub': {'m': EnumValue.b.value}}).sub, B) with pytest.raises(ValidationError) as exc_info: Top.model_validate({'sub': {'m': 3}}) expected_tags = f'{EnumValue.a!r}, {EnumValue.b!r}' # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_invalid', 'loc': ('sub',), 'msg': f"Input tag '3' found using 'm' does not match any of the expected tags: {expected_tags}", 'input': {'m': 3}, 'ctx': {'discriminator': "'m'", 'tag': '3', 'expected_tags': expected_tags}, } ] def test_alias_different(): class Cat(BaseModel): pet_type: Literal['cat'] = Field(alias='U') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='T') d: str with pytest.raises(TypeError, match=re.escape("Aliases for discriminator 'pet_type' must be the same (got T, U)")): class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') def test_alias_same(): class Cat(BaseModel): pet_type: Literal['cat'] = Field(alias='typeOfPet') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') d: str class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') assert Model(**{'pet': {'typeOfPet': 'dog', 'd': 'milou'}}).pet.pet_type == 'dog' def test_nested(): class Cat(BaseModel): pet_type: Literal['cat'] name: str class Dog(BaseModel): pet_type: Literal['dog'] name: str CommonPet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] name: str class Model(BaseModel): pet: Union[CommonPet, Lizard] = Field(discriminator='pet_type') n: int assert isinstance(Model(**{'pet': {'pet_type': 'dog', 'name': 'Milou'}, 'n': 5}).pet, Dog) def test_generic(): T = TypeVar('T') class Success(BaseModel, Generic[T]): type: Literal['Success'] = 'Success' data: T class Failure(BaseModel): type: Literal['Failure'] = 'Failure' error_message: str class Container(BaseModel, Generic[T]): result: Union[Success[T], Failure] = Field(discriminator='type') with pytest.raises(ValidationError, match="Unable to extract tag using discriminator 'type'"): Container[str].model_validate({'result': {}}) with pytest.raises( ValidationError, match=re.escape( "Input tag 'Other' found using 'type' does not match any of the expected tags: 'Success', 'Failure'" ), ): Container[str].model_validate({'result': {'type': 'Other'}}) with pytest.raises(ValidationError, match=r'Container\[str\]\nresult\.Success\.data') as exc_info: Container[str].model_validate({'result': {'type': 'Success'}}) assert exc_info.value.errors(include_url=False) == [ {'input': {'type': 'Success'}, 'loc': ('result', 'Success', 'data'), 'msg': 'Field required', 'type': 'missing'} ] # invalid types error with pytest.raises(ValidationError) as exc_info: Container[str].model_validate({'result': {'type': 'Success', 'data': 1}}) assert exc_info.value.errors(include_url=False) == [ { 'input': 1, 'loc': ('result', 'Success', 'data'), 'msg': 'Input should be a valid string', 'type': 'string_type', } ] assert Container[str].model_validate({'result': {'type': 'Success', 'data': '1'}}).result.data == '1' def test_optional_union(): class Cat(BaseModel): pet_type: Literal['cat'] name: str class Dog(BaseModel): pet_type: Literal['dog'] name: str class Pet(BaseModel): pet: Optional[Union[Cat, Dog]] = Field(discriminator='pet_type') assert Pet(pet={'pet_type': 'cat', 'name': 'Milo'}).model_dump() == {'pet': {'name': 'Milo', 'pet_type': 'cat'}} assert Pet(pet={'pet_type': 'dog', 'name': 'Otis'}).model_dump() == {'pet': {'name': 'Otis', 'pet_type': 'dog'}} assert Pet(pet=None).model_dump() == {'pet': None} with pytest.raises(ValidationError) as exc_info: Pet() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('pet',), 'msg': 'Field required', 'type': 'missing'} ] with pytest.raises(ValidationError) as exc_info: Pet(pet={'name': 'Benji'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'"}, 'input': {'name': 'Benji'}, 'loc': ('pet',), 'msg': "Unable to extract tag using discriminator 'pet_type'", 'type': 'union_tag_not_found', } ] with pytest.raises(ValidationError) as exc_info: Pet(pet={'pet_type': 'lizard'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'", 'expected_tags': "'cat', 'dog'", 'tag': 'lizard'}, 'input': {'pet_type': 'lizard'}, 'loc': ('pet',), 'msg': "Input tag 'lizard' found using 'pet_type' does not match any of the " "expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] def test_optional_union_with_defaults(): class Cat(BaseModel): pet_type: Literal['cat'] = 'cat' name: str class Dog(BaseModel): pet_type: Literal['dog'] = 'dog' name: str class Pet(BaseModel): pet: Optional[Union[Cat, Dog]] = Field(default=None, discriminator='pet_type') assert Pet(pet={'pet_type': 'cat', 'name': 'Milo'}).model_dump() == {'pet': {'name': 'Milo', 'pet_type': 'cat'}} assert Pet(pet={'pet_type': 'dog', 'name': 'Otis'}).model_dump() == {'pet': {'name': 'Otis', 'pet_type': 'dog'}} assert Pet(pet=None).model_dump() == {'pet': None} assert Pet().model_dump() == {'pet': None} with pytest.raises(ValidationError) as exc_info: Pet(pet={'name': 'Benji'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'"}, 'input': {'name': 'Benji'}, 'loc': ('pet',), 'msg': "Unable to extract tag using discriminator 'pet_type'", 'type': 'union_tag_not_found', } ] with pytest.raises(ValidationError) as exc_info: Pet(pet={'pet_type': 'lizard'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'pet_type'", 'expected_tags': "'cat', 'dog'", 'tag': 'lizard'}, 'input': {'pet_type': 'lizard'}, 'loc': ('pet',), 'msg': "Input tag 'lizard' found using 'pet_type' does not match any of the " "expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] def test_aliases_matching_is_not_sufficient() -> None: class Case1(BaseModel): kind_one: Literal['1'] = Field(alias='kind') class Case2(BaseModel): kind_two: Literal['2'] = Field(alias='kind') with pytest.raises(PydanticUserError, match="Model 'Case1' needs a discriminator field for key 'kind'"): class TaggedParent(BaseModel): tagged: Union[Case1, Case2] = Field(discriminator='kind') def test_nested_optional_unions() -> None: class Cat(BaseModel): pet_type: Literal['cat'] = 'cat' class Dog(BaseModel): pet_type: Literal['dog'] = 'dog' class Lizard(BaseModel): pet_type: Literal['lizard', 'reptile'] = 'lizard' MaybeCatDog = Annotated[Optional[Union[Cat, Dog]], Field(discriminator='pet_type')] MaybeDogLizard = Annotated[Union[Dog, Lizard, None], Field(discriminator='pet_type')] class Pet(BaseModel): pet: Union[MaybeCatDog, MaybeDogLizard] = Field(discriminator='pet_type') Pet.model_validate({'pet': {'pet_type': 'dog'}}) Pet.model_validate({'pet': {'pet_type': 'cat'}}) Pet.model_validate({'pet': {'pet_type': 'lizard'}}) Pet.model_validate({'pet': {'pet_type': 'reptile'}}) Pet.model_validate({'pet': None}) with pytest.raises(ValidationError) as exc_info: Pet.model_validate({'pet': {'pet_type': None}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_invalid', 'loc': ('pet',), 'msg': "Input tag 'None' found using 'pet_type' does not match any of the expected tags: 'cat', 'dog', 'lizard', 'reptile'", 'input': {'pet_type': None}, 'ctx': {'discriminator': "'pet_type'", 'tag': 'None', 'expected_tags': "'cat', 'dog', 'lizard', 'reptile'"}, } ] with pytest.raises(ValidationError) as exc_info: Pet.model_validate({'pet': {'pet_type': 'fox'}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_invalid', 'loc': ('pet',), 'msg': "Input tag 'fox' found using 'pet_type' does not match any of the expected tags: 'cat', 'dog', 'lizard', 'reptile'", 'input': {'pet_type': 'fox'}, 'ctx': {'discriminator': "'pet_type'", 'tag': 'fox', 'expected_tags': "'cat', 'dog', 'lizard', 'reptile'"}, } ] def test_nested_discriminated_union() -> None: class Cat(BaseModel): pet_type: Literal['cat', 'CAT'] class Dog(BaseModel): pet_type: Literal['dog', 'DOG'] class Lizard(BaseModel): pet_type: Literal['lizard', 'LIZARD'] CatDog = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] CatDogLizard = Annotated[Union[CatDog, Lizard], Field(discriminator='pet_type')] class Pet(BaseModel): pet: CatDogLizard Pet.model_validate({'pet': {'pet_type': 'dog'}}) Pet.model_validate({'pet': {'pet_type': 'cat'}}) Pet.model_validate({'pet': {'pet_type': 'lizard'}}) with pytest.raises(ValidationError) as exc_info: Pet.model_validate({'pet': {'pet_type': 'reptile'}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_invalid', 'loc': ('pet',), 'msg': "Input tag 'reptile' found using 'pet_type' does not match any of the expected tags: 'cat', 'CAT', 'dog', 'DOG', 'lizard', 'LIZARD'", 'input': {'pet_type': 'reptile'}, 'ctx': { 'discriminator': "'pet_type'", 'tag': 'reptile', 'expected_tags': "'cat', 'CAT', 'dog', 'DOG', 'lizard', 'LIZARD'", }, } ] def test_unions_of_optionals() -> None: class Cat(BaseModel): pet_type: Literal['cat'] = Field(alias='typeOfPet') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') d: str class Lizard(BaseModel): pet_type: Literal['lizard'] = Field(alias='typeOfPet') MaybeCat = Annotated[Union[Cat, None], 'some annotation'] MaybeDogLizard = Annotated[Optional[Union[Dog, Lizard]], 'some other annotation'] class Model(BaseModel): maybe_pet: Union[MaybeCat, MaybeDogLizard] = Field(discriminator='pet_type') assert Model(**{'maybe_pet': None}).maybe_pet is None assert Model(**{'maybe_pet': {'typeOfPet': 'dog', 'd': 'milou'}}).maybe_pet.pet_type == 'dog' assert Model(**{'maybe_pet': {'typeOfPet': 'lizard'}}).maybe_pet.pet_type == 'lizard' def test_union_discriminator_literals() -> None: class Cat(BaseModel): pet_type: Union[Literal['cat'], Literal['CAT']] = Field(alias='typeOfPet') class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') assert Model(**{'pet': {'typeOfPet': 'dog'}}).pet.pet_type == 'dog' assert Model(**{'pet': {'typeOfPet': 'cat'}}).pet.pet_type == 'cat' assert Model(**{'pet': {'typeOfPet': 'CAT'}}).pet.pet_type == 'CAT' with pytest.raises(ValidationError) as exc_info: Model(**{'pet': {'typeOfPet': 'Cat'}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_invalid', 'loc': ('pet',), 'msg': "Input tag 'Cat' found using 'pet_type' | 'typeOfPet' does not match any of the expected tags: 'cat', 'CAT', 'dog'", 'input': {'typeOfPet': 'Cat'}, 'ctx': {'discriminator': "'pet_type' | 'typeOfPet'", 'tag': 'Cat', 'expected_tags': "'cat', 'CAT', 'dog'"}, } ] def test_none_schema() -> None: cat_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))} dog_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))} cat = core_schema.typed_dict_schema(cat_fields) dog = core_schema.typed_dict_schema(dog_fields) schema = core_schema.union_schema([cat, dog, core_schema.none_schema()]) schema = apply_discriminator(schema, 'kind') validator = SchemaValidator(schema) assert validator.validate_python({'kind': 'cat'})['kind'] == 'cat' assert validator.validate_python({'kind': 'dog'})['kind'] == 'dog' assert validator.validate_python(None) is None with pytest.raises(ValidationError) as exc_info: validator.validate_python({'kind': 'lizard'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'kind'", 'expected_tags': "'cat', 'dog'", 'tag': 'lizard'}, 'input': {'kind': 'lizard'}, 'loc': (), 'msg': "Input tag 'lizard' found using 'kind' does not match any of the " "expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] def test_nested_unwrapping() -> None: cat_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))} dog_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))} cat = core_schema.typed_dict_schema(cat_fields) dog = core_schema.typed_dict_schema(dog_fields) schema = core_schema.union_schema([cat, dog]) for _ in range(3): schema = core_schema.nullable_schema(schema) schema = core_schema.nullable_schema(schema) schema = core_schema.definitions_schema(schema, []) schema = core_schema.definitions_schema(schema, []) schema = apply_discriminator(schema, 'kind') validator = SchemaValidator(schema) assert validator.validate_python({'kind': 'cat'})['kind'] == 'cat' assert validator.validate_python({'kind': 'dog'})['kind'] == 'dog' assert validator.validate_python(None) is None with pytest.raises(ValidationError) as exc_info: validator.validate_python({'kind': 'lizard'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'kind'", 'expected_tags': "'cat', 'dog'", 'tag': 'lizard'}, 'input': {'kind': 'lizard'}, 'loc': (), 'msg': "Input tag 'lizard' found using 'kind' does not match any of the " "expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] def test_distinct_choices() -> None: class Cat(BaseModel): pet_type: Literal['cat', 'dog'] = Field(alias='typeOfPet') class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') with pytest.raises(TypeError, match="Value 'dog' for discriminator 'pet_type' mapped to multiple choices"): class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') def test_invalid_discriminated_union_type() -> None: class Cat(BaseModel): pet_type: Literal['cat'] = Field(alias='typeOfPet') class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') with pytest.raises( TypeError, match="'str' is not a valid discriminated union variant; should be a `BaseModel` or `dataclass`" ): class Model(BaseModel): pet: Union[Cat, Dog, str] = Field(discriminator='pet_type') def test_invalid_alias() -> None: cat_fields = { 'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']), validation_alias=['cat', 'CAT']) } dog_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))} cat = core_schema.typed_dict_schema(cat_fields) dog = core_schema.typed_dict_schema(dog_fields) schema = core_schema.union_schema([cat, dog]) with pytest.raises(TypeError, match=re.escape("Alias ['cat', 'CAT'] is not supported in a discriminated union")): apply_discriminator(schema, 'kind') def test_invalid_discriminator_type() -> None: cat_fields = {'kind': core_schema.typed_dict_field(core_schema.int_schema())} dog_fields = {'kind': core_schema.typed_dict_field(core_schema.str_schema())} cat = core_schema.typed_dict_schema(cat_fields) dog = core_schema.typed_dict_schema(dog_fields) with pytest.raises(TypeError, match=re.escape("TypedDict needs field 'kind' to be of type `Literal`")): apply_discriminator(core_schema.union_schema([cat, dog]), 'kind') def test_missing_discriminator_field() -> None: cat_fields = {'kind': core_schema.typed_dict_field(core_schema.int_schema())} dog_fields = {} cat = core_schema.typed_dict_schema(cat_fields) dog = core_schema.typed_dict_schema(dog_fields) with pytest.raises(TypeError, match=re.escape("TypedDict needs a discriminator field for key 'kind'")): apply_discriminator(core_schema.union_schema([dog, cat]), 'kind') def test_wrap_function_schema() -> None: cat_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))} dog_fields = {'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))} cat = core_schema.with_info_wrap_validator_function(lambda x, y, z: None, core_schema.typed_dict_schema(cat_fields)) dog = core_schema.typed_dict_schema(dog_fields) schema = core_schema.union_schema([cat, dog]) assert apply_discriminator(schema, 'kind') == { 'choices': { 'cat': { 'function': { 'type': 'with-info', 'function': HasRepr(IsStr(regex=r'\. at 0x[0-9a-fA-F]+>')), }, 'schema': { 'fields': { 'kind': {'schema': {'expected': ['cat'], 'type': 'literal'}, 'type': 'typed-dict-field'} }, 'type': 'typed-dict', }, 'type': 'function-wrap', }, 'dog': { 'fields': {'kind': {'schema': {'expected': ['dog'], 'type': 'literal'}, 'type': 'typed-dict-field'}}, 'type': 'typed-dict', }, }, 'discriminator': 'kind', 'from_attributes': True, 'strict': False, 'type': 'tagged-union', } def test_plain_function_schema_is_invalid() -> None: with pytest.raises( TypeError, match="'function-plain' is not a valid discriminated union variant; " 'should be a `BaseModel` or `dataclass`', ): apply_discriminator( core_schema.union_schema( [core_schema.with_info_plain_validator_function(lambda x, y: None), core_schema.int_schema()] ), 'kind', ) def test_invalid_str_choice_discriminator_values() -> None: cat = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))}) dog = core_schema.str_schema() schema = core_schema.union_schema( [ cat, # NOTE: Wrapping the union with a validator results in failure to more thoroughly decompose the tagged # union. I think this would be difficult to avoid in the general case, and I would suggest that we not # attempt to do more than this until presented with scenarios where it is helpful/necessary. core_schema.with_info_wrap_validator_function(lambda x, y, z: x, dog), ] ) with pytest.raises( TypeError, match="'str' is not a valid discriminated union variant; should be a `BaseModel` or `dataclass`" ): apply_discriminator(schema, 'kind') def test_lax_or_strict_definitions() -> None: cat = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))}) lax_dog = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['DOG']))}) strict_dog = core_schema.definitions_schema( core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))}), [core_schema.int_schema(ref='my-int-definition')], ) dog = core_schema.definitions_schema( core_schema.lax_or_strict_schema(lax_schema=lax_dog, strict_schema=strict_dog), [core_schema.str_schema(ref='my-str-definition')], ) discriminated_schema = apply_discriminator(core_schema.union_schema([cat, dog]), 'kind') # insert_assert(discriminated_schema) assert discriminated_schema == { 'type': 'tagged-union', 'choices': { 'cat': { 'type': 'typed-dict', 'fields': {'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['cat']}}}, }, 'DOG': { 'type': 'lax-or-strict', 'lax_schema': { 'type': 'typed-dict', 'fields': { 'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['DOG']}} }, }, 'strict_schema': { 'type': 'definitions', 'schema': { 'type': 'typed-dict', 'fields': { 'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['dog']}} }, }, 'definitions': [{'type': 'int', 'ref': 'my-int-definition'}], }, }, 'dog': { 'type': 'lax-or-strict', 'lax_schema': { 'type': 'typed-dict', 'fields': { 'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['DOG']}} }, }, 'strict_schema': { 'type': 'definitions', 'schema': { 'type': 'typed-dict', 'fields': { 'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['dog']}} }, }, 'definitions': [{'type': 'int', 'ref': 'my-int-definition'}], }, }, }, 'discriminator': 'kind', 'strict': False, 'from_attributes': True, } def test_wrapped_nullable_union() -> None: cat = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['cat']))}) dog = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['dog']))}) ant = core_schema.typed_dict_schema({'kind': core_schema.typed_dict_field(core_schema.literal_schema(['ant']))}) schema = core_schema.union_schema( [ ant, # NOTE: Wrapping the union with a validator results in failure to more thoroughly decompose the tagged # union. I think this would be difficult to avoid in the general case, and I would suggest that we not # attempt to do more than this until presented with scenarios where it is helpful/necessary. core_schema.with_info_wrap_validator_function( lambda x, y, z: x, core_schema.nullable_schema(core_schema.union_schema([cat, dog])) ), ] ) discriminated_schema = apply_discriminator(schema, 'kind') validator = SchemaValidator(discriminated_schema) assert validator.validate_python({'kind': 'ant'})['kind'] == 'ant' assert validator.validate_python({'kind': 'cat'})['kind'] == 'cat' assert validator.validate_python(None) is None with pytest.raises(ValidationError) as exc_info: validator.validate_python({'kind': 'armadillo'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'kind'", 'expected_tags': "'ant', 'cat', 'dog'", 'tag': 'armadillo'}, 'input': {'kind': 'armadillo'}, 'loc': (), 'msg': "Input tag 'armadillo' found using 'kind' does not match any of the " "expected tags: 'ant', 'cat', 'dog'", 'type': 'union_tag_invalid', } ] # insert_assert(discriminated_schema) assert discriminated_schema == { 'type': 'nullable', 'schema': { 'type': 'tagged-union', 'choices': { 'ant': { 'type': 'typed-dict', 'fields': { 'kind': {'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['ant']}} }, }, 'cat': { 'type': 'function-wrap', 'function': { 'type': 'with-info', 'function': HasRepr(IsStr(regex=r'\. at 0x[0-9a-fA-F]+>')), }, 'schema': { 'type': 'nullable', 'schema': { 'type': 'union', 'choices': [ { 'type': 'typed-dict', 'fields': { 'kind': { 'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['cat']}, } }, }, { 'type': 'typed-dict', 'fields': { 'kind': { 'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['dog']}, } }, }, ], }, }, }, 'dog': { 'type': 'function-wrap', 'function': { 'type': 'with-info', 'function': HasRepr(IsStr(regex=r'\. at 0x[0-9a-fA-F]+>')), }, 'schema': { 'type': 'nullable', 'schema': { 'type': 'union', 'choices': [ { 'type': 'typed-dict', 'fields': { 'kind': { 'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['cat']}, } }, }, { 'type': 'typed-dict', 'fields': { 'kind': { 'type': 'typed-dict-field', 'schema': {'type': 'literal', 'expected': ['dog']}, } }, }, ], }, }, }, }, 'discriminator': 'kind', 'strict': False, 'from_attributes': True, }, } def test_union_in_submodel() -> None: class UnionModel1(BaseModel): type: Literal[1] = 1 other: Literal['UnionModel1'] = 'UnionModel1' class UnionModel2(BaseModel): type: Literal[2] = 2 other: Literal['UnionModel2'] = 'UnionModel2' UnionModel = Annotated[Union[UnionModel1, UnionModel2], Field(discriminator='type')] class SubModel1(BaseModel): union_model: UnionModel class SubModel2(BaseModel): union_model: UnionModel class TestModel(BaseModel): submodel: Union[SubModel1, SubModel2] m = TestModel.model_validate({'submodel': {'union_model': {'type': 1}}}) assert isinstance(m.submodel, SubModel1) assert isinstance(m.submodel.union_model, UnionModel1) m = TestModel.model_validate({'submodel': {'union_model': {'type': 2}}}) assert isinstance(m.submodel, SubModel1) assert isinstance(m.submodel.union_model, UnionModel2) with pytest.raises(ValidationError) as exc_info: TestModel.model_validate({'submodel': {'union_model': {'type': 1, 'other': 'UnionModel2'}}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('submodel', 'SubModel1', 'union_model', 1, 'other'), 'msg': "Input should be 'UnionModel1'", 'input': 'UnionModel2', 'ctx': {'expected': "'UnionModel1'"}, }, { 'type': 'literal_error', 'loc': ('submodel', 'SubModel2', 'union_model', 1, 'other'), 'msg': "Input should be 'UnionModel1'", 'input': 'UnionModel2', 'ctx': {'expected': "'UnionModel1'"}, }, ] # insert_assert(TestModel.model_json_schema()) assert TestModel.model_json_schema() == { '$defs': { 'SubModel1': { 'properties': { 'union_model': { 'discriminator': { 'mapping': {'1': '#/$defs/UnionModel1', '2': '#/$defs/UnionModel2'}, 'propertyName': 'type', }, 'oneOf': [{'$ref': '#/$defs/UnionModel1'}, {'$ref': '#/$defs/UnionModel2'}], 'title': 'Union Model', } }, 'required': ['union_model'], 'title': 'SubModel1', 'type': 'object', }, 'SubModel2': { 'properties': { 'union_model': { 'discriminator': { 'mapping': {'1': '#/$defs/UnionModel1', '2': '#/$defs/UnionModel2'}, 'propertyName': 'type', }, 'oneOf': [{'$ref': '#/$defs/UnionModel1'}, {'$ref': '#/$defs/UnionModel2'}], 'title': 'Union Model', } }, 'required': ['union_model'], 'title': 'SubModel2', 'type': 'object', }, 'UnionModel1': { 'properties': { 'type': {'const': 1, 'default': 1, 'title': 'Type', 'type': 'integer'}, 'other': { 'const': 'UnionModel1', 'default': 'UnionModel1', 'title': 'Other', 'type': 'string', }, }, 'title': 'UnionModel1', 'type': 'object', }, 'UnionModel2': { 'properties': { 'type': {'const': 2, 'default': 2, 'title': 'Type', 'type': 'integer'}, 'other': { 'const': 'UnionModel2', 'default': 'UnionModel2', 'title': 'Other', 'type': 'string', }, }, 'title': 'UnionModel2', 'type': 'object', }, }, 'properties': { 'submodel': {'anyOf': [{'$ref': '#/$defs/SubModel1'}, {'$ref': '#/$defs/SubModel2'}], 'title': 'Submodel'} }, 'required': ['submodel'], 'title': 'TestModel', 'type': 'object', } def test_function_after_discriminator(): class CatModel(BaseModel): name: Literal['kitty', 'cat'] @field_validator('name', mode='after') def replace_name(cls, v): return 'cat' class DogModel(BaseModel): name: Literal['puppy', 'dog'] # comment out the 2 field validators and model will work! @field_validator('name', mode='after') def replace_name(cls, v): return 'dog' AllowedAnimal = Annotated[Union[CatModel, DogModel], Field(discriminator='name')] class Model(BaseModel): x: AllowedAnimal m = Model(x={'name': 'kitty'}) assert m.x.name == 'cat' # Ensure a discriminated union is actually being used during validation with pytest.raises(ValidationError) as exc_info: Model(x={'name': 'invalid'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': "'name'", 'expected_tags': "'kitty', 'cat', 'puppy', 'dog'", 'tag': 'invalid'}, 'input': {'name': 'invalid'}, 'loc': ('x',), 'msg': "Input tag 'invalid' found using 'name' does not match any of the " "expected tags: 'kitty', 'cat', 'puppy', 'dog'", 'type': 'union_tag_invalid', } ] def test_sequence_discriminated_union(): class Cat(BaseModel): pet_type: Literal['cat'] meows: int class Dog(BaseModel): pet_type: Literal['dog'] barks: float class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] scales: bool Pet = Annotated[Union[Cat, Dog, Lizard], Field(discriminator='pet_type')] class Model(BaseModel): pet: Sequence[Pet] n: int # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'Cat': { 'properties': { 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, 'meows': {'title': 'Meows', 'type': 'integer'}, }, 'required': ['pet_type', 'meows'], 'title': 'Cat', 'type': 'object', }, 'Dog': { 'properties': { 'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}, 'barks': {'title': 'Barks', 'type': 'number'}, }, 'required': ['pet_type', 'barks'], 'title': 'Dog', 'type': 'object', }, 'Lizard': { 'properties': { 'pet_type': {'enum': ['reptile', 'lizard'], 'title': 'Pet Type', 'type': 'string'}, 'scales': {'title': 'Scales', 'type': 'boolean'}, }, 'required': ['pet_type', 'scales'], 'title': 'Lizard', 'type': 'object', }, }, 'properties': { 'pet': { 'items': { 'discriminator': { 'mapping': { 'cat': '#/$defs/Cat', 'dog': '#/$defs/Dog', 'lizard': '#/$defs/Lizard', 'reptile': '#/$defs/Lizard', }, 'propertyName': 'pet_type', }, 'oneOf': [{'$ref': '#/$defs/Cat'}, {'$ref': '#/$defs/Dog'}, {'$ref': '#/$defs/Lizard'}], }, 'title': 'Pet', 'type': 'array', }, 'n': {'title': 'N', 'type': 'integer'}, }, 'required': ['pet', 'n'], 'title': 'Model', 'type': 'object', } def test_sequence_discriminated_union_validation(): """ Related issue: https://github.com/pydantic/pydantic/issues/9872 """ class A(BaseModel): type: Literal['a'] a_field: str class B(BaseModel): type: Literal['b'] b_field: str class Model(BaseModel): items: Sequence[Annotated[Union[A, B], Field(discriminator='type')]] import json data_json = '{"items": [{"type": "b"}]}' data_dict = json.loads(data_json) expected_error = { 'type': 'missing', 'loc': ('items', 0, 'b', 'b_field'), 'msg': 'Field required', 'input': {'type': 'b'}, } # missing field should be `b_field` only, not including `a_field` # also `literal_error` should not be reported on `type` with pytest.raises(ValidationError) as exc_info: Model.model_validate(data_dict) assert exc_info.value.errors(include_url=False) == [expected_error] with pytest.raises(ValidationError) as exc_info: Model.model_validate_json(data_json) assert exc_info.value.errors(include_url=False) == [expected_error] def test_sequence_discriminated_union_validation_with_validator(): """ This is the same as the previous test, but add validators to both class. """ class A(BaseModel): type: Literal['a'] a_field: str @model_validator(mode='after') def check_a(self): return self class B(BaseModel): type: Literal['b'] b_field: str @model_validator(mode='after') def check_b(self): return self class Model(BaseModel): items: Sequence[Annotated[Union[A, B], Field(discriminator='type')]] import json data_json = '{"items": [{"type": "b"}]}' data_dict = json.loads(data_json) expected_error = { 'type': 'missing', 'loc': ('items', 0, 'b', 'b_field'), 'msg': 'Field required', 'input': {'type': 'b'}, } # missing field should be `b_field` only, not including `a_field` # also `literal_error` should not be reported on `type` with pytest.raises(ValidationError) as exc_info: Model.model_validate(data_dict) assert exc_info.value.errors(include_url=False) == [expected_error] @pytest.fixture(scope='session', name='animals') def callable_discriminated_union_animals() -> SimpleNamespace: class Cat(BaseModel): pet_type: Literal['cat'] = 'cat' class Dog(BaseModel): pet_kind: Literal['dog'] = 'dog' class Fish(BaseModel): pet_kind: Literal['fish'] = 'fish' class Lizard(BaseModel): pet_variety: Literal['lizard'] = 'lizard' animals = SimpleNamespace(cat=Cat, dog=Dog, fish=Fish, lizard=Lizard) return animals @pytest.fixture(scope='session', name='get_pet_discriminator_value') def shared_pet_discriminator_value() -> Callable[[Any], str]: def get_discriminator_value(v): if isinstance(v, dict): return v.get('pet_type', v.get('pet_kind')) return getattr(v, 'pet_type', getattr(v, 'pet_kind', None)) return get_discriminator_value def test_callable_discriminated_union_with_type_adapter( animals: SimpleNamespace, get_pet_discriminator_value: Callable[[Any], str] ) -> None: pet_adapter = TypeAdapter( Annotated[ Union[Annotated[animals.cat, Tag('cat')], Annotated[animals.dog, Tag('dog')]], Discriminator(get_pet_discriminator_value), ] ) assert pet_adapter.validate_python({'pet_type': 'cat'}).pet_type == 'cat' assert pet_adapter.validate_python({'pet_kind': 'dog'}).pet_kind == 'dog' assert pet_adapter.validate_python(animals.cat()).pet_type == 'cat' assert pet_adapter.validate_python(animals.dog()).pet_kind == 'dog' assert pet_adapter.validate_json('{"pet_type":"cat"}').pet_type == 'cat' assert pet_adapter.validate_json('{"pet_kind":"dog"}').pet_kind == 'dog' # Unexpected discriminator value for dict with pytest.raises(ValidationError) as exc_info: pet_adapter.validate_python({'pet_kind': 'fish'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()', 'expected_tags': "'cat', 'dog'", 'tag': 'fish'}, 'input': {'pet_kind': 'fish'}, 'loc': (), 'msg': "Input tag 'fish' found using get_discriminator_value() does not " "match any of the expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] # Missing discriminator key for dict with pytest.raises(ValidationError) as exc_info: pet_adapter.validate_python({'pet_variety': 'lizard'}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()'}, 'input': {'pet_variety': 'lizard'}, 'loc': (), 'msg': 'Unable to extract tag using discriminator get_discriminator_value()', 'type': 'union_tag_not_found', } ] # Unexpected discriminator value for instance with pytest.raises(ValidationError) as exc_info: pet_adapter.validate_python(animals.fish()) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()', 'expected_tags': "'cat', 'dog'", 'tag': 'fish'}, 'input': animals.fish(pet_kind='fish'), 'loc': (), 'msg': "Input tag 'fish' found using get_discriminator_value() does not " "match any of the expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] # Missing discriminator key for instance with pytest.raises(ValidationError) as exc_info: pet_adapter.validate_python(animals.lizard()) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()'}, 'input': animals.lizard(pet_variety='lizard'), 'loc': (), 'msg': 'Unable to extract tag using discriminator get_discriminator_value()', 'type': 'union_tag_not_found', } ] def test_various_syntax_options_for_callable_union( animals: SimpleNamespace, get_pet_discriminator_value: Callable[[Any], str] ) -> None: class PetModelField(BaseModel): pet: Union[Annotated[animals.cat, Tag('cat')], Annotated[animals.dog, Tag('dog')]] = Field( discriminator=Discriminator(get_pet_discriminator_value) ) class PetModelAnnotated(BaseModel): pet: Annotated[ Union[Annotated[animals.cat, Tag('cat')], Annotated[animals.dog, Tag('dog')]], Discriminator(get_pet_discriminator_value), ] class PetModelAnnotatedWithField(BaseModel): pet: Annotated[ Union[Annotated[animals.cat, Tag('cat')], Annotated[animals.dog, Tag('dog')]], Field(discriminator=Discriminator(get_pet_discriminator_value)), ] models = [PetModelField, PetModelAnnotated, PetModelAnnotatedWithField] for model in models: assert model.model_validate({'pet': {'pet_type': 'cat'}}).pet.pet_type == 'cat' assert model.model_validate({'pet': {'pet_kind': 'dog'}}).pet.pet_kind == 'dog' assert model(pet=animals.cat()).pet.pet_type == 'cat' assert model(pet=animals.dog()).pet.pet_kind == 'dog' assert model.model_validate_json('{"pet": {"pet_type":"cat"}}').pet.pet_type == 'cat' assert model.model_validate_json('{"pet": {"pet_kind":"dog"}}').pet.pet_kind == 'dog' # Unexpected discriminator value for dict with pytest.raises(ValidationError) as exc_info: model.model_validate({'pet': {'pet_kind': 'fish'}}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()', 'expected_tags': "'cat', 'dog'", 'tag': 'fish'}, 'input': {'pet_kind': 'fish'}, 'loc': ('pet',), 'msg': "Input tag 'fish' found using get_discriminator_value() does not " "match any of the expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] # Missing discriminator key for dict with pytest.raises(ValidationError) as exc_info: model.model_validate({'pet': {'pet_variety': 'lizard'}}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()'}, 'input': {'pet_variety': 'lizard'}, 'loc': ('pet',), 'msg': 'Unable to extract tag using discriminator get_discriminator_value()', 'type': 'union_tag_not_found', } ] # Unexpected discriminator value for instance with pytest.raises(ValidationError) as exc_info: model(pet=animals.fish()) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()', 'expected_tags': "'cat', 'dog'", 'tag': 'fish'}, 'input': animals.fish(pet_kind='fish'), 'loc': ('pet',), 'msg': "Input tag 'fish' found using get_discriminator_value() does not " "match any of the expected tags: 'cat', 'dog'", 'type': 'union_tag_invalid', } ] # Missing discriminator key for instance with pytest.raises(ValidationError) as exc_info: model(pet=animals.lizard()) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'get_discriminator_value()'}, 'input': animals.lizard(pet_variety='lizard'), 'loc': ('pet',), 'msg': 'Unable to extract tag using discriminator get_discriminator_value()', 'type': 'union_tag_not_found', } ] def test_callable_discriminated_union_recursive(): # Demonstrate that the errors are very verbose without a callable discriminator: class Model(BaseModel): x: Union[str, 'Model'] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'x': {'x': {'x': 1}}}) assert exc_info.value.errors(include_url=False) == [ {'input': {'x': {'x': 1}}, 'loc': ('x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type'}, { 'input': {'x': 1}, 'loc': ('x', 'Model', 'x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type', }, { 'input': 1, 'loc': ('x', 'Model', 'x', 'Model', 'x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type', }, { 'ctx': {'class_name': 'Model'}, 'input': 1, 'loc': ('x', 'Model', 'x', 'Model', 'x', 'Model'), 'msg': 'Input should be a valid dictionary or instance of Model', 'type': 'model_type', }, ] with pytest.raises(ValidationError) as exc_info: Model.model_validate({'x': {'x': {'x': {}}}}) assert exc_info.value.errors(include_url=False) == [ { 'input': {'x': {'x': {}}}, 'loc': ('x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type', }, { 'input': {'x': {}}, 'loc': ('x', 'Model', 'x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type', }, { 'input': {}, 'loc': ('x', 'Model', 'x', 'Model', 'x', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type', }, { 'input': {}, 'loc': ('x', 'Model', 'x', 'Model', 'x', 'Model', 'x'), 'msg': 'Field required', 'type': 'missing', }, ] # Demonstrate that the errors are less verbose _with_ a callable discriminator: def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' class DiscriminatedModel(BaseModel): x: Annotated[ Union[Annotated[str, Tag('str')], Annotated['DiscriminatedModel', Tag('model')]], Discriminator( model_x_discriminator, custom_error_type='invalid_union_member', custom_error_message='Invalid union member', custom_error_context={'discriminator': 'str_or_model'}, ), ] with pytest.raises(ValidationError) as exc_info: DiscriminatedModel.model_validate({'x': {'x': {'x': 1}}}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'discriminator': 'str_or_model'}, 'input': 1, 'loc': ('x', 'model', 'x', 'model', 'x'), 'msg': 'Invalid union member', 'type': 'invalid_union_member', } ] with pytest.raises(ValidationError) as exc_info: DiscriminatedModel.model_validate({'x': {'x': {'x': {}}}}) assert exc_info.value.errors(include_url=False) == [ { 'input': {}, 'loc': ('x', 'model', 'x', 'model', 'x', 'model', 'x'), 'msg': 'Field required', 'type': 'missing', } ] # Demonstrate that the data is still handled properly when valid: data = {'x': {'x': {'x': 'a'}}} m = DiscriminatedModel.model_validate(data) assert m == DiscriminatedModel(x=DiscriminatedModel(x=DiscriminatedModel(x='a'))) assert m.model_dump() == data def test_callable_discriminated_union_with_missing_tag() -> None: def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[Annotated[str, Tag('str')], 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, Annotated['DiscriminatedModel', Tag('model')]], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' @pytest.mark.xfail( reason='Issue not yet fixed, see: https://github.com/pydantic/pydantic/issues/8271. At the moment, JSON schema gen warns with a PydanticJsonSchemaWarning.' ) def test_presence_of_discriminator_when_generating_type_adaptor_json_schema_definitions() -> None: class ItemType(str, Enum): ITEM1 = 'item1' ITEM2 = 'item2' class CreateItem1(BaseModel): item_type: Annotated[Literal[ItemType.ITEM1], Field(alias='type')] id: int class CreateItem2(BaseModel): item_type: Annotated[Literal[ItemType.ITEM2], Field(alias='type')] id: int class CreateObjectDto(BaseModel): id: int items: List[ Annotated[ Union[ CreateItem1, CreateItem2, ], Field(discriminator='item_type'), ] ] adapter = TypeAdapter( Annotated[CreateObjectDto, FieldInfo(examples=[{'id': 1, 'items': [{'id': 3, 'type': 'ITEM1'}]}])] ) schema_map, definitions = GenerateJsonSchema().generate_definitions([(adapter, 'validation', adapter.core_schema)]) assert definitions == { 'CreateItem1': { 'properties': {'id': {'title': 'Id', 'type': 'integer'}, 'type': {'const': 'item1', 'title': 'Type'}}, 'required': ['type', 'id'], 'title': 'CreateItem1', 'type': 'object', }, 'CreateItem2': { 'properties': {'id': {'title': 'Id', 'type': 'integer'}, 'type': {'const': 'item2', 'title': 'Type'}}, 'required': ['type', 'id'], 'title': 'CreateItem2', 'type': 'object', }, 'CreateObjectDto': { 'properties': { 'id': {'title': 'Id', 'type': 'integer'}, 'items': { 'items': { 'discriminator': { 'mapping': {'item1': '#/$defs/CreateItem1', 'item2': '#/$defs/CreateItem2'}, 'propertyName': 'type', }, 'oneOf': [{'$ref': '#/$defs/CreateItem1'}, {'$ref': '#/$defs/CreateItem2'}], }, 'title': 'Items', 'type': 'array', }, }, 'required': ['id', 'items'], 'title': 'CreateObjectDto', 'type': 'object', }, } def test_nested_discriminator() -> None: """ The exact details of the JSON schema produced are not necessarily important; the test was added in response to a regression that caused the inner union to lose its discriminator. Even if the schema changes, the important thing is that the core schema (and therefore JSON schema) produced has an actual discriminated union in it. For more context, see: https://github.com/pydantic/pydantic/issues/8688. """ class Step_A(BaseModel): type: Literal['stepA'] count: int class Step_B(BaseModel): type: Literal['stepB'] value: float class MyModel(BaseModel): type: Literal['mixed'] sub_models: List['SubModel'] steps: Union[Step_A, Step_B] = Field( default=None, discriminator='type', ) class SubModel(MyModel): type: Literal['mixed'] blending: float MyModel.model_rebuild() # insert_assert(MyModel.model_json_schema()) assert MyModel.model_json_schema() == { '$defs': { 'Step_A': { 'properties': { 'count': {'title': 'Count', 'type': 'integer'}, 'type': {'const': 'stepA', 'title': 'Type', 'type': 'string'}, }, 'required': ['type', 'count'], 'title': 'Step_A', 'type': 'object', }, 'Step_B': { 'properties': { 'type': {'const': 'stepB', 'title': 'Type', 'type': 'string'}, 'value': {'title': 'Value', 'type': 'number'}, }, 'required': ['type', 'value'], 'title': 'Step_B', 'type': 'object', }, 'SubModel': { 'properties': { 'blending': {'title': 'Blending', 'type': 'number'}, 'steps': { 'default': None, 'discriminator': { 'mapping': {'stepA': '#/$defs/Step_A', 'stepB': '#/$defs/Step_B'}, 'propertyName': 'type', }, 'oneOf': [{'$ref': '#/$defs/Step_A'}, {'$ref': '#/$defs/Step_B'}], 'title': 'Steps', }, 'sub_models': {'items': {'$ref': '#/$defs/SubModel'}, 'title': 'Sub Models', 'type': 'array'}, 'type': {'const': 'mixed', 'title': 'Type', 'type': 'string'}, }, 'required': ['type', 'sub_models', 'blending'], 'title': 'SubModel', 'type': 'object', }, }, 'properties': { 'steps': { 'default': None, 'discriminator': { 'mapping': {'stepA': '#/$defs/Step_A', 'stepB': '#/$defs/Step_B'}, 'propertyName': 'type', }, 'oneOf': [{'$ref': '#/$defs/Step_A'}, {'$ref': '#/$defs/Step_B'}], 'title': 'Steps', }, 'sub_models': {'items': {'$ref': '#/$defs/SubModel'}, 'title': 'Sub Models', 'type': 'array'}, 'type': {'const': 'mixed', 'title': 'Type', 'type': 'string'}, }, 'required': ['type', 'sub_models'], 'title': 'MyModel', 'type': 'object', } def test_nested_schema_gen_uses_tagged_union_in_ref() -> None: class NestedState(BaseModel): state_type: Literal['nested'] substate: 'AnyState' # If this type is left out, the model behaves normally again class LoopState(BaseModel): state_type: Literal['loop'] substate: 'AnyState' class LeafState(BaseModel): state_type: Literal['leaf'] AnyState = Annotated[Union[NestedState, LoopState, LeafState], Field(discriminator='state_type')] adapter = TypeAdapter(AnyState) assert adapter.core_schema['schema']['type'] == 'tagged-union' for definition in adapter.core_schema['definitions']: if definition['schema']['model_name'] in ['NestedState', 'LoopState']: assert definition['schema']['fields']['substate']['schema']['type'] == 'tagged-union' def test_recursive_discriminiated_union_with_typed_dict() -> None: class Foo(TypedDict): type: Literal['foo'] x: 'Foobar' class Bar(TypedDict): type: Literal['bar'] Foobar = Annotated[Union[Foo, Bar], Field(discriminator='type')] ta = TypeAdapter(Foobar) # len of errors should be 1 for each case, bc we're using a tagged union with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'wrong'}) assert len(e.value.errors()) == 1 with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'foo', 'x': {'type': 'wrong'}}) assert len(e.value.errors()) == 1 core_schema = ta.core_schema assert core_schema['schema']['type'] == 'tagged-union' for definition in core_schema['definitions']: if 'Foo' in definition['ref']: assert definition['fields']['x']['schema']['type'] == 'tagged-union' def test_recursive_discriminiated_union_with_base_model() -> None: class Foo(BaseModel): type: Literal['foo'] x: 'Foobar' class Bar(BaseModel): type: Literal['bar'] Foobar = Annotated[Union[Foo, Bar], Field(discriminator='type')] ta = TypeAdapter(Foobar) # len of errors should be 1 for each case, bc we're using a tagged union with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'wrong'}) assert len(e.value.errors()) == 1 with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'foo', 'x': {'type': 'wrong'}}) assert len(e.value.errors()) == 1 core_schema = ta.core_schema assert core_schema['schema']['type'] == 'tagged-union' for definition in core_schema['definitions']: if 'Foo' in definition['ref']: assert definition['schema']['fields']['x']['schema']['type'] == 'tagged-union' def test_recursive_discriminated_union_with_pydantic_dataclass() -> None: @pydantic_dataclass class Foo: type: Literal['foo'] x: 'Foobar' @pydantic_dataclass class Bar: type: Literal['bar'] Foobar = Annotated[Union[Foo, Bar], Field(discriminator='type')] ta = TypeAdapter(Foobar) # len of errors should be 1 for each case, bc we're using a tagged union with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'wrong'}) assert len(e.value.errors()) == 1 with pytest.raises(ValidationError) as e: ta.validate_python({'type': 'foo', 'x': {'type': 'wrong'}}) assert len(e.value.errors()) == 1 core_schema = ta.core_schema assert core_schema['schema']['type'] == 'tagged-union' for definition in core_schema['definitions']: if 'Foo' in definition['ref']: for field in definition['schema']['fields']: assert field['schema']['type'] == 'tagged-union' if field['name'] == 'x' else True def test_discriminated_union_with_nested_dataclass() -> None: @pydantic_dataclass class Cat: type: Literal['cat'] = 'cat' @pydantic_dataclass class Dog: type: Literal['dog'] = 'dog' @pydantic_dataclass class NestedDataClass: animal: Annotated[Union[Cat, Dog], Discriminator('type')] @pydantic_dataclass class Root: data_class: NestedDataClass ta = TypeAdapter(Root) assert ta.core_schema['schema']['fields'][0]['schema']['schema']['fields'][0]['schema']['type'] == 'tagged-union' def test_discriminated_union_with_nested_typed_dicts() -> None: class Cat(TypedDict): type: Literal['cat'] class Dog(TypedDict): type: Literal['dog'] class NestedTypedDict(TypedDict): animal: Annotated[Union[Cat, Dog], Discriminator('type')] class Root(TypedDict): data_class: NestedTypedDict ta = TypeAdapter(Root) assert ta.core_schema['fields']['data_class']['schema']['fields']['animal']['schema']['type'] == 'tagged-union' def test_discriminated_union_with_unsubstituted_type_var() -> None: T = TypeVar('T') class Dog(BaseModel, Generic[T]): type_: Literal['dog'] friends: List['GenericPet'] id: T class Cat(BaseModel, Generic[T]): type_: Literal['cat'] friends: List['GenericPet'] id: T GenericPet = Annotated[Union[Dog[T], Cat[T]], Field(discriminator='type_')] ta = TypeAdapter(Dog[int]) int_dog = { 'type_': 'dog', 'friends': [{'type_': 'dog', 'friends': [], 'id': 2}, {'type_': 'cat', 'friends': [], 'id': 3}], 'id': 1, } assert ta.validate_python(int_dog).id == 1 assert ta.validate_python(int_dog).friends[0].id == 2 assert ta.validate_python(int_dog).friends[1].id == 3 def test_discriminated_union_model_dump_with_nested_class() -> None: class SomeEnum(str, Enum): CAT = 'cat' DOG = 'dog' class Dog(BaseModel): type: Literal[SomeEnum.DOG] = SomeEnum.DOG name: str class Cat(BaseModel): type: Literal[SomeEnum.CAT] = SomeEnum.CAT name: str class Yard(BaseModel): pet: Union[Dog, Cat] = Field(discriminator='type') yard = Yard(pet=Dog(name='Rex')) yard_dict = yard.model_dump(mode='json') assert isinstance(yard_dict['pet']['type'], str) assert not isinstance(yard_dict['pet']['type'], SomeEnum) assert str(yard_dict['pet']['type']) == 'dog' @pytest.mark.xfail(reason='Waiting for union serialization fixes via https://github.com/pydantic/pydantic/issues/9688.') def test_discriminated_union_serializer() -> None: """Reported via https://github.com/pydantic/pydantic/issues/9590.""" @dataclass class FooId: _id: int @dataclass class BarId: _id: int FooOrBarId = Annotated[ Annotated[FooId, PlainSerializer(lambda v: {'tag': 'foo', '_id': v._id}), Tag('foo')] | Annotated[BarId, PlainSerializer(lambda v: {'tag': 'bar', '_id': v._id}), Tag('bar')], Discriminator(lambda v: v['tag']), ] adapter = TypeAdapter(FooOrBarId) assert adapter.dump_python(FooId(1)) == {'tag': 'foo', '_id': 1} assert adapter.dump_python(BarId(2)) == {'tag': 'bar', '_id': 2} pydantic-2.10.6/tests/test_docs.py000066400000000000000000000234341474456633400171440ustar00rootroot00000000000000from __future__ import annotations as _annotations import os import platform import re import subprocess import sys from datetime import datetime from pathlib import Path from tempfile import NamedTemporaryFile from typing import Any import pytest from pydantic_core import core_schema from pytest_examples import CodeExample, EvalExample, find_examples from pydantic.errors import PydanticErrorCodes INDEX_MAIN = None DOCS_ROOT = Path(__file__).parent.parent / 'docs' SOURCES_ROOT = Path(__file__).parent.parent / 'pydantic' def skip_docs_tests(): if sys.platform not in {'linux', 'darwin'}: return 'not in linux or macos' if platform.python_implementation() != 'CPython': return 'not cpython' try: import devtools # noqa: F401 except ImportError: return 'devtools not installed' try: import sqlalchemy # noqa: F401 except ImportError: return 'sqlalchemy not installed' try: import ansi2html # noqa: F401 except ImportError: return 'ansi2html not installed' class GroupModuleGlobals: def __init__(self) -> None: self.name = None self.module_dict: dict[str, str] = {} def get(self, name: str | None): if name is not None and name == self.name: return self.module_dict def set(self, name: str | None, module_dict: dict[str, str]): self.name = name if self.name is None: self.module_dict = None else: self.module_dict = module_dict group_globals = GroupModuleGlobals() class MockedDatetime(datetime): @classmethod def now(cls, *args, tz=None, **kwargs): return datetime(2032, 1, 2, 3, 4, 5, 6, tzinfo=tz) skip_reason = skip_docs_tests() LINE_LENGTH = 80 def print_callback(print_statement: str) -> str: return re.sub(r'(https://errors.pydantic.dev)/.+?/', r'\1/2/', print_statement) def run_example(example: CodeExample, eval_example: EvalExample, mocker: Any) -> None: # noqa C901 eval_example.print_callback = print_callback prefix_settings = example.prefix_settings() test_settings = prefix_settings.get('test', '') lint_settings = prefix_settings.get('lint', '') if test_settings.startswith('skip') and lint_settings.startswith('skip'): pytest.skip('both running code and lint skipped') requires_settings = prefix_settings.get('requires') if requires_settings: major, minor = map(int, requires_settings.split('.')) if sys.version_info < (major, minor): pytest.skip(f'requires python {requires_settings}') group_name = prefix_settings.get('group') eval_example.set_config(ruff_ignore=['D', 'T', 'B', 'C4', 'E721', 'Q001'], line_length=LINE_LENGTH) if '# ignore-above' in example.source: eval_example.set_config(ruff_ignore=eval_example.config.ruff_ignore + ['E402'], line_length=LINE_LENGTH) if group_name: eval_example.set_config(ruff_ignore=eval_example.config.ruff_ignore + ['F821'], line_length=LINE_LENGTH) if not lint_settings.startswith('skip'): if eval_example.update_examples: eval_example.format(example) else: if example.in_py_file(): # Ignore isort as double newlines will cause it to fail, but we remove them in py files eval_example.set_config(ruff_ignore=eval_example.config.ruff_ignore + ['I001'], line_length=LINE_LENGTH) eval_example.lint(example) if test_settings.startswith('skip'): pytest.skip(test_settings[4:].lstrip(' -') or 'running code skipped') group_name = prefix_settings.get('group') d = group_globals.get(group_name) mocker.patch('datetime.datetime', MockedDatetime) mocker.patch('random.randint', return_value=3) xfail = None if test_settings.startswith('xfail'): xfail = test_settings[5:].lstrip(' -') rewrite_assertions = prefix_settings.get('rewrite_assert', 'true') == 'true' try: if test_settings == 'no-print-intercept': d2 = eval_example.run(example, module_globals=d, rewrite_assertions=rewrite_assertions) elif eval_example.update_examples: d2 = eval_example.run_print_update(example, module_globals=d, rewrite_assertions=rewrite_assertions) else: d2 = eval_example.run_print_check(example, module_globals=d, rewrite_assertions=rewrite_assertions) except BaseException as e: # run_print_check raises a BaseException if xfail: pytest.xfail(f'{xfail}, {type(e).__name__}: {e}') raise else: if xfail: pytest.fail('expected xfail') group_globals.set(group_name, d2) @pytest.mark.filterwarnings('ignore:(parse_obj_as|schema_json_of|schema_of) is deprecated.*:DeprecationWarning') @pytest.mark.skipif(bool(skip_reason), reason=skip_reason or 'not skipping') @pytest.mark.parametrize('example', find_examples(str(SOURCES_ROOT), skip=sys.platform == 'win32'), ids=str) def test_docstrings_examples(example: CodeExample, eval_example: EvalExample, tmp_path: Path, mocker): if str(example.path).startswith(str(SOURCES_ROOT / 'v1')): pytest.skip('skip v1 examples') run_example(example, eval_example, mocker) @pytest.fixture(scope='module', autouse=True) def set_cwd(): # `test_docs_examples` needs to be run from this folder or relative paths will be wrong and some tests fail execution_path = str(DOCS_ROOT.parent) cwd = os.getcwd() os.chdir(execution_path) try: yield finally: os.chdir(cwd) @pytest.mark.filterwarnings('ignore:(parse_obj_as|schema_json_of|schema_of) is deprecated.*:DeprecationWarning') @pytest.mark.filterwarnings('ignore::pydantic.warnings.PydanticExperimentalWarning') @pytest.mark.skipif(bool(skip_reason), reason=skip_reason or 'not skipping') @pytest.mark.parametrize('example', find_examples(str(DOCS_ROOT), skip=sys.platform == 'win32'), ids=str) def test_docs_examples(example: CodeExample, eval_example: EvalExample, tmp_path: Path, mocker): global INDEX_MAIN if example.path.name == 'index.md': if INDEX_MAIN is None: INDEX_MAIN = example.source else: (tmp_path / 'index_main.py').write_text(INDEX_MAIN) sys.path.append(str(tmp_path)) if example.path.name == 'devtools.md': pytest.skip('tested below') run_example(example, eval_example, mocker) @pytest.mark.skipif(bool(skip_reason), reason=skip_reason or 'not skipping') @pytest.mark.skipif(sys.version_info >= (3, 13), reason='python-devtools does not yet support python 3.13') @pytest.mark.parametrize( 'example', find_examples(str(DOCS_ROOT / 'integrations/devtools.md'), skip=sys.platform == 'win32'), ids=str ) def test_docs_devtools_example(example: CodeExample, eval_example: EvalExample, tmp_path: Path): from ansi2html import Ansi2HTMLConverter eval_example.set_config(ruff_ignore=['D', 'T', 'B', 'C4'], line_length=LINE_LENGTH) if eval_example.update_examples: eval_example.format(example) else: eval_example.lint(example) with NamedTemporaryFile(mode='w', suffix='.py') as f: f.write(example.source) f.flush() os.environ['PY_DEVTOOLS_HIGHLIGHT'] = 'true' p = subprocess.run((sys.executable, f.name), stdout=subprocess.PIPE, check=True, encoding='utf8') conv = Ansi2HTMLConverter() # replace ugly file path with "devtools_example.py" output = re.sub(r'/.+?\.py', 'devtools_example.py', p.stdout) output_html = conv.convert(output, full=False) output_html = ( '\n' f'{output_html}' ) output_file = DOCS_ROOT / 'plugins/devtools_output.html' if eval_example.update_examples: output_file.write_text(output_html) elif not output_file.exists(): pytest.fail(f'output file {output_file} does not exist') else: assert output_html == output_file.read_text() def test_error_codes(): error_text = (DOCS_ROOT / 'errors/usage_errors.md').read_text() code_error_codes = PydanticErrorCodes.__args__ documented_error_codes = tuple(re.findall(r'^## .+ \{#(.+?)}$', error_text, flags=re.MULTILINE)) assert code_error_codes == documented_error_codes, 'Error codes in code and docs do not match' def test_validation_error_codes(): error_text = (DOCS_ROOT / 'errors/validation_errors.md').read_text() expected_validation_error_codes = set(core_schema.ErrorType.__args__) # Remove codes that are not currently accessible from pydantic: expected_validation_error_codes.remove('timezone_offset') # not currently exposed for configuration in pydantic test_failures = [] documented_validation_error_codes = [] error_code_section = None printed_error_code = None for line in error_text.splitlines(): section_match = re.fullmatch(r'## `(.+)`', line) if section_match: if error_code_section is not None and printed_error_code != error_code_section: test_failures.append(f'Error code {error_code_section!r} is not printed in its example') error_code_section = section_match.group(1) if error_code_section not in expected_validation_error_codes: test_failures.append(f'Documented error code {error_code_section!r} is not a member of ErrorType') documented_validation_error_codes.append(error_code_section) printed_error_code = None continue printed_match = re.search("#> '(.+)'", line) if printed_match: printed_error_code = printed_match.group(1) assert test_failures == [] code_validation_error_codes = sorted(expected_validation_error_codes) assert code_validation_error_codes == documented_validation_error_codes, 'Error codes in code and docs do not match' pydantic-2.10.6/tests/test_docs_extraction.py000066400000000000000000000202171474456633400214000ustar00rootroot00000000000000import textwrap from typing import Generic, TypeVar from typing_extensions import Annotated, TypedDict from pydantic import BaseModel, ConfigDict, Field, TypeAdapter, create_model from pydantic.dataclasses import dataclass as pydantic_dataclass T = TypeVar('T') def dec_noop(obj): return obj def test_model_no_docs_extraction(): class MyModel(BaseModel): a: int = 1 """A docs""" b: str = '1' """B docs""" assert MyModel.model_fields['a'].description is None assert MyModel.model_fields['b'].description is None def test_model_docs_extraction(): # Using a couple dummy decorators to make sure the frame is pointing at # the `class` line: @dec_noop @dec_noop class MyModel(BaseModel): a: int """A docs""" b: int = 1 """B docs""" c: int = 1 # This isn't used as a description. d: int def dummy_method(self) -> None: """Docs for dummy that won't be used for d""" e: Annotated[int, Field(description='Real description')] """Won't be used""" f: int """F docs""" """Useless docs""" g: int """G docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) assert MyModel.model_fields['a'].description == 'A docs' assert MyModel.model_fields['b'].description == 'B docs' assert MyModel.model_fields['c'].description is None assert MyModel.model_fields['d'].description is None assert MyModel.model_fields['e'].description == 'Real description' assert MyModel.model_fields['g'].description == 'G docs' def test_model_docs_duplicate_class(): """Ensure source parsing is working correctly when using frames.""" @dec_noop class MyModel(BaseModel): a: int """A docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) @dec_noop class MyModel(BaseModel): b: int """B docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) assert MyModel.model_fields['b'].description == 'B docs' # With https://github.com/python/cpython/pull/106815/ introduced, # inspect will fallback to the last found class in the source file. # The following is to ensure using frames will still get the correct one if True: class MyModel(BaseModel): a: int """A docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) else: class MyModel(BaseModel): b: int """B docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) assert MyModel.model_fields['a'].description == 'A docs' def test_model_docs_dedented_string(): # fmt: off class MyModel(BaseModel): def bar(self): """ An inconveniently dedented string """ a: int """A docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) # fmt: on assert MyModel.model_fields['a'].description == 'A docs' def test_model_docs_inheritance(): class MyModel(BaseModel): a: int """A docs""" b: int """B docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) FirstModel = MyModel class MyModel(FirstModel): a: int """A overridden docs""" assert FirstModel.model_fields['a'].description == 'A docs' assert MyModel.model_fields['a'].description == 'A overridden docs' assert MyModel.model_fields['b'].description == 'B docs' def test_model_different_name(): # As we extract docstrings from cls in `ModelMetaclass.__new__`, # we are not affected by `__name__` being altered in any way. class MyModel(BaseModel): a: int """A docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) MyModel.__name__ = 'OtherModel' print(MyModel.__name__) assert MyModel.model_fields['a'].description == 'A docs' def test_model_generic(): class MyModel(BaseModel, Generic[T]): a: T """A docs""" model_config = ConfigDict( use_attribute_docstrings=True, ) assert MyModel.model_fields['a'].description == 'A docs' class MyParameterizedModel(MyModel[int]): a: int """A parameterized docs""" assert MyParameterizedModel.model_fields['a'].description == 'A parameterized docs' assert MyModel[int].model_fields['a'].description == 'A docs' def test_dataclass_no_docs_extraction(): @pydantic_dataclass class MyModel: a: int = 1 """A docs""" b: str = '1' """B docs""" assert MyModel.__pydantic_fields__['a'].description is None assert MyModel.__pydantic_fields__['b'].description is None def test_dataclass_docs_extraction(): @pydantic_dataclass( config=ConfigDict(use_attribute_docstrings=True), ) @dec_noop class MyModel: a: int """A docs""" b: int = 1 """B docs""" c: int = 1 # This isn't used as a description. d: int = 1 def dummy_method(self) -> None: """Docs for dummy_method that won't be used for d""" e: int = Field(1, description='Real description') """Won't be used""" f: int = 1 """F docs""" """Useless docs""" g: int = 1 """G docs""" h = 1 """H docs""" i: Annotated[int, Field(description='Real description')] = 1 """Won't be used""" assert MyModel.__pydantic_fields__['a'].description == 'A docs' assert MyModel.__pydantic_fields__['b'].description == 'B docs' assert MyModel.__pydantic_fields__['c'].description is None assert MyModel.__pydantic_fields__['d'].description is None assert MyModel.__pydantic_fields__['e'].description == 'Real description' assert MyModel.__pydantic_fields__['g'].description == 'G docs' assert MyModel.__pydantic_fields__['i'].description == 'Real description' def test_typeddict(): class MyModel(TypedDict): a: int """A docs""" ta = TypeAdapter(MyModel) assert ta.json_schema() == { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'MyModel', 'type': 'object', } class MyModel(TypedDict): a: int """A docs""" __pydantic_config__ = ConfigDict(use_attribute_docstrings=True) ta = TypeAdapter(MyModel) assert ta.json_schema() == { 'properties': {'a': {'title': 'A', 'type': 'integer', 'description': 'A docs'}}, 'required': ['a'], 'title': 'MyModel', 'type': 'object', } def test_typeddict_as_field(): class ModelTDAsField(TypedDict): a: int """A docs""" __pydantic_config__ = ConfigDict(use_attribute_docstrings=True) class MyModel(BaseModel): td: ModelTDAsField a_property = MyModel.model_json_schema()['$defs']['ModelTDAsField']['properties']['a'] assert a_property['description'] == 'A docs' def test_create_model_test(): # Duplicate class creation to ensure create_model # doesn't fallback to using inspect, which could # in turn use the wrong class: class MyModel(BaseModel): foo: str = '123' """Shouldn't be used""" model_config = ConfigDict( use_attribute_docstrings=True, ) assert MyModel.model_fields['foo'].description == "Shouldn't be used" MyModel = create_model( 'MyModel', foo=(int, 123), __config__=ConfigDict(use_attribute_docstrings=True), ) assert MyModel.model_fields['foo'].description is None def test_exec_cant_be_parsed(): source = textwrap.dedent( ''' class MyModel(BaseModel): a: int """A docs""" model_config = ConfigDict(use_attribute_docstrings=True) ''' ) locals_dict = {} exec(source, globals(), locals_dict) assert locals_dict['MyModel'].model_fields['a'].description is None pydantic-2.10.6/tests/test_dunder_all.py000066400000000000000000000014111474456633400203140ustar00rootroot00000000000000def test_explicit_reexports() -> None: from pydantic import __all__ as root_all from pydantic.deprecated.tools import __all__ as tools from pydantic.main import __all__ as main from pydantic.networks import __all__ as networks from pydantic.types import __all__ as types for name, export_all in [('main', main), ('networks', networks), ('deprecated.tools', tools), ('types', types)]: for export in export_all: assert export in root_all, f'{export} is in `pydantic.{name}.__all__` but missing in `pydantic.__all__`' def test_explicit_reexports_exist() -> None: import pydantic for name in pydantic.__all__: assert hasattr(pydantic, name), f'{name} is in `pydantic.__all__` but `from pydantic import {name}` fails' pydantic-2.10.6/tests/test_edge_cases.py000066400000000000000000002540321474456633400202760ustar00rootroot00000000000000import functools import importlib.util import re import sys from abc import ABC, abstractmethod from collections.abc import Hashable from decimal import Decimal from enum import Enum, auto from typing import ( Any, Callable, Dict, ForwardRef, FrozenSet, Generic, List, Optional, Sequence, Set, Tuple, Type, TypeVar, Union, ) import pytest from dirty_equals import HasRepr, IsStr from pydantic_core import ErrorDetails, InitErrorDetails, PydanticSerializationError, PydanticUndefined, core_schema from typing_extensions import Annotated, Literal, TypeAliasType, TypedDict, get_args from pydantic import ( BaseModel, ConfigDict, GetCoreSchemaHandler, PrivateAttr, PydanticDeprecatedSince20, PydanticSchemaGenerationError, RootModel, TypeAdapter, ValidationError, constr, errors, field_validator, model_validator, root_validator, validator, ) from pydantic._internal._model_construction import ModelMetaclass from pydantic.fields import Field, computed_field from pydantic.functional_serializers import ( field_serializer, model_serializer, ) def test_str_bytes(): class Model(BaseModel): v: Union[str, bytes] m = Model(v='s') assert m.v == 's' assert repr(m.model_fields['v']) == 'FieldInfo(annotation=Union[str, bytes], required=True)' m = Model(v=b'b') assert m.v == b'b' with pytest.raises(ValidationError) as exc_info: Model(v=None) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'string_type', 'loc': ('v', 'str'), 'msg': 'Input should be a valid string', 'input': None}, {'type': 'bytes_type', 'loc': ('v', 'bytes'), 'msg': 'Input should be a valid bytes', 'input': None}, ] def test_str_bytes_none(): class Model(BaseModel): v: Union[None, str, bytes] = ... m = Model(v='s') assert m.v == 's' m = Model(v=b'b') assert m.v == b'b' m = Model(v=None) assert m.v is None def test_union_int_str(): class Model(BaseModel): v: Union[int, str] = ... m = Model(v=123) assert m.v == 123 m = Model(v='123') assert m.v == '123' m = Model(v=b'foobar') assert m.v == 'foobar' m = Model(v=12.0) assert m.v == 12 with pytest.raises(ValidationError) as exc_info: Model(v=None) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('v', 'int'), 'msg': 'Input should be a valid integer', 'input': None}, { 'type': 'string_type', 'loc': ('v', 'str'), 'msg': 'Input should be a valid string', 'input': None, }, ] def test_union_int_any(): class Model(BaseModel): v: Union[int, Any] m = Model(v=123) assert m.v == 123 m = Model(v='123') assert m.v == '123' m = Model(v='foobar') assert m.v == 'foobar' m = Model(v=None) assert m.v is None def test_typed_list(): class Model(BaseModel): v: List[int] = ... m = Model(v=[1, 2, '3']) assert m.v == [1, 2, 3] with pytest.raises(ValidationError) as exc_info: Model(v=[1, 'x', 'y']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, { 'type': 'int_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'y', }, ] with pytest.raises(ValidationError) as exc_info: Model(v=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('v',), 'msg': 'Input should be a valid list', 'input': 1} ] def test_typed_set(): class Model(BaseModel): v: Set[int] = ... assert Model(v={1, 2, '3'}).v == {1, 2, 3} assert Model(v=[1, 2, '3']).v == {1, 2, 3} with pytest.raises(ValidationError) as exc_info: Model(v=[1, 'x']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', } ] def test_dict_dict(): class Model(BaseModel): v: Dict[str, int] = ... assert Model(v={'foo': 1}).model_dump() == {'v': {'foo': 1}} def test_none_list(): class Model(BaseModel): v: List[None] = [None] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'v': {'title': 'V', 'default': [None], 'type': 'array', 'items': {'type': 'null'}}}, } @pytest.mark.parametrize( 'value,result', [ ({'a': 2, 'b': 4}, {'a': 2, 'b': 4}), ({b'a': '2', 'b': 4}, {'a': 2, 'b': 4}), # ([('a', 2), ('b', 4)], {'a': 2, 'b': 4}), ], ) def test_typed_dict(value, result): class Model(BaseModel): v: Dict[str, int] = ... assert Model(v=value).v == result @pytest.mark.parametrize( 'value,errors', [ (1, [{'type': 'dict_type', 'loc': ('v',), 'msg': 'Input should be a valid dictionary', 'input': 1}]), ( {'a': 'b'}, [ { 'type': 'int_parsing', 'loc': ('v', 'a'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', } ], ), ( [1, 2, 3], [{'type': 'dict_type', 'loc': ('v',), 'msg': 'Input should be a valid dictionary', 'input': [1, 2, 3]}], ), ], ) def test_typed_dict_error(value, errors): class Model(BaseModel): v: Dict[str, int] = ... with pytest.raises(ValidationError) as exc_info: Model(v=value) assert exc_info.value.errors(include_url=False) == errors def test_dict_key_error(): class Model(BaseModel): v: Dict[int, int] = ... assert Model(v={1: 2, '3': '4'}).v == {1: 2, 3: 4} with pytest.raises(ValidationError) as exc_info: Model(v={'foo': 2, '3': '4'}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 'foo', '[key]'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'foo', } ] def test_tuple(): class Model(BaseModel): v: Tuple[int, float, bool] m = Model(v=['1.0', '2.2', 'true']) assert m.v == (1, 2.2, True) def test_tuple_more(): class Model(BaseModel): empty_tuple: Tuple[()] simple_tuple: tuple = None tuple_of_different_types: Tuple[int, float, str, bool] = None tuple_of_single_tuples: Tuple[Tuple[int], ...] = () m = Model( empty_tuple=[], simple_tuple=[1, 2, 3, 4], tuple_of_different_types=[4, 3.1, 'str', 1], tuple_of_single_tuples=(('1',), (2,)), ) assert m.model_dump() == { 'empty_tuple': (), 'simple_tuple': (1, 2, 3, 4), 'tuple_of_different_types': (4, 3.1, 'str', True), 'tuple_of_single_tuples': ((1,), (2,)), } @pytest.mark.parametrize( 'dict_cls,frozenset_cls,list_cls,set_cls,tuple_cls,type_cls', [ (Dict, FrozenSet, List, Set, Tuple, Type), (dict, frozenset, list, set, tuple, type), ], ) def test_pep585_generic_types(dict_cls, frozenset_cls, list_cls, set_cls, tuple_cls, type_cls): class Type1: pass class Type2: pass class Model(BaseModel, arbitrary_types_allowed=True): a: dict_cls a1: 'dict_cls[str, int]' b: frozenset_cls b1: 'frozenset_cls[int]' c: list_cls c1: 'list_cls[int]' d: set_cls d1: 'set_cls[int]' e: tuple_cls e1: 'tuple_cls[int]' e2: 'tuple_cls[int, ...]' e3: 'tuple_cls[()]' f: type_cls f1: 'type_cls[Type1]' default_model_kwargs = dict( a={}, a1={'a': '1'}, b=[], b1=('1',), c=[], c1=('1',), d=[], d1=['1'], e=[], e1=['1'], e2=['1', '2'], e3=[], f=Type1, f1=Type1, ) m = Model(**default_model_kwargs) assert m.a == {} assert m.a1 == {'a': 1} assert m.b == frozenset() assert m.b1 == frozenset({1}) assert m.c == [] assert m.c1 == [1] assert m.d == set() assert m.d1 == {1} assert m.e == () assert m.e1 == (1,) assert m.e2 == (1, 2) assert m.e3 == () assert m.f == Type1 assert m.f1 == Type1 with pytest.raises(ValidationError) as exc_info: Model(**{**default_model_kwargs, 'e3': (1,)}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('e3',), 'msg': 'Tuple should have at most 0 items after validation, not 1', 'input': (1,), 'ctx': {'field_type': 'Tuple', 'max_length': 0, 'actual_length': 1}, } ] Model(**{**default_model_kwargs, 'f': Type2}) with pytest.raises(ValidationError) as exc_info: Model(**{**default_model_kwargs, 'f1': Type2}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_subclass_of', 'loc': ('f1',), 'msg': 'Input should be a subclass of test_pep585_generic_types..Type1', 'input': HasRepr(IsStr(regex=r".+\.Type2'>")), 'ctx': {'class': 'test_pep585_generic_types..Type1'}, } ] def test_tuple_length_error(): class Model(BaseModel): v: Tuple[int, float, bool] w: Tuple[()] with pytest.raises(ValidationError) as exc_info: Model(v=[1, 2], w=[1]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('v', 2), 'msg': 'Field required', 'input': [1, 2]}, { 'type': 'too_long', 'loc': ('w',), 'msg': 'Tuple should have at most 0 items after validation, not 1', 'input': [1], 'ctx': {'field_type': 'Tuple', 'max_length': 0, 'actual_length': 1}, }, ] def test_tuple_invalid(): class Model(BaseModel): v: Tuple[int, float, bool] with pytest.raises(ValidationError) as exc_info: Model(v='xxx') assert exc_info.value.errors(include_url=False) == [ {'type': 'tuple_type', 'loc': ('v',), 'msg': 'Input should be a valid tuple', 'input': 'xxx'} ] def test_tuple_value_error(): class Model(BaseModel): v: Tuple[int, float, Decimal] with pytest.raises(ValidationError) as exc_info: Model(v=['x', 'y', 'x']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, { 'type': 'float_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'y', }, { 'type': 'decimal_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid decimal', 'input': 'x', }, ] def test_recursive_list(): class SubModel(BaseModel): name: str = ... count: int = None class Model(BaseModel): v: List[SubModel] = [] m = Model(v=[]) assert m.v == [] m = Model(v=[{'name': 'testing', 'count': 4}]) assert repr(m) == "Model(v=[SubModel(name='testing', count=4)])" assert m.v[0].name == 'testing' assert m.v[0].count == 4 assert m.model_dump() == {'v': [{'count': 4, 'name': 'testing'}]} with pytest.raises(ValidationError) as exc_info: Model(v=['x']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('v', 0), 'msg': 'Input should be a valid dictionary or instance of SubModel', 'input': 'x', 'ctx': {'class_name': 'SubModel'}, } ] def test_recursive_list_error(): class SubModel(BaseModel): name: str = ... count: int = None class Model(BaseModel): v: List[SubModel] = [] with pytest.raises(ValidationError) as exc_info: Model(v=[{}]) assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('v', 0, 'name'), 'msg': 'Field required', 'type': 'missing'} ] def test_list_unions(): class Model(BaseModel): v: List[Union[int, str]] assert Model(v=[123, '456', 'foobar']).v == [123, '456', 'foobar'] with pytest.raises(ValidationError) as exc_info: Model(v=[1, 2, None]) # the reason that we need to do an unordered list comparison here is that previous tests use Union[str, int] # and Python's cache makes it such that the above Model has `v` associated with a List[Union[str, int]] instead # of the expected List[Union[int, str]] # for more info, see https://github.com/python/cpython/issues/103749 and # https://github.com/pydantic/pydantic/pull/10244#issuecomment-2312796647 errors = exc_info.value.errors(include_url=False) expected_errors = [ {'input': None, 'loc': ('v', 2, 'int'), 'msg': 'Input should be a valid integer', 'type': 'int_type'}, {'input': None, 'loc': ('v', 2, 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type'}, ] assert sorted(errors, key=str) == sorted(expected_errors, key=str) def test_recursive_lists(): class Model(BaseModel): v: List[List[Union[int, float]]] = ... assert Model(v=[[1, 2], [3, '4', '4.1']]).v == [[1, 2], [3, 4, 4.1]] assert Model.model_fields['v'].annotation == List[List[Union[int, float]]] assert Model.model_fields['v'].is_required() class StrEnum(str, Enum): a = 'a10' b = 'b10' def test_str_enum(): class Model(BaseModel): v: StrEnum = ... assert Model(v='a10').v is StrEnum.a with pytest.raises(ValidationError): Model(v='different') def test_any_dict(): class Model(BaseModel): v: Dict[int, Any] = ... assert Model(v={1: 'foobar'}).model_dump() == {'v': {1: 'foobar'}} assert Model(v={123: 456}).model_dump() == {'v': {123: 456}} assert Model(v={2: [1, 2, 3]}).model_dump() == {'v': {2: [1, 2, 3]}} def test_success_values_include(): class Model(BaseModel): a: int = 1 b: int = 2 c: int = 3 m = Model() assert m.model_dump() == {'a': 1, 'b': 2, 'c': 3} assert m.model_dump(include={'a'}) == {'a': 1} assert m.model_dump(exclude={'a'}) == {'b': 2, 'c': 3} assert m.model_dump(include={'a', 'b'}, exclude={'a'}) == {'b': 2} def test_include_exclude_unset(): class Model(BaseModel): a: int b: int c: int = 3 d: int = 4 e: int = 5 f: int = 6 m = Model(a=1, b=2, e=5, f=7) assert m.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 7} assert m.model_fields_set == {'a', 'b', 'e', 'f'} assert m.model_dump(exclude_unset=True) == {'a': 1, 'b': 2, 'e': 5, 'f': 7} assert m.model_dump(include={'a'}, exclude_unset=True) == {'a': 1} assert m.model_dump(include={'c'}, exclude_unset=True) == {} assert m.model_dump(exclude={'a'}, exclude_unset=True) == {'b': 2, 'e': 5, 'f': 7} assert m.model_dump(exclude={'c'}, exclude_unset=True) == {'a': 1, 'b': 2, 'e': 5, 'f': 7} assert m.model_dump(include={'a', 'b', 'c'}, exclude={'b'}, exclude_unset=True) == {'a': 1} assert m.model_dump(include={'a', 'b', 'c'}, exclude={'a', 'c'}, exclude_unset=True) == {'b': 2} def test_include_exclude_defaults(): class Model(BaseModel): a: int b: int c: int = 3 d: int = 4 e: int = 5 f: int = 6 m = Model(a=1, b=2, e=5, f=7) assert m.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 7} assert m.model_fields_set == {'a', 'b', 'e', 'f'} assert m.model_dump(exclude_defaults=True) == {'a': 1, 'b': 2, 'f': 7} assert m.model_dump(include={'a'}, exclude_defaults=True) == {'a': 1} assert m.model_dump(include={'c'}, exclude_defaults=True) == {} assert m.model_dump(exclude={'a'}, exclude_defaults=True) == {'b': 2, 'f': 7} assert m.model_dump(exclude={'c'}, exclude_defaults=True) == {'a': 1, 'b': 2, 'f': 7} assert m.model_dump(include={'a', 'b', 'c'}, exclude={'b'}, exclude_defaults=True) == {'a': 1} assert m.model_dump(include={'a', 'b', 'c'}, exclude={'a', 'c'}, exclude_defaults=True) == {'b': 2} assert m.model_dump(include={'a': 1}.keys()) == {'a': 1} assert m.model_dump(exclude={'a': 1}.keys()) == {'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 7} assert m.model_dump(include={'a': 1}.keys(), exclude_unset=True) == {'a': 1} assert m.model_dump(exclude={'a': 1}.keys(), exclude_unset=True) == {'b': 2, 'e': 5, 'f': 7} assert m.model_dump(include=['a']) == {'a': 1} assert m.model_dump(exclude=['a']) == {'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 7} assert m.model_dump(include=['a'], exclude_unset=True) == {'a': 1} assert m.model_dump(exclude=['a'], exclude_unset=True) == {'b': 2, 'e': 5, 'f': 7} def test_advanced_exclude(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) assert m.model_dump(exclude={'f': {'c': ..., 'd': {-1: {'a'}}}}) == { 'e': 'e', 'f': {'d': [{'a': 'a', 'b': 'b'}, {'b': 'e'}]}, } assert m.model_dump(exclude={'e': ..., 'f': {'d'}}) == {'f': {'c': 'foo'}} def test_advanced_exclude_by_alias(): class SubSubModel(BaseModel): a: str aliased_b: str = Field(alias='b_alias') class SubModel(BaseModel): aliased_c: str = Field(alias='c_alias') aliased_d: List[SubSubModel] = Field(alias='d_alias') class Model(BaseModel): aliased_e: str = Field(alias='e_alias') aliased_f: SubModel = Field(alias='f_alias') m = Model( e_alias='e', f_alias=SubModel(c_alias='foo', d_alias=[SubSubModel(a='a', b_alias='b'), SubSubModel(a='c', b_alias='e')]), ) excludes = {'aliased_f': {'aliased_c': ..., 'aliased_d': {-1: {'a'}}}} assert m.model_dump(exclude=excludes, by_alias=True) == { 'e_alias': 'e', 'f_alias': {'d_alias': [{'a': 'a', 'b_alias': 'b'}, {'b_alias': 'e'}]}, } excludes = {'aliased_e': ..., 'aliased_f': {'aliased_d'}} assert m.model_dump(exclude=excludes, by_alias=True) == {'f_alias': {'c_alias': 'foo'}} def test_advanced_value_include(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) assert m.model_dump(include={'f'}) == {'f': {'c': 'foo', 'd': [{'a': 'a', 'b': 'b'}, {'a': 'c', 'b': 'e'}]}} assert m.model_dump(include={'e'}) == {'e': 'e'} assert m.model_dump(include={'f': {'d': {0: ..., -1: {'b'}}}}) == {'f': {'d': [{'a': 'a', 'b': 'b'}, {'b': 'e'}]}} def test_advanced_value_exclude_include(): class SubSubModel(BaseModel): a: str b: str class SubModel(BaseModel): c: str d: List[SubSubModel] class Model(BaseModel): e: str f: SubModel m = Model(e='e', f=SubModel(c='foo', d=[SubSubModel(a='a', b='b'), SubSubModel(a='c', b='e')])) assert m.model_dump(exclude={'f': {'c': ..., 'd': {-1: {'a'}}}}, include={'f'}) == { 'f': {'d': [{'a': 'a', 'b': 'b'}, {'b': 'e'}]} } assert m.model_dump(exclude={'e': ..., 'f': {'d'}}, include={'e', 'f'}) == {'f': {'c': 'foo'}} assert m.model_dump(exclude={'f': {'d': {-1: {'a'}}}}, include={'f': {'d'}}) == { 'f': {'d': [{'a': 'a', 'b': 'b'}, {'b': 'e'}]} } @pytest.mark.parametrize( 'exclude,expected', [ pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}}}}}, {'subs': [{'k': 1, 'subsubs': [{'j': 1}, {'j': 2}]}, {'k': 2, 'subsubs': [{'j': 3}]}]}, id='Normal nested __all__', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}}}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'k': 1, 'subsubs': [{}, {}]}, {'k': 2, 'subsubs': [{'j': 3}]}]}, id='Merge sub dicts 1', ), pytest.param( {'subs': {'__all__': {'subsubs': ...}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1}, {'i': 2}]}, {'k': 2}]}, # {'subs': [{'k': 1 }, {'k': 2}]} id='Merge sub sets 2', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'j'}}}, 0: {'subsubs': ...}}}, {'subs': [{'k': 1}, {'k': 2, 'subsubs': [{'i': 3}]}]}, id='Merge sub sets 3', ), pytest.param( {'subs': {'__all__': {'subsubs': {0}}, 0: {'subsubs': {1}}}}, {'subs': [{'k': 1, 'subsubs': []}, {'k': 2, 'subsubs': []}]}, id='Merge sub sets 1', ), pytest.param( {'subs': {'__all__': {'subsubs': {0: {'i'}}}, 0: {'subsubs': {1}}}}, {'subs': [{'k': 1, 'subsubs': [{'j': 1}]}, {'k': 2, 'subsubs': [{'j': 3}]}]}, id='Merge sub dict-set', ), pytest.param({'subs': {'__all__': {'subsubs'}, 0: {'k'}}}, {'subs': [{}, {'k': 2}]}, id='Different keys 1'), pytest.param( {'subs': {'__all__': {'subsubs': ...}, 0: {'k'}}}, {'subs': [{}, {'k': 2}]}, id='Different keys 2' ), pytest.param( {'subs': {'__all__': {'subsubs'}, 0: {'k': ...}}}, {'subs': [{}, {'k': 2}]}, id='Different keys 3' ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}, 0: {'j'}}}}}, {'subs': [{'k': 1, 'subsubs': [{}, {'j': 2}]}, {'k': 2, 'subsubs': [{}]}]}, id='Nested different keys 1', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i': ...}, 0: {'j'}}}}}, {'subs': [{'k': 1, 'subsubs': [{}, {'j': 2}]}, {'k': 2, 'subsubs': [{}]}]}, id='Nested different keys 2', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}, 0: {'j': ...}}}}}, {'subs': [{'k': 1, 'subsubs': [{}, {'j': 2}]}, {'k': 2, 'subsubs': [{}]}]}, id='Nested different keys 3', ), pytest.param( {'subs': {'__all__': {'subsubs'}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1}, {'i': 2}]}, {'k': 2}]}, id='Ignore __all__ for index with defined exclude 1', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'j'}}}, 0: ...}}, {'subs': [{'k': 2, 'subsubs': [{'i': 3}]}]}, id='Ignore __all__ for index with defined exclude 2', ), pytest.param( {'subs': {'__all__': ..., 0: {'subsubs'}}}, {'subs': [{'k': 1}]}, id='Ignore __all__ for index with defined exclude 3', ), ], ) def test_advanced_exclude_nested_lists(exclude, expected): class SubSubModel(BaseModel): i: int j: int class SubModel(BaseModel): k: int subsubs: List[SubSubModel] class Model(BaseModel): subs: List[SubModel] m = Model(subs=[dict(k=1, subsubs=[dict(i=1, j=1), dict(i=2, j=2)]), dict(k=2, subsubs=[dict(i=3, j=3)])]) assert m.model_dump(exclude=exclude) == expected @pytest.mark.parametrize( 'include,expected', [ pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}}}}}, {'subs': [{'subsubs': [{'i': 1}, {'i': 2}]}, {'subsubs': [{'i': 3}]}]}, id='Normal nested __all__', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}}}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3}]}]}, id='Merge sub dicts 1', ), pytest.param( {'subs': {'__all__': {'subsubs': ...}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'subsubs': [{'j': 1}, {'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Merge sub dicts 2', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'j'}}}, 0: {'subsubs': ...}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'j': 3}]}]}, id='Merge sub dicts 3', ), pytest.param( {'subs': {'__all__': {'subsubs': {0}}, 0: {'subsubs': {1}}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Merge sub sets', ), pytest.param( {'subs': {'__all__': {'subsubs': {0: {'i'}}}, 0: {'subsubs': {1}}}}, {'subs': [{'subsubs': [{'i': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3}]}]}, id='Merge sub dict-set', ), pytest.param( {'subs': {'__all__': {'subsubs'}, 0: {'k'}}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 1', ), pytest.param( {'subs': {'__all__': {'subsubs': ...}, 0: {'k'}}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 2', ), pytest.param( {'subs': {'__all__': {'subsubs'}, 0: {'k': ...}}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 3', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}, 0: {'j'}}}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 1', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i': ...}, 0: {'j'}}}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 2', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'i'}, 0: {'j': ...}}}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Nested different keys 3', ), pytest.param( {'subs': {'__all__': {'subsubs'}, 0: {'subsubs': {'__all__': {'j'}}}}}, {'subs': [{'subsubs': [{'j': 1}, {'j': 2}]}, {'subsubs': [{'i': 3, 'j': 3}]}]}, id='Ignore __all__ for index with defined include 1', ), pytest.param( {'subs': {'__all__': {'subsubs': {'__all__': {'j'}}}, 0: ...}}, {'subs': [{'k': 1, 'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'subsubs': [{'j': 3}]}]}, id='Ignore __all__ for index with defined include 2', ), pytest.param( {'subs': {'__all__': ..., 0: {'subsubs'}}}, {'subs': [{'subsubs': [{'i': 1, 'j': 1}, {'i': 2, 'j': 2}]}, {'k': 2, 'subsubs': [{'i': 3, 'j': 3}]}]}, id='Ignore __all__ for index with defined include 3', ), ], ) def test_advanced_include_nested_lists(include, expected): class SubSubModel(BaseModel): i: int j: int class SubModel(BaseModel): k: int subsubs: List[SubSubModel] class Model(BaseModel): subs: List[SubModel] m = Model(subs=[dict(k=1, subsubs=[dict(i=1, j=1), dict(i=2, j=2)]), dict(k=2, subsubs=[dict(i=3, j=3)])]) assert m.model_dump(include=include) == expected def test_field_set_ignore_extra(): class Model(BaseModel): model_config = ConfigDict(extra='ignore') a: int b: int c: int = 3 m = Model(a=1, b=2) assert m.model_dump() == {'a': 1, 'b': 2, 'c': 3} assert m.model_fields_set == {'a', 'b'} assert m.model_dump(exclude_unset=True) == {'a': 1, 'b': 2} m2 = Model(a=1, b=2, d=4) assert m2.model_dump() == {'a': 1, 'b': 2, 'c': 3} assert m2.model_fields_set == {'a', 'b'} assert m2.model_dump(exclude_unset=True) == {'a': 1, 'b': 2} def test_field_set_allow_extra(): class Model(BaseModel): model_config = ConfigDict(extra='allow') a: int b: int c: int = 3 m = Model(a=1, b=2) assert m.model_dump() == {'a': 1, 'b': 2, 'c': 3} assert m.model_fields_set == {'a', 'b'} assert m.model_dump(exclude_unset=True) == {'a': 1, 'b': 2} m2 = Model(a=1, b=2, d=4) assert m2.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4} assert m2.model_fields_set == {'a', 'b', 'd'} assert m2.model_dump(exclude_unset=True) == {'a': 1, 'b': 2, 'd': 4} def test_field_set_field_name(): class Model(BaseModel): a: int field_set: int b: int = 3 assert Model(a=1, field_set=2).model_dump() == {'a': 1, 'field_set': 2, 'b': 3} assert Model(a=1, field_set=2).model_dump(exclude_unset=True) == {'a': 1, 'field_set': 2} assert Model.model_construct(a=1, field_set=3).model_dump() == {'a': 1, 'field_set': 3, 'b': 3} def test_values_order(): class Model(BaseModel): a: int = 1 b: int = 2 c: int = 3 m = Model(c=30, b=20, a=10) assert list(m) == [('a', 10), ('b', 20), ('c', 30)] def test_inheritance(): class Foo(BaseModel): a: float = ... with pytest.raises( TypeError, match=( "Field 'a' defined on a base class was overridden by a non-annotated attribute. " 'All field definitions, including overrides, require a type annotation.' ), ): class Bar(Foo): x: float = 12.3 a = 123.0 class Bar2(Foo): x: float = 12.3 a: float = 123.0 assert Bar2().model_dump() == {'x': 12.3, 'a': 123.0} class Bar3(Foo): x: float = 12.3 a: float = Field(default=123.0) assert Bar3().model_dump() == {'x': 12.3, 'a': 123.0} def test_inheritance_subclass_default(): class MyStr(str): pass # Confirm hint supports a subclass default class Simple(BaseModel): x: str = MyStr('test') model_config = dict(arbitrary_types_allowed=True) # Confirm hint on a base can be overridden with a subclass default on a subclass class Base(BaseModel): x: str y: str class Sub(Base): x: str = MyStr('test') y: MyStr = MyStr('test') # force subtype model_config = dict(arbitrary_types_allowed=True) assert Sub.model_fields['x'].annotation == str assert Sub.model_fields['y'].annotation == MyStr def test_invalid_type(): with pytest.raises(PydanticSchemaGenerationError) as exc_info: class Model(BaseModel): x: 43 = 123 assert 'Unable to generate pydantic-core schema for 43' in exc_info.value.args[0] class CustomStr(str): def foobar(self): return 7 @pytest.mark.parametrize( 'value,expected', [ ('a string', 'a string'), (b'some bytes', 'some bytes'), (bytearray('foobar', encoding='utf8'), 'foobar'), (StrEnum.a, 'a10'), (CustomStr('whatever'), 'whatever'), ], ) def test_valid_string_types(value, expected): class Model(BaseModel): v: str assert Model(v=value).v == expected @pytest.mark.parametrize( 'value,errors', [ ( {'foo': 'bar'}, [{'input': {'foo': 'bar'}, 'loc': ('v',), 'msg': 'Input should be a valid string', 'type': 'string_type'}], ), ( [1, 2, 3], [{'input': [1, 2, 3], 'loc': ('v',), 'msg': 'Input should be a valid string', 'type': 'string_type'}], ), ], ) def test_invalid_string_types(value, errors): class Model(BaseModel): v: str with pytest.raises(ValidationError) as exc_info: Model(v=value) assert exc_info.value.errors(include_url=False) == errors def test_inheritance_config(): class Parent(BaseModel): a: str class Child(Parent): model_config = ConfigDict(str_to_lower=True) b: str m1 = Parent(a='A') m2 = Child(a='A', b='B') assert repr(m1) == "Parent(a='A')" assert repr(m2) == "Child(a='a', b='b')" def test_partial_inheritance_config(): class Parent(BaseModel): a: int = Field(ge=0) class Child(Parent): b: int = Field(ge=0) Child(a=0, b=0) with pytest.raises(ValidationError) as exc_info: Child(a=-1, b=0) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'ge': 0}, 'input': -1, 'loc': ('a',), 'msg': 'Input should be greater than or equal to 0', 'type': 'greater_than_equal', } ] with pytest.raises(ValidationError) as exc_info: Child(a=0, b=-1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'ge': 0}, 'input': -1, 'loc': ('b',), 'msg': 'Input should be greater than or equal to 0', 'type': 'greater_than_equal', } ] def test_annotation_inheritance(): class A(BaseModel): integer: int = 1 class B(A): integer: int = 2 assert B.model_fields['integer'].annotation == int class C(A): integer: str = 'G' assert C.__annotations__['integer'] == str assert C.model_fields['integer'].annotation == str with pytest.raises( TypeError, match=( "Field 'integer' defined on a base class was overridden by a non-annotated attribute. " 'All field definitions, including overrides, require a type annotation.' ), ): class D(A): integer = 'G' def test_string_none(): class Model(BaseModel): model_config = ConfigDict(extra='ignore') a: constr(min_length=20, max_length=1000) = ... with pytest.raises(ValidationError) as exc_info: Model(a=None) assert exc_info.value.errors(include_url=False) == [ {'input': None, 'loc': ('a',), 'msg': 'Input should be a valid string', 'type': 'string_type'} ] # def test_return_errors_ok(): # class Model(BaseModel): # foo: int # bar: List[int] # # assert validate_model(Model, {'foo': '123', 'bar': (1, 2, 3)}) == ( # {'foo': 123, 'bar': [1, 2, 3]}, # {'foo', 'bar'}, # None, # ) # d, f, e = validate_model(Model, {'foo': '123', 'bar': (1, 2, 3)}, False) # assert d == {'foo': 123, 'bar': [1, 2, 3]} # assert f == {'foo', 'bar'} # assert e is None # def test_return_errors_error(): # class Model(BaseModel): # foo: int # bar: List[int] # # d, f, e = validate_model(Model, {'foo': '123', 'bar': (1, 2, 'x')}, False) # assert d == {'foo': 123} # assert f == {'foo', 'bar'} # assert e.errors() == [{'loc': ('bar', 2), 'msg': 'value is not a valid integer', 'type': 'type_error.integer'}] # # d, f, e = validate_model(Model, {'bar': (1, 2, 3)}, False) # assert d == {'bar': [1, 2, 3]} # assert f == {'bar'} # assert e.errors() == [{'loc': ('foo',), 'msg': 'field required', 'type': 'value_error.missing'}] def test_optional_required(): class Model(BaseModel): bar: Optional[int] assert Model(bar=123).model_dump() == {'bar': 123} assert Model(bar=None).model_dump() == {'bar': None} with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('bar',), 'msg': 'Field required', 'type': 'missing'} ] def test_unable_to_infer(): with pytest.raises( errors.PydanticUserError, match=re.escape( 'A non-annotated attribute was detected: `x = None`. All model fields require a type annotation; ' 'if `x` is not meant to be a field, you may be able to resolve this error by annotating it as a ' "`ClassVar` or updating `model_config['ignored_types']`" ), ): class InvalidDefinitionModel(BaseModel): x = None def test_multiple_errors(): class Model(BaseModel): a: Union[None, int, float, Decimal] with pytest.raises(ValidationError) as exc_info: Model(a='foobar') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a', 'int'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'foobar', }, { 'type': 'float_parsing', 'loc': ('a', 'float'), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'foobar', }, { 'type': 'decimal_parsing', 'loc': ('a', 'decimal'), 'msg': 'Input should be a valid decimal', 'input': 'foobar', }, ] assert Model(a=1.5).a == 1.5 assert Model(a=None).a is None def test_validate_default(): class Model(BaseModel): model_config = ConfigDict(validate_default=True) a: int b: int with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('b',), 'msg': 'Field required', 'type': 'missing'}, ] def test_force_extra(): class Model(BaseModel): model_config = ConfigDict(extra='ignore') foo: int assert Model.model_config['extra'] == 'ignore' def test_submodel_different_type(): class Foo(BaseModel): a: int class Bar(BaseModel): b: int class Spam(BaseModel): c: Foo assert Spam(c={'a': '123'}).model_dump() == {'c': {'a': 123}} with pytest.raises(ValidationError): Spam(c={'b': '123'}) assert Spam(c=Foo(a='123')).model_dump() == {'c': {'a': 123}} with pytest.raises(ValidationError): Spam(c=Bar(b='123')) def test_self(): class Model(BaseModel): self: str m = Model.model_validate(dict(self='some value')) assert m.model_dump() == {'self': 'some value'} assert m.self == 'some value' assert m.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'self': {'title': 'Self', 'type': 'string'}}, 'required': ['self'], } def test_no_name_conflict_in_constructor(): class Model(BaseModel): self: int m = Model(**{'__pydantic_self__': 4, 'self': 2}) assert m.self == 2 def test_self_recursive(): class SubModel(BaseModel): self: int class Model(BaseModel): sm: SubModel m = Model.model_validate({'sm': {'self': '123'}}) assert m.model_dump() == {'sm': {'self': 123}} def test_custom_init(): class Model(BaseModel): x: int def __init__(self, x: int, y: int): if isinstance(y, str): y = len(y) super().__init__(x=x + int(y)) assert Model(x=1, y=1).x == 2 assert Model.model_validate({'x': 1, 'y': 1}).x == 2 assert Model.model_validate_json('{"x": 1, "y": 2}').x == 3 # For documentation purposes: type hints on __init__ are not currently used for validation: assert Model.model_validate({'x': 1, 'y': 'abc'}).x == 4 def test_nested_custom_init(): class NestedModel(BaseModel): self: str modified_number: int = 1 def __init__(someinit, **kwargs): super().__init__(**kwargs) someinit.modified_number += 1 class TopModel(BaseModel): self: str nest: NestedModel m = TopModel.model_validate(dict(self='Top Model', nest=dict(self='Nested Model', modified_number=0))) assert m.self == 'Top Model' assert m.nest.self == 'Nested Model' assert m.nest.modified_number == 1 def test_init_inspection(): calls = [] class Foobar(BaseModel): x: int def __init__(self, **data) -> None: with pytest.raises(AttributeError): calls.append(data) assert self.x super().__init__(**data) Foobar(x=1) Foobar.model_validate({'x': 2}) Foobar.model_validate_json('{"x": 3}') assert calls == [{'x': 1}, {'x': 2}, {'x': 3}] def test_type_on_annotation(): class FooBar: pass class Model(BaseModel): a: Type[int] b: Type[int] = int c: Type[FooBar] d: Type[FooBar] = FooBar e: Sequence[Type[FooBar]] = [FooBar] f: Union[Type[FooBar], Sequence[Type[FooBar]]] = FooBar g: Union[Type[FooBar], Sequence[Type[FooBar]]] = [FooBar] model_config = dict(arbitrary_types_allowed=True) assert Model.model_fields.keys() == set('abcdefg') def test_type_union(): class Model(BaseModel): a: Type[Union[str, bytes]] b: Type[Union[Any, str]] m = Model(a=bytes, b=int) assert m.model_dump() == {'a': bytes, 'b': int} assert m.a == bytes def test_type_on_none(): class Model(BaseModel): a: Type[None] Model(a=type(None)) with pytest.raises(ValidationError) as exc_info: Model(a=None) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_subclass_of', 'loc': ('a',), 'msg': 'Input should be a subclass of NoneType', 'input': None, 'ctx': {'class': 'NoneType'}, } ] def test_type_on_typealias(): Float = TypeAliasType('Float', float) class MyFloat(float): ... adapter = TypeAdapter(Type[Float]) adapter.validate_python(float) adapter.validate_python(MyFloat) with pytest.raises(ValidationError) as exc_info: adapter.validate_python(str) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_subclass_of', 'loc': (), 'msg': 'Input should be a subclass of float', 'input': str, 'ctx': {'class': 'float'}, } ] def test_type_on_annotated(): class Model(BaseModel): a: Type[Annotated[int, ...]] Model(a=int) with pytest.raises(ValidationError) as exc_info: Model(a=str) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_subclass_of', 'loc': ('a',), 'msg': 'Input should be a subclass of int', 'input': str, 'ctx': {'class': 'int'}, } ] def test_type_assign(): class Parent: def echo(self): return 'parent' class Child(Parent): def echo(self): return 'child' class Different: def echo(self): return 'different' class Model(BaseModel): v: Type[Parent] = Parent assert Model(v=Parent).v().echo() == 'parent' assert Model().v().echo() == 'parent' assert Model(v=Child).v().echo() == 'child' with pytest.raises(ValidationError) as exc_info: Model(v=Different) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': Parent.__qualname__}, 'input': HasRepr(repr(Different)), 'loc': ('v',), 'msg': f'Input should be a subclass of {Parent.__qualname__}', 'type': 'is_subclass_of', } ] def test_optional_subfields(): class Model(BaseModel): a: Optional[int] assert Model.model_fields['a'].annotation == Optional[int] with pytest.raises(ValidationError) as exc_info: Model(a='foobar') assert exc_info.value.errors(include_url=False) == [ { 'input': 'foobar', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'} ] assert Model(a=None).a is None assert Model(a=12).a == 12 def test_validated_optional_subfields(): class Model(BaseModel): a: Optional[int] @field_validator('a') @classmethod def check_a(cls, v): return v assert Model.model_fields['a'].annotation == Optional[int] with pytest.raises(ValidationError) as exc_info: Model(a='foobar') assert exc_info.value.errors(include_url=False) == [ { 'input': 'foobar', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'} ] assert Model(a=None).a is None assert Model(a=12).a == 12 def test_optional_field_constraints(): class MyModel(BaseModel): my_int: Optional[int] = Field(ge=3) with pytest.raises(ValidationError) as exc_info: MyModel(my_int=2) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'ge': 3}, 'input': 2, 'loc': ('my_int',), 'msg': 'Input should be greater than or equal to 3', 'type': 'greater_than_equal', } ] def test_field_str_shape(): class Model(BaseModel): a: List[int] assert repr(Model.model_fields['a']) == 'FieldInfo(annotation=List[int], required=True)' assert str(Model.model_fields['a']) == 'annotation=List[int] required=True' T1 = TypeVar('T1') T2 = TypeVar('T2') class DisplayGen(Generic[T1, T2]): def __init__(self, t1: T1, t2: T2): self.t1 = t1 self.t2 = t2 @pytest.mark.parametrize( 'type_,expected', [ (int, 'int'), (Optional[int], 'Union[int, NoneType]'), (Union[None, int, str], 'Union[NoneType, int, str]'), (Union[int, str, bytes], 'Union[int, str, bytes]'), (List[int], 'List[int]'), (Tuple[int, str, bytes], 'Tuple[int, str, bytes]'), (Union[List[int], Set[bytes]], 'Union[List[int], Set[bytes]]'), (List[Tuple[int, int]], 'List[Tuple[int, int]]'), (Dict[int, str], 'Dict[int, str]'), (FrozenSet[int], 'FrozenSet[int]'), (Tuple[int, ...], 'Tuple[int, ...]'), (Optional[List[int]], 'Union[List[int], NoneType]'), (dict, 'dict'), pytest.param( DisplayGen[bool, str], 'DisplayGen[bool, str]', marks=pytest.mark.skipif(sys.version_info[:2] <= (3, 9), reason='difference in __name__ between versions'), ), pytest.param( DisplayGen[bool, str], 'tests.test_edge_cases.DisplayGen[bool, str]', marks=pytest.mark.skipif(sys.version_info[:2] > (3, 9), reason='difference in __name__ between versions'), ), ], ) def test_field_type_display(type_, expected): class Model(BaseModel): a: type_ model_config = dict(arbitrary_types_allowed=True) assert re.search(rf'\(annotation={re.escape(expected)},', str(Model.model_fields)) def test_any_none(): class MyModel(BaseModel): foo: Any m = MyModel(foo=None) assert dict(m) == {'foo': None} def test_type_var_any(): Foobar = TypeVar('Foobar') class MyModel(BaseModel): foo: Foobar assert MyModel.model_json_schema() == { 'properties': {'foo': {'title': 'Foo'}}, 'required': ['foo'], 'title': 'MyModel', 'type': 'object', } assert MyModel(foo=None).foo is None assert MyModel(foo='x').foo == 'x' assert MyModel(foo=123).foo == 123 def test_type_var_constraint(): Foobar = TypeVar('Foobar', int, str) class MyModel(BaseModel): foo: Foobar assert MyModel.model_json_schema() == { 'title': 'MyModel', 'type': 'object', 'properties': {'foo': {'title': 'Foo', 'anyOf': [{'type': 'integer'}, {'type': 'string'}]}}, 'required': ['foo'], } with pytest.raises(ValidationError) as exc_info: MyModel(foo=None) assert exc_info.value.errors(include_url=False) == [ {'input': None, 'loc': ('foo', 'int'), 'msg': 'Input should be a valid integer', 'type': 'int_type'}, {'input': None, 'loc': ('foo', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type'}, ] with pytest.raises(ValidationError): MyModel(foo=[1, 2, 3]) assert exc_info.value.errors(include_url=False) == [ {'input': None, 'loc': ('foo', 'int'), 'msg': 'Input should be a valid integer', 'type': 'int_type'}, {'input': None, 'loc': ('foo', 'str'), 'msg': 'Input should be a valid string', 'type': 'string_type'}, ] assert MyModel(foo='x').foo == 'x' assert MyModel(foo=123).foo == 123 def test_type_var_bound(): Foobar = TypeVar('Foobar', bound=int) class MyModel(BaseModel): foo: Foobar assert MyModel.model_json_schema() == { 'title': 'MyModel', 'type': 'object', 'properties': {'foo': {'title': 'Foo', 'type': 'integer'}}, 'required': ['foo'], } with pytest.raises(ValidationError) as exc_info: MyModel(foo=None) assert exc_info.value.errors(include_url=False) == [ {'input': None, 'loc': ('foo',), 'msg': 'Input should be a valid integer', 'type': 'int_type'} ] with pytest.raises(ValidationError): MyModel(foo='x') assert exc_info.value.errors(include_url=False) == [ {'input': None, 'loc': ('foo',), 'msg': 'Input should be a valid integer', 'type': 'int_type'} ] assert MyModel(foo=123).foo == 123 def test_dict_bare(): class MyModel(BaseModel): foo: Dict m = MyModel(foo={'x': 'a', 'y': None}) assert m.foo == {'x': 'a', 'y': None} def test_list_bare(): class MyModel(BaseModel): foo: List m = MyModel(foo=[1, 2, None]) assert m.foo == [1, 2, None] def test_dict_any(): class MyModel(BaseModel): foo: Dict[str, Any] m = MyModel(foo={'x': 'a', 'y': None}) assert m.foo == {'x': 'a', 'y': None} def test_modify_fields(): class Foo(BaseModel): foo: List[List[int]] @field_validator('foo') @classmethod def check_something(cls, value): return value class Bar(Foo): pass assert repr(Foo.model_fields['foo']) == 'FieldInfo(annotation=List[List[int]], required=True)' assert repr(Bar.model_fields['foo']) == 'FieldInfo(annotation=List[List[int]], required=True)' assert Foo(foo=[[0, 1]]).foo == [[0, 1]] assert Bar(foo=[[0, 1]]).foo == [[0, 1]] def test_exclude_none(): class MyModel(BaseModel): a: Optional[int] = None b: int = 2 m = MyModel(a=5) assert m.model_dump(exclude_none=True) == {'a': 5, 'b': 2} m = MyModel(b=3) assert m.model_dump(exclude_none=True) == {'b': 3} assert m.model_dump_json(exclude_none=True) == '{"b":3}' def test_exclude_none_recursive(): class ModelA(BaseModel): a: Optional[int] = None b: int = 1 class ModelB(BaseModel): c: int d: int = 2 e: ModelA f: Optional[str] = None m = ModelB(c=5, e={'a': 0}) assert m.model_dump() == {'c': 5, 'd': 2, 'e': {'a': 0, 'b': 1}, 'f': None} assert m.model_dump(exclude_none=True) == {'c': 5, 'd': 2, 'e': {'a': 0, 'b': 1}} assert dict(m) == {'c': 5, 'd': 2, 'e': ModelA(a=0), 'f': None} m = ModelB(c=5, e={'b': 20}, f='test') assert m.model_dump() == {'c': 5, 'd': 2, 'e': {'a': None, 'b': 20}, 'f': 'test'} assert m.model_dump(exclude_none=True) == {'c': 5, 'd': 2, 'e': {'b': 20}, 'f': 'test'} assert dict(m) == {'c': 5, 'd': 2, 'e': ModelA(b=20), 'f': 'test'} def test_exclude_none_with_extra(): class MyModel(BaseModel): model_config = ConfigDict(extra='allow') a: str = 'default' b: Optional[str] = None m = MyModel(a='a', c='c') assert m.model_dump(exclude_none=True) == {'a': 'a', 'c': 'c'} assert m.model_dump() == {'a': 'a', 'b': None, 'c': 'c'} m = MyModel(a='a', b='b', c=None) assert m.model_dump(exclude_none=True) == {'a': 'a', 'b': 'b'} assert m.model_dump() == {'a': 'a', 'b': 'b', 'c': None} def test_str_method_inheritance(): import pydantic class Foo(pydantic.BaseModel): x: int = 3 y: int = 4 def __str__(self): return str(self.y + self.x) class Bar(Foo): z: bool = False assert str(Foo()) == '7' assert str(Bar()) == '7' def test_repr_method_inheritance(): import pydantic class Foo(pydantic.BaseModel): x: int = 3 y: int = 4 def __repr__(self): return repr(self.y + self.x) class Bar(Foo): z: bool = False assert repr(Foo()) == '7' assert repr(Bar()) == '7' def test_optional_validator(): val_calls = [] class Model(BaseModel): something: Optional[str] @field_validator('something') @classmethod def check_something(cls, v): val_calls.append(v) return v with pytest.raises(ValidationError) as exc_info: assert Model().model_dump() == {'something': None} assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('something',), 'msg': 'Field required', 'type': 'missing'} ] assert Model(something=None).model_dump() == {'something': None} assert Model(something='hello').model_dump() == {'something': 'hello'} assert val_calls == [None, 'hello'] def test_required_optional(): class Model(BaseModel): nullable1: Optional[int] = ... nullable2: Optional[int] = Field(...) with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('nullable1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('nullable2',), 'msg': 'Field required', 'type': 'missing'}, ] with pytest.raises(ValidationError) as exc_info: Model(nullable1=1) assert exc_info.value.errors(include_url=False) == [ {'input': {'nullable1': 1}, 'loc': ('nullable2',), 'msg': 'Field required', 'type': 'missing'} ] with pytest.raises(ValidationError) as exc_info: Model(nullable2=2) assert exc_info.value.errors(include_url=False) == [ {'input': {'nullable2': 2}, 'loc': ('nullable1',), 'msg': 'Field required', 'type': 'missing'} ] assert Model(nullable1=None, nullable2=None).model_dump() == {'nullable1': None, 'nullable2': None} assert Model(nullable1=1, nullable2=2).model_dump() == {'nullable1': 1, 'nullable2': 2} with pytest.raises(ValidationError) as exc_info: Model(nullable1='some text') assert exc_info.value.errors(include_url=False) == [ { 'input': 'some text', 'loc': ('nullable1',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, {'input': {'nullable1': 'some text'}, 'loc': ('nullable2',), 'msg': 'Field required', 'type': 'missing'}, ] def test_ellipsis_forward_ref_annotated() -> None: """This previously resulted in the ellipsis being used as a default value.""" class Model(BaseModel): f: 'Forward' Forward = Annotated[int, Field(...)] assert Model.model_fields['f'].default is PydanticUndefined def test_private_attr_ellipsis() -> None: class Model(BaseModel): _a: int = PrivateAttr(...) assert not hasattr(Model(), '_a') def test_required_any(): class Model(BaseModel): optional1: Any optional2: Any = None optional3: Optional[Any] = None nullable1: Any = ... nullable2: Any = Field(...) nullable3: Optional[Any] with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('optional1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('nullable1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('nullable2',), 'msg': 'Field required', 'type': 'missing'}, {'input': {}, 'loc': ('nullable3',), 'msg': 'Field required', 'type': 'missing'}, ] with pytest.raises(ValidationError) as exc_info: Model(nullable1='a') assert exc_info.value.errors(include_url=False) == [ {'input': {'nullable1': 'a'}, 'loc': ('optional1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'nullable1': 'a'}, 'loc': ('nullable2',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'nullable1': 'a'}, 'loc': ('nullable3',), 'msg': 'Field required', 'type': 'missing'}, ] with pytest.raises(ValidationError) as exc_info: Model(nullable2=False) assert exc_info.value.errors(include_url=False) == [ {'input': {'nullable2': False}, 'loc': ('optional1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'nullable2': False}, 'loc': ('nullable1',), 'msg': 'Field required', 'type': 'missing'}, {'input': {'nullable2': False}, 'loc': ('nullable3',), 'msg': 'Field required', 'type': 'missing'}, ] with pytest.raises(ValidationError) as exc_info: assert Model(nullable1=None, nullable2=None).model_dump() == { 'optional1': None, 'optional2': None, 'nullable1': None, 'nullable2': None, } assert exc_info.value.errors(include_url=False) == [ { 'input': {'nullable1': None, 'nullable2': None}, 'loc': ('optional1',), 'msg': 'Field required', 'type': 'missing', }, { 'input': {'nullable1': None, 'nullable2': None}, 'loc': ('nullable3',), 'msg': 'Field required', 'type': 'missing', }, ] assert Model(optional1=None, nullable1=1, nullable2='two', nullable3=None).model_dump() == { 'optional1': None, 'optional2': None, 'optional3': None, 'nullable1': 1, 'nullable2': 'two', 'nullable3': None, } assert Model(optional1='op1', optional2=False, nullable1=1, nullable2='two', nullable3='three').model_dump() == { 'optional1': 'op1', 'optional2': False, 'optional3': None, 'nullable1': 1, 'nullable2': 'two', 'nullable3': 'three', } def test_custom_generic_validators(): T1 = TypeVar('T1') T2 = TypeVar('T2') class MyGen(Generic[T1, T2]): def __init__(self, t1: T1, t2: T2): self.t1 = t1 self.t2 = t2 @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler): schema = core_schema.is_instance_schema(cls) args = get_args(source) if not args: return schema t1_f = TypeAdapter(args[0]).validate_python t2_f = TypeAdapter(args[1]).validate_python def convert_to_init_error(e: ErrorDetails, loc: str) -> InitErrorDetails: init_e = {'type': e['type'], 'loc': e['loc'] + (loc,), 'input': e['input']} if 'ctx' in e: init_e['ctx'] = e['ctx'] return init_e def validate(v, _info): if not args: return v try: v.t1 = t1_f(v.t1) except ValidationError as exc: raise ValidationError.from_exception_data( exc.title, [convert_to_init_error(e, 't1') for e in exc.errors()] ) from exc try: v.t2 = t2_f(v.t2) except ValidationError as exc: raise ValidationError.from_exception_data( exc.title, [convert_to_init_error(e, 't2') for e in exc.errors()] ) from exc return v return core_schema.with_info_after_validator_function(validate, schema) class Model(BaseModel): a: str gen: MyGen[str, bool] gen2: MyGen model_config = dict(arbitrary_types_allowed=True) with pytest.raises(ValidationError) as exc_info: Model(a='foo', gen='invalid', gen2='invalid') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'test_custom_generic_validators..MyGen'}, 'input': 'invalid', 'loc': ('gen',), 'msg': 'Input should be an instance of test_custom_generic_validators..MyGen', 'type': 'is_instance_of', }, { 'ctx': {'class': 'test_custom_generic_validators..MyGen'}, 'input': 'invalid', 'loc': ('gen2',), 'msg': 'Input should be an instance of test_custom_generic_validators..MyGen', 'type': 'is_instance_of', }, ] with pytest.raises(ValidationError) as exc_info: Model(a='foo', gen=MyGen(t1='bar', t2='baz'), gen2=MyGen(t1='bar', t2='baz')) assert exc_info.value.errors(include_url=False) == [ { 'input': 'baz', 'loc': ('gen', 't2'), 'msg': 'Input should be a valid boolean, unable to interpret input', 'type': 'bool_parsing', } ] m = Model(a='foo', gen=MyGen(t1='bar', t2=True), gen2=MyGen(t1=1, t2=2)) assert m.a == 'foo' assert m.gen.t1 == 'bar' assert m.gen.t2 is True assert m.gen2.t1 == 1 assert m.gen2.t2 == 2 def test_custom_generic_arbitrary_allowed(): T1 = TypeVar('T1') T2 = TypeVar('T2') class MyGen(Generic[T1, T2]): def __init__(self, t1: T1, t2: T2): self.t1 = t1 self.t2 = t2 class Model(BaseModel): a: str gen: MyGen[str, bool] model_config = dict(arbitrary_types_allowed=True) with pytest.raises(ValidationError) as exc_info: Model(a='foo', gen='invalid') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'test_custom_generic_arbitrary_allowed..MyGen'}, 'input': 'invalid', 'loc': ('gen',), 'msg': 'Input should be an instance of ' 'test_custom_generic_arbitrary_allowed..MyGen', 'type': 'is_instance_of', } ] # No validation, no exception m = Model(a='foo', gen=MyGen(t1='bar', t2='baz')) assert m.a == 'foo' assert m.gen.t1 == 'bar' assert m.gen.t2 == 'baz' m = Model(a='foo', gen=MyGen(t1='bar', t2=True)) assert m.a == 'foo' assert m.gen.t1 == 'bar' assert m.gen.t2 is True def test_custom_generic_disallowed(): T1 = TypeVar('T1') T2 = TypeVar('T2') class MyGen(Generic[T1, T2]): def __init__(self, t1: T1, t2: T2): self.t1 = t1 self.t2 = t2 match = ( r'Unable to generate pydantic-core schema for (.*)MyGen\[str, bool\](.*). ' r'Set `arbitrary_types_allowed=True` in the model_config to ignore this error' ) with pytest.raises(TypeError, match=match): class Model(BaseModel): a: str gen: MyGen[str, bool] def test_hashable_required(): class Model(BaseModel): v: Hashable Model(v=None) with pytest.raises(ValidationError) as exc_info: Model(v=[]) assert exc_info.value.errors(include_url=False) == [ {'input': [], 'loc': ('v',), 'msg': 'Input should be hashable', 'type': 'is_hashable'} ] with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('v',), 'msg': 'Field required', 'type': 'missing'} ] @pytest.mark.parametrize('default', [1, None]) def test_hashable_optional(default): class Model(BaseModel): v: Hashable = default Model(v=None) Model() def test_hashable_serialization(): class Model(BaseModel): v: Hashable class HashableButNotSerializable: def __hash__(self): return 0 assert Model(v=(1,)).model_dump_json() == '{"v":[1]}' m = Model(v=HashableButNotSerializable()) with pytest.raises( PydanticSerializationError, match='Unable to serialize unknown type:.*HashableButNotSerializable' ): m.model_dump_json() def test_hashable_validate_json(): class Model(BaseModel): v: Hashable ta = TypeAdapter(Model) # Test a large nested dict for validate in (Model.model_validate_json, ta.validate_json): for testcase in ( '{"v": "a"}', '{"v": 1}', '{"v": 1.0}', '{"v": true}', '{"v": null}', ): assert hash(validate(testcase).v) == hash(validate(testcase).v) @pytest.mark.parametrize( 'non_hashable', [ '{"v": []}', '{"v": {"a": 0}}', ], ) def test_hashable_invalid_json(non_hashable) -> None: """This is primarily included in order to document the behavior / limitations of the `Hashable` type's validation logic. Specifically, we don't do any coercions to arrays / dicts when loading from JSON, and thus they are not considered hashable. This would be different if we, for example, coerced arrays to tuples. """ class Model(BaseModel): v: Hashable with pytest.raises(ValidationError): Model.model_validate_json(non_hashable) def test_hashable_json_schema(): class Model(BaseModel): v: Hashable assert Model.model_json_schema() == { 'properties': {'v': {'title': 'V'}}, 'required': ['v'], 'title': 'Model', 'type': 'object', } def test_default_factory_called_once(): """It should never call `default_factory` more than once even when `validate_all` is set""" v = 0 def factory() -> int: nonlocal v v += 1 return v class MyModel(BaseModel): model_config = ConfigDict(validate_default=True) id: int = Field(default_factory=factory) m1 = MyModel() assert m1.id == 1 class MyBadModel(BaseModel): model_config = ConfigDict(validate_default=True) id: List[str] = Field(default_factory=factory) with pytest.raises(ValidationError) as exc_info: MyBadModel() assert v == 2 # `factory` has been called to run validation assert exc_info.value.errors(include_url=False) == [ {'input': 2, 'loc': ('id',), 'msg': 'Input should be a valid list', 'type': 'list_type'} ] def test_default_factory_validator_child(): class Parent(BaseModel): foo: List[str] = Field(default_factory=list) @field_validator('foo', mode='before') @classmethod def mutate_foo(cls, v): return [f'{x}-1' for x in v] assert Parent(foo=['a', 'b']).foo == ['a-1', 'b-1'] class Child(Parent): pass assert Child(foo=['a', 'b']).foo == ['a-1', 'b-1'] def test_resolve_annotations_module_missing(tmp_path): # see https://github.com/pydantic/pydantic/issues/2363 file_path = tmp_path / 'module_to_load.py' # language=Python file_path.write_text( """ from pydantic import BaseModel class User(BaseModel): id: int name: str = 'Jane Doe' """ ) spec = importlib.util.spec_from_file_location('my_test_module', file_path) module = importlib.util.module_from_spec(spec) spec.loader.exec_module(module) assert module.User(id=12).model_dump() == {'id': 12, 'name': 'Jane Doe'} def test_iter_coverage(): class MyModel(BaseModel): x: int = 1 y: str = 'a' with pytest.warns( PydanticDeprecatedSince20, match='The private method `_iter` will be removed and should no longer be used.' ): assert list(MyModel()._iter(by_alias=True)) == [('x', 1), ('y', 'a')] def test_frozen_config_and_field(): class Foo(BaseModel): model_config = ConfigDict(frozen=False, validate_assignment=True) a: str assert Foo.model_fields['a'].metadata == [] f = Foo(a='x') f.a = 'y' assert f.model_dump() == {'a': 'y'} class Bar(BaseModel): model_config = ConfigDict(validate_assignment=True) a: str = Field(frozen=True) c: Annotated[str, Field(frozen=True)] assert Bar.model_fields['a'].frozen b = Bar(a='x', c='z') with pytest.raises(ValidationError) as exc_info: b.a = 'y' assert exc_info.value.errors(include_url=False) == [ {'input': 'y', 'loc': ('a',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] with pytest.raises(ValidationError) as exc_info: b.c = 'y' assert exc_info.value.errors(include_url=False) == [ {'input': 'y', 'loc': ('c',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] assert b.model_dump() == {'a': 'x', 'c': 'z'} def test_arbitrary_types_allowed_custom_eq(): class Foo: def __eq__(self, other): if other.__class__ is not Foo: raise TypeError(f'Cannot interpret {other.__class__.__name__!r} as a valid type') return True class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) x: Foo = Foo() assert Model().x == Foo() def test_bytes_subclass(): class MyModel(BaseModel): my_bytes: bytes class BytesSubclass(bytes): def __new__(cls, data: bytes): self = bytes.__new__(cls, data) return self m = MyModel(my_bytes=BytesSubclass(b'foobar')) assert m.my_bytes.__class__ == BytesSubclass def test_int_subclass(): class MyModel(BaseModel): my_int: int class IntSubclass(int): def __new__(cls, data: int): self = int.__new__(cls, data) return self m = MyModel(my_int=IntSubclass(123)) # This is expected behavior in `V2` because in pydantic-core we cast the value to a rust i64, # so the sub-type information is lost." # (more detail about how to handle this in: https://github.com/pydantic/pydantic/pull/5151#discussion_r1130691036) assert m.my_int.__class__ != IntSubclass assert isinstance(m.my_int, int) def test_model_issubclass(): assert not issubclass(int, BaseModel) class MyModel(BaseModel): x: int assert issubclass(MyModel, BaseModel) class Custom: __fields__ = True assert not issubclass(Custom, BaseModel) def test_long_int(): """ see https://github.com/pydantic/pydantic/issues/1477 and in turn, https://github.com/python/cpython/issues/95778 """ class Model(BaseModel): x: int assert Model(x='1' * 4_300).x == int('1' * 4_300) too_long = '1' * 4_301 with pytest.raises(ValidationError) as exc_info: Model(x=too_long) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing_size', 'loc': ('x',), 'msg': 'Unable to parse input string as an integer, exceeded maximum size', 'input': too_long, } ] # this used to hang indefinitely with pytest.raises(ValidationError): Model(x='1' * (10**7)) def test_parent_field_with_default(): class Parent(BaseModel): a: int = 1 b: int = Field(2) class Child(Parent): c: int = 3 c = Child() assert c.a == 1 assert c.b == 2 assert c.c == 3 @pytest.mark.skipif(sys.version_info < (3, 12), reason='error message different on older versions') @pytest.mark.parametrize( 'bases', [ (BaseModel, ABC), (ABC, BaseModel), (BaseModel,), ], ) def test_abstractmethod_missing_for_all_decorators(bases): class AbstractSquare(*bases): side: float @field_validator('side') @classmethod @abstractmethod def my_field_validator(cls, v): raise NotImplementedError @model_validator(mode='wrap') @classmethod @abstractmethod def my_model_validator(cls, values, handler, info): raise NotImplementedError with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) @classmethod @abstractmethod def my_root_validator(cls, values): raise NotImplementedError with pytest.warns(PydanticDeprecatedSince20): @validator('side') @classmethod @abstractmethod def my_validator(cls, value, **kwargs): raise NotImplementedError @model_serializer(mode='wrap') @abstractmethod def my_model_serializer(self, handler, info): raise NotImplementedError @field_serializer('side') @abstractmethod def my_serializer(self, v, _info): raise NotImplementedError @computed_field @property @abstractmethod def my_computed_field(self) -> Any: raise NotImplementedError class Square(AbstractSquare): pass with pytest.raises( TypeError, match=( "Can't instantiate abstract class Square without an implementation for abstract methods" " 'my_computed_field'," " 'my_field_validator'," " 'my_model_serializer'," " 'my_model_validator'," " 'my_root_validator'," " 'my_serializer'," " 'my_validator'" ), ): Square(side=1.0) def test_generic_wrapped_forwardref(): class Operation(BaseModel): callbacks: 'list[PathItem]' class PathItem(BaseModel): pass Operation.model_rebuild() Operation.model_validate({'callbacks': [PathItem()]}) with pytest.raises(ValidationError) as exc_info: Operation.model_validate({'callbacks': [1]}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('callbacks', 0), 'msg': 'Input should be a valid dictionary or instance of PathItem', 'input': 1, 'ctx': {'class_name': 'PathItem'}, } ] def test_plain_basemodel_field(): class Model(BaseModel): x: BaseModel class Model2(BaseModel): pass assert Model(x=Model2()).x == Model2() with pytest.raises(ValidationError) as exc_info: Model(x=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('x',), 'msg': 'Input should be a valid dictionary or instance of BaseModel', 'input': 1, 'ctx': {'class_name': 'BaseModel'}, } ] def test_invalid_forward_ref_model(): """ This test is to document the fact that forward refs to a type with the same name as that of a field can cause problems, and to demonstrate a way to work around this. """ # The problem: if sys.version_info >= (3, 11): # See PR #8243, this was a RecursionError raised by Python, but is now caught on the Pydantic side error = errors.PydanticUserError else: error = TypeError with pytest.raises(error): class M(BaseModel): B: ForwardRef('B') = Field(default=None) # The solution: class A(BaseModel): B: ForwardRef('__types["B"]') = Field() # F821 assert A.model_fields['B'].annotation == ForwardRef('__types["B"]') # F821 A.model_rebuild(raise_errors=False) assert A.model_fields['B'].annotation == ForwardRef('__types["B"]') # F821 class B(BaseModel): pass class C(BaseModel): pass assert not A.__pydantic_complete__ types = {'B': B} A.model_rebuild(_types_namespace={'__types': types}) assert A.__pydantic_complete__ assert A(B=B()).B == B() with pytest.raises(ValidationError) as exc_info: A(B=C()) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('B',), 'msg': 'Input should be a valid dictionary or instance of B', 'input': C(), 'ctx': {'class_name': 'B'}, } ] @pytest.mark.parametrize( ('sequence_type', 'input_data', 'expected_error_type', 'expected_error_msg', 'expected_error_ctx'), [ pytest.param(List[str], '1bc', 'list_type', 'Input should be a valid list', None, id='list[str]'), pytest.param( Sequence[str], '1bc', 'sequence_str', "'str' instances are not allowed as a Sequence value", {'type_name': 'str'}, id='Sequence[str]', ), pytest.param( Sequence[bytes], b'1bc', 'sequence_str', "'bytes' instances are not allowed as a Sequence value", {'type_name': 'bytes'}, id='Sequence[bytes]', ), ], ) def test_sequences_str(sequence_type, input_data, expected_error_type, expected_error_msg, expected_error_ctx): input_sequence = [input_data[:1], input_data[1:]] expected_error = { 'type': expected_error_type, 'input': input_data, 'loc': ('str_sequence',), 'msg': expected_error_msg, } if expected_error_ctx is not None: expected_error.update(ctx=expected_error_ctx) class Model(BaseModel): str_sequence: sequence_type assert Model(str_sequence=input_sequence).str_sequence == input_sequence with pytest.raises(ValidationError) as e: Model(str_sequence=input_data) assert e.value.errors(include_url=False) == [expected_error] def test_multiple_enums(): """See https://github.com/pydantic/pydantic/issues/6270""" class MyEnum(Enum): a = auto() class MyModel(TypedDict): a: Optional[MyEnum] b: Optional[MyEnum] TypeAdapter(MyModel) @pytest.mark.parametrize( ('literal_type', 'other_type', 'data', 'json_value', 'data_reversed', 'json_value_reversed'), [ (Literal[False], str, False, 'false', False, 'false'), (Literal[True], str, True, 'true', True, 'true'), (Literal[False], str, 'abc', '"abc"', 'abc', '"abc"'), (Literal[False], int, False, 'false', False, 'false'), (Literal[True], int, True, 'true', True, 'true'), (Literal[False], int, 42, '42', 42, '42'), ], ) def test_union_literal_with_other_type(literal_type, other_type, data, json_value, data_reversed, json_value_reversed): class Model(BaseModel): value: Union[literal_type, other_type] value_types_reversed: Union[other_type, literal_type] m = Model(value=data, value_types_reversed=data) assert m.model_dump() == {'value': data, 'value_types_reversed': data_reversed} assert m.model_dump_json() == f'{{"value":{json_value},"value_types_reversed":{json_value_reversed}}}' def test_model_repr_before_validation(): log = [] class MyModel(BaseModel): x: int def __init__(self, **kwargs): log.append(f'before={self!r}') super().__init__(**kwargs) log.append(f'after={self!r}') m = MyModel(x='10') assert m.x == 10 # insert_assert(log) assert log == ['before=MyModel()', 'after=MyModel(x=10)'] def test_custom_exception_handler(): from traceback import TracebackException from pydantic import BaseModel traceback_exceptions = [] class MyModel(BaseModel): name: str class CustomErrorCatcher: def __enter__(self): return None def __exit__(self, _exception_type, exception, exception_traceback): if exception is not None: traceback_exceptions.append( TracebackException( exc_type=type(exception), exc_value=exception, exc_traceback=exception_traceback, capture_locals=True, ) ) return True return False with CustomErrorCatcher(): data = {'age': 'John Doe'} MyModel(**data) assert len(traceback_exceptions) == 1 def test_recursive_walk_fails_on_double_diamond_composition(): class A(BaseModel): pass class B(BaseModel): a_1: A a_2: A class C(BaseModel): b: B class D(BaseModel): c_1: C c_2: C class E(BaseModel): c: C # This is just to check that above model contraption doesn't fail assert E(c=C(b=B(a_1=A(), a_2=A()))).model_dump() == {'c': {'b': {'a_1': {}, 'a_2': {}}}} def test_recursive_root_models_in_discriminated_union(): class Model1(BaseModel): kind: Literal['1'] = '1' two: Optional['Model2'] class Model2(BaseModel): kind: Literal['2'] = '2' one: Optional[Model1] class Root1(RootModel[Model1]): @property def kind(self): # Ensures discriminated union validation works even with model instances return self.root.kind class Root2(RootModel[Model2]): @property def kind(self): # Ensures discriminated union validation works even with model instances return self.root.kind class Outer(BaseModel): a: Annotated[Union[Root1, Root2], Field(discriminator='kind')] b: Annotated[Union[Root1, Root2], Field(discriminator='kind')] validated = Outer.model_validate({'a': {'kind': '1', 'two': None}, 'b': {'kind': '2', 'one': None}}) assert validated == Outer(a=Root1(root=Model1(two=None)), b=Root2(root=Model2(one=None))) # insert_assert(Outer.model_json_schema()) assert Outer.model_json_schema() == { '$defs': { 'Model1': { 'properties': { 'kind': {'const': '1', 'default': '1', 'title': 'Kind', 'type': 'string'}, 'two': {'anyOf': [{'$ref': '#/$defs/Model2'}, {'type': 'null'}]}, }, 'required': ['two'], 'title': 'Model1', 'type': 'object', }, 'Model2': { 'properties': { 'kind': {'const': '2', 'default': '2', 'title': 'Kind', 'type': 'string'}, 'one': {'anyOf': [{'$ref': '#/$defs/Model1'}, {'type': 'null'}]}, }, 'required': ['one'], 'title': 'Model2', 'type': 'object', }, 'Root1': {'$ref': '#/$defs/Model1', 'title': 'Root1'}, 'Root2': {'$ref': '#/$defs/Model2', 'title': 'Root2'}, }, 'properties': { 'a': { 'discriminator': {'mapping': {'1': '#/$defs/Root1', '2': '#/$defs/Root2'}, 'propertyName': 'kind'}, 'oneOf': [{'$ref': '#/$defs/Root1'}, {'$ref': '#/$defs/Root2'}], 'title': 'A', }, 'b': { 'discriminator': {'mapping': {'1': '#/$defs/Root1', '2': '#/$defs/Root2'}, 'propertyName': 'kind'}, 'oneOf': [{'$ref': '#/$defs/Root1'}, {'$ref': '#/$defs/Root2'}], 'title': 'B', }, }, 'required': ['a', 'b'], 'title': 'Outer', 'type': 'object', } def test_eq_with_cached_property(): """ Test BaseModel.__eq__ compatibility with functools.cached_property See GH-7444: https://github.com/pydantic/pydantic/issues/7444 Previously, pydantic BaseModel.__eq__ compared the full __dict__ of model instances. This is not compatible with e.g. functools.cached_property, which caches the computed values in the instance's __dict__ """ class Model(BaseModel): attr: int @functools.cached_property def cached(self) -> int: return 0 obj1 = Model(attr=1) obj2 = Model(attr=1) # ensure the instances are indeed equal before __dict__ mutations assert obj1 == obj2 # This access to the cached_property has the side-effect of modifying obj1.__dict__ # See functools.cached_property documentation and source code obj1.cached # Ensure the objects still compare equals after caching a property assert obj1 == obj2 def test_model_metaclass_on_other_class() -> None: """Test that `ModelMetaclass` can be used as a metaclass on an unrelated class. This is done by some libraries to offer compatibility between Pydantic versions. """ class OtherClass(metaclass=ModelMetaclass): pass @pytest.mark.skipif(sys.version_info < (3, 12), reason='requires Python 3.12+') def test_nested_type_statement(): # https://docs.python.org/3/reference/simple_stmts.html#type globs = {} exec( """ from pydantic import BaseModel class A(BaseModel): type Int = int a: Int """, globs, ) A = globs['A'] assert A(a=1).a == 1 def test_method_descriptors_default() -> None: class SomeModel(BaseModel): @staticmethod def default_int_factory() -> int: ... int_factory: Callable[[], int] = Field(default=default_int_factory) assert SomeModel.model_fields['int_factory'].default is SomeModel.default_int_factory pydantic-2.10.6/tests/test_errors.py000066400000000000000000000020611474456633400175210ustar00rootroot00000000000000import re import pytest from pydantic import BaseModel, PydanticUserError, ValidationError from pydantic.version import version_short def test_user_error_url(): with pytest.raises(PydanticUserError) as exc_info: BaseModel() # insert_assert(str(exc_info.value)) assert str(exc_info.value) == ( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly\n\n' f'For further information visit https://errors.pydantic.dev/{version_short()}/u/base-model-instantiated' ) @pytest.mark.parametrize( 'hide_input,input_str', ((False, 'type=greater_than, input_value=4, input_type=int'), (True, 'type=greater_than')), ) def test_raise_validation_error_hide_input(hide_input, input_str): with pytest.raises(ValidationError, match=re.escape(f'Input should be greater than 5 [{input_str}]')): raise ValidationError.from_exception_data( 'Foobar', [{'type': 'greater_than', 'loc': ('a', 2), 'input': 4, 'ctx': {'gt': 5}}], hide_input=hide_input, ) pydantic-2.10.6/tests/test_exports.py000066400000000000000000000110001474456633400177020ustar00rootroot00000000000000import importlib import importlib.util import json import platform import sys from pathlib import Path from types import ModuleType import pytest import pydantic @pytest.mark.filterwarnings('ignore::DeprecationWarning') def test_init_export(): for name in dir(pydantic): getattr(pydantic, name) @pytest.mark.filterwarnings('ignore::DeprecationWarning') @pytest.mark.parametrize(('attr_name', 'value'), list(pydantic._dynamic_imports.items())) def test_public_api_dynamic_imports(attr_name, value): package, module_name = value if module_name == '__module__': module = importlib.import_module(attr_name, package=package) assert isinstance(module, ModuleType) else: imported_object = getattr(importlib.import_module(module_name, package=package), attr_name) assert isinstance(imported_object, object) @pytest.mark.skipif( platform.python_implementation() == 'PyPy' and platform.python_version_tuple() < ('3', '8'), reason='Produces a weird error on pypy<3.8', ) @pytest.mark.filterwarnings('ignore::DeprecationWarning') @pytest.mark.filterwarnings('ignore::pydantic.warnings.PydanticExperimentalWarning') def test_public_internal(): """ check we don't make anything from _internal public """ public_internal_attributes = [] def _test_file(file: Path, module_name: str): if file.name != '__init__.py' and not file.name.startswith('_'): module = sys.modules.get(module_name) if module is None: spec = importlib.util.spec_from_file_location(module_name, str(file)) module = importlib.util.module_from_spec(spec) sys.modules[module_name] = module try: spec.loader.exec_module(module) except ImportError: return for name, attr in vars(module).items(): if not name.startswith('_'): attr_module = getattr(attr, '__module__', '') if attr_module.startswith('pydantic._internal'): public_internal_attributes.append(f'{module.__name__}:{name} from {attr_module}') pydantic_files = (Path(__file__).parent.parent / 'pydantic').glob('*.py') experimental_files = (Path(__file__).parent.parent / 'pydantic' / 'experimental').glob('*.py') for file in pydantic_files: _test_file(file, f'pydantic.{file.stem}') for file in experimental_files: _test_file(file, f'pydantic.experimental.{file.stem}') if public_internal_attributes: pytest.fail('The following should not be publicly accessible:\n ' + '\n '.join(public_internal_attributes)) # language=Python IMPORTED_PYDANTIC_CODE = """ import sys import pydantic modules = list(sys.modules.keys()) import json print(json.dumps(modules)) """ def test_import_pydantic(subprocess_run_code): output = subprocess_run_code(IMPORTED_PYDANTIC_CODE) imported_modules = json.loads(output) # debug(imported_modules) assert 'pydantic' in imported_modules assert 'pydantic.deprecated' not in imported_modules # language=Python IMPORTED_BASEMODEL_CODE = """ import sys from pydantic import BaseModel modules = list(sys.modules.keys()) import json print(json.dumps(modules)) """ def test_import_base_model(subprocess_run_code): output = subprocess_run_code(IMPORTED_BASEMODEL_CODE) imported_modules = json.loads(output) # debug(sorted(imported_modules)) assert 'pydantic' in imported_modules assert 'pydantic.fields' not in imported_modules assert 'pydantic.types' not in imported_modules assert 'annotated_types' not in imported_modules def test_dataclass_import(subprocess_run_code): @subprocess_run_code def run_in_subprocess(): import pydantic assert pydantic.dataclasses.__name__ == 'pydantic.dataclasses' @pydantic.dataclasses.dataclass class Foo: a: int try: Foo('not an int') except ValueError: pass else: raise AssertionError('Should have raised a ValueError') def test_dataclass_import2(subprocess_run_code): @subprocess_run_code def run_in_subprocess(): import pydantic.dataclasses assert pydantic.dataclasses.__name__ == 'pydantic.dataclasses' @pydantic.dataclasses.dataclass class Foo: a: int try: Foo('not an int') except ValueError: pass else: raise AssertionError('Should have raised a ValueError') pydantic-2.10.6/tests/test_fastapi.sh000077500000000000000000000034351474456633400176270ustar00rootroot00000000000000#! /usr/bin/env bash set -x set -e # waiting on a fix for a bug introduced in v72.0.0, see https://github.com/pypa/setuptools/issues/4519 echo "PIP_CONSTRAINT=setuptools<72.0.0" >> $GITHUB_ENV cd fastapi git fetch --tags pip install -r requirements.txt # Install the version of pydantic from the current branch, not the released version used by fastapi pip uninstall -y pydantic cd .. && pip install . && cd fastapi # ./scripts/test.sh accepts arbitrary arguments and passes them to the pytest call. # This may be necessary if we make low-consequence changes to pydantic, such as minor changes the details of a JSON # schema or the contents of a ValidationError # # To skip a specific test, add '--deselect path/to/test.py::test_name' to the end of this command # # To update the list of deselected tests, remove all deselections, run the tests, and re-add any remaining failures # Remove the first one once that test is fixed, see https://github.com/pydantic/pydantic/pull/10029 # the remaining tests all failing bc we now correctly add a `'deprecated': True` attribute to the JSON schema, # So it's the FastAPI tests that need to be updated here ./scripts/test.sh -vv \ --deselect tests/test_openapi_examples.py::test_openapi_schema \ --deselect tests/test_tutorial/test_query_params_str_validations/test_tutorial010.py::test_openapi_schema \ --deselect tests/test_tutorial/test_query_params_str_validations/test_tutorial010_an.py::test_openapi_schema \ --deselect tests/test_tutorial/test_query_params_str_validations/test_tutorial010_an_py310.py::test_openapi_schema \ --deselect tests/test_tutorial/test_query_params_str_validations/test_tutorial010_an_py39.py::test_openapi_schema \ --deselect tests/test_tutorial/test_query_params_str_validations/test_tutorial010_py310.py::test_openapi_schema \ pydantic-2.10.6/tests/test_fastapi_json_schema.py000066400000000000000000000111771474456633400222150ustar00rootroot00000000000000""" This file contains an initial proposal that can be scrapped and reworked if/when appropriate. Either way, this test file should probably be removed once the actual FastAPI implementation is complete and has integration tests with pydantic v2. However, we are including it here for now to get an early warning if this approach would require modification for compatibility with any future changes to the JSON schema generation logic, etc. See the original PR for more details: https://github.com/pydantic/pydantic/pull/5094 """ from __future__ import annotations from dataclasses import dataclass from typing import Any from dirty_equals import HasRepr, IsInstance, IsStr from pydantic import BaseModel, ConfigDict from pydantic._internal._core_utils import CoreSchemaOrField from pydantic.errors import PydanticInvalidForJsonSchema from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class _ErrorKey(str): pass class FastAPIGenerateJsonSchema(GenerateJsonSchema): """ Idea: This class would be exported from FastAPI, and if users want to modify the way JSON schema is generated in FastAPI, they should inherit from it and override it as appropriate. In the JSON schema generation logic, FastAPI _could_ also attempt to work with classes that inherit directly from GenerateJsonSchema by doing something like: if UserGenerateJsonSchema.handle_invalid_for_json_schema is GenerateJsonSchema.handle_invalid_for_json_schema: # The method has not been overridden; inherit from FastAPIGenerateJsonSchema UserGenerateJsonSchema = type( "UserGenerateJsonSchema", (FastAPIGenerateJsonSchema, UserGenerateJsonSchema), {} ) else: raise TypeError(f"{UserGenerateJsonSchema.__name__} should inherit from FastAPIGenerateJsonSchema") I'm not sure which approach is better. """ def handle_invalid_for_json_schema(self, schema: CoreSchemaOrField, error_info: str) -> JsonSchemaValue: if schema.get('metadata', {}).get('pydantic_js_modify_function') is not None: # Since there is a json schema modify function, assume that this type is meant to be handled, # and the modify function will set all properties as appropriate return {} else: error = PydanticInvalidForJsonSchema(f'Cannot generate a JsonSchema for {error_info}') return {_ErrorKey('error'): error} @dataclass class ErrorDetails: path: list[Any] error: PydanticInvalidForJsonSchema def collect_errors(schema: JsonSchemaValue) -> list[ErrorDetails]: errors: list[ErrorDetails] = [] def _collect_errors(schema: JsonSchemaValue, path: list[Any]) -> None: if isinstance(schema, dict): for k, v in schema.items(): if isinstance(k, _ErrorKey): errors.append(ErrorDetails(path, schema[k])) _collect_errors(v, list(path) + [k]) elif isinstance(schema, list): for i, v in enumerate(schema): _collect_errors(v, list(path) + [i]) _collect_errors(schema, []) return errors def test_inheritance_detection() -> None: class GenerateJsonSchema2(GenerateJsonSchema): pass assert GenerateJsonSchema2.handle_invalid_for_json_schema is GenerateJsonSchema.handle_invalid_for_json_schema # this is just a quick proof of the note above indicating that you can detect whether a specific method # is overridden, for the purpose of allowing direct inheritance from GenerateJsonSchema. assert ( FastAPIGenerateJsonSchema.handle_invalid_for_json_schema is not GenerateJsonSchema.handle_invalid_for_json_schema ) def test_collect_errors() -> None: class Car: def __init__(self, make: str, model: str, year: int): self.make = make self.model = model self.year = year class Model(BaseModel): f1: int = 1 f2: Car model_config = ConfigDict(arbitrary_types_allowed=True) schema = Model.model_json_schema(schema_generator=FastAPIGenerateJsonSchema) assert schema == { 'title': 'Model', 'type': 'object', 'properties': { 'f1': {'type': 'integer', 'default': 1, 'title': 'F1'}, 'f2': { 'error': HasRepr(IsStr(regex=r'PydanticInvalidForJsonSchema\(.*\)')), 'title': 'F2', }, }, 'required': ['f2'], } collected_errors = collect_errors(schema) assert collected_errors == [ ErrorDetails( path=['properties', 'f2'], error=IsInstance(PydanticInvalidForJsonSchema), ) ] pydantic-2.10.6/tests/test_fields.py000066400000000000000000000130261474456633400174560ustar00rootroot00000000000000from typing import Union import pytest import pydantic.dataclasses from pydantic import BaseModel, ConfigDict, Field, PydanticUserError, RootModel, ValidationError, computed_field, fields def test_field_info_annotation_keyword_argument(): """This tests that `FieldInfo.from_field` raises an error if passed the `annotation` kwarg. At the time of writing this test there is no way `FieldInfo.from_field` could receive the `annotation` kwarg from anywhere inside Pydantic code. However, it is possible that this API is still being in use by applications and third-party tools. """ with pytest.raises(TypeError) as e: fields.FieldInfo.from_field(annotation=()) assert e.value.args == ('"annotation" is not permitted as a Field keyword argument',) def test_field_info_annotated_attribute_name_clashing(): """This tests that `FieldInfo.from_annotated_attribute` will raise a `PydanticUserError` if attribute names clashes with a type. """ with pytest.raises(PydanticUserError): class SubModel(BaseModel): a: int = 1 class Model(BaseModel): SubModel: SubModel = Field() def test_init_var_field(): @pydantic.dataclasses.dataclass class Foo: bar: str baz: str = Field(init_var=True) class Model(BaseModel): foo: Foo model = Model(foo=Foo('bar', baz='baz')) assert 'bar' in model.foo.__pydantic_fields__ assert 'baz' not in model.foo.__pydantic_fields__ def test_root_model_arbitrary_field_name_error(): with pytest.raises( NameError, match="Unexpected field with name 'a_field'; only 'root' is allowed as a field of a `RootModel`" ): class Model(RootModel[int]): a_field: str def test_root_model_arbitrary_private_field_works(): class Model(RootModel[int]): _a_field: str = 'value 1' m = Model(1) assert m._a_field == 'value 1' m._a_field = 'value 2' assert m._a_field == 'value 2' def test_root_model_field_override(): # Weird as this is, I think it's probably best to allow it to ensure it is possible to override # the annotation in subclasses of RootModel subclasses. Basically, I think retaining the flexibility # is worth the increased potential for weird/confusing "accidental" overrides. # I'm mostly including this test now to document the behavior class Model(RootModel[int]): root: str assert Model.model_validate('abc').root == 'abc' with pytest.raises(ValidationError) as exc_info: Model.model_validate(1) assert exc_info.value.errors(include_url=False) == [ {'input': 1, 'loc': (), 'msg': 'Input should be a valid string', 'type': 'string_type'} ] class SubModel(Model): root: float with pytest.raises(ValidationError) as exc_info: SubModel.model_validate('abc') assert exc_info.value.errors(include_url=False) == [ { 'input': 'abc', 'loc': (), 'msg': 'Input should be a valid number, unable to parse string as a number', 'type': 'float_parsing', } ] validated = SubModel.model_validate_json('1').root assert validated == 1.0 assert isinstance(validated, float) def test_frozen_field_repr(): class Model(BaseModel): non_frozen_field: int = Field(frozen=False) frozen_field: int = Field(frozen=True) assert repr(Model.model_fields['non_frozen_field']) == 'FieldInfo(annotation=int, required=True)' assert repr(Model.model_fields['frozen_field']) == 'FieldInfo(annotation=int, required=True, frozen=True)' def test_model_field_default_info(): """Test that __repr_args__ of FieldInfo includes the default value when it's set to None.""" class Model(BaseModel): a: Union[int, None] = Field(default=None) b: Union[int, None] = None assert str(Model.model_fields) == ( "{'a': FieldInfo(annotation=Union[int, NoneType], required=False, default=None), " "'b': FieldInfo(annotation=Union[int, NoneType], required=False, default=None)}" ) def test_computed_field_raises_correct_attribute_error(): class Model(BaseModel): model_config = ConfigDict(extra='allow') @computed_field def comp_field(self) -> str: raise AttributeError('Computed field attribute error') @property def prop_field(self): raise AttributeError('Property attribute error') with pytest.raises(AttributeError, match='Computed field attribute error'): Model().comp_field with pytest.raises(AttributeError, match='Property attribute error'): Model().prop_field with pytest.raises(AttributeError, match=f"'{Model.__name__}' object has no attribute 'invalid_field'"): Model().invalid_field @pytest.mark.parametrize('number', (1, 42, 443, 11.11, 0.553)) def test_coerce_numbers_to_str_field_option(number): class Model(BaseModel): field: str = Field(coerce_numbers_to_str=True, max_length=10) assert Model(field=number).field == str(number) @pytest.mark.parametrize('number', (1, 42, 443, 11.11, 0.553)) def test_coerce_numbers_to_str_field_precedence(number): class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) field: str = Field(coerce_numbers_to_str=False) with pytest.raises(ValidationError): Model(field=number) class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=False) field: str = Field(coerce_numbers_to_str=True) assert Model(field=number).field == str(number) pydantic-2.10.6/tests/test_forward_ref.py000066400000000000000000001077321474456633400205200ustar00rootroot00000000000000import dataclasses import re import sys import typing from typing import Any, Optional, Tuple import pytest from pydantic import BaseModel, PydanticUserError, ValidationError def test_postponed_annotations(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class Model(BaseModel): a: int """ ) m = module.Model(a='123') assert m.model_dump() == {'a': 123} def test_postponed_annotations_auto_model_rebuild(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class Model(BaseModel): a: Model """ ) assert module.Model.model_fields['a'].annotation.__name__ == 'Model' def test_forward_ref_auto_update_no_model(create_module): @create_module def module(): from typing import Optional import pytest from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): a: Optional['Bar'] = None with pytest.raises(PydanticUserError, match='`Foo` is not fully defined; you should define `Bar`,'): Foo(a={'b': {'a': {}}}) class Bar(BaseModel): b: 'Foo' assert module.Bar.__pydantic_complete__ is True assert repr(module.Bar.model_fields['b']) == 'FieldInfo(annotation=Foo, required=True)' # Bar should be complete and ready to use b = module.Bar(b={'a': {'b': {}}}) assert b.model_dump() == {'b': {'a': {'b': {'a': None}}}} # model_fields is complete on Foo assert repr(module.Foo.model_fields['a']) == ( 'FieldInfo(annotation=Union[Bar, NoneType], required=False, default=None)' ) assert module.Foo.__pydantic_complete__ is False # Foo gets auto-rebuilt during the first attempt at validation f = module.Foo(a={'b': {'a': {'b': {'a': None}}}}) assert module.Foo.__pydantic_complete__ is True assert f.model_dump() == {'a': {'b': {'a': {'b': {'a': None}}}}} def test_forward_ref_one_of_fields_not_defined(create_module): @create_module def module(): from pydantic import BaseModel class Foo(BaseModel): foo: 'Foo' bar: 'Bar' assert {k: repr(v) for k, v in module.Foo.model_fields.items()} == { 'foo': 'FieldInfo(annotation=Foo, required=True)', 'bar': "FieldInfo(annotation=ForwardRef('Bar'), required=True)", } def test_basic_forward_ref(create_module): @create_module def module(): from typing import ForwardRef, Optional from pydantic import BaseModel class Foo(BaseModel): a: int FooRef = ForwardRef('Foo') class Bar(BaseModel): b: Optional[FooRef] = None assert module.Bar().model_dump() == {'b': None} assert module.Bar(b={'a': '123'}).model_dump() == {'b': {'a': 123}} def test_self_forward_ref_module(create_module): @create_module def module(): from typing import ForwardRef, Optional from pydantic import BaseModel FooRef = ForwardRef('Foo') class Foo(BaseModel): a: int = 123 b: Optional[FooRef] = None assert module.Foo().model_dump() == {'a': 123, 'b': None} assert module.Foo(b={'a': '321'}).model_dump() == {'a': 123, 'b': {'a': 321, 'b': None}} def test_self_forward_ref_collection(create_module): @create_module def module(): from typing import Dict, List from pydantic import BaseModel class Foo(BaseModel): a: int = 123 b: 'Foo' = None c: 'List[Foo]' = [] d: 'Dict[str, Foo]' = {} assert module.Foo().model_dump() == {'a': 123, 'b': None, 'c': [], 'd': {}} assert module.Foo(b={'a': '321'}, c=[{'a': 234}], d={'bar': {'a': 345}}).model_dump() == { 'a': 123, 'b': {'a': 321, 'b': None, 'c': [], 'd': {}}, 'c': [{'a': 234, 'b': None, 'c': [], 'd': {}}], 'd': {'bar': {'a': 345, 'b': None, 'c': [], 'd': {}}}, } with pytest.raises(ValidationError) as exc_info: module.Foo(b={'a': '321'}, c=[{'b': 234}], d={'bar': {'a': 345}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('c', 0, 'b'), 'msg': 'Input should be a valid dictionary or instance of Foo', 'input': 234, 'ctx': {'class_name': 'Foo'}, } ] assert repr(module.Foo.model_fields['a']) == 'FieldInfo(annotation=int, required=False, default=123)' assert repr(module.Foo.model_fields['b']) == 'FieldInfo(annotation=Foo, required=False, default=None)' if sys.version_info < (3, 10): return assert repr(module.Foo.model_fields['c']) == ('FieldInfo(annotation=List[Foo], required=False, ' 'default=[])') assert repr(module.Foo.model_fields['d']) == ('FieldInfo(annotation=Dict[str, Foo], required=False, default={})') def test_self_forward_ref_local(create_module): @create_module def module(): from typing import ForwardRef from pydantic import BaseModel def main(): Foo = ForwardRef('Foo') class Foo(BaseModel): a: int = 123 b: Foo = None return Foo Foo = module.main() assert Foo().model_dump() == {'a': 123, 'b': None} assert Foo(b={'a': '321'}).model_dump() == {'a': 123, 'b': {'a': 321, 'b': None}} def test_forward_ref_dataclass(create_module): @create_module def module(): from typing import Optional from pydantic.dataclasses import dataclass @dataclass class MyDataclass: a: int b: Optional['MyDataclass'] = None dc = module.MyDataclass(a=1, b={'a': 2, 'b': {'a': 3}}) assert dataclasses.asdict(dc) == {'a': 1, 'b': {'a': 2, 'b': {'a': 3, 'b': None}}} def test_forward_ref_sub_types(create_module): @create_module def module(): from typing import ForwardRef, Union from pydantic import BaseModel class Leaf(BaseModel): a: str TreeType = Union[ForwardRef('Node'), Leaf] class Node(BaseModel): value: int left: TreeType right: TreeType Node = module.Node Leaf = module.Leaf data = {'value': 3, 'left': {'a': 'foo'}, 'right': {'value': 5, 'left': {'a': 'bar'}, 'right': {'a': 'buzz'}}} node = Node(**data) assert isinstance(node.left, Leaf) assert isinstance(node.right, Node) def test_forward_ref_nested_sub_types(create_module): @create_module def module(): from typing import ForwardRef, Tuple, Union from pydantic import BaseModel class Leaf(BaseModel): a: str TreeType = Union[Union[Tuple[ForwardRef('Node'), str], int], Leaf] class Node(BaseModel): value: int left: TreeType right: TreeType Node = module.Node Leaf = module.Leaf data = { 'value': 3, 'left': {'a': 'foo'}, 'right': [{'value': 5, 'left': {'a': 'bar'}, 'right': {'a': 'buzz'}}, 'test'], } node = Node(**data) assert isinstance(node.left, Leaf) assert isinstance(node.right[0], Node) def test_self_reference_json_schema(create_module): @create_module def module(): from typing import List from pydantic import BaseModel class Account(BaseModel): name: str subaccounts: List['Account'] = [] Account = module.Account assert Account.model_json_schema() == { '$ref': '#/$defs/Account', '$defs': { 'Account': { 'title': 'Account', 'type': 'object', 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'subaccounts': { 'title': 'Subaccounts', 'default': [], 'type': 'array', 'items': {'$ref': '#/$defs/Account'}, }, }, 'required': ['name'], } }, } def test_self_reference_json_schema_with_future_annotations(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class Account(BaseModel): name: str subaccounts: list[Account] = [] """ ) Account = module.Account assert Account.model_json_schema() == { '$ref': '#/$defs/Account', '$defs': { 'Account': { 'title': 'Account', 'type': 'object', 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'subaccounts': { 'title': 'Subaccounts', 'default': [], 'type': 'array', 'items': {'$ref': '#/$defs/Account'}, }, }, 'required': ['name'], } }, } def test_circular_reference_json_schema(create_module): @create_module def module(): from typing import List from pydantic import BaseModel class Owner(BaseModel): account: 'Account' class Account(BaseModel): name: str owner: 'Owner' subaccounts: List['Account'] = [] Account = module.Account assert Account.model_json_schema() == { '$ref': '#/$defs/Account', '$defs': { 'Account': { 'title': 'Account', 'type': 'object', 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'owner': {'$ref': '#/$defs/Owner'}, 'subaccounts': { 'title': 'Subaccounts', 'default': [], 'type': 'array', 'items': {'$ref': '#/$defs/Account'}, }, }, 'required': ['name', 'owner'], }, 'Owner': { 'title': 'Owner', 'type': 'object', 'properties': {'account': {'$ref': '#/$defs/Account'}}, 'required': ['account'], }, }, } def test_circular_reference_json_schema_with_future_annotations(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class Owner(BaseModel): account: Account class Account(BaseModel): name: str owner: Owner subaccounts: list[Account] = [] """ ) Account = module.Account assert Account.model_json_schema() == { '$ref': '#/$defs/Account', '$defs': { 'Account': { 'title': 'Account', 'type': 'object', 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'owner': {'$ref': '#/$defs/Owner'}, 'subaccounts': { 'title': 'Subaccounts', 'default': [], 'type': 'array', 'items': {'$ref': '#/$defs/Account'}, }, }, 'required': ['name', 'owner'], }, 'Owner': { 'title': 'Owner', 'type': 'object', 'properties': {'account': {'$ref': '#/$defs/Account'}}, 'required': ['account'], }, }, } def test_forward_ref_with_field(create_module): @create_module def module(): import re from typing import ForwardRef, List import pytest from pydantic import BaseModel, Field Foo = ForwardRef('Foo') class Foo(BaseModel): c: List[Foo] = Field(gt=0) with pytest.raises(TypeError, match=re.escape("Unable to apply constraint 'gt' to supplied value []")): Foo(c=[Foo(c=[])]) def test_forward_ref_optional(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel, Field class Spec(BaseModel): spec_fields: list[str] = Field(alias="fields") filter: str | None = None sort: str | None class PSpec(Spec): g: GSpec | None = None class GSpec(Spec): p: PSpec | None # PSpec.model_rebuild() class Filter(BaseModel): g: GSpec | None = None p: PSpec | None """ ) Filter = module.Filter assert isinstance(Filter(p={'sort': 'some_field:asc', 'fields': []}), Filter) def test_forward_ref_with_create_model(create_module): @create_module def module(): import pydantic Sub = pydantic.create_model('Sub', foo=(str, 'bar'), __module__=__name__) assert Sub # get rid of "local variable 'Sub' is assigned to but never used" Main = pydantic.create_model('Main', sub=('Sub', ...), __module__=__name__) instance = Main(sub={}) assert instance.sub.model_dump() == {'foo': 'bar'} def test_resolve_forward_ref_dataclass(create_module): module = create_module( # language=Python """ from __future__ import annotations from dataclasses import dataclass from pydantic import BaseModel from typing_extensions import Literal @dataclass class Base: literal: Literal[1, 2] class What(BaseModel): base: Base """ ) m = module.What(base=module.Base(literal=1)) assert m.base.literal == 1 def test_nested_forward_ref(): class NestedTuple(BaseModel): x: Tuple[int, Optional['NestedTuple']] obj = NestedTuple.model_validate({'x': ('1', {'x': ('2', {'x': ('3', None)})})}) assert obj.model_dump() == {'x': (1, {'x': (2, {'x': (3, None)})})} def test_discriminated_union_forward_ref(create_module): @create_module def module(): from typing import Union from typing_extensions import Literal from pydantic import BaseModel, Field class Pet(BaseModel): pet: Union['Cat', 'Dog'] = Field(discriminator='type') class Cat(BaseModel): type: Literal['cat'] class Dog(BaseModel): type: Literal['dog'] assert module.Pet.__pydantic_complete__ is False with pytest.raises( ValidationError, match="Input tag 'pika' found using 'type' does not match any of the expected tags: 'cat', 'dog'", ): module.Pet.model_validate({'pet': {'type': 'pika'}}) # Ensure the rebuild has happened automatically despite validation failure assert module.Pet.__pydantic_complete__ is True # insert_assert(module.Pet.model_json_schema()) assert module.Pet.model_json_schema() == { 'title': 'Pet', 'required': ['pet'], 'type': 'object', 'properties': { 'pet': { 'title': 'Pet', 'discriminator': {'mapping': {'cat': '#/$defs/Cat', 'dog': '#/$defs/Dog'}, 'propertyName': 'type'}, 'oneOf': [{'$ref': '#/$defs/Cat'}, {'$ref': '#/$defs/Dog'}], } }, '$defs': { 'Cat': { 'title': 'Cat', 'type': 'object', 'properties': {'type': {'const': 'cat', 'title': 'Type', 'type': 'string'}}, 'required': ['type'], }, 'Dog': { 'title': 'Dog', 'type': 'object', 'properties': {'type': {'const': 'dog', 'title': 'Type', 'type': 'string'}}, 'required': ['type'], }, }, } def test_class_var_as_string(create_module): module = create_module( # language=Python """ from __future__ import annotations from typing import ClassVar, ClassVar as CV from typing_extensions import Annotated from pydantic import BaseModel class Model(BaseModel): a: ClassVar[int] _b: ClassVar[int] _c: ClassVar[Forward] _d: Annotated[ClassVar[int], ...] _e: CV[int] _f: Annotated[CV[int], ...] # Doesn't work as of today: # _g: CV[Forward] Forward = int """ ) assert module.Model.__class_vars__ == {'a', '_b', '_c', '_d', '_e', '_f'} assert module.Model.__private_attributes__ == {} def test_private_attr_annotation_not_evaluated() -> None: class Model(BaseModel): _a: 'UnknownAnnotation' assert '_a' in Model.__private_attributes__ def test_json_encoder_str(create_module): module = create_module( # language=Python """ from pydantic import BaseModel, ConfigDict, field_serializer class User(BaseModel): x: str FooUser = User class User(BaseModel): y: str class Model(BaseModel): foo_user: FooUser user: User @field_serializer('user') def serialize_user(self, v): return f'User({v.y})' """ ) m = module.Model(foo_user={'x': 'user1'}, user={'y': 'user2'}) # TODO: How can we replicate this custom-encoder functionality without affecting the serialization of `User`? assert m.model_dump_json() == '{"foo_user":{"x":"user1"},"user":"User(user2)"}' skip_pep585 = pytest.mark.skipif( sys.version_info < (3, 9), reason='PEP585 generics only supported for python 3.9 and above' ) @skip_pep585 def test_pep585_self_referencing_generics(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class SelfReferencing(BaseModel): names: list[SelfReferencing] # noqa: F821 """ ) SelfReferencing = module.SelfReferencing if sys.version_info >= (3, 10): assert ( repr(SelfReferencing.model_fields['names']) == 'FieldInfo(annotation=list[SelfReferencing], required=True)' ) # test that object creation works obj = SelfReferencing(names=[SelfReferencing(names=[])]) assert obj.names == [SelfReferencing(names=[])] @skip_pep585 def test_pep585_recursive_generics(create_module): @create_module def module(): from typing import ForwardRef from pydantic import BaseModel HeroRef = ForwardRef('Hero') class Team(BaseModel): name: str heroes: list[HeroRef] class Hero(BaseModel): name: str teams: list[Team] Team.model_rebuild() assert repr(module.Team.model_fields['heroes']) == 'FieldInfo(annotation=list[Hero], required=True)' assert repr(module.Hero.model_fields['teams']) == 'FieldInfo(annotation=list[Team], required=True)' h = module.Hero(name='Ivan', teams=[module.Team(name='TheBest', heroes=[])]) # insert_assert(h.model_dump()) assert h.model_dump() == {'name': 'Ivan', 'teams': [{'name': 'TheBest', 'heroes': []}]} def test_class_var_forward_ref(create_module): # see #3679 create_module( # language=Python """ from __future__ import annotations from typing import ClassVar from pydantic import BaseModel class WithClassVar(BaseModel): Instances: ClassVar[dict[str, WithClassVar]] = {} """ ) def test_recursive_model(create_module): module = create_module( # language=Python """ from __future__ import annotations from typing import Optional from pydantic import BaseModel class Foobar(BaseModel): x: int y: Optional[Foobar] = None """ ) f = module.Foobar(x=1, y={'x': 2}) assert f.model_dump() == {'x': 1, 'y': {'x': 2, 'y': None}} assert f.model_fields_set == {'x', 'y'} assert f.y.model_fields_set == {'x'} @pytest.mark.skipif(sys.version_info < (3, 10), reason='needs 3.10 or newer') def test_recursive_models_union(create_module): # This test should pass because PydanticRecursiveRef.__or__ is implemented, # not because `eval_type_backport` magically makes `|` work, # since it's installed for tests but otherwise optional. sys.modules['eval_type_backport'] = None # type: ignore try: module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel from typing import TypeVar, Generic T = TypeVar("T") class Foo(BaseModel): bar: Bar[str] | None = None bar2: int | Bar[float] class Bar(BaseModel, Generic[T]): foo: Foo """ ) finally: del sys.modules['eval_type_backport'] assert module.Foo.model_fields['bar'].annotation == typing.Optional[module.Bar[str]] assert module.Foo.model_fields['bar2'].annotation == typing.Union[int, module.Bar[float]] assert module.Bar.model_fields['foo'].annotation == module.Foo def test_recursive_models_union_backport(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel from typing import TypeVar, Generic T = TypeVar("T") class Foo(BaseModel): bar: Bar[str] | None = None # The `int | str` here differs from the previous test and requires the backport. # At the same time, `PydanticRecursiveRef.__or__` means that the second `|` works normally, # which actually triggered a bug in the backport that needed fixing. bar2: int | str | Bar[float] class Bar(BaseModel, Generic[T]): foo: Foo """ ) assert module.Foo.model_fields['bar'].annotation == typing.Optional[module.Bar[str]] assert module.Foo.model_fields['bar2'].annotation == typing.Union[int, str, module.Bar[float]] assert module.Bar.model_fields['foo'].annotation == module.Foo def test_force_rebuild(): class Foobar(BaseModel): b: int assert Foobar.__pydantic_complete__ is True assert Foobar.model_rebuild() is None assert Foobar.model_rebuild(force=True) is True def test_rebuild_subclass_of_built_model(): class Model(BaseModel): x: int class FutureReferencingModel(Model): y: 'FutureModel' class FutureModel(BaseModel): pass FutureReferencingModel.model_rebuild() assert FutureReferencingModel(x=1, y=FutureModel()).model_dump() == {'x': 1, 'y': {}} def test_nested_annotation(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel def nested(): class Foo(BaseModel): a: int class Bar(BaseModel): b: Foo return Bar """ ) bar_model = module.nested() assert bar_model.__pydantic_complete__ is True assert bar_model(b={'a': 1}).model_dump() == {'b': {'a': 1}} def test_nested_more_annotation(create_module): @create_module def module(): from pydantic import BaseModel def nested(): class Foo(BaseModel): a: int def more_nested(): class Bar(BaseModel): b: 'Foo' return Bar return more_nested() bar_model = module.nested() # this does not work because Foo is in a parent scope assert bar_model.__pydantic_complete__ is False def test_nested_annotation_priority(create_module): @create_module def module(): from annotated_types import Gt from typing_extensions import Annotated from pydantic import BaseModel Foobar = Annotated[int, Gt(0)] # noqa: F841 def nested(): Foobar = Annotated[int, Gt(10)] class Bar(BaseModel): b: 'Foobar' return Bar bar_model = module.nested() assert bar_model.model_fields['b'].metadata[0].gt == 10 assert bar_model(b=11).model_dump() == {'b': 11} with pytest.raises(ValidationError, match=r'Input should be greater than 10 \[type=greater_than,'): bar_model(b=1) def test_nested_model_rebuild(create_module): @create_module def module(): from pydantic import BaseModel def nested(): class Bar(BaseModel): b: 'Foo' class Foo(BaseModel): a: int assert Bar.__pydantic_complete__ is False Bar.model_rebuild() return Bar bar_model = module.nested() assert bar_model.__pydantic_complete__ is True assert bar_model(b={'a': 1}).model_dump() == {'b': {'a': 1}} def pytest_raises_user_error_for_undefined_type(defining_class_name, missing_type_name): """ Returns a `pytest.raises` context manager that checks the error message when an undefined type is present. usage: `with pytest_raises_user_error_for_undefined_type(class_name='Foobar', missing_class_name='UndefinedType'):` """ return pytest.raises( PydanticUserError, match=re.escape( f'`{defining_class_name}` is not fully defined; you should define `{missing_type_name}`, then call' f' `{defining_class_name}.model_rebuild()`.' ), ) # NOTE: the `undefined_types_warning` tests below are "statically parameterized" (i.e. have Duplicate Code). # The initial attempt to refactor them into a single parameterized test was not straightforward due to the use of the # `create_module` fixture and the requirement that `from __future__ import annotations` be the first line in a module. # # Test Parameters: # 1. config setting: (1a) default behavior vs (1b) overriding with Config setting: # 2. type checking approach: (2a) `from __future__ import annotations` vs (2b) `ForwardRef` # # The parameter tags "1a", "1b", "2a", and "2b" are used in the test names below, to indicate which combination # of conditions the test is covering. def test_undefined_types_warning_1a_raised_by_default_2a_future_annotations(create_module): with pytest_raises_user_error_for_undefined_type(defining_class_name='Foobar', missing_type_name='UndefinedType'): create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class Foobar(BaseModel): a: UndefinedType # Trigger the error for an undefined type: Foobar(a=1) """ ) def test_undefined_types_warning_1a_raised_by_default_2b_forward_ref(create_module): with pytest_raises_user_error_for_undefined_type(defining_class_name='Foobar', missing_type_name='UndefinedType'): @create_module def module(): from typing import ForwardRef from pydantic import BaseModel UndefinedType = ForwardRef('UndefinedType') class Foobar(BaseModel): a: UndefinedType # Trigger the error for an undefined type Foobar(a=1) def test_undefined_types_warning_1b_suppressed_via_config_2a_future_annotations(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel # Because we don't instantiate the type, no error for an undefined type is raised class Foobar(BaseModel): a: UndefinedType """ ) # Since we're testing the absence of an error, it's important to confirm pydantic was actually run. # The presence of the `__pydantic_complete__` is a good indicator of this. assert module.Foobar.__pydantic_complete__ is False def test_undefined_types_warning_1b_suppressed_via_config_2b_forward_ref(create_module): @create_module def module(): from typing import ForwardRef from pydantic import BaseModel UndefinedType = ForwardRef('UndefinedType') # Because we don't instantiate the type, no error for an undefined type is raised class Foobar(BaseModel): a: UndefinedType # Since we're testing the absence of a warning, it's important to confirm pydantic was actually run. # The presence of the `__pydantic_complete__` is a good indicator of this. assert module.Foobar.__pydantic_complete__ is False def test_undefined_types_warning_raised_by_usage(create_module): with pytest_raises_user_error_for_undefined_type('Foobar', 'UndefinedType'): @create_module def module(): from typing import ForwardRef from pydantic import BaseModel UndefinedType = ForwardRef('UndefinedType') class Foobar(BaseModel): a: UndefinedType Foobar(a=1) def test_rebuild_recursive_schema(): from typing import ForwardRef, List class Expressions_(BaseModel): model_config = dict(undefined_types_warning=False) items: List["types['Expression']"] class Expression_(BaseModel): model_config = dict(undefined_types_warning=False) Or: ForwardRef("types['allOfExpressions']") Not: ForwardRef("types['allOfExpression']") class allOfExpression_(BaseModel): model_config = dict(undefined_types_warning=False) Not: ForwardRef("types['Expression']") class allOfExpressions_(BaseModel): model_config = dict(undefined_types_warning=False) items: List["types['Expression']"] types_namespace = { 'types': { 'Expression': Expression_, 'Expressions': Expressions_, 'allOfExpression': allOfExpression_, 'allOfExpressions': allOfExpressions_, } } models = [allOfExpressions_, Expressions_] for m in models: m.model_rebuild(_types_namespace=types_namespace) def test_forward_ref_in_generic(create_module: Any) -> None: """https://github.com/pydantic/pydantic/issues/6503""" @create_module def module(): import typing as tp from pydantic import BaseModel class Foo(BaseModel): x: tp.Dict['tp.Type[Bar]', tp.Type['Bar']] class Bar(BaseModel): pass Foo = module.Foo Bar = module.Bar assert Foo(x={Bar: Bar}).x[Bar] is Bar def test_forward_ref_in_generic_separate_modules(create_module: Any) -> None: """https://github.com/pydantic/pydantic/issues/6503""" @create_module def module_1(): import typing as tp from pydantic import BaseModel class Foo(BaseModel): x: tp.Dict['tp.Type[Bar]', tp.Type['Bar']] @create_module def module_2(): from pydantic import BaseModel class Bar(BaseModel): pass Foo = module_1.Foo Bar = module_2.Bar Foo.model_rebuild(_types_namespace={'tp': typing, 'Bar': Bar}) assert Foo(x={Bar: Bar}).x[Bar] is Bar def test_invalid_forward_ref() -> None: class CustomType: """A custom type that isn't subscriptable.""" msg = "Unable to evaluate type annotation 'CustomType[int]'." with pytest.raises(TypeError, match=re.escape(msg)): class Model(BaseModel): foo: 'CustomType[int]' def test_pydantic_extra_forward_ref_separate_module(create_module: Any) -> None: """https://github.com/pydantic/pydantic/issues/10069""" @create_module def module_1(): from typing import Dict from pydantic import BaseModel, ConfigDict class Bar(BaseModel): model_config = ConfigDict(defer_build=True, extra='allow') __pydantic_extra__: 'Dict[str, int]' module_2 = create_module( f""" from pydantic import BaseModel from {module_1.__name__} import Bar class Foo(BaseModel): bar: Bar """ ) extras_schema = module_2.Foo.__pydantic_core_schema__['schema']['fields']['bar']['schema']['schema'][ 'extras_schema' ] assert extras_schema == {'type': 'int'} @pytest.mark.xfail( reason='While `get_cls_type_hints` uses the correct module ns for each base, `collect_model_fields` ' 'will still use the `FieldInfo` instances from each base (see the `parent_fields_lookup` logic). ' 'This means that `f` is still a forward ref in `Foo.model_fields`, and it gets evaluated in ' '`GenerateSchema._model_schema`, where only the module of `Foo` is considered.' ) def test_uses_the_correct_globals_to_resolve_model_forward_refs(create_module): @create_module def module_1(): from pydantic import BaseModel class Bar(BaseModel): f: 'A' A = int module_2 = create_module( f""" from {module_1.__name__} import Bar A = str class Foo(Bar): pass """ ) assert module_2.Foo.model_fields['f'].annotation is int @pytest.mark.xfail( reason='We should keep a reference to the parent frame, not `f_locals`. ' "It's probably only reasonable to support this in Python 3.14 with PEP 649." ) def test_can_resolve_forward_refs_in_parent_frame_after_class_definition(): def func(): class Model(BaseModel): a: 'A' class A(BaseModel): pass return Model Model = func() Model.model_rebuild() def test_uses_correct_global_ns_for_type_defined_in_separate_module(create_module): @create_module def module_1(): from dataclasses import dataclass @dataclass class Bar: f: 'A' A = int module_2 = create_module( f""" from pydantic import BaseModel from {module_1.__name__} import Bar A = str class Foo(BaseModel): bar: Bar """ ) module_2.Foo(bar={'f': 1}) def test_uses_the_local_namespace_when_generating_schema(): def func(): A = int class Model(BaseModel): __pydantic_extra__: 'dict[str, A]' model_config = {'defer_build': True, 'extra': 'allow'} return Model Model = func() A = str # noqa: F841 Model.model_rebuild() Model(extra_value=1) def test_uses_the_correct_globals_to_resolve_dataclass_forward_refs(create_module): @create_module def module_1(): from dataclasses import dataclass A = int @dataclass class DC1: a: 'A' module_2 = create_module(f""" from dataclasses import dataclass from pydantic import BaseModel from {module_1.__name__} import DC1 A = str @dataclass class DC2(DC1): b: 'A' class Model(BaseModel): dc: DC2 """) Model = module_2.Model Model(dc=dict(a=1, b='not_an_int')) @pytest.mark.skipif(sys.version_info < (3, 12), reason='Requires PEP 695 syntax') def test_class_locals_are_kept_during_schema_generation(create_module): create_module( """ from pydantic import BaseModel class Model(BaseModel): type Test = int a: 'Test | Forward' Forward = str Model.model_rebuild() """ ) def test_validate_call_does_not_override_the_global_ns_with_the_local_ns_where_it_is_used(create_module): from pydantic import validate_call @create_module def module_1(): A = int def func(a: 'A'): pass def inner(): A = str # noqa: F841 from module_1 import func func_val = validate_call(func) func_val(a=1) def test_uses_the_correct_globals_to_resolve_forward_refs_on_serializers(create_module): # Note: unlike `test_uses_the_correct_globals_to_resolve_model_forward_refs`, # we use the globals of the underlying func to resolve the return type. @create_module def module_1(): from typing_extensions import Annotated from pydantic import ( BaseModel, PlainSerializer, # or WrapSerializer field_serializer, # or model_serializer, computed_field ) MyStr = str def ser_func(value) -> 'MyStr': return str(value) class Model(BaseModel): a: int b: Annotated[int, PlainSerializer(ser_func)] @field_serializer('a') def ser(self, value) -> 'MyStr': return str(self.a) class Sub(module_1.Model): pass Sub.model_rebuild() @pytest.mark.xfail(reason='parent namespace is used for every type in `NsResolver`, for backwards compatibility.') def test_do_not_use_parent_ns_when_outside_the_function(create_module): @create_module def module_1(): import dataclasses from pydantic import BaseModel @dataclasses.dataclass class A: a: 'Model' # shouldn't resolve b: 'Test' # same def inner(): Test = int # noqa: F841 class Model(BaseModel, A): pass return Model ReturnedModel = inner() # noqa: F841 assert module_1.ReturnedModel.__pydantic_complete__ is False pydantic-2.10.6/tests/test_generics.py000066400000000000000000002526411474456633400200170ustar00rootroot00000000000000import gc import itertools import json import platform import re import sys from collections import deque from enum import Enum, IntEnum from typing import ( Any, Callable, ClassVar, Counter, DefaultDict, Deque, Dict, FrozenSet, Generic, Iterable, List, Mapping, NamedTuple, Optional, OrderedDict, Sequence, Set, Tuple, Type, TypeVar, Union, ) import pytest from dirty_equals import HasRepr, IsStr from pydantic_core import CoreSchema, core_schema from typing_extensions import ( Annotated, Literal, Never, NotRequired, ParamSpec, TypedDict, TypeVarTuple, Unpack, get_args, ) from typing_extensions import ( TypeVar as TypingExtensionsTypeVar, ) from pydantic import ( BaseModel, Field, GetCoreSchemaHandler, Json, PositiveInt, PydanticSchemaGenerationError, PydanticUserError, TypeAdapter, ValidationError, ValidationInfo, computed_field, field_validator, model_validator, ) from pydantic._internal._core_utils import collect_invalid_schemas from pydantic._internal._generics import ( _GENERIC_TYPES_CACHE, _LIMITED_DICT_SIZE, LimitedDict, generic_recursion_self_type, iter_contained_typevars, recursively_defined_type_refs, replace_types, ) from pydantic.warnings import GenericBeforeBaseModelWarning @pytest.fixture() def clean_cache(): # cleans up _GENERIC_TYPES_CACHE for checking item counts in the cache _GENERIC_TYPES_CACHE.clear() gc.collect(0) gc.collect(1) gc.collect(2) def test_generic_name(): data_type = TypeVar('data_type') class Result(BaseModel, Generic[data_type]): data: data_type if sys.version_info >= (3, 9): assert Result[list[int]].__name__ == 'Result[list[int]]' assert Result[List[int]].__name__ == 'Result[List[int]]' assert Result[int].__name__ == 'Result[int]' def test_double_parameterize_error(): data_type = TypeVar('data_type') class Result(BaseModel, Generic[data_type]): data: data_type with pytest.raises(TypeError) as exc_info: Result[int][int] assert str(exc_info.value) == " is not a generic class" def test_value_validation(): T = TypeVar('T', bound=Dict[Any, Any]) class Response(BaseModel, Generic[T]): data: T @field_validator('data') @classmethod def validate_value_nonzero(cls, v: Any): if any(x == 0 for x in v.values()): raise ValueError('some value is zero') return v @model_validator(mode='after') def validate_sum(self) -> 'Response[T]': data = self.data if sum(data.values()) > 5: raise ValueError('sum too large') return self assert Response[Dict[int, int]](data={1: '4'}).model_dump() == {'data': {1: 4}} with pytest.raises(ValidationError) as exc_info: Response[Dict[int, int]](data={1: 'a'}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('data', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] with pytest.raises(ValidationError) as exc_info: Response[Dict[int, int]](data={1: 0}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('some value is zero')))}, 'input': {1: 0}, 'loc': ('data',), 'msg': 'Value error, some value is zero', 'type': 'value_error', } ] with pytest.raises(ValidationError) as exc_info: Response[Dict[int, int]](data={1: 3, 2: 6}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('sum too large')))}, 'input': {'data': {1: 3, 2: 6}}, 'loc': (), 'msg': 'Value error, sum too large', 'type': 'value_error', } ] def test_methods_are_inherited(): class CustomModel(BaseModel): def method(self): return self.data T = TypeVar('T') class Model(CustomModel, Generic[T]): data: T instance = Model[int](data=1) assert instance.method() == 1 def test_config_is_inherited(): class CustomGenericModel(BaseModel, frozen=True): ... T = TypeVar('T') class Model(CustomGenericModel, Generic[T]): data: T instance = Model[int](data=1) with pytest.raises(ValidationError) as exc_info: instance.data = 2 assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_instance', 'loc': ('data',), 'msg': 'Instance is frozen', 'input': 2} ] def test_default_argument(): T = TypeVar('T') class Result(BaseModel, Generic[T]): data: T other: bool = True result = Result[int](data=1) assert result.other is True def test_default_argument_for_typevar(): T = TypeVar('T') class Result(BaseModel, Generic[T]): data: T = 4 result = Result[int]() assert result.data == 4 result = Result[float]() assert result.data == 4 result = Result[int](data=1) assert result.data == 1 def test_classvar(): T = TypeVar('T') class Result(BaseModel, Generic[T]): data: T other: ClassVar[int] = 1 assert Result.other == 1 assert Result[int].other == 1 assert Result[int](data=1).other == 1 assert 'other' not in Result.model_fields def test_non_annotated_field(): T = TypeVar('T') with pytest.raises(PydanticUserError, match='A non-annotated attribute was detected: `other = True`'): class Result(BaseModel, Generic[T]): data: T other = True def test_non_generic_field(): T = TypeVar('T') class Result(BaseModel, Generic[T]): data: T other: bool = True assert 'other' in Result.model_fields assert 'other' in Result[int].model_fields result = Result[int](data=1) assert result.other is True def test_must_inherit_from_generic(): with pytest.raises(TypeError) as exc_info: class Result(BaseModel): pass Result[int] assert str(exc_info.value) == ( ".Result'> cannot be " 'parametrized because it does not inherit from typing.Generic' ) def test_parameters_placed_on_generic(): T = TypeVar('T') with pytest.raises(TypeError, match='Type parameters should be placed on typing.Generic, not BaseModel'): class Result(BaseModel[T]): pass def test_parameters_must_be_typevar(): with pytest.raises(TypeError, match='Type parameters should be placed on typing.Generic, not BaseModel'): class Result(BaseModel[int]): pass def test_subclass_can_be_genericized(): T = TypeVar('T') class Result(BaseModel, Generic[T]): pass Result[T] def test_parameter_count(): T = TypeVar('T') S = TypeVar('S') class Model(BaseModel, Generic[T, S]): x: T y: S with pytest.raises(TypeError) as exc_info: Model[int, int, int] # This error message, which comes from `typing`, changed 'parameters' to 'arguments' in 3.11 error_message = str(exc_info.value) assert error_message.startswith('Too many parameters') or error_message.startswith('Too many arguments') assert error_message.endswith( " for .Model'>; actual 3, expected 2" ) def test_cover_cache(clean_cache): cache_size = len(_GENERIC_TYPES_CACHE) T = TypeVar('T') class Model(BaseModel, Generic[T]): x: T models = [] # keep references to models to get cache size models.append(Model[int]) # adds both with-tuple and without-tuple version to cache assert len(_GENERIC_TYPES_CACHE) == cache_size + 3 models.append(Model[int]) # uses the cache assert len(_GENERIC_TYPES_CACHE) == cache_size + 3 del models def test_cache_keys_are_hashable(clean_cache): cache_size = len(_GENERIC_TYPES_CACHE) T = TypeVar('T') C = Callable[[str, Dict[str, Any]], Iterable[str]] class MyGenericModel(BaseModel, Generic[T]): t: T # Callable's first params get converted to a list, which is not hashable. # Make sure we can handle that special case Simple = MyGenericModel[Callable[[int], str]] models = [] # keep references to models to get cache size models.append(Simple) assert len(_GENERIC_TYPES_CACHE) == cache_size + 3 # Nested Callables models.append(MyGenericModel[Callable[[C], Iterable[str]]]) assert len(_GENERIC_TYPES_CACHE) == cache_size + 6 models.append(MyGenericModel[Callable[[Simple], Iterable[int]]]) assert len(_GENERIC_TYPES_CACHE) == cache_size + 9 models.append(MyGenericModel[Callable[[MyGenericModel[C]], Iterable[int]]]) assert len(_GENERIC_TYPES_CACHE) == cache_size + 15 class Model(BaseModel): x: MyGenericModel[Callable[[C], Iterable[str]]] models.append(Model) assert len(_GENERIC_TYPES_CACHE) == cache_size + 15 del models @pytest.mark.skipif(platform.python_implementation() == 'PyPy', reason='PyPy does not play nice with PyO3 gc') def test_caches_get_cleaned_up(clean_cache): initial_types_cache_size = len(_GENERIC_TYPES_CACHE) T = TypeVar('T') class MyGenericModel(BaseModel, Generic[T]): x: T model_config = dict(arbitrary_types_allowed=True) n_types = 200 types = [] for i in range(n_types): class MyType(int): pass types.append(MyGenericModel[MyType]) # retain a reference assert len(_GENERIC_TYPES_CACHE) == initial_types_cache_size + 3 * n_types types.clear() gc.collect(0) gc.collect(1) gc.collect(2) assert len(_GENERIC_TYPES_CACHE) < initial_types_cache_size + _LIMITED_DICT_SIZE @pytest.mark.skipif(platform.python_implementation() == 'PyPy', reason='PyPy does not play nice with PyO3 gc') def test_caches_get_cleaned_up_with_aliased_parametrized_bases(clean_cache): types_cache_size = len(_GENERIC_TYPES_CACHE) def run() -> None: # Run inside nested function to get classes in local vars cleaned also T1 = TypeVar('T1') T2 = TypeVar('T2') class A(BaseModel, Generic[T1, T2]): x: T1 y: T2 B = A[int, T2] C = B[str] assert len(_GENERIC_TYPES_CACHE) == types_cache_size + 5 del C del B gc.collect() run() gc.collect(0) gc.collect(1) gc.collect(2) assert len(_GENERIC_TYPES_CACHE) < types_cache_size + _LIMITED_DICT_SIZE @pytest.mark.skipif(platform.python_implementation() == 'PyPy', reason='PyPy does not play nice with PyO3 gc') @pytest.mark.skipif(sys.version_info[:2] == (3, 9), reason='The test randomly fails on Python 3.9') def test_circular_generic_refs_get_cleaned_up(): initial_cache_size = len(_GENERIC_TYPES_CACHE) def fn(): T = TypeVar('T') C = TypeVar('C') class Inner(BaseModel, Generic[T, C]): a: T b: C class Outer(BaseModel, Generic[C]): c: Inner[int, C] klass = Outer[str] assert len(_GENERIC_TYPES_CACHE) > initial_cache_size assert klass in _GENERIC_TYPES_CACHE.values() fn() gc.collect(0) gc.collect(1) gc.collect(2) assert len(_GENERIC_TYPES_CACHE) == initial_cache_size def test_generics_work_with_many_parametrized_base_models(clean_cache): cache_size = len(_GENERIC_TYPES_CACHE) count_create_models = 1000 T = TypeVar('T') C = TypeVar('C') class A(BaseModel, Generic[T, C]): x: T y: C class B(A[int, C], BaseModel, Generic[C]): pass models = [] for i in range(count_create_models): class M(BaseModel): pass M.__name__ = f'M{i}' models.append(M) generics = [] for m in models: Working = B[m] generics.append(Working) target_size = cache_size + count_create_models * 3 + 2 assert len(_GENERIC_TYPES_CACHE) < target_size + _LIMITED_DICT_SIZE del models del generics def test_generic_config(): data_type = TypeVar('data_type') class Result(BaseModel, Generic[data_type], frozen=True): data: data_type result = Result[int](data=1) assert result.data == 1 with pytest.raises(ValidationError): result.data = 2 def test_enum_generic(): T = TypeVar('T') class MyEnum(IntEnum): x = 1 y = 2 class Model(BaseModel, Generic[T]): enum: T Model[MyEnum](enum=MyEnum.x) Model[MyEnum](enum=2) def test_generic(): data_type = TypeVar('data_type') error_type = TypeVar('error_type') class Result(BaseModel, Generic[data_type, error_type]): data: Optional[List[data_type]] = None error: Optional[error_type] = None positive_number: int @field_validator('error') @classmethod def validate_error(cls, v: Optional[error_type], info: ValidationInfo) -> Optional[error_type]: values = info.data if values.get('data', None) is None and v is None: raise ValueError('Must provide data or error') if values.get('data', None) is not None and v is not None: raise ValueError('Must not provide both data and error') return v @field_validator('positive_number') @classmethod def validate_positive_number(cls, v: int) -> int: if v < 0: raise ValueError return v class Error(BaseModel): message: str class Data(BaseModel): number: int text: str success1 = Result[Data, Error](data=[Data(number=1, text='a')], positive_number=1) assert success1.model_dump() == {'data': [{'number': 1, 'text': 'a'}], 'error': None, 'positive_number': 1} assert repr(success1) == ( 'Result[test_generic..Data,' " test_generic..Error](data=[Data(number=1, text='a')], error=None, positive_number=1)" ) success2 = Result[Data, Error](error=Error(message='error'), positive_number=1) assert success2.model_dump() == {'data': None, 'error': {'message': 'error'}, 'positive_number': 1} assert repr(success2) == ( 'Result[test_generic..Data, test_generic..Error]' "(data=None, error=Error(message='error'), positive_number=1)" ) with pytest.raises(ValidationError) as exc_info: Result[Data, Error](error=Error(message='error'), positive_number=-1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError()))}, 'input': -1, 'loc': ('positive_number',), 'msg': 'Value error, ', 'type': 'value_error', } ] with pytest.raises(ValidationError) as exc_info: Result[Data, Error](data=[Data(number=1, text='a')], error=Error(message='error'), positive_number=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('Must not provide both data and error')))}, 'input': Error(message='error'), 'loc': ('error',), 'msg': 'Value error, Must not provide both data and error', 'type': 'value_error', } ] def test_alongside_concrete_generics(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): item: T metadata: Dict[str, Any] model = MyModel[int](item=1, metadata={}) assert model.item == 1 assert model.metadata == {} def test_complex_nesting(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): item: List[Dict[Union[int, T], str]] item = [{1: 'a', 'a': 'a'}] model = MyModel[str](item=item) assert model.item == item def test_required_value(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): a: int with pytest.raises(ValidationError) as exc_info: MyModel[int]() assert exc_info.value.errors(include_url=False) == [ {'input': {}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'} ] def test_optional_value(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): a: Optional[int] = 1 model = MyModel[int]() assert model.model_dump() == {'a': 1} def test_custom_schema(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): a: int = Field(1, description='Custom') schema = MyModel[int].model_json_schema() assert schema['properties']['a'].get('description') == 'Custom' def test_child_schema(): T = TypeVar('T') class Model(BaseModel, Generic[T]): a: T class Child(Model[T], Generic[T]): pass schema = Child[int].model_json_schema() assert schema == { 'title': 'Child[int]', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], } def test_custom_generic_naming(): T = TypeVar('T') class MyModel(BaseModel, Generic[T]): value: Optional[T] @classmethod def model_parametrized_name(cls, params: Tuple[Type[Any], ...]) -> str: param_names = [param.__name__ if hasattr(param, '__name__') else str(param) for param in params] title = param_names[0].title() return f'Optional{title}Wrapper' assert repr(MyModel[int](value=1)) == 'OptionalIntWrapper(value=1)' assert repr(MyModel[str](value=None)) == 'OptionalStrWrapper(value=None)' def test_nested(): AT = TypeVar('AT') class InnerT(BaseModel, Generic[AT]): a: AT inner_int = InnerT[int](a=8) inner_str = InnerT[str](a='ate') inner_dict_any = InnerT[Any](a={}) inner_int_any = InnerT[Any](a=7) class OuterT_SameType(BaseModel, Generic[AT]): i: InnerT[AT] OuterT_SameType[int](i={'a': 8}) OuterT_SameType[int](i=inner_int) OuterT_SameType[str](i=inner_str) OuterT_SameType[int](i=inner_int_any) with pytest.raises(ValidationError) as exc_info: OuterT_SameType[int](i=inner_str.model_dump()) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('i', 'a'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'ate', } ] with pytest.raises(ValidationError) as exc_info: OuterT_SameType[int](i=inner_str) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('i', 'a'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'ate', } ] with pytest.raises(ValidationError) as exc_info: OuterT_SameType[int](i=inner_dict_any.model_dump()) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('i', 'a'), 'msg': 'Input should be a valid integer', 'input': {}} ] with pytest.raises(ValidationError) as exc_info: OuterT_SameType[int](i=inner_dict_any) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('i', 'a'), 'msg': 'Input should be a valid integer', 'input': {}} ] def test_partial_specification(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: AT b: BT partial_model = Model[int, BT] concrete_model = partial_model[str] concrete_model(a=1, b='abc') with pytest.raises(ValidationError) as exc_info: concrete_model(a='abc', b=None) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'abc', }, {'type': 'string_type', 'loc': ('b',), 'msg': 'Input should be a valid string', 'input': None}, ] def test_partial_specification_with_inner_typevar(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: List[AT] b: List[BT] partial_model = Model[int, BT] assert partial_model.__pydantic_generic_metadata__['parameters'] concrete_model = partial_model[int] assert not concrete_model.__pydantic_generic_metadata__['parameters'] # nested resolution of partial models should work as expected nested_resolved = concrete_model(a=['123'], b=['456']) assert nested_resolved.a == [123] assert nested_resolved.b == [456] @pytest.mark.skipif(sys.version_info < (3, 12), reason='repr different on older versions') def test_partial_specification_name(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: AT b: BT partial_model = Model[int, BT] assert partial_model.__name__ == 'Model[int, TypeVar]' concrete_model = partial_model[str] assert concrete_model.__name__ == 'Model[int, str]' def test_partial_specification_instantiation(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: AT b: BT partial_model = Model[int, BT] partial_model(a=1, b=2) partial_model(a=1, b='a') with pytest.raises(ValidationError) as exc_info: partial_model(a='a', b=2) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] def test_partial_specification_instantiation_bounded(): AT = TypeVar('AT') BT = TypeVar('BT', bound=int) class Model(BaseModel, Generic[AT, BT]): a: AT b: BT Model(a=1, b=1) with pytest.raises(ValidationError) as exc_info: Model(a=1, b='a') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] partial_model = Model[int, BT] partial_model(a=1, b=1) with pytest.raises(ValidationError) as exc_info: partial_model(a=1, b='a') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] def test_typevar_parametrization(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: AT b: BT CT = TypeVar('CT', bound=int) DT = TypeVar('DT', bound=int) with pytest.raises(ValidationError) as exc_info: Model[CT, DT](a='a', b='b') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, { 'type': 'int_parsing', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', }, ] def test_multiple_specification(): AT = TypeVar('AT') BT = TypeVar('BT') class Model(BaseModel, Generic[AT, BT]): a: AT b: BT CT = TypeVar('CT') partial_model = Model[CT, CT] concrete_model = partial_model[str] with pytest.raises(ValidationError) as exc_info: concrete_model(a=None, b=None) assert exc_info.value.errors(include_url=False) == [ {'type': 'string_type', 'loc': ('a',), 'msg': 'Input should be a valid string', 'input': None}, {'type': 'string_type', 'loc': ('b',), 'msg': 'Input should be a valid string', 'input': None}, ] def test_generic_subclass_of_concrete_generic(): T = TypeVar('T') U = TypeVar('U') class GenericBaseModel(BaseModel, Generic[T]): data: T class GenericSub(GenericBaseModel[int], Generic[U]): extra: U ConcreteSub = GenericSub[int] with pytest.raises(ValidationError): ConcreteSub(data=2, extra='wrong') with pytest.raises(ValidationError): ConcreteSub(data='wrong', extra=2) ConcreteSub(data=2, extra=3) def test_generic_model_pickle(create_module): # Using create_module because pickle doesn't support # objects with in their __qualname__ (e.g. defined in function) @create_module def module(): import pickle from typing import Generic, TypeVar from pydantic import BaseModel t = TypeVar('t') class Model(BaseModel): a: float b: int = 10 class MyGeneric(BaseModel, Generic[t]): value: t original = MyGeneric[Model](value=Model(a='24')) dumped = pickle.dumps(original) loaded = pickle.loads(dumped) assert loaded.value.a == original.value.a == 24 assert loaded.value.b == original.value.b == 10 assert loaded == original def test_generic_model_from_function_pickle_fail(create_module): @create_module def module(): import pickle from typing import Generic, TypeVar import pytest from pydantic import BaseModel t = TypeVar('t') class Model(BaseModel): a: float b: int = 10 class MyGeneric(BaseModel, Generic[t]): value: t def get_generic(t): return MyGeneric[t] original = get_generic(Model)(value=Model(a='24')) with pytest.raises(pickle.PicklingError): pickle.dumps(original) def test_generic_model_redefined_without_cache_fail(create_module, monkeypatch): # match identity checker otherwise we never get to the redefinition check monkeypatch.setattr('pydantic._internal._utils.all_identical', lambda left, right: False) @create_module def module(): from typing import Generic, TypeVar from pydantic import BaseModel from pydantic._internal._generics import _GENERIC_TYPES_CACHE t = TypeVar('t') class MyGeneric(BaseModel, Generic[t]): value: t class Model(BaseModel): ... concrete = MyGeneric[Model] _GENERIC_TYPES_CACHE.clear() second_concrete = MyGeneric[Model] class Model(BaseModel): # same name, but type different, so it's not in cache ... third_concrete = MyGeneric[Model] assert concrete is not second_concrete assert concrete is not third_concrete assert second_concrete is not third_concrete assert globals()['MyGeneric[Model]'] is concrete assert globals()['MyGeneric[Model]_'] is second_concrete assert globals()['MyGeneric[Model]__'] is third_concrete def test_generic_model_caching_detect_order_of_union_args_basic(create_module): # Basic variant of https://github.com/pydantic/pydantic/issues/4474 @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel t = TypeVar('t') class Model(BaseModel, Generic[t]): data: t int_or_float_model = Model[Union[int, float]] float_or_int_model = Model[Union[float, int]] assert type(int_or_float_model(data='1').data) is int assert type(float_or_int_model(data='1').data) is float @pytest.mark.skip( reason=""" Depends on similar issue in CPython itself: https://github.com/python/cpython/issues/86483 Documented and skipped for possible fix later. """ ) def test_generic_model_caching_detect_order_of_union_args_nested(create_module): # Nested variant of https://github.com/pydantic/pydantic/issues/4474 @create_module def module(): from typing import Generic, List, TypeVar, Union from pydantic import BaseModel t = TypeVar('t') class Model(BaseModel, Generic[t]): data: t int_or_float_model = Model[List[Union[int, float]]] float_or_int_model = Model[List[Union[float, int]]] assert type(int_or_float_model(data=['1']).data[0]) is int assert type(float_or_int_model(data=['1']).data[0]) is float def test_get_caller_frame_info(create_module): @create_module def module(): from pydantic._internal._generics import _get_caller_frame_info def function(): assert _get_caller_frame_info() == (__name__, True) another_function() def another_function(): assert _get_caller_frame_info() == (__name__, False) third_function() def third_function(): assert _get_caller_frame_info() == (__name__, False) function() def test_get_caller_frame_info_called_from_module(create_module): @create_module def module(): from unittest.mock import patch import pytest from pydantic._internal._generics import _get_caller_frame_info with pytest.raises(RuntimeError, match='This function must be used inside another function'): with patch('sys._getframe', side_effect=ValueError('getframe_exc')): _get_caller_frame_info() def test_get_caller_frame_info_when_sys_getframe_undefined(): from pydantic._internal._generics import _get_caller_frame_info getframe = sys._getframe del sys._getframe try: assert _get_caller_frame_info() == (None, False) finally: # just to make sure we always setting original attribute back sys._getframe = getframe def test_iter_contained_typevars(): T = TypeVar('T') T2 = TypeVar('T2') class Model(BaseModel, Generic[T]): a: T assert list(iter_contained_typevars(Model[T])) == [T] assert list(iter_contained_typevars(Optional[List[Union[str, Model[T]]]])) == [T] assert list(iter_contained_typevars(Optional[List[Union[str, Model[int]]]])) == [] assert list(iter_contained_typevars(Optional[List[Union[str, Model[T], Callable[[T2, T], str]]]])) == [T, T2, T] def test_nested_identity_parameterization(): T = TypeVar('T') T2 = TypeVar('T2') class Model(BaseModel, Generic[T]): a: T assert Model[T][T][T] is Model assert Model[T] is Model assert Model[T2] is not Model def test_replace_types(): T = TypeVar('T') class Model(BaseModel, Generic[T]): a: T assert replace_types(T, {T: int}) is int assert replace_types(List[Union[str, list, T]], {T: int}) == List[Union[str, list, int]] assert replace_types(Callable, {T: int}) == Callable assert replace_types(Callable[[int, str, T], T], {T: int}) == Callable[[int, str, int], int] assert replace_types(T, {}) is T assert replace_types(Model[List[T]], {T: int}) == Model[List[int]] assert replace_types(Model[List[T]], {T: int}) == Model[List[T]][int] assert ( replace_types(Model[List[T]], {T: int}).model_fields['a'].annotation == Model[List[T]][int].model_fields['a'].annotation ) assert replace_types(T, {}) is T assert replace_types(Type[T], {T: int}) == Type[int] assert replace_types(Model[T], {T: T}) == Model[T] assert replace_types(Json[T], {T: int}) == Json[int] if sys.version_info >= (3, 9): # Check generic aliases (subscripted builtin types) to make sure they # resolve correctly (don't get translated to typing versions for # example) assert replace_types(list[Union[str, list, T]], {T: int}) == list[Union[str, list, int]] if sys.version_info >= (3, 10): # Check that types.UnionType gets handled properly assert replace_types(str | list[T] | float, {T: int}) == str | list[int] | float def test_replace_types_with_user_defined_generic_type_field(): # noqa: C901 """Test that using user defined generic types as generic model fields are handled correctly.""" T = TypeVar('T') KT = TypeVar('KT') VT = TypeVar('VT') class CustomCounter(Counter[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Counter[get_args(source_type)[0]])) class CustomDefaultDict(DefaultDict[KT, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: keys_type, values_type = get_args(source_type) return core_schema.no_info_after_validator_function( lambda x: cls(x.default_factory, x), handler(DefaultDict[keys_type, values_type]) ) class CustomDeque(Deque[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Deque[get_args(source_type)[0]])) class CustomDict(Dict[KT, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: keys_type, values_type = get_args(source_type) return core_schema.no_info_after_validator_function(cls, handler(Dict[keys_type, values_type])) class CustomFrozenset(FrozenSet[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(FrozenSet[get_args(source_type)[0]])) class CustomIterable(Iterable[T]): def __init__(self, iterable): self.iterable = iterable def __iter__(self): return self def __next__(self): return next(self.iterable) @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Iterable[get_args(source_type)[0]])) class CustomList(List[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(List[get_args(source_type)[0]])) class CustomMapping(Mapping[KT, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: keys_type, values_type = get_args(source_type) return handler(Mapping[keys_type, values_type]) class CustomOrderedDict(OrderedDict[KT, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: keys_type, values_type = get_args(source_type) return core_schema.no_info_after_validator_function(cls, handler(OrderedDict[keys_type, values_type])) class CustomSet(Set[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Set[get_args(source_type)[0]])) class CustomTuple(Tuple[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Tuple[get_args(source_type)[0]])) class CustomLongTuple(Tuple[T, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(Tuple[get_args(source_type)])) class Model(BaseModel, Generic[T, KT, VT]): counter_field: CustomCounter[T] default_dict_field: CustomDefaultDict[KT, VT] deque_field: CustomDeque[T] dict_field: CustomDict[KT, VT] frozenset_field: CustomFrozenset[T] iterable_field: CustomIterable[T] list_field: CustomList[T] mapping_field: CustomMapping[KT, VT] ordered_dict_field: CustomOrderedDict[KT, VT] set_field: CustomSet[T] tuple_field: CustomTuple[T] long_tuple_field: CustomLongTuple[T, VT] assert replace_types(Model, {T: bool, KT: str, VT: int}) == Model[bool, str, int] assert replace_types(Model[T, KT, VT], {T: bool, KT: str, VT: int}) == Model[bool, str, int] assert replace_types(Model[T, VT, KT], {T: bool, KT: str, VT: int}) == Model[T, VT, KT][bool, int, str] m = Model[bool, str, int]( counter_field=Counter([True, False]), default_dict_field={'a': 1}, deque_field=[True, False], dict_field={'a': 1}, frozenset_field=frozenset([True, False]), iterable_field=[True, False], list_field=[True, False], mapping_field={'a': 2}, ordered_dict_field=OrderedDict([('a', 1)]), set_field={True, False}, tuple_field=(True,), long_tuple_field=(True, 42), ) # The following assertions are just to document the current behavior, and should # be updated if/when we do a better job of respecting the exact annotated type assert type(m.counter_field) is CustomCounter # assert type(m.default_dict_field) is CustomDefaultDict assert type(m.deque_field) is CustomDeque assert type(m.dict_field) is CustomDict assert type(m.frozenset_field) is CustomFrozenset assert type(m.iterable_field) is CustomIterable assert type(m.list_field) is CustomList assert type(m.mapping_field) is dict # this is determined in CustomMapping.__get_pydantic_core_schema__ assert type(m.ordered_dict_field) is CustomOrderedDict assert type(m.set_field) is CustomSet assert type(m.tuple_field) is CustomTuple assert type(m.long_tuple_field) is CustomLongTuple assert m.model_dump() == { 'counter_field': {False: 1, True: 1}, 'default_dict_field': {'a': 1}, 'deque_field': deque([True, False]), 'dict_field': {'a': 1}, 'frozenset_field': frozenset({False, True}), 'iterable_field': HasRepr(IsStr(regex=r'SerializationIterator\(index=0, iterator=.*CustomIterable.*')), 'list_field': [True, False], 'mapping_field': {'a': 2}, 'ordered_dict_field': {'a': 1}, 'set_field': {False, True}, 'tuple_field': (True,), 'long_tuple_field': (True, 42), } def test_custom_sequence_behavior(): T = TypeVar('T') class CustomSequence(Sequence[T]): pass with pytest.raises( PydanticSchemaGenerationError, match=( r'Unable to generate pydantic-core schema for .*' ' Set `arbitrary_types_allowed=True` in the model_config to ignore this error' ' or implement `__get_pydantic_core_schema__` on your type to fully support it' ), ): class Model(BaseModel, Generic[T]): x: CustomSequence[T] def test_replace_types_identity_on_unchanged(): T = TypeVar('T') U = TypeVar('U') type_ = List[Union[str, Callable[[list], Optional[str]], U]] assert replace_types(type_, {T: int}) is type_ def test_deep_generic(): T = TypeVar('T') S = TypeVar('S') R = TypeVar('R') class OuterModel(BaseModel, Generic[T, S, R]): a: Dict[R, Optional[List[T]]] b: Optional[Union[S, R]] c: R d: float class InnerModel(BaseModel, Generic[T, R]): c: T d: R class NormalModel(BaseModel): e: int f: str inner_model = InnerModel[int, str] generic_model = OuterModel[inner_model, NormalModel, int] inner_models = [inner_model(c=1, d='a')] generic_model(a={1: inner_models, 2: None}, b=None, c=1, d=1.5) generic_model(a={}, b=NormalModel(e=1, f='a'), c=1, d=1.5) generic_model(a={}, b=1, c=1, d=1.5) assert InnerModel.__pydantic_generic_metadata__['parameters'] # i.e., InnerModel is not concrete assert not inner_model.__pydantic_generic_metadata__['parameters'] # i.e., inner_model is concrete def test_deep_generic_with_inner_typevar(): T = TypeVar('T') class OuterModel(BaseModel, Generic[T]): a: List[T] class InnerModel(OuterModel[T], Generic[T]): pass assert not InnerModel[int].__pydantic_generic_metadata__['parameters'] # i.e., InnerModel[int] is concrete assert InnerModel.__pydantic_generic_metadata__['parameters'] # i.e., InnerModel is not concrete with pytest.raises(ValidationError): InnerModel[int](a=['wrong']) assert InnerModel[int](a=['1']).a == [1] def test_deep_generic_with_referenced_generic(): T = TypeVar('T') R = TypeVar('R') class ReferencedModel(BaseModel, Generic[R]): a: R class OuterModel(BaseModel, Generic[T]): a: ReferencedModel[T] class InnerModel(OuterModel[T], Generic[T]): pass assert not InnerModel[int].__pydantic_generic_metadata__['parameters'] assert InnerModel.__pydantic_generic_metadata__['parameters'] with pytest.raises(ValidationError): InnerModel[int](a={'a': 'wrong'}) assert InnerModel[int](a={'a': 1}).a.a == 1 def test_deep_generic_with_referenced_inner_generic(): T = TypeVar('T') class ReferencedModel(BaseModel, Generic[T]): a: T class OuterModel(BaseModel, Generic[T]): a: Optional[List[Union[ReferencedModel[T], str]]] class InnerModel(OuterModel[T], Generic[T]): pass assert not InnerModel[int].__pydantic_generic_metadata__['parameters'] assert InnerModel.__pydantic_generic_metadata__['parameters'] with pytest.raises(ValidationError): InnerModel[int](a=['s', {'a': 'wrong'}]) assert InnerModel[int](a=['s', {'a': 1}]).a[1].a == 1 assert InnerModel[int].model_fields['a'].annotation == Optional[List[Union[ReferencedModel[int], str]]] def test_deep_generic_with_multiple_typevars(): T = TypeVar('T') U = TypeVar('U') class OuterModel(BaseModel, Generic[T]): data: List[T] class InnerModel(OuterModel[T], Generic[U, T]): extra: U ConcreteInnerModel = InnerModel[int, float] assert ConcreteInnerModel.model_fields['data'].annotation == List[float] assert ConcreteInnerModel.model_fields['extra'].annotation == int assert ConcreteInnerModel(data=['1'], extra='2').model_dump() == {'data': [1.0], 'extra': 2} def test_deep_generic_with_multiple_inheritance(): K = TypeVar('K') V = TypeVar('V') T = TypeVar('T') class OuterModelA(BaseModel, Generic[K, V]): data: Dict[K, V] class OuterModelB(BaseModel, Generic[T]): stuff: List[T] class InnerModel(OuterModelA[K, V], OuterModelB[T], Generic[K, V, T]): extra: int ConcreteInnerModel = InnerModel[int, float, str] assert ConcreteInnerModel.model_fields['data'].annotation == Dict[int, float] assert ConcreteInnerModel.model_fields['stuff'].annotation == List[str] assert ConcreteInnerModel.model_fields['extra'].annotation == int with pytest.raises(ValidationError) as exc_info: ConcreteInnerModel(data={1.1: '5'}, stuff=[123], extra=5) assert exc_info.value.errors(include_url=False) == [ {'input': 123, 'loc': ('stuff', 0), 'msg': 'Input should be a valid string', 'type': 'string_type'}, { 'input': 1.1, 'loc': ('data', '1.1', '[key]'), 'msg': 'Input should be a valid integer, got a number with a fractional part', 'type': 'int_from_float', }, ] assert ConcreteInnerModel(data={1: 5}, stuff=['123'], extra=5).model_dump() == { 'data': {1: 5}, 'stuff': ['123'], 'extra': 5, } def test_generic_with_referenced_generic_type_1(): T = TypeVar('T') class ModelWithType(BaseModel, Generic[T]): # Type resolves to type origin of "type" which is non-subscriptible for # python < 3.9 so we want to make sure it works for other versions some_type: Type[T] class ReferenceModel(BaseModel, Generic[T]): abstract_base_with_type: ModelWithType[T] ReferenceModel[int] def test_generic_with_referenced_generic_type_bound(): T = TypeVar('T', bound=int) class ModelWithType(BaseModel, Generic[T]): # Type resolves to type origin of "type" which is non-subscriptible for # python < 3.9 so we want to make sure it works for other versions some_type: Type[T] class ReferenceModel(BaseModel, Generic[T]): abstract_base_with_type: ModelWithType[T] class MyInt(int): ... ReferenceModel[MyInt] def test_generic_with_referenced_generic_union_type_bound(): T = TypeVar('T', bound=Union[str, int]) class ModelWithType(BaseModel, Generic[T]): some_type: Type[T] class MyInt(int): ... class MyStr(str): ... ModelWithType[MyInt] ModelWithType[MyStr] def test_generic_with_referenced_generic_type_constraints(): T = TypeVar('T', int, str) class ModelWithType(BaseModel, Generic[T]): # Type resolves to type origin of "type" which is non-subscriptible for # python < 3.9 so we want to make sure it works for other versions some_type: Type[T] class ReferenceModel(BaseModel, Generic[T]): abstract_base_with_type: ModelWithType[T] ReferenceModel[int] def test_generic_with_referenced_nested_typevar(): T = TypeVar('T') class ModelWithType(BaseModel, Generic[T]): # Type resolves to type origin of "collections.abc.Sequence" which is # non-subscriptible for # python < 3.9 so we want to make sure it works for other versions some_type: Sequence[T] class ReferenceModel(BaseModel, Generic[T]): abstract_base_with_type: ModelWithType[T] ReferenceModel[int] def test_generic_with_callable(): T = TypeVar('T') class Model(BaseModel, Generic[T]): # Callable is a test for any type that accepts a list as an argument some_callable: Callable[[Optional[int], T], None] assert not Model[str].__pydantic_generic_metadata__['parameters'] assert Model.__pydantic_generic_metadata__['parameters'] def test_generic_with_partial_callable(): T = TypeVar('T') U = TypeVar('U') class Model(BaseModel, Generic[T, U]): t: T u: U # Callable is a test for any type that accepts a list as an argument some_callable: Callable[[Optional[int], str], None] assert Model[str, U].__pydantic_generic_metadata__['parameters'] == (U,) assert not Model[str, int].__pydantic_generic_metadata__['parameters'] def test_generic_recursive_models(create_module): @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel T = TypeVar('T') class Model1(BaseModel, Generic[T]): ref: 'Model2[T]' class Model2(BaseModel, Generic[T]): ref: Union[T, Model1[T]] Model1.model_rebuild() Model1 = module.Model1 Model2 = module.Model2 with pytest.raises(ValidationError) as exc_info: Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref=123))))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': {'ref': {'ref': 123}}, }, { 'type': 'string_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': 123, }, { 'type': 'model_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'Model1[str]'), 'msg': 'Input should be a valid dictionary or instance of Model1[str]', 'input': 123, 'ctx': {'class_name': 'Model1[str]'}, }, ] result = Model1(ref=Model2(ref=Model1(ref=Model2(ref='123')))) assert result.model_dump() == {'ref': {'ref': {'ref': {'ref': '123'}}}} result = Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref='123'))))) assert result.model_dump() == {'ref': {'ref': {'ref': {'ref': '123'}}}} def test_generic_recursive_models_separate_parameters(create_module): @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel T = TypeVar('T') class Model1(BaseModel, Generic[T]): ref: 'Model2[T]' S = TypeVar('S') class Model2(BaseModel, Generic[S]): ref: Union[S, Model1[S]] Model1.model_rebuild() Model1 = module.Model1 # Model2 = module.Model2 with pytest.raises(ValidationError) as exc_info: Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref=123))))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': {'ref': {'ref': 123}}, }, { 'type': 'string_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': 123, }, { 'type': 'model_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'Model1[str]'), 'msg': 'Input should be a valid dictionary or instance of Model1[str]', 'input': 123, 'ctx': {'class_name': 'Model1[str]'}, }, ] # TODO: Unlike in the previous test, the following (commented) line currently produces this error: # > result = Model1(ref=Model2(ref=Model1(ref=Model2(ref='123')))) # E pydantic_core._pydantic_core.ValidationError: 1 validation error for Model2[~T] # E ref # E Input should be a valid dictionary [type=dict_type, input_value=Model2(ref='123'), input_type=Model2] # The root of this problem is that Model2[T] ends up being a proper subclass of Model2 since T != S. # I am sure we can solve this problem, just need to put a bit more effort in. # While I don't think we should block merging this functionality on getting the next line to pass, # I think we should come back and resolve this at some point. # result = Model1(ref=Model2(ref=Model1(ref=Model2(ref='123')))) # assert result.model_dump() == {'ref': {'ref': {'ref': {'ref': '123'}}}} result = Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref='123'))))) assert result.model_dump() == {'ref': {'ref': {'ref': {'ref': '123'}}}} def test_generic_recursive_models_repeated_separate_parameters(create_module): @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel T = TypeVar('T') class Model1(BaseModel, Generic[T]): ref: 'Model2[T]' ref2: Union['Model2[T]', None] = None S = TypeVar('S') class Model2(BaseModel, Generic[S]): ref: Union[S, Model1[S]] ref2: Union[S, Model1[S], None] = None Model1.model_rebuild() Model1 = module.Model1 # Model2 = module.Model2 with pytest.raises(ValidationError) as exc_info: Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref=123))))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': {'ref': {'ref': 123}}, }, { 'type': 'string_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'str'), 'msg': 'Input should be a valid string', 'input': 123, }, { 'type': 'model_type', 'loc': ('ref', 'ref', 'Model1[str]', 'ref', 'ref', 'Model1[str]'), 'msg': 'Input should be a valid dictionary or instance of Model1[str]', 'input': 123, 'ctx': {'class_name': 'Model1[str]'}, }, ] result = Model1[str].model_validate(dict(ref=dict(ref=dict(ref=dict(ref='123'))))) assert result.model_dump() == { 'ref': {'ref': {'ref': {'ref': '123', 'ref2': None}, 'ref2': None}, 'ref2': None}, 'ref2': None, } def test_generic_recursive_models_triple(create_module): @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel T1 = TypeVar('T1') T2 = TypeVar('T2') T3 = TypeVar('T3') class A1(BaseModel, Generic[T1]): a1: 'A2[T1]' class A2(BaseModel, Generic[T2]): a2: 'A3[T2]' class A3(BaseModel, Generic[T3]): a3: Union['A1[T3]', T3] A1.model_rebuild() A1 = module.A1 with pytest.raises(ValidationError) as exc_info: A1[str].model_validate({'a1': {'a2': {'a3': 1}}}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': ('a1', 'a2', 'a3', 'A1[str]'), 'msg': 'Input should be a valid dictionary or instance of A1[str]', 'input': 1, 'ctx': {'class_name': 'A1[str]'}, }, {'type': 'string_type', 'loc': ('a1', 'a2', 'a3', 'str'), 'msg': 'Input should be a valid string', 'input': 1}, ] A1[int].model_validate({'a1': {'a2': {'a3': 1}}}) def test_generic_recursive_models_with_a_concrete_parameter(create_module): @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel V1 = TypeVar('V1') V2 = TypeVar('V2') V3 = TypeVar('V3') class M1(BaseModel, Generic[V1, V2]): a: V1 m: 'M2[V2]' class M2(BaseModel, Generic[V3]): m: Union[M1[int, V3], V3] M1.model_rebuild() M1 = module.M1 # assert M1.__pydantic_core_schema__ == {} assert collect_invalid_schemas(M1.__pydantic_core_schema__) is False def test_generic_recursive_models_complicated(create_module): """ Note: If we drop the use of LimitedDict and use WeakValueDictionary only, this test will fail if run by itself. This is due to weird behavior with the WeakValueDictionary used for caching. As part of the next batch of generics work, we should attempt to fix this if possible. In the meantime, if this causes issues, or the test otherwise starts failing, please make it xfail with strict=False """ @create_module def module(): from typing import Generic, TypeVar, Union from pydantic import BaseModel T1 = TypeVar('T1') T2 = TypeVar('T2') T3 = TypeVar('T3') class A1(BaseModel, Generic[T1]): a1: 'A2[T1]' class A2(BaseModel, Generic[T2]): a2: 'A3[T2]' class A3(BaseModel, Generic[T3]): a3: Union[A1[T3], T3] A1.model_rebuild() S1 = TypeVar('S1') S2 = TypeVar('S2') class B1(BaseModel, Generic[S1]): a1: 'B2[S1]' class B2(BaseModel, Generic[S2]): a2: 'B1[S2]' B1.model_rebuild() V1 = TypeVar('V1') V2 = TypeVar('V2') V3 = TypeVar('V3') class M1(BaseModel, Generic[V1, V2]): a: int b: B1[V2] m: 'M2[V1]' class M2(BaseModel, Generic[V3]): m: Union[M1[V3, int], V3] M1.model_rebuild() M1 = module.M1 assert collect_invalid_schemas(M1.__pydantic_core_schema__) is False def test_generic_recursive_models_in_container(create_module): @create_module def module(): from typing import Generic, List, Optional, TypeVar from pydantic import BaseModel T = TypeVar('T') class MyGenericModel(BaseModel, Generic[T]): foobar: Optional[List['MyGenericModel[T]']] spam: T MyGenericModel = module.MyGenericModel instance = MyGenericModel[int](foobar=[{'foobar': [], 'spam': 1}], spam=1) assert type(instance.foobar[0]) == MyGenericModel[int] def test_generic_enum(): T = TypeVar('T') class SomeGenericModel(BaseModel, Generic[T]): some_field: T class SomeStringEnum(str, Enum): A = 'A' B = 'B' class MyModel(BaseModel): my_gen: SomeGenericModel[SomeStringEnum] m = MyModel.model_validate({'my_gen': {'some_field': 'A'}}) assert m.my_gen.some_field is SomeStringEnum.A def test_generic_literal(): FieldType = TypeVar('FieldType') ValueType = TypeVar('ValueType') class GModel(BaseModel, Generic[FieldType, ValueType]): field: Dict[FieldType, ValueType] Fields = Literal['foo', 'bar'] m = GModel[Fields, str](field={'foo': 'x'}) assert m.model_dump() == {'field': {'foo': 'x'}} def test_generic_enums(): T = TypeVar('T') class GModel(BaseModel, Generic[T]): x: T class EnumA(str, Enum): a = 'a' class EnumB(str, Enum): b = 'b' class Model(BaseModel): g_a: GModel[EnumA] g_b: GModel[EnumB] assert set(Model.model_json_schema()['$defs']) == {'EnumA', 'EnumB', 'GModel_EnumA_', 'GModel_EnumB_'} def test_generic_with_user_defined_generic_field(): T = TypeVar('T') class GenericList(List[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(GenericList, handler(List[get_args(source_type)[0]])) class Model(BaseModel, Generic[T]): field: GenericList[T] model = Model[int](field=[5]) assert model.field[0] == 5 with pytest.raises(ValidationError): model = Model[int](field=['a']) def test_generic_annotated(): T = TypeVar('T') class SomeGenericModel(BaseModel, Generic[T]): some_field: Annotated[T, Field(alias='the_alias')] SomeGenericModel[str](the_alias='qwe') def test_generic_subclass(): T = TypeVar('T') class A(BaseModel, Generic[T]): ... class B(A[T], Generic[T]): ... class C(B[T], Generic[T]): ... assert B[int].__name__ == 'B[int]' assert issubclass(B[int], B) assert issubclass(B[int], A) assert not issubclass(B[int], C) def test_generic_subclass_with_partial_application(): T = TypeVar('T') S = TypeVar('S') class A(BaseModel, Generic[T]): ... class B(A[S], Generic[T, S]): ... PartiallyAppliedB = B[str, T] assert issubclass(PartiallyAppliedB[int], A) def test_multilevel_generic_binding(): T = TypeVar('T') S = TypeVar('S') class A(BaseModel, Generic[T, S]): ... class B(A[str, T], Generic[T]): ... assert B[int].__name__ == 'B[int]' assert issubclass(B[int], A) def test_generic_subclass_with_extra_type(): T = TypeVar('T') S = TypeVar('S') class A(BaseModel, Generic[T]): ... class B(A[S], Generic[T, S]): ... assert B[int, str].__name__ == 'B[int, str]', B[int, str].__name__ assert issubclass(B[str, int], B) assert issubclass(B[str, int], A) def test_generic_subclass_with_extra_type_requires_all_params(): T = TypeVar('T') S = TypeVar('S') class A(BaseModel, Generic[T]): ... with pytest.raises( TypeError, match=re.escape( 'All parameters must be present on typing.Generic; you should inherit from typing.Generic[~T, ~S]' ), ): class B(A[T], Generic[S]): ... def test_generic_subclass_with_extra_type_with_hint_message(): E = TypeVar('E', bound=BaseModel) D = TypeVar('D') with pytest.warns( GenericBeforeBaseModelWarning, match='Classes should inherit from `BaseModel` before generic classes', ): class BaseGenericClass(Generic[E, D], BaseModel): uid: str name: str with pytest.raises( TypeError, match=re.escape( 'All parameters must be present on typing.Generic; you should inherit from typing.Generic[~E, ~D].' ' Note: `typing.Generic` must go last:' ' `class ChildGenericClass(BaseGenericClass, typing.Generic[~E, ~D]): ...`' ), ): with pytest.warns( GenericBeforeBaseModelWarning, match='Classes should inherit from `BaseModel` before generic classes', ): class ChildGenericClass(BaseGenericClass[E, Dict[str, Any]]): ... def test_multi_inheritance_generic_binding(): T = TypeVar('T') class A(BaseModel, Generic[T]): ... class B(A[int], Generic[T]): ... class C(B[str], Generic[T]): ... assert C[float].__name__ == 'C[float]' assert issubclass(C[float], B) assert issubclass(C[float], A) assert not issubclass(B[float], C) def test_parent_field_parametrization(): T = TypeVar('T') class A(BaseModel, Generic[T]): a: T class B(A, Generic[T]): b: T with pytest.raises(ValidationError) as exc_info: B[int](a='a', b=1) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] def test_multi_inheritance_generic_defaults(): T = TypeVar('T') class A(BaseModel, Generic[T]): a: T x: str = 'a' class B(A[int], Generic[T]): b: Optional[T] = None y: str = 'b' class C(B[str], Generic[T]): c: T z: str = 'c' assert C(a=1, c=...).model_dump() == {'a': 1, 'b': None, 'c': ..., 'x': 'a', 'y': 'b', 'z': 'c'} def test_parse_generic_json(): T = TypeVar('T') class MessageWrapper(BaseModel, Generic[T]): message: Json[T] class Payload(BaseModel): payload_field: str raw = json.dumps({'payload_field': 'payload'}) record = MessageWrapper[Payload](message=raw) assert isinstance(record.message, Payload) validation_schema = record.model_json_schema(mode='validation') assert validation_schema == { '$defs': { 'Payload': { 'properties': {'payload_field': {'title': 'Payload Field', 'type': 'string'}}, 'required': ['payload_field'], 'title': 'Payload', 'type': 'object', } }, 'properties': { 'message': { 'contentMediaType': 'application/json', 'contentSchema': {'$ref': '#/$defs/Payload'}, 'title': 'Message', 'type': 'string', } }, 'required': ['message'], 'title': 'MessageWrapper[test_parse_generic_json..Payload]', 'type': 'object', } serialization_schema = record.model_json_schema(mode='serialization') assert serialization_schema == { '$defs': { 'Payload': { 'properties': {'payload_field': {'title': 'Payload Field', 'type': 'string'}}, 'required': ['payload_field'], 'title': 'Payload', 'type': 'object', } }, 'properties': {'message': {'$ref': '#/$defs/Payload', 'title': 'Message'}}, 'required': ['message'], 'title': 'MessageWrapper[test_parse_generic_json..Payload]', 'type': 'object', } @pytest.mark.skipif(sys.version_info > (3, 12), reason="memray doesn't yet support Python 3.13") def memray_limit_memory(limit): if '--memray' in sys.argv: return pytest.mark.limit_memory(limit) else: return pytest.mark.skip(reason='memray not enabled') @memray_limit_memory('100 MB') def test_generics_memory_use(): """See: - https://github.com/pydantic/pydantic/issues/3829 - https://github.com/pydantic/pydantic/pull/4083 - https://github.com/pydantic/pydantic/pull/5052 """ T = TypeVar('T') U = TypeVar('U') V = TypeVar('V') class MyModel(BaseModel, Generic[T, U, V]): message: Json[T] field: Dict[U, V] class Outer(BaseModel, Generic[T]): inner: T types = [ int, str, float, bool, bytes, ] containers = [ List, Tuple, Set, FrozenSet, ] all = [*types, *[container[tp] for container in containers for tp in types]] total = list(itertools.product(all, all, all)) for t1, t2, t3 in total: class Foo(MyModel[t1, t2, t3]): pass class _(Outer[Foo]): pass @pytest.mark.xfail(reason='Generic models are not type aliases', raises=TypeError) def test_generic_model_as_parameter_to_generic_type_alias() -> None: T = TypeVar('T') class GenericPydanticModel(BaseModel, Generic[T]): x: T GenericPydanticModelList = List[GenericPydanticModel[T]] GenericPydanticModelList[int] def test_double_typevar_substitution() -> None: T = TypeVar('T') class GenericPydanticModel(BaseModel, Generic[T]): x: T = [] assert GenericPydanticModel[List[T]](x=[1, 2, 3]).model_dump() == {'x': [1, 2, 3]} @pytest.fixture(autouse=True) def ensure_contextvar_gets_reset(): # Ensure that the generic recursion contextvar is empty at the start of every test assert not recursively_defined_type_refs() def test_generic_recursion_contextvar(): T = TypeVar('T') class TestingException(Exception): pass class Model(BaseModel, Generic[T]): pass # Make sure that the contextvar-managed recursive types cache begins empty assert not recursively_defined_type_refs() try: with generic_recursion_self_type(Model, (int,)): # Make sure that something has been added to the contextvar-managed recursive types cache assert recursively_defined_type_refs() raise TestingException except TestingException: pass # Make sure that an exception causes the contextvar-managed recursive types cache to be reset assert not recursively_defined_type_refs() def test_limited_dict(): d = LimitedDict(10) d[1] = '1' d[2] = '2' assert list(d.items()) == [(1, '1'), (2, '2')] for no in '34567890': d[int(no)] = no assert list(d.items()) == [ (1, '1'), (2, '2'), (3, '3'), (4, '4'), (5, '5'), (6, '6'), (7, '7'), (8, '8'), (9, '9'), (0, '0'), ] d[11] = '11' # reduce size to 9 after setting 11 assert len(d) == 9 assert list(d.items()) == [ (3, '3'), (4, '4'), (5, '5'), (6, '6'), (7, '7'), (8, '8'), (9, '9'), (0, '0'), (11, '11'), ] d[12] = '12' assert len(d) == 10 d[13] = '13' assert len(d) == 9 def test_construct_generic_model_with_validation(): T = TypeVar('T') class Page(BaseModel, Generic[T]): page: int = Field(ge=42) items: Sequence[T] unenforced: PositiveInt = Field(lt=10) with pytest.raises(ValidationError) as exc_info: Page[int](page=41, items=[], unenforced=11) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'ge': 42}, 'input': 41, 'loc': ('page',), 'msg': 'Input should be greater than or equal to 42', 'type': 'greater_than_equal', }, { 'ctx': {'lt': 10}, 'input': 11, 'loc': ('unenforced',), 'msg': 'Input should be less than 10', 'type': 'less_than', }, ] def test_construct_other_generic_model_with_validation(): # based on the test-case from https://github.com/samuelcolvin/pydantic/issues/2581 T = TypeVar('T') class Page(BaseModel, Generic[T]): page: int = Field(ge=42) items: Sequence[T] # Check we can perform this assignment, this is the actual test concrete_model = Page[str] print(concrete_model) assert concrete_model.__name__ == 'Page[str]' # Sanity check the resulting type works as expected valid = concrete_model(page=42, items=[]) assert valid.page == 42 with pytest.raises(ValidationError) as exc_info: concrete_model(page=41, items=[]) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'ge': 42}, 'input': 41, 'loc': ('page',), 'msg': 'Input should be greater than or equal to 42', 'type': 'greater_than_equal', } ] def test_generic_enum_bound(): T = TypeVar('T', bound=Enum) class MyEnum(Enum): a = 1 class OtherEnum(Enum): b = 2 class Model(BaseModel, Generic[T]): x: T m = Model(x=MyEnum.a) assert m.x == MyEnum.a with pytest.raises(ValidationError) as exc_info: Model(x=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'Enum'}, 'input': 1, 'loc': ('x',), 'msg': 'Input should be an instance of Enum', 'type': 'is_instance_of', } ] m2 = Model[MyEnum](x=MyEnum.a) assert m2.x == MyEnum.a with pytest.raises(ValidationError) as exc_info: Model[MyEnum](x=OtherEnum.b) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': '1'}, 'input': OtherEnum.b, 'loc': ('x',), 'msg': 'Input should be 1', 'type': 'enum', } ] # insert_assert(Model[MyEnum].model_json_schema()) assert Model[MyEnum].model_json_schema() == { '$defs': {'MyEnum': {'enum': [1], 'title': 'MyEnum', 'type': 'integer'}}, 'properties': {'x': {'$ref': '#/$defs/MyEnum'}}, 'required': ['x'], 'title': 'Model[test_generic_enum_bound..MyEnum]', 'type': 'object', } def test_generic_intenum_bound(): T = TypeVar('T', bound=IntEnum) class MyEnum(IntEnum): a = 1 class OtherEnum(IntEnum): b = 2 class Model(BaseModel, Generic[T]): x: T m = Model(x=MyEnum.a) assert m.x == MyEnum.a with pytest.raises(ValidationError) as exc_info: Model(x=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'IntEnum'}, 'input': 1, 'loc': ('x',), 'msg': 'Input should be an instance of IntEnum', 'type': 'is_instance_of', } ] m2 = Model[MyEnum](x=MyEnum.a) assert m2.x == MyEnum.a with pytest.raises(ValidationError) as exc_info: Model[MyEnum](x=OtherEnum.b) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': '1'}, 'input': 2, 'loc': ('x',), 'msg': 'Input should be 1', 'type': 'enum', } ] # insert_assert(Model[MyEnum].model_json_schema()) assert Model[MyEnum].model_json_schema() == { '$defs': {'MyEnum': {'enum': [1], 'title': 'MyEnum', 'type': 'integer'}}, 'properties': {'x': {'$ref': '#/$defs/MyEnum'}}, 'required': ['x'], 'title': 'Model[test_generic_intenum_bound..MyEnum]', 'type': 'object', } @pytest.mark.skipif(sys.version_info < (3, 11), reason='requires python 3.11 or higher') @pytest.mark.xfail( reason='TODO: Variadic generic parametrization is not supported yet;' ' Issue: https://github.com/pydantic/pydantic/issues/5804' ) def test_variadic_generic_init(): class ComponentModel(BaseModel): pass class Wrench(ComponentModel): pass class Screwdriver(ComponentModel): pass ComponentVar = TypeVar('ComponentVar', bound=ComponentModel) NumberOfComponents = TypeVarTuple('NumberOfComponents') class VariadicToolbox(BaseModel, Generic[ComponentVar, Unpack[NumberOfComponents]]): main_component: ComponentVar left_component_pocket: Optional[list[ComponentVar]] = Field(default_factory=list) right_component_pocket: Optional[list[ComponentVar]] = Field(default_factory=list) @computed_field @property def all_components(self) -> tuple[ComponentVar, Unpack[NumberOfComponents]]: return (self.main_component, *self.left_component_pocket, *self.right_component_pocket) sa, sb, w = Screwdriver(), Screwdriver(), Wrench() my_toolbox = VariadicToolbox[Screwdriver, Screwdriver, Wrench]( main_component=sa, left_component_pocket=[w], right_component_pocket=[sb] ) assert my_toolbox.all_components == [sa, w, sb] @pytest.mark.skipif(sys.version_info < (3, 11), reason='requires python 3.11 or higher') @pytest.mark.xfail( reason='TODO: Variadic fields are not supported yet; Issue: https://github.com/pydantic/pydantic/issues/5804' ) def test_variadic_generic_with_variadic_fields(): class ComponentModel(BaseModel): pass class Wrench(ComponentModel): pass class Screwdriver(ComponentModel): pass ComponentVar = TypeVar('ComponentVar', bound=ComponentModel) NumberOfComponents = TypeVarTuple('NumberOfComponents') class VariadicToolbox(BaseModel, Generic[ComponentVar, Unpack[NumberOfComponents]]): toolbelt_cm_size: Optional[tuple[Unpack[NumberOfComponents]]] = Field(default_factory=tuple) manual_toolset: Optional[tuple[ComponentVar, Unpack[NumberOfComponents]]] = Field(default_factory=tuple) MyToolboxClass = VariadicToolbox[Screwdriver, Screwdriver, Wrench] sa, sb, w = Screwdriver(), Screwdriver(), Wrench() MyToolboxClass(toolbelt_cm_size=(5, 10.5, 4), manual_toolset=(sa, sb, w)) with pytest.raises(TypeError): # Should raise error because integer 5 does not meet the bound requirements of ComponentVar MyToolboxClass(manual_toolset=(sa, sb, 5)) @pytest.mark.skipif( sys.version_info < (3, 11), reason=( 'Multiple inheritance with NamedTuple and the corresponding type annotations' " aren't supported before Python 3.11" ), ) def test_generic_namedtuple(): T = TypeVar('T') class FlaggedValue(NamedTuple, Generic[T]): value: T flag: bool class Model(BaseModel): f_value: FlaggedValue[float] assert Model(f_value=(1, True)).model_dump() == {'f_value': (1, True)} with pytest.raises(ValidationError): Model(f_value=(1, 'abc')) with pytest.raises(ValidationError): Model(f_value=('abc', True)) def test_generic_none(): T = TypeVar('T') class Container(BaseModel, Generic[T]): value: T assert Container[type(None)](value=None).value is None assert Container[None](value=None).value is None @pytest.mark.skipif(platform.python_implementation() == 'PyPy', reason='PyPy does not allow ParamSpec in generics') def test_paramspec_is_usable(): # This used to cause a recursion error due to `P in P is True` # This test doesn't actually test that ParamSpec works properly for validation or anything. P = ParamSpec('P') class MyGenericParamSpecClass(Generic[P]): def __init__(self, func: Callable[P, None], *args: P.args, **kwargs: P.kwargs) -> None: super().__init__() class ParamSpecGenericModel(BaseModel, Generic[P]): my_generic: MyGenericParamSpecClass[P] model_config = dict(arbitrary_types_allowed=True) def test_parametrize_with_basemodel(): T = TypeVar('T') class SimpleGenericModel(BaseModel, Generic[T]): pass class Concrete(SimpleGenericModel[BaseModel]): pass def test_no_generic_base(): T = TypeVar('T') class A(BaseModel, Generic[T]): a: T class B(A[T]): b: T class C(B[int]): pass assert C(a='1', b='2').model_dump() == {'a': 1, 'b': 2} with pytest.raises(ValidationError) as exc_info: C(a='a', b='b') assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', }, { 'input': 'b', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', }, ] def test_reverse_order_generic_hashability(): T = TypeVar('T') with pytest.warns( GenericBeforeBaseModelWarning, match='Classes should inherit from `BaseModel` before generic classes', ): class Model(Generic[T], BaseModel): x: T model_config = dict(frozen=True) m1 = Model[int](x=1) m2 = Model[int](x=1) assert len({m1, m2}) == 1 def test_serialize_unsubstituted_typevars_bound() -> None: class ErrorDetails(BaseModel): foo: str # This version of `TypeVar` does not support `default` on Python <3.12 ErrorDataT = TypeVar('ErrorDataT', bound=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str sample_error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } sample_error = Error[ErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } sample_error = Error[MyErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } def test_serialize_unsubstituted_typevars_bound_default_supported() -> None: class ErrorDetails(BaseModel): foo: str # This version of `TypeVar` always support `default` ErrorDataT = TypingExtensionsTypeVar('ErrorDataT', bound=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str sample_error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } sample_error = Error[ErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } sample_error = Error[MyErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } @pytest.mark.parametrize( 'type_var', [ TypingExtensionsTypeVar('ErrorDataT', default=BaseModel), TypeVar('ErrorDataT', BaseModel, str), ], ids=['default', 'constraint'], ) def test_serialize_unsubstituted_typevars_variants( type_var: Type[BaseModel], ) -> None: class ErrorDetails(BaseModel): foo: str class Error(BaseModel, Generic[type_var]): # type: ignore message: str details: type_var class MyErrorDetails(ErrorDetails): bar: str sample_error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': {}, } sample_error = Error[ErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } sample_error = Error[MyErrorDetails]( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert sample_error.details.model_dump() == { 'foo': 'var', 'bar': 'baz', } assert sample_error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } def test_mix_default_and_constraints() -> None: T = TypingExtensionsTypeVar('T', str, int, default=str) msg = 'Pydantic does not support mixing more than one of TypeVar bounds, constraints and defaults' with pytest.raises(NotImplementedError, match=msg): class _(BaseModel, Generic[T]): x: T def test_generic_with_not_required_in_typed_dict() -> None: T = TypingExtensionsTypeVar('T') class FooStr(TypedDict): type: NotRequired[str] class FooGeneric(TypedDict, Generic[T]): type: NotRequired[T] ta_foo_str = TypeAdapter(FooStr) assert ta_foo_str.validate_python({'type': 'tomato'}) == {'type': 'tomato'} assert ta_foo_str.validate_python({}) == {} ta_foo_generic = TypeAdapter(FooGeneric[str]) assert ta_foo_generic.validate_python({'type': 'tomato'}) == {'type': 'tomato'} assert ta_foo_generic.validate_python({}) == {} def test_generic_with_allow_extra(): T = TypeVar('T') # This used to raise an error related to accessing the __annotations__ attribute of the Generic class class AllowExtraGeneric(BaseModel, Generic[T], extra='allow'): data: T def test_generic_field(): """Test for https://github.com/pydantic/pydantic/issues/10039. This was originally fixed by defining a custom MRO for Pydantic models, but the fix from https://github.com/pydantic/pydantic/pull/10666 seemed better. Test is still kept for historical purposes. """ T = TypeVar('T') class A(BaseModel, Generic[T]): ... class B(A[T]): ... class C(B[bool]): ... class Model(BaseModel): input_bool: A[bool] Model(input_bool=C()) def test_generic_any_or_never() -> None: T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): f: Union[T, int] any_json_schema = GenericModel[Any].model_json_schema() assert any_json_schema['properties']['f'] == {'title': 'F'} # any type never_json_schema = GenericModel[Never].model_json_schema() assert never_json_schema['properties']['f'] == {'type': 'integer', 'title': 'F'} def test_revalidation_against_any() -> None: T = TypeVar('T') class ResponseModel(BaseModel, Generic[T]): content: T class Product(BaseModel): name: str price: float class Order(BaseModel): id: int product: ResponseModel[Any] product = Product(name='Apple', price=0.5) response1: ResponseModel[Any] = ResponseModel[Any](content=product) response2: ResponseModel[Any] = ResponseModel(content=product) response3: ResponseModel[Any] = ResponseModel[Product](content=product) for response in response1, response2, response3: order = Order(id=1, product=response) assert isinstance(order.product.content, Product) def test_revalidation_without_explicit_parametrization() -> None: """Note, this is seen in the test above as well, but is added here for thoroughness.""" T1 = TypeVar('T1', bound=BaseModel) class InnerModel(BaseModel, Generic[T1]): model: T1 T2 = TypeVar('T2', bound=InnerModel) class OuterModel(BaseModel, Generic[T2]): inner: T2 class MyModel(BaseModel): foo: int # Construct two instances, with and without generic annotation in the constructor: inner1 = InnerModel[MyModel](model=MyModel(foo=42)) inner2 = InnerModel(model=MyModel(foo=42)) assert inner1 == inner2 outer1 = OuterModel[InnerModel[MyModel]](inner=inner1) outer2 = OuterModel[InnerModel[MyModel]](inner=inner2) # implies that validation succeeds for both assert outer1 == outer2 def test_revalidation_with_basic_inference() -> None: T = TypeVar('T') class Inner(BaseModel, Generic[T]): inner: T class Holder(BaseModel, Generic[T]): inner: Inner[T] holder1 = Holder[int](inner=Inner[int](inner=1)) holder2 = Holder(inner=Inner(inner=1)) # implies that validation succeeds for both assert holder1 == holder2 pydantic-2.10.6/tests/test_internal.py000066400000000000000000000206001474456633400200200ustar00rootroot00000000000000""" Tests for internal things that are complex enough to warrant their own unit tests. """ from dataclasses import dataclass from decimal import Decimal import pytest from pydantic_core import CoreSchema, SchemaValidator from pydantic_core import core_schema as cs from pydantic._internal._core_utils import ( Walk, collect_invalid_schemas, simplify_schema_references, walk_core_schema, ) from pydantic._internal._repr import Representation from pydantic._internal._validators import _extract_decimal_digits_info def remove_metadata(schema: CoreSchema) -> CoreSchema: def inner(s: CoreSchema, recurse: Walk) -> CoreSchema: s = s.copy() s.pop('metadata', None) return recurse(s, inner) return walk_core_schema(schema, inner) @pytest.mark.parametrize( 'input_schema,inlined', [ # Test case 1: Simple schema with no references (cs.list_schema(cs.int_schema()), cs.list_schema(cs.int_schema())), # Test case 2: Schema with single-level nested references ( cs.definitions_schema( cs.list_schema(cs.definition_reference_schema('list_of_ints')), definitions=[ cs.list_schema(cs.definition_reference_schema('int'), ref='list_of_ints'), cs.int_schema(ref='int'), ], ), cs.list_schema(cs.list_schema(cs.int_schema(ref='int'), ref='list_of_ints')), ), # Test case 3: Schema with multiple single-level nested references ( cs.list_schema( cs.definitions_schema(cs.definition_reference_schema('int'), definitions=[cs.int_schema(ref='int')]) ), cs.list_schema(cs.int_schema(ref='int')), ), # Test case 4: A simple recursive schema ( cs.list_schema(cs.definition_reference_schema(schema_ref='list'), ref='list'), cs.definitions_schema( cs.definition_reference_schema(schema_ref='list'), definitions=[cs.list_schema(cs.definition_reference_schema(schema_ref='list'), ref='list')], ), ), # Test case 5: Deeply nested schema with multiple references ( cs.definitions_schema( cs.list_schema(cs.definition_reference_schema('list_of_lists_of_ints')), definitions=[ cs.list_schema(cs.definition_reference_schema('list_of_ints'), ref='list_of_lists_of_ints'), cs.list_schema(cs.definition_reference_schema('int'), ref='list_of_ints'), cs.int_schema(ref='int'), ], ), cs.list_schema( cs.list_schema( cs.list_schema(cs.int_schema(ref='int'), ref='list_of_ints'), ref='list_of_lists_of_ints' ) ), ), # Test case 6: More complex recursive schema ( cs.definitions_schema( cs.list_schema(cs.definition_reference_schema(schema_ref='list_of_ints_and_lists')), definitions=[ cs.list_schema( cs.definitions_schema( cs.definition_reference_schema(schema_ref='int_or_list'), definitions=[ cs.int_schema(ref='int'), cs.tuple_variable_schema( cs.definition_reference_schema(schema_ref='list_of_ints_and_lists'), ref='a tuple' ), ], ), ref='list_of_ints_and_lists', ), cs.int_schema(ref='int_or_list'), ], ), cs.list_schema(cs.list_schema(cs.int_schema(ref='int_or_list'), ref='list_of_ints_and_lists')), ), # Test case 7: Schema with multiple definitions and nested references, some of which are unused ( cs.definitions_schema( cs.list_schema(cs.definition_reference_schema('list_of_ints')), definitions=[ cs.list_schema( cs.definitions_schema( cs.definition_reference_schema('int'), definitions=[cs.int_schema(ref='int')] ), ref='list_of_ints', ) ], ), cs.list_schema(cs.list_schema(cs.int_schema(ref='int'), ref='list_of_ints')), ), # Test case 8: Reference is used in multiple places ( cs.definitions_schema( cs.union_schema( [ cs.definition_reference_schema('list_of_ints'), cs.tuple_variable_schema(cs.definition_reference_schema('int')), ] ), definitions=[ cs.list_schema(cs.definition_reference_schema('int'), ref='list_of_ints'), cs.int_schema(ref='int'), ], ), cs.definitions_schema( cs.union_schema( [ cs.list_schema(cs.definition_reference_schema('int'), ref='list_of_ints'), cs.tuple_variable_schema(cs.definition_reference_schema('int')), ] ), definitions=[cs.int_schema(ref='int')], ), ), # Test case 9: https://github.com/pydantic/pydantic/issues/6270 ( cs.definitions_schema( cs.definition_reference_schema('model'), definitions=[ cs.typed_dict_schema( { 'a': cs.typed_dict_field( cs.nullable_schema( cs.int_schema(ref='ref'), ), ), 'b': cs.typed_dict_field( cs.nullable_schema( cs.int_schema(ref='ref'), ), ), }, ref='model', ), ], ), cs.definitions_schema( cs.typed_dict_schema( { 'a': cs.typed_dict_field( cs.nullable_schema(cs.definition_reference_schema(schema_ref='ref')), ), 'b': cs.typed_dict_field( cs.nullable_schema(cs.definition_reference_schema(schema_ref='ref')), ), }, ref='model', ), definitions=[ cs.int_schema(ref='ref'), ], ), ), ], ) def test_build_schema_defs(input_schema: cs.CoreSchema, inlined: cs.CoreSchema): actual_inlined = remove_metadata(simplify_schema_references(input_schema)) assert actual_inlined == inlined SchemaValidator(actual_inlined) # check for validity def test_representation_integrations(): devtools = pytest.importorskip('devtools') @dataclass class Obj(Representation): int_attr: int = 42 str_attr: str = 'Marvin' obj = Obj() assert str(devtools.debug.format(obj)).split('\n')[1:] == [ ' Obj(', ' int_attr=42,', " str_attr='Marvin',", ' ) (Obj)', ] assert list(obj.__rich_repr__()) == [('int_attr', 42), ('str_attr', 'Marvin')] def test_schema_is_valid(): assert collect_invalid_schemas(cs.none_schema()) is False assert collect_invalid_schemas(cs.invalid_schema()) is True assert collect_invalid_schemas(cs.nullable_schema(cs.invalid_schema())) is True @pytest.mark.parametrize( 'decimal,decimal_places,digits', [ (Decimal('0.0'), 1, 1), (Decimal('0.'), 0, 1), (Decimal('0.000'), 3, 3), (Decimal('0.0001'), 4, 4), (Decimal('.0001'), 4, 4), (Decimal('123.123'), 3, 6), (Decimal('123.1230'), 4, 7), ], ) def test_decimal_digits_calculation(decimal: Decimal, decimal_places: int, digits: int) -> None: assert _extract_decimal_digits_info(decimal) == (decimal_places, digits) pydantic-2.10.6/tests/test_json.py000066400000000000000000000447611474456633400171730ustar00rootroot00000000000000import json import math import re import sys from dataclasses import dataclass as vanilla_dataclass from datetime import date, datetime, time, timedelta, timezone from decimal import Decimal from enum import Enum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from typing import Any, Generator, List, Optional, Pattern, Union from uuid import UUID import pytest from pydantic_core import CoreSchema, SchemaSerializer, core_schema from typing_extensions import Annotated from pydantic import ( AfterValidator, BaseModel, ConfigDict, GetCoreSchemaHandler, GetJsonSchemaHandler, NameEmail, PlainSerializer, RootModel, ) from pydantic._internal._config import ConfigWrapper from pydantic._internal._generate_schema import GenerateSchema from pydantic.color import Color from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.deprecated.json import pydantic_encoder, timedelta_isoformat from pydantic.functional_serializers import ( field_serializer, ) from pydantic.json_schema import JsonSchemaValue from pydantic.type_adapter import TypeAdapter from pydantic.types import DirectoryPath, FilePath, SecretBytes, SecretStr, condecimal try: import email_validator except ImportError: email_validator = None pytestmark = pytest.mark.filterwarnings('ignore::DeprecationWarning') class MyEnum(Enum): foo = 'bar' snap = 'crackle' class MyModel(BaseModel): a: str = 'b' c: str = 'd' @pytest.mark.parametrize( 'ser_type,gen_value,json_output', [ (UUID, lambda: UUID('ebcdab58-6eb8-46fb-a190-d07a33e9eac8'), b'"ebcdab58-6eb8-46fb-a190-d07a33e9eac8"'), (IPv4Address, lambda: '192.168.0.1', b'"192.168.0.1"'), (Color, lambda: Color('#000'), b'"black"'), (Color, lambda: Color((1, 12, 123)), b'"#010c7b"'), (SecretStr, lambda: SecretStr('abcd'), b'"**********"'), (SecretStr, lambda: SecretStr(''), b'""'), (SecretBytes, lambda: SecretBytes(b'xyz'), b'"**********"'), (SecretBytes, lambda: SecretBytes(b''), b'""'), (IPv6Address, lambda: IPv6Address('::1:0:1'), b'"::1:0:1"'), (IPv4Interface, lambda: IPv4Interface('192.168.0.0/24'), b'"192.168.0.0/24"'), (IPv6Interface, lambda: IPv6Interface('2001:db00::/120'), b'"2001:db00::/120"'), (IPv4Network, lambda: IPv4Network('192.168.0.0/24'), b'"192.168.0.0/24"'), (IPv6Network, lambda: IPv6Network('2001:db00::/120'), b'"2001:db00::/120"'), (datetime, lambda: datetime(2032, 1, 1, 1, 1), b'"2032-01-01T01:01:00"'), (datetime, lambda: datetime(2032, 1, 1, 1, 1, tzinfo=timezone.utc), b'"2032-01-01T01:01:00Z"'), (datetime, lambda: datetime(2032, 1, 1), b'"2032-01-01T00:00:00"'), (time, lambda: time(12, 34, 56), b'"12:34:56"'), (timedelta, lambda: timedelta(days=12, seconds=34, microseconds=56), b'"P12DT34.000056S"'), (timedelta, lambda: timedelta(seconds=-1), b'"-PT1S"'), (set, lambda: {1, 2, 3}, b'[1,2,3]'), (frozenset, lambda: frozenset([1, 2, 3]), b'[1,2,3]'), (Generator[int, None, None], lambda: (v for v in range(4)), b'[0,1,2,3]'), (bytes, lambda: b'this is bytes', b'"this is bytes"'), (Decimal, lambda: Decimal('12.34'), b'"12.34"'), (MyModel, lambda: MyModel(), b'{"a":"b","c":"d"}'), (MyEnum, lambda: MyEnum.foo, b'"bar"'), (Pattern, lambda: re.compile('^regex$'), b'"^regex$"'), ], ) def test_json_serialization(ser_type, gen_value, json_output): ta: TypeAdapter[Any] = TypeAdapter(ser_type) assert ta.dump_json(gen_value()) == json_output @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_json_serialization_email(): config_wrapper = ConfigWrapper({'arbitrary_types_allowed': False}) gen = GenerateSchema(config_wrapper, None) schema = gen.generate_schema(NameEmail) serializer = SchemaSerializer(schema) assert serializer.to_json(NameEmail('foo bar', 'foobaR@example.com')) == b'"foo bar "' @pytest.mark.skipif(sys.platform.startswith('win'), reason='paths look different on windows') def test_path_encoding(tmpdir): class PathModel(BaseModel): path: Path file_path: FilePath dir_path: DirectoryPath tmpdir = Path(tmpdir) file_path = tmpdir / 'bar' file_path.touch() dir_path = tmpdir / 'baz' dir_path.mkdir() model = PathModel(path=Path('/path/test/example/'), file_path=file_path, dir_path=dir_path) expected = f'{{"path": "/path/test/example", "file_path": "{file_path}", "dir_path": "{dir_path}"}}' assert json.dumps(model, default=pydantic_encoder) == expected def test_model_encoding(): class ModelA(BaseModel): x: int y: str class Model(BaseModel): a: float b: bytes c: Decimal d: ModelA m = Model(a=10.2, b='foobar', c='10.2', d={'x': 123, 'y': '123'}) assert m.model_dump() == {'a': 10.2, 'b': b'foobar', 'c': Decimal('10.2'), 'd': {'x': 123, 'y': '123'}} assert m.model_dump_json() == '{"a":10.2,"b":"foobar","c":"10.2","d":{"x":123,"y":"123"}}' assert m.model_dump_json(exclude={'b'}) == '{"a":10.2,"c":"10.2","d":{"x":123,"y":"123"}}' def test_subclass_encoding(): class SubDate(datetime): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: def val(v: datetime) -> SubDate: return SubDate.fromtimestamp(v.timestamp()) return core_schema.no_info_after_validator_function(val, handler(datetime)) class Model(BaseModel): a: datetime b: SubDate m = Model(a=datetime(2032, 1, 1, 1, 1), b=SubDate(2020, 2, 29, 12, 30)) assert m.model_dump() == {'a': datetime(2032, 1, 1, 1, 1), 'b': SubDate(2020, 2, 29, 12, 30)} assert m.model_dump_json() == '{"a":"2032-01-01T01:01:00","b":"2020-02-29T12:30:00"}' def test_subclass_custom_encoding(): class SubDt(datetime): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: def val(v: datetime) -> SubDt: return SubDt.fromtimestamp(v.timestamp()) return core_schema.no_info_after_validator_function(val, handler(datetime)) class SubDelta(timedelta): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: def val(v: timedelta) -> SubDelta: return cls(seconds=v.total_seconds()) return core_schema.no_info_after_validator_function(val, handler(timedelta)) class Model(BaseModel): a: SubDt b: SubDelta @field_serializer('a', when_used='json') def serialize_a(self, v: SubDt, _info): return v.strftime('%a, %d %b %C %H:%M:%S') model_config = ConfigDict(ser_json_timedelta='float') m = Model(a=SubDt(2032, 1, 1, 1, 1), b=SubDelta(hours=100)) assert m.model_dump() == {'a': SubDt(2032, 1, 1, 1, 1), 'b': SubDelta(days=4, seconds=14400)} assert m.model_dump(mode='json') == {'a': 'Thu, 01 Jan 20 01:01:00', 'b': 360000.0} assert m.model_dump_json() == '{"a":"Thu, 01 Jan 20 01:01:00","b":360000.0}' def test_invalid_model(): class Foo: pass with pytest.raises(TypeError): json.dumps(Foo, default=pydantic_encoder) @pytest.mark.parametrize( 'input,output', [ (timedelta(days=12, seconds=34, microseconds=56), 'P12DT0H0M34.000056S'), (timedelta(days=1001, hours=1, minutes=2, seconds=3, microseconds=654_321), 'P1001DT1H2M3.654321S'), (timedelta(seconds=-1), '-P1DT23H59M59.000000S'), (timedelta(), 'P0DT0H0M0.000000S'), ], ) def test_iso_timedelta(input, output): assert output == timedelta_isoformat(input) def test_custom_encoder(): class Model(BaseModel): x: timedelta y: Decimal z: date @field_serializer('x') def serialize_x(self, v: timedelta, _info): return f'{v.total_seconds():0.3f}s' @field_serializer('y') def serialize_y(self, v: Decimal, _info): return 'a decimal' assert Model(x=123, y=5, z='2032-06-01').model_dump_json() == '{"x":"123.000s","y":"a decimal","z":"2032-06-01"}' def test_iso_timedelta_simple(): class Model(BaseModel): x: timedelta m = Model(x=123) json_data = m.model_dump_json() assert json_data == '{"x":"PT2M3S"}' assert Model.model_validate_json(json_data).x == timedelta(seconds=123) def test_con_decimal_encode() -> None: """ Makes sure a decimal with decimal_places = 0, as well as one with places can handle a encode/decode roundtrip. """ class Obj(BaseModel): id: condecimal(gt=0, max_digits=22, decimal_places=0) price: Decimal = Decimal('0.01') json_data = '{"id":"1","price":"0.01"}' assert Obj(id=1).model_dump_json() == json_data assert Obj.model_validate_json(json_data) == Obj(id=1) def test_json_encoder_simple_inheritance(): class Parent(BaseModel): dt: datetime = datetime.now() timedt: timedelta = timedelta(hours=100) @field_serializer('dt') def serialize_dt(self, _v: datetime, _info): return 'parent_encoder' class Child(Parent): @field_serializer('timedt') def serialize_timedt(self, _v: timedelta, _info): return 'child_encoder' assert Child().model_dump_json() == '{"dt":"parent_encoder","timedt":"child_encoder"}' def test_encode_dataclass(): @vanilla_dataclass class Foo: bar: int spam: str f = Foo(bar=123, spam='apple pie') assert '{"bar": 123, "spam": "apple pie"}' == json.dumps(f, default=pydantic_encoder) def test_encode_pydantic_dataclass(): @pydantic_dataclass class Foo: bar: int spam: str f = Foo(bar=123, spam='apple pie') assert json.dumps(f, default=pydantic_encoder) == '{"bar": 123, "spam": "apple pie"}' def test_json_nested_encode_models(): class Phone(BaseModel): manufacturer: str number: int class User(BaseModel): name: str SSN: int birthday: datetime phone: Phone friend: Optional['User'] = None @field_serializer('birthday') def serialize_birthday(self, v: datetime, _info): return v.timestamp() @field_serializer('phone', when_used='unless-none') def serialize_phone(self, v: Phone, _info): return v.number @field_serializer('friend', when_used='unless-none') def serialize_user(self, v, _info): return v.SSN User.model_rebuild() iphone = Phone(manufacturer='Apple', number=18002752273) galaxy = Phone(manufacturer='Samsung', number=18007267864) timon = User(name='Timon', SSN=123, birthday=datetime(1993, 6, 1, tzinfo=timezone.utc), phone=iphone) pumbaa = User(name='Pumbaa', SSN=234, birthday=datetime(1993, 5, 15, tzinfo=timezone.utc), phone=galaxy) timon.friend = pumbaa assert iphone.model_dump_json() == '{"manufacturer":"Apple","number":18002752273}' assert ( pumbaa.model_dump_json() == '{"name":"Pumbaa","SSN":234,"birthday":737424000.0,"phone":18007267864,"friend":null}' ) assert ( timon.model_dump_json() == '{"name":"Timon","SSN":123,"birthday":738892800.0,"phone":18002752273,"friend":234}' ) def test_custom_encode_fallback_basemodel(): class MyExoticType: pass class Foo(BaseModel): x: MyExoticType @field_serializer('x') def serialize_x(self, _v: MyExoticType, _info): return 'exo' model_config = ConfigDict(arbitrary_types_allowed=True) class Bar(BaseModel): foo: Foo assert Bar(foo=Foo(x=MyExoticType())).model_dump_json() == '{"foo":{"x":"exo"}}' def test_recursive(create_module): module = create_module( # language=Python """ from __future__ import annotations from typing import Optional from pydantic import BaseModel class Model(BaseModel): value: int nested: Optional[Model] = None """ ) M = module.Model assert M(value=1, nested=M(value=2)).model_dump_json(exclude_none=True) == '{"value":1,"nested":{"value":2}}' def test_resolve_ref_schema_recursive_model(): class Model(BaseModel): mini_me: Union['Model', None] @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = super().__get_pydantic_json_schema__(core_schema, handler) json_schema = handler.resolve_ref_schema(json_schema) json_schema['examples'] = [{'foo': {'mini_me': None}}] return json_schema # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'Model': { 'examples': [{'foo': {'mini_me': None}}], 'properties': {'mini_me': {'anyOf': [{'$ref': '#/$defs/Model'}, {'type': 'null'}]}}, 'required': ['mini_me'], 'title': 'Model', 'type': 'object', } }, '$ref': '#/$defs/Model', } def test_custom_json_encoder_config(): class Model(BaseModel): x: timedelta y: Decimal z: date model_config = ConfigDict( json_encoders={timedelta: lambda v: f'{v.total_seconds():0.3f}s', Decimal: lambda v: 'a decimal'} ) assert json.loads(Model(x=123, y=5, z='2032-06-01').model_dump_json()) == { 'x': '123.000s', 'y': 'a decimal', 'z': '2032-06-01', } def test_custom_iso_timedelta(): class Model(BaseModel): x: timedelta model_config = ConfigDict(json_encoders={timedelta: lambda _: 'P0DT0H2M3.000000S'}) m = Model(x=321) assert json.loads(m.model_dump_json()) == {'x': 'P0DT0H2M3.000000S'} def test_json_encoders_config_simple_inheritance(): """json_encoders is not "inheritable", this is different than v1 but much simpler""" class Parent(BaseModel): dt: datetime = datetime.now() timedt: timedelta = timedelta(hours=100) model_config = ConfigDict(json_encoders={timedelta: lambda _: 'parent_encoder'}) class Child(Parent): model_config = ConfigDict(json_encoders={datetime: lambda _: 'child_encoder'}) # insert_assert(Child().model_dump()) assert json.loads(Child().model_dump_json()) == {'dt': 'child_encoder', 'timedt': 'P4DT4H'} def test_custom_iso_timedelta_annotated(): class Model(BaseModel): # the json_encoders config applies to the type but the annotation overrides it y: timedelta x: Annotated[timedelta, AfterValidator(lambda x: x), PlainSerializer(lambda _: 'P0DT0H1M2.000000S')] model_config = ConfigDict(json_encoders={timedelta: lambda _: 'P0DT0H2M3.000000S'}) m = Model(x=321, y=456) assert json.loads(m.model_dump_json()) == {'x': 'P0DT0H1M2.000000S', 'y': 'P0DT0H2M3.000000S'} def test_json_encoders_on_model() -> None: """Make sure that applying json_encoders to a BaseModel does not edit its schema in place. """ class Model(BaseModel): x: int class Outer1(BaseModel): m: Model model_config = ConfigDict(json_encoders={Model: lambda x: 'encoded!'}) class Outer2(BaseModel): m: Model class Outermost(BaseModel): inner: Union[Outer1, Outer2] m = Outermost(inner=Outer1(m=Model(x=1))) # insert_assert(m.model_dump()) assert json.loads(m.model_dump_json()) == {'inner': {'m': 'encoded!'}} m = Outermost(inner=Outer2(m=Model(x=1))) # insert_assert(m.model_dump()) assert json.loads(m.model_dump_json()) == {'inner': {'m': {'x': 1}}} def test_json_encoders_not_used_for_python_dumps() -> None: class Model(BaseModel): x: int model_config = ConfigDict(json_encoders={int: lambda x: 'encoded!'}) m = Model(x=1) assert m.model_dump() == {'x': 1} assert m.model_dump_json() == '{"x":"encoded!"}' def test_json_encoders_types() -> None: class MyEnum(Enum): A = 'a' B = 'b' class A(BaseModel): a: MyEnum b: List[int] c: Decimal model_config = ConfigDict( json_encoders={Enum: lambda val: val.name, List[int]: lambda val: 'list!', Decimal: lambda val: 'decimal!'} ) m = A(a=MyEnum.A, b=[1, 2, 3], c=Decimal('0')) assert m.model_dump_json() == '{"a":"A","b":"list!","c":"decimal!"}' assert m.model_dump() == {'a': MyEnum.A, 'b': [1, 2, 3], 'c': Decimal('0')} @pytest.mark.parametrize( 'float_value,encoded_str', [ (float('inf'), 'Infinity'), (float('-inf'), '-Infinity'), (float('nan'), 'NaN'), ], ) def test_json_inf_nan_allow(float_value, encoded_str): class R(RootModel[float]): model_config = ConfigDict(ser_json_inf_nan='strings') r = R(float_value) r_encoded = f'"{encoded_str}"' assert r.model_dump_json() == r_encoded if math.isnan(float_value): assert math.isnan(R.model_validate_json(r_encoded).root) else: assert R.model_validate_json(r_encoded) == r class M(BaseModel): f: float model_config = R.model_config m = M(f=float_value) m_encoded = f'{{"f":{r_encoded}}}' assert m.model_dump_json() == m_encoded if math.isnan(float_value): assert math.isnan(M.model_validate_json(m_encoded).f) else: assert M.model_validate_json(m_encoded) == m def test_json_bytes_base64_round_trip(): class R(RootModel[bytes]): model_config = ConfigDict(ser_json_bytes='base64', val_json_bytes='base64') r = R(b'hello') r_encoded = '"aGVsbG8="' assert r.model_dump_json() == r_encoded assert R.model_validate_json(r_encoded) == r class M(BaseModel): key: bytes model_config = R.model_config m = M(key=b'hello') m_encoded = f'{{"key":{r_encoded}}}' assert m.model_dump_json() == m_encoded assert M.model_validate_json(m_encoded) == m def test_json_bytes_hex_round_trip(): class R(RootModel[bytes]): model_config = ConfigDict(ser_json_bytes='hex', val_json_bytes='hex') r = R(b'hello') r_encoded = '"68656c6c6f"' assert r.model_dump_json() == r_encoded assert R.model_validate_json(r_encoded) == r class M(BaseModel): key: bytes model_config = R.model_config m = M(key=b'hello') m_encoded = f'{{"key":{r_encoded}}}' assert m.model_dump_json() == m_encoded assert M.model_validate_json(m_encoded) == m pydantic-2.10.6/tests/test_json_schema.py000066400000000000000000006367241474456633400205210ustar00rootroot00000000000000import dataclasses import importlib.metadata import json import math import re import sys import typing from datetime import date, datetime, time, timedelta from decimal import Decimal from enum import Enum, IntEnum from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from pathlib import Path from typing import ( Any, Callable, Deque, Dict, FrozenSet, Generic, Iterable, List, NamedTuple, NewType, Optional, Pattern, Sequence, Set, Tuple, Type, TypeVar, Union, ) from uuid import UUID import pytest from dirty_equals import HasRepr from packaging.version import Version from pydantic_core import CoreSchema, SchemaValidator, core_schema, to_jsonable_python from pydantic_core.core_schema import ValidatorFunctionWrapHandler from typing_extensions import Annotated, Literal, Self, TypedDict, deprecated import pydantic from pydantic import ( AfterValidator, BaseModel, BeforeValidator, Field, GetCoreSchemaHandler, GetJsonSchemaHandler, ImportString, InstanceOf, PlainSerializer, PlainValidator, PydanticDeprecatedSince20, PydanticDeprecatedSince29, PydanticUserError, RootModel, ValidationError, WithJsonSchema, WrapValidator, computed_field, field_serializer, field_validator, ) from pydantic.color import Color from pydantic.config import ConfigDict from pydantic.dataclasses import dataclass from pydantic.errors import PydanticInvalidForJsonSchema from pydantic.json_schema import ( DEFAULT_REF_TEMPLATE, Examples, GenerateJsonSchema, JsonSchemaValue, PydanticJsonSchemaWarning, SkipJsonSchema, model_json_schema, models_json_schema, ) from pydantic.networks import ( AnyUrl, EmailStr, IPvAnyAddress, IPvAnyInterface, IPvAnyNetwork, NameEmail, _CoreMultiHostUrl, ) from pydantic.type_adapter import TypeAdapter from pydantic.types import ( UUID1, UUID3, UUID4, UUID5, ByteSize, DirectoryPath, FilePath, Json, NegativeFloat, NegativeInt, NewPath, NonNegativeFloat, NonNegativeInt, NonPositiveFloat, NonPositiveInt, PositiveFloat, PositiveInt, SecretBytes, SecretStr, StrictBool, StrictStr, StringConstraints, conbytes, condate, condecimal, confloat, conint, constr, ) try: import email_validator except ImportError: email_validator = None T = TypeVar('T') def test_by_alias(): class ApplePie(BaseModel): model_config = ConfigDict(title='Apple Pie') a: float = Field(alias='Snap') b: int = Field(10, alias='Crackle') assert ApplePie.model_json_schema() == { 'type': 'object', 'title': 'Apple Pie', 'properties': { 'Snap': {'type': 'number', 'title': 'Snap'}, 'Crackle': {'type': 'integer', 'title': 'Crackle', 'default': 10}, }, 'required': ['Snap'], } assert list(ApplePie.model_json_schema(by_alias=True)['properties'].keys()) == ['Snap', 'Crackle'] assert list(ApplePie.model_json_schema(by_alias=False)['properties'].keys()) == ['a', 'b'] def test_ref_template(): class KeyLimePie(BaseModel): x: str = None class ApplePie(BaseModel): model_config = ConfigDict(title='Apple Pie') a: float = None key_lime: Optional[KeyLimePie] = None assert ApplePie.model_json_schema(ref_template='foobar/{model}.json') == { 'title': 'Apple Pie', 'type': 'object', 'properties': { 'a': {'default': None, 'title': 'A', 'type': 'number'}, 'key_lime': { 'anyOf': [{'$ref': 'foobar/KeyLimePie.json'}, {'type': 'null'}], 'default': None, }, }, '$defs': { 'KeyLimePie': { 'title': 'KeyLimePie', 'type': 'object', 'properties': {'x': {'default': None, 'title': 'X', 'type': 'string'}}, } }, } assert ApplePie.model_json_schema()['properties']['key_lime'] == { 'anyOf': [{'$ref': '#/$defs/KeyLimePie'}, {'type': 'null'}], 'default': None, } json_schema = json.dumps(ApplePie.model_json_schema(ref_template='foobar/{model}.json')) assert 'foobar/KeyLimePie.json' in json_schema assert '#/$defs/KeyLimePie' not in json_schema def test_by_alias_generator(): class ApplePie(BaseModel): model_config = ConfigDict(alias_generator=lambda x: x.upper()) a: float b: int = 10 assert ApplePie.model_json_schema() == { 'title': 'ApplePie', 'type': 'object', 'properties': {'A': {'title': 'A', 'type': 'number'}, 'B': {'title': 'B', 'default': 10, 'type': 'integer'}}, 'required': ['A'], } assert ApplePie.model_json_schema(by_alias=False)['properties'].keys() == {'a', 'b'} def test_sub_model(): class Foo(BaseModel): """hello""" b: float class Bar(BaseModel): a: int b: Optional[Foo] = None assert Bar.model_json_schema() == { 'type': 'object', 'title': 'Bar', '$defs': { 'Foo': { 'type': 'object', 'title': 'Foo', 'description': 'hello', 'properties': {'b': {'type': 'number', 'title': 'B'}}, 'required': ['b'], } }, 'properties': { 'a': {'type': 'integer', 'title': 'A'}, 'b': {'anyOf': [{'$ref': '#/$defs/Foo'}, {'type': 'null'}], 'default': None}, }, 'required': ['a'], } def test_schema_class(): class Model(BaseModel): foo: int = Field(4, title='Foo is Great') bar: str = Field(description='this description of bar') with pytest.raises(ValidationError): Model() m = Model(bar='123') assert m.model_dump() == {'foo': 4, 'bar': '123'} assert Model.model_json_schema() == { 'type': 'object', 'title': 'Model', 'properties': { 'foo': {'type': 'integer', 'title': 'Foo is Great', 'default': 4}, 'bar': {'type': 'string', 'title': 'Bar', 'description': 'this description of bar'}, }, 'required': ['bar'], } def test_schema_repr(): s = Field(4, title='Foo is Great') assert str(s) == "annotation=NoneType required=False default=4 title='Foo is Great'" assert repr(s) == "FieldInfo(annotation=NoneType, required=False, default=4, title='Foo is Great')" def test_schema_class_by_alias(): class Model(BaseModel): foo: int = Field(4, alias='foofoo') assert list(Model.model_json_schema()['properties'].keys()) == ['foofoo'] assert list(Model.model_json_schema(by_alias=False)['properties'].keys()) == ['foo'] def test_choices(): FooEnum = Enum('FooEnum', {'foo': 'f', 'bar': 'b'}) BarEnum = IntEnum('BarEnum', {'foo': 1, 'bar': 2}) class SpamEnum(str, Enum): foo = 'f' bar = 'b' class Model(BaseModel): foo: FooEnum bar: BarEnum spam: SpamEnum = Field(None) # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'BarEnum': {'enum': [1, 2], 'title': 'BarEnum', 'type': 'integer'}, 'FooEnum': {'enum': ['f', 'b'], 'title': 'FooEnum', 'type': 'string'}, 'SpamEnum': {'enum': ['f', 'b'], 'title': 'SpamEnum', 'type': 'string'}, }, 'properties': { 'foo': {'$ref': '#/$defs/FooEnum'}, 'bar': {'$ref': '#/$defs/BarEnum'}, 'spam': {'$ref': '#/$defs/SpamEnum', 'default': None}, }, 'required': ['foo', 'bar'], 'title': 'Model', 'type': 'object', } def test_enum_modify_schema(): class SpamEnum(str, Enum): """ Spam enum. """ foo = 'f' bar = 'b' @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema = handler.resolve_ref_schema(field_schema) existing_comment = field_schema.get('$comment', '') field_schema['$comment'] = existing_comment + 'comment' # make sure this function is only called once field_schema['tsEnumNames'] = [e.name for e in cls] return field_schema class Model(BaseModel): spam: Optional[SpamEnum] = Field(None) # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'SpamEnum': { '$comment': 'comment', 'description': 'Spam enum.', 'enum': ['f', 'b'], 'title': 'SpamEnum', 'tsEnumNames': ['foo', 'bar'], 'type': 'string', } }, 'properties': {'spam': {'anyOf': [{'$ref': '#/$defs/SpamEnum'}, {'type': 'null'}], 'default': None}}, 'title': 'Model', 'type': 'object', } def test_enum_schema_custom_field(): class FooBarEnum(str, Enum): foo = 'foo' bar = 'bar' class Model(BaseModel): pika: FooBarEnum = Field(alias='pikalias', title='Pikapika!', description='Pika is definitely the best!') bulbi: FooBarEnum = Field('foo', alias='bulbialias', title='Bulbibulbi!', description='Bulbi is not...') cara: FooBarEnum # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'type': 'object', 'properties': { 'pikalias': { 'title': 'Pikapika!', 'description': 'Pika is definitely the best!', '$ref': '#/$defs/FooBarEnum', }, 'bulbialias': { '$ref': '#/$defs/FooBarEnum', 'default': 'foo', 'title': 'Bulbibulbi!', 'description': 'Bulbi is not...', }, 'cara': {'$ref': '#/$defs/FooBarEnum'}, }, 'required': ['pikalias', 'cara'], 'title': 'Model', '$defs': {'FooBarEnum': {'enum': ['foo', 'bar'], 'title': 'FooBarEnum', 'type': 'string'}}, } def test_enum_and_model_have_same_behaviour(): class Names(str, Enum): rick = 'Rick' morty = 'Morty' summer = 'Summer' class Pika(BaseModel): a: str class Foo(BaseModel): enum: Names titled_enum: Names = Field( ..., title='Title of enum', description='Description of enum', ) model: Pika titled_model: Pika = Field( ..., title='Title of model', description='Description of model', ) # insert_assert(Foo.model_json_schema()) assert Foo.model_json_schema() == { 'type': 'object', 'properties': { 'enum': {'$ref': '#/$defs/Names'}, 'titled_enum': { 'title': 'Title of enum', 'description': 'Description of enum', '$ref': '#/$defs/Names', }, 'model': {'$ref': '#/$defs/Pika'}, 'titled_model': { 'title': 'Title of model', 'description': 'Description of model', '$ref': '#/$defs/Pika', }, }, 'required': ['enum', 'titled_enum', 'model', 'titled_model'], 'title': 'Foo', '$defs': { 'Names': {'enum': ['Rick', 'Morty', 'Summer'], 'title': 'Names', 'type': 'string'}, 'Pika': { 'type': 'object', 'properties': {'a': {'type': 'string', 'title': 'A'}}, 'required': ['a'], 'title': 'Pika', }, }, } def test_enum_includes_extra_without_other_params(): class Names(str, Enum): rick = 'Rick' morty = 'Morty' summer = 'Summer' class Foo(BaseModel): enum: Names extra_enum: Names = Field(json_schema_extra={'extra': 'Extra field'}) assert Foo.model_json_schema() == { '$defs': { 'Names': { 'enum': ['Rick', 'Morty', 'Summer'], 'title': 'Names', 'type': 'string', }, }, 'properties': { 'enum': {'$ref': '#/$defs/Names'}, 'extra_enum': {'$ref': '#/$defs/Names', 'extra': 'Extra field'}, }, 'required': ['enum', 'extra_enum'], 'title': 'Foo', 'type': 'object', } def test_invalid_json_schema_extra(): class MyModel(BaseModel): model_config = ConfigDict(json_schema_extra=1) name: str with pytest.raises( ValueError, match=re.escape("model_config['json_schema_extra']=1 should be a dict, callable, or None") ): MyModel.model_json_schema() def test_list_enum_schema_extras(): class FoodChoice(str, Enum): spam = 'spam' egg = 'egg' chips = 'chips' class Model(BaseModel): foods: List[FoodChoice] = Field(examples=[['spam', 'egg']]) assert Model.model_json_schema() == { '$defs': { 'FoodChoice': { 'enum': ['spam', 'egg', 'chips'], 'title': 'FoodChoice', 'type': 'string', } }, 'properties': { 'foods': { 'title': 'Foods', 'type': 'array', 'items': {'$ref': '#/$defs/FoodChoice'}, 'examples': [['spam', 'egg']], }, }, 'required': ['foods'], 'title': 'Model', 'type': 'object', } def test_enum_schema_cleandoc(): class FooBar(str, Enum): """ This is docstring which needs to be cleaned up """ foo = 'foo' bar = 'bar' class Model(BaseModel): enum: FooBar assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'enum': {'$ref': '#/$defs/FooBar'}}, 'required': ['enum'], '$defs': { 'FooBar': { 'title': 'FooBar', 'description': 'This is docstring which needs to be cleaned up', 'enum': ['foo', 'bar'], 'type': 'string', } }, } def test_decimal_json_schema(): class Model(BaseModel): a: bytes = b'foobar' b: Decimal = Decimal('12.34') model_json_schema_validation = Model.model_json_schema(mode='validation') model_json_schema_serialization = Model.model_json_schema(mode='serialization') assert model_json_schema_validation == { 'properties': { 'a': {'default': 'foobar', 'format': 'binary', 'title': 'A', 'type': 'string'}, 'b': {'anyOf': [{'type': 'number'}, {'type': 'string'}], 'default': '12.34', 'title': 'B'}, }, 'title': 'Model', 'type': 'object', } assert model_json_schema_serialization == { 'properties': { 'a': {'default': 'foobar', 'format': 'binary', 'title': 'A', 'type': 'string'}, 'b': {'default': '12.34', 'title': 'B', 'type': 'string'}, }, 'title': 'Model', 'type': 'object', } def test_list_sub_model(): class Foo(BaseModel): a: float class Bar(BaseModel): b: List[Foo] assert Bar.model_json_schema() == { 'title': 'Bar', 'type': 'object', '$defs': { 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'type': 'number', 'title': 'A'}}, 'required': ['a'], } }, 'properties': {'b': {'type': 'array', 'items': {'$ref': '#/$defs/Foo'}, 'title': 'B'}}, 'required': ['b'], } def test_optional(): class Model(BaseModel): a: Optional[str] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'anyOf': [{'type': 'string'}, {'type': 'null'}], 'title': 'A'}}, 'required': ['a'], } def test_optional_modify_schema(): class MyNone(Type[None]): @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return core_schema.nullable_schema(core_schema.none_schema()) class Model(BaseModel): x: MyNone assert Model.model_json_schema() == { 'properties': {'x': {'title': 'X', 'type': 'null'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } def test_any(): class Model(BaseModel): a: Any b: object assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'a': {'title': 'A'}, 'b': {'title': 'B'}, }, 'required': ['a', 'b'], } def test_set(): class Model(BaseModel): a: Set[int] b: set c: set = {1} assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'a': {'title': 'A', 'type': 'array', 'uniqueItems': True, 'items': {'type': 'integer'}}, 'b': {'title': 'B', 'type': 'array', 'items': {}, 'uniqueItems': True}, 'c': {'title': 'C', 'type': 'array', 'items': {}, 'default': [1], 'uniqueItems': True}, }, 'required': ['a', 'b'], } @pytest.mark.parametrize( 'field_type,extra_props', [ pytest.param(tuple, {'items': {}}, id='tuple'), pytest.param(Tuple, {'items': {}}, id='Tuple'), pytest.param( Tuple[str, int, Union[str, int, float], float], { 'prefixItems': [ {'type': 'string'}, {'type': 'integer'}, {'anyOf': [{'type': 'string'}, {'type': 'integer'}, {'type': 'number'}]}, {'type': 'number'}, ], 'minItems': 4, 'maxItems': 4, }, id='Tuple[str, int, Union[str, int, float], float]', ), pytest.param(Tuple[str], {'prefixItems': [{'type': 'string'}], 'minItems': 1, 'maxItems': 1}, id='Tuple[str]'), pytest.param(Tuple[()], {'maxItems': 0, 'minItems': 0}, id='Tuple[()]'), pytest.param(Tuple[str, ...], {'items': {'type': 'string'}, 'type': 'array'}, id='Tuple[str, ...]'), ], ) def test_tuple(field_type, extra_props): class Model(BaseModel): a: field_type assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'array', **extra_props}}, 'required': ['a'], } ta = TypeAdapter(field_type) assert ta.json_schema() == {'type': 'array', **extra_props} def test_deque(): class Model(BaseModel): a: Deque[str] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'array', 'items': {'type': 'string'}}}, 'required': ['a'], } def test_bool(): class Model(BaseModel): a: bool assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'boolean'}}, 'required': ['a'], } def test_strict_bool(): class Model(BaseModel): a: StrictBool assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'boolean'}}, 'required': ['a'], } def test_dict(): class Model(BaseModel): a: dict assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'object'}}, 'required': ['a'], } def test_list(): class Model(BaseModel): a: list assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'array', 'items': {}}}, 'required': ['a'], } class Foo(BaseModel): a: float @pytest.mark.parametrize( 'field_type,expected_schema', [ ( Union[int, str], { 'properties': {'a': {'title': 'A', 'anyOf': [{'type': 'integer'}, {'type': 'string'}]}}, 'required': ['a'], }, ), ( List[int], {'properties': {'a': {'title': 'A', 'type': 'array', 'items': {'type': 'integer'}}}, 'required': ['a']}, ), ( Dict[str, Foo], { '$defs': { 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'number'}}, 'required': ['a'], } }, 'properties': {'a': {'title': 'A', 'type': 'object', 'additionalProperties': {'$ref': '#/$defs/Foo'}}}, 'required': ['a'], }, ), ( Union[None, Foo], { '$defs': { 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'number'}}, 'required': ['a'], } }, 'properties': {'a': {'anyOf': [{'$ref': '#/$defs/Foo'}, {'type': 'null'}]}}, 'required': ['a'], 'title': 'Model', 'type': 'object', }, ), ( Union[int, int], {'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a']}, ), (Dict[str, Any], {'properties': {'a': {'title': 'A', 'type': 'object'}}, 'required': ['a']}), ], ) def test_list_union_dict(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = {'title': 'Model', 'type': 'object'} base_schema.update(expected_schema) assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (datetime, {'type': 'string', 'format': 'date-time'}), (date, {'type': 'string', 'format': 'date'}), (time, {'type': 'string', 'format': 'time'}), (timedelta, {'type': 'string', 'format': 'duration'}), ], ) def test_date_types(field_type, expected_schema): class Model(BaseModel): a: field_type attribute_schema = {'title': 'A'} attribute_schema.update(expected_schema) base_schema = {'title': 'Model', 'type': 'object', 'properties': {'a': attribute_schema}, 'required': ['a']} assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type', [ condate(), condate(gt=date(2010, 1, 1), lt=date(2021, 2, 2)), condate(ge=date(2010, 1, 1), le=date(2021, 2, 2)), ], ) def test_date_constrained_types_no_constraints(field_type): """No constraints added, see https://github.com/json-schema-org/json-schema-spec/issues/116.""" class Model(BaseModel): a: field_type assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string', 'format': 'date'}}, 'required': ['a'], } def test_complex_types(): class Model(BaseModel): a: complex assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } @pytest.mark.parametrize( 'field_type,expected_schema', [ (Optional[str], {'properties': {'a': {'anyOf': [{'type': 'string'}, {'type': 'null'}], 'title': 'A'}}}), ( Optional[bytes], {'properties': {'a': {'title': 'A', 'anyOf': [{'type': 'string', 'format': 'binary'}, {'type': 'null'}]}}}, ), ( Union[str, bytes], { 'properties': { 'a': {'title': 'A', 'anyOf': [{'type': 'string'}, {'type': 'string', 'format': 'binary'}]} }, }, ), ( Union[None, str, bytes], { 'properties': { 'a': { 'title': 'A', 'anyOf': [{'type': 'string'}, {'type': 'string', 'format': 'binary'}, {'type': 'null'}], } } }, ), ], ) def test_str_basic_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = {'title': 'Model', 'type': 'object', 'required': ['a']} base_schema.update(expected_schema) assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (Pattern, {'type': 'string', 'format': 'regex'}), (Pattern[str], {'type': 'string', 'format': 'regex'}), (Pattern[bytes], {'type': 'string', 'format': 'regex'}), ], ) def test_pattern(field_type, expected_schema) -> None: class Model(BaseModel): a: field_type expected_schema.update({'title': 'A'}) full_schema = {'title': 'Model', 'type': 'object', 'required': ['a'], 'properties': {'a': expected_schema}} assert Model.model_json_schema() == full_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (StrictStr, {'title': 'A', 'type': 'string'}), # (ConstrainedStr, {'title': 'A', 'type': 'string'}), ( constr(min_length=3, max_length=5, pattern='^text$'), {'title': 'A', 'type': 'string', 'minLength': 3, 'maxLength': 5, 'pattern': '^text$'}, ), ], ) def test_str_constrained_types(field_type, expected_schema): class Model(BaseModel): a: field_type model_schema = Model.model_json_schema() assert model_schema['properties']['a'] == expected_schema base_schema = {'title': 'Model', 'type': 'object', 'properties': {'a': expected_schema}, 'required': ['a']} assert model_schema == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (AnyUrl, {'title': 'A', 'type': 'string', 'format': 'uri', 'minLength': 1}), ( Annotated[AnyUrl, Field(max_length=2**16)], {'title': 'A', 'type': 'string', 'format': 'uri', 'minLength': 1, 'maxLength': 2**16}, ), (_CoreMultiHostUrl, {'title': 'A', 'type': 'string', 'format': 'multi-host-uri', 'minLength': 1}), ], ) def test_special_str_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = {'title': 'Model', 'type': 'object', 'properties': {'a': {}}, 'required': ['a']} base_schema['properties']['a'] = expected_schema assert Model.model_json_schema() == base_schema @pytest.mark.skipif(not email_validator, reason='email_validator not installed') @pytest.mark.parametrize('field_type,expected_schema', [(EmailStr, 'email'), (NameEmail, 'name-email')]) def test_email_str_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } base_schema['properties']['a']['format'] = expected_schema assert Model.model_json_schema() == base_schema @pytest.mark.parametrize('field_type,inner_type', [(SecretBytes, 'string'), (SecretStr, 'string')]) def test_secret_types(field_type, inner_type): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': inner_type, 'writeOnly': True, 'format': 'password'}}, 'required': ['a'], } assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ # (ConstrainedInt, {}), (conint(gt=5, lt=10), {'exclusiveMinimum': 5, 'exclusiveMaximum': 10}), (conint(ge=5, le=10), {'minimum': 5, 'maximum': 10}), (conint(multiple_of=5), {'multipleOf': 5}), (PositiveInt, {'exclusiveMinimum': 0}), (NegativeInt, {'exclusiveMaximum': 0}), (NonNegativeInt, {'minimum': 0}), (NonPositiveInt, {'maximum': 0}), ], ) def test_special_int_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], } base_schema['properties']['a'].update(expected_schema) assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (confloat(gt=5, lt=10), {'exclusiveMinimum': 5, 'exclusiveMaximum': 10}), (confloat(ge=5, le=10), {'minimum': 5, 'maximum': 10}), (confloat(multiple_of=5), {'multipleOf': 5}), (PositiveFloat, {'exclusiveMinimum': 0}), (NegativeFloat, {'exclusiveMaximum': 0}), (NonNegativeFloat, {'minimum': 0}), (NonPositiveFloat, {'maximum': 0}), ], ) def test_special_float_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'number'}}, 'required': ['a'], } base_schema['properties']['a'].update(expected_schema) assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [ (condecimal(gt=5, lt=10), {'exclusiveMinimum': 5, 'exclusiveMaximum': 10}), (condecimal(ge=5, le=10), {'minimum': 5, 'maximum': 10}), (condecimal(multiple_of=5), {'multipleOf': 5}), ], ) def test_special_decimal_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'anyOf': [{'type': 'number'}, {'type': 'string'}], 'title': 'A'}}, 'required': ['a'], } base_schema['properties']['a']['anyOf'][0].update(expected_schema) assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [(UUID, 'uuid'), (UUID1, 'uuid1'), (UUID3, 'uuid3'), (UUID4, 'uuid4'), (UUID5, 'uuid5')], ) def test_uuid_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string', 'format': 'uuid'}}, 'required': ['a'], } base_schema['properties']['a']['format'] = expected_schema assert Model.model_json_schema() == base_schema @pytest.mark.parametrize( 'field_type,expected_schema', [(FilePath, 'file-path'), (DirectoryPath, 'directory-path'), (NewPath, 'path'), (Path, 'path')], ) def test_path_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string', 'format': ''}}, 'required': ['a'], } base_schema['properties']['a']['format'] = expected_schema assert Model.model_json_schema() == base_schema def test_json_type(): class Model(BaseModel): a: Json b: Json[int] c: Json[Any] assert Model.model_json_schema() == { 'properties': { 'a': {'contentMediaType': 'application/json', 'contentSchema': {}, 'title': 'A', 'type': 'string'}, 'b': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'integer'}, 'title': 'B', 'type': 'string', }, 'c': {'contentMediaType': 'application/json', 'contentSchema': {}, 'title': 'C', 'type': 'string'}, }, 'required': ['a', 'b', 'c'], 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization') == { 'properties': {'a': {'title': 'A'}, 'b': {'title': 'B', 'type': 'integer'}, 'c': {'title': 'C'}}, 'required': ['a', 'b', 'c'], 'title': 'Model', 'type': 'object', } def test_ipv4address_type(): class Model(BaseModel): ip_address: IPv4Address model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_address': {'title': 'Ip Address', 'type': 'string', 'format': 'ipv4'}}, 'required': ['ip_address'], } def test_ipv6address_type(): class Model(BaseModel): ip_address: IPv6Address model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_address': {'title': 'Ip Address', 'type': 'string', 'format': 'ipv6'}}, 'required': ['ip_address'], } def test_ipvanyaddress_type(): class Model(BaseModel): ip_address: IPvAnyAddress model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_address': {'title': 'Ip Address', 'type': 'string', 'format': 'ipvanyaddress'}}, 'required': ['ip_address'], } def test_ipv4interface_type(): class Model(BaseModel): ip_interface: IPv4Interface model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_interface': {'title': 'Ip Interface', 'type': 'string', 'format': 'ipv4interface'}}, 'required': ['ip_interface'], } def test_ipv6interface_type(): class Model(BaseModel): ip_interface: IPv6Interface model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_interface': {'title': 'Ip Interface', 'type': 'string', 'format': 'ipv6interface'}}, 'required': ['ip_interface'], } def test_ipvanyinterface_type(): class Model(BaseModel): ip_interface: IPvAnyInterface model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_interface': {'title': 'Ip Interface', 'type': 'string', 'format': 'ipvanyinterface'}}, 'required': ['ip_interface'], } def test_ipv4network_type(): class Model(BaseModel): ip_network: IPv4Network model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_network': {'title': 'Ip Network', 'type': 'string', 'format': 'ipv4network'}}, 'required': ['ip_network'], } def test_ipv6network_type(): class Model(BaseModel): ip_network: IPv6Network model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_network': {'title': 'Ip Network', 'type': 'string', 'format': 'ipv6network'}}, 'required': ['ip_network'], } def test_ipvanynetwork_type(): class Model(BaseModel): ip_network: IPvAnyNetwork model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'ip_network': {'title': 'Ip Network', 'type': 'string', 'format': 'ipvanynetwork'}}, 'required': ['ip_network'], } @pytest.mark.parametrize( 'type_,default_value', ( (Callable, ...), (Callable, lambda x: x), (Callable[[int], int], ...), (Callable[[int], int], lambda x: x), ), ) @pytest.mark.parametrize( 'base_json_schema,properties', [ ( {'a': 'b'}, { 'callback': {'title': 'Callback', 'a': 'b'}, 'foo': {'title': 'Foo', 'type': 'integer'}, }, ), ( None, { 'foo': {'title': 'Foo', 'type': 'integer'}, }, ), ], ) def test_callable_type(type_, default_value, base_json_schema, properties): class Model(BaseModel): callback: type_ = default_value foo: int with pytest.raises(PydanticInvalidForJsonSchema): Model.model_json_schema() class ModelWithOverride(BaseModel): callback: Annotated[type_, WithJsonSchema(base_json_schema)] = default_value foo: int if default_value is Ellipsis or base_json_schema is None: model_schema = ModelWithOverride.model_json_schema() else: with pytest.warns( PydanticJsonSchemaWarning, match='Default value .* is not JSON serializable; excluding' r' default from JSON schema \[non-serializable-default]', ): model_schema = ModelWithOverride.model_json_schema() assert model_schema['properties'] == properties @pytest.mark.parametrize( 'default_value,properties', ( (Field(...), {'callback': {'title': 'Callback', 'type': 'integer'}}), (1, {'callback': {'default': 1, 'title': 'Callback', 'type': 'integer'}}), ), ) def test_callable_type_with_fallback(default_value, properties): class Model(BaseModel): callback: Union[int, Callable[[int], int]] = default_value class MyGenerator(GenerateJsonSchema): ignored_warning_kinds = () with pytest.warns( PydanticJsonSchemaWarning, match=re.escape('Cannot generate a JsonSchema for core_schema.CallableSchema [skipped-choice]'), ): model_schema = Model.model_json_schema(schema_generator=MyGenerator) assert model_schema['properties'] == properties def test_byte_size_type(): class Model(BaseModel): a: ByteSize b: ByteSize = ByteSize(1000000) c: ByteSize = Field(default='1MB', validate_default=True) assert Model.model_json_schema(mode='validation') == { 'properties': { 'a': { 'anyOf': [ {'pattern': '^\\s*(\\d*\\.?\\d+)\\s*(\\w+)?', 'type': 'string'}, {'minimum': 0, 'type': 'integer'}, ], 'title': 'A', }, 'b': { 'anyOf': [ {'pattern': '^\\s*(\\d*\\.?\\d+)\\s*(\\w+)?', 'type': 'string'}, {'minimum': 0, 'type': 'integer'}, ], 'default': 1000000, 'title': 'B', }, 'c': { 'anyOf': [ {'pattern': '^\\s*(\\d*\\.?\\d+)\\s*(\\w+)?', 'type': 'string'}, {'minimum': 0, 'type': 'integer'}, ], 'default': '1MB', 'title': 'C', }, }, 'required': ['a'], 'title': 'Model', 'type': 'object', } with pytest.warns( PydanticJsonSchemaWarning, match=re.escape( "Unable to serialize value '1MB' with the plain serializer; excluding default from JSON schema" ), ): assert Model.model_json_schema(mode='serialization') == { 'properties': { 'a': {'minimum': 0, 'title': 'A', 'type': 'integer'}, 'b': {'default': 1000000, 'minimum': 0, 'title': 'B', 'type': 'integer'}, 'c': {'minimum': 0, 'title': 'C', 'type': 'integer'}, }, 'required': ['a'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize( 'type_,default_value,properties', ( ( Dict[Any, Any], {(lambda x: x): 1}, {'callback': {'title': 'Callback', 'type': 'object'}}, ), ( Union[int, Callable[[int], int]], lambda x: x, {'callback': {'title': 'Callback', 'type': 'integer'}}, ), ), ) def test_non_serializable_default(type_, default_value, properties): class Model(BaseModel): callback: type_ = default_value with pytest.warns( PydanticJsonSchemaWarning, match=( 'Default value .* is not JSON serializable; excluding default from JSON schema ' r'\[non-serializable-default\]' ), ): model_schema = Model.model_json_schema() assert model_schema['properties'] == properties assert model_schema.get('required') is None def test_callable_fallback_with_non_serializable_default(): class Model(BaseModel): callback: Union[int, Callable[[int], int]] = lambda x: x class MyGenerator(GenerateJsonSchema): ignored_warning_kinds = () inner_match = ( r'Default value .* is not JSON serializable; excluding default from JSON schema \[non-serializable-default\]' ) outer_match = r'Cannot generate a JsonSchema for core_schema.CallableSchema \[skipped-choice\]' with pytest.warns(PydanticJsonSchemaWarning, match=outer_match): with pytest.warns(PydanticJsonSchemaWarning, match=inner_match): model_schema = Model.model_json_schema(schema_generator=MyGenerator) assert model_schema == { 'properties': {'callback': {'title': 'Callback', 'type': 'integer'}}, 'title': 'Model', 'type': 'object', } def test_importstring_json_schema(): class Model(BaseModel): a: ImportString assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_schema_overrides(): class Foo(BaseModel): a: str class Bar(BaseModel): b: Foo = Foo(a='foo') class Baz(BaseModel): c: Optional[Bar] class Model(BaseModel): d: Baz model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', '$defs': { 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], }, 'Bar': { 'title': 'Bar', 'type': 'object', 'properties': {'b': {'$ref': '#/$defs/Foo', 'default': {'a': 'foo'}}}, }, 'Baz': { 'title': 'Baz', 'type': 'object', 'properties': {'c': {'anyOf': [{'$ref': '#/$defs/Bar'}, {'type': 'null'}]}}, 'required': ['c'], }, }, 'properties': {'d': {'$ref': '#/$defs/Baz'}}, 'required': ['d'], } def test_schema_overrides_w_union(): class Foo(BaseModel): pass class Bar(BaseModel): pass class Spam(BaseModel): a: Union[Foo, Bar] = Field(description='xxx') assert Spam.model_json_schema()['properties'] == { 'a': { 'title': 'A', 'description': 'xxx', 'anyOf': [{'$ref': '#/$defs/Foo'}, {'$ref': '#/$defs/Bar'}], }, } def test_schema_from_models(): class Foo(BaseModel): a: str class Bar(BaseModel): b: Foo class Baz(BaseModel): c: Bar class Model(BaseModel): d: Baz class Ingredient(BaseModel): name: str class Pizza(BaseModel): name: str ingredients: List[Ingredient] json_schemas_map, model_schema = models_json_schema( [(Model, 'validation'), (Pizza, 'validation')], title='Multi-model schema', description='Single JSON Schema with multiple definitions', ) assert json_schemas_map == { (Pizza, 'validation'): {'$ref': '#/$defs/Pizza'}, (Model, 'validation'): {'$ref': '#/$defs/Model'}, } assert model_schema == { 'title': 'Multi-model schema', 'description': 'Single JSON Schema with multiple definitions', '$defs': { 'Pizza': { 'title': 'Pizza', 'type': 'object', 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'ingredients': { 'title': 'Ingredients', 'type': 'array', 'items': {'$ref': '#/$defs/Ingredient'}, }, }, 'required': ['name', 'ingredients'], }, 'Ingredient': { 'title': 'Ingredient', 'type': 'object', 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], }, 'Model': { 'title': 'Model', 'type': 'object', 'properties': {'d': {'$ref': '#/$defs/Baz'}}, 'required': ['d'], }, 'Baz': { 'title': 'Baz', 'type': 'object', 'properties': {'c': {'$ref': '#/$defs/Bar'}}, 'required': ['c'], }, 'Bar': { 'title': 'Bar', 'type': 'object', 'properties': {'b': {'$ref': '#/$defs/Foo'}}, 'required': ['b'], }, 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], }, }, } def test_schema_with_refs(): ref_template = '#/components/schemas/{model}' class Foo(BaseModel): a: str class Bar(BaseModel): b: Foo class Baz(BaseModel): c: Bar keys_map, model_schema = models_json_schema([(Bar, 'validation'), (Baz, 'validation')], ref_template=ref_template) assert keys_map == { (Bar, 'validation'): {'$ref': '#/components/schemas/Bar'}, (Baz, 'validation'): {'$ref': '#/components/schemas/Baz'}, } assert model_schema == { '$defs': { 'Baz': { 'title': 'Baz', 'type': 'object', 'properties': {'c': {'$ref': '#/components/schemas/Bar'}}, 'required': ['c'], }, 'Bar': { 'title': 'Bar', 'type': 'object', 'properties': {'b': {'$ref': '#/components/schemas/Foo'}}, 'required': ['b'], }, 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], }, } } def test_schema_with_custom_ref_template(): class Foo(BaseModel): a: str class Bar(BaseModel): b: Foo class Baz(BaseModel): c: Bar keys_map, model_schema = models_json_schema( [(Bar, 'validation'), (Baz, 'validation')], ref_template='/schemas/{model}.json#/' ) assert keys_map == { (Bar, 'validation'): {'$ref': '/schemas/Bar.json#/'}, (Baz, 'validation'): {'$ref': '/schemas/Baz.json#/'}, } assert model_schema == { '$defs': { 'Baz': { 'title': 'Baz', 'type': 'object', 'properties': {'c': {'$ref': '/schemas/Bar.json#/'}}, 'required': ['c'], }, 'Bar': { 'title': 'Bar', 'type': 'object', 'properties': {'b': {'$ref': '/schemas/Foo.json#/'}}, 'required': ['b'], }, 'Foo': { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], }, } } def test_schema_ref_template_key_error(): class Foo(BaseModel): a: str class Bar(BaseModel): b: Foo class Baz(BaseModel): c: Bar with pytest.raises(KeyError): models_json_schema([(Bar, 'validation'), (Baz, 'validation')], ref_template='/schemas/{bad_name}.json#/') def test_external_ref(): """https://github.com/pydantic/pydantic/issues/9783""" class Model(BaseModel): json_schema: Annotated[ dict, WithJsonSchema({'$ref': 'https://json-schema.org/draft/2020-12/schema'}), ] assert Model.model_json_schema() == { 'properties': {'json_schema': {'$ref': 'https://json-schema.org/draft/2020-12/schema', 'title': 'Json Schema'}}, 'required': ['json_schema'], 'title': 'Model', 'type': 'object', } def test_schema_no_definitions(): keys_map, model_schema = models_json_schema([], title='Schema without definitions') assert keys_map == {} assert model_schema == {'title': 'Schema without definitions'} def test_list_default(): class UserModel(BaseModel): friends: List[int] = [1] assert UserModel.model_json_schema() == { 'title': 'UserModel', 'type': 'object', 'properties': {'friends': {'title': 'Friends', 'default': [1], 'type': 'array', 'items': {'type': 'integer'}}}, } def test_enum_str_default(): class MyEnum(str, Enum): FOO = 'foo' class UserModel(BaseModel): friends: MyEnum = MyEnum.FOO default_value = UserModel.model_json_schema()['properties']['friends']['default'] assert type(default_value) is str assert default_value == MyEnum.FOO.value def test_enum_int_default(): class MyEnum(IntEnum): FOO = 1 class UserModel(BaseModel): friends: MyEnum = MyEnum.FOO default_value = UserModel.model_json_schema()['properties']['friends']['default'] assert type(default_value) is int assert default_value == MyEnum.FOO.value def test_enum_dict(): class MyEnum(str, Enum): FOO = 'foo' BAR = 'bar' class MyModel(BaseModel): enum_dict: Dict[MyEnum, str] assert MyModel.model_json_schema() == { '$defs': { 'MyEnum': {'enum': ['foo', 'bar'], 'title': 'MyEnum', 'type': 'string'}, }, 'title': 'MyModel', 'type': 'object', 'properties': { 'enum_dict': { 'title': 'Enum Dict', 'type': 'object', 'additionalProperties': {'type': 'string'}, 'propertyNames': {'$ref': '#/$defs/MyEnum'}, } }, 'required': ['enum_dict'], } def test_property_names_constraint(): class MyModel(BaseModel): my_dict: Dict[Annotated[str, StringConstraints(max_length=1)], str] assert MyModel.model_json_schema() == { 'properties': { 'my_dict': { 'additionalProperties': {'type': 'string'}, 'propertyNames': {'maxLength': 1}, 'title': 'My Dict', 'type': 'object', } }, 'required': ['my_dict'], 'title': 'MyModel', 'type': 'object', } def test_dict_default(): class UserModel(BaseModel): friends: Dict[str, float] = {'a': 1.1, 'b': 2.2} assert UserModel.model_json_schema() == { 'title': 'UserModel', 'type': 'object', 'properties': { 'friends': { 'title': 'Friends', 'default': {'a': 1.1, 'b': 2.2}, 'type': 'object', 'additionalProperties': {'type': 'number'}, } }, } def test_model_default(): """Make sure inner model types are encoded properly""" class Inner(BaseModel): a: Dict[Path, str] = {Path(): ''} class Outer(BaseModel): inner: Inner = Inner() assert Outer.model_json_schema() == { '$defs': { 'Inner': { 'properties': { 'a': { 'additionalProperties': {'type': 'string'}, 'default': {'.': ''}, 'propertyNames': {'format': 'path'}, 'title': 'A', 'type': 'object', } }, 'title': 'Inner', 'type': 'object', } }, 'properties': {'inner': {'$ref': '#/$defs/Inner', 'default': {'a': {'.': ''}}}}, 'title': 'Outer', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_timedelta,properties', [ ('float', {'duration': {'default': 300.0, 'title': 'Duration', 'type': 'number'}}), ('iso8601', {'duration': {'default': 'PT5M', 'format': 'duration', 'title': 'Duration', 'type': 'string'}}), ], ) def test_model_default_timedelta(ser_json_timedelta: Literal['float', 'iso8601'], properties: typing.Dict[str, Any]): class Model(BaseModel): model_config = ConfigDict(ser_json_timedelta=ser_json_timedelta) duration: timedelta = timedelta(minutes=5) # insert_assert(Model.model_json_schema(mode='serialization')) assert Model.model_json_schema(mode='serialization') == { 'properties': properties, 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_bytes,properties', [ ('base64', {'data': {'default': 'Zm9vYmFy', 'format': 'base64url', 'title': 'Data', 'type': 'string'}}), ('utf8', {'data': {'default': 'foobar', 'format': 'binary', 'title': 'Data', 'type': 'string'}}), ], ) def test_model_default_bytes(ser_json_bytes: Literal['base64', 'utf8'], properties: typing.Dict[str, Any]): class Model(BaseModel): model_config = ConfigDict(ser_json_bytes=ser_json_bytes) data: bytes = b'foobar' # insert_assert(Model.model_json_schema(mode='serialization')) assert Model.model_json_schema(mode='serialization') == { 'properties': properties, 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_timedelta,properties', [ ('float', {'duration': {'default': 300.0, 'title': 'Duration', 'type': 'number'}}), ('iso8601', {'duration': {'default': 'PT5M', 'format': 'duration', 'title': 'Duration', 'type': 'string'}}), ], ) def test_dataclass_default_timedelta( ser_json_timedelta: Literal['float', 'iso8601'], properties: typing.Dict[str, Any] ): @dataclass(config=ConfigDict(ser_json_timedelta=ser_json_timedelta)) class Dataclass: duration: timedelta = timedelta(minutes=5) # insert_assert(TypeAdapter(Dataclass).json_schema(mode='serialization')) assert TypeAdapter(Dataclass).json_schema(mode='serialization') == { 'properties': properties, 'title': 'Dataclass', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_bytes,properties', [ ('base64', {'data': {'default': 'Zm9vYmFy', 'format': 'base64url', 'title': 'Data', 'type': 'string'}}), ('utf8', {'data': {'default': 'foobar', 'format': 'binary', 'title': 'Data', 'type': 'string'}}), ], ) def test_dataclass_default_bytes(ser_json_bytes: Literal['base64', 'utf8'], properties: typing.Dict[str, Any]): @dataclass(config=ConfigDict(ser_json_bytes=ser_json_bytes)) class Dataclass: data: bytes = b'foobar' # insert_assert(TypeAdapter(Dataclass).json_schema(mode='serialization')) assert TypeAdapter(Dataclass).json_schema(mode='serialization') == { 'properties': properties, 'title': 'Dataclass', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_timedelta,properties', [ ('float', {'duration': {'default': 300.0, 'title': 'Duration', 'type': 'number'}}), ('iso8601', {'duration': {'default': 'PT5M', 'format': 'duration', 'title': 'Duration', 'type': 'string'}}), ], ) def test_typeddict_default_timedelta( ser_json_timedelta: Literal['float', 'iso8601'], properties: typing.Dict[str, Any] ): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(ser_json_timedelta=ser_json_timedelta) duration: Annotated[timedelta, Field(timedelta(minutes=5))] # insert_assert(TypeAdapter(MyTypedDict).json_schema(mode='serialization')) assert TypeAdapter(MyTypedDict).json_schema(mode='serialization') == { 'properties': properties, 'title': 'MyTypedDict', 'type': 'object', } @pytest.mark.parametrize( 'ser_json_bytes,properties', [ ('base64', {'data': {'default': 'Zm9vYmFy', 'format': 'base64url', 'title': 'Data', 'type': 'string'}}), ('utf8', {'data': {'default': 'foobar', 'format': 'binary', 'title': 'Data', 'type': 'string'}}), ], ) def test_typeddict_default_bytes(ser_json_bytes: Literal['base64', 'utf8'], properties: typing.Dict[str, Any]): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(ser_json_bytes=ser_json_bytes) data: Annotated[bytes, Field(b'foobar')] # insert_assert(TypeAdapter(MyTypedDict).json_schema(mode='serialization')) assert TypeAdapter(MyTypedDict).json_schema(mode='serialization') == { 'properties': properties, 'title': 'MyTypedDict', 'type': 'object', } def test_model_subclass_metadata(): class A(BaseModel): """A Model docstring""" class B(A): pass assert A.model_json_schema() == { 'title': 'A', 'description': 'A Model docstring', 'type': 'object', 'properties': {}, } assert B.model_json_schema() == {'title': 'B', 'type': 'object', 'properties': {}} @pytest.mark.parametrize( 'docstring,description', [ ('foobar', 'foobar'), ('\n foobar\n ', 'foobar'), ('foobar\n ', 'foobar\n '), ('foo\n bar\n ', 'foo\nbar'), ('\n foo\n bar\n ', 'foo\nbar'), ], ) def test_docstring(docstring, description): class A(BaseModel): x: int A.__doc__ = docstring assert A.model_json_schema()['description'] == description @pytest.mark.parametrize( 'kwargs,type_,expected_extra', [ ({'max_length': 5}, str, {'type': 'string', 'maxLength': 5}), ({}, constr(max_length=6), {'type': 'string', 'maxLength': 6}), ({'min_length': 2}, str, {'type': 'string', 'minLength': 2}), ({'max_length': 5}, bytes, {'type': 'string', 'maxLength': 5, 'format': 'binary'}), ({'pattern': '^foo$'}, str, {'type': 'string', 'pattern': '^foo$'}), ({'gt': 2}, int, {'type': 'integer', 'exclusiveMinimum': 2}), ({'lt': 5}, int, {'type': 'integer', 'exclusiveMaximum': 5}), ({'ge': 2}, int, {'type': 'integer', 'minimum': 2}), ({'le': 5}, int, {'type': 'integer', 'maximum': 5}), ({'multiple_of': 5}, int, {'type': 'integer', 'multipleOf': 5}), ({'gt': 2}, float, {'type': 'number', 'exclusiveMinimum': 2}), ({'lt': 5}, float, {'type': 'number', 'exclusiveMaximum': 5}), ({'ge': 2}, float, {'type': 'number', 'minimum': 2}), ({'le': 5}, float, {'type': 'number', 'maximum': 5}), ({'gt': -math.inf}, float, {'type': 'number'}), ({'lt': math.inf}, float, {'type': 'number'}), ({'ge': -math.inf}, float, {'type': 'number'}), ({'le': math.inf}, float, {'type': 'number'}), ({'multiple_of': 5}, float, {'type': 'number', 'multipleOf': 5}), ({'gt': 2}, Decimal, {'anyOf': [{'exclusiveMinimum': 2.0, 'type': 'number'}, {'type': 'string'}]}), ({'lt': 5}, Decimal, {'anyOf': [{'type': 'number', 'exclusiveMaximum': 5}, {'type': 'string'}]}), ({'ge': 2}, Decimal, {'anyOf': [{'type': 'number', 'minimum': 2}, {'type': 'string'}]}), ({'le': 5}, Decimal, {'anyOf': [{'type': 'number', 'maximum': 5}, {'type': 'string'}]}), ({'multiple_of': 5}, Decimal, {'anyOf': [{'type': 'number', 'multipleOf': 5}, {'type': 'string'}]}), ], ) def test_constraints_schema_validation(kwargs, type_, expected_extra): class Foo(BaseModel): a: type_ = Field('foo', title='A title', description='A description', **kwargs) expected_schema = { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A title', 'description': 'A description', 'default': 'foo'}}, } expected_schema['properties']['a'].update(expected_extra) assert Foo.model_json_schema(mode='validation') == expected_schema @pytest.mark.parametrize( 'kwargs,type_,expected_extra', [ ({'max_length': 5}, str, {'type': 'string', 'maxLength': 5}), ({}, constr(max_length=6), {'type': 'string', 'maxLength': 6}), ({'min_length': 2}, str, {'type': 'string', 'minLength': 2}), ({'max_length': 5}, bytes, {'type': 'string', 'maxLength': 5, 'format': 'binary'}), ({'pattern': '^foo$'}, str, {'type': 'string', 'pattern': '^foo$'}), ({'gt': 2}, int, {'type': 'integer', 'exclusiveMinimum': 2}), ({'lt': 5}, int, {'type': 'integer', 'exclusiveMaximum': 5}), ({'ge': 2}, int, {'type': 'integer', 'minimum': 2}), ({'le': 5}, int, {'type': 'integer', 'maximum': 5}), ({'multiple_of': 5}, int, {'type': 'integer', 'multipleOf': 5}), ({'gt': 2}, float, {'type': 'number', 'exclusiveMinimum': 2}), ({'lt': 5}, float, {'type': 'number', 'exclusiveMaximum': 5}), ({'ge': 2}, float, {'type': 'number', 'minimum': 2}), ({'le': 5}, float, {'type': 'number', 'maximum': 5}), ({'gt': -math.inf}, float, {'type': 'number'}), ({'lt': math.inf}, float, {'type': 'number'}), ({'ge': -math.inf}, float, {'type': 'number'}), ({'le': math.inf}, float, {'type': 'number'}), ({'multiple_of': 5}, float, {'type': 'number', 'multipleOf': 5}), ({'gt': 2}, Decimal, {'type': 'string'}), ({'lt': 5}, Decimal, {'type': 'string'}), ({'ge': 2}, Decimal, {'type': 'string'}), ({'le': 5}, Decimal, {'type': 'string'}), ({'multiple_of': 5}, Decimal, {'type': 'string'}), ], ) def test_constraints_schema_serialization(kwargs, type_, expected_extra): class Foo(BaseModel): a: type_ = Field('foo', title='A title', description='A description', **kwargs) expected_schema = { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'title': 'A title', 'description': 'A description', 'default': 'foo'}}, } expected_schema['properties']['a'].update(expected_extra) assert Foo.model_json_schema(mode='serialization') == expected_schema @pytest.mark.parametrize( 'kwargs,type_,value', [ ({'max_length': 5}, str, 'foo'), ({'min_length': 2}, str, 'foo'), ({'max_length': 5}, bytes, b'foo'), ({'pattern': '^foo$'}, str, 'foo'), ({'gt': 2}, int, 3), ({'lt': 5}, int, 3), ({'ge': 2}, int, 3), ({'ge': 2}, int, 2), ({'gt': 2}, int, '3'), ({'le': 5}, int, 3), ({'le': 5}, int, 5), ({'gt': 2}, float, 3.0), ({'gt': 2}, float, 2.1), ({'lt': 5}, float, 3.0), ({'lt': 5}, float, 4.9), ({'ge': 2}, float, 3.0), ({'ge': 2}, float, 2.0), ({'le': 5}, float, 3.0), ({'le': 5}, float, 5.0), ({'gt': 2}, float, 3), ({'gt': 2}, float, '3'), ({'gt': 2}, Decimal, Decimal(3)), ({'lt': 5}, Decimal, Decimal(3)), ({'ge': 2}, Decimal, Decimal(3)), ({'ge': 2}, Decimal, Decimal(2)), ({'le': 5}, Decimal, Decimal(3)), ({'le': 5}, Decimal, Decimal(5)), ], ) def test_constraints_schema_validation_passes(kwargs, type_, value): class Foo(BaseModel): a: type_ = Field('foo', title='A title', description='A description', **kwargs) assert Foo(a=value) @pytest.mark.parametrize( 'kwargs,type_,value', [ ({'max_length': 5}, str, 'foobar'), ({'min_length': 2}, str, 'f'), ({'pattern': '^foo$'}, str, 'bar'), ({'gt': 2}, int, 2), ({'lt': 5}, int, 5), ({'ge': 2}, int, 1), ({'le': 5}, int, 6), ({'gt': 2}, float, 2.0), ({'lt': 5}, float, 5.0), ({'ge': 2}, float, 1.9), ({'le': 5}, float, 5.1), ({'gt': 2}, Decimal, Decimal(2)), ({'lt': 5}, Decimal, Decimal(5)), ({'ge': 2}, Decimal, Decimal(1)), ({'le': 5}, Decimal, Decimal(6)), ], ) def test_constraints_schema_validation_raises(kwargs, type_, value): class Foo(BaseModel): a: type_ = Field('foo', title='A title', description='A description', **kwargs) with pytest.raises(ValidationError): Foo(a=value) def test_schema_kwargs(): class Foo(BaseModel): a: str = Field('foo', examples=['bar']) assert Foo.model_json_schema() == { 'title': 'Foo', 'type': 'object', 'properties': {'a': {'type': 'string', 'title': 'A', 'default': 'foo', 'examples': ['bar']}}, } def test_schema_dict_constr(): regex_str = r'^([a-zA-Z_][a-zA-Z0-9_]*)$' ConStrType = constr(pattern=regex_str) ConStrKeyDict = Dict[ConStrType, str] class Foo(BaseModel): a: ConStrKeyDict = {} assert Foo.model_json_schema() == { 'title': 'Foo', 'type': 'object', 'properties': { 'a': {'type': 'object', 'title': 'A', 'default': {}, 'patternProperties': {regex_str: {'type': 'string'}}} }, } @pytest.mark.parametrize( 'field_type,expected_schema', [ # (ConstrainedBytes, {'title': 'A', 'type': 'string', 'format': 'binary'}), ( conbytes(min_length=3, max_length=5), {'title': 'A', 'type': 'string', 'format': 'binary', 'minLength': 3, 'maxLength': 5}, ), ], ) def test_bytes_constrained_types(field_type, expected_schema): class Model(BaseModel): a: field_type base_schema = {'title': 'Model', 'type': 'object', 'properties': {'a': {}}, 'required': ['a']} base_schema['properties']['a'] = expected_schema assert Model.model_json_schema() == base_schema def test_optional_dict(): class Model(BaseModel): something: Optional[Dict[str, Any]] = None assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'something': {'anyOf': [{'type': 'object'}, {'type': 'null'}], 'default': None, 'title': 'Something'} }, } assert Model().model_dump() == {'something': None} assert Model(something={'foo': 'Bar'}).model_dump() == {'something': {'foo': 'Bar'}} def test_optional_validator(): class Model(BaseModel): something: Optional[str] = None @field_validator('something') def check_something(cls, v): if v is not None and 'x' in v: raise ValueError('should not contain x') return v assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'something': { 'title': 'Something', 'anyOf': [{'type': 'string'}, {'type': 'null'}], 'default': None, } }, } assert Model().model_dump() == {'something': None} assert Model(something=None).model_dump() == {'something': None} assert Model(something='hello').model_dump() == {'something': 'hello'} with pytest.raises(ValidationError) as exc_info: Model(something='hellox') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('should not contain x')))}, 'input': 'hellox', 'loc': ('something',), 'msg': 'Value error, should not contain x', 'type': 'value_error', } ] def test_field_with_validator(): class Model(BaseModel): something: Optional[int] = None @field_validator('something') def check_field(cls, v, info): return v assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'something': {'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': None, 'title': 'Something'} }, } def test_unparameterized_schema_generation(): class FooList(BaseModel): d: List class BarList(BaseModel): d: list assert model_json_schema(FooList) == { 'title': 'FooList', 'type': 'object', 'properties': {'d': {'items': {}, 'title': 'D', 'type': 'array'}}, 'required': ['d'], } foo_list_schema = model_json_schema(FooList) bar_list_schema = model_json_schema(BarList) bar_list_schema['title'] = 'FooList' # to check for equality assert foo_list_schema == bar_list_schema class FooDict(BaseModel): d: Dict class BarDict(BaseModel): d: dict model_json_schema(Foo) assert model_json_schema(FooDict) == { 'title': 'FooDict', 'type': 'object', 'properties': {'d': {'title': 'D', 'type': 'object'}}, 'required': ['d'], } foo_dict_schema = model_json_schema(FooDict) bar_dict_schema = model_json_schema(BarDict) bar_dict_schema['title'] = 'FooDict' # to check for equality assert foo_dict_schema == bar_dict_schema def test_known_model_optimization(): class Dep(BaseModel): number: int class Model(BaseModel): dep: Dep dep_l: List[Dep] expected = { 'title': 'Model', 'type': 'object', 'properties': { 'dep': {'$ref': '#/$defs/Dep'}, 'dep_l': {'title': 'Dep L', 'type': 'array', 'items': {'$ref': '#/$defs/Dep'}}, }, 'required': ['dep', 'dep_l'], '$defs': { 'Dep': { 'title': 'Dep', 'type': 'object', 'properties': {'number': {'title': 'Number', 'type': 'integer'}}, 'required': ['number'], } }, } assert Model.model_json_schema() == expected def test_new_type_schema(): a_type = NewType('a_type', int) b_type = NewType('b_type', a_type) c_type = NewType('c_type', str) class Model(BaseModel): a: a_type b: b_type c: c_type assert Model.model_json_schema() == { 'properties': { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}, 'c': {'title': 'C', 'type': 'string'}, }, 'required': ['a', 'b', 'c'], 'title': 'Model', 'type': 'object', } def test_literal_schema(): class Model(BaseModel): a: Literal[1] b: Literal['a'] c: Literal['a', 1] d: Literal['a', Literal['b'], 1, 2] e: Literal[1.0] f: Literal[['a', 1]] # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'a': {'const': 1, 'title': 'A', 'type': 'integer'}, 'b': {'const': 'a', 'title': 'B', 'type': 'string'}, 'c': {'enum': ['a', 1], 'title': 'C'}, 'd': {'enum': ['a', 'b', 1, 2], 'title': 'D'}, 'e': {'const': 1.0, 'title': 'E', 'type': 'number'}, 'f': {'const': ['a', 1], 'title': 'F', 'type': 'array'}, }, 'required': ['a', 'b', 'c', 'd', 'e', 'f'], 'title': 'Model', 'type': 'object', } def test_literal_enum(): class MyEnum(str, Enum): FOO = 'foo' BAR = 'bar' class Model(BaseModel): kind: Literal[MyEnum.FOO] other: Literal[MyEnum.FOO, MyEnum.BAR] # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'kind': {'const': 'foo', 'title': 'Kind', 'type': 'string'}, 'other': {'enum': ['foo', 'bar'], 'title': 'Other', 'type': 'string'}, }, 'required': ['kind', 'other'], 'title': 'Model', 'type': 'object', } @pytest.mark.skipif(sys.version_info[:2] == (3, 8), reason="ListEnum doesn't work in 3.8") def test_literal_types() -> None: """Test that we properly add `type` to json schema enums when there is a single type.""" # for float and array we use an Enum because Literal can only accept str, int, bool or None class FloatEnum(float, Enum): a = 123.0 b = 123.1 class ListEnum(List[int], Enum): a = [123] b = [456] class Model(BaseModel): str_literal: Literal['foo', 'bar'] int_literal: Literal[123, 456] float_literal: FloatEnum bool_literal: Literal[True, False] none_literal: Literal[None] # ends up as a const since there's only 1 list_literal: ListEnum mixed_literal: Literal[123, 'abc'] # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'FloatEnum': {'enum': [123.0, 123.1], 'title': 'FloatEnum', 'type': 'number'}, 'ListEnum': {'enum': [[123], [456]], 'title': 'ListEnum', 'type': 'array'}, }, 'properties': { 'str_literal': {'enum': ['foo', 'bar'], 'title': 'Str Literal', 'type': 'string'}, 'int_literal': {'enum': [123, 456], 'title': 'Int Literal', 'type': 'integer'}, 'float_literal': {'$ref': '#/$defs/FloatEnum'}, 'bool_literal': {'enum': [True, False], 'title': 'Bool Literal', 'type': 'boolean'}, 'none_literal': {'const': None, 'title': 'None Literal', 'type': 'null'}, 'list_literal': {'$ref': '#/$defs/ListEnum'}, 'mixed_literal': {'enum': [123, 'abc'], 'title': 'Mixed Literal'}, }, 'required': [ 'str_literal', 'int_literal', 'float_literal', 'bool_literal', 'none_literal', 'list_literal', 'mixed_literal', ], 'title': 'Model', 'type': 'object', } def test_color_type(): class Model(BaseModel): color: Color model_schema = Model.model_json_schema() assert model_schema == { 'title': 'Model', 'type': 'object', 'properties': {'color': {'title': 'Color', 'type': 'string', 'format': 'color'}}, 'required': ['color'], } def test_model_with_extra_forbidden(): class Model(BaseModel): model_config = ConfigDict(extra='forbid') a: str assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'additionalProperties': False, } def test_model_with_extra_allow(): class Model(BaseModel): model_config = ConfigDict(extra='allow') a: str assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'additionalProperties': True, } def test_model_with_extra_ignore(): class Model(BaseModel): model_config = ConfigDict(extra='ignore') a: str assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_dataclass_with_extra_allow(): @pydantic.dataclasses.dataclass class Model: __pydantic_config__ = ConfigDict(extra='allow') a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'additionalProperties': True, } def test_dataclass_with_extra_ignore(): @pydantic.dataclasses.dataclass class Model: __pydantic_config__ = ConfigDict(extra='ignore') a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_dataclass_with_extra_forbid(): @pydantic.dataclasses.dataclass class Model: __pydantic_config__ = ConfigDict(extra='ignore') a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_typeddict_with_extra_allow(): class Model(TypedDict): __pydantic_config__ = ConfigDict(extra='allow') # type: ignore a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'additionalProperties': True, } def test_typeddict_with_extra_ignore(): class Model(TypedDict): __pydantic_config__ = ConfigDict(extra='ignore') # type: ignore a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_typeddict_with_extra_forbid(): @pydantic.dataclasses.dataclass class Model: __pydantic_config__ = ConfigDict(extra='forbid') a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'additionalProperties': False, } def test_typeddict_with_title(): class Model(TypedDict): __pydantic_config__ = ConfigDict(title='Test') # type: ignore a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Test', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_typeddict_with_json_schema_extra(): class Model(TypedDict): __pydantic_config__ = ConfigDict(title='Test', json_schema_extra={'foobar': 'hello'}) # type: ignore a: str assert TypeAdapter(Model).json_schema() == { 'title': 'Test', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'foobar': 'hello', } @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_typeddict_with__callable_json_schema_extra(): def json_schema_extra(schema, model_class): schema.pop('properties') schema['type'] = 'override' assert model_class is Model class Model(TypedDict): __pydantic_config__ = ConfigDict(title='Test', json_schema_extra=json_schema_extra) # type: ignore a: str assert TypeAdapter(Model).json_schema() == {'title': 'Test', 'type': 'override', 'required': ['a']} @pytest.mark.parametrize( 'annotation,kwargs,field_schema', [ (int, dict(gt=0), {'title': 'A', 'exclusiveMinimum': 0, 'type': 'integer'}), ( Optional[int], dict(gt=0), {'title': 'A', 'anyOf': [{'exclusiveMinimum': 0, 'type': 'integer'}, {'type': 'null'}]}, ), ( Tuple[Annotated[int, Field(gt=0)], ...], {}, {'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'title': 'A', 'type': 'array'}, ), ( Tuple[Annotated[int, Field(gt=0)], Annotated[int, Field(gt=0)], Annotated[int, Field(gt=0)]], {}, { 'title': 'A', 'type': 'array', 'prefixItems': [ {'exclusiveMinimum': 0, 'type': 'integer'}, {'exclusiveMinimum': 0, 'type': 'integer'}, {'exclusiveMinimum': 0, 'type': 'integer'}, ], 'minItems': 3, 'maxItems': 3, }, ), ( Union[Annotated[int, Field(gt=0)], Annotated[float, Field(gt=0)]], {}, { 'title': 'A', 'anyOf': [{'exclusiveMinimum': 0, 'type': 'integer'}, {'exclusiveMinimum': 0, 'type': 'number'}], }, ), ( List[Annotated[int, Field(gt=0)]], {}, {'title': 'A', 'type': 'array', 'items': {'exclusiveMinimum': 0, 'type': 'integer'}}, ), ( Dict[str, Annotated[int, Field(gt=0)]], {}, { 'title': 'A', 'type': 'object', 'additionalProperties': {'exclusiveMinimum': 0, 'type': 'integer'}, }, ), ( Union[Annotated[str, Field(max_length=5)], Annotated[int, Field(gt=0)]], {}, {'title': 'A', 'anyOf': [{'maxLength': 5, 'type': 'string'}, {'exclusiveMinimum': 0, 'type': 'integer'}]}, ), ], ) def test_enforced_constraints(annotation, kwargs, field_schema): class Model(BaseModel): a: annotation = Field(**kwargs) schema = Model.model_json_schema() # debug(schema['properties']['a']) assert schema['properties']['a'] == field_schema def test_real_constraints(): class Model1(BaseModel): model_config = ConfigDict(title='Test Model') foo: int = Field(gt=123) with pytest.raises(ValidationError, match='should be greater than 123'): Model1(foo=123) assert Model1(foo=124).model_dump() == {'foo': 124} assert Model1.model_json_schema() == { 'title': 'Test Model', 'type': 'object', 'properties': {'foo': {'title': 'Foo', 'exclusiveMinimum': 123, 'type': 'integer'}}, 'required': ['foo'], } def test_subfield_field_info(): class MyModel(BaseModel): entries: Dict[str, List[int]] assert MyModel.model_json_schema() == { 'title': 'MyModel', 'type': 'object', 'properties': { 'entries': { 'title': 'Entries', 'type': 'object', 'additionalProperties': {'type': 'array', 'items': {'type': 'integer'}}, } }, 'required': ['entries'], } def test_dataclass(): @dataclass class Model: a: bool assert models_json_schema([(Model, 'validation')]) == ( {(Model, 'validation'): {'$ref': '#/$defs/Model'}}, { '$defs': { 'Model': { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'boolean'}}, 'required': ['a'], } } }, ) assert model_json_schema(Model) == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'boolean'}}, 'required': ['a'], } def test_schema_attributes(): class ExampleEnum(Enum): """This is a test description.""" gt = 'GT' lt = 'LT' ge = 'GE' le = 'LE' max_length = 'ML' multiple_of = 'MO' regex = 'RE' class Example(BaseModel): example: ExampleEnum # insert_assert(Example.model_json_schema()) assert Example.model_json_schema() == { '$defs': { 'ExampleEnum': { 'description': 'This is a test description.', 'enum': ['GT', 'LT', 'GE', 'LE', 'ML', 'MO', 'RE'], 'title': 'ExampleEnum', 'type': 'string', } }, 'properties': {'example': {'$ref': '#/$defs/ExampleEnum'}}, 'required': ['example'], 'title': 'Example', 'type': 'object', } def test_tuple_with_extra_schema(): class MyTuple(Tuple[int, str]): @classmethod def __get_pydantic_core_schema__(cls, _source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.tuple_schema( [core_schema.int_schema(), core_schema.str_schema(), core_schema.int_schema()], variadic_item_index=2 ) class Model(BaseModel): x: MyTuple assert Model.model_json_schema() == { 'properties': { 'x': { 'items': {'type': 'integer'}, 'minItems': 2, 'prefixItems': [{'type': 'integer'}, {'type': 'string'}], 'title': 'X', 'type': 'array', } }, 'required': ['x'], 'title': 'Model', 'type': 'object', } def test_path_modify_schema(): class MyPath(Path): @classmethod def __get_pydantic_core_schema__(cls, _source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return handler(Path) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: schema = handler(core_schema) schema.update(foobar=123) return schema class Model(BaseModel): path1: Path path2: MyPath path3: List[MyPath] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'path1': {'title': 'Path1', 'type': 'string', 'format': 'path'}, 'path2': {'title': 'Path2', 'type': 'string', 'format': 'path', 'foobar': 123}, 'path3': {'title': 'Path3', 'type': 'array', 'items': {'type': 'string', 'format': 'path', 'foobar': 123}}, }, 'required': ['path1', 'path2', 'path3'], } def test_frozen_set(): class Model(BaseModel): a: FrozenSet[int] = frozenset({1, 2, 3}) b: FrozenSet = frozenset({1, 2, 3}) c: frozenset = frozenset({1, 2, 3}) d: frozenset = ... assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'a': { 'title': 'A', 'default': [1, 2, 3], 'type': 'array', 'items': {'type': 'integer'}, 'uniqueItems': True, }, 'b': {'title': 'B', 'default': [1, 2, 3], 'type': 'array', 'items': {}, 'uniqueItems': True}, 'c': {'title': 'C', 'default': [1, 2, 3], 'type': 'array', 'items': {}, 'uniqueItems': True}, 'd': {'title': 'D', 'type': 'array', 'items': {}, 'uniqueItems': True}, }, 'required': ['d'], } def test_iterable(): class Model(BaseModel): a: Iterable[int] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'array', 'items': {'type': 'integer'}}}, 'required': ['a'], } def test_new_type(): new_type = NewType('NewStr', str) class Model(BaseModel): a: new_type assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], } def test_multiple_models_with_same_input_output(create_module): module = create_module( # language=Python """ from pydantic import BaseModel class ModelOne(BaseModel): class NestedModel(BaseModel): a: float nested: NestedModel class ModelTwo(BaseModel): class NestedModel(BaseModel): b: float nested: NestedModel class NestedModel(BaseModel): c: float """ ) # All validation keys_map, schema = models_json_schema( [(module.ModelOne, 'validation'), (module.ModelTwo, 'validation'), (module.NestedModel, 'validation')] ) model_names = set(schema['$defs'].keys()) expected_model_names = { 'ModelOne', 'ModelTwo', f'{module.__name__}__ModelOne__NestedModel', f'{module.__name__}__ModelTwo__NestedModel', f'{module.__name__}__NestedModel', } assert model_names == expected_model_names # Validation + serialization keys_map, schema = models_json_schema( [ (module.ModelOne, 'validation'), (module.ModelTwo, 'validation'), (module.NestedModel, 'validation'), (module.ModelOne, 'serialization'), (module.ModelTwo, 'serialization'), (module.NestedModel, 'serialization'), ] ) model_names = set(schema['$defs'].keys()) expected_model_names = { 'ModelOne', 'ModelTwo', f'{module.__name__}__ModelOne__NestedModel', f'{module.__name__}__ModelTwo__NestedModel', f'{module.__name__}__NestedModel', } assert model_names == expected_model_names def test_multiple_models_with_same_name_different_input_output(create_module): module = create_module( # language=Python """ from decimal import Decimal from pydantic import BaseModel class ModelOne(BaseModel): class NestedModel(BaseModel): a: Decimal nested: NestedModel class ModelTwo(BaseModel): class NestedModel(BaseModel): b: Decimal nested: NestedModel class NestedModel(BaseModel): c: Decimal """ ) # All validation keys_map, schema = models_json_schema( [(module.ModelOne, 'validation'), (module.ModelTwo, 'validation'), (module.NestedModel, 'validation')] ) model_names = set(schema['$defs'].keys()) expected_model_names = { 'ModelOne', 'ModelTwo', f'{module.__name__}__ModelOne__NestedModel', f'{module.__name__}__ModelTwo__NestedModel', f'{module.__name__}__NestedModel', } assert model_names == expected_model_names # Validation + serialization keys_map, schema = models_json_schema( [ (module.ModelOne, 'validation'), (module.ModelTwo, 'validation'), (module.NestedModel, 'validation'), (module.ModelOne, 'serialization'), (module.ModelTwo, 'serialization'), (module.NestedModel, 'serialization'), ] ) model_names = set(schema['$defs'].keys()) expected_model_names = { 'ModelOne-Input', 'ModelOne-Output', 'ModelTwo-Input', 'ModelTwo-Output', f'{module.__name__}__ModelOne__NestedModel-Input', f'{module.__name__}__ModelOne__NestedModel-Output', f'{module.__name__}__ModelTwo__NestedModel-Input', f'{module.__name__}__ModelTwo__NestedModel-Output', f'{module.__name__}__NestedModel-Input', f'{module.__name__}__NestedModel-Output', } assert model_names == expected_model_names def test_multiple_enums_with_same_name(create_module): module_1 = create_module( # language=Python """ from enum import Enum from pydantic import BaseModel class MyEnum(str, Enum): a = 'a' b = 'b' c = 'c' class MyModel(BaseModel): my_enum_1: MyEnum """ ) module_2 = create_module( # language=Python """ from enum import Enum from pydantic import BaseModel class MyEnum(str, Enum): d = 'd' e = 'e' f = 'f' class MyModel(BaseModel): my_enum_2: MyEnum """ ) class Model(BaseModel): my_model_1: module_1.MyModel my_model_2: module_2.MyModel assert len(Model.model_json_schema()['$defs']) == 4 assert set(Model.model_json_schema()['$defs']) == { f'{module_1.__name__}__MyEnum', f'{module_1.__name__}__MyModel', f'{module_2.__name__}__MyEnum', f'{module_2.__name__}__MyModel', } def test_mode_name_causes_no_conflict(): class Organization(BaseModel): pass class OrganizationInput(BaseModel): pass class OrganizationOutput(BaseModel): pass class Model(BaseModel): # Ensure the validation and serialization schemas are different: x: Organization = Field(validation_alias='x_validation', serialization_alias='x_serialization') y: OrganizationInput z: OrganizationOutput assert Model.model_json_schema(mode='validation') == { '$defs': { 'Organization': {'properties': {}, 'title': 'Organization', 'type': 'object'}, 'OrganizationInput': {'properties': {}, 'title': 'OrganizationInput', 'type': 'object'}, 'OrganizationOutput': {'properties': {}, 'title': 'OrganizationOutput', 'type': 'object'}, }, 'properties': { 'x_validation': {'$ref': '#/$defs/Organization'}, 'y': {'$ref': '#/$defs/OrganizationInput'}, 'z': {'$ref': '#/$defs/OrganizationOutput'}, }, 'required': ['x_validation', 'y', 'z'], 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization') == { '$defs': { 'Organization': {'properties': {}, 'title': 'Organization', 'type': 'object'}, 'OrganizationInput': {'properties': {}, 'title': 'OrganizationInput', 'type': 'object'}, 'OrganizationOutput': {'properties': {}, 'title': 'OrganizationOutput', 'type': 'object'}, }, 'properties': { 'x_serialization': {'$ref': '#/$defs/Organization'}, 'y': {'$ref': '#/$defs/OrganizationInput'}, 'z': {'$ref': '#/$defs/OrganizationOutput'}, }, 'required': ['x_serialization', 'y', 'z'], 'title': 'Model', 'type': 'object', } def test_ref_conflict_resolution_without_mode_difference(): class OrganizationInput(BaseModel): pass class Organization(BaseModel): x: int schema_with_defs, defs = GenerateJsonSchema().generate_definitions( [ (Organization, 'validation', Organization.__pydantic_core_schema__), (Organization, 'serialization', Organization.__pydantic_core_schema__), (OrganizationInput, 'validation', OrganizationInput.__pydantic_core_schema__), ] ) assert schema_with_defs == { (Organization, 'serialization'): {'$ref': '#/$defs/Organization'}, (Organization, 'validation'): {'$ref': '#/$defs/Organization'}, (OrganizationInput, 'validation'): {'$ref': '#/$defs/OrganizationInput'}, } assert defs == { 'OrganizationInput': {'properties': {}, 'title': 'OrganizationInput', 'type': 'object'}, 'Organization': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Organization', 'type': 'object', }, } def test_ref_conflict_resolution_with_mode_difference(): class OrganizationInput(BaseModel): pass class Organization(BaseModel): x: int @field_serializer('x') def serialize_x(self, v: int) -> str: return str(v) schema_with_defs, defs = GenerateJsonSchema().generate_definitions( [ (Organization, 'validation', Organization.__pydantic_core_schema__), (Organization, 'serialization', Organization.__pydantic_core_schema__), (OrganizationInput, 'validation', OrganizationInput.__pydantic_core_schema__), ] ) assert schema_with_defs == { (Organization, 'serialization'): {'$ref': '#/$defs/Organization-Output'}, (Organization, 'validation'): {'$ref': '#/$defs/Organization-Input'}, (OrganizationInput, 'validation'): {'$ref': '#/$defs/OrganizationInput'}, } assert defs == { 'OrganizationInput': {'properties': {}, 'title': 'OrganizationInput', 'type': 'object'}, 'Organization-Input': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Organization', 'type': 'object', }, 'Organization-Output': { 'properties': {'x': {'title': 'X', 'type': 'string'}}, 'required': ['x'], 'title': 'Organization', 'type': 'object', }, } def test_conflicting_names(): class Organization__Input(BaseModel): pass class Organization(BaseModel): x: int @field_serializer('x') def serialize_x(self, v: int) -> str: return str(v) schema_with_defs, defs = GenerateJsonSchema().generate_definitions( [ (Organization, 'validation', Organization.__pydantic_core_schema__), (Organization, 'serialization', Organization.__pydantic_core_schema__), (Organization__Input, 'validation', Organization__Input.__pydantic_core_schema__), ] ) assert schema_with_defs == { (Organization, 'serialization'): {'$ref': '#/$defs/Organization-Output'}, (Organization, 'validation'): {'$ref': '#/$defs/Organization-Input'}, (Organization__Input, 'validation'): {'$ref': '#/$defs/Organization__Input'}, } assert defs == { 'Organization__Input': {'properties': {}, 'title': 'Organization__Input', 'type': 'object'}, 'Organization-Input': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Organization', 'type': 'object', }, 'Organization-Output': { 'properties': {'x': {'title': 'X', 'type': 'string'}}, 'required': ['x'], 'title': 'Organization', 'type': 'object', }, } @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_schema_for_generic_field(): T = TypeVar('T') class GenModel(Generic[T]): def __init__(self, data: Any): self.data = data @classmethod def __get_validators__(cls): yield cls.validate @classmethod def validate(cls, v: Any): return v @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler, ) -> core_schema.PlainValidatorFunctionSchema: source_args = getattr(source, '__args__', [Any]) param = source_args[0] metadata = {'pydantic_js_functions': [lambda _c, h: h(handler.generate_schema(param))]} return core_schema.with_info_plain_validator_function( GenModel, metadata=metadata, ) class Model(BaseModel): data: GenModel[str] data1: GenModel model_config = dict(arbitrary_types_allowed=True) assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'data': {'type': 'string', 'title': 'Data'}, 'data1': { 'title': 'Data1', }, }, 'required': ['data', 'data1'], } class GenModelModified(GenModel, Generic[T]): @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) type = field_schema.pop('type', 'other') field_schema.update(anyOf=[{'type': type}, {'type': 'array', 'items': {'type': type}}]) return field_schema class ModelModified(BaseModel): data: GenModelModified[str] data1: GenModelModified model_config = dict(arbitrary_types_allowed=True) assert ModelModified.model_json_schema() == { 'title': 'ModelModified', 'type': 'object', 'properties': { 'data': {'title': 'Data', 'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}, 'data1': {'title': 'Data1', 'anyOf': [{'type': 'other'}, {'type': 'array', 'items': {'type': 'other'}}]}, }, 'required': ['data', 'data1'], } def test_namedtuple_default(): class Coordinates(NamedTuple): x: float y: float class LocationBase(BaseModel): coords: Coordinates = Coordinates(34, 42) assert LocationBase(coords=Coordinates(1, 2)).coords == Coordinates(1, 2) assert LocationBase.model_json_schema() == { '$defs': { 'Coordinates': { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'title': 'X', 'type': 'number'}, {'title': 'Y', 'type': 'number'}], 'type': 'array', } }, 'properties': {'coords': {'$ref': '#/$defs/Coordinates', 'default': [34, 42]}}, 'title': 'LocationBase', 'type': 'object', } def test_namedtuple_modify_schema(): class Coordinates(NamedTuple): x: float y: float class CustomCoordinates(Coordinates): @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) schema['arguments_schema']['metadata']['pydantic_js_prefer_positional_arguments'] = False return schema class Location(BaseModel): coords: CustomCoordinates = CustomCoordinates(34, 42) assert Location.model_json_schema() == { '$defs': { 'CustomCoordinates': { 'additionalProperties': False, 'properties': {'x': {'title': 'X', 'type': 'number'}, 'y': {'title': 'Y', 'type': 'number'}}, 'required': ['x', 'y'], 'type': 'object', } }, 'properties': {'coords': {'$ref': '#/$defs/CustomCoordinates', 'default': [34, 42]}}, 'title': 'Location', 'type': 'object', } def test_advanced_generic_schema(): # noqa: C901 T = TypeVar('T') K = TypeVar('K') class Gen(Generic[T]): def __init__(self, data: Any): self.data = data @classmethod def __get_validators__(cls): yield cls.validate @classmethod def validate(cls, v: Any): return v @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: if hasattr(source, '__args__'): arg = source.__args__[0] def js_func(s, h): # ignore the schema we were given and get a new CoreSchema s = handler.generate_schema(Optional[arg]) return h(s) return core_schema.with_info_plain_validator_function( Gen, metadata={'pydantic_js_annotation_functions': [js_func]}, ) else: return handler(source) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: try: field_schema = handler(core_schema) except PydanticInvalidForJsonSchema: field_schema = {} the_type = field_schema.pop('anyOf', [{'type': 'string'}])[0] field_schema.update(title='Gen title', anyOf=[the_type, {'type': 'array', 'items': the_type}]) return field_schema class GenTwoParams(Generic[T, K]): def __init__(self, x: str, y: Any): self.x = x self.y = y @classmethod def __get_validators__(cls): yield cls.validate @classmethod def validate(cls, v: Any): return cls(*v) @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler, **_kwargs: Any ) -> core_schema.CoreSchema: if hasattr(source, '__args__'): # the js_function ignores the schema we were given and gets a new Tuple CoreSchema metadata = {'pydantic_js_functions': [lambda _c, h: h(handler(Tuple[source.__args__]))]} return core_schema.with_info_plain_validator_function( GenTwoParams, metadata=metadata, ) return handler(source) @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.pop('minItems') field_schema.pop('maxItems') field_schema.update(examples=[['a', 'e0add881-8b94-4368-8286-f8607928924e']]) return field_schema class CustomType(Enum): A = 'a' B = 'b' @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> core_schema.CoreSchema: json_schema = handler(core_schema) json_schema.update(title='CustomType title', type='string') return json_schema class Model(BaseModel): data0: Gen data1: Gen[CustomType] = Field(title='Data1 title', description='Data 1 description') data2: GenTwoParams[CustomType, UUID4] = Field(title='Data2 title', description='Data 2') # check Tuple because changes in code touch that type data3: Tuple data4: Tuple[CustomType] data5: Tuple[CustomType, str] model_config = {'arbitrary_types_allowed': True} # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': {'CustomType': {'enum': ['a', 'b'], 'title': 'CustomType title', 'type': 'string'}}, 'properties': { 'data0': { 'anyOf': [{'type': 'string'}, {'items': {'type': 'string'}, 'type': 'array'}], 'title': 'Gen title', }, 'data1': { 'anyOf': [{'$ref': '#/$defs/CustomType'}, {'items': {'$ref': '#/$defs/CustomType'}, 'type': 'array'}], 'description': 'Data 1 description', 'title': 'Data1 title', }, 'data2': { 'description': 'Data 2', 'examples': [['a', 'e0add881-8b94-4368-8286-f8607928924e']], 'prefixItems': [{'$ref': '#/$defs/CustomType'}, {'format': 'uuid4', 'type': 'string'}], 'title': 'Data2 title', 'type': 'array', }, 'data3': {'items': {}, 'title': 'Data3', 'type': 'array'}, 'data4': { 'maxItems': 1, 'minItems': 1, 'prefixItems': [{'$ref': '#/$defs/CustomType'}], 'title': 'Data4', 'type': 'array', }, 'data5': { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'$ref': '#/$defs/CustomType'}, {'type': 'string'}], 'title': 'Data5', 'type': 'array', }, }, 'required': ['data0', 'data1', 'data2', 'data3', 'data4', 'data5'], 'title': 'Model', 'type': 'object', } def test_nested_generic(): """ Test a nested BaseModel that is also a Generic """ class Ref(BaseModel, Generic[T]): uuid: str def resolve(self) -> T: ... class Model(BaseModel): ref: Ref['Model'] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', '$defs': { 'Ref_Model_': { 'title': 'Ref[Model]', 'type': 'object', 'properties': { 'uuid': {'title': 'Uuid', 'type': 'string'}, }, 'required': ['uuid'], }, }, 'properties': { 'ref': {'$ref': '#/$defs/Ref_Model_'}, }, 'required': ['ref'], } def test_nested_generic_model(): """ Test a nested generic model """ class Box(BaseModel, Generic[T]): uuid: str data: T class Model(BaseModel): box_str: Box[str] box_int: Box[int] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', '$defs': { 'Box_str_': Box[str].model_json_schema(), 'Box_int_': Box[int].model_json_schema(), }, 'properties': { 'box_str': {'$ref': '#/$defs/Box_str_'}, 'box_int': {'$ref': '#/$defs/Box_int_'}, }, 'required': ['box_str', 'box_int'], } def test_complex_nested_generic(): """ Handle a union of a generic. """ class Ref(BaseModel, Generic[T]): uuid: str def resolve(self) -> T: ... class Model(BaseModel): uuid: str model: Union[Ref['Model'], 'Model'] def resolve(self) -> 'Model': ... Model.model_rebuild() assert Model.model_json_schema() == { '$defs': { 'Model': { 'title': 'Model', 'type': 'object', 'properties': { 'uuid': {'title': 'Uuid', 'type': 'string'}, 'model': { 'title': 'Model', 'anyOf': [ {'$ref': '#/$defs/Ref_Model_'}, {'$ref': '#/$defs/Model'}, ], }, }, 'required': ['uuid', 'model'], }, 'Ref_Model_': { 'title': 'Ref[Model]', 'type': 'object', 'properties': {'uuid': {'title': 'Uuid', 'type': 'string'}}, 'required': ['uuid'], }, }, '$ref': '#/$defs/Model', } def test_modify_schema_dict_keys() -> None: class MyType: @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: return {'test': 'passed'} class MyModel(BaseModel): my_field: Dict[str, MyType] model_config = dict(arbitrary_types_allowed=True) assert MyModel.model_json_schema() == { 'properties': { 'my_field': {'additionalProperties': {'test': 'passed'}, 'title': 'My Field', 'type': 'object'} # <---- }, 'required': ['my_field'], 'title': 'MyModel', 'type': 'object', } def test_remove_anyof_redundancy() -> None: class A: @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: return handler({'type': 'str'}) class B: @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: return handler({'type': 'str'}) class MyModel(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) # Union of two objects should give a JSON with an `anyOf` field, but in this case # since the fields are the same, the `anyOf` is removed. field: Union[A, B] assert MyModel.model_json_schema() == { 'properties': {'field': {'title': 'Field', 'type': 'string'}}, 'required': ['field'], 'title': 'MyModel', 'type': 'object', } def test_discriminated_union(): class Cat(BaseModel): pet_type: Literal['cat'] class Dog(BaseModel): pet_type: Literal['dog'] class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] class Model(BaseModel): pet: Union[Cat, Dog, Lizard] = Field(discriminator='pet_type') # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'Cat': { 'properties': {'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Cat', 'type': 'object', }, 'Dog': { 'properties': {'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Dog', 'type': 'object', }, 'Lizard': { 'properties': {'pet_type': {'enum': ['reptile', 'lizard'], 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Lizard', 'type': 'object', }, }, 'properties': { 'pet': { 'discriminator': { 'mapping': { 'cat': '#/$defs/Cat', 'dog': '#/$defs/Dog', 'lizard': '#/$defs/Lizard', 'reptile': '#/$defs/Lizard', }, 'propertyName': 'pet_type', }, 'oneOf': [{'$ref': '#/$defs/Cat'}, {'$ref': '#/$defs/Dog'}, {'$ref': '#/$defs/Lizard'}], 'title': 'Pet', } }, 'required': ['pet'], 'title': 'Model', 'type': 'object', } def test_discriminated_annotated_union(): class Cat(BaseModel): pet_type: Literal['cat'] class Dog(BaseModel): pet_type: Literal['dog'] class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] class Model(BaseModel): pet: Annotated[Union[Cat, Dog, Lizard], Field(discriminator='pet_type')] # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'Cat': { 'properties': {'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Cat', 'type': 'object', }, 'Dog': { 'properties': {'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Dog', 'type': 'object', }, 'Lizard': { 'properties': {'pet_type': {'enum': ['reptile', 'lizard'], 'title': 'Pet Type', 'type': 'string'}}, 'required': ['pet_type'], 'title': 'Lizard', 'type': 'object', }, }, 'properties': { 'pet': { 'discriminator': { 'mapping': { 'cat': '#/$defs/Cat', 'dog': '#/$defs/Dog', 'lizard': '#/$defs/Lizard', 'reptile': '#/$defs/Lizard', }, 'propertyName': 'pet_type', }, 'oneOf': [{'$ref': '#/$defs/Cat'}, {'$ref': '#/$defs/Dog'}, {'$ref': '#/$defs/Lizard'}], 'title': 'Pet', } }, 'required': ['pet'], 'title': 'Model', 'type': 'object', } def test_nested_discriminated_union(): class BlackCatWithHeight(BaseModel): color: Literal['black'] info: Literal['height'] height: float class BlackCatWithWeight(BaseModel): color: Literal['black'] info: Literal['weight'] weight: float BlackCat = Annotated[Union[BlackCatWithHeight, BlackCatWithWeight], Field(discriminator='info')] class WhiteCat(BaseModel): color: Literal['white'] white_cat_info: str class Cat(BaseModel): pet: Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] # insert_assert(Cat.model_json_schema()) assert Cat.model_json_schema() == { '$defs': { 'BlackCatWithHeight': { 'properties': { 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'height': {'title': 'Height', 'type': 'number'}, 'info': {'const': 'height', 'title': 'Info', 'type': 'string'}, }, 'required': ['color', 'info', 'height'], 'title': 'BlackCatWithHeight', 'type': 'object', }, 'BlackCatWithWeight': { 'properties': { 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'info': {'const': 'weight', 'title': 'Info', 'type': 'string'}, 'weight': {'title': 'Weight', 'type': 'number'}, }, 'required': ['color', 'info', 'weight'], 'title': 'BlackCatWithWeight', 'type': 'object', }, 'WhiteCat': { 'properties': { 'color': {'const': 'white', 'title': 'Color', 'type': 'string'}, 'white_cat_info': {'title': 'White Cat Info', 'type': 'string'}, }, 'required': ['color', 'white_cat_info'], 'title': 'WhiteCat', 'type': 'object', }, }, 'properties': { 'pet': { 'discriminator': { 'mapping': { 'black': { 'discriminator': { 'mapping': { 'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [{'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}], }, 'white': '#/$defs/WhiteCat', }, 'propertyName': 'color', }, 'oneOf': [ { 'discriminator': { 'mapping': {'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight'}, 'propertyName': 'info', }, 'oneOf': [{'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}], }, {'$ref': '#/$defs/WhiteCat'}, ], 'title': 'Pet', } }, 'required': ['pet'], 'title': 'Cat', 'type': 'object', } def test_deeper_nested_discriminated_annotated_union(): class BlackCatWithHeight(BaseModel): pet_type: Literal['cat'] color: Literal['black'] info: Literal['height'] black_infos: str class BlackCatWithWeight(BaseModel): pet_type: Literal['cat'] color: Literal['black'] info: Literal['weight'] black_infos: str BlackCat = Annotated[Union[BlackCatWithHeight, BlackCatWithWeight], Field(discriminator='info')] class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_infos: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal['dog'] dog_name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pet: Pet number: int # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'BlackCatWithHeight': { 'properties': { 'black_infos': {'title': 'Black Infos', 'type': 'string'}, 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'info': {'const': 'height', 'title': 'Info', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'info', 'black_infos'], 'title': 'BlackCatWithHeight', 'type': 'object', }, 'BlackCatWithWeight': { 'properties': { 'black_infos': {'title': 'Black Infos', 'type': 'string'}, 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'info': {'const': 'weight', 'title': 'Info', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'info', 'black_infos'], 'title': 'BlackCatWithWeight', 'type': 'object', }, 'Dog': { 'properties': { 'dog_name': {'title': 'Dog Name', 'type': 'string'}, 'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'dog_name'], 'title': 'Dog', 'type': 'object', }, 'WhiteCat': { 'properties': { 'color': {'const': 'white', 'title': 'Color', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, 'white_infos': {'title': 'White Infos', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'white_infos'], 'title': 'WhiteCat', 'type': 'object', }, }, 'properties': { 'number': {'title': 'Number', 'type': 'integer'}, 'pet': { 'discriminator': { 'mapping': { 'cat': { 'discriminator': { 'mapping': { 'black': { 'discriminator': { 'mapping': { 'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, 'white': '#/$defs/WhiteCat', }, 'propertyName': 'color', }, 'oneOf': [ { 'discriminator': { 'mapping': { 'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, {'$ref': '#/$defs/WhiteCat'}, ], }, 'dog': '#/$defs/Dog', }, 'propertyName': 'pet_type', }, 'oneOf': [ { 'discriminator': { 'mapping': { 'black': { 'discriminator': { 'mapping': { 'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, 'white': '#/$defs/WhiteCat', }, 'propertyName': 'color', }, 'oneOf': [ { 'discriminator': { 'mapping': { 'height': '#/$defs/BlackCatWithHeight', 'weight': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, {'$ref': '#/$defs/WhiteCat'}, ], }, {'$ref': '#/$defs/Dog'}, ], 'title': 'Pet', }, }, 'required': ['pet', 'number'], 'title': 'Model', 'type': 'object', } def test_discriminated_annotated_union_literal_enum(): class PetType(Enum): cat = 'cat' dog = 'dog' class PetColor(str, Enum): black = 'black' white = 'white' class PetInfo(Enum): height = 0 weight = 1 class BlackCatWithHeight(BaseModel): pet_type: Literal[PetType.cat] color: Literal[PetColor.black] info: Literal[PetInfo.height] black_infos: str class BlackCatWithWeight(BaseModel): pet_type: Literal[PetType.cat] color: Literal[PetColor.black] info: Literal[PetInfo.weight] black_infos: str BlackCat = Annotated[Union[BlackCatWithHeight, BlackCatWithWeight], Field(discriminator='info')] class WhiteCat(BaseModel): pet_type: Literal[PetType.cat] color: Literal[PetColor.white] white_infos: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal[PetType.dog] dog_name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pet: Pet number: int # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'BlackCatWithHeight': { 'properties': { 'black_infos': {'title': 'Black Infos', 'type': 'string'}, 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'info': {'const': 0, 'title': 'Info', 'type': 'integer'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'info', 'black_infos'], 'title': 'BlackCatWithHeight', 'type': 'object', }, 'BlackCatWithWeight': { 'properties': { 'black_infos': {'title': 'Black Infos', 'type': 'string'}, 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'info': {'const': 1, 'title': 'Info', 'type': 'integer'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'info', 'black_infos'], 'title': 'BlackCatWithWeight', 'type': 'object', }, 'Dog': { 'properties': { 'dog_name': {'title': 'Dog Name', 'type': 'string'}, 'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'dog_name'], 'title': 'Dog', 'type': 'object', }, 'WhiteCat': { 'properties': { 'color': {'const': 'white', 'title': 'Color', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, 'white_infos': {'title': 'White Infos', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'white_infos'], 'title': 'WhiteCat', 'type': 'object', }, }, 'properties': { 'number': {'title': 'Number', 'type': 'integer'}, 'pet': { 'discriminator': { 'mapping': { 'cat': { 'discriminator': { 'mapping': { 'black': { 'discriminator': { 'mapping': { '0': '#/$defs/BlackCatWithHeight', '1': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, 'white': '#/$defs/WhiteCat', }, 'propertyName': 'color', }, 'oneOf': [ { 'discriminator': { 'mapping': { '0': '#/$defs/BlackCatWithHeight', '1': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, {'$ref': '#/$defs/WhiteCat'}, ], }, 'dog': '#/$defs/Dog', }, 'propertyName': 'pet_type', }, 'oneOf': [ { 'discriminator': { 'mapping': { 'black': { 'discriminator': { 'mapping': { '0': '#/$defs/BlackCatWithHeight', '1': '#/$defs/BlackCatWithWeight', }, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, 'white': '#/$defs/WhiteCat', }, 'propertyName': 'color', }, 'oneOf': [ { 'discriminator': { 'mapping': {'0': '#/$defs/BlackCatWithHeight', '1': '#/$defs/BlackCatWithWeight'}, 'propertyName': 'info', }, 'oneOf': [ {'$ref': '#/$defs/BlackCatWithHeight'}, {'$ref': '#/$defs/BlackCatWithWeight'}, ], }, {'$ref': '#/$defs/WhiteCat'}, ], }, {'$ref': '#/$defs/Dog'}, ], 'title': 'Pet', }, }, 'required': ['pet', 'number'], 'title': 'Model', 'type': 'object', } def test_alias_same(): class Cat(BaseModel): pet_type: Literal['cat'] = Field(alias='typeOfPet') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(alias='typeOfPet') d: str class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'Cat': { 'properties': { 'c': {'title': 'C', 'type': 'string'}, 'typeOfPet': {'const': 'cat', 'title': 'Typeofpet', 'type': 'string'}, }, 'required': ['typeOfPet', 'c'], 'title': 'Cat', 'type': 'object', }, 'Dog': { 'properties': { 'd': {'title': 'D', 'type': 'string'}, 'typeOfPet': {'const': 'dog', 'title': 'Typeofpet', 'type': 'string'}, }, 'required': ['typeOfPet', 'd'], 'title': 'Dog', 'type': 'object', }, }, 'properties': { 'number': {'title': 'Number', 'type': 'integer'}, 'pet': { 'oneOf': [{'$ref': '#/$defs/Cat'}, {'$ref': '#/$defs/Dog'}], 'title': 'Pet', 'discriminator': {'mapping': {'cat': '#/$defs/Cat', 'dog': '#/$defs/Dog'}, 'propertyName': 'typeOfPet'}, }, }, 'required': ['pet', 'number'], 'title': 'Model', 'type': 'object', } def test_nested_python_dataclasses(): """ Test schema generation for nested python dataclasses """ from dataclasses import dataclass as python_dataclass @python_dataclass class ChildModel: name: str @python_dataclass class NestedModel: """ Custom description """ # Note: the Custom description will not be preserved as this is a vanilla dataclass # This is the same behavior as in v1 child: List[ChildModel] # insert_assert(model_json_schema(dataclass(NestedModel))) assert model_json_schema(dataclass(NestedModel)) == { '$defs': { 'ChildModel': { 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], 'title': 'ChildModel', 'type': 'object', } }, 'properties': {'child': {'items': {'$ref': '#/$defs/ChildModel'}, 'title': 'Child', 'type': 'array'}}, 'required': ['child'], 'title': 'NestedModel', 'type': 'object', } def test_discriminated_union_in_list(): class BlackCat(BaseModel): pet_type: Literal['cat'] color: Literal['black'] black_name: str class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_name: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal['dog'] name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pets: Pet n: int # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': { 'BlackCat': { 'properties': { 'black_name': {'title': 'Black Name', 'type': 'string'}, 'color': {'const': 'black', 'title': 'Color', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'black_name'], 'title': 'BlackCat', 'type': 'object', }, 'Dog': { 'properties': { 'name': {'title': 'Name', 'type': 'string'}, 'pet_type': {'const': 'dog', 'title': 'Pet Type', 'type': 'string'}, }, 'required': ['pet_type', 'name'], 'title': 'Dog', 'type': 'object', }, 'WhiteCat': { 'properties': { 'color': {'const': 'white', 'title': 'Color', 'type': 'string'}, 'pet_type': {'const': 'cat', 'title': 'Pet Type', 'type': 'string'}, 'white_name': {'title': 'White Name', 'type': 'string'}, }, 'required': ['pet_type', 'color', 'white_name'], 'title': 'WhiteCat', 'type': 'object', }, }, 'properties': { 'n': {'title': 'N', 'type': 'integer'}, 'pets': { 'discriminator': { 'mapping': { 'cat': { 'discriminator': { 'mapping': {'black': '#/$defs/BlackCat', 'white': '#/$defs/WhiteCat'}, 'propertyName': 'color', }, 'oneOf': [{'$ref': '#/$defs/BlackCat'}, {'$ref': '#/$defs/WhiteCat'}], }, 'dog': '#/$defs/Dog', }, 'propertyName': 'pet_type', }, 'oneOf': [ { 'discriminator': { 'mapping': {'black': '#/$defs/BlackCat', 'white': '#/$defs/WhiteCat'}, 'propertyName': 'color', }, 'oneOf': [{'$ref': '#/$defs/BlackCat'}, {'$ref': '#/$defs/WhiteCat'}], }, {'$ref': '#/$defs/Dog'}, ], 'title': 'Pets', }, }, 'required': ['pets', 'n'], 'title': 'Model', 'type': 'object', } def test_model_with_type_attributes(): class Foo: a: float class Bar(BaseModel): b: int class Baz(BaseModel): a: Type[Foo] b: Type[Bar] assert Baz.model_json_schema() == { 'title': 'Baz', 'type': 'object', 'properties': {'a': {'title': 'A'}, 'b': {'title': 'B'}}, 'required': ['a', 'b'], } @pytest.mark.parametrize('secret_cls', [SecretStr, SecretBytes]) @pytest.mark.parametrize( 'field_kw,schema_kw', [ # [{}, {}], [{'min_length': 6}, {'minLength': 6}], [{'max_length': 10}, {'maxLength': 10}], [{'min_length': 6, 'max_length': 10}, {'minLength': 6, 'maxLength': 10}], ], ids=['min-constraint', 'max-constraint', 'min-max-constraints'], ) def test_secrets_schema(secret_cls, field_kw, schema_kw): class Foobar(BaseModel): password: secret_cls = Field(**field_kw) assert Foobar.model_json_schema() == { 'title': 'Foobar', 'type': 'object', 'properties': { 'password': {'title': 'Password', 'type': 'string', 'writeOnly': True, 'format': 'password', **schema_kw} }, 'required': ['password'], } def test_override_generate_json_schema(): class MyGenerateJsonSchema(GenerateJsonSchema): def generate(self, schema, mode='validation'): json_schema = super().generate(schema, mode=mode) json_schema['$schema'] = self.schema_dialect return json_schema class MyBaseModel(BaseModel): @classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: Type[GenerateJsonSchema] = MyGenerateJsonSchema, mode='validation', ) -> Dict[str, Any]: return super().model_json_schema(by_alias, ref_template, schema_generator, mode) class MyModel(MyBaseModel): x: int assert MyModel.model_json_schema() == { '$schema': 'https://json-schema.org/draft/2020-12/schema', 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'MyModel', 'type': 'object', } def test_generate_json_schema_generate_twice(): generator = GenerateJsonSchema() class Model(BaseModel): title: str generator.generate(Model.__pydantic_core_schema__) with pytest.raises( PydanticUserError, match=re.escape( 'This JSON schema generator has already been used to generate a JSON schema. ' 'You must create a new instance of GenerateJsonSchema to generate a new JSON schema.' ), ): generator.generate(Model.__pydantic_core_schema__) generator = GenerateJsonSchema() generator.generate_definitions([(Model, 'validation', Model.__pydantic_core_schema__)]) with pytest.raises( PydanticUserError, match=re.escape( 'This JSON schema generator has already been used to generate a JSON schema. ' 'You must create a new instance of GenerateJsonSchema to generate a new JSON schema.' ), ): generator.generate_definitions([(Model, 'validation', Model.__pydantic_core_schema__)]) def test_nested_default_json_schema(): class InnerModel(BaseModel): foo: str = 'bar' baz: str = Field(default='foobar', alias='my_alias') class OuterModel(BaseModel): nested_field: InnerModel = InnerModel() assert OuterModel.model_json_schema() == { '$defs': { 'InnerModel': { 'properties': { 'foo': {'default': 'bar', 'title': 'Foo', 'type': 'string'}, 'my_alias': {'default': 'foobar', 'title': 'My Alias', 'type': 'string'}, }, 'title': 'InnerModel', 'type': 'object', } }, 'properties': {'nested_field': {'$ref': '#/$defs/InnerModel', 'default': {'my_alias': 'foobar', 'foo': 'bar'}}}, 'title': 'OuterModel', 'type': 'object', } @pytest.mark.xfail( reason=( 'We are calling __get_pydantic_json_schema__ too many times.' ' The second time we analyze a model we get the CoreSchema from __pydantic_core_schema__.' ' But then we proceed to append to the metadata json schema functions.' ) ) def test_get_pydantic_core_schema_calls() -> None: """Verify when/how many times `__get_pydantic_core_schema__` gets called""" calls: List[str] = [] class Model(BaseModel): @classmethod def __get_pydantic_json_schema__(cls, schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: calls.append('Model::before') json_schema = handler(schema) calls.append('Model::after') return json_schema schema = Model.model_json_schema() expected: JsonSchemaValue = {'type': 'object', 'properties': {}, 'title': 'Model'} assert schema == expected assert calls == ['Model::before', 'Model::after'] calls.clear() class CustomAnnotation(NamedTuple): name: str def __get_pydantic_json_schema__(self, schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: calls.append(f'CustomAnnotation({self.name})::before') json_schema = handler(schema) calls.append(f'CustomAnnotation({self.name})::after') return json_schema AnnotatedType = Annotated[str, CustomAnnotation('foo'), CustomAnnotation('bar')] schema = TypeAdapter(AnnotatedType).json_schema() expected: JsonSchemaValue = {'type': 'string'} assert schema == expected assert calls == [ 'CustomAnnotation(bar)::before', 'CustomAnnotation(foo)::before', 'CustomAnnotation(foo)::after', 'CustomAnnotation(bar)::after', ] calls.clear() class OuterModel(BaseModel): x: Model @classmethod def __get_pydantic_json_schema__(cls, schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: calls.append('OuterModel::before') json_schema = handler(schema) calls.append('OuterModel::after') return json_schema schema = OuterModel.model_json_schema() expected: JsonSchemaValue = { 'type': 'object', 'properties': {'x': {'$ref': '#/$defs/Model'}}, 'required': ['x'], 'title': 'OuterModel', '$defs': {'Model': {'type': 'object', 'properties': {}, 'title': 'Model'}}, } assert schema == expected assert calls == [ 'OuterModel::before', 'Model::before', 'Model::after', 'OuterModel::after', ] calls.clear() AnnotatedModel = Annotated[Model, CustomAnnotation('foo')] schema = TypeAdapter(AnnotatedModel).json_schema() expected: JsonSchemaValue = {} assert schema == expected assert calls == [ 'CustomAnnotation(foo)::before', 'Model::before', 'Model::after', 'CustomAnnotation(foo)::after', ] calls.clear() class OuterModelWithAnnotatedField(BaseModel): x: AnnotatedModel schema = OuterModelWithAnnotatedField.model_json_schema() expected: JsonSchemaValue = { 'type': 'object', 'properties': {'x': {'$ref': '#/$defs/Model'}}, 'required': ['x'], 'title': 'OuterModel', '$defs': {'Model': {'type': 'object', 'properties': {}, 'title': 'Model'}}, } assert schema == expected assert calls == [ 'OuterModel::before', 'CustomAnnotation(foo)::before', 'Model::before', 'Model::after', 'CustomAnnotation(foo)::after', 'OuterModel::after', ] calls.clear() def test_annotated_get_json_schema() -> None: calls: List[int] = [] class CustomType(str): @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return handler(str) @classmethod def __get_pydantic_json_schema__(cls, schema: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: calls.append(1) json_schema = handler(schema) return json_schema TypeAdapter(Annotated[CustomType, 123]).json_schema() assert sum(calls) == 1 def test_model_with_strict_mode(): class Model(BaseModel): model_config = ConfigDict(strict=True) a: str assert Model.model_json_schema() == { 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } def test_model_with_schema_extra(): class Model(BaseModel): a: str model_config = dict(json_schema_extra={'examples': [{'a': 'Foo'}]}) assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'examples': [{'a': 'Foo'}], } @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable(): class Model(BaseModel): name: str = None @staticmethod def json_schema_extra(schema, model_class): schema.pop('properties') schema['type'] = 'override' assert model_class is Model model_config = dict(json_schema_extra=json_schema_extra) assert Model.model_json_schema() == {'title': 'Model', 'type': 'override'} @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable_no_model_class(): class Model(BaseModel): name: str = None @classmethod def json_schema_extra(cls, schema): schema.pop('properties') schema['type'] = 'override' model_config = dict(json_schema_extra=json_schema_extra) assert Model.model_json_schema() == {'title': 'Model', 'type': 'override'} @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable_config_class(): with pytest.warns(PydanticDeprecatedSince20, match='use ConfigDict instead'): class Model(BaseModel): name: str = None class Config: @staticmethod def json_schema_extra(schema, model_class): schema.pop('properties') schema['type'] = 'override' assert model_class is Model assert Model.model_json_schema() == {'title': 'Model', 'type': 'override'} @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable_no_model_class_config_class(): with pytest.warns(PydanticDeprecatedSince20): class Model(BaseModel): name: str = None class Config: @staticmethod def json_schema_extra(schema): schema.pop('properties') schema['type'] = 'override' assert Model.model_json_schema() == {'title': 'Model', 'type': 'override'} @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable_classmethod(): with pytest.warns(PydanticDeprecatedSince20): class Model(BaseModel): name: str = None class Config: type = 'foo' @classmethod def json_schema_extra(cls, schema, model_class): schema.pop('properties') schema['type'] = cls.type assert model_class is Model assert Model.model_json_schema() == {'title': 'Model', 'type': 'foo'} @pytest.mark.skip_json_schema_validation(reason='Custom type used.') def test_model_with_schema_extra_callable_instance_method(): with pytest.warns(PydanticDeprecatedSince20): class Model(BaseModel): name: str = None class Config: def json_schema_extra(schema, model_class): schema.pop('properties') schema['type'] = 'override' assert model_class is Model assert Model.model_json_schema() == {'title': 'Model', 'type': 'override'} def test_serialization_validation_interaction(): class Inner(BaseModel): x: Json[int] class Outer(BaseModel): inner: Inner _, v_schema = models_json_schema([(Outer, 'validation')]) assert v_schema == { '$defs': { 'Inner': { 'properties': { 'x': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'integer'}, 'title': 'X', 'type': 'string', } }, 'required': ['x'], 'title': 'Inner', 'type': 'object', }, 'Outer': { 'properties': {'inner': {'$ref': '#/$defs/Inner'}}, 'required': ['inner'], 'title': 'Outer', 'type': 'object', }, } } _, s_schema = models_json_schema([(Outer, 'serialization')]) assert s_schema == { '$defs': { 'Inner': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Inner', 'type': 'object', }, 'Outer': { 'properties': {'inner': {'$ref': '#/$defs/Inner'}}, 'required': ['inner'], 'title': 'Outer', 'type': 'object', }, } } _, vs_schema = models_json_schema([(Outer, 'validation'), (Outer, 'serialization')]) assert vs_schema == { '$defs': { 'Inner-Input': { 'properties': { 'x': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'integer'}, 'title': 'X', 'type': 'string', } }, 'required': ['x'], 'title': 'Inner', 'type': 'object', }, 'Inner-Output': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Inner', 'type': 'object', }, 'Outer-Input': { 'properties': {'inner': {'$ref': '#/$defs/Inner-Input'}}, 'required': ['inner'], 'title': 'Outer', 'type': 'object', }, 'Outer-Output': { 'properties': {'inner': {'$ref': '#/$defs/Inner-Output'}}, 'required': ['inner'], 'title': 'Outer', 'type': 'object', }, } } def test_extras_and_examples_are_json_encoded(): class Toy(BaseModel): name: Annotated[str, Field(examples=['mouse', 'ball'])] class Cat(BaseModel): toys: Annotated[ List[Toy], Field(examples=[[Toy(name='mouse'), Toy(name='ball')]], json_schema_extra={'special': Toy(name='bird')}), ] assert Cat.model_json_schema()['properties']['toys']['examples'] == [[{'name': 'mouse'}, {'name': 'ball'}]] assert Cat.model_json_schema()['properties']['toys']['special'] == {'name': 'bird'} def test_computed_field(): class Model(BaseModel): x: int @computed_field @property def double_x(self) -> int: return 2 * self.x assert Model.model_json_schema(mode='validation') == { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization') == { 'properties': { 'double_x': {'readOnly': True, 'title': 'Double X', 'type': 'integer'}, 'x': {'title': 'X', 'type': 'integer'}, }, 'required': ['x', 'double_x'], 'title': 'Model', 'type': 'object', } def test_serialization_schema_with_exclude(): class MyGenerateJsonSchema(GenerateJsonSchema): def field_is_present(self, field) -> bool: # Always include fields in the JSON schema, even if excluded from serialization return True class Model(BaseModel): x: int y: int = Field(exclude=True) assert Model(x=1, y=1).model_dump() == {'x': 1} assert Model.model_json_schema(mode='serialization') == { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization', schema_generator=MyGenerateJsonSchema) == { 'properties': {'x': {'title': 'X', 'type': 'integer'}, 'y': {'title': 'Y', 'type': 'integer'}}, 'required': ['x', 'y'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize('mapping_type', [typing.Dict, typing.Mapping]) def test_mappings_str_int_json_schema(mapping_type: Any): class Model(BaseModel): str_int_map: mapping_type[str, int] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'str_int_map': { 'title': 'Str Int Map', 'type': 'object', 'additionalProperties': {'type': 'integer'}, } }, 'required': ['str_int_map'], } @pytest.mark.parametrize(('sequence_type'), [pytest.param(List), pytest.param(Sequence)]) def test_sequence_schema(sequence_type): class Model(BaseModel): field: sequence_type[int] assert Model.model_json_schema() == { 'properties': { 'field': {'items': {'type': 'integer'}, 'title': 'Field', 'type': 'array'}, }, 'required': ['field'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize(('sequence_type',), [pytest.param(List), pytest.param(Sequence)]) def test_sequence_schema_with_max_length(sequence_type): class Model(BaseModel): field: sequence_type[int] = Field(max_length=5) assert Model.model_json_schema() == { 'properties': { 'field': {'items': {'type': 'integer'}, 'maxItems': 5, 'title': 'Field', 'type': 'array'}, }, 'required': ['field'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize(('sequence_type',), [pytest.param(List), pytest.param(Sequence)]) def test_sequence_schema_with_min_length(sequence_type): class Model(BaseModel): field: sequence_type[int] = Field(min_length=1) assert Model.model_json_schema() == { 'properties': { 'field': {'items': {'type': 'integer'}, 'minItems': 1, 'title': 'Field', 'type': 'array'}, }, 'required': ['field'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize(('sequence_type',), [pytest.param(List), pytest.param(Sequence)]) def test_sequences_int_json_schema(sequence_type): class Model(BaseModel): int_seq: sequence_type[int] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'int_seq': { 'title': 'Int Seq', 'type': 'array', 'items': {'type': 'integer'}, }, }, 'required': ['int_seq'], } assert Model.model_validate_json('{"int_seq": [1, 2, 3]}') @pytest.mark.parametrize( 'field_schema,model_schema', [ (None, {'properties': {}, 'title': 'Model', 'type': 'object'}), ( {'a': 'b'}, {'properties': {'x': {'a': 'b', 'title': 'X'}}, 'required': ['x'], 'title': 'Model', 'type': 'object'}, ), ], ) @pytest.mark.parametrize('instance_of', [True, False]) def test_arbitrary_type_json_schema(field_schema, model_schema, instance_of): class ArbitraryClass: pass if instance_of: class Model(BaseModel): x: Annotated[InstanceOf[ArbitraryClass], WithJsonSchema(field_schema)] else: class Model(BaseModel): model_config = dict(arbitrary_types_allowed=True) x: Annotated[ArbitraryClass, WithJsonSchema(field_schema)] assert Model.model_json_schema() == model_schema @pytest.mark.parametrize( 'metadata,json_schema', [ ( WithJsonSchema({'type': 'number'}), { 'properties': {'x': {'anyOf': [{'type': 'number'}, {'type': 'null'}], 'title': 'X'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', }, ), ( Examples([1, 2, 3]), { 'properties': { 'x': { 'anyOf': [{'examples': [1, 2, 3], 'type': 'integer'}, {'type': 'null'}], 'title': 'X', } }, 'required': ['x'], 'title': 'Model', 'type': 'object', }, ), ], ) def test_hashable_types(metadata, json_schema): class Model(BaseModel): x: Union[Annotated[int, metadata], None] assert Model.model_json_schema() == json_schema def test_root_model(): class A(RootModel[int]): """A Model docstring""" assert A.model_json_schema() == {'title': 'A', 'description': 'A Model docstring', 'type': 'integer'} class B(RootModel[A]): pass assert B.model_json_schema() == { '$defs': {'A': {'description': 'A Model docstring', 'title': 'A', 'type': 'integer'}}, '$ref': '#/$defs/A', 'title': 'B', } class C(RootModel[A]): """C Model docstring""" assert C.model_json_schema() == { '$defs': {'A': {'description': 'A Model docstring', 'title': 'A', 'type': 'integer'}}, '$ref': '#/$defs/A', 'title': 'C', 'description': 'C Model docstring', } def test_type_adapter_json_schemas_title_description(): class Model(BaseModel): a: str _, json_schema = TypeAdapter.json_schemas([(Model, 'validation', TypeAdapter(Model))]) assert 'title' not in json_schema assert 'description' not in json_schema _, json_schema = TypeAdapter.json_schemas( [(Model, 'validation', TypeAdapter(Model))], title='test title', description='test description', ) assert json_schema['title'] == 'test title' assert json_schema['description'] == 'test description' def test_type_adapter_json_schemas_without_definitions(): _, json_schema = TypeAdapter.json_schemas( [(int, 'validation', TypeAdapter(int))], ref_template='#/components/schemas/{model}', ) assert 'definitions' not in json_schema def test_custom_chain_schema(): class MySequence: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: list_schema = core_schema.list_schema() return core_schema.chain_schema([list_schema]) class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) a: MySequence assert Model.model_json_schema() == { 'properties': {'a': {'items': {}, 'title': 'A', 'type': 'array'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } def test_json_or_python_schema(): class MyJsonOrPython: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: int_schema = core_schema.int_schema() return core_schema.json_or_python_schema(json_schema=int_schema, python_schema=int_schema) class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) a: MyJsonOrPython assert Model.model_json_schema() == { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } def test_lax_or_strict_schema(): class MyLaxOrStrict: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: int_schema = core_schema.int_schema() return core_schema.lax_or_strict_schema(lax_schema=int_schema, strict_schema=int_schema, strict=True) class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) a: MyLaxOrStrict assert Model.model_json_schema() == { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } def test_override_enum_json_schema(): class CustomType(Enum): A = 'a' B = 'b' @classmethod def __get_pydantic_json_schema__( cls, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> core_schema.CoreSchema: json_schema = handler(core_schema) json_schema.update(title='CustomType title', type='string') return json_schema class Model(BaseModel): x: CustomType # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { '$defs': {'CustomType': {'enum': ['a', 'b'], 'title': 'CustomType title', 'type': 'string'}}, 'properties': {'x': {'$ref': '#/$defs/CustomType'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } def test_json_schema_extras_on_ref() -> None: @dataclass class JsonSchemaExamples: examples: List[Any] def __get_pydantic_json_schema__( self, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(core_schema) assert json_schema.keys() == {'$ref'} json_schema['examples'] = to_jsonable_python(self.examples) return json_schema @dataclass class JsonSchemaTitle: title: str def __get_pydantic_json_schema__( self, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(core_schema) assert json_schema.keys() == {'$ref', 'examples'} json_schema['title'] = self.title return json_schema class Model(BaseModel): name: str age: int ta = TypeAdapter(Annotated[Model, JsonSchemaExamples([Model(name='John', age=28)]), JsonSchemaTitle('ModelTitle')]) # insert_assert(ta.json_schema()) assert ta.json_schema() == { '$defs': { 'Model': { 'properties': {'age': {'title': 'Age', 'type': 'integer'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name', 'age'], 'title': 'Model', 'type': 'object', } }, '$ref': '#/$defs/Model', 'examples': [{'name': 'John', 'age': 28}], 'title': 'ModelTitle', } def test_inclusion_of_defaults(): class Model(BaseModel): x: int = 1 y: int = Field(default_factory=lambda: 2) assert Model.model_json_schema() == { 'properties': {'x': {'default': 1, 'title': 'X', 'type': 'integer'}, 'y': {'title': 'Y', 'type': 'integer'}}, 'title': 'Model', 'type': 'object', } def test_resolve_def_schema_from_core_schema() -> None: class Inner(BaseModel): x: int class Marker: def __get_pydantic_json_schema__( self, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema['title'] = 'Foo' original_schema = handler.resolve_ref_schema(field_schema) original_schema['title'] = 'Bar' return field_schema class Outer(BaseModel): inner: Annotated[Inner, Marker()] # insert_assert(Outer.model_json_schema()) assert Outer.model_json_schema() == { '$defs': { 'Inner': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Bar', 'type': 'object', } }, 'properties': {'inner': {'$ref': '#/$defs/Inner', 'title': 'Foo'}}, 'required': ['inner'], 'title': 'Outer', 'type': 'object', } def test_examples_annotation() -> None: ListWithExamples = Annotated[ List[float], Examples([[1, 1, 2, 3, 5], [1, 2, 3]]), ] ta = TypeAdapter(ListWithExamples) assert ta.json_schema() == { 'examples': [[1, 1, 2, 3, 5], [1, 2, 3]], 'items': {'type': 'number'}, 'type': 'array', } ListWithExtraExample = Annotated[ ListWithExamples, Examples([[3.14, 2.71]]), ] ta = TypeAdapter(ListWithExtraExample) assert ta.json_schema() == { 'examples': [[1, 1, 2, 3, 5], [1, 2, 3], [3.14, 2.71]], 'items': {'type': 'number'}, 'type': 'array', } @pytest.mark.skip_json_schema_validation(reason='Uses old examples format, planned for removal in v3.0.') def test_examples_annotation_dict() -> None: with pytest.warns(PydanticDeprecatedSince29): ListWithExamples = Annotated[ List[float], Examples({'Fibonacci': [1, 1, 2, 3, 5]}), ] ta = TypeAdapter(ListWithExamples) # insert_assert(ta.json_schema()) assert ta.json_schema() == { 'examples': {'Fibonacci': [1, 1, 2, 3, 5]}, 'items': {'type': 'number'}, 'type': 'array', } with pytest.warns(PydanticDeprecatedSince29): ListWithMoreExamples = Annotated[ ListWithExamples, Examples( { 'Constants': [ 3.14, 2.71, ] } ), ] ta = TypeAdapter(ListWithMoreExamples) assert ta.json_schema() == { 'examples': {'Constants': [3.14, 2.71], 'Fibonacci': [1, 1, 2, 3, 5]}, 'items': {'type': 'number'}, 'type': 'array', } def test_examples_mixed_types() -> None: with pytest.warns(PydanticDeprecatedSince29): ListThenDict = Annotated[ int, Examples([1, 2]), Examples({'some_example': [3, 4]}), ] DictThenList = Annotated[ int, Examples({'some_example': [3, 4]}), Examples([1, 2]), ] list_then_dict_ta = TypeAdapter(ListThenDict) dict_then_list_ta = TypeAdapter(DictThenList) with pytest.warns( UserWarning, match=re.escape('Updating existing JSON Schema examples of type list with examples of type dict.'), ): assert list_then_dict_ta.json_schema() == { 'examples': [1, 2, 3, 4], 'type': 'integer', } with pytest.warns( UserWarning, match=re.escape('Updating existing JSON Schema examples of type dict with examples of type list.'), ): assert dict_then_list_ta.json_schema() == { 'examples': [3, 4, 1, 2], 'type': 'integer', } def test_skip_json_schema_annotation() -> None: class Model(BaseModel): x: Union[int, SkipJsonSchema[None]] = None y: Union[int, SkipJsonSchema[None]] = 1 z: Union[int, SkipJsonSchema[str]] = 'foo' assert Model(y=None).y is None # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'x': {'default': None, 'title': 'X', 'type': 'integer'}, 'y': {'default': 1, 'title': 'Y', 'type': 'integer'}, 'z': {'default': 'foo', 'title': 'Z', 'type': 'integer'}, }, 'title': 'Model', 'type': 'object', } def test_skip_json_schema_exclude_default(): class Model(BaseModel): x: Union[int, SkipJsonSchema[None]] = Field(default=None, json_schema_extra=lambda s: s.pop('default')) assert Model().x is None # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'x': {'title': 'X', 'type': 'integer'}, }, 'title': 'Model', 'type': 'object', } def test_typeddict_field_required_missing() -> None: """https://github.com/pydantic/pydantic/issues/6192""" class CustomType: def __init__(self, data: Dict[str, int]) -> None: self.data = data @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: data_schema = core_schema.typed_dict_schema( { 'subunits': core_schema.typed_dict_field( core_schema.int_schema(), ), } ) return core_schema.no_info_after_validator_function(cls, data_schema) class Model(BaseModel): t: CustomType m = Model(t={'subunits': 123}) assert type(m.t) is CustomType assert m.t.data == {'subunits': 123} with pytest.raises(ValidationError) as exc_info: Model(t={'subunits': 'abc'}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('t', 'subunits'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'abc', } ] def test_json_schema_keys_sorting() -> None: """We sort all keys except those under a 'property' parent key""" class Model(BaseModel): b: int a: str class OuterModel(BaseModel): inner: List[Model] = Field(default=[Model(b=1, a='fruit')]) # verify the schema contents # this is just to get a nicer error message / diff if it fails expected = { '$defs': { 'Model': { 'properties': {'b': {'title': 'B', 'type': 'integer'}, 'a': {'title': 'A', 'type': 'string'}}, 'required': ['b', 'a'], 'title': 'Model', 'type': 'object', } }, 'properties': { 'inner': { 'default': [{'b': 1, 'a': 'fruit'}], 'items': {'$ref': '#/$defs/Model'}, 'title': 'Inner', 'type': 'array', } }, 'title': 'OuterModel', 'type': 'object', } actual = OuterModel.model_json_schema() assert actual == expected # verify order # dumping to json just happens to be a simple way to verify the order assert json.dumps(actual, indent=2) == json.dumps(expected, indent=2) def test_custom_type_gets_unpacked_ref() -> None: class Annotation: def __get_pydantic_json_schema__( self, schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(schema) json_schema['title'] = 'Set from annotation' return json_schema class Model(BaseModel): x: int @classmethod def __get_pydantic_json_schema__( cls, schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(schema) return json_schema ta = TypeAdapter(Annotated[Model, Annotation()]) # insert_assert(ta.json_schema()) assert ta.json_schema() == { '$defs': { 'Model': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } }, '$ref': '#/$defs/Model', 'title': 'Set from annotation', } @pytest.mark.parametrize( 'annotation, expected', [ (Annotated[int, Field(json_schema_extra={'title': 'abc'})], {'type': 'integer', 'title': 'abc'}), ( Annotated[int, Field(title='abc'), Field(description='xyz')], {'type': 'integer', 'title': 'abc', 'description': 'xyz'}, ), (Annotated[int, Field(gt=0)], {'type': 'integer', 'exclusiveMinimum': 0}), ( Annotated[int, Field(gt=0), Field(lt=100)], {'type': 'integer', 'exclusiveMinimum': 0, 'exclusiveMaximum': 100}, ), (Annotated[int, Field(examples=[1])], {'type': 'integer', 'examples': [1]}), ], ids=repr, ) def test_field_json_schema_metadata(annotation: Type[Any], expected: JsonSchemaValue) -> None: ta = TypeAdapter(annotation) assert ta.json_schema() == expected def test_multiple_models_with_same_qualname(): from pydantic import create_model model_a1 = create_model( 'A', inner_a1=(str, ...), ) model_a2 = create_model( 'A', inner_a2=(str, ...), ) model_c = create_model( 'B', outer_a1=(model_a1, ...), outer_a2=(model_a2, ...), ) # insert_assert(model_c.model_json_schema()) assert model_c.model_json_schema() == { '$defs': { 'tests__test_json_schema__A__1': { 'properties': {'inner_a1': {'title': 'Inner A1', 'type': 'string'}}, 'required': ['inner_a1'], 'title': 'A', 'type': 'object', }, 'tests__test_json_schema__A__2': { 'properties': {'inner_a2': {'title': 'Inner A2', 'type': 'string'}}, 'required': ['inner_a2'], 'title': 'A', 'type': 'object', }, }, 'properties': { 'outer_a1': {'$ref': '#/$defs/tests__test_json_schema__A__1'}, 'outer_a2': {'$ref': '#/$defs/tests__test_json_schema__A__2'}, }, 'required': ['outer_a1', 'outer_a2'], 'title': 'B', 'type': 'object', } def test_generate_definitions_for_no_ref_schemas(): decimal_schema = TypeAdapter(Decimal).core_schema class Model(BaseModel): pass result = GenerateJsonSchema().generate_definitions( [ ('Decimal', 'validation', decimal_schema), ('Decimal', 'serialization', decimal_schema), ('Model', 'validation', Model.__pydantic_core_schema__), ] ) assert result == ( { ('Decimal', 'serialization'): {'type': 'string'}, ('Decimal', 'validation'): {'anyOf': [{'type': 'number'}, {'type': 'string'}]}, ('Model', 'validation'): {'$ref': '#/$defs/Model'}, }, {'Model': {'properties': {}, 'title': 'Model', 'type': 'object'}}, ) def test_chain_schema(): # this is a contrived schema which requires a string input that can be coerced to an int: s = core_schema.chain_schema([core_schema.str_schema(), core_schema.int_schema()]) assert SchemaValidator(s).validate_python('1') == 1 # proof it works this way assert GenerateJsonSchema().generate(s, mode='validation') == {'type': 'string'} assert GenerateJsonSchema().generate(s, mode='serialization') == {'type': 'integer'} def test_deferred_json_schema(): class Foo(BaseModel): x: 'Bar' with pytest.raises(PydanticUserError, match='`Foo` is not fully defined'): Foo.model_json_schema() class Bar(BaseModel): pass Foo.model_rebuild() assert Foo.model_json_schema() == { '$defs': {'Bar': {'properties': {}, 'title': 'Bar', 'type': 'object'}}, 'properties': {'x': {'$ref': '#/$defs/Bar'}}, 'required': ['x'], 'title': 'Foo', 'type': 'object', } def test_dollar_ref_alias(): class MyModel(BaseModel): my_field: str = Field(alias='$ref') assert MyModel.model_json_schema() == { 'properties': {'$ref': {'title': '$Ref', 'type': 'string'}}, 'required': ['$ref'], 'title': 'MyModel', 'type': 'object', } def test_multiple_parametrization_of_generic_model() -> None: """https://github.com/pydantic/pydantic/issues/6708""" T = TypeVar('T') calls = 0 class Inner(BaseModel): a: int @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: nonlocal calls calls += 1 json_schema = handler(core_schema) return json_schema class Outer(BaseModel, Generic[T]): b: Optional[T] class ModelTest(BaseModel): c: Outer[Inner] for _ in range(sys.getrecursionlimit() + 1): class ModelTest(BaseModel): c: Outer[Inner] ModelTest.model_json_schema() # this is not necessarily a promise we make # (in fact, we've had bugs in the past where this was not the case and we'd # call the __get_pydantic_json_schema__ method multiple times) # but it's much easier to test for than absence of a recursion limit assert calls == 1 def test_callable_json_schema_extra(): def pop_default(s): s.pop('default') class Model(BaseModel): a: int = Field(default=1, json_schema_extra=pop_default) b: Annotated[int, Field(default=2), Field(json_schema_extra=pop_default)] c: Annotated[int, Field(default=3)] = Field(json_schema_extra=pop_default) assert Model().model_dump() == {'a': 1, 'b': 2, 'c': 3} assert Model(a=11, b=12, c=13).model_dump() == { 'a': 11, 'b': 12, 'c': 13, } json_schema = Model.model_json_schema() for key in 'abc': assert json_schema['properties'][key] == {'title': key.upper(), 'type': 'integer'} # default is not present def test_callable_json_schema_extra_dataclass(): def pop_default(s): s.pop('default') @pydantic.dataclasses.dataclass class MyDataclass: # Note that a and b here have to come first since dataclasses requires annotation-only fields to come before # fields with defaults (for similar reasons to why function arguments with defaults must come later) # But otherwise, evnerything seems to work properly a: Annotated[int, Field(json_schema_extra=pop_default), Field(default=1)] b: Annotated[int, Field(default=2), Field(json_schema_extra=pop_default)] c: int = Field(default=3, json_schema_extra=pop_default) d: Annotated[int, Field(json_schema_extra=pop_default)] = 4 e: Annotated[int, Field(json_schema_extra=pop_default)] = Field(default=5) f: Annotated[int, Field(default=6)] = Field(json_schema_extra=pop_default) adapter = TypeAdapter(MyDataclass) assert adapter.dump_python(MyDataclass()) == {'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6} assert adapter.dump_python(MyDataclass(a=11, b=12, c=13, d=14, e=15, f=16)) == { 'a': 11, 'b': 12, 'c': 13, 'd': 14, 'e': 15, 'f': 16, } json_schema = adapter.json_schema() for key in 'abcdef': assert json_schema['properties'][key] == {'title': key.upper(), 'type': 'integer'} # default is not present def test_model_rebuild_happens_even_with_parent_classes(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel class MyBaseModel(BaseModel): pass class B(MyBaseModel): b: A class A(MyBaseModel): a: str """ ) assert module.B.model_json_schema() == { '$defs': { 'A': { 'properties': {'a': {'title': 'A', 'type': 'string'}}, 'required': ['a'], 'title': 'A', 'type': 'object', } }, 'properties': {'b': {'$ref': '#/$defs/A'}}, 'required': ['b'], 'title': 'B', 'type': 'object', } def test_enum_complex_value() -> None: """https://github.com/pydantic/pydantic/issues/7045""" class MyEnum(Enum): foo = (1, 2) bar = (2, 3) ta = TypeAdapter(MyEnum) # insert_assert(ta.json_schema()) assert ta.json_schema() == {'enum': [[1, 2], [2, 3]], 'title': 'MyEnum', 'type': 'array'} def test_json_schema_serialization_defaults_required(): class Model(BaseModel): a: str = 'a' class SerializationDefaultsRequiredModel(Model): model_config = ConfigDict(json_schema_serialization_defaults_required=True) model_schema = Model.model_json_schema(mode='serialization') sdr_model_schema = SerializationDefaultsRequiredModel.model_json_schema(mode='serialization') assert 'required' not in model_schema assert sdr_model_schema['required'] == ['a'] def test_json_schema_mode_override(): class Model(BaseModel): a: Json[int] # requires a string to validate, but will dump an int class ValidationModel(Model): model_config = ConfigDict(json_schema_mode_override='validation', title='Model') class SerializationModel(Model): model_config = ConfigDict(json_schema_mode_override='serialization', title='Model') # Ensure the ValidationModel and SerializationModel schemas do not depend on the value of the mode assert ValidationModel.model_json_schema(mode='validation') == ValidationModel.model_json_schema( mode='serialization' ) assert SerializationModel.model_json_schema(mode='validation') == SerializationModel.model_json_schema( mode='serialization' ) # Ensure the two submodels models have different JSON schemas assert ValidationModel.model_json_schema() != SerializationModel.model_json_schema() # Ensure the submodels' JSON schemas match the expected mode even when the opposite value is specified: assert ValidationModel.model_json_schema(mode='serialization') == Model.model_json_schema(mode='validation') assert SerializationModel.model_json_schema(mode='validation') == Model.model_json_schema(mode='serialization') def test_models_json_schema_generics() -> None: class G(BaseModel, Generic[T]): foo: T class M(BaseModel): foo: Literal['a', 'b'] GLiteral = G[Literal['a', 'b']] assert models_json_schema( [ (GLiteral, 'serialization'), (GLiteral, 'validation'), (M, 'validation'), ] ) == ( { (GLiteral, 'serialization'): {'$ref': '#/$defs/G_Literal__a____b___'}, (GLiteral, 'validation'): {'$ref': '#/$defs/G_Literal__a____b___'}, (M, 'validation'): {'$ref': '#/$defs/M'}, }, { '$defs': { 'G_Literal__a____b___': { 'properties': {'foo': {'enum': ['a', 'b'], 'title': 'Foo', 'type': 'string'}}, 'required': ['foo'], 'title': "G[Literal['a', 'b']]", 'type': 'object', }, 'M': { 'properties': {'foo': {'enum': ['a', 'b'], 'title': 'Foo', 'type': 'string'}}, 'required': ['foo'], 'title': 'M', 'type': 'object', }, } }, ) def test_recursive_non_generic_model() -> None: class Foo(BaseModel): maybe_bar: Union[None, 'Bar'] class Bar(BaseModel): foo: Foo # insert_assert(Bar(foo=Foo(maybe_bar=None)).model_dump()) assert Bar.model_validate({'foo': {'maybe_bar': None}}).model_dump() == {'foo': {'maybe_bar': None}} # insert_assert(Bar.model_json_schema()) assert Bar.model_json_schema() == { '$defs': { 'Bar': { 'properties': {'foo': {'$ref': '#/$defs/Foo'}}, 'required': ['foo'], 'title': 'Bar', 'type': 'object', }, 'Foo': { 'properties': {'maybe_bar': {'anyOf': [{'$ref': '#/$defs/Bar'}, {'type': 'null'}]}}, 'required': ['maybe_bar'], 'title': 'Foo', 'type': 'object', }, }, '$ref': '#/$defs/Bar', } def test_module_with_colon_in_name(create_module): module = create_module( # language=Python """ from pydantic import BaseModel class Foo(BaseModel): x: int """, module_name_prefix='C:\\', ) foo_model = module.Foo _, v_schema = models_json_schema([(foo_model, 'validation')]) assert v_schema == { '$defs': { 'Foo': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Foo', 'type': 'object', } } } def test_repeated_custom_type(): class Numeric(pydantic.BaseModel): value: float @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: pydantic.GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_before_validator_function(cls.validate, handler(source_type)) @classmethod def validate(cls, v: Any) -> Union[Dict[str, Any], Self]: if isinstance(v, (str, float, int)): return cls(value=v) if isinstance(v, Numeric): return v if isinstance(v, dict): return v raise ValueError(f'Invalid value for {cls}: {v}') def is_positive(value: Numeric): assert value.value > 0.0, 'Must be positive' class OuterModel(pydantic.BaseModel): x: Numeric y: Numeric z: Annotated[Numeric, AfterValidator(is_positive)] assert OuterModel(x=2, y=-1, z=1) with pytest.raises(ValidationError): OuterModel(x=2, y=-1, z=-1) def test_description_not_included_for_basemodel() -> None: class Model(BaseModel): x: BaseModel assert 'description' not in Model.model_json_schema()['$defs']['BaseModel'] def test_recursive_json_schema_build() -> None: """ Schema build for this case is a bit complicated due to the recursive nature of the models. This was reported as broken in https://github.com/pydantic/pydantic/issues/8689, which was originally caused by the change made in https://github.com/pydantic/pydantic/pull/8583, which has since been reverted. """ class AllowedValues(str, Enum): VAL1 = 'Val1' VAL2 = 'Val2' class ModelA(BaseModel): modelA_1: AllowedValues = Field(max_length=60) class ModelB(ModelA): modelB_1: typing.List[ModelA] class ModelC(BaseModel): modelC_1: ModelB class Model(BaseModel): b: ModelB c: ModelC assert Model.model_json_schema() def test_json_schema_annotated_with_field() -> None: """Ensure field specified with Annotated in create_model call is still marked as required.""" from pydantic import create_model Model = create_model( 'test_model', bar=(Annotated[int, Field(description='Bar description')], ...), ) assert Model.model_json_schema() == { 'properties': { 'bar': {'description': 'Bar description', 'title': 'Bar', 'type': 'integer'}, }, 'required': ['bar'], 'title': 'test_model', 'type': 'object', } def test_required_fields_in_annotated_with_create_model() -> None: """Ensure multiple field specified with Annotated in create_model call is still marked as required.""" from pydantic import create_model Model = create_model( 'test_model', foo=(int, ...), bar=(Annotated[int, Field(description='Bar description')], ...), baz=(Annotated[int, Field(description='Baz description')], ...), ) assert Model.model_json_schema() == { 'properties': { 'foo': {'title': 'Foo', 'type': 'integer'}, 'bar': {'description': 'Bar description', 'title': 'Bar', 'type': 'integer'}, 'baz': {'description': 'Baz description', 'title': 'Baz', 'type': 'integer'}, }, 'required': ['foo', 'bar', 'baz'], 'title': 'test_model', 'type': 'object', } def test_required_fields_in_annotated_with_basemodel() -> None: """ Ensure multiple field specified with Annotated in BaseModel is marked as required. """ class Model(BaseModel): a: int = ... b: Annotated[int, 'placeholder'] = ... c: Annotated[int, Field()] = ... assert Model.model_fields['a'].is_required() assert Model.model_fields['b'].is_required() assert Model.model_fields['c'].is_required() @pytest.mark.parametrize( 'field_type,default_value,expected_schema', [ ( IPvAnyAddress, IPv4Address('127.0.0.1'), { 'properties': { 'field': {'default': '127.0.0.1', 'format': 'ipvanyaddress', 'title': 'Field', 'type': 'string'} }, 'title': 'Model', 'type': 'object', }, ), ( IPvAnyAddress, IPv6Address('::1'), { 'properties': { 'field': {'default': '::1', 'format': 'ipvanyaddress', 'title': 'Field', 'type': 'string'} }, 'title': 'Model', 'type': 'object', }, ), ], ) def test_default_value_encoding(field_type, default_value, expected_schema): class Model(BaseModel): field: field_type = default_value schema = Model.model_json_schema() assert schema == expected_schema def _generate_deprecated_classes(): @deprecated('MyModel is deprecated') class MyModel(BaseModel): pass @deprecated('MyPydanticDataclass is deprecated') @pydantic.dataclasses.dataclass class MyPydanticDataclass: pass @deprecated('MyBuiltinDataclass is deprecated') @dataclasses.dataclass class MyBuiltinDataclass: pass @deprecated('MyTypedDict is deprecated') class MyTypedDict(TypedDict): pass return [ pytest.param(MyModel, id='BaseModel'), pytest.param(MyPydanticDataclass, id='pydantic-dataclass'), pytest.param(MyBuiltinDataclass, id='builtin-dataclass'), pytest.param(MyTypedDict, id='TypedDict'), ] @pytest.mark.skipif( Version(importlib.metadata.version('typing_extensions')) < Version('4.9'), reason='`deprecated` type annotation requires typing_extensions>=4.9', ) @pytest.mark.parametrize('cls', _generate_deprecated_classes()) def test_deprecated_classes_json_schema(cls): assert hasattr(cls, '__deprecated__') assert TypeAdapter(cls).json_schema()['deprecated'] @pytest.mark.skipif( Version(importlib.metadata.version('typing_extensions')) < Version('4.9'), reason='`deprecated` type annotation requires typing_extensions>=4.9', ) @pytest.mark.parametrize('cls', _generate_deprecated_classes()) def test_deprecated_subclasses_json_schema(cls): class Model(BaseModel): subclass: cls assert Model.model_json_schema() == { '$defs': {cls.__name__: {'deprecated': True, 'properties': {}, 'title': f'{cls.__name__}', 'type': 'object'}}, 'properties': {'subclass': {'$ref': f'#/$defs/{cls.__name__}'}}, 'required': ['subclass'], 'title': 'Model', 'type': 'object', } @pytest.mark.skipif( Version(importlib.metadata.version('typing_extensions')) < Version('4.9'), reason='`deprecated` type annotation requires typing_extensions>=4.9', ) @pytest.mark.parametrize('cls', _generate_deprecated_classes()) def test_deprecated_class_usage_warns(cls): if issubclass(cls, dict): pytest.skip('TypedDict does not generate a DeprecationWarning on usage') with pytest.warns(DeprecationWarning, match=f'{cls.__name__} is deprecated'): cls() @dataclasses.dataclass class BuiltinDataclassParent: name: str @pydantic.dataclasses.dataclass class PydanticDataclassParent: name: str class TypedDictParent(TypedDict): name: str class ModelParent(BaseModel): name: str @pytest.mark.parametrize( 'pydantic_type,expected_json_schema', [ pytest.param( BuiltinDataclassParent, { '$defs': { 'BuiltinDataclassParent': { 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], 'title': 'BuiltinDataclassParent', 'type': 'object', } }, 'properties': {'parent': {'$ref': '#/$defs/BuiltinDataclassParent', 'default': {'name': 'Jon Doe'}}}, 'title': 'child', 'type': 'object', }, id='builtin-dataclass', ), pytest.param( PydanticDataclassParent, { '$defs': { 'PydanticDataclassParent': { 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], 'title': 'PydanticDataclassParent', 'type': 'object', } }, 'properties': {'parent': {'$ref': '#/$defs/PydanticDataclassParent', 'default': {'name': 'Jon Doe'}}}, 'title': 'child', 'type': 'object', }, id='pydantic-dataclass', ), pytest.param( TypedDictParent, { '$defs': { 'TypedDictParent': { 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], 'title': 'TypedDictParent', 'type': 'object', } }, 'properties': {'parent': {'$ref': '#/$defs/TypedDictParent', 'default': {'name': 'Jon Doe'}}}, 'title': 'child', 'type': 'object', }, id='typeddict', ), pytest.param( ModelParent, { '$defs': { 'ModelParent': { 'properties': {'name': {'title': 'Name', 'type': 'string'}}, 'required': ['name'], 'title': 'ModelParent', 'type': 'object', } }, 'properties': {'parent': {'$ref': '#/$defs/ModelParent', 'default': {'name': 'Jon Doe'}}}, 'title': 'child', 'type': 'object', }, id='model', ), ], ) def test_pydantic_types_as_default_values(pydantic_type, expected_json_schema): class Child(BaseModel): model_config = ConfigDict(title='child') parent: pydantic_type = pydantic_type(name='Jon Doe') assert Child.model_json_schema() == expected_json_schema def test_str_schema_with_pattern() -> None: assert TypeAdapter(Annotated[str, Field(pattern='abc')]).json_schema() == {'type': 'string', 'pattern': 'abc'} assert TypeAdapter(Annotated[str, Field(pattern=re.compile('abc'))]).json_schema() == { 'type': 'string', 'pattern': 'abc', } def test_plain_serializer_applies_to_default() -> None: class Model(BaseModel): custom_str: Annotated[str, PlainSerializer(lambda x: f'serialized-{x}', return_type=str)] = 'foo' assert Model.model_json_schema(mode='validation') == { 'properties': {'custom_str': {'default': 'foo', 'title': 'Custom Str', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } assert Model.model_json_schema(mode='serialization') == { 'properties': {'custom_str': {'default': 'serialized-foo', 'title': 'Custom Str', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } def test_plain_serializer_does_not_apply_with_unless_none() -> None: """Test plain serializers aren't used to compute the JSON Schema default if mode is 'json-unless-none' and default value is `None`.""" class Model(BaseModel): custom_decimal_json_unless_none: Annotated[ Optional[Decimal], PlainSerializer(lambda x: float(x), when_used='json-unless-none', return_type=float) ] = None custom_decimal_unless_none: Annotated[ Optional[Decimal], PlainSerializer(lambda x: float(x), when_used='unless-none', return_type=float) ] = None assert Model.model_json_schema(mode='serialization') == { 'properties': { 'custom_decimal_json_unless_none': { 'anyOf': [{'type': 'null'}, {'type': 'number'}], 'default': None, 'title': 'Custom Decimal Json Unless None', }, 'custom_decimal_unless_none': { 'anyOf': [{'type': 'null'}, {'type': 'number'}], 'default': None, 'title': 'Custom Decimal Unless None', }, }, 'title': 'Model', 'type': 'object', } def test_merge_json_schema_extra_from_field_infos() -> None: class Model(BaseModel): f: Annotated[str, Field(json_schema_extra={'a': 1, 'b': 2})] f_with_overwrite: Annotated[str, Field('bar', json_schema_extra={'a': 1}), Field(json_schema_extra={'a': 2})] f_with_additional: Annotated[str, Field('bar', json_schema_extra={'a': 1}), Field(json_schema_extra={'b': 2})] # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'properties': { 'f': {'a': 1, 'b': 2, 'title': 'F', 'type': 'string'}, 'f_with_overwrite': { 'a': 2, 'default': 'bar', 'title': 'F With Overwrite', 'type': 'string', }, 'f_with_additional': { 'a': 1, 'b': 2, 'default': 'bar', 'title': 'F With Additional', 'type': 'string', }, }, 'required': ['f'], 'title': 'Model', 'type': 'object', } def test_ta_and_bm_same_json_schema() -> None: MyStr = Annotated[str, Field(json_schema_extra={'key1': 'value1'}), Field(json_schema_extra={'key2': 'value2'})] class Foo(BaseModel): v: MyStr ta_json_schema = TypeAdapter(MyStr).json_schema() bm_json_schema = Foo.model_json_schema()['properties']['v'] bm_json_schema.pop('title') assert ta_json_schema == bm_json_schema def test_min_and_max_in_schema() -> None: TSeq = TypeAdapter(Annotated[Sequence[int], Field(min_length=2, max_length=5)]) assert TSeq.json_schema() == {'items': {'type': 'integer'}, 'maxItems': 5, 'minItems': 2, 'type': 'array'} def test_plain_field_validator_serialization() -> None: """`PlainValidator` internally creates a wrap ser. schema. This tests that we can still generate a JSON Schema in `'serialization'` mode. """ class Foo(BaseModel): a: Annotated[int, PlainValidator(lambda x: x)] assert Foo.model_json_schema(mode='serialization') == { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'Foo', 'type': 'object', } def test_annotated_field_validator_input_type() -> None: class Model(BaseModel): # `json_schema_input_type` defaults to `PydanticUndefined`, so `int` will be used to generate the JSON Schema: a: Annotated[int, BeforeValidator(lambda v: v)] b: Annotated[int, WrapValidator(lambda v, h: h(v))] # `json_schema_input_type` defaults to `Any`: c: Annotated[int, PlainValidator(lambda v: v)] d: Annotated[int, BeforeValidator(lambda v: v, json_schema_input_type=Union[int, str])] e: Annotated[int, WrapValidator(lambda v, h: h(v), json_schema_input_type=Union[int, str])] f: Annotated[int, PlainValidator(lambda v: v, json_schema_input_type=Union[int, str])] assert Model.model_json_schema(mode='validation')['properties'] == { 'a': {'type': 'integer', 'title': 'A'}, 'b': {'type': 'integer', 'title': 'B'}, 'c': {'title': 'C'}, 'd': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'D'}, 'e': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'E'}, 'f': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'F'}, } assert Model.model_json_schema(mode='serialization')['properties'] == { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}, 'c': {'title': 'C', 'type': 'integer'}, 'd': {'title': 'D', 'type': 'integer'}, 'e': {'title': 'E', 'type': 'integer'}, 'f': {'title': 'F', 'type': 'integer'}, } def test_decorator_field_validator_input_type() -> None: class Model(BaseModel): a: int b: int c: int d: int e: int f: int @field_validator('a', mode='before') @classmethod def validate_a(cls, value: Any) -> int: ... @field_validator('b', mode='wrap') @classmethod def validate_b(cls, value: Any, handler: ValidatorFunctionWrapHandler) -> int: ... @field_validator('c', mode='plain') @classmethod def validate_c(cls, value: Any) -> int: ... @field_validator('d', mode='before', json_schema_input_type=Union[int, str]) @classmethod def validate_d(cls, value: Any) -> int: ... @field_validator('e', mode='wrap', json_schema_input_type=Union[int, str]) @classmethod def validate_e(cls, value: Any, handler: ValidatorFunctionWrapHandler) -> int: ... @field_validator('f', mode='plain', json_schema_input_type=Union[int, str]) @classmethod def validate_f(cls, value: Any) -> int: ... assert Model.model_json_schema(mode='validation')['properties'] == { 'a': {'type': 'integer', 'title': 'A'}, 'b': {'type': 'integer', 'title': 'B'}, 'c': {'title': 'C'}, 'd': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'D'}, 'e': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'E'}, 'f': {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'F'}, } assert Model.model_json_schema(mode='serialization')['properties'] == { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}, 'c': {'title': 'C', 'type': 'integer'}, 'd': {'title': 'D', 'type': 'integer'}, 'e': {'title': 'E', 'type': 'integer'}, 'f': {'title': 'F', 'type': 'integer'}, } @pytest.mark.parametrize( 'validator', [ PlainValidator(lambda v: v, json_schema_input_type='Union[Sub1, Sub2]'), BeforeValidator(lambda v: v, json_schema_input_type='Union[Sub1, Sub2]'), WrapValidator(lambda v, h: h(v), json_schema_input_type='Union[Sub1, Sub2]'), ], ) def test_json_schema_input_type_with_refs(validator) -> None: """Test that `'definition-ref` schemas for `json_schema_input_type` are supported. See: https://github.com/pydantic/pydantic/issues/10434. See: https://github.com/pydantic/pydantic/issues/11033 """ class Sub1(BaseModel): pass class Sub2(BaseModel): pass class Model(BaseModel): sub: Annotated[ Union[Sub1, Sub2], PlainSerializer(lambda v: v, return_type=Union[Sub1, Sub2]), validator, ] json_schema = Model.model_json_schema() assert 'Sub1' in json_schema['$defs'] assert 'Sub2' in json_schema['$defs'] assert json_schema['properties']['sub'] == { 'anyOf': [{'$ref': '#/$defs/Sub1'}, {'$ref': '#/$defs/Sub2'}], 'title': 'Sub', } @pytest.mark.parametrize( 'validator', [ PlainValidator(lambda v: v, json_schema_input_type='Model'), BeforeValidator(lambda v: v, json_schema_input_type='Model'), WrapValidator(lambda v, h: h(v), json_schema_input_type='Model'), ], ) def test_json_schema_input_type_with_recursive_refs(validator) -> None: """Test that recursive `'definition-ref` schemas for `json_schema_input_type` are not inlined.""" class Model(BaseModel): model: Annotated[ 'Model', PlainSerializer(lambda v: v, return_type='Model'), validator, ] json_schema = Model.model_json_schema() assert 'Model' in json_schema['$defs'] assert json_schema['$ref'] == '#/$defs/Model' def test_title_strip() -> None: class Model(BaseModel): some_field: str = Field(alias='_some_field') assert Model.model_json_schema()['properties']['_some_field']['title'] == 'Some Field' def test_arbitrary_ref_in_json_schema() -> None: """See https://github.com/pydantic/pydantic/issues/9981.""" class Test(BaseModel): x: dict = Field(examples=[{'$ref': '#/components/schemas/Pet'}]) assert Test.model_json_schema() == { 'properties': {'x': {'examples': [{'$ref': '#/components/schemas/Pet'}], 'title': 'X', 'type': 'object'}}, 'required': ['x'], 'title': 'Test', 'type': 'object', } def test_examples_as_property_key() -> None: """https://github.com/pydantic/pydantic/issues/11304. A regression of https://github.com/pydantic/pydantic/issues/9981 (see `test_arbitrary_ref_in_json_schema`). """ class Model1(BaseModel): pass class Model(BaseModel): examples: Model1 assert Model.model_json_schema() == { '$defs': {'Model1': {'properties': {}, 'title': 'Model1', 'type': 'object'}}, 'properties': {'examples': {'$ref': '#/$defs/Model1'}}, 'required': ['examples'], 'title': 'Model', 'type': 'object', } def test_warn_on_mixed_compose() -> None: with pytest.warns( PydanticJsonSchemaWarning, match='Composing `dict` and `callable` type `json_schema_extra` is not supported.' ): class Model(BaseModel): field: Annotated[int, Field(json_schema_extra={'a': 'dict'}), Field(json_schema_extra=lambda x: x.pop('a'))] # type: ignore def test_blank_title_is_respected() -> None: class Model(BaseModel): x: int model_config = ConfigDict(title='') assert Model.model_json_schema()['title'] == '' pydantic-2.10.6/tests/test_main.py000066400000000000000000003015301474456633400171340ustar00rootroot00000000000000import json import platform import re import sys import warnings from collections import defaultdict from copy import deepcopy from dataclasses import dataclass from datetime import date, datetime from enum import Enum from functools import partial from typing import ( Any, Callable, ClassVar, Dict, Final, Generic, List, Mapping, Optional, Set, Type, TypeVar, Union, get_type_hints, ) from uuid import UUID, uuid4 import pytest from pydantic_core import CoreSchema, core_schema from typing_extensions import Annotated, Literal from pydantic import ( AfterValidator, BaseModel, ConfigDict, Field, GetCoreSchemaHandler, PrivateAttr, PydanticUndefinedAnnotation, PydanticUserError, SecretStr, StringConstraints, TypeAdapter, ValidationError, ValidationInfo, constr, field_validator, ) from pydantic._internal._generate_schema import GenerateSchema from pydantic._internal._mock_val_ser import MockCoreSchema from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.v1 import BaseModel as BaseModelV1 def test_success(): # same as below but defined here so class definition occurs inside the test class Model(BaseModel): a: float b: int = 10 m = Model(a=10.2) assert m.a == 10.2 assert m.b == 10 @pytest.fixture(name='UltraSimpleModel', scope='session') def ultra_simple_model_fixture(): class UltraSimpleModel(BaseModel): a: float b: int = 10 return UltraSimpleModel def test_ultra_simple_missing(UltraSimpleModel): with pytest.raises(ValidationError) as exc_info: UltraSimpleModel() assert exc_info.value.errors(include_url=False) == [ {'loc': ('a',), 'msg': 'Field required', 'type': 'missing', 'input': {}} ] assert str(exc_info.value) == ( '1 validation error for UltraSimpleModel\n' 'a\n' ' Field required [type=missing, input_value={}, input_type=dict]' ) def test_ultra_simple_failed(UltraSimpleModel): with pytest.raises(ValidationError) as exc_info: UltraSimpleModel(a='x', b='x') assert exc_info.value.errors(include_url=False) == [ { 'type': 'float_parsing', 'loc': ('a',), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'x', }, { 'type': 'int_parsing', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, ] def test_ultra_simple_repr(UltraSimpleModel): m = UltraSimpleModel(a=10.2) assert str(m) == 'a=10.2 b=10' assert repr(m) == 'UltraSimpleModel(a=10.2, b=10)' assert repr(m.model_fields['a']) == 'FieldInfo(annotation=float, required=True)' assert repr(m.model_fields['b']) == 'FieldInfo(annotation=int, required=False, default=10)' assert dict(m) == {'a': 10.2, 'b': 10} assert m.model_dump() == {'a': 10.2, 'b': 10} assert m.model_dump_json() == '{"a":10.2,"b":10}' assert str(m) == 'a=10.2 b=10' def test_recursive_repr() -> None: class A(BaseModel): a: object = None class B(BaseModel): a: Optional[A] = None a = A() a.a = a b = B(a=a) assert re.match(r"B\(a=A\(a=''\)\)", repr(b)) is not None def test_default_factory_field(): def myfunc(): return 1 class Model(BaseModel): a: int = Field(default_factory=myfunc) m = Model() assert str(m) == 'a=1' assert repr(m.model_fields['a']) == 'FieldInfo(annotation=int, required=False, default_factory=myfunc)' assert dict(m) == {'a': 1} assert m.model_dump_json() == '{"a":1}' def test_comparing(UltraSimpleModel): m = UltraSimpleModel(a=10.2, b='100') assert m.model_dump() == {'a': 10.2, 'b': 100} assert m != {'a': 10.2, 'b': 100} assert m == UltraSimpleModel(a=10.2, b=100) @pytest.fixture(scope='session', name='NoneCheckModel') def none_check_model_fix(): class NoneCheckModel(BaseModel): existing_str_value: str = 'foo' required_str_value: str = ... required_str_none_value: Optional[str] = ... existing_bytes_value: bytes = b'foo' required_bytes_value: bytes = ... required_bytes_none_value: Optional[bytes] = ... return NoneCheckModel def test_nullable_strings_success(NoneCheckModel): m = NoneCheckModel( required_str_value='v1', required_str_none_value=None, required_bytes_value='v2', required_bytes_none_value=None ) assert m.required_str_value == 'v1' assert m.required_str_none_value is None assert m.required_bytes_value == b'v2' assert m.required_bytes_none_value is None def test_nullable_strings_fails(NoneCheckModel): with pytest.raises(ValidationError) as exc_info: NoneCheckModel( required_str_value=None, required_str_none_value=None, required_bytes_value=None, required_bytes_none_value=None, ) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('required_str_value',), 'msg': 'Input should be a valid string', 'input': None, }, { 'type': 'bytes_type', 'loc': ('required_bytes_value',), 'msg': 'Input should be a valid bytes', 'input': None, }, ] @pytest.fixture(name='ParentModel', scope='session') def parent_sub_model_fixture(): class UltraSimpleModel(BaseModel): a: float b: int = 10 class ParentModel(BaseModel): grape: bool banana: UltraSimpleModel return ParentModel def test_parent_sub_model(ParentModel): m = ParentModel(grape=1, banana={'a': 1}) assert m.grape is True assert m.banana.a == 1.0 assert m.banana.b == 10 assert repr(m) == 'ParentModel(grape=True, banana=UltraSimpleModel(a=1.0, b=10))' def test_parent_sub_model_fails(ParentModel): with pytest.raises(ValidationError): ParentModel(grape=1, banana=123) def test_not_required(): class Model(BaseModel): a: float = None assert Model(a=12.2).a == 12.2 assert Model().a is None with pytest.raises(ValidationError) as exc_info: Model(a=None) assert exc_info.value.errors(include_url=False) == [ { 'type': 'float_type', 'loc': ('a',), 'msg': 'Input should be a valid number', 'input': None, }, ] def test_allow_extra(): class Model(BaseModel): model_config = ConfigDict(extra='allow') a: float m = Model(a='10.2', b=12) assert m.__dict__ == {'a': 10.2} assert m.__pydantic_extra__ == {'b': 12} assert m.a == 10.2 assert m.b == 12 assert m.model_extra == {'b': 12} m.c = 42 assert 'c' not in m.__dict__ assert m.__pydantic_extra__ == {'b': 12, 'c': 42} assert m.model_dump() == {'a': 10.2, 'b': 12, 'c': 42} @pytest.mark.parametrize('extra', ['ignore', 'forbid', 'allow']) def test_allow_extra_from_attributes(extra: Literal['ignore', 'forbid', 'allow']): class Model(BaseModel): a: float model_config = ConfigDict(extra=extra, from_attributes=True) class TestClass: a = 1.0 b = 12 m = Model.model_validate(TestClass()) assert m.a == 1.0 assert not hasattr(m, 'b') def test_allow_extra_repr(): class Model(BaseModel): model_config = ConfigDict(extra='allow') a: float = ... assert str(Model(a='10.2', b=12)) == 'a=10.2 b=12' def test_forbidden_extra_success(): class ForbiddenExtra(BaseModel): model_config = ConfigDict(extra='forbid') foo: str = 'whatever' m = ForbiddenExtra() assert m.foo == 'whatever' def test_forbidden_extra_fails(): class ForbiddenExtra(BaseModel): model_config = ConfigDict(extra='forbid') foo: str = 'whatever' with pytest.raises(ValidationError) as exc_info: ForbiddenExtra(foo='ok', bar='wrong', spam='xx') assert exc_info.value.errors(include_url=False) == [ { 'type': 'extra_forbidden', 'loc': ('bar',), 'msg': 'Extra inputs are not permitted', 'input': 'wrong', }, { 'type': 'extra_forbidden', 'loc': ('spam',), 'msg': 'Extra inputs are not permitted', 'input': 'xx', }, ] def test_assign_extra_no_validate(): class Model(BaseModel): model_config = ConfigDict(validate_assignment=True) a: float model = Model(a=0.2) with pytest.raises(ValidationError, match=r"b\s+Object has no attribute 'b'"): model.b = 2 def test_assign_extra_validate(): class Model(BaseModel): model_config = ConfigDict(validate_assignment=True) a: float model = Model(a=0.2) with pytest.raises(ValidationError, match=r"b\s+Object has no attribute 'b'"): model.b = 2 def test_model_property_attribute_error(): class Model(BaseModel): @property def a_property(self): raise AttributeError('Internal Error') with pytest.raises(AttributeError, match='Internal Error'): Model().a_property def test_extra_allowed(): class Model(BaseModel): model_config = ConfigDict(extra='allow') a: float model = Model(a=0.2, b=0.1) assert model.b == 0.1 assert not hasattr(model, 'c') model.c = 1 assert hasattr(model, 'c') assert model.c == 1 def test_reassign_instance_method_with_extra_allow(): class Model(BaseModel): model_config = ConfigDict(extra='allow') name: str def not_extra_func(self) -> str: return f'hello {self.name}' def not_extra_func_replacement(self_sub: Model) -> str: return f'hi {self_sub.name}' m = Model(name='james') assert m.not_extra_func() == 'hello james' m.not_extra_func = partial(not_extra_func_replacement, m) assert m.not_extra_func() == 'hi james' assert 'not_extra_func' in m.__dict__ def test_extra_ignored(): class Model(BaseModel): model_config = ConfigDict(extra='ignore') a: float model = Model(a=0.2, b=0.1) assert not hasattr(model, 'b') with pytest.raises(ValueError, match='"Model" object has no field "b"'): model.b = 1 assert model.model_extra is None def test_field_order_is_preserved_with_extra(): """This test covers https://github.com/pydantic/pydantic/issues/1234.""" class Model(BaseModel): model_config = ConfigDict(extra='allow') a: int b: str c: float model = Model(a=1, b='2', c=3.0, d=4) assert repr(model) == "Model(a=1, b='2', c=3.0, d=4)" assert str(model.model_dump()) == "{'a': 1, 'b': '2', 'c': 3.0, 'd': 4}" assert str(model.model_dump_json()) == '{"a":1,"b":"2","c":3.0,"d":4}' def test_extra_broken_via_pydantic_extra_interference(): """ At the time of writing this test there is `_model_construction.model_extra_getattr` being assigned to model's `__getattr__`. The method then expects `BaseModel.__pydantic_extra__` isn't `None`. Both this actions happen when `model_config.extra` is set to `True`. However, this behavior could be accidentally broken in a subclass of `BaseModel`. In that case `AttributeError` should be thrown when `__getattr__` is being accessed essentially disabling the `extra` functionality. """ class BrokenExtraBaseModel(BaseModel): def model_post_init(self, __context: Any) -> None: super().model_post_init(__context) object.__setattr__(self, '__pydantic_extra__', None) class Model(BrokenExtraBaseModel): model_config = ConfigDict(extra='allow') m = Model(extra_field='some extra value') with pytest.raises(AttributeError) as e: m.extra_field assert e.value.args == ("'Model' object has no attribute 'extra_field'",) def test_model_extra_is_none_when_extra_is_forbid(): class Foo(BaseModel): model_config = ConfigDict(extra='forbid') assert Foo().model_extra is None def test_set_attr(UltraSimpleModel): m = UltraSimpleModel(a=10.2) assert m.model_dump() == {'a': 10.2, 'b': 10} m.b = 20 assert m.model_dump() == {'a': 10.2, 'b': 20} def test_set_attr_invalid(): class UltraSimpleModel(BaseModel): a: float = ... b: int = 10 m = UltraSimpleModel(a=10.2) assert m.model_dump() == {'a': 10.2, 'b': 10} with pytest.raises(ValueError) as exc_info: m.c = 20 assert '"UltraSimpleModel" object has no field "c"' in exc_info.value.args[0] def test_any(): class AnyModel(BaseModel): a: Any = 10 b: object = 20 m = AnyModel() assert m.a == 10 assert m.b == 20 m = AnyModel(a='foobar', b='barfoo') assert m.a == 'foobar' assert m.b == 'barfoo' def test_population_by_field_name(): class Model(BaseModel): model_config = ConfigDict(populate_by_name=True) a: str = Field(alias='_a') assert Model(a='different').a == 'different' assert Model(a='different').model_dump() == {'a': 'different'} assert Model(a='different').model_dump(by_alias=True) == {'_a': 'different'} def test_field_order(): class Model(BaseModel): c: float b: int = 10 a: str d: dict = {} assert list(Model.model_fields.keys()) == ['c', 'b', 'a', 'd'] def test_required(): # same as below but defined here so class definition occurs inside the test class Model(BaseModel): a: float b: int = 10 m = Model(a=10.2) assert m.model_dump() == dict(a=10.2, b=10) with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('a',), 'msg': 'Field required', 'input': {}} ] def test_mutability(): class TestModel(BaseModel): a: int = 10 model_config = ConfigDict(extra='forbid', frozen=False) m = TestModel() assert m.a == 10 m.a = 11 assert m.a == 11 def test_frozen_model(): class FrozenModel(BaseModel): model_config = ConfigDict(extra='forbid', frozen=True) a: int = 10 m = FrozenModel() assert m.a == 10 with pytest.raises(ValidationError) as exc_info: m.a = 11 assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_instance', 'loc': ('a',), 'msg': 'Instance is frozen', 'input': 11} ] with pytest.raises(ValidationError) as exc_info: del m.a assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_instance', 'loc': ('a',), 'msg': 'Instance is frozen', 'input': None} ] assert m.a == 10 def test_frozen_field(): class FrozenModel(BaseModel): a: int = Field(10, frozen=True) m = FrozenModel() assert m.a == 10 with pytest.raises(ValidationError) as exc_info: m.a = 11 assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_field', 'loc': ('a',), 'msg': 'Field is frozen', 'input': 11} ] with pytest.raises(ValidationError) as exc_info: del m.a assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_field', 'loc': ('a',), 'msg': 'Field is frozen', 'input': None} ] assert m.a == 10 def test_not_frozen_are_not_hashable(): class TestModel(BaseModel): a: int = 10 m = TestModel() with pytest.raises(TypeError) as exc_info: hash(m) assert "unhashable type: 'TestModel'" in exc_info.value.args[0] def test_with_declared_hash(): class Foo(BaseModel): x: int def __hash__(self): return self.x**2 class Bar(Foo): y: int def __hash__(self): return self.y**3 class Buz(Bar): z: int assert hash(Foo(x=2)) == 4 assert hash(Bar(x=2, y=3)) == 27 assert hash(Buz(x=2, y=3, z=4)) == 27 def test_frozen_with_hashable_fields_are_hashable(): class TestModel(BaseModel): model_config = ConfigDict(frozen=True) a: int = 10 m = TestModel() assert m.__hash__ is not None assert isinstance(hash(m), int) def test_frozen_with_unhashable_fields_are_not_hashable(): class TestModel(BaseModel): model_config = ConfigDict(frozen=True) a: int = 10 y: List[int] = [1, 2, 3] m = TestModel() with pytest.raises(TypeError) as exc_info: hash(m) assert "unhashable type: 'list'" in exc_info.value.args[0] def test_hash_function_empty_model(): class TestModel(BaseModel): model_config = ConfigDict(frozen=True) m = TestModel() m2 = TestModel() assert m == m2 assert hash(m) == hash(m2) def test_hash_function_give_different_result_for_different_object(): class TestModel(BaseModel): model_config = ConfigDict(frozen=True) a: int = 10 m = TestModel() m2 = TestModel() m3 = TestModel(a=11) assert hash(m) == hash(m2) assert hash(m) != hash(m3) def test_hash_function_works_when_instance_dict_modified(): class TestModel(BaseModel): model_config = ConfigDict(frozen=True) a: int b: int m = TestModel(a=1, b=2) h = hash(m) # Test edge cases where __dict__ is modified # @functools.cached_property can add keys to __dict__, these should be ignored. m.__dict__['c'] = 1 assert hash(m) == h # Order of keys can be changed, e.g. with the deprecated copy method, which shouldn't matter. m.__dict__ = {'b': 2, 'a': 1} assert hash(m) == h # Keys can be missing, e.g. when using the deprecated copy method. # This could change the hash, and more importantly hashing shouldn't raise a KeyError # We don't assert here, because a hash collision is possible: the hash is not guaranteed to change # However, hashing must not raise an exception, which simply calling hash() checks for del m.__dict__['a'] hash(m) def test_default_hash_function_overrides_default_hash_function(): class A(BaseModel): model_config = ConfigDict(frozen=True) x: int class B(A): model_config = ConfigDict(frozen=True) y: int assert A.__hash__ != B.__hash__ assert hash(A(x=1)) != hash(B(x=1, y=2)) != hash(B(x=1, y=3)) def test_hash_method_is_inherited_for_frozen_models(): from functools import lru_cache class MyBaseModel(BaseModel): """A base model with sensible configurations.""" model_config = ConfigDict(frozen=True) def __hash__(self): return hash(id(self)) class MySubClass(MyBaseModel): x: Dict[str, int] @lru_cache(maxsize=None) def cached_method(self): return len(self.x) my_instance = MySubClass(x={'a': 1, 'b': 2}) assert my_instance.cached_method() == 2 object.__setattr__(my_instance, 'x', {}) # can't change the "normal" way due to frozen assert my_instance.cached_method() == 2 @pytest.fixture(name='ValidateAssignmentModel', scope='session') def validate_assignment_fixture(): class ValidateAssignmentModel(BaseModel): model_config = ConfigDict(validate_assignment=True) a: int = 2 b: constr(min_length=1) return ValidateAssignmentModel def test_validating_assignment_pass(ValidateAssignmentModel): p = ValidateAssignmentModel(a=5, b='hello') p.a = 2 assert p.a == 2 assert p.model_dump() == {'a': 2, 'b': 'hello'} p.b = 'hi' assert p.b == 'hi' assert p.model_dump() == {'a': 2, 'b': 'hi'} def test_validating_assignment_fail(ValidateAssignmentModel): p = ValidateAssignmentModel(a=5, b='hello') with pytest.raises(ValidationError) as exc_info: p.a = 'b' assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', }, ] with pytest.raises(ValidationError) as exc_info: p.b = '' assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_short', 'loc': ('b',), 'msg': 'String should have at least 1 character', 'input': '', 'ctx': {'min_length': 1}, } ] class Foo(Enum): FOO = 'foo' BAR = 'bar' @pytest.mark.parametrize('value', [Foo.FOO, Foo.FOO.value, 'foo']) def test_enum_values(value: Any) -> None: class Model(BaseModel): foo: Foo model_config = ConfigDict(use_enum_values=True) m = Model(foo=value) foo = m.foo assert type(foo) is str, type(foo) assert foo == 'foo' foo = m.model_dump()['foo'] assert type(foo) is str, type(foo) assert foo == 'foo' def test_literal_enum_values(): FooEnum = Enum('FooEnum', {'foo': 'foo_value', 'bar': 'bar_value'}) class Model(BaseModel): baz: Literal[FooEnum.foo] boo: str = 'hoo' model_config = ConfigDict(use_enum_values=True) m = Model(baz=FooEnum.foo) assert m.model_dump() == {'baz': 'foo_value', 'boo': 'hoo'} assert m.model_dump(mode='json') == {'baz': 'foo_value', 'boo': 'hoo'} assert m.baz == 'foo_value' with pytest.raises(ValidationError) as exc_info: Model(baz=FooEnum.bar) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('baz',), 'msg': "Input should be ", 'input': FooEnum.bar, 'ctx': {'expected': ""}, } ] class StrFoo(str, Enum): FOO = 'foo' BAR = 'bar' @pytest.mark.parametrize('value', [StrFoo.FOO, StrFoo.FOO.value, 'foo', 'hello']) def test_literal_use_enum_values_multi_type(value) -> None: class Model(BaseModel): baz: Literal[StrFoo.FOO, 'hello'] model_config = ConfigDict(use_enum_values=True) assert isinstance(Model(baz=value).baz, str) def test_literal_use_enum_values_with_default() -> None: class Model(BaseModel): baz: Literal[StrFoo.FOO] = Field(default=StrFoo.FOO) model_config = ConfigDict(use_enum_values=True, validate_default=True) validated = Model() assert type(validated.baz) is str assert type(validated.model_dump()['baz']) is str validated = Model.model_validate_json('{"baz": "foo"}') assert type(validated.baz) is str assert type(validated.model_dump()['baz']) is str validated = Model.model_validate({'baz': StrFoo.FOO}) assert type(validated.baz) is str assert type(validated.model_dump()['baz']) is str def test_strict_enum_values(): class MyEnum(Enum): val = 'val' class Model(BaseModel): model_config = ConfigDict(use_enum_values=True) x: MyEnum assert Model.model_validate({'x': MyEnum.val}, strict=True).x == 'val' def test_union_enum_values(): class MyEnum(Enum): val = 'val' class NormalModel(BaseModel): x: Union[MyEnum, int] class UseEnumValuesModel(BaseModel): model_config = ConfigDict(use_enum_values=True) x: Union[MyEnum, int] assert NormalModel(x=MyEnum.val).x != 'val' assert UseEnumValuesModel(x=MyEnum.val).x == 'val' def test_enum_raw(): FooEnum = Enum('FooEnum', {'foo': 'foo', 'bar': 'bar'}) class Model(BaseModel): foo: FooEnum = None m = Model(foo='foo') assert isinstance(m.foo, FooEnum) assert m.foo != 'foo' assert m.foo.value == 'foo' def test_set_tuple_values(): class Model(BaseModel): foo: set bar: tuple m = Model(foo=['a', 'b'], bar=['c', 'd']) assert m.foo == {'a', 'b'} assert m.bar == ('c', 'd') assert m.model_dump() == {'foo': {'a', 'b'}, 'bar': ('c', 'd')} def test_default_copy(): class User(BaseModel): friends: List[int] = Field(default_factory=lambda: []) u1 = User() u2 = User() assert u1.friends is not u2.friends class ArbitraryType: pass def test_arbitrary_type_allowed_validation_success(): class ArbitraryTypeAllowedModel(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) t: ArbitraryType arbitrary_type_instance = ArbitraryType() m = ArbitraryTypeAllowedModel(t=arbitrary_type_instance) assert m.t == arbitrary_type_instance class OtherClass: pass def test_arbitrary_type_allowed_validation_fails(): class ArbitraryTypeAllowedModel(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) t: ArbitraryType input_value = OtherClass() with pytest.raises(ValidationError) as exc_info: ArbitraryTypeAllowedModel(t=input_value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': ('t',), 'msg': 'Input should be an instance of ArbitraryType', 'input': input_value, 'ctx': {'class': 'ArbitraryType'}, } ] def test_arbitrary_types_not_allowed(): with pytest.raises(TypeError, match='Unable to generate pydantic-core schema for str: return cls.__name__ assert Model.class_name == 'Model' assert Model().class_name == 'Model' def test_model_iteration(): class Foo(BaseModel): a: int = 1 b: int = 2 class Bar(BaseModel): c: int d: Foo m = Bar(c=3, d={}) assert m.model_dump() == {'c': 3, 'd': {'a': 1, 'b': 2}} assert list(m) == [('c', 3), ('d', Foo())] assert dict(m) == {'c': 3, 'd': Foo()} def test_model_iteration_extra() -> None: class Foo(BaseModel): x: int = 1 class Bar(BaseModel): a: int b: Foo model_config = ConfigDict(extra='allow') m = Bar.model_validate({'a': 1, 'b': {}, 'c': 2, 'd': Foo()}) assert m.model_dump() == {'a': 1, 'b': {'x': 1}, 'c': 2, 'd': {'x': 1}} assert list(m) == [('a', 1), ('b', Foo()), ('c', 2), ('d', Foo())] assert dict(m) == {'a': 1, 'b': Foo(), 'c': 2, 'd': Foo()} @pytest.mark.parametrize( 'exclude,expected,raises_match', [ pytest.param( None, {'c': 3, 'foos': [{'a': 1, 'b': 2}, {'a': 3, 'b': 4}]}, None, id='exclude nothing', ), pytest.param( {'foos': {0: {'a'}, 1: {'a'}}}, {'c': 3, 'foos': [{'b': 2}, {'b': 4}]}, None, id='excluding fields of indexed list items', ), pytest.param( {'foos': {'a'}}, {'c': 3, 'foos': [{'a': 1, 'b': 2}, {'a': 3, 'b': 4}]}, None, id='Trying to exclude string keys on list field should be ignored (1)', ), pytest.param( {'foos': {0: ..., 'a': ...}}, {'c': 3, 'foos': [{'a': 3, 'b': 4}]}, None, id='Trying to exclude string keys on list field should be ignored (2)', ), pytest.param( {'foos': {0: 1}}, TypeError, '`exclude` argument must be a set or dict', id='value as int should be an error', ), pytest.param( {'foos': {'__all__': {1}}}, {'c': 3, 'foos': [{'a': 1, 'b': 2}, {'a': 3, 'b': 4}]}, None, id='excluding int in dict should have no effect', ), pytest.param( {'foos': {'__all__': {'a'}}}, {'c': 3, 'foos': [{'b': 2}, {'b': 4}]}, None, id='using "__all__" to exclude specific nested field', ), pytest.param( {'foos': {0: {'b'}, '__all__': {'a'}}}, {'c': 3, 'foos': [{}, {'b': 4}]}, None, id='using "__all__" to exclude specific nested field in combination with more specific exclude', ), pytest.param( {'foos': {'__all__'}}, {'c': 3, 'foos': []}, None, id='using "__all__" to exclude all list items', ), pytest.param( {'foos': {1, '__all__'}}, {'c': 3, 'foos': []}, None, id='using "__all__" and other items should get merged together, still excluding all list items', ), pytest.param( {'foos': {-1: {'b'}}}, {'c': 3, 'foos': [{'a': 1, 'b': 2}, {'a': 3}]}, None, id='negative indexes', ), ], ) def test_model_export_nested_list(exclude, expected, raises_match): class Foo(BaseModel): a: int = 1 b: int = 2 class Bar(BaseModel): c: int foos: List[Foo] m = Bar(c=3, foos=[Foo(a=1, b=2), Foo(a=3, b=4)]) if raises_match is not None: with pytest.raises(expected, match=raises_match): m.model_dump(exclude=exclude) else: original_exclude = deepcopy(exclude) assert m.model_dump(exclude=exclude) == expected assert exclude == original_exclude @pytest.mark.parametrize( 'excludes,expected', [ pytest.param( {'bars': {0}}, {'a': 1, 'bars': [{'y': 2}, {'w': -1, 'z': 3}]}, id='excluding first item from list field using index', ), pytest.param({'bars': {'__all__'}}, {'a': 1, 'bars': []}, id='using "__all__" to exclude all list items'), pytest.param( {'bars': {'__all__': {'w'}}}, {'a': 1, 'bars': [{'x': 1}, {'y': 2}, {'z': 3}]}, id='exclude single dict key from all list items', ), ], ) def test_model_export_dict_exclusion(excludes, expected): class Foo(BaseModel): a: int = 1 bars: List[Dict[str, int]] m = Foo(a=1, bars=[{'w': 0, 'x': 1}, {'y': 2}, {'w': -1, 'z': 3}]) original_excludes = deepcopy(excludes) assert m.model_dump(exclude=excludes) == expected assert excludes == original_excludes def test_field_exclude(): class User(BaseModel): _priv: int = PrivateAttr() id: int username: str password: SecretStr = Field(exclude=True) hobbies: List[str] my_user = User(id=42, username='JohnDoe', password='hashedpassword', hobbies=['scuba diving']) my_user._priv = 13 assert my_user.id == 42 assert my_user.password.get_secret_value() == 'hashedpassword' assert my_user.model_dump() == {'id': 42, 'username': 'JohnDoe', 'hobbies': ['scuba diving']} def test_revalidate_instances_never(): class User(BaseModel): hobbies: List[str] my_user = User(hobbies=['scuba diving']) class Transaction(BaseModel): user: User t = Transaction(user=my_user) assert t.user is my_user assert t.user.hobbies is my_user.hobbies class SubUser(User): sins: List[str] my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) assert t.user is my_sub_user assert t.user.hobbies is my_sub_user.hobbies def test_revalidate_instances_sub_instances(): class User(BaseModel, revalidate_instances='subclass-instances'): hobbies: List[str] my_user = User(hobbies=['scuba diving']) class Transaction(BaseModel): user: User t = Transaction(user=my_user) assert t.user is my_user assert t.user.hobbies is my_user.hobbies class SubUser(User): sins: List[str] my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) assert t.user is not my_sub_user assert t.user.hobbies is not my_sub_user.hobbies assert not hasattr(t.user, 'sins') def test_revalidate_instances_always(): class User(BaseModel, revalidate_instances='always'): hobbies: List[str] my_user = User(hobbies=['scuba diving']) class Transaction(BaseModel): user: User t = Transaction(user=my_user) assert t.user is not my_user assert t.user.hobbies is not my_user.hobbies class SubUser(User): sins: List[str] my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) assert t.user is not my_sub_user assert t.user.hobbies is not my_sub_user.hobbies assert not hasattr(t.user, 'sins') def test_revalidate_instances_always_list_of_model_instance(): class A(BaseModel): model_config = ConfigDict(revalidate_instances='always') name: str class B(BaseModel): list_a: List[A] a = A(name='a') b = B(list_a=[a]) assert b.list_a == [A(name='a')] a.name = 'b' assert b.list_a == [A(name='a')] @pytest.mark.skip(reason='not implemented') @pytest.mark.parametrize( 'kinds', [ {'sub_fields', 'model_fields', 'model_config', 'sub_config', 'combined_config'}, {'sub_fields', 'model_fields', 'combined_config'}, {'sub_fields', 'model_fields'}, {'combined_config'}, {'model_config', 'sub_config'}, {'model_config', 'sub_fields'}, {'model_fields', 'sub_config'}, ], ) @pytest.mark.parametrize( 'exclude,expected', [ (None, {'a': 0, 'c': {'a': [3, 5], 'c': 'foobar'}, 'd': {'c': 'foobar'}}), ({'c', 'd'}, {'a': 0}), ({'a': ..., 'c': ..., 'd': {'a': ..., 'c': ...}}, {'d': {}}), ], ) def test_model_export_exclusion_with_fields_and_config(kinds, exclude, expected): """Test that exporting models with fields using the export parameter works.""" class ChildConfig: pass if 'sub_config' in kinds: ChildConfig.fields = {'b': {'exclude': ...}, 'a': {'exclude': {1}}} class ParentConfig: pass if 'combined_config' in kinds: ParentConfig.fields = { 'b': {'exclude': ...}, 'c': {'exclude': {'b': ..., 'a': {1}}}, 'd': {'exclude': {'a': ..., 'b': ...}}, } elif 'model_config' in kinds: ParentConfig.fields = {'b': {'exclude': ...}, 'd': {'exclude': {'a'}}} class Sub(BaseModel): a: List[int] = Field([3, 4, 5], exclude={1} if 'sub_fields' in kinds else None) b: int = Field(4, exclude=... if 'sub_fields' in kinds else None) c: str = 'foobar' Config = ChildConfig class Model(BaseModel): a: int = 0 b: int = Field(2, exclude=... if 'model_fields' in kinds else None) c: Sub = Sub() d: Sub = Field(Sub(), exclude={'a'} if 'model_fields' in kinds else None) Config = ParentConfig m = Model() assert m.model_dump(exclude=exclude) == expected, 'Unexpected model export result' @pytest.mark.skip(reason='not implemented') def test_model_export_exclusion_inheritance(): class Sub(BaseModel): s1: str = 'v1' s2: str = 'v2' s3: str = 'v3' s4: str = Field('v4', exclude=...) class Parent(BaseModel): model_config = ConfigDict(fields={'a': {'exclude': ...}, 's': {'exclude': {'s1'}}}) a: int b: int = Field(exclude=...) c: int d: int s: Sub = Sub() class Child(Parent): model_config = ConfigDict(fields={'c': {'exclude': ...}, 's': {'exclude': {'s2'}}}) actual = Child(a=0, b=1, c=2, d=3).model_dump() expected = {'d': 3, 's': {'s3': 'v3'}} assert actual == expected, 'Unexpected model export result' @pytest.mark.skip(reason='not implemented') def test_model_export_with_true_instead_of_ellipsis(): class Sub(BaseModel): s1: int = 1 class Model(BaseModel): model_config = ConfigDict(fields={'c': {'exclude': True}}) a: int = 2 b: int = Field(3, exclude=True) c: int = Field(4) s: Sub = Sub() m = Model() assert m.model_dump(exclude={'s': True}) == {'a': 2} @pytest.mark.skip(reason='not implemented') def test_model_export_inclusion(): class Sub(BaseModel): s1: str = 'v1' s2: str = 'v2' s3: str = 'v3' s4: str = 'v4' class Model(BaseModel): model_config = ConfigDict( fields={'a': {'include': {'s2', 's1', 's3'}}, 'b': {'include': {'s1', 's2', 's3', 's4'}}} ) a: Sub = Sub() b: Sub = Field(Sub(), include={'s1'}) c: Sub = Field(Sub(), include={'s1', 's2'}) assert Model.model_fields['a'].field_info.include == {'s1': ..., 's2': ..., 's3': ...} assert Model.model_fields['b'].field_info.include == {'s1': ...} assert Model.model_fields['c'].field_info.include == {'s1': ..., 's2': ...} actual = Model().model_dump(include={'a': {'s3', 's4'}, 'b': ..., 'c': ...}) # s1 included via field, s2 via config and s3 via .dict call: expected = {'a': {'s3': 'v3'}, 'b': {'s1': 'v1'}, 'c': {'s1': 'v1', 's2': 'v2'}} assert actual == expected, 'Unexpected model export result' @pytest.mark.skip(reason='not implemented') def test_model_export_inclusion_inheritance(): class Sub(BaseModel): s1: str = Field('v1', include=...) s2: str = Field('v2', include=...) s3: str = Field('v3', include=...) s4: str = 'v4' class Parent(BaseModel): # b will be included since fields are set independently model_config = ConfigDict(fields={'b': {'include': ...}}) a: int b: int c: int s: Sub = Field(Sub(), include={'s1', 's2'}) # overrides includes set in Sub model class Child(Parent): # b is still included even if it doesn't occur here since fields # are still considered separately. # s however, is merged, resulting in only s1 being included. model_config = ConfigDict(fields={'a': {'include': ...}, 's': {'include': {'s1'}}}) actual = Child(a=0, b=1, c=2).model_dump() expected = {'a': 0, 'b': 1, 's': {'s1': 'v1'}} assert actual == expected, 'Unexpected model export result' def test_untyped_fields_warning(): with pytest.raises( PydanticUserError, match=re.escape( 'A non-annotated attribute was detected: `x = 1`. All model fields require a type annotation; ' 'if `x` is not meant to be a field, you may be able to resolve this error by annotating it ' "as a `ClassVar` or updating `model_config['ignored_types']`." ), ): class WarningModel(BaseModel): x = 1 # Prove that annotating with ClassVar prevents the warning class NonWarningModel(BaseModel): x: ClassVar = 1 def test_untyped_fields_error(): with pytest.raises(TypeError, match="Field 'a' requires a type annotation"): class Model(BaseModel): a = Field('foobar') def test_custom_init_subclass_params(): class DerivedModel(BaseModel): def __init_subclass__(cls, something): cls.something = something # if this raises a TypeError, then there is a regression of issue 867: # pydantic.main.MetaModel.__new__ should include **kwargs at the end of the # method definition and pass them on to the super call at the end in order # to allow the special method __init_subclass__ to be defined with custom # parameters on extended BaseModel classes. class NewModel(DerivedModel, something=2): something: ClassVar = 1 assert NewModel.something == 2 def test_recursive_model(): class MyModel(BaseModel): field: Optional['MyModel'] m = MyModel(field={'field': {'field': None}}) assert m.model_dump() == {'field': {'field': {'field': None}}} def test_recursive_cycle_with_repeated_field(): class A(BaseModel): b: 'B' class B(BaseModel): a1: Optional[A] = None a2: Optional[A] = None A.model_rebuild() assert A.model_validate({'b': {'a1': {'b': {'a1': None}}}}) == A(b=B(a1=A(b=B(a1=None)))) with pytest.raises(ValidationError) as exc_info: A.model_validate({'b': {'a1': {'a1': None}}}) assert exc_info.value.errors(include_url=False) == [ {'input': {'a1': None}, 'loc': ('b', 'a1', 'b'), 'msg': 'Field required', 'type': 'missing'} ] def test_two_defaults(): with pytest.raises(TypeError, match='^cannot specify both default and default_factory$'): class Model(BaseModel): a: int = Field(default=3, default_factory=lambda: 3) def test_default_factory(): class ValueModel(BaseModel): uid: UUID = uuid4() m1 = ValueModel() m2 = ValueModel() assert m1.uid == m2.uid class DynamicValueModel(BaseModel): uid: UUID = Field(default_factory=uuid4) m1 = DynamicValueModel() m2 = DynamicValueModel() assert isinstance(m1.uid, UUID) assert m1.uid != m2.uid # With a callable: we still should be able to set callables as defaults class FunctionModel(BaseModel): a: int = 1 uid: Callable[[], UUID] = Field(uuid4) m = FunctionModel() assert m.uid is uuid4 # Returning a singleton from a default_factory is supported class MySingleton: pass MY_SINGLETON = MySingleton() class SingletonFieldModel(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) singleton: MySingleton = Field(default_factory=lambda: MY_SINGLETON) assert SingletonFieldModel().singleton is SingletonFieldModel().singleton def test_default_factory_called_once(): """It should call only once the given factory by default""" class Seq: def __init__(self): self.v = 0 def __call__(self): self.v += 1 return self.v class MyModel(BaseModel): id: int = Field(default_factory=Seq()) m1 = MyModel() assert m1.id == 1 m2 = MyModel() assert m2.id == 2 assert m1.id == 1 def test_default_factory_called_once_2(): """It should call only once the given factory by default""" v = 0 def factory(): nonlocal v v += 1 return v class MyModel(BaseModel): id: int = Field(default_factory=factory) m1 = MyModel() assert m1.id == 1 m2 = MyModel() assert m2.id == 2 def test_default_factory_validate_children(): class Child(BaseModel): x: int class Parent(BaseModel): children: List[Child] = Field(default_factory=list) Parent(children=[{'x': 1}, {'x': 2}]) with pytest.raises(ValidationError) as exc_info: Parent(children=[{'x': 1}, {'y': 2}]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('children', 1, 'x'), 'msg': 'Field required', 'input': {'y': 2}} ] def test_default_factory_parse(): class Inner(BaseModel): val: int = Field(0) class Outer(BaseModel): inner_1: Inner = Field(default_factory=Inner) inner_2: Inner = Field(Inner()) default = Outer().model_dump() parsed = Outer.model_validate(default) assert parsed.model_dump() == {'inner_1': {'val': 0}, 'inner_2': {'val': 0}} assert repr(parsed) == 'Outer(inner_1=Inner(val=0), inner_2=Inner(val=0))' def test_default_factory_validated_data_arg() -> None: class Model(BaseModel): a: int = 1 b: int = Field(default_factory=lambda data: data['a']) model = Model() assert model.b == 1 model = Model.model_construct(a=1) assert model.b == 1 class InvalidModel(BaseModel): a: int = Field(default_factory=lambda data: data['b']) b: int with pytest.raises(KeyError): InvalidModel(b=2) def test_default_factory_validated_data_arg_not_required() -> None: def fac(data: Optional[Dict[str, Any]] = None): if data is not None: return data['a'] return 3 class Model(BaseModel): a: int = 1 b: int = Field(default_factory=fac) model = Model() assert model.b == 3 def test_reuse_same_field(): required_field = Field() class Model1(BaseModel): required: str = required_field class Model2(BaseModel): required: str = required_field with pytest.raises(ValidationError): Model1.model_validate({}) with pytest.raises(ValidationError): Model2.model_validate({}) def test_base_config_type_hinting(): class M(BaseModel): a: int get_type_hints(type(M.model_config)) def test_frozen_field_with_validate_assignment(): """assigning a frozen=True field should raise a TypeError""" class Entry(BaseModel): model_config = ConfigDict(validate_assignment=True) id: float = Field(frozen=True) val: float r = Entry(id=1, val=100) assert r.val == 100 r.val = 101 assert r.val == 101 assert r.id == 1 with pytest.raises(ValidationError) as exc_info: r.id = 2 assert exc_info.value.errors(include_url=False) == [ {'input': 2, 'loc': ('id',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] def test_repr_field(): class Model(BaseModel): a: int = Field() b: float = Field(repr=True) c: bool = Field(repr=False) m = Model(a=1, b=2.5, c=True) assert repr(m) == 'Model(a=1, b=2.5)' assert repr(m.model_fields['a']) == 'FieldInfo(annotation=int, required=True)' assert repr(m.model_fields['b']) == 'FieldInfo(annotation=float, required=True)' assert repr(m.model_fields['c']) == 'FieldInfo(annotation=bool, required=True, repr=False)' def test_inherited_model_field_copy(): """It should copy models used as fields by default""" class Image(BaseModel): path: str def __hash__(self): return id(self) class Item(BaseModel): images: Set[Image] image_1 = Image(path='my_image1.png') image_2 = Image(path='my_image2.png') item = Item(images={image_1, image_2}) assert image_1 in item.images assert id(image_1) in {id(image) for image in item.images} assert id(image_2) in {id(image) for image in item.images} def test_mapping_subclass_as_input(): class CustomMap(dict): pass class Model(BaseModel): x: Mapping[str, int] d = CustomMap() d['one'] = 1 d['two'] = 2 v = Model(x=d).x # we don't promise that this will or will not be a CustomMap # all we promise is that it _will_ be a mapping assert isinstance(v, Mapping) # but the current behavior is that it will be a dict, not a CustomMap # so document that here assert not isinstance(v, CustomMap) assert v == {'one': 1, 'two': 2} def test_typing_coercion_dict(): class Model(BaseModel): x: Dict[str, int] m = Model(x={'one': 1, 'two': 2}) assert repr(m) == "Model(x={'one': 1, 'two': 2})" KT = TypeVar('KT') VT = TypeVar('VT') class MyDict(Dict[KT, VT]): def __repr__(self): return f'MyDict({super().__repr__()})' def test_class_kwargs_config(): class Base(BaseModel, extra='forbid', alias_generator=str.upper): a: int assert Base.model_config['extra'] == 'forbid' assert Base.model_config['alias_generator'] is str.upper # assert Base.model_fields['a'].alias == 'A' class Model(Base, extra='allow'): b: int assert Model.model_config['extra'] == 'allow' # overwritten as intended assert Model.model_config['alias_generator'] is str.upper # inherited as intended # assert Model.model_fields['b'].alias == 'B' # alias_generator still works def test_class_kwargs_config_and_attr_conflict(): class Model(BaseModel, extra='allow', alias_generator=str.upper): model_config = ConfigDict(extra='forbid', title='Foobar') b: int assert Model.model_config['extra'] == 'allow' assert Model.model_config['alias_generator'] is str.upper assert Model.model_config['title'] == 'Foobar' def test_class_kwargs_custom_config(): if platform.python_implementation() == 'PyPy': msg = r"__init_subclass__\(\) got an unexpected keyword argument 'some_config'" else: msg = r'__init_subclass__\(\) takes no keyword arguments' with pytest.raises(TypeError, match=msg): class Model(BaseModel, some_config='new_value'): a: int def test_new_union_origin(): """On 3.10+, origin of `int | str` is `types.UnionType`, not `typing.Union`""" class Model(BaseModel): x: 'int | str' assert Model(x=3).x == 3 assert Model(x='3').x == '3' assert Model(x='pika').x == 'pika' assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'x': {'title': 'X', 'anyOf': [{'type': 'integer'}, {'type': 'string'}]}}, 'required': ['x'], } @pytest.mark.parametrize( 'ann', [Final, Final[int]], ids=['no-arg', 'with-arg'], ) @pytest.mark.parametrize( 'value', [None, Field()], ids=['none', 'field'], ) def test_frozen_field_decl_without_default_val(ann, value): class Model(BaseModel): a: ann if value is not None: a = value assert 'a' not in Model.__class_vars__ assert 'a' in Model.model_fields assert Model.model_fields['a'].frozen @pytest.mark.parametrize( 'ann', [Final, Final[int]], ids=['no-arg', 'with-arg'], ) def test_final_field_decl_with_default_val(ann): class Model(BaseModel): a: ann = 10 assert 'a' in Model.__class_vars__ assert 'a' not in Model.model_fields def test_final_field_reassignment(): class Model(BaseModel): model_config = ConfigDict(validate_assignment=True) a: Final[int] obj = Model(a=10) with pytest.raises(ValidationError) as exc_info: obj.a = 20 assert exc_info.value.errors(include_url=False) == [ {'input': 20, 'loc': ('a',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] def test_field_by_default_is_not_frozen(): class Model(BaseModel): a: int assert not Model.model_fields['a'].frozen def test_annotated_final(): class Model(BaseModel): a: Annotated[Final[int], Field(title='abc')] assert Model.model_fields['a'].frozen assert Model.model_fields['a'].title == 'abc' class Model2(BaseModel): a: Final[Annotated[int, Field(title='def')]] assert Model2.model_fields['a'].frozen assert Model2.model_fields['a'].title == 'def' def test_post_init(): calls = [] class InnerModel(BaseModel): a: int b: int def model_post_init(self, __context) -> None: super().model_post_init(__context) # this is included just to show it doesn't error assert self.model_dump() == {'a': 3, 'b': 4} calls.append('inner_model_post_init') class Model(BaseModel): c: int d: int sub: InnerModel def model_post_init(self, __context) -> None: assert self.model_dump() == {'c': 1, 'd': 2, 'sub': {'a': 3, 'b': 4}} calls.append('model_post_init') m = Model(c=1, d='2', sub={'a': 3, 'b': '4'}) assert calls == ['inner_model_post_init', 'model_post_init'] assert m.model_dump() == {'c': 1, 'd': 2, 'sub': {'a': 3, 'b': 4}} class SubModel(Model): def model_post_init(self, __context) -> None: assert self.model_dump() == {'c': 1, 'd': 2, 'sub': {'a': 3, 'b': 4}} super().model_post_init(__context) calls.append('submodel_post_init') calls.clear() m = SubModel(c=1, d='2', sub={'a': 3, 'b': '4'}) assert calls == ['inner_model_post_init', 'model_post_init', 'submodel_post_init'] assert m.model_dump() == {'c': 1, 'd': 2, 'sub': {'a': 3, 'b': 4}} @pytest.mark.parametrize('include_private_attribute', [True, False]) def test_post_init_call_signatures(include_private_attribute): calls = [] class Model(BaseModel): a: int b: int if include_private_attribute: _x: int = PrivateAttr(1) def model_post_init(self, *args, **kwargs) -> None: calls.append((args, kwargs)) Model(a=1, b=2) assert calls == [((None,), {})] Model.model_construct(a=3, b=4) assert calls == [((None,), {}), ((None,), {})] def test_post_init_not_called_without_override(): calls = [] def monkey_patched_model_post_init(cls, __context): calls.append('BaseModel.model_post_init') original_base_model_post_init = BaseModel.model_post_init try: BaseModel.model_post_init = monkey_patched_model_post_init class WithoutOverrideModel(BaseModel): pass WithoutOverrideModel() WithoutOverrideModel.model_construct() assert calls == [] class WithOverrideModel(BaseModel): def model_post_init(self, __context: Any) -> None: calls.append('WithOverrideModel.model_post_init') WithOverrideModel() assert calls == ['WithOverrideModel.model_post_init'] WithOverrideModel.model_construct() assert calls == ['WithOverrideModel.model_post_init', 'WithOverrideModel.model_post_init'] finally: BaseModel.model_post_init = original_base_model_post_init def test_model_post_init_subclass_private_attrs(): """https://github.com/pydantic/pydantic/issues/7293""" calls = [] class A(BaseModel): a: int = 1 def model_post_init(self, __context: Any) -> None: calls.append(f'{self.__class__.__name__}.model_post_init') class B(A): pass class C(B): _private: bool = True C() assert calls == ['C.model_post_init'] def test_model_post_init_supertype_private_attrs(): """https://github.com/pydantic/pydantic/issues/9098""" class Model(BaseModel): _private: int = 12 class SubModel(Model): def model_post_init(self, __context: Any) -> None: if self._private == 12: self._private = 13 super().model_post_init(__context) m = SubModel() assert m._private == 13 def test_model_post_init_subclass_setting_private_attrs(): """https://github.com/pydantic/pydantic/issues/7091""" class Model(BaseModel): _priv1: int = PrivateAttr(91) _priv2: int = PrivateAttr(92) def model_post_init(self, __context) -> None: self._priv1 = 100 class SubModel(Model): _priv3: int = PrivateAttr(93) _priv4: int = PrivateAttr(94) _priv5: int = PrivateAttr() _priv6: int = PrivateAttr() def model_post_init(self, __context) -> None: self._priv3 = 200 self._priv5 = 300 super().model_post_init(__context) m = SubModel() assert m._priv1 == 100 assert m._priv2 == 92 assert m._priv3 == 200 assert m._priv4 == 94 assert m._priv5 == 300 with pytest.raises(AttributeError): assert m._priv6 == 94 def test_model_post_init_correct_mro(): """https://github.com/pydantic/pydantic/issues/7293""" calls = [] class A(BaseModel): a: int = 1 class B(BaseModel): b: int = 1 def model_post_init(self, __context: Any) -> None: calls.append(f'{self.__class__.__name__}.model_post_init') class C(A, B): _private: bool = True C() assert calls == ['C.model_post_init'] def test_deeper_recursive_model(): class A(BaseModel): b: 'B' class B(BaseModel): c: 'C' class C(BaseModel): a: Optional['A'] A.model_rebuild() B.model_rebuild() C.model_rebuild() m = A(b=B(c=C(a=None))) assert m.model_dump() == {'b': {'c': {'a': None}}} def test_model_rebuild_localns(): class A(BaseModel): x: int class B(BaseModel): a: 'Model' # noqa: F821 B.model_rebuild(_types_namespace={'Model': A}) m = B(a={'x': 1}) assert m.model_dump() == {'a': {'x': 1}} assert isinstance(m.a, A) class C(BaseModel): a: 'Model' # noqa: F821 with pytest.raises(PydanticUndefinedAnnotation, match="name 'Model' is not defined"): C.model_rebuild(_types_namespace={'A': A}) def test_model_rebuild_zero_depth(): class Model(BaseModel): x: 'X_Type' X_Type = str with pytest.raises(NameError, match='X_Type'): Model.model_rebuild(_parent_namespace_depth=0) Model.__pydantic_parent_namespace__.update({'X_Type': int}) Model.model_rebuild(_parent_namespace_depth=0) m = Model(x=42) assert m.model_dump() == {'x': 42} @pytest.fixture(scope='session', name='InnerEqualityModel') def inner_equality_fixture(): class InnerEqualityModel(BaseModel): iw: int ix: int = 0 _iy: int = PrivateAttr() _iz: int = PrivateAttr(0) return InnerEqualityModel @pytest.fixture(scope='session', name='EqualityModel') def equality_fixture(InnerEqualityModel): class EqualityModel(BaseModel): w: int x: int = 0 _y: int = PrivateAttr() _z: int = PrivateAttr(0) model: InnerEqualityModel return EqualityModel def test_model_equality(EqualityModel, InnerEqualityModel): m1 = EqualityModel(w=0, x=0, model=InnerEqualityModel(iw=0)) m2 = EqualityModel(w=0, x=0, model=InnerEqualityModel(iw=0)) assert m1 == m2 def test_model_equality_type(EqualityModel, InnerEqualityModel): class Model1(BaseModel): x: int class Model2(BaseModel): x: int m1 = Model1(x=1) m2 = Model2(x=1) assert m1.model_dump() == m2.model_dump() assert m1 != m2 def test_model_equality_dump(EqualityModel, InnerEqualityModel): inner_model = InnerEqualityModel(iw=0) assert inner_model != inner_model.model_dump() model = EqualityModel(w=0, x=0, model=inner_model) assert model != dict(model) assert dict(model) != model.model_dump() # Due to presence of inner model def test_model_equality_fields_set(InnerEqualityModel): m1 = InnerEqualityModel(iw=0) m2 = InnerEqualityModel(iw=0, ix=0) assert m1.model_fields_set != m2.model_fields_set assert m1 == m2 def test_model_equality_private_attrs(InnerEqualityModel): m = InnerEqualityModel(iw=0, ix=0) m1 = m.model_copy() m2 = m.model_copy() m3 = m.model_copy() m2._iy = 1 m3._iz = 1 models = [m1, m2, m3] for i, first_model in enumerate(models): for j, second_model in enumerate(models): if i == j: assert first_model == second_model else: assert first_model != second_model m2_equal = m.model_copy() m2_equal._iy = 1 assert m2 == m2_equal m3_equal = m.model_copy() m3_equal._iz = 1 assert m3 == m3_equal def test_model_copy_extra(): class Model(BaseModel, extra='allow'): x: int m = Model(x=1, y=2) assert m.model_dump() == {'x': 1, 'y': 2} assert m.model_extra == {'y': 2} m2 = m.model_copy() assert m2.model_dump() == {'x': 1, 'y': 2} assert m2.model_extra == {'y': 2} m3 = m.model_copy(update={'x': 4, 'z': 3}) assert m3.model_dump() == {'x': 4, 'y': 2, 'z': 3} assert m3.model_extra == {'y': 2, 'z': 3} m4 = m.model_copy(update={'x': 4, 'z': 3}) assert m4.model_dump() == {'x': 4, 'y': 2, 'z': 3} assert m4.model_extra == {'y': 2, 'z': 3} m = Model(x=1, a=2) m.__pydantic_extra__ = None m5 = m.model_copy(update={'x': 4, 'b': 3}) assert m5.model_dump() == {'x': 4, 'b': 3} assert m5.model_extra == {'b': 3} def test_model_parametrized_name_not_generic(): class Model(BaseModel): x: int with pytest.raises(TypeError, match='Concrete names should only be generated for generic models.'): Model.model_parametrized_name(()) def test_model_equality_generics(): T = TypeVar('T') class GenericModel(BaseModel, Generic[T], frozen=True): x: T class ConcreteModel(BaseModel): x: int assert ConcreteModel(x=1) != GenericModel(x=1) assert ConcreteModel(x=1) != GenericModel[Any](x=1) assert ConcreteModel(x=1) != GenericModel[int](x=1) assert GenericModel(x=1) != GenericModel(x=2) S = TypeVar('S') models = [ GenericModel(x=1), GenericModel[S](x=1), GenericModel[Any](x=1), GenericModel[int](x=1), GenericModel[float](x=1), ] for m1 in models: for m2 in models: # Test that it works with nesting as well m3 = GenericModel[type(m1)](x=m1) m4 = GenericModel[type(m2)](x=m2) assert m1 == m2 assert m3 == m4 assert hash(m1) == hash(m2) assert hash(m3) == hash(m4) def test_model_validate_strict() -> None: class LaxModel(BaseModel): x: int model_config = ConfigDict(strict=False) class StrictModel(BaseModel): x: int model_config = ConfigDict(strict=True) assert LaxModel.model_validate({'x': '1'}, strict=None) == LaxModel(x=1) assert LaxModel.model_validate({'x': '1'}, strict=False) == LaxModel(x=1) with pytest.raises(ValidationError) as exc_info: LaxModel.model_validate({'x': '1'}, strict=True) # there's no such thing on the model itself # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] with pytest.raises(ValidationError) as exc_info: StrictModel.model_validate({'x': '1'}) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] assert StrictModel.model_validate({'x': '1'}, strict=False) == StrictModel(x=1) with pytest.raises(ValidationError) as exc_info: LaxModel.model_validate({'x': '1'}, strict=True) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] @pytest.mark.xfail( reason='strict=True in model_validate_json does not overwrite strict=False given in ConfigDict' 'See issue: https://github.com/pydantic/pydantic/issues/8930' ) def test_model_validate_list_strict() -> None: # FIXME: This change must be implemented in pydantic-core. The argument strict=True # in model_validate_json method is not overwriting the one set with ConfigDict(strict=False) # for sequence like types. See: https://github.com/pydantic/pydantic/issues/8930 class LaxModel(BaseModel): x: List[str] model_config = ConfigDict(strict=False) assert LaxModel.model_validate_json(json.dumps({'x': ('a', 'b', 'c')}), strict=None) == LaxModel(x=('a', 'b', 'c')) assert LaxModel.model_validate_json(json.dumps({'x': ('a', 'b', 'c')}), strict=False) == LaxModel(x=('a', 'b', 'c')) with pytest.raises(ValidationError) as exc_info: LaxModel.model_validate_json(json.dumps({'x': ('a', 'b', 'c')}), strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('x',), 'msg': 'Input should be a valid list', 'input': ('a', 'b', 'c')} ] def test_model_validate_json_strict() -> None: class LaxModel(BaseModel): x: int model_config = ConfigDict(strict=False) class StrictModel(BaseModel): x: int model_config = ConfigDict(strict=True) assert LaxModel.model_validate_json(json.dumps({'x': '1'}), strict=None) == LaxModel(x=1) assert LaxModel.model_validate_json(json.dumps({'x': '1'}), strict=False) == LaxModel(x=1) with pytest.raises(ValidationError) as exc_info: LaxModel.model_validate_json(json.dumps({'x': '1'}), strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] with pytest.raises(ValidationError) as exc_info: StrictModel.model_validate_json(json.dumps({'x': '1'}), strict=None) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] assert StrictModel.model_validate_json(json.dumps({'x': '1'}), strict=False) == StrictModel(x=1) with pytest.raises(ValidationError) as exc_info: StrictModel.model_validate_json(json.dumps({'x': '1'}), strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] def test_validate_python_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int, info: ValidationInfo) -> int: assert info.context == contexts.pop(0) return v Model.model_validate({'x': 1}) Model.model_validate({'x': 1}, context=None) Model.model_validate({'x': 1}, context={'foo': 'bar'}) assert contexts == [] def test_validate_json_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int, info: ValidationInfo) -> int: assert info.context == contexts.pop(0) return v Model.model_validate_json(json.dumps({'x': 1})) Model.model_validate_json(json.dumps({'x': 1}), context=None) Model.model_validate_json(json.dumps({'x': 1}), context={'foo': 'bar'}) assert contexts == [] def test_pydantic_init_subclass() -> None: calls = [] class MyModel(BaseModel): def __init_subclass__(cls, **kwargs): super().__init_subclass__() # can't pass kwargs to object.__init_subclass__, weirdly calls.append((cls.__name__, '__init_subclass__', kwargs)) @classmethod def __pydantic_init_subclass__(cls, **kwargs): super().__pydantic_init_subclass__(**kwargs) calls.append((cls.__name__, '__pydantic_init_subclass__', kwargs)) class MySubModel(MyModel, a=1): pass assert calls == [ ('MySubModel', '__init_subclass__', {'a': 1}), ('MySubModel', '__pydantic_init_subclass__', {'a': 1}), ] def test_model_validate_with_context(): class InnerModel(BaseModel): x: int @field_validator('x') def validate(cls, value, info): return value * info.context.get('multiplier', 1) class OuterModel(BaseModel): inner: InnerModel assert OuterModel.model_validate({'inner': {'x': 2}}, context={'multiplier': 1}).inner.x == 2 assert OuterModel.model_validate({'inner': {'x': 2}}, context={'multiplier': 2}).inner.x == 4 assert OuterModel.model_validate({'inner': {'x': 2}}, context={'multiplier': 3}).inner.x == 6 def test_extra_equality(): class MyModel(BaseModel, extra='allow'): pass assert MyModel(x=1) != MyModel() def test_equality_delegation(): from unittest.mock import ANY class MyModel(BaseModel): foo: str assert MyModel(foo='bar') == ANY def test_recursion_loop_error(): class Model(BaseModel): x: List['Model'] data = {'x': []} data['x'].append(data) with pytest.raises(ValidationError) as exc_info: Model(**data) assert repr(exc_info.value.errors(include_url=False)[0]) == ( "{'type': 'recursion_loop', 'loc': ('x', 0, 'x', 0), 'msg': " "'Recursion error - cyclic reference detected', 'input': {'x': [{...}]}}" ) def test_protected_namespace_default(): with pytest.warns( UserWarning, match='Field "model_dump_something" in Model has conflict with protected namespace "model_dump"' ): class Model(BaseModel): model_dump_something: str def test_custom_protected_namespace(): with pytest.warns(UserWarning, match='Field "test_field" in Model has conflict with protected namespace "test_"'): class Model(BaseModel): # this field won't raise error because we changed the default value for the # `protected_namespaces` config. model_prefixed_field: str test_field: str model_config = ConfigDict(protected_namespaces=('test_',)) def test_multiple_protected_namespace(): with pytest.warns( UserWarning, match='Field "also_protect_field" in Model has conflict with protected namespace "also_protect_"' ): class Model(BaseModel): also_protect_field: str model_config = ConfigDict(protected_namespaces=('protect_me_', 'also_protect_')) def test_protected_namespace_pattern() -> None: with pytest.warns(UserWarning, match=r'Field "perfect_match" in Model has conflict with protected namespace .*'): class Model(BaseModel): perfect_match: str model_config = ConfigDict(protected_namespaces=(re.compile(r'^perfect_match$'),)) def test_model_get_core_schema() -> None: class Model(BaseModel): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(int) schema.pop('metadata', None) # we don't care about this in tests assert schema == {'type': 'int'} schema = handler.generate_schema(int) schema.pop('metadata', None) # we don't care about this in tests assert schema == {'type': 'int'} return handler(source_type) Model() def test_nested_types_ignored(): from pydantic import BaseModel class NonNestedType: pass # Defining a nested type does not error class GoodModel(BaseModel): class NestedType: pass # You can still store such types on the class by annotating as a ClassVar MyType: ClassVar[Type[Any]] = NonNestedType # For documentation: you _can_ give multiple names to a nested type and it won't error: # It might be better if it did, but this seems to be rare enough that I'm not concerned x = NestedType assert GoodModel.MyType is NonNestedType assert GoodModel.x is GoodModel.NestedType with pytest.raises(PydanticUserError, match='A non-annotated attribute was detected'): class BadModel(BaseModel): x = NonNestedType def test_validate_python_from_attributes() -> None: class Model(BaseModel): x: int class ModelFromAttributesTrue(Model): model_config = ConfigDict(from_attributes=True) class ModelFromAttributesFalse(Model): model_config = ConfigDict(from_attributes=False) @dataclass class UnrelatedClass: x: int = 1 input = UnrelatedClass(1) for from_attributes in (False, None): with pytest.raises(ValidationError) as exc_info: Model.model_validate(UnrelatedClass(), from_attributes=from_attributes) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of Model', 'input': input, 'ctx': {'class_name': 'Model'}, } ] res = Model.model_validate(UnrelatedClass(), from_attributes=True) assert res == Model(x=1) with pytest.raises(ValidationError) as exc_info: ModelFromAttributesTrue.model_validate(UnrelatedClass(), from_attributes=False) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of ModelFromAttributesTrue', 'input': input, 'ctx': {'class_name': 'ModelFromAttributesTrue'}, } ] for from_attributes in (True, None): res = ModelFromAttributesTrue.model_validate(UnrelatedClass(), from_attributes=from_attributes) assert res == ModelFromAttributesTrue(x=1) for from_attributes in (False, None): with pytest.raises(ValidationError) as exc_info: ModelFromAttributesFalse.model_validate(UnrelatedClass(), from_attributes=from_attributes) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of ModelFromAttributesFalse', 'input': input, 'ctx': {'class_name': 'ModelFromAttributesFalse'}, } ] res = ModelFromAttributesFalse.model_validate(UnrelatedClass(), from_attributes=True) assert res == ModelFromAttributesFalse(x=1) @pytest.mark.parametrize( 'field_type,input_value,expected,raises_match,strict', [ (bool, 'true', True, None, False), (bool, 'true', True, None, True), (bool, 'false', False, None, False), (bool, 'e', ValidationError, 'type=bool_parsing', False), (int, '1', 1, None, False), (int, '1', 1, None, True), (int, 'xxx', ValidationError, 'type=int_parsing', True), (float, '1.1', 1.1, None, False), (float, '1.10', 1.1, None, False), (float, '1.1', 1.1, None, True), (float, '1.10', 1.1, None, True), (date, '2017-01-01', date(2017, 1, 1), None, False), (date, '2017-01-01', date(2017, 1, 1), None, True), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_from_datetime_inexact', False), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_parsing', True), (date, '2017-01-01T00:00:00', date(2017, 1, 1), None, False), (date, '2017-01-01T00:00:00', ValidationError, 'type=date_parsing', True), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, False), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, True), ], ids=repr, ) def test_model_validate_strings(field_type, input_value, expected, raises_match, strict): class Model(BaseModel): x: field_type if raises_match is not None: with pytest.raises(expected, match=raises_match): Model.model_validate_strings({'x': input_value}, strict=strict) else: assert Model.model_validate_strings({'x': input_value}, strict=strict).x == expected @pytest.mark.parametrize('strict', [True, False]) def test_model_validate_strings_dict(strict): class Model(BaseModel): x: Dict[int, date] assert Model.model_validate_strings({'x': {'1': '2017-01-01', '2': '2017-01-02'}}, strict=strict).x == { 1: date(2017, 1, 1), 2: date(2017, 1, 2), } def test_model_signature_annotated() -> None: class Model(BaseModel): x: Annotated[int, 123] # we used to accidentally convert `__metadata__` to a list # which caused things like `typing.get_args()` to fail assert Model.__signature__.parameters['x'].annotation.__metadata__ == (123,) def test_get_core_schema_unpacks_refs_for_source_type() -> None: # use a list to track since we end up calling `__get_pydantic_core_schema__` multiple times for models # e.g. InnerModel.__get_pydantic_core_schema__ gets called: # 1. When InnerModel is defined # 2. When OuterModel is defined # 3. When we use the TypeAdapter received_schemas: dict[str, list[str]] = defaultdict(list) @dataclass class Marker: name: str def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) received_schemas[self.name].append(schema['type']) return schema class InnerModel(BaseModel): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) received_schemas['InnerModel'].append(schema['type']) schema['metadata'] = schema.get('metadata', {}) schema['metadata']['foo'] = 'inner was here!' return deepcopy(schema) class OuterModel(BaseModel): inner: Annotated[InnerModel, Marker('Marker("inner")')] @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) received_schemas['OuterModel'].append(schema['type']) return schema ta = TypeAdapter(Annotated[OuterModel, Marker('Marker("outer")')]) # super hacky check but it works in all cases and avoids a complex and fragile iteration over CoreSchema # the point here is to verify that `__get_pydantic_core_schema__` assert 'inner was here' in str(ta.core_schema) assert received_schemas == { 'InnerModel': ['model', 'model', 'model'], 'Marker("inner")': ['definition-ref', 'definition-ref'], 'OuterModel': ['model', 'model'], 'Marker("outer")': ['definition-ref'], } def test_get_core_schema_return_new_ref() -> None: class InnerModel(BaseModel): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) schema = deepcopy(schema) schema['metadata'] = schema.get('metadata', {}) schema['metadata']['foo'] = 'inner was here!' return deepcopy(schema) class OuterModel(BaseModel): inner: InnerModel x: int = 1 @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) def set_x(m: 'OuterModel') -> 'OuterModel': m.x += 1 return m return core_schema.no_info_after_validator_function(set_x, schema, ref=schema.pop('ref')) cs = OuterModel.__pydantic_core_schema__ # super hacky check but it works in all cases and avoids a complex and fragile iteration over CoreSchema # the point here is to verify that `__get_pydantic_core_schema__` assert 'inner was here' in str(cs) assert OuterModel(inner=InnerModel()).x == 2 def test_resolve_def_schema_from_core_schema() -> None: class Inner(BaseModel): x: int class Marker: def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) resolved = handler.resolve_ref_schema(schema) assert resolved['type'] == 'model' assert resolved['cls'] is Inner def modify_inner(v: Inner) -> Inner: v.x += 1 return v return core_schema.no_info_after_validator_function(modify_inner, schema) class Outer(BaseModel): inner: Annotated[Inner, Marker()] assert Outer.model_validate({'inner': {'x': 1}}).inner.x == 2 def test_extra_validator_scalar() -> None: class Model(BaseModel): model_config = ConfigDict(extra='allow') class Child(Model): __pydantic_extra__: Dict[str, int] m = Child(a='1') assert m.__pydantic_extra__ == {'a': 1} # insert_assert(Child.model_json_schema()) assert Child.model_json_schema() == { 'additionalProperties': {'type': 'integer'}, 'properties': {}, 'title': 'Child', 'type': 'object', } def test_extra_validator_field() -> None: class Model(BaseModel, extra='allow'): # use Field(init=False) to ensure this is not treated as a field by dataclass_transform __pydantic_extra__: Dict[str, int] = Field(init=False) m = Model(a='1') assert m.__pydantic_extra__ == {'a': 1} with pytest.raises(ValidationError) as exc_info: Model(a='a') assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] # insert_assert(Child.model_json_schema()) assert Model.model_json_schema() == { 'additionalProperties': {'type': 'integer'}, 'properties': {}, 'title': 'Model', 'type': 'object', } def test_extra_validator_named() -> None: class Foo(BaseModel): x: int class Model(BaseModel): model_config = ConfigDict(extra='allow') __pydantic_extra__: 'dict[str, Foo]' class Child(Model): y: int m = Child(a={'x': '1'}, y=2) assert m.__pydantic_extra__ == {'a': Foo(x=1)} # insert_assert(Child.model_json_schema()) assert Child.model_json_schema() == { '$defs': { 'Foo': { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Foo', 'type': 'object', } }, 'additionalProperties': {'$ref': '#/$defs/Foo'}, 'properties': {'y': {'title': 'Y', 'type': 'integer'}}, 'required': ['y'], 'title': 'Child', 'type': 'object', } def test_super_getattr_extra(): class Model(BaseModel): model_config = {'extra': 'allow'} def __getattr__(self, item): if item == 'test': return 'success' return super().__getattr__(item) m = Model(x=1) assert m.x == 1 with pytest.raises(AttributeError): m.y assert m.test == 'success' def test_super_getattr_private(): class Model(BaseModel): _x: int = PrivateAttr() def __getattr__(self, item): if item == 'test': return 'success' else: return super().__getattr__(item) m = Model() m._x = 1 assert m._x == 1 with pytest.raises(AttributeError): m._y assert m.test == 'success' def test_super_delattr_extra(): test_calls = [] class Model(BaseModel): model_config = {'extra': 'allow'} def __delattr__(self, item): if item == 'test': test_calls.append('success') else: super().__delattr__(item) m = Model(x=1) assert m.x == 1 del m.x with pytest.raises(AttributeError): m._x assert test_calls == [] del m.test assert test_calls == ['success'] def test_super_delattr_private(): test_calls = [] class Model(BaseModel): _x: int = PrivateAttr() def __delattr__(self, item): if item == 'test': test_calls.append('success') else: super().__delattr__(item) m = Model() m._x = 1 assert m._x == 1 del m._x with pytest.raises(AttributeError): m._x assert test_calls == [] del m.test assert test_calls == ['success'] def test_arbitrary_types_not_a_type() -> None: """https://github.com/pydantic/pydantic/issues/6477""" class Foo: pass class Bar: pass with pytest.warns(UserWarning, match='is not a Python type'): ta = TypeAdapter(Foo(), config=ConfigDict(arbitrary_types_allowed=True)) bar = Bar() assert ta.validate_python(bar) is bar @pytest.mark.parametrize('is_dataclass', [False, True]) def test_deferred_core_schema(is_dataclass: bool) -> None: if is_dataclass: @pydantic_dataclass class Foo: x: 'Bar' else: class Foo(BaseModel): x: 'Bar' assert isinstance(Foo.__pydantic_core_schema__, MockCoreSchema) with pytest.raises(PydanticUserError, match='`Foo` is not fully defined'): Foo.__pydantic_core_schema__['type'] class Bar(BaseModel): pass assert Foo.__pydantic_core_schema__['type'] == ('dataclass' if is_dataclass else 'model') assert isinstance(Foo.__pydantic_core_schema__, dict) def test_help(create_module): module = create_module( # language=Python """ import pydoc from pydantic import BaseModel class Model(BaseModel): x: int help_result_string = pydoc.render_doc(Model) """ ) assert 'class Model' in module.help_result_string def test_cannot_use_leading_underscore_field_names(): with pytest.raises( NameError, match="Fields must not use names with leading underscores; e.g., use 'x' instead of '_x'" ): class Model1(BaseModel): _x: int = Field(alias='x') with pytest.raises( NameError, match="Fields must not use names with leading underscores; e.g., use 'x__' instead of '__x__'" ): class Model2(BaseModel): __x__: int = Field() with pytest.raises( NameError, match="Fields must not use names with leading underscores; e.g., use 'my_field' instead of '___'" ): class Model3(BaseModel): ___: int = Field(default=1) def test_customize_type_constraints_order() -> None: class Model(BaseModel): # whitespace will be stripped first, then max length will be checked, should pass on ' 1 ' x: Annotated[str, AfterValidator(lambda x: x.strip()), StringConstraints(max_length=1)] # max length will be checked first, then whitespace will be stripped, should fail on ' 1 ' y: Annotated[str, StringConstraints(max_length=1), AfterValidator(lambda x: x.strip())] with pytest.raises(ValidationError) as exc_info: Model(x=' 1 ', y=' 1 ') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_long', 'loc': ('y',), 'msg': 'String should have at most 1 character', 'input': ' 1 ', 'ctx': {'max_length': 1}, } ] def test_shadow_attribute() -> None: """https://github.com/pydantic/pydantic/issues/7108""" class Model(BaseModel): foo: str @classmethod def __pydantic_init_subclass__(cls, **kwargs: Any): super().__pydantic_init_subclass__(**kwargs) for key in cls.model_fields.keys(): setattr(cls, key, getattr(cls, key, '') + ' edited!') class One(Model): foo: str = 'abc' with pytest.warns(UserWarning, match=r'"foo" in ".*Two" shadows an attribute in parent ".*One"'): class Two(One): foo: str with pytest.warns(UserWarning, match=r'"foo" in ".*Three" shadows an attribute in parent ".*One"'): class Three(One): foo: str = 'xyz' # unlike dataclasses BaseModel does not preserve the value of defaults # so when we access the attribute in `Model.__pydantic_init_subclass__` there is no default # and hence we append `edited!` to an empty string # we've talked about changing this but this is the current behavior as of this test assert getattr(Model, 'foo', None) is None assert getattr(One, 'foo', None) == ' edited!' assert getattr(Two, 'foo', None) == ' edited! edited!' assert getattr(Three, 'foo', None) == ' edited! edited!' def test_shadow_attribute_warn_for_redefined_fields() -> None: """https://github.com/pydantic/pydantic/issues/9107""" # A simple class which defines a field class Parent: foo: bool = False # When inheriting from the parent class, as long as the field is not defined at all, there should be no warning # about shadowed fields. with warnings.catch_warnings(record=True) as captured_warnings: # Start capturing all warnings warnings.simplefilter('always') class ChildWithoutRedefinedField(BaseModel, Parent): pass # Check that no warnings were captured assert len(captured_warnings) == 0 # But when inheriting from the parent class and a parent field is redefined, a warning should be raised about # shadowed fields irrespective of whether it is defined with a type that is still compatible or narrower, or # with a different default that is still compatible with the type definition. with pytest.warns( UserWarning, match=r'"foo" in ".*ChildWithRedefinedField" shadows an attribute in parent ".*Parent"', ): class ChildWithRedefinedField(BaseModel, Parent): foo: bool = True def test_eval_type_backport(): class Model(BaseModel): foo: 'list[int | str]' assert Model(foo=[1, '2']).model_dump() == {'foo': [1, '2']} with pytest.raises(ValidationError) as exc_info: Model(foo='not a list') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': ('foo',), 'msg': 'Input should be a valid list', 'input': 'not a list', } ] with pytest.raises(ValidationError) as exc_info: Model(foo=[{'not a str or int'}]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_type', 'loc': ('foo', 0, 'int'), 'msg': 'Input should be a valid integer', 'input': {'not a str or int'}, }, { 'type': 'string_type', 'loc': ('foo', 0, 'str'), 'msg': 'Input should be a valid string', 'input': {'not a str or int'}, }, ] def test_inherited_class_vars(create_module): @create_module def module(): import typing from pydantic import BaseModel class Base(BaseModel): CONST1: 'typing.ClassVar[str]' = 'a' CONST2: 'ClassVar[str]' = 'b' class Child(module.Base): pass assert Child.CONST1 == 'a' assert Child.CONST2 == 'b' def test_schema_valid_for_inner_generic() -> None: T = TypeVar('T') class Inner(BaseModel, Generic[T]): x: T class Outer(BaseModel): inner: Inner[int] assert Outer(inner={'x': 1}).inner.x == 1 # confirming that the typevars are substituted in the outer model schema assert Outer.__pydantic_core_schema__['schema']['fields']['inner']['schema']['cls'] == Inner[int] assert ( Outer.__pydantic_core_schema__['schema']['fields']['inner']['schema']['schema']['fields']['x']['schema']['type'] == 'int' ) def test_validation_works_for_cyclical_forward_refs() -> None: class X(BaseModel): y: Union['Y', None] class Y(BaseModel): x: Union[X, None] assert Y(x={'y': None}).x.y is None def test_model_construct_with_model_post_init_and_model_copy() -> None: class Model(BaseModel): id: int def model_post_init(self, context: Any) -> None: super().model_post_init(context) m = Model.model_construct(id=1) copy = m.model_copy(deep=True) assert m == copy assert id(m) != id(copy) def test_subclassing_gen_schema_warns() -> None: with pytest.warns(UserWarning, match='Subclassing `GenerateSchema` is not supported.'): class MyGenSchema(GenerateSchema): ... def test_nested_v1_model_warns() -> None: with pytest.warns( UserWarning, match=r'Mixing V1 models and V2 models \(or constructs, like `TypeAdapter`\) is not supported. Please upgrade `V1Model` to V2.', ): class V1Model(BaseModelV1): a: int class V2Model(BaseModel): inner: V1Model @pytest.mark.skipif(sys.version_info < (3, 13), reason='requires python 3.13') def test_replace() -> None: from copy import replace class Model(BaseModel): x: int y: int m = Model(x=1, y=2) assert replace(m, x=3) == Model(x=3, y=2) pydantic-2.10.6/tests/test_meta.py000066400000000000000000000006141474456633400171350ustar00rootroot00000000000000"""Meta tests, testing the test utils and fixtures.""" import pytest from typing_extensions import Annotated from pydantic import TypeAdapter from pydantic.json_schema import WithJsonSchema @pytest.mark.xfail(reason='Invalid JSON Schemas are expected to fail.') def test_invalid_json_schema_raises() -> None: TypeAdapter(Annotated[int, WithJsonSchema({'type': 'invalid'})]).json_schema() pydantic-2.10.6/tests/test_migration.py000066400000000000000000000033331474456633400202010ustar00rootroot00000000000000import importlib import pytest from pydantic._migration import DEPRECATED_MOVED_IN_V2, MOVED_IN_V2, REDIRECT_TO_V1, REMOVED_IN_V2, getattr_migration from pydantic.errors import PydanticImportError def import_from(dotted_path: str): if ':' in dotted_path: module, obj_name = dotted_path.rsplit(':', 1) module = importlib.import_module(module) return getattr(module, obj_name) else: return importlib.import_module(dotted_path) @pytest.mark.filterwarnings('ignore::UserWarning') @pytest.mark.parametrize('module', MOVED_IN_V2.keys()) def test_moved_on_v2(module: str): import_from(module) @pytest.mark.parametrize('module', DEPRECATED_MOVED_IN_V2.keys()) def test_moved_but_not_warn_on_v2(module: str): import_from(module) @pytest.mark.filterwarnings('ignore::UserWarning') @pytest.mark.parametrize('module', REDIRECT_TO_V1.keys()) def test_redirect_to_v1(module: str): import_from(module) @pytest.mark.parametrize('module', REMOVED_IN_V2) def test_removed_on_v2(module: str): with pytest.raises(PydanticImportError, match=f'`{module}` has been removed in V2.'): import_from(module) assert False, f'{module} should not be importable' def test_base_settings_removed(): with pytest.raises(PydanticImportError, match='`BaseSettings` has been moved to the `pydantic-settings` package. '): import_from('pydantic:BaseSettings') assert False, 'pydantic:BaseSettings should not be importable' def test_getattr_migration(): get_attr = getattr_migration(__name__) assert callable(get_attr('test_getattr_migration')) is True with pytest.raises(AttributeError, match="module 'tests.test_migration' has no attribute 'foo'"): get_attr('foo') pydantic-2.10.6/tests/test_model_signature.py000066400000000000000000000136021474456633400213710ustar00rootroot00000000000000import sys from inspect import Parameter, Signature, signature from typing import Any, Generic, Iterable, Optional, TypeVar, Union import pytest from typing_extensions import Annotated from pydantic import BaseModel, ConfigDict, Field, create_model from pydantic._internal._typing_extra import is_annotated def _equals(a: Union[str, Iterable[str]], b: Union[str, Iterable[str]]) -> bool: """ compare strings with spaces removed """ if isinstance(a, str) and isinstance(b, str): return a.replace(' ', '') == b.replace(' ', '') elif isinstance(a, Iterable) and isinstance(b, Iterable): return all(_equals(a_, b_) for a_, b_ in zip(a, b)) else: raise TypeError(f'arguments must be both strings or both lists, not {type(a)}, {type(b)}') def test_model_signature(): class Model(BaseModel): a: float = Field(title='A') b: int = Field(10) c: int = Field(default_factory=lambda: 1) sig = signature(Model) assert sig != signature(BaseModel) assert _equals(map(str, sig.parameters.values()), ('a: float', 'b: int = 10', 'c: int = ')) assert _equals(str(sig), '(*, a: float, b: int = 10, c: int = ) -> None') def test_generic_model_signature(): T = TypeVar('T') class Model(BaseModel, Generic[T]): a: T sig = signature(Model[int]) assert sig != signature(BaseModel) assert _equals(map(str, sig.parameters.values()), ('a: int',)) assert _equals(str(sig), '(*, a: int) -> None') def test_custom_init_signature(): class MyModel(BaseModel): id: int name: str = 'John Doe' f__: str = Field(alias='foo') model_config = ConfigDict(extra='allow') def __init__(self, id: int = 1, bar=2, *, baz: Any, **data): super().__init__(id=id, **data) self.bar = bar self.baz = baz sig = signature(MyModel) assert _equals( map(str, sig.parameters.values()), ('id: int = 1', 'bar=2', 'baz: Any', "name: str = 'John Doe'", 'foo: str', '**data'), ) assert _equals(str(sig), "(id: int = 1, bar=2, *, baz: Any, name: str = 'John Doe', foo: str, **data) -> None") def test_custom_init_signature_with_no_var_kw(): class Model(BaseModel): a: float b: int = 2 c: int def __init__(self, a: float, b: int): super().__init__(a=a, b=b, c=1) model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(a: float, b: int) -> None') def test_invalid_identifiers_signature(): model = create_model( 'Model', **{'123 invalid identifier!': (int, Field(123, alias='valid_identifier')), '!': (int, Field(0, alias='yeah'))}, ) assert _equals(str(signature(model)), '(*, valid_identifier: int = 123, yeah: int = 0) -> None') model = create_model('Model', **{'123 invalid identifier!': (int, 123), '!': (int, Field(0, alias='yeah'))}) assert _equals(str(signature(model)), '(*, yeah: int = 0, **extra_data: Any) -> None') def test_use_field_name(): class Foo(BaseModel): foo: str = Field(alias='this is invalid') model_config = ConfigDict(populate_by_name=True) assert _equals(str(signature(Foo)), '(*, foo: str) -> None') def test_does_not_use_reserved_word(): class Foo(BaseModel): from_: str = Field(alias='from') model_config = ConfigDict(populate_by_name=True) assert _equals(str(signature(Foo)), '(*, from_: str) -> None') def test_extra_allow_no_conflict(): class Model(BaseModel): spam: str model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(*, spam: str, **extra_data: Any) -> None') def test_extra_allow_conflict(): class Model(BaseModel): extra_data: str model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(*, extra_data: str, **extra_data_: Any) -> None') def test_extra_allow_conflict_twice(): class Model(BaseModel): extra_data: str extra_data_: str model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(*, extra_data: str, extra_data_: str, **extra_data__: Any) -> None') def test_extra_allow_conflict_custom_signature(): class Model(BaseModel): extra_data: int def __init__(self, extra_data: int = 1, **foobar: Any): super().__init__(extra_data=extra_data, **foobar) model_config = ConfigDict(extra='allow') assert _equals(str(signature(Model)), '(extra_data: int = 1, **foobar: Any) -> None') def test_signature_is_class_only(): class Model(BaseModel): foo: int = 123 def __call__(self, a: int) -> bool: pass assert _equals(str(signature(Model)), '(*, foo: int = 123) -> None') assert _equals(str(signature(Model())), '(a: int) -> bool') assert not hasattr(Model(), '__signature__') def test_optional_field(): class Model(BaseModel): foo: Optional[int] = None assert signature(Model) == Signature( [Parameter('foo', Parameter.KEYWORD_ONLY, default=None, annotation=Optional[int])], return_annotation=None ) @pytest.mark.skipif(sys.version_info < (3, 12), reason='repr different on older versions') def test_annotated_field(): from annotated_types import Gt class Model(BaseModel): foo: Annotated[int, Gt(1)] = 1 sig = signature(Model) assert str(sig) == '(*, foo: Annotated[int, Gt(gt=1)] = 1) -> None' # check that the `Annotated` we created is a valid `Annotated` assert is_annotated(sig.parameters['foo'].annotation) @pytest.mark.skipif(sys.version_info < (3, 10), reason='repr different on older versions') def test_annotated_optional_field(): from annotated_types import Gt class Model(BaseModel): foo: Annotated[Optional[int], Gt(1)] = None assert str(signature(Model)) == '(*, foo: Annotated[Optional[int], Gt(gt=1)] = None) -> None' pydantic-2.10.6/tests/test_model_validator.py000066400000000000000000000104171474456633400213560ustar00rootroot00000000000000from __future__ import annotations from typing import Any, Dict, cast import pytest from pydantic import BaseModel, ValidationInfo, ValidatorFunctionWrapHandler, model_validator def test_model_validator_wrap() -> None: class Model(BaseModel): x: int y: int @model_validator(mode='wrap') @classmethod def val_model(cls, values: dict[str, Any] | Model, handler: ValidatorFunctionWrapHandler) -> Model: if isinstance(values, dict): assert values == {'x': 1, 'y': 2} model = handler({'x': 2, 'y': 3}) else: assert values.x == 1 assert values.y == 2 model = handler(Model.model_construct(x=2, y=3)) assert model.x == 2 assert model.y == 3 model.x = 20 model.y = 30 return model assert Model(x=1, y=2).model_dump() == {'x': 20, 'y': 30} assert Model.model_validate(Model.model_construct(x=1, y=2)).model_dump() == {'x': 20, 'y': 30} @pytest.mark.parametrize('classmethod_decorator', [classmethod, lambda x: x]) def test_model_validator_before(classmethod_decorator: Any) -> None: class Model(BaseModel): x: int y: int @model_validator(mode='before') @classmethod_decorator def val_model(cls, values: Any, info: ValidationInfo) -> dict[str, Any] | Model: assert not info.context if isinstance(values, dict): values = cast(Dict[str, Any], values) values['x'] += 1 values['y'] += 1 else: assert isinstance(values, Model) values.x += 1 values.y += 1 return values m = Model(x=1, y=2) assert m.model_dump() == {'x': 2, 'y': 3} # model not changed because we don't revalidate m assert Model.model_validate(m).model_dump() == {'x': 2, 'y': 3} @pytest.mark.parametrize('classmethod_decorator', [classmethod, lambda x: x]) def test_model_validator_before_revalidate_always(classmethod_decorator: Any) -> None: class Model(BaseModel, revalidate_instances='always'): x: int y: int @model_validator(mode='before') @classmethod_decorator def val_model(cls, values: Any, info: ValidationInfo) -> dict[str, Any] | Model: assert not info.context if isinstance(values, dict): values = cast(Dict[str, Any], values) values['x'] += 1 values['y'] += 1 else: assert isinstance(values, Model) values.x += 1 values.y += 1 return values assert Model(x=1, y=2).model_dump() == {'x': 2, 'y': 3} assert Model.model_validate(Model(x=1, y=2)).model_dump() == {'x': 3, 'y': 4} def test_model_validator_after() -> None: class Model(BaseModel): x: int y: int @model_validator(mode='after') def val_model(self, info: ValidationInfo) -> Model: assert not info.context self.x += 1 self.y += 1 return self assert Model(x=1, y=2).model_dump() == {'x': 2, 'y': 3} assert Model.model_validate(Model(x=1, y=2)).model_dump() == {'x': 3, 'y': 4} def test_subclass() -> None: class Human(BaseModel): @model_validator(mode='before') @classmethod def run_model_validator(cls, values: dict[str, Any]) -> dict[str, Any]: values['age'] *= 2 return values class Person(Human): age: int assert Person(age=28).age == 56 def test_nested_models() -> None: calls: list[str] = [] class Model(BaseModel): inner: Model | None @model_validator(mode='before') @classmethod def validate_model_before(cls, values: dict[str, Any]) -> dict[str, Any]: calls.append('before') return values @model_validator(mode='after') def validate_model_after(self) -> Model: calls.append('after') return self Model.model_validate({'inner': None}) assert calls == ['before', 'after'] calls.clear() Model.model_validate({'inner': {'inner': {'inner': None}}}) assert calls == ['before'] * 3 + ['after'] * 3 calls.clear() pydantic-2.10.6/tests/test_networks.py000066400000000000000000001241051474456633400200650ustar00rootroot00000000000000import json from typing import Any, Union import pytest from pydantic_core import PydanticCustomError, PydanticSerializationError, Url from typing_extensions import Annotated from pydantic import ( AfterValidator, AmqpDsn, AnyHttpUrl, AnyUrl, BaseModel, ClickHouseDsn, CockroachDsn, Field, FileUrl, FtpUrl, HttpUrl, KafkaDsn, MariaDBDsn, MongoDsn, MySQLDsn, NameEmail, NatsDsn, PostgresDsn, RedisDsn, SnowflakeDsn, Strict, TypeAdapter, UrlConstraints, ValidationError, WebsocketUrl, ) from pydantic.networks import import_email_validator, validate_email try: import email_validator except ImportError: email_validator = None @pytest.mark.parametrize( 'value', [ 'http://example.org', 'http://test', 'http://localhost', 'https://example.org/whatever/next/', 'postgres://user:pass@localhost:5432/app', 'postgres://just-user@localhost:5432/app', 'postgresql+asyncpg://user:pass@localhost:5432/app', 'postgresql+pg8000://user:pass@localhost:5432/app', 'postgresql+psycopg://postgres:postgres@localhost:5432/hatch', 'postgresql+psycopg2://postgres:postgres@localhost:5432/hatch', 'postgresql+psycopg2cffi://user:pass@localhost:5432/app', 'postgresql+py-postgresql://user:pass@localhost:5432/app', 'postgresql+pygresql://user:pass@localhost:5432/app', 'mysql://user:pass@localhost:3306/app', 'mysql+mysqlconnector://user:pass@localhost:3306/app', 'mysql+aiomysql://user:pass@localhost:3306/app', 'mysql+asyncmy://user:pass@localhost:3306/app', 'mysql+mysqldb://user:pass@localhost:3306/app', 'mysql+pymysql://user:pass@localhost:3306/app?charset=utf8mb4', 'mysql+cymysql://user:pass@localhost:3306/app', 'mysql+pyodbc://user:pass@localhost:3306/app', 'mariadb://user:pass@localhost:3306/app', 'mariadb+mariadbconnector://user:pass@localhost:3306/app', 'mariadb+pymysql://user:pass@localhost:3306/app', 'snowflake://user:pass@myorganization-myaccount', 'snowflake://user:pass@myorganization-myaccount/testdb/public?warehouse=testwh&role=myrole', 'foo-bar://example.org', 'foo.bar://example.org', 'foo0bar://example.org', 'https://example.org', 'http://localhost', 'http://localhost/', 'http://localhost:8000', 'http://localhost:8000/', 'https://foo_bar.example.com/', 'ftp://example.org', 'ftps://example.org', 'http://example.co.jp', 'http://www.example.com/a%C2%B1b', 'http://www.example.com/~username/', 'http://info.example.com?fred', 'http://info.example.com/?fred', 'http://xn--mgbh0fb.xn--kgbechtv/', 'http://example.com/blue/red%3Fand+green', 'http://www.example.com/?array%5Bkey%5D=value', 'http://xn--rsum-bpad.example.org/', 'http://123.45.67.8/', 'http://123.45.67.8:8329/', 'http://[2001:db8::ff00:42]:8329', 'http://[2001::1]:8329', 'http://[2001:db8::1]/', 'http://www.example.com:8000/foo', 'http://www.cwi.nl:80/%7Eguido/Python.html', 'https://www.python.org/путь', 'http://андрей@example.com', # AnyUrl('https://example.com', scheme='https', host='example.com'), 'https://exam_ple.com/', 'http://twitter.com/@handle/', 'http://11.11.11.11.example.com/action', 'http://abc.11.11.11.11.example.com/action', 'http://example#', 'http://example/#', 'http://example/#fragment', 'http://example/?#', 'http://example.org/path#', 'http://example.org/path#fragment', 'http://example.org/path?query#', 'http://example.org/path?query#fragment', ], ) def test_any_url_success(value): class Model(BaseModel): v: AnyUrl assert Model(v=value).v, value @pytest.mark.parametrize( 'value,err_type,err_msg', [ ('http:///', 'url_parsing', 'Input should be a valid URL, empty host'), ('http://??', 'url_parsing', 'Input should be a valid URL, empty host'), ( 'https://example.org more', 'url_parsing', 'Input should be a valid URL, invalid domain character', ), ('$https://example.org', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('../icons/logo.gif', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('abc', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('..', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('/', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('+http://example.com/', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('ht*tp://example.com/', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), (' ', 'url_parsing', 'Input should be a valid URL, relative URL without a base'), ('', 'url_parsing', 'Input should be a valid URL, input is empty'), (None, 'url_type', 'URL input should be a string or URL'), ( 'http://2001:db8::ff00:42:8329', 'url_parsing', 'Input should be a valid URL, invalid port number', ), ('http://[192.168.1.1]:8329', 'url_parsing', 'Input should be a valid URL, invalid IPv6 address'), ('http://example.com:99999', 'url_parsing', 'Input should be a valid URL, invalid port number'), ], ) def test_any_url_invalid(value, err_type, err_msg): class Model(BaseModel): v: AnyUrl with pytest.raises(ValidationError) as exc_info: Model(v=value) assert len(exc_info.value.errors(include_url=False)) == 1, exc_info.value.errors(include_url=False) error = exc_info.value.errors(include_url=False)[0] # debug(error) assert {'type': error['type'], 'msg': error['msg']} == {'type': err_type, 'msg': err_msg} def validate_url(s): class Model(BaseModel): v: AnyUrl return Model(v=s).v def test_any_url_parts(): url = validate_url('http://example.org') assert str(url) == 'http://example.org/' assert repr(url) == "AnyUrl('http://example.org/')" assert url.scheme == 'http' assert url.host == 'example.org' assert url.port == 80 def test_url_repr(): url = validate_url('http://user:password@example.org:1234/the/path/?query=here#fragment=is;this=bit') assert str(url) == 'http://user:password@example.org:1234/the/path/?query=here#fragment=is;this=bit' assert repr(url) == "AnyUrl('http://user:password@example.org:1234/the/path/?query=here#fragment=is;this=bit')" assert url.scheme == 'http' assert url.username == 'user' assert url.password == 'password' assert url.host == 'example.org' assert url.port == 1234 assert url.path == '/the/path/' assert url.query == 'query=here' assert url.fragment == 'fragment=is;this=bit' def test_ipv4_port(): url = validate_url('ftp://123.45.67.8:8329/') assert url.scheme == 'ftp' assert url.host == '123.45.67.8' assert url.port == 8329 assert url.username is None assert url.password is None def test_ipv4_no_port(): url = validate_url('ftp://123.45.67.8') assert url.scheme == 'ftp' assert url.host == '123.45.67.8' assert url.port == 21 assert url.username is None assert url.password is None def test_ipv6_port(): url = validate_url('wss://[2001:db8::ff00:42]:8329') assert url.scheme == 'wss' assert url.host == '[2001:db8::ff00:42]' assert url.port == 8329 def test_int_domain(): url = validate_url('https://£££.org') assert url.host == 'xn--9aaa.org' assert str(url) == 'https://xn--9aaa.org/' def test_co_uk(): url = validate_url('http://example.co.uk') assert str(url) == 'http://example.co.uk/' assert url.scheme == 'http' assert url.host == 'example.co.uk' def test_user_no_password(): url = validate_url('http://user:@example.org') assert url.username == 'user' assert url.password is None assert url.host == 'example.org' def test_user_info_no_user(): url = validate_url('http://:password@example.org') assert url.username is None assert url.password == 'password' assert url.host == 'example.org' def test_at_in_path(): url = validate_url('https://twitter.com/@handle') assert url.scheme == 'https' assert url.host == 'twitter.com' assert url.username is None assert url.password is None assert url.path == '/@handle' def test_fragment_without_query(): url = validate_url('https://docs.pydantic.dev/usage/types/#constrained-types') assert url.scheme == 'https' assert url.host == 'docs.pydantic.dev' assert url.path == '/usage/types/' assert url.query is None assert url.fragment == 'constrained-types' @pytest.mark.parametrize( 'value,expected', [ ('http://example.org', 'http://example.org/'), ('http://example.org/foobar', 'http://example.org/foobar'), ('http://example.org.', 'http://example.org./'), ('http://example.org./foobar', 'http://example.org./foobar'), ('HTTP://EXAMPLE.ORG', 'http://example.org/'), ('https://example.org', 'https://example.org/'), ('https://example.org?a=1&b=2', 'https://example.org/?a=1&b=2'), ('https://example.org#a=3;b=3', 'https://example.org/#a=3;b=3'), ('https://foo_bar.example.com/', 'https://foo_bar.example.com/'), ('https://exam_ple.com/', 'https://exam_ple.com/'), ('https://example.xn--p1ai', 'https://example.xn--p1ai/'), ('https://example.xn--vermgensberatung-pwb', 'https://example.xn--vermgensberatung-pwb/'), ('https://example.xn--zfr164b', 'https://example.xn--zfr164b/'), ], ) def test_http_url_success(value, expected): class Model(BaseModel): v: HttpUrl assert str(Model(v=value).v) == expected def test_nullable_http_url(): class Model(BaseModel): v: Union[HttpUrl, None] assert Model(v=None).v is None assert str(Model(v='http://example.org').v) == 'http://example.org/' @pytest.mark.parametrize( 'value,err_type,err_msg', [ ( 'ftp://example.com/', 'url_scheme', "URL scheme should be 'http' or 'https'", ), ( 'x' * 2084, 'url_too_long', 'URL should have at most 2083 characters', ), ], ) def test_http_url_invalid(value, err_type, err_msg): class Model(BaseModel): v: HttpUrl with pytest.raises(ValidationError) as exc_info: Model(v=value) assert len(exc_info.value.errors(include_url=False)) == 1, exc_info.value.errors(include_url=False) error = exc_info.value.errors(include_url=False)[0] assert {'type': error['type'], 'msg': error['msg']} == {'type': err_type, 'msg': err_msg} @pytest.mark.parametrize( 'input,output', [ (' https://www.example.com \n', 'https://www.example.com/'), (b'https://www.example.com', 'https://www.example.com/'), # https://www.xudongz.com/blog/2017/idn-phishing/ accepted but converted ('https://www.аррӏе.com/', 'https://www.xn--80ak6aa92e.com/'), ('https://exampl£e.org', 'https://xn--example-gia.org/'), ('https://example.珠宝', 'https://example.xn--pbt977c/'), ('https://example.vermögensberatung', 'https://example.xn--vermgensberatung-pwb/'), ('https://example.рф', 'https://example.xn--p1ai/'), ('https://exampl£e.珠宝', 'https://xn--example-gia.xn--pbt977c/'), ], ) def test_coerce_url(input, output): class Model(BaseModel): v: HttpUrl assert str(Model(v=input).v) == output @pytest.mark.parametrize( 'value,expected', [ ('file:///foo/bar', 'file:///foo/bar'), ('file://localhost/foo/bar', 'file:///foo/bar'), ('file:////localhost/foo/bar', 'file:///localhost/foo/bar'), ], ) def test_file_url_success(value, expected): class Model(BaseModel): v: FileUrl assert str(Model(v=value).v) == expected @pytest.mark.parametrize( 'url,expected_port, expected_str', [ ('https://www.example.com/', 443, 'https://www.example.com/'), ('https://www.example.com:443/', 443, 'https://www.example.com/'), ('https://www.example.com:8089/', 8089, 'https://www.example.com:8089/'), ('http://www.example.com/', 80, 'http://www.example.com/'), ('http://www.example.com:80/', 80, 'http://www.example.com/'), ('http://www.example.com:8080/', 8080, 'http://www.example.com:8080/'), ], ) def test_http_urls_default_port(url, expected_port, expected_str): class Model(BaseModel): v: HttpUrl m = Model(v=url) assert m.v.port == expected_port assert str(m.v) == expected_str @pytest.mark.parametrize( 'value,expected', [ ('ws://example.com', 'ws://example.com/'), ('wss://example.com', 'wss://example.com/'), ('wss://ws.example.com/', 'wss://ws.example.com/'), ('ws://ws.example.com/', 'ws://ws.example.com/'), ('ws://example.com:8080', 'ws://example.com:8080/'), ('ws://example.com/path', 'ws://example.com/path'), ('wss://example.com:4433', 'wss://example.com:4433/'), ('wss://example.com/path', 'wss://example.com/path'), ], ) def test_websocket_url_success(value, expected): class Schema(BaseModel): ws: WebsocketUrl assert Schema(ws=value).ws.unicode_string() == expected @pytest.mark.parametrize( 'value,expected', [ ('ws://example.com', 80), ('wss://example.com', 443), ('wss://ws.example.com/', 443), ('ws://ws.example.com/', 80), ('ws://example.com:8080', 8080), ('ws://example.com:9999/path', 9999), ('wss://example.com:4433', 4433), ('wss://example.com/path', 443), ], ) def test_websocket_url_port_success(value, expected): class Schema(BaseModel): ws: WebsocketUrl assert Schema(ws=value).ws.port == expected @pytest.mark.parametrize( 'value,expected', [ ('ws://example.com', '/'), ('wss://example.com', '/'), ('wss://ws.example.com/', '/'), ('ws://ws.example.com/', '/'), ('ws://example.com:8080', '/'), ('ws://example.com:9999/path', '/path'), ('wss://example.com:4433', '/'), ('wss://example.com/path/to/ws', '/path/to/ws'), ], ) def test_websocket_url_path_success(value, expected): class Schema(BaseModel): ws: WebsocketUrl assert Schema(ws=value).ws.path == expected @pytest.mark.parametrize( 'value,expected', [ ('ftp://example.com', 'ftp://example.com/'), ('ftp://example.com/path/to/ftp', 'ftp://example.com/path/to/ftp'), ('ftp://example.com:21', 'ftp://example.com/'), ('ftp://example.com:21/path/to/ftp', 'ftp://example.com/path/to/ftp'), ('ftp://example.com', 'ftp://example.com/'), ('ftp://example.com/path/to/ftp', 'ftp://example.com/path/to/ftp'), ('ftp://example.com:990', 'ftp://example.com:990/'), ('ftp://example.com:990/path/to/ftp', 'ftp://example.com:990/path/to/ftp'), ], ) def test_ftp_url_success(value, expected): class Schema(BaseModel): ftp: FtpUrl assert Schema(ftp=value).ftp.unicode_string() == expected @pytest.mark.parametrize( 'value,expected', [ ('ftp://example.com', 21), ('ftp://example.com/path/to/ftp', 21), ('ftp://example.com:21', 21), ('ftp://exaMplФ.com:221/path/to/ftp', 221), ('ftp://example.com:144', 144), ('ftp://example.com:990/path/to/ftp', 990), ], ) def test_ftp_url_port_success(value, expected): class Schema(BaseModel): ftp: FtpUrl assert Schema(ftp=value).ftp.port == expected @pytest.mark.parametrize( 'dsn', [ 'postgres://user:pass@localhost:5432/app', 'postgresql://user:pass@localhost:5432/app', 'postgresql+asyncpg://user:pass@localhost:5432/app', 'postgres://user:pass@host1.db.net,host2.db.net:6432/app', 'postgres://user:pass@%2Fvar%2Flib%2Fpostgresql/dbname', ], ) def test_postgres_dsns(dsn): class Model(BaseModel): a: PostgresDsn assert str(Model(a=dsn).a) == dsn @pytest.mark.parametrize( 'dsn', [ 'mysql://user:pass@localhost:3306/app', 'mysql+mysqlconnector://user:pass@localhost:3306/app', 'mysql+aiomysql://user:pass@localhost:3306/app', 'mysql+asyncmy://user:pass@localhost:3306/app', 'mysql+mysqldb://user:pass@localhost:3306/app', 'mysql+pymysql://user:pass@localhost:3306/app?charset=utf8mb4', 'mysql+cymysql://user:pass@localhost:3306/app', 'mysql+pyodbc://user:pass@localhost:3306/app', ], ) def test_mysql_dsns(dsn): class Model(BaseModel): a: MySQLDsn assert str(Model(a=dsn).a) == dsn @pytest.mark.parametrize( 'dsn', [ 'mariadb://user:pass@localhost:3306/app', 'mariadb+mariadbconnector://user:pass@localhost:3306/app', 'mariadb+pymysql://user:pass@localhost:3306/app', ], ) def test_mariadb_dsns(dsn): class Model(BaseModel): a: MariaDBDsn assert str(Model(a=dsn).a) == dsn @pytest.mark.parametrize( 'dsn', [ 'clickhouse+native://user:pass@localhost:9000/app', 'clickhouse+asynch://user:pass@localhost:9000/app', ], ) def test_clickhouse_dsns(dsn): class Model(BaseModel): a: ClickHouseDsn assert str(Model(a=dsn).a) == dsn @pytest.mark.parametrize( 'dsn', [ 'snowflake://user:pass@myorganization-myaccount', 'snowflake://user:pass@myorganization-myaccount/testdb/public?warehouse=testwh&role=myrole', ], ) def test_snowflake_dsns(dsn): class Model(BaseModel): a: SnowflakeDsn assert str(Model(a=dsn).a) == dsn @pytest.mark.parametrize( 'dsn,error_message', ( ( 'postgres://user:pass@host1.db.net:4321,/foo/bar:5432/app', { 'type': 'url_parsing', 'loc': ('a',), 'msg': 'Input should be a valid URL, empty host', 'input': 'postgres://user:pass@host1.db.net:4321,/foo/bar:5432/app', }, ), ( 'postgres://user:pass@host1.db.net,/app', { 'type': 'url_parsing', 'loc': ('a',), 'msg': 'Input should be a valid URL, empty host', 'input': 'postgres://user:pass@host1.db.net,/app', }, ), ( 'postgres://user:pass@/foo/bar:5432,host1.db.net:4321/app', { 'type': 'url_parsing', 'loc': ('a',), 'msg': 'Input should be a valid URL, empty host', 'input': 'postgres://user:pass@/foo/bar:5432,host1.db.net:4321/app', }, ), ( 'postgres://user@/foo/bar:5432/app', { 'type': 'url_parsing', 'loc': ('a',), 'msg': 'Input should be a valid URL, empty host', 'input': 'postgres://user@/foo/bar:5432/app', }, ), ( 'http://example.org', { 'type': 'url_scheme', 'loc': ('a',), 'msg': ( "URL scheme should be 'postgres', 'postgresql', 'postgresql+asyncpg', 'postgresql+pg8000', " "'postgresql+psycopg', 'postgresql+psycopg2', 'postgresql+psycopg2cffi', " "'postgresql+py-postgresql' or 'postgresql+pygresql'" ), 'input': 'http://example.org', }, ), ), ) def test_postgres_dsns_validation_error(dsn, error_message): class Model(BaseModel): a: PostgresDsn with pytest.raises(ValidationError) as exc_info: Model(a=dsn) error = exc_info.value.errors(include_url=False)[0] error.pop('ctx', None) assert error == error_message def test_multihost_postgres_dsns(): class Model(BaseModel): a: PostgresDsn any_multihost_url = Model(a='postgres://user:pass@host1.db.net:4321,host2.db.net:6432/app').a assert str(any_multihost_url) == 'postgres://user:pass@host1.db.net:4321,host2.db.net:6432/app' assert any_multihost_url.scheme == 'postgres' assert any_multihost_url.path == '/app' # insert_assert(any_multihost_url.hosts()) assert any_multihost_url.hosts() == [ {'username': 'user', 'password': 'pass', 'host': 'host1.db.net', 'port': 4321}, {'username': None, 'password': None, 'host': 'host2.db.net', 'port': 6432}, ] any_multihost_url = Model(a='postgres://user:pass@host.db.net:4321/app').a assert any_multihost_url.scheme == 'postgres' assert str(any_multihost_url) == 'postgres://user:pass@host.db.net:4321/app' assert any_multihost_url.path == '/app' # insert_assert(any_multihost_url.hosts()) assert any_multihost_url.hosts() == [{'username': 'user', 'password': 'pass', 'host': 'host.db.net', 'port': 4321}] def test_cockroach_dsns(): class Model(BaseModel): a: CockroachDsn assert str(Model(a='cockroachdb://user:pass@localhost:5432/app').a) == 'cockroachdb://user:pass@localhost:5432/app' assert ( str(Model(a='cockroachdb+psycopg2://user:pass@localhost:5432/app').a) == 'cockroachdb+psycopg2://user:pass@localhost:5432/app' ) assert ( str(Model(a='cockroachdb+asyncpg://user:pass@localhost:5432/app').a) == 'cockroachdb+asyncpg://user:pass@localhost:5432/app' ) with pytest.raises(ValidationError) as exc_info: Model(a='http://example.org') assert exc_info.value.errors(include_url=False)[0]['type'] == 'url_scheme' def test_amqp_dsns(): class Model(BaseModel): a: AmqpDsn m = Model(a='amqp://user:pass@localhost:1234/app') assert str(m.a) == 'amqp://user:pass@localhost:1234/app' assert m.a.username == 'user' assert m.a.password == 'pass' m = Model(a='amqps://user:pass@localhost:5432//') assert str(m.a) == 'amqps://user:pass@localhost:5432//' with pytest.raises(ValidationError) as exc_info: Model(a='http://example.org') assert exc_info.value.errors(include_url=False)[0]['type'] == 'url_scheme' # Password is not required for AMQP protocol m = Model(a='amqp://localhost:1234/app') assert str(m.a) == 'amqp://localhost:1234/app' assert m.a.username is None assert m.a.password is None # Only schema is required for AMQP protocol. # https://www.rabbitmq.com/uri-spec.html m = Model(a='amqps://') assert m.a.scheme == 'amqps' assert m.a.host is None assert m.a.port is None assert m.a.path is None def test_redis_dsns(): class Model(BaseModel): a: RedisDsn m = Model(a='redis://user:pass@localhost:1234/app') assert str(m.a) == 'redis://user:pass@localhost:1234/app' assert m.a.username == 'user' assert m.a.password == 'pass' m = Model(a='rediss://user:pass@localhost:1234/app') assert str(m.a) == 'rediss://user:pass@localhost:1234/app' m = Model(a='rediss://:pass@localhost:1234') assert str(m.a) == 'rediss://:pass@localhost:1234/0' with pytest.raises(ValidationError) as exc_info: Model(a='http://example.org') assert exc_info.value.errors(include_url=False)[0]['type'] == 'url_scheme' # Password is not required for Redis protocol m = Model(a='redis://localhost:1234/app') assert str(m.a) == 'redis://localhost:1234/app' assert m.a.username is None assert m.a.password is None # Only schema is required for Redis protocol. Otherwise it will be set to default # https://www.iana.org/assignments/uri-schemes/prov/redis m = Model(a='rediss://') assert m.a.scheme == 'rediss' assert m.a.host == 'localhost' assert m.a.port == 6379 assert m.a.path == '/0' def test_mongodb_dsns(): class Model(BaseModel): a: MongoDsn # TODO: Need to unit tests about "Replica Set", "Sharded cluster" and other deployment modes of MongoDB m = Model(a='mongodb://user:pass@localhost:1234/app') assert str(m.a) == 'mongodb://user:pass@localhost:1234/app' # insert_assert(m.a.hosts()) assert m.a.hosts() == [{'username': 'user', 'password': 'pass', 'host': 'localhost', 'port': 1234}] with pytest.raises(ValidationError) as exc_info: Model(a='http://example.org') assert exc_info.value.errors(include_url=False)[0]['type'] == 'url_scheme' # Password is not required for MongoDB protocol m = Model(a='mongodb://localhost:1234/app') assert str(m.a) == 'mongodb://localhost:1234/app' # insert_assert(m.a.hosts()) assert m.a.hosts() == [{'username': None, 'password': None, 'host': 'localhost', 'port': 1234}] # Only schema and host is required for MongoDB protocol m = Model(a='mongodb://localhost') assert m.a.scheme == 'mongodb' # insert_assert(m.a.hosts()) assert m.a.hosts() == [{'username': None, 'password': None, 'host': 'localhost', 'port': 27017}] @pytest.mark.parametrize( ('dsn', 'expected'), [ ('mongodb://user:pass@localhost/app', 'mongodb://user:pass@localhost:27017/app'), pytest.param( 'mongodb+srv://user:pass@localhost/app', 'mongodb+srv://user:pass@localhost/app', marks=pytest.mark.xfail( reason=( 'This case is not supported. ' 'Check https://github.com/pydantic/pydantic/pull/7116 for more details.' ) ), ), ], ) def test_mongodsn_default_ports(dsn: str, expected: str): class Model(BaseModel): dsn: MongoDsn m = Model(dsn=dsn) assert str(m.dsn) == expected def test_kafka_dsns(): class Model(BaseModel): a: KafkaDsn m = Model(a='kafka://') assert m.a.scheme == 'kafka' assert m.a.host == 'localhost' assert m.a.port == 9092 assert str(m.a) == 'kafka://localhost:9092' m = Model(a='kafka://kafka1') assert str(m.a) == 'kafka://kafka1:9092' with pytest.raises(ValidationError) as exc_info: Model(a='http://example.org') assert exc_info.value.errors(include_url=False)[0]['type'] == 'url_scheme' m = Model(a='kafka://kafka3:9093') assert m.a.username is None assert m.a.password is None @pytest.mark.parametrize( 'dsn,result', [ ('nats://user:pass@localhost:4222', 'nats://user:pass@localhost:4222'), ('tls://user@localhost', 'tls://user@localhost:4222'), ('ws://localhost:2355', 'ws://localhost:2355/'), ('tls://', 'tls://localhost:4222'), ('ws://:password@localhost:9999', 'ws://:password@localhost:9999/'), ], ) def test_nats_dsns(dsn, result): class Model(BaseModel): dsn: NatsDsn assert str(Model(dsn=dsn).dsn) == result def test_custom_schemes(): class Model(BaseModel): v: Annotated[Url, UrlConstraints(allowed_schemes=['ws', 'wss']), Strict()] class Model2(BaseModel): v: Annotated[Url, UrlConstraints(host_required=False, allowed_schemes=['foo'])] assert str(Model(v='ws://example.org').v) == 'ws://example.org/' assert str(Model2(v='foo:///foo/bar').v) == 'foo:///foo/bar' with pytest.raises(ValidationError, match=r"URL scheme should be 'ws' or 'wss' \[type=url_scheme,"): Model(v='http://example.org') with pytest.raises(ValidationError, match='leading or trailing control or space character are ignored in URLs'): Model(v='ws://example.org ') with pytest.raises(ValidationError, match=r'syntax rules, expected // \[type=url_syntax_violation,'): Model(v='ws:///foo/bar') @pytest.mark.parametrize( 'options', [ # Ensures the hash is generated correctly when a field is null {'max_length': None}, {'allowed_schemes': None}, {'host_required': None}, {'default_host': None}, {'default_port': None}, {'default_path': None}, ], ) def test_url_constraints_hash_equal(options): defaults = { 'max_length': 1, 'allowed_schemes': ['scheme'], 'host_required': False, 'default_host': 'host', 'default_port': 0, 'default_path': 'path', } options = {**defaults, **options} assert hash(UrlConstraints(**options)) == hash(UrlConstraints(**options)) @pytest.mark.parametrize( 'changes', [ {'max_length': 2}, {'allowed_schemes': ['new-scheme']}, {'host_required': True}, {'default_host': 'new-host'}, {'default_port': 1}, {'default_path': 'new-path'}, {'max_length': None}, {'allowed_schemes': None}, {'host_required': None}, {'default_host': None}, {'default_port': None}, {'default_path': None}, ], ) def test_url_constraints_hash_inequal(changes): options = { 'max_length': 1, 'allowed_schemes': ['scheme'], 'host_required': False, 'default_host': 'host', 'default_port': 0, 'default_path': 'path', } assert hash(UrlConstraints(**options)) != hash(UrlConstraints(**{**options, **changes})) def test_json(): class Model(BaseModel): v: HttpUrl m = Model(v='http://foo@example.net') assert m.model_dump_json() == '{"v":"http://foo@example.net/"}' @pytest.mark.skipif(not email_validator, reason='email_validator not installed') @pytest.mark.parametrize( 'value,name,email', [ ('foobar@example.com', 'foobar', 'foobar@example.com'), ('s@muelcolvin.com', 's', 's@muelcolvin.com'), ('Samuel Colvin ', 'Samuel Colvin', 's@muelcolvin.com'), ('foobar ', 'foobar', 'foobar@example.com'), (' foo.bar@example.com', 'foo.bar', 'foo.bar@example.com'), ('foo.bar@example.com ', 'foo.bar', 'foo.bar@example.com'), ('foo BAR ', 'foo BAR', 'foobar@example.com'), ('FOO bar ', 'FOO bar', 'foobar@example.com'), (' Whatever ', 'Whatever', 'foobar@example.com'), ('Whatever < foobar@example.com>', 'Whatever', 'foobar@example.com'), ('Whatever ', 'Whatever', 'foobar@example.com'), ('Whatever < foobar@example.com >', 'Whatever', 'foobar@example.com'), (' ', 'FOOBAR', 'FOOBAR@example.com'), ('ñoñó@example.com', 'ñoñó', 'ñoñó@example.com'), ('我買@example.com', '我買', '我買@example.com'), ('甲斐黒川日本@example.com', '甲斐黒川日本', '甲斐黒川日本@example.com'), ( 'чебурашкаящик-с-апельсинами.рф@example.com', 'чебурашкаящик-с-апельсинами.рф', 'чебурашкаящик-с-апельсинами.рф@example.com', ), ('उदाहरण.परीक्ष@domain.with.idn.tld', 'उदाहरण.परीक्ष', 'उदाहरण.परीक्ष@domain.with.idn.tld'), ('foo.bar@example.com', 'foo.bar', 'foo.bar@example.com'), ('foo.bar@exam-ple.com ', 'foo.bar', 'foo.bar@exam-ple.com'), ('ιωάννης@εεττ.gr', 'ιωάννης', 'ιωάννης@εεττ.gr'), ('foobar@аррӏе.com', 'foobar', 'foobar@аррӏе.com'), ('foobar@xn--80ak6aa92e.com', 'foobar', 'foobar@аррӏе.com'), ('аррӏе@example.com', 'аррӏе', 'аррӏе@example.com'), ('xn--80ak6aa92e@example.com', 'xn--80ak6aa92e', 'xn--80ak6aa92e@example.com'), ('葉士豪@臺網中心.tw', '葉士豪', '葉士豪@臺網中心.tw'), ('"first.last" ', 'first.last', 'first.last@example.com'), ("Shaquille O'Neal ", "Shaquille O'Neal", 'shaq@example.com'), ], ) def test_address_valid(value, name, email): assert validate_email(value) == (name, email) @pytest.mark.skipif(not email_validator, reason='email_validator not installed') @pytest.mark.parametrize( 'value,reason', [ ('@example.com', 'There must be something before the @-sign.'), ('f oo.bar@example.com', 'The email address contains invalid characters before the @-sign'), ('foobar', 'An email address must have an @-sign.'), ('foobar@localhost', 'The part after the @-sign is not valid. It should have a period.'), ('foobar@127.0.0.1', 'The part after the @-sign is not valid. It is not within a valid top-level domain.'), ('foo.bar@exam\nple.com ', None), ('foobar ', None), ('foobar >', None), ('foobar <', None), ('foobar <>', None), ('first.last ', None), pytest.param('foobar <' + 'a' * 4096 + '@example.com>', 'Length must not exceed 2048 characters', id='long'), ], ) def test_address_invalid(value: str, reason: Union[str, None]): with pytest.raises(PydanticCustomError, match=f'value is not a valid email address: {reason or ""}'): validate_email(value) def test_email_validator_not_installed(mocker): mocker.patch('pydantic.networks.email_validator', None) m = mocker.patch('pydantic.networks.import_email_validator', side_effect=ImportError) with pytest.raises(ImportError): validate_email('s@muelcolvin.com') m.assert_called_once() def test_import_email_validator_not_installed(mocker): mocker.patch.dict('sys.modules', {'email_validator': None}) with pytest.raises(ImportError, match=r'email-validator is not installed, run `pip install pydantic\[email\]`'): import_email_validator() @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_import_email_validator_invalid_version(mocker): mocker.patch('pydantic.networks.version', return_value='1.0.0') with pytest.raises( ImportError, match=r'email-validator version >= 2.0 required, run pip install -U email-validator' ): import_email_validator() @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_name_email(): class Model(BaseModel): v: NameEmail assert str(Model(v=NameEmail('foo bar', 'foobaR@example.com')).v) == 'foo bar ' assert str(Model(v='foo bar ').v) == 'foo bar ' assert str(Model(v='foobaR@example.com').v) == 'foobaR ' assert NameEmail('foo bar', 'foobaR@example.com') == NameEmail('foo bar', 'foobaR@example.com') assert NameEmail('foo bar', 'foobaR@example.com') != NameEmail('foo bar', 'different@example.com') assert Model.model_validate_json('{"v":"foo bar "}').v == NameEmail( 'foo bar', 'foobaR@example.com' ) assert str(Model.model_validate_json('{"v":"foobaR@example.com"}').v) == 'foobaR ' assert ( Model(v=NameEmail('foo bar', 'foobaR@example.com')).model_dump_json() == '{"v":"foo bar "}' ) with pytest.raises(ValidationError) as exc_info: Model(v=1) assert exc_info.value.errors() == [ {'input': 1, 'loc': ('v',), 'msg': 'Input is not a valid NameEmail', 'type': 'name_email_type'} ] @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_name_email_serialization(): class Model(BaseModel): email: NameEmail m = Model.model_validate({'email': '"name@mailbox.com" '}) assert m.email.name == 'name@mailbox.com' assert str(m.email) == '"name@mailbox.com" ' obj = json.loads(m.model_dump_json()) Model(email=obj['email']) def test_specialized_urls() -> None: ta = TypeAdapter(HttpUrl) http_url = ta.validate_python('http://example.com/something') assert str(http_url) == 'http://example.com/something' assert repr(http_url) == "HttpUrl('http://example.com/something')" assert http_url.__class__ == HttpUrl assert http_url.host == 'example.com' assert http_url.path == '/something' assert http_url.username is None assert http_url.password is None http_url2 = ta.validate_python(http_url) assert str(http_url2) == 'http://example.com/something' assert repr(http_url2) == "HttpUrl('http://example.com/something')" assert http_url2.__class__ == HttpUrl assert http_url2.host == 'example.com' assert http_url2.path == '/something' assert http_url2.username is None assert http_url2.password is None def test_url_equality() -> None: # works for descendants of _BaseUrl and _BaseMultiHostUrl assert HttpUrl('http://example.com/something') == HttpUrl('http://example.com/something') assert PostgresDsn('postgres://user:pass@localhost:5432/app') == PostgresDsn( 'postgres://user:pass@localhost:5432/app' ) def test_equality_independent_of_init() -> None: ta = TypeAdapter(HttpUrl) from_str = ta.validate_python('http://example.com/something') from_url = ta.validate_python(HttpUrl('http://example.com/something')) from_validated = ta.validate_python(from_str) assert from_str == from_url == from_validated def test_url_subclasses_any_url() -> None: http_url = AnyHttpUrl('https://localhost') assert isinstance(http_url, AnyUrl) assert isinstance(http_url, AnyHttpUrl) url = TypeAdapter(AnyUrl).validate_python(http_url) assert url is http_url def test_custom_constraints() -> None: HttpUrl = Annotated[AnyUrl, UrlConstraints(allowed_schemes=['http', 'https'])] ta = TypeAdapter(HttpUrl) assert ta.validate_python('https://example.com') with pytest.raises(ValidationError): ta.validate_python('ftp://example.com') def test_after_validator() -> None: def remove_trailing_slash(url: AnyUrl) -> str: """Custom url -> str transformer that removes trailing slash.""" return str(url._url).rstrip('/') HttpUrl = Annotated[ AnyUrl, UrlConstraints(allowed_schemes=['http', 'https']), AfterValidator(lambda url: remove_trailing_slash(url)), ] ta = TypeAdapter(HttpUrl) assert ta.validate_python('https://example.com/') == 'https://example.com' def test_serialize_as_any() -> None: ta = TypeAdapter(Any) assert ta.dump_python(HttpUrl('https://example.com')) == HttpUrl('https://example.com/') assert ta.dump_json('https://example.com') == b'"https://example.com"' def test_any_url_hashable() -> None: example_url_1a = AnyUrl('https://example1.com') example_url_1b = AnyUrl('https://example1.com') example_url_2 = AnyUrl('https://example2.com') assert hash(example_url_1a) == hash(example_url_1b) assert hash(example_url_1a) != hash(example_url_2) assert len({example_url_1a, example_url_1b, example_url_2}) == 2 example_multi_host_url_1a = PostgresDsn('postgres://user:pass@host1:5432,host2:5432/app') example_multi_host_url_1b = PostgresDsn('postgres://user:pass@host1:5432,host2:5432/app') example_multi_host_url_2 = PostgresDsn('postgres://user:pass@host1:5432,host3:5432/app') assert hash(example_multi_host_url_1a) == hash(example_multi_host_url_1b) assert hash(example_multi_host_url_1a) != hash(example_multi_host_url_2) assert len({example_multi_host_url_1a, example_multi_host_url_1b, example_multi_host_url_2}) == 2 def test_host_not_required_for_2_9_compatibility() -> None: data_uri = 'file:///path/to/data' url = AnyUrl(data_uri) assert url.host is None def test_json_schema() -> None: ta = TypeAdapter(HttpUrl) val_json_schema = ta.json_schema(mode='validation') assert val_json_schema == {'type': 'string', 'format': 'uri', 'minLength': 1, 'maxLength': 2083} ser_json_schema = ta.json_schema(mode='serialization') assert ser_json_schema == {'type': 'string', 'format': 'uri', 'minLength': 1, 'maxLength': 2083} def test_any_url_comparison() -> None: first_url = AnyUrl('https://a.com') second_url = AnyUrl('https://b.com') assert first_url < second_url assert second_url > first_url assert first_url <= second_url assert second_url >= first_url def test_max_length_base_url() -> None: class Model(BaseModel): url: AnyUrl = Field(max_length=20) # _BaseUrl/AnyUrl adds trailing slash: https://github.com/pydantic/pydantic/issues/7186 # once solved the second expected line can be removed expected = 'https://example.com' expected = f'{expected}/' assert len(Model(url='https://example.com').url) == len(expected) with pytest.raises(ValidationError, match=r'Value should have at most 20 items after validation'): Model(url='https://example.com/longer') def test_max_length_base_multi_host() -> None: class Model(BaseModel): postgres: PostgresDsn = Field(max_length=45) expected = 'postgres://user:pass@localhost:5432/foobar' assert len(Model(postgres=expected).postgres) == len(expected) with pytest.raises(ValidationError, match=r'Value should have at most 45 items after validation'): Model(postgres='postgres://user:pass@localhost:5432/foobarbazfoo') def test_unexpected_ser() -> None: ta = TypeAdapter(HttpUrl) with pytest.raises( PydanticSerializationError, match="Expected `` but got `` with value `'http://example.com'`", ): ta.dump_python('http://example.com', warnings='error') def test_url_ser() -> None: ta = TypeAdapter(HttpUrl) assert ta.dump_python(HttpUrl('http://example.com')) == HttpUrl('http://example.com') assert ta.dump_json(HttpUrl('http://example.com')) == b'"http://example.com/"' def test_url_ser_as_any() -> None: ta = TypeAdapter(Any) assert ta.dump_python(HttpUrl('http://example.com')) == HttpUrl('http://example.com') assert ta.dump_json(HttpUrl('http://example.com')) == b'"http://example.com/"' pydantic-2.10.6/tests/test_networks_ipaddress.py000066400000000000000000000375311474456633400221310ustar00rootroot00000000000000import json from ipaddress import IPv4Address, IPv4Interface, IPv4Network, IPv6Address, IPv6Interface, IPv6Network from typing import Any, List import pytest from pydantic import BaseModel, IPvAnyAddress, IPvAnyInterface, IPvAnyNetwork, ValidationError from pydantic.config import ConfigDict @pytest.mark.parametrize( 'value,cls', [ ('0.0.0.0', IPv4Address), ('1.1.1.1', IPv4Address), ('10.10.10.10', IPv4Address), ('192.168.0.1', IPv4Address), ('255.255.255.255', IPv4Address), ('::1:0:1', IPv6Address), ('ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', IPv6Address), (b'\x00\x00\x00\x00', IPv4Address), (b'\x01\x01\x01\x01', IPv4Address), (b'\n\n\n\n', IPv4Address), (b'\xc0\xa8\x00\x01', IPv4Address), (b'\xff\xff\xff\xff', IPv4Address), (b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x01', IPv6Address), (b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Address), (0, IPv4Address), (16_843_009, IPv4Address), (168_430_090, IPv4Address), (3_232_235_521, IPv4Address), (4_294_967_295, IPv4Address), (4_294_967_297, IPv6Address), (340_282_366_920_938_463_463_374_607_431_768_211_455, IPv6Address), (IPv4Address('192.168.0.1'), IPv4Address), (IPv6Address('::1:0:1'), IPv6Address), ], ) def test_ipaddress_success(value, cls): class Model(BaseModel): ip: IPvAnyAddress assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize( 'value', [ '0.0.0.0', '1.1.1.1', '10.10.10.10', '192.168.0.1', '255.255.255.255', b'\x00\x00\x00\x00', b'\x01\x01\x01\x01', b'\n\n\n\n', b'\xc0\xa8\x00\x01', b'\xff\xff\xff\xff', 0, 16_843_009, 168_430_090, 3_232_235_521, 4_294_967_295, IPv4Address('0.0.0.0'), IPv4Address('1.1.1.1'), IPv4Address('10.10.10.10'), IPv4Address('192.168.0.1'), IPv4Address('255.255.255.255'), ], ) def test_ipv4address_success(value): class Model(BaseModel): ipv4: IPv4Address assert Model(ipv4=value).ipv4 == IPv4Address(value) @pytest.mark.parametrize( 'tp,value,errors', [ ( IPv4Address, IPv4Address('0.0.0.0'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv4Address', 'input': '0.0.0.0', 'ctx': {'class': 'IPv4Address'}, } ], ), ( IPv4Interface, IPv4Interface('192.168.0.0/24'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv4Interface', 'input': '192.168.0.0/24', 'ctx': {'class': 'IPv4Interface'}, } ], ), ( IPv4Network, IPv4Network('192.168.0.0/24'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv4Network', 'input': '192.168.0.0/24', 'ctx': {'class': 'IPv4Network'}, } ], ), ( IPv6Address, IPv6Address('::1:0:1'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv6Address', 'input': '::1:0:1', 'ctx': {'class': 'IPv6Address'}, } ], ), ( IPv6Interface, IPv6Interface('2001:db00::0/120'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv6Interface', 'input': '2001:db00::/120', 'ctx': {'class': 'IPv6Interface'}, } ], ), ( IPv6Network, IPv6Network('2001:db00::0/120'), [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of IPv6Network', 'input': '2001:db00::/120', 'ctx': {'class': 'IPv6Network'}, } ], ), ], ) def test_ip_strict(tp: Any, value: Any, errors: List[Any]) -> None: class Model(BaseModel): v: tp model_config = ConfigDict(strict=True) with pytest.raises(ValidationError) as exc_info: Model(v=str(value)) assert exc_info.value.errors(include_url=False) == errors assert Model(v=value).v == value @pytest.mark.parametrize( 'value', [ '::1:0:1', 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x01', b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', 4_294_967_297, 340_282_366_920_938_463_463_374_607_431_768_211_455, IPv6Address('::1:0:1'), IPv6Address('ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff'), ], ) def test_ipv6address_success(value): class Model(BaseModel): ipv6: IPv6Address assert Model(ipv6=value).ipv6 == IPv6Address(value) @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1', -1, 2**128 + 1]) def test_ipaddress_fails(value): class Model(BaseModel): ip: IPvAnyAddress with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_any_address', 'loc': ('ip',), 'msg': 'value is not a valid IPv4 or IPv6 address', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1', -1, 2**32 + 1, IPv6Address('::0:1:0')]) def test_ipv4address_fails(value): class Model(BaseModel): ipv4: IPv4Address with pytest.raises(ValidationError) as exc_info: Model(ipv4=value) assert exc_info.value.error_count() == 1 assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v4_address', 'loc': ('ipv4',), 'msg': 'Input is not a valid IPv4 address', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1', -1, 2**128 + 1, IPv4Address('192.168.0.1')]) def test_ipv6address_fails(value): class Model(BaseModel): ipv6: IPv6Address with pytest.raises(ValidationError) as exc_info: Model(ipv6=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v6_address', 'loc': ('ipv6',), 'msg': 'Input is not a valid IPv6 address', 'input': value, } @pytest.mark.parametrize( 'value,cls', [ ('192.168.0.0/24', IPv4Network), ('192.168.128.0/30', IPv4Network), ('2001:db00::0/120', IPv6Network), (2**32 - 1, IPv4Network), # no mask equals to mask /32 (20_282_409_603_651_670_423_947_251_286_015, IPv6Network), # /128 (b'\xff\xff\xff\xff', IPv4Network), # /32 (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Network), (('192.168.0.0', 24), IPv4Network), (('2001:db00::0', 120), IPv6Network), (IPv4Network('192.168.0.0/24'), IPv4Network), ], ) def test_ipnetwork_success(value, cls): class Model(BaseModel): ip: IPvAnyNetwork = None assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize( 'value,cls', [ ('192.168.0.0/24', IPv4Network), ('192.168.128.0/30', IPv4Network), (2**32 - 1, IPv4Network), # no mask equals to mask /32 (b'\xff\xff\xff\xff', IPv4Network), # /32 (('192.168.0.0', 24), IPv4Network), (IPv4Network('192.168.0.0/24'), IPv4Network), ], ) def test_ip_v4_network_success(value, cls): class Model(BaseModel): ip: IPv4Network = None assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize( 'value,cls', [ ('2001:db00::0/120', IPv6Network), (20_282_409_603_651_670_423_947_251_286_015, IPv6Network), # /128 (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Network), (('2001:db00::0', 120), IPv6Network), (IPv6Network('2001:db00::0/120'), IPv6Network), ], ) def test_ip_v6_network_success(value, cls): class Model(BaseModel): ip: IPv6Network = None assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1]) def test_ipnetwork_fails(value): class Model(BaseModel): ip: IPvAnyNetwork = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_any_network', 'loc': ('ip',), 'msg': 'value is not a valid IPv4 or IPv6 network', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1, '2001:db00::1/120']) def test_ip_v4_network_fails(value): class Model(BaseModel): ip: IPv4Network = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v4_network', 'loc': ('ip',), 'msg': 'Input is not a valid IPv4 network', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1, '192.168.0.1/24']) def test_ip_v6_network_fails(value): class Model(BaseModel): ip: IPv6Network = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v6_network', 'loc': ('ip',), 'msg': 'Input is not a valid IPv6 network', 'input': value, } def test_ipvany_serialization(): class Model(BaseModel): address: IPvAnyAddress network: IPvAnyNetwork interface: IPvAnyInterface m = Model(address='127.0.0.1', network='192.0.2.0/27', interface='127.0.0.1/32') assert json.loads(m.model_dump_json()) == { 'address': '127.0.0.1', 'interface': '127.0.0.1/32', 'network': '192.0.2.0/27', } @pytest.mark.parametrize( 'value,cls', [ ('192.168.0.0/24', IPv4Interface), ('192.168.0.1/24', IPv4Interface), ('192.168.128.0/30', IPv4Interface), ('192.168.128.1/30', IPv4Interface), ('2001:db00::0/120', IPv6Interface), ('2001:db00::1/120', IPv6Interface), (2**32 - 1, IPv4Interface), # no mask equals to mask /32 (2**32 - 1, IPv4Interface), # so `strict` has no effect (20_282_409_603_651_670_423_947_251_286_015, IPv6Interface), # /128 (20_282_409_603_651_670_423_947_251_286_014, IPv6Interface), (b'\xff\xff\xff\xff', IPv4Interface), # /32 (b'\xff\xff\xff\xff', IPv4Interface), (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Interface), (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Interface), (('192.168.0.0', 24), IPv4Interface), (('192.168.0.1', 24), IPv4Interface), (('2001:db00::0', 120), IPv6Interface), (('2001:db00::1', 120), IPv6Interface), (IPv4Interface('192.168.0.0/24'), IPv4Interface), (IPv4Interface('192.168.0.1/24'), IPv4Interface), (IPv6Interface('2001:db00::0/120'), IPv6Interface), (IPv6Interface('2001:db00::1/120'), IPv6Interface), ], ) def test_ipinterface_success(value, cls): class Model(BaseModel): ip: IPvAnyInterface = None assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize( 'value,cls', [ ('192.168.0.0/24', IPv4Interface), ('192.168.0.1/24', IPv4Interface), ('192.168.128.0/30', IPv4Interface), ('192.168.128.1/30', IPv4Interface), (2**32 - 1, IPv4Interface), # no mask equals to mask /32 (2**32 - 1, IPv4Interface), # so `strict` has no effect (b'\xff\xff\xff\xff', IPv4Interface), # /32 (b'\xff\xff\xff\xff', IPv4Interface), (('192.168.0.0', 24), IPv4Interface), (('192.168.0.1', 24), IPv4Interface), (IPv4Interface('192.168.0.0/24'), IPv4Interface), (IPv4Interface('192.168.0.1/24'), IPv4Interface), ], ) def test_ip_v4_interface_success(value, cls): class Model(BaseModel): ip: IPv4Interface assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize( 'value,cls', [ ('2001:db00::0/120', IPv6Interface), ('2001:db00::1/120', IPv6Interface), (20_282_409_603_651_670_423_947_251_286_015, IPv6Interface), # /128 (20_282_409_603_651_670_423_947_251_286_014, IPv6Interface), (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Interface), (b'\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff', IPv6Interface), (('2001:db00::0', 120), IPv6Interface), (('2001:db00::1', 120), IPv6Interface), (IPv6Interface('2001:db00::0/120'), IPv6Interface), (IPv6Interface('2001:db00::1/120'), IPv6Interface), ], ) def test_ip_v6_interface_success(value, cls): class Model(BaseModel): ip: IPv6Interface = None assert Model(ip=value).ip == cls(value) @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1]) def test_ipinterface_fails(value): class Model(BaseModel): ip: IPvAnyInterface = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_any_interface', 'loc': ('ip',), 'msg': 'value is not a valid IPv4 or IPv6 interface', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1]) def test_ip_v4_interface_fails(value): class Model(BaseModel): ip: IPv4Interface = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v4_interface', 'loc': ('ip',), 'msg': 'Input is not a valid IPv4 interface', 'input': value, } @pytest.mark.parametrize('value', ['hello,world', '192.168.0.1.1.1/24', -1, 2**128 + 1]) def test_ip_v6_interface_fails(value): class Model(BaseModel): ip: IPv6Interface = None with pytest.raises(ValidationError) as exc_info: Model(ip=value) assert exc_info.value.error_count() == 1 # insert_assert(exc_info.value.errors(include_url=False)[0]) assert exc_info.value.errors(include_url=False)[0] == { 'type': 'ip_v6_interface', 'loc': ('ip',), 'msg': 'Input is not a valid IPv6 interface', 'input': value, } pydantic-2.10.6/tests/test_parse.py000066400000000000000000000164501474456633400173260ustar00rootroot00000000000000from typing import List, Tuple import pytest from pydantic_core import CoreSchema from pydantic import BaseModel, GetJsonSchemaHandler, ValidationError, model_validator, parse_obj_as from pydantic.functional_serializers import model_serializer from pydantic.json_schema import JsonSchemaValue class Model(BaseModel): a: float b: int = 10 def test_obj(): m = Model.model_validate(dict(a=10.2)) assert str(m) == 'a=10.2 b=10' def test_model_validate_fails(): with pytest.raises(ValidationError) as exc_info: Model.model_validate([1, 2, 3]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of Model', 'input': [1, 2, 3], 'ctx': {'class_name': 'Model'}, } ] def test_model_validate_submodel(): m = Model.model_validate(Model(a=10.2)) assert m.model_dump() == {'a': 10.2, 'b': 10} def test_model_validate_wrong_model(): class Foo(BaseModel): c: int = 123 with pytest.raises(ValidationError) as exc_info: Model.model_validate(Foo()) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of Model', 'input': Foo(), 'ctx': {'class_name': 'Model'}, } ] with pytest.raises(ValidationError) as exc_info: Model.model_validate(Foo().model_dump()) assert exc_info.value.errors(include_url=False) == [ {'input': {'c': 123}, 'loc': ('a',), 'msg': 'Field required', 'type': 'missing'} ] def test_root_model_error(): with pytest.raises( TypeError, match="To define root models, use `pydantic.RootModel` rather than a field called '__root__'" ): class MyModel(BaseModel): __root__: str def test_model_validate_root(): class MyModel(BaseModel): root: str # Note that the following three definitions require no changes across all __root__ models # I couldn't see a nice way to create a decorator that reduces the boilerplate, # but if we want to discourage this pattern, perhaps that's okay? @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(core_schema) root = handler.resolve_ref_schema(json_schema)['properties']['root'] return root # Validation m = MyModel.model_validate('a') assert m.root == 'a' # Serialization assert m.model_dump() == {'root': 'a'} assert m.model_dump_json() == '"a"' # JSON schema assert m.model_json_schema() == {'title': 'Root', 'type': 'string'} def test_parse_root_list(): class MyModel(BaseModel): root: List[str] @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] m = MyModel.model_validate(['a']) assert m.model_dump() == {'root': ['a']} assert m.model_dump_json() == '["a"]' assert m.root == ['a'] def test_parse_nested_root_list(): class NestedData(BaseModel): id: str class NestedModel(BaseModel): root: List[NestedData] @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] class MyModel(BaseModel): nested: NestedModel m = MyModel.model_validate({'nested': [{'id': 'foo'}]}) assert isinstance(m.nested, NestedModel) assert isinstance(m.nested.root[0], NestedData) @pytest.mark.filterwarnings('ignore:`parse_obj_as` is deprecated.*:DeprecationWarning') def test_parse_nested_root_tuple(): class NestedData(BaseModel): id: str class NestedModel(BaseModel): root: Tuple[int, NestedData] @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] class MyModel(BaseModel): nested: List[NestedModel] data = [0, {'id': 'foo'}] m = MyModel.model_validate({'nested': [data]}) assert isinstance(m.nested[0], NestedModel) assert isinstance(m.nested[0].root[1], NestedData) nested = parse_obj_as(NestedModel, data) assert isinstance(nested, NestedModel) def test_parse_nested_custom_root(): class NestedModel(BaseModel): root: List[str] @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] class MyModel(BaseModel): root: NestedModel @model_validator(mode='before') @classmethod def populate_root(cls, values): return {'root': values} @model_serializer(mode='wrap') def _serialize(self, handler, info): data = handler(self) if info.mode == 'json': return data['root'] else: return data @classmethod def model_modify_json_schema(cls, json_schema): return json_schema['properties']['root'] nested = ['foo', 'bar'] m = MyModel.model_validate(nested) assert isinstance(m, MyModel) assert isinstance(m.root, NestedModel) assert isinstance(m.root.root, List) assert isinstance(m.root.root[0], str) def test_json(): assert Model.model_validate_json('{"a": 12, "b": 8}') == Model(a=12, b=8) pydantic-2.10.6/tests/test_pickle.py000066400000000000000000000210621474456633400174560ustar00rootroot00000000000000import dataclasses import gc import pickle from typing import Optional, Type import pytest import pydantic from pydantic import BaseModel, PositiveFloat, ValidationError from pydantic._internal._model_construction import _PydanticWeakRef from pydantic.config import ConfigDict try: import cloudpickle except ImportError: cloudpickle = None pytestmark = pytest.mark.skipif(cloudpickle is None, reason='cloudpickle is not installed') class IntWrapper: def __init__(self, v: int): self._v = v def get(self) -> int: return self._v def __eq__(self, other: 'IntWrapper') -> bool: return self.get() == other.get() def test_pickle_pydantic_weakref(): obj1 = IntWrapper(1) ref1 = _PydanticWeakRef(obj1) assert ref1() is obj1 obj2 = IntWrapper(2) ref2 = _PydanticWeakRef(obj2) assert ref2() is obj2 ref3 = _PydanticWeakRef(IntWrapper(3)) gc.collect() # PyPy does not use reference counting and always relies on GC. assert ref3() is None d = { # Hold a hard reference to the underlying object for ref1 that will also # be pickled. 'hard_ref': obj1, # ref1's underlying object has a hard reference in the pickled object so it # should maintain the reference after deserialization. 'has_hard_ref': ref1, # ref2's underlying object has no hard reference in the pickled object so it # should be `None` after deserialization. 'has_no_hard_ref': ref2, # ref3's underlying object had already gone out of scope before pickling so it # should be `None` after deserialization. 'ref_out_of_scope': ref3, } loaded = pickle.loads(pickle.dumps(d)) gc.collect() # PyPy does not use reference counting and always relies on GC. assert loaded['hard_ref'] == IntWrapper(1) assert loaded['has_hard_ref']() is loaded['hard_ref'] assert loaded['has_no_hard_ref']() is None assert loaded['ref_out_of_scope']() is None class ImportableModel(BaseModel): foo: str bar: Optional[str] = None val: PositiveFloat = 0.7 def model_factory() -> Type: class NonImportableModel(BaseModel): foo: str bar: Optional[str] = None val: PositiveFloat = 0.7 return NonImportableModel @pytest.mark.parametrize( 'model_type,use_cloudpickle', [ # Importable model can be pickled with either pickle or cloudpickle. (ImportableModel, False), (ImportableModel, True), # Locally-defined model can only be pickled with cloudpickle. (model_factory(), True), ], ) def test_pickle_model(model_type: Type, use_cloudpickle: bool): if use_cloudpickle: model_type = cloudpickle.loads(cloudpickle.dumps(model_type)) else: model_type = pickle.loads(pickle.dumps(model_type)) m = model_type(foo='hi', val=1) assert m.foo == 'hi' assert m.bar is None assert m.val == 1.0 if use_cloudpickle: m = cloudpickle.loads(cloudpickle.dumps(m)) else: m = pickle.loads(pickle.dumps(m)) assert m.foo == 'hi' assert m.bar is None assert m.val == 1.0 with pytest.raises(ValidationError): model_type(foo='hi', val=-1.1) class ImportableNestedModel(BaseModel): inner: ImportableModel def nested_model_factory() -> Type: class NonImportableNestedModel(BaseModel): inner: ImportableModel return NonImportableNestedModel @pytest.mark.parametrize( 'model_type,use_cloudpickle', [ # Importable model can be pickled with either pickle or cloudpickle. (ImportableNestedModel, False), (ImportableNestedModel, True), # Locally-defined model can only be pickled with cloudpickle. (nested_model_factory(), True), ], ) def test_pickle_nested_model(model_type: Type, use_cloudpickle: bool): if use_cloudpickle: model_type = cloudpickle.loads(cloudpickle.dumps(model_type)) else: model_type = pickle.loads(pickle.dumps(model_type)) m = model_type(inner=ImportableModel(foo='hi', val=1)) assert m.inner.foo == 'hi' assert m.inner.bar is None assert m.inner.val == 1.0 if use_cloudpickle: m = cloudpickle.loads(cloudpickle.dumps(m)) else: m = pickle.loads(pickle.dumps(m)) assert m.inner.foo == 'hi' assert m.inner.bar is None assert m.inner.val == 1.0 @pydantic.dataclasses.dataclass class ImportableDataclass: a: int b: float def dataclass_factory() -> Type: @pydantic.dataclasses.dataclass class NonImportableDataclass: a: int b: float return NonImportableDataclass @dataclasses.dataclass class ImportableBuiltinDataclass: a: int b: float def builtin_dataclass_factory() -> Type: @dataclasses.dataclass class NonImportableBuiltinDataclass: a: int b: float return NonImportableBuiltinDataclass class ImportableChildDataclass(ImportableDataclass): pass def child_dataclass_factory() -> Type: class NonImportableChildDataclass(ImportableDataclass): pass return NonImportableChildDataclass @pytest.mark.parametrize( 'dataclass_type,use_cloudpickle', [ # Importable Pydantic dataclass can be pickled with either pickle or cloudpickle. (ImportableDataclass, False), (ImportableDataclass, True), (ImportableChildDataclass, False), (ImportableChildDataclass, True), # Locally-defined Pydantic dataclass can only be pickled with cloudpickle. (dataclass_factory(), True), (child_dataclass_factory(), True), # Pydantic dataclass generated from builtin can only be pickled with cloudpickle. (pydantic.dataclasses.dataclass(ImportableBuiltinDataclass), True), # Pydantic dataclass generated from locally-defined builtin can only be pickled with cloudpickle. (pydantic.dataclasses.dataclass(builtin_dataclass_factory()), True), ], ) def test_pickle_dataclass(dataclass_type: Type, use_cloudpickle: bool): if use_cloudpickle: dataclass_type = cloudpickle.loads(cloudpickle.dumps(dataclass_type)) else: dataclass_type = pickle.loads(pickle.dumps(dataclass_type)) d = dataclass_type('1', '2.5') assert d.a == 1 assert d.b == 2.5 if use_cloudpickle: d = cloudpickle.loads(cloudpickle.dumps(d)) else: d = pickle.loads(pickle.dumps(d)) assert d.a == 1 assert d.b == 2.5 d = dataclass_type(b=10, a=20) assert d.a == 20 assert d.b == 10 if use_cloudpickle: d = cloudpickle.loads(cloudpickle.dumps(d)) else: d = pickle.loads(pickle.dumps(d)) assert d.a == 20 assert d.b == 10 class ImportableNestedDataclassModel(BaseModel): inner: ImportableBuiltinDataclass def nested_dataclass_model_factory() -> Type: class NonImportableNestedDataclassModel(BaseModel): inner: ImportableBuiltinDataclass return NonImportableNestedDataclassModel @pytest.mark.parametrize( 'model_type,use_cloudpickle', [ # Importable model can be pickled with either pickle or cloudpickle. (ImportableNestedDataclassModel, False), (ImportableNestedDataclassModel, True), # Locally-defined model can only be pickled with cloudpickle. (nested_dataclass_model_factory(), True), ], ) def test_pickle_dataclass_nested_in_model(model_type: Type, use_cloudpickle: bool): if use_cloudpickle: model_type = cloudpickle.loads(cloudpickle.dumps(model_type)) else: model_type = pickle.loads(pickle.dumps(model_type)) m = model_type(inner=ImportableBuiltinDataclass(a=10, b=20)) assert m.inner.a == 10 assert m.inner.b == 20 if use_cloudpickle: m = cloudpickle.loads(cloudpickle.dumps(m)) else: m = pickle.loads(pickle.dumps(m)) assert m.inner.a == 10 assert m.inner.b == 20 class ImportableModelWithConfig(BaseModel): model_config = ConfigDict(title='MyTitle') def model_with_config_factory() -> Type: class NonImportableModelWithConfig(BaseModel): model_config = ConfigDict(title='MyTitle') return NonImportableModelWithConfig @pytest.mark.parametrize( 'model_type,use_cloudpickle', [ (ImportableModelWithConfig, False), (ImportableModelWithConfig, True), (model_with_config_factory(), True), ], ) def test_pickle_model_with_config(model_type: Type, use_cloudpickle: bool): if use_cloudpickle: model_type = cloudpickle.loads(cloudpickle.dumps(model_type)) else: model_type = pickle.loads(pickle.dumps(model_type)) assert model_type.model_config['title'] == 'MyTitle' pydantic-2.10.6/tests/test_pipeline.py000066400000000000000000000371341474456633400200230ustar00rootroot00000000000000"""Tests for the experimental transform module.""" from __future__ import annotations import datetime import sys import warnings from decimal import Decimal from typing import Any, Callable, Dict, FrozenSet, List, Set, Tuple, Union import pytest import pytz from annotated_types import Interval from typing_extensions import Annotated if sys.version_info >= (3, 9): pass from pydantic import PydanticExperimentalWarning, TypeAdapter, ValidationError with warnings.catch_warnings(): warnings.filterwarnings('ignore', category=PydanticExperimentalWarning) from pydantic.experimental.pipeline import _Pipeline, transform, validate_as # type: ignore @pytest.mark.parametrize('potato_variation', ['potato', ' potato ', ' potato', 'potato ', ' POTATO ', ' PoTatO ']) def test_parse_str(potato_variation: str) -> None: ta_lower = TypeAdapter[str](Annotated[str, validate_as(...).str_strip().str_lower()]) assert ta_lower.validate_python(potato_variation) == 'potato' def test_parse_str_with_pattern() -> None: ta_pattern = TypeAdapter[str](Annotated[str, validate_as(...).str_pattern(r'[a-z]+')]) assert ta_pattern.validate_python('potato') == 'potato' with pytest.raises(ValueError): ta_pattern.validate_python('POTATO') @pytest.mark.parametrize( 'type_, pipeline, valid_cases, invalid_cases', [ (int, validate_as(...).ge(0), [0, 1, 100], [-1, -100]), (float, validate_as(...).ge(0.0), [1.8, 0.0], [-1.0]), (Decimal, validate_as(...).ge(Decimal(0.0)), [Decimal(1), Decimal(0.0)], [Decimal(-1.0)]), (int, validate_as(...).le(5), [2, 4], [6, 100]), (float, validate_as(...).le(1.0), [0.5, 0.0], [100.0]), (Decimal, validate_as(...).le(Decimal(1.0)), [Decimal(1)], [Decimal(5.0)]), (int, validate_as(...).gt(0), [1, 2, 100], [0, -1]), (float, validate_as(...).gt(0.0), [0.1, 1.8], [0.0, -1.0]), (Decimal, validate_as(...).gt(Decimal(0.0)), [Decimal(1)], [Decimal(0.0), Decimal(-1.0)]), (int, validate_as(...).lt(5), [2, 4], [5, 6, 100]), (float, validate_as(...).lt(1.0), [0.5, 0.0], [1.0, 100.0]), (Decimal, validate_as(...).lt(Decimal(1.0)), [Decimal(0.5)], [Decimal(1.0), Decimal(5.0)]), ], ) def test_ge_le_gt_lt( type_: Any, pipeline: _Pipeline[Any, Any], valid_cases: list[Any], invalid_cases: list[Any] ) -> None: ta = TypeAdapter[Any](Annotated[type_, pipeline]) for x in valid_cases: assert ta.validate_python(x) == x for y in invalid_cases: with pytest.raises(ValueError): ta.validate_python(y) @pytest.mark.parametrize( 'type_, pipeline, valid_cases, invalid_cases', [ (int, validate_as(int).multiple_of(5), [5, 20, 0], [18, 7]), (float, validate_as(float).multiple_of(2.5), [2.5, 5.0, 7.5], [3.0, 1.1]), ( Decimal, validate_as(Decimal).multiple_of(Decimal('1.5')), [Decimal('1.5'), Decimal('3.0'), Decimal('4.5')], [Decimal('1.4'), Decimal('2.1')], ), ], ) def test_parse_multipleOf(type_: Any, pipeline: Any, valid_cases: list[Any], invalid_cases: list[Any]) -> None: ta = TypeAdapter[Any](Annotated[type_, pipeline]) for x in valid_cases: assert ta.validate_python(x) == x for y in invalid_cases: with pytest.raises(ValueError): ta.validate_python(y) @pytest.mark.parametrize( 'type_, pipeline, valid_cases, invalid_cases', [ (int, validate_as(int).constrain(Interval(ge=0, le=10)), [0, 5, 10], [11]), (float, validate_as(float).constrain(Interval(gt=0.0, lt=10.0)), [0.1, 9.9], [10.0]), ( Decimal, validate_as(Decimal).constrain(Interval(ge=Decimal('1.0'), lt=Decimal('10.0'))), [Decimal('1.0'), Decimal('5.5'), Decimal('9.9')], [Decimal('0.0'), Decimal('10.0')], ), (int, validate_as(int).constrain(Interval(gt=1, lt=5)), [2, 4], [1, 5]), (float, validate_as(float).constrain(Interval(ge=1.0, le=5.0)), [1.0, 3.0, 5.0], [0.9, 5.1]), ], ) def test_interval_constraints(type_: Any, pipeline: Any, valid_cases: list[Any], invalid_cases: list[Any]) -> None: ta = TypeAdapter[Any](Annotated[type_, pipeline]) for x in valid_cases: assert ta.validate_python(x) == x for y in invalid_cases: with pytest.raises(ValueError): ta.validate_python(y) @pytest.mark.parametrize( 'type_, pipeline, valid_cases, invalid_cases', [ ( str, validate_as(str).len(min_len=2, max_len=5), ['ab', 'abc', 'abcd', 'abcde'], ['a', 'abcdef'], ), ( List[int], validate_as(List[int]).len(min_len=1, max_len=3), [[1], [1, 2], [1, 2, 3]], [[], [1, 2, 3, 4]], ), (Tuple[int, ...], validate_as(Tuple[int, ...]).len(min_len=1, max_len=2), [(1,), (1, 2)], [(), (1, 2, 3)]), ( Set[int], validate_as(Set[int]).len(min_len=2, max_len=4), [{1, 2}, {1, 2, 3}, {1, 2, 3, 4}], [{1}, {1, 2, 3, 4, 5}], ), ( FrozenSet[int], validate_as(FrozenSet[int]).len(min_len=2, max_len=3), [frozenset({1, 2}), frozenset({1, 2, 3})], [frozenset({1}), frozenset({1, 2, 3, 4})], ), ( Dict[str, int], validate_as(Dict[str, int]).len(min_len=1, max_len=2), [{'a': 1}, {'a': 1, 'b': 2}], [{}, {'a': 1, 'b': 2, 'c': 3}], ), ( str, validate_as(str).len(min_len=2), # max_len is None ['ab', 'abc', 'abcd', 'abcde', 'abcdef'], ['a'], ), ], ) def test_len_constraints(type_: Any, pipeline: Any, valid_cases: list[Any], invalid_cases: list[Any]) -> None: ta = TypeAdapter[Any](Annotated[type_, pipeline]) for x in valid_cases: assert ta.validate_python(x) == x for y in invalid_cases: with pytest.raises(ValueError): ta.validate_python(y) def test_parse_tz() -> None: ta_tz = TypeAdapter[datetime.datetime]( Annotated[ datetime.datetime, validate_as(datetime.datetime).datetime_tz_naive(), ] ) date = datetime.datetime(2032, 6, 4, 11, 15, 30, 400000) assert ta_tz.validate_python(date) == date date_a = datetime.datetime(2032, 6, 4, 11, 15, 30, 400000, tzinfo=pytz.UTC) with pytest.raises(ValueError): ta_tz.validate_python(date_a) ta_tza = TypeAdapter[datetime.datetime]( Annotated[ datetime.datetime, validate_as(datetime.datetime).datetime_tz_aware(), ] ) date_a = datetime.datetime(2032, 6, 4, 11, 15, 30, 400000, pytz.UTC) assert ta_tza.validate_python(date_a) == date_a with pytest.raises(ValueError): ta_tza.validate_python(date) @pytest.mark.parametrize( 'method, method_arg, input_string, expected_output', [ # transforms ('lower', None, 'POTATO', 'potato'), ('upper', None, 'potato', 'POTATO'), ('title', None, 'potato potato', 'Potato Potato'), ('strip', None, ' potato ', 'potato'), # constraints ('pattern', r'[a-z]+', 'potato', 'potato'), # check lowercase # predicates ('contains', 'pot', 'potato', 'potato'), ('starts_with', 'pot', 'potato', 'potato'), ('ends_with', 'ato', 'potato', 'potato'), ], ) def test_string_validator_valid(method: str, method_arg: str | None, input_string: str, expected_output: str): # annotated metadata is equivalent to validate_as(str).str_method(method_arg) # ex: validate_as(str).str_contains('pot') annotated_metadata = getattr(validate_as(str), 'str_' + method) annotated_metadata = annotated_metadata(method_arg) if method_arg else annotated_metadata() ta = TypeAdapter[str](Annotated[str, annotated_metadata]) assert ta.validate_python(input_string) == expected_output def test_string_validator_invalid() -> None: ta_contains = TypeAdapter[str](Annotated[str, validate_as(str).str_contains('potato')]) with pytest.raises(ValidationError): ta_contains.validate_python('tomato') ta_starts_with = TypeAdapter[str](Annotated[str, validate_as(str).str_starts_with('potato')]) with pytest.raises(ValidationError): ta_starts_with.validate_python('tomato') ta_ends_with = TypeAdapter[str](Annotated[str, validate_as(str).str_ends_with('potato')]) with pytest.raises(ValidationError): ta_ends_with.validate_python('tomato') def test_parse_int() -> None: ta_gt = TypeAdapter[int](Annotated[int, validate_as(int).gt(0)]) assert ta_gt.validate_python(1) == 1 assert ta_gt.validate_python('1') == 1 with pytest.raises(ValidationError): ta_gt.validate_python(0) ta_gt_strict = TypeAdapter[int](Annotated[int, validate_as(int, strict=True).gt(0)]) assert ta_gt_strict.validate_python(1) == 1 with pytest.raises(ValidationError): ta_gt_strict.validate_python('1') with pytest.raises(ValidationError): ta_gt_strict.validate_python(0) def test_parse_str_to_int() -> None: ta = TypeAdapter[int](Annotated[int, validate_as(str).str_strip().validate_as(int)]) assert ta.validate_python('1') == 1 assert ta.validate_python(' 1 ') == 1 with pytest.raises(ValidationError): ta.validate_python('a') def test_predicates() -> None: ta_int = TypeAdapter[int](Annotated[int, validate_as(int).predicate(lambda x: x % 2 == 0)]) assert ta_int.validate_python(2) == 2 with pytest.raises(ValidationError): ta_int.validate_python(1) ta_str = TypeAdapter[int](Annotated[str, validate_as(str).predicate(lambda x: x != 'potato')]) assert ta_str.validate_python('tomato') == 'tomato' with pytest.raises(ValidationError): ta_str.validate_python('potato') @pytest.mark.parametrize( 'model, expected_val_schema, expected_ser_schema', [ ( Annotated[Union[int, str], validate_as(...) | validate_as(str)], {'anyOf': [{'type': 'integer'}, {'type': 'string'}]}, {'anyOf': [{'type': 'integer'}, {'type': 'string'}]}, ), ( Annotated[int, validate_as(...) | validate_as(str).validate_as(int)], {'anyOf': [{'type': 'integer'}, {'type': 'string'}]}, {'type': 'integer'}, ), ( Annotated[int, validate_as(...) | validate_as(str).validate_as(int)], {'anyOf': [{'type': 'integer'}, {'type': 'string'}]}, {'type': 'integer'}, ), ( Annotated[int, validate_as(...) | validate_as(str).transform(int).validate_as(int)], {'anyOf': [{'type': 'integer'}, {'type': 'string'}]}, {'type': 'integer'}, ), ( Annotated[int, validate_as(int).gt(0).lt(100)], {'type': 'integer', 'exclusiveMinimum': 0, 'exclusiveMaximum': 100}, {'type': 'integer', 'exclusiveMinimum': 0, 'exclusiveMaximum': 100}, ), ( Annotated[int, validate_as(int).gt(0) | validate_as(int).lt(100)], {'anyOf': [{'type': 'integer', 'exclusiveMinimum': 0}, {'type': 'integer', 'exclusiveMaximum': 100}]}, {'anyOf': [{'type': 'integer', 'exclusiveMinimum': 0}, {'type': 'integer', 'exclusiveMaximum': 100}]}, ), ( Annotated[List[int], validate_as(...).len(0, 100)], {'type': 'array', 'items': {'type': 'integer'}, 'maxItems': 100}, {'type': 'array', 'items': {'type': 'integer'}, 'maxItems': 100}, ), # note - we added this to confirm the fact that the transform doesn't impact the JSON schema, # as it's applied as a function after validator ( Annotated[int, validate_as(str).transform(int)], {'type': 'string'}, {'type': 'string'}, # see this is still string ), # in juxtaposition to the case above, when we use validate_as (recommended), # the JSON schema is updated appropriately ( Annotated[int, validate_as(str).validate_as(int)], {'type': 'string'}, {'type': 'integer'}, # aha, this is now an integer ), ], ) def test_json_schema( model: type[Any], expected_val_schema: dict[str, Any], expected_ser_schema: dict[str, Any] ) -> None: ta = TypeAdapter(model) schema = ta.json_schema(mode='validation') assert schema == expected_val_schema schema = ta.json_schema(mode='serialization') assert schema == expected_ser_schema def test_transform_first_step() -> None: """Check that when transform() is used as the first step in a pipeline it run after parsing.""" ta = TypeAdapter[int](Annotated[int, transform(lambda x: x + 1)]) assert ta.validate_python('1') == 2 def test_not_eq() -> None: ta = TypeAdapter[int](Annotated[str, validate_as(str).not_eq('potato')]) assert ta.validate_python('tomato') == 'tomato' with pytest.raises(ValidationError): ta.validate_python('potato') def test_eq() -> None: ta = TypeAdapter[int](Annotated[str, validate_as(str).eq('potato')]) assert ta.validate_python('potato') == 'potato' with pytest.raises(ValidationError): ta.validate_python('tomato') def test_not_in() -> None: ta = TypeAdapter[int](Annotated[str, validate_as(str).not_in(['potato', 'tomato'])]) assert ta.validate_python('carrot') == 'carrot' with pytest.raises(ValidationError): ta.validate_python('potato') def test_in() -> None: ta = TypeAdapter[int](Annotated[str, validate_as(str).in_(['potato', 'tomato'])]) assert ta.validate_python('potato') == 'potato' with pytest.raises(ValidationError): ta.validate_python('carrot') def test_composition() -> None: ta = TypeAdapter[int](Annotated[int, validate_as(int).gt(10) | validate_as(int).lt(5)]) assert ta.validate_python(1) == 1 assert ta.validate_python(20) == 20 with pytest.raises(ValidationError): ta.validate_python(9) ta = TypeAdapter[int](Annotated[int, validate_as(int).gt(10) & validate_as(int).le(20)]) assert ta.validate_python(15) == 15 with pytest.raises(ValidationError): ta.validate_python(9) with pytest.raises(ValidationError): ta.validate_python(21) # test that sticking a transform in the middle doesn't break the composition calls: list[tuple[str, int]] = [] def tf(step: str) -> Callable[[int], int]: def inner(x: int) -> int: calls.append((step, x)) return x return inner ta = TypeAdapter[int]( Annotated[ int, validate_as(int).transform(tf('1')).gt(10).transform(tf('2')) | validate_as(int).transform(tf('3')).lt(5).transform(tf('4')), ] ) assert ta.validate_python(1) == 1 assert calls == [('1', 1), ('3', 1), ('4', 1)] calls.clear() assert ta.validate_python(20) == 20 assert calls == [('1', 20), ('2', 20)] calls.clear() with pytest.raises(ValidationError): ta.validate_python(9) assert calls == [('1', 9), ('3', 9)] calls.clear() ta = TypeAdapter[int]( Annotated[ int, validate_as(int).transform(tf('1')).gt(10).transform(tf('2')) & validate_as(int).transform(tf('3')).le(20).transform(tf('4')), ] ) assert ta.validate_python(15) == 15 assert calls == [('1', 15), ('2', 15), ('3', 15), ('4', 15)] calls.clear() with pytest.raises(ValidationError): ta.validate_python(9) assert calls == [('1', 9)] calls.clear() with pytest.raises(ValidationError): ta.validate_python(21) assert calls == [('1', 21), ('2', 21), ('3', 21)] calls.clear() pydantic-2.10.6/tests/test_plugin_loader.py000066400000000000000000000042571474456633400210420ustar00rootroot00000000000000import importlib.metadata as importlib_metadata import os from unittest.mock import patch import pytest import pydantic.plugin._loader as loader class EntryPoint: def __init__(self, name, value, group): self.name = name self.value = value self.group = group def load(self): return self.value class Dist: entry_points = [] def __init__(self, entry_points): self.entry_points = entry_points @pytest.fixture def reset_plugins(): global loader initial_plugins = loader._plugins loader._plugins = None yield # teardown loader._plugins = initial_plugins @pytest.fixture(autouse=True) def mock(): mock_entry_1 = EntryPoint(name='test_plugin1', value='test_plugin:plugin1', group='pydantic') mock_entry_2 = EntryPoint(name='test_plugin2', value='test_plugin:plugin2', group='pydantic') mock_entry_3 = EntryPoint(name='test_plugin3', value='test_plugin:plugin3', group='pydantic') mock_dist = Dist([mock_entry_1, mock_entry_2, mock_entry_3]) with patch.object(importlib_metadata, 'distributions', return_value=[mock_dist]): yield def test_loader(reset_plugins): res = loader.get_plugins() assert list(res) == ['test_plugin:plugin1', 'test_plugin:plugin2', 'test_plugin:plugin3'] def test_disable_all(reset_plugins): os.environ['PYDANTIC_DISABLE_PLUGINS'] = '__all__' res = loader.get_plugins() assert res == () def test_disable_all_1(reset_plugins): os.environ['PYDANTIC_DISABLE_PLUGINS'] = '1' res = loader.get_plugins() assert res == () def test_disable_true(reset_plugins): os.environ['PYDANTIC_DISABLE_PLUGINS'] = 'true' res = loader.get_plugins() assert res == () def test_disable_one(reset_plugins): os.environ['PYDANTIC_DISABLE_PLUGINS'] = 'test_plugin1' res = loader.get_plugins() assert len(list(res)) == 2 assert 'test_plugin:plugin1' not in list(res) def test_disable_multiple(reset_plugins): os.environ['PYDANTIC_DISABLE_PLUGINS'] = 'test_plugin1,test_plugin2' res = loader.get_plugins() assert len(list(res)) == 1 assert 'test_plugin:plugin1' not in list(res) assert 'test_plugin:plugin2' not in list(res) pydantic-2.10.6/tests/test_plugins.py000066400000000000000000000405261474456633400176760ustar00rootroot00000000000000from __future__ import annotations import contextlib from functools import partial from typing import Any, Generator, List from pydantic_core import ValidationError from pydantic import BaseModel, TypeAdapter, create_model, dataclasses, field_validator, validate_call from pydantic.plugin import ( PydanticPluginProtocol, SchemaTypePath, ValidateJsonHandlerProtocol, ValidatePythonHandlerProtocol, ValidateStringsHandlerProtocol, ) from pydantic.plugin._loader import _plugins @contextlib.contextmanager def install_plugin(plugin: PydanticPluginProtocol) -> Generator[None, None, None]: _plugins[plugin.__class__.__qualname__] = plugin try: yield finally: _plugins.clear() def test_on_validate_json_on_success() -> None: class CustomOnValidateJson(ValidateJsonHandlerProtocol): def on_enter( self, input: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: assert input == '{"a": 1}' assert strict is None assert context is None assert self_instance is None def on_success(self, result: Any) -> None: assert isinstance(result, Model) class CustomPlugin(PydanticPluginProtocol): def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert config == {'title': 'Model'} assert plugin_settings == {'observe': 'all'} assert schema_type.__name__ == 'Model' assert schema_type_path == SchemaTypePath( 'tests.test_plugins', 'test_on_validate_json_on_success..Model' ) assert schema_kind == 'BaseModel' return None, CustomOnValidateJson(), None plugin = CustomPlugin() with install_plugin(plugin): class Model(BaseModel, plugin_settings={'observe': 'all'}): a: int assert Model.model_validate({'a': 1}) == Model(a=1) assert Model.model_validate_json('{"a": 1}') == Model(a=1) assert Model.__pydantic_validator__.title == 'Model' def test_on_validate_json_on_error() -> None: class CustomOnValidateJson: def on_enter( self, input: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: assert input == '{"a": "potato"}' assert strict is None assert context is None assert self_instance is None def on_error(self, error: ValidationError) -> None: assert error.title == 'Model' assert error.errors(include_url=False) == [ { 'input': 'potato', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] class Plugin(PydanticPluginProtocol): def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert config == {'title': 'Model'} assert plugin_settings == {'observe': 'all'} return None, CustomOnValidateJson(), None plugin = Plugin() with install_plugin(plugin): class Model(BaseModel, plugin_settings={'observe': 'all'}): a: int assert Model.model_validate({'a': 1}) == Model(a=1) with contextlib.suppress(ValidationError): Model.model_validate_json('{"a": "potato"}') def test_on_validate_python_on_success() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): def on_enter( self, input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: assert input == {'a': 1} assert strict is None assert context is None assert self_instance is None def on_success(self, result: Any) -> None: assert isinstance(result, Model) class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert config == {'title': 'Model'} assert plugin_settings == {'observe': 'all'} assert schema_type.__name__ == 'Model' assert schema_kind == 'BaseModel' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): class Model(BaseModel, plugin_settings={'observe': 'all'}): a: int assert Model.model_validate({'a': 1}).model_dump() == {'a': 1} assert Model.model_validate_json('{"a": 1}').model_dump() == {'a': 1} def test_on_validate_python_on_error() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): def on_enter( self, input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: assert input == {'a': 'potato'} assert strict is None assert context is None assert self_instance is None def on_error(self, error: ValidationError) -> None: assert error.title == 'Model' assert error.errors(include_url=False) == [ { 'input': 'potato', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] class Plugin(PydanticPluginProtocol): def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert config == {'title': 'Model'} assert plugin_settings == {'observe': 'all'} assert schema_type.__name__ == 'Model' assert schema_kind == 'BaseModel' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): class Model(BaseModel, plugin_settings={'observe': 'all'}): a: int with contextlib.suppress(ValidationError): Model.model_validate({'a': 'potato'}) assert Model.model_validate_json('{"a": 1}').model_dump() == {'a': 1} def test_stateful_plugin() -> None: stack: list[Any] = [] class CustomOnValidatePython(ValidatePythonHandlerProtocol): def on_enter( self, input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, self_instance: Any | None = None, ) -> None: stack.append(input) def on_success(self, result: Any) -> None: stack.pop() def on_error(self, error: Exception) -> None: stack.pop() def on_exception(self, exception: Exception) -> None: stack.pop() class Plugin(PydanticPluginProtocol): def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): return CustomOnValidatePython(), None, None plugin = Plugin() class MyException(Exception): pass with install_plugin(plugin): class Model(BaseModel, plugin_settings={'observe': 'all'}): a: int @field_validator('a') def validate_a(cls, v: int) -> int: if v < 0: raise MyException return v with contextlib.suppress(ValidationError): Model.model_validate({'a': 'potato'}) assert not stack with contextlib.suppress(MyException): Model.model_validate({'a': -1}) assert not stack assert Model.model_validate({'a': 1}).a == 1 assert not stack def test_all_handlers(): log = [] class Python(ValidatePythonHandlerProtocol): def on_enter(self, input, **kwargs) -> None: log.append(f'python enter input={input} kwargs={kwargs}') def on_success(self, result: Any) -> None: log.append(f'python success result={result}') def on_error(self, error: ValidationError) -> None: log.append(f'python error error={error}') class Json(ValidateJsonHandlerProtocol): def on_enter(self, input, **kwargs) -> None: log.append(f'json enter input={input} kwargs={kwargs}') def on_success(self, result: Any) -> None: log.append(f'json success result={result}') def on_error(self, error: ValidationError) -> None: log.append(f'json error error={error}') class Strings(ValidateStringsHandlerProtocol): def on_enter(self, input, **kwargs) -> None: log.append(f'strings enter input={input} kwargs={kwargs}') def on_success(self, result: Any) -> None: log.append(f'strings success result={result}') def on_error(self, error: ValidationError) -> None: log.append(f'strings error error={error}') class Plugin(PydanticPluginProtocol): def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): return Python(), Json(), Strings() plugin = Plugin() with install_plugin(plugin): class Model(BaseModel): a: int assert Model(a=1).model_dump() == {'a': 1} # insert_assert(log) assert log == ["python enter input={'a': 1} kwargs={'self_instance': Model()}", 'python success result=a=1'] log.clear() assert Model.model_validate_json('{"a": 2}', context={'c': 2}).model_dump() == {'a': 2} # insert_assert(log) assert log == [ "json enter input={\"a\": 2} kwargs={'strict': None, 'context': {'c': 2}}", 'json success result=a=2', ] log.clear() assert Model.model_validate_strings({'a': '3'}, strict=True, context={'c': 3}).model_dump() == {'a': 3} # insert_assert(log) assert log == [ "strings enter input={'a': '3'} kwargs={'strict': True, 'context': {'c': 3}}", 'strings success result=a=3', ] def test_plugin_path_dataclass() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert schema_type.__name__ == 'Bar' assert schema_type_path == SchemaTypePath('tests.test_plugins', 'test_plugin_path_dataclass..Bar') assert schema_kind == 'dataclass' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): @dataclasses.dataclass class Bar: a: int def test_plugin_path_type_adapter() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert str(schema_type) == 'typing.List[str]' assert schema_type_path == SchemaTypePath('tests.test_plugins', 'typing.List[str]') assert schema_kind == 'TypeAdapter' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): adapter = TypeAdapter(List[str]) adapter.validate_python(['a', 'b']) def test_plugin_path_type_adapter_with_module() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert str(schema_type) == 'typing.List[str]' assert schema_type_path == SchemaTypePath('provided_module_by_type_adapter', 'typing.List[str]') assert schema_kind == 'TypeAdapter' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): TypeAdapter(List[str], module='provided_module_by_type_adapter') def test_plugin_path_type_adapter_without_name_in_globals() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert str(schema_type) == 'typing.List[str]' assert schema_type_path == SchemaTypePath('', 'typing.List[str]') assert schema_kind == 'TypeAdapter' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): code = """ from typing import List import pydantic pydantic.TypeAdapter(List[str]) """ exec(code, {'bar': 'baz'}) def test_plugin_path_validate_call() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin1: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert schema_type.__name__ == 'foo' assert schema_type_path == SchemaTypePath( 'tests.test_plugins', 'test_plugin_path_validate_call..foo' ) assert schema_kind == 'validate_call' return CustomOnValidatePython(), None, None plugin = Plugin1() with install_plugin(plugin): @validate_call() def foo(a: int): return a class Plugin2: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert schema_type.__name__ == 'my_wrapped_function' assert schema_type_path == SchemaTypePath( 'tests.test_plugins', 'partial(test_plugin_path_validate_call..my_wrapped_function)' ) assert schema_kind == 'validate_call' return CustomOnValidatePython(), None, None plugin = Plugin2() with install_plugin(plugin): def my_wrapped_function(a: int, b: int, c: int): return a + b + c my_partial_function = partial(my_wrapped_function, c=3) validate_call(my_partial_function) def test_plugin_path_create_model() -> None: class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): assert schema_type.__name__ == 'FooModel' assert list(schema_type.model_fields.keys()) == ['foo', 'bar'] assert schema_type_path == SchemaTypePath('tests.test_plugins', 'FooModel') assert schema_kind == 'create_model' return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): create_model('FooModel', foo=(str, ...), bar=(int, 123)) def test_plugin_path_complex() -> None: paths: list[tuple(str, str)] = [] class CustomOnValidatePython(ValidatePythonHandlerProtocol): pass class Plugin: def new_schema_validator(self, schema, schema_type, schema_type_path, schema_kind, config, plugin_settings): paths.append((schema_type.__name__, schema_type_path, schema_kind)) return CustomOnValidatePython(), None, None plugin = Plugin() with install_plugin(plugin): def foo(): class Model1(BaseModel): pass def bar(): class Model2(BaseModel): pass foo() bar() assert paths == [ ( 'Model1', SchemaTypePath('tests.test_plugins', 'test_plugin_path_complex..foo..Model1'), 'BaseModel', ), ( 'Model2', SchemaTypePath('tests.test_plugins', 'test_plugin_path_complex..bar..Model2'), 'BaseModel', ), ] pydantic-2.10.6/tests/test_private_attributes.py000066400000000000000000000261621474456633400221350ustar00rootroot00000000000000import functools from typing import ClassVar, Generic, TypeVar import pytest from pydantic_core import PydanticUndefined from pydantic import BaseModel, ConfigDict, PrivateAttr, computed_field def test_private_attribute(): default = {'a': {}} class Model(BaseModel): _foo = PrivateAttr(default) assert set(Model.__private_attributes__) == {'_foo'} m = Model() assert m._foo == default assert m._foo is not default assert m._foo['a'] is not default['a'] m._foo = None assert m._foo is None assert m.model_dump() == {} assert m.__dict__ == {} def test_private_attribute_double_leading_underscore(): default = {'a': {}} class Model(BaseModel): __foo = PrivateAttr(default) assert set(Model.__private_attributes__) == {'_Model__foo'} m = Model() with pytest.raises(AttributeError, match='__foo'): m.__foo assert m._Model__foo == default assert m._Model__foo is not default assert m._Model__foo['a'] is not default['a'] m._Model__foo = None assert m._Model__foo is None assert m.model_dump() == {} assert m.__dict__ == {} def test_private_attribute_nested(): class SubModel(BaseModel): _foo = PrivateAttr(42) x: int class Model(BaseModel): y: int sub: SubModel m = Model(y=1, sub={'x': 2}) assert m.sub._foo == 42 def test_private_attribute_factory(): default = {'a': {}} def factory(): return default class Model(BaseModel): _foo = PrivateAttr(default_factory=factory) assert Model.__private_attributes__ == {'_foo': PrivateAttr(default_factory=factory)} m = Model() assert m._foo == default assert m._foo is default assert m._foo['a'] is default['a'] m._foo = None assert m._foo is None assert m.model_dump() == {} assert m.__dict__ == {} def test_private_attribute_annotation(): class Model(BaseModel): """The best model""" _foo: str assert Model.__private_attributes__ == {'_foo': PrivateAttr(PydanticUndefined)} assert repr(Model.__doc__) == "'The best model'" m = Model() with pytest.raises(AttributeError): m._foo m._foo = '123' assert m._foo == '123' m._foo = None assert m._foo is None del m._foo with pytest.raises(AttributeError): m._foo m._foo = '123' assert m._foo == '123' assert m.model_dump() == {} assert m.__dict__ == {} def test_underscore_attrs_are_private(): class Model(BaseModel): _foo: str = 'abc' _bar: ClassVar[str] = 'cba' assert Model._bar == 'cba' assert Model.__private_attributes__ == {'_foo': PrivateAttr('abc')} m = Model() assert m._foo == 'abc' m._foo = None assert m._foo is None with pytest.raises( AttributeError, match=( "'_bar' is a ClassVar of `Model` and cannot be set on an instance. " 'If you want to set a value on the class, use `Model._bar = value`.' ), ): m._bar = 1 def test_private_attribute_intersection_with_extra_field(): class Model(BaseModel): _foo = PrivateAttr('private_attribute') model_config = ConfigDict(extra='allow') assert set(Model.__private_attributes__) == {'_foo'} m = Model(_foo='field') assert m._foo == 'private_attribute' assert m.__dict__ == {} assert m.__pydantic_extra__ == {'_foo': 'field'} assert m.model_dump() == {'_foo': 'field'} m._foo = 'still_private' assert m._foo == 'still_private' assert m.__dict__ == {} assert m.__pydantic_extra__ == {'_foo': 'field'} assert m.model_dump() == {'_foo': 'field'} def test_private_attribute_invalid_name(): with pytest.raises( NameError, match="Private attributes must not use valid field names; use sunder names, e.g. '_foo' instead of 'foo'.", ): class Model(BaseModel): foo = PrivateAttr() def test_slots_are_ignored(): class Model(BaseModel): __slots__ = ( 'foo', '_bar', ) def __init__(self): super().__init__() for attr_ in self.__slots__: object.__setattr__(self, attr_, 'spam') assert Model.__private_attributes__ == {} assert set(Model.__slots__) == {'foo', '_bar'} m1 = Model() m2 = Model() for attr in Model.__slots__: assert object.__getattribute__(m1, attr) == 'spam' # In v2, you are always allowed to set instance attributes if the name starts with `_`. m1._bar = 'not spam' assert m1._bar == 'not spam' assert m2._bar == 'spam' with pytest.raises(ValueError, match='"Model" object has no field "foo"'): m1.foo = 'not spam' def test_default_and_default_factory_used_error(): with pytest.raises(TypeError, match='cannot specify both default and default_factory'): PrivateAttr(default=123, default_factory=lambda: 321) def test_config_override_init(): class MyModel(BaseModel): x: str _private_attr: int def __init__(self, **data) -> None: super().__init__(**data) self._private_attr = 123 m = MyModel(x='hello') assert m.model_dump() == {'x': 'hello'} assert m._private_attr == 123 def test_generic_private_attribute(): T = TypeVar('T') class Model(BaseModel, Generic[T]): value: T _private_value: T m = Model[int](value=1, _private_attr=3) m._private_value = 3 assert m.model_dump() == {'value': 1} def test_private_attribute_multiple_inheritance(): # We need to test this since PrivateAttr uses __slots__ and that has some restrictions with regards to # multiple inheritance default = {'a': {}} class GrandParentModel(BaseModel): _foo = PrivateAttr(default) class ParentAModel(GrandParentModel): pass class ParentBModel(GrandParentModel): _bar = PrivateAttr(default) class Model(ParentAModel, ParentBModel): _baz = PrivateAttr(default) assert GrandParentModel.__private_attributes__ == { '_foo': PrivateAttr(default), } assert ParentBModel.__private_attributes__ == { '_foo': PrivateAttr(default), '_bar': PrivateAttr(default), } assert Model.__private_attributes__ == { '_foo': PrivateAttr(default), '_bar': PrivateAttr(default), '_baz': PrivateAttr(default), } m = Model() assert m._foo == default assert m._foo is not default assert m._foo['a'] is not default['a'] assert m._bar == default assert m._bar is not default assert m._bar['a'] is not default['a'] assert m._baz == default assert m._baz is not default assert m._baz['a'] is not default['a'] m._foo = None assert m._foo is None m._bar = None assert m._bar is None m._baz = None assert m._baz is None assert m.model_dump() == {} assert m.__dict__ == {} def test_private_attributes_not_dunder() -> None: with pytest.raises( NameError, match='Private attributes must not use dunder names;' " use a single underscore prefix instead of '__foo__'.", ): class MyModel(BaseModel): __foo__ = PrivateAttr({'private'}) def test_ignored_types_are_ignored() -> None: class IgnoredType: pass class MyModel(BaseModel): model_config = ConfigDict(ignored_types=(IgnoredType,)) _a = IgnoredType() _b: int = IgnoredType() _c: IgnoredType _d: IgnoredType = IgnoredType() # The following are included to document existing behavior, which is to make them into PrivateAttrs # this can be updated if the current behavior is not the desired behavior _e: int _f: int = 1 _g = 1 assert sorted(MyModel.__private_attributes__.keys()) == ['_e', '_f', '_g'] @pytest.mark.skipif(not hasattr(functools, 'cached_property'), reason='cached_property is not available') def test_ignored_types_are_ignored_cached_property(): """Demonstrate the members of functools are ignore here as with fields.""" class MyModel(BaseModel): _a: functools.cached_property _b: int assert set(MyModel.__private_attributes__) == {'_b'} def test_none_as_private_attr(): from pydantic import BaseModel class A(BaseModel): _x: None a = A() a._x = None assert a._x is None def test_layout_compatible_multiple_private_parents(): import typing as t import pydantic class ModelMixin(pydantic.BaseModel): _mixin_private: t.Optional[str] = pydantic.PrivateAttr(None) class Model(pydantic.BaseModel): public: str = 'default' _private: t.Optional[str] = pydantic.PrivateAttr(None) class NewModel(ModelMixin, Model): pass assert set(NewModel.__private_attributes__) == {'_mixin_private', '_private'} m = NewModel() m._mixin_private = 1 m._private = 2 assert m.__pydantic_private__ == {'_mixin_private': 1, '_private': 2} assert m._mixin_private == 1 assert m._private == 2 def test_unannotated_private_attr(): from pydantic import BaseModel, PrivateAttr class A(BaseModel): _x = PrivateAttr() _y = 52 a = A() assert a._y == 52 assert a.__pydantic_private__ == {'_y': 52} a._x = 1 assert a.__pydantic_private__ == {'_x': 1, '_y': 52} def test_classvar_collision_prevention(create_module): module = create_module( # language=Python """ from __future__ import annotations from pydantic import BaseModel import typing as t class BaseConfig(BaseModel): _FIELD_UPDATE_STRATEGY: t.ClassVar[t.Dict[str, t.Any]] = {} """ ) assert module.BaseConfig._FIELD_UPDATE_STRATEGY == {} @pytest.mark.skipif(not hasattr(functools, 'cached_property'), reason='cached_property is not available') def test_private_properties_not_included_in_iter_cached_property() -> None: class Model(BaseModel): foo: int @computed_field @functools.cached_property def _foo(self) -> int: return -self.foo m = Model(foo=1) assert '_foo' not in list(k for k, _ in m) def test_private_properties_not_included_in_iter_property() -> None: class Model(BaseModel): foo: int @computed_field @property def _foo(self) -> int: return -self.foo m = Model(foo=1) assert '_foo' not in list(k for k, _ in m) def test_private_properties_not_included_in_repr_by_default_property() -> None: class Model(BaseModel): foo: int @computed_field @property def _private_property(self) -> int: return -self.foo m = Model(foo=1) m_repr = repr(m) assert '_private_property' not in m_repr @pytest.mark.skipif(not hasattr(functools, 'cached_property'), reason='cached_property is not available') def test_private_properties_not_included_in_repr_by_default_cached_property() -> None: class Model(BaseModel): foo: int @computed_field @functools.cached_property def _private_cached_property(self) -> int: return -self.foo m = Model(foo=1) m_repr = repr(m) assert '_private_cached_property' not in m_repr pydantic-2.10.6/tests/test_pydantic_extra_types.sh000066400000000000000000000002061474456633400224300ustar00rootroot00000000000000#! /usr/bin/env bash set -x set -e pushd "$(dirname $0)/../pydantic-extra-types" make install pip install -e ../ make test popd pydantic-2.10.6/tests/test_pydantic_settings.sh000077500000000000000000000002031474456633400217210ustar00rootroot00000000000000#! /usr/bin/env bash set -x set -e pushd "$(dirname $0)/../pydantic-settings" make install pip install -e ../ make test popd pydantic-2.10.6/tests/test_rich_repr.py000066400000000000000000000015231474456633400201640ustar00rootroot00000000000000from datetime import datetime from typing import List, Optional import pytest from pydantic import BaseModel from pydantic.color import Color @pytest.fixture(scope='session', name='User') def user_fixture(): class User(BaseModel): id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None friends: List[int] = [] return User def test_rich_repr(User): user = User(id=22) rich_repr = list(user.__rich_repr__()) assert rich_repr == [ ('id', 22), ('name', 'John Doe'), ('signup_ts', None), ('friends', []), ] @pytest.mark.filterwarnings('ignore::DeprecationWarning') def test_rich_repr_color(User): color = Color((10, 20, 30, 0.1)) rich_repr = list(color.__rich_repr__()) assert rich_repr == ['#0a141e1a', ('rgb', (10, 20, 30, 0.1))] pydantic-2.10.6/tests/test_root_model.py000066400000000000000000000454441474456633400203640ustar00rootroot00000000000000import pickle from datetime import date, datetime from typing import Any, Dict, Generic, List, Optional, Union import pytest from pydantic_core import CoreSchema from pydantic_core.core_schema import SerializerFunctionWrapHandler from typing_extensions import Annotated, Literal, TypeVar from pydantic import ( Base64Str, BaseModel, ConfigDict, Field, PrivateAttr, PydanticDeprecatedSince20, PydanticUserError, RootModel, ValidationError, field_serializer, model_validator, ) def parametrize_root_model(): class InnerModel(BaseModel): int_field: int str_field: str return pytest.mark.parametrize( ('root_type', 'root_value', 'dump_value'), [ pytest.param(int, 42, 42, id='int'), pytest.param(str, 'forty two', 'forty two', id='str'), pytest.param(Dict[int, bool], {1: True, 2: False}, {1: True, 2: False}, id='dict[int, bool]'), pytest.param(List[int], [4, 2, -1], [4, 2, -1], id='list[int]'), pytest.param( InnerModel, InnerModel(int_field=42, str_field='forty two'), {'int_field': 42, 'str_field': 'forty two'}, id='InnerModel', ), ], ) def check_schema(schema: CoreSchema) -> None: # we assume the shape of the core schema here, which is not a guarantee # pydantic makes to its users but is useful to check here to make sure # we are doing the right thing internally assert schema['type'] == 'model' assert schema['root_model'] is True assert schema['custom_init'] is False @parametrize_root_model() def test_root_model_specialized(root_type, root_value, dump_value): Model = RootModel[root_type] check_schema(Model.__pydantic_core_schema__) m = Model(root_value) assert m.model_dump() == dump_value assert dict(m) == {'root': m.root} assert m.__pydantic_fields_set__ == {'root'} @parametrize_root_model() def test_root_model_inherited(root_type, root_value, dump_value): class Model(RootModel[root_type]): pass check_schema(Model.__pydantic_core_schema__) m = Model(root_value) assert m.model_dump() == dump_value assert dict(m) == {'root': m.root} assert m.__pydantic_fields_set__ == {'root'} def test_root_model_validation_error(): Model = RootModel[int] with pytest.raises(ValidationError) as e: Model('forty two') assert e.value.errors(include_url=False) == [ { 'input': 'forty two', 'loc': (), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] def test_root_model_repr(): SpecializedRootModel = RootModel[int] class SubRootModel(RootModel): pass class SpecializedSubRootModel(RootModel[int]): pass assert repr(SpecializedRootModel(1)) == 'RootModel[int](root=1)' assert repr(SubRootModel(1)) == 'SubRootModel(root=1)' assert repr(SpecializedSubRootModel(1)) == 'SpecializedSubRootModel(root=1)' def test_root_model_recursive(): class A(RootModel[List['B']]): def my_a_method(self): pass class B(RootModel[Dict[str, Optional[A]]]): def my_b_method(self): pass assert repr(A.model_validate([{}])) == 'A(root=[B(root={})])' def test_root_model_nested(): calls = [] class B(RootModel[int]): def my_b_method(self): calls.append(('my_b_method', self.root)) class A(RootModel[B]): def my_a_method(self): calls.append(('my_a_method', self.root.root)) m1 = A.model_validate(1) m1.my_a_method() m1.root.my_b_method() assert calls == [('my_a_method', 1), ('my_b_method', 1)] calls.clear() m2 = A.model_validate_json('2') m2.my_a_method() m2.root.my_b_method() assert calls == [('my_a_method', 2), ('my_b_method', 2)] def test_root_model_as_field(): class MyRootModel(RootModel[int]): pass class MyModel(BaseModel): root_model: MyRootModel m = MyModel.model_validate({'root_model': 1}) assert isinstance(m.root_model, MyRootModel) def test_v1_compatibility_serializer(): class MyInnerModel(BaseModel): x: int class MyRootModel(RootModel[MyInnerModel]): # The following field_serializer can be added to achieve the same behavior as v1 had for .dict() @field_serializer('root', mode='wrap') def embed_in_dict(self, v: Any, handler: SerializerFunctionWrapHandler): return {'__root__': handler(v)} class MyOuterModel(BaseModel): my_root: MyRootModel m = MyOuterModel.model_validate({'my_root': {'x': 1}}) assert m.model_dump() == {'my_root': {'__root__': {'x': 1}}} with pytest.warns(PydanticDeprecatedSince20): assert m.dict() == {'my_root': {'__root__': {'x': 1}}} def test_construct(): class Base64Root(RootModel[Base64Str]): pass v = Base64Root.model_construct('test') assert v.model_dump() == 'dGVzdA==' def test_construct_nested(): class Base64RootProperty(BaseModel): data: RootModel[Base64Str] v = Base64RootProperty.model_construct(data=RootModel[Base64Str].model_construct('test')) assert v.model_dump() == {'data': 'dGVzdA=='} # Note: model_construct requires the inputs to be valid; the root model value does not get "validated" into # an actual root model instance: v = Base64RootProperty.model_construct(data='test') assert isinstance(v.data, str) # should be RootModel[Base64Str], but model_construct skipped validation with pytest.raises(AttributeError, match="'str' object has no attribute 'root'"): v.model_dump() def test_assignment(): Model = RootModel[int] m = Model(1) assert m.model_fields_set == {'root'} assert m.root == 1 m.root = 2 assert m.root == 2 def test_model_validator_before(): class Model(RootModel[int]): @model_validator(mode='before') @classmethod def words(cls, v): if v == 'one': return 1 elif v == 'two': return 2 else: return v assert Model('one').root == 1 assert Model('two').root == 2 assert Model('3').root == 3 with pytest.raises(ValidationError) as exc_info: Model('three') # insert_assert(exc_info.value.errors()) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'three', } ] def test_model_validator_after(): class Model(RootModel[int]): @model_validator(mode='after') def double(self) -> 'Model': self.root *= 2 return self assert Model('1').root == 2 assert Model('21').root == 42 def test_private_attr(): class Model(RootModel[int]): _private_attr: str _private_attr_default: str = PrivateAttr(default='abc') m = Model(42) assert m.root == 42 assert m._private_attr_default == 'abc' with pytest.raises(AttributeError, match='_private_attr'): m._private_attr m._private_attr = 7 m._private_attr_default = 8 m._other_private_attr = 9 # TODO: Should this be an `AttributeError`? with pytest.raises(ValueError, match='other_attr'): m.other_attr = 10 assert m._private_attr == 7 assert m._private_attr_default == 8 assert m._other_private_attr == 9 assert m.model_dump() == 42 def test_validate_assignment_false(): Model = RootModel[int] m = Model(42) m.root = 'abc' assert m.root == 'abc' def test_validate_assignment_true(): class Model(RootModel[int]): model_config = ConfigDict(validate_assignment=True) m = Model(42) with pytest.raises(ValidationError) as e: m.root = 'abc' assert e.value.errors(include_url=False) == [ { 'input': 'abc', 'loc': (), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] def test_root_model_literal(): assert RootModel[int](42).root == 42 def test_root_model_equality(): assert RootModel[int](42) == RootModel[int](42) assert RootModel[int](42) != RootModel[int](7) assert RootModel[int](42) != RootModel[float](42) assert RootModel[int](42) == RootModel[int].model_construct(42) def test_root_model_with_private_attrs_equality(): class Model(RootModel[int]): _private_attr: str = PrivateAttr(default='abc') m = Model(42) assert m == Model(42) m._private_attr = 'xyz' assert m != Model(42) def test_root_model_nested_equality(): class Model(BaseModel): value: RootModel[int] assert Model(value=42).value == RootModel[int](42) def test_root_model_base_model_equality(): class R(RootModel[int]): pass class B(BaseModel): root: int assert R(42) != B(root=42) assert B(root=42) != R(42) @pytest.mark.parametrize('extra_value', ['ignore', 'allow', 'forbid']) def test_extra_error(extra_value): with pytest.raises(PydanticUserError, match='extra'): class Model(RootModel[int]): model_config = ConfigDict(extra=extra_value) def test_root_model_default_value(): class Model(RootModel): root: int = 42 m = Model() assert m.root == 42 assert m.model_dump() == 42 assert m.__pydantic_fields_set__ == set() def test_root_model_default_factory(): class Model(RootModel): root: int = Field(default_factory=lambda: 42) m = Model() assert m.root == 42 assert m.model_dump() == 42 assert m.__pydantic_fields_set__ == set() def test_root_model_wrong_default_value_without_validate_default(): class Model(RootModel): root: int = '42' assert Model().root == '42' def test_root_model_default_value_with_validate_default(): class Model(RootModel): model_config = ConfigDict(validate_default=True) root: int = '42' m = Model() assert m.root == 42 assert m.model_dump() == 42 assert m.__pydantic_fields_set__ == set() def test_root_model_default_value_with_validate_default_on_field(): class Model(RootModel): root: Annotated[int, Field(validate_default=True, default='42')] m = Model() assert m.root == 42 assert m.model_dump() == 42 assert m.__pydantic_fields_set__ == set() def test_root_model_as_attr_with_validate_default(): class Model(BaseModel): model_config = ConfigDict(validate_default=True) rooted_value: RootModel[int] = 42 m = Model() assert m.rooted_value == RootModel[int](42) assert m.model_dump() == {'rooted_value': 42} assert m.rooted_value.__pydantic_fields_set__ == {'root'} def test_root_model_in_root_model_default(): class Nested(RootModel): root: int = 42 class Model(RootModel): root: Nested = Nested() m = Model() assert m.root.root == 42 assert m.__pydantic_fields_set__ == set() assert m.root.__pydantic_fields_set__ == set() def test_nested_root_model_naive_default(): class Nested(RootModel): root: int = 42 class Model(BaseModel): value: Nested m = Model(value=Nested()) assert m.value.root == 42 assert m.value.__pydantic_fields_set__ == set() def test_nested_root_model_proper_default(): class Nested(RootModel): root: int = 42 class Model(BaseModel): value: Nested = Field(default_factory=Nested) m = Model() assert m.value.root == 42 assert m.value.__pydantic_fields_set__ == set() def test_root_model_json_schema_meta(): ParametrizedModel = RootModel[int] class SubclassedModel(RootModel): """Subclassed Model docstring""" root: int parametrized_json_schema = ParametrizedModel.model_json_schema() subclassed_json_schema = SubclassedModel.model_json_schema() assert parametrized_json_schema.get('title') == 'RootModel[int]' assert parametrized_json_schema.get('description') is None assert subclassed_json_schema.get('title') == 'SubclassedModel' assert subclassed_json_schema.get('description') == 'Subclassed Model docstring' @pytest.mark.parametrize('order', ['BR', 'RB']) def test_root_model_dump_with_base_model(order): class BModel(BaseModel): value: str class RModel(RootModel): root: int if order == 'BR': class Model(RootModel): root: List[Union[BModel, RModel]] elif order == 'RB': class Model(RootModel): root: List[Union[RModel, BModel]] m = Model([1, 2, {'value': 'abc'}]) assert m.root == [RModel(1), RModel(2), BModel.model_construct(value='abc')] assert m.model_dump() == [1, 2, {'value': 'abc'}] assert m.model_dump_json() == '[1,2,{"value":"abc"}]' @pytest.mark.parametrize( 'data', [ pytest.param({'kind': 'IModel', 'int_value': 42}, id='IModel'), pytest.param({'kind': 'SModel', 'str_value': 'abc'}, id='SModel'), ], ) def test_mixed_discriminated_union(data): class IModel(BaseModel): kind: Literal['IModel'] int_value: int class RModel(RootModel): root: IModel class SModel(BaseModel): kind: Literal['SModel'] str_value: str class Model(RootModel): root: Union[SModel, RModel] = Field(discriminator='kind') if data['kind'] == 'IModel': with pytest.warns(UserWarning, match='Failed to get discriminator value for tagged union serialization'): assert Model(data).model_dump() == data assert Model(**data).model_dump() == data else: assert Model(data).model_dump() == data assert Model(**data).model_dump() == data def test_list_rootmodel(): class A(BaseModel): type: Literal['a'] a: str class B(BaseModel): type: Literal['b'] b: str class D(RootModel[Annotated[Union[A, B], Field(discriminator='type')]]): pass LD = RootModel[List[D]] obj = LD.model_validate([{'type': 'a', 'a': 'a'}, {'type': 'b', 'b': 'b'}]) assert obj.model_dump() == [{'type': 'a', 'a': 'a'}, {'type': 'b', 'b': 'b'}] def test_root_and_data_error(): class BModel(BaseModel): value: int other_value: str Model = RootModel[BModel] with pytest.raises( ValueError, match='"RootModel.__init__" accepts either a single positional argument or arbitrary keyword arguments', ): Model({'value': 42}, other_value='abc') def test_pickle_root_model(create_module): @create_module def module(): from pydantic import RootModel class MyRootModel(RootModel[str]): pass MyRootModel = module.MyRootModel assert MyRootModel(root='abc') == pickle.loads(pickle.dumps(MyRootModel(root='abc'))) def test_json_schema_extra_on_model(): class Model(RootModel): model_config = ConfigDict(json_schema_extra={'schema key': 'schema value'}) root: str assert Model.model_json_schema() == { 'schema key': 'schema value', 'title': 'Model', 'type': 'string', } def test_json_schema_extra_on_field(): class Model(RootModel): root: str = Field(json_schema_extra={'schema key': 'schema value'}) assert Model.model_json_schema() == { 'schema key': 'schema value', 'title': 'Model', 'type': 'string', } def test_json_schema_extra_on_model_and_on_field(): class Model(RootModel): model_config = ConfigDict(json_schema_extra={'schema key on model': 'schema value on model'}) root: str = Field(json_schema_extra={'schema key on field': 'schema value on field'}) with pytest.raises(ValueError, match=r'json_schema_extra.*?must not be set simultaneously'): Model.model_json_schema() def test_help(create_module): module = create_module( # language=Python """ import pydoc from pydantic import RootModel help_result_string = pydoc.render_doc(RootModel) """ ) assert 'class RootModel' in module.help_result_string def test_copy_preserves_equality(): model = RootModel() copied = model.__copy__() assert model == copied deepcopied = model.__deepcopy__() assert model == deepcopied @pytest.mark.parametrize( 'root_type,input_value,expected,raises_match,strict', [ (bool, 'true', True, None, False), (bool, 'true', True, None, True), (bool, 'false', False, None, False), (bool, 'e', ValidationError, 'type=bool_parsing', False), (int, '1', 1, None, False), (int, '1', 1, None, True), (int, 'xxx', ValidationError, 'type=int_parsing', True), (float, '1.1', 1.1, None, False), (float, '1.10', 1.1, None, False), (float, '1.1', 1.1, None, True), (float, '1.10', 1.1, None, True), (date, '2017-01-01', date(2017, 1, 1), None, False), (date, '2017-01-01', date(2017, 1, 1), None, True), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_from_datetime_inexact', False), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_parsing', True), (date, '2017-01-01T00:00:00', date(2017, 1, 1), None, False), (date, '2017-01-01T00:00:00', ValidationError, 'type=date_parsing', True), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, False), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, True), ], ids=repr, ) def test_model_validate_strings(root_type, input_value, expected, raises_match, strict): Model = RootModel[root_type] if raises_match is not None: with pytest.raises(expected, match=raises_match): Model.model_validate_strings(input_value, strict=strict) else: assert Model.model_validate_strings(input_value, strict=strict).root == expected def test_model_construction_with_invalid_generic_specification() -> None: T_ = TypeVar('T_', bound=BaseModel) with pytest.raises(TypeError, match='You should parametrize RootModel directly'): class GenericRootModel(RootModel, Generic[T_]): root: Union[T_, int] def test_model_with_field_description() -> None: class AModel(RootModel): root: int = Field(description='abc') assert AModel.model_json_schema() == {'title': 'AModel', 'type': 'integer', 'description': 'abc'} def test_model_with_both_docstring_and_field_description() -> None: """Check if the docstring is used as the description when both are present.""" class AModel(RootModel): """More detailed description""" root: int = Field(description='abc') assert AModel.model_json_schema() == { 'title': 'AModel', 'type': 'integer', 'description': 'More detailed description', } pydantic-2.10.6/tests/test_serialize.py000066400000000000000000001140721474456633400202020ustar00rootroot00000000000000""" New tests for v2 of serialization logic. """ import json import re import sys from enum import Enum from functools import partial, partialmethod from typing import Any, Callable, ClassVar, Dict, List, Optional, Pattern, Union import pytest from pydantic_core import PydanticSerializationError, core_schema, to_jsonable_python from typing_extensions import Annotated, TypedDict from pydantic import ( BaseModel, Field, FieldSerializationInfo, SerializationInfo, SerializerFunctionWrapHandler, TypeAdapter, computed_field, errors, field_serializer, model_serializer, ) from pydantic.config import ConfigDict from pydantic.functional_serializers import PlainSerializer, WrapSerializer def test_serialize_extra_allow() -> None: class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') m = Model(x=1, y=2) assert m.y == 2 assert m.model_dump() == {'x': 1, 'y': 2} assert json.loads(m.model_dump_json()) == {'x': 1, 'y': 2} def test_serialize_extra_allow_subclass_1() -> None: class Parent(BaseModel): x: int class Child(Parent): model_config = ConfigDict(extra='allow') class Model(BaseModel): inner: Parent m = Model(inner=Child(x=1, y=2)) assert m.inner.y == 2 assert m.model_dump() == {'inner': {'x': 1}} assert json.loads(m.model_dump_json()) == {'inner': {'x': 1}} def test_serialize_extra_allow_subclass_2() -> None: class Parent(BaseModel): x: int model_config = ConfigDict(extra='allow') class Child(Parent): y: int class Model(BaseModel): inner: Parent m = Model(inner=Child(x=1, y=2)) assert m.inner.y == 2 assert m.model_dump() == {'inner': {'x': 1}} assert json.loads(m.model_dump_json()) == {'inner': {'x': 1}} m = Model(inner=Parent(x=1, y=2)) assert m.inner.y == 2 assert m.model_dump() == {'inner': {'x': 1, 'y': 2}} assert json.loads(m.model_dump_json()) == {'inner': {'x': 1, 'y': 2}} def test_serializer_annotated_plain_always(): FancyInt = Annotated[int, PlainSerializer(lambda x: f'{x:,}', return_type=str)] class MyModel(BaseModel): x: FancyInt assert MyModel(x=1234).model_dump() == {'x': '1,234'} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,234'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,234"}' def test_serializer_annotated_plain_json(): FancyInt = Annotated[int, PlainSerializer(lambda x: f'{x:,}', return_type=str, when_used='json')] class MyModel(BaseModel): x: FancyInt assert MyModel(x=1234).model_dump() == {'x': 1234} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,234'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,234"}' def test_serializer_annotated_wrap_always(): def ser_wrap(v: Any, nxt: SerializerFunctionWrapHandler) -> str: return f'{nxt(v + 1):,}' FancyInt = Annotated[int, WrapSerializer(ser_wrap, return_type=str)] class MyModel(BaseModel): x: FancyInt assert MyModel(x=1234).model_dump() == {'x': '1,235'} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,235'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,235"}' def test_serializer_annotated_wrap_json(): def ser_wrap(v: Any, nxt: SerializerFunctionWrapHandler) -> str: return f'{nxt(v + 1):,}' FancyInt = Annotated[int, WrapSerializer(ser_wrap, when_used='json')] class MyModel(BaseModel): x: FancyInt assert MyModel(x=1234).model_dump() == {'x': 1234} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,235'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,235"}' @pytest.mark.parametrize( 'serializer, func', [ (PlainSerializer, lambda v: f'{v + 1:,}'), (WrapSerializer, lambda v, nxt: f'{nxt(v + 1):,}'), ], ) def test_serializer_annotated_typing_cache(serializer, func): FancyInt = Annotated[int, serializer(func)] class FancyIntModel(BaseModel): x: Optional[FancyInt] assert FancyIntModel(x=1234).model_dump() == {'x': '1,235'} def test_serialize_decorator_always(): class MyModel(BaseModel): x: Optional[int] @field_serializer('x') def customise_x_serialization(v, _info) -> str: return f'{v:,}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,234'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,234"}' m = MyModel(x=None) # can't use v:, on None, hence error error_msg = ( 'Error calling function `customise_x_serialization`: ' 'TypeError: unsupported format string passed to NoneType.__format__' ) with pytest.raises(PydanticSerializationError, match=error_msg): m.model_dump() with pytest.raises(PydanticSerializationError, match=error_msg): m.model_dump_json() def test_serialize_decorator_json(): class MyModel(BaseModel): x: int @field_serializer('x', when_used='json') def customise_x_serialization(v) -> str: return f'{v:,}' assert MyModel(x=1234).model_dump() == {'x': 1234} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,234'} assert MyModel(x=1234).model_dump_json() == '{"x":"1,234"}' def test_serialize_decorator_unless_none(): class MyModel(BaseModel): x: Optional[int] @field_serializer('x', when_used='unless-none') def customise_x_serialization(v): return f'{v:,}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} assert MyModel(x=None).model_dump() == {'x': None} assert MyModel(x=1234).model_dump(mode='json') == {'x': '1,234'} assert MyModel(x=None).model_dump(mode='json') == {'x': None} assert MyModel(x=1234).model_dump_json() == '{"x":"1,234"}' assert MyModel(x=None).model_dump_json() == '{"x":null}' def test_annotated_customisation(): def parse_int(s: str, _: Any) -> int: return int(s.replace(',', '')) class CommaFriendlyIntLogic: @classmethod def __get_pydantic_core_schema__(cls, _source, _handler): # here we ignore the schema argument (which is just `{'type': 'int'}`) and return our own return core_schema.with_info_before_validator_function( parse_int, core_schema.int_schema(), serialization=core_schema.format_ser_schema(',', when_used='unless-none'), ) CommaFriendlyInt = Annotated[int, CommaFriendlyIntLogic] class MyModel(BaseModel): x: CommaFriendlyInt m = MyModel(x='1,000') assert m.x == 1000 assert m.model_dump(mode='json') == {'x': '1,000'} assert m.model_dump_json() == '{"x":"1,000"}' def test_serialize_valid_signatures(): def ser_plain(v: Any, info: SerializationInfo) -> Any: return f'{v:,}' def ser_plain_no_info(v: Any, unrelated_arg: int = 1, other_unrelated_arg: int = 2) -> Any: # Arguments with default values are not treated as info arg. return f'{v:,}' def ser_wrap(v: Any, nxt: SerializerFunctionWrapHandler, info: SerializationInfo) -> Any: return f'{nxt(v):,}' class MyModel(BaseModel): f1: int f2: int f3: int f4: int f5: int @field_serializer('f1') def ser_f1(self, v: Any, info: FieldSerializationInfo) -> Any: assert self.f1 == 1_000 assert v == 1_000 assert info.field_name == 'f1' return f'{v:,}' @field_serializer('f2', mode='wrap') def ser_f2(self, v: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo) -> Any: assert self.f2 == 2_000 assert v == 2_000 assert info.field_name == 'f2' return f'{nxt(v):,}' ser_f3 = field_serializer('f3')(ser_plain) ser_f4 = field_serializer('f4')(ser_plain_no_info) ser_f5 = field_serializer('f5', mode='wrap')(ser_wrap) m = MyModel(**{f'f{x}': x * 1_000 for x in range(1, 9)}) assert m.model_dump() == { 'f1': '1,000', 'f2': '2,000', 'f3': '3,000', 'f4': '4,000', 'f5': '5,000', } assert m.model_dump_json() == '{"f1":"1,000","f2":"2,000","f3":"3,000","f4":"4,000","f5":"5,000"}' def test_invalid_signature_no_params() -> None: with pytest.raises(TypeError, match='Unrecognized field_serializer function signature'): class _(BaseModel): x: int # caught by type checkers @field_serializer('x') def no_args() -> Any: ... def test_invalid_signature_single_params() -> None: with pytest.raises(TypeError, match='Unrecognized field_serializer function signature'): class _(BaseModel): x: int # not caught by type checkers @field_serializer('x') def no_args(self) -> Any: ... def test_invalid_signature_too_many_params_1() -> None: with pytest.raises(TypeError, match='Unrecognized field_serializer function signature'): class _(BaseModel): x: int # caught by type checkers @field_serializer('x') def no_args(self, value: Any, nxt: Any, info: Any, extra_param: Any) -> Any: ... def test_invalid_signature_too_many_params_2() -> None: with pytest.raises(TypeError, match='Unrecognized field_serializer function signature'): class _(BaseModel): x: int # caught by type checkers @field_serializer('x') @staticmethod def no_args(not_self: Any, value: Any, nxt: Any, info: Any) -> Any: ... def test_invalid_signature_bad_plain_signature() -> None: with pytest.raises(TypeError, match='Unrecognized field_serializer function signature for'): class _(BaseModel): x: int # caught by type checkers @field_serializer('x', mode='plain') def no_args(self, value: Any, nxt: Any, info: Any) -> Any: ... def test_serialize_ignore_info_plain(): class MyModel(BaseModel): x: int @field_serializer('x') def ser_x(v: Any) -> str: return f'{v:,}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} def test_serialize_ignore_info_wrap(): class MyModel(BaseModel): x: int @field_serializer('x', mode='wrap') def ser_x(v: Any, handler: SerializerFunctionWrapHandler) -> str: return f'{handler(v):,}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} def test_serialize_decorator_self_info(): class MyModel(BaseModel): x: Optional[int] @field_serializer('x') def customise_x_serialization(self, v, info) -> str: return f'{info.mode}:{v:,}' assert MyModel(x=1234).model_dump() == {'x': 'python:1,234'} assert MyModel(x=1234).model_dump(mode='foobar') == {'x': 'foobar:1,234'} def test_serialize_decorator_self_no_info(): class MyModel(BaseModel): x: Optional[int] @field_serializer('x') def customise_x_serialization(self, v) -> str: return f'{v:,}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} def test_model_serializer_plain(): class MyModel(BaseModel): a: int b: bytes @model_serializer def _serialize(self): if self.b == b'custom': return f'MyModel(a={self.a!r}, b={self.b!r})' else: return self.__dict__ m = MyModel(a=1, b='boom') assert m.model_dump() == {'a': 1, 'b': b'boom'} assert m.model_dump(mode='json') == {'a': 1, 'b': 'boom'} assert m.model_dump_json() == '{"a":1,"b":"boom"}' assert m.model_dump(exclude={'a'}) == {'a': 1, 'b': b'boom'} # exclude is ignored as we used self.__dict__ assert m.model_dump(mode='json', exclude={'a'}) == {'a': 1, 'b': 'boom'} assert m.model_dump_json(exclude={'a'}) == '{"a":1,"b":"boom"}' m = MyModel(a=1, b='custom') assert m.model_dump() == "MyModel(a=1, b=b'custom')" assert m.model_dump(mode='json') == "MyModel(a=1, b=b'custom')" assert m.model_dump_json() == '"MyModel(a=1, b=b\'custom\')"' def test_model_serializer_plain_info(): class MyModel(BaseModel): a: int b: bytes @model_serializer def _serialize(self, info): if info.exclude: return {k: v for k, v in self.__dict__.items() if k not in info.exclude} else: return self.__dict__ m = MyModel(a=1, b='boom') assert m.model_dump() == {'a': 1, 'b': b'boom'} assert m.model_dump(mode='json') == {'a': 1, 'b': 'boom'} assert m.model_dump_json() == '{"a":1,"b":"boom"}' assert m.model_dump(exclude={'a'}) == {'b': b'boom'} # exclude is not ignored assert m.model_dump(mode='json', exclude={'a'}) == {'b': 'boom'} assert m.model_dump_json(exclude={'a'}) == '{"b":"boom"}' def test_model_serializer_wrap(): class MyModel(BaseModel): a: int b: bytes c: bytes = Field(exclude=True) @model_serializer(mode='wrap') def _serialize(self, handler): d = handler(self) d['extra'] = 42 return d m = MyModel(a=1, b='boom', c='excluded') assert m.model_dump() == {'a': 1, 'b': b'boom', 'extra': 42} assert m.model_dump(mode='json') == {'a': 1, 'b': 'boom', 'extra': 42} assert m.model_dump_json() == '{"a":1,"b":"boom","extra":42}' assert m.model_dump(exclude={'a'}) == {'b': b'boom', 'extra': 42} assert m.model_dump(mode='json', exclude={'a'}) == {'b': 'boom', 'extra': 42} assert m.model_dump_json(exclude={'a'}) == '{"b":"boom","extra":42}' def test_model_serializer_wrap_info(): class MyModel(BaseModel): a: int b: bytes c: bytes = Field(exclude=True) @model_serializer(mode='wrap') def _serialize(self, handler, info): d = handler(self) d['info'] = f'mode={info.mode} exclude={info.exclude}' return d m = MyModel(a=1, b='boom', c='excluded') assert m.model_dump() == {'a': 1, 'b': b'boom', 'info': 'mode=python exclude=None'} assert m.model_dump(mode='json') == {'a': 1, 'b': 'boom', 'info': 'mode=json exclude=None'} assert m.model_dump_json() == '{"a":1,"b":"boom","info":"mode=json exclude=None"}' assert m.model_dump(exclude={'a'}) == {'b': b'boom', 'info': "mode=python exclude={'a'}"} assert m.model_dump(mode='json', exclude={'a'}) == {'b': 'boom', 'info': "mode=json exclude={'a'}"} assert m.model_dump_json(exclude={'a'}) == '{"b":"boom","info":"mode=json exclude={\'a\'}"}' def test_model_serializer_plain_json_return_type(): class MyModel(BaseModel): a: int @model_serializer(when_used='json') def _serialize(self) -> str: if self.a == 666: return self.a else: return f'MyModel(a={self.a!r})' m = MyModel(a=1) assert m.model_dump() == {'a': 1} assert m.model_dump(mode='json') == 'MyModel(a=1)' assert m.model_dump_json() == '"MyModel(a=1)"' m = MyModel(a=666) assert m.model_dump() == {'a': 666} with pytest.warns( UserWarning, match='Expected `str` but got `int` with value `666` - serialized value may not be as expected' ): assert m.model_dump(mode='json') == 666 with pytest.warns( UserWarning, match='Expected `str` but got `int` with value `666` - serialized value may not be as expected' ): assert m.model_dump_json() == '666' def test_model_serializer_wrong_args(): m = ( r'Unrecognized model_serializer function signature for ' r'<.+MyModel._serialize at 0x\w+> with `mode=plain`:\(self, x, y, z\)' ) with pytest.raises(TypeError, match=m): class MyModel(BaseModel): a: int @model_serializer def _serialize(self, x, y, z): return self def test_model_serializer_no_self(): with pytest.raises(TypeError, match='`@model_serializer` must be applied to instance methods'): class MyModel(BaseModel): a: int @model_serializer def _serialize(slf, x, y, z): return slf def test_model_serializer_classmethod(): with pytest.raises(TypeError, match='`@model_serializer` must be applied to instance methods'): class MyModel(BaseModel): a: int @model_serializer @classmethod def _serialize(self, x, y, z): return self def test_field_multiple_serializer(): m = "Multiple field serializer functions were defined for field 'x', this is not allowed." with pytest.raises(TypeError, match=m): class MyModel(BaseModel): x: int y: int @field_serializer('x', 'y') def serializer1(v) -> str: return f'{v:,}' @field_serializer('x') def serializer2(v) -> str: return v def test_field_multiple_serializer_subclass(): class MyModel(BaseModel): x: int @field_serializer('x') def serializer1(v) -> str: return f'{v:,}' class MySubModel(MyModel): @field_serializer('x') def serializer1(v) -> str: return f'{v}' assert MyModel(x=1234).model_dump() == {'x': '1,234'} assert MySubModel(x=1234).model_dump() == {'x': '1234'} def test_serialize_all_fields(): class MyModel(BaseModel): x: int @field_serializer('*') @classmethod def serialize_all(cls, v: Any): return v * 2 assert MyModel(x=10).model_dump() == {'x': 20} def int_ser_func_without_info1(v: int, expected: int) -> str: return f'{v:,}' def int_ser_func_without_info2(v: int, *, expected: int) -> str: return f'{v:,}' def int_ser_func_with_info1(v: int, info: FieldSerializationInfo, expected: int) -> str: return f'{v:,}' def int_ser_func_with_info2(v: int, info: FieldSerializationInfo, *, expected: int) -> str: return f'{v:,}' def int_ser_instance_method_without_info1(self: Any, v: int, *, expected: int) -> str: assert self.x == v return f'{v:,}' def int_ser_instance_method_without_info2(self: Any, v: int, expected: int) -> str: assert self.x == v return f'{v:,}' def int_ser_instance_method_with_info1(self: Any, v: int, info: FieldSerializationInfo, expected: int) -> str: assert self.x == v return f'{v:,}' def int_ser_instance_method_with_info2(self: Any, v: int, info: FieldSerializationInfo, *, expected: int) -> str: assert self.x == v return f'{v:,}' @pytest.mark.parametrize( 'func', [ int_ser_func_with_info1, int_ser_func_with_info2, int_ser_func_without_info1, int_ser_func_without_info2, int_ser_instance_method_with_info1, int_ser_instance_method_with_info2, int_ser_instance_method_without_info1, int_ser_instance_method_without_info2, ], ) def test_serialize_partial( func: Any, ): class MyModel(BaseModel): x: int ser = field_serializer('x', return_type=str)(partial(func, expected=1234)) assert MyModel(x=1234).model_dump() == {'x': '1,234'} @pytest.mark.parametrize( 'func', [ int_ser_func_with_info1, int_ser_func_with_info2, int_ser_func_without_info1, int_ser_func_without_info2, int_ser_instance_method_with_info1, int_ser_instance_method_with_info2, int_ser_instance_method_without_info1, int_ser_instance_method_without_info2, ], ) def test_serialize_partialmethod( func: Any, ): class MyModel(BaseModel): x: int ser = field_serializer('x', return_type=str)(partialmethod(func, expected=1234)) assert MyModel(x=1234).model_dump() == {'x': '1,234'} def test_serializer_allow_reuse_inheritance_override(): class Parent(BaseModel): x: int @field_serializer('x') def ser_x(self, _v: int, _info: SerializationInfo) -> str: return 'parent_encoder' # overriding a serializer with a function / class var # of the same name is allowed # to mimic how inheritance works # the serializer in the child class replaces the parent # (without modifying the parent class itself) class Child1(Parent): @field_serializer('x') def ser_x(self, _v: int, _info: SerializationInfo) -> str: return 'child1_encoder' + ' ' + super().ser_x(_v, _info) assert Parent(x=1).model_dump_json() == '{"x":"parent_encoder"}' assert Child1(x=1).model_dump_json() == '{"x":"child1_encoder parent_encoder"}' # defining an _different_ serializer on the other hand is not allowed # because they would both "exist" thus causing confusion # since it's not clear if both or just one will run msg = 'Multiple field serializer functions were defined ' "for field 'x', this is not allowed." with pytest.raises(TypeError, match=msg): class _(Parent): @field_serializer('x') def ser_x_other(self, _v: int) -> str: return 'err' # the same thing applies if defined on the same class with pytest.raises(TypeError, match=msg): class _(BaseModel): x: int @field_serializer('x') def ser_x(self, _v: int) -> str: return 'parent_encoder' @field_serializer('x') def other_func_name(self, _v: int) -> str: return 'parent_encoder' def test_serializer_allow_reuse_same_field(): with pytest.warns(UserWarning, match='`ser_x` overrides an existing Pydantic `@field_serializer` decorator'): class Model(BaseModel): x: int @field_serializer('x') def ser_x(self, _v: int) -> str: return 'ser_1' @field_serializer('x') def ser_x(self, _v: int) -> str: return 'ser_2' assert Model(x=1).model_dump() == {'x': 'ser_2'} def test_serializer_allow_reuse_different_field_1(): with pytest.warns(UserWarning, match='`ser` overrides an existing Pydantic `@field_serializer` decorator'): class Model(BaseModel): x: int y: int @field_serializer('x') def ser(self, _v: int) -> str: return 'x' @field_serializer('y') def ser(self, _v: int) -> str: return 'y' assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 'y'} def test_serializer_allow_reuse_different_field_2(): with pytest.warns(UserWarning, match='`ser_x` overrides an existing Pydantic `@field_serializer` decorator'): def ser(self: Any, _v: int, _info: Any) -> str: return 'ser' class Model(BaseModel): x: int y: int @field_serializer('x') def ser_x(self, _v: int) -> str: return 'ser_x' ser_x = field_serializer('y')(ser) assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 'ser'} def test_serializer_allow_reuse_different_field_3(): with pytest.warns(UserWarning, match='`ser_x` overrides an existing Pydantic `@field_serializer` decorator'): def ser1(self: Any, _v: int, _info: Any) -> str: return 'ser1' def ser2(self: Any, _v: int, _info: Any) -> str: return 'ser2' class Model(BaseModel): x: int y: int ser_x = field_serializer('x')(ser1) ser_x = field_serializer('y')(ser2) assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 'ser2'} def test_serializer_allow_reuse_different_field_4(): def ser(self: Any, _v: int, _info: Any) -> str: return f'{_v:,}' class Model(BaseModel): x: int y: int ser_x = field_serializer('x')(ser) not_ser_x = field_serializer('y')(ser) assert Model(x=1_000, y=2_000).model_dump() == {'x': '1,000', 'y': '2,000'} def test_serialize_any_model(): class Model(BaseModel): m: str @field_serializer('m') def ser_m(self, v: str, _info: SerializationInfo) -> str: return f'custom:{v}' class AnyModel(BaseModel): x: Any m = Model(m='test') assert m.model_dump() == {'m': 'custom:test'} assert to_jsonable_python(AnyModel(x=m)) == {'x': {'m': 'custom:test'}} assert AnyModel(x=m).model_dump() == {'x': {'m': 'custom:test'}} def test_invalid_field(): msg = ( r'Decorators defined with incorrect fields:' r' tests.test_serialize.test_invalid_field..Model:\d+.customise_b_serialization' r" \(use check_fields=False if you're inheriting from the model and intended this\)" ) with pytest.raises(errors.PydanticUserError, match=msg): class Model(BaseModel): a: str @field_serializer('b') def customise_b_serialization(v): return v def test_serialize_with_extra(): class Inner(BaseModel): a: str = 'a' class Outer(BaseModel): # this cause the inner model incorrectly dumpped: model_config = ConfigDict(extra='allow') inner: Inner = Field(default_factory=Inner) m = Outer.model_validate({}) assert m.model_dump() == {'inner': {'a': 'a'}} def test_model_serializer_nested_models() -> None: class Model(BaseModel): x: int inner: Optional['Model'] @model_serializer(mode='wrap') def ser_model(self, handler: Callable[['Model'], Dict[str, Any]]) -> Dict[str, Any]: inner = handler(self) inner['x'] += 1 return inner assert Model(x=0, inner=None).model_dump() == {'x': 1, 'inner': None} assert Model(x=2, inner=Model(x=1, inner=Model(x=0, inner=None))).model_dump() == { 'x': 3, 'inner': {'x': 2, 'inner': {'x': 1, 'inner': None}}, } def test_pattern_serialize(): ta = TypeAdapter(Pattern[str]) pattern = re.compile('^regex$') assert ta.dump_python(pattern) == pattern assert ta.dump_python(pattern, mode='json') == '^regex$' assert ta.dump_json(pattern) == b'"^regex$"' def test_custom_return_schema(): class Model(BaseModel): x: int @field_serializer('x', return_type=str) def ser_model(self, v) -> int: return repr(v) return_serializer = re.search(r'return_serializer: *\w+', repr(Model.__pydantic_serializer__)).group(0) assert return_serializer == 'return_serializer: Str' def test_clear_return_schema(): class Model(BaseModel): x: int @field_serializer('x', return_type=Any) def ser_model(self, v) -> int: return repr(v) return_serializer = re.search(r'return_serializer: *\w+', repr(Model.__pydantic_serializer__)).group(0) assert return_serializer == 'return_serializer: Any' def test_serializer_return_type_model() -> None: """https://github.com/pydantic/pydantic/issues/10443""" class Sub(BaseModel): pass class Model(BaseModel): sub: Annotated[ Sub, PlainSerializer(lambda v: v, return_type=Sub), ] assert Model(sub=Sub()).model_dump() == {'sub': {}} def test_type_adapter_dump_json(): class Model(TypedDict): x: int y: float @model_serializer(mode='plain') def ser_model(self) -> Dict[str, Any]: return {'x': self['x'] * 2, 'y': self['y'] * 3} ta = TypeAdapter(Model) assert ta.dump_json(Model({'x': 1, 'y': 2.5})) == b'{"x":2,"y":7.5}' def test_type_adapter_dump_with_context(): class Model(TypedDict): x: int y: float @model_serializer(mode='wrap') def _serialize(self, handler, info: SerializationInfo): data = handler(self) if info.context and info.context.get('mode') == 'x-only': data.pop('y') return data ta = TypeAdapter(Model) assert ta.dump_json(Model({'x': 1, 'y': 2.5}), context={'mode': 'x-only'}) == b'{"x":1}' @pytest.mark.parametrize('as_annotation', [True, False]) @pytest.mark.parametrize('mode', ['plain', 'wrap']) def test_forward_ref_for_serializers(as_annotation, mode): if mode == 'plain': def ser_model_func(v) -> 'SomeOtherModel': # noqa F821 return OtherModel(y=v + 1) def ser_model_method(self, v) -> 'SomeOtherModel': # noqa F821 return ser_model_func(v) annotation = PlainSerializer(ser_model_func) else: def ser_model_func(v, handler) -> 'SomeOtherModel': # noqa F821 return OtherModel(y=v + 1) def ser_model_method(self, v, handler) -> 'SomeOtherModel': # noqa F821 return ser_model_func(v, handler) annotation = WrapSerializer(ser_model_func) class Model(BaseModel): if as_annotation: x: Annotated[int, annotation] else: x: int ser_model = field_serializer('x', mode=mode)(ser_model_method) class OtherModel(BaseModel): y: int Model.model_rebuild(_types_namespace={'SomeOtherModel': OtherModel}) assert Model(x=1).model_dump() == {'x': {'y': 2}} assert Model.model_json_schema(mode='serialization') == { '$defs': { 'OtherModel': { 'properties': {'y': {'title': 'Y', 'type': 'integer'}}, 'required': ['y'], 'title': 'OtherModel', 'type': 'object', } }, 'properties': {'x': {'$ref': '#/$defs/OtherModel', 'title': 'X'}}, 'required': ['x'], 'title': 'Model', 'type': 'object', } def test_forward_ref_for_computed_fields(): class Model(BaseModel): x: int @computed_field @property def two_x(self) -> 'IntAlias': # noqa F821 return self.x * 2 Model.model_rebuild(_types_namespace={'IntAlias': int}) assert Model.model_json_schema(mode='serialization') == { 'properties': { 'two_x': {'readOnly': True, 'title': 'Two X', 'type': 'integer'}, 'x': {'title': 'X', 'type': 'integer'}, }, 'required': ['x', 'two_x'], 'title': 'Model', 'type': 'object', } assert Model(x=1).model_dump() == {'two_x': 2, 'x': 1} def test_computed_field_custom_serializer(): class Model(BaseModel): x: int @computed_field @property def two_x(self) -> int: return self.x * 2 @field_serializer('two_x', when_used='json') def ser_two_x(self, v): return f'The double of x is {v}' m = Model(x=1) assert m.model_dump() == {'two_x': 2, 'x': 1} assert json.loads(m.model_dump_json()) == {'two_x': 'The double of x is 2', 'x': 1} def test_annotated_computed_field_custom_serializer(): class Model(BaseModel): x: int @computed_field @property def two_x(self) -> Annotated[int, PlainSerializer(lambda v: f'The double of x is {v}', return_type=str)]: return self.x * 2 @computed_field @property def triple_x(self) -> Annotated[int, PlainSerializer(lambda v: f'The triple of x is {v}', return_type=str)]: return self.two_x * 3 @computed_field @property def quadruple_x_plus_one(self) -> Annotated[int, PlainSerializer(lambda v: v + 1, return_type=int)]: return self.two_x * 2 m = Model(x=1) assert m.x == 1 assert m.two_x == 2 assert m.triple_x == 6 assert m.quadruple_x_plus_one == 4 # insert_assert(m.model_dump()) assert m.model_dump() == { 'x': 1, 'two_x': 'The double of x is 2', 'triple_x': 'The triple of x is 6', 'quadruple_x_plus_one': 5, } # insert_assert(json.loads(m.model_dump_json())) assert json.loads(m.model_dump_json()) == { 'x': 1, 'two_x': 'The double of x is 2', 'triple_x': 'The triple of x is 6', 'quadruple_x_plus_one': 5, } # insert_assert(Model.model_json_schema(mode='serialization')) assert Model.model_json_schema(mode='serialization') == { 'properties': { 'x': {'title': 'X', 'type': 'integer'}, 'two_x': {'readOnly': True, 'title': 'Two X', 'type': 'string'}, 'triple_x': {'readOnly': True, 'title': 'Triple X', 'type': 'string'}, 'quadruple_x_plus_one': {'readOnly': True, 'title': 'Quadruple X Plus One', 'type': 'integer'}, }, 'required': ['x', 'two_x', 'triple_x', 'quadruple_x_plus_one'], 'title': 'Model', 'type': 'object', } @pytest.mark.skipif( sys.version_info < (3, 9) or sys.version_info >= (3, 13), reason='@computed_field @classmethod @property only works in 3.9-3.12', ) def test_forward_ref_for_classmethod_computed_fields(): class Model(BaseModel): y: ClassVar[int] = 4 @computed_field @classmethod @property def two_y(cls) -> 'IntAlias': # noqa F821 return cls.y * 2 Model.model_rebuild(_types_namespace={'IntAlias': int}) assert Model.model_json_schema(mode='serialization') == { 'properties': { 'two_y': {'readOnly': True, 'title': 'Two Y', 'type': 'integer'}, }, 'required': ['two_y'], 'title': 'Model', 'type': 'object', } assert Model().model_dump() == {'two_y': 8} def test_enum_as_dict_key() -> None: # See https://github.com/pydantic/pydantic/issues/7639 class MyEnum(Enum): A = 'a' B = 'b' class MyModel(BaseModel): foo: Dict[MyEnum, str] bar: MyEnum assert MyModel(foo={MyEnum.A: 'hello'}, bar=MyEnum.B).model_dump_json() == '{"foo":{"a":"hello"},"bar":"b"}' def test_subclass_support_unions() -> None: class Pet(BaseModel): name: str class Dog(Pet): breed: str class Kid(BaseModel): age: str class Home(BaseModel): little_guys: Union[List[Pet], List[Kid]] class Shelter(BaseModel): pets: List[Pet] h1 = Home(little_guys=[Pet(name='spot'), Pet(name='buddy')]) assert h1.model_dump() == {'little_guys': [{'name': 'spot'}, {'name': 'buddy'}]} h2 = Home(little_guys=[Dog(name='fluffy', breed='lab'), Dog(name='patches', breed='boxer')]) assert h2.model_dump() == {'little_guys': [{'name': 'fluffy'}, {'name': 'patches'}]} # confirming same serialization + validation behavior as for a single list (not a union) s = Shelter(pets=[Dog(name='fluffy', breed='lab'), Dog(name='patches', breed='boxer')]) assert s.model_dump() == {'pets': [{'name': 'fluffy'}, {'name': 'patches'}]} def test_subclass_support_unions_with_forward_ref() -> None: class Bar(BaseModel): bar_id: int class Baz(Bar): baz_id: int class Foo(BaseModel): items: Union[List['Foo'], List[Bar]] foo = Foo(items=[Baz(bar_id=1, baz_id=2), Baz(bar_id=3, baz_id=4)]) assert foo.model_dump() == {'items': [{'bar_id': 1}, {'bar_id': 3}]} foo_recursive = Foo(items=[Foo(items=[Baz(bar_id=42, baz_id=99)])]) assert foo_recursive.model_dump() == {'items': [{'items': [{'bar_id': 42}]}]} def test_serialize_python_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_serializer('x') def serialize_x(self, v: int, info: SerializationInfo) -> int: assert info.context == contexts.pop(0) return v m = Model.model_construct(**{'x': 1}) m.model_dump() m.model_dump(context=None) m.model_dump(context={'foo': 'bar'}) assert contexts == [] def test_serialize_json_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_serializer('x') def serialize_x(self, v: int, info: SerializationInfo) -> int: assert info.context == contexts.pop(0) return v m = Model.model_construct(**{'x': 1}) m.model_dump_json() m.model_dump_json(context=None) m.model_dump_json(context={'foo': 'bar'}) assert contexts == [] def test_plain_serializer_with_std_type() -> None: """Ensure that a plain serializer can be used with a standard type constructor, rather than having to use lambda x: std_type(x).""" class MyModel(BaseModel): x: Annotated[int, PlainSerializer(float)] m = MyModel(x=1) assert m.model_dump() == {'x': 1.0} assert m.model_dump_json() == '{"x":1.0}' assert m.model_json_schema(mode='serialization') == { 'properties': {'x': {'title': 'X', 'type': 'number'}}, 'required': ['x'], 'title': 'MyModel', 'type': 'object', } @pytest.mark.xfail(reason='Waiting for union serialization fixes via https://github.com/pydantic/pydantic/issues/9688.') def smart_union_serialization() -> None: """Initially reported via https://github.com/pydantic/pydantic/issues/9417, effectively a round tripping problem with type consistency.""" class FloatThenInt(BaseModel): value: Union[float, int, str] = Field(union_mode='smart') class IntThenFloat(BaseModel): value: Union[int, float, str] = Field(union_mode='smart') float_then_int = FloatThenInt(value=100) assert type(json.loads(float_then_int.model_dump_json())['value']) is int int_then_float = IntThenFloat(value=100) assert type(json.loads(int_then_float.model_dump_json())['value']) is int def test_serialize_with_custom_ser() -> None: class Item(BaseModel): id: int @model_serializer def dump(self) -> Dict[str, Any]: return {'id': self.id} class ItemContainer(BaseModel): item_or_items: Union[Item, List[Item]] items = [Item(id=i) for i in range(5)] assert ( ItemContainer(item_or_items=items).model_dump_json() == '{"item_or_items":[{"id":0},{"id":1},{"id":2},{"id":3},{"id":4}]}' ) def test_field_serializers_use_enum_ref() -> None: """See https://github.com/pydantic/pydantic/issues/9394 for the original issue.""" class MyEnum(Enum): A = 'a' B = 'b' class MyModel(BaseModel): @computed_field @property def computed_a_or_b(self) -> MyEnum: return MyEnum.B @field_serializer('computed_a_or_b') def serialize_my_enum(self, a_or_b: MyEnum) -> str: return a_or_b.value m = MyModel() assert m.model_dump()['computed_a_or_b'] == 'b' pydantic-2.10.6/tests/test_serialize_as_any.py000066400000000000000000000147721474456633400215420ustar00rootroot00000000000000import json from dataclasses import dataclass from typing import List, Optional import pytest from typing_extensions import TypedDict from pydantic import BaseModel, ConfigDict, RootModel, SecretStr, SerializeAsAny, TypeAdapter from pydantic.dataclasses import dataclass as pydantic_dataclass class User(BaseModel): name: str class UserLogin(User): password: SecretStr user = User(name='pydantic') user_login = UserLogin(name='pydantic', password='password') def test_serialize_as_any_annotation() -> None: class OuterModel(BaseModel): maybe_as_any: Optional[SerializeAsAny[User]] = None as_any: SerializeAsAny[User] without: User # insert_assert(json.loads(OuterModel(as_any=user, without=user).model_dump_json())) assert json.loads(OuterModel(maybe_as_any=user_login, as_any=user_login, without=user_login).model_dump_json()) == { 'maybe_as_any': {'name': 'pydantic', 'password': '**********'}, 'as_any': {'name': 'pydantic', 'password': '**********'}, 'without': {'name': 'pydantic'}, } def test_serialize_as_any_runtime() -> None: class OuterModel(BaseModel): user: User assert json.loads(OuterModel(user=user_login).model_dump_json(serialize_as_any=False)) == { 'user': {'name': 'pydantic'} } assert json.loads(OuterModel(user=user_login).model_dump_json(serialize_as_any=True)) == { 'user': {'name': 'pydantic', 'password': '**********'} } def test_serialize_as_any_runtime_recursive() -> None: class User(BaseModel): name: str friends: List['User'] class UserLogin(User): password: SecretStr class OuterModel(BaseModel): user: User user = UserLogin( name='pydantic', password='password', friends=[UserLogin(name='pydantic', password='password', friends=[])] ) assert json.loads(OuterModel(user=user).model_dump_json(serialize_as_any=False)) == { 'user': { 'name': 'pydantic', 'friends': [{'name': 'pydantic', 'friends': []}], }, } assert json.loads(OuterModel(user=user).model_dump_json(serialize_as_any=True)) == { 'user': { 'name': 'pydantic', 'password': '**********', 'friends': [{'name': 'pydantic', 'password': '**********', 'friends': []}], }, } def test_serialize_as_any_with_rootmodel() -> None: UserRoot = RootModel[User] assert json.loads(UserRoot(root=user_login).model_dump_json(serialize_as_any=False)) == {'name': 'pydantic'} assert json.loads(UserRoot(root=user_login).model_dump_json(serialize_as_any=True)) == { 'name': 'pydantic', 'password': '**********', } def test_serialize_as_any_type_adapter() -> None: ta = TypeAdapter(User) assert json.loads(ta.dump_json(user_login, serialize_as_any=False)) == {'name': 'pydantic'} assert json.loads(ta.dump_json(user_login, serialize_as_any=True)) == {'name': 'pydantic', 'password': '**********'} @pytest.mark.parametrize('dataclass_constructor', [dataclass, pydantic_dataclass]) def test_serialize_as_any_with_dataclasses(dataclass_constructor) -> None: @dataclass_constructor class User: name: str @dataclass_constructor class UserLogin(User): password: str user_login = UserLogin(name='pydantic', password='password') ta = TypeAdapter(User) assert json.loads(ta.dump_json(user_login, serialize_as_any=False, warnings=False)) == {'name': 'pydantic'} assert json.loads(ta.dump_json(user_login, serialize_as_any=True, warnings=False)) == { 'name': 'pydantic', 'password': 'password', } def test_serialize_as_any_with_typed_dict() -> None: class User(TypedDict): name: str class UserLogin(User): password: str user_login = UserLogin(name='pydantic', password='password') ta = TypeAdapter(User) assert json.loads(ta.dump_json(user_login, serialize_as_any=False, warnings=False)) == {'name': 'pydantic'} assert json.loads(ta.dump_json(user_login, serialize_as_any=True, warnings=False)) == { 'name': 'pydantic', 'password': 'password', } def test_serialize_as_any_flag_on_unrelated_models() -> None: class Parent(BaseModel): x: int class Other(BaseModel): y: str model_config = ConfigDict(extra='allow') ta = TypeAdapter(Parent) other = Other(x=1, y='hello') assert ta.dump_python(other, serialize_as_any=False) == {} assert ta.dump_python(other, serialize_as_any=True) == {'y': 'hello', 'x': 1} def test_serialize_as_any_annotation_on_unrelated_models() -> None: class Parent(BaseModel): x: int class Other(BaseModel): y: str model_config = ConfigDict(extra='allow') ta = TypeAdapter(Parent) other = Other(x=1, y='hello') assert ta.dump_python(other) == {} ta_any = TypeAdapter(SerializeAsAny[Parent]) assert ta_any.dump_python(other) == {'y': 'hello', 'x': 1} def test_serialize_as_any_with_inner_models() -> None: """As with other serialization flags, serialize_as_any affects nested models as well.""" class Inner(BaseModel): x: int class Outer(BaseModel): inner: Inner class InnerChild(Inner): y: int ta = TypeAdapter(Outer) inner_child = InnerChild(x=1, y=2) outer = Outer(inner=inner_child) assert ta.dump_python(outer, serialize_as_any=False) == {'inner': {'x': 1}} assert ta.dump_python(outer, serialize_as_any=True) == {'inner': {'x': 1, 'y': 2}} def test_serialize_as_any_annotation_with_inner_models() -> None: """The SerializeAsAny annotation does not affect nested models.""" class Inner(BaseModel): x: int class Outer(BaseModel): inner: Inner class InnerChild(Inner): y: int ta = TypeAdapter(SerializeAsAny[Outer]) inner_child = InnerChild(x=1, y=2) outer = Outer(inner=inner_child) assert ta.dump_python(outer) == {'inner': {'x': 1}} def test_serialize_as_any_flag_with_incorrect_list_el_type() -> None: # a warning is raised when using the `serialize_as_any` flag ta = TypeAdapter(List[int]) with pytest.warns(UserWarning, match='Expected `int` but got `str`'): assert ta.dump_python(['a', 'b', 'c'], serialize_as_any=False) == ['a', 'b', 'c'] def test_serialize_as_any_annotation_with_incorrect_list_el_type() -> None: # notably, the warning is not raised when using the SerializeAsAny annotation ta = TypeAdapter(SerializeAsAny[List[int]]) assert ta.dump_python(['a', 'b', 'c']) == ['a', 'b', 'c'] pydantic-2.10.6/tests/test_strict.py000066400000000000000000000061731474456633400175250ustar00rootroot00000000000000import sys from typing import Any, Type if sys.version_info < (3, 9): from typing_extensions import Annotated else: from typing import Annotated import pytest from pydantic import BaseModel, ConfigDict, Field, ValidationError @pytest.fixture(scope='session', name='ModelWithStrictField') def model_with_strict_field(): class ModelWithStrictField(BaseModel): a: Annotated[int, Field(strict=True)] return ModelWithStrictField @pytest.mark.parametrize( 'value', [ '1', True, 1.0, ], ) def test_parse_strict_mode_on_field_invalid(value: Any, ModelWithStrictField: Type[BaseModel]) -> None: with pytest.raises(ValidationError) as exc_info: ModelWithStrictField(a=value) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('a',), 'msg': 'Input should be a valid integer', 'input': value} ] def test_parse_strict_mode_on_field_valid(ModelWithStrictField: Type[BaseModel]) -> None: value = ModelWithStrictField(a=1) assert value.model_dump() == {'a': 1} @pytest.fixture(scope='session', name='ModelWithStrictConfig') def model_with_strict_config_false(): class ModelWithStrictConfig(BaseModel): a: int # strict=False overrides the Config b: Annotated[int, Field(strict=False)] # strict=None or not including it is equivalent # lets this field be overridden by the Config c: Annotated[int, Field(strict=None)] d: Annotated[int, Field()] model_config = ConfigDict(strict=True) return ModelWithStrictConfig def test_parse_model_with_strict_config_enabled(ModelWithStrictConfig: Type[BaseModel]) -> None: with pytest.raises(ValidationError) as exc_info: ModelWithStrictConfig(a='1', b=2, c=3, d=4) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('a',), 'msg': 'Input should be a valid integer', 'input': '1'} ] with pytest.raises(ValidationError) as exc_info: ModelWithStrictConfig(a=1, b=2, c='3', d=4) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('c',), 'msg': 'Input should be a valid integer', 'input': '3'} ] with pytest.raises(ValidationError) as exc_info: ModelWithStrictConfig(a=1, b=2, c=3, d='4') assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('d',), 'msg': 'Input should be a valid integer', 'input': '4'} ] values = [ ModelWithStrictConfig(a=1, b='2', c=3, d=4), ModelWithStrictConfig(a=1, b=2, c=3, d=4), ] assert all(v.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4} for v in values) def test_parse_model_with_strict_config_disabled(ModelWithStrictConfig: Type[BaseModel]) -> None: class Model(ModelWithStrictConfig): model_config = ConfigDict(strict=False) values = [ Model(a='1', b=2, c=3, d=4), Model(a=1, b=2, c='3', d=4), Model(a=1, b=2, c=3, d='4'), Model(a=1, b='2', c=3, d=4), Model(a=1, b=2, c=3, d=4), ] assert all(v.model_dump() == {'a': 1, 'b': 2, 'c': 3, 'd': 4} for v in values) pydantic-2.10.6/tests/test_structural_pattern_matching.py000066400000000000000000000012321474456633400240230ustar00rootroot00000000000000import sys import pytest @pytest.mark.skipif(sys.version_info < (3, 10), reason='requires python 3.10 or higher') def test_match_kwargs(create_module): module = create_module( # language=Python """ from pydantic import BaseModel class Model(BaseModel): a: str b: str def main(model): match model: case Model(a='a', b=b): return b case Model(a='a2'): return 'b2' case _: return None """ ) assert module.main(module.Model(a='a', b='b')) == 'b' assert module.main(module.Model(a='a2', b='b')) == 'b2' assert module.main(module.Model(a='x', b='b')) is None pydantic-2.10.6/tests/test_titles.py000066400000000000000000000424201474456633400175140ustar00rootroot00000000000000import re import typing from typing import Any, Callable, List import pytest import typing_extensions import pydantic from pydantic import BaseModel, ConfigDict, Field, TypeAdapter, computed_field from pydantic.fields import FieldInfo from pydantic.json_schema import model_json_schema from .test_types_typeddict import fixture_typed_dict, fixture_typed_dict_all # noqa @pytest.fixture( name='Annotated', params=[ pytest.param(typing, id='typing.Annotated'), pytest.param(typing_extensions, id='t_e.Annotated'), ], ) def fixture_annotated(request): try: return request.param.Annotated except AttributeError: pytest.skip(f'Annotated is not available from {request.param}') def make_title(name: str, _): def _capitalize(v: str): return v[0].upper() + v[1:] return re.sub(r'(?<=[a-z])([A-Z])', r' \1', _capitalize(name)) FIELD_TITLE_GENERATORS: List[Callable[[str, Any], str]] = [ lambda t, _: t.lower(), lambda t, _: t * 2, lambda t, _: 'My Title', make_title, ] MODEL_TITLE_GENERATORS: List[Callable[[Any], str]] = [ lambda m: m.__name__.upper(), lambda m: m.__name__ * 2, lambda m: 'My Model', ] @pytest.mark.parametrize('model_title_generator', MODEL_TITLE_GENERATORS) def test_model_model_title_generator(model_title_generator): class Model(BaseModel): model_config = ConfigDict(model_title_generator=model_title_generator) assert Model.model_json_schema() == { 'properties': {}, 'title': model_title_generator(Model), 'type': 'object', } @pytest.mark.parametrize('model_title_generator', MODEL_TITLE_GENERATORS) def test_model_title_generator_in_submodel(model_title_generator): class SubModel(BaseModel): model_config = ConfigDict(model_title_generator=model_title_generator) class Model(BaseModel): sub: SubModel assert Model.model_json_schema() == { '$defs': {'SubModel': {'properties': {}, 'title': model_title_generator(SubModel), 'type': 'object'}}, 'properties': {'sub': {'$ref': '#/$defs/SubModel'}}, 'required': ['sub'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_field_title_generator_in_model_fields(field_title_generator): class Model(BaseModel): field_a: str = Field(field_title_generator=field_title_generator) field_b: int = Field(field_title_generator=field_title_generator) @computed_field(field_title_generator=field_title_generator) def field_c(self) -> str: return self.field_a assert Model.model_json_schema(mode='serialization') == { 'properties': { 'field_a': {'title': field_title_generator('field_a', Model.model_fields['field_a']), 'type': 'string'}, 'field_b': {'title': field_title_generator('field_b', Model.model_fields['field_b']), 'type': 'integer'}, 'field_c': { 'readOnly': True, 'title': field_title_generator('field_c', Model.model_computed_fields['field_c']), 'type': 'string', }, }, 'required': ['field_a', 'field_b', 'field_c'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_model_config_field_title_generator(field_title_generator): class Model(BaseModel): model_config = ConfigDict(field_title_generator=field_title_generator) field_a: str field_b: int field___c: bool @computed_field def field_d(self) -> str: return self.field_a assert Model.model_json_schema(mode='serialization') == { 'properties': { 'field_a': {'title': field_title_generator('field_a', Model.model_fields['field_a']), 'type': 'string'}, 'field_b': {'title': field_title_generator('field_b', Model.model_fields['field_b']), 'type': 'integer'}, 'field___c': { 'title': field_title_generator('field___c', Model.model_fields['field___c']), 'type': 'boolean', }, 'field_d': { 'readOnly': True, 'title': field_title_generator('field_d', Model.model_computed_fields['field_d']), 'type': 'string', }, }, 'required': ['field_a', 'field_b', 'field___c', 'field_d'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize('model_title_generator', MODEL_TITLE_GENERATORS) def test_dataclass_model_title_generator(model_title_generator): @pydantic.dataclasses.dataclass(config=ConfigDict(model_title_generator=model_title_generator)) class MyDataclass: field_a: int assert model_json_schema(MyDataclass) == { 'properties': {'field_a': {'title': 'Field A', 'type': 'integer'}}, 'required': ['field_a'], 'title': model_title_generator(MyDataclass), 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_field_title_generator_in_dataclass_fields(field_title_generator): @pydantic.dataclasses.dataclass class MyDataclass: field_a: str = Field(field_title_generator=field_title_generator) field_b: int = Field(field_title_generator=field_title_generator) assert model_json_schema(MyDataclass) == { 'properties': { 'field_a': { 'title': field_title_generator('field_a', MyDataclass.__pydantic_fields__['field_a']), 'type': 'string', }, 'field_b': { 'title': field_title_generator('field_b', MyDataclass.__pydantic_fields__['field_b']), 'type': 'integer', }, }, 'required': ['field_a', 'field_b'], 'title': 'MyDataclass', 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_dataclass_config_field_title_generator(field_title_generator): @pydantic.dataclasses.dataclass(config=ConfigDict(field_title_generator=field_title_generator)) class MyDataclass: field_a: str field_b: int field___c: bool assert model_json_schema(MyDataclass) == { 'properties': { 'field_a': { 'title': field_title_generator('field_a', MyDataclass.__pydantic_fields__['field_a']), 'type': 'string', }, 'field_b': { 'title': field_title_generator('field_b', MyDataclass.__pydantic_fields__['field_b']), 'type': 'integer', }, 'field___c': { 'title': field_title_generator('field___c', MyDataclass.__pydantic_fields__['field___c']), 'type': 'boolean', }, }, 'required': ['field_a', 'field_b', 'field___c'], 'title': 'MyDataclass', 'type': 'object', } @pytest.mark.parametrize('model_title_generator', MODEL_TITLE_GENERATORS) def test_typeddict_model_title_generator(model_title_generator, TypedDict): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(model_title_generator=model_title_generator) pass assert TypeAdapter(MyTypedDict).json_schema() == { 'properties': {}, 'title': model_title_generator(MyTypedDict), 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_field_title_generator_in_typeddict_fields(field_title_generator, TypedDict, Annotated): class MyTypedDict(TypedDict): field_a: Annotated[str, Field(field_title_generator=field_title_generator)] field_b: Annotated[int, Field(field_title_generator=field_title_generator)] assert TypeAdapter(MyTypedDict).json_schema() == { 'properties': { 'field_a': { 'title': field_title_generator( 'field_a', FieldInfo.from_annotation(MyTypedDict.__annotations__['field_a']) ), 'type': 'string', }, 'field_b': { 'title': field_title_generator( 'field_b', FieldInfo.from_annotation(MyTypedDict.__annotations__['field_a']) ), 'type': 'integer', }, }, 'required': ['field_a', 'field_b'], 'title': 'MyTypedDict', 'type': 'object', } @pytest.mark.parametrize('field_title_generator', FIELD_TITLE_GENERATORS) def test_typeddict_config_field_title_generator(field_title_generator, TypedDict): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(field_title_generator=field_title_generator) field_a: str field_b: int field___c: bool assert TypeAdapter(MyTypedDict).json_schema() == { 'properties': { 'field_a': { 'title': field_title_generator( 'field_a', FieldInfo.from_annotation(MyTypedDict.__annotations__['field_a']) ), 'type': 'string', }, 'field_b': { 'title': field_title_generator( 'field_b', FieldInfo.from_annotation(MyTypedDict.__annotations__['field_b']) ), 'type': 'integer', }, 'field___c': { 'title': field_title_generator( 'field___c', FieldInfo.from_annotation(MyTypedDict.__annotations__['field___c']) ), 'type': 'boolean', }, }, 'required': ['field_a', 'field_b', 'field___c'], 'title': 'MyTypedDict', 'type': 'object', } @pytest.mark.parametrize( 'field_level_title_generator,config_level_title_generator', ((lambda f, _: f.lower(), lambda f, _: f.upper()), (lambda f, _: f, make_title)), ) def test_field_level_field_title_generator_precedence_over_config_level( field_level_title_generator, config_level_title_generator, TypedDict, Annotated ): class MyModel(BaseModel): model_config = ConfigDict(field_title_generator=field_level_title_generator) field_a: str = Field(field_title_generator=field_level_title_generator) assert MyModel.model_json_schema() == { 'properties': { 'field_a': { 'title': field_level_title_generator('field_a', MyModel.model_fields['field_a']), 'type': 'string', } }, 'required': ['field_a'], 'title': 'MyModel', 'type': 'object', } @pydantic.dataclasses.dataclass(config=ConfigDict(field_title_generator=field_level_title_generator)) class MyDataclass: field_a: str = Field(field_title_generator=field_level_title_generator) assert model_json_schema(MyDataclass) == { 'properties': { 'field_a': { 'title': field_level_title_generator('field_a', MyDataclass.__pydantic_fields__['field_a']), 'type': 'string', } }, 'required': ['field_a'], 'title': 'MyDataclass', 'type': 'object', } class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(field_title_generator=field_level_title_generator) field_a: Annotated[str, Field(field_title_generator=field_level_title_generator)] assert TypeAdapter(MyTypedDict).json_schema() == { 'properties': { 'field_a': { 'title': field_level_title_generator( 'field_a', FieldInfo.from_annotation(MyTypedDict.__annotations__['field_a']) ), 'type': 'string', } }, 'required': ['field_a'], 'title': 'MyTypedDict', 'type': 'object', } def test_field_title_precedence_over_generators(TypedDict, Annotated): class Model(BaseModel): model_config = ConfigDict(field_title_generator=lambda f, _: f.upper()) field_a: str = Field(title='MyFieldA', field_title_generator=lambda f, _: f.upper()) @computed_field(title='MyFieldB', field_title_generator=lambda f, _: f.upper()) def field_b(self) -> str: return self.field_a assert Model.model_json_schema(mode='serialization') == { 'properties': { 'field_a': {'title': 'MyFieldA', 'type': 'string'}, 'field_b': {'readOnly': True, 'title': 'MyFieldB', 'type': 'string'}, }, 'required': ['field_a', 'field_b'], 'title': 'Model', 'type': 'object', } @pydantic.dataclasses.dataclass(config=ConfigDict(field_title_generator=lambda f, _: f.upper())) class MyDataclass: field_a: str = Field(title='MyTitle', field_title_generator=lambda f, _: f.upper()) assert model_json_schema(MyDataclass) == { 'properties': {'field_a': {'title': 'MyTitle', 'type': 'string'}}, 'required': ['field_a'], 'title': 'MyDataclass', 'type': 'object', } class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(field_title_generator=lambda f, _: f.upper()) field_a: Annotated[str, Field(title='MyTitle', field_title_generator=lambda f, _: f.upper())] assert TypeAdapter(MyTypedDict).json_schema() == { 'properties': {'field_a': {'title': 'MyTitle', 'type': 'string'}}, 'required': ['field_a'], 'title': 'MyTypedDict', 'type': 'object', } def test_class_title_precedence_over_generator(): class Model(BaseModel): model_config = ConfigDict(title='MyTitle', model_title_generator=lambda m: m.__name__.upper()) assert Model.model_json_schema() == { 'properties': {}, 'title': 'MyTitle', 'type': 'object', } @pydantic.dataclasses.dataclass( config=ConfigDict(title='MyTitle', model_title_generator=lambda m: m.__name__.upper()) ) class MyDataclass: pass assert model_json_schema(MyDataclass) == { 'properties': {}, 'title': 'MyTitle', 'type': 'object', } @pytest.mark.parametrize('invalid_return_value', (1, 2, 3, tuple(), list(), object())) def test_model_title_generator_returns_invalid_type(invalid_return_value, TypedDict): with pytest.raises( TypeError, match=f'model_title_generator .* must return str, not {invalid_return_value.__class__}' ): class Model(BaseModel): model_config = ConfigDict(model_title_generator=lambda m: invalid_return_value) Model.model_json_schema() with pytest.raises( TypeError, match=f'model_title_generator .* must return str, not {invalid_return_value.__class__}' ): @pydantic.dataclasses.dataclass(config=ConfigDict(model_title_generator=lambda m: invalid_return_value)) class MyDataclass: pass TypeAdapter(MyDataclass).json_schema() with pytest.raises( TypeError, match=f'model_title_generator .* must return str, not {invalid_return_value.__class__}' ): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(model_title_generator=lambda m: invalid_return_value) pass TypeAdapter(MyTypedDict).json_schema() @pytest.mark.parametrize('invalid_return_value', (1, 2, 3, tuple(), list(), object())) def test_config_field_title_generator_returns_invalid_type(invalid_return_value, TypedDict): with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): class Model(BaseModel): model_config = ConfigDict(field_title_generator=lambda f, _: invalid_return_value) field_a: str with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): @pydantic.dataclasses.dataclass(config=ConfigDict(field_title_generator=lambda f, _: invalid_return_value)) class MyDataclass: field_a: str with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(field_title_generator=lambda f, _: invalid_return_value) field_a: str TypeAdapter(MyTypedDict) @pytest.mark.parametrize('invalid_return_value', (1, 2, 3, tuple(), list(), object())) def test_field_title_generator_returns_invalid_type(invalid_return_value, TypedDict, Annotated): with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): class Model(BaseModel): field_a: Any = Field(field_title_generator=lambda f, _: invalid_return_value) Model(field_a=invalid_return_value).model_json_schema() with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): @pydantic.dataclasses.dataclass class MyDataclass: field_a: Any = Field(field_title_generator=lambda f, _: invalid_return_value) model_json_schema(MyDataclass) with pytest.raises( TypeError, match=f'field_title_generator .* must return str, not {invalid_return_value.__class__}' ): class MyTypedDict(TypedDict): field_a: Annotated[str, Field(field_title_generator=lambda f, _: invalid_return_value)] TypeAdapter(MyTypedDict) pydantic-2.10.6/tests/test_tools.py000066400000000000000000000051711474456633400173520ustar00rootroot00000000000000from typing import Dict, List, Mapping, Union import pytest from pydantic import BaseModel, PydanticDeprecatedSince20, ValidationError from pydantic.dataclasses import dataclass from pydantic.deprecated.tools import parse_obj_as, schema_json_of, schema_of pytestmark = pytest.mark.filterwarnings('ignore::DeprecationWarning') @pytest.mark.parametrize('obj,type_,parsed', [('1', int, 1), (['1'], List[int], [1])]) def test_parse_obj(obj, type_, parsed): assert parse_obj_as(type_, obj) == parsed def test_parse_obj_as_model(): class Model(BaseModel): x: int y: bool z: str model_inputs = {'x': '1', 'y': 'true', 'z': 'abc'} assert parse_obj_as(Model, model_inputs) == Model(**model_inputs) def test_parse_obj_preserves_subclasses(): class ModelA(BaseModel): a: Mapping[int, str] class ModelB(ModelA): b: int model_b = ModelB(a={1: 'f'}, b=2) parsed = parse_obj_as(List[ModelA], [model_b]) assert parsed == [model_b] def test_parse_obj_fails(): with pytest.raises(ValidationError) as exc_info: parse_obj_as(int, 'a') assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': (), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_parsing_model_naming(): with pytest.raises(ValidationError) as exc_info: parse_obj_as(int, 'a') assert str(exc_info.value).split('\n')[0] == '1 validation error for int' with pytest.raises(ValidationError) as exc_info: with pytest.warns(PydanticDeprecatedSince20, match='The type_name parameter is deprecated'): parse_obj_as(int, 'a', type_name='ParsingModel') assert str(exc_info.value).split('\n')[0] == '1 validation error for int' def test_parse_as_dataclass(): @dataclass class PydanticDataclass: x: int inputs = {'x': '1'} assert parse_obj_as(PydanticDataclass, inputs) == PydanticDataclass(1) def test_parse_mapping_as(): inputs = {'1': '2'} assert parse_obj_as(Dict[int, int], inputs) == {1: 2} def test_schema(): assert schema_of(Union[int, str], title='IntOrStr') == { 'title': 'IntOrStr', 'anyOf': [{'type': 'integer'}, {'type': 'string'}], } assert schema_json_of(Union[int, str], title='IntOrStr', indent=2) == ( '{\n' ' "anyOf": [\n' ' {\n' ' "type": "integer"\n' ' },\n' ' {\n' ' "type": "string"\n' ' }\n' ' ],\n' ' "title": "IntOrStr"\n' '}' ) pydantic-2.10.6/tests/test_type_adapter.py000066400000000000000000000561471474456633400207040ustar00rootroot00000000000000import json import sys from dataclasses import dataclass from datetime import date, datetime from typing import Any, Dict, ForwardRef, Generic, List, NamedTuple, Optional, Tuple, TypeVar, Union import pytest from pydantic_core import ValidationError from typing_extensions import Annotated, TypeAlias, TypedDict from pydantic import BaseModel, Field, TypeAdapter, ValidationInfo, create_model, field_validator from pydantic._internal import _mock_val_ser from pydantic._internal._typing_extra import annotated_type from pydantic.config import ConfigDict from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.errors import PydanticUndefinedAnnotation, PydanticUserError from pydantic.type_adapter import _type_has_config ItemType = TypeVar('ItemType') NestedList = List[List[ItemType]] class PydanticModel(BaseModel): x: int T = TypeVar('T') class GenericPydanticModel(BaseModel, Generic[T]): x: NestedList[T] class SomeTypedDict(TypedDict): x: int class SomeNamedTuple(NamedTuple): x: int @pytest.mark.parametrize( 'tp, val, expected', [ (PydanticModel, PydanticModel(x=1), PydanticModel(x=1)), (PydanticModel, {'x': 1}, PydanticModel(x=1)), (SomeTypedDict, {'x': 1}, {'x': 1}), (SomeNamedTuple, SomeNamedTuple(x=1), SomeNamedTuple(x=1)), (List[str], ['1', '2'], ['1', '2']), (Tuple[str], ('1',), ('1',)), (Tuple[str, int], ('1', 1), ('1', 1)), (Tuple[str, ...], ('1',), ('1',)), (Dict[str, int], {'foo': 123}, {'foo': 123}), (Union[int, str], 1, 1), (Union[int, str], '2', '2'), (GenericPydanticModel[int], {'x': [[1]]}, GenericPydanticModel[int](x=[[1]])), (GenericPydanticModel[int], {'x': [['1']]}, GenericPydanticModel[int](x=[[1]])), (NestedList[int], [[1]], [[1]]), (NestedList[int], [['1']], [[1]]), ], ) def test_types(tp: Any, val: Any, expected: Any): v = TypeAdapter(tp).validate_python assert expected == v(val) IntList = List[int] OuterDict = Dict[str, 'IntList'] @pytest.mark.parametrize('defer_build', [False, True]) @pytest.mark.parametrize('method', ['validate', 'serialize', 'json_schema', 'json_schemas']) def test_global_namespace_variables(defer_build: bool, method: str, generate_schema_calls): config = ConfigDict(defer_build=True) if defer_build else None ta = TypeAdapter(OuterDict, config=config) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if method == 'validate': assert ta.validate_python({'foo': [1, '2']}) == {'foo': [1, 2]} elif method == 'serialize': assert ta.dump_python({'foo': [1, 2]}) == {'foo': [1, 2]} elif method == 'json_schema': assert ta.json_schema()['type'] == 'object' else: assert method == 'json_schemas' schemas, _ = TypeAdapter.json_schemas([(OuterDict, 'validation', ta)]) assert schemas[(OuterDict, 'validation')]['type'] == 'object' @pytest.mark.parametrize('defer_build', [False, True]) @pytest.mark.parametrize('method', ['validate', 'serialize', 'json_schema', 'json_schemas']) def test_model_global_namespace_variables(defer_build: bool, method: str, generate_schema_calls): class MyModel(BaseModel): model_config = ConfigDict(defer_build=defer_build) x: OuterDict ta = TypeAdapter(MyModel) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if method == 'validate': assert ta.validate_python({'x': {'foo': [1, '2']}}) == MyModel(x={'foo': [1, 2]}) elif method == 'serialize': assert ta.dump_python(MyModel(x={'foo': [1, 2]})) == {'x': {'foo': [1, 2]}} elif method == 'json_schema': assert ta.json_schema()['title'] == 'MyModel' else: assert method == 'json_schemas' _, json_schema = TypeAdapter.json_schemas([(MyModel, 'validation', TypeAdapter(MyModel))]) assert 'MyModel' in json_schema['$defs'] @pytest.mark.parametrize('defer_build', [False, True]) @pytest.mark.parametrize('method', ['validate', 'serialize', 'json_schema', 'json_schemas']) def test_local_namespace_variables(defer_build: bool, method: str, generate_schema_calls): IntList = List[int] # noqa: F841 OuterDict = Dict[str, 'IntList'] config = ConfigDict(defer_build=True) if defer_build else None ta = TypeAdapter(OuterDict, config=config) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if method == 'validate': assert ta.validate_python({'foo': [1, '2']}) == {'foo': [1, 2]} elif method == 'serialize': assert ta.dump_python({'foo': [1, 2]}) == {'foo': [1, 2]} elif method == 'json_schema': assert ta.json_schema()['type'] == 'object' else: assert method == 'json_schemas' schemas, _ = TypeAdapter.json_schemas([(OuterDict, 'validation', ta)]) assert schemas[(OuterDict, 'validation')]['type'] == 'object' @pytest.mark.parametrize('defer_build', [False, True]) @pytest.mark.parametrize('method', ['validate', 'serialize', 'json_schema', 'json_schemas']) def test_model_local_namespace_variables(defer_build: bool, method: str, generate_schema_calls): IntList = List[int] # noqa: F841 class MyModel(BaseModel): model_config = ConfigDict(defer_build=defer_build) x: Dict[str, 'IntList'] ta = TypeAdapter(MyModel) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if method == 'validate': assert ta.validate_python({'x': {'foo': [1, '2']}}) == MyModel(x={'foo': [1, 2]}) elif method == 'serialize': assert ta.dump_python(MyModel(x={'foo': [1, 2]})) == {'x': {'foo': [1, 2]}} elif method == 'json_schema': assert ta.json_schema()['title'] == 'MyModel' else: assert method == 'json_schemas' _, json_schema = TypeAdapter.json_schemas([(MyModel, 'validation', ta)]) assert 'MyModel' in json_schema['$defs'] @pytest.mark.parametrize('defer_build', [False, True]) @pytest.mark.parametrize('method', ['validate', 'serialize', 'json_schema', 'json_schemas']) @pytest.mark.skipif(sys.version_info < (3, 9), reason="ForwardRef doesn't accept module as a parameter in Python < 3.9") def test_top_level_fwd_ref(defer_build: bool, method: str, generate_schema_calls): config = ConfigDict(defer_build=True) if defer_build else None FwdRef = ForwardRef('OuterDict', module=__name__) ta = TypeAdapter(FwdRef, config=config) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if method == 'validate': assert ta.validate_python({'foo': [1, '2']}) == {'foo': [1, 2]} elif method == 'serialize': assert ta.dump_python({'foo': [1, 2]}) == {'foo': [1, 2]} elif method == 'json_schema': assert ta.json_schema()['type'] == 'object' else: assert method == 'json_schemas' schemas, _ = TypeAdapter.json_schemas([(FwdRef, 'validation', ta)]) assert schemas[(FwdRef, 'validation')]['type'] == 'object' MyUnion: TypeAlias = 'Union[str, int]' def test_type_alias(): MyList = List[MyUnion] v = TypeAdapter(MyList).validate_python res = v([1, '2']) assert res == [1, '2'] def test_validate_python_strict() -> None: class Model(TypedDict): x: int class ModelStrict(Model): __pydantic_config__ = ConfigDict(strict=True) # type: ignore lax_validator = TypeAdapter(Model) strict_validator = TypeAdapter(ModelStrict) assert lax_validator.validate_python({'x': '1'}, strict=None) == Model(x=1) assert lax_validator.validate_python({'x': '1'}, strict=False) == Model(x=1) with pytest.raises(ValidationError) as exc_info: lax_validator.validate_python({'x': '1'}, strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] with pytest.raises(ValidationError) as exc_info: strict_validator.validate_python({'x': '1'}) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] assert strict_validator.validate_python({'x': '1'}, strict=False) == Model(x=1) with pytest.raises(ValidationError) as exc_info: strict_validator.validate_python({'x': '1'}, strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] @pytest.mark.xfail(reason='Need to fix this in https://github.com/pydantic/pydantic/pull/5944') def test_validate_json_strict() -> None: class Model(TypedDict): x: int class ModelStrict(Model): __pydantic_config__ = ConfigDict(strict=True) # type: ignore lax_validator = TypeAdapter(Model, config=ConfigDict(strict=False)) strict_validator = TypeAdapter(ModelStrict) assert lax_validator.validate_json(json.dumps({'x': '1'}), strict=None) == Model(x=1) assert lax_validator.validate_json(json.dumps({'x': '1'}), strict=False) == Model(x=1) with pytest.raises(ValidationError) as exc_info: lax_validator.validate_json(json.dumps({'x': '1'}), strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] with pytest.raises(ValidationError) as exc_info: strict_validator.validate_json(json.dumps({'x': '1'}), strict=None) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] assert strict_validator.validate_json(json.dumps({'x': '1'}), strict=False) == Model(x=1) with pytest.raises(ValidationError) as exc_info: strict_validator.validate_json(json.dumps({'x': '1'}), strict=True) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('x',), 'msg': 'Input should be a valid integer', 'input': '1'} ] def test_validate_python_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int, info: ValidationInfo) -> int: assert info.context == contexts.pop(0) return v validator = TypeAdapter(Model) validator.validate_python({'x': 1}) validator.validate_python({'x': 1}, context=None) validator.validate_python({'x': 1}, context={'foo': 'bar'}) assert contexts == [] def test_validate_json_context() -> None: contexts: List[Any] = [None, None, {'foo': 'bar'}] class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int, info: ValidationInfo) -> int: assert info.context == contexts.pop(0) return v validator = TypeAdapter(Model) validator.validate_json(json.dumps({'x': 1})) validator.validate_json(json.dumps({'x': 1}), context=None) validator.validate_json(json.dumps({'x': 1}), context={'foo': 'bar'}) assert contexts == [] def test_validate_python_from_attributes() -> None: class Model(BaseModel): x: int class ModelFromAttributesTrue(Model): model_config = ConfigDict(from_attributes=True) class ModelFromAttributesFalse(Model): model_config = ConfigDict(from_attributes=False) @dataclass class UnrelatedClass: x: int = 1 input = UnrelatedClass(1) ta = TypeAdapter(Model) for from_attributes in (False, None): with pytest.raises(ValidationError) as exc_info: ta.validate_python(UnrelatedClass(), from_attributes=from_attributes) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of Model', 'input': input, 'ctx': {'class_name': 'Model'}, } ] res = ta.validate_python(UnrelatedClass(), from_attributes=True) assert res == Model(x=1) ta = TypeAdapter(ModelFromAttributesTrue) with pytest.raises(ValidationError) as exc_info: ta.validate_python(UnrelatedClass(), from_attributes=False) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of ModelFromAttributesTrue', 'input': input, 'ctx': {'class_name': 'ModelFromAttributesTrue'}, } ] for from_attributes in (True, None): res = ta.validate_python(UnrelatedClass(), from_attributes=from_attributes) assert res == ModelFromAttributesTrue(x=1) ta = TypeAdapter(ModelFromAttributesFalse) for from_attributes in (False, None): with pytest.raises(ValidationError) as exc_info: ta.validate_python(UnrelatedClass(), from_attributes=from_attributes) assert exc_info.value.errors(include_url=False) == [ { 'type': 'model_type', 'loc': (), 'msg': 'Input should be a valid dictionary or instance of ModelFromAttributesFalse', 'input': input, 'ctx': {'class_name': 'ModelFromAttributesFalse'}, } ] res = ta.validate_python(UnrelatedClass(), from_attributes=True) assert res == ModelFromAttributesFalse(x=1) @pytest.mark.parametrize( 'field_type,input_value,expected,raises_match,strict', [ (bool, 'true', True, None, False), (bool, 'true', True, None, True), (bool, 'false', False, None, False), (bool, 'e', ValidationError, 'type=bool_parsing', False), (int, '1', 1, None, False), (int, '1', 1, None, True), (int, 'xxx', ValidationError, 'type=int_parsing', True), (float, '1.1', 1.1, None, False), (float, '1.10', 1.1, None, False), (float, '1.1', 1.1, None, True), (float, '1.10', 1.1, None, True), (date, '2017-01-01', date(2017, 1, 1), None, False), (date, '2017-01-01', date(2017, 1, 1), None, True), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_from_datetime_inexact', False), (date, '2017-01-01T12:13:14.567', ValidationError, 'type=date_parsing', True), (date, '2017-01-01T00:00:00', date(2017, 1, 1), None, False), (date, '2017-01-01T00:00:00', ValidationError, 'type=date_parsing', True), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, False), (datetime, '2017-01-01T12:13:14.567', datetime(2017, 1, 1, 12, 13, 14, 567_000), None, True), ], ids=repr, ) @pytest.mark.parametrize('defer_build', [False, True]) def test_validate_strings( field_type, input_value, expected, raises_match, strict, defer_build: bool, generate_schema_calls ): config = ConfigDict(defer_build=True) if defer_build else None ta = TypeAdapter(field_type, config=config) assert generate_schema_calls.count == (0 if defer_build else 1), 'Should be built deferred' if raises_match is not None: with pytest.raises(expected, match=raises_match): ta.validate_strings(input_value, strict=strict) else: assert ta.validate_strings(input_value, strict=strict) == expected assert generate_schema_calls.count == 1, 'Should not build duplicates' @pytest.mark.parametrize('strict', [True, False]) def test_validate_strings_dict(strict): assert TypeAdapter(Dict[int, date]).validate_strings({'1': '2017-01-01', '2': '2017-01-02'}, strict=strict) == { 1: date(2017, 1, 1), 2: date(2017, 1, 2), } def test_annotated_type_disallows_config() -> None: class Model(BaseModel): x: int with pytest.raises(PydanticUserError, match='Cannot use `config`'): TypeAdapter(Annotated[Model, ...], config=ConfigDict(strict=False)) def test_ta_config_with_annotated_type() -> None: class TestValidator(BaseModel): x: str model_config = ConfigDict(str_to_lower=True) assert TestValidator(x='ABC').x == 'abc' assert TypeAdapter(TestValidator).validate_python({'x': 'ABC'}).x == 'abc' assert TypeAdapter(Annotated[TestValidator, ...]).validate_python({'x': 'ABC'}).x == 'abc' class TestSerializer(BaseModel): some_bytes: bytes model_config = ConfigDict(ser_json_bytes='base64') result = TestSerializer(some_bytes=b'\xaa') assert result.model_dump(mode='json') == {'some_bytes': 'qg=='} assert TypeAdapter(TestSerializer).dump_python(result, mode='json') == {'some_bytes': 'qg=='} # cases where SchemaSerializer is constructed within TypeAdapter's __init__ assert TypeAdapter(Annotated[TestSerializer, ...]).dump_python(result, mode='json') == {'some_bytes': 'qg=='} assert TypeAdapter(Annotated[List[TestSerializer], ...]).dump_python([result], mode='json') == [ {'some_bytes': 'qg=='} ] def test_eval_type_backport(): v = TypeAdapter('list[int | str]').validate_python assert v([1, '2']) == [1, '2'] with pytest.raises(ValidationError) as exc_info: v([{'not a str or int'}]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_type', 'loc': (0, 'int'), 'msg': 'Input should be a valid integer', 'input': {'not a str or int'}, }, { 'type': 'string_type', 'loc': (0, 'str'), 'msg': 'Input should be a valid string', 'input': {'not a str or int'}, }, ] with pytest.raises(ValidationError) as exc_info: v('not a list') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': (), 'msg': 'Input should be a valid list', 'input': 'not a list'} ] def defer_build_test_models(config: ConfigDict) -> List[Any]: class Model(BaseModel): model_config = config x: int class SubModel(Model): y: Optional[int] = None @pydantic_dataclass(config=config) class DataClassModel: x: int @pydantic_dataclass class SubDataClassModel(DataClassModel): y: Optional[int] = None class TypedDictModel(TypedDict): __pydantic_config__ = config # type: ignore x: int models = [ Model, SubModel, create_model('DynamicModel', __base__=Model), create_model('DynamicSubModel', __base__=SubModel), DataClassModel, SubDataClassModel, TypedDictModel, Dict[str, int], ] return [ *models, # FastAPI heavily uses Annotated so test that as well *[Annotated[model, Field(title='abc')] for model in models], ] CONFIGS = [ ConfigDict(defer_build=False), ConfigDict(defer_build=True), ] MODELS_CONFIGS: List[Tuple[Any, ConfigDict]] = [ (model, config) for config in CONFIGS for model in defer_build_test_models(config) ] @pytest.mark.parametrize('model, config', MODELS_CONFIGS) @pytest.mark.parametrize('method', ['schema', 'validate', 'dump']) def test_core_schema_respects_defer_build(model: Any, config: ConfigDict, method: str, generate_schema_calls) -> None: type_ = annotated_type(model) or model dumped = dict(x=1) if 'Dict[' in str(type_) else type_(x=1) generate_schema_calls.reset() type_adapter = TypeAdapter(model) if _type_has_config(model) else TypeAdapter(model, config=config) if config.get('defer_build'): assert generate_schema_calls.count == 0, 'Should be built deferred' assert isinstance(type_adapter.core_schema, _mock_val_ser.MockCoreSchema), 'Should be initialized deferred' assert isinstance(type_adapter.validator, _mock_val_ser.MockValSer), 'Should be initialized deferred' assert isinstance(type_adapter.serializer, _mock_val_ser.MockValSer), 'Should be initialized deferred' else: built_inside_type_adapter = 'Dict' in str(model) or 'Annotated' in str(model) assert generate_schema_calls.count == (1 if built_inside_type_adapter else 0), f'Should be built ({model})' assert not isinstance( type_adapter.core_schema, _mock_val_ser.MockCoreSchema ), 'Should be initialized before usage' assert not isinstance(type_adapter.validator, _mock_val_ser.MockValSer), 'Should be initialized before usage' assert not isinstance(type_adapter.validator, _mock_val_ser.MockValSer), 'Should be initialized before usage' if method == 'schema': json_schema = type_adapter.json_schema() # Use it assert "'type': 'integer'" in str(json_schema) # Sanity check # Do not check generate_schema_calls count here as the json_schema generation uses generate schema internally # assert generate_schema_calls.count < 2, 'Should not build duplicates' elif method == 'validate': validated = type_adapter.validate_python({'x': 1}) # Use it assert (validated['x'] if isinstance(validated, dict) else validated.x) == 1 # Sanity check assert generate_schema_calls.count < 2, 'Should not build duplicates' else: assert method == 'dump' raw = type_adapter.dump_json(dumped) # Use it assert json.loads(raw.decode())['x'] == 1 # Sanity check assert generate_schema_calls.count < 2, 'Should not build duplicates' assert not isinstance( type_adapter.core_schema, _mock_val_ser.MockCoreSchema ), 'Should be initialized after the usage' assert not isinstance(type_adapter.validator, _mock_val_ser.MockValSer), 'Should be initialized after the usage' assert not isinstance(type_adapter.validator, _mock_val_ser.MockValSer), 'Should be initialized after the usage' def test_defer_build_raise_errors() -> None: ta = TypeAdapter('MyInt', config=ConfigDict(defer_build=True)) # pyright: ignore[reportUndefinedVariable] assert isinstance(ta.core_schema, _mock_val_ser.MockCoreSchema) with pytest.raises(PydanticUndefinedAnnotation): # `True` is the `raise_errors` default for the `rebuild` method, but we include here for clarity ta.rebuild(raise_errors=True) ta.rebuild(raise_errors=False) assert isinstance(ta.core_schema, _mock_val_ser.MockCoreSchema) MyInt = int # noqa: F841 ta.rebuild(raise_errors=True) assert not isinstance(ta.core_schema, _mock_val_ser.MockCoreSchema) @dataclass class SimpleDataclass: x: int @pytest.mark.parametrize('type_,repr_', [(int, 'int'), (List[int], 'List[int]'), (SimpleDataclass, 'SimpleDataclass')]) def test_ta_repr(type_: Any, repr_: str) -> None: ta = TypeAdapter(type_) assert repr(ta) == f'TypeAdapter({repr_})' def test_correct_frame_used_parametrized(create_module) -> None: """https://github.com/pydantic/pydantic/issues/10892""" @create_module def module_1() -> None: from pydantic import TypeAdapter Any = int # noqa: F841 # 'Any' should resolve to `int`, not `typing.Any`: ta = TypeAdapter[int]('Any') # noqa: F841 with pytest.raises(ValidationError): module_1.ta.validate_python('a') pydantic-2.10.6/tests/test_type_alias_type.py000066400000000000000000000355031474456633400214070ustar00rootroot00000000000000import datetime from dataclasses import dataclass from typing import Dict, Generic, List, Sequence, Tuple, TypeVar, Union import pytest from annotated_types import MaxLen from typing_extensions import Annotated, Literal, TypeAliasType from pydantic import BaseModel, Field, PydanticUserError, TypeAdapter, ValidationError T = TypeVar('T') JsonType = TypeAliasType('JsonType', Union[List['JsonType'], Dict[str, 'JsonType'], str, int, float, bool, None]) RecursiveGenericAlias = TypeAliasType( 'RecursiveGenericAlias', List[Union['RecursiveGenericAlias[T]', T]], type_params=(T,) ) MyList = TypeAliasType('MyList', List[T], type_params=(T,)) # try mixing with implicit type aliases ShortMyList = Annotated[MyList[T], MaxLen(1)] ShortRecursiveGenericAlias = Annotated[RecursiveGenericAlias[T], MaxLen(1)] def test_type_alias() -> None: t = TypeAdapter(MyList[int]) assert t.validate_python(['1', '2']) == [1, 2] with pytest.raises(ValidationError) as exc_info: t.validate_python(['a']) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (0,), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] assert t.json_schema() == {'type': 'array', 'items': {'type': 'integer'}} def test_recursive_type_alias() -> None: t = TypeAdapter(JsonType) assert t.validate_python({'a': [True, [{'b': None}]]}) == {'a': [True, [{'b': None}]]} with pytest.raises(ValidationError) as exc_info: t.validate_python({'a': datetime.date(year=1992, month=12, day=11)}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': ('list[nullable[union[list[...],dict[str,...],str,int,float,bool]]]',), 'msg': 'Input should be a valid list', 'input': {'a': datetime.date(1992, 12, 11)}, }, { 'type': 'list_type', 'loc': ('dict[str,...]', 'a', 'list[nullable[union[list[...],dict[str,...],str,int,float,bool]]]'), 'msg': 'Input should be a valid list', 'input': datetime.date(1992, 12, 11), }, { 'type': 'dict_type', 'loc': ('dict[str,...]', 'a', 'dict[str,...]'), 'msg': 'Input should be a valid dictionary', 'input': datetime.date(1992, 12, 11), }, { 'type': 'string_type', 'loc': ('dict[str,...]', 'a', 'str'), 'msg': 'Input should be a valid string', 'input': datetime.date(1992, 12, 11), }, { 'type': 'int_type', 'loc': ('dict[str,...]', 'a', 'int'), 'msg': 'Input should be a valid integer', 'input': datetime.date(1992, 12, 11), }, { 'type': 'float_type', 'loc': ('dict[str,...]', 'a', 'float'), 'msg': 'Input should be a valid number', 'input': datetime.date(1992, 12, 11), }, { 'type': 'bool_type', 'loc': ('dict[str,...]', 'a', 'bool'), 'msg': 'Input should be a valid boolean', 'input': datetime.date(1992, 12, 11), }, { 'type': 'string_type', 'loc': ('str',), 'msg': 'Input should be a valid string', 'input': {'a': datetime.date(1992, 12, 11)}, }, { 'type': 'int_type', 'loc': ('int',), 'msg': 'Input should be a valid integer', 'input': {'a': datetime.date(1992, 12, 11)}, }, { 'type': 'float_type', 'loc': ('float',), 'msg': 'Input should be a valid number', 'input': {'a': datetime.date(1992, 12, 11)}, }, { 'type': 'bool_type', 'loc': ('bool',), 'msg': 'Input should be a valid boolean', 'input': {'a': datetime.date(1992, 12, 11)}, }, ] assert t.json_schema() == { '$ref': '#/$defs/JsonType', '$defs': { 'JsonType': { 'anyOf': [ {'type': 'array', 'items': {'$ref': '#/$defs/JsonType'}}, {'type': 'object', 'additionalProperties': {'$ref': '#/$defs/JsonType'}}, {'type': 'string'}, {'type': 'integer'}, {'type': 'number'}, {'type': 'boolean'}, {'type': 'null'}, ] } }, } def test_recursive_type_alias_name(): T = TypeVar('T') @dataclass class MyGeneric(Generic[T]): field: T MyRecursiveType = TypeAliasType('MyRecursiveType', Union[MyGeneric['MyRecursiveType'], int]) json_schema = TypeAdapter(MyRecursiveType).json_schema() assert sorted(json_schema['$defs'].keys()) == ['MyGeneric_MyRecursiveType_', 'MyRecursiveType'] def test_type_alias_annotated() -> None: t = TypeAdapter(ShortMyList[int]) assert t.validate_python(['1']) == [1] with pytest.raises(ValidationError) as exc_info: t.validate_python([1, 2]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': (), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [1, 2], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, } ] assert t.json_schema() == {'type': 'array', 'items': {'type': 'integer'}, 'maxItems': 1} def test_type_alias_annotated_defs() -> None: # force use of refs by referencing the schema in multiple places t = TypeAdapter(Tuple[ShortMyList[int], ShortMyList[int]]) assert t.validate_python((['1'], ['2'])) == ([1], [2]) with pytest.raises(ValidationError) as exc_info: t.validate_python(([1, 2], [1, 2])) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': (0,), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [1, 2], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, }, { 'type': 'too_long', 'loc': (1,), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [1, 2], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, }, ] assert t.json_schema() == { 'type': 'array', 'minItems': 2, 'prefixItems': [ {'$ref': '#/$defs/MyList_int__MaxLen_max_length_1_'}, {'$ref': '#/$defs/MyList_int__MaxLen_max_length_1_'}, ], 'maxItems': 2, '$defs': {'MyList_int__MaxLen_max_length_1_': {'type': 'array', 'items': {'type': 'integer'}, 'maxItems': 1}}, } def test_recursive_generic_type_alias() -> None: t = TypeAdapter(RecursiveGenericAlias[int]) assert t.validate_python([[['1']]]) == [[[1]]] with pytest.raises(ValidationError) as exc_info: t.validate_python([[['a']]]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': (0, 'list[union[...,int]]', 0, 'list[union[...,int]]', 0, 'list[union[...,int]]'), 'msg': 'Input should be a valid list', 'input': 'a', }, { 'type': 'int_parsing', 'loc': (0, 'list[union[...,int]]', 0, 'list[union[...,int]]', 0, 'int'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, { 'type': 'int_type', 'loc': (0, 'list[union[...,int]]', 0, 'int'), 'msg': 'Input should be a valid integer', 'input': ['a'], }, {'type': 'int_type', 'loc': (0, 'int'), 'msg': 'Input should be a valid integer', 'input': [['a']]}, ] assert t.json_schema() == { '$ref': '#/$defs/RecursiveGenericAlias_int_', '$defs': { 'RecursiveGenericAlias_int_': { 'type': 'array', 'items': {'anyOf': [{'$ref': '#/$defs/RecursiveGenericAlias_int_'}, {'type': 'integer'}]}, } }, } def test_recursive_generic_type_alias_annotated() -> None: t = TypeAdapter(ShortRecursiveGenericAlias[int]) assert t.validate_python([[]]) == [[]] with pytest.raises(ValidationError) as exc_info: t.validate_python([[], []]) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': (), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [[], []], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, } ] # insert_assert(t.json_schema()) assert t.json_schema() == { 'type': 'array', 'items': {'anyOf': [{'$ref': '#/$defs/RecursiveGenericAlias_int_'}, {'type': 'integer'}]}, 'maxItems': 1, '$defs': { 'RecursiveGenericAlias_int_': { 'type': 'array', 'items': {'anyOf': [{'$ref': '#/$defs/RecursiveGenericAlias_int_'}, {'type': 'integer'}]}, } }, } def test_recursive_generic_type_alias_annotated_defs() -> None: # force use of refs by referencing the schema in multiple places t = TypeAdapter(Tuple[ShortRecursiveGenericAlias[int], ShortRecursiveGenericAlias[int]]) assert t.validate_python(([[]], [[]])) == ([[]], [[]]) with pytest.raises(ValidationError) as exc_info: t.validate_python(([[], []], [[]])) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': (0,), 'msg': 'List should have at most 1 item after validation, not 2', 'input': [[], []], 'ctx': {'field_type': 'List', 'max_length': 1, 'actual_length': 2}, } ] # insert_assert(t.json_schema()) assert t.json_schema() == { 'type': 'array', 'minItems': 2, 'prefixItems': [ {'$ref': '#/$defs/RecursiveGenericAlias_int__MaxLen_max_length_1_'}, {'$ref': '#/$defs/RecursiveGenericAlias_int__MaxLen_max_length_1_'}, ], 'maxItems': 2, '$defs': { 'RecursiveGenericAlias_int_': { 'type': 'array', 'items': {'anyOf': [{'$ref': '#/$defs/RecursiveGenericAlias_int_'}, {'type': 'integer'}]}, }, 'RecursiveGenericAlias_int__MaxLen_max_length_1_': { 'type': 'array', 'items': {'anyOf': [{'$ref': '#/$defs/RecursiveGenericAlias_int_'}, {'type': 'integer'}]}, 'maxItems': 1, }, }, } @pytest.mark.xfail(reason='description is currently dropped') def test_field() -> None: SomeAlias = TypeAliasType('SomeAlias', Annotated[int, Field(description='number')]) ta = TypeAdapter(Annotated[SomeAlias, Field(title='abc')]) # insert_assert(ta.json_schema()) assert ta.json_schema() == { '$defs': {'SomeAlias': {'type': 'integer', 'description': 'number'}}, '$ref': '#/$defs/SomeAlias', 'title': 'abc', } def test_nested_generic_type_alias_type() -> None: class MyModel(BaseModel): field_1: MyList[bool] field_2: MyList[str] model = MyModel(field_1=[True], field_2=['abc']) assert model.model_json_schema() == { '$defs': { 'MyList_bool_': {'items': {'type': 'boolean'}, 'type': 'array'}, 'MyList_str_': {'items': {'type': 'string'}, 'type': 'array'}, }, 'properties': {'field_1': {'$ref': '#/$defs/MyList_bool_'}, 'field_2': {'$ref': '#/$defs/MyList_str_'}}, 'required': ['field_1', 'field_2'], 'title': 'MyModel', 'type': 'object', } def test_non_specified_generic_type_alias_type() -> None: assert TypeAdapter(MyList).json_schema() == {'items': {}, 'type': 'array'} def test_redefined_type_alias(): MyType = TypeAliasType('MyType', str) class MyInnerModel(BaseModel): x: MyType MyType = TypeAliasType('MyType', int) class MyOuterModel(BaseModel): inner: MyInnerModel y: MyType data = {'inner': {'x': 'hello'}, 'y': 1} assert MyOuterModel.model_validate(data).model_dump() == data def test_type_alias_to_type_with_ref(): class Div(BaseModel): type: Literal['Div'] = 'Div' components: List['AnyComponent'] AnyComponent = TypeAliasType('AnyComponent', Div) adapter = TypeAdapter(AnyComponent) adapter.validate_python({'type': 'Div', 'components': [{'type': 'Div', 'components': []}]}) with pytest.raises(ValidationError) as exc_info: adapter.validate_python({'type': 'Div', 'components': [{'type': 'NotDiv', 'components': []}]}) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': "'Div'"}, 'input': 'NotDiv', 'loc': ('components', 0, 'type'), 'msg': "Input should be 'Div'", 'type': 'literal_error', } ] def test_intermediate_type_aliases() -> None: # https://github.com/pydantic/pydantic/issues/8984 MySeq = TypeAliasType('MySeq', Sequence[T], type_params=(T,)) MyIntSeq = TypeAliasType('MyIntSeq', MySeq[int]) class MyModel(BaseModel): my_int_seq: MyIntSeq assert MyModel(my_int_seq=range(1, 4)).my_int_seq == [1, 2, 3] assert MyModel.model_json_schema() == { '$defs': {'MySeq_int_': {'items': {'type': 'integer'}, 'type': 'array'}}, 'properties': {'my_int_seq': {'$ref': '#/$defs/MySeq_int_'}}, 'required': ['my_int_seq'], 'title': 'MyModel', 'type': 'object', } def test_intermediate_type_aliases_json_type() -> None: JSON = TypeAliasType('JSON', Union[str, int, bool, 'JSONSeq', 'JSONObj', None]) JSONObj = TypeAliasType('JSONObj', Dict[str, JSON]) JSONSeq = TypeAliasType('JSONSeq', List[JSON]) MyJSONAlias1 = TypeAliasType('MyJSONAlias1', JSON) MyJSONAlias2 = TypeAliasType('MyJSONAlias2', MyJSONAlias1) JSONs = TypeAliasType('JSONs', List[MyJSONAlias2]) adapter = TypeAdapter(JSONs) assert adapter.validate_python([{'a': 1}, 2, '3', [4, 5], True, None]) == [{'a': 1}, 2, '3', [4, 5], True, None] def test_intermediate_type_aliases_chain() -> None: A = TypeAliasType('A', int) B = TypeAliasType('B', A) C = TypeAliasType('C', B) D = TypeAliasType('D', C) E = TypeAliasType('E', D) TypeAdapter(E) def test_circular_type_aliases() -> None: A = TypeAliasType('A', 'C') B = TypeAliasType('B', A) C = TypeAliasType('C', B) with pytest.raises(PydanticUserError) as exc_info: class MyModel(BaseModel): a: C assert exc_info.value.code == 'circular-reference-schema' assert exc_info.value.message.startswith('tests.test_type_alias_type.C') pydantic-2.10.6/tests/test_type_hints.py000066400000000000000000000117751474456633400204070ustar00rootroot00000000000000""" Test pydantic model type hints (annotations) and that they can be queried by :py:meth:`typing.get_type_hints`. """ import inspect import sys from functools import lru_cache from typing import ( Any, Dict, Generic, Optional, Set, TypeVar, ) import pytest import typing_extensions from pydantic import ( BaseModel, RootModel, ) from pydantic.dataclasses import dataclass DEPRECATED_MODEL_MEMBERS = { 'construct', 'copy', 'dict', 'from_orm', 'json', 'json_schema', 'parse_file', 'parse_obj', } # Disable deprecation warnings, as we enumerate members that may be # i.e. pydantic.warnings.PydanticDeprecatedSince20: The `__fields__` attribute is deprecated, # use `model_fields` instead. # Additionally, only run these tests for 3.10+ pytestmark = [ pytest.mark.filterwarnings('ignore::DeprecationWarning'), pytest.mark.skipif(sys.version_info < (3, 10), reason='requires python3.10 or higher to work properly'), ] @pytest.fixture(name='ParentModel', scope='session') def parent_sub_model_fixture(): class UltraSimpleModel(BaseModel): a: float b: int = 10 class ParentModel(BaseModel): grape: bool banana: UltraSimpleModel return ParentModel @lru_cache def get_type_checking_only_ns(): """ When creating `BaseModel` in `pydantic.main`, some globals are imported only when `TYPE_CHECKING` is `True`, so we have to manually include them when calling `typing.get_type_hints`. """ from inspect import Signature from pydantic_core import CoreSchema, SchemaSerializer, SchemaValidator from pydantic.deprecated.parse import Protocol as DeprecatedParseProtocol from pydantic.fields import ComputedFieldInfo, FieldInfo, ModelPrivateAttr from pydantic.fields import PrivateAttr as _PrivateAttr return { 'Signature': Signature, 'CoreSchema': CoreSchema, 'SchemaSerializer': SchemaSerializer, 'SchemaValidator': SchemaValidator, 'DeprecatedParseProtocol': DeprecatedParseProtocol, 'ComputedFieldInfo': ComputedFieldInfo, 'FieldInfo': FieldInfo, 'ModelPrivateAttr': ModelPrivateAttr, '_PrivateAttr': _PrivateAttr, } def inspect_type_hints( obj_type, members: Optional[Set[str]] = None, exclude_members: Optional[Set[str]] = None, recursion_limit: int = 3 ): """ Test an object and its members to make sure type hints can be resolved. :param obj_type: Type to check :param members: Explicit set of members to check, None to check all :param exclude_members: Set of member names to exclude :param recursion_limit: Recursion limit (0 to disallow) """ try: hints = typing_extensions.get_type_hints(obj_type, localns=get_type_checking_only_ns()) assert isinstance(hints, dict), f'Type annotation(s) on {obj_type} are invalid' except NameError as ex: raise AssertionError(f'Type annotation(s) on {obj_type} are invalid: {str(ex)}') from ex if recursion_limit <= 0: return if isinstance(obj_type, type): # Check class members for member_name, member_obj in inspect.getmembers(obj_type): if member_name.startswith('_'): # Ignore private members continue if (members and member_name not in members) or (exclude_members and member_name in exclude_members): continue if inspect.isclass(member_obj) or inspect.isfunction(member_obj): # Inspect all child members (can't exclude specific ones) inspect_type_hints(member_obj, recursion_limit=recursion_limit - 1) @pytest.mark.parametrize( ('obj_type', 'members', 'exclude_members'), [ (BaseModel, None, DEPRECATED_MODEL_MEMBERS), (RootModel, None, DEPRECATED_MODEL_MEMBERS), ], ) def test_obj_type_hints(obj_type, members: Optional[Set[str]], exclude_members: Optional[Set[str]]): """ Test an object and its members to make sure type hints can be resolved. :param obj_type: Type to check :param members: Explicit set of members to check, None to check all :param exclude_members: Set of member names to exclude """ inspect_type_hints(obj_type, members, exclude_members) def test_parent_sub_model(ParentModel): inspect_type_hints(ParentModel, None, DEPRECATED_MODEL_MEMBERS) def test_root_model_as_field(): class MyRootModel(RootModel[int]): pass class MyModel(BaseModel): root_model: MyRootModel inspect_type_hints(MyRootModel, None, DEPRECATED_MODEL_MEMBERS) inspect_type_hints(MyModel, None, DEPRECATED_MODEL_MEMBERS) def test_generics(): data_type = TypeVar('data_type') class Result(BaseModel, Generic[data_type]): data: data_type inspect_type_hints(Result, None, DEPRECATED_MODEL_MEMBERS) inspect_type_hints(Result[Dict[str, Any]], None, DEPRECATED_MODEL_MEMBERS) def test_dataclasses(): @dataclass class MyDataclass: a: int b: float inspect_type_hints(MyDataclass) pydantic-2.10.6/tests/test_types.py000066400000000000000000006713551474456633400173730ustar00rootroot00000000000000import collections import ipaddress import itertools import json import math import os import platform import re import sys import typing import uuid from collections import OrderedDict, defaultdict, deque from dataclasses import dataclass from datetime import date, datetime, time, timedelta, timezone from decimal import Decimal from enum import Enum, IntEnum from fractions import Fraction from numbers import Number from pathlib import Path from typing import ( Any, Callable, Counter, DefaultDict, Deque, Dict, FrozenSet, Iterable, List, NewType, Optional, Pattern, Sequence, Set, Tuple, TypeVar, Union, ) from uuid import UUID import annotated_types import dirty_equals import pytest from dirty_equals import HasRepr, IsFloatNan, IsOneOf, IsStr from pydantic_core import ( CoreSchema, PydanticCustomError, SchemaError, core_schema, ) from typing_extensions import Annotated, Literal, NotRequired, TypedDict, get_args from pydantic import ( UUID1, UUID3, UUID4, UUID5, AfterValidator, AllowInfNan, AwareDatetime, Base64Bytes, Base64Str, Base64UrlBytes, Base64UrlStr, BaseModel, BeforeValidator, ByteSize, ConfigDict, DirectoryPath, EmailStr, FailFast, Field, FilePath, FiniteFloat, FutureDate, FutureDatetime, GetCoreSchemaHandler, GetPydanticSchema, ImportString, InstanceOf, Json, JsonValue, NaiveDatetime, NameEmail, NegativeFloat, NegativeInt, NewPath, NonNegativeFloat, NonNegativeInt, NonPositiveFloat, NonPositiveInt, OnErrorOmit, PastDate, PastDatetime, PlainSerializer, PositiveFloat, PositiveInt, PydanticInvalidForJsonSchema, PydanticSchemaGenerationError, Secret, SecretBytes, SecretStr, SerializeAsAny, SkipValidation, SocketPath, Strict, StrictBool, StrictBytes, StrictFloat, StrictInt, StrictStr, StringConstraints, Tag, TypeAdapter, ValidationError, conbytes, condate, condecimal, confloat, confrozenset, conint, conlist, conset, constr, field_serializer, field_validator, validate_call, ) from pydantic.dataclasses import dataclass as pydantic_dataclass try: import email_validator except ImportError: email_validator = None # TODO add back tests for Iterator @pytest.fixture(scope='session', name='ConBytesModel') def con_bytes_model_fixture(): class ConBytesModel(BaseModel): v: conbytes(max_length=10) = b'foobar' return ConBytesModel def test_constrained_bytes_good(ConBytesModel): m = ConBytesModel(v=b'short') assert m.v == b'short' def test_constrained_bytes_default(ConBytesModel): m = ConBytesModel() assert m.v == b'foobar' def test_strict_raw_type(): class Model(BaseModel): v: Annotated[str, Strict] assert Model(v='foo').v == 'foo' with pytest.raises(ValidationError, match=r'Input should be a valid string \[type=string_type,'): Model(v=b'fo') @pytest.mark.parametrize( ('data', 'valid'), [(b'this is too long', False), ('⪶⓲⽷01'.encode(), False), (b'not long90', True), ('⪶⓲⽷0'.encode(), True)], ) def test_constrained_bytes_too_long(ConBytesModel, data: bytes, valid: bool): if valid: assert ConBytesModel(v=data).model_dump() == {'v': data} else: with pytest.raises(ValidationError) as exc_info: ConBytesModel(v=data) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'max_length': 10}, 'input': data, 'loc': ('v',), 'msg': 'Data should have at most 10 bytes', 'type': 'bytes_too_long', } ] def test_constrained_bytes_strict_true(): class Model(BaseModel): v: conbytes(strict=True) assert Model(v=b'foobar').v == b'foobar' with pytest.raises(ValidationError): Model(v=bytearray('foobar', 'utf-8')) with pytest.raises(ValidationError): Model(v='foostring') with pytest.raises(ValidationError): Model(v=42) with pytest.raises(ValidationError): Model(v=0.42) def test_constrained_bytes_strict_false(): class Model(BaseModel): v: conbytes(strict=False) assert Model(v=b'foobar').v == b'foobar' assert Model(v=bytearray('foobar', 'utf-8')).v == b'foobar' assert Model(v='foostring').v == b'foostring' with pytest.raises(ValidationError): Model(v=42) with pytest.raises(ValidationError): Model(v=0.42) def test_constrained_bytes_strict_default(): class Model(BaseModel): v: conbytes() assert Model(v=b'foobar').v == b'foobar' assert Model(v=bytearray('foobar', 'utf-8')).v == b'foobar' assert Model(v='foostring').v == b'foostring' with pytest.raises(ValidationError): Model(v=42) with pytest.raises(ValidationError): Model(v=0.42) def test_constrained_list_good(): class ConListModelMax(BaseModel): v: conlist(int) = [] m = ConListModelMax(v=[1, 2, 3]) assert m.v == [1, 2, 3] def test_constrained_list_default(): class ConListModelMax(BaseModel): v: conlist(int) = [] m = ConListModelMax() assert m.v == [] def test_constrained_list_too_long(): class ConListModelMax(BaseModel): v: conlist(int, max_length=10) = [] with pytest.raises(ValidationError) as exc_info: ConListModelMax(v=list(str(i) for i in range(11))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('v',), 'msg': 'List should have at most 10 items after validation, not 11', 'input': ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', '10'], 'ctx': {'field_type': 'List', 'max_length': 10, 'actual_length': 11}, } ] def test_constrained_list_too_short(): class ConListModelMin(BaseModel): v: conlist(int, min_length=1) with pytest.raises(ValidationError) as exc_info: ConListModelMin(v=[]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('v',), 'msg': 'List should have at least 1 item after validation, not 0', 'input': [], 'ctx': {'field_type': 'List', 'min_length': 1, 'actual_length': 0}, } ] def test_constrained_list_optional(): class Model(BaseModel): req: Optional[conlist(str, min_length=1)] opt: Optional[conlist(str, min_length=1)] = None assert Model(req=None).model_dump() == {'req': None, 'opt': None} assert Model(req=None, opt=None).model_dump() == {'req': None, 'opt': None} with pytest.raises(ValidationError) as exc_info: Model(req=[], opt=[]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('req',), 'msg': 'List should have at least 1 item after validation, not 0', 'input': [], 'ctx': {'field_type': 'List', 'min_length': 1, 'actual_length': 0}, }, { 'type': 'too_short', 'loc': ('opt',), 'msg': 'List should have at least 1 item after validation, not 0', 'input': [], 'ctx': {'field_type': 'List', 'min_length': 1, 'actual_length': 0}, }, ] assert Model(req=['a'], opt=['a']).model_dump() == {'req': ['a'], 'opt': ['a']} def test_constrained_list_constraints(): class ConListModelBoth(BaseModel): v: conlist(int, min_length=7, max_length=11) m = ConListModelBoth(v=list(range(7))) assert m.v == list(range(7)) m = ConListModelBoth(v=list(range(11))) assert m.v == list(range(11)) with pytest.raises(ValidationError) as exc_info: ConListModelBoth(v=list(range(6))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('v',), 'msg': 'List should have at least 7 items after validation, not 6', 'input': [0, 1, 2, 3, 4, 5], 'ctx': {'field_type': 'List', 'min_length': 7, 'actual_length': 6}, } ] with pytest.raises(ValidationError) as exc_info: ConListModelBoth(v=list(range(12))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('v',), 'msg': 'List should have at most 11 items after validation, not 12', 'input': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11], 'ctx': {'field_type': 'List', 'max_length': 11, 'actual_length': 12}, } ] with pytest.raises(ValidationError) as exc_info: ConListModelBoth(v=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('v',), 'msg': 'Input should be a valid list', 'input': 1} ] def test_constrained_list_item_type_fails(): class ConListModel(BaseModel): v: conlist(int) = [] with pytest.raises(ValidationError) as exc_info: ConListModel(v=['a', 'b', 'c']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', }, { 'type': 'int_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'c', }, ] def test_conlist(): class Model(BaseModel): foo: List[int] = Field(min_length=2, max_length=4) bar: conlist(str, min_length=1, max_length=4) = None assert Model(foo=[1, 2], bar=['spoon']).model_dump() == {'foo': [1, 2], 'bar': ['spoon']} msg = r'List should have at least 2 items after validation, not 1 \[type=too_short,' with pytest.raises(ValidationError, match=msg): Model(foo=[1]) msg = r'List should have at most 4 items after validation, not 5 \[type=too_long,' with pytest.raises(ValidationError, match=msg): Model(foo=list(range(5))) with pytest.raises(ValidationError) as exc_info: Model(foo=[1, 'x', 'y']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('foo', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, { 'type': 'int_parsing', 'loc': ('foo', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'y', }, ] with pytest.raises(ValidationError) as exc_info: Model(foo=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('foo',), 'msg': 'Input should be a valid list', 'input': 1} ] def test_conlist_wrong_type_default(): """It should not validate default value by default""" class Model(BaseModel): v: conlist(int) = 'a' m = Model() assert m.v == 'a' def test_constrained_set_good(): class Model(BaseModel): v: conset(int) = [] m = Model(v=[1, 2, 3]) assert m.v == {1, 2, 3} def test_constrained_set_default(): class Model(BaseModel): v: conset(int) = set() m = Model() assert m.v == set() def test_constrained_set_default_invalid(): class Model(BaseModel): v: conset(int) = 'not valid, not validated' m = Model() assert m.v == 'not valid, not validated' def test_constrained_set_too_long(): class ConSetModelMax(BaseModel): v: conset(int, max_length=10) = [] with pytest.raises(ValidationError) as exc_info: ConSetModelMax(v={str(i) for i in range(11)}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('v',), 'msg': 'Set should have at most 10 items after validation, not more', 'input': {'4', '3', '10', '9', '5', '6', '1', '8', '0', '7', '2'}, 'ctx': {'field_type': 'Set', 'max_length': 10, 'actual_length': None}, } ] def test_constrained_set_too_short(): class ConSetModelMin(BaseModel): v: conset(int, min_length=1) with pytest.raises(ValidationError) as exc_info: ConSetModelMin(v=[]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('v',), 'msg': 'Set should have at least 1 item after validation, not 0', 'input': [], 'ctx': {'field_type': 'Set', 'min_length': 1, 'actual_length': 0}, } ] def test_constrained_set_optional(): class Model(BaseModel): req: Optional[conset(str, min_length=1)] opt: Optional[conset(str, min_length=1)] = None assert Model(req=None).model_dump() == {'req': None, 'opt': None} assert Model(req=None, opt=None).model_dump() == {'req': None, 'opt': None} with pytest.raises(ValidationError) as exc_info: Model(req=set(), opt=set()) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('req',), 'msg': 'Set should have at least 1 item after validation, not 0', 'input': set(), 'ctx': {'field_type': 'Set', 'min_length': 1, 'actual_length': 0}, }, { 'type': 'too_short', 'loc': ('opt',), 'msg': 'Set should have at least 1 item after validation, not 0', 'input': set(), 'ctx': {'field_type': 'Set', 'min_length': 1, 'actual_length': 0}, }, ] assert Model(req={'a'}, opt={'a'}).model_dump() == {'req': {'a'}, 'opt': {'a'}} def test_constrained_set_constraints(): class ConSetModelBoth(BaseModel): v: conset(int, min_length=7, max_length=11) m = ConSetModelBoth(v=set(range(7))) assert m.v == set(range(7)) m = ConSetModelBoth(v=set(range(11))) assert m.v == set(range(11)) with pytest.raises(ValidationError) as exc_info: ConSetModelBoth(v=set(range(6))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('v',), 'msg': 'Set should have at least 7 items after validation, not 6', 'input': {0, 1, 2, 3, 4, 5}, 'ctx': {'field_type': 'Set', 'min_length': 7, 'actual_length': 6}, } ] with pytest.raises(ValidationError) as exc_info: ConSetModelBoth(v=set(range(12))) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('v',), 'msg': 'Set should have at most 11 items after validation, not more', 'input': {0, 8, 1, 9, 2, 10, 3, 7, 11, 4, 6, 5}, 'ctx': {'field_type': 'Set', 'max_length': 11, 'actual_length': None}, } ] with pytest.raises(ValidationError) as exc_info: ConSetModelBoth(v=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'set_type', 'loc': ('v',), 'msg': 'Input should be a valid set', 'input': 1} ] def test_constrained_set_item_type_fails(): class ConSetModel(BaseModel): v: conset(int) = [] with pytest.raises(ValidationError) as exc_info: ConSetModel(v=['a', 'b', 'c']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('v', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', }, { 'type': 'int_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'c', }, ] def test_conset(): class Model(BaseModel): foo: Set[int] = Field(min_length=2, max_length=4) bar: conset(str, min_length=1, max_length=4) = None assert Model(foo=[1, 2], bar=['spoon']).model_dump() == {'foo': {1, 2}, 'bar': {'spoon'}} assert Model(foo=[1, 1, 1, 2, 2], bar=['spoon']).model_dump() == {'foo': {1, 2}, 'bar': {'spoon'}} with pytest.raises(ValidationError, match='Set should have at least 2 items after validation, not 1'): Model(foo=[1]) with pytest.raises(ValidationError, match='Set should have at most 4 items after validation, not more'): Model(foo=list(range(5))) with pytest.raises(ValidationError) as exc_info: Model(foo=[1, 'x', 'y']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('foo', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, { 'type': 'int_parsing', 'loc': ('foo', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'y', }, ] with pytest.raises(ValidationError) as exc_info: Model(foo=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'set_type', 'loc': ('foo',), 'msg': 'Input should be a valid set', 'input': 1} ] def test_conset_not_required(): class Model(BaseModel): foo: Optional[Set[int]] = None assert Model(foo=None).foo is None assert Model().foo is None def test_confrozenset(): class Model(BaseModel): foo: FrozenSet[int] = Field(min_length=2, max_length=4) bar: confrozenset(str, min_length=1, max_length=4) = None m = Model(foo=[1, 2], bar=['spoon']) assert m.model_dump() == {'foo': {1, 2}, 'bar': {'spoon'}} assert isinstance(m.foo, frozenset) assert isinstance(m.bar, frozenset) assert Model(foo=[1, 1, 1, 2, 2], bar=['spoon']).model_dump() == {'foo': {1, 2}, 'bar': {'spoon'}} with pytest.raises(ValidationError, match='Frozenset should have at least 2 items after validation, not 1'): Model(foo=[1]) with pytest.raises(ValidationError, match='Frozenset should have at most 4 items after validation, not more'): Model(foo=list(range(5))) with pytest.raises(ValidationError) as exc_info: Model(foo=[1, 'x', 'y']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('foo', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, { 'type': 'int_parsing', 'loc': ('foo', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'y', }, ] with pytest.raises(ValidationError) as exc_info: Model(foo=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_set_type', 'loc': ('foo',), 'msg': 'Input should be a valid frozenset', 'input': 1} ] def test_confrozenset_not_required(): class Model(BaseModel): foo: Optional[FrozenSet[int]] = None assert Model(foo=None).foo is None assert Model().foo is None def test_constrained_frozenset_optional(): class Model(BaseModel): req: Optional[confrozenset(str, min_length=1)] opt: Optional[confrozenset(str, min_length=1)] = None assert Model(req=None).model_dump() == {'req': None, 'opt': None} assert Model(req=None, opt=None).model_dump() == {'req': None, 'opt': None} with pytest.raises(ValidationError) as exc_info: Model(req=frozenset(), opt=frozenset()) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('req',), 'msg': 'Frozenset should have at least 1 item after validation, not 0', 'input': frozenset(), 'ctx': {'field_type': 'Frozenset', 'min_length': 1, 'actual_length': 0}, }, { 'type': 'too_short', 'loc': ('opt',), 'msg': 'Frozenset should have at least 1 item after validation, not 0', 'input': frozenset(), 'ctx': {'field_type': 'Frozenset', 'min_length': 1, 'actual_length': 0}, }, ] assert Model(req={'a'}, opt={'a'}).model_dump() == {'req': {'a'}, 'opt': {'a'}} @pytest.fixture(scope='session', name='ConStringModel') def constring_model_fixture(): class ConStringModel(BaseModel): v: constr(max_length=10) = 'foobar' return ConStringModel def test_constrained_str_good(ConStringModel): m = ConStringModel(v='short') assert m.v == 'short' def test_constrained_str_default(ConStringModel): m = ConStringModel() assert m.v == 'foobar' @pytest.mark.parametrize( ('data', 'valid'), [('this is too long', False), ('⛄' * 11, False), ('not long90', True), ('⛄' * 10, True)], ) def test_constrained_str_too_long(ConStringModel, data, valid): if valid: assert ConStringModel(v=data).model_dump() == {'v': data} else: with pytest.raises(ValidationError) as exc_info: ConStringModel(v=data) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'max_length': 10}, 'input': data, 'loc': ('v',), 'msg': 'String should have at most 10 characters', 'type': 'string_too_long', } ] @pytest.mark.parametrize( 'to_upper, value, result', [ (True, 'abcd', 'ABCD'), (False, 'aBcD', 'aBcD'), ], ) def test_constrained_str_upper(to_upper, value, result): class Model(BaseModel): v: constr(to_upper=to_upper) m = Model(v=value) assert m.v == result @pytest.mark.parametrize( 'to_lower, value, result', [ (True, 'ABCD', 'abcd'), (False, 'ABCD', 'ABCD'), ], ) def test_constrained_str_lower(to_lower, value, result): class Model(BaseModel): v: constr(to_lower=to_lower) m = Model(v=value) assert m.v == result def test_constrained_str_max_length_0(): class Model(BaseModel): v: constr(max_length=0) m = Model(v='') assert m.v == '' with pytest.raises(ValidationError) as exc_info: Model(v='qwe') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_long', 'loc': ('v',), 'msg': 'String should have at most 0 characters', 'input': 'qwe', 'ctx': {'max_length': 0}, } ] @pytest.mark.parametrize( 'annotation', [ ImportString[Callable[[Any], Any]], Annotated[Callable[[Any], Any], ImportString], ], ) def test_string_import_callable(annotation): class PyObjectModel(BaseModel): callable: annotation m = PyObjectModel(callable='math.cos') assert m.callable == math.cos m = PyObjectModel(callable=math.cos) assert m.callable == math.cos with pytest.raises(ValidationError) as exc_info: PyObjectModel(callable='foobar') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'import_error', 'loc': ('callable',), 'msg': "Invalid python path: No module named 'foobar'", 'input': 'foobar', 'ctx': {'error': "No module named 'foobar'"}, } ] with pytest.raises(ValidationError) as exc_info: PyObjectModel(callable='os.missing') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'import_error', 'loc': ('callable',), 'msg': "Invalid python path: No module named 'os.missing'", 'input': 'os.missing', 'ctx': {'error': "No module named 'os.missing'"}, } ] with pytest.raises(ValidationError) as exc_info: PyObjectModel(callable='os.path') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'callable_type', 'loc': ('callable',), 'msg': 'Input should be callable', 'input': os.path} ] with pytest.raises(ValidationError) as exc_info: PyObjectModel(callable=[1, 2, 3]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'callable_type', 'loc': ('callable',), 'msg': 'Input should be callable', 'input': [1, 2, 3]} ] @pytest.mark.parametrize( ('value', 'expected', 'mode'), [ ('math:cos', 'math.cos', 'json'), ('math:cos', math.cos, 'python'), ('math.cos', 'math.cos', 'json'), ('math.cos', math.cos, 'python'), pytest.param( 'os.path', 'posixpath', 'json', marks=pytest.mark.skipif(sys.platform == 'win32', reason='different output') ), pytest.param( 'os.path', 'ntpath', 'json', marks=pytest.mark.skipif(sys.platform != 'win32', reason='different output') ), ('os.path', os.path, 'python'), ([1, 2, 3], [1, 2, 3], 'json'), ([1, 2, 3], [1, 2, 3], 'python'), ('math', 'math', 'json'), ('math', math, 'python'), ('builtins.list', 'builtins.list', 'json'), ('builtins.list', list, 'python'), (list, 'builtins.list', 'json'), (list, list, 'python'), (f'{__name__}.pytest', 'pytest', 'json'), (f'{__name__}.pytest', pytest, 'python'), ], ) def test_string_import_any(value: Any, expected: Any, mode: Literal['json', 'python']): class PyObjectModel(BaseModel): thing: ImportString assert PyObjectModel(thing=value).model_dump(mode=mode) == {'thing': expected} @pytest.mark.parametrize( ('value', 'validate_default', 'expected'), [ (math.cos, True, math.cos), ('math:cos', True, math.cos), (math.cos, False, math.cos), ('math:cos', False, 'math:cos'), ], ) def test_string_import_default_value(value: Any, validate_default: bool, expected: Any): class PyObjectModel(BaseModel): thing: ImportString = Field(default=value, validate_default=validate_default) assert PyObjectModel().thing == expected @pytest.mark.parametrize('value', ['oss', 'os.os', f'{__name__}.x']) def test_string_import_any_expected_failure(value: Any): """Ensure importString correctly fails to instantiate when it's supposed to""" class PyObjectModel(BaseModel): thing: ImportString with pytest.raises(ValidationError, match='type=import_error'): PyObjectModel(thing=value) @pytest.mark.parametrize( 'annotation', [ ImportString[Annotated[float, annotated_types.Ge(3), annotated_types.Le(4)]], Annotated[float, annotated_types.Ge(3), annotated_types.Le(4), ImportString], ], ) def test_string_import_constraints(annotation): class PyObjectModel(BaseModel): thing: annotation assert PyObjectModel(thing='math:pi').model_dump() == {'thing': pytest.approx(3.141592654)} with pytest.raises(ValidationError, match='type=greater_than_equal'): PyObjectModel(thing='math:e') def test_string_import_examples(): import collections adapter = TypeAdapter(ImportString) assert adapter.validate_python('collections') is collections assert adapter.validate_python('collections.abc') is collections.abc assert adapter.validate_python('collections.abc.Mapping') is collections.abc.Mapping assert adapter.validate_python('collections.abc:Mapping') is collections.abc.Mapping @pytest.mark.parametrize( 'import_string,errors', [ ( 'collections.abc.def', [ { 'ctx': {'error': "No module named 'collections.abc.def'"}, 'input': 'collections.abc.def', 'loc': (), 'msg': "Invalid python path: No module named 'collections.abc.def'", 'type': 'import_error', } ], ), ( 'collections.abc:def', [ { 'ctx': {'error': "cannot import name 'def' from 'collections.abc'"}, 'input': 'collections.abc:def', 'loc': (), 'msg': "Invalid python path: cannot import name 'def' from 'collections.abc'", 'type': 'import_error', } ], ), ( 'collections:abc:Mapping', [ { 'ctx': {'error': "Import strings should have at most one ':'; received 'collections:abc:Mapping'"}, 'input': 'collections:abc:Mapping', 'loc': (), 'msg': "Invalid python path: Import strings should have at most one ':';" " received 'collections:abc:Mapping'", 'type': 'import_error', } ], ), ( '123_collections:Mapping', [ { 'ctx': {'error': "No module named '123_collections'"}, 'input': '123_collections:Mapping', 'loc': (), 'msg': "Invalid python path: No module named '123_collections'", 'type': 'import_error', } ], ), ( ':Mapping', [ { 'ctx': {'error': "Import strings should have a nonempty module name; received ':Mapping'"}, 'input': ':Mapping', 'loc': (), 'msg': 'Invalid python path: Import strings should have a nonempty module ' "name; received ':Mapping'", 'type': 'import_error', } ], ), ], ) def test_string_import_errors(import_string, errors): with pytest.raises(ValidationError) as exc_info: TypeAdapter(ImportString).validate_python(import_string) assert exc_info.value.errors() == errors @pytest.mark.xfail( reason='This fails with pytest bc of the weirdness associated with importing modules in a test, but works in normal usage' ) def test_import_string_sys_stdout() -> None: class ImportThings(BaseModel): obj: ImportString import_things = ImportThings(obj='sys.stdout') assert import_things.model_dump_json() == '{"obj":"sys.stdout"}' def test_decimal(): class Model(BaseModel): v: Decimal m = Model(v='1.234') assert m.v == Decimal('1.234') assert isinstance(m.v, Decimal) assert m.model_dump() == {'v': Decimal('1.234')} def test_decimal_allow_inf(): class MyModel(BaseModel): value: Annotated[Decimal, AllowInfNan(True)] m = MyModel(value='inf') assert m.value == Decimal('inf') m = MyModel(value=Decimal('inf')) assert m.value == Decimal('inf') def test_decimal_dont_allow_inf(): class MyModel(BaseModel): value: Decimal with pytest.raises(ValidationError, match=r'Input should be a finite number \[type=finite_number'): MyModel(value='inf') with pytest.raises(ValidationError, match=r'Input should be a finite number \[type=finite_number'): MyModel(value=Decimal('inf')) def test_decimal_strict(): class Model(BaseModel): v: Decimal model_config = ConfigDict(strict=True) with pytest.raises(ValidationError) as exc_info: Model(v=1.23) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of Decimal', 'input': 1.23, 'ctx': {'class': 'Decimal'}, } ] v = Decimal(1.23) assert Model(v=v).v == v assert Model(v=v).model_dump() == {'v': v} assert Model.model_validate_json('{"v": "1.23"}').v == Decimal('1.23') def test_decimal_precision() -> None: ta = TypeAdapter(Decimal) num = f'{1234567890 * 100}.{1234567890 * 100}' expected = Decimal(num) assert ta.validate_python(num) == expected assert ta.validate_json(f'"{num}"') == expected def test_strict_date(): class Model(BaseModel): v: Annotated[date, Field(strict=True)] assert Model(v=date(2017, 5, 5)).v == date(2017, 5, 5) with pytest.raises(ValidationError) as exc_info: Model(v=datetime(2017, 5, 5)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'date_type', 'loc': ('v',), 'msg': 'Input should be a valid date', 'input': datetime(2017, 5, 5), } ] with pytest.raises(ValidationError) as exc_info: Model(v='2017-05-05') assert exc_info.value.errors(include_url=False) == [ { 'type': 'date_type', 'loc': ('v',), 'msg': 'Input should be a valid date', 'input': '2017-05-05', } ] def test_strict_datetime(): class Model(BaseModel): v: Annotated[datetime, Field(strict=True)] assert Model(v=datetime(2017, 5, 5, 10, 10, 10)).v == datetime(2017, 5, 5, 10, 10, 10) with pytest.raises(ValidationError) as exc_info: Model(v=date(2017, 5, 5)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_type', 'loc': ('v',), 'msg': 'Input should be a valid datetime', 'input': date(2017, 5, 5), } ] with pytest.raises(ValidationError) as exc_info: Model(v='2017-05-05T10:10:10') assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_type', 'loc': ('v',), 'msg': 'Input should be a valid datetime', 'input': '2017-05-05T10:10:10', } ] def test_strict_time(): class Model(BaseModel): v: Annotated[time, Field(strict=True)] assert Model(v=time(10, 10, 10)).v == time(10, 10, 10) with pytest.raises(ValidationError) as exc_info: Model(v='10:10:10') assert exc_info.value.errors(include_url=False) == [ { 'type': 'time_type', 'loc': ('v',), 'msg': 'Input should be a valid time', 'input': '10:10:10', } ] def test_strict_timedelta(): class Model(BaseModel): v: Annotated[timedelta, Field(strict=True)] assert Model(v=timedelta(days=1)).v == timedelta(days=1) with pytest.raises(ValidationError) as exc_info: Model(v='1 days') assert exc_info.value.errors(include_url=False) == [ { 'type': 'time_delta_type', 'loc': ('v',), 'msg': 'Input should be a valid timedelta', 'input': '1 days', } ] @pytest.fixture(scope='session', name='CheckModel') def check_model_fixture(): class CheckModel(BaseModel): bool_check: bool = True str_check: constr(strip_whitespace=True, max_length=10) = 's' bytes_check: bytes = b's' int_check: int = 1 float_check: float = 1.0 uuid_check: UUID = UUID('7bd00d58-6485-4ca6-b889-3da6d8df3ee4') decimal_check: condecimal(allow_inf_nan=False) = Decimal('42.24') date_check: date = date(2017, 5, 5) datetime_check: datetime = datetime(2017, 5, 5, 10, 10, 10) time_check: time = time(10, 10, 10) timedelta_check: timedelta = timedelta(days=1) list_check: List[str] = ['1', '2'] tuple_check: Tuple[str, ...] = ('1', '2') set_check: Set[str] = {'1', '2'} frozenset_check: FrozenSet[str] = frozenset(['1', '2']) return CheckModel class BoolCastable: def __bool__(self) -> bool: return True @pytest.mark.parametrize( 'field,value,result', [ ('bool_check', True, True), ('bool_check', 1, True), ('bool_check', 1.0, True), ('bool_check', Decimal(1), True), ('bool_check', 'y', True), ('bool_check', 'Y', True), ('bool_check', 'yes', True), ('bool_check', 'Yes', True), ('bool_check', 'YES', True), ('bool_check', 'true', True), ('bool_check', 'True', True), ('bool_check', 'TRUE', True), ('bool_check', 'on', True), ('bool_check', 'On', True), ('bool_check', 'ON', True), ('bool_check', '1', True), ('bool_check', 't', True), ('bool_check', 'T', True), ('bool_check', b'TRUE', True), ('bool_check', False, False), ('bool_check', 0, False), ('bool_check', 0.0, False), ('bool_check', Decimal(0), False), ('bool_check', 'n', False), ('bool_check', 'N', False), ('bool_check', 'no', False), ('bool_check', 'No', False), ('bool_check', 'NO', False), ('bool_check', 'false', False), ('bool_check', 'False', False), ('bool_check', 'FALSE', False), ('bool_check', 'off', False), ('bool_check', 'Off', False), ('bool_check', 'OFF', False), ('bool_check', '0', False), ('bool_check', 'f', False), ('bool_check', 'F', False), ('bool_check', b'FALSE', False), ('bool_check', None, ValidationError), ('bool_check', '', ValidationError), ('bool_check', [], ValidationError), ('bool_check', {}, ValidationError), ('bool_check', [1, 2, 3, 4], ValidationError), ('bool_check', {1: 2, 3: 4}, ValidationError), ('bool_check', b'2', ValidationError), ('bool_check', '2', ValidationError), ('bool_check', 2, ValidationError), ('bool_check', 2.0, ValidationError), ('bool_check', Decimal(2), ValidationError), ('bool_check', b'\x81', ValidationError), ('bool_check', BoolCastable(), ValidationError), ('str_check', 's', 's'), ('str_check', ' s ', 's'), ('str_check', ' leading', 'leading'), ('str_check', 'trailing ', 'trailing'), ('str_check', b's', 's'), ('str_check', b' s ', 's'), ('str_check', bytearray(b's' * 5), 'sssss'), ('str_check', 1, ValidationError), ('str_check', 'x' * 11, ValidationError), ('str_check', b'x' * 11, ValidationError), ('str_check', b'\x81', ValidationError), ('str_check', bytearray(b'\x81' * 5), ValidationError), ('bytes_check', 's', b's'), ('bytes_check', ' s ', b' s '), ('bytes_check', b's', b's'), ('bytes_check', 1, ValidationError), ('bytes_check', bytearray('xx', encoding='utf8'), b'xx'), ('bytes_check', True, ValidationError), ('bytes_check', False, ValidationError), ('bytes_check', {}, ValidationError), ('bytes_check', 'x' * 11, b'x' * 11), ('bytes_check', b'x' * 11, b'x' * 11), ('int_check', 1, 1), ('int_check', 1.0, 1), ('int_check', 1.9, ValidationError), ('int_check', Decimal(1), 1), ('int_check', Decimal(1.9), ValidationError), ('int_check', '1', 1), ('int_check', '1.9', ValidationError), ('int_check', b'1', 1), ('int_check', 12, 12), ('int_check', '12', 12), ('int_check', b'12', 12), ('float_check', 1, 1.0), ('float_check', 1.0, 1.0), ('float_check', Decimal(1.0), 1.0), ('float_check', '1.0', 1.0), ('float_check', '1', 1.0), ('float_check', b'1.0', 1.0), ('float_check', b'1', 1.0), ('float_check', True, 1.0), ('float_check', False, 0.0), ('float_check', 't', ValidationError), ('float_check', b't', ValidationError), ('uuid_check', 'ebcdab58-6eb8-46fb-a190-d07a33e9eac8', UUID('ebcdab58-6eb8-46fb-a190-d07a33e9eac8')), ('uuid_check', UUID('ebcdab58-6eb8-46fb-a190-d07a33e9eac8'), UUID('ebcdab58-6eb8-46fb-a190-d07a33e9eac8')), ('uuid_check', b'ebcdab58-6eb8-46fb-a190-d07a33e9eac8', UUID('ebcdab58-6eb8-46fb-a190-d07a33e9eac8')), ('uuid_check', b'\x12\x34\x56\x78' * 4, UUID('12345678-1234-5678-1234-567812345678')), ('uuid_check', 'ebcdab58-6eb8-46fb-a190-', ValidationError), ('uuid_check', 123, ValidationError), ('decimal_check', 42.24, Decimal('42.24')), ('decimal_check', '42.24', Decimal('42.24')), ('decimal_check', b'42.24', ValidationError), ('decimal_check', ' 42.24 ', Decimal('42.24')), ('decimal_check', Decimal('42.24'), Decimal('42.24')), ('decimal_check', 'not a valid decimal', ValidationError), ('decimal_check', 'NaN', ValidationError), ('date_check', date(2017, 5, 5), date(2017, 5, 5)), ('date_check', datetime(2017, 5, 5), date(2017, 5, 5)), ('date_check', '2017-05-05', date(2017, 5, 5)), ('date_check', b'2017-05-05', date(2017, 5, 5)), ('date_check', 1493942400000, date(2017, 5, 5)), ('date_check', 1493942400, date(2017, 5, 5)), ('date_check', 1493942400000.0, date(2017, 5, 5)), ('date_check', Decimal(1493942400000), date(2017, 5, 5)), ('date_check', datetime(2017, 5, 5, 10), ValidationError), ('date_check', '2017-5-5', ValidationError), ('date_check', b'2017-5-5', ValidationError), ('date_check', 1493942401000, ValidationError), ('date_check', 1493942401000.0, ValidationError), ('date_check', Decimal(1493942401000), ValidationError), ('datetime_check', datetime(2017, 5, 5, 10, 10, 10), datetime(2017, 5, 5, 10, 10, 10)), ('datetime_check', date(2017, 5, 5), datetime(2017, 5, 5, 0, 0, 0)), ('datetime_check', '2017-05-05T10:10:10.0002', datetime(2017, 5, 5, 10, 10, 10, microsecond=200)), ('datetime_check', '2017-05-05 10:10:10', datetime(2017, 5, 5, 10, 10, 10)), ('datetime_check', '2017-05-05 10:10:10+00:00', datetime(2017, 5, 5, 10, 10, 10, tzinfo=timezone.utc)), ('datetime_check', b'2017-05-05T10:10:10.0002', datetime(2017, 5, 5, 10, 10, 10, microsecond=200)), ('datetime_check', 1493979010000, datetime(2017, 5, 5, 10, 10, 10, tzinfo=timezone.utc)), ('datetime_check', 1493979010, datetime(2017, 5, 5, 10, 10, 10, tzinfo=timezone.utc)), ('datetime_check', 1493979010000.0, datetime(2017, 5, 5, 10, 10, 10, tzinfo=timezone.utc)), ('datetime_check', Decimal(1493979010), datetime(2017, 5, 5, 10, 10, 10, tzinfo=timezone.utc)), ('datetime_check', '2017-5-5T10:10:10', ValidationError), ('datetime_check', b'2017-5-5T10:10:10', ValidationError), ('time_check', time(10, 10, 10), time(10, 10, 10)), ('time_check', '10:10:10.0002', time(10, 10, 10, microsecond=200)), ('time_check', b'10:10:10.0002', time(10, 10, 10, microsecond=200)), ('time_check', 3720, time(1, 2, tzinfo=timezone.utc)), ('time_check', 3720.0002, time(1, 2, microsecond=200, tzinfo=timezone.utc)), ('time_check', Decimal(3720.0002), time(1, 2, microsecond=200, tzinfo=timezone.utc)), ('time_check', '1:1:1', ValidationError), ('time_check', b'1:1:1', ValidationError), ('time_check', -1, ValidationError), ('time_check', 86400, ValidationError), ('time_check', 86400.0, ValidationError), ('time_check', Decimal(86400), ValidationError), ('timedelta_check', timedelta(days=1), timedelta(days=1)), ('timedelta_check', '1 days 10:10', timedelta(days=1, seconds=36600)), ('timedelta_check', '1 d 10:10', timedelta(days=1, seconds=36600)), ('timedelta_check', b'1 days 10:10', timedelta(days=1, seconds=36600)), ('timedelta_check', 123_000, timedelta(days=1, seconds=36600)), ('timedelta_check', 123_000.0002, timedelta(days=1, seconds=36600, microseconds=200)), ('timedelta_check', Decimal(123_000.0002), timedelta(days=1, seconds=36600, microseconds=200)), ('timedelta_check', '1 10:10', ValidationError), ('timedelta_check', b'1 10:10', ValidationError), ('list_check', ['1', '2'], ['1', '2']), ('list_check', ('1', '2'), ['1', '2']), ('list_check', {'1': 1, '2': 2}.keys(), ['1', '2']), ('list_check', {'1': '1', '2': '2'}.values(), ['1', '2']), ('list_check', {'1', '2'}, dirty_equals.IsOneOf(['1', '2'], ['2', '1'])), ('list_check', frozenset(['1', '2']), dirty_equals.IsOneOf(['1', '2'], ['2', '1'])), ('list_check', {'1': 1, '2': 2}, ValidationError), ('tuple_check', ('1', '2'), ('1', '2')), ('tuple_check', ['1', '2'], ('1', '2')), ('tuple_check', {'1': 1, '2': 2}.keys(), ('1', '2')), ('tuple_check', {'1': '1', '2': '2'}.values(), ('1', '2')), ('tuple_check', {'1', '2'}, dirty_equals.IsOneOf(('1', '2'), ('2', '1'))), ('tuple_check', frozenset(['1', '2']), dirty_equals.IsOneOf(('1', '2'), ('2', '1'))), ('tuple_check', {'1': 1, '2': 2}, ValidationError), ('set_check', {'1', '2'}, {'1', '2'}), ('set_check', ['1', '2', '1', '2'], {'1', '2'}), ('set_check', ('1', '2', '1', '2'), {'1', '2'}), ('set_check', frozenset(['1', '2']), {'1', '2'}), ('set_check', {'1': 1, '2': 2}.keys(), {'1', '2'}), ('set_check', {'1': '1', '2': '2'}.values(), {'1', '2'}), ('set_check', {'1': 1, '2': 2}, ValidationError), ('frozenset_check', frozenset(['1', '2']), frozenset(['1', '2'])), ('frozenset_check', ['1', '2', '1', '2'], frozenset(['1', '2'])), ('frozenset_check', ('1', '2', '1', '2'), frozenset(['1', '2'])), ('frozenset_check', {'1', '2'}, frozenset(['1', '2'])), ('frozenset_check', {'1': 1, '2': 2}.keys(), frozenset(['1', '2'])), ('frozenset_check', {'1': '1', '2': '2'}.values(), frozenset(['1', '2'])), ('frozenset_check', {'1': 1, '2': 2}, ValidationError), ], ) def test_default_validators(field, value, result, CheckModel): kwargs = {field: value} if result == ValidationError: with pytest.raises(ValidationError): CheckModel(**kwargs) else: assert CheckModel(**kwargs).model_dump()[field] == result @pytest.fixture(scope='session', name='StrModel') def str_model_fixture(): class StrModel(BaseModel): str_check: Annotated[str, annotated_types.Len(5, 10)] return StrModel def test_string_too_long(StrModel): with pytest.raises(ValidationError) as exc_info: StrModel(str_check='x' * 150) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_long', 'loc': ('str_check',), 'msg': 'String should have at most 10 characters', 'input': 'x' * 150, 'ctx': {'max_length': 10}, } ] def test_string_too_short(StrModel): with pytest.raises(ValidationError) as exc_info: StrModel(str_check='x') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_too_short', 'loc': ('str_check',), 'msg': 'String should have at least 5 characters', 'input': 'x', 'ctx': {'min_length': 5}, } ] @pytest.fixture(scope='session', name='DatetimeModel') def datetime_model_fixture(): class DatetimeModel(BaseModel): dt: datetime date_: date time_: time duration: timedelta return DatetimeModel def test_datetime_successful(DatetimeModel): m = DatetimeModel(dt='2017-10-05T19:47:07', date_=1493942400, time_='10:20:30.400', duration='00:15:30.0001') assert m.dt == datetime(2017, 10, 5, 19, 47, 7) assert m.date_ == date(2017, 5, 5) assert m.time_ == time(10, 20, 30, 400_000) assert m.duration == timedelta(minutes=15, seconds=30, microseconds=100) def test_datetime_errors(DatetimeModel): with pytest.raises(ValueError) as exc_info: DatetimeModel(dt='2017-13-05T19:47:07', date_='XX1494012000', time_='25:20:30.400', duration='15:30.0001broken') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'datetime_from_date_parsing', 'loc': ('dt',), 'msg': 'Input should be a valid datetime or date, month value is outside expected range of 1-12', 'input': '2017-13-05T19:47:07', 'ctx': {'error': 'month value is outside expected range of 1-12'}, }, { 'type': 'date_from_datetime_parsing', 'loc': ('date_',), 'msg': 'Input should be a valid date or datetime, invalid character in year', 'input': 'XX1494012000', 'ctx': {'error': 'invalid character in year'}, }, { 'type': 'time_parsing', 'loc': ('time_',), 'msg': 'Input should be in a valid time format, hour value is outside expected range of 0-23', 'input': '25:20:30.400', 'ctx': {'error': 'hour value is outside expected range of 0-23'}, }, { 'type': 'time_delta_parsing', 'loc': ('duration',), 'msg': 'Input should be a valid timedelta, unexpected extra characters at the end of the input', 'input': '15:30.0001broken', 'ctx': {'error': 'unexpected extra characters at the end of the input'}, }, ] @pytest.fixture(scope='session') def cooking_model(): class FruitEnum(str, Enum): pear = 'pear' banana = 'banana' class ToolEnum(IntEnum): spanner = 1 wrench = 2 class CookingModel(BaseModel): fruit: FruitEnum = FruitEnum.pear tool: ToolEnum = ToolEnum.spanner return FruitEnum, ToolEnum, CookingModel def test_enum_successful(cooking_model): FruitEnum, ToolEnum, CookingModel = cooking_model m = CookingModel(tool=2) assert m.fruit == FruitEnum.pear assert m.tool == ToolEnum.wrench assert repr(m.tool) == '' def test_enum_fails(cooking_model): FruitEnum, ToolEnum, CookingModel = cooking_model with pytest.raises(ValueError) as exc_info: CookingModel(tool=3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': '1 or 2'}, 'input': 3, 'loc': ('tool',), 'msg': 'Input should be 1 or 2', 'type': 'enum', } ] def test_enum_fails_error_msg(): class Number(IntEnum): one = 1 two = 2 three = 3 class Model(BaseModel): num: Number with pytest.raises(ValueError) as exc_info: Model(num=4) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'enum', 'loc': ('num',), 'msg': 'Input should be 1, 2 or 3', 'input': 4, 'ctx': {'expected': '1, 2 or 3'}, } ] def test_int_enum_successful_for_str_int(cooking_model): FruitEnum, ToolEnum, CookingModel = cooking_model m = CookingModel(tool='2') assert m.tool == ToolEnum.wrench assert repr(m.tool) == '' def test_plain_enum_validate(): class MyEnum(Enum): a = 1 class Model(BaseModel): x: MyEnum m = Model(x=MyEnum.a) assert m.x is MyEnum.a assert TypeAdapter(MyEnum).validate_python(1) is MyEnum.a with pytest.raises(ValidationError) as exc_info: TypeAdapter(MyEnum).validate_python(1, strict=True) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'test_plain_enum_validate..MyEnum'}, 'input': 1, 'loc': (), 'msg': IsStr(regex='Input should be an instance of test_plain_enum_validate..MyEnum'), 'type': 'is_instance_of', } ] assert TypeAdapter(MyEnum).validate_json('1') is MyEnum.a TypeAdapter(MyEnum).validate_json('1', strict=True) with pytest.raises(ValidationError) as exc_info: TypeAdapter(MyEnum).validate_json('"1"', strict=True) assert exc_info.value.errors(include_url=False) == [ {'ctx': {'expected': '1'}, 'input': '1', 'loc': (), 'msg': 'Input should be 1', 'type': 'enum'} ] def test_plain_enum_validate_json(): class MyEnum(Enum): a = 1 class Model(BaseModel): x: MyEnum m = Model.model_validate_json('{"x":1}') assert m.x is MyEnum.a def test_enum_type(): class Model(BaseModel): my_enum: Enum class MyEnum(Enum): a = 1 m = Model(my_enum=MyEnum.a) assert m.my_enum == MyEnum.a assert m.model_dump() == {'my_enum': MyEnum.a} assert m.model_dump_json() == '{"my_enum":1}' with pytest.raises(ValidationError) as exc_info: Model(my_enum=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'Enum'}, 'input': 1, 'loc': ('my_enum',), 'msg': 'Input should be an instance of Enum', 'type': 'is_instance_of', } ] def test_enum_missing_default(): class MyEnum(Enum): a = 1 ta = TypeAdapter(MyEnum) missing_value = re.search(r'missing: (\w+)', repr(ta.validator)).group(1) assert missing_value == 'None' assert ta.validate_python(1) is MyEnum.a with pytest.raises(ValidationError): ta.validate_python(2) def test_enum_missing_custom(): class MyEnum(Enum): a = 1 @classmethod def _missing_(cls, value): return MyEnum.a ta = TypeAdapter(MyEnum) missing_value = re.search(r'missing: (\w+)', repr(ta.validator)).group(1) assert missing_value == 'Some' assert ta.validate_python(1) is MyEnum.a assert ta.validate_python(2) is MyEnum.a def test_int_enum_type(): class Model(BaseModel): my_enum: IntEnum class MyEnum(Enum): a = 1 class MyIntEnum(IntEnum): b = 2 m = Model(my_enum=MyIntEnum.b) assert m.my_enum == MyIntEnum.b assert m.model_dump() == {'my_enum': MyIntEnum.b} assert m.model_dump_json() == '{"my_enum":2}' with pytest.raises(ValidationError) as exc_info: Model(my_enum=MyEnum.a) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'IntEnum'}, 'input': MyEnum.a, 'loc': ('my_enum',), 'msg': 'Input should be an instance of IntEnum', 'type': 'is_instance_of', } ] @pytest.mark.parametrize('enum_base,strict', [(Enum, False), (IntEnum, False), (IntEnum, True)]) def test_enum_from_json(enum_base, strict): class MyEnum(enum_base): a = 1 b = 3 class Model(BaseModel): my_enum: MyEnum m = Model.model_validate_json('{"my_enum":1}', strict=strict) assert m.my_enum is MyEnum.a with pytest.raises(ValidationError) as exc_info: Model.model_validate_json('{"my_enum":2}', strict=strict) MyEnum.__name__ if sys.version_info[:2] <= (3, 8) else MyEnum.__qualname__ if strict: assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': '1 or 3'}, 'input': 2, 'loc': ('my_enum',), 'msg': 'Input should be 1 or 3', 'type': 'enum', } ] else: assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'expected': '1 or 3'}, 'input': 2, 'loc': ('my_enum',), 'msg': 'Input should be 1 or 3', 'type': 'enum', } ] def test_strict_enum() -> None: class Demo(Enum): A = 0 B = 1 class User(BaseModel): model_config = ConfigDict(strict=True) demo_strict: Demo demo_not_strict: Demo = Field(strict=False) user = User(demo_strict=Demo.A, demo_not_strict=1) assert isinstance(user.demo_strict, Demo) assert isinstance(user.demo_not_strict, Demo) assert user.demo_strict.value == 0 assert user.demo_not_strict.value == 1 with pytest.raises(ValidationError, match='Input should be an instance of test_strict_enum..Demo'): User(demo_strict=0, demo_not_strict=1) def test_enum_with_no_cases() -> None: class MyEnum(Enum): pass class MyModel(BaseModel): e: MyEnum json_schema = MyModel.model_json_schema() assert json_schema['properties']['e']['enum'] == [] @pytest.mark.parametrize( 'kwargs,type_,a', [ ({'pattern': '^foo$'}, int, 1), ({'gt': 0}, conlist(int, min_length=4), [1, 2, 3, 4, 5]), ({'gt': 0}, conset(int, min_length=4), {1, 2, 3, 4, 5}), ({'gt': 0}, confrozenset(int, min_length=4), frozenset({1, 2, 3, 4, 5})), ], ) def test_invalid_schema_constraints(kwargs, type_, a): class Foo(BaseModel): a: type_ = Field('foo', title='A title', description='A description', **kwargs) constraint_name = list(kwargs.keys())[0] with pytest.raises( TypeError, match=re.escape(f"Unable to apply constraint '{constraint_name}' to supplied value {a}") ): Foo(a=a) def test_invalid_decimal_constraint(): class Foo(BaseModel): a: Decimal = Field('foo', title='A title', description='A description', max_length=5) with pytest.raises(TypeError, match="Unable to apply constraint 'max_length' to supplied value 1.0"): Foo(a=1.0) @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_string_success(): class MoreStringsModel(BaseModel): str_strip_enabled: constr(strip_whitespace=True) str_strip_disabled: constr(strip_whitespace=False) str_regex: constr(pattern=r'^xxx\d{3}$') = ... str_min_length: constr(min_length=5) = ... str_email: EmailStr = ... name_email: NameEmail = ... str_gt: Annotated[str, annotated_types.Gt('a')] m = MoreStringsModel( str_strip_enabled=' xxx123 ', str_strip_disabled=' xxx123 ', str_regex='xxx123', str_min_length='12345', str_email='foobar@example.com ', name_email='foo bar ', str_gt='b', ) assert m.str_strip_enabled == 'xxx123' assert m.str_strip_disabled == ' xxx123 ' assert m.str_regex == 'xxx123' assert m.str_email == 'foobar@example.com' assert repr(m.name_email) == "NameEmail(name='foo bar', email='foobaR@example.com')" assert str(m.name_email) == 'foo bar ' assert m.name_email.name == 'foo bar' assert m.name_email.email == 'foobaR@example.com' assert m.str_gt == 'b' @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_string_fails(): class MoreStringsModel(BaseModel): str_regex: constr(pattern=r'^xxx\d{3}$') = ... str_min_length: constr(min_length=5) = ... str_email: EmailStr = ... name_email: NameEmail = ... with pytest.raises(ValidationError) as exc_info: MoreStringsModel( str_regex='xxx123xxx', str_min_length='1234', str_email='foobar<@example.com', name_email='foobar @example.com', ) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_pattern_mismatch', 'loc': ('str_regex',), 'msg': "String should match pattern '^xxx\\d{3}$'", 'input': 'xxx123xxx', 'ctx': {'pattern': '^xxx\\d{3}$'}, }, { 'type': 'string_too_short', 'loc': ('str_min_length',), 'msg': 'String should have at least 5 characters', 'input': '1234', 'ctx': {'min_length': 5}, }, { 'type': 'value_error', 'loc': ('str_email',), 'msg': 'value is not a valid email address: An open angle bracket at the start of the email address has to be followed by a close angle bracket at the end.', 'input': 'foobar<@example.com', 'ctx': { 'reason': 'An open angle bracket at the start of the email address has to be followed by a close angle bracket at the end.' }, }, { 'type': 'value_error', 'loc': ('name_email',), 'msg': 'value is not a valid email address: The email address contains invalid characters before the @-sign: SPACE.', 'input': 'foobar @example.com', 'ctx': {'reason': 'The email address contains invalid characters before the @-sign: SPACE.'}, }, ] @pytest.mark.skipif(email_validator, reason='email_validator is installed') def test_email_validator_not_installed_email_str(): with pytest.raises(ImportError): class Model(BaseModel): str_email: EmailStr = ... @pytest.mark.skipif(email_validator, reason='email_validator is installed') def test_email_validator_not_installed_name_email(): with pytest.raises(ImportError): class Model(BaseModel): str_email: NameEmail = ... def test_dict(): class Model(BaseModel): v: dict assert Model(v={1: 10, 2: 20}).v == {1: 10, 2: 20} with pytest.raises(ValidationError) as exc_info: Model(v=[(1, 2), (3, 4)]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'dict_type', 'loc': ('v',), 'msg': 'Input should be a valid dictionary', 'input': [(1, 2), (3, 4)], } ] with pytest.raises(ValidationError) as exc_info: Model(v=[1, 2, 3]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'dict_type', 'loc': ('v',), 'msg': 'Input should be a valid dictionary', 'input': [1, 2, 3]} ] @pytest.mark.parametrize( 'value,result', ( ([1, 2, '3'], [1, 2, '3']), ((1, 2, '3'), [1, 2, '3']), ((i**2 for i in range(5)), [0, 1, 4, 9, 16]), (deque([1, 2, 3]), [1, 2, 3]), ({1, '2'}, IsOneOf([1, '2'], ['2', 1])), ), ) def test_list_success(value, result): class Model(BaseModel): v: list assert Model(v=value).v == result @pytest.mark.parametrize('value', (123, '123')) def test_list_fails(value): class Model(BaseModel): v: list with pytest.raises(ValidationError) as exc_info: Model(v=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': ('v',), 'msg': 'Input should be a valid list', 'input': value, } ] def test_ordered_dict(): class Model(BaseModel): v: OrderedDict assert Model(v=OrderedDict([(1, 10), (2, 20)])).v == OrderedDict([(1, 10), (2, 20)]) assert Model(v={1: 10, 2: 20}).v == OrderedDict([(1, 10), (2, 20)]) with pytest.raises(ValidationError) as exc_info: Model(v=[1, 2, 3]) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'dict_type', 'loc': ('v',), 'msg': 'Input should be a valid dictionary', 'input': [1, 2, 3]} ] @pytest.mark.parametrize( 'value,result', ( ([1, 2, '3'], (1, 2, '3')), ((1, 2, '3'), (1, 2, '3')), ((i**2 for i in range(5)), (0, 1, 4, 9, 16)), (deque([1, 2, 3]), (1, 2, 3)), ({1, '2'}, IsOneOf((1, '2'), ('2', 1))), ), ) def test_tuple_success(value, result): class Model(BaseModel): v: tuple assert Model(v=value).v == result @pytest.mark.parametrize('value', (123, '123')) def test_tuple_fails(value): class Model(BaseModel): v: tuple with pytest.raises(ValidationError) as exc_info: Model(v=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'tuple_type', 'loc': ('v',), 'msg': 'Input should be a valid tuple', 'input': value} ] @pytest.mark.parametrize( 'value,cls,result', ( ([1, 2, '3'], int, (1, 2, 3)), ((1, 2, '3'), int, (1, 2, 3)), ((i**2 for i in range(5)), int, (0, 1, 4, 9, 16)), (('a', 'b', 'c'), str, ('a', 'b', 'c')), ), ) def test_tuple_variable_len_success(value, cls, result): class Model(BaseModel): v: Tuple[cls, ...] assert Model(v=value).v == result @pytest.mark.parametrize( 'value, cls, exc', [ ( ('a', 'b', [1, 2], 'c'), str, [ { 'type': 'string_type', 'loc': ('v', 2), 'msg': 'Input should be a valid string', 'input': [1, 2], } ], ), ( ('a', 'b', [1, 2], 'c', [3, 4]), str, [ { 'type': 'string_type', 'loc': ('v', 2), 'msg': 'Input should be a valid string', 'input': [1, 2], }, { 'type': 'string_type', 'loc': ('v', 4), 'msg': 'Input should be a valid string', 'input': [3, 4], }, ], ), ], ) def test_tuple_variable_len_fails(value, cls, exc): class Model(BaseModel): v: Tuple[cls, ...] with pytest.raises(ValidationError) as exc_info: Model(v=value) assert exc_info.value.errors(include_url=False) == exc @pytest.mark.parametrize( 'value,result', ( ({1, 2, '3'}, {1, 2, '3'}), ((1, 2, 2, '3'), {1, 2, '3'}), ([1, 2, 2, '3'], {1, 2, '3'}), ({i**2 for i in range(5)}, {0, 1, 4, 9, 16}), ), ) def test_set_success(value, result): class Model(BaseModel): v: set assert Model(v=value).v == result @pytest.mark.parametrize('value', (123, '123')) def test_set_fails(value): class Model(BaseModel): v: set with pytest.raises(ValidationError) as exc_info: Model(v=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'set_type', 'loc': ('v',), 'msg': 'Input should be a valid set', 'input': value} ] def test_list_type_fails(): class Model(BaseModel): v: List[int] with pytest.raises(ValidationError) as exc_info: Model(v='123') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('v',), 'msg': 'Input should be a valid list', 'input': '123'} ] def test_set_type_fails(): class Model(BaseModel): v: Set[int] with pytest.raises(ValidationError) as exc_info: Model(v='123') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'set_type', 'loc': ('v',), 'msg': 'Input should be a valid set', 'input': '123'} ] @pytest.mark.parametrize( 'cls, value,result', ( (int, [1, 2, 3], [1, 2, 3]), (int, (1, 2, 3), (1, 2, 3)), (int, range(5), [0, 1, 2, 3, 4]), (int, deque((1, 2, 3)), deque((1, 2, 3))), (Set[int], [{1, 2}, {3, 4}, {5, 6}], [{1, 2}, {3, 4}, {5, 6}]), (Tuple[int, str], ((1, 'a'), (2, 'b'), (3, 'c')), ((1, 'a'), (2, 'b'), (3, 'c'))), ), ) def test_sequence_success(cls, value, result): class Model(BaseModel): v: Sequence[cls] assert Model(v=value).v == result def int_iterable(): i = 0 while True: i += 1 yield str(i) def str_iterable(): while True: yield from 'foobarbaz' def test_infinite_iterable_int(): class Model(BaseModel): it: Iterable[int] m = Model(it=int_iterable()) assert repr(m.it) == 'ValidatorIterator(index=0, schema=Some(Int(IntValidator { strict: false })))' output = [] for i in m.it: output.append(i) if i == 10: break assert output == [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] m = Model(it=[1, 2, 3]) assert list(m.it) == [1, 2, 3] m = Model(it=str_iterable()) with pytest.raises(ValidationError) as exc_info: next(m.it) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (0,), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'f', } ] @pytest.mark.parametrize('type_annotation', (Iterable[Any], Iterable)) def test_iterable_any(type_annotation): class Model(BaseModel): it: type_annotation m = Model(it=int_iterable()) output = [] for i in m.it: output.append(i) if int(i) == 10: break assert output == ['1', '2', '3', '4', '5', '6', '7', '8', '9', '10'] m = Model(it=[1, '2', b'three']) assert list(m.it) == [1, '2', b'three'] with pytest.raises(ValidationError) as exc_info: Model(it=3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'iterable_type', 'loc': ('it',), 'msg': 'Input should be iterable', 'input': 3} ] def test_invalid_iterable(): class Model(BaseModel): it: Iterable[int] with pytest.raises(ValidationError) as exc_info: Model(it=3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'iterable_type', 'loc': ('it',), 'msg': 'Input should be iterable', 'input': 3} ] @pytest.mark.parametrize( 'config,input_str', ( ({}, 'type=iterable_type, input_value=5, input_type=int'), ({'hide_input_in_errors': False}, 'type=iterable_type, input_value=5, input_type=int'), ({'hide_input_in_errors': True}, 'type=iterable_type'), ), ) def test_iterable_error_hide_input(config, input_str): class Model(BaseModel): it: Iterable[int] model_config = ConfigDict(**config) with pytest.raises(ValidationError, match=re.escape(f'Input should be iterable [{input_str}]')): Model(it=5) def test_infinite_iterable_validate_first(): class Model(BaseModel): it: Iterable[int] b: int @field_validator('it') @classmethod def infinite_first_int(cls, it): return itertools.chain([next(it)], it) m = Model(it=int_iterable(), b=3) assert m.b == 3 assert m.it for i in m.it: assert i if i == 10: break with pytest.raises(ValidationError) as exc_info: Model(it=str_iterable(), b=3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('it', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'f', } ] def test_sequence_generator_fails(): class Model(BaseModel): v: Sequence[int] gen = (i for i in [1, 2, 3]) with pytest.raises(ValidationError) as exc_info: Model(v=gen) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of Sequence', 'input': gen, 'ctx': {'class': 'Sequence'}, } ] @pytest.mark.parametrize( 'cls,value,errors', ( ( int, [1, 'a', 3], [ { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, ], ), ( int, (1, 2, 'a'), [ { 'type': 'int_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, ], ), ( float, ('a', 2.2, 3.3), [ { 'type': 'float_parsing', 'loc': ('v', 0), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'a', }, ], ), ( float, (1.1, 2.2, 'a'), [ { 'type': 'float_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'a', }, ], ), ( float, {1.0, 2.0, 3.0}, [ { 'type': 'is_instance_of', 'loc': ('v',), 'msg': 'Input should be an instance of Sequence', 'input': { 1.0, 2.0, 3.0, }, 'ctx': { 'class': 'Sequence', }, }, ], ), ( Set[int], [{1, 2}, {2, 3}, {'d'}], [ { 'type': 'int_parsing', 'loc': ('v', 2, 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'd', } ], ), ( Tuple[int, str], ((1, 'a'), ('a', 'a'), (3, 'c')), [ { 'type': 'int_parsing', 'loc': ('v', 1, 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ], ), ( List[int], [{'a': 1, 'b': 2}, [1, 2], [2, 3]], [ { 'type': 'list_type', 'loc': ('v', 0), 'msg': 'Input should be a valid list', 'input': {'a': 1, 'b': 2}, } ], ), ), ids=repr, ) def test_sequence_fails(cls, value, errors): class Model(BaseModel): v: Sequence[cls] with pytest.raises(ValidationError) as exc_info: Model(v=value) assert exc_info.value.errors(include_url=False) == errors def test_sequence_strict(): assert TypeAdapter(Sequence[int]).validate_python((), strict=True) == () def test_list_strict() -> None: class LaxModel(BaseModel): v: List[int] model_config = ConfigDict(strict=False) class StrictModel(BaseModel): v: List[int] model_config = ConfigDict(strict=True) assert LaxModel(v=(1, 2)).v == [1, 2] assert LaxModel(v=('1', 2)).v == [1, 2] # Tuple should be rejected with pytest.raises(ValidationError) as exc_info: StrictModel(v=(1, 2)) assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('v',), 'msg': 'Input should be a valid list', 'input': (1, 2)} ] # Strict in each list item with pytest.raises(ValidationError) as exc_info: StrictModel(v=['1', 2]) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('v', 0), 'msg': 'Input should be a valid integer', 'input': '1'} ] def test_set_strict() -> None: class LaxModel(BaseModel): v: Set[int] model_config = ConfigDict(strict=False) class StrictModel(BaseModel): v: Set[int] model_config = ConfigDict(strict=True) assert LaxModel(v=(1, 2)).v == {1, 2} assert LaxModel(v=('1', 2)).v == {1, 2} # Tuple should be rejected with pytest.raises(ValidationError) as exc_info: StrictModel(v=(1, 2)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'set_type', 'loc': ('v',), 'msg': 'Input should be a valid set', 'input': (1, 2), } ] # Strict in each set item with pytest.raises(ValidationError) as exc_info: StrictModel(v={'1', 2}) err_info = exc_info.value.errors(include_url=False) # Sets are not ordered del err_info[0]['loc'] assert err_info == [{'type': 'int_type', 'msg': 'Input should be a valid integer', 'input': '1'}] def test_frozenset_strict() -> None: class LaxModel(BaseModel): v: FrozenSet[int] model_config = ConfigDict(strict=False) class StrictModel(BaseModel): v: FrozenSet[int] model_config = ConfigDict(strict=True) assert LaxModel(v=(1, 2)).v == frozenset((1, 2)) assert LaxModel(v=('1', 2)).v == frozenset((1, 2)) # Tuple should be rejected with pytest.raises(ValidationError) as exc_info: StrictModel(v=(1, 2)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'frozen_set_type', 'loc': ('v',), 'msg': 'Input should be a valid frozenset', 'input': (1, 2), } ] # Strict in each set item with pytest.raises(ValidationError) as exc_info: StrictModel(v=frozenset(('1', 2))) err_info = exc_info.value.errors(include_url=False) # Sets are not ordered del err_info[0]['loc'] assert err_info == [{'type': 'int_type', 'msg': 'Input should be a valid integer', 'input': '1'}] def test_tuple_strict() -> None: class LaxModel(BaseModel): v: Tuple[int, int] model_config = ConfigDict(strict=False) class StrictModel(BaseModel): v: Tuple[int, int] model_config = ConfigDict(strict=True) assert LaxModel(v=[1, 2]).v == (1, 2) assert LaxModel(v=['1', 2]).v == (1, 2) # List should be rejected with pytest.raises(ValidationError) as exc_info: StrictModel(v=[1, 2]) assert exc_info.value.errors(include_url=False) == [ {'type': 'tuple_type', 'loc': ('v',), 'msg': 'Input should be a valid tuple', 'input': [1, 2]} ] # Strict in each list item with pytest.raises(ValidationError) as exc_info: StrictModel(v=('1', 2)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('v', 0), 'msg': 'Input should be a valid integer', 'input': '1'} ] def test_int_validation(): class Model(BaseModel): a: PositiveInt = None b: NegativeInt = None c: NonNegativeInt = None d: NonPositiveInt = None e: conint(gt=4, lt=10) = None f: conint(ge=0, le=10) = None g: conint(multiple_of=5) = None m = Model(a=5, b=-5, c=0, d=0, e=5, f=0, g=25) assert m.model_dump() == {'a': 5, 'b': -5, 'c': 0, 'd': 0, 'e': 5, 'f': 0, 'g': 25} with pytest.raises(ValidationError) as exc_info: Model(a=-5, b=5, c=-5, d=5, e=-5, f=11, g=42) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('a',), 'msg': 'Input should be greater than 0', 'input': -5, 'ctx': {'gt': 0}, }, { 'type': 'less_than', 'loc': ('b',), 'msg': 'Input should be less than 0', 'input': 5, 'ctx': {'lt': 0}, }, { 'type': 'greater_than_equal', 'loc': ('c',), 'msg': 'Input should be greater than or equal to 0', 'input': -5, 'ctx': {'ge': 0}, }, { 'type': 'less_than_equal', 'loc': ('d',), 'msg': 'Input should be less than or equal to 0', 'input': 5, 'ctx': {'le': 0}, }, { 'type': 'greater_than', 'loc': ('e',), 'msg': 'Input should be greater than 4', 'input': -5, 'ctx': {'gt': 4}, }, { 'type': 'less_than_equal', 'loc': ('f',), 'msg': 'Input should be less than or equal to 10', 'input': 11, 'ctx': {'le': 10}, }, { 'type': 'multiple_of', 'loc': ('g',), 'msg': 'Input should be a multiple of 5', 'input': 42, 'ctx': {'multiple_of': 5}, }, ] def test_float_validation(): class Model(BaseModel): a: PositiveFloat = None b: NegativeFloat = None c: NonNegativeFloat = None d: NonPositiveFloat = None e: confloat(gt=4, lt=12.2) = None f: confloat(ge=0, le=9.9) = None g: confloat(multiple_of=0.5) = None h: confloat(allow_inf_nan=False) = None m = Model(a=5.1, b=-5.2, c=0, d=0, e=5.3, f=9.9, g=2.5, h=42) assert m.model_dump() == {'a': 5.1, 'b': -5.2, 'c': 0, 'd': 0, 'e': 5.3, 'f': 9.9, 'g': 2.5, 'h': 42} assert Model(a=float('inf')).a == float('inf') assert Model(b=float('-inf')).b == float('-inf') with pytest.raises(ValidationError) as exc_info: Model(a=-5.1, b=5.2, c=-5.1, d=5.1, e=-5.3, f=9.91, g=4.2, h=float('nan')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('a',), 'msg': 'Input should be greater than 0', 'input': -5.1, 'ctx': { 'gt': 0.0, }, }, { 'type': 'less_than', 'loc': ('b',), 'msg': 'Input should be less than 0', 'input': 5.2, 'ctx': { 'lt': 0.0, }, }, { 'type': 'greater_than_equal', 'loc': ('c',), 'msg': 'Input should be greater than or equal to 0', 'input': -5.1, 'ctx': { 'ge': 0.0, }, }, { 'type': 'less_than_equal', 'loc': ('d',), 'msg': 'Input should be less than or equal to 0', 'input': 5.1, 'ctx': { 'le': 0.0, }, }, { 'type': 'greater_than', 'loc': ('e',), 'msg': 'Input should be greater than 4', 'input': -5.3, 'ctx': { 'gt': 4.0, }, }, { 'type': 'less_than_equal', 'loc': ('f',), 'msg': 'Input should be less than or equal to 9.9', 'input': 9.91, 'ctx': { 'le': 9.9, }, }, { 'type': 'multiple_of', 'loc': ('g',), 'msg': 'Input should be a multiple of 0.5', 'input': 4.2, 'ctx': { 'multiple_of': 0.5, }, }, { 'type': 'finite_number', 'loc': ('h',), 'msg': 'Input should be a finite number', 'input': HasRepr('nan'), }, ] def test_infinite_float_validation(): class Model(BaseModel): a: float = None assert Model(a=float('inf')).a == float('inf') assert Model(a=float('-inf')).a == float('-inf') assert math.isnan(Model(a=float('nan')).a) @pytest.mark.parametrize( ('ser_json_inf_nan', 'input', 'output', 'python_roundtrip'), ( ('null', float('inf'), 'null', None), ('null', float('-inf'), 'null', None), ('null', float('nan'), 'null', None), ('constants', float('inf'), 'Infinity', float('inf')), ('constants', float('-inf'), '-Infinity', float('-inf')), ('constants', float('nan'), 'NaN', IsFloatNan), ), ) def test_infinite_float_json_serialization(ser_json_inf_nan, input, output, python_roundtrip): class Model(BaseModel): model_config = ConfigDict(ser_json_inf_nan=ser_json_inf_nan) a: float json_string = Model(a=input).model_dump_json() assert json_string == f'{{"a":{output}}}' assert json.loads(json_string) == {'a': python_roundtrip} @pytest.mark.parametrize('value', [float('inf'), float('-inf'), float('nan')]) def test_finite_float_validation_error(value): class Model(BaseModel): a: FiniteFloat assert Model(a=42).a == 42 with pytest.raises(ValidationError) as exc_info: Model(a=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'finite_number', 'loc': ('a',), 'msg': 'Input should be a finite number', 'input': HasRepr(repr(value)), } ] def test_finite_float_config(): class Model(BaseModel): a: float model_config = ConfigDict(allow_inf_nan=False) assert Model(a=42).a == 42 with pytest.raises(ValidationError) as exc_info: Model(a=float('nan')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'finite_number', 'loc': ('a',), 'msg': 'Input should be a finite number', 'input': HasRepr('nan'), } ] def test_strict_bytes(): class Model(BaseModel): v: StrictBytes assert Model(v=b'foobar').v == b'foobar' with pytest.raises(ValidationError, match='Input should be a valid bytes'): Model(v=bytearray('foobar', 'utf-8')) with pytest.raises(ValidationError, match='Input should be a valid bytes'): Model(v='foostring') with pytest.raises(ValidationError, match='Input should be a valid bytes'): Model(v=42) with pytest.raises(ValidationError, match='Input should be a valid bytes'): Model(v=0.42) def test_strict_bytes_max_length(): class Model(BaseModel): u: StrictBytes = Field(max_length=5) assert Model(u=b'foo').u == b'foo' with pytest.raises(ValidationError, match=r'Input should be a valid bytes \[type=bytes_type'): Model(u=123) with pytest.raises(ValidationError, match=r'Data should have at most 5 bytes \[type=bytes_too_long,'): Model(u=b'1234567') def test_strict_str(): class FruitEnum(str, Enum): """A subclass of a string""" pear = 'pear' banana = 'banana' class Model(BaseModel): v: StrictStr assert Model(v='foobar').v == 'foobar' assert Model.model_validate({'v': FruitEnum.banana}) == Model.model_construct(v=FruitEnum.banana) with pytest.raises(ValidationError, match='Input should be a valid string'): Model(v=123) with pytest.raises(ValidationError, match='Input should be a valid string'): Model(v=b'foobar') def test_strict_str_max_length(): class Model(BaseModel): u: StrictStr = Field(max_length=5) assert Model(u='foo').u == 'foo' with pytest.raises(ValidationError, match='Input should be a valid string'): Model(u=123) with pytest.raises(ValidationError, match=r'String should have at most 5 characters \[type=string_too_long,'): Model(u='1234567') def test_strict_bool(): class Model(BaseModel): v: StrictBool assert Model(v=True).v is True assert Model(v=False).v is False with pytest.raises(ValidationError): Model(v=1) with pytest.raises(ValidationError): Model(v='1') with pytest.raises(ValidationError): Model(v=b'1') def test_strict_int(): class Model(BaseModel): v: StrictInt assert Model(v=123456).v == 123456 with pytest.raises(ValidationError, match=r'Input should be a valid integer \[type=int_type,'): Model(v='123456') with pytest.raises(ValidationError, match=r'Input should be a valid integer \[type=int_type,'): Model(v=3.14159) with pytest.raises(ValidationError, match=r'Input should be a valid integer \[type=int_type,'): Model(v=True) @pytest.mark.parametrize( ('input', 'expected_json'), ( (9_223_372_036_854_775_807, b'9223372036854775807'), (-9_223_372_036_854_775_807, b'-9223372036854775807'), (1433352099889938534014333520998899385340, b'1433352099889938534014333520998899385340'), (-1433352099889938534014333520998899385340, b'-1433352099889938534014333520998899385340'), ), ) def test_big_int_json(input, expected_json): v = TypeAdapter(int) dumped = v.dump_json(input) assert dumped == expected_json assert v.validate_json(dumped) == input def test_strict_float(): class Model(BaseModel): v: StrictFloat assert Model(v=3.14159).v == 3.14159 assert Model(v=123456).v == 123456 with pytest.raises(ValidationError, match=r'Input should be a valid number \[type=float_type,'): Model(v='3.14159') with pytest.raises(ValidationError, match=r'Input should be a valid number \[type=float_type,'): Model(v=True) def test_bool_unhashable_fails(): class Model(BaseModel): v: bool with pytest.raises(ValidationError) as exc_info: Model(v={}) assert exc_info.value.errors(include_url=False) == [ {'type': 'bool_type', 'loc': ('v',), 'msg': 'Input should be a valid boolean', 'input': {}} ] def test_uuid_error(): v = TypeAdapter(UUID) valid = UUID('49fdfa1d856d4003a83e4b9236532ec6') # sanity check assert v.validate_python(valid) == valid assert v.validate_python(valid.hex) == valid with pytest.raises(ValidationError) as exc_info: v.validate_python('ebcdab58-6eb8-46fb-a190-d07a3') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'loc': (), 'msg': 'Input should be a valid UUID, invalid group length in group 4: expected 12, found 5', 'input': 'ebcdab58-6eb8-46fb-a190-d07a3', 'ctx': {'error': 'invalid group length in group 4: expected 12, found 5'}, 'type': 'uuid_parsing', } ] not_a_valid_input_type = object() with pytest.raises(ValidationError) as exc_info: v.validate_python(not_a_valid_input_type) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'input': not_a_valid_input_type, 'loc': (), 'msg': 'UUID input should be a string, bytes or UUID object', 'type': 'uuid_type', }, ] with pytest.raises(ValidationError) as exc_info: v.validate_python(valid.hex, strict=True) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': (), 'msg': 'Input should be an instance of UUID', 'input': '49fdfa1d856d4003a83e4b9236532ec6', 'ctx': {'class': 'UUID'}, } ] assert v.validate_json(json.dumps(valid.hex), strict=True) == valid def test_uuid_json(): class Model(BaseModel): v: UUID v1: UUID1 v3: UUID3 v4: UUID4 m = Model(v=uuid.uuid4(), v1=uuid.uuid1(), v3=uuid.uuid3(uuid.NAMESPACE_DNS, 'python.org'), v4=uuid.uuid4()) assert m.model_dump_json() == f'{{"v":"{m.v}","v1":"{m.v1}","v3":"{m.v3}","v4":"{m.v4}"}}' def test_uuid_validation(): class UUIDModel(BaseModel): a: UUID1 b: UUID3 c: UUID4 d: UUID5 e: UUID a = uuid.uuid1() b = uuid.uuid3(uuid.NAMESPACE_DNS, 'python.org') c = uuid.uuid4() d = uuid.uuid5(uuid.NAMESPACE_DNS, 'python.org') e = UUID('{00000000-7fff-4000-7fff-000000000000}') m = UUIDModel(a=a, b=b, c=c, d=d, e=e) assert m.model_dump() == {'a': a, 'b': b, 'c': c, 'd': d, 'e': e} with pytest.raises(ValidationError) as exc_info: UUIDModel(a=d, b=c, c=b, d=a, e=e) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'uuid_version', 'loc': ('a',), 'msg': 'UUID version 1 expected', 'input': d, 'ctx': {'expected_version': 1}, }, { 'type': 'uuid_version', 'loc': ('b',), 'msg': 'UUID version 3 expected', 'input': c, 'ctx': {'expected_version': 3}, }, { 'type': 'uuid_version', 'loc': ('c',), 'msg': 'UUID version 4 expected', 'input': b, 'ctx': {'expected_version': 4}, }, { 'type': 'uuid_version', 'loc': ('d',), 'msg': 'UUID version 5 expected', 'input': a, 'ctx': {'expected_version': 5}, }, ] with pytest.raises(ValidationError) as exc_info: UUIDModel(a=e, b=e, c=e, d=e, e=e) assert exc_info.value.errors(include_url=False) == [ { 'type': 'uuid_version', 'loc': ('a',), 'msg': 'UUID version 1 expected', 'input': e, 'ctx': {'expected_version': 1}, }, { 'type': 'uuid_version', 'loc': ('b',), 'msg': 'UUID version 3 expected', 'input': e, 'ctx': {'expected_version': 3}, }, { 'type': 'uuid_version', 'loc': ('c',), 'msg': 'UUID version 4 expected', 'input': e, 'ctx': {'expected_version': 4}, }, { 'type': 'uuid_version', 'loc': ('d',), 'msg': 'UUID version 5 expected', 'input': e, 'ctx': {'expected_version': 5}, }, ] def test_uuid_strict() -> None: class StrictByConfig(BaseModel): a: UUID1 b: UUID3 c: UUID4 d: UUID5 e: uuid.UUID model_config = ConfigDict(strict=True) class StrictByField(BaseModel): a: UUID1 = Field(strict=True) b: UUID3 = Field(strict=True) c: UUID4 = Field(strict=True) d: UUID5 = Field(strict=True) e: uuid.UUID = Field(strict=True) a = uuid.UUID('7fb48116-ca6b-11ed-a439-3274d3adddac') # uuid1 b = uuid.UUID('6fa459ea-ee8a-3ca4-894e-db77e160355e') # uuid3 c = uuid.UUID('260d1600-3680-4f4f-a968-f6fa622ffd8d') # uuid4 d = uuid.UUID('886313e1-3b8a-5372-9b90-0c9aee199e5d') # uuid5 e = uuid.UUID('7fb48116-ca6b-11ed-a439-3274d3adddac') # any uuid strict_errors = [ { 'type': 'is_instance_of', 'loc': ('a',), 'msg': 'Input should be an instance of UUID', 'input': '7fb48116-ca6b-11ed-a439-3274d3adddac', 'ctx': {'class': 'UUID'}, }, { 'type': 'is_instance_of', 'loc': ('b',), 'msg': 'Input should be an instance of UUID', 'input': '6fa459ea-ee8a-3ca4-894e-db77e160355e', 'ctx': {'class': 'UUID'}, }, { 'type': 'is_instance_of', 'loc': ('c',), 'msg': 'Input should be an instance of UUID', 'input': '260d1600-3680-4f4f-a968-f6fa622ffd8d', 'ctx': {'class': 'UUID'}, }, { 'type': 'is_instance_of', 'loc': ('d',), 'msg': 'Input should be an instance of UUID', 'input': '886313e1-3b8a-5372-9b90-0c9aee199e5d', 'ctx': {'class': 'UUID'}, }, { 'type': 'is_instance_of', 'loc': ('e',), 'msg': 'Input should be an instance of UUID', 'input': '7fb48116-ca6b-11ed-a439-3274d3adddac', 'ctx': {'class': 'UUID'}, }, ] for model in [StrictByConfig, StrictByField]: with pytest.raises(ValidationError) as exc_info: model(a=str(a), b=str(b), c=str(c), d=str(d), e=str(e)) assert exc_info.value.errors(include_url=False) == strict_errors m = model(a=a, b=b, c=c, d=d, e=e) assert isinstance(m.a, type(a)) and m.a == a assert isinstance(m.b, type(b)) and m.b == b assert isinstance(m.c, type(c)) and m.c == c assert isinstance(m.d, type(d)) and m.d == d assert isinstance(m.e, type(e)) and m.e == e @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [ (True, ' 123 ', '123'), (True, ' 123\t\n', '123'), (False, ' 123 ', ' 123 '), ], ) def test_str_strip_whitespace(enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_strip_whitespace=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [(True, 'ABCDefG', 'ABCDEFG'), (False, 'ABCDefG', 'ABCDefG')], ) def test_str_to_upper(enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_to_upper=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check @pytest.mark.parametrize( 'enabled,str_check,result_str_check', [(True, 'ABCDefG', 'abcdefg'), (False, 'ABCDefG', 'ABCDefG')], ) def test_str_to_lower(enabled, str_check, result_str_check): class Model(BaseModel): str_check: str model_config = ConfigDict(str_to_lower=enabled) m = Model(str_check=str_check) assert m.str_check == result_str_check pos_int_values = 'Inf', '+Inf', 'Infinity', '+Infinity' neg_int_values = '-Inf', '-Infinity' nan_values = 'NaN', '-NaN', '+NaN', 'sNaN', '-sNaN', '+sNaN' non_finite_values = nan_values + pos_int_values + neg_int_values # dirty_equals.AnyThing() doesn't work with Decimal on PyPy, hence this hack ANY_THING = object() @pytest.mark.parametrize( 'type_args,value,result', [ (dict(gt=Decimal('42.24')), Decimal('43'), Decimal('43')), ( dict(gt=Decimal('42.24')), Decimal('42'), [ { 'type': 'greater_than', 'loc': ('foo',), 'msg': 'Input should be greater than 42.24', 'input': Decimal('42'), 'ctx': {'gt': Decimal('42.24')}, } ], ), (dict(lt=Decimal('42.24')), Decimal('42'), Decimal('42')), ( dict(lt=Decimal('42.24')), Decimal('43'), [ { 'type': 'less_than', 'loc': ('foo',), 'msg': 'Input should be less than 42.24', 'input': Decimal('43'), 'ctx': { 'lt': Decimal('42.24'), }, }, ], ), (dict(ge=Decimal('42.24')), Decimal('43'), Decimal('43')), (dict(ge=Decimal('42.24')), Decimal('42.24'), Decimal('42.24')), ( dict(ge=Decimal('42.24')), Decimal('42'), [ { 'type': 'greater_than_equal', 'loc': ('foo',), 'msg': 'Input should be greater than or equal to 42.24', 'input': Decimal('42'), 'ctx': { 'ge': Decimal('42.24'), }, } ], ), (dict(le=Decimal('42.24')), Decimal('42'), Decimal('42')), (dict(le=Decimal('42.24')), Decimal('42.24'), Decimal('42.24')), ( dict(le=Decimal('42.24')), Decimal('43'), [ { 'type': 'less_than_equal', 'loc': ('foo',), 'msg': 'Input should be less than or equal to 42.24', 'input': Decimal('43'), 'ctx': { 'le': Decimal('42.24'), }, } ], ), (dict(max_digits=2, decimal_places=2), Decimal('0.99'), Decimal('0.99')), pytest.param( dict(max_digits=2, decimal_places=1), Decimal('0.99'), [ { 'type': 'decimal_max_places', 'loc': ('foo',), 'msg': 'Decimal input should have no more than 1 decimal place', 'input': Decimal('0.99'), 'ctx': { 'decimal_places': 1, }, } ], ), ( dict(max_digits=3, decimal_places=1), Decimal('999'), [ { 'loc': ('foo',), 'msg': 'Decimal input should have no more than 2 digits before the decimal point', 'type': 'decimal_whole_digits', 'input': Decimal('999'), 'ctx': {'whole_digits': 2}, } ], ), (dict(max_digits=4, decimal_places=1), Decimal('999'), Decimal('999')), (dict(max_digits=20, decimal_places=2), Decimal('742403889818000000'), Decimal('742403889818000000')), (dict(max_digits=20, decimal_places=2), Decimal('7.42403889818E+17'), Decimal('7.42403889818E+17')), (dict(max_digits=6, decimal_places=2), Decimal('000000000001111.700000'), Decimal('000000000001111.700000')), ( dict(max_digits=6, decimal_places=2), Decimal('0000000000011111.700000'), [ { 'type': 'decimal_whole_digits', 'loc': ('foo',), 'msg': 'Decimal input should have no more than 4 digits before the decimal point', 'input': Decimal('11111.700000'), 'ctx': {'whole_digits': 4}, } ], ), ( dict(max_digits=20, decimal_places=2), Decimal('7424742403889818000000'), [ { 'type': 'decimal_max_digits', 'loc': ('foo',), 'msg': 'Decimal input should have no more than 20 digits in total', 'input': Decimal('7424742403889818000000'), 'ctx': { 'max_digits': 20, }, }, ], ), (dict(max_digits=5, decimal_places=2), Decimal('7304E-1'), Decimal('7304E-1')), ( dict(max_digits=5, decimal_places=2), Decimal('7304E-3'), [ { 'type': 'decimal_max_places', 'loc': ('foo',), 'msg': 'Decimal input should have no more than 2 decimal places', 'input': Decimal('7.304'), 'ctx': {'decimal_places': 2}, } ], ), (dict(max_digits=5, decimal_places=5), Decimal('70E-5'), Decimal('70E-5')), ( dict(max_digits=4, decimal_places=4), Decimal('70E-6'), [ { 'loc': ('foo',), 'msg': 'Decimal input should have no more than 4 digits in total', 'type': 'decimal_max_digits', 'input': Decimal('0.00007'), 'ctx': {'max_digits': 4}, } ], ), *[ ( dict(decimal_places=2, max_digits=10, allow_inf_nan=False), value, [ { 'loc': ('foo',), 'msg': 'Input should be a finite number', 'type': 'finite_number', 'input': value, } ], ) for value in non_finite_values ], *[ ( dict(decimal_places=2, max_digits=10, allow_inf_nan=False), Decimal(value), [ { 'loc': ('foo',), 'msg': 'Input should be a finite number', 'type': 'finite_number', 'input': ANY_THING, } ], ) for value in non_finite_values ], ( dict(multiple_of=Decimal('5')), Decimal('42'), [ { 'type': 'multiple_of', 'loc': ('foo',), 'msg': 'Input should be a multiple of 5', 'input': Decimal('42'), 'ctx': {'multiple_of': Decimal('5')}, } ], ), ], ) @pytest.mark.parametrize('mode', ['Field', 'condecimal', 'optional']) def test_decimal_validation(mode, type_args, value, result): if mode == 'Field': class Model(BaseModel): foo: Decimal = Field(**type_args) elif mode == 'optional': class Model(BaseModel): foo: Optional[Decimal] = Field(**type_args) else: class Model(BaseModel): foo: condecimal(**type_args) if not isinstance(result, Decimal): with pytest.raises(ValidationError) as exc_info: m = Model(foo=value) print(f'unexpected result: {m!r}') # debug(exc_info.value.errors(include_url=False)) # dirty_equals.AnyThing() doesn't work with Decimal on PyPy, hence this hack errors = exc_info.value.errors(include_url=False) if result[0].get('input') is ANY_THING: for e in errors: e['input'] = ANY_THING assert errors == result # assert exc_info.value.json().startswith('[') else: assert Model(foo=value).foo == result @pytest.fixture(scope='module', name='AllowInfModel') def fix_allow_inf_model(): class Model(BaseModel): v: condecimal(allow_inf_nan=True) return Model @pytest.mark.parametrize( 'value,result', [ (Decimal('42'), 'unchanged'), *[(v, 'is_nan') for v in nan_values], *[(v, 'is_pos_inf') for v in pos_int_values], *[(v, 'is_neg_inf') for v in neg_int_values], ], ) def test_decimal_not_finite(value, result, AllowInfModel): m = AllowInfModel(v=value) if result == 'unchanged': assert m.v == value elif result == 'is_nan': assert m.v.is_nan(), m.v elif result == 'is_pos_inf': assert m.v.is_infinite() and m.v > 0, m.v else: assert result == 'is_neg_inf' assert m.v.is_infinite() and m.v < 0, m.v def test_decimal_invalid(): with pytest.raises(SchemaError, match='allow_inf_nan=True cannot be used with max_digits or decimal_places'): class Model(BaseModel): v: condecimal(allow_inf_nan=True, max_digits=4) @pytest.mark.parametrize('value,result', (('/test/path', Path('/test/path')), (Path('/test/path'), Path('/test/path')))) def test_path_validation_success(value, result): class Model(BaseModel): foo: Path assert Model(foo=value).foo == result assert Model.model_validate_json(json.dumps({'foo': str(value)})).foo == result def test_path_validation_constrained(): ta = TypeAdapter(Annotated[Path, Field(min_length=9, max_length=20)]) with pytest.raises(ValidationError): ta.validate_python('/short') with pytest.raises(ValidationError): ta.validate_python('/' + 'long' * 100) assert ta.validate_python('/just/right/enough') == Path('/just/right/enough') def test_path_like(): class Model(BaseModel): foo: os.PathLike assert Model(foo='/foo/bar').foo == Path('/foo/bar') assert Model(foo=Path('/foo/bar')).foo == Path('/foo/bar') assert Model.model_validate_json('{"foo": "abc"}').foo == Path('abc') # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'type': 'object', 'properties': {'foo': {'type': 'string', 'format': 'path', 'title': 'Foo'}}, 'required': ['foo'], 'title': 'Model', } @pytest.mark.skipif(sys.version_info < (3, 9), reason='requires python 3.9 or higher to parametrize os.PathLike') def test_path_like_extra_subtype(): class Model(BaseModel): str_type: os.PathLike[str] byte_type: os.PathLike[bytes] any_type: os.PathLike[Any] m = Model( str_type='/foo/bar', byte_type=b'/foo/bar', any_type='/foo/bar', ) assert m.str_type == Path('/foo/bar') assert m.byte_type == Path('/foo/bar') assert m.any_type == Path('/foo/bar') assert Model.model_json_schema() == { 'properties': { 'str_type': {'format': 'path', 'title': 'Str Type', 'type': 'string'}, 'byte_type': {'format': 'path', 'title': 'Byte Type', 'type': 'string'}, 'any_type': {'format': 'path', 'title': 'Any Type', 'type': 'string'}, }, 'required': ['str_type', 'byte_type', 'any_type'], 'title': 'Model', 'type': 'object', } with pytest.raises(ValidationError) as exc_info: Model( str_type=b'/foo/bar', byte_type='/foo/bar', any_type=111, ) assert exc_info.value.errors(include_url=False) == [ { 'type': 'path_type', 'loc': ('str_type',), 'msg': "Input is not a valid path for ", 'input': b'/foo/bar', }, { 'type': 'path_type', 'loc': ('byte_type',), 'msg': "Input is not a valid path for ", 'input': '/foo/bar', }, { 'type': 'path_type', 'loc': ('any_type',), 'msg': "Input is not a valid path for ", 'input': 111, }, ] def test_path_like_strict(): class Model(BaseModel): model_config = dict(strict=True) foo: os.PathLike with pytest.raises(ValidationError, match='Input should be an instance of PathLike'): Model(foo='/foo/bar') assert Model(foo=Path('/foo/bar')).foo == Path('/foo/bar') assert Model.model_validate_json('{"foo": "abc"}').foo == Path('abc') # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'type': 'object', 'properties': {'foo': {'type': 'string', 'format': 'path', 'title': 'Foo'}}, 'required': ['foo'], 'title': 'Model', } def test_path_strict_override(): class Model(BaseModel): model_config = ConfigDict(strict=True) x: Path = Field(strict=False) m = Model(x='/foo/bar') assert m.x == Path('/foo/bar') def test_path_validation_fails(): class Model(BaseModel): foo: Path with pytest.raises(ValidationError) as exc_info: Model(foo=123) # insert_assert(exc_info.value.errors(include_url=False))[0]['type'] assert exc_info.value.errors(include_url=False)[0]['type'] == 'path_type' with pytest.raises(ValidationError) as exc_info: Model(foo=None) # insert_assert(exc_info.value.errors(include_url=False))[0]['type'] assert exc_info.value.errors(include_url=False)[0]['type'] == 'path_type' def test_path_validation_strict(): class Model(BaseModel): foo: Path model_config = ConfigDict(strict=True) with pytest.raises(ValidationError) as exc_info: Model(foo='/test/path') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': ('foo',), 'msg': 'Input should be an instance of Path', 'input': '/test/path', 'ctx': {'class': 'Path'}, } ] assert Model(foo=Path('/test/path')).foo == Path('/test/path') @pytest.mark.parametrize( 'value,result', (('tests/test_types.py', Path('tests/test_types.py')), (Path('tests/test_types.py'), Path('tests/test_types.py'))), ) def test_file_path_validation_success(value, result): class Model(BaseModel): foo: FilePath assert Model(foo=value).foo == result @pytest.mark.parametrize('value', ['nonexistentfile', Path('nonexistentfile'), 'tests', Path('tests')]) def test_file_path_validation_fails(value): class Model(BaseModel): foo: FilePath with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'path_not_file', 'loc': ('foo',), 'msg': 'Path does not point to a file', 'input': value, } ] @pytest.mark.parametrize('value,result', (('tests', Path('tests')), (Path('tests'), Path('tests')))) def test_directory_path_validation_success(value, result): class Model(BaseModel): foo: DirectoryPath assert Model(foo=value).foo == result @pytest.mark.parametrize( 'value', ['nonexistentdirectory', Path('nonexistentdirectory'), 'tests/test_t.py', Path('tests/test_ypestypes.py')] ) def test_directory_path_validation_fails(value): class Model(BaseModel): foo: DirectoryPath with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'path_not_directory', 'loc': ('foo',), 'msg': 'Path does not point to a directory', 'input': value, } ] @pytest.mark.parametrize('value', ('tests/test_types.py', Path('tests/test_types.py'))) def test_new_path_validation_path_already_exists(value): class Model(BaseModel): foo: NewPath with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'path_exists', 'loc': ('foo',), 'msg': 'Path already exists', 'input': value, } ] @pytest.mark.skipif( not sys.platform.startswith('linux'), reason='Test only works for linux systems. Windows excluded (unsupported). Mac excluded due to CI issues.', ) def test_socket_exists(tmp_path): import socket # Working around path length limits by reducing character count where possible. target = tmp_path / 's' with socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) as sock: sock.bind(str(target)) class Model(BaseModel): path: SocketPath assert Model(path=target).path == target def test_socket_not_exists(tmp_path): target = tmp_path / 's' class Model(BaseModel): path: SocketPath with pytest.raises(ValidationError) as exc_info: Model(path=target) assert exc_info.value.errors(include_url=False) == [ { 'type': 'path_not_socket', 'loc': ('path',), 'msg': 'Path does not point to a socket', 'input': target, } ] @pytest.mark.parametrize('value', ('/nonexistentdir/foo.py', Path('/nonexistentdir/foo.py'))) def test_new_path_validation_parent_does_not_exist(value): class Model(BaseModel): foo: NewPath with pytest.raises(ValidationError) as exc_info: Model(foo=value) assert exc_info.value.errors(include_url=False) == [ { 'type': 'parent_does_not_exist', 'loc': ('foo',), 'msg': 'Parent directory does not exist', 'input': value, } ] @pytest.mark.parametrize( 'value,result', (('tests/foo.py', Path('tests/foo.py')), (Path('tests/foo.py'), Path('tests/foo.py'))) ) def test_new_path_validation_success(value, result): class Model(BaseModel): foo: NewPath assert Model(foo=value).foo == result def test_number_gt(): class Model(BaseModel): a: conint(gt=-1) = 0 assert Model(a=0).model_dump() == {'a': 0} with pytest.raises(ValidationError) as exc_info: Model(a=-1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('a',), 'msg': 'Input should be greater than -1', 'input': -1, 'ctx': {'gt': -1}, } ] def test_number_ge(): class Model(BaseModel): a: conint(ge=0) = 0 assert Model(a=0).model_dump() == {'a': 0} with pytest.raises(ValidationError) as exc_info: Model(a=-1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than_equal', 'loc': ('a',), 'msg': 'Input should be greater than or equal to 0', 'input': -1, 'ctx': {'ge': 0}, } ] def test_number_lt(): class Model(BaseModel): a: conint(lt=5) = 0 assert Model(a=4).model_dump() == {'a': 4} with pytest.raises(ValidationError) as exc_info: Model(a=5) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'less_than', 'loc': ('a',), 'msg': 'Input should be less than 5', 'input': 5, 'ctx': {'lt': 5}, } ] def test_number_le(): class Model(BaseModel): a: conint(le=5) = 0 assert Model(a=5).model_dump() == {'a': 5} with pytest.raises(ValidationError) as exc_info: Model(a=6) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'less_than_equal', 'loc': ('a',), 'msg': 'Input should be less than or equal to 5', 'input': 6, 'ctx': {'le': 5}, } ] @pytest.mark.parametrize('value', (10, 100, 20)) def test_number_multiple_of_int_valid(value): class Model(BaseModel): a: conint(multiple_of=5) assert Model(a=value).model_dump() == {'a': value} @pytest.mark.parametrize('value', [1337, 23, 6, 14]) def test_number_multiple_of_int_invalid(value): class Model(BaseModel): a: conint(multiple_of=5) with pytest.raises(ValidationError) as exc_info: Model(a=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'multiple_of', 'loc': ('a',), 'msg': 'Input should be a multiple of 5', 'input': value, 'ctx': {'multiple_of': 5}, } ] @pytest.mark.parametrize('value', [0.2, 0.3, 0.4, 0.5, 1]) def test_number_multiple_of_float_valid(value): class Model(BaseModel): a: confloat(multiple_of=0.1) assert Model(a=value).model_dump() == {'a': value} @pytest.mark.parametrize('value', [0.07, 1.27, 1.003]) def test_number_multiple_of_float_invalid(value): class Model(BaseModel): a: confloat(multiple_of=0.1) with pytest.raises(ValidationError) as exc_info: Model(a=value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'multiple_of', 'loc': ('a',), 'msg': 'Input should be a multiple of 0.1', 'input': value, 'ctx': {'multiple_of': 0.1}, } ] def test_new_type_success(): a_type = NewType('a_type', int) b_type = NewType('b_type', a_type) c_type = NewType('c_type', List[int]) class Model(BaseModel): a: a_type b: b_type c: c_type m = Model(a=42, b=24, c=[1, 2, 3]) assert m.model_dump() == {'a': 42, 'b': 24, 'c': [1, 2, 3]} def test_new_type_fails(): a_type = NewType('a_type', int) b_type = NewType('b_type', a_type) c_type = NewType('c_type', List[int]) class Model(BaseModel): a: a_type b: b_type c: c_type with pytest.raises(ValidationError) as exc_info: Model(a='foo', b='bar', c=['foo']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'foo', }, { 'type': 'int_parsing', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'bar', }, { 'type': 'int_parsing', 'loc': ('c', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'foo', }, ] def test_valid_simple_json(): class JsonModel(BaseModel): json_obj: Json obj = '{"a": 1, "b": [2, 3]}' assert JsonModel(json_obj=obj).model_dump() == {'json_obj': {'a': 1, 'b': [2, 3]}} def test_valid_simple_json_any(): class JsonModel(BaseModel): json_obj: Json[Any] obj = '{"a": 1, "b": [2, 3]}' assert JsonModel(json_obj=obj).model_dump() == {'json_obj': {'a': 1, 'b': [2, 3]}} @pytest.mark.parametrize('gen_type', [lambda: Json, lambda: Json[Any]]) def test_invalid_simple_json(gen_type): t = gen_type() class JsonModel(BaseModel): json_obj: t obj = '{a: 1, b: [2, 3]}' with pytest.raises(ValidationError) as exc_info: JsonModel(json_obj=obj) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'json_invalid', 'loc': ('json_obj',), 'msg': 'Invalid JSON: key must be a string at line 1 column 2', 'input': '{a: 1, b: [2, 3]}', 'ctx': {'error': 'key must be a string at line 1 column 2'}, } ] def test_valid_simple_json_bytes(): class JsonModel(BaseModel): json_obj: Json obj = b'{"a": 1, "b": [2, 3]}' assert JsonModel(json_obj=obj).model_dump() == {'json_obj': {'a': 1, 'b': [2, 3]}} def test_valid_detailed_json(): class JsonDetailedModel(BaseModel): json_obj: Json[List[int]] obj = '[1, 2, 3]' assert JsonDetailedModel(json_obj=obj).model_dump() == {'json_obj': [1, 2, 3]} obj = b'[1, 2, 3]' assert JsonDetailedModel(json_obj=obj).model_dump() == {'json_obj': [1, 2, 3]} obj = '(1, 2, 3)' with pytest.raises(ValidationError) as exc_info: JsonDetailedModel(json_obj=obj) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'json_invalid', 'loc': ('json_obj',), 'msg': 'Invalid JSON: expected value at line 1 column 1', 'input': '(1, 2, 3)', 'ctx': {'error': 'expected value at line 1 column 1'}, } ] def test_valid_model_json(): class Model(BaseModel): a: int b: List[int] class JsonDetailedModel(BaseModel): json_obj: Json[Model] obj = '{"a": 1, "b": [2, 3]}' m = JsonDetailedModel(json_obj=obj) assert isinstance(m.json_obj, Model) assert m.json_obj.a == 1 assert m.model_dump() == {'json_obj': {'a': 1, 'b': [2, 3]}} def test_invalid_model_json(): class Model(BaseModel): a: int b: List[int] class JsonDetailedModel(BaseModel): json_obj: Json[Model] obj = '{"a": 1, "c": [2, 3]}' with pytest.raises(ValidationError) as exc_info: JsonDetailedModel(json_obj=obj) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('json_obj', 'b'), 'msg': 'Field required', 'input': {'a': 1, 'c': [2, 3]}} ] def test_invalid_detailed_json_type_error(): class JsonDetailedModel(BaseModel): json_obj: Json[List[int]] obj = '["a", "b", "c"]' with pytest.raises(ValidationError) as exc_info: JsonDetailedModel(json_obj=obj) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('json_obj', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, { 'type': 'int_parsing', 'loc': ('json_obj', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'b', }, { 'type': 'int_parsing', 'loc': ('json_obj', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'c', }, ] def test_json_not_str(): class JsonDetailedModel(BaseModel): json_obj: Json[List[int]] obj = 12 with pytest.raises(ValidationError) as exc_info: JsonDetailedModel(json_obj=obj) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'json_type', 'loc': ('json_obj',), 'msg': 'JSON input should be string, bytes or bytearray', 'input': 12, } ] def test_json_before_validator(): call_count = 0 class JsonModel(BaseModel): json_obj: Json[str] @field_validator('json_obj', mode='before') @classmethod def check(cls, v): assert v == '"foobar"' nonlocal call_count call_count += 1 return v assert JsonModel(json_obj='"foobar"').model_dump() == {'json_obj': 'foobar'} assert call_count == 1 def test_json_optional_simple(): class JsonOptionalModel(BaseModel): json_obj: Optional[Json] assert JsonOptionalModel(json_obj=None).model_dump() == {'json_obj': None} assert JsonOptionalModel(json_obj='["x", "y", "z"]').model_dump() == {'json_obj': ['x', 'y', 'z']} def test_json_optional_complex(): class JsonOptionalModel(BaseModel): json_obj: Optional[Json[List[int]]] JsonOptionalModel(json_obj=None) good = JsonOptionalModel(json_obj='[1, 2, 3]') assert good.json_obj == [1, 2, 3] with pytest.raises(ValidationError) as exc_info: JsonOptionalModel(json_obj='["i should fail"]') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('json_obj', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'i should fail', } ] def test_json_required(): class JsonRequired(BaseModel): json_obj: Json assert JsonRequired(json_obj='["x", "y", "z"]').model_dump() == {'json_obj': ['x', 'y', 'z']} with pytest.raises(ValidationError, match=r'JSON input should be string, bytes or bytearray \[type=json_type,'): JsonRequired(json_obj=None) with pytest.raises(ValidationError, match=r'Field required \[type=missing,'): JsonRequired() @pytest.mark.parametrize( ('pattern_type', 'pattern_value', 'matching_value', 'non_matching_value'), [ pytest.param(re.Pattern, r'^whatev.r\d$', 'whatever1', ' whatever1', id='re.Pattern'), pytest.param(Pattern, r'^whatev.r\d$', 'whatever1', ' whatever1', id='Pattern'), pytest.param(Pattern[str], r'^whatev.r\d$', 'whatever1', ' whatever1', id='Pattern[str]'), pytest.param(Pattern[bytes], rb'^whatev.r\d$', b'whatever1', b' whatever1', id='Pattern[bytes]'), ], ) def test_pattern(pattern_type, pattern_value, matching_value, non_matching_value): class Foobar(BaseModel): pattern: pattern_type f = Foobar(pattern=pattern_value) assert f.pattern.__class__.__name__ == 'Pattern' # check it's really a proper pattern assert f.pattern.match(matching_value) assert not f.pattern.match(non_matching_value) # Check that pre-compiled patterns are accepted unchanged p = re.compile(pattern_value) f2 = Foobar(pattern=p) assert f2.pattern is p assert Foobar.model_json_schema() == { 'type': 'object', 'title': 'Foobar', 'properties': {'pattern': {'type': 'string', 'format': 'regex', 'title': 'Pattern'}}, 'required': ['pattern'], } @pytest.mark.parametrize( 'use_field', [pytest.param(True, id='Field'), pytest.param(False, id='constr')], ) def test_compiled_pattern_in_field(use_field): """ https://github.com/pydantic/pydantic/issues/9052 https://github.com/pydantic/pydantic/pull/9053 """ pattern_value = r'^whatev.r\d$' field_pattern = re.compile(pattern_value) if use_field: class Foobar(BaseModel): str_regex: str = Field(pattern=field_pattern) else: class Foobar(BaseModel): str_regex: constr(pattern=field_pattern) = ... field_general_metadata = Foobar.model_fields['str_regex'].metadata assert len(field_general_metadata) == 1 field_metadata_pattern = field_general_metadata[0].pattern assert field_metadata_pattern == field_pattern assert isinstance(field_metadata_pattern, re.Pattern) matching_value = 'whatever1' f = Foobar(str_regex=matching_value) assert f.str_regex == matching_value with pytest.raises( ValidationError, match=re.escape("String should match pattern '" + pattern_value + "'"), ): Foobar(str_regex=' whatever1') assert Foobar.model_json_schema() == { 'type': 'object', 'title': 'Foobar', 'properties': {'str_regex': {'pattern': pattern_value, 'title': 'Str Regex', 'type': 'string'}}, 'required': ['str_regex'], } def test_pattern_with_invalid_param(): with pytest.raises( PydanticSchemaGenerationError, match=re.escape('Unable to generate pydantic-core schema for typing.Pattern[int].'), ): class Foo(BaseModel): pattern: Pattern[int] @pytest.mark.parametrize( ('pattern_type', 'pattern_value', 'error_type', 'error_msg'), [ pytest.param( re.Pattern, '[xx', 'pattern_regex', 'Input should be a valid regular expression', id='re.Pattern-pattern_regex', ), pytest.param( Pattern, '[xx', 'pattern_regex', 'Input should be a valid regular expression', id='re.Pattern-pattern_regex' ), pytest.param( re.Pattern, (), 'pattern_type', 'Input should be a valid pattern', id='typing.Pattern-pattern_type' ), pytest.param(Pattern, (), 'pattern_type', 'Input should be a valid pattern', id='typing.Pattern-pattern_type'), pytest.param( Pattern[str], re.compile(b''), 'pattern_str_type', 'Input should be a string pattern', id='typing.Pattern[str]-pattern_str_type-non_str', ), pytest.param( Pattern[str], b'', 'pattern_str_type', 'Input should be a string pattern', id='typing.Pattern[str]-pattern_str_type-bytes', ), pytest.param( Pattern[str], (), 'pattern_type', 'Input should be a valid pattern', id='typing.Pattern[str]-pattern_type' ), pytest.param( Pattern[bytes], re.compile(''), 'pattern_bytes_type', 'Input should be a bytes pattern', id='typing.Pattern[bytes]-pattern_bytes_type-non_bytes', ), pytest.param( Pattern[bytes], '', 'pattern_bytes_type', 'Input should be a bytes pattern', id='typing.Pattern[bytes]-pattern_bytes_type-str', ), pytest.param( Pattern[bytes], (), 'pattern_type', 'Input should be a valid pattern', id='typing.Pattern[bytes]-pattern_type', ), ], ) def test_pattern_error(pattern_type, pattern_value, error_type, error_msg): class Foobar(BaseModel): pattern: pattern_type with pytest.raises(ValidationError) as exc_info: Foobar(pattern=pattern_value) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': error_type, 'loc': ('pattern',), 'msg': error_msg, 'input': pattern_value} ] @pytest.mark.parametrize('validate_json', [True, False]) def test_secretstr(validate_json): class Foobar(BaseModel): password: SecretStr empty_password: SecretStr if validate_json: f = Foobar.model_validate_json('{"password": "1234", "empty_password": ""}') with pytest.raises(ValidationError) as exc_info: Foobar.model_validate_json('{"password": 1234, "empty_password": null}') else: f = Foobar(password='1234', empty_password='') with pytest.raises(ValidationError) as exc_info: Foobar(password=1234, empty_password=None) assert exc_info.value.errors(include_url=False) == [ {'type': 'string_type', 'loc': ('password',), 'msg': 'Input should be a valid string', 'input': 1234}, {'type': 'string_type', 'loc': ('empty_password',), 'msg': 'Input should be a valid string', 'input': None}, ] # Assert correct types. assert f.password.__class__.__name__ == 'SecretStr' assert f.empty_password.__class__.__name__ == 'SecretStr' # Assert str and repr are correct. assert str(f.password) == '**********' assert str(f.empty_password) == '' assert repr(f.password) == "SecretStr('**********')" assert repr(f.empty_password) == "SecretStr('')" assert len(f.password) == 4 assert len(f.empty_password) == 0 # Assert retrieval of secret value is correct assert f.password.get_secret_value() == '1234' assert f.empty_password.get_secret_value() == '' def test_secretstr_subclass(): class DecryptableStr(SecretStr): """ Simulate a SecretStr with decryption capabilities. """ def decrypt_value(self) -> str: return f'MOCK DECRYPTED {self.get_secret_value()}' class Foobar(BaseModel): password: DecryptableStr empty_password: SecretStr # Initialize the model. f = Foobar(password='1234', empty_password='') # Assert correct types. assert f.password.__class__.__name__ == 'DecryptableStr' assert f.empty_password.__class__.__name__ == 'SecretStr' # Assert str and repr are correct. assert str(f.password) == '**********' assert str(f.empty_password) == '' assert repr(f.password) == "DecryptableStr('**********')" assert repr(f.empty_password) == "SecretStr('')" assert len(f.password) == 4 assert len(f.empty_password) == 0 # Assert retrieval of secret value is correct assert f.password.get_secret_value() == '1234' assert f.empty_password.get_secret_value() == '' def test_secretstr_equality(): assert SecretStr('abc') == SecretStr('abc') assert SecretStr('123') != SecretStr('321') assert SecretStr('123') != '123' assert SecretStr('123') is not SecretStr('123') def test_secretstr_idempotent(): class Foobar(BaseModel): password: SecretStr # Should not raise an exception m = Foobar(password=SecretStr('1234')) assert m.password.get_secret_value() == '1234' class SecretDate(Secret[date]): def _display(self) -> str: return '****/**/**' class SampleEnum(str, Enum): foo = 'foo' bar = 'bar' SecretEnum = Secret[SampleEnum] @pytest.mark.parametrize( 'value, result', [ # Valid inputs (1_493_942_400, date(2017, 5, 5)), (1_493_942_400_000, date(2017, 5, 5)), (0, date(1970, 1, 1)), ('2012-04-23', date(2012, 4, 23)), (b'2012-04-23', date(2012, 4, 23)), (date(2012, 4, 9), date(2012, 4, 9)), (datetime(2012, 4, 9, 0, 0), date(2012, 4, 9)), (1_549_238_400, date(2019, 2, 4)), # nowish in s (1_549_238_400_000, date(2019, 2, 4)), # nowish in ms (19_999_958_400, date(2603, 10, 11)), # just before watershed ], ) def test_secretdate(value, result): class Foobar(BaseModel): value: SecretDate f = Foobar(value=value) # Assert correct type. assert f.value.__class__.__name__ == 'SecretDate' # Assert str and repr are correct. assert str(f.value) == '****/**/**' assert repr(f.value) == "SecretDate('****/**/**')" # Assert retrieval of secret value is correct assert f.value.get_secret_value() == result def test_secretdate_json_serializable(): class _SecretDate(Secret[date]): def _display(self) -> str: return '****/**/**' SecretDate = Annotated[ _SecretDate, PlainSerializer(lambda v: v.get_secret_value().strftime('%Y-%m-%d'), when_used='json'), ] class Foobar(BaseModel): value: SecretDate f = Foobar(value='2017-01-01') assert '2017-01-01' in f.model_dump_json() def test_secretenum_json_serializable(): class SampleEnum(str, Enum): foo = 'foo' bar = 'bar' SecretEnum = Annotated[ Secret[SampleEnum], PlainSerializer(lambda v: v.get_secret_value(), when_used='json'), ] class Foobar(BaseModel): value: SecretEnum f = Foobar(value='foo') assert f.model_dump_json() == '{"value":"foo"}' @pytest.mark.parametrize( 'SecretField, value, error_msg', [ (SecretDate, 'not-a-date', r'Input should be a valid date'), (SecretStr, 0, r'Input should be a valid string \[type=string_type,'), (SecretBytes, 0, r'Input should be a valid bytes \[type=bytes_type,'), (SecretEnum, 0, r'Input should be an instance of SampleEnum'), ], ) def test_strict_secretfield_by_config(SecretField, value, error_msg): class Foobar(BaseModel): model_config = ConfigDict(strict=True) value: SecretField with pytest.raises(ValidationError, match=error_msg): Foobar(value=value) @pytest.mark.parametrize( 'field, value, error_msg', [ (date, 'not-a-date', r'Input should be a valid date'), (str, 0, r'Input should be a valid string \[type=string_type,'), (bytes, 0, r'Input should be a valid bytes \[type=bytes_type,'), (SampleEnum, 0, r'Input should be an instance of SampleEnum'), ], ) def test_strict_secretfield_annotated(field, value, error_msg): SecretField = Annotated[field, Strict()] class Foobar(BaseModel): value: Secret[SecretField] with pytest.raises(ValidationError, match=error_msg): Foobar(value=value) @pytest.mark.parametrize( 'value', [ datetime(2012, 4, 9, 12, 15), 'x20120423', '2012-04-56', 20000044800, # just after watershed 1_549_238_400_000_000, # nowish in μs 1_549_238_400_000_000_000, # nowish in ns 'infinity', float('inf'), int('1' + '0' * 100), 1e1000, float('-infinity'), float('nan'), ], ) def test_secretdate_parsing(value): class FooBar(BaseModel): d: SecretDate with pytest.raises(ValidationError): FooBar(d=value) def test_secretdate_equality(): assert SecretDate('2017-01-01') == SecretDate('2017-01-01') assert SecretDate('2017-01-01') != SecretDate('2018-01-01') assert SecretDate(date(2017, 1, 1)) != date(2017, 1, 1) assert SecretDate('2017-01-01') is not SecretDate('2017-01-01') def test_secretdate_idempotent(): class Foobar(BaseModel): value: SecretDate # Should not raise an exception m = Foobar(value=SecretDate(date(2017, 1, 1))) assert m.value.get_secret_value() == date(2017, 1, 1) def test_secret_union_serializable() -> None: class Base(BaseModel): x: Union[Secret[int], Secret[str]] model = Base(x=1) assert model.model_dump() == {'x': Secret[int](1)} assert model.model_dump_json() == '{"x":"**********"}' @pytest.mark.parametrize( 'pydantic_type', [ Strict, StrictBool, conint, PositiveInt, NegativeInt, NonPositiveInt, NonNegativeInt, StrictInt, confloat, PositiveFloat, NegativeFloat, NonPositiveFloat, NonNegativeFloat, StrictFloat, FiniteFloat, conbytes, Secret, SecretBytes, constr, StrictStr, SecretStr, ImportString, conset, confrozenset, conlist, condecimal, UUID1, UUID3, UUID4, UUID5, FilePath, DirectoryPath, NewPath, Json, ByteSize, condate, PastDate, FutureDate, PastDatetime, FutureDatetime, AwareDatetime, NaiveDatetime, ], ) def test_is_hashable(pydantic_type): assert type(hash(pydantic_type)) is int def test_model_contain_hashable_type(): class MyModel(BaseModel): v: Union[str, StrictStr] assert MyModel(v='test').v == 'test' def test_secretstr_error(): class Foobar(BaseModel): password: SecretStr with pytest.raises(ValidationError) as exc_info: Foobar(password=[6, 23, 'abc']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('password',), 'msg': 'Input should be a valid string', 'input': [6, 23, 'abc'], } ] def test_secret_str_hashable(): assert type(hash(SecretStr('abs'))) is int def test_secret_bytes_hashable(): assert type(hash(SecretBytes(b'abs'))) is int def test_secret_str_min_max_length(): class Foobar(BaseModel): password: SecretStr = Field(min_length=6, max_length=10) with pytest.raises(ValidationError) as exc_info: Foobar(password='') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('password',), 'msg': 'Value should have at least 6 items after validation, not 0', 'input': '', 'ctx': {'field_type': 'Value', 'min_length': 6, 'actual_length': 0}, } ] with pytest.raises(ValidationError) as exc_info: Foobar(password='1' * 20) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('password',), 'msg': 'Value should have at most 10 items after validation, not 20', 'input': '11111111111111111111', 'ctx': {'field_type': 'Value', 'max_length': 10, 'actual_length': 20}, } ] value = '1' * 8 assert Foobar(password=value).password.get_secret_value() == value def test_secretbytes_json(): class Foobar(BaseModel): password: SecretBytes assert Foobar(password='foo').model_dump_json() == '{"password":"**********"}' def test_secretbytes(): class Foobar(BaseModel): password: SecretBytes empty_password: SecretBytes # Initialize the model. # Use bytes that can't be decoded with UTF8 (https://github.com/pydantic/pydantic/issues/7971) password = b'\x89PNG\r\n\x1a\n' f = Foobar(password=password, empty_password=b'') # Assert correct types. assert f.password.__class__.__name__ == 'SecretBytes' assert f.empty_password.__class__.__name__ == 'SecretBytes' # Assert str and repr are correct. assert str(f.password) == "b'**********'" assert str(f.empty_password) == "b''" assert repr(f.password) == "SecretBytes(b'**********')" assert repr(f.empty_password) == "SecretBytes(b'')" # Assert retrieval of secret value is correct assert f.password.get_secret_value() == password assert f.empty_password.get_secret_value() == b'' # Assert that SecretBytes is equal to SecretBytes if the secret is the same. assert f == f.model_copy() copied_with_changes = f.model_copy() copied_with_changes.password = SecretBytes(b'4321') assert f != copied_with_changes def test_secretbytes_equality(): assert SecretBytes(b'abc') == SecretBytes(b'abc') assert SecretBytes(b'123') != SecretBytes(b'321') assert SecretBytes(b'123') != b'123' assert SecretBytes(b'123') is not SecretBytes(b'123') def test_secretbytes_idempotent(): class Foobar(BaseModel): password: SecretBytes # Should not raise an exception. _ = Foobar(password=SecretBytes(b'1234')) def test_secretbytes_error(): class Foobar(BaseModel): password: SecretBytes with pytest.raises(ValidationError) as exc_info: Foobar(password=[6, 23, 'abc']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'bytes_type', 'loc': ('password',), 'msg': 'Input should be a valid bytes', 'input': [6, 23, 'abc'], } ] def test_secret_bytes_min_max_length(): class Foobar(BaseModel): password: SecretBytes = Field(min_length=6, max_length=10) with pytest.raises(ValidationError) as exc_info: Foobar(password=b'') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('password',), 'msg': 'Value should have at least 6 items after validation, not 0', 'input': b'', 'ctx': {'field_type': 'Value', 'min_length': 6, 'actual_length': 0}, } ] with pytest.raises(ValidationError) as exc_info: Foobar(password=b'1' * 20) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_long', 'loc': ('password',), 'msg': 'Value should have at most 10 items after validation, not 20', 'input': b'11111111111111111111', 'ctx': {'field_type': 'Value', 'max_length': 10, 'actual_length': 20}, } ] value = b'1' * 8 assert Foobar(password=value).password.get_secret_value() == value def test_generic_without_params(): class Model(BaseModel): generic_list: List generic_dict: Dict generic_tuple: Tuple m = Model(generic_list=[0, 'a'], generic_dict={0: 'a', 'a': 0}, generic_tuple=(1, 'q')) assert m.model_dump() == {'generic_list': [0, 'a'], 'generic_dict': {0: 'a', 'a': 0}, 'generic_tuple': (1, 'q')} def test_generic_without_params_error(): class Model(BaseModel): generic_list: List generic_dict: Dict generic_tuple: Tuple with pytest.raises(ValidationError) as exc_info: Model(generic_list=0, generic_dict=0, generic_tuple=0) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': ('generic_list',), 'msg': 'Input should be a valid list', 'input': 0, }, { 'type': 'dict_type', 'loc': ('generic_dict',), 'msg': 'Input should be a valid dictionary', 'input': 0, }, {'type': 'tuple_type', 'loc': ('generic_tuple',), 'msg': 'Input should be a valid tuple', 'input': 0}, ] def test_literal_single(): class Model(BaseModel): a: Literal['a'] Model(a='a') with pytest.raises(ValidationError) as exc_info: Model(a='b') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('a',), 'msg': "Input should be 'a'", 'input': 'b', 'ctx': {'expected': "'a'"}, } ] def test_literal_multiple(): class Model(BaseModel): a_or_b: Literal['a', 'b'] Model(a_or_b='a') Model(a_or_b='b') with pytest.raises(ValidationError) as exc_info: Model(a_or_b='c') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('a_or_b',), 'msg': "Input should be 'a' or 'b'", 'input': 'c', 'ctx': {'expected': "'a' or 'b'"}, } ] def test_typing_mutable_set(): s1 = TypeAdapter(Set[int]).core_schema s1.pop('metadata', None) s2 = TypeAdapter(typing.MutableSet[int]).core_schema s2.pop('metadata', None) assert s1 == s2 def test_frozenset_field(): class FrozenSetModel(BaseModel): set: FrozenSet[int] test_set = frozenset({1, 2, 3}) object_under_test = FrozenSetModel(set=test_set) assert object_under_test.set == test_set @pytest.mark.parametrize( 'value,result', [ ([1, 2, 3], frozenset([1, 2, 3])), ({1, 2, 3}, frozenset([1, 2, 3])), ((1, 2, 3), frozenset([1, 2, 3])), (deque([1, 2, 3]), frozenset([1, 2, 3])), ], ) def test_frozenset_field_conversion(value, result): class FrozenSetModel(BaseModel): set: FrozenSet[int] object_under_test = FrozenSetModel(set=value) assert object_under_test.set == result def test_frozenset_field_not_convertible(): class FrozenSetModel(BaseModel): set: FrozenSet[int] with pytest.raises(ValidationError, match=r'frozenset'): FrozenSetModel(set=42) @pytest.mark.parametrize( 'input_value,output,human_bin,human_dec,human_sep', ( (1, 1, '1B', '1B', '1 B'), ('1', 1, '1B', '1B', '1 B'), ('1.0', 1, '1B', '1B', '1 B'), ('1b', 1, '1B', '1B', '1 B'), ('1.5 KB', int(1.5e3), '1.5KiB', '1.5KB', '1.5 KiB'), ('1.5 K', int(1.5e3), '1.5KiB', '1.5KB', '1.5 KiB'), ('1.5 MB', int(1.5e6), '1.4MiB', '1.5MB', '1.4 MiB'), ('1.5 M', int(1.5e6), '1.4MiB', '1.5MB', '1.4 MiB'), ('5.1kib', 5222, '5.1KiB', '5.2KB', '5.1 KiB'), ('6.2EiB', 7148113328562451456, '6.2EiB', '7.1EB', '6.2 EiB'), ('8bit', 1, '1B', '1B', '1 B'), ('1kbit', 125, '125B', '125B', '125 B'), ), ) def test_bytesize_conversions(input_value, output, human_bin, human_dec, human_sep): class Model(BaseModel): size: ByteSize m = Model(size=input_value) assert m.size == output assert m.size.human_readable() == human_bin assert m.size.human_readable(decimal=True) == human_dec assert m.size.human_readable(separator=' ') == human_sep def test_bytesize_to(): class Model(BaseModel): size: ByteSize m = Model(size='1GiB') assert m.size.to('MiB') == pytest.approx(1024) assert m.size.to('MB') == pytest.approx(1073.741824) assert m.size.to('TiB') == pytest.approx(0.0009765625) assert m.size.to('bit') == pytest.approx(8589934592) assert m.size.to('kbit') == pytest.approx(8589934.592) def test_bytesize_raises(): class Model(BaseModel): size: ByteSize with pytest.raises(ValidationError, match='parse value') as exc_info: Model(size='d1MB') assert exc_info.value.errors(include_url=False) == [ { 'input': 'd1MB', 'loc': ('size',), 'msg': 'could not parse value and unit from byte string', 'type': 'byte_size', } ] with pytest.raises(ValidationError, match='byte unit') as exc_info: Model(size='1LiB') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'unit': 'LiB'}, 'input': '1LiB', 'loc': ('size',), 'msg': 'could not interpret byte unit: LiB', 'type': 'byte_size_unit', } ] # 1Gi is not a valid unit unlike 1G with pytest.raises(ValidationError, match='byte unit'): Model(size='1Gi') m = Model(size='1MB') with pytest.raises(PydanticCustomError, match='byte unit'): m.size.to('bad_unit') with pytest.raises(PydanticCustomError, match='byte unit'): m.size.to('1ZiB') def test_deque_success(): class Model(BaseModel): v: deque assert Model(v=[1, 2, 3]).v == deque([1, 2, 3]) @pytest.mark.parametrize( 'cls,value,result', ( (int, [1, 2, 3], deque([1, 2, 3])), (int, (1, 2, 3), deque((1, 2, 3))), (int, deque((1, 2, 3)), deque((1, 2, 3))), (float, [1.0, 2.0, 3.0], deque([1.0, 2.0, 3.0])), (Set[int], [{1, 2}, {3, 4}, {5, 6}], deque([{1, 2}, {3, 4}, {5, 6}])), (Tuple[int, str], ((1, 'a'), (2, 'b'), (3, 'c')), deque(((1, 'a'), (2, 'b'), (3, 'c')))), (str, 'one two three'.split(), deque(['one', 'two', 'three'])), ( int, {1: 10, 2: 20, 3: 30}.keys(), deque([1, 2, 3]), ), ( int, {1: 10, 2: 20, 3: 30}.values(), deque([10, 20, 30]), ), ( Tuple[int, int], {1: 10, 2: 20, 3: 30}.items(), deque([(1, 10), (2, 20), (3, 30)]), ), ( float, {1, 2, 3}, deque([1, 2, 3]), ), ( float, frozenset((1, 2, 3)), deque([1, 2, 3]), ), ), ) def test_deque_generic_success(cls, value, result): class Model(BaseModel): v: Deque[cls] assert Model(v=value).v == result @pytest.mark.parametrize( 'cls,value,result', ( (int, deque((1, 2, 3)), deque((1, 2, 3))), (str, deque(('1', '2', '3')), deque(('1', '2', '3'))), ), ) def test_deque_generic_success_strict(cls, value: Any, result): class Model(BaseModel): v: Deque[cls] model_config = ConfigDict(strict=True) assert Model(v=value).v == result @pytest.mark.parametrize( 'cls,value,expected_error', ( ( int, [1, 'a', 3], { 'type': 'int_parsing', 'loc': ('v', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, ), ( int, (1, 2, 'a'), { 'type': 'int_parsing', 'loc': ('v', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, ), ( Tuple[int, str], ((1, 'a'), ('a', 'a'), (3, 'c')), { 'type': 'int_parsing', 'loc': ('v', 1, 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', }, ), ( List[int], [{'a': 1, 'b': 2}, [1, 2], [2, 3]], { 'type': 'list_type', 'loc': ('v', 0), 'msg': 'Input should be a valid list', 'input': { 'a': 1, 'b': 2, }, }, ), ), ) def test_deque_fails(cls, value, expected_error): class Model(BaseModel): v: Deque[cls] with pytest.raises(ValidationError) as exc_info: Model(v=value) # debug(exc_info.value.errors(include_url=False)) assert len(exc_info.value.errors(include_url=False)) == 1 assert expected_error == exc_info.value.errors(include_url=False)[0] def test_deque_model(): class Model2(BaseModel): x: int class Model(BaseModel): v: Deque[Model2] seq = [Model2(x=1), Model2(x=2)] assert Model(v=seq).v == deque(seq) def test_deque_json(): class Model(BaseModel): v: Deque[int] assert Model(v=deque((1, 2, 3))).model_dump_json() == '{"v":[1,2,3]}' def test_deque_any_maxlen(): class DequeModel1(BaseModel): field: deque assert DequeModel1(field=deque()).field.maxlen is None assert DequeModel1(field=deque(maxlen=8)).field.maxlen == 8 class DequeModel2(BaseModel): field: deque = deque() assert DequeModel2().field.maxlen is None assert DequeModel2(field=deque()).field.maxlen is None assert DequeModel2(field=deque(maxlen=8)).field.maxlen == 8 class DequeModel3(BaseModel): field: deque = deque(maxlen=5) assert DequeModel3().field.maxlen == 5 assert DequeModel3(field=deque()).field.maxlen is None assert DequeModel3(field=deque(maxlen=8)).field.maxlen == 8 def test_deque_typed_maxlen(): class DequeModel1(BaseModel): field: Deque[int] assert DequeModel1(field=deque()).field.maxlen is None assert DequeModel1(field=deque(maxlen=8)).field.maxlen == 8 class DequeModel2(BaseModel): field: Deque[int] = deque() assert DequeModel2().field.maxlen is None assert DequeModel2(field=deque()).field.maxlen is None assert DequeModel2(field=deque(maxlen=8)).field.maxlen == 8 class DequeModel3(BaseModel): field: Deque[int] = deque(maxlen=5) assert DequeModel3().field.maxlen == 5 assert DequeModel3(field=deque()).field.maxlen is None assert DequeModel3(field=deque(maxlen=8)).field.maxlen == 8 def test_deque_set_maxlen(): class DequeModel1(BaseModel): field: Annotated[Deque[int], Field(max_length=10)] assert DequeModel1(field=deque()).field.maxlen == 10 assert DequeModel1(field=deque(maxlen=8)).field.maxlen == 8 assert DequeModel1(field=deque(maxlen=15)).field.maxlen == 10 class DequeModel2(BaseModel): field: Annotated[Deque[int], Field(max_length=10)] = deque() assert DequeModel2().field.maxlen is None assert DequeModel2(field=deque()).field.maxlen == 10 assert DequeModel2(field=deque(maxlen=8)).field.maxlen == 8 assert DequeModel2(field=deque(maxlen=15)).field.maxlen == 10 class DequeModel3(DequeModel2): model_config = ConfigDict(validate_default=True) assert DequeModel3().field.maxlen == 10 class DequeModel4(BaseModel): field: Annotated[Deque[int], Field(max_length=10)] = deque(maxlen=5) assert DequeModel4().field.maxlen == 5 class DequeModel5(DequeModel4): model_config = ConfigDict(validate_default=True) assert DequeModel4().field.maxlen == 5 @pytest.mark.parametrize('value_type', (None, type(None), None.__class__)) def test_none(value_type): class Model(BaseModel): my_none: value_type my_none_list: List[value_type] my_none_dict: Dict[str, value_type] my_json_none: Json[value_type] Model( my_none=None, my_none_list=[None] * 3, my_none_dict={'a': None, 'b': None}, my_json_none='null', ) assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'my_none': {'type': 'null', 'title': 'My None'}, 'my_none_list': {'type': 'array', 'items': {'type': 'null'}, 'title': 'My None List'}, 'my_none_dict': {'type': 'object', 'additionalProperties': {'type': 'null'}, 'title': 'My None Dict'}, 'my_json_none': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'null'}, 'title': 'My Json None', 'type': 'string', }, }, 'required': ['my_none', 'my_none_list', 'my_none_dict', 'my_json_none'], } with pytest.raises(ValidationError) as exc_info: Model( my_none='qwe', my_none_list=[1, None, 'qwe'], my_none_dict={'a': 1, 'b': None}, my_json_none='"a"', ) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'none_required', 'loc': ('my_none',), 'msg': 'Input should be None', 'input': 'qwe'}, {'type': 'none_required', 'loc': ('my_none_list', 0), 'msg': 'Input should be None', 'input': 1}, { 'type': 'none_required', 'loc': ('my_none_list', 2), 'msg': 'Input should be None', 'input': 'qwe', }, { 'type': 'none_required', 'loc': ('my_none_dict', 'a'), 'msg': 'Input should be None', 'input': 1, }, {'type': 'none_required', 'loc': ('my_json_none',), 'msg': 'Input should be None', 'input': 'a'}, ] def test_none_literal(): class Model(BaseModel): my_none: Literal[None] my_none_list: List[Literal[None]] my_none_dict: Dict[str, Literal[None]] my_json_none: Json[Literal[None]] Model( my_none=None, my_none_list=[None] * 3, my_none_dict={'a': None, 'b': None}, my_json_none='null', ) # insert_assert(Model.model_json_schema()) assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': { 'my_none': {'const': None, 'title': 'My None', 'type': 'null'}, 'my_none_list': { 'items': {'const': None, 'type': 'null'}, 'title': 'My None List', 'type': 'array', }, 'my_none_dict': { 'additionalProperties': {'const': None, 'type': 'null'}, 'title': 'My None Dict', 'type': 'object', }, 'my_json_none': { 'contentMediaType': 'application/json', 'contentSchema': {'const': None, 'type': 'null'}, 'title': 'My Json None', 'type': 'string', }, }, 'required': ['my_none', 'my_none_list', 'my_none_dict', 'my_json_none'], } with pytest.raises(ValidationError) as exc_info: Model( my_none='qwe', my_none_list=[1, None, 'qwe'], my_none_dict={'a': 1, 'b': None}, my_json_none='"a"', ) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('my_none',), 'msg': 'Input should be None', 'input': 'qwe', 'ctx': {'expected': 'None'}, }, { 'type': 'literal_error', 'loc': ('my_none_list', 0), 'msg': 'Input should be None', 'input': 1, 'ctx': {'expected': 'None'}, }, { 'type': 'literal_error', 'loc': ('my_none_list', 2), 'msg': 'Input should be None', 'input': 'qwe', 'ctx': {'expected': 'None'}, }, { 'type': 'literal_error', 'loc': ('my_none_dict', 'a'), 'msg': 'Input should be None', 'input': 1, 'ctx': {'expected': 'None'}, }, { 'type': 'literal_error', 'loc': ('my_json_none',), 'msg': 'Input should be None', 'input': 'a', 'ctx': {'expected': 'None'}, }, ] def test_default_union_types(): class DefaultModel(BaseModel): v: Union[int, bool, str] # do it this way since `1 == True` assert repr(DefaultModel(v=True).v) == 'True' assert repr(DefaultModel(v=1).v) == '1' assert repr(DefaultModel(v='1').v) == "'1'" assert DefaultModel.model_json_schema() == { 'title': 'DefaultModel', 'type': 'object', 'properties': {'v': {'title': 'V', 'anyOf': [{'type': t} for t in ('integer', 'boolean', 'string')]}}, 'required': ['v'], } def test_default_union_types_left_to_right(): class DefaultModel(BaseModel): v: Annotated[Union[int, bool, str], Field(union_mode='left_to_right')] print(DefaultModel.__pydantic_core_schema__) # int will coerce everything in left-to-right mode assert repr(DefaultModel(v=True).v) == '1' assert repr(DefaultModel(v=1).v) == '1' assert repr(DefaultModel(v='1').v) == '1' assert DefaultModel.model_json_schema() == { 'title': 'DefaultModel', 'type': 'object', 'properties': {'v': {'title': 'V', 'anyOf': [{'type': t} for t in ('integer', 'boolean', 'string')]}}, 'required': ['v'], } def test_union_enum_int_left_to_right(): class BinaryEnum(IntEnum): ZERO = 0 ONE = 1 # int will win over enum in this case assert TypeAdapter(Union[BinaryEnum, int]).validate_python(0) is not BinaryEnum.ZERO # in left to right mode, enum will validate successfully and take precedence assert ( TypeAdapter(Annotated[Union[BinaryEnum, int], Field(union_mode='left_to_right')]).validate_python(0) is BinaryEnum.ZERO ) def test_union_uuid_str_left_to_right(): IdOrSlug = Union[UUID, str] # in smart mode JSON and python are currently validated differently in this # case, because in Python this is a str but in JSON a str is also a UUID assert TypeAdapter(IdOrSlug).validate_json('"f4fe10b4-e0c8-4232-ba26-4acd491c2414"') == UUID( 'f4fe10b4-e0c8-4232-ba26-4acd491c2414' ) assert ( TypeAdapter(IdOrSlug).validate_python('f4fe10b4-e0c8-4232-ba26-4acd491c2414') == 'f4fe10b4-e0c8-4232-ba26-4acd491c2414' ) IdOrSlugLTR = Annotated[Union[UUID, str], Field(union_mode='left_to_right')] # in left to right mode both JSON and python are validated as UUID assert TypeAdapter(IdOrSlugLTR).validate_json('"f4fe10b4-e0c8-4232-ba26-4acd491c2414"') == UUID( 'f4fe10b4-e0c8-4232-ba26-4acd491c2414' ) assert TypeAdapter(IdOrSlugLTR).validate_python('f4fe10b4-e0c8-4232-ba26-4acd491c2414') == UUID( 'f4fe10b4-e0c8-4232-ba26-4acd491c2414' ) def test_default_union_class(): class A(BaseModel): x: str class B(BaseModel): x: str class Model(BaseModel): y: Union[A, B] assert isinstance(Model(y=A(x='a')).y, A) assert isinstance(Model(y=B(x='b')).y, B) @pytest.mark.parametrize('max_length', [10, None]) def test_union_subclass(max_length: Union[int, None]): class MyStr(str): ... class Model(BaseModel): x: Union[int, Annotated[str, Field(max_length=max_length)]] v = Model(x=MyStr('1')).x assert type(v) is str assert v == '1' def test_union_compound_types(): class Model(BaseModel): values: Union[Dict[str, str], List[str], Dict[str, List[str]]] assert Model(values={'L': '1'}).model_dump() == {'values': {'L': '1'}} assert Model(values=['L1']).model_dump() == {'values': ['L1']} assert Model(values=('L1',)).model_dump() == {'values': ['L1']} assert Model(values={'x': ['pika']}) != {'values': {'x': ['pika']}} assert Model(values={'x': ('pika',)}).model_dump() == {'values': {'x': ['pika']}} with pytest.raises(ValidationError) as e: Model(values={'x': {'a': 'b'}}) # insert_assert(e.value.errors(include_url=False)) assert e.value.errors(include_url=False) == [ { 'type': 'string_type', 'loc': ('values', 'dict[str,str]', 'x'), 'msg': 'Input should be a valid string', 'input': {'a': 'b'}, }, { 'type': 'list_type', 'loc': ('values', 'list[str]'), 'msg': 'Input should be a valid list', 'input': {'x': {'a': 'b'}}, }, { 'type': 'list_type', 'loc': ('values', 'dict[str,list[str]]', 'x'), 'msg': 'Input should be a valid list', 'input': {'a': 'b'}, }, ] def test_smart_union_compounded_types_edge_case(): class Model(BaseModel): x: Union[List[str], List[int]] assert Model(x=[1, 2]).x == [1, 2] assert Model(x=['1', '2']).x == ['1', '2'] assert Model(x=[1, '2']).x == [1, 2] def test_union_typeddict(): class Dict1(TypedDict): foo: str class Dict2(TypedDict): bar: str class M(BaseModel): d: Union[Dict2, Dict1] assert M(d=dict(foo='baz')).d == {'foo': 'baz'} def test_custom_generic_containers(): T = TypeVar('T') class GenericList(List[T]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(GenericList, handler(List[get_args(source_type)[0]])) class Model(BaseModel): field: GenericList[int] model = Model(field=['1', '2']) assert model.field == [1, 2] assert isinstance(model.field, GenericList) with pytest.raises(ValidationError) as exc_info: Model(field=['a']) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('field', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] @pytest.mark.parametrize( ('field_type', 'input_data', 'expected_value', 'serialized_data'), [ pytest.param(Base64Bytes, b'Zm9vIGJhcg==', b'foo bar', b'Zm9vIGJhcg==', id='Base64Bytes-bytes-input'), pytest.param(Base64Bytes, 'Zm9vIGJhcg==', b'foo bar', b'Zm9vIGJhcg==', id='Base64Bytes-str-input'), pytest.param( Base64Bytes, bytearray(b'Zm9vIGJhcg=='), b'foo bar', b'Zm9vIGJhcg==', id='Base64Bytes-bytearray-input' ), pytest.param(Base64Str, b'Zm9vIGJhcg==', 'foo bar', 'Zm9vIGJhcg==', id='Base64Str-bytes-input'), pytest.param(Base64Str, 'Zm9vIGJhcg==', 'foo bar', 'Zm9vIGJhcg==', id='Base64Str-str-input'), pytest.param(Base64Str, bytearray(b'Zm9vIGJhcg=='), 'foo bar', 'Zm9vIGJhcg==', id='Base64Str-bytearray-input'), pytest.param( Base64Bytes, b'BCq+6+1/Paun/Q==', b'\x04*\xbe\xeb\xed\x7f=\xab\xa7\xfd', b'BCq+6+1/Paun/Q==', id='Base64Bytes-bytes-alphabet-vanilla', ), ], ) def test_base64(field_type, input_data, expected_value, serialized_data): class Model(BaseModel): base64_value: field_type base64_value_or_none: Optional[field_type] = None m = Model(base64_value=input_data) assert m.base64_value == expected_value m = Model.model_construct(base64_value=expected_value) assert m.base64_value == expected_value assert m.model_dump() == { 'base64_value': serialized_data, 'base64_value_or_none': None, } assert Model.model_json_schema() == { 'properties': { 'base64_value': { 'format': 'base64', 'title': 'Base64 Value', 'type': 'string', }, 'base64_value_or_none': { 'anyOf': [{'type': 'string', 'format': 'base64'}, {'type': 'null'}], 'default': None, 'title': 'Base64 Value Or None', }, }, 'required': ['base64_value'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize( ('field_type', 'input_data'), [ pytest.param(Base64Bytes, b'Zm9vIGJhcg', id='Base64Bytes-invalid-base64-bytes'), pytest.param(Base64Bytes, 'Zm9vIGJhcg', id='Base64Bytes-invalid-base64-str'), pytest.param(Base64Str, b'Zm9vIGJhcg', id='Base64Str-invalid-base64-bytes'), pytest.param(Base64Str, 'Zm9vIGJhcg', id='Base64Str-invalid-base64-str'), ], ) def test_base64_invalid(field_type, input_data): class Model(BaseModel): base64_value: field_type with pytest.raises(ValidationError) as e: Model(base64_value=input_data) assert e.value.errors(include_url=False) == [ { 'ctx': {'error': 'Incorrect padding'}, 'input': input_data, 'loc': ('base64_value',), 'msg': "Base64 decoding error: 'Incorrect padding'", 'type': 'base64_decode', }, ] @pytest.mark.parametrize( ('field_type', 'input_data', 'expected_value', 'serialized_data'), [ pytest.param(Base64UrlBytes, b'Zm9vIGJhcg==\n', b'foo bar', b'Zm9vIGJhcg==', id='Base64UrlBytes-reversible'), pytest.param(Base64UrlStr, 'Zm9vIGJhcg==\n', 'foo bar', 'Zm9vIGJhcg==', id='Base64UrlStr-reversible'), pytest.param(Base64UrlBytes, b'Zm9vIGJhcg==', b'foo bar', b'Zm9vIGJhcg==', id='Base64UrlBytes-bytes-input'), pytest.param(Base64UrlBytes, 'Zm9vIGJhcg==', b'foo bar', b'Zm9vIGJhcg==', id='Base64UrlBytes-str-input'), pytest.param( Base64UrlBytes, bytearray(b'Zm9vIGJhcg=='), b'foo bar', b'Zm9vIGJhcg==', id='Base64UrlBytes-bytearray-input' ), pytest.param(Base64UrlStr, b'Zm9vIGJhcg==', 'foo bar', 'Zm9vIGJhcg==', id='Base64UrlStr-bytes-input'), pytest.param(Base64UrlStr, 'Zm9vIGJhcg==', 'foo bar', 'Zm9vIGJhcg==', id='Base64UrlStr-str-input'), pytest.param( Base64UrlStr, bytearray(b'Zm9vIGJhcg=='), 'foo bar', 'Zm9vIGJhcg==', id='Base64UrlStr-bytearray-input' ), pytest.param( Base64UrlBytes, b'BCq-6-1_Paun_Q==', b'\x04*\xbe\xeb\xed\x7f=\xab\xa7\xfd', b'BCq-6-1_Paun_Q==', id='Base64UrlBytes-bytes-alphabet-url', ), pytest.param( Base64UrlBytes, b'BCq+6+1/Paun/Q==', b'\x04*\xbe\xeb\xed\x7f=\xab\xa7\xfd', b'BCq-6-1_Paun_Q==', id='Base64UrlBytes-bytes-alphabet-vanilla', ), ], ) def test_base64url(field_type, input_data, expected_value, serialized_data): class Model(BaseModel): base64url_value: field_type base64url_value_or_none: Optional[field_type] = None m = Model(base64url_value=input_data) assert m.base64url_value == expected_value m = Model.model_construct(base64url_value=expected_value) assert m.base64url_value == expected_value assert m.model_dump() == { 'base64url_value': serialized_data, 'base64url_value_or_none': None, } assert Model.model_json_schema() == { 'properties': { 'base64url_value': { 'format': 'base64url', 'title': 'Base64Url Value', 'type': 'string', }, 'base64url_value_or_none': { 'anyOf': [{'type': 'string', 'format': 'base64url'}, {'type': 'null'}], 'default': None, 'title': 'Base64Url Value Or None', }, }, 'required': ['base64url_value'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize( ('field_type', 'input_data'), [ pytest.param(Base64UrlBytes, b'Zm9vIGJhcg', id='Base64UrlBytes-invalid-base64-bytes'), pytest.param(Base64UrlBytes, 'Zm9vIGJhcg', id='Base64UrlBytes-invalid-base64-str'), pytest.param(Base64UrlStr, b'Zm9vIGJhcg', id='Base64UrlStr-invalid-base64-bytes'), pytest.param(Base64UrlStr, 'Zm9vIGJhcg', id='Base64UrlStr-invalid-base64-str'), ], ) def test_base64url_invalid(field_type, input_data): class Model(BaseModel): base64url_value: field_type with pytest.raises(ValidationError) as e: Model(base64url_value=input_data) assert e.value.errors(include_url=False) == [ { 'ctx': {'error': 'Incorrect padding'}, 'input': input_data, 'loc': ('base64url_value',), 'msg': "Base64 decoding error: 'Incorrect padding'", 'type': 'base64_decode', }, ] def test_sequence_subclass_without_core_schema() -> None: class MyList(List[int]): # The point of this is that subclasses can do arbitrary things # This is the reason why we don't try to handle them automatically # TBD if we introspect `__init__` / `__new__` # (which is the main thing that would mess us up if modified in a subclass) # and automatically handle cases where the subclass doesn't override it. # There's still edge cases (again, arbitrary behavior...) # and it's harder to explain, but could lead to a better user experience in some cases # It will depend on how the complaints (which have and will happen in both directions) # balance out def __init__(self, *args: Any, required: int, **kwargs: Any) -> None: super().__init__(*args, **kwargs) with pytest.raises( PydanticSchemaGenerationError, match='implement `__get_pydantic_core_schema__` on your type to fully support it' ): class _(BaseModel): x: MyList def test_typing_coercion_defaultdict(): class Model(BaseModel): x: DefaultDict[int, str] d = defaultdict(str) d['1'] m = Model(x=d) assert isinstance(m.x, defaultdict) assert repr(m.x) == "defaultdict(, {1: ''})" def test_typing_coercion_counter(): class Model(BaseModel): x: Counter[str] m = Model(x={'a': 10}) assert isinstance(m.x, Counter) assert repr(m.x) == "Counter({'a': 10})" def test_typing_counter_value_validation(): class Model(BaseModel): x: Counter[str] with pytest.raises(ValidationError) as exc_info: Model(x={'a': 'a'}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('x', 'a'), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] def test_mapping_subclass_without_core_schema() -> None: class MyDict(Dict[int, int]): # The point of this is that subclasses can do arbitrary things # This is the reason why we don't try to handle them automatically # TBD if we introspect `__init__` / `__new__` # (which is the main thing that would mess us up if modified in a subclass) # and automatically handle cases where the subclass doesn't override it. # There's still edge cases (again, arbitrary behavior...) # and it's harder to explain, but could lead to a better user experience in some cases # It will depend on how the complaints (which have and will happen in both directions) # balance out def __init__(self, *args: Any, required: int, **kwargs: Any) -> None: super().__init__(*args, **kwargs) with pytest.raises( PydanticSchemaGenerationError, match='implement `__get_pydantic_core_schema__` on your type to fully support it' ): class _(BaseModel): x: MyDict def test_defaultdict_unknown_default_factory() -> None: """ https://github.com/pydantic/pydantic/issues/4687 """ with pytest.raises( PydanticSchemaGenerationError, match=r'Unable to infer a default factory for keys of type typing.DefaultDict\[int, int\]', ): class Model(BaseModel): d: DefaultDict[int, DefaultDict[int, int]] def test_defaultdict_infer_default_factory() -> None: class Model(BaseModel): a: DefaultDict[int, List[int]] b: DefaultDict[int, int] c: DefaultDict[int, set] m = Model(a={}, b={}, c={}) assert m.a.default_factory is not None assert m.a.default_factory() == [] assert m.b.default_factory is not None assert m.b.default_factory() == 0 assert m.c.default_factory is not None assert m.c.default_factory() == set() def test_defaultdict_explicit_default_factory() -> None: class MyList(List[int]): pass class Model(BaseModel): a: DefaultDict[int, Annotated[List[int], Field(default_factory=lambda: MyList())]] m = Model(a={}) assert m.a.default_factory is not None assert isinstance(m.a.default_factory(), MyList) def test_defaultdict_default_factory_preserved() -> None: class Model(BaseModel): a: DefaultDict[int, List[int]] class MyList(List[int]): pass m = Model(a=defaultdict(lambda: MyList())) assert m.a.default_factory is not None assert isinstance(m.a.default_factory(), MyList) def test_custom_default_dict() -> None: KT = TypeVar('KT') VT = TypeVar('VT') class CustomDefaultDict(DefaultDict[KT, VT]): @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: keys_type, values_type = get_args(source_type) return core_schema.no_info_after_validator_function( lambda x: cls(x.default_factory, x), handler(DefaultDict[keys_type, values_type]) ) ta = TypeAdapter(CustomDefaultDict[str, int]) assert ta.validate_python({'a': 1}) == CustomDefaultDict(int, {'a': 1}) @pytest.mark.parametrize('field_type', [typing.OrderedDict, collections.OrderedDict]) def test_ordered_dict_from_ordered_dict(field_type): class Model(BaseModel): od_field: field_type od_value = collections.OrderedDict([('a', 1), ('b', 2)]) m = Model(od_field=od_value) assert isinstance(m.od_field, collections.OrderedDict) assert m.od_field == od_value # we don't make any promises about preserving instances # at the moment we always copy them for consistency and predictability # so this is more so documenting the current behavior than a promise # we make to users assert m.od_field is not od_value assert m.model_json_schema() == { 'properties': {'od_field': {'title': 'Od Field', 'type': 'object'}}, 'required': ['od_field'], 'title': 'Model', 'type': 'object', } def test_ordered_dict_from_ordered_dict_typed(): class Model(BaseModel): od_field: typing.OrderedDict[str, int] od_value = collections.OrderedDict([('a', 1), ('b', 2)]) m = Model(od_field=od_value) assert isinstance(m.od_field, collections.OrderedDict) assert m.od_field == od_value assert m.model_json_schema() == { 'properties': { 'od_field': { 'additionalProperties': {'type': 'integer'}, 'title': 'Od Field', 'type': 'object', } }, 'required': ['od_field'], 'title': 'Model', 'type': 'object', } @pytest.mark.parametrize('field_type', [typing.OrderedDict, collections.OrderedDict]) def test_ordered_dict_from_dict(field_type): class Model(BaseModel): od_field: field_type od_value = {'a': 1, 'b': 2} m = Model(od_field=od_value) assert isinstance(m.od_field, collections.OrderedDict) assert m.od_field == collections.OrderedDict(od_value) assert m.model_json_schema() == { 'properties': {'od_field': {'title': 'Od Field', 'type': 'object'}}, 'required': ['od_field'], 'title': 'Model', 'type': 'object', } def test_handle_3rd_party_custom_type_reusing_known_metadata() -> None: class PdDecimal(Decimal): def ___repr__(self) -> str: return f'PdDecimal({super().__repr__()})' class PdDecimalMarker: def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: return core_schema.no_info_after_validator_function(PdDecimal, handler.generate_schema(Decimal)) class Model(BaseModel): x: Annotated[PdDecimal, PdDecimalMarker(), annotated_types.Gt(0)] assert isinstance(Model(x=1).x, PdDecimal) with pytest.raises(ValidationError) as exc_info: Model(x=-1) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('x',), 'msg': 'Input should be greater than 0', 'input': -1, 'ctx': {'gt': 0}, } ] @pytest.mark.parametrize('optional', [True, False]) def test_skip_validation(optional): type_hint = SkipValidation[int] if optional: type_hint = Optional[type_hint] @validate_call def my_function(y: type_hint): return repr(y) assert my_function('2') == "'2'" def test_skip_validation_model_reference(): class ModelA(BaseModel): x: int class ModelB(BaseModel): y: SkipValidation[ModelA] assert ModelB(y=123).y == 123 def test_skip_validation_serialization(): class A(BaseModel): x: SkipValidation[int] @field_serializer('x') def double_x(self, v): return v * 2 assert A(x=1).model_dump() == {'x': 2} assert A(x='abc').model_dump() == {'x': 'abcabc'} # no validation assert A(x='abc').model_dump_json() == '{"x":"abcabc"}' def test_skip_validation_json_schema(): class A(BaseModel): x: SkipValidation[int] assert A.model_json_schema() == { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'A', 'type': 'object', } def test_transform_schema(): ValidateStrAsInt = Annotated[str, GetPydanticSchema(lambda _s, h: core_schema.int_schema())] class Model(BaseModel): x: Optional[ValidateStrAsInt] assert Model(x=None).x is None assert Model(x='1').x == 1 def test_transform_schema_for_first_party_class(): # Here, first party means you can define the `__get_pydantic_core_schema__` method on the class directly. class LowercaseStr(str): @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: schema = handler(str) return core_schema.no_info_after_validator_function(lambda v: v.lower(), schema) class Model(BaseModel): lower: LowercaseStr = Field(min_length=1) assert Model(lower='ABC').lower == 'abc' with pytest.raises(ValidationError) as exc_info: Model(lower='') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'too_short', 'loc': ('lower',), 'msg': 'Value should have at least 1 item after validation, not 0', 'input': '', 'ctx': {'field_type': 'Value', 'min_length': 1, 'actual_length': 0}, } ] def test_constraint_dataclass() -> None: @dataclass(order=True) # need to make it inherit from int so that # because the PydanticKnownError requires it to be a number # but it's not really relevant to this test class MyDataclass(int): x: int ta = TypeAdapter(Annotated[MyDataclass, annotated_types.Gt(MyDataclass(0))]) assert ta.validate_python(MyDataclass(1)) == MyDataclass(1) with pytest.raises(ValidationError) as exc_info: ta.validate_python(MyDataclass(0)) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': (), 'msg': 'Input should be greater than 0', 'input': MyDataclass(0), 'ctx': {'gt': MyDataclass(0)}, } ] def test_transform_schema_for_third_party_class(): # Here, third party means you can't define methods on the class directly, so have to use annotations. class IntWrapper: # This is pretending to be a third-party class. This example is specifically inspired by pandas.Timestamp, # which can receive an item of type `datetime` as an input to its `__init__`. # The important thing here is we are not defining any custom methods on this type directly. def __init__(self, t: int) -> None: self.t = t def __eq__(self, value: object) -> bool: if isinstance(value, IntWrapper): return self.t == value.t elif isinstance(value, int): return self.t == value return False def __gt__(self, value: object) -> bool: if isinstance(value, IntWrapper): return self.t > value.t elif isinstance(value, int): return self.t > value return NotImplemented class _IntWrapperAnnotation: # This is an auxiliary class that, when used as the first annotation for DatetimeWrapper, # ensures pydantic can produce a valid schema. @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: schema = handler.generate_schema(int) return core_schema.no_info_after_validator_function(IntWrapper, schema) # Giving a name to Annotated[IntWrapper, _IntWrapperAnnotation] makes it easier to use in code # where I want a field of type `IntWrapper` that works as desired with pydantic. PydanticDatetimeWrapper = Annotated[IntWrapper, _IntWrapperAnnotation] class Model(BaseModel): # The reason all of the above is necessary is specifically so that we get good behavior x: Annotated[PydanticDatetimeWrapper, annotated_types.Gt(123)] m = Model(x=1234) assert isinstance(m.x, IntWrapper) assert repr(m.x.t) == '1234' with pytest.raises(ValidationError) as exc_info: Model(x=1) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('x',), 'msg': 'Input should be greater than 123', 'input': 1, 'ctx': {'gt': 123}, } ] def test_iterable_arbitrary_type(): class CustomIterable(Iterable): def __init__(self, iterable): self.iterable = iterable def __iter__(self): return self def __next__(self): return next(self.iterable) with pytest.raises( PydanticSchemaGenerationError, match='Unable to generate pydantic-core schema for .*CustomIterable.*. Set `arbitrary_types_allowed=True`', ): class Model(BaseModel): x: CustomIterable def test_typing_extension_literal_field(): from typing_extensions import Literal class Model(BaseModel): foo: Literal['foo'] assert Model(foo='foo').foo == 'foo' def test_typing_literal_field(): from typing import Literal class Model(BaseModel): foo: Literal['foo'] assert Model(foo='foo').foo == 'foo' def test_instance_of_annotation(): class Model(BaseModel): # Note: the generic parameter gets ignored by runtime validation x: InstanceOf[Sequence[int]] class MyList(list): pass assert Model(x='abc').x == 'abc' assert type(Model(x=MyList([1, 2, 3])).x) is MyList with pytest.raises(ValidationError) as exc_info: Model(x=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'Sequence'}, 'input': 1, 'loc': ('x',), 'msg': 'Input should be an instance of Sequence', 'type': 'is_instance_of', } ] assert Model.model_validate_json('{"x": [1,2,3]}').x == [1, 2, 3] with pytest.raises(ValidationError) as exc_info: Model.model_validate_json('{"x": "abc"}') assert exc_info.value.errors(include_url=False) == [ {'input': 'abc', 'loc': ('x',), 'msg': 'Input should be a valid array', 'type': 'list_type'} ] def test_instanceof_invalid_core_schema(): class MyClass: pass class MyModel(BaseModel): a: InstanceOf[MyClass] b: Optional[InstanceOf[MyClass]] MyModel(a=MyClass(), b=None) MyModel(a=MyClass(), b=MyClass()) with pytest.raises(ValidationError) as exc_info: MyModel(a=1, b=1) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'class': 'test_instanceof_invalid_core_schema..MyClass'}, 'input': 1, 'loc': ('a',), 'msg': 'Input should be an instance of ' 'test_instanceof_invalid_core_schema..MyClass', 'type': 'is_instance_of', }, { 'ctx': {'class': 'test_instanceof_invalid_core_schema..MyClass'}, 'input': 1, 'loc': ('b',), 'msg': 'Input should be an instance of ' 'test_instanceof_invalid_core_schema..MyClass', 'type': 'is_instance_of', }, ] with pytest.raises( PydanticInvalidForJsonSchema, match='Cannot generate a JsonSchema for core_schema.IsInstanceSchema' ): MyModel.model_json_schema() def test_instanceof_serialization(): class Inner(BaseModel): pass class SubInner(Inner): x: int class OuterStandard(BaseModel): inner: InstanceOf[Inner] assert OuterStandard(inner=SubInner(x=1)).model_dump() == {'inner': {}} class OuterAsAny(BaseModel): inner1: SerializeAsAny[InstanceOf[Inner]] inner2: InstanceOf[SerializeAsAny[Inner]] assert OuterAsAny(inner1=SubInner(x=2), inner2=SubInner(x=3)).model_dump() == { 'inner1': {'x': 2}, 'inner2': {'x': 3}, } def test_constraints_arbitrary_type() -> None: class CustomType: def __init__(self, v: Any) -> None: self.v = v def __eq__(self, o: object) -> bool: return self.v == o def __le__(self, o: object) -> bool: return self.v <= o def __lt__(self, o: object) -> bool: return self.v < o def __ge__(self, o: object) -> bool: return self.v >= o def __gt__(self, o: object) -> bool: return self.v > o def __mod__(self, o: Any) -> Any: return self.v % o def __len__(self) -> int: return len(self.v) def __repr__(self) -> str: return f'CustomType({self.v})' class Model(BaseModel): gt: Annotated[CustomType, annotated_types.Gt(0)] ge: Annotated[CustomType, annotated_types.Ge(0)] lt: Annotated[CustomType, annotated_types.Lt(0)] le: Annotated[CustomType, annotated_types.Le(0)] multiple_of: Annotated[CustomType, annotated_types.MultipleOf(2)] min_length: Annotated[CustomType, annotated_types.MinLen(1)] max_length: Annotated[CustomType, annotated_types.MaxLen(1)] predicate: Annotated[CustomType, annotated_types.Predicate(lambda x: x > 0)] not_multiple_of_3: Annotated[CustomType, annotated_types.Not(lambda x: x % 3 == 0)] model_config = ConfigDict(arbitrary_types_allowed=True) Model( gt=CustomType(1), ge=CustomType(0), lt=CustomType(-1), le=CustomType(0), min_length=CustomType([1, 2]), max_length=CustomType([1]), multiple_of=CustomType(4), predicate=CustomType(1), not_multiple_of_3=CustomType(4), ) with pytest.raises(ValidationError) as exc_info: Model( gt=CustomType(-1), ge=CustomType(-1), lt=CustomType(1), le=CustomType(1), min_length=CustomType([]), max_length=CustomType([1, 2, 3]), multiple_of=CustomType(3), predicate=CustomType(-1), not_multiple_of_3=CustomType(6), ) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'greater_than', 'loc': ('gt',), 'msg': 'Input should be greater than 0', 'input': CustomType(-1), 'ctx': {'gt': 0}, }, { 'type': 'greater_than_equal', 'loc': ('ge',), 'msg': 'Input should be greater than or equal to 0', 'input': CustomType(-1), 'ctx': {'ge': 0}, }, { 'type': 'less_than', 'loc': ('lt',), 'msg': 'Input should be less than 0', 'input': CustomType(1), 'ctx': {'lt': 0}, }, { 'type': 'less_than_equal', 'loc': ('le',), 'msg': 'Input should be less than or equal to 0', 'input': CustomType(1), 'ctx': {'le': 0}, }, { 'type': 'multiple_of', 'loc': ('multiple_of',), 'msg': 'Input should be a multiple of 2', 'input': CustomType(3), 'ctx': {'multiple_of': 2}, }, { 'type': 'too_short', 'loc': ('min_length',), 'msg': 'Value should have at least 1 item after validation, not 0', 'input': CustomType([]), 'ctx': {'field_type': 'Value', 'min_length': 1, 'actual_length': 0}, }, { 'type': 'too_long', 'loc': ('max_length',), 'msg': 'Value should have at most 1 item after validation, not 3', 'input': CustomType([1, 2, 3]), 'ctx': {'field_type': 'Value', 'max_length': 1, 'actual_length': 3}, }, { 'type': 'predicate_failed', 'loc': ('predicate',), 'msg': 'Predicate test_constraints_arbitrary_type..Model. failed', 'input': CustomType(-1), }, { 'type': 'not_operation_failed', 'loc': ('not_multiple_of_3',), 'msg': 'Not of test_constraints_arbitrary_type..Model. failed', 'input': CustomType(6), }, ] def test_annotated_default_value() -> None: t = TypeAdapter(Annotated[List[int], Field(default=['1', '2'])]) r = t.get_default_value() assert r is not None assert r.value == ['1', '2'] # insert_assert(t.json_schema()) assert t.json_schema() == {'type': 'array', 'items': {'type': 'integer'}, 'default': ['1', '2']} def test_annotated_default_value_validate_default() -> None: t = TypeAdapter(Annotated[List[int], Field(default=['1', '2'])], config=ConfigDict(validate_default=True)) r = t.get_default_value() assert r is not None assert r.value == [1, 2] # insert_assert(t.json_schema()) assert t.json_schema() == {'type': 'array', 'items': {'type': 'integer'}, 'default': ['1', '2']} def test_annotated_default_value_functional_validator() -> None: T = TypeVar('T') WithAfterValidator = Annotated[T, AfterValidator(lambda x: [v * 2 for v in x])] WithDefaultValue = Annotated[T, Field(default=['1', '2'])] # the order of the args should not matter, we always put the default value on the outside for tp in (WithDefaultValue[WithAfterValidator[List[int]]], WithAfterValidator[WithDefaultValue[List[int]]]): t = TypeAdapter(tp, config=ConfigDict(validate_default=True)) r = t.get_default_value() assert r is not None assert r.value == [2, 4] # insert_assert(t.json_schema()) assert t.json_schema() == {'type': 'array', 'items': {'type': 'integer'}, 'default': ['1', '2']} @pytest.mark.parametrize( 'pydantic_type,expected', ( (Json, 'Json'), (PastDate, 'PastDate'), (FutureDate, 'FutureDate'), (AwareDatetime, 'AwareDatetime'), (NaiveDatetime, 'NaiveDatetime'), (PastDatetime, 'PastDatetime'), (FutureDatetime, 'FutureDatetime'), (ImportString, 'ImportString'), ), ) def test_types_repr(pydantic_type, expected): assert repr(pydantic_type()) == expected def test_enum_custom_schema() -> None: class MyEnum(str, Enum): foo = 'FOO' bar = 'BAR' baz = 'BAZ' @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: # check that we can still call handler handler(source_type) # return a custom unrelated schema so we can test that # it gets used schema = core_schema.union_schema( [ core_schema.str_schema(), core_schema.is_instance_schema(cls), ] ) return core_schema.no_info_after_validator_function( function=lambda x: MyEnum(x.upper()) if isinstance(x, str) else x, schema=schema, serialization=core_schema.plain_serializer_function_ser_schema( lambda x: x.value, return_schema=core_schema.int_schema() ), ) ta = TypeAdapter(MyEnum) assert ta.validate_python('foo') == MyEnum.foo def test_get_pydantic_core_schema_marker_unrelated_type() -> None: """Test using handler.generate_schema() to generate a schema that ignores the current context of annotations and such """ @dataclass class Marker: num: int def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler.resolve_ref_schema(handler.generate_schema(source_type)) return core_schema.no_info_after_validator_function(lambda x: x * self.num, schema) ta = TypeAdapter(Annotated[int, Marker(2), Marker(3)]) assert ta.validate_python('1') == 3 def test_string_constraints() -> None: ta = TypeAdapter( Annotated[str, StringConstraints(strip_whitespace=True, to_lower=True), AfterValidator(lambda x: x * 2)] ) assert ta.validate_python(' ABC ') == 'abcabc' def test_string_constraints_strict() -> None: ta = TypeAdapter(Annotated[str, StringConstraints(strict=False)]) assert ta.validate_python(b'123') == '123' ta = TypeAdapter(Annotated[str, StringConstraints(strict=True)]) with pytest.raises(ValidationError): ta.validate_python(b'123') def test_decimal_float_precision() -> None: """https://github.com/pydantic/pydantic/issues/6807""" ta = TypeAdapter(Decimal) assert ta.validate_json('1.1') == Decimal('1.1') assert ta.validate_python(1.1) == Decimal('1.1') assert ta.validate_json('"1.1"') == Decimal('1.1') assert ta.validate_python('1.1') == Decimal('1.1') assert ta.validate_json('1') == Decimal('1') assert ta.validate_python(1) == Decimal('1') def test_coerce_numbers_to_str_disabled_in_strict_mode() -> None: class Model(BaseModel): model_config = ConfigDict(strict=True, coerce_numbers_to_str=True) value: str with pytest.raises(ValidationError, match='value'): Model.model_validate({'value': 42}) with pytest.raises(ValidationError, match='value'): Model.model_validate_json('{"value": 42}') @pytest.mark.parametrize('value_param', [True, False]) def test_coerce_numbers_to_str_raises_for_bool(value_param: bool) -> None: class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) value: str with pytest.raises(ValidationError, match='value'): Model.model_validate({'value': value_param}) with pytest.raises(ValidationError, match='value'): if value_param is True: Model.model_validate_json('{"value": true}') elif value_param is False: Model.model_validate_json('{"value": false}') @pydantic_dataclass(config=ConfigDict(coerce_numbers_to_str=True)) class Model: value: str with pytest.raises(ValidationError, match='value'): Model(value=value_param) @pytest.mark.parametrize( ('number', 'expected_str'), [ pytest.param(42, '42', id='42'), pytest.param(42.0, '42.0', id='42.0'), pytest.param(Decimal('42.0'), '42.0', id="Decimal('42.0')"), ], ) def test_coerce_numbers_to_str(number: Number, expected_str: str) -> None: class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) value: str assert Model.model_validate({'value': number}).model_dump() == {'value': expected_str} @pytest.mark.parametrize( ('number', 'expected_str'), [ pytest.param('42', '42', id='42'), pytest.param('42.0', '42', id='42.0'), pytest.param('42.13', '42.13', id='42.13'), ], ) def test_coerce_numbers_to_str_from_json(number: str, expected_str: str) -> None: class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) value: str assert Model.model_validate_json(f'{{"value": {number}}}').model_dump() == {'value': expected_str} def test_union_tags_in_errors(): DoubledList = Annotated[List[int], AfterValidator(lambda x: x * 2)] StringsMap = Dict[str, str] adapter = TypeAdapter(Union[DoubledList, StringsMap]) with pytest.raises(ValidationError) as exc_info: adapter.validate_python(['a']) # yuck assert '2 validation errors for union[function-after[(), list[int]],dict[str,str]]' in str(exc_info) # the loc's are bad here: assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('function-after[(), list[int]]', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': ['a'], 'loc': ('dict[str,str]',), 'msg': 'Input should be a valid dictionary', 'type': 'dict_type', }, ] tag_adapter = TypeAdapter( Union[Annotated[DoubledList, Tag('DoubledList')], Annotated[StringsMap, Tag('StringsMap')]] ) with pytest.raises(ValidationError) as exc_info: tag_adapter.validate_python(['a']) assert '2 validation errors for union[DoubledList,StringsMap]' in str(exc_info) # nice # the loc's are good here: assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('DoubledList', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': ['a'], 'loc': ('StringsMap',), 'msg': 'Input should be a valid dictionary', 'type': 'dict_type', }, ] def test_json_value(): adapter = TypeAdapter(JsonValue) valid_json_data = {'a': {'b': {'c': 1, 'd': [2, None]}}} # would pass validation as a dict[str, Any] invalid_json_data = {'a': {'b': ...}} assert adapter.validate_python(valid_json_data) == valid_json_data assert adapter.validate_json(json.dumps(valid_json_data)) == valid_json_data with pytest.raises(ValidationError) as exc_info: adapter.validate_python(invalid_json_data) assert exc_info.value.errors() == [ { 'input': Ellipsis, 'loc': ('dict', 'a', 'dict', 'b'), 'msg': 'input was not a valid JSON value', 'type': 'invalid-json-value', } ] def test_json_value_with_subclassed_types(): class IntType(int): pass class FloatType(float): pass class StrType(str): pass class ListType(list): pass class DictType(dict): pass adapter = TypeAdapter(JsonValue) valid_json_data = {'int': IntType(), 'float': FloatType(), 'str': StrType(), 'list': ListType(), 'dict': DictType()} assert adapter.validate_python(valid_json_data) == valid_json_data def test_json_value_roundtrip() -> None: # see https://github.com/pydantic/pydantic/issues/8175 class MyModel(BaseModel): val: Union[str, JsonValue] round_trip_value = json.loads(MyModel(val=True).model_dump_json())['val'] assert round_trip_value is True, round_trip_value def test_on_error_omit() -> None: OmittableInt = OnErrorOmit[int] class MyTypedDict(TypedDict): a: NotRequired[OmittableInt] b: NotRequired[OmittableInt] class Model(BaseModel): a_list: List[OmittableInt] a_dict: Dict[OmittableInt, OmittableInt] a_typed_dict: MyTypedDict actual = Model( a_list=[1, 2, 'a', 3], a_dict={1: 1, 2: 2, 'a': 'a', 'b': 0, 3: 'c', 4: 4}, a_typed_dict=MyTypedDict(a=1, b='xyz'), # type: ignore ) expected = Model(a_list=[1, 2, 3], a_dict={1: 1, 2: 2, 4: 4}, a_typed_dict=MyTypedDict(a=1)) assert actual == expected def test_on_error_omit_top_level() -> None: ta = TypeAdapter(OnErrorOmit[int]) assert ta.validate_python(1) == 1 assert ta.validate_python('1') == 1 # we might want to just raise the OmitError or convert it to a ValidationError # if it hits the top level, but this documents the current behavior at least with pytest.raises(SchemaError, match='Uncaught Omit error'): ta.validate_python('a') def test_diff_enums_diff_configs() -> None: class MyEnum(str, Enum): A = 'a' class MyModel(BaseModel, use_enum_values=True): my_enum: MyEnum class OtherModel(BaseModel): my_enum: MyEnum class Model(BaseModel): my_model: MyModel other_model: OtherModel obj = Model.model_validate({'my_model': {'my_enum': 'a'}, 'other_model': {'my_enum': 'a'}}) assert not isinstance(obj.my_model.my_enum, MyEnum) assert isinstance(obj.other_model.my_enum, MyEnum) def test_can_serialize_deque_passed_to_sequence() -> None: ta = TypeAdapter(Sequence[int]) my_dec = ta.validate_python(deque([1, 2, 3])) assert my_dec == deque([1, 2, 3]) assert ta.dump_python(my_dec) == my_dec assert ta.dump_json(my_dec) == b'[1,2,3]' def test_strict_enum_with_use_enum_values() -> None: class SomeEnum(int, Enum): SOME_KEY = 1 class Foo(BaseModel): model_config = ConfigDict(strict=False, use_enum_values=True) foo: Annotated[SomeEnum, Strict(strict=True)] f = Foo(foo=SomeEnum.SOME_KEY) assert f.foo == 1 # validation error raised bc foo field uses strict mode with pytest.raises(ValidationError): Foo(foo='1') @pytest.mark.skipif( platform.python_implementation() == 'PyPy', reason='PyPy has a bug in complex string parsing. A fix is implemented but not yet released.', ) def test_complex_field(): class Model(BaseModel): number: complex m = Model(number=complex(1, 2)) assert repr(m) == 'Model(number=(1+2j))' assert m.model_dump() == {'number': complex(1, 2)} assert m.model_dump_json() == '{"number":"1+2j"}' # Complex numbers presented as strings are also acceptable m = Model(number='1+2j') assert repr(m) == 'Model(number=(1+2j))' assert m.model_dump() == {'number': complex(1, 2)} assert m.model_dump_json() == '{"number":"1+2j"}' # The part that is 0 can be omitted m = Model(number='1') assert repr(m) == 'Model(number=(1+0j))' assert m.model_dump() == {'number': complex(1, 0)} assert m.model_dump_json() == '{"number":"1+0j"}' m = Model(number='1j') assert repr(m) == 'Model(number=1j)' assert m.model_dump() == {'number': complex(0, 1)} assert m.model_dump_json() == '{"number":"1j"}' m = Model(number='0') assert repr(m) == 'Model(number=0j)' assert m.model_dump() == {'number': complex(0, 0)} assert m.model_dump_json() == '{"number":"0j"}' m = Model(number='infj') assert repr(m) == 'Model(number=infj)' assert m.model_dump() == {'number': complex(0, float('inf'))} assert m.model_dump_json() == '{"number":"infj"}' m = Model(number='-nanj') assert repr(m) == 'Model(number=nanj)' d = m.model_dump() assert d['number'].real == 0 assert math.isnan(d['number'].imag) assert m.model_dump_json() == '{"number":"NaNj"}' # strings with brackets and space characters are allowed as long as # they follow the rule m = Model(number='\t( -1.23+4.5J )\n') assert repr(m) == 'Model(number=(-1.23+4.5j))' assert m.model_dump() == {'number': complex(-1.23, 4.5)} assert m.model_dump_json() == '{"number":"-1.23+4.5j"}' # int and float are also accepted (with imaginary part == 0) m = Model(number=2) assert repr(m) == 'Model(number=(2+0j))' assert m.model_dump() == {'number': complex(2, 0)} assert m.model_dump_json() == '{"number":"2+0j"}' m = Model(number=1.5) assert repr(m) == 'Model(number=(1.5+0j))' assert m.model_dump() == {'number': complex(1.5, 0)} assert m.model_dump_json() == '{"number":"1.5+0j"}' # Empty strings are not allowed with pytest.raises(ValidationError): Model(number='') with pytest.raises(ValidationError): Model(number='foo') # Bracket missing with pytest.raises(ValidationError): Model(number='\t( -1.23+4.5J \n') # Space between numbers with pytest.raises(ValidationError): Model(number='\t( -1.23 +4.5J \n') def test_strict_complex_field(): class Model(BaseModel): # Only complex objects are accepted number: Annotated[complex, Field(strict=True)] m = Model(number=complex(1, 2)) assert repr(m) == 'Model(number=(1+2j))' assert m.model_dump() == {'number': complex(1, 2)} assert m.model_dump_json() == '{"number":"1+2j"}' with pytest.raises(ValidationError): m = Model(number='1+2j') with pytest.raises(ValidationError): m = Model(number=1.0) with pytest.raises(ValidationError): m = Model(number=5) def test_python_re_respects_flags() -> None: class Model(BaseModel): a: Annotated[str, StringConstraints(pattern=re.compile(r'[A-Z]+', re.IGNORECASE))] model_config = ConfigDict(regex_engine='python-re') # allows lowercase letters, even though the pattern is uppercase only due to the IGNORECASE flag assert Model(a='abc').a == 'abc' @pytest.mark.skipif(not email_validator, reason='email_validator not installed') def test_constraints_on_str_like() -> None: """See https://github.com/pydantic/pydantic/issues/8577 for motivation.""" class Foo(BaseModel): baz: Annotated[EmailStr, StringConstraints(to_lower=True, strip_whitespace=True)] assert Foo(baz=' uSeR@ExAmPlE.com ').baz == 'user@example.com' @pytest.mark.parametrize( 'tp', [ pytest.param(List[int], id='list'), pytest.param(Tuple[int, ...], id='tuple'), pytest.param(Set[int], id='set'), pytest.param(FrozenSet[int], id='frozenset'), ], ) @pytest.mark.parametrize( ['fail_fast', 'decl'], [ pytest.param(True, FailFast(), id='fail-fast-default'), pytest.param(True, FailFast(True), id='fail-fast-true'), pytest.param(False, FailFast(False), id='fail-fast-false'), pytest.param(False, Field(), id='field-default'), pytest.param(True, Field(fail_fast=True), id='field-true'), pytest.param(False, Field(fail_fast=False), id='field-false'), ], ) def test_fail_fast(tp, fail_fast, decl) -> None: class Foo(BaseModel): a: Annotated[tp, decl] with pytest.raises(ValidationError) as exc_info: Foo(a=[1, 'a', 'c']) errors = [ { 'input': 'a', 'loc': ('a', 1), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', }, ] if not fail_fast: errors.append( { 'input': 'c', 'loc': ('a', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', }, ) assert exc_info.value.errors(include_url=False) == errors def test_mutable_mapping() -> None: """Addresses https://github.com/pydantic/pydantic/issues/9549. Note - we still don't do a good job of handling subclasses, as we convert the input to a dict via the MappingValidator annotation's schema. """ import collections.abc adapter = TypeAdapter(collections.abc.MutableMapping, config=ConfigDict(arbitrary_types_allowed=True, strict=True)) assert isinstance(adapter.validate_python(collections.UserDict()), collections.abc.MutableMapping) def test_ser_ip_with_union() -> None: bool_first = TypeAdapter(Union[bool, ipaddress.IPv4Address]) assert bool_first.dump_python(True, mode='json') is True assert bool_first.dump_json(True) == b'true' ip_first = TypeAdapter(Union[ipaddress.IPv4Address, bool]) assert ip_first.dump_python(True, mode='json') is True assert ip_first.dump_json(True) == b'true' def test_ser_ip_with_unexpected_value() -> None: ta = TypeAdapter(ipaddress.IPv4Address) with pytest.warns(UserWarning, match='serialized value may not be as expected.'): assert ta.dump_python(123) def test_ser_ip_python_and_json() -> None: ta = TypeAdapter(ipaddress.IPv4Address) ip = ta.validate_python('127.0.0.1') assert ta.dump_python(ip) == ip assert ta.dump_python(ip, mode='json') == '127.0.0.1' assert ta.dump_json(ip) == b'"127.0.0.1"' @pytest.mark.parametrize('input_data', ['1/3', 1.333, Fraction(1, 3), Decimal('1.333')]) def test_fraction_validation_lax(input_data) -> None: ta = TypeAdapter(Fraction) fraction = ta.validate_python(input_data) assert isinstance(fraction, Fraction) def test_fraction_validation_strict() -> None: ta = TypeAdapter(Fraction, config=ConfigDict(strict=True)) assert ta.validate_python(Fraction(1 / 3)) == Fraction(1 / 3) # only fractions accepted in strict mode for lax_fraction in ['1/3', 1.333, Decimal('1.333')]: with pytest.raises(ValidationError): ta.validate_python(lax_fraction) def test_fraction_serialization() -> None: ta = TypeAdapter(Fraction) assert ta.dump_python(Fraction(1, 3)) == '1/3' assert ta.dump_json(Fraction(1, 3)) == b'"1/3"' def test_fraction_json_schema() -> None: ta = TypeAdapter(Fraction) assert ta.json_schema() == {'type': 'string', 'format': 'fraction'} def test_annotated_metadata_any_order() -> None: def validator(v): if isinstance(v, (int, float)): return timedelta(days=v) return v class BeforeValidatorAfterLe(BaseModel): v: Annotated[timedelta, annotated_types.Le(timedelta(days=365)), BeforeValidator(validator)] class BeforeValidatorBeforeLe(BaseModel): v: Annotated[timedelta, BeforeValidator(validator), annotated_types.Le(timedelta(days=365))] try: BeforeValidatorAfterLe(v=366) except ValueError as ex: assert '365 days' in str(ex) # in this case, the Le constraint comes after the BeforeValidator, so we use functional validators # from pydantic._internal._validators.py (in this case, less_than_or_equal_validator) # which doesn't have access to fancy pydantic-core formatting for timedelta, so we get # the raw timedelta repr in the error `datetime.timedelta(days=365`` vs the above `365 days` try: BeforeValidatorBeforeLe(v=366) except ValueError as ex: assert 'datetime.timedelta(days=365)' in str(ex) @pytest.mark.parametrize('base64_type', [Base64Bytes, Base64Str, Base64UrlBytes, Base64UrlStr]) def test_base64_with_invalid_min_length(base64_type) -> None: """Check that an error is raised when the length of the base64 type's value is less or more than the min_length and max_length.""" class Model(BaseModel): base64_value: base64_type = Field(min_length=3, max_length=5) # type: ignore with pytest.raises(ValidationError): Model(**{'base64_value': b''}) with pytest.raises(ValidationError): Model(**{'base64_value': b'123456'}) def test_serialize_as_any_secret_types() -> None: ta_secret_str = TypeAdapter(SecretStr) secret_str = ta_secret_str.validate_python('secret') ta_any = TypeAdapter(Any) assert ta_any.dump_python(secret_str) == secret_str assert ta_any.dump_python(secret_str, mode='json') == '**********' assert ta_any.dump_json(secret_str) == b'"**********"' ta_secret_bytes = TypeAdapter(SecretBytes) secret_bytes = ta_secret_bytes.validate_python(b'secret') assert ta_any.dump_python(secret_bytes) == secret_bytes assert ta_any.dump_python(secret_bytes, mode='json') == '**********' assert ta_any.dump_json(secret_bytes) == b'"**********"' ta_secret_date = TypeAdapter(SecretDate) secret_date = ta_secret_date.validate_python('2024-01-01') assert ta_any.dump_python(secret_date) == secret_date assert ta_any.dump_python(secret_date, mode='json') == '****/**/**' assert ta_any.dump_json(secret_date) == b'"****/**/**"' def test_custom_serializer_override_secret_str() -> None: class User(BaseModel): name: str password: Annotated[SecretStr, PlainSerializer(lambda x: f'secret: {str(x)}')] u = User(name='sam', password='hi') assert u.model_dump()['password'] == 'secret: **********' pydantic-2.10.6/tests/test_types_namedtuple.py000066400000000000000000000166411474456633400216000ustar00rootroot00000000000000from collections import namedtuple from typing import Generic, NamedTuple, Optional, Tuple, TypeVar import pytest from typing_extensions import NamedTuple as TypingExtensionsNamedTuple from pydantic import BaseModel, ConfigDict, PositiveInt, TypeAdapter, ValidationError from pydantic.errors import PydanticSchemaGenerationError def test_namedtuple_simple(): Position = namedtuple('Pos', 'x y') class Model(BaseModel): pos: Position model = Model(pos=('1', 2)) assert isinstance(model.pos, Position) assert model.pos.x == '1' assert model.pos == Position('1', 2) model = Model(pos={'x': '1', 'y': 2}) assert model.pos == Position('1', 2) def test_namedtuple(): class Event(NamedTuple): a: int b: int c: int d: str class Model(BaseModel): # pos: Position event: Event model = Model(event=(b'1', '2', 3, 'qwe')) assert isinstance(model.event, Event) assert model.event == Event(1, 2, 3, 'qwe') assert repr(model) == "Model(event=Event(a=1, b=2, c=3, d='qwe'))" with pytest.raises(ValidationError) as exc_info: Model(pos=('1', 2), event=['qwe', '2', 3, 'qwe']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('event', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'qwe', } ] def test_namedtuple_schema(): class Position1(NamedTuple): x: int y: int Position2 = namedtuple('Position2', 'x y') class Model(BaseModel): pos1: Position1 pos2: Position2 pos3: Tuple[int, int] assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', '$defs': { 'Position1': { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'title': 'X', 'type': 'integer'}, {'title': 'Y', 'type': 'integer'}], 'type': 'array', }, 'Position2': { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'title': 'X'}, {'title': 'Y'}], 'type': 'array', }, }, 'properties': { 'pos1': {'$ref': '#/$defs/Position1'}, 'pos2': {'$ref': '#/$defs/Position2'}, 'pos3': { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'type': 'integer'}, {'type': 'integer'}], 'title': 'Pos3', 'type': 'array', }, }, 'required': ['pos1', 'pos2', 'pos3'], } def test_namedtuple_right_length(): class Point(NamedTuple): x: int y: int class Model(BaseModel): p: Point assert isinstance(Model(p=(1, 2)), Model) with pytest.raises(ValidationError) as exc_info: Model(p=(1, 2, 3)) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'unexpected_positional_argument', 'loc': ('p', 2), 'msg': 'Unexpected positional argument', 'input': 3, } ] def test_namedtuple_postponed_annotation(): """ https://github.com/pydantic/pydantic/issues/2760 """ class Tup(NamedTuple): v: 'PositiveInt' class Model(BaseModel): t: Tup # The effect of issue #2760 is that this call raises a `PydanticUserError` even though the type declared on `Tup.v` # references a binding in this module's global scope. with pytest.raises(ValidationError): Model.model_validate({'t': [-1]}) def test_namedtuple_different_module(create_module) -> None: """https://github.com/pydantic/pydantic/issues/10336""" @create_module def other_module(): from typing import NamedTuple TestIntOtherModule = int class Tup(NamedTuple): f: 'TestIntOtherModule' class Model(BaseModel): tup: other_module.Tup assert Model(tup={'f': 1}).tup.f == 1 def test_namedtuple_arbitrary_type(): class CustomClass: pass class Tup(NamedTuple): c: CustomClass class Model(BaseModel): x: Tup model_config = ConfigDict(arbitrary_types_allowed=True) data = {'x': Tup(c=CustomClass())} model = Model.model_validate(data) assert isinstance(model.x.c, CustomClass) with pytest.raises(PydanticSchemaGenerationError): class ModelNoArbitraryTypes(BaseModel): x: Tup def test_recursive_namedtuple(): class MyNamedTuple(NamedTuple): x: int y: Optional['MyNamedTuple'] ta = TypeAdapter(MyNamedTuple) assert ta.validate_python({'x': 1, 'y': {'x': 2, 'y': None}}) == (1, (2, None)) with pytest.raises(ValidationError) as exc_info: ta.validate_python({'x': 1, 'y': {'x': 2, 'y': {'x': 'a', 'y': None}}}) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('y', 'y', 'x'), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_recursive_generic_namedtuple(): # Need to use TypingExtensionsNamedTuple to make it work with Python <3.11 T = TypeVar('T') class MyNamedTuple(TypingExtensionsNamedTuple, Generic[T]): x: T y: Optional['MyNamedTuple[T]'] ta = TypeAdapter(MyNamedTuple[int]) assert ta.validate_python({'x': 1, 'y': {'x': 2, 'y': None}}) == (1, (2, None)) with pytest.raises(ValidationError) as exc_info: ta.validate_python({'x': 1, 'y': {'x': 2, 'y': {'x': 'a', 'y': None}}}) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('y', 'y', 'x'), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_namedtuple_defaults(): class NT(NamedTuple): x: int y: int = 33 assert TypeAdapter(NT).validate_python([1]) == (1, 33) assert TypeAdapter(NT).validate_python({'x': 22}) == (22, 33) def test_eval_type_backport(): class MyNamedTuple(NamedTuple): foo: 'list[int | str]' class Model(BaseModel): t: MyNamedTuple assert Model(t=([1, '2'],)).model_dump() == {'t': ([1, '2'],)} with pytest.raises(ValidationError) as exc_info: Model(t=('not a list',)) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': ('t', 0), 'msg': 'Input should be a valid list', 'input': 'not a list', } ] with pytest.raises(ValidationError) as exc_info: Model(t=([{'not a str or int'}],)) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_type', 'loc': ('t', 0, 0, 'int'), 'msg': 'Input should be a valid integer', 'input': {'not a str or int'}, }, { 'type': 'string_type', 'loc': ('t', 0, 0, 'str'), 'msg': 'Input should be a valid string', 'input': {'not a str or int'}, }, ] pydantic-2.10.6/tests/test_types_payment_card_number.py000066400000000000000000000120471474456633400234540ustar00rootroot00000000000000from collections import namedtuple from typing import Any import pytest from pydantic_core import PydanticCustomError from pydantic import BaseModel, ValidationError from pydantic.types import PaymentCardBrand, PaymentCardNumber pytestmark = pytest.mark.filterwarnings( 'ignore:' 'The `PaymentCardNumber` class is deprecated, use `pydantic_extra_types` instead.*' ':DeprecationWarning' ) VALID_AMEX = '370000000000002' VALID_MC = '5100000000000003' VALID_VISA_13 = '4050000000001' VALID_VISA_16 = '4050000000000001' VALID_VISA_19 = '4050000000000000001' VALID_OTHER = '2000000000000000008' LUHN_INVALID = '4000000000000000' LEN_INVALID = '40000000000000006' # Mock PaymentCardNumber PCN = namedtuple('PaymentCardNumber', ['card_number', 'brand']) PCN.__len__ = lambda v: len(v.card_number) @pytest.fixture(scope='session', name='PaymentCard') def payment_card_model_fixture(): class PaymentCard(BaseModel): card_number: PaymentCardNumber return PaymentCard def test_validate_digits(): digits = '12345' assert PaymentCardNumber.validate_digits(digits) is None with pytest.raises(PydanticCustomError, match='Card number is not all digits'): PaymentCardNumber.validate_digits('hello') @pytest.mark.parametrize( 'card_number, valid', [ ('0', True), ('00', True), ('18', True), ('0000000000000000', True), ('4242424242424240', False), ('4242424242424241', False), ('4242424242424242', True), ('4242424242424243', False), ('4242424242424244', False), ('4242424242424245', False), ('4242424242424246', False), ('4242424242424247', False), ('4242424242424248', False), ('4242424242424249', False), ('42424242424242426', True), ('424242424242424267', True), ('4242424242424242675', True), ('5164581347216566', True), ('4345351087414150', True), ('343728738009846', True), ('5164581347216567', False), ('4345351087414151', False), ('343728738009847', False), ('000000018', True), ('99999999999999999999', True), ('99999999999999999999999999999999999999999999999999999999999999999997', True), ], ) def test_validate_luhn_check_digit(card_number: str, valid: bool): if valid: assert PaymentCardNumber.validate_luhn_check_digit(card_number) == card_number else: with pytest.raises(PydanticCustomError, match='Card number is not luhn valid'): PaymentCardNumber.validate_luhn_check_digit(card_number) @pytest.mark.parametrize( 'card_number, brand, valid', [ (VALID_VISA_13, PaymentCardBrand.visa, True), (VALID_VISA_16, PaymentCardBrand.visa, True), (VALID_VISA_19, PaymentCardBrand.visa, True), (VALID_MC, PaymentCardBrand.mastercard, True), (VALID_AMEX, PaymentCardBrand.amex, True), (VALID_OTHER, PaymentCardBrand.other, True), (LEN_INVALID, PaymentCardBrand.visa, False), ], ) def test_length_for_brand(card_number: str, brand: PaymentCardBrand, valid: bool): # pcn = PCN(card_number, brand) if valid: assert PaymentCardNumber.validate_brand(card_number) == brand else: with pytest.raises(PydanticCustomError) as exc_info: PaymentCardNumber.validate_brand(card_number) assert exc_info.value.type == 'payment_card_number_brand' @pytest.mark.parametrize( 'card_number, brand', [ (VALID_AMEX, PaymentCardBrand.amex), (VALID_MC, PaymentCardBrand.mastercard), (VALID_VISA_16, PaymentCardBrand.visa), (VALID_OTHER, PaymentCardBrand.other), ], ) def test_get_brand(card_number: str, brand: PaymentCardBrand): assert PaymentCardNumber.validate_brand(card_number) == brand def test_valid(PaymentCard): card = PaymentCard(card_number=VALID_VISA_16) assert str(card.card_number) == VALID_VISA_16 assert card.card_number.masked == '405000******0001' @pytest.mark.parametrize( 'card_number, error_message', [ (None, 'type=string_type'), ('1' * 11, 'type=string_too_short,'), ('1' * 20, 'type=string_too_long,'), ('h' * 16, 'type=payment_card_number_digits'), (LUHN_INVALID, 'type=payment_card_number_luhn,'), (LEN_INVALID, 'type=payment_card_number_brand,'), ], ) def test_error_types(card_number: Any, error_message: str, PaymentCard): with pytest.raises(ValidationError, match=error_message): PaymentCard(card_number=card_number) def test_payment_card_brand(): b = PaymentCardBrand.visa assert str(b) == 'Visa' assert b is PaymentCardBrand.visa assert b == PaymentCardBrand.visa assert b in {PaymentCardBrand.visa, PaymentCardBrand.mastercard} b = 'Visa' assert b is not PaymentCardBrand.visa assert b == PaymentCardBrand.visa assert b in {PaymentCardBrand.visa, PaymentCardBrand.mastercard} b = PaymentCardBrand.amex assert b is not PaymentCardBrand.visa assert b != PaymentCardBrand.visa assert b not in {PaymentCardBrand.visa, PaymentCardBrand.mastercard} pydantic-2.10.6/tests/test_types_self.py000066400000000000000000000153371474456633400203740ustar00rootroot00000000000000import dataclasses import re import sys import typing from typing import List, Optional, Type, Union import pytest import typing_extensions from typing_extensions import NamedTuple, TypedDict from pydantic import BaseModel, Field, PydanticUserError, TypeAdapter, ValidationError, computed_field, validate_call @pytest.fixture( name='Self', params=[ pytest.param(typing, id='typing.Self'), pytest.param(typing_extensions, id='t_e.Self'), ], ) def fixture_self_all(request): try: return request.param.Self except AttributeError: pytest.skip(f'Self is not available from {request.param}') def test_recursive_model(Self): class SelfRef(BaseModel): data: int ref: typing.Optional[Self] = None assert SelfRef(data=1, ref={'data': 2}).model_dump() == {'data': 1, 'ref': {'data': 2, 'ref': None}} def test_recursive_model_invalid(Self): class SelfRef(BaseModel): data: int ref: typing.Optional[Self] = None with pytest.raises( ValidationError, match=r'ref\.ref\s+Input should be a valid dictionary or instance of SelfRef \[type=model_type,', ): SelfRef(data=1, ref={'data': 2, 'ref': 3}).model_dump() def test_recursive_model_with_subclass(Self): """Self refs should be valid and should reference the correct class in covariant direction""" class SelfRef(BaseModel): x: int ref: Self | None = None class SubSelfRef(SelfRef): y: int assert SubSelfRef(x=1, ref=SubSelfRef(x=3, y=4), y=2).model_dump() == { 'x': 1, 'ref': {'x': 3, 'ref': None, 'y': 4}, # SubSelfRef.ref: SubSelfRef 'y': 2, } assert SelfRef(x=1, ref=SubSelfRef(x=2, y=3)).model_dump() == { 'x': 1, 'ref': {'x': 2, 'ref': None}, } # SelfRef.ref: SelfRef def test_recursive_model_with_subclass_invalid(Self): """Self refs are invalid in contravariant direction""" class SelfRef(BaseModel): x: int ref: Self | None = None class SubSelfRef(SelfRef): y: int with pytest.raises( ValidationError, match=r'ref\s+Input should be a valid dictionary or instance of SubSelfRef \[type=model_type,', ): SubSelfRef(x=1, ref=SelfRef(x=2), y=3).model_dump() def test_recursive_model_with_subclass_override(Self): """Self refs should be overridable""" class SelfRef(BaseModel): x: int ref: Self | None = None class SubSelfRef(SelfRef): y: int ref: Optional[Union[SelfRef, Self]] = None assert SubSelfRef(x=1, ref=SubSelfRef(x=3, y=4), y=2).model_dump() == { 'x': 1, 'ref': {'x': 3, 'ref': None, 'y': 4}, 'y': 2, } assert SubSelfRef(x=1, ref=SelfRef(x=3, y=4), y=2).model_dump() == { 'x': 1, 'ref': {'x': 3, 'ref': None}, 'y': 2, } def test_self_type_with_field(Self): class SelfRef(BaseModel): x: int refs: typing.List[Self] = Field(gt=0) with pytest.raises(TypeError, match=re.escape("Unable to apply constraint 'gt' to supplied value []")): SelfRef(x=1, refs=[SelfRef(x=2, refs=[])]) def test_self_type_json_schema(Self): class SelfRef(BaseModel): x: int refs: Optional[List[Self]] = [] assert SelfRef.model_json_schema() == { '$defs': { 'SelfRef': { 'properties': { 'x': {'title': 'X', 'type': 'integer'}, 'refs': { 'anyOf': [{'items': {'$ref': '#/$defs/SelfRef'}, 'type': 'array'}, {'type': 'null'}], 'default': [], 'title': 'Refs', }, }, 'required': ['x'], 'title': 'SelfRef', 'type': 'object', } }, '$ref': '#/$defs/SelfRef', } def test_self_type_in_named_tuple(Self): class SelfRefNamedTuple(NamedTuple): x: int ref: Self | None ta = TypeAdapter(SelfRefNamedTuple) assert ta.validate_python({'x': 1, 'ref': {'x': 2, 'ref': None}}) == (1, (2, None)) def test_self_type_in_typed_dict(Self): class SelfRefTypedDict(TypedDict): x: int ref: Self | None ta = TypeAdapter(SelfRefTypedDict) assert ta.validate_python({'x': 1, 'ref': {'x': 2, 'ref': None}}) == {'x': 1, 'ref': {'x': 2, 'ref': None}} def test_self_type_in_dataclass(Self): @dataclasses.dataclass(frozen=True) class SelfRef: x: int ref: Self | None class Model(BaseModel): item: SelfRef m = Model.model_validate({'item': {'x': 1, 'ref': {'x': 2, 'ref': None}}}) assert m.item.x == 1 assert m.item.ref.x == 2 with pytest.raises(dataclasses.FrozenInstanceError): m.item.ref.x = 3 def test_invalid_validate_call(Self): with pytest.raises(PydanticUserError, match='`typing.Self` is invalid in this context'): @validate_call def foo(self: Self): pass def test_invalid_validate_call_of_method(Self): with pytest.raises(PydanticUserError, match='`typing.Self` is invalid in this context'): class A(BaseModel): @validate_call def foo(self: Self): pass def test_type_of_self(Self): class A(BaseModel): self_type: Type[Self] @computed_field def self_types1(self) -> List[Type[Self]]: return [type(self), self.self_type] # make sure forward refs are supported: @computed_field def self_types2(self) -> List[Type['Self']]: return [type(self), self.self_type] @computed_field def self_types3(self) -> 'List[Type[Self]]': return [type(self), self.self_type] if sys.version_info >= (3, 9): # standard container types are supported in 3.9+ @computed_field def self_types4(self) -> 'list[type[Self]]': return [type(self), self.self_type] @computed_field def self_types5(self) -> list['type[Self]']: return [type(self), self.self_type] class B(A): ... A(self_type=A) A(self_type=B) B(self_type=B) a = A(self_type=B) for prop in (a.self_types1, a.self_types2, a.self_types3): assert prop == [A, B] for invalid_type in (type, int, A, object): with pytest.raises(ValidationError) as exc_info: B(self_type=invalid_type) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_subclass_of', 'loc': ('self_type',), 'msg': f'Input should be a subclass of {B.__qualname__}', 'input': invalid_type, 'ctx': {'class': B.__qualname__}, } ] pydantic-2.10.6/tests/test_types_typeddict.py000066400000000000000000000655541474456633400214420ustar00rootroot00000000000000""" Tests for TypedDict """ import sys import typing from typing import Any, Dict, Generic, List, Optional, TypeVar import pytest import typing_extensions from annotated_types import Lt from pydantic_core import core_schema from typing_extensions import Annotated, TypedDict from pydantic import ( BaseModel, ConfigDict, Field, GetCoreSchemaHandler, PositiveInt, PydanticUserError, ValidationError, with_config, ) from pydantic._internal._decorators import get_attribute_from_bases from pydantic.functional_serializers import field_serializer, model_serializer from pydantic.functional_validators import field_validator, model_validator from pydantic.type_adapter import TypeAdapter from .conftest import Err @pytest.fixture( name='TypedDictAll', params=[ pytest.param(typing, id='typing.TypedDict'), pytest.param(typing_extensions, id='t_e.TypedDict'), ], ) def fixture_typed_dict_all(request): try: return request.param.TypedDict except AttributeError: pytest.skip(f'TypedDict is not available from {request.param}') @pytest.fixture(name='TypedDict') def fixture_typed_dict(TypedDictAll): class TestTypedDict(TypedDictAll): foo: str if sys.version_info < (3, 12) and TypedDictAll.__module__ == 'typing': pytest.skip('typing.TypedDict does not support all pydantic features in Python < 3.12') if hasattr(TestTypedDict, '__required_keys__'): return TypedDictAll else: pytest.skip('TypedDict does not include __required_keys__') @pytest.fixture( name='req_no_req', params=[ pytest.param(typing, id='typing.Required'), pytest.param(typing_extensions, id='t_e.Required'), ], ) def fixture_req_no_req(request): try: return request.param.Required, request.param.NotRequired except AttributeError: pytest.skip(f'Required and NotRequired are not available from {request.param}') def test_typeddict_all(TypedDictAll): class MyDict(TypedDictAll): foo: str try: class M(BaseModel): d: MyDict except PydanticUserError as e: assert e.message == 'Please use `typing_extensions.TypedDict` instead of `typing.TypedDict` on Python < 3.12.' else: assert M(d=dict(foo='baz')).d == {'foo': 'baz'} def test_typeddict_annotated_simple(TypedDict, req_no_req): Required, NotRequired = req_no_req class MyDict(TypedDict): foo: str bar: Annotated[int, Lt(10)] spam: NotRequired[float] class M(BaseModel): d: MyDict assert M(d=dict(foo='baz', bar='8')).d == {'foo': 'baz', 'bar': 8} assert M(d=dict(foo='baz', bar='8', spam='44.4')).d == {'foo': 'baz', 'bar': 8, 'spam': 44.4} with pytest.raises(ValidationError, match=r'd\.bar\s+Field required \[type=missing,'): M(d=dict(foo='baz')) with pytest.raises(ValidationError, match=r'd\.bar\s+Input should be less than 10 \[type=less_than,'): M(d=dict(foo='baz', bar='11')) def test_typeddict_total_false(TypedDict, req_no_req): Required, NotRequired = req_no_req class MyDict(TypedDict, total=False): foo: Required[str] bar: int class M(BaseModel): d: MyDict assert M(d=dict(foo='baz', bar='8')).d == {'foo': 'baz', 'bar': 8} assert M(d=dict(foo='baz')).d == {'foo': 'baz'} with pytest.raises(ValidationError, match=r'd\.foo\s+Field required \[type=missing,'): M(d={}) def test_typeddict(TypedDict): class TD(TypedDict): a: int b: int c: int d: str class Model(BaseModel): td: TD m = Model(td={'a': '3', 'b': b'1', 'c': 4, 'd': 'qwe'}) assert m.td == {'a': 3, 'b': 1, 'c': 4, 'd': 'qwe'} with pytest.raises(ValidationError) as exc_info: Model(td={'a': [1], 'b': 2, 'c': 3, 'd': 'qwe'}) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': ('td', 'a'), 'msg': 'Input should be a valid integer', 'input': [1]} ] def test_typeddict_non_total(TypedDict): class FullMovie(TypedDict, total=True): name: str year: int class Model(BaseModel): movie: FullMovie with pytest.raises(ValidationError) as exc_info: Model(movie={'year': '2002'}) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('movie', 'name'), 'msg': 'Field required', 'input': {'year': '2002'}} ] class PartialMovie(TypedDict, total=False): name: str year: int class Model(BaseModel): movie: PartialMovie m = Model(movie={'year': '2002'}) assert m.movie == {'year': 2002} def test_partial_new_typeddict(TypedDict): class OptionalUser(TypedDict, total=False): name: str class User(OptionalUser): id: int class Model(BaseModel): user: User assert Model(user={'id': 1, 'name': 'foobar'}).user == {'id': 1, 'name': 'foobar'} assert Model(user={'id': 1}).user == {'id': 1} def test_typeddict_extra_default(TypedDict): class User(TypedDict): name: str age: int ta = TypeAdapter(User) assert ta.validate_python({'name': 'pika', 'age': 7, 'rank': 1}) == {'name': 'pika', 'age': 7} class UserExtraAllow(User): __pydantic_config__ = ConfigDict(extra='allow') ta = TypeAdapter(UserExtraAllow) assert ta.validate_python({'name': 'pika', 'age': 7, 'rank': 1}) == {'name': 'pika', 'age': 7, 'rank': 1} class UserExtraForbid(User): __pydantic_config__ = ConfigDict(extra='forbid') ta = TypeAdapter(UserExtraForbid) with pytest.raises(ValidationError) as exc_info: ta.validate_python({'name': 'pika', 'age': 7, 'rank': 1}) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'extra_forbidden', 'loc': ('rank',), 'msg': 'Extra inputs are not permitted', 'input': 1} ] def test_typeddict_schema(TypedDict): class Data(BaseModel): a: int class DataTD(TypedDict): a: int class CustomTD(TypedDict): b: int @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: schema = handler(source_type) schema = handler.resolve_ref_schema(schema) assert schema['type'] == 'typed-dict' b = schema['fields']['b']['schema'] assert b['type'] == 'int' b['gt'] = 0 # type: ignore return schema class Model(BaseModel): data: Data data_td: DataTD custom_td: CustomTD # insert_assert(Model.model_json_schema(mode='validation')) assert Model.model_json_schema(mode='validation') == { 'type': 'object', 'properties': { 'data': {'$ref': '#/$defs/Data'}, 'data_td': {'$ref': '#/$defs/DataTD'}, 'custom_td': {'$ref': '#/$defs/CustomTD'}, }, 'required': ['data', 'data_td', 'custom_td'], 'title': 'Model', '$defs': { 'DataTD': { 'type': 'object', 'properties': {'a': {'type': 'integer', 'title': 'A'}}, 'required': ['a'], 'title': 'DataTD', }, 'CustomTD': { 'type': 'object', 'properties': {'b': {'type': 'integer', 'exclusiveMinimum': 0, 'title': 'B'}}, 'required': ['b'], 'title': 'CustomTD', }, 'Data': { 'type': 'object', 'properties': {'a': {'type': 'integer', 'title': 'A'}}, 'required': ['a'], 'title': 'Data', }, }, } # insert_assert(Model.model_json_schema(mode='serialization')) assert Model.model_json_schema(mode='serialization') == { 'type': 'object', 'properties': { 'data': {'$ref': '#/$defs/Data'}, 'data_td': {'$ref': '#/$defs/DataTD'}, 'custom_td': {'$ref': '#/$defs/CustomTD'}, }, 'required': ['data', 'data_td', 'custom_td'], 'title': 'Model', '$defs': { 'DataTD': { 'type': 'object', 'properties': {'a': {'type': 'integer', 'title': 'A'}}, 'required': ['a'], 'title': 'DataTD', }, 'CustomTD': { 'type': 'object', 'properties': {'b': {'type': 'integer', 'exclusiveMinimum': 0, 'title': 'B'}}, 'required': ['b'], 'title': 'CustomTD', }, 'Data': { 'type': 'object', 'properties': {'a': {'type': 'integer', 'title': 'A'}}, 'required': ['a'], 'title': 'Data', }, }, } def test_typeddict_postponed_annotation(TypedDict): class DataTD(TypedDict): v: 'PositiveInt' class Model(BaseModel): t: DataTD with pytest.raises(ValidationError): Model.model_validate({'t': {'v': -1}}) def test_typeddict_required(TypedDict, req_no_req): Required, _ = req_no_req class DataTD(TypedDict, total=False): a: int b: Required[str] class Model(BaseModel): t: DataTD assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'t': {'$ref': '#/$defs/DataTD'}}, 'required': ['t'], '$defs': { 'DataTD': { 'title': 'DataTD', 'type': 'object', 'properties': { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'string'}, }, 'required': ['b'], } }, } def test_typeddict_from_attributes(): class UserCls: def __init__(self, name: str, age: int): self.name = name self.age = age class User(TypedDict): name: str age: int class FromAttributesCls: def __init__(self, u: User): self.u = u class Model(BaseModel): u: Annotated[User, Field(strict=False)] class FromAttributesModel(BaseModel, from_attributes=True): u: Annotated[User, Field(strict=False)] # You can validate the TypedDict from attributes from a type that has a field with an appropriate attribute assert FromAttributesModel.model_validate(FromAttributesCls(u={'name': 'foo', 'age': 15})) # The normal case: you can't populate a TypedDict from attributes with the relevant config setting disabled with pytest.raises(ValidationError, match='Input should be a valid dictionary'): Model(u=UserCls('foo', 15)) # Going further: even with from_attributes allowed, it won't attempt to populate a TypedDict from attributes with pytest.raises(ValidationError, match='Input should be a valid dictionary'): FromAttributesModel(u=UserCls('foo', 15)) def test_typeddict_not_required_schema(TypedDict, req_no_req): Required, NotRequired = req_no_req class DataTD(TypedDict, total=True): a: NotRequired[int] b: str class Model(BaseModel): t: DataTD assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'t': {'$ref': '#/$defs/DataTD'}}, 'required': ['t'], '$defs': { 'DataTD': { 'title': 'DataTD', 'type': 'object', 'properties': { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'string'}, }, 'required': ['b'], } }, } def test_typed_dict_inheritance_schema(TypedDict, req_no_req): Required, NotRequired = req_no_req class DataTDBase(TypedDict, total=True): a: NotRequired[int] b: str class DataTD(DataTDBase, total=False): c: Required[int] d: str class Model(BaseModel): t: DataTD assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'t': {'$ref': '#/$defs/DataTD'}}, 'required': ['t'], '$defs': { 'DataTD': { 'title': 'DataTD', 'type': 'object', 'properties': { 'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'string'}, 'c': {'title': 'C', 'type': 'integer'}, 'd': {'title': 'D', 'type': 'string'}, }, 'required': ['b', 'c'], } }, } def test_typeddict_annotated_nonoptional_schema(TypedDict): class DataTD(TypedDict): a: Optional[int] b: Annotated[Optional[int], Field(42)] c: Annotated[Optional[int], Field(description='Test')] class Model(BaseModel): data_td: DataTD assert Model.model_json_schema() == { 'title': 'Model', 'type': 'object', 'properties': {'data_td': {'$ref': '#/$defs/DataTD'}}, 'required': ['data_td'], '$defs': { 'DataTD': { 'type': 'object', 'title': 'DataTD', 'properties': { 'a': {'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'title': 'A'}, 'b': {'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': 42, 'title': 'B'}, 'c': {'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'description': 'Test', 'title': 'C'}, }, 'required': ['a', 'c'], }, }, } @pytest.mark.parametrize( 'input_value,expected', [ ({'a': '1', 'b': 2, 'c': 3}, {'a': 1, 'b': 2, 'c': 3}), ({'a': None, 'b': 2, 'c': 3}, {'a': None, 'b': 2, 'c': 3}), ({'a': None, 'c': 3}, {'a': None, 'b': 42, 'c': 3}), # ({}, None), # ({'data_td': []}, None), # ({'data_td': {'a': 1, 'b': 2, 'd': 4}}, None), ], ids=repr, ) def test_typeddict_annotated(TypedDict, input_value, expected): class DataTD(TypedDict): a: Optional[int] b: Annotated[Optional[int], Field(42)] c: Annotated[Optional[int], Field(description='Test', lt=4)] class Model(BaseModel): d: DataTD if isinstance(expected, Err): with pytest.raises(ValidationError, match=expected.message_escaped()): Model(d=input_value) else: assert Model(d=input_value).d == expected def test_recursive_typeddict(): from typing import Optional from typing_extensions import TypedDict from pydantic import BaseModel class RecursiveTypedDict(TypedDict): foo: Optional['RecursiveTypedDict'] class RecursiveTypedDictModel(BaseModel): rec: RecursiveTypedDict assert RecursiveTypedDictModel(rec={'foo': {'foo': None}}).rec == {'foo': {'foo': None}} with pytest.raises(ValidationError) as exc_info: RecursiveTypedDictModel(rec={'foo': {'foo': {'foo': 1}}}) assert exc_info.value.errors(include_url=False) == [ { 'input': 1, 'loc': ('rec', 'foo', 'foo', 'foo'), 'msg': 'Input should be a valid dictionary', 'type': 'dict_type', } ] T = TypeVar('T') def test_generic_typeddict_in_concrete_model(): T = TypeVar('T') class GenericTypedDict(typing_extensions.TypedDict, Generic[T]): x: T class Model(BaseModel): y: GenericTypedDict[int] Model(y={'x': 1}) with pytest.raises(ValidationError) as exc_info: Model(y={'x': 'a'}) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('y', 'x'), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_generic_typeddict_in_generic_model(): T = TypeVar('T') class GenericTypedDict(typing_extensions.TypedDict, Generic[T]): x: T class Model(BaseModel, Generic[T]): y: GenericTypedDict[T] Model[int](y={'x': 1}) with pytest.raises(ValidationError) as exc_info: Model[int](y={'x': 'a'}) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('y', 'x'), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] def test_recursive_generic_typeddict_in_module(create_module): @create_module def module(): from typing import Generic, List, Optional, TypeVar from typing_extensions import TypedDict from pydantic import BaseModel T = TypeVar('T') class RecursiveGenTypedDictModel(BaseModel, Generic[T]): rec: 'RecursiveGenTypedDict[T]' class RecursiveGenTypedDict(TypedDict, Generic[T]): foo: Optional['RecursiveGenTypedDict[T]'] ls: List[T] int_data: module.RecursiveGenTypedDict[int] = {'foo': {'foo': None, 'ls': [1]}, 'ls': [1]} assert module.RecursiveGenTypedDictModel[int](rec=int_data).rec == int_data str_data: module.RecursiveGenTypedDict[str] = {'foo': {'foo': None, 'ls': ['a']}, 'ls': ['a']} with pytest.raises(ValidationError) as exc_info: module.RecursiveGenTypedDictModel[int](rec=str_data) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('rec', 'foo', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': 'a', 'loc': ('rec', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] def test_recursive_generic_typeddict_in_function_1(): T = TypeVar('T') # First ordering: typed dict first class RecursiveGenTypedDict(TypedDict, Generic[T]): foo: Optional['RecursiveGenTypedDict[T]'] ls: List[T] class RecursiveGenTypedDictModel(BaseModel, Generic[T]): rec: 'RecursiveGenTypedDict[T]' # Note: no model_rebuild() necessary here # RecursiveGenTypedDictModel.model_rebuild() int_data: RecursiveGenTypedDict[int] = {'foo': {'foo': None, 'ls': [1]}, 'ls': [1]} assert RecursiveGenTypedDictModel[int](rec=int_data).rec == int_data str_data: RecursiveGenTypedDict[str] = {'foo': {'foo': None, 'ls': ['a']}, 'ls': ['a']} with pytest.raises(ValidationError) as exc_info: RecursiveGenTypedDictModel[int](rec=str_data) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('rec', 'foo', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': 'a', 'loc': ('rec', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] def test_recursive_generic_typeddict_in_function_2(): T = TypeVar('T') # Second ordering: model first class RecursiveGenTypedDictModel(BaseModel, Generic[T]): rec: 'RecursiveGenTypedDict[T]' class RecursiveGenTypedDict(TypedDict, Generic[T]): foo: Optional['RecursiveGenTypedDict[T]'] ls: List[T] int_data: RecursiveGenTypedDict[int] = {'foo': {'foo': None, 'ls': [1]}, 'ls': [1]} assert RecursiveGenTypedDictModel[int](rec=int_data).rec == int_data str_data: RecursiveGenTypedDict[str] = {'foo': {'foo': None, 'ls': ['a']}, 'ls': ['a']} with pytest.raises(ValidationError) as exc_info: RecursiveGenTypedDictModel[int](rec=str_data) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('rec', 'foo', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': 'a', 'loc': ('rec', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] def test_recursive_generic_typeddict_in_function_3(): T = TypeVar('T') class RecursiveGenTypedDictModel(BaseModel, Generic[T]): rec: 'RecursiveGenTypedDict[T]' IntModel = RecursiveGenTypedDictModel[int] class RecursiveGenTypedDict(TypedDict, Generic[T]): foo: Optional['RecursiveGenTypedDict[T]'] ls: List[T] int_data: RecursiveGenTypedDict[int] = {'foo': {'foo': None, 'ls': [1]}, 'ls': [1]} assert IntModel(rec=int_data).rec == int_data str_data: RecursiveGenTypedDict[str] = {'foo': {'foo': None, 'ls': ['a']}, 'ls': ['a']} with pytest.raises(ValidationError) as exc_info: IntModel(rec=str_data) assert exc_info.value.errors(include_url=False) == [ { 'input': 'a', 'loc': ('rec', 'foo', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, { 'input': 'a', 'loc': ('rec', 'ls', 0), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', }, ] def test_typeddict_alias_generator(TypedDict): def alias_generator(name: str) -> str: return 'alias_' + name class MyDict(TypedDict): __pydantic_config__ = ConfigDict(alias_generator=alias_generator, extra='forbid') foo: str class Model(BaseModel): d: MyDict ta = TypeAdapter(MyDict) model = ta.validate_python({'alias_foo': 'bar'}) assert model['foo'] == 'bar' with pytest.raises(ValidationError) as exc_info: ta.validate_python({'foo': 'bar'}) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('alias_foo',), 'msg': 'Field required', 'input': {'foo': 'bar'}}, {'input': 'bar', 'loc': ('foo',), 'msg': 'Extra inputs are not permitted', 'type': 'extra_forbidden'}, ] def test_typeddict_inheritance(TypedDict: Any) -> None: class Parent(TypedDict): x: int class Child(Parent): y: float ta = TypeAdapter(Child) assert ta.validate_python({'x': '1', 'y': '1.0'}) == {'x': 1, 'y': 1.0} def test_typeddict_field_validator(TypedDict: Any) -> None: class Parent(TypedDict): a: List[str] @field_validator('a') @classmethod def parent_val_before(cls, v: List[str]): v.append('parent before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('parent') return v @field_validator('a') @classmethod def parent_val_after(cls, v: List[str]): v.append('parent after') return v class Child(Parent): @field_validator('a') @classmethod def child_val_before(cls, v: List[str]): v.append('child before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('child') return v @field_validator('a') @classmethod def child_val_after(cls, v: List[str]): v.append('child after') return v parent_ta = TypeAdapter(Parent) child_ta = TypeAdapter(Child) assert parent_ta.validate_python({'a': []})['a'] == ['parent before', 'parent', 'parent after'] assert child_ta.validate_python({'a': []})['a'] == [ 'parent before', 'child', 'parent after', 'child before', 'child after', ] def test_typeddict_model_validator(TypedDict) -> None: class Model(TypedDict): x: int y: float @model_validator(mode='before') @classmethod def val_model_before(cls, value: Dict[str, Any]) -> Dict[str, Any]: return dict(x=value['x'] + 1, y=value['y'] + 2) @model_validator(mode='after') def val_model_after(self) -> 'Model': return Model(x=self['x'] * 2, y=self['y'] * 3) ta = TypeAdapter(Model) assert ta.validate_python({'x': 1, 'y': 2.5}) == {'x': 4, 'y': 13.5} def test_typeddict_field_serializer(TypedDict: Any) -> None: class Parent(TypedDict): a: List[str] @field_serializer('a') @classmethod def ser(cls, v: List[str]): v.append('parent') return v class Child(Parent): @field_serializer('a') @classmethod def ser(cls, v: List[str]): v.append('child') return v parent_ta = TypeAdapter(Parent) child_ta = TypeAdapter(Child) assert parent_ta.dump_python(Parent({'a': []}))['a'] == ['parent'] assert child_ta.dump_python(Child({'a': []}))['a'] == ['child'] def test_typeddict_model_serializer(TypedDict) -> None: class Model(TypedDict): x: int y: float @model_serializer(mode='plain') def ser_model(self) -> Dict[str, Any]: return {'x': self['x'] * 2, 'y': self['y'] * 3} ta = TypeAdapter(Model) assert ta.dump_python(Model({'x': 1, 'y': 2.5})) == {'x': 2, 'y': 7.5} def test_model_config() -> None: class Model(TypedDict): x: str __pydantic_config__ = ConfigDict(str_to_lower=True) # type: ignore ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC'}) == {'x': 'abc'} def test_model_config_inherited() -> None: class Base(TypedDict): __pydantic_config__ = ConfigDict(str_to_lower=True) # type: ignore class Model(Base): x: str ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC'}) == {'x': 'abc'} def test_grandparent_config(): class MyTypedDict(TypedDict): __pydantic_config__ = ConfigDict(str_to_lower=True) x: str class MyMiddleTypedDict(MyTypedDict): y: str class MySubTypedDict(MyMiddleTypedDict): z: str validated_data = TypeAdapter(MySubTypedDict).validate_python({'x': 'ABC', 'y': 'DEF', 'z': 'GHI'}) assert validated_data == {'x': 'abc', 'y': 'def', 'z': 'ghi'} def test_typeddict_mro(): class A(TypedDict): x = 1 class B(A): x = 2 class C(B): pass assert get_attribute_from_bases(C, 'x') == 2 def test_typeddict_with_config_decorator(): @with_config(ConfigDict(str_to_lower=True)) class Model(TypedDict): x: str ta = TypeAdapter(Model) assert ta.validate_python({'x': 'ABC'}) == {'x': 'abc'} def test_config_pushdown_typed_dict() -> None: class ArbitraryType: pass class TD(TypedDict): a: ArbitraryType class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) td: TD pydantic-2.10.6/tests/test_types_zoneinfo.py000066400000000000000000000033071474456633400212640ustar00rootroot00000000000000from datetime import timezone from typing import Union import pytest from pydantic import BaseModel, ConfigDict, TypeAdapter, ValidationError zoneinfo = pytest.importorskip('zoneinfo', reason='zoneinfo requires >=3.9') class ZoneInfoModel(BaseModel): tz: zoneinfo.ZoneInfo @pytest.mark.parametrize( 'tz', [ pytest.param(zoneinfo.ZoneInfo('America/Los_Angeles'), id='ZoneInfoObject'), pytest.param('America/Los_Angeles', id='IanaTimezoneStr'), ], ) def test_zoneinfo_valid_inputs(tz): model = ZoneInfoModel(tz=tz) assert model.tz == zoneinfo.ZoneInfo('America/Los_Angeles') def test_zoneinfo_serialization(): model = ZoneInfoModel(tz=zoneinfo.ZoneInfo('America/Los_Angeles')) assert model.model_dump_json() == '{"tz":"America/Los_Angeles"}' def test_zoneinfo_parsing_fails_for_invalid_iana_tz_strs(): with pytest.raises(ValidationError) as ex_info: ZoneInfoModel(tz='Zone/That_Does_Not_Exist') assert ex_info.value.errors() == [ { 'type': 'zoneinfo_str', 'loc': ('tz',), 'msg': 'invalid timezone: Zone/That_Does_Not_Exist', 'input': 'Zone/That_Does_Not_Exist', 'ctx': {'value': 'Zone/That_Does_Not_Exist'}, } ] def test_zoneinfo_json_schema(): assert ZoneInfoModel.model_json_schema() == { 'type': 'object', 'title': 'ZoneInfoModel', 'properties': {'tz': {'type': 'string', 'format': 'zoneinfo', 'title': 'Tz'}}, 'required': ['tz'], } def test_zoneinfo_union() -> None: ta = TypeAdapter(Union[zoneinfo.ZoneInfo, timezone], config=ConfigDict(arbitrary_types_allowed=True)) assert ta.validate_python(timezone.utc) is timezone.utc pydantic-2.10.6/tests/test_typing.py000066400000000000000000000117771474456633400175350ustar00rootroot00000000000000import sys import typing from collections import namedtuple from typing import Callable, ClassVar, ForwardRef, NamedTuple import pytest from typing_extensions import Annotated, Literal, get_origin from pydantic import BaseModel, Field # noqa: F401 from pydantic._internal._typing_extra import ( NoneType, eval_type, get_function_type_hints, is_classvar_annotation, is_literal, is_namedtuple, is_none_type, origin_is_union, parent_frame_namespace, ) try: from typing import TypedDict as typing_TypedDict except ImportError: typing_TypedDict = None try: from typing_extensions import TypedDict as typing_extensions_TypedDict except ImportError: typing_extensions_TypedDict = None ALL_TYPEDDICT_KINDS = (typing_TypedDict, typing_extensions_TypedDict) def test_is_namedtuple(): class Employee(NamedTuple): name: str id: int = 3 assert is_namedtuple(namedtuple('Point', 'x y')) is True assert is_namedtuple(Employee) is True assert is_namedtuple(NamedTuple('Employee', [('name', str), ('id', int)])) is True class Other(tuple): name: str id: int assert is_namedtuple(Other) is False def test_is_none_type(): assert is_none_type(Literal[None]) is True assert is_none_type(None) is True assert is_none_type(type(None)) is True assert is_none_type(6) is False assert is_none_type({}) is False # WARNING: It's important to test `typing.Callable` not # `collections.abc.Callable` (even with python >= 3.9) as they behave # differently assert is_none_type(Callable) is False @pytest.mark.parametrize( 'union', [ typing.Union[int, str], eval_type('int | str'), *([int | str] if sys.version_info >= (3, 10) else []), ], ) def test_is_union(union): origin = get_origin(union) assert origin_is_union(origin) def test_is_literal_with_typing_extension_literal(): from typing_extensions import Literal assert is_literal(Literal) is False assert is_literal(Literal['foo']) is True def test_is_literal_with_typing_literal(): from typing import Literal assert is_literal(Literal) is False assert is_literal(Literal['foo']) is True @pytest.mark.parametrize( ['ann_type', 'extepcted'], ( (None, False), (ForwardRef('Other[int]'), False), (ForwardRef('Other[ClassVar[int]]'), False), (ForwardRef('ClassVar[int]'), True), (ForwardRef('t.ClassVar[int]'), True), (ForwardRef('typing.ClassVar[int]'), True), (ForwardRef('Annotated[ClassVar[int], ...]'), True), (ForwardRef('Annotated[t.ClassVar[int], ...]'), True), (ForwardRef('t.Annotated[t.ClassVar[int], ...]'), True), (ClassVar[int], True), (Annotated[ClassVar[int], ...], True), ), ) def test_is_classvar_annotation(ann_type, extepcted): assert is_classvar_annotation(ann_type) is extepcted def test_parent_frame_namespace(mocker): assert parent_frame_namespace() is not None from dataclasses import dataclass @dataclass class MockedFrame: f_back = None f_locals = {} mocker.patch('sys._getframe', return_value=MockedFrame()) assert parent_frame_namespace() is None def test_get_function_type_hints_none_type(): def f(x: int, y: None) -> int: return x assert get_function_type_hints(f) == {'return': int, 'x': int, 'y': NoneType} @pytest.mark.skipif(sys.version_info[:2] > (3, 9), reason='testing using a feature not supported by older Python') def test_eval_type_backport_not_installed(): sys.modules['eval_type_backport'] = None try: with pytest.raises(TypeError) as exc_info: class _Model(BaseModel): foo: 'int | str' assert str(exc_info.value) == ( "Unable to evaluate type annotation 'int | str'. If you are making use " 'of the new typing syntax (unions using `|` since Python 3.10 or builtins subscripting ' 'since Python 3.9), you should either replace the use of new syntax with the existing ' '`typing` constructs or install the `eval_type_backport` package.' ) finally: del sys.modules['eval_type_backport'] def test_func_ns_excludes_default_globals() -> None: foo = 'foo' func_ns = parent_frame_namespace(parent_depth=1) assert func_ns is not None assert func_ns['foo'] == foo # there are more default global variables, but these are examples of well known ones for default_global_var in ['__name__', '__doc__', '__package__', '__builtins__']: assert default_global_var not in func_ns module_foo = 'global_foo' module_ns = parent_frame_namespace(parent_depth=1) def test_module_ns_is_none() -> None: """Module namespace should be none because we skip fetching data from the top module level.""" assert module_ns is None def test_exotic_localns() -> None: __foo_annotation__ = str class Model(BaseModel): foo: __foo_annotation__ assert Model.model_fields['foo'].annotation == str pydantic-2.10.6/tests/test_utils.py000066400000000000000000000606421474456633400173560ustar00rootroot00000000000000import collections.abc import json import os import pickle import sys import time from copy import copy, deepcopy from typing import Callable, Dict, Generic, List, NewType, Tuple, TypeVar, Union import pytest from dirty_equals import IsList from pydantic_core import PydanticCustomError, PydanticUndefined, core_schema from typing_extensions import Annotated, Literal from pydantic import BaseModel from pydantic._internal import _repr from pydantic._internal._core_utils import _WalkCoreSchema, pretty_print_core_schema from pydantic._internal._typing_extra import get_origin, is_new_type, literal_values from pydantic._internal._utils import ( BUILTIN_COLLECTIONS, LazyClassAttribute, ValueItems, all_identical, deep_update, lenient_issubclass, smart_deepcopy, unique_list, ) from pydantic._internal._validators import import_string from pydantic.alias_generators import to_camel, to_pascal, to_snake from pydantic.color import Color try: import devtools except ImportError: devtools = None def test_import_module(): assert import_string('os.path') == os.path def test_import_module_invalid(): with pytest.raises(PydanticCustomError, match="Invalid python path: No module named 'xx'"): import_string('xx') def test_import_no_attr(): with pytest.raises(PydanticCustomError, match="cannot import name 'foobar' from 'os'"): import_string('os:foobar') def foobar(a, b, c=4): pass T = TypeVar('T') class LoggedVar(Generic[T]): def get(self) -> T: ... @pytest.mark.parametrize( 'value,expected', [ (str, 'str'), ('foobar', 'str'), ('SomeForwardRefString', 'str'), # included to document current behavior; could be changed (List['SomeForwardRef'], "List[ForwardRef('SomeForwardRef')]"), # noqa: F821 (Union[str, int], 'Union[str, int]'), (list, 'list'), (List, 'List'), ([1, 2, 3], 'list'), (List[Dict[str, int]], 'List[Dict[str, int]]'), (Tuple[str, int, float], 'Tuple[str, int, float]'), (Tuple[str, ...], 'Tuple[str, ...]'), (Union[int, List[str], Tuple[str, int]], 'Union[int, List[str], Tuple[str, int]]'), (foobar, 'foobar'), (time.time_ns, 'time_ns'), (LoggedVar, 'LoggedVar'), (LoggedVar(), 'LoggedVar'), ], ) def test_display_as_type(value, expected): assert _repr.display_as_type(value) == expected @pytest.mark.skipif(sys.version_info < (3, 10), reason='requires python 3.10 or higher') @pytest.mark.parametrize( 'value_gen,expected', [ (lambda: str, 'str'), (lambda: 'SomeForwardRefString', 'str'), # included to document current behavior; could be changed (lambda: List['SomeForwardRef'], "List[ForwardRef('SomeForwardRef')]"), # noqa: F821 (lambda: str | int, 'Union[str, int]'), (lambda: list, 'list'), (lambda: List, 'List'), (lambda: list[int], 'list[int]'), (lambda: List[int], 'List[int]'), (lambda: list[dict[str, int]], 'list[dict[str, int]]'), (lambda: list[Union[str, int]], 'list[Union[str, int]]'), (lambda: list[str | int], 'list[Union[str, int]]'), (lambda: LoggedVar[int], 'LoggedVar[int]'), (lambda: LoggedVar[Dict[int, str]], 'LoggedVar[Dict[int, str]]'), ], ) def test_display_as_type_310(value_gen, expected): value = value_gen() assert _repr.display_as_type(value) == expected def test_lenient_issubclass(): class A(str): pass assert lenient_issubclass(A, str) is True @pytest.mark.skipif(sys.version_info < (3, 9), reason='generic aliases are not available in python < 3.9') def test_lenient_issubclass_with_generic_aliases(): from collections.abc import Mapping # should not raise an error here: assert lenient_issubclass(list[str], Mapping) is False def test_lenient_issubclass_is_lenient(): assert lenient_issubclass('a', 'a') is False @pytest.mark.parametrize( 'input_value,output', [ ([], []), ([1, 1, 1, 2, 1, 2, 3, 2, 3, 1, 4, 2, 3, 1], [1, 2, 3, 4]), (['a', 'a', 'b', 'a', 'b', 'c', 'b', 'c', 'a'], ['a', 'b', 'c']), ], ) def test_unique_list(input_value, output): assert unique_list(input_value) == output assert unique_list(unique_list(input_value)) == unique_list(input_value) def test_value_items(): v = ['a', 'b', 'c'] vi = ValueItems(v, {0, -1}) assert vi.is_excluded(2) assert [v_ for i, v_ in enumerate(v) if not vi.is_excluded(i)] == ['b'] assert vi.is_included(2) assert [v_ for i, v_ in enumerate(v) if vi.is_included(i)] == ['a', 'c'] v2 = {'a': v, 'b': {'a': 1, 'b': (1, 2)}, 'c': 1} vi = ValueItems(v2, {'a': {0, -1}, 'b': {'a': ..., 'b': -1}}) assert not vi.is_excluded('a') assert vi.is_included('a') assert not vi.is_excluded('c') assert not vi.is_included('c') assert str(vi) == "{'a': {0, -1}, 'b': {'a': Ellipsis, 'b': -1}}" assert repr(vi) == "ValueItems({'a': {0, -1}, 'b': {'a': Ellipsis, 'b': -1}})" excluded = {k_: v_ for k_, v_ in v2.items() if not vi.is_excluded(k_)} assert excluded == {'a': v, 'b': {'a': 1, 'b': (1, 2)}, 'c': 1} included = {k_: v_ for k_, v_ in v2.items() if vi.is_included(k_)} assert included == {'a': v, 'b': {'a': 1, 'b': (1, 2)}} sub_v = included['a'] sub_vi = ValueItems(sub_v, vi.for_element('a')) assert repr(sub_vi) == 'ValueItems({0: Ellipsis, 2: Ellipsis})' assert sub_vi.is_excluded(2) assert [v_ for i, v_ in enumerate(sub_v) if not sub_vi.is_excluded(i)] == ['b'] assert sub_vi.is_included(2) assert [v_ for i, v_ in enumerate(sub_v) if sub_vi.is_included(i)] == ['a', 'c'] vi = ValueItems([], {'__all__': {}}) assert vi._items == {} with pytest.raises(TypeError, match='Unexpected type of exclude value for index "a" '): ValueItems(['a', 'b'], {'a': None}) m = ( 'Excluding fields from a sequence of sub-models or dicts must be performed index-wise: ' 'expected integer keys or keyword "__all__"' ) with pytest.raises(TypeError, match=m): ValueItems(['a', 'b'], {'a': {}}) vi = ValueItems([1, 2, 3, 4], {'__all__': True}) assert repr(vi) == 'ValueItems({0: Ellipsis, 1: Ellipsis, 2: Ellipsis, 3: Ellipsis})' vi = ValueItems([1, 2], {'__all__': {1, 2}}) assert repr(vi) == 'ValueItems({0: {1: Ellipsis, 2: Ellipsis}, 1: {1: Ellipsis, 2: Ellipsis}})' @pytest.mark.parametrize( 'base,override,intersect,expected', [ # Check in default (union) mode (..., ..., False, ...), (None, None, False, None), ({}, {}, False, {}), (..., None, False, ...), (None, ..., False, ...), (None, {}, False, {}), ({}, None, False, {}), (..., {}, False, {}), ({}, ..., False, ...), ({'a': None}, {'a': None}, False, {}), ({'a'}, ..., False, ...), ({'a'}, {}, False, {'a': ...}), ({'a'}, {'b'}, False, {'a': ..., 'b': ...}), ({'a': ...}, {'b': {'c'}}, False, {'a': ..., 'b': {'c': ...}}), ({'a': ...}, {'a': {'c'}}, False, {'a': {'c': ...}}), ({'a': {'c': ...}, 'b': {'d'}}, {'a': ...}, False, {'a': ..., 'b': {'d': ...}}), # Check in intersection mode (..., ..., True, ...), (None, None, True, None), ({}, {}, True, {}), (..., None, True, ...), (None, ..., True, ...), (None, {}, True, {}), ({}, None, True, {}), (..., {}, True, {}), ({}, ..., True, {}), ({'a': None}, {'a': None}, True, {}), ({'a'}, ..., True, {'a': ...}), ({'a'}, {}, True, {}), ({'a'}, {'b'}, True, {}), ({'a': ...}, {'b': {'c'}}, True, {}), ({'a': ...}, {'a': {'c'}}, True, {'a': {'c': ...}}), ({'a': {'c': ...}, 'b': {'d'}}, {'a': ...}, True, {'a': {'c': ...}}), # Check usage of `True` instead of `...` (..., True, False, True), (True, ..., False, ...), (True, None, False, True), ({'a': {'c': True}, 'b': {'d'}}, {'a': True}, False, {'a': True, 'b': {'d': ...}}), ], ) def test_value_items_merge(base, override, intersect, expected): actual = ValueItems.merge(base, override, intersect=intersect) assert actual == expected def test_value_items_error(): with pytest.raises(TypeError) as e: ValueItems(1, (1, 2, 3)) assert str(e.value) == "Unexpected type of exclude value " def test_is_new_type(): new_type = NewType('new_type', str) new_new_type = NewType('new_new_type', new_type) assert is_new_type(new_type) assert is_new_type(new_new_type) assert not is_new_type(str) def test_pretty(): class MyTestModel(BaseModel): a: int = 1 b: List[int] = [1, 2, 3] m = MyTestModel() assert m.__repr_name__() == 'MyTestModel' assert str(m) == 'a=1 b=[1, 2, 3]' assert repr(m) == 'MyTestModel(a=1, b=[1, 2, 3])' assert list(m.__pretty__(lambda x: f'fmt: {x!r}')) == [ 'MyTestModel(', 1, 'a=', 'fmt: 1', ',', 0, 'b=', 'fmt: [1, 2, 3]', ',', 0, -1, ')', ] @pytest.mark.filterwarnings('ignore::DeprecationWarning') def test_pretty_color(): c = Color('red') assert str(c) == 'red' assert repr(c) == "Color('red', rgb=(255, 0, 0))" assert list(c.__pretty__(lambda x: f'fmt: {x!r}')) == [ 'Color(', 1, "fmt: 'red'", ',', 0, 'rgb=', 'fmt: (255, 0, 0)', ',', 0, -1, ')', ] @pytest.mark.skipif(not devtools, reason='devtools not installed') def test_devtools_output(): class MyTestModel(BaseModel): a: int = 1 b: List[int] = [1, 2, 3] assert devtools.pformat(MyTestModel()) == 'MyTestModel(\n a=1,\n b=[1, 2, 3],\n)' @pytest.mark.parametrize( 'mapping, updating_mapping, expected_mapping, msg', [ ( {'key': {'inner_key': 0}}, {'other_key': 1}, {'key': {'inner_key': 0}, 'other_key': 1}, 'extra keys are inserted', ), ( {'key': {'inner_key': 0}, 'other_key': 1}, {'key': [1, 2, 3]}, {'key': [1, 2, 3], 'other_key': 1}, 'values that can not be merged are updated', ), ( {'key': {'inner_key': 0}}, {'key': {'other_key': 1}}, {'key': {'inner_key': 0, 'other_key': 1}}, 'values that have corresponding keys are merged', ), ( {'key': {'inner_key': {'deep_key': 0}}}, {'key': {'inner_key': {'other_deep_key': 1}}}, {'key': {'inner_key': {'deep_key': 0, 'other_deep_key': 1}}}, 'deeply nested values that have corresponding keys are merged', ), ], ) def test_deep_update(mapping, updating_mapping, expected_mapping, msg): assert deep_update(mapping, updating_mapping) == expected_mapping, msg def test_deep_update_is_not_mutating(): mapping = {'key': {'inner_key': {'deep_key': 1}}} updated_mapping = deep_update(mapping, {'key': {'inner_key': {'other_deep_key': 1}}}) assert updated_mapping == {'key': {'inner_key': {'deep_key': 1, 'other_deep_key': 1}}} assert mapping == {'key': {'inner_key': {'deep_key': 1}}} def test_undefined_repr(): assert repr(PydanticUndefined) == 'PydanticUndefined' def test_undefined_copy(): assert copy(PydanticUndefined) is PydanticUndefined assert deepcopy(PydanticUndefined) is PydanticUndefined def test_class_attribute(): class Foo: attr = LazyClassAttribute('attr', lambda: 'foo') assert Foo.attr == 'foo' with pytest.raises(AttributeError, match="'attr' attribute of 'Foo' is class-only"): Foo().attr f = Foo() f.attr = 'not foo' assert f.attr == 'not foo' def test_literal_values(): L1 = Literal['1'] assert literal_values(L1) == ['1'] L2 = Literal['2'] L12 = Literal[L1, L2] assert literal_values(L12) == IsList('1', '2', check_order=False) L312 = Literal['3', Literal[L1, L2]] assert literal_values(L312) == IsList('3', '1', '2', check_order=False) @pytest.mark.parametrize( 'obj', (1, 1.0, '1', b'1', int, None, test_literal_values, len, test_literal_values.__code__, lambda: ..., ...), ) def test_smart_deepcopy_immutable_non_sequence(obj, mocker): # make sure deepcopy is not used # (other option will be to use obj.copy(), but this will produce error as none of given objects have this method) mocker.patch('pydantic._internal._utils.deepcopy', side_effect=RuntimeError) assert smart_deepcopy(obj) is deepcopy(obj) is obj @pytest.mark.parametrize('empty_collection', (collection() for collection in BUILTIN_COLLECTIONS)) def test_smart_deepcopy_empty_collection(empty_collection, mocker): mocker.patch('pydantic._internal._utils.deepcopy', side_effect=RuntimeError) # make sure deepcopy is not used if not isinstance(empty_collection, (tuple, frozenset)): # empty tuple or frozenset are always the same object assert smart_deepcopy(empty_collection) is not empty_collection @pytest.mark.parametrize( 'collection', (c.fromkeys((1,)) if issubclass(c, dict) else c((1,)) for c in BUILTIN_COLLECTIONS) ) def test_smart_deepcopy_collection(collection, mocker): expected_value = object() mocker.patch('pydantic._internal._utils.deepcopy', return_value=expected_value) assert smart_deepcopy(collection) is expected_value @pytest.mark.parametrize('error', [TypeError, ValueError, RuntimeError]) def test_smart_deepcopy_error(error, mocker): class RaiseOnBooleanOperation(str): def __bool__(self): raise error('raised error') obj = RaiseOnBooleanOperation() expected_value = deepcopy(obj) assert smart_deepcopy(obj) == expected_value T = TypeVar('T') @pytest.mark.parametrize( 'input_value,output_value', [ (Annotated[int, 10] if Annotated else None, Annotated), (Callable[[], T][int], collections.abc.Callable), (Dict[str, int], dict), (List[str], list), (Union[int, str], Union), (int, None), ], ) def test_get_origin(input_value, output_value): if input_value is None: pytest.skip('Skipping undefined hint for this python version') assert get_origin(input_value) is output_value def test_all_identical(): a, b = object(), object() c = [b] assert all_identical([a, b], [a, b]) is True assert all_identical([a, b], [a, b]) is True assert all_identical([a, b, b], [a, b, b]) is True assert all_identical([a, c, b], [a, c, b]) is True assert all_identical([], [a]) is False, 'Expected iterables with different lengths to evaluate to `False`' assert all_identical([a], []) is False, 'Expected iterables with different lengths to evaluate to `False`' assert ( all_identical([a, [b], b], [a, [b], b]) is False ), 'New list objects are different objects and should therefore not be identical.' def test_undefined_pickle(): undefined2 = pickle.loads(pickle.dumps(PydanticUndefined)) assert undefined2 is PydanticUndefined def test_on_lower_camel_zero_length(): assert to_camel('') == '' def test_on_lower_camel_one_length(): assert to_camel('a') == 'a' def test_on_lower_camel_many_length(): assert to_camel('i_like_turtles') == 'iLikeTurtles' @pytest.mark.parametrize( 'value,result', [ ('snake_to_camel', 'snakeToCamel'), ('snake_2_camel', 'snake2Camel'), ('snake2camel', 'snake2Camel'), ('_snake_to_camel', '_snakeToCamel'), ('snake_to_camel_', 'snakeToCamel_'), ('__snake_to_camel__', '__snakeToCamel__'), ('snake_2', 'snake2'), ('_snake_2', '_snake2'), ('snake_2_', 'snake2_'), ], ) def test_snake2camel_start_lower(value: str, result: str) -> None: assert to_camel(value) == result @pytest.mark.parametrize( 'value,result', [ ('snake_to_camel', 'SnakeToCamel'), ('snake_2_camel', 'Snake2Camel'), ('snake2camel', 'Snake2Camel'), ('_snake_to_camel', '_SnakeToCamel'), ('snake_to_camel_', 'SnakeToCamel_'), ('__snake_to_camel__', '__SnakeToCamel__'), ('snake_2', 'Snake2'), ('_snake_2', '_Snake2'), ('snake_2_', 'Snake2_'), ], ) def test_snake2pascal(value: str, result: str) -> None: assert to_pascal(value) == result @pytest.mark.parametrize( 'value,result', [ ('camel_to_snake', 'camel_to_snake'), ('camelToSnake', 'camel_to_snake'), ('camel2Snake', 'camel_2_snake'), ('_camelToSnake', '_camel_to_snake'), ('camelToSnake_', 'camel_to_snake_'), ('__camelToSnake__', '__camel_to_snake__'), ('CamelToSnake', 'camel_to_snake'), ('Camel2Snake', 'camel_2_snake'), ('_CamelToSnake', '_camel_to_snake'), ('CamelToSnake_', 'camel_to_snake_'), ('CAMELToSnake', 'camel_to_snake'), ('__CamelToSnake__', '__camel_to_snake__'), ('Camel2', 'camel_2'), ('Camel2_', 'camel_2_'), ('_Camel2', '_camel_2'), ('camel2', 'camel_2'), ('camel2_', 'camel_2_'), ('_camel2', '_camel_2'), ('kebab-to-snake', 'kebab_to_snake'), ('kebab-Snake', 'kebab_snake'), ('Kebab-Snake', 'kebab_snake'), ('PascalToSnake', 'pascal_to_snake'), ('snake_to_snake', 'snake_to_snake'), ('snakeV2', 'snake_v2'), ], ) def test_to_snake(value: str, result: str) -> None: assert to_snake(value) == result def test_to_camel_from_camel() -> None: assert to_camel('alreadyCamel') == 'alreadyCamel' def test_handle_tuple_schema(): schema = core_schema.tuple_schema([core_schema.float_schema(), core_schema.int_schema()]) def walk(s, recurse): # change extra_schema['type'] to 'str' if s['type'] == 'float': s['type'] = 'str' return s schema = _WalkCoreSchema().handle_tuple_schema(schema, walk) assert schema == { 'items_schema': [{'type': 'str'}, {'type': 'int'}], 'type': 'tuple', } @pytest.mark.parametrize( 'params,expected_extra_schema', ( pytest.param({}, {}, id='Model fields without extra_validator'), pytest.param( {'extras_schema': core_schema.float_schema()}, {'extras_schema': {'type': 'str'}}, id='Model fields with extra_validator', ), ), ) def test_handle_model_fields_schema(params, expected_extra_schema): schema = core_schema.model_fields_schema( { 'foo': core_schema.model_field(core_schema.int_schema()), }, **params, ) def walk(s, recurse): # change extra_schema['type'] to 'str' if s['type'] == 'float': s['type'] = 'str' return s schema = _WalkCoreSchema().handle_model_fields_schema(schema, walk) assert schema == { **expected_extra_schema, 'type': 'model-fields', 'fields': {'foo': {'type': 'model-field', 'schema': {'type': 'int'}}}, } @pytest.mark.parametrize( 'params,expected_extra_schema', ( pytest.param({}, {}, id='Typeddict without extra_validator'), pytest.param( {'extras_schema': core_schema.float_schema()}, {'extras_schema': {'type': 'str'}}, id='Typeddict with extra_validator', ), ), ) def test_handle_typed_dict_schema(params, expected_extra_schema): schema = core_schema.typed_dict_schema( { 'foo': core_schema.model_field(core_schema.int_schema()), }, **params, ) def walk(s, recurse): # change extra_validator['type'] to 'str' if s['type'] == 'float': s['type'] = 'str' return s schema = _WalkCoreSchema().handle_typed_dict_schema(schema, walk) assert schema == { **expected_extra_schema, 'type': 'typed-dict', 'fields': {'foo': {'type': 'model-field', 'schema': {'type': 'int'}}}, } def test_handle_call_schema(): param_a = core_schema.arguments_parameter(name='a', schema=core_schema.str_schema(), mode='positional_only') args_schema = core_schema.arguments_schema([param_a]) schema = core_schema.call_schema( arguments=args_schema, function=lambda a: int(a), return_schema=core_schema.str_schema(), ) def walk(s, recurse): # change return schema if 'return_schema' in schema: schema['return_schema']['type'] = 'int' return s schema = _WalkCoreSchema().handle_call_schema(schema, walk) assert schema['return_schema'] == {'type': 'int'} class TestModel: __slots__ = ( '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__', ) @pytest.mark.parametrize( 'include_metadata, schema, expected', [ # including metadata with a simple any schema ( True, core_schema.AnySchema( type='any', ref='meta_schema', metadata={'schema_type': 'any', 'test_id': '42'}, serialization=core_schema.simple_ser_schema('bool'), ), { 'type': 'any', 'ref': 'meta_schema', 'metadata': {'schema_type': 'any', 'test_id': '42'}, 'serialization': {'type': 'bool'}, }, ), # excluding metadata with a model_fields_schema ( False, core_schema.model_fields_schema( ref='meta_schema', metadata={'schema_type': 'model', 'test_id': '43'}, computed_fields=[ core_schema.computed_field( property_name='TestModel', return_schema=core_schema.model_fields_schema( fields={'a': core_schema.model_field(core_schema.str_schema())}, ), alias='comp_field_1', metadata={'comp_field_key': 'comp_field_data'}, ) ], fields={'a': core_schema.model_field(core_schema.str_schema())}, ), { 'type': 'model-fields', 'fields': {'a': {'type': 'model-field', 'schema': {'type': 'str'}}}, 'computed_fields': [ { 'type': 'computed-field', 'property_name': 'TestModel', 'return_schema': { 'type': 'model-fields', 'fields': {'a': {'type': 'model-field', 'schema': {'type': 'str'}}}, }, 'alias': 'comp_field_1', 'metadata': {'comp_field_key': 'comp_field_data'}, } ], 'ref': 'meta_schema', }, ), # exclude metadata with a model_schema ( False, core_schema.model_schema( ref='meta_schema', metadata={'schema_type': 'model', 'test_id': '43'}, custom_init=False, root_model=False, cls=TestModel, config=core_schema.CoreConfig(str_max_length=5), schema=core_schema.model_fields_schema( fields={'a': core_schema.model_field(core_schema.str_schema())}, ), ), { 'type': 'model', 'schema': {'type': 'model-fields', 'fields': {'a': {'type': 'model-field', 'schema': {'type': 'str'}}}}, 'config': {'str_max_length': 5}, 'ref': 'meta_schema', }, ), ], ) def test_pretty_print(include_metadata, schema, expected, capfd, monkeypatch): """Verify basic functionality of pretty_print_core_schema, which is used as a utility for debugging. Given varied output, this test verifies that the content of the output is as expected, Rather than doing robust formatting testing. """ # This can break the test by adding color to the output streams monkeypatch.delenv('FORCE_COLOR', raising=False) pretty_print_core_schema(schema=schema, include_metadata=include_metadata) content = capfd.readouterr() # Remove cls due to string formatting (for case 3 above) cls_substring = "'cls': ," new_content_out = content.out.replace(cls_substring, '') content_as_json = json.loads(new_content_out.replace("'", '"')) assert content_as_json == expected pydantic-2.10.6/tests/test_v1.py000066400000000000000000000021371474456633400165370ustar00rootroot00000000000000import warnings from pydantic import VERSION from pydantic import BaseModel as V2BaseModel from pydantic.v1 import VERSION as V1_VERSION from pydantic.v1 import BaseModel as V1BaseModel from pydantic.v1 import root_validator as v1_root_validator def test_version(): assert V1_VERSION.startswith('1.') assert V1_VERSION != VERSION def test_root_validator(): class Model(V1BaseModel): v: str @v1_root_validator(pre=True) @classmethod def root_validator(cls, values): values['v'] += '-v1' return values model = Model(v='value') assert model.v == 'value-v1' def test_isinstance_does_not_raise_deprecation_warnings(): class V1Model(V1BaseModel): v: int class V2Model(V2BaseModel): v: int v1_obj = V1Model(v=1) v2_obj = V2Model(v=2) with warnings.catch_warnings(): warnings.simplefilter('error') assert isinstance(v1_obj, V1BaseModel) assert not isinstance(v1_obj, V2BaseModel) assert not isinstance(v2_obj, V1BaseModel) assert isinstance(v2_obj, V2BaseModel) pydantic-2.10.6/tests/test_validate_call.py000066400000000000000000001072651474456633400210050ustar00rootroot00000000000000import asyncio import inspect import re import sys from datetime import datetime, timezone from functools import partial from typing import Any, List, Literal, Tuple, Union import pytest from pydantic_core import ArgsKwargs from typing_extensions import Annotated, Required, TypedDict, Unpack from pydantic import ( AfterValidator, BaseModel, BeforeValidator, Field, PydanticInvalidForJsonSchema, PydanticUserError, Strict, TypeAdapter, ValidationError, validate_call, with_config, ) def test_wrap() -> None: @validate_call def foo_bar(a: int, b: int): """This is the foo_bar method.""" return f'{a}, {b}' assert foo_bar.__doc__ == 'This is the foo_bar method.' assert foo_bar.__name__ == 'foo_bar' assert foo_bar.__module__ == 'tests.test_validate_call' assert foo_bar.__qualname__ == 'test_wrap..foo_bar' assert callable(foo_bar.raw_function) assert repr(inspect.signature(foo_bar)) == '' def test_func_type() -> None: def f(x: int): ... class A: def m(self, x: int): ... for func in (f, lambda x: None, A.m, A().m): assert validate_call(func).__name__ == func.__name__ assert validate_call(func).__qualname__ == func.__qualname__ assert validate_call(partial(func)).__name__ == f'partial({func.__name__})' assert validate_call(partial(func)).__qualname__ == f'partial({func.__qualname__})' with pytest.raises( PydanticUserError, match=(f'Partial of `{list}` is invalid because the type of `{list}` is not supported by `validate_call`'), ): validate_call(partial(list)) with pytest.raises( PydanticUserError, match=('`validate_call` should be applied to one of the following: function, method, partial, or lambda'), ): validate_call([]) def validate_bare_none() -> None: @validate_call def func(f: None): return f assert func(f=None) is None def test_validate_class() -> None: class A: @validate_call def __new__(cls, x: int): return super().__new__(cls) @validate_call def __init__(self, x: int) -> None: self.x = x class M(type): ... for cls in (A, int, type, Exception, M): with pytest.raises( PydanticUserError, match=re.escape( '`validate_call` should be applied to functions, not classes (put `@validate_call` on top of `__init__` or `__new__` instead)' ), ): validate_call(cls) assert A('5').x == 5 def test_validate_custom_callable() -> None: class A: def __call__(self, x: int) -> int: return x with pytest.raises( PydanticUserError, match=re.escape( '`validate_call` should be applied to functions, not instances or other callables. Use `validate_call` explicitly on `__call__` instead.' ), ): validate_call(A()) a = A() assert validate_call(a.__call__)('5') == 5 # Note: dunder methods cannot be overridden at instance level class B: @validate_call def __call__(self, x: int) -> int: return x assert B()('5') == 5 def test_invalid_signature() -> None: # Builtins functions not supported: with pytest.raises(PydanticUserError, match=(f'Input built-in function `{breakpoint}` is not supported')): validate_call(breakpoint) class A: def f(): ... # A method require at least one positional arg (i.e. `self`), so the signature is invalid func = A().f with pytest.raises(PydanticUserError, match=(f"Input function `{func}` doesn't have a valid signature")): validate_call(func) @pytest.mark.parametrize('decorator', [staticmethod, classmethod]) def test_classmethod_order_error(decorator) -> None: name = decorator.__name__ with pytest.raises( PydanticUserError, match=re.escape(f'The `@{name}` decorator should be applied after `@validate_call` (put `@{name}` on top)'), ): class A: @validate_call @decorator def method(self, x: int): pass def test_args() -> None: @validate_call def foo(a: int, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(*[1, 2]) == '1, 2' assert foo(*(1, 2)) == '1, 2' assert foo(*[1], 2) == '1, 2' assert foo(a=1, b=2) == '1, 2' assert foo(1, b=2) == '1, 2' assert foo(b=2, a=1) == '1, 2' with pytest.raises(ValidationError) as exc_info: foo() # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing_argument', 'loc': ('a',), 'msg': 'Missing required argument', 'input': ArgsKwargs(())}, {'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs(())}, ] with pytest.raises(ValidationError) as exc_info: foo(1, 'x') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (1,), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', } ] with pytest.raises(ValidationError, match=r'2\s+Unexpected positional argument'): foo(1, 2, 3) with pytest.raises(ValidationError, match=r'apple\s+Unexpected keyword argument'): foo(1, 2, apple=3) with pytest.raises(ValidationError, match=r'a\s+Got multiple values for argument'): foo(1, 2, a=3) with pytest.raises(ValidationError) as exc_info: foo(1, 2, a=3, b=4) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'multiple_argument_values', 'loc': ('a',), 'msg': 'Got multiple values for argument', 'input': 3}, {'type': 'multiple_argument_values', 'loc': ('b',), 'msg': 'Got multiple values for argument', 'input': 4}, ] def test_optional(): @validate_call def foo_bar(a: int = None): return f'a={a}' assert foo_bar() == 'a=None' assert foo_bar(1) == 'a=1' with pytest.raises(ValidationError) as exc_info: foo_bar(None) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': (0,), 'msg': 'Input should be a valid integer', 'input': None} ] def test_kwargs(): @validate_call def foo(*, a: int, b: int): return a + b assert foo(a=1, b=3) == 4 with pytest.raises(ValidationError) as exc_info: foo(a=1, b='x') assert exc_info.value.errors(include_url=False) == [ { 'input': 'x', 'loc': ('b',), 'msg': 'Input should be a valid integer, unable to parse string as an ' 'integer', 'type': 'int_parsing', } ] with pytest.raises(ValidationError) as exc_info: foo(1, 'x') # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing_keyword_only_argument', 'loc': ('a',), 'msg': 'Missing required keyword only argument', 'input': ArgsKwargs((1, 'x')), }, { 'type': 'missing_keyword_only_argument', 'loc': ('b',), 'msg': 'Missing required keyword only argument', 'input': ArgsKwargs((1, 'x')), }, {'type': 'unexpected_positional_argument', 'loc': (0,), 'msg': 'Unexpected positional argument', 'input': 1}, {'type': 'unexpected_positional_argument', 'loc': (1,), 'msg': 'Unexpected positional argument', 'input': 'x'}, ] def test_untyped(): @validate_call def foo(a, b, c='x', *, d='y'): return ', '.join(str(arg) for arg in [a, b, c, d]) assert foo(1, 2) == '1, 2, x, y' assert foo(1, {'x': 2}, c='3', d='4') == "1, {'x': 2}, 3, 4" @pytest.mark.parametrize('validated', (True, False)) def test_var_args_kwargs(validated): def foo(a, b, *args, d=3, **kwargs): return f'a={a!r}, b={b!r}, args={args!r}, d={d!r}, kwargs={kwargs!r}' if validated: foo = validate_call(foo) assert foo(1, 2) == 'a=1, b=2, args=(), d=3, kwargs={}' assert foo(1, 2, 3, d=4) == 'a=1, b=2, args=(3,), d=4, kwargs={}' assert foo(*[1, 2, 3], d=4) == 'a=1, b=2, args=(3,), d=4, kwargs={}' assert foo(1, 2, args=(10, 11)) == "a=1, b=2, args=(), d=3, kwargs={'args': (10, 11)}" assert foo(1, 2, 3, args=(10, 11)) == "a=1, b=2, args=(3,), d=3, kwargs={'args': (10, 11)}" assert foo(1, 2, 3, e=10) == "a=1, b=2, args=(3,), d=3, kwargs={'e': 10}" assert foo(1, 2, kwargs=4) == "a=1, b=2, args=(), d=3, kwargs={'kwargs': 4}" assert foo(1, 2, kwargs=4, e=5) == "a=1, b=2, args=(), d=3, kwargs={'kwargs': 4, 'e': 5}" def test_unpacked_typed_dict_kwargs_invalid_type() -> None: with pytest.raises(PydanticUserError) as exc: @validate_call def foo(**kwargs: Unpack[int]): pass assert exc.value.code == 'unpack-typed-dict' def test_unpacked_typed_dict_kwargs_overlaps() -> None: class TD(TypedDict, total=False): a: int b: int c: int with pytest.raises(PydanticUserError) as exc: @validate_call def foo(a: int, b: int, **kwargs: Unpack[TD]): pass assert exc.value.code == 'overlapping-unpack-typed-dict' assert exc.value.message == "Typed dictionary 'TD' overlaps with parameters 'a', 'b'" # Works for a pos-only argument @validate_call def foo(a: int, /, **kwargs: Unpack[TD]): pass foo(1, a=1) def test_unpacked_typed_dict_kwargs() -> None: @with_config({'strict': True}) class TD(TypedDict, total=False): a: int b: Required[str] @validate_call def foo1(**kwargs: Unpack[TD]): pass @validate_call def foo2(**kwargs: 'Unpack[TD]'): pass for foo in (foo1, foo2): foo(a=1, b='test') foo(b='test') with pytest.raises(ValidationError) as exc: foo(a='1') assert exc.value.errors()[0]['type'] == 'int_type' assert exc.value.errors()[0]['loc'] == ('a',) assert exc.value.errors()[1]['type'] == 'missing' assert exc.value.errors()[1]['loc'] == ('b',) # Make sure that when called without any arguments, # empty kwargs are still validated against the typed dict: with pytest.raises(ValidationError) as exc: foo() assert exc.value.errors()[0]['type'] == 'missing' assert exc.value.errors()[0]['loc'] == ('b',) def test_unpacked_typed_dict_kwargs_functional_syntax() -> None: TD = TypedDict('TD', {'in': int, 'x-y': int}) @validate_call def foo(**kwargs: Unpack[TD]): pass foo(**{'in': 1, 'x-y': 2}) with pytest.raises(ValidationError) as exc: foo(**{'in': 'not_an_int', 'x-y': 1}) assert exc.value.errors()[0]['type'] == 'int_parsing' assert exc.value.errors()[0]['loc'] == ('in',) def test_field_can_provide_factory() -> None: @validate_call def foo(a: int, b: int = Field(default_factory=lambda: 99), *args: int) -> int: """mypy is happy with this""" return a + b + sum(args) assert foo(3) == 102 assert foo(1, 2, 3) == 6 def test_annotated_field_can_provide_factory() -> None: @validate_call def foo2(a: int, b: 'Annotated[int, Field(default_factory=lambda: 99)]', *args: int) -> int: """mypy reports Incompatible default for argument "b" if we don't supply ANY as default""" return a + b + sum(args) assert foo2(1) == 100 def test_positional_only(create_module): module = create_module( # language=Python """ from pydantic import validate_call @validate_call def foo(a, b, /, c=None): return f'{a}, {b}, {c}' """ ) assert module.foo(1, 2) == '1, 2, None' assert module.foo(1, 2, 44) == '1, 2, 44' assert module.foo(1, 2, c=44) == '1, 2, 44' with pytest.raises(ValidationError) as exc_info: module.foo(1, b=2) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing_positional_only_argument', 'loc': (1,), 'msg': 'Missing required positional only argument', 'input': ArgsKwargs((1,), {'b': 2}), }, {'type': 'unexpected_keyword_argument', 'loc': ('b',), 'msg': 'Unexpected keyword argument', 'input': 2}, ] with pytest.raises(ValidationError) as exc_info: module.foo(a=1, b=2) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing_positional_only_argument', 'loc': (0,), 'msg': 'Missing required positional only argument', 'input': ArgsKwargs((), {'a': 1, 'b': 2}), }, { 'type': 'missing_positional_only_argument', 'loc': (1,), 'msg': 'Missing required positional only argument', 'input': ArgsKwargs((), {'a': 1, 'b': 2}), }, {'type': 'unexpected_keyword_argument', 'loc': ('a',), 'msg': 'Unexpected keyword argument', 'input': 1}, {'type': 'unexpected_keyword_argument', 'loc': ('b',), 'msg': 'Unexpected keyword argument', 'input': 2}, ] def test_args_name(): @validate_call def foo(args: int, kwargs: int): return f'args={args!r}, kwargs={kwargs!r}' assert foo(1, 2) == 'args=1, kwargs=2' with pytest.raises(ValidationError, match=r'apple\s+Unexpected keyword argument'): foo(1, 2, apple=4) with pytest.raises(ValidationError) as exc_info: foo(1, 2, apple=4, banana=5) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'unexpected_keyword_argument', 'loc': ('apple',), 'msg': 'Unexpected keyword argument', 'input': 4}, {'type': 'unexpected_keyword_argument', 'loc': ('banana',), 'msg': 'Unexpected keyword argument', 'input': 5}, ] with pytest.raises(ValidationError) as exc_info: foo(1, 2, 3) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'unexpected_positional_argument', 'loc': (2,), 'msg': 'Unexpected positional argument', 'input': 3} ] def test_async(): @validate_call async def foo(a, b): return f'a={a} b={b}' async def run(): v = await foo(1, 2) assert v == 'a=1 b=2' # insert_assert(inspect.iscoroutinefunction(foo) is True) assert inspect.iscoroutinefunction(foo) is True asyncio.run(run()) with pytest.raises(ValidationError) as exc_info: asyncio.run(foo('x')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs(('x',))} ] def test_string_annotation(): @validate_call def foo(a: 'List[int]', b: 'float'): return f'a={a!r} b={b!r}' assert foo([1, 2, 3], 22) == 'a=[1, 2, 3] b=22.0' with pytest.raises(ValidationError) as exc_info: foo(['x']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (0, 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, {'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs((['x'],))}, ] def test_local_annotation(): ListInt = List[int] @validate_call def foo(a: ListInt): return f'a={a!r}' assert foo([1, 2, 3]) == 'a=[1, 2, 3]' with pytest.raises(ValidationError) as exc_info: foo(['x']) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (0, 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', }, ] def test_item_method(): class X: def __init__(self, v): self.v = v @validate_call def foo(self, a: int, b: int): assert self.v == a return f'{a}, {b}' x = X(4) assert x.foo(4, 2) == '4, 2' assert x.foo(*[4, 2]) == '4, 2' with pytest.raises(ValidationError) as exc_info: x.foo() # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing_argument', 'loc': ('a',), 'msg': 'Missing required argument', 'input': ArgsKwargs((x,))}, {'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs((x,))}, ] def test_class_method(): class X: @classmethod @validate_call def foo(cls, a: int, b: int): assert cls == X return f'{a}, {b}' x = X() assert x.foo(4, 2) == '4, 2' assert x.foo(*[4, 2]) == '4, 2' with pytest.raises(ValidationError) as exc_info: x.foo() # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ {'type': 'missing_argument', 'loc': ('a',), 'msg': 'Missing required argument', 'input': ArgsKwargs((X,))}, {'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs((X,))}, ] def test_json_schema(): @validate_call def foo(a: int, b: int = None): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(1, b=2) == '1, 2' assert foo(1) == '1, None' assert TypeAdapter(foo).json_schema() == { 'type': 'object', 'properties': {'a': {'title': 'A', 'type': 'integer'}, 'b': {'default': None, 'title': 'B', 'type': 'integer'}}, 'required': ['a'], 'additionalProperties': False, } @validate_call def foo(a: int, /, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert TypeAdapter(foo).json_schema() == { 'maxItems': 2, 'minItems': 2, 'prefixItems': [{'title': 'A', 'type': 'integer'}, {'title': 'B', 'type': 'integer'}], 'type': 'array', } @validate_call def foo(a: int, /, *, b: int, c: int): return f'{a}, {b}, {c}' assert foo(1, b=2, c=3) == '1, 2, 3' with pytest.raises( PydanticInvalidForJsonSchema, match=( 'Unable to generate JSON schema for arguments validator ' 'with positional-only and keyword-only arguments' ), ): TypeAdapter(foo).json_schema() @validate_call def foo(*numbers: int) -> int: return sum(numbers) assert foo(1, 2, 3) == 6 assert TypeAdapter(foo).json_schema() == {'items': {'type': 'integer'}, 'type': 'array'} @validate_call def foo(a: int, *numbers: int) -> int: return a + sum(numbers) assert foo(1, 2, 3) == 6 assert TypeAdapter(foo).json_schema() == { 'items': {'type': 'integer'}, 'prefixItems': [{'title': 'A', 'type': 'integer'}], 'minItems': 1, 'type': 'array', } @validate_call def foo(**scores: int) -> str: return ', '.join(f'{k}={v}' for k, v in sorted(scores.items())) assert foo(a=1, b=2) == 'a=1, b=2' assert TypeAdapter(foo).json_schema() == { 'additionalProperties': {'type': 'integer'}, 'properties': {}, 'type': 'object', } @validate_call def foo(a: Annotated[int, Field(alias='A')]): return a assert foo(1) == 1 assert TypeAdapter(foo).json_schema() == { 'additionalProperties': False, 'properties': {'A': {'title': 'A', 'type': 'integer'}}, 'required': ['A'], 'type': 'object', } def test_alias_generator(): @validate_call(config=dict(alias_generator=lambda x: x * 2)) def foo(a: int, b: int): return f'{a}, {b}' assert foo(1, 2) == '1, 2' assert foo(aa=1, bb=2) == '1, 2' def test_config_arbitrary_types_allowed(): class EggBox: def __str__(self) -> str: return 'EggBox()' @validate_call(config=dict(arbitrary_types_allowed=True)) def foo(a: int, b: EggBox): return f'{a}, {b}' assert foo(1, EggBox()) == '1, EggBox()' with pytest.raises(ValidationError) as exc_info: assert foo(1, 2) == '1, 2' # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'is_instance_of', 'loc': (1,), 'msg': 'Input should be an instance of test_config_arbitrary_types_allowed..EggBox', 'input': 2, 'ctx': {'class': 'test_config_arbitrary_types_allowed..EggBox'}, } ] def test_config_strict(): @validate_call(config=dict(strict=True)) def foo(a: int, b: List[str]): return f'{a}, {b[0]}' assert foo(1, ['bar', 'foobar']) == '1, bar' with pytest.raises(ValidationError) as exc_info: foo('foo', ('bar', 'foobar')) assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': (0,), 'msg': 'Input should be a valid integer', 'input': 'foo'}, {'type': 'list_type', 'loc': (1,), 'msg': 'Input should be a valid list', 'input': ('bar', 'foobar')}, ] def test_annotated_num(): @validate_call def f(a: Annotated[int, Field(gt=0), Field(lt=10)]): return a assert f(5) == 5 with pytest.raises(ValidationError) as exc_info: f(0) assert exc_info.value.errors(include_url=False) == [ {'type': 'greater_than', 'loc': (0,), 'msg': 'Input should be greater than 0', 'input': 0, 'ctx': {'gt': 0}} ] with pytest.raises(ValidationError) as exc_info: f(10) assert exc_info.value.errors(include_url=False) == [ {'type': 'less_than', 'loc': (0,), 'msg': 'Input should be less than 10', 'input': 10, 'ctx': {'lt': 10}} ] def test_annotated_discriminator(): class Cat(BaseModel): type: Literal['cat'] = 'cat' food: str meow: int class Dog(BaseModel): type: Literal['dog'] = 'dog' food: str bark: int Pet = Annotated[Union[Cat, Dog], Field(discriminator='type')] @validate_call def f(pet: Pet): return pet with pytest.raises(ValidationError) as exc_info: f({'food': 'fish'}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'union_tag_not_found', 'loc': (0,), 'msg': "Unable to extract tag using discriminator 'type'", 'input': {'food': 'fish'}, 'ctx': {'discriminator': "'type'"}, } ] with pytest.raises(ValidationError) as exc_info: f({'type': 'dog', 'food': 'fish'}) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing', 'loc': (0, 'dog', 'bark'), 'msg': 'Field required', 'input': {'type': 'dog', 'food': 'fish'}, } ] def test_annotated_validator(): @validate_call def f(x: Annotated[int, BeforeValidator(lambda x: x + '2'), AfterValidator(lambda x: x + 1)]): return x assert f('1') == 13 def test_annotated_strict(): @validate_call def f1(x: Annotated[int, Strict()]): return x @validate_call def f2(x: 'Annotated[int, Strict()]'): return x for f in (f1, f2): assert f(1) == 1 with pytest.raises(ValidationError) as exc_info: f('1') assert exc_info.value.errors(include_url=False) == [ {'type': 'int_type', 'loc': (0,), 'msg': 'Input should be a valid integer', 'input': '1'} ] def test_annotated_use_of_alias(): @validate_call def foo(a: Annotated[int, Field(alias='b')], c: Annotated[int, Field()], d: Annotated[int, Field(alias='')]): return a + c + d assert foo(**{'b': 10, 'c': 12, '': 1}) == 23 with pytest.raises(ValidationError) as exc_info: assert foo(a=10, c=12, d=1) == 10 # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'missing_argument', 'loc': ('b',), 'msg': 'Missing required argument', 'input': ArgsKwargs((), {'a': 10, 'c': 12, 'd': 1}), }, { 'type': 'missing_argument', 'loc': ('',), 'msg': 'Missing required argument', 'input': ArgsKwargs((), {'a': 10, 'c': 12, 'd': 1}), }, {'type': 'unexpected_keyword_argument', 'loc': ('a',), 'msg': 'Unexpected keyword argument', 'input': 10}, {'type': 'unexpected_keyword_argument', 'loc': ('d',), 'msg': 'Unexpected keyword argument', 'input': 1}, ] def test_use_of_alias(): @validate_call def foo(c: int = Field(default_factory=lambda: 20), a: int = Field(default_factory=lambda: 10, alias='b')): return a + c assert foo(b=10) == 30 def test_populate_by_name(): @validate_call(config=dict(populate_by_name=True)) def foo(a: Annotated[int, Field(alias='b')], c: Annotated[int, Field(alias='d')]): return a + c assert foo(b=10, d=1) == 11 assert foo(a=10, d=1) == 11 assert foo(b=10, c=1) == 11 assert foo(a=10, c=1) == 11 def test_validate_return(): @validate_call(config=dict(validate_return=True)) def foo(a: int, b: int) -> int: return a + b assert foo(1, 2) == 3 def test_validate_all(): @validate_call(config=dict(validate_default=True)) def foo(dt: datetime = Field(default_factory=lambda: 946684800)): return dt assert foo() == datetime(2000, 1, 1, tzinfo=timezone.utc) assert foo(0) == datetime(1970, 1, 1, tzinfo=timezone.utc) def test_validate_all_positional(create_module): module = create_module( # language=Python """ from datetime import datetime from pydantic import Field, validate_call @validate_call(config=dict(validate_default=True)) def foo(dt: datetime = Field(default_factory=lambda: 946684800), /): return dt """ ) assert module.foo() == datetime(2000, 1, 1, tzinfo=timezone.utc) assert module.foo(0) == datetime(1970, 1, 1, tzinfo=timezone.utc) def test_partial(): def my_wrapped_function(a: int, b: int, c: int): return a + b + c my_partial_function = partial(my_wrapped_function, c=3) f = validate_call(my_partial_function) assert f(1, 2) == 6 def test_validator_init(): class Foo: @validate_call def __init__(self, a: int, b: int): self.v = a + b assert Foo(1, 2).v == 3 assert Foo(1, '2').v == 3 with pytest.raises(ValidationError, match="type=int_parsing, input_value='x', input_type=str"): Foo(1, 'x') def test_positional_and_keyword_with_same_name(create_module): module = create_module( # language=Python """ from pydantic import validate_call @validate_call def f(a: int, /, **kwargs): return a, kwargs """ ) assert module.f(1, a=2) == (1, {'a': 2}) def test_model_as_arg() -> None: class Model1(TypedDict): x: int class Model2(BaseModel): y: int @validate_call(validate_return=True) def f1(m1: Model1, m2: Model2) -> Tuple[Model1, Model2]: return (m1, m2.model_dump()) # type: ignore res = f1({'x': '1'}, {'y': '2'}) # type: ignore assert res == ({'x': 1}, Model2(y=2)) def test_do_not_call_repr_on_validate_call() -> None: class Class: @validate_call def __init__(self, number: int) -> None: ... def __repr__(self) -> str: assert False Class(50) def test_methods_are_not_rebound(): class Thing: def __init__(self, x: int): self.x = x def a(self, x: int): return x + self.x c = validate_call(a) thing = Thing(1) assert thing.a == thing.a assert thing.c == thing.c assert Thing.c == Thing.c # Ensure validation is still happening assert Thing.c(thing, '2') == 3 assert Thing(2).c('3') == 5 def test_basemodel_method(): class Foo(BaseModel): @classmethod @validate_call def test(cls, x: int): return cls, x assert Foo.test('1') == (Foo, 1) class Bar(BaseModel): @validate_call def test(self, x: int): return self, x bar = Bar() assert bar.test('1') == (bar, 1) def test_dynamic_method_decoration(): class Foo: def bar(self, value: str) -> str: return f'bar-{value}' Foo.bar = validate_call(Foo.bar) assert Foo.bar foo = Foo() assert foo.bar('test') == 'bar-test' def test_async_func() -> None: @validate_call(validate_return=True) async def foo(a: Any) -> int: return a res = asyncio.run(foo(1)) assert res == 1 with pytest.raises(ValidationError) as exc_info: asyncio.run(foo('x')) # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': (), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', } ] def test_validate_call_with_slots() -> None: class ClassWithSlots: __slots__ = {} @validate_call(validate_return=True) def some_instance_method(self, x: str) -> str: return x @classmethod @validate_call(validate_return=True) def some_class_method(cls, x: str) -> str: return x @staticmethod @validate_call(validate_return=True) def some_static_method(x: str) -> str: return x c = ClassWithSlots() assert c.some_instance_method(x='potato') == 'potato' assert c.some_class_method(x='pepper') == 'pepper' assert c.some_static_method(x='onion') == 'onion' # verify that equality still holds for instance methods assert c.some_instance_method == c.some_instance_method assert c.some_class_method == c.some_class_method assert c.some_static_method == c.some_static_method def test_eval_type_backport(): @validate_call def foo(bar: 'list[int | str]') -> 'list[int | str]': return bar assert foo([1, '2']) == [1, '2'] with pytest.raises(ValidationError) as exc_info: foo('not a list') # type: ignore # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'list_type', 'loc': (0,), 'msg': 'Input should be a valid list', 'input': 'not a list', } ] with pytest.raises(ValidationError) as exc_info: foo([{'not a str or int'}]) # type: ignore # insert_assert(exc_info.value.errors(include_url=False)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_type', 'loc': (0, 0, 'int'), 'msg': 'Input should be a valid integer', 'input': {'not a str or int'}, }, { 'type': 'string_type', 'loc': (0, 0, 'str'), 'msg': 'Input should be a valid string', 'input': {'not a str or int'}, }, ] def test_eval_namespace_basic(create_module): module = create_module( """ from __future__ import annotations from typing import TypeVar from pydantic import validate_call T = TypeVar('T', bound=int) @validate_call def f(x: T): ... def g(): MyList = list @validate_call def h(x: MyList[int]): ... return h """ ) f = module.f f(1) with pytest.raises(ValidationError) as exc_info: f('x') assert exc_info.value.errors(include_url=False) == [ { 'input': 'x', 'loc': (0,), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'type': 'int_parsing', } ] h = module.g() with pytest.raises(ValidationError) as exc_info: h('not a list') assert exc_info.value.errors(include_url=False) == [ {'input': 'not a list', 'loc': (0,), 'msg': 'Input should be a valid list', 'type': 'list_type'} ] @pytest.mark.skipif(sys.version_info < (3, 12), reason='requires Python 3.12+ for PEP 695 syntax with generics') def test_validate_call_with_pep_695_syntax(create_module) -> None: """Note: validate_call still doesn't work properly with generics, see https://github.com/pydantic/pydantic/issues/7796. This test is just to ensure that the syntax is accepted and doesn't raise a NameError.""" module = create_module( """ from typing import Iterable from pydantic import validate_call @validate_call def find_max_no_validate_return[T](args: 'Iterable[T]') -> T: return sorted(args, reverse=True)[0] @validate_call(validate_return=True) def find_max_validate_return[T](args: 'Iterable[T]') -> T: return sorted(args, reverse=True)[0] """ ) functions = [module.find_max_no_validate_return, module.find_max_validate_return] for find_max in functions: assert len(find_max.__type_params__) == 1 assert find_max([1, 2, 10, 5]) == 10 with pytest.raises(ValidationError): find_max(1) @pytest.mark.skipif(sys.version_info < (3, 12), reason='requires Python 3.12+ for PEP 695 syntax with generics') def test_pep695_with_class(create_module): """Primarily to ensure that the syntax is accepted and doesn't raise a `NameError` with `T`. The validation is not expected to work properly when parameterized at this point.""" for import_annotations in ('from __future__ import annotations', ''): module = create_module( f""" {import_annotations} from pydantic import validate_call class A[T]: @validate_call(validate_return=True) def f(self, a: T) -> T: return str(a) """ ) A = module.A a = A[int]() # these two are undesired behavior, but it's what happens now assert a.f(1) == '1' assert a.f('1') == '1' @pytest.mark.skipif(sys.version_info < (3, 12), reason='requires Python 3.12+ for PEP 695 syntax with generics') def test_pep695_with_nested_scopes(create_module): """Nested scopes generally cannot be caught by `parent_frame_namespace`, so currently this test is expected to fail. """ module = create_module( """ from __future__ import annotations from pydantic import validate_call class A[T]: def g(self): @validate_call(validate_return=True) def inner(a: T) -> T: ... def h[S](self): @validate_call(validate_return=True) def inner(a: T) -> S: ... """ ) A = module.A a = A[int]() with pytest.raises(NameError): a.g() with pytest.raises(NameError): a.h() with pytest.raises(NameError): create_module( """ from __future__ import annotations from pydantic import validate_call class A[T]: class B: @validate_call(validate_return=True) def f(a: T) -> T: ... class C[S]: @validate_call(validate_return=True) def f(a: T) -> S: ... """ ) class M0(BaseModel): z: int M = M0 def test_uses_local_ns(): class M1(BaseModel): y: int M = M1 # noqa: F841 def foo(): class M2(BaseModel): z: int M = M2 # noqa: F841 @validate_call(validate_return=True) def bar(m: 'M') -> 'M': return m assert bar({'z': 1}) == M2(z=1) foo() pydantic-2.10.6/tests/test_validators.py000066400000000000000000002471341474456633400203710ustar00rootroot00000000000000import contextlib import re from collections import deque from dataclasses import dataclass from datetime import date, datetime from enum import Enum from functools import partial, partialmethod from itertools import product from os.path import normcase from typing import Any, Callable, Deque, Dict, FrozenSet, List, NamedTuple, Optional, Tuple, Union from unittest.mock import MagicMock import pytest from dirty_equals import HasRepr, IsInstance from pydantic_core import core_schema from typing_extensions import Annotated, Literal, TypedDict from pydantic import ( BaseModel, ConfigDict, Field, GetCoreSchemaHandler, PlainSerializer, PydanticDeprecatedSince20, PydanticUserError, TypeAdapter, ValidationError, ValidationInfo, ValidatorFunctionWrapHandler, errors, field_validator, model_validator, root_validator, validate_call, validator, ) from pydantic.dataclasses import dataclass as pydantic_dataclass from pydantic.functional_validators import AfterValidator, BeforeValidator, PlainValidator, WrapValidator V1_VALIDATOR_DEPRECATION_MATCH = r'Pydantic V1 style `@validator` validators are deprecated' def test_annotated_validator_after() -> None: MyInt = Annotated[int, AfterValidator(lambda x, _info: x if x != -1 else 0)] class Model(BaseModel): x: MyInt assert Model(x=0).x == 0 assert Model(x=-1).x == 0 assert Model(x=-2).x == -2 assert Model(x=1).x == 1 assert Model(x='-1').x == 0 def test_annotated_validator_before() -> None: FloatMaybeInf = Annotated[float, BeforeValidator(lambda x, _info: x if x != 'zero' else 0.0)] class Model(BaseModel): x: FloatMaybeInf assert Model(x='zero').x == 0.0 assert Model(x=1.0).x == 1.0 assert Model(x='1.0').x == 1.0 def test_annotated_validator_builtin() -> None: """https://github.com/pydantic/pydantic/issues/6752""" TruncatedFloat = Annotated[float, BeforeValidator(int)] DateTimeFromIsoFormat = Annotated[datetime, BeforeValidator(datetime.fromisoformat)] class Model(BaseModel): x: TruncatedFloat y: DateTimeFromIsoFormat m = Model(x=1.234, y='2011-11-04T00:05:23') assert m.x == 1 assert m.y == datetime(2011, 11, 4, 0, 5, 23) def test_annotated_validator_plain() -> None: MyInt = Annotated[int, PlainValidator(lambda x, _info: x if x != -1 else 0)] class Model(BaseModel): x: MyInt assert Model(x=0).x == 0 assert Model(x=-1).x == 0 assert Model(x=-2).x == -2 def test_annotated_validator_wrap() -> None: def sixties_validator(val: Any, handler: ValidatorFunctionWrapHandler, info: ValidationInfo) -> date: if val == 'epoch': return date.fromtimestamp(0) newval = handler(val) if not date.fromisoformat('1960-01-01') <= newval < date.fromisoformat('1970-01-01'): raise ValueError(f'{val} is not in the sixties!') return newval SixtiesDateTime = Annotated[date, WrapValidator(sixties_validator)] class Model(BaseModel): x: SixtiesDateTime assert Model(x='epoch').x == date.fromtimestamp(0) assert Model(x='1962-01-13').x == date(year=1962, month=1, day=13) assert Model(x=datetime(year=1962, month=1, day=13)).x == date(year=1962, month=1, day=13) with pytest.raises(ValidationError) as exc_info: Model(x=date(year=1970, month=4, day=17)) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('1970-04-17 is not in the sixties!')))}, 'input': date(1970, 4, 17), 'loc': ('x',), 'msg': 'Value error, 1970-04-17 is not in the sixties!', 'type': 'value_error', } ] def test_annotated_validator_nested() -> None: MyInt = Annotated[int, AfterValidator(lambda x: x if x != -1 else 0)] def non_decreasing_list(data: List[int]) -> List[int]: for prev, cur in zip(data, data[1:]): assert cur >= prev return data class Model(BaseModel): x: Annotated[List[MyInt], AfterValidator(non_decreasing_list)] assert Model(x=[0, -1, 2]).x == [0, 0, 2] with pytest.raises(ValidationError) as exc_info: Model(x=[0, -1, -2]) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(AssertionError('assert -2 >= 0')))}, 'input': [0, -1, -2], 'loc': ('x',), 'msg': 'Assertion failed, assert -2 >= 0', 'type': 'assertion_error', } ] def test_annotated_validator_runs_before_field_validators() -> None: MyInt = Annotated[int, AfterValidator(lambda x: x if x != -1 else 0)] class Model(BaseModel): x: MyInt @field_validator('x') def val_x(cls, v: int) -> int: assert v != -1 return v assert Model(x=-1).x == 0 @pytest.mark.parametrize( 'validator, func', [ (PlainValidator, lambda x: x if x != -1 else 0), (WrapValidator, lambda x, nxt: x if x != -1 else 0), (BeforeValidator, lambda x: x if x != -1 else 0), (AfterValidator, lambda x: x if x != -1 else 0), ], ) def test_annotated_validator_typing_cache(validator, func): FancyInt = Annotated[int, validator(func)] class FancyIntModel(BaseModel): x: Optional[FancyInt] assert FancyIntModel(x=1234).x == 1234 assert FancyIntModel(x=-1).x == 0 assert FancyIntModel(x=0).x == 0 def test_simple(): class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v assert Model(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('"foobar" not found in a')))}, 'input': 'snap', 'loc': ('a',), 'msg': 'Value error, "foobar" not found in a', 'type': 'value_error', } ] def test_int_validation(): class Model(BaseModel): a: int with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'snap', } ] assert Model(a=3).a == 3 assert Model(a=True).a == 1 assert Model(a=False).a == 0 with pytest.raises(ValidationError) as exc_info: Model(a=4.5) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_from_float', 'loc': ('a',), 'msg': 'Input should be a valid integer, got a number with a fractional part', 'input': 4.5, } ] # Doesn't raise ValidationError for number > (2 ^ 63) - 1 assert Model(a=(2**63) + 100).a == (2**63) + 100 @pytest.mark.parametrize('value', [2.2250738585072011e308, float('nan'), float('inf')]) def test_int_overflow_validation(value): class Model(BaseModel): a: int with pytest.raises(ValidationError) as exc_info: Model(a=value) assert exc_info.value.errors(include_url=False) == [ {'type': 'finite_number', 'loc': ('a',), 'msg': 'Input should be a finite number', 'input': value} ] def test_frozenset_validation(): class Model(BaseModel): a: FrozenSet[int] with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ {'type': 'frozen_set_type', 'loc': ('a',), 'msg': 'Input should be a valid frozenset', 'input': 'snap'} ] assert Model(a={1, 2, 3}).a == frozenset({1, 2, 3}) assert Model(a=frozenset({1, 2, 3})).a == frozenset({1, 2, 3}) assert Model(a=[4, 5]).a == frozenset({4, 5}) assert Model(a=(6,)).a == frozenset({6}) assert Model(a={'1', '2', '3'}).a == frozenset({1, 2, 3}) def test_deque_validation(): class Model(BaseModel): a: Deque[int] with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ {'type': 'list_type', 'loc': ('a',), 'msg': 'Input should be a valid list', 'input': 'snap'} ] with pytest.raises(ValidationError) as exc_info: Model(a=['a']) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] with pytest.raises(ValidationError) as exc_info: Model(a=('a',)) assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a', 0), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'a', } ] assert Model(a={'1'}).a == deque([1]) assert Model(a=[4, 5]).a == deque([4, 5]) assert Model(a=(6,)).a == deque([6]) def test_validate_whole(): class Model(BaseModel): a: List[int] @field_validator('a', mode='before') @classmethod def check_a1(cls, v: List[Any]) -> List[Any]: v.append('123') return v @field_validator('a') @classmethod def check_a2(cls, v: List[int]) -> List[Any]: v.append(456) return v assert Model(a=[1, 2]).a == [1, 2, 123, 456] def test_validate_pre_error(): calls = [] class Model(BaseModel): a: List[int] @field_validator('a', mode='before') @classmethod def check_a1(cls, v: Any): calls.append(f'check_a1 {v}') if 1 in v: raise ValueError('a1 broken') v[0] += 1 return v @field_validator('a') @classmethod def check_a2(cls, v: Any): calls.append(f'check_a2 {v}') if 10 in v: raise ValueError('a2 broken') return v assert Model(a=[3, 8]).a == [4, 8] assert calls == ['check_a1 [3, 8]', 'check_a2 [4, 8]'] calls = [] with pytest.raises(ValidationError) as exc_info: Model(a=[1, 3]) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('a1 broken')))}, 'input': [1, 3], 'loc': ('a',), 'msg': 'Value error, a1 broken', 'type': 'value_error', } ] assert calls == ['check_a1 [1, 3]'] calls = [] with pytest.raises(ValidationError) as exc_info: Model(a=[5, 10]) assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('a2 broken')))}, 'input': [6, 10], 'loc': ('a',), 'msg': 'Value error, a2 broken', 'type': 'value_error', } ] assert calls == ['check_a1 [5, 10]', 'check_a2 [6, 10]'] @pytest.fixture(scope='session', name='ValidateAssignmentModel') def validate_assignment_model_fixture(): class ValidateAssignmentModel(BaseModel): a: int = 4 b: str = ... c: int = 0 @field_validator('b') @classmethod def b_length(cls, v, info): values = info.data if 'a' in values and len(v) < values['a']: raise ValueError('b too short') return v @field_validator('c') @classmethod def double_c(cls, v: Any): return v * 2 model_config = ConfigDict(validate_assignment=True, extra='allow') return ValidateAssignmentModel def test_validating_assignment_ok(ValidateAssignmentModel): p = ValidateAssignmentModel(b='hello') assert p.b == 'hello' def test_validating_assignment_fail(ValidateAssignmentModel): with pytest.raises(ValidationError): ValidateAssignmentModel(a=10, b='hello') p = ValidateAssignmentModel(b='hello') with pytest.raises(ValidationError): p.b = 'x' def test_validating_assignment_value_change(ValidateAssignmentModel): p = ValidateAssignmentModel(b='hello', c=2) assert p.c == 4 p = ValidateAssignmentModel(b='hello') assert p.c == 0 p.c = 3 assert p.c == 6 assert p.model_dump()['c'] == 6 def test_validating_assignment_extra(ValidateAssignmentModel): p = ValidateAssignmentModel(b='hello', extra_field=1.23) assert p.extra_field == 1.23 p = ValidateAssignmentModel(b='hello') p.extra_field = 1.23 assert p.extra_field == 1.23 p.extra_field = 'bye' assert p.extra_field == 'bye' assert p.model_dump()['extra_field'] == 'bye' def test_validating_assignment_dict(ValidateAssignmentModel): with pytest.raises(ValidationError) as exc_info: ValidateAssignmentModel(a='x', b='xx') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'x', } ] def test_validating_assignment_values_dict(): class ModelOne(BaseModel): a: int class ModelTwo(BaseModel): m: ModelOne b: int @field_validator('b') @classmethod def validate_b(cls, b, info: ValidationInfo): if 'm' in info.data: return b + info.data['m'].a # this fails if info.data['m'] is a dict else: return b model_config = ConfigDict(validate_assignment=True) model = ModelTwo(m=ModelOne(a=1), b=2) assert model.b == 3 model.b = 3 assert model.b == 4 def test_validate_multiple(): class Model(BaseModel): a: str b: str @field_validator('a', 'b') @classmethod def check_a_and_b(cls, v: Any, info: ValidationInfo) -> Any: if len(v) < 4: field = cls.model_fields[info.field_name] raise AssertionError(f'{field.alias or info.field_name} is too short') return v + 'x' assert Model(a='1234', b='5678').model_dump() == {'a': '1234x', 'b': '5678x'} with pytest.raises(ValidationError) as exc_info: Model(a='x', b='x') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(AssertionError('a is too short')))}, 'input': 'x', 'loc': ('a',), 'msg': 'Assertion failed, a is too short', 'type': 'assertion_error', }, { 'ctx': {'error': HasRepr(repr(AssertionError('b is too short')))}, 'input': 'x', 'loc': ('b',), 'msg': 'Assertion failed, b is too short', 'type': 'assertion_error', }, ] def test_classmethod(): class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls, v: Any): assert cls is Model return v m = Model(a='this is foobar good') assert m.a == 'this is foobar good' m.check_a('x') def test_use_bare(): with pytest.raises(TypeError, match='`@validator` should be used with fields'): class Model(BaseModel): a: str with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator def checker(cls, v): return v def test_use_bare_field_validator(): with pytest.raises(TypeError, match='`@field_validator` should be used with fields'): class Model(BaseModel): a: str @field_validator def checker(cls, v): return v def test_use_no_fields(): with pytest.raises(TypeError, match=re.escape("validator() missing 1 required positional argument: '__field'")): class Model(BaseModel): a: str @validator() def checker(cls, v): return v def test_use_no_fields_field_validator(): with pytest.raises(TypeError, match=re.escape("field_validator() missing 1 required positional argument: 'field'")): class Model(BaseModel): a: str @field_validator() def checker(cls, v): return v def test_validator_bad_fields_throws_configerror(): """ Attempts to create a validator with fields set as a list of strings, rather than just multiple string args. Expects ConfigError to be raised. """ with pytest.raises(TypeError, match='`@validator` fields should be passed as separate string args.'): class Model(BaseModel): a: str b: str with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator(['a', 'b']) def check_fields(cls, v): return v def test_field_validator_bad_fields_throws_configerror(): """ Attempts to create a validator with fields set as a list of strings, rather than just multiple string args. Expects ConfigError to be raised. """ with pytest.raises(TypeError, match='`@field_validator` fields should be passed as separate string args.'): class Model(BaseModel): a: str b: str @field_validator(['a', 'b']) def check_fields(cls, v): return v def test_validate_always(): check_calls = 0 class Model(BaseModel): a: str = None with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', pre=True, always=True) @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'xxx' assert Model().a == 'xxx' assert check_calls == 1 assert Model(a='y').a == 'y' assert check_calls == 2 def test_field_validator_validate_default(): check_calls = 0 class Model(BaseModel): a: str = Field(None, validate_default=True) @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'xxx' assert Model().a == 'xxx' assert check_calls == 1 assert Model(a='y').a == 'y' assert check_calls == 2 def test_validate_always_on_inheritance(): check_calls = 0 class ParentModel(BaseModel): a: str = None class Model(ParentModel): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', pre=True, always=True) @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'xxx' assert Model().a == 'xxx' assert check_calls == 1 assert Model(a='y').a == 'y' assert check_calls == 2 def test_field_validator_validate_default_on_inheritance(): check_calls = 0 class ParentModel(BaseModel): a: str = Field(None, validate_default=True) class Model(ParentModel): @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'xxx' assert Model().a == 'xxx' assert check_calls == 1 assert Model(a='y').a == 'y' assert check_calls == 2 def test_validate_not_always(): check_calls = 0 class Model(BaseModel): a: Optional[str] = None @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'xxx' assert Model().a is None assert check_calls == 0 assert Model(a='y').a == 'y' assert check_calls == 1 @pytest.mark.parametrize( 'decorator, pytest_warns', [ (validator, pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH)), (field_validator, contextlib.nullcontext()), ], ) def test_wildcard_validators(decorator, pytest_warns): calls: list[tuple[str, Any]] = [] with pytest_warns: class Model(BaseModel): a: str b: int @decorator('a') def check_a(cls, v: Any) -> Any: calls.append(('check_a', v)) return v @decorator('*') def check_all(cls, v: Any) -> Any: calls.append(('check_all', v)) return v @decorator('*', 'a') def check_all_a(cls, v: Any) -> Any: calls.append(('check_all_a', v)) return v assert Model(a='abc', b='123').model_dump() == dict(a='abc', b=123) assert calls == [ ('check_a', 'abc'), ('check_all', 'abc'), ('check_all_a', 'abc'), ('check_all', 123), ('check_all_a', 123), ] @pytest.mark.parametrize( 'decorator, pytest_warns', [ (validator, pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH)), (field_validator, contextlib.nullcontext()), ], ) def test_wildcard_validator_error(decorator, pytest_warns): with pytest_warns: class Model(BaseModel): a: str b: str @decorator('*') def check_all(cls, v: Any) -> Any: if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v assert Model(a='foobar a', b='foobar b').b == 'foobar b' with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('"foobar" not found in a')))}, 'input': 'snap', 'loc': ('a',), 'msg': 'Value error, "foobar" not found in a', 'type': 'value_error', }, {'type': 'missing', 'loc': ('b',), 'msg': 'Field required', 'input': {'a': 'snap'}}, ] def test_invalid_field(): msg = ( r'Decorators defined with incorrect fields:' r' tests.test_validators.test_invalid_field..Model:\d+.check_b' r" \(use check_fields=False if you're inheriting from the model and intended this\)" ) with pytest.raises(errors.PydanticUserError, match=msg): class Model(BaseModel): a: str @field_validator('b') def check_b(cls, v: Any): return v def test_validate_child(): class Parent(BaseModel): a: str class Child(Parent): @field_validator('a') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v assert Parent(a='this is not a child').a == 'this is not a child' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Child(a='snap') def test_validate_child_extra(): class Parent(BaseModel): a: str @field_validator('a') @classmethod def check_a_one(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v class Child(Parent): @field_validator('a') @classmethod def check_a_two(cls, v: Any): return v.upper() assert Parent(a='this is foobar good').a == 'this is foobar good' assert Child(a='this is foobar good').a == 'THIS IS FOOBAR GOOD' with pytest.raises(ValidationError): Child(a='snap') def test_validate_all(): class MyModel(BaseModel): x: int @field_validator('*') @classmethod def validate_all(cls, v: Any): return v * 2 assert MyModel(x=10).x == 20 def test_validate_child_all(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Parent(BaseModel): a: str class Child(Parent): @validator('*') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v assert Parent(a='this is not a child').a == 'this is not a child' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Child(a='snap') class Parent(BaseModel): a: str class Child(Parent): @field_validator('*') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v assert Parent(a='this is not a child').a == 'this is not a child' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Child(a='snap') def test_validate_parent(): class Parent(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v class Child(Parent): pass assert Parent(a='this is foobar good').a == 'this is foobar good' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Parent(a='snap') with pytest.raises(ValidationError): Child(a='snap') def test_validate_parent_all(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Parent(BaseModel): a: str @validator('*') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v class Child(Parent): pass assert Parent(a='this is foobar good').a == 'this is foobar good' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Parent(a='snap') with pytest.raises(ValidationError): Child(a='snap') class Parent(BaseModel): a: str @field_validator('*') @classmethod def check_a(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v class Child(Parent): pass assert Parent(a='this is foobar good').a == 'this is foobar good' assert Child(a='this is foobar good').a == 'this is foobar good' with pytest.raises(ValidationError): Parent(a='snap') with pytest.raises(ValidationError): Child(a='snap') def test_inheritance_keep(): class Parent(BaseModel): a: int @field_validator('a') @classmethod def add_to_a(cls, v: Any): return v + 1 class Child(Parent): pass assert Child(a=0).a == 1 def test_inheritance_replace(): """We promise that if you add a validator with the same _function_ name as an existing validator it replaces the existing validator and is run instead of it. """ class Parent(BaseModel): a: List[str] @field_validator('a') @classmethod def parent_val_before(cls, v: List[str]): v.append('parent before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('parent') return v @field_validator('a') @classmethod def parent_val_after(cls, v: List[str]): v.append('parent after') return v class Child(Parent): @field_validator('a') @classmethod def child_val_before(cls, v: List[str]): v.append('child before') return v @field_validator('a') @classmethod def val(cls, v: List[str]): v.append('child') return v @field_validator('a') @classmethod def child_val_after(cls, v: List[str]): v.append('child after') return v assert Parent(a=[]).a == ['parent before', 'parent', 'parent after'] assert Child(a=[]).a == ['parent before', 'child', 'parent after', 'child before', 'child after'] def test_inheritance_replace_root_validator(): """ We promise that if you add a validator with the same _function_ name as an existing validator it replaces the existing validator and is run instead of it. """ with pytest.warns(PydanticDeprecatedSince20): class Parent(BaseModel): a: List[str] @root_validator(skip_on_failure=True) def parent_val_before(cls, values: Dict[str, Any]): values['a'].append('parent before') return values @root_validator(skip_on_failure=True) def val(cls, values: Dict[str, Any]): values['a'].append('parent') return values @root_validator(skip_on_failure=True) def parent_val_after(cls, values: Dict[str, Any]): values['a'].append('parent after') return values class Child(Parent): @root_validator(skip_on_failure=True) def child_val_before(cls, values: Dict[str, Any]): values['a'].append('child before') return values @root_validator(skip_on_failure=True) def val(cls, values: Dict[str, Any]): values['a'].append('child') return values @root_validator(skip_on_failure=True) def child_val_after(cls, values: Dict[str, Any]): values['a'].append('child after') return values assert Parent(a=[]).a == ['parent before', 'parent', 'parent after'] assert Child(a=[]).a == ['parent before', 'child', 'parent after', 'child before', 'child after'] def test_validation_each_item(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): foobar: Dict[int, int] @validator('foobar', each_item=True) @classmethod def check_foobar(cls, v: Any): return v + 1 assert Model(foobar={1: 1}).foobar == {1: 2} def test_validation_each_item_invalid_type(): with pytest.raises( TypeError, match=re.escape('@validator(..., each_item=True)` cannot be applied to fields with a schema of int') ): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): foobar: int @validator('foobar', each_item=True) @classmethod def check_foobar(cls, v: Any): ... def test_validation_each_item_nullable(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): foobar: Optional[List[int]] @validator('foobar', each_item=True) @classmethod def check_foobar(cls, v: Any): return v + 1 assert Model(foobar=[1]).foobar == [2] def test_validation_each_item_one_sublevel(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): foobar: List[Tuple[int, int]] @validator('foobar', each_item=True) @classmethod def check_foobar(cls, v: Tuple[int, int]) -> Tuple[int, int]: v1, v2 = v assert v1 == v2 return v assert Model(foobar=[(1, 1), (2, 2)]).foobar == [(1, 1), (2, 2)] def test_key_validation(): class Model(BaseModel): foobar: Dict[int, int] @field_validator('foobar') @classmethod def check_foobar(cls, value): return {k + 1: v + 1 for k, v in value.items()} assert Model(foobar={1: 1}).foobar == {2: 2} def test_validator_always_optional(): check_calls = 0 class Model(BaseModel): a: Optional[str] = None with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', pre=True, always=True) @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'default value' assert Model(a='y').a == 'y' assert check_calls == 1 assert Model().a == 'default value' assert check_calls == 2 def test_field_validator_validate_default_optional(): check_calls = 0 class Model(BaseModel): a: Optional[str] = Field(None, validate_default=True) @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'default value' assert Model(a='y').a == 'y' assert check_calls == 1 assert Model().a == 'default value' assert check_calls == 2 def test_validator_always_pre(): check_calls = 0 class Model(BaseModel): a: str = None with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', pre=True, always=True) @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'default value' assert Model(a='y').a == 'y' assert Model().a == 'default value' assert check_calls == 2 def test_field_validator_validate_default_pre(): check_calls = 0 class Model(BaseModel): a: str = Field(None, validate_default=True) @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v or 'default value' assert Model(a='y').a == 'y' assert Model().a == 'default value' assert check_calls == 2 def test_validator_always_post(): class Model(BaseModel): # NOTE: Unlike in v1, you can't replicate the behavior of only applying defined validators and not standard # field validation. This is why I've set the default to '' instead of None. # But, I think this is a good thing, and I don't think we should try to support this. a: str = '' with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', always=True) @classmethod def check_a(cls, v: Any): return v or 'default value' assert Model(a='y').a == 'y' assert Model().a == 'default value' def test_field_validator_validate_default_post(): class Model(BaseModel): a: str = Field('', validate_default=True) @field_validator('a') @classmethod def check_a(cls, v: Any): return v or 'default value' assert Model(a='y').a == 'y' assert Model().a == 'default value' def test_validator_always_post_optional(): class Model(BaseModel): a: Optional[str] = None with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('a', pre=True, always=True) @classmethod def check_a(cls, v: Any): return 'default value' if v is None else v assert Model(a='y').a == 'y' assert Model().a == 'default value' def test_field_validator_validate_default_post_optional(): class Model(BaseModel): a: Optional[str] = Field(None, validate_default=True) @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): return v or 'default value' assert Model(a='y').a == 'y' assert Model().a == 'default value' def test_datetime_validator(): check_calls = 0 class Model(BaseModel): d: datetime = None with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): @validator('d', pre=True, always=True) @classmethod def check_d(cls, v: Any): nonlocal check_calls check_calls += 1 return v or datetime(2032, 1, 1) assert Model(d='2023-01-01T00:00:00').d == datetime(2023, 1, 1) assert check_calls == 1 assert Model().d == datetime(2032, 1, 1) assert check_calls == 2 assert Model(d=datetime(2023, 1, 1)).d == datetime(2023, 1, 1) assert check_calls == 3 def test_datetime_field_validator(): check_calls = 0 class Model(BaseModel): d: datetime = Field(None, validate_default=True) @field_validator('d', mode='before') @classmethod def check_d(cls, v: Any): nonlocal check_calls check_calls += 1 return v or datetime(2032, 1, 1) assert Model(d='2023-01-01T00:00:00').d == datetime(2023, 1, 1) assert check_calls == 1 assert Model().d == datetime(2032, 1, 1) assert check_calls == 2 assert Model(d=datetime(2023, 1, 1)).d == datetime(2023, 1, 1) assert check_calls == 3 def test_pre_called_once(): check_calls = 0 class Model(BaseModel): a: Tuple[int, int, int] @field_validator('a', mode='before') @classmethod def check_a(cls, v: Any): nonlocal check_calls check_calls += 1 return v assert Model(a=['1', '2', '3']).a == (1, 2, 3) assert check_calls == 1 def test_assert_raises_validation_error(): class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls, v: Any): if v != 'a': raise AssertionError('invalid a') return v Model(a='a') with pytest.raises(ValidationError) as exc_info: Model(a='snap') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(AssertionError('invalid a')))}, 'input': 'snap', 'loc': ('a',), 'msg': 'Assertion failed, invalid a', 'type': 'assertion_error', } ] def test_root_validator(): root_val_values: List[Dict[str, Any]] = [] class Model(BaseModel): a: int = 1 b: str c: str @field_validator('b') @classmethod def repeat_b(cls, v: Any): return v * 2 with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) def example_root_validator(cls, values: Dict[str, Any]) -> Dict[str, Any]: root_val_values.append(values) if 'snap' in values.get('b', ''): raise ValueError('foobar') return dict(values, b='changed') @root_validator(skip_on_failure=True) def example_root_validator2(cls, values: Dict[str, Any]) -> Dict[str, Any]: root_val_values.append(values) if 'snap' in values.get('c', ''): raise ValueError('foobar2') return dict(values, c='changed') assert Model(a='123', b='bar', c='baz').model_dump() == {'a': 123, 'b': 'changed', 'c': 'changed'} with pytest.raises(ValidationError) as exc_info: Model(b='snap dragon', c='snap dragon2') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('foobar')))}, 'input': {'b': 'snap dragon', 'c': 'snap dragon2'}, 'loc': (), 'msg': 'Value error, foobar', 'type': 'value_error', } ] with pytest.raises(ValidationError) as exc_info: Model(a='broken', b='bar', c='baz') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'broken', } ] assert root_val_values == [ {'a': 123, 'b': 'barbar', 'c': 'baz'}, {'a': 123, 'b': 'changed', 'c': 'baz'}, {'a': 1, 'b': 'snap dragonsnap dragon', 'c': 'snap dragon2'}, ] def test_root_validator_subclass(): """ https://github.com/pydantic/pydantic/issues/5388 """ class Parent(BaseModel): x: int expected: Any with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) @classmethod def root_val(cls, values: Dict[str, Any]) -> Dict[str, Any]: assert cls is values['expected'] return values class Child1(Parent): pass class Child2(Parent): with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) @classmethod def root_val(cls, values: Dict[str, Any]) -> Dict[str, Any]: assert cls is Child2 values['x'] = values['x'] * 2 return values class Child3(Parent): @classmethod def root_val(cls, values: Dict[str, Any]) -> Dict[str, Any]: assert cls is Child3 values['x'] = values['x'] * 3 return values assert Parent(x=1, expected=Parent).x == 1 assert Child1(x=1, expected=Child1).x == 1 assert Child2(x=1, expected=Child2).x == 2 assert Child3(x=1, expected=Child3).x == 3 def test_root_validator_pre(): root_val_values: List[Dict[str, Any]] = [] class Model(BaseModel): a: int = 1 b: str @field_validator('b') @classmethod def repeat_b(cls, v: Any): return v * 2 with pytest.warns(PydanticDeprecatedSince20): @root_validator(pre=True) def root_validator(cls, values: Dict[str, Any]) -> Dict[str, Any]: root_val_values.append(values) if 'snap' in values.get('b', ''): raise ValueError('foobar') return {'a': 42, 'b': 'changed'} assert Model(a='123', b='bar').model_dump() == {'a': 42, 'b': 'changedchanged'} with pytest.raises(ValidationError) as exc_info: Model(b='snap dragon') assert root_val_values == [{'a': '123', 'b': 'bar'}, {'b': 'snap dragon'}] assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('foobar')))}, 'input': {'b': 'snap dragon'}, 'loc': (), 'msg': 'Value error, foobar', 'type': 'value_error', } ] def test_root_validator_types(): root_val_values = None class Model(BaseModel): a: int = 1 b: str with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) def root_validator(cls, values: Dict[str, Any]) -> Dict[str, Any]: nonlocal root_val_values root_val_values = cls, repr(values) return values model_config = ConfigDict(extra='allow') assert Model(b='bar', c='wobble').model_dump() == {'a': 1, 'b': 'bar', 'c': 'wobble'} assert root_val_values == (Model, "{'a': 1, 'b': 'bar', 'c': 'wobble'}") def test_root_validator_returns_none_exception(): class Model(BaseModel): a: int = 1 with pytest.warns(PydanticDeprecatedSince20): @root_validator(skip_on_failure=True) def root_validator_repeated(cls, values): return None with pytest.raises( TypeError, match=r"(:?__dict__ must be set to a dictionary, not a 'NoneType')|(:?setting dictionary to a non-dict)", ): Model() def test_model_validator_returns_ignore(): class Model(BaseModel): a: int = 1 @model_validator(mode='after') # type: ignore def model_validator_return_none(self) -> None: return None with pytest.warns(UserWarning, match='A custom validator is returning a value other than `self`'): m = Model(a=2) assert m.model_dump() == {'a': 2} def reusable_validator(num: int) -> int: return num * 2 def test_reuse_global_validators(): class Model(BaseModel): x: int y: int double_x = field_validator('x')(reusable_validator) double_y = field_validator('y')(reusable_validator) assert dict(Model(x=1, y=1)) == {'x': 2, 'y': 2} @pytest.mark.parametrize('validator_classmethod,root_validator_classmethod', product(*[[True, False]] * 2)) def test_root_validator_classmethod(validator_classmethod, root_validator_classmethod): root_val_values = [] class Model(BaseModel): a: int = 1 b: str def repeat_b(cls, v: Any): return v * 2 if validator_classmethod: repeat_b = classmethod(repeat_b) repeat_b = field_validator('b')(repeat_b) def example_root_validator(cls, values): root_val_values.append(values) if 'snap' in values.get('b', ''): raise ValueError('foobar') return dict(values, b='changed') if root_validator_classmethod: example_root_validator = classmethod(example_root_validator) with pytest.warns(PydanticDeprecatedSince20): example_root_validator = root_validator(skip_on_failure=True)(example_root_validator) assert Model(a='123', b='bar').model_dump() == {'a': 123, 'b': 'changed'} with pytest.raises(ValidationError) as exc_info: Model(b='snap dragon') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('foobar')))}, 'input': {'b': 'snap dragon'}, 'loc': (), 'msg': 'Value error, foobar', 'type': 'value_error', } ] with pytest.raises(ValidationError) as exc_info: Model(a='broken', b='bar') assert exc_info.value.errors(include_url=False) == [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'broken', } ] assert root_val_values == [{'a': 123, 'b': 'barbar'}, {'a': 1, 'b': 'snap dragonsnap dragon'}] def test_assignment_validator_cls(): validator_calls = 0 class Model(BaseModel): name: str model_config = ConfigDict(validate_assignment=True) @field_validator('name') @classmethod def check_foo(cls, value): nonlocal validator_calls validator_calls += 1 assert cls == Model return value m = Model(name='hello') m.name = 'goodbye' assert validator_calls == 2 def test_literal_validator(): class Model(BaseModel): a: Literal['foo'] Model(a='foo') with pytest.raises(ValidationError) as exc_info: Model(a='nope') assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('a',), 'msg': "Input should be 'foo'", 'input': 'nope', 'ctx': {'expected': "'foo'"}, } ] def test_literal_validator_str_enum(): class Bar(str, Enum): FIZ = 'fiz' FUZ = 'fuz' class Foo(BaseModel): bar: Bar barfiz: Literal[Bar.FIZ] fizfuz: Literal[Bar.FIZ, Bar.FUZ] my_foo = Foo.model_validate({'bar': 'fiz', 'barfiz': 'fiz', 'fizfuz': 'fiz'}) assert my_foo.bar is Bar.FIZ assert my_foo.barfiz is Bar.FIZ assert my_foo.fizfuz is Bar.FIZ my_foo = Foo.model_validate({'bar': 'fiz', 'barfiz': 'fiz', 'fizfuz': 'fuz'}) assert my_foo.bar is Bar.FIZ assert my_foo.barfiz is Bar.FIZ assert my_foo.fizfuz is Bar.FUZ def test_nested_literal_validator(): L1 = Literal['foo'] L2 = Literal['bar'] class Model(BaseModel): a: Literal[L1, L2] Model(a='foo') with pytest.raises(ValidationError) as exc_info: Model(a='nope') assert exc_info.value.errors(include_url=False) == [ { 'type': 'literal_error', 'loc': ('a',), 'msg': "Input should be 'foo' or 'bar'", 'input': 'nope', 'ctx': {'expected': "'foo' or 'bar'"}, } ] def test_union_literal_with_constraints(): class Model(BaseModel, validate_assignment=True): x: Union[Literal[42], Literal['pika']] = Field(frozen=True) m = Model(x=42) with pytest.raises(ValidationError) as exc_info: m.x += 1 assert exc_info.value.errors(include_url=False) == [ {'input': 43, 'loc': ('x',), 'msg': 'Field is frozen', 'type': 'frozen_field'} ] def test_field_that_is_being_validated_is_excluded_from_validator_values(): check_values = MagicMock() class Model(BaseModel): foo: str bar: str = Field(alias='pika') baz: str model_config = ConfigDict(validate_assignment=True) @field_validator('foo') @classmethod def validate_foo(cls, v: Any, info: ValidationInfo) -> Any: check_values({**info.data}) return v @field_validator('bar') @classmethod def validate_bar(cls, v: Any, info: ValidationInfo) -> Any: check_values({**info.data}) return v model = Model(foo='foo_value', pika='bar_value', baz='baz_value') check_values.reset_mock() assert list(dict(model).items()) == [('foo', 'foo_value'), ('bar', 'bar_value'), ('baz', 'baz_value')] model.foo = 'new_foo_value' check_values.assert_called_once_with({'bar': 'bar_value', 'baz': 'baz_value'}) check_values.reset_mock() model.bar = 'new_bar_value' check_values.assert_called_once_with({'foo': 'new_foo_value', 'baz': 'baz_value'}) # ensure field order is the same assert list(dict(model).items()) == [('foo', 'new_foo_value'), ('bar', 'new_bar_value'), ('baz', 'baz_value')] def test_exceptions_in_field_validators_restore_original_field_value(): class Model(BaseModel): foo: str model_config = ConfigDict(validate_assignment=True) @field_validator('foo') @classmethod def validate_foo(cls, v: Any): if v == 'raise_exception': raise RuntimeError('test error') return v model = Model(foo='foo') with pytest.raises(RuntimeError, match='test error'): model.foo = 'raise_exception' assert model.foo == 'foo' def test_overridden_root_validators(): validate_stub = MagicMock() class A(BaseModel): x: str @model_validator(mode='before') @classmethod def pre_root(cls, values: Dict[str, Any]) -> Dict[str, Any]: validate_stub('A', 'pre') return values @model_validator(mode='after') def post_root(self) -> 'A': validate_stub('A', 'post') return self class B(A): @model_validator(mode='before') @classmethod def pre_root(cls, values: Dict[str, Any]) -> Dict[str, Any]: validate_stub('B', 'pre') return values @model_validator(mode='after') def post_root(self) -> 'B': validate_stub('B', 'post') return self A(x='pika') assert validate_stub.call_args_list == [[('A', 'pre'), {}], [('A', 'post'), {}]] validate_stub.reset_mock() B(x='pika') assert validate_stub.call_args_list == [[('B', 'pre'), {}], [('B', 'post'), {}]] def test_validating_assignment_pre_root_validator_fail(): class Model(BaseModel): current_value: float = Field(alias='current') max_value: float model_config = ConfigDict(validate_assignment=True) with pytest.warns(PydanticDeprecatedSince20): @root_validator(pre=True) def values_are_not_string(cls, values: Dict[str, Any]) -> Dict[str, Any]: if any(isinstance(x, str) for x in values.values()): raise ValueError('values cannot be a string') return values m = Model(current=100, max_value=200) with pytest.raises(ValidationError) as exc_info: m.current_value = '100' assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('values cannot be a string')))}, 'input': {'current_value': '100', 'max_value': 200.0}, 'loc': (), 'msg': 'Value error, values cannot be a string', 'type': 'value_error', } ] def test_validating_assignment_model_validator_before_fail(): class Model(BaseModel): current_value: float = Field(alias='current') max_value: float model_config = ConfigDict(validate_assignment=True) @model_validator(mode='before') @classmethod def values_are_not_string(cls, values: Dict[str, Any]) -> Dict[str, Any]: assert isinstance(values, dict) if any(isinstance(x, str) for x in values.values()): raise ValueError('values cannot be a string') return values m = Model(current=100, max_value=200) with pytest.raises(ValidationError) as exc_info: m.current_value = '100' assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('values cannot be a string')))}, 'input': {'current_value': '100', 'max_value': 200.0}, 'loc': (), 'msg': 'Value error, values cannot be a string', 'type': 'value_error', } ] @pytest.mark.parametrize( 'kwargs', [ {'skip_on_failure': False}, {'skip_on_failure': False, 'pre': False}, {'pre': False}, ], ) def test_root_validator_skip_on_failure_invalid(kwargs: Dict[str, Any]): with pytest.raises(TypeError, match='MUST specify `skip_on_failure=True`'): with pytest.warns( PydanticDeprecatedSince20, match='Pydantic V1 style `@root_validator` validators are deprecated.' ): class Model(BaseModel): @root_validator(**kwargs) def root_val(cls, values: Dict[str, Any]) -> Dict[str, Any]: return values @pytest.mark.parametrize( 'kwargs', [ {'skip_on_failure': True}, {'skip_on_failure': True, 'pre': False}, {'skip_on_failure': False, 'pre': True}, {'pre': True}, ], ) def test_root_validator_skip_on_failure_valid(kwargs: Dict[str, Any]): with pytest.warns( PydanticDeprecatedSince20, match='Pydantic V1 style `@root_validator` validators are deprecated.' ): class Model(BaseModel): @root_validator(**kwargs) def root_val(cls, values: Dict[str, Any]) -> Dict[str, Any]: return values def test_model_validator_many_values_change(): """It should run root_validator on assignment and update ALL concerned fields""" class Rectangle(BaseModel): width: float height: float area: Optional[float] = None model_config = ConfigDict(validate_assignment=True) @model_validator(mode='after') def set_area(self) -> 'Rectangle': self.__dict__['area'] = self.width * self.height return self r = Rectangle(width=1, height=1) assert r.area == 1 r.height = 5 assert r.area == 5 def _get_source_line(filename: str, lineno: int) -> str: with open(filename) as f: for _ in range(lineno - 1): f.readline() return f.readline() def test_v1_validator_deprecated(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH) as w: class Point(BaseModel): y: int x: int @validator('x') @classmethod def check_x(cls, x: int, values: Dict[str, Any]) -> int: assert x * 2 == values['y'] return x assert Point(x=1, y=2).model_dump() == {'x': 1, 'y': 2} warnings = w.list assert len(warnings) == 1 w = warnings[0] # check that we got stacklevel correct # if this fails you need to edit the stacklevel # parameter to warnings.warn in _decorators.py assert normcase(w.filename) == normcase(__file__) source = _get_source_line(w.filename, w.lineno) assert "@validator('x')" in source def test_info_field_name_data_before(): """ Test accessing info.field_name and info.data We only test the `before` validator because they all share the same implementation. """ class Model(BaseModel): a: str b: str @field_validator('b', mode='before') @classmethod def check_a(cls, v: Any, info: ValidationInfo) -> Any: assert v == b'but my barbaz is better' assert info.field_name == 'b' assert info.data == {'a': 'your foobar is good'} return 'just kidding!' assert Model(a=b'your foobar is good', b=b'but my barbaz is better').b == 'just kidding!' def test_decorator_proxy(): """ Test that our validator decorator allows calling the wrapped methods/functions. """ def val(v: int) -> int: return v + 1 class Model(BaseModel): x: int @field_validator('x') @staticmethod def val1(v: int) -> int: return v + 1 @field_validator('x') @classmethod def val2(cls, v: int) -> int: return v + 1 val3 = field_validator('x')(val) assert Model.val1(1) == 2 assert Model.val2(1) == 2 assert Model.val3(1) == 2 def test_root_validator_self(): with pytest.raises(TypeError, match=r'`@root_validator` cannot be applied to instance methods'): with pytest.warns(PydanticDeprecatedSince20): class Model(BaseModel): a: int = 1 @root_validator(skip_on_failure=True) def root_validator(self, values: Any) -> Any: return values def test_validator_self(): with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): with pytest.raises(TypeError, match=r'`@validator` cannot be applied to instance methods'): class Model(BaseModel): a: int = 1 @validator('a') def check_a(self, values: Any) -> Any: return values def test_field_validator_self(): with pytest.raises(TypeError, match=r'`@field_validator` cannot be applied to instance methods'): class Model(BaseModel): a: int = 1 @field_validator('a') def check_a(self, values: Any) -> Any: return values def test_v1_validator_signature_kwargs_not_allowed() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): with pytest.raises(TypeError, match=r'Unsupported signature for V1 style validator'): class Model(BaseModel): a: int @validator('a') def check_a(cls, value: Any, foo: Any) -> Any: ... def test_v1_validator_signature_kwargs1() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, **kwargs: Any) -> Any: assert kwargs == {'values': {'a': 1}} assert value == 2 return value + 1 assert Model(a=1, b=2).model_dump() == {'a': 1, 'b': 3} def test_v1_validator_signature_kwargs2() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, values: Dict[str, Any], **kwargs: Any) -> Any: assert kwargs == {} assert values == {'a': 1} assert value == 2 return value + 1 assert Model(a=1, b=2).model_dump() == {'a': 1, 'b': 3} def test_v1_validator_signature_with_values() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, values: Dict[str, Any]) -> Any: assert values == {'a': 1} assert value == 2 return value + 1 assert Model(a=1, b=2).model_dump() == {'a': 1, 'b': 3} def test_v1_validator_signature_with_values_kw_only() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, *, values: Dict[str, Any]) -> Any: assert values == {'a': 1} assert value == 2 return value + 1 assert Model(a=1, b=2).model_dump() == {'a': 1, 'b': 3} def test_v1_validator_signature_with_field() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): with pytest.raises(TypeError, match=r'The `field` and `config` parameters are not available in Pydantic V2'): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, field: Any) -> Any: ... def test_v1_validator_signature_with_config() -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): with pytest.raises(TypeError, match=r'The `field` and `config` parameters are not available in Pydantic V2'): class Model(BaseModel): a: int b: int @validator('b') def check_b(cls, value: Any, config: Any) -> Any: ... def test_model_config_validate_default(): class Model(BaseModel): x: int = -1 @field_validator('x') @classmethod def force_x_positive(cls, v): assert v > 0 return v assert Model().x == -1 class ValidatingModel(Model): model_config = ConfigDict(validate_default=True) with pytest.raises(ValidationError) as exc_info: ValidatingModel() assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(AssertionError('assert -1 > 0')))}, 'input': -1, 'loc': ('x',), 'msg': 'Assertion failed, assert -1 > 0', 'type': 'assertion_error', } ] def partial_val_func1( value: int, allowed: int, ) -> int: assert value == allowed return value def partial_val_func2( value: int, *, allowed: int, ) -> int: assert value == allowed return value def partial_values_val_func1( value: int, values: Dict[str, Any], *, allowed: int, ) -> int: assert isinstance(values, dict) assert value == allowed return value def partial_values_val_func2( value: int, *, values: Dict[str, Any], allowed: int, ) -> int: assert isinstance(values, dict) assert value == allowed return value def partial_info_val_func( value: int, info: ValidationInfo, *, allowed: int, ) -> int: assert isinstance(info.data, dict) assert value == allowed return value def partial_cls_val_func1( cls: Any, value: int, allowed: int, expected_cls: Any, ) -> int: assert cls.__name__ == expected_cls assert value == allowed return value def partial_cls_val_func2( cls: Any, value: int, *, allowed: int, expected_cls: Any, ) -> int: assert cls.__name__ == expected_cls assert value == allowed return value def partial_cls_values_val_func1( cls: Any, value: int, values: Dict[str, Any], *, allowed: int, expected_cls: Any, ) -> int: assert cls.__name__ == expected_cls assert isinstance(values, dict) assert value == allowed return value def partial_cls_values_val_func2( cls: Any, value: int, *, values: Dict[str, Any], allowed: int, expected_cls: Any, ) -> int: assert cls.__name__ == expected_cls assert isinstance(values, dict) assert value == allowed return value def partial_cls_info_val_func( cls: Any, value: int, info: ValidationInfo, *, allowed: int, expected_cls: Any, ) -> int: assert cls.__name__ == expected_cls assert isinstance(info.data, dict) assert value == allowed return value @pytest.mark.parametrize( 'func', [ partial_val_func1, partial_val_func2, partial_info_val_func, ], ) def test_functools_partial_validator_v2( func: Callable[..., Any], ) -> None: class Model(BaseModel): x: int val = field_validator('x')(partial(func, allowed=42)) Model(x=42) with pytest.raises(ValidationError): Model(x=123) @pytest.mark.parametrize( 'func', [ partial_val_func1, partial_val_func2, partial_info_val_func, ], ) def test_functools_partialmethod_validator_v2( func: Callable[..., Any], ) -> None: class Model(BaseModel): x: int val = field_validator('x')(partialmethod(func, allowed=42)) Model(x=42) with pytest.raises(ValidationError): Model(x=123) @pytest.mark.parametrize( 'func', [ partial_cls_val_func1, partial_cls_val_func2, partial_cls_info_val_func, ], ) def test_functools_partialmethod_validator_v2_cls_method( func: Callable[..., Any], ) -> None: class Model(BaseModel): x: int # note that you _have_ to wrap your function with classmethod # it's partialmethod not us that requires it # otherwise it creates a bound instance method val = field_validator('x')(partialmethod(classmethod(func), allowed=42, expected_cls='Model')) Model(x=42) with pytest.raises(ValidationError): Model(x=123) @pytest.mark.parametrize( 'func', [ partial_val_func1, partial_val_func2, partial_values_val_func1, partial_values_val_func2, ], ) def test_functools_partial_validator_v1( func: Callable[..., Any], ) -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): x: int val = validator('x')(partial(func, allowed=42)) Model(x=42) with pytest.raises(ValidationError): Model(x=123) @pytest.mark.parametrize( 'func', [ partial_val_func1, partial_val_func2, partial_values_val_func1, partial_values_val_func2, ], ) def test_functools_partialmethod_validator_v1( func: Callable[..., Any], ) -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): x: int val = validator('x')(partialmethod(func, allowed=42)) Model(x=42) with pytest.raises(ValidationError): Model(x=123) @pytest.mark.parametrize( 'func', [ partial_cls_val_func1, partial_cls_val_func2, partial_cls_values_val_func1, partial_cls_values_val_func2, ], ) def test_functools_partialmethod_validator_v1_cls_method( func: Callable[..., Any], ) -> None: with pytest.warns(PydanticDeprecatedSince20, match=V1_VALIDATOR_DEPRECATION_MATCH): class Model(BaseModel): x: int # note that you _have_ to wrap your function with classmethod # it's partialmethod not us that requires it # otherwise it creates a bound instance method val = validator('x')(partialmethod(classmethod(func), allowed=42, expected_cls='Model')) Model(x=42) with pytest.raises(ValidationError): Model(x=123) def test_validator_allow_reuse_inheritance(): class Parent(BaseModel): x: int @field_validator('x') def val(cls, v: int) -> int: return v + 1 class Child(Parent): @field_validator('x') def val(cls, v: int) -> int: assert v == 1 v = super().val(v) assert v == 2 return 4 assert Parent(x=1).model_dump() == {'x': 2} assert Child(x=1).model_dump() == {'x': 4} def test_validator_allow_reuse_same_field(): with pytest.warns(UserWarning, match='`val_x` overrides an existing Pydantic `@field_validator` decorator'): class Model(BaseModel): x: int @field_validator('x') def val_x(cls, v: int) -> int: return v + 1 @field_validator('x') def val_x(cls, v: int) -> int: return v + 2 assert Model(x=1).model_dump() == {'x': 3} def test_validator_allow_reuse_different_field_1(): with pytest.warns(UserWarning, match='`val` overrides an existing Pydantic `@field_validator` decorator'): class Model(BaseModel): x: int y: int @field_validator('x') def val(cls, v: int) -> int: return v + 1 @field_validator('y') def val(cls, v: int) -> int: return v + 2 assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 4} def test_validator_allow_reuse_different_field_2(): with pytest.warns(UserWarning, match='`val_x` overrides an existing Pydantic `@field_validator` decorator'): def val(cls: Any, v: int) -> int: return v + 2 class Model(BaseModel): x: int y: int @field_validator('x') def val_x(cls, v: int) -> int: return v + 1 val_x = field_validator('y')(val) assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 4} def test_validator_allow_reuse_different_field_3(): with pytest.warns(UserWarning, match='`val_x` overrides an existing Pydantic `@field_validator` decorator'): def val1(v: int) -> int: return v + 1 def val2(v: int) -> int: return v + 2 class Model(BaseModel): x: int y: int val_x = field_validator('x')(val1) val_x = field_validator('y')(val2) assert Model(x=1, y=2).model_dump() == {'x': 1, 'y': 4} def test_validator_allow_reuse_different_field_4(): def val(v: int) -> int: return v + 1 class Model(BaseModel): x: int y: int val_x = field_validator('x')(val) not_val_x = field_validator('y')(val) assert Model(x=1, y=2).model_dump() == {'x': 2, 'y': 3} @pytest.mark.filterwarnings( 'ignore:Pydantic V1 style `@root_validator` validators are deprecated.*:pydantic.warnings.PydanticDeprecatedSince20' ) def test_root_validator_allow_reuse_same_field(): with pytest.warns(UserWarning, match='`root_val` overrides an existing Pydantic `@root_validator` decorator'): class Model(BaseModel): x: int @root_validator(skip_on_failure=True) def root_val(cls, v: Dict[str, Any]) -> Dict[str, Any]: v['x'] += 1 return v @root_validator(skip_on_failure=True) def root_val(cls, v: Dict[str, Any]) -> Dict[str, Any]: v['x'] += 2 return v assert Model(x=1).model_dump() == {'x': 3} def test_root_validator_allow_reuse_inheritance(): with pytest.warns(PydanticDeprecatedSince20): class Parent(BaseModel): x: int @root_validator(skip_on_failure=True) def root_val(cls, v: Dict[str, Any]) -> Dict[str, Any]: v['x'] += 1 return v with pytest.warns(PydanticDeprecatedSince20): class Child(Parent): @root_validator(skip_on_failure=True) def root_val(cls, v: Dict[str, Any]) -> Dict[str, Any]: assert v == {'x': 1} v = super().root_val(v) assert v == {'x': 2} return {'x': 4} assert Parent(x=1).model_dump() == {'x': 2} assert Child(x=1).model_dump() == {'x': 4} def test_bare_root_validator(): with pytest.raises( PydanticUserError, match=re.escape( 'If you use `@root_validator` with pre=False (the default) you MUST specify `skip_on_failure=True`.' ' Note that `@root_validator` is deprecated and should be replaced with `@model_validator`.' ), ): with pytest.warns( PydanticDeprecatedSince20, match='Pydantic V1 style `@root_validator` validators are deprecated.' ): class Model(BaseModel): @root_validator @classmethod def validate_values(cls, values): return values def test_validator_with_underscore_name() -> None: """ https://github.com/pydantic/pydantic/issues/5252 """ def f(name: str) -> str: return name.lower() class Model(BaseModel): name: str _normalize_name = field_validator('name')(f) assert Model(name='Adrian').name == 'adrian' @pytest.mark.parametrize( 'mode,config,input_str', ( ('before', {}, "type=value_error, input_value='123', input_type=str"), ('before', {'hide_input_in_errors': False}, "type=value_error, input_value='123', input_type=str"), ('before', {'hide_input_in_errors': True}, 'type=value_error'), ('after', {}, "type=value_error, input_value='123', input_type=str"), ('after', {'hide_input_in_errors': False}, "type=value_error, input_value='123', input_type=str"), ('after', {'hide_input_in_errors': True}, 'type=value_error'), ('plain', {}, "type=value_error, input_value='123', input_type=str"), ('plain', {'hide_input_in_errors': False}, "type=value_error, input_value='123', input_type=str"), ('plain', {'hide_input_in_errors': True}, 'type=value_error'), ), ) def test_validator_function_error_hide_input(mode, config, input_str): class Model(BaseModel): x: str model_config = ConfigDict(**config) @field_validator('x', mode=mode) @classmethod def check_a1(cls, v: str) -> str: raise ValueError('foo') with pytest.raises(ValidationError, match=re.escape(f'Value error, foo [{input_str}]')): Model(x='123') def foobar_validate(value: Any, info: core_schema.ValidationInfo): data = info.data if isinstance(data, dict): data = data.copy() return {'value': value, 'field_name': info.field_name, 'data': data} class Foobar: @classmethod def __get_pydantic_core_schema__(cls, source_type: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: return core_schema.with_info_plain_validator_function(foobar_validate, field_name=handler.field_name) def test_custom_type_field_name_model(): class MyModel(BaseModel): foobar: Foobar m = MyModel(foobar=1, tuple_nesting=(1, 2)) # insert_assert(m.foobar) assert m.foobar == {'value': 1, 'field_name': 'foobar', 'data': {}} def test_custom_type_field_name_model_nested(): class MyModel(BaseModel): x: int tuple_nested: Tuple[int, Foobar] m = MyModel(x='123', tuple_nested=(1, 2)) # insert_assert(m.tuple_nested[1]) assert m.tuple_nested[1] == {'value': 2, 'field_name': 'tuple_nested', 'data': {'x': 123}} def test_custom_type_field_name_typed_dict(): class MyDict(TypedDict): x: int foobar: Foobar ta = TypeAdapter(MyDict) m = ta.validate_python({'x': '123', 'foobar': 1}) # insert_assert(m['foobar']) assert m['foobar'] == {'value': 1, 'field_name': 'foobar', 'data': {'x': 123}} def test_custom_type_field_name_dataclass(): @dataclass class MyDc: x: int foobar: Foobar ta = TypeAdapter(MyDc) m = ta.validate_python({'x': '123', 'foobar': 1}) # insert_assert(m.foobar) assert m.foobar == {'value': 1, 'field_name': 'foobar', 'data': {'x': 123}} def test_custom_type_field_name_named_tuple(): class MyNamedTuple(NamedTuple): x: int foobar: Foobar ta = TypeAdapter(MyNamedTuple) m = ta.validate_python({'x': '123', 'foobar': 1}) # insert_assert(m.foobar) assert m.foobar == {'value': 1, 'field_name': 'foobar', 'data': None} def test_custom_type_field_name_validate_call(): @validate_call def foobar(x: int, y: Foobar): return x, y # insert_assert(foobar(1, 2)) assert foobar(1, 2) == (1, {'value': 2, 'field_name': 'y', 'data': None}) def test_after_validator_field_name(): class MyModel(BaseModel): x: int foobar: Annotated[int, AfterValidator(foobar_validate)] m = MyModel(x='123', foobar='1') # insert_assert(m.foobar) assert m.foobar == {'value': 1, 'field_name': 'foobar', 'data': {'x': 123}} def test_before_validator_field_name(): class MyModel(BaseModel): x: int foobar: Annotated[Dict[Any, Any], BeforeValidator(foobar_validate)] m = MyModel(x='123', foobar='1') # insert_assert(m.foobar) assert m.foobar == {'value': '1', 'field_name': 'foobar', 'data': {'x': 123}} def test_plain_validator_field_name(): class MyModel(BaseModel): x: int foobar: Annotated[int, PlainValidator(foobar_validate)] m = MyModel(x='123', foobar='1') # insert_assert(m.foobar) assert m.foobar == {'value': '1', 'field_name': 'foobar', 'data': {'x': 123}} def validate_wrap(value: Any, handler: core_schema.ValidatorFunctionWrapHandler, info: core_schema.ValidationInfo): data = info.data if isinstance(data, dict): data = data.copy() return {'value': handler(value), 'field_name': info.field_name, 'data': data} def test_wrap_validator_field_name(): class MyModel(BaseModel): x: int foobar: Annotated[int, WrapValidator(validate_wrap)] m = MyModel(x='123', foobar='1') # insert_assert(m.foobar) assert m.foobar == {'value': 1, 'field_name': 'foobar', 'data': {'x': 123}} def test_validate_default_raises_for_basemodel() -> None: class Model(BaseModel): value_0: str value_a: Annotated[Optional[str], Field(None, validate_default=True)] value_b: Annotated[Optional[str], Field(None, validate_default=True)] @field_validator('value_a', mode='after') def value_a_validator(cls, value): raise AssertionError @field_validator('value_b', mode='after') def value_b_validator(cls, value): raise AssertionError with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('value_0',), 'msg': 'Field required', 'input': {}}, { 'type': 'assertion_error', 'loc': ('value_a',), 'msg': 'Assertion failed, ', 'input': None, 'ctx': {'error': IsInstance(AssertionError)}, }, { 'type': 'assertion_error', 'loc': ('value_b',), 'msg': 'Assertion failed, ', 'input': None, 'ctx': {'error': IsInstance(AssertionError)}, }, ] def test_validate_default_raises_for_dataclasses() -> None: @pydantic_dataclass class Model: value_0: str value_a: Annotated[Optional[str], Field(None, validate_default=True)] value_b: Annotated[Optional[str], Field(None, validate_default=True)] @field_validator('value_a', mode='after') def value_a_validator(cls, value): raise AssertionError @field_validator('value_b', mode='after') def value_b_validator(cls, value): raise AssertionError with pytest.raises(ValidationError) as exc_info: Model() assert exc_info.value.errors(include_url=False) == [ {'type': 'missing', 'loc': ('value_0',), 'msg': 'Field required', 'input': HasRepr('ArgsKwargs(())')}, { 'type': 'assertion_error', 'loc': ('value_a',), 'msg': 'Assertion failed, ', 'input': None, 'ctx': {'error': IsInstance(AssertionError)}, }, { 'type': 'assertion_error', 'loc': ('value_b',), 'msg': 'Assertion failed, ', 'input': None, 'ctx': {'error': IsInstance(AssertionError)}, }, ] def test_plain_validator_plain_serializer() -> None: """https://github.com/pydantic/pydantic/issues/8512""" ser_type = str serializer = PlainSerializer(lambda x: ser_type(int(x)), return_type=ser_type) validator = PlainValidator(lambda x: bool(int(x))) class Blah(BaseModel): foo: Annotated[bool, validator, serializer] bar: Annotated[bool, serializer, validator] blah = Blah(foo='0', bar='1') data = blah.model_dump() assert isinstance(data['foo'], ser_type) assert isinstance(data['bar'], ser_type) def test_plain_validator_plain_serializer_single_ser_call() -> None: """https://github.com/pydantic/pydantic/issues/10385""" ser_count = 0 def ser(v): nonlocal ser_count ser_count += 1 return v class Model(BaseModel): foo: Annotated[bool, PlainSerializer(ser), PlainValidator(lambda v: v)] model = Model(foo=True) data = model.model_dump() assert data == {'foo': True} assert ser_count == 1 @pytest.mark.xfail(reason='https://github.com/pydantic/pydantic/issues/10428') def test_plain_validator_with_filter_dict_schema() -> None: class MyDict: @classmethod def __get_pydantic_core_schema__(cls, source, handler): return core_schema.dict_schema( keys_schema=handler.generate_schema(str), values_schema=handler.generate_schema(int), serialization=core_schema.filter_dict_schema( include={'a'}, ), ) class Model(BaseModel): f: Annotated[MyDict, PlainValidator(lambda v: v)] assert Model(f={'a': 1, 'b': 1}).model_dump() == {'f': {'a': 1}} def test_plain_validator_with_unsupported_type() -> None: class UnsupportedClass: pass PreviouslySupportedType = Annotated[ UnsupportedClass, PlainValidator(lambda _: UnsupportedClass()), ] type_adapter = TypeAdapter(PreviouslySupportedType) model = type_adapter.validate_python('abcdefg') assert isinstance(model, UnsupportedClass) assert isinstance(type_adapter.dump_python(model), UnsupportedClass) def test_validator_with_default_values() -> None: def validate_x(v: int, unrelated_arg: int = 1, other_unrelated_arg: int = 2) -> int: assert v != -1 return v class Model(BaseModel): x: int val_x = field_validator('x')(validate_x) with pytest.raises(ValidationError): Model(x=-1) def test_field_validator_input_type_invalid_mode() -> None: with pytest.raises( PydanticUserError, match=re.escape("`json_schema_input_type` can't be used when mode is set to 'after'") ): class Model(BaseModel): a: int @field_validator('a', mode='after', json_schema_input_type=Union[int, str]) # pyright: ignore @classmethod def validate_a(cls, value: Any) -> Any: ... def test_non_self_return_val_warns() -> None: class Child(BaseModel): name: str @model_validator(mode='after') # type: ignore def validate_model(self) -> 'Child': return Child.model_construct(name='different') with pytest.warns(UserWarning, match='A custom validator is returning a value other than `self`'): c = Child(name='name') # confirmation of behavior: non-self return value is ignored assert c.name == 'name' pydantic-2.10.6/tests/test_validators_dataclass.py000077500000000000000000000112011474456633400223730ustar00rootroot00000000000000from dataclasses import asdict, is_dataclass from typing import Any, List import pytest from dirty_equals import HasRepr from pydantic import ValidationError, field_validator, model_validator from pydantic.dataclasses import dataclass def test_simple(): @dataclass class MyDataclass: a: str @field_validator('a') @classmethod def change_a(cls, v): return v + ' changed' assert MyDataclass(a='this is foobar good').a == 'this is foobar good changed' def test_validate_before(): @dataclass class MyDataclass: a: List[int] @field_validator('a', mode='before') @classmethod def check_a1(cls, v: List[Any]) -> List[Any]: v.append('123') return v @field_validator('a') @classmethod def check_a2(cls, v: List[int]) -> List[int]: v.append(456) return v assert MyDataclass(a=[1, 2]).a == [1, 2, 123, 456] def test_validate_multiple(): @dataclass class MyDataclass: a: str b: str @field_validator('a', 'b') @classmethod def check_a_and_b(cls, v, info): if len(v) < 4: raise ValueError(f'{info.field_name} is too short') return v + 'x' assert asdict(MyDataclass(a='1234', b='5678')) == {'a': '1234x', 'b': '5678x'} with pytest.raises(ValidationError) as exc_info: MyDataclass(a='x', b='x') assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('a is too short')))}, 'input': 'x', 'loc': ('a',), 'msg': 'Value error, a is too short', 'type': 'value_error', }, { 'ctx': {'error': HasRepr(repr(ValueError('b is too short')))}, 'input': 'x', 'loc': ('b',), 'msg': 'Value error, b is too short', 'type': 'value_error', }, ] def test_type_error(): @dataclass class MyDataclass: a: str b: str @field_validator('a', 'b') @classmethod def check_a_and_b(cls, v, info): if len(v) < 4: raise TypeError(f'{info.field_name} is too short') return v + 'x' assert asdict(MyDataclass(a='1234', b='5678')) == {'a': '1234x', 'b': '5678x'} with pytest.raises(TypeError, match='a is too short'): MyDataclass(a='x', b='x') def test_classmethod(): @dataclass class MyDataclass: a: str @field_validator('a') @classmethod def check_a(cls, v): assert cls is MyDataclass and is_dataclass(MyDataclass) return v m = MyDataclass(a='this is foobar good') assert m.a == 'this is foobar good' m.check_a('x') def test_validate_parent(): @dataclass class Parent: a: str @field_validator('a') @classmethod def change_a(cls, v): return v + ' changed' @dataclass class Child(Parent): pass assert Parent(a='this is foobar good').a == 'this is foobar good changed' assert Child(a='this is foobar good').a == 'this is foobar good changed' def test_inheritance_replace(): @dataclass class Parent: a: int @field_validator('a') @classmethod def add_to_a(cls, v): return v + 1 @dataclass class Child(Parent): @field_validator('a') @classmethod def add_to_a(cls, v): return v + 5 assert Child(a=0).a == 5 def test_model_validator(): root_val_values: list[Any] = [] @dataclass class MyDataclass: a: int b: str @field_validator('b') @classmethod def repeat_b(cls, v: str) -> str: return v * 2 @model_validator(mode='after') def root_validator(self) -> 'MyDataclass': root_val_values.append(asdict(self)) if 'snap' in self.b: raise ValueError('foobar') self.b = 'changed' return self assert asdict(MyDataclass(a='123', b='bar')) == {'a': 123, 'b': 'changed'} with pytest.raises(ValidationError) as exc_info: MyDataclass(1, b='snap dragon') assert root_val_values == [{'a': 123, 'b': 'barbar'}, {'a': 1, 'b': 'snap dragonsnap dragon'}] assert exc_info.value.errors(include_url=False) == [ { 'ctx': {'error': HasRepr(repr(ValueError('foobar')))}, 'input': HasRepr("ArgsKwargs((1,), {'b': 'snap dragon'})"), 'loc': (), 'msg': 'Value error, foobar', 'type': 'value_error', } ] pydantic-2.10.6/tests/test_version.py000066400000000000000000000020621474456633400176730ustar00rootroot00000000000000from unittest.mock import patch import pytest from packaging.version import parse as parse_version import pydantic from pydantic.version import version_info, version_short def test_version_info(): version_info_fields = [ 'pydantic version', 'pydantic-core version', 'pydantic-core build', 'install path', 'python version', 'platform', 'related packages', 'commit', ] s = version_info() assert all(f'{field}:' in s for field in version_info_fields) assert s.count('\n') == 7 def test_standard_version(): v = parse_version(pydantic.VERSION) assert str(v) == pydantic.VERSION def test_version_attribute_is_present(): assert hasattr(pydantic, '__version__') def test_version_attribute_is_a_string(): assert isinstance(pydantic.__version__, str) @pytest.mark.parametrize('version,expected', (('2.1', '2.1'), ('2.1.0', '2.1'))) def test_version_short(version, expected): with patch('pydantic.version.VERSION', version): assert version_short() == expected pydantic-2.10.6/tests/test_warnings.py000066400000000000000000000032341474456633400200400ustar00rootroot00000000000000from pydantic import PydanticDeprecatedSince20, PydanticDeprecationWarning from pydantic.version import version_short def test_pydantic_deprecation_warning(): warning = PydanticDeprecationWarning('Warning message', 'Arbitrary argument', since=(2, 1), expected_removal=(4, 0)) assert str(warning) == 'Warning message. Deprecated in Pydantic V2.1 to be removed in V4.0.' assert warning.args[0] == 'Warning message' assert warning.args[1] == 'Arbitrary argument' def test_pydantic_deprecation_warning_tailing_dot_in_message(): warning = PydanticDeprecationWarning('Warning message.', since=(2, 1), expected_removal=(4, 0)) assert str(warning) == 'Warning message. Deprecated in Pydantic V2.1 to be removed in V4.0.' assert warning.args[0] == 'Warning message.' def test_pydantic_deprecation_warning_calculated_expected_removal(): warning = PydanticDeprecationWarning('Warning message', since=(2, 1)) assert str(warning) == 'Warning message. Deprecated in Pydantic V2.1 to be removed in V3.0.' def test_pydantic_deprecation_warning_2_0_migration_guide_link(): warning = PydanticDeprecationWarning('Warning message', since=(2, 0)) assert ( str(warning) == f'Warning message. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/{version_short()}/migration/' ) def test_pydantic_deprecated_since_2_0_warning(): warning = PydanticDeprecatedSince20('Warning message') assert isinstance(warning, PydanticDeprecationWarning) assert warning.message == 'Warning message' assert warning.since == (2, 0) assert warning.expected_removal == (3, 0) pydantic-2.10.6/tests/typechecking/000077500000000000000000000000001474456633400172525ustar00rootroot00000000000000pydantic-2.10.6/tests/typechecking/README.md000066400000000000000000000016231474456633400205330ustar00rootroot00000000000000# Type checking test suite This test suite is meant to assert the correct behavior of the type hints we use in the Pydantic code. In CI, we run both Mypy and Pyright on these files, using the [`pyproject.toml`](./pyproject.toml) configuration file. Note that these tests do not relate to the Mypy plugin, which is tested under the [`mypy/`](../mypy/) folder. ## Assertions Use [`assert_type`](https://docs.python.org/3/library/typing.html#typing.assert_type) to make assertions: ```python from typing_extensions import assert_type from pydantic import TypeAdapter ta1 = TypeAdapter(int) assert_type(ta1, TypeAdapter[int]) ``` To assert on invalid cases, add a `type: ignore` (for Mypy, must go first) and/or a `pyright: ignore` (for Pyright) comment: ```python from pydantic import BaseModel class Model(BaseModel): a: int Model() # type: ignore[call-arg] # pyright: ignore[reportCallIssue] ``` pydantic-2.10.6/tests/typechecking/base_model.py000066400000000000000000000023041474456633400217150ustar00rootroot00000000000000""" This file is used to test pyright's ability to check Pydantic's `BaseModel` related code. """ from typing_extensions import assert_type from pydantic import BaseModel, Field from pydantic.fields import ComputedFieldInfo, FieldInfo class MyModel(BaseModel): x: str y: list[int] z: int = 1 m1 = MyModel(x='hello', y=[1, 2, 3]) m2 = MyModel(x='hello') # type: ignore[call-arg] # pyright: ignore[reportCallIssue] m3 = MyModel(x='hello', y=[1, '2', b'3']) # type: ignore[list-item] # pyright: ignore[reportArgumentType] m1.z + 'not an int' # type: ignore[operator] # pyright: ignore[reportOperatorIssue] m1.foobar # type: ignore[attr-defined] # pyright: ignore[reportAttributeAccessIssue] class Knight(BaseModel): title: str = Field(default='Sir Lancelot') # this is okay age: int = Field(23) # this works fine at runtime but will case an error for pyright k = Knight() # type: ignore[call-arg] # pyright: ignore[reportCallIssue] assert_type(Knight.model_fields, dict[str, FieldInfo]) assert_type(Knight.model_computed_fields, dict[str, ComputedFieldInfo]) assert_type(k.model_fields, dict[str, FieldInfo]) assert_type(k.model_computed_fields, dict[str, ComputedFieldInfo]) pydantic-2.10.6/tests/typechecking/computed_field.py000066400000000000000000000014371474456633400226140ustar00rootroot00000000000000from functools import cached_property from pydantic import BaseModel, computed_field class Square(BaseModel): side: float # mypy limitation, see: # https://mypy.readthedocs.io/en/stable/error_code_list.html#decorator-preceding-property-not-supported-prop-decorator @computed_field # type: ignore[prop-decorator] @property def area(self) -> float: return self.side**2 @computed_field # type: ignore[prop-decorator] @cached_property def area_cached(self) -> float: return self.side**2 sq = Square(side=10) y = 12.4 + sq.area z = 'x' + sq.area # type: ignore[operator] # pyright: ignore[reportOperatorIssue] y_cached = 12.4 + sq.area_cached z_cached = 'x' + sq.area_cached # type: ignore[operator] # pyright: ignore[reportOperatorIssue] pydantic-2.10.6/tests/typechecking/decorators.py000066400000000000000000000226001474456633400217710ustar00rootroot00000000000000""" This file is used to test pyright's ability to check Pydantic decorators used in `BaseModel`. """ from functools import partial, partialmethod from typing import Any from pydantic_core.core_schema import ValidatorFunctionWrapHandler from typing_extensions import Self, assert_type from pydantic import ( BaseModel, FieldSerializationInfo, SerializationInfo, SerializerFunctionWrapHandler, ValidationInfo, field_serializer, field_validator, model_serializer, model_validator, ) from pydantic.functional_validators import ModelWrapValidatorHandler def validate_before_func(value: Any) -> Any: ... class BeforeModelValidator(BaseModel): @model_validator(mode='before') def valid_method(self, value: Any) -> Any: """TODO This shouldn't be valid. At runtime, `self` is the actual value and `value` is the `ValidationInfo` instance.""" @model_validator(mode='before') def valid_method_info(self, value: Any, info: ValidationInfo) -> Any: ... @model_validator(mode='before') @classmethod def valid_classmethod(cls, value: Any) -> Any: ... @model_validator(mode='before') @staticmethod def valid_staticmethod(value: Any) -> Any: ... valid_function = model_validator(mode='before')(validate_before_func) class WrapModelValidator(BaseModel): # mypy randomly does not catch the type error here (https://github.com/python/mypy/issues/18125) # so we also ignore the `unused-ignore` code: @model_validator(mode='wrap') # type: ignore[arg-type, unused-ignore] # pyright: ignore[reportArgumentType] def no_classmethod(cls, value: Any, handler: ModelWrapValidatorHandler[Self]) -> Self: ... @model_validator(mode='wrap') # type: ignore[arg-type] # pyright: ignore[reportArgumentType] @classmethod def no_handler(cls, value: Any) -> Self: ... # Mypy somehow reports "Cannot infer function type argument" here: @model_validator(mode='wrap') # type:ignore[misc] # pyright: ignore[reportArgumentType] @classmethod def incompatible_type_var(cls, value: Any, handler: ModelWrapValidatorHandler[int]) -> int: """ Type checkers will infer `cls` as being `type[Self]`. When binding the `incompatible_type_var` callable to `ModelWrapValidator.__call__`, the `_ModelType` type var will thus bind to `Self`. It is then expected to have `handler: ModelWrapValidatorHandler[_ModelType]` and the return type as `-> _ModelType`. """ ... @model_validator(mode='wrap') @classmethod def valid_no_info(cls, value: Any, handler: ModelWrapValidatorHandler[Self]) -> Self: rv = handler(value) assert_type(rv, Self) return rv @model_validator(mode='wrap') @classmethod def valid_info(cls, value: Any, handler: ModelWrapValidatorHandler[Self], info: ValidationInfo) -> Self: rv = handler(value, 1) assert_type(rv, Self) return rv class AfterModelValidator(BaseModel): # Mypy somehow reports "Cannot infer function type argument" here: @model_validator(mode='after') # type:ignore[misc] # pyright: ignore[reportArgumentType] def missing_return_value(self) -> None: ... @model_validator(mode='after') def valid_method_no_info(self) -> Self: ... @model_validator(mode='after') def valid_method_info(self, info: ValidationInfo) -> Self: ... class BeforeFieldValidator(BaseModel): """Same tests should apply to `mode='plain'`.""" @field_validator('foo', mode='before') def no_classmethod(self, value: Any) -> Any: """TODO this shouldn't be valid, the decorator should only work on classmethods. We might want to do the same type checking as wrap model validators. """ @field_validator('foo', mode='before') @classmethod def valid_classmethod(cls, value: Any) -> Any: ... @field_validator('foo', mode='before') # type: ignore[type-var] # pyright: ignore[reportArgumentType] @classmethod def invalid_with_info(cls, value: Any, info: int) -> Any: ... @field_validator('foo', mode='before', json_schema_input_type=int) # `json_schema_input_type` allowed here. @classmethod def valid_with_info(cls, value: Any, info: ValidationInfo) -> Any: ... class AfterFieldValidator(BaseModel): @field_validator('foo', mode='after') @classmethod def valid_classmethod(cls, value: Any) -> Any: ... @field_validator('foo', mode='after', json_schema_input_type=int) # type: ignore[call-overload] # pyright: ignore[reportCallIssue, reportArgumentType] @classmethod def invalid_input_type_not_allowed(cls, value: Any) -> Any: ... class WrapFieldValidator(BaseModel): @field_validator('foo', mode='wrap') @classmethod def invalid_missing_handler(cls, value: Any) -> Any: """TODO This shouldn't be valid. At runtime, `check_decorator_fields_exist` raises an error, as the `handler` argument is missing. However, there's no type checking error as the provided signature matches `pydantic_core.core_schema.NoInfoWrapValidatorFunction`. """ @field_validator('foo', mode='wrap') # type: ignore[type-var] # pyright: ignore[reportArgumentType] @classmethod def invalid_handler(cls, value: Any, handler: int) -> Any: ... @field_validator('foo', mode='wrap') @classmethod def valid_no_info(cls, value: Any, handler: ValidatorFunctionWrapHandler) -> Any: ... @field_validator('foo', mode='wrap', json_schema_input_type=int) # `json_schema_input_type` allowed here. @classmethod def valid_with_info(cls, value: Any, handler: ValidatorFunctionWrapHandler, info: ValidationInfo) -> Any: ... class PlainModelSerializer(BaseModel): @model_serializer # type: ignore[type-var] # pyright: ignore[reportArgumentType] def too_many_arguments(self, info: SerializationInfo, unrelated: Any) -> Any: ... @model_serializer def valid_plain_serializer_1(self) -> Any: ... @model_serializer(mode='plain') def valid_plain_serializer_2(self) -> Any: ... @model_serializer(mode='plain') def valid_plain_serializer_info(self, info: SerializationInfo) -> Any: ... class WrapModelSerializer(BaseModel): @model_serializer(mode='wrap') # type: ignore[type-var] # pyright: ignore[reportArgumentType] def no_handler(self) -> Any: ... @model_serializer(mode='wrap') def valid_no_info(self, handler: SerializerFunctionWrapHandler) -> Any: value = handler(self) return value @model_serializer(mode='wrap') def valid_info(self, handler: SerializerFunctionWrapHandler, info: SerializationInfo) -> Any: value = handler(self) return value class PlainFieldSerializer(BaseModel): a: int = 1 @field_serializer('a') def valid_method_no_info_1(self, value: Any) -> Any: ... @field_serializer('a', mode='plain') def valid_method_no_info_2(self, value: Any) -> Any: ... @field_serializer('a', mode='plain') # type: ignore[type-var] # pyright: ignore[reportArgumentType] def invalid_method_info_1(self, value: Any, info: int) -> Any: ... @field_serializer('a', mode='plain') def invalid_method_info_2(self, value: Any, info: SerializationInfo) -> Any: """TODO This shouldn't be valid. With field serializers, `info` is `FieldSerializationInfo`. However, the `AnyFieldPlainSerializer` type alias is too broad as it seems to include model serializer functions as well. This isn't trivial to solve, as we allow regular method and staticmethod/functions to be passed to `field_serializer`, so there's some overlaps in the signatures (because of the `self` argument). """ @field_serializer('a', mode='plain') def valid_method_info(self, value: Any, info: FieldSerializationInfo) -> Any: ... @field_serializer('a', mode='plain') @staticmethod def valid_staticmethod_no_info(value: Any) -> Any: ... @field_serializer('a', mode='plain') @staticmethod def valid_staticmethod_info(value: Any, info: FieldSerializationInfo) -> Any: ... @field_serializer('a', mode='plain') @classmethod def valid_classmethod_no_info(cls, value: Any) -> Any: ... @field_serializer('a', mode='plain') @classmethod def valid_classmethod_info(cls, value: Any, info: FieldSerializationInfo) -> Any: ... partial_ = field_serializer('a', mode='plain')(partial(lambda v, x: v, x=1)) def partial_method(self, value: Any, x: Any) -> Any: ... partial_method_ = field_serializer('a', mode='plain')(partialmethod(partial_method)) class WrapFieldSerializer(BaseModel): a: int = 1 @field_serializer('a', mode='wrap') def no_handler(self, value: Any) -> Any: """TODO This shouldn't be valid. At runtime, `inspect_field_serializer` raises an error, as the `handler` argument is missing. However, there's no type checking error as the provided signature matches `pydantic_core.core_schema.GeneralWrapNoInfoSerializerFunction`. """ @field_serializer('a', mode='wrap') # type: ignore[type-var] # pyright: ignore[reportArgumentType] @staticmethod def staticmethod_no_handler(value: Any) -> Any: ... @field_serializer('a', mode='wrap') def valid_no_info(self, value: Any, handler: SerializerFunctionWrapHandler) -> Any: ... @field_serializer('a', mode='wrap') def valid_info(self, value: Any, handler: SerializerFunctionWrapHandler, info: FieldSerializationInfo) -> Any: ... pydantic-2.10.6/tests/typechecking/fields.py000066400000000000000000000033151474456633400210740ustar00rootroot00000000000000from pydantic import BaseModel, Field, PrivateAttr # private attributes should be excluded from # the synthesized `__init__` method: class ModelWithPrivateAttr(BaseModel): _private_field: str = PrivateAttr() m = ModelWithPrivateAttr() def new_list() -> list[int]: return [] class Model(BaseModel): # `default` and `default_factory` are mutually exclusive: f1: int = Field(default=1, default_factory=int) # type: ignore[call-overload] # pyright: ignore[reportCallIssue] # `default` and `default_factory` matches the annotation: f2: int = Field(default='1') # type: ignore[assignment] # pyright: ignore[reportAssignmentType] f3: int = Field(default_factory=str) # type: ignore[assignment] # pyright: ignore[reportAssignmentType] f4: int = PrivateAttr(default='1') # type: ignore[assignment] # pyright: ignore[reportAssignmentType] f5: int = PrivateAttr(default_factory=str) # type: ignore[assignment] # pyright: ignore[reportAssignmentType] f6: list[str] = Field(default_factory=list) f7: list[int] = Field(default_factory=new_list) f8: list[str] = Field(default_factory=lambda: list()) f9: dict[str, str] = Field(default_factory=dict) f10: int = Field(default_factory=lambda: 123) # Note: mypy may require a different error code for `f12` (see https://github.com/python/mypy/issues/17986). # Seems like this is not the case anymore. But could pop up at any time. f11: list[str] = Field(default_factory=new_list) # type: ignore[arg-type] # pyright: ignore[reportAssignmentType] f12: int = Field(default_factory=list) # type: ignore[arg-type] # pyright: ignore[reportAssignmentType] # Do not error on the ellipsis: f13: int = Field(...) pydantic-2.10.6/tests/typechecking/json_schema_examples.py000066400000000000000000000002021474456633400240050ustar00rootroot00000000000000from pydantic.json_schema import Examples e_good = Examples([]) e_deprecated = Examples({}) # pyright: ignore[reportDeprecated] pydantic-2.10.6/tests/typechecking/misc.py000066400000000000000000000010671474456633400205630ustar00rootroot00000000000000from pydantic import BaseModel class Sub(BaseModel): a: int b: int class Model(BaseModel): subs: list[Sub] def func(model: Model) -> None: model.model_dump( include={'a': {1: True}}, ) model.model_dump( include={'a': {'__all__': True}}, ) model.model_dump( include={'a': {1: {'a'}}}, ) model.model_dump( include={'a': {1, 2}}, ) # Invalid cases, should fail but the `IncEx` alias uses `bool` due to mypy limitations: model.model_dump( include={'a': {1: False}}, ) pydantic-2.10.6/tests/typechecking/pipeline_api.py000066400000000000000000000015751474456633400222720ustar00rootroot00000000000000import datetime from typing import Annotated from pydantic.experimental.pipeline import validate_as # TODO: since Pyright 1.1.384, support for PEP 746 was disabled. # `a1` and `a2` should have a `pyright: ignore[reportInvalidTypeArguments]` comment. a1 = Annotated[str, validate_as(int)] a2 = Annotated[str, validate_as(str).transform(lambda x: int(x))] a3 = Annotated[float, validate_as(float).gt(0)] # should be able to compare float to int a4 = Annotated[datetime.datetime, validate_as(datetime.datetime).datetime_tz_naive()] a5 = Annotated[datetime.datetime, validate_as(str).datetime_tz_naive()] # pyright: ignore[reportAttributeAccessIssue] a6 = Annotated[ datetime.datetime, ( validate_as(str).transform(str.strip).validate_as(datetime.datetime).datetime_tz_naive() | validate_as(int).transform(datetime.datetime.fromtimestamp).datetime_tz_aware() ), ] pydantic-2.10.6/tests/typechecking/pyproject.toml000066400000000000000000000005051474456633400221660ustar00rootroot00000000000000[tool.pyright] extraPaths = ['../..'] pythonVersion = '3.10' enableExperimentalFeatures = true enableTypeIgnoreComments = false reportUnnecessaryTypeIgnoreComment = true reportDeprecated = true reportUnusedExpression = false [tool.mypy] python_version = '3.10' disable_error_code = ['empty-body'] warn_unused_ignores = true pydantic-2.10.6/tests/typechecking/root_model.py000066400000000000000000000006261474456633400217730ustar00rootroot00000000000000from typing_extensions import assert_type from pydantic import RootModel IntRootModel = RootModel[int] int_root_model = IntRootModel(1) bad_root_model = IntRootModel('1') # type: ignore[arg-type] # pyright: ignore[reportArgumentType] assert_type(int_root_model.root, int) class StrRootModel(RootModel[str]): pass str_root_model = StrRootModel(root='a') assert_type(str_root_model.root, str) pydantic-2.10.6/tests/typechecking/type_adapter.py000066400000000000000000000016351474456633400223120ustar00rootroot00000000000000# to be removed with PEP 747: # mypy: disable_error_code=var-annotated from typing import Annotated from typing_extensions import assert_type from pydantic import TypeAdapter ta1 = TypeAdapter(int) assert_type(ta1, TypeAdapter[int]) assert_type(ta1.validate_python('1'), int) ta1.dump_python(1) ta1.dump_python('1') # type: ignore[arg-type] # pyright: ignore[reportArgumentType] ta1.dump_json(1) ta1.dump_json('1') # type: ignore[arg-type] # pyright: ignore[reportArgumentType] # The following use cases require PEP 747: TypeExpr: ta2 = TypeAdapter(Annotated[int, ...]) assert_type(ta2, TypeAdapter[int]) # type: ignore[assert-type] # pyright: ignore[reportAssertTypeFailure] ta3: TypeAdapter[int] = TypeAdapter(Annotated[int, ...]) assert_type(ta3, TypeAdapter[int]) ta4 = TypeAdapter(int | str) assert_type(ta4, TypeAdapter[int | str]) # type: ignore[assert-type] # pyright: ignore[reportAssertTypeFailure] pydantic-2.10.6/tests/typechecking/validate_call.py000066400000000000000000000012171474456633400224110ustar00rootroot00000000000000from typing_extensions import assert_type from pydantic import validate_call @validate_call def foo(a: int, *, c: str = 'x') -> str: return c * a a = foo(1, c='a') assert_type(a, str) foo('', c=1) # type: ignore[arg-type] # pyright: ignore[reportArgumentType] # Not possible to type check currently (see https://github.com/pydantic/pydantic/issues/9883): foo.raw_function(1, c='a') # type: ignore[attr-defined] # pyright: ignore[reportFunctionMemberAccess] # Should work even when not used as a bare decorator: @validate_call(config={'arbitrary_types_allowed': True}) def bar(a: int) -> int: return a b = bar(1) assert_type(b, int) pydantic-2.10.6/tests/typechecking/with_config_decorator.py000066400000000000000000000004341474456633400241670ustar00rootroot00000000000000from typing import TypedDict from typing_extensions import assert_type from pydantic import ConfigDict, with_config @with_config(ConfigDict(str_to_lower=True)) class Model(TypedDict): a: str assert_type(Model, type[Model]) model = Model(a='ABC') assert_type(model, Model) pydantic-2.10.6/update_v1.sh000077500000000000000000000013601474456633400156620ustar00rootroot00000000000000#! /usr/bin/env bash set -x set -e echo "cloning pydantic V1" git clone -b 1.10.X-fixes https://github.com/pydantic/pydantic.git pydantic-v1 pushd "$(dirname $0)/pydantic-v1" # Find latest tag in v1 latest_tag=$(git describe --tags --abbrev=0) echo "latest tag in V1 is '${latest_tag}'" git checkout "${latest_tag}" # Remove current V1 rm -rf ../pydantic/v1 # Copy new V1 into pydantic/v1 cp -r pydantic ../pydantic/v1 # Remove the v1 sub directory from v1, it's not needed in the v2 codebase rm -rf ../pydantic/v1/v1 # Update imports in pydantic/v1 to use pydantic.v1 find "../pydantic/v1" -name "*.py" -exec sed -i '' -E 's/from pydantic(\.[a-zA-Z0-9_]*)? import/from pydantic.v1\1 import/g' {} \; popd # Remove V1 clone rm -rf pydantic-v1 pydantic-2.10.6/uv.lock000066400000000000000000012513051474456633400147460ustar00rootroot00000000000000version = 1 requires-python = ">=3.8" resolution-markers = [ "(python_full_version < '3.9' and platform_system == 'Windows') or (python_full_version < '3.12' and platform_system != 'Windows' and sys_platform != 'win32') or (python_full_version < '3.9' and sys_platform == 'win32')", "python_full_version >= '3.9' and python_full_version < '3.12' and platform_system == 'Windows' and sys_platform != 'win32'", "python_full_version >= '3.12' and sys_platform != 'win32'", "python_full_version >= '3.9' and python_full_version < '3.12' and sys_platform == 'win32'", "python_full_version >= '3.12' and sys_platform == 'win32'", ] [[package]] name = "annotated-types" version = "0.7.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "typing-extensions", marker = "python_full_version < '3.9'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 } wheels = [ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 }, ] [[package]] name = "ansi2html" version = "1.9.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/4b/d5/e3546dcd5e4a9566f4ed8708df5853e83ca627461a5b048a861c6f8e7a26/ansi2html-1.9.2.tar.gz", hash = "sha256:3453bf87535d37b827b05245faaa756dbab4ec3d69925e352b6319c3c955c0a5", size = 44300 } wheels = [ { url = "https://files.pythonhosted.org/packages/bd/71/aee71b836e9ee2741d5694b80d74bfc7c8cd5dbdf7a9f3035fcf80d792b1/ansi2html-1.9.2-py3-none-any.whl", hash = "sha256:dccb75aa95fb018e5d299be2b45f802952377abfdce0504c17a6ee6ef0a420c5", size = 17614 }, ] [[package]] name = "asttokens" version = "2.4.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "six" }, ] sdist = { url = "https://files.pythonhosted.org/packages/45/1d/f03bcb60c4a3212e15f99a56085d93093a497718adf828d050b9d675da81/asttokens-2.4.1.tar.gz", hash = "sha256:b03869718ba9a6eb027e134bfdf69f38a236d681c83c160d510768af11254ba0", size = 62284 } wheels = [ { url = "https://files.pythonhosted.org/packages/45/86/4736ac618d82a20d87d2f92ae19441ebc7ac9e7a581d7e58bbe79233b24a/asttokens-2.4.1-py2.py3-none-any.whl", hash = "sha256:051ed49c3dcae8913ea7cd08e46a606dba30b79993209636c4875bc1d637bc24", size = 27764 }, ] [[package]] name = "astunparse" version = "1.6.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "six", marker = "(python_full_version < '3.9' and platform_system == 'Windows') or (python_full_version < '3.12' and platform_system != 'Windows' and sys_platform != 'win32') or (python_full_version < '3.9' and sys_platform == 'win32')" }, { name = "wheel", marker = "(python_full_version < '3.9' and platform_system == 'Windows') or (python_full_version < '3.12' and platform_system != 'Windows' and sys_platform != 'win32') or (python_full_version < '3.9' and sys_platform == 'win32')" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f3/af/4182184d3c338792894f34a62672919db7ca008c89abee9b564dd34d8029/astunparse-1.6.3.tar.gz", hash = "sha256:5ad93a8456f0d084c3456d059fd9a92cce667963232cbf763eac3bc5b7940872", size = 18290 } wheels = [ { url = "https://files.pythonhosted.org/packages/2b/03/13dde6512ad7b4557eb792fbcf0c653af6076b81e5941d36ec61f7ce6028/astunparse-1.6.3-py2.py3-none-any.whl", hash = "sha256:c2652417f2c8b5bb325c885ae329bdf3f86424075c4fd1a128674bc6fba4b8e8", size = 12732 }, ] [[package]] name = "attrs" version = "24.2.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/fc/0f/aafca9af9315aee06a89ffde799a10a582fe8de76c563ee80bbcdc08b3fb/attrs-24.2.0.tar.gz", hash = "sha256:5cfb1b9148b5b086569baec03f20d7b6bf3bcacc9a42bebf87ffaaca362f6346", size = 792678 } wheels = [ { url = "https://files.pythonhosted.org/packages/6a/21/5b6702a7f963e95456c0de2d495f67bf5fd62840ac655dc451586d23d39a/attrs-24.2.0-py3-none-any.whl", hash = "sha256:81921eb96de3191c8258c199618104dd27ac608d9366f5e35d011eae1867ede2", size = 63001 }, ] [[package]] name = "autoflake" version = "2.3.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pyflakes" }, { name = "tomli", marker = "python_full_version < '3.11'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/2a/cb/486f912d6171bc5748c311a2984a301f4e2d054833a1da78485866c71522/autoflake-2.3.1.tar.gz", hash = "sha256:c98b75dc5b0a86459c4f01a1d32ac7eb4338ec4317a4469515ff1e687ecd909e", size = 27642 } wheels = [ { url = "https://files.pythonhosted.org/packages/a2/ee/3fd29bf416eb4f1c5579cf12bf393ae954099258abd7bde03c4f9716ef6b/autoflake-2.3.1-py3-none-any.whl", hash = "sha256:3ae7495db9084b7b32818b4140e6dc4fc280b712fb414f5b8fe57b0a8e85a840", size = 32483 }, ] [[package]] name = "babel" version = "2.16.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pytz", marker = "python_full_version < '3.9'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/2a/74/f1bc80f23eeba13393b7222b11d95ca3af2c1e28edca18af487137eefed9/babel-2.16.0.tar.gz", hash = "sha256:d1f3554ca26605fe173f3de0c65f750f5a42f924499bf134de6423582298e316", size = 9348104 } wheels = [ { url = "https://files.pythonhosted.org/packages/ed/20/bc79bc575ba2e2a7f70e8a1155618bb1301eaa5132a8271373a6903f73f8/babel-2.16.0-py3-none-any.whl", hash = "sha256:368b5b98b37c06b7daf6696391c3240c938b37767d4584413e8438c5c435fa8b", size = 9587599 }, ] [[package]] name = "black" version = "24.8.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, { name = "mypy-extensions" }, { name = "packaging" }, { name = "pathspec" }, { name = "platformdirs" }, { name = "tomli", marker = "python_full_version < '3.11'" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/04/b0/46fb0d4e00372f4a86a6f8efa3cb193c9f64863615e39010b1477e010578/black-24.8.0.tar.gz", hash = "sha256:2500945420b6784c38b9ee885af039f5e7471ef284ab03fa35ecdde4688cd83f", size = 644810 } wheels = [ { url = "https://files.pythonhosted.org/packages/47/6e/74e29edf1fba3887ed7066930a87f698ffdcd52c5dbc263eabb06061672d/black-24.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:09cdeb74d494ec023ded657f7092ba518e8cf78fa8386155e4a03fdcc44679e6", size = 1632092 }, { url = "https://files.pythonhosted.org/packages/ab/49/575cb6c3faee690b05c9d11ee2e8dba8fbd6d6c134496e644c1feb1b47da/black-24.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:81c6742da39f33b08e791da38410f32e27d632260e599df7245cccee2064afeb", size = 1457529 }, { url = "https://files.pythonhosted.org/packages/7a/b4/d34099e95c437b53d01c4aa37cf93944b233066eb034ccf7897fa4e5f286/black-24.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:707a1ca89221bc8a1a64fb5e15ef39cd755633daa672a9db7498d1c19de66a42", size = 1757443 }, { url = "https://files.pythonhosted.org/packages/87/a0/6d2e4175ef364b8c4b64f8441ba041ed65c63ea1db2720d61494ac711c15/black-24.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:d6417535d99c37cee4091a2f24eb2b6d5ec42b144d50f1f2e436d9fe1916fe1a", size = 1418012 }, { url = "https://files.pythonhosted.org/packages/08/a6/0a3aa89de9c283556146dc6dbda20cd63a9c94160a6fbdebaf0918e4a3e1/black-24.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fb6e2c0b86bbd43dee042e48059c9ad7830abd5c94b0bc518c0eeec57c3eddc1", size = 1615080 }, { url = "https://files.pythonhosted.org/packages/db/94/b803d810e14588bb297e565821a947c108390a079e21dbdcb9ab6956cd7a/black-24.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:837fd281f1908d0076844bc2b801ad2d369c78c45cf800cad7b61686051041af", size = 1438143 }, { url = "https://files.pythonhosted.org/packages/a5/b5/f485e1bbe31f768e2e5210f52ea3f432256201289fd1a3c0afda693776b0/black-24.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62e8730977f0b77998029da7971fa896ceefa2c4c4933fcd593fa599ecbf97a4", size = 1738774 }, { url = "https://files.pythonhosted.org/packages/a8/69/a000fc3736f89d1bdc7f4a879f8aaf516fb03613bb51a0154070383d95d9/black-24.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:72901b4913cbac8972ad911dc4098d5753704d1f3c56e44ae8dce99eecb0e3af", size = 1427503 }, { url = "https://files.pythonhosted.org/packages/a2/a8/05fb14195cfef32b7c8d4585a44b7499c2a4b205e1662c427b941ed87054/black-24.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7c046c1d1eeb7aea9335da62472481d3bbf3fd986e093cffd35f4385c94ae368", size = 1646132 }, { url = "https://files.pythonhosted.org/packages/41/77/8d9ce42673e5cb9988f6df73c1c5c1d4e9e788053cccd7f5fb14ef100982/black-24.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:649f6d84ccbae73ab767e206772cc2d7a393a001070a4c814a546afd0d423aed", size = 1448665 }, { url = "https://files.pythonhosted.org/packages/cc/94/eff1ddad2ce1d3cc26c162b3693043c6b6b575f538f602f26fe846dfdc75/black-24.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b59b250fdba5f9a9cd9d0ece6e6d993d91ce877d121d161e4698af3eb9c1018", size = 1762458 }, { url = "https://files.pythonhosted.org/packages/28/ea/18b8d86a9ca19a6942e4e16759b2fa5fc02bbc0eb33c1b866fcd387640ab/black-24.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:6e55d30d44bed36593c3163b9bc63bf58b3b30e4611e4d88a0c3c239930ed5b2", size = 1436109 }, { url = "https://files.pythonhosted.org/packages/9f/d4/ae03761ddecc1a37d7e743b89cccbcf3317479ff4b88cfd8818079f890d0/black-24.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:505289f17ceda596658ae81b61ebbe2d9b25aa78067035184ed0a9d855d18afd", size = 1617322 }, { url = "https://files.pythonhosted.org/packages/14/4b/4dfe67eed7f9b1ddca2ec8e4418ea74f0d1dc84d36ea874d618ffa1af7d4/black-24.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b19c9ad992c7883ad84c9b22aaa73562a16b819c1d8db7a1a1a49fb7ec13c7d2", size = 1442108 }, { url = "https://files.pythonhosted.org/packages/97/14/95b3f91f857034686cae0e73006b8391d76a8142d339b42970eaaf0416ea/black-24.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1f13f7f386f86f8121d76599114bb8c17b69d962137fc70efe56137727c7047e", size = 1745786 }, { url = "https://files.pythonhosted.org/packages/95/54/68b8883c8aa258a6dde958cd5bdfada8382bec47c5162f4a01e66d839af1/black-24.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:f490dbd59680d809ca31efdae20e634f3fae27fba3ce0ba3208333b713bc3920", size = 1426754 }, { url = "https://files.pythonhosted.org/packages/13/b2/b3f24fdbb46f0e7ef6238e131f13572ee8279b70f237f221dd168a9dba1a/black-24.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eab4dd44ce80dea27dc69db40dab62d4ca96112f87996bca68cd75639aeb2e4c", size = 1631706 }, { url = "https://files.pythonhosted.org/packages/d9/35/31010981e4a05202a84a3116423970fd1a59d2eda4ac0b3570fbb7029ddc/black-24.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3c4285573d4897a7610054af5a890bde7c65cb466040c5f0c8b732812d7f0e5e", size = 1457429 }, { url = "https://files.pythonhosted.org/packages/27/25/3f706b4f044dd569a20a4835c3b733dedea38d83d2ee0beb8178a6d44945/black-24.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e84e33b37be070ba135176c123ae52a51f82306def9f7d063ee302ecab2cf47", size = 1756488 }, { url = "https://files.pythonhosted.org/packages/63/72/79375cd8277cbf1c5670914e6bd4c1b15dea2c8f8e906dc21c448d0535f0/black-24.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:73bbf84ed136e45d451a260c6b73ed674652f90a2b3211d6a35e78054563a9bb", size = 1417721 }, { url = "https://files.pythonhosted.org/packages/27/1e/83fa8a787180e1632c3d831f7e58994d7aaf23a0961320d21e84f922f919/black-24.8.0-py3-none-any.whl", hash = "sha256:972085c618ee94f402da1af548a4f218c754ea7e5dc70acb168bfaca4c2542ed", size = 206504 }, ] [[package]] name = "cairocffi" version = "1.7.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cffi" }, ] sdist = { url = "https://files.pythonhosted.org/packages/70/c5/1a4dc131459e68a173cbdab5fad6b524f53f9c1ef7861b7698e998b837cc/cairocffi-1.7.1.tar.gz", hash = "sha256:2e48ee864884ec4a3a34bfa8c9ab9999f688286eb714a15a43ec9d068c36557b", size = 88096 } wheels = [ { url = "https://files.pythonhosted.org/packages/93/d8/ba13451aa6b745c49536e87b6bf8f629b950e84bd0e8308f7dc6883b67e2/cairocffi-1.7.1-py3-none-any.whl", hash = "sha256:9803a0e11f6c962f3b0ae2ec8ba6ae45e957a146a004697a1ac1bbf16b073b3f", size = 75611 }, ] [[package]] name = "cairosvg" version = "2.7.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cairocffi" }, { name = "cssselect2" }, { name = "defusedxml" }, { name = "pillow" }, { name = "tinycss2" }, ] sdist = { url = "https://files.pythonhosted.org/packages/d5/e6/ec5900b724e3c44af7f6f51f719919137284e5da4aabe96508baec8a1b40/CairoSVG-2.7.1.tar.gz", hash = "sha256:432531d72347291b9a9ebfb6777026b607563fd8719c46ee742db0aef7271ba0", size = 8399085 } wheels = [ { url = "https://files.pythonhosted.org/packages/01/a5/1866b42151f50453f1a0d28fc4c39f5be5f412a2e914f33449c42daafdf1/CairoSVG-2.7.1-py3-none-any.whl", hash = "sha256:8a5222d4e6c3f86f1f7046b63246877a63b49923a1cd202184c3a634ef546b3b", size = 43235 }, ] [[package]] name = "certifi" version = "2024.8.30" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/b0/ee/9b19140fe824b367c04c5e1b369942dd754c4c5462d5674002f75c4dedc1/certifi-2024.8.30.tar.gz", hash = "sha256:bec941d2aa8195e248a60b31ff9f0558284cf01a52591ceda73ea9afffd69fd9", size = 168507 } wheels = [ { url = "https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl", hash = "sha256:922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8", size = 167321 }, ] [[package]] name = "cffi" version = "1.17.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pycparser" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 } wheels = [ { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 }, { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 }, { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 }, { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 }, { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 }, { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 }, { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 }, { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 }, { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 }, { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 }, { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 }, { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 }, { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 }, { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 }, { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 }, { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 }, { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 }, { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 }, { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 }, { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 }, { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 }, { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 }, { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 }, { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 }, { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 }, { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 }, { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 }, { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 }, { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 }, { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 }, { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 }, { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 }, { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 }, { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 }, { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 }, { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 }, { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 }, { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 }, { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 }, { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 }, { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 }, { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 }, { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 }, { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 }, { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 }, { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 }, { url = "https://files.pythonhosted.org/packages/48/08/15bf6b43ae9bd06f6b00ad8a91f5a8fe1069d4c9fab550a866755402724e/cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b", size = 182457 }, { url = "https://files.pythonhosted.org/packages/c2/5b/f1523dd545f92f7df468e5f653ffa4df30ac222f3c884e51e139878f1cb5/cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964", size = 425932 }, { url = "https://files.pythonhosted.org/packages/53/93/7e547ab4105969cc8c93b38a667b82a835dd2cc78f3a7dad6130cfd41e1d/cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9", size = 448585 }, { url = "https://files.pythonhosted.org/packages/56/c4/a308f2c332006206bb511de219efeff090e9d63529ba0a77aae72e82248b/cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc", size = 456268 }, { url = "https://files.pythonhosted.org/packages/ca/5b/b63681518265f2f4060d2b60755c1c77ec89e5e045fc3773b72735ddaad5/cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c", size = 436592 }, { url = "https://files.pythonhosted.org/packages/bb/19/b51af9f4a4faa4a8ac5a0e5d5c2522dcd9703d07fac69da34a36c4d960d3/cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1", size = 446512 }, { url = "https://files.pythonhosted.org/packages/e2/63/2bed8323890cb613bbecda807688a31ed11a7fe7afe31f8faaae0206a9a3/cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8", size = 171576 }, { url = "https://files.pythonhosted.org/packages/2f/70/80c33b044ebc79527447fd4fbc5455d514c3bb840dede4455de97da39b4d/cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1", size = 181229 }, { url = "https://files.pythonhosted.org/packages/b9/ea/8bb50596b8ffbc49ddd7a1ad305035daa770202a6b782fc164647c2673ad/cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16", size = 182220 }, { url = "https://files.pythonhosted.org/packages/ae/11/e77c8cd24f58285a82c23af484cf5b124a376b32644e445960d1a4654c3a/cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36", size = 178605 }, { url = "https://files.pythonhosted.org/packages/ed/65/25a8dc32c53bf5b7b6c2686b42ae2ad58743f7ff644844af7cdb29b49361/cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8", size = 424910 }, { url = "https://files.pythonhosted.org/packages/42/7a/9d086fab7c66bd7c4d0f27c57a1b6b068ced810afc498cc8c49e0088661c/cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576", size = 447200 }, { url = "https://files.pythonhosted.org/packages/da/63/1785ced118ce92a993b0ec9e0d0ac8dc3e5dbfbcaa81135be56c69cabbb6/cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87", size = 454565 }, { url = "https://files.pythonhosted.org/packages/74/06/90b8a44abf3556599cdec107f7290277ae8901a58f75e6fe8f970cd72418/cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0", size = 435635 }, { url = "https://files.pythonhosted.org/packages/bd/62/a1f468e5708a70b1d86ead5bab5520861d9c7eacce4a885ded9faa7729c3/cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3", size = 445218 }, { url = "https://files.pythonhosted.org/packages/5b/95/b34462f3ccb09c2594aa782d90a90b045de4ff1f70148ee79c69d37a0a5a/cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595", size = 460486 }, { url = "https://files.pythonhosted.org/packages/fc/fc/a1e4bebd8d680febd29cf6c8a40067182b64f00c7d105f8f26b5bc54317b/cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a", size = 437911 }, { url = "https://files.pythonhosted.org/packages/e6/c3/21cab7a6154b6a5ea330ae80de386e7665254835b9e98ecc1340b3a7de9a/cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e", size = 460632 }, { url = "https://files.pythonhosted.org/packages/cb/b5/fd9f8b5a84010ca169ee49f4e4ad6f8c05f4e3545b72ee041dbbcb159882/cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7", size = 171820 }, { url = "https://files.pythonhosted.org/packages/8c/52/b08750ce0bce45c143e1b5d7357ee8c55341b52bdef4b0f081af1eb248c2/cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662", size = 181290 }, ] [[package]] name = "charset-normalizer" version = "3.4.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/f2/4f/e1808dc01273379acc506d18f1504eb2d299bd4131743b9fc54d7be4df1e/charset_normalizer-3.4.0.tar.gz", hash = "sha256:223217c3d4f82c3ac5e29032b3f1c2eb0fb591b72161f86d93f5719079dae93e", size = 106620 } wheels = [ { url = "https://files.pythonhosted.org/packages/69/8b/825cc84cf13a28bfbcba7c416ec22bf85a9584971be15b21dd8300c65b7f/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4f9fc98dad6c2eaa32fc3af1417d95b5e3d08aff968df0cd320066def971f9a6", size = 196363 }, { url = "https://files.pythonhosted.org/packages/23/81/d7eef6a99e42c77f444fdd7bc894b0ceca6c3a95c51239e74a722039521c/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0de7b687289d3c1b3e8660d0741874abe7888100efe14bd0f9fd7141bcbda92b", size = 125639 }, { url = "https://files.pythonhosted.org/packages/21/67/b4564d81f48042f520c948abac7079356e94b30cb8ffb22e747532cf469d/charset_normalizer-3.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5ed2e36c3e9b4f21dd9422f6893dec0abf2cca553af509b10cd630f878d3eb99", size = 120451 }, { url = "https://files.pythonhosted.org/packages/c2/72/12a7f0943dd71fb5b4e7b55c41327ac0a1663046a868ee4d0d8e9c369b85/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d3ff7fc90b98c637bda91c89d51264a3dcf210cade3a2c6f838c7268d7a4ca", size = 140041 }, { url = "https://files.pythonhosted.org/packages/67/56/fa28c2c3e31217c4c52158537a2cf5d98a6c1e89d31faf476c89391cd16b/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1110e22af8ca26b90bd6364fe4c763329b0ebf1ee213ba32b68c73de5752323d", size = 150333 }, { url = "https://files.pythonhosted.org/packages/f9/d2/466a9be1f32d89eb1554cf84073a5ed9262047acee1ab39cbaefc19635d2/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:86f4e8cca779080f66ff4f191a685ced73d2f72d50216f7112185dc02b90b9b7", size = 142921 }, { url = "https://files.pythonhosted.org/packages/f8/01/344ec40cf5d85c1da3c1f57566c59e0c9b56bcc5566c08804a95a6cc8257/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f683ddc7eedd742e2889d2bfb96d69573fde1d92fcb811979cdb7165bb9c7d3", size = 144785 }, { url = "https://files.pythonhosted.org/packages/73/8b/2102692cb6d7e9f03b9a33a710e0164cadfce312872e3efc7cfe22ed26b4/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27623ba66c183eca01bf9ff833875b459cad267aeeb044477fedac35e19ba907", size = 146631 }, { url = "https://files.pythonhosted.org/packages/d8/96/cc2c1b5d994119ce9f088a9a0c3ebd489d360a2eb058e2c8049f27092847/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:f606a1881d2663630ea5b8ce2efe2111740df4b687bd78b34a8131baa007f79b", size = 140867 }, { url = "https://files.pythonhosted.org/packages/c9/27/cde291783715b8ec30a61c810d0120411844bc4c23b50189b81188b273db/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0b309d1747110feb25d7ed6b01afdec269c647d382c857ef4663bbe6ad95a912", size = 149273 }, { url = "https://files.pythonhosted.org/packages/3a/a4/8633b0fc1a2d1834d5393dafecce4a1cc56727bfd82b4dc18fc92f0d3cc3/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:136815f06a3ae311fae551c3df1f998a1ebd01ddd424aa5603a4336997629e95", size = 152437 }, { url = "https://files.pythonhosted.org/packages/64/ea/69af161062166b5975ccbb0961fd2384853190c70786f288684490913bf5/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:14215b71a762336254351b00ec720a8e85cada43b987da5a042e4ce3e82bd68e", size = 150087 }, { url = "https://files.pythonhosted.org/packages/3b/fd/e60a9d9fd967f4ad5a92810138192f825d77b4fa2a557990fd575a47695b/charset_normalizer-3.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:79983512b108e4a164b9c8d34de3992f76d48cadc9554c9e60b43f308988aabe", size = 145142 }, { url = "https://files.pythonhosted.org/packages/6d/02/8cb0988a1e49ac9ce2eed1e07b77ff118f2923e9ebd0ede41ba85f2dcb04/charset_normalizer-3.4.0-cp310-cp310-win32.whl", hash = "sha256:c94057af19bc953643a33581844649a7fdab902624d2eb739738a30e2b3e60fc", size = 94701 }, { url = "https://files.pythonhosted.org/packages/d6/20/f1d4670a8a723c46be695dff449d86d6092916f9e99c53051954ee33a1bc/charset_normalizer-3.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:55f56e2ebd4e3bc50442fbc0888c9d8c94e4e06a933804e2af3e89e2f9c1c749", size = 102191 }, { url = "https://files.pythonhosted.org/packages/9c/61/73589dcc7a719582bf56aae309b6103d2762b526bffe189d635a7fcfd998/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0d99dd8ff461990f12d6e42c7347fd9ab2532fb70e9621ba520f9e8637161d7c", size = 193339 }, { url = "https://files.pythonhosted.org/packages/77/d5/8c982d58144de49f59571f940e329ad6e8615e1e82ef84584c5eeb5e1d72/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c57516e58fd17d03ebe67e181a4e4e2ccab1168f8c2976c6a334d4f819fe5944", size = 124366 }, { url = "https://files.pythonhosted.org/packages/bf/19/411a64f01ee971bed3231111b69eb56f9331a769072de479eae7de52296d/charset_normalizer-3.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6dba5d19c4dfab08e58d5b36304b3f92f3bd5d42c1a3fa37b5ba5cdf6dfcbcee", size = 118874 }, { url = "https://files.pythonhosted.org/packages/4c/92/97509850f0d00e9f14a46bc751daabd0ad7765cff29cdfb66c68b6dad57f/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf4475b82be41b07cc5e5ff94810e6a01f276e37c2d55571e3fe175e467a1a1c", size = 138243 }, { url = "https://files.pythonhosted.org/packages/e2/29/d227805bff72ed6d6cb1ce08eec707f7cfbd9868044893617eb331f16295/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce031db0408e487fd2775d745ce30a7cd2923667cf3b69d48d219f1d8f5ddeb6", size = 148676 }, { url = "https://files.pythonhosted.org/packages/13/bc/87c2c9f2c144bedfa62f894c3007cd4530ba4b5351acb10dc786428a50f0/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ff4e7cdfdb1ab5698e675ca622e72d58a6fa2a8aa58195de0c0061288e6e3ea", size = 141289 }, { url = "https://files.pythonhosted.org/packages/eb/5b/6f10bad0f6461fa272bfbbdf5d0023b5fb9bc6217c92bf068fa5a99820f5/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3710a9751938947e6327ea9f3ea6332a09bf0ba0c09cae9cb1f250bd1f1549bc", size = 142585 }, { url = "https://files.pythonhosted.org/packages/3b/a0/a68980ab8a1f45a36d9745d35049c1af57d27255eff8c907e3add84cf68f/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:82357d85de703176b5587dbe6ade8ff67f9f69a41c0733cf2425378b49954de5", size = 144408 }, { url = "https://files.pythonhosted.org/packages/d7/a1/493919799446464ed0299c8eef3c3fad0daf1c3cd48bff9263c731b0d9e2/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:47334db71978b23ebcf3c0f9f5ee98b8d65992b65c9c4f2d34c2eaf5bcaf0594", size = 139076 }, { url = "https://files.pythonhosted.org/packages/fb/9d/9c13753a5a6e0db4a0a6edb1cef7aee39859177b64e1a1e748a6e3ba62c2/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8ce7fd6767a1cc5a92a639b391891bf1c268b03ec7e021c7d6d902285259685c", size = 146874 }, { url = "https://files.pythonhosted.org/packages/75/d2/0ab54463d3410709c09266dfb416d032a08f97fd7d60e94b8c6ef54ae14b/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f1a2f519ae173b5b6a2c9d5fa3116ce16e48b3462c8b96dfdded11055e3d6365", size = 150871 }, { url = "https://files.pythonhosted.org/packages/8d/c9/27e41d481557be53d51e60750b85aa40eaf52b841946b3cdeff363105737/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:63bc5c4ae26e4bc6be6469943b8253c0fd4e4186c43ad46e713ea61a0ba49129", size = 148546 }, { url = "https://files.pythonhosted.org/packages/ee/44/4f62042ca8cdc0cabf87c0fc00ae27cd8b53ab68be3605ba6d071f742ad3/charset_normalizer-3.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bcb4f8ea87d03bc51ad04add8ceaf9b0f085ac045ab4d74e73bbc2dc033f0236", size = 143048 }, { url = "https://files.pythonhosted.org/packages/01/f8/38842422988b795220eb8038745d27a675ce066e2ada79516c118f291f07/charset_normalizer-3.4.0-cp311-cp311-win32.whl", hash = "sha256:9ae4ef0b3f6b41bad6366fb0ea4fc1d7ed051528e113a60fa2a65a9abb5b1d99", size = 94389 }, { url = "https://files.pythonhosted.org/packages/0b/6e/b13bd47fa9023b3699e94abf565b5a2f0b0be6e9ddac9812182596ee62e4/charset_normalizer-3.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cee4373f4d3ad28f1ab6290684d8e2ebdb9e7a1b74fdc39e4c211995f77bec27", size = 101752 }, { url = "https://files.pythonhosted.org/packages/d3/0b/4b7a70987abf9b8196845806198975b6aab4ce016632f817ad758a5aa056/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0713f3adb9d03d49d365b70b84775d0a0d18e4ab08d12bc46baa6132ba78aaf6", size = 194445 }, { url = "https://files.pythonhosted.org/packages/50/89/354cc56cf4dd2449715bc9a0f54f3aef3dc700d2d62d1fa5bbea53b13426/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:de7376c29d95d6719048c194a9cf1a1b0393fbe8488a22008610b0361d834ecf", size = 125275 }, { url = "https://files.pythonhosted.org/packages/fa/44/b730e2a2580110ced837ac083d8ad222343c96bb6b66e9e4e706e4d0b6df/charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a51b48f42d9358460b78725283f04bddaf44a9358197b889657deba38f329db", size = 119020 }, { url = "https://files.pythonhosted.org/packages/9d/e4/9263b8240ed9472a2ae7ddc3e516e71ef46617fe40eaa51221ccd4ad9a27/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b295729485b06c1a0683af02a9e42d2caa9db04a373dc38a6a58cdd1e8abddf1", size = 139128 }, { url = "https://files.pythonhosted.org/packages/6b/e3/9f73e779315a54334240353eaea75854a9a690f3f580e4bd85d977cb2204/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ee803480535c44e7f5ad00788526da7d85525cfefaf8acf8ab9a310000be4b03", size = 149277 }, { url = "https://files.pythonhosted.org/packages/1a/cf/f1f50c2f295312edb8a548d3fa56a5c923b146cd3f24114d5adb7e7be558/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d59d125ffbd6d552765510e3f31ed75ebac2c7470c7274195b9161a32350284", size = 142174 }, { url = "https://files.pythonhosted.org/packages/16/92/92a76dc2ff3a12e69ba94e7e05168d37d0345fa08c87e1fe24d0c2a42223/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8cda06946eac330cbe6598f77bb54e690b4ca93f593dee1568ad22b04f347c15", size = 143838 }, { url = "https://files.pythonhosted.org/packages/a4/01/2117ff2b1dfc61695daf2babe4a874bca328489afa85952440b59819e9d7/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:07afec21bbbbf8a5cc3651aa96b980afe2526e7f048fdfb7f1014d84acc8b6d8", size = 146149 }, { url = "https://files.pythonhosted.org/packages/f6/9b/93a332b8d25b347f6839ca0a61b7f0287b0930216994e8bf67a75d050255/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6b40e8d38afe634559e398cc32b1472f376a4099c75fe6299ae607e404c033b2", size = 140043 }, { url = "https://files.pythonhosted.org/packages/ab/f6/7ac4a01adcdecbc7a7587767c776d53d369b8b971382b91211489535acf0/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b8dcd239c743aa2f9c22ce674a145e0a25cb1566c495928440a181ca1ccf6719", size = 148229 }, { url = "https://files.pythonhosted.org/packages/9d/be/5708ad18161dee7dc6a0f7e6cf3a88ea6279c3e8484844c0590e50e803ef/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:84450ba661fb96e9fd67629b93d2941c871ca86fc38d835d19d4225ff946a631", size = 151556 }, { url = "https://files.pythonhosted.org/packages/5a/bb/3d8bc22bacb9eb89785e83e6723f9888265f3a0de3b9ce724d66bd49884e/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:44aeb140295a2f0659e113b31cfe92c9061622cadbc9e2a2f7b8ef6b1e29ef4b", size = 149772 }, { url = "https://files.pythonhosted.org/packages/f7/fa/d3fc622de05a86f30beea5fc4e9ac46aead4731e73fd9055496732bcc0a4/charset_normalizer-3.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1db4e7fefefd0f548d73e2e2e041f9df5c59e178b4c72fbac4cc6f535cfb1565", size = 144800 }, { url = "https://files.pythonhosted.org/packages/9a/65/bdb9bc496d7d190d725e96816e20e2ae3a6fa42a5cac99c3c3d6ff884118/charset_normalizer-3.4.0-cp312-cp312-win32.whl", hash = "sha256:5726cf76c982532c1863fb64d8c6dd0e4c90b6ece9feb06c9f202417a31f7dd7", size = 94836 }, { url = "https://files.pythonhosted.org/packages/3e/67/7b72b69d25b89c0b3cea583ee372c43aa24df15f0e0f8d3982c57804984b/charset_normalizer-3.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:b197e7094f232959f8f20541ead1d9862ac5ebea1d58e9849c1bf979255dfac9", size = 102187 }, { url = "https://files.pythonhosted.org/packages/f3/89/68a4c86f1a0002810a27f12e9a7b22feb198c59b2f05231349fbce5c06f4/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:dd4eda173a9fcccb5f2e2bd2a9f423d180194b1bf17cf59e3269899235b2a114", size = 194617 }, { url = "https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e9e3c4c9e1ed40ea53acf11e2a386383c3304212c965773704e4603d589343ed", size = 125310 }, { url = "https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:92a7e36b000bf022ef3dbb9c46bfe2d52c047d5e3f3343f43204263c5addc250", size = 119126 }, { url = "https://files.pythonhosted.org/packages/ff/6e/e445afe4f7fda27a533f3234b627b3e515a1b9429bc981c9a5e2aa5d97b6/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:54b6a92d009cbe2fb11054ba694bc9e284dad30a26757b1e372a1fdddaf21920", size = 139342 }, { url = "https://files.pythonhosted.org/packages/a1/b2/4af9993b532d93270538ad4926c8e37dc29f2111c36f9c629840c57cd9b3/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ffd9493de4c922f2a38c2bf62b831dcec90ac673ed1ca182fe11b4d8e9f2a64", size = 149383 }, { url = "https://files.pythonhosted.org/packages/fb/6f/4e78c3b97686b871db9be6f31d64e9264e889f8c9d7ab33c771f847f79b7/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:35c404d74c2926d0287fbd63ed5d27eb911eb9e4a3bb2c6d294f3cfd4a9e0c23", size = 142214 }, { url = "https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4796efc4faf6b53a18e3d46343535caed491776a22af773f366534056c4e1fbc", size = 144104 }, { url = "https://files.pythonhosted.org/packages/ee/68/efad5dcb306bf37db7db338338e7bb8ebd8cf38ee5bbd5ceaaaa46f257e6/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e7fdd52961feb4c96507aa649550ec2a0d527c086d284749b2f582f2d40a2e0d", size = 146255 }, { url = "https://files.pythonhosted.org/packages/0c/75/1ed813c3ffd200b1f3e71121c95da3f79e6d2a96120163443b3ad1057505/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:92db3c28b5b2a273346bebb24857fda45601aef6ae1c011c0a997106581e8a88", size = 140251 }, { url = "https://files.pythonhosted.org/packages/7d/0d/6f32255c1979653b448d3c709583557a4d24ff97ac4f3a5be156b2e6a210/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ab973df98fc99ab39080bfb0eb3a925181454d7c3ac8a1e695fddfae696d9e90", size = 148474 }, { url = "https://files.pythonhosted.org/packages/ac/a0/c1b5298de4670d997101fef95b97ac440e8c8d8b4efa5a4d1ef44af82f0d/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4b67fdab07fdd3c10bb21edab3cbfe8cf5696f453afce75d815d9d7223fbe88b", size = 151849 }, { url = "https://files.pythonhosted.org/packages/04/4f/b3961ba0c664989ba63e30595a3ed0875d6790ff26671e2aae2fdc28a399/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:aa41e526a5d4a9dfcfbab0716c7e8a1b215abd3f3df5a45cf18a12721d31cb5d", size = 149781 }, { url = "https://files.pythonhosted.org/packages/d8/90/6af4cd042066a4adad58ae25648a12c09c879efa4849c705719ba1b23d8c/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc519621dce0c767e96b9c53f09c5d215578e10b02c285809f76509a3931482", size = 144970 }, { url = "https://files.pythonhosted.org/packages/cc/67/e5e7e0cbfefc4ca79025238b43cdf8a2037854195b37d6417f3d0895c4c2/charset_normalizer-3.4.0-cp313-cp313-win32.whl", hash = "sha256:f19c1585933c82098c2a520f8ec1227f20e339e33aca8fa6f956f6691b784e67", size = 94973 }, { url = "https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:707b82d19e65c9bd28b81dde95249b07bf9f5b90ebe1ef17d9b57473f8a64b7b", size = 102308 }, { url = "https://files.pythonhosted.org/packages/86/f4/ccab93e631e7293cca82f9f7ba39783c967f823a0000df2d8dd743cad74f/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:af73657b7a68211996527dbfeffbb0864e043d270580c5aef06dc4b659a4b578", size = 193961 }, { url = "https://files.pythonhosted.org/packages/94/d4/2b21cb277bac9605026d2d91a4a8872bc82199ed11072d035dc674c27223/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cab5d0b79d987c67f3b9e9c53f54a61360422a5a0bc075f43cab5621d530c3b6", size = 124507 }, { url = "https://files.pythonhosted.org/packages/9a/e0/a7c1fcdff20d9c667342e0391cfeb33ab01468d7d276b2c7914b371667cc/charset_normalizer-3.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9289fd5dddcf57bab41d044f1756550f9e7cf0c8e373b8cdf0ce8773dc4bd417", size = 119298 }, { url = "https://files.pythonhosted.org/packages/70/de/1538bb2f84ac9940f7fa39945a5dd1d22b295a89c98240b262fc4b9fcfe0/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b493a043635eb376e50eedf7818f2f322eabbaa974e948bd8bdd29eb7ef2a51", size = 139328 }, { url = "https://files.pythonhosted.org/packages/e9/ca/288bb1a6bc2b74fb3990bdc515012b47c4bc5925c8304fc915d03f94b027/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fa2566ca27d67c86569e8c85297aaf413ffab85a8960500f12ea34ff98e4c41", size = 149368 }, { url = "https://files.pythonhosted.org/packages/aa/75/58374fdaaf8406f373e508dab3486a31091f760f99f832d3951ee93313e8/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8e538f46104c815be19c975572d74afb53f29650ea2025bbfaef359d2de2f7f", size = 141944 }, { url = "https://files.pythonhosted.org/packages/32/c8/0bc558f7260db6ffca991ed7166494a7da4fda5983ee0b0bfc8ed2ac6ff9/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fd30dc99682dc2c603c2b315bded2799019cea829f8bf57dc6b61efde6611c8", size = 143326 }, { url = "https://files.pythonhosted.org/packages/0e/dd/7f6fec09a1686446cee713f38cf7d5e0669e0bcc8288c8e2924e998cf87d/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2006769bd1640bdf4d5641c69a3d63b71b81445473cac5ded39740a226fa88ab", size = 146171 }, { url = "https://files.pythonhosted.org/packages/4c/a8/440f1926d6d8740c34d3ca388fbd718191ec97d3d457a0677eb3aa718fce/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:dc15e99b2d8a656f8e666854404f1ba54765871104e50c8e9813af8a7db07f12", size = 139711 }, { url = "https://files.pythonhosted.org/packages/e9/7f/4b71e350a3377ddd70b980bea1e2cc0983faf45ba43032b24b2578c14314/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ab2e5bef076f5a235c3774b4f4028a680432cded7cad37bba0fd90d64b187d19", size = 148348 }, { url = "https://files.pythonhosted.org/packages/1e/70/17b1b9202531a33ed7ef41885f0d2575ae42a1e330c67fddda5d99ad1208/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:4ec9dd88a5b71abfc74e9df5ebe7921c35cbb3b641181a531ca65cdb5e8e4dea", size = 151290 }, { url = "https://files.pythonhosted.org/packages/44/30/574b5b5933d77ecb015550aafe1c7d14a8cd41e7e6c4dcea5ae9e8d496c3/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:43193c5cda5d612f247172016c4bb71251c784d7a4d9314677186a838ad34858", size = 149114 }, { url = "https://files.pythonhosted.org/packages/0b/11/ca7786f7e13708687443082af20d8341c02e01024275a28bc75032c5ce5d/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:aa693779a8b50cd97570e5a0f343538a8dbd3e496fa5dcb87e29406ad0299654", size = 143856 }, { url = "https://files.pythonhosted.org/packages/f9/c2/1727c1438256c71ed32753b23ec2e6fe7b6dff66a598f6566cfe8139305e/charset_normalizer-3.4.0-cp38-cp38-win32.whl", hash = "sha256:7706f5850360ac01d80c89bcef1640683cc12ed87f42579dab6c5d3ed6888613", size = 94333 }, { url = "https://files.pythonhosted.org/packages/09/c8/0e17270496a05839f8b500c1166e3261d1226e39b698a735805ec206967b/charset_normalizer-3.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:c3e446d253bd88f6377260d07c895816ebf33ffffd56c1c792b13bff9c3e1ade", size = 101454 }, { url = "https://files.pythonhosted.org/packages/54/2f/28659eee7f5d003e0f5a3b572765bf76d6e0fe6601ab1f1b1dd4cba7e4f1/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:980b4f289d1d90ca5efcf07958d3eb38ed9c0b7676bf2831a54d4f66f9c27dfa", size = 196326 }, { url = "https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f28f891ccd15c514a0981f3b9db9aa23d62fe1a99997512b0491d2ed323d229a", size = 125614 }, { url = "https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8aacce6e2e1edcb6ac625fb0f8c3a9570ccc7bfba1f63419b3769ccf6a00ed0", size = 120450 }, { url = "https://files.pythonhosted.org/packages/a4/23/65af317914a0308495133b2d654cf67b11bbd6ca16637c4e8a38f80a5a69/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd7af3717683bea4c87acd8c0d3d5b44d56120b26fd3f8a692bdd2d5260c620a", size = 140135 }, { url = "https://files.pythonhosted.org/packages/f2/41/6190102ad521a8aa888519bb014a74251ac4586cde9b38e790901684f9ab/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ff2ed8194587faf56555927b3aa10e6fb69d931e33953943bc4f837dfee2242", size = 150413 }, { url = "https://files.pythonhosted.org/packages/7b/ab/f47b0159a69eab9bd915591106859f49670c75f9a19082505ff16f50efc0/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e91f541a85298cf35433bf66f3fab2a4a2cff05c127eeca4af174f6d497f0d4b", size = 142992 }, { url = "https://files.pythonhosted.org/packages/28/89/60f51ad71f63aaaa7e51a2a2ad37919985a341a1d267070f212cdf6c2d22/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:309a7de0a0ff3040acaebb35ec45d18db4b28232f21998851cfa709eeff49d62", size = 144871 }, { url = "https://files.pythonhosted.org/packages/0c/48/0050550275fea585a6e24460b42465020b53375017d8596c96be57bfabca/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:285e96d9d53422efc0d7a17c60e59f37fbf3dfa942073f666db4ac71e8d726d0", size = 146756 }, { url = "https://files.pythonhosted.org/packages/dc/b5/47f8ee91455946f745e6c9ddbb0f8f50314d2416dd922b213e7d5551ad09/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5d447056e2ca60382d460a604b6302d8db69476fd2015c81e7c35417cfabe4cd", size = 141034 }, { url = "https://files.pythonhosted.org/packages/84/79/5c731059ebab43e80bf61fa51666b9b18167974b82004f18c76378ed31a3/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:20587d20f557fe189b7947d8e7ec5afa110ccf72a3128d61a2a387c3313f46be", size = 149434 }, { url = "https://files.pythonhosted.org/packages/ca/f3/0719cd09fc4dc42066f239cb3c48ced17fc3316afca3e2a30a4756fe49ab/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:130272c698667a982a5d0e626851ceff662565379baf0ff2cc58067b81d4f11d", size = 152443 }, { url = "https://files.pythonhosted.org/packages/f7/0e/c6357297f1157c8e8227ff337e93fd0a90e498e3d6ab96b2782204ecae48/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:ab22fbd9765e6954bc0bcff24c25ff71dcbfdb185fcdaca49e81bac68fe724d3", size = 150294 }, { url = "https://files.pythonhosted.org/packages/54/9a/acfa96dc4ea8c928040b15822b59d0863d6e1757fba8bd7de3dc4f761c13/charset_normalizer-3.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:7782afc9b6b42200f7362858f9e73b1f8316afb276d316336c0ec3bd73312742", size = 145314 }, { url = "https://files.pythonhosted.org/packages/73/1c/b10a63032eaebb8d7bcb8544f12f063f41f5f463778ac61da15d9985e8b6/charset_normalizer-3.4.0-cp39-cp39-win32.whl", hash = "sha256:2de62e8801ddfff069cd5c504ce3bc9672b23266597d4e4f50eda28846c322f2", size = 94724 }, { url = "https://files.pythonhosted.org/packages/c5/77/3a78bf28bfaa0863f9cfef278dbeadf55efe064eafff8c7c424ae3c4c1bf/charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:95c3c157765b031331dd4db3c775e58deaee050a3042fcad72cbc4189d7c8dca", size = 102159 }, { url = "https://files.pythonhosted.org/packages/bf/9b/08c0432272d77b04803958a4598a51e2a4b51c06640af8b8f0f908c18bf2/charset_normalizer-3.4.0-py3-none-any.whl", hash = "sha256:fe9f97feb71aa9896b81973a7bbada8c49501dc73e58a10fcef6663af95e5079", size = 49446 }, ] [[package]] name = "click" version = "8.1.7" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama", marker = "platform_system == 'Windows'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/96/d3/f04c7bfcf5c1862a2a5b845c6b2b360488cf47af55dfa79c98f6a6bf98b5/click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de", size = 336121 } wheels = [ { url = "https://files.pythonhosted.org/packages/00/2e/d53fa4befbf2cfa713304affc7ca780ce4fc1fd8710527771b58311a3229/click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28", size = 97941 }, ] [[package]] name = "cloudpickle" version = "3.1.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/97/c7/f746cadd08c4c08129215cf1b984b632f9e579fc781301e63da9e85c76c1/cloudpickle-3.1.0.tar.gz", hash = "sha256:81a929b6e3c7335c863c771d673d105f02efdb89dfaba0c90495d1c64796601b", size = 66155 } wheels = [ { url = "https://files.pythonhosted.org/packages/48/41/e1d85ca3cab0b674e277c8c4f678cf66a91cd2cecf93df94353a606fe0db/cloudpickle-3.1.0-py3-none-any.whl", hash = "sha256:fe11acda67f61aaaec473e3afe030feb131d78a43461b718185363384f1ba12e", size = 22021 }, ] [[package]] name = "colorama" version = "0.4.6" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } wheels = [ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, ] [[package]] name = "coverage" version = "7.6.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/f7/08/7e37f82e4d1aead42a7443ff06a1e406aabf7302c4f00a546e4b320b994c/coverage-7.6.1.tar.gz", hash = "sha256:953510dfb7b12ab69d20135a0662397f077c59b1e6379a768e97c59d852ee51d", size = 798791 } wheels = [ { url = "https://files.pythonhosted.org/packages/7e/61/eb7ce5ed62bacf21beca4937a90fe32545c91a3c8a42a30c6616d48fc70d/coverage-7.6.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b06079abebbc0e89e6163b8e8f0e16270124c154dc6e4a47b413dd538859af16", size = 206690 }, { url = "https://files.pythonhosted.org/packages/7d/73/041928e434442bd3afde5584bdc3f932fb4562b1597629f537387cec6f3d/coverage-7.6.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cf4b19715bccd7ee27b6b120e7e9dd56037b9c0681dcc1adc9ba9db3d417fa36", size = 207127 }, { url = "https://files.pythonhosted.org/packages/c7/c8/6ca52b5147828e45ad0242388477fdb90df2c6cbb9a441701a12b3c71bc8/coverage-7.6.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61c0abb4c85b095a784ef23fdd4aede7a2628478e7baba7c5e3deba61070a02", size = 235654 }, { url = "https://files.pythonhosted.org/packages/d5/da/9ac2b62557f4340270942011d6efeab9833648380109e897d48ab7c1035d/coverage-7.6.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd21f6ae3f08b41004dfb433fa895d858f3f5979e7762d052b12aef444e29afc", size = 233598 }, { url = "https://files.pythonhosted.org/packages/53/23/9e2c114d0178abc42b6d8d5281f651a8e6519abfa0ef460a00a91f80879d/coverage-7.6.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f59d57baca39b32db42b83b2a7ba6f47ad9c394ec2076b084c3f029b7afca23", size = 234732 }, { url = "https://files.pythonhosted.org/packages/0f/7e/a0230756fb133343a52716e8b855045f13342b70e48e8ad41d8a0d60ab98/coverage-7.6.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:a1ac0ae2b8bd743b88ed0502544847c3053d7171a3cff9228af618a068ed9c34", size = 233816 }, { url = "https://files.pythonhosted.org/packages/28/7c/3753c8b40d232b1e5eeaed798c875537cf3cb183fb5041017c1fdb7ec14e/coverage-7.6.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e6a08c0be454c3b3beb105c0596ebdc2371fab6bb90c0c0297f4e58fd7e1012c", size = 232325 }, { url = "https://files.pythonhosted.org/packages/57/e3/818a2b2af5b7573b4b82cf3e9f137ab158c90ea750a8f053716a32f20f06/coverage-7.6.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f5796e664fe802da4f57a168c85359a8fbf3eab5e55cd4e4569fbacecc903959", size = 233418 }, { url = "https://files.pythonhosted.org/packages/c8/fb/4532b0b0cefb3f06d201648715e03b0feb822907edab3935112b61b885e2/coverage-7.6.1-cp310-cp310-win32.whl", hash = "sha256:7bb65125fcbef8d989fa1dd0e8a060999497629ca5b0efbca209588a73356232", size = 209343 }, { url = "https://files.pythonhosted.org/packages/5a/25/af337cc7421eca1c187cc9c315f0a755d48e755d2853715bfe8c418a45fa/coverage-7.6.1-cp310-cp310-win_amd64.whl", hash = "sha256:3115a95daa9bdba70aea750db7b96b37259a81a709223c8448fa97727d546fe0", size = 210136 }, { url = "https://files.pythonhosted.org/packages/ad/5f/67af7d60d7e8ce61a4e2ddcd1bd5fb787180c8d0ae0fbd073f903b3dd95d/coverage-7.6.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7dea0889685db8550f839fa202744652e87c60015029ce3f60e006f8c4462c93", size = 206796 }, { url = "https://files.pythonhosted.org/packages/e1/0e/e52332389e057daa2e03be1fbfef25bb4d626b37d12ed42ae6281d0a274c/coverage-7.6.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ed37bd3c3b063412f7620464a9ac1314d33100329f39799255fb8d3027da50d3", size = 207244 }, { url = "https://files.pythonhosted.org/packages/aa/cd/766b45fb6e090f20f8927d9c7cb34237d41c73a939358bc881883fd3a40d/coverage-7.6.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d85f5e9a5f8b73e2350097c3756ef7e785f55bd71205defa0bfdaf96c31616ff", size = 239279 }, { url = "https://files.pythonhosted.org/packages/70/6c/a9ccd6fe50ddaf13442a1e2dd519ca805cbe0f1fcd377fba6d8339b98ccb/coverage-7.6.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bc572be474cafb617672c43fe989d6e48d3c83af02ce8de73fff1c6bb3c198d", size = 236859 }, { url = "https://files.pythonhosted.org/packages/14/6f/8351b465febb4dbc1ca9929505202db909c5a635c6fdf33e089bbc3d7d85/coverage-7.6.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c0420b573964c760df9e9e86d1a9a622d0d27f417e1a949a8a66dd7bcee7bc6", size = 238549 }, { url = "https://files.pythonhosted.org/packages/68/3c/289b81fa18ad72138e6d78c4c11a82b5378a312c0e467e2f6b495c260907/coverage-7.6.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1f4aa8219db826ce6be7099d559f8ec311549bfc4046f7f9fe9b5cea5c581c56", size = 237477 }, { url = "https://files.pythonhosted.org/packages/ed/1c/aa1efa6459d822bd72c4abc0b9418cf268de3f60eeccd65dc4988553bd8d/coverage-7.6.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:fc5a77d0c516700ebad189b587de289a20a78324bc54baee03dd486f0855d234", size = 236134 }, { url = "https://files.pythonhosted.org/packages/fb/c8/521c698f2d2796565fe9c789c2ee1ccdae610b3aa20b9b2ef980cc253640/coverage-7.6.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b48f312cca9621272ae49008c7f613337c53fadca647d6384cc129d2996d1133", size = 236910 }, { url = "https://files.pythonhosted.org/packages/7d/30/033e663399ff17dca90d793ee8a2ea2890e7fdf085da58d82468b4220bf7/coverage-7.6.1-cp311-cp311-win32.whl", hash = "sha256:1125ca0e5fd475cbbba3bb67ae20bd2c23a98fac4e32412883f9bcbaa81c314c", size = 209348 }, { url = "https://files.pythonhosted.org/packages/20/05/0d1ccbb52727ccdadaa3ff37e4d2dc1cd4d47f0c3df9eb58d9ec8508ca88/coverage-7.6.1-cp311-cp311-win_amd64.whl", hash = "sha256:8ae539519c4c040c5ffd0632784e21b2f03fc1340752af711f33e5be83a9d6c6", size = 210230 }, { url = "https://files.pythonhosted.org/packages/7e/d4/300fc921dff243cd518c7db3a4c614b7e4b2431b0d1145c1e274fd99bd70/coverage-7.6.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:95cae0efeb032af8458fc27d191f85d1717b1d4e49f7cb226cf526ff28179778", size = 206983 }, { url = "https://files.pythonhosted.org/packages/e1/ab/6bf00de5327ecb8db205f9ae596885417a31535eeda6e7b99463108782e1/coverage-7.6.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5621a9175cf9d0b0c84c2ef2b12e9f5f5071357c4d2ea6ca1cf01814f45d2391", size = 207221 }, { url = "https://files.pythonhosted.org/packages/92/8f/2ead05e735022d1a7f3a0a683ac7f737de14850395a826192f0288703472/coverage-7.6.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:260933720fdcd75340e7dbe9060655aff3af1f0c5d20f46b57f262ab6c86a5e8", size = 240342 }, { url = "https://files.pythonhosted.org/packages/0f/ef/94043e478201ffa85b8ae2d2c79b4081e5a1b73438aafafccf3e9bafb6b5/coverage-7.6.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:07e2ca0ad381b91350c0ed49d52699b625aab2b44b65e1b4e02fa9df0e92ad2d", size = 237371 }, { url = "https://files.pythonhosted.org/packages/1f/0f/c890339dd605f3ebc269543247bdd43b703cce6825b5ed42ff5f2d6122c7/coverage-7.6.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c44fee9975f04b33331cb8eb272827111efc8930cfd582e0320613263ca849ca", size = 239455 }, { url = "https://files.pythonhosted.org/packages/d1/04/7fd7b39ec7372a04efb0f70c70e35857a99b6a9188b5205efb4c77d6a57a/coverage-7.6.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:877abb17e6339d96bf08e7a622d05095e72b71f8afd8a9fefc82cf30ed944163", size = 238924 }, { url = "https://files.pythonhosted.org/packages/ed/bf/73ce346a9d32a09cf369f14d2a06651329c984e106f5992c89579d25b27e/coverage-7.6.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3e0cadcf6733c09154b461f1ca72d5416635e5e4ec4e536192180d34ec160f8a", size = 237252 }, { url = "https://files.pythonhosted.org/packages/86/74/1dc7a20969725e917b1e07fe71a955eb34bc606b938316bcc799f228374b/coverage-7.6.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c3c02d12f837d9683e5ab2f3d9844dc57655b92c74e286c262e0fc54213c216d", size = 238897 }, { url = "https://files.pythonhosted.org/packages/b6/e9/d9cc3deceb361c491b81005c668578b0dfa51eed02cd081620e9a62f24ec/coverage-7.6.1-cp312-cp312-win32.whl", hash = "sha256:e05882b70b87a18d937ca6768ff33cc3f72847cbc4de4491c8e73880766718e5", size = 209606 }, { url = "https://files.pythonhosted.org/packages/47/c8/5a2e41922ea6740f77d555c4d47544acd7dc3f251fe14199c09c0f5958d3/coverage-7.6.1-cp312-cp312-win_amd64.whl", hash = "sha256:b5d7b556859dd85f3a541db6a4e0167b86e7273e1cdc973e5b175166bb634fdb", size = 210373 }, { url = "https://files.pythonhosted.org/packages/8c/f9/9aa4dfb751cb01c949c990d136a0f92027fbcc5781c6e921df1cb1563f20/coverage-7.6.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a4acd025ecc06185ba2b801f2de85546e0b8ac787cf9d3b06e7e2a69f925b106", size = 207007 }, { url = "https://files.pythonhosted.org/packages/b9/67/e1413d5a8591622a46dd04ff80873b04c849268831ed5c304c16433e7e30/coverage-7.6.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a6d3adcf24b624a7b778533480e32434a39ad8fa30c315208f6d3e5542aeb6e9", size = 207269 }, { url = "https://files.pythonhosted.org/packages/14/5b/9dec847b305e44a5634d0fb8498d135ab1d88330482b74065fcec0622224/coverage-7.6.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0c212c49b6c10e6951362f7c6df3329f04c2b1c28499563d4035d964ab8e08c", size = 239886 }, { url = "https://files.pythonhosted.org/packages/7b/b7/35760a67c168e29f454928f51f970342d23cf75a2bb0323e0f07334c85f3/coverage-7.6.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6e81d7a3e58882450ec4186ca59a3f20a5d4440f25b1cff6f0902ad890e6748a", size = 237037 }, { url = "https://files.pythonhosted.org/packages/f7/95/d2fd31f1d638df806cae59d7daea5abf2b15b5234016a5ebb502c2f3f7ee/coverage-7.6.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:78b260de9790fd81e69401c2dc8b17da47c8038176a79092a89cb2b7d945d060", size = 239038 }, { url = "https://files.pythonhosted.org/packages/6e/bd/110689ff5752b67924efd5e2aedf5190cbbe245fc81b8dec1abaffba619d/coverage-7.6.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a78d169acd38300060b28d600344a803628c3fd585c912cacc9ea8790fe96862", size = 238690 }, { url = "https://files.pythonhosted.org/packages/d3/a8/08d7b38e6ff8df52331c83130d0ab92d9c9a8b5462f9e99c9f051a4ae206/coverage-7.6.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2c09f4ce52cb99dd7505cd0fc8e0e37c77b87f46bc9c1eb03fe3bc9991085388", size = 236765 }, { url = "https://files.pythonhosted.org/packages/d6/6a/9cf96839d3147d55ae713eb2d877f4d777e7dc5ba2bce227167d0118dfe8/coverage-7.6.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6878ef48d4227aace338d88c48738a4258213cd7b74fd9a3d4d7582bb1d8a155", size = 238611 }, { url = "https://files.pythonhosted.org/packages/74/e4/7ff20d6a0b59eeaab40b3140a71e38cf52547ba21dbcf1d79c5a32bba61b/coverage-7.6.1-cp313-cp313-win32.whl", hash = "sha256:44df346d5215a8c0e360307d46ffaabe0f5d3502c8a1cefd700b34baf31d411a", size = 209671 }, { url = "https://files.pythonhosted.org/packages/35/59/1812f08a85b57c9fdb6d0b383d779e47b6f643bc278ed682859512517e83/coverage-7.6.1-cp313-cp313-win_amd64.whl", hash = "sha256:8284cf8c0dd272a247bc154eb6c95548722dce90d098c17a883ed36e67cdb129", size = 210368 }, { url = "https://files.pythonhosted.org/packages/9c/15/08913be1c59d7562a3e39fce20661a98c0a3f59d5754312899acc6cb8a2d/coverage-7.6.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:d3296782ca4eab572a1a4eca686d8bfb00226300dcefdf43faa25b5242ab8a3e", size = 207758 }, { url = "https://files.pythonhosted.org/packages/c4/ae/b5d58dff26cade02ada6ca612a76447acd69dccdbb3a478e9e088eb3d4b9/coverage-7.6.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:502753043567491d3ff6d08629270127e0c31d4184c4c8d98f92c26f65019962", size = 208035 }, { url = "https://files.pythonhosted.org/packages/b8/d7/62095e355ec0613b08dfb19206ce3033a0eedb6f4a67af5ed267a8800642/coverage-7.6.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6a89ecca80709d4076b95f89f308544ec8f7b4727e8a547913a35f16717856cb", size = 250839 }, { url = "https://files.pythonhosted.org/packages/7c/1e/c2967cb7991b112ba3766df0d9c21de46b476d103e32bb401b1b2adf3380/coverage-7.6.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a318d68e92e80af8b00fa99609796fdbcdfef3629c77c6283566c6f02c6d6704", size = 246569 }, { url = "https://files.pythonhosted.org/packages/8b/61/a7a6a55dd266007ed3b1df7a3386a0d760d014542d72f7c2c6938483b7bd/coverage-7.6.1-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:13b0a73a0896988f053e4fbb7de6d93388e6dd292b0d87ee51d106f2c11b465b", size = 248927 }, { url = "https://files.pythonhosted.org/packages/c8/fa/13a6f56d72b429f56ef612eb3bc5ce1b75b7ee12864b3bd12526ab794847/coverage-7.6.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4421712dbfc5562150f7554f13dde997a2e932a6b5f352edcce948a815efee6f", size = 248401 }, { url = "https://files.pythonhosted.org/packages/75/06/0429c652aa0fb761fc60e8c6b291338c9173c6aa0f4e40e1902345b42830/coverage-7.6.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:166811d20dfea725e2e4baa71fffd6c968a958577848d2131f39b60043400223", size = 246301 }, { url = "https://files.pythonhosted.org/packages/52/76/1766bb8b803a88f93c3a2d07e30ffa359467810e5cbc68e375ebe6906efb/coverage-7.6.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:225667980479a17db1048cb2bf8bfb39b8e5be8f164b8f6628b64f78a72cf9d3", size = 247598 }, { url = "https://files.pythonhosted.org/packages/66/8b/f54f8db2ae17188be9566e8166ac6df105c1c611e25da755738025708d54/coverage-7.6.1-cp313-cp313t-win32.whl", hash = "sha256:170d444ab405852903b7d04ea9ae9b98f98ab6d7e63e1115e82620807519797f", size = 210307 }, { url = "https://files.pythonhosted.org/packages/9f/b0/e0dca6da9170aefc07515cce067b97178cefafb512d00a87a1c717d2efd5/coverage-7.6.1-cp313-cp313t-win_amd64.whl", hash = "sha256:b9f222de8cded79c49bf184bdbc06630d4c58eec9459b939b4a690c82ed05657", size = 211453 }, { url = "https://files.pythonhosted.org/packages/81/d0/d9e3d554e38beea5a2e22178ddb16587dbcbe9a1ef3211f55733924bf7fa/coverage-7.6.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6db04803b6c7291985a761004e9060b2bca08da6d04f26a7f2294b8623a0c1a0", size = 206674 }, { url = "https://files.pythonhosted.org/packages/38/ea/cab2dc248d9f45b2b7f9f1f596a4d75a435cb364437c61b51d2eb33ceb0e/coverage-7.6.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f1adfc8ac319e1a348af294106bc6a8458a0f1633cc62a1446aebc30c5fa186a", size = 207101 }, { url = "https://files.pythonhosted.org/packages/ca/6f/f82f9a500c7c5722368978a5390c418d2a4d083ef955309a8748ecaa8920/coverage-7.6.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a95324a9de9650a729239daea117df21f4b9868ce32e63f8b650ebe6cef5595b", size = 236554 }, { url = "https://files.pythonhosted.org/packages/a6/94/d3055aa33d4e7e733d8fa309d9adf147b4b06a82c1346366fc15a2b1d5fa/coverage-7.6.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b43c03669dc4618ec25270b06ecd3ee4fa94c7f9b3c14bae6571ca00ef98b0d3", size = 234440 }, { url = "https://files.pythonhosted.org/packages/e4/6e/885bcd787d9dd674de4a7d8ec83faf729534c63d05d51d45d4fa168f7102/coverage-7.6.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8929543a7192c13d177b770008bc4e8119f2e1f881d563fc6b6305d2d0ebe9de", size = 235889 }, { url = "https://files.pythonhosted.org/packages/f4/63/df50120a7744492710854860783d6819ff23e482dee15462c9a833cc428a/coverage-7.6.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:a09ece4a69cf399510c8ab25e0950d9cf2b42f7b3cb0374f95d2e2ff594478a6", size = 235142 }, { url = "https://files.pythonhosted.org/packages/3a/5d/9d0acfcded2b3e9ce1c7923ca52ccc00c78a74e112fc2aee661125b7843b/coverage-7.6.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:9054a0754de38d9dbd01a46621636689124d666bad1936d76c0341f7d71bf569", size = 233805 }, { url = "https://files.pythonhosted.org/packages/c4/56/50abf070cb3cd9b1dd32f2c88f083aab561ecbffbcd783275cb51c17f11d/coverage-7.6.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:0dbde0f4aa9a16fa4d754356a8f2e36296ff4d83994b2c9d8398aa32f222f989", size = 234655 }, { url = "https://files.pythonhosted.org/packages/25/ee/b4c246048b8485f85a2426ef4abab88e48c6e80c74e964bea5cd4cd4b115/coverage-7.6.1-cp38-cp38-win32.whl", hash = "sha256:da511e6ad4f7323ee5702e6633085fb76c2f893aaf8ce4c51a0ba4fc07580ea7", size = 209296 }, { url = "https://files.pythonhosted.org/packages/5c/1c/96cf86b70b69ea2b12924cdf7cabb8ad10e6130eab8d767a1099fbd2a44f/coverage-7.6.1-cp38-cp38-win_amd64.whl", hash = "sha256:3f1156e3e8f2872197af3840d8ad307a9dd18e615dc64d9ee41696f287c57ad8", size = 210137 }, { url = "https://files.pythonhosted.org/packages/19/d3/d54c5aa83268779d54c86deb39c1c4566e5d45c155369ca152765f8db413/coverage-7.6.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:abd5fd0db5f4dc9289408aaf34908072f805ff7792632250dcb36dc591d24255", size = 206688 }, { url = "https://files.pythonhosted.org/packages/a5/fe/137d5dca72e4a258b1bc17bb04f2e0196898fe495843402ce826a7419fe3/coverage-7.6.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:547f45fa1a93154bd82050a7f3cddbc1a7a4dd2a9bf5cb7d06f4ae29fe94eaf8", size = 207120 }, { url = "https://files.pythonhosted.org/packages/78/5b/a0a796983f3201ff5485323b225d7c8b74ce30c11f456017e23d8e8d1945/coverage-7.6.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:645786266c8f18a931b65bfcefdbf6952dd0dea98feee39bd188607a9d307ed2", size = 235249 }, { url = "https://files.pythonhosted.org/packages/4e/e1/76089d6a5ef9d68f018f65411fcdaaeb0141b504587b901d74e8587606ad/coverage-7.6.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9e0b2df163b8ed01d515807af24f63de04bebcecbd6c3bfeff88385789fdf75a", size = 233237 }, { url = "https://files.pythonhosted.org/packages/9a/6f/eef79b779a540326fee9520e5542a8b428cc3bfa8b7c8f1022c1ee4fc66c/coverage-7.6.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:609b06f178fe8e9f89ef676532760ec0b4deea15e9969bf754b37f7c40326dbc", size = 234311 }, { url = "https://files.pythonhosted.org/packages/75/e1/656d65fb126c29a494ef964005702b012f3498db1a30dd562958e85a4049/coverage-7.6.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:702855feff378050ae4f741045e19a32d57d19f3e0676d589df0575008ea5004", size = 233453 }, { url = "https://files.pythonhosted.org/packages/68/6a/45f108f137941a4a1238c85f28fd9d048cc46b5466d6b8dda3aba1bb9d4f/coverage-7.6.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:2bdb062ea438f22d99cba0d7829c2ef0af1d768d1e4a4f528087224c90b132cb", size = 231958 }, { url = "https://files.pythonhosted.org/packages/9b/e7/47b809099168b8b8c72ae311efc3e88c8d8a1162b3ba4b8da3cfcdb85743/coverage-7.6.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:9c56863d44bd1c4fe2abb8a4d6f5371d197f1ac0ebdee542f07f35895fc07f36", size = 232938 }, { url = "https://files.pythonhosted.org/packages/52/80/052222ba7058071f905435bad0ba392cc12006380731c37afaf3fe749b88/coverage-7.6.1-cp39-cp39-win32.whl", hash = "sha256:6e2cd258d7d927d09493c8df1ce9174ad01b381d4729a9d8d4e38670ca24774c", size = 209352 }, { url = "https://files.pythonhosted.org/packages/b8/d8/1b92e0b3adcf384e98770a00ca095da1b5f7b483e6563ae4eb5e935d24a1/coverage-7.6.1-cp39-cp39-win_amd64.whl", hash = "sha256:06a737c882bd26d0d6ee7269b20b12f14a8704807a01056c80bb881a4b2ce6ca", size = 210153 }, { url = "https://files.pythonhosted.org/packages/a5/2b/0354ed096bca64dc8e32a7cbcae28b34cb5ad0b1fe2125d6d99583313ac0/coverage-7.6.1-pp38.pp39.pp310-none-any.whl", hash = "sha256:e9a6e0eb86070e8ccaedfbd9d38fec54864f3125ab95419970575b42af7541df", size = 198926 }, ] [package.optional-dependencies] toml = [ { name = "tomli", marker = "python_full_version <= '3.11'" }, ] [[package]] name = "cssselect2" version = "0.7.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "tinycss2" }, { name = "webencodings" }, ] sdist = { url = "https://files.pythonhosted.org/packages/e7/fc/326cb6f988905998f09bb54a3f5d98d4462ba119363c0dfad29750d48c09/cssselect2-0.7.0.tar.gz", hash = "sha256:1ccd984dab89fc68955043aca4e1b03e0cf29cad9880f6e28e3ba7a74b14aa5a", size = 35888 } wheels = [ { url = "https://files.pythonhosted.org/packages/9d/3a/e39436efe51894243ff145a37c4f9a030839b97779ebcc4f13b3ba21c54e/cssselect2-0.7.0-py3-none-any.whl", hash = "sha256:fd23a65bfd444595913f02fc71f6b286c29261e354c41d722ca7a261a49b5969", size = 15586 }, ] [[package]] name = "defusedxml" version = "0.7.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/0f/d5/c66da9b79e5bdb124974bfe172b4daf3c984ebd9c2a06e2b8a4dc7331c72/defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69", size = 75520 } wheels = [ { url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604 }, ] [[package]] name = "devtools" version = "0.12.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "asttokens" }, { name = "executing" }, { name = "pygments" }, ] sdist = { url = "https://files.pythonhosted.org/packages/84/75/b78198620640d394bc435c17bb49db18419afdd6cfa3ed8bcfe14034ec80/devtools-0.12.2.tar.gz", hash = "sha256:efceab184cb35e3a11fa8e602cc4fadacaa2e859e920fc6f87bf130b69885507", size = 75005 } wheels = [ { url = "https://files.pythonhosted.org/packages/d1/ae/afb1487556e2dc827a17097aac8158a25b433a345386f0e249f6d2694ccb/devtools-0.12.2-py3-none-any.whl", hash = "sha256:c366e3de1df4cdd635f1ad8cbcd3af01a384d7abda71900e68d43b04eb6aaca7", size = 19411 }, ] [[package]] name = "dirty-equals" version = "0.8.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pytz", marker = "python_full_version < '3.9'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/17/4c/39c428b7c900b21c8116d89a56b73f6dc14a2455767961b54adfe7c224fe/dirty_equals-0.8.0.tar.gz", hash = "sha256:798db3b9481b9a5024c0e520946507676ed2f0c65317d3e95bdce1a01657cf60", size = 50294 } wheels = [ { url = "https://files.pythonhosted.org/packages/f3/cd/8c3ce82cc6b18e149bff3cf8dd50a75316ca093ae706f0c1c4df87f2b88f/dirty_equals-0.8.0-py3-none-any.whl", hash = "sha256:0ef998ba3c395e03cf5eb3cd1c13c26a9a992efa18c0d59c22ba27344519cee1", size = 28217 }, ] [[package]] name = "dnspython" version = "2.6.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/37/7d/c871f55054e403fdfd6b8f65fd6d1c4e147ed100d3e9f9ba1fe695403939/dnspython-2.6.1.tar.gz", hash = "sha256:e8f0f9c23a7b7cb99ded64e6c3a6f3e701d78f50c55e002b839dea7225cff7cc", size = 332727 } wheels = [ { url = "https://files.pythonhosted.org/packages/87/a1/8c5287991ddb8d3e4662f71356d9656d91ab3a36618c3dd11b280df0d255/dnspython-2.6.1-py3-none-any.whl", hash = "sha256:5ef3b9680161f6fa89daf8ad451b5f1a33b18ae8a1c6778cdf4b43f08c0a6e50", size = 307696 }, ] [[package]] name = "email-validator" version = "2.2.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "dnspython" }, { name = "idna" }, ] sdist = { url = "https://files.pythonhosted.org/packages/48/ce/13508a1ec3f8bb981ae4ca79ea40384becc868bfae97fd1c942bb3a001b1/email_validator-2.2.0.tar.gz", hash = "sha256:cb690f344c617a714f22e66ae771445a1ceb46821152df8e165c5f9a364582b7", size = 48967 } wheels = [ { url = "https://files.pythonhosted.org/packages/d7/ee/bf0adb559ad3c786f12bcbc9296b3f5675f529199bef03e2df281fa1fadb/email_validator-2.2.0-py3-none-any.whl", hash = "sha256:561977c2d73ce3611850a06fa56b414621e0c8faa9d66f2611407d87465da631", size = 33521 }, ] [[package]] name = "eval-type-backport" version = "0.2.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/23/ca/1601a9fa588867fe2ab6c19ed4c936929160d08a86597adf61bbd443fe57/eval_type_backport-0.2.0.tar.gz", hash = "sha256:68796cfbc7371ebf923f03bdf7bef415f3ec098aeced24e054b253a0e78f7b37", size = 8977 } wheels = [ { url = "https://files.pythonhosted.org/packages/ac/ac/aa3d8e0acbcd71140420bc752d7c9779cf3a2a3bb1d7ef30944e38b2cd39/eval_type_backport-0.2.0-py3-none-any.whl", hash = "sha256:ac2f73d30d40c5a30a80b8739a789d6bb5e49fdffa66d7912667e2015d9c9933", size = 5855 }, ] [[package]] name = "exceptiongroup" version = "1.2.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 } wheels = [ { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 }, ] [[package]] name = "executing" version = "2.1.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/8c/e3/7d45f492c2c4a0e8e0fad57d081a7c8a0286cdd86372b070cca1ec0caa1e/executing-2.1.0.tar.gz", hash = "sha256:8ea27ddd260da8150fa5a708269c4a10e76161e2496ec3e587da9e3c0fe4b9ab", size = 977485 } wheels = [ { url = "https://files.pythonhosted.org/packages/b5/fd/afcd0496feca3276f509df3dbd5dae726fcc756f1a08d9e25abe1733f962/executing-2.1.0-py2.py3-none-any.whl", hash = "sha256:8d63781349375b5ebccc3142f4b30350c0cd9c79f921cde38be2be4637e98eaf", size = 25805 }, ] [[package]] name = "faker" version = "30.8.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "python-dateutil" }, { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/c1/df/7574c0d13f0bbab725e52bec4b00783aaa14163fe9093dde11a928a4c638/faker-30.8.2.tar.gz", hash = "sha256:aa31b52cdae3673d6a78b4857c7bcdc0e98f201a5cb77d7827fa9e6b5876da94", size = 1808329 } wheels = [ { url = "https://files.pythonhosted.org/packages/64/82/f7d0c0a4ab512fd1572a315eec903d50a578c75d5aa894cf3f5cc04025e5/Faker-30.8.2-py3-none-any.whl", hash = "sha256:4a82b2908cd19f3bba1a4da2060cc4eb18a40410ccdf9350d071d79dc92fe3ce", size = 1846458 }, ] [[package]] name = "filelock" version = "3.16.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/9d/db/3ef5bb276dae18d6ec2124224403d1d67bccdbefc17af4cc8f553e341ab1/filelock-3.16.1.tar.gz", hash = "sha256:c249fbfcd5db47e5e2d6d62198e565475ee65e4831e2561c8e313fa7eb961435", size = 18037 } wheels = [ { url = "https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl", hash = "sha256:2082e5703d51fbf98ea75855d9d5527e33d8ff23099bec374a134febee6946b0", size = 16163 }, ] [[package]] name = "ghp-import" version = "2.1.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "python-dateutil" }, ] sdist = { url = "https://files.pythonhosted.org/packages/d9/29/d40217cbe2f6b1359e00c6c307bb3fc876ba74068cbab3dde77f03ca0dc4/ghp-import-2.1.0.tar.gz", hash = "sha256:9c535c4c61193c2df8871222567d7fd7e5014d835f97dc7b7439069e2413d343", size = 10943 } wheels = [ { url = "https://files.pythonhosted.org/packages/f7/ec/67fbef5d497f86283db54c22eec6f6140243aae73265799baaaa19cd17fb/ghp_import-2.1.0-py3-none-any.whl", hash = "sha256:8337dd7b50877f163d4c0289bc1f1c7f127550241988d568c1db512c4324a619", size = 11034 }, ] [[package]] name = "greenlet" version = "3.1.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/2f/ff/df5fede753cc10f6a5be0931204ea30c35fa2f2ea7a35b25bdaf4fe40e46/greenlet-3.1.1.tar.gz", hash = "sha256:4ce3ac6cdb6adf7946475d7ef31777c26d94bccc377e070a7986bd2d5c515467", size = 186022 } wheels = [ { url = "https://files.pythonhosted.org/packages/25/90/5234a78dc0ef6496a6eb97b67a42a8e96742a56f7dc808cb954a85390448/greenlet-3.1.1-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:0bbae94a29c9e5c7e4a2b7f0aae5c17e8e90acbfd3bf6270eeba60c39fce3563", size = 271235 }, { url = "https://files.pythonhosted.org/packages/7c/16/cd631fa0ab7d06ef06387135b7549fdcc77d8d859ed770a0d28e47b20972/greenlet-3.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fde093fb93f35ca72a556cf72c92ea3ebfda3d79fc35bb19fbe685853869a83", size = 637168 }, { url = "https://files.pythonhosted.org/packages/2f/b1/aed39043a6fec33c284a2c9abd63ce191f4f1a07319340ffc04d2ed3256f/greenlet-3.1.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:36b89d13c49216cadb828db8dfa6ce86bbbc476a82d3a6c397f0efae0525bdd0", size = 648826 }, { url = "https://files.pythonhosted.org/packages/76/25/40e0112f7f3ebe54e8e8ed91b2b9f970805143efef16d043dfc15e70f44b/greenlet-3.1.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94b6150a85e1b33b40b1464a3f9988dcc5251d6ed06842abff82e42632fac120", size = 644443 }, { url = "https://files.pythonhosted.org/packages/fb/2f/3850b867a9af519794784a7eeed1dd5bc68ffbcc5b28cef703711025fd0a/greenlet-3.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:93147c513fac16385d1036b7e5b102c7fbbdb163d556b791f0f11eada7ba65dc", size = 643295 }, { url = "https://files.pythonhosted.org/packages/cf/69/79e4d63b9387b48939096e25115b8af7cd8a90397a304f92436bcb21f5b2/greenlet-3.1.1-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:da7a9bff22ce038e19bf62c4dd1ec8391062878710ded0a845bcf47cc0200617", size = 599544 }, { url = "https://files.pythonhosted.org/packages/46/1d/44dbcb0e6c323bd6f71b8c2f4233766a5faf4b8948873225d34a0b7efa71/greenlet-3.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b2795058c23988728eec1f36a4e5e4ebad22f8320c85f3587b539b9ac84128d7", size = 1125456 }, { url = "https://files.pythonhosted.org/packages/e0/1d/a305dce121838d0278cee39d5bb268c657f10a5363ae4b726848f833f1bb/greenlet-3.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ed10eac5830befbdd0c32f83e8aa6288361597550ba669b04c48f0f9a2c843c6", size = 1149111 }, { url = "https://files.pythonhosted.org/packages/96/28/d62835fb33fb5652f2e98d34c44ad1a0feacc8b1d3f1aecab035f51f267d/greenlet-3.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:77c386de38a60d1dfb8e55b8c1101d68c79dfdd25c7095d51fec2dd800892b80", size = 298392 }, { url = "https://files.pythonhosted.org/packages/28/62/1c2665558618553c42922ed47a4e6d6527e2fa3516a8256c2f431c5d0441/greenlet-3.1.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:e4d333e558953648ca09d64f13e6d8f0523fa705f51cae3f03b5983489958c70", size = 272479 }, { url = "https://files.pythonhosted.org/packages/76/9d/421e2d5f07285b6e4e3a676b016ca781f63cfe4a0cd8eaecf3fd6f7a71ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:09fc016b73c94e98e29af67ab7b9a879c307c6731a2c9da0db5a7d9b7edd1159", size = 640404 }, { url = "https://files.pythonhosted.org/packages/e5/de/6e05f5c59262a584e502dd3d261bbdd2c97ab5416cc9c0b91ea38932a901/greenlet-3.1.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d5e975ca70269d66d17dd995dafc06f1b06e8cb1ec1e9ed54c1d1e4a7c4cf26e", size = 652813 }, { url = "https://files.pythonhosted.org/packages/49/93/d5f93c84241acdea15a8fd329362c2c71c79e1a507c3f142a5d67ea435ae/greenlet-3.1.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b2813dc3de8c1ee3f924e4d4227999285fd335d1bcc0d2be6dc3f1f6a318ec1", size = 648517 }, { url = "https://files.pythonhosted.org/packages/15/85/72f77fc02d00470c86a5c982b8daafdf65d38aefbbe441cebff3bf7037fc/greenlet-3.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e347b3bfcf985a05e8c0b7d462ba6f15b1ee1c909e2dcad795e49e91b152c383", size = 647831 }, { url = "https://files.pythonhosted.org/packages/f7/4b/1c9695aa24f808e156c8f4813f685d975ca73c000c2a5056c514c64980f6/greenlet-3.1.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e8f8c9cb53cdac7ba9793c276acd90168f416b9ce36799b9b885790f8ad6c0a", size = 602413 }, { url = "https://files.pythonhosted.org/packages/76/70/ad6e5b31ef330f03b12559d19fda2606a522d3849cde46b24f223d6d1619/greenlet-3.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:62ee94988d6b4722ce0028644418d93a52429e977d742ca2ccbe1c4f4a792511", size = 1129619 }, { url = "https://files.pythonhosted.org/packages/f4/fb/201e1b932e584066e0f0658b538e73c459b34d44b4bd4034f682423bc801/greenlet-3.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1776fd7f989fc6b8d8c8cb8da1f6b82c5814957264d1f6cf818d475ec2bf6395", size = 1155198 }, { url = "https://files.pythonhosted.org/packages/12/da/b9ed5e310bb8b89661b80cbcd4db5a067903bbcd7fc854923f5ebb4144f0/greenlet-3.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:48ca08c771c268a768087b408658e216133aecd835c0ded47ce955381105ba39", size = 298930 }, { url = "https://files.pythonhosted.org/packages/7d/ec/bad1ac26764d26aa1353216fcbfa4670050f66d445448aafa227f8b16e80/greenlet-3.1.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:4afe7ea89de619adc868e087b4d2359282058479d7cfb94970adf4b55284574d", size = 274260 }, { url = "https://files.pythonhosted.org/packages/66/d4/c8c04958870f482459ab5956c2942c4ec35cac7fe245527f1039837c17a9/greenlet-3.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f406b22b7c9a9b4f8aa9d2ab13d6ae0ac3e85c9a809bd590ad53fed2bf70dc79", size = 649064 }, { url = "https://files.pythonhosted.org/packages/51/41/467b12a8c7c1303d20abcca145db2be4e6cd50a951fa30af48b6ec607581/greenlet-3.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c3a701fe5a9695b238503ce5bbe8218e03c3bcccf7e204e455e7462d770268aa", size = 663420 }, { url = "https://files.pythonhosted.org/packages/27/8f/2a93cd9b1e7107d5c7b3b7816eeadcac2ebcaf6d6513df9abaf0334777f6/greenlet-3.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2846930c65b47d70b9d178e89c7e1a69c95c1f68ea5aa0a58646b7a96df12441", size = 658035 }, { url = "https://files.pythonhosted.org/packages/57/5c/7c6f50cb12be092e1dccb2599be5a942c3416dbcfb76efcf54b3f8be4d8d/greenlet-3.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:99cfaa2110534e2cf3ba31a7abcac9d328d1d9f1b95beede58294a60348fba36", size = 660105 }, { url = "https://files.pythonhosted.org/packages/f1/66/033e58a50fd9ec9df00a8671c74f1f3a320564c6415a4ed82a1c651654ba/greenlet-3.1.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1443279c19fca463fc33e65ef2a935a5b09bb90f978beab37729e1c3c6c25fe9", size = 613077 }, { url = "https://files.pythonhosted.org/packages/19/c5/36384a06f748044d06bdd8776e231fadf92fc896bd12cb1c9f5a1bda9578/greenlet-3.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b7cede291382a78f7bb5f04a529cb18e068dd29e0fb27376074b6d0317bf4dd0", size = 1135975 }, { url = "https://files.pythonhosted.org/packages/38/f9/c0a0eb61bdf808d23266ecf1d63309f0e1471f284300ce6dac0ae1231881/greenlet-3.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:23f20bb60ae298d7d8656c6ec6db134bca379ecefadb0b19ce6f19d1f232a942", size = 1163955 }, { url = "https://files.pythonhosted.org/packages/43/21/a5d9df1d21514883333fc86584c07c2b49ba7c602e670b174bd73cfc9c7f/greenlet-3.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:7124e16b4c55d417577c2077be379514321916d5790fa287c9ed6f23bd2ffd01", size = 299655 }, { url = "https://files.pythonhosted.org/packages/f3/57/0db4940cd7bb461365ca8d6fd53e68254c9dbbcc2b452e69d0d41f10a85e/greenlet-3.1.1-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:05175c27cb459dcfc05d026c4232f9de8913ed006d42713cb8a5137bd49375f1", size = 272990 }, { url = "https://files.pythonhosted.org/packages/1c/ec/423d113c9f74e5e402e175b157203e9102feeb7088cee844d735b28ef963/greenlet-3.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:935e943ec47c4afab8965954bf49bfa639c05d4ccf9ef6e924188f762145c0ff", size = 649175 }, { url = "https://files.pythonhosted.org/packages/a9/46/ddbd2db9ff209186b7b7c621d1432e2f21714adc988703dbdd0e65155c77/greenlet-3.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:667a9706c970cb552ede35aee17339a18e8f2a87a51fba2ed39ceeeb1004798a", size = 663425 }, { url = "https://files.pythonhosted.org/packages/bc/f9/9c82d6b2b04aa37e38e74f0c429aece5eeb02bab6e3b98e7db89b23d94c6/greenlet-3.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b8a678974d1f3aa55f6cc34dc480169d58f2e6d8958895d68845fa4ab566509e", size = 657736 }, { url = "https://files.pythonhosted.org/packages/d9/42/b87bc2a81e3a62c3de2b0d550bf91a86939442b7ff85abb94eec3fc0e6aa/greenlet-3.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efc0f674aa41b92da8c49e0346318c6075d734994c3c4e4430b1c3f853e498e4", size = 660347 }, { url = "https://files.pythonhosted.org/packages/37/fa/71599c3fd06336cdc3eac52e6871cfebab4d9d70674a9a9e7a482c318e99/greenlet-3.1.1-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0153404a4bb921f0ff1abeb5ce8a5131da56b953eda6e14b88dc6bbc04d2049e", size = 615583 }, { url = "https://files.pythonhosted.org/packages/4e/96/e9ef85de031703ee7a4483489b40cf307f93c1824a02e903106f2ea315fe/greenlet-3.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:275f72decf9932639c1c6dd1013a1bc266438eb32710016a1c742df5da6e60a1", size = 1133039 }, { url = "https://files.pythonhosted.org/packages/87/76/b2b6362accd69f2d1889db61a18c94bc743e961e3cab344c2effaa4b4a25/greenlet-3.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c4aab7f6381f38a4b42f269057aee279ab0fc7bf2e929e3d4abfae97b682a12c", size = 1160716 }, { url = "https://files.pythonhosted.org/packages/1f/1b/54336d876186920e185066d8c3024ad55f21d7cc3683c856127ddb7b13ce/greenlet-3.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:b42703b1cf69f2aa1df7d1030b9d77d3e584a70755674d60e710f0af570f3761", size = 299490 }, { url = "https://files.pythonhosted.org/packages/5f/17/bea55bf36990e1638a2af5ba10c1640273ef20f627962cf97107f1e5d637/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1695e76146579f8c06c1509c7ce4dfe0706f49c6831a817ac04eebb2fd02011", size = 643731 }, { url = "https://files.pythonhosted.org/packages/78/d2/aa3d2157f9ab742a08e0fd8f77d4699f37c22adfbfeb0c610a186b5f75e0/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7876452af029456b3f3549b696bb36a06db7c90747740c5302f74a9e9fa14b13", size = 649304 }, { url = "https://files.pythonhosted.org/packages/f1/8e/d0aeffe69e53ccff5a28fa86f07ad1d2d2d6537a9506229431a2a02e2f15/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ead44c85f8ab905852d3de8d86f6f8baf77109f9da589cb4fa142bd3b57b475", size = 646537 }, { url = "https://files.pythonhosted.org/packages/05/79/e15408220bbb989469c8871062c97c6c9136770657ba779711b90870d867/greenlet-3.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8320f64b777d00dd7ccdade271eaf0cad6636343293a25074cc5566160e4de7b", size = 642506 }, { url = "https://files.pythonhosted.org/packages/18/87/470e01a940307796f1d25f8167b551a968540fbe0551c0ebb853cb527dd6/greenlet-3.1.1-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6510bf84a6b643dabba74d3049ead221257603a253d0a9873f55f6a59a65f822", size = 602753 }, { url = "https://files.pythonhosted.org/packages/e2/72/576815ba674eddc3c25028238f74d7b8068902b3968cbe456771b166455e/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:04b013dc07c96f83134b1e99888e7a79979f1a247e2a9f59697fa14b5862ed01", size = 1122731 }, { url = "https://files.pythonhosted.org/packages/ac/38/08cc303ddddc4b3d7c628c3039a61a3aae36c241ed01393d00c2fd663473/greenlet-3.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:411f015496fec93c1c8cd4e5238da364e1da7a124bcb293f085bf2860c32c6f6", size = 1142112 }, { url = "https://files.pythonhosted.org/packages/97/83/bdf5f69fcf304065ec7cf8fc7c08248479cfed9bcca02bf0001c07e000aa/greenlet-3.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:346bed03fe47414091be4ad44786d1bd8bef0c3fcad6ed3dee074a032ab408a9", size = 271017 }, { url = "https://files.pythonhosted.org/packages/31/4a/2d4443adcb38e1e90e50c653a26b2be39998ea78ca1a4cf414dfdeb2e98b/greenlet-3.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfc59d69fc48664bc693842bd57acfdd490acafda1ab52c7836e3fc75c90a111", size = 642888 }, { url = "https://files.pythonhosted.org/packages/5a/c9/b5d9ac1b932aa772dd1eb90a8a2b30dbd7ad5569dcb7fdac543810d206b4/greenlet-3.1.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d21e10da6ec19b457b82636209cbe2331ff4306b54d06fa04b7c138ba18c8a81", size = 655451 }, { url = "https://files.pythonhosted.org/packages/a8/18/218e21caf7caba5b2236370196eaebc00987d4a2b2d3bf63cc4d4dd5a69f/greenlet-3.1.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:37b9de5a96111fc15418819ab4c4432e4f3c2ede61e660b1e33971eba26ef9ba", size = 651409 }, { url = "https://files.pythonhosted.org/packages/a7/25/de419a2b22fa6e18ce3b2a5adb01d33ec7b2784530f76fa36ba43d8f0fac/greenlet-3.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6ef9ea3f137e5711f0dbe5f9263e8c009b7069d8a1acea822bd5e9dae0ae49c8", size = 650661 }, { url = "https://files.pythonhosted.org/packages/d8/88/0ce16c0afb2d71d85562a7bcd9b092fec80a7767ab5b5f7e1bbbca8200f8/greenlet-3.1.1-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:85f3ff71e2e60bd4b4932a043fbbe0f499e263c628390b285cb599154a3b03b1", size = 605959 }, { url = "https://files.pythonhosted.org/packages/5a/10/39a417ad0afb0b7e5b150f1582cdeb9416f41f2e1df76018434dfac4a6cc/greenlet-3.1.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:95ffcf719966dd7c453f908e208e14cde192e09fde6c7186c8f1896ef778d8cd", size = 1132341 }, { url = "https://files.pythonhosted.org/packages/9f/f5/e9b151ddd2ed0508b7a47bef7857e46218dbc3fd10e564617a3865abfaac/greenlet-3.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:03a088b9de532cbfe2ba2034b2b85e82df37874681e8c470d6fb2f8c04d7e4b7", size = 1159409 }, { url = "https://files.pythonhosted.org/packages/86/97/2c86989ca4e0f089fbcdc9229c972a01ef53abdafd5ae89e0f3dcdcd4adb/greenlet-3.1.1-cp38-cp38-win32.whl", hash = "sha256:8b8b36671f10ba80e159378df9c4f15c14098c4fd73a36b9ad715f057272fbef", size = 281126 }, { url = "https://files.pythonhosted.org/packages/d3/50/7b7a3e10ed82c760c1fd8d3167a7c95508e9fdfc0b0604f05ed1a9a9efdc/greenlet-3.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:7017b2be767b9d43cc31416aba48aab0d2309ee31b4dbf10a1d38fb7972bdf9d", size = 298285 }, { url = "https://files.pythonhosted.org/packages/8c/82/8051e82af6d6b5150aacb6789a657a8afd48f0a44d8e91cb72aaaf28553a/greenlet-3.1.1-cp39-cp39-macosx_11_0_universal2.whl", hash = "sha256:396979749bd95f018296af156201d6211240e7a23090f50a8d5d18c370084dc3", size = 270027 }, { url = "https://files.pythonhosted.org/packages/f9/74/f66de2785880293780eebd18a2958aeea7cbe7814af1ccef634f4701f846/greenlet-3.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca9d0ff5ad43e785350894d97e13633a66e2b50000e8a183a50a88d834752d42", size = 634822 }, { url = "https://files.pythonhosted.org/packages/68/23/acd9ca6bc412b02b8aa755e47b16aafbe642dde0ad2f929f836e57a7949c/greenlet-3.1.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f6ff3b14f2df4c41660a7dec01045a045653998784bf8cfcb5a525bdffffbc8f", size = 646866 }, { url = "https://files.pythonhosted.org/packages/a9/ab/562beaf8a53dc9f6b2459f200e7bc226bb07e51862a66351d8b7817e3efd/greenlet-3.1.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94ebba31df2aa506d7b14866fed00ac141a867e63143fe5bca82a8e503b36437", size = 641985 }, { url = "https://files.pythonhosted.org/packages/03/d3/1006543621f16689f6dc75f6bcf06e3c23e044c26fe391c16c253623313e/greenlet-3.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73aaad12ac0ff500f62cebed98d8789198ea0e6f233421059fa68a5aa7220145", size = 641268 }, { url = "https://files.pythonhosted.org/packages/2f/c1/ad71ce1b5f61f900593377b3f77b39408bce5dc96754790311b49869e146/greenlet-3.1.1-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:63e4844797b975b9af3a3fb8f7866ff08775f5426925e1e0bbcfe7932059a12c", size = 597376 }, { url = "https://files.pythonhosted.org/packages/f7/ff/183226685b478544d61d74804445589e069d00deb8ddef042699733950c7/greenlet-3.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:7939aa3ca7d2a1593596e7ac6d59391ff30281ef280d8632fa03d81f7c5f955e", size = 1123359 }, { url = "https://files.pythonhosted.org/packages/c0/8b/9b3b85a89c22f55f315908b94cd75ab5fed5973f7393bbef000ca8b2c5c1/greenlet-3.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d0028e725ee18175c6e422797c407874da24381ce0690d6b9396c204c7f7276e", size = 1147458 }, { url = "https://files.pythonhosted.org/packages/b8/1c/248fadcecd1790b0ba793ff81fa2375c9ad6442f4c748bf2cc2e6563346a/greenlet-3.1.1-cp39-cp39-win32.whl", hash = "sha256:5e06afd14cbaf9e00899fae69b24a32f2196c19de08fcb9f4779dd4f004e5e7c", size = 281131 }, { url = "https://files.pythonhosted.org/packages/ae/02/e7d0aef2354a38709b764df50b2b83608f0621493e47f47694eb80922822/greenlet-3.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:3319aa75e0e0639bc15ff54ca327e8dc7a6fe404003496e3c6925cd3142e0e22", size = 298306 }, ] [[package]] name = "griffe" version = "1.4.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "astunparse", marker = "python_full_version < '3.9'" }, { name = "colorama" }, ] sdist = { url = "https://files.pythonhosted.org/packages/05/e9/b2c86ad9d69053e497a24ceb25d661094fb321ab4ed39a8b71793dcbae82/griffe-1.4.0.tar.gz", hash = "sha256:8fccc585896d13f1221035d32c50dec65830c87d23f9adb9b1e6f3d63574f7f5", size = 381028 } wheels = [ { url = "https://files.pythonhosted.org/packages/22/7c/e9e66869c2e4c9b378474e49c993128ec0131ef4721038b6d06e50538caf/griffe-1.4.0-py3-none-any.whl", hash = "sha256:e589de8b8c137e99a46ec45f9598fc0ac5b6868ce824b24db09c02d117b89bc5", size = 127015 }, ] [[package]] name = "idna" version = "3.10" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } wheels = [ { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, ] [[package]] name = "importlib-metadata" version = "8.5.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "zipp" }, ] sdist = { url = "https://files.pythonhosted.org/packages/cd/12/33e59336dca5be0c398a7482335911a33aa0e20776128f038019f1a95f1b/importlib_metadata-8.5.0.tar.gz", hash = "sha256:71522656f0abace1d072b9e5481a48f07c138e00f079c38c8f883823f9c26bd7", size = 55304 } wheels = [ { url = "https://files.pythonhosted.org/packages/a0/d9/a1e041c5e7caa9a05c925f4bdbdfb7f006d1f74996af53467bc394c97be7/importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b", size = 26514 }, ] [[package]] name = "importlib-resources" version = "6.4.5" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "zipp", marker = "python_full_version < '3.10'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/98/be/f3e8c6081b684f176b761e6a2fef02a0be939740ed6f54109a2951d806f3/importlib_resources-6.4.5.tar.gz", hash = "sha256:980862a1d16c9e147a59603677fa2aa5fd82b87f223b6cb870695bcfce830065", size = 43372 } wheels = [ { url = "https://files.pythonhosted.org/packages/e1/6a/4604f9ae2fa62ef47b9de2fa5ad599589d28c9fd1d335f32759813dfa91e/importlib_resources-6.4.5-py3-none-any.whl", hash = "sha256:ac29d5f956f01d5e4bb63102a5a19957f1b9175e45649977264a1416783bb717", size = 36115 }, ] [[package]] name = "iniconfig" version = "2.0.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/d7/4b/cbd8e699e64a6f16ca3a8220661b5f83792b3017d0f79807cb8708d33913/iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3", size = 4646 } wheels = [ { url = "https://files.pythonhosted.org/packages/ef/a6/62565a6e1cf69e10f5727360368e451d4b7f58beeac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374", size = 5892 }, ] [[package]] name = "jinja2" version = "3.1.4" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markupsafe" }, ] sdist = { url = "https://files.pythonhosted.org/packages/ed/55/39036716d19cab0747a5020fc7e907f362fbf48c984b14e62127f7e68e5d/jinja2-3.1.4.tar.gz", hash = "sha256:4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369", size = 240245 } wheels = [ { url = "https://files.pythonhosted.org/packages/31/80/3a54838c3fb461f6fec263ebf3a3a41771bd05190238de3486aae8540c36/jinja2-3.1.4-py3-none-any.whl", hash = "sha256:bc5dd2abb727a5319567b7a813e6a2e7318c39f4f487cfe6c89c6f9c7d25197d", size = 133271 }, ] [[package]] name = "jsonschema" version = "4.23.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "jsonschema-specifications" }, { name = "pkgutil-resolve-name", marker = "python_full_version < '3.9'" }, { name = "referencing" }, { name = "rpds-py" }, ] sdist = { url = "https://files.pythonhosted.org/packages/38/2e/03362ee4034a4c917f697890ccd4aec0800ccf9ded7f511971c75451deec/jsonschema-4.23.0.tar.gz", hash = "sha256:d71497fef26351a33265337fa77ffeb82423f3ea21283cd9467bb03999266bc4", size = 325778 } wheels = [ { url = "https://files.pythonhosted.org/packages/69/4a/4f9dbeb84e8850557c02365a0eee0649abe5eb1d84af92a25731c6c0f922/jsonschema-4.23.0-py3-none-any.whl", hash = "sha256:fbadb6f8b144a8f8cf9f0b89ba94501d143e50411a1278633f56a7acf7fd5566", size = 88462 }, ] [[package]] name = "jsonschema-specifications" version = "2023.12.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "referencing" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f8/b9/cc0cc592e7c195fb8a650c1d5990b10175cf13b4c97465c72ec841de9e4b/jsonschema_specifications-2023.12.1.tar.gz", hash = "sha256:48a76787b3e70f5ed53f1160d2b81f586e4ca6d1548c5de7085d1682674764cc", size = 13983 } wheels = [ { url = "https://files.pythonhosted.org/packages/ee/07/44bd408781594c4d0a027666ef27fab1e441b109dc3b76b4f836f8fd04fe/jsonschema_specifications-2023.12.1-py3-none-any.whl", hash = "sha256:87e4fdf3a94858b8a2ba2778d9ba57d8a9cafca7c7489c46ba0d30a8bc6a9c3c", size = 18482 }, ] [[package]] name = "linkify-it-py" version = "2.0.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "uc-micro-py", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/2a/ae/bb56c6828e4797ba5a4821eec7c43b8bf40f69cda4d4f5f8c8a2810ec96a/linkify-it-py-2.0.3.tar.gz", hash = "sha256:68cda27e162e9215c17d786649d1da0021a451bdc436ef9e0fa0ba5234b9b048", size = 27946 } wheels = [ { url = "https://files.pythonhosted.org/packages/04/1e/b832de447dee8b582cac175871d2f6c3d5077cc56d5575cadba1fd1cccfa/linkify_it_py-2.0.3-py3-none-any.whl", hash = "sha256:6bcbc417b0ac14323382aef5c5192c0075bf8a9d6b41820a2b66371eac6b6d79", size = 19820 }, ] [[package]] name = "markdown" version = "3.7" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "importlib-metadata", marker = "python_full_version < '3.10'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/54/28/3af612670f82f4c056911fbbbb42760255801b3068c48de792d354ff4472/markdown-3.7.tar.gz", hash = "sha256:2ae2471477cfd02dbbf038d5d9bc226d40def84b4fe2986e49b59b6b472bbed2", size = 357086 } wheels = [ { url = "https://files.pythonhosted.org/packages/3f/08/83871f3c50fc983b88547c196d11cf8c3340e37c32d2e9d6152abe2c61f7/Markdown-3.7-py3-none-any.whl", hash = "sha256:7eb6df5690b81a1d7942992c97fad2938e956e79df20cbc6186e9c3a77b1c803", size = 106349 }, ] [[package]] name = "markdown-it-py" version = "3.0.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mdurl" }, ] sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 } wheels = [ { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 }, ] [package.optional-dependencies] linkify = [ { name = "linkify-it-py", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] plugins = [ { name = "mdit-py-plugins", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] [[package]] name = "markupsafe" version = "2.1.5" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/87/5b/aae44c6655f3801e81aa3eef09dbbf012431987ba564d7231722f68df02d/MarkupSafe-2.1.5.tar.gz", hash = "sha256:d283d37a890ba4c1ae73ffadf8046435c76e7bc2247bbb63c00bd1a709c6544b", size = 19384 } wheels = [ { url = "https://files.pythonhosted.org/packages/e4/54/ad5eb37bf9d51800010a74e4665425831a9db4e7c4e0fde4352e391e808e/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a17a92de5231666cfbe003f0e4b9b3a7ae3afb1ec2845aadc2bacc93ff85febc", size = 18206 }, { url = "https://files.pythonhosted.org/packages/6a/4a/a4d49415e600bacae038c67f9fecc1d5433b9d3c71a4de6f33537b89654c/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:72b6be590cc35924b02c78ef34b467da4ba07e4e0f0454a2c5907f473fc50ce5", size = 14079 }, { url = "https://files.pythonhosted.org/packages/0a/7b/85681ae3c33c385b10ac0f8dd025c30af83c78cec1c37a6aa3b55e67f5ec/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61659ba32cf2cf1481e575d0462554625196a1f2fc06a1c777d3f48e8865d46", size = 26620 }, { url = "https://files.pythonhosted.org/packages/7c/52/2b1b570f6b8b803cef5ac28fdf78c0da318916c7d2fe9402a84d591b394c/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2174c595a0d73a3080ca3257b40096db99799265e1c27cc5a610743acd86d62f", size = 25818 }, { url = "https://files.pythonhosted.org/packages/29/fe/a36ba8c7ca55621620b2d7c585313efd10729e63ef81e4e61f52330da781/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae2ad8ae6ebee9d2d94b17fb62763125f3f374c25618198f40cbb8b525411900", size = 25493 }, { url = "https://files.pythonhosted.org/packages/60/ae/9c60231cdfda003434e8bd27282b1f4e197ad5a710c14bee8bea8a9ca4f0/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:075202fa5b72c86ad32dc7d0b56024ebdbcf2048c0ba09f1cde31bfdd57bcfff", size = 30630 }, { url = "https://files.pythonhosted.org/packages/65/dc/1510be4d179869f5dafe071aecb3f1f41b45d37c02329dfba01ff59e5ac5/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:598e3276b64aff0e7b3451b72e94fa3c238d452e7ddcd893c3ab324717456bad", size = 29745 }, { url = "https://files.pythonhosted.org/packages/30/39/8d845dd7d0b0613d86e0ef89549bfb5f61ed781f59af45fc96496e897f3a/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fce659a462a1be54d2ffcacea5e3ba2d74daa74f30f5f143fe0c58636e355fdd", size = 30021 }, { url = "https://files.pythonhosted.org/packages/c7/5c/356a6f62e4f3c5fbf2602b4771376af22a3b16efa74eb8716fb4e328e01e/MarkupSafe-2.1.5-cp310-cp310-win32.whl", hash = "sha256:d9fad5155d72433c921b782e58892377c44bd6252b5af2f67f16b194987338a4", size = 16659 }, { url = "https://files.pythonhosted.org/packages/69/48/acbf292615c65f0604a0c6fc402ce6d8c991276e16c80c46a8f758fbd30c/MarkupSafe-2.1.5-cp310-cp310-win_amd64.whl", hash = "sha256:bf50cd79a75d181c9181df03572cdce0fbb75cc353bc350712073108cba98de5", size = 17213 }, { url = "https://files.pythonhosted.org/packages/11/e7/291e55127bb2ae67c64d66cef01432b5933859dfb7d6949daa721b89d0b3/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:629ddd2ca402ae6dbedfceeba9c46d5f7b2a61d9749597d4307f943ef198fc1f", size = 18219 }, { url = "https://files.pythonhosted.org/packages/6b/cb/aed7a284c00dfa7c0682d14df85ad4955a350a21d2e3b06d8240497359bf/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5b7b716f97b52c5a14bffdf688f971b2d5ef4029127f1ad7a513973cfd818df2", size = 14098 }, { url = "https://files.pythonhosted.org/packages/1c/cf/35fe557e53709e93feb65575c93927942087e9b97213eabc3fe9d5b25a55/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ec585f69cec0aa07d945b20805be741395e28ac1627333b1c5b0105962ffced", size = 29014 }, { url = "https://files.pythonhosted.org/packages/97/18/c30da5e7a0e7f4603abfc6780574131221d9148f323752c2755d48abad30/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b91c037585eba9095565a3556f611e3cbfaa42ca1e865f7b8015fe5c7336d5a5", size = 28220 }, { url = "https://files.pythonhosted.org/packages/0c/40/2e73e7d532d030b1e41180807a80d564eda53babaf04d65e15c1cf897e40/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7502934a33b54030eaf1194c21c692a534196063db72176b0c4028e140f8f32c", size = 27756 }, { url = "https://files.pythonhosted.org/packages/18/46/5dca760547e8c59c5311b332f70605d24c99d1303dd9a6e1fc3ed0d73561/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0e397ac966fdf721b2c528cf028494e86172b4feba51d65f81ffd65c63798f3f", size = 33988 }, { url = "https://files.pythonhosted.org/packages/6d/c5/27febe918ac36397919cd4a67d5579cbbfa8da027fa1238af6285bb368ea/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:c061bb86a71b42465156a3ee7bd58c8c2ceacdbeb95d05a99893e08b8467359a", size = 32718 }, { url = "https://files.pythonhosted.org/packages/f8/81/56e567126a2c2bc2684d6391332e357589a96a76cb9f8e5052d85cb0ead8/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3a57fdd7ce31c7ff06cdfbf31dafa96cc533c21e443d57f5b1ecc6cdc668ec7f", size = 33317 }, { url = "https://files.pythonhosted.org/packages/00/0b/23f4b2470accb53285c613a3ab9ec19dc944eaf53592cb6d9e2af8aa24cc/MarkupSafe-2.1.5-cp311-cp311-win32.whl", hash = "sha256:397081c1a0bfb5124355710fe79478cdbeb39626492b15d399526ae53422b906", size = 16670 }, { url = "https://files.pythonhosted.org/packages/b7/a2/c78a06a9ec6d04b3445a949615c4c7ed86a0b2eb68e44e7541b9d57067cc/MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl", hash = "sha256:2b7c57a4dfc4f16f7142221afe5ba4e093e09e728ca65c51f5620c9aaeb9a617", size = 17224 }, { url = "https://files.pythonhosted.org/packages/53/bd/583bf3e4c8d6a321938c13f49d44024dbe5ed63e0a7ba127e454a66da974/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8dec4936e9c3100156f8a2dc89c4b88d5c435175ff03413b443469c7c8c5f4d1", size = 18215 }, { url = "https://files.pythonhosted.org/packages/48/d6/e7cd795fc710292c3af3a06d80868ce4b02bfbbf370b7cee11d282815a2a/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3c6b973f22eb18a789b1460b4b91bf04ae3f0c4234a0a6aa6b0a92f6f7b951d4", size = 14069 }, { url = "https://files.pythonhosted.org/packages/51/b5/5d8ec796e2a08fc814a2c7d2584b55f889a55cf17dd1a90f2beb70744e5c/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac07bad82163452a6884fe8fa0963fb98c2346ba78d779ec06bd7a6262132aee", size = 29452 }, { url = "https://files.pythonhosted.org/packages/0a/0d/2454f072fae3b5a137c119abf15465d1771319dfe9e4acbb31722a0fff91/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f5dfb42c4604dddc8e4305050aa6deb084540643ed5804d7455b5df8fe16f5e5", size = 28462 }, { url = "https://files.pythonhosted.org/packages/2d/75/fd6cb2e68780f72d47e6671840ca517bda5ef663d30ada7616b0462ad1e3/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ea3d8a3d18833cf4304cd2fc9cbb1efe188ca9b5efef2bdac7adc20594a0e46b", size = 27869 }, { url = "https://files.pythonhosted.org/packages/b0/81/147c477391c2750e8fc7705829f7351cf1cd3be64406edcf900dc633feb2/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d050b3361367a06d752db6ead6e7edeb0009be66bc3bae0ee9d97fb326badc2a", size = 33906 }, { url = "https://files.pythonhosted.org/packages/8b/ff/9a52b71839d7a256b563e85d11050e307121000dcebc97df120176b3ad93/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bec0a414d016ac1a18862a519e54b2fd0fc8bbfd6890376898a6c0891dd82e9f", size = 32296 }, { url = "https://files.pythonhosted.org/packages/88/07/2dc76aa51b481eb96a4c3198894f38b480490e834479611a4053fbf08623/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:58c98fee265677f63a4385256a6d7683ab1832f3ddd1e66fe948d5880c21a169", size = 33038 }, { url = "https://files.pythonhosted.org/packages/96/0c/620c1fb3661858c0e37eb3cbffd8c6f732a67cd97296f725789679801b31/MarkupSafe-2.1.5-cp312-cp312-win32.whl", hash = "sha256:8590b4ae07a35970728874632fed7bd57b26b0102df2d2b233b6d9d82f6c62ad", size = 16572 }, { url = "https://files.pythonhosted.org/packages/3f/14/c3554d512d5f9100a95e737502f4a2323a1959f6d0d01e0d0997b35f7b10/MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl", hash = "sha256:823b65d8706e32ad2df51ed89496147a42a2a6e01c13cfb6ffb8b1e92bc910bb", size = 17127 }, { url = "https://files.pythonhosted.org/packages/f8/ff/2c942a82c35a49df5de3a630ce0a8456ac2969691b230e530ac12314364c/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:656f7526c69fac7f600bd1f400991cc282b417d17539a1b228617081106feb4a", size = 18192 }, { url = "https://files.pythonhosted.org/packages/4f/14/6f294b9c4f969d0c801a4615e221c1e084722ea6114ab2114189c5b8cbe0/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:97cafb1f3cbcd3fd2b6fbfb99ae11cdb14deea0736fc2b0952ee177f2b813a46", size = 14072 }, { url = "https://files.pythonhosted.org/packages/81/d4/fd74714ed30a1dedd0b82427c02fa4deec64f173831ec716da11c51a50aa/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f3fbcb7ef1f16e48246f704ab79d79da8a46891e2da03f8783a5b6fa41a9532", size = 26928 }, { url = "https://files.pythonhosted.org/packages/c7/bd/50319665ce81bb10e90d1cf76f9e1aa269ea6f7fa30ab4521f14d122a3df/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa9db3f79de01457b03d4f01b34cf91bc0048eb2c3846ff26f66687c2f6d16ab", size = 26106 }, { url = "https://files.pythonhosted.org/packages/4c/6f/f2b0f675635b05f6afd5ea03c094557bdb8622fa8e673387444fe8d8e787/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffee1f21e5ef0d712f9033568f8344d5da8cc2869dbd08d87c84656e6a2d2f68", size = 25781 }, { url = "https://files.pythonhosted.org/packages/51/e0/393467cf899b34a9d3678e78961c2c8cdf49fb902a959ba54ece01273fb1/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5dedb4db619ba5a2787a94d877bc8ffc0566f92a01c0ef214865e54ecc9ee5e0", size = 30518 }, { url = "https://files.pythonhosted.org/packages/f6/02/5437e2ad33047290dafced9df741d9efc3e716b75583bbd73a9984f1b6f7/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:30b600cf0a7ac9234b2638fbc0fb6158ba5bdcdf46aeb631ead21248b9affbc4", size = 29669 }, { url = "https://files.pythonhosted.org/packages/0e/7d/968284145ffd9d726183ed6237c77938c021abacde4e073020f920e060b2/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8dd717634f5a044f860435c1d8c16a270ddf0ef8588d4887037c5028b859b0c3", size = 29933 }, { url = "https://files.pythonhosted.org/packages/bf/f3/ecb00fc8ab02b7beae8699f34db9357ae49d9f21d4d3de6f305f34fa949e/MarkupSafe-2.1.5-cp38-cp38-win32.whl", hash = "sha256:daa4ee5a243f0f20d528d939d06670a298dd39b1ad5f8a72a4275124a7819eff", size = 16656 }, { url = "https://files.pythonhosted.org/packages/92/21/357205f03514a49b293e214ac39de01fadd0970a6e05e4bf1ddd0ffd0881/MarkupSafe-2.1.5-cp38-cp38-win_amd64.whl", hash = "sha256:619bc166c4f2de5caa5a633b8b7326fbe98e0ccbfacabd87268a2b15ff73a029", size = 17206 }, { url = "https://files.pythonhosted.org/packages/0f/31/780bb297db036ba7b7bbede5e1d7f1e14d704ad4beb3ce53fb495d22bc62/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7a68b554d356a91cce1236aa7682dc01df0edba8d043fd1ce607c49dd3c1edcf", size = 18193 }, { url = "https://files.pythonhosted.org/packages/6c/77/d77701bbef72892affe060cdacb7a2ed7fd68dae3b477a8642f15ad3b132/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:db0b55e0f3cc0be60c1f19efdde9a637c32740486004f20d1cff53c3c0ece4d2", size = 14073 }, { url = "https://files.pythonhosted.org/packages/d9/a7/1e558b4f78454c8a3a0199292d96159eb4d091f983bc35ef258314fe7269/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e53af139f8579a6d5f7b76549125f0d94d7e630761a2111bc431fd820e163b8", size = 26486 }, { url = "https://files.pythonhosted.org/packages/5f/5a/360da85076688755ea0cceb92472923086993e86b5613bbae9fbc14136b0/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17b950fccb810b3293638215058e432159d2b71005c74371d784862b7e4683f3", size = 25685 }, { url = "https://files.pythonhosted.org/packages/6a/18/ae5a258e3401f9b8312f92b028c54d7026a97ec3ab20bfaddbdfa7d8cce8/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c31f53cdae6ecfa91a77820e8b151dba54ab528ba65dfd235c80b086d68a465", size = 25338 }, { url = "https://files.pythonhosted.org/packages/0b/cc/48206bd61c5b9d0129f4d75243b156929b04c94c09041321456fd06a876d/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bff1b4290a66b490a2f4719358c0cdcd9bafb6b8f061e45c7a2460866bf50c2e", size = 30439 }, { url = "https://files.pythonhosted.org/packages/d1/06/a41c112ab9ffdeeb5f77bc3e331fdadf97fa65e52e44ba31880f4e7f983c/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc1667f8b83f48511b94671e0e441401371dfd0f0a795c7daa4a3cd1dde55bea", size = 29531 }, { url = "https://files.pythonhosted.org/packages/02/8c/ab9a463301a50dab04d5472e998acbd4080597abc048166ded5c7aa768c8/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5049256f536511ee3f7e1b3f87d1d1209d327e818e6ae1365e8653d7e3abb6a6", size = 29823 }, { url = "https://files.pythonhosted.org/packages/bc/29/9bc18da763496b055d8e98ce476c8e718dcfd78157e17f555ce6dd7d0895/MarkupSafe-2.1.5-cp39-cp39-win32.whl", hash = "sha256:00e046b6dd71aa03a41079792f8473dc494d564611a8f89bbbd7cb93295ebdcf", size = 16658 }, { url = "https://files.pythonhosted.org/packages/f6/f8/4da07de16f10551ca1f640c92b5f316f9394088b183c6a57183df6de5ae4/MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl", hash = "sha256:fa173ec60341d6bb97a89f5ea19c85c5643c1e7dedebc22f5181eb73573142c5", size = 17211 }, ] [[package]] name = "mdit-py-plugins" version = "0.4.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown-it-py", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/19/03/a2ecab526543b152300717cf232bb4bb8605b6edb946c845016fa9c9c9fd/mdit_py_plugins-0.4.2.tar.gz", hash = "sha256:5f2cd1fdb606ddf152d37ec30e46101a60512bc0e5fa1a7002c36647b09e26b5", size = 43542 } wheels = [ { url = "https://files.pythonhosted.org/packages/a7/f7/7782a043553ee469c1ff49cfa1cdace2d6bf99a1f333cf38676b3ddf30da/mdit_py_plugins-0.4.2-py3-none-any.whl", hash = "sha256:0c673c3f889399a33b95e88d2f0d111b4447bdfea7f237dab2d488f459835636", size = 55316 }, ] [[package]] name = "mdurl" version = "0.1.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 } wheels = [ { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 }, ] [[package]] name = "memray" version = "1.14.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "jinja2", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, { name = "rich", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, { name = "textual", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f8/e8/725ac54c543d43479c32be1d7d49ea75b5f46fbe5f77262332cd51e301dc/memray-1.14.0.tar.gz", hash = "sha256:b5d8874b7b215551f0ae9fa8aef3f2f52321a6460dc0141aaf9374709e6b0eb7", size = 1024807 } wheels = [ { url = "https://files.pythonhosted.org/packages/4d/43/9ba9156a69225706e3c52f84410d5f4339de2374ff53a4d047125a3d89ec/memray-1.14.0-cp310-cp310-macosx_10_14_x86_64.whl", hash = "sha256:745d9014cb662065501441a7b534c29914fe2b68398b37385aba9f4a1c51c723", size = 922486 }, { url = "https://files.pythonhosted.org/packages/e5/bf/90411ec60b705c4416a6f90dc7621129713c4cf59e60d3fab5526187e7d3/memray-1.14.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f62a402ca1a7126f749544c3d6493672d6330ffd37d59ba230bc73e5143b3bc2", size = 897793 }, { url = "https://files.pythonhosted.org/packages/31/0e/3f9fa83bb5673fe015586d2e93b1f8a3143773e20ed8b48cb95cd7e1c394/memray-1.14.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:36840f39277b1871ecb5a9592dd1aa517a17b9f855f4e3ff08aa328a9d305e69", size = 8252322 }, { url = "https://files.pythonhosted.org/packages/d4/0e/c667294a4385b2f6f320ad123c9a15e2f46af6fde14b4240bb9cd1db429f/memray-1.14.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3c7933ca70c0d59d0ce9b1064a6eda86231248759b46ed6dabedf489039d1aa1", size = 8320971 }, { url = "https://files.pythonhosted.org/packages/38/80/969dee8842217e0b8b9a35d9fcf9835858781b455c63b184a8f0e6e6f980/memray-1.14.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:75a5907345ff845652e709ddce3171a9ba2d65c62e8bd49a99131066e2a7ce3b", size = 7942760 }, { url = "https://files.pythonhosted.org/packages/b3/12/873ec138c635bc702012f0303010b428aeeca970191d246a485b23fb9453/memray-1.14.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:88c89c3797834eec177a89ad528699c75b94e2ed08c00754141eae69c520b894", size = 8281919 }, { url = "https://files.pythonhosted.org/packages/ed/c2/7df995f7b022de366e76a1c35f1fcce18a76feb08379d43f068473e79b3f/memray-1.14.0-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:d6087f291fd68acdf0a833efb57bc0f192c98ae89b4377c690c28313e78d029c", size = 926546 }, { url = "https://files.pythonhosted.org/packages/96/0c/19f14555812374bdfe09a0338be9c2c0477d388a1e062d516ddf64664a6d/memray-1.14.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e6ba7bff9dfa37bf3b80a5b83b50eadf20afb1f0e8de4a0139019154086d6bed", size = 900882 }, { url = "https://files.pythonhosted.org/packages/4b/1e/3d69f23c60653196967aa5ba1cc97202b66216a7cd493d676efa4926ff3e/memray-1.14.0-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9bb0cfe1b755a860435cd52047b2e3f4f7b0c3887e0c1bf98da7127948284a91", size = 8439375 }, { url = "https://files.pythonhosted.org/packages/99/70/299416caa5aebabc5dfb2c13d24ebfea50908f5b9c47b8dc43cc18b76213/memray-1.14.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:638ba74e1735a40b6595fee7f37b426b9a95d244091a1f5df3dc5d98df1cbd4b", size = 8061494 }, { url = "https://files.pythonhosted.org/packages/1c/c7/02609171bfe977fe5e0f452deac6dfd6b80595c7dbb8ef9e591bdf89a96b/memray-1.14.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7227ebf024cb0688a68ed91ed3e05c61a13751a9e875807195076b827bfde464", size = 8159467 }, { url = "https://files.pythonhosted.org/packages/07/1b/0dab2597832024dcb8e4ae505f822bad26d8e11a1e3b16a74331c59d5563/memray-1.14.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:248dea8cfb5a615345e28b7e25c94377a8d198da3b6957ee443afa6f4ff1b733", size = 8442474 }, { url = "https://files.pythonhosted.org/packages/92/fb/f7ae3f29e6e2eff78d6050da8b9086f90a0a9da9a629c36858ba1fdb97ab/memray-1.14.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:7d03f6be66aa259df7fa50082876fbe6461108d77d46c1f720c46067d60685d4", size = 8405651 }, { url = "https://files.pythonhosted.org/packages/59/f8/97cdfaf46bb7d5ad1bd6f2b206f944f1eaf09d29dd1adc8906e2b08cc35a/memray-1.14.0-cp312-cp312-macosx_10_14_x86_64.whl", hash = "sha256:9af9d30b1e484fd8591c9a7f322fd50b9192a2bce660be92385a01555af9968b", size = 926815 }, { url = "https://files.pythonhosted.org/packages/be/c8/ec5997fe3e98f5df2d51152b75094988a6e34431891f3045cd606be5ff89/memray-1.14.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c4088b391c04796c888ac751b5d387f6e8212b3515d4c53ba540c65a6efe4bda", size = 898403 }, { url = "https://files.pythonhosted.org/packages/11/f0/0edec41903e5e54063aa443d9dccf68b6bb44a9789238119adf6458f2e5a/memray-1.14.0-cp312-cp312-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:af8aee7e8e5cac1e4130f1184b3e03b6bb08264e4ba1696551791ed3f8fb824e", size = 8419310 }, { url = "https://files.pythonhosted.org/packages/c1/71/2dfc324a3365ee0914decde34be2f4bb28938892883f0a9414325f2df18f/memray-1.14.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4352f9e85957f2cbe45a9e1c87dfc19d2df77e93dcd8a558794a683eeee57b7b", size = 8013939 }, { url = "https://files.pythonhosted.org/packages/a2/10/70389d7f2357f8d785f7280bea520bcf2fb664fbd1854f73fad261b94140/memray-1.14.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5953f0d2aa31b23d4cce20236a03d90b7c71903709a57b456d6494bfe6f470b7", size = 8131908 }, { url = "https://files.pythonhosted.org/packages/8a/b0/0f10448116f95ac8747cef1556872f2874cbb2b199f19ccb7a6ab64b6647/memray-1.14.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e4ccaca04365efcda51036fe2add980030e33cfc4f3a194a01f530a5c923c65", size = 8402549 }, { url = "https://files.pythonhosted.org/packages/53/37/699b78d255f2d211ed5940ba0eb7622809c25279efee9a28f820a16f18b9/memray-1.14.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:f85a27eb8a65c810161bb992116a66d328546f78a4a4c7c1868949651b917c08", size = 8363797 }, { url = "https://files.pythonhosted.org/packages/6f/dc/924f9c09cf4ea87ade11f45407fe6ed106fd76ff31c0ba30cbf8c4b03917/memray-1.14.0-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:958d57f7149b8fa4831785394f2a7ace93dbc2be6c49a1c07987a8972986474a", size = 922182 }, { url = "https://files.pythonhosted.org/packages/96/27/baeeef7a041b02b3f5149192a2065cba2deb1724357dd4798db56fc8894c/memray-1.14.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:287a01953bc44dd0a32549a23bdacb5f9734e345ca289fa3923867c637715056", size = 893844 }, { url = "https://files.pythonhosted.org/packages/be/b5/3f6b5a0dfb1296450263c550cc860d11d8ecfb1165403790520cde568694/memray-1.14.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfc17cba35d98e3d2ca20ab995f17eec3edba7138b062cbc1aa36d53d9d2d955", size = 8005692 }, { url = "https://files.pythonhosted.org/packages/a0/3b/1e0310539a0f501197a005701d5fc1e3343b42ddf55039e2913fad4ade29/memray-1.14.0-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c82342cead930ca50235f59740ca238808f9c33ef31d994712972966beb6226e", size = 8125205 }, { url = "https://files.pythonhosted.org/packages/d3/d1/967f90923c3027e1bea68f66f19b5f567555d1b0ecbc3627a50bd51480b9/memray-1.14.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a22a826b4047e839310514f4889c24e45a66ea222fca19ac0ae7b2f89bbb0281", size = 8396122 }, { url = "https://files.pythonhosted.org/packages/52/47/92c3002fcf334a6961875320a3104792a4adf12ac5c322d0aa3a96b94140/memray-1.14.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:344f3c73b97ffc8f1666b404deafbc31a19e6b2881341b706aa7ec20afb0e8b1", size = 8345742 }, { url = "https://files.pythonhosted.org/packages/bc/03/bd95e1ef113e9fb6dd4775bd2afecc7320e6e16389612970b3b66bc311da/memray-1.14.0-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:443885a96ab9f67d46288240e2593b5c3ecb2c507ddb4e3b10695e104403d001", size = 944828 }, { url = "https://files.pythonhosted.org/packages/6d/2e/3aa9d596d38b5ce85802353bf9a91894616ca452b282b80e6deb8927c8b6/memray-1.14.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:52a45d96ed717d8efb645e99646a92dd21a2ca38bdb823fe22e38c429cba9513", size = 917163 }, { url = "https://files.pythonhosted.org/packages/c1/ad/979e23f804cb917fdd77609a7ae706c520a5d016a23f60f6572f03d1bda6/memray-1.14.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:72febec7b287667e8ea9ee3e879a4da19a4318bc47e211da815be74acd961994", size = 8383975 }, { url = "https://files.pythonhosted.org/packages/13/4e/166dc9603a3ae5899eb325a4410dabaa971dde92dbe1ac50a1fdf1ae4633/memray-1.14.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4e07bdc3a4979b335c2b6b93a81b807d5aacd8dbbea56c41c6899a8bc0d2beb3", size = 8478926 }, { url = "https://files.pythonhosted.org/packages/71/06/40dc0654004eec8b1e3b55adf907fce4799809eca535d65de6ff022b705e/memray-1.14.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3b5e729d03caf426dc45a258270537a603794ecc067ccfd92f9c67ba9332e788", size = 8032255 }, { url = "https://files.pythonhosted.org/packages/f6/95/50cc6e2eee89d1f849ea4c23a347317c939849045d3af393ba2318052adc/memray-1.14.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:1d0a1397b5387b75dc9d9759beb022cb360948584840e850474d7d39ad267f85", size = 8429853 }, { url = "https://files.pythonhosted.org/packages/13/d0/7a8f86faa015170e05361c051d553c391e2ea53189e117aad0dffe4e1734/memray-1.14.0-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:c119b600e7c665e0713f09e25f9ee09733a98035688ecc1ec8fd168fa37a77f6", size = 923790 }, { url = "https://files.pythonhosted.org/packages/f5/09/9f24dd0d423c0e2af72e39de9d1036494e3703426e144ef0b95988a29363/memray-1.14.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:29a2e7d84d1652ef4664bcceb155630979b4797180b70da35525d963a4ca707f", size = 899142 }, { url = "https://files.pythonhosted.org/packages/4e/ef/278e03685db5d4d58aedf59392de1201df8a298236bc7514c69571905f03/memray-1.14.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:b3b8d46b6447cdecba3ba100d47c62e78cdad58b00b2d6ba004d6bad318c8668", size = 8255744 }, { url = "https://files.pythonhosted.org/packages/ac/b3/b22bedf7145cbf1acc329b18d3648593443c8a83b1d0425d2a5913a415fe/memray-1.14.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:57f9bf3f1c648f1ea877a84c21c449fdafd8cc105763ada6023e36bae9b45eb8", size = 8320931 }, { url = "https://files.pythonhosted.org/packages/4b/a3/b5639ac770d518b48735b68cf790a2013d1fe20da68df04e18a7d0a62469/memray-1.14.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b7a59346d242fc39041d87a71cb6cf45baf492ffbb69da9690de49346be64a8", size = 7941173 }, { url = "https://files.pythonhosted.org/packages/5d/31/4948a063bc1f69685b2808671b965f069709a2349ef6419a860a5505ea75/memray-1.14.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:11fb00105572b70f2aca8b787ce9748b0c94672fbb6334f1604f7f813ca3dca6", size = 8285122 }, ] [[package]] name = "mergedeep" version = "1.3.4" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/3a/41/580bb4006e3ed0361b8151a01d324fb03f420815446c7def45d02f74c270/mergedeep-1.3.4.tar.gz", hash = "sha256:0096d52e9dad9939c3d975a774666af186eda617e6ca84df4c94dec30004f2a8", size = 4661 } wheels = [ { url = "https://files.pythonhosted.org/packages/2c/19/04f9b178c2d8a15b076c8b5140708fa6ffc5601fb6f1e975537072df5b2a/mergedeep-1.3.4-py3-none-any.whl", hash = "sha256:70775750742b25c0d8f36c55aed03d24c3384d17c951b3175d898bd778ef0307", size = 6354 }, ] [[package]] name = "mike" version = "2.1.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "importlib-metadata" }, { name = "importlib-resources" }, { name = "jinja2" }, { name = "mkdocs" }, { name = "pyparsing" }, { name = "pyyaml" }, { name = "pyyaml-env-tag" }, { name = "verspec" }, ] sdist = { url = "https://files.pythonhosted.org/packages/ab/f7/2933f1a1fb0e0f077d5d6a92c6c7f8a54e6128241f116dff4df8b6050bbf/mike-2.1.3.tar.gz", hash = "sha256:abd79b8ea483fb0275b7972825d3082e5ae67a41820f8d8a0dc7a3f49944e810", size = 38119 } wheels = [ { url = "https://files.pythonhosted.org/packages/fd/1a/31b7cd6e4e7a02df4e076162e9783620777592bea9e4bb036389389af99d/mike-2.1.3-py3-none-any.whl", hash = "sha256:d90c64077e84f06272437b464735130d380703a76a5738b152932884c60c062a", size = 33754 }, ] [[package]] name = "mkdocs" version = "1.6.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, { name = "colorama", marker = "platform_system == 'Windows'" }, { name = "ghp-import" }, { name = "importlib-metadata", marker = "python_full_version < '3.10'" }, { name = "jinja2" }, { name = "markdown" }, { name = "markupsafe" }, { name = "mergedeep" }, { name = "mkdocs-get-deps" }, { name = "packaging" }, { name = "pathspec" }, { name = "pyyaml" }, { name = "pyyaml-env-tag" }, { name = "watchdog" }, ] sdist = { url = "https://files.pythonhosted.org/packages/bc/c6/bbd4f061bd16b378247f12953ffcb04786a618ce5e904b8c5a01a0309061/mkdocs-1.6.1.tar.gz", hash = "sha256:7b432f01d928c084353ab39c57282f29f92136665bdd6abf7c1ec8d822ef86f2", size = 3889159 } wheels = [ { url = "https://files.pythonhosted.org/packages/22/5b/dbc6a8cddc9cfa9c4971d59fb12bb8d42e161b7e7f8cc89e49137c5b279c/mkdocs-1.6.1-py3-none-any.whl", hash = "sha256:db91759624d1647f3f34aa0c3f327dd2601beae39a366d6e064c03468d35c20e", size = 3864451 }, ] [[package]] name = "mkdocs-autorefs" version = "1.2.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown" }, { name = "markupsafe" }, { name = "mkdocs" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fb/ae/0f1154c614d6a8b8a36fff084e5b82af3a15f7d2060cf0dcdb1c53297a71/mkdocs_autorefs-1.2.0.tar.gz", hash = "sha256:a86b93abff653521bda71cf3fc5596342b7a23982093915cb74273f67522190f", size = 40262 } wheels = [ { url = "https://files.pythonhosted.org/packages/71/26/4d39d52ea2219604053a4d05b98e90d6a335511cc01806436ec4886b1028/mkdocs_autorefs-1.2.0-py3-none-any.whl", hash = "sha256:d588754ae89bd0ced0c70c06f58566a4ee43471eeeee5202427da7de9ef85a2f", size = 16522 }, ] [[package]] name = "mkdocs-exclude" version = "1.0.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mkdocs" }, ] sdist = { url = "https://files.pythonhosted.org/packages/54/b5/3a8e289282c9e8d7003f8a2f53d673d4fdaa81d493dc6966092d9985b6fc/mkdocs-exclude-1.0.2.tar.gz", hash = "sha256:ba6fab3c80ddbe3fd31d3e579861fd3124513708271180a5f81846da8c7e2a51", size = 6751 } [[package]] name = "mkdocs-get-deps" version = "0.2.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "importlib-metadata", marker = "python_full_version < '3.10'" }, { name = "mergedeep" }, { name = "platformdirs" }, { name = "pyyaml" }, ] sdist = { url = "https://files.pythonhosted.org/packages/98/f5/ed29cd50067784976f25ed0ed6fcd3c2ce9eb90650aa3b2796ddf7b6870b/mkdocs_get_deps-0.2.0.tar.gz", hash = "sha256:162b3d129c7fad9b19abfdcb9c1458a651628e4b1dea628ac68790fb3061c60c", size = 10239 } wheels = [ { url = "https://files.pythonhosted.org/packages/9f/d4/029f984e8d3f3b6b726bd33cafc473b75e9e44c0f7e80a5b29abc466bdea/mkdocs_get_deps-0.2.0-py3-none-any.whl", hash = "sha256:2bf11d0b133e77a0dd036abeeb06dec8775e46efa526dc70667d8863eefc6134", size = 9521 }, ] [[package]] name = "mkdocs-material" version = "9.5.44" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "babel" }, { name = "colorama" }, { name = "jinja2" }, { name = "markdown" }, { name = "mkdocs" }, { name = "mkdocs-material-extensions" }, { name = "paginate" }, { name = "pygments" }, { name = "pymdown-extensions" }, { name = "regex" }, { name = "requests" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f7/56/182d8121db9ab553cdf9bc58d5972b89833f60b63272f693c1f2b849b640/mkdocs_material-9.5.44.tar.gz", hash = "sha256:f3a6c968e524166b3f3ed1fb97d3ed3e0091183b0545cedf7156a2a6804c56c0", size = 3964306 } wheels = [ { url = "https://files.pythonhosted.org/packages/ed/eb/a801d00e0e210d82184aacce596906ec065422c78a7319244ba0771c4ded/mkdocs_material-9.5.44-py3-none-any.whl", hash = "sha256:47015f9c167d58a5ff5e682da37441fc4d66a1c79334bfc08d774763cacf69ca", size = 8674509 }, ] [package.optional-dependencies] imaging = [ { name = "cairosvg" }, { name = "pillow" }, ] [[package]] name = "mkdocs-material-extensions" version = "1.3.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/79/9b/9b4c96d6593b2a541e1cb8b34899a6d021d208bb357042823d4d2cabdbe7/mkdocs_material_extensions-1.3.1.tar.gz", hash = "sha256:10c9511cea88f568257f960358a467d12b970e1f7b2c0e5fb2bb48cab1928443", size = 11847 } wheels = [ { url = "https://files.pythonhosted.org/packages/5b/54/662a4743aa81d9582ee9339d4ffa3c8fd40a4965e033d77b9da9774d3960/mkdocs_material_extensions-1.3.1-py3-none-any.whl", hash = "sha256:adff8b62700b25cb77b53358dad940f3ef973dd6db797907c49e3c2ef3ab4e31", size = 8728 }, ] [[package]] name = "mkdocs-redirects" version = "1.2.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mkdocs" }, ] sdist = { url = "https://files.pythonhosted.org/packages/f1/a8/6d44a6cf07e969c7420cb36ab287b0669da636a2044de38a7d2208d5a758/mkdocs_redirects-1.2.2.tar.gz", hash = "sha256:3094981b42ffab29313c2c1b8ac3969861109f58b2dd58c45fc81cd44bfa0095", size = 7162 } wheels = [ { url = "https://files.pythonhosted.org/packages/c4/ec/38443b1f2a3821bbcb24e46cd8ba979154417794d54baf949fefde1c2146/mkdocs_redirects-1.2.2-py3-none-any.whl", hash = "sha256:7dbfa5647b79a3589da4401403d69494bd1f4ad03b9c15136720367e1f340ed5", size = 6142 }, ] [[package]] name = "mkdocstrings" version = "0.26.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, { name = "importlib-metadata", marker = "python_full_version < '3.10'" }, { name = "jinja2" }, { name = "markdown" }, { name = "markupsafe" }, { name = "mkdocs" }, { name = "mkdocs-autorefs" }, { name = "platformdirs" }, { name = "pymdown-extensions" }, { name = "typing-extensions", marker = "python_full_version < '3.10'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/e6/bf/170ff04de72227f715d67da32950c7b8434449f3805b2ec3dd1085db4d7c/mkdocstrings-0.26.1.tar.gz", hash = "sha256:bb8b8854d6713d5348ad05b069a09f3b79edbc6a0f33a34c6821141adb03fe33", size = 92677 } wheels = [ { url = "https://files.pythonhosted.org/packages/23/cc/8ba127aaee5d1e9046b0d33fa5b3d17da95a9d705d44902792e0569257fd/mkdocstrings-0.26.1-py3-none-any.whl", hash = "sha256:29738bfb72b4608e8e55cc50fb8a54f325dc7ebd2014e4e3881a49892d5983cf", size = 29643 }, ] [[package]] name = "mkdocstrings-python" version = "1.11.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "griffe" }, { name = "mkdocs-autorefs" }, { name = "mkdocstrings" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fc/ba/534c934cd0a809f51c91332d6ed278782ee4126b8ba8db02c2003f162b47/mkdocstrings_python-1.11.1.tar.gz", hash = "sha256:8824b115c5359304ab0b5378a91f6202324a849e1da907a3485b59208b797322", size = 166890 } wheels = [ { url = "https://files.pythonhosted.org/packages/2f/f2/2a2c48fda645ac6bbe73bcc974587a579092b6868e6ff8bc6d177f4db38a/mkdocstrings_python-1.11.1-py3-none-any.whl", hash = "sha256:a21a1c05acef129a618517bb5aae3e33114f569b11588b1e7af3e9d4061a71af", size = 109297 }, ] [[package]] name = "mypy" version = "1.12.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "mypy-extensions" }, { name = "tomli", marker = "python_full_version < '3.11'" }, { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/17/03/744330105a74dc004578f47ec27e1bf66b1dd5664ea444d18423e41343bd/mypy-1.12.1.tar.gz", hash = "sha256:f5b3936f7a6d0e8280c9bdef94c7ce4847f5cdfc258fbb2c29a8c1711e8bb96d", size = 3150767 } wheels = [ { url = "https://files.pythonhosted.org/packages/16/90/3a83d3bcff2eb85151723f116336bd545995b5260a49d3e0d95213fcc2d7/mypy-1.12.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3d7d4371829184e22fda4015278fbfdef0327a4b955a483012bd2d423a788801", size = 11017908 }, { url = "https://files.pythonhosted.org/packages/e4/5c/d6b32ddde2460fc63168ca0f7bf44f38474353547f7c0304a30023c40aa0/mypy-1.12.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f59f1dfbf497d473201356966e353ef09d4daec48caeacc0254db8ef633a28a5", size = 10184164 }, { url = "https://files.pythonhosted.org/packages/42/5e/680aa37c938e6db23bd7e6dd4d38d7e609998491721e453b32ec10d31e7f/mypy-1.12.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b947097fae68004b8328c55161ac9db7d3566abfef72d9d41b47a021c2fba6b1", size = 12587852 }, { url = "https://files.pythonhosted.org/packages/9e/0f/9cafea1c3aaf852cfa1d4a387f33923b6d9714b5c16eb0469da67c5c31e4/mypy-1.12.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:96af62050971c5241afb4701c15189ea9507db89ad07794a4ee7b4e092dc0627", size = 13106489 }, { url = "https://files.pythonhosted.org/packages/ea/c3/7f56d5d87a81e665de8dfa424120ab3a6954ae5854946cec0a46f78f6168/mypy-1.12.1-cp310-cp310-win_amd64.whl", hash = "sha256:d90da248f4c2dba6c44ddcfea94bb361e491962f05f41990ff24dbd09969ce20", size = 9634753 }, { url = "https://files.pythonhosted.org/packages/18/0a/70de7c97a86cb85535077ab5cef1cbc4e2812fd2e9cc21d78eb561a6b80f/mypy-1.12.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1230048fec1380faf240be6385e709c8570604d2d27ec6ca7e573e3bc09c3735", size = 10940998 }, { url = "https://files.pythonhosted.org/packages/c0/97/9ed6d4834d7549936ab88533b302184fb568a0940c4000d2aaee6dc07112/mypy-1.12.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:02dcfe270c6ea13338210908f8cadc8d31af0f04cee8ca996438fe6a97b4ec66", size = 10108523 }, { url = "https://files.pythonhosted.org/packages/48/41/1686f37d09c915dfc5b683e20cc99dabac199900b5ca6d22747b99ddcb50/mypy-1.12.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a5a437c9102a6a252d9e3a63edc191a3aed5f2fcb786d614722ee3f4472e33f6", size = 12505553 }, { url = "https://files.pythonhosted.org/packages/8d/2b/2dbcaa7e97b23f27ced77493256ee878f4a140ac750e198630ff1b9b60c6/mypy-1.12.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:186e0c8346efc027ee1f9acf5ca734425fc4f7dc2b60144f0fbe27cc19dc7931", size = 12988634 }, { url = "https://files.pythonhosted.org/packages/54/55/710d082e91a2ccaea21214229b11f9215a9d22446f949491b5457655e82b/mypy-1.12.1-cp311-cp311-win_amd64.whl", hash = "sha256:673ba1140a478b50e6d265c03391702fa11a5c5aff3f54d69a62a48da32cb811", size = 9630747 }, { url = "https://files.pythonhosted.org/packages/8a/74/b9e0e4f06e951e277058f878302faa154d282ca11274c59fe08353f52949/mypy-1.12.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:9fb83a7be97c498176fb7486cafbb81decccaef1ac339d837c377b0ce3743a7f", size = 11079902 }, { url = "https://files.pythonhosted.org/packages/9f/62/fcad290769db3eb0de265094cef5c94d6075c70bc1e42b67eee4ca192dcc/mypy-1.12.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:389e307e333879c571029d5b93932cf838b811d3f5395ed1ad05086b52148fb0", size = 10072373 }, { url = "https://files.pythonhosted.org/packages/cb/27/9ac78349c2952e4446288ec1174675ab9e0160ed18c2cb1154fa456c54e8/mypy-1.12.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:94b2048a95a21f7a9ebc9fbd075a4fcd310410d078aa0228dbbad7f71335e042", size = 12589779 }, { url = "https://files.pythonhosted.org/packages/7c/4a/58cebd122cf1cba95680ac51303fbeb508392413ca64e3e711aa7d4877aa/mypy-1.12.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ee5932370ccf7ebf83f79d1c157a5929d7ea36313027b0d70a488493dc1b179", size = 13044459 }, { url = "https://files.pythonhosted.org/packages/5b/c7/672935e2a3f9bcc07b1b870395a653f665657bef3cdaa504ad99f56eadf0/mypy-1.12.1-cp312-cp312-win_amd64.whl", hash = "sha256:19bf51f87a295e7ab2894f1d8167622b063492d754e69c3c2fed6563268cb42a", size = 9731919 }, { url = "https://files.pythonhosted.org/packages/bb/b0/092be5094840a401940c95224f63bb2a8f09bce9251ac1df180ec523830c/mypy-1.12.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d34167d43613ffb1d6c6cdc0cc043bb106cac0aa5d6a4171f77ab92a3c758bcc", size = 11068611 }, { url = "https://files.pythonhosted.org/packages/9a/86/f20f53b8f062876c39602243d7a59b5cabd6b24315d8de511d607fa4de6a/mypy-1.12.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:427878aa54f2e2c5d8db31fa9010c599ed9f994b3b49e64ae9cd9990c40bd635", size = 10068036 }, { url = "https://files.pythonhosted.org/packages/84/c7/1dbd6575785522da1d4c1ac2c419505fcf23bee74811880cac447a4a77ab/mypy-1.12.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5fcde63ea2c9f69d6be859a1e6dd35955e87fa81de95bc240143cf00de1f7f81", size = 12585671 }, { url = "https://files.pythonhosted.org/packages/46/8a/f6ae18b446eb2bccce54c4bd94065bcfe417d6c67021dcc032bf1e720aff/mypy-1.12.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d54d840f6c052929f4a3d2aab2066af0f45a020b085fe0e40d4583db52aab4e4", size = 13036083 }, { url = "https://files.pythonhosted.org/packages/59/e6/fc65fde3dc7156fce8d49ba21c7b1f5d866ad50467bf196ca94a7f6d2c9e/mypy-1.12.1-cp313-cp313-win_amd64.whl", hash = "sha256:20db6eb1ca3d1de8ece00033b12f793f1ea9da767334b7e8c626a4872090cf02", size = 9735467 }, { url = "https://files.pythonhosted.org/packages/bf/b2/69daed06c59a8136af49e743408daa088c30c1cc95d7d366ebf3aacada75/mypy-1.12.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b16fe09f9c741d85a2e3b14a5257a27a4f4886c171d562bc5a5e90d8591906b8", size = 10954793 }, { url = "https://files.pythonhosted.org/packages/cb/49/60b91ab7cae846a71eeb017e2a401dbdb735c9f6be5883ad7ad109487189/mypy-1.12.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:0dcc1e843d58f444fce19da4cce5bd35c282d4bde232acdeca8279523087088a", size = 10139304 }, { url = "https://files.pythonhosted.org/packages/a4/d0/bf417eae3103727e00ee6ec5b43d01cf7741a2ef021d9b4d369293612068/mypy-1.12.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e10ba7de5c616e44ad21005fa13450cd0de7caaa303a626147d45307492e4f2d", size = 12542494 }, { url = "https://files.pythonhosted.org/packages/36/37/b9ea809d0ed484aacf26615929f337d38833108e6543fa58ee7cd73aa565/mypy-1.12.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:0e6fe449223fa59fbee351db32283838a8fee8059e0028e9e6494a03802b4004", size = 13051241 }, { url = "https://files.pythonhosted.org/packages/69/8b/20212fb38adb3c6b7f8a84d8faf2f480cb8f8235544a695877181ffba2d7/mypy-1.12.1-cp38-cp38-win_amd64.whl", hash = "sha256:dc6e2a2195a290a7fd5bac3e60b586d77fc88e986eba7feced8b778c373f9afe", size = 9616931 }, { url = "https://files.pythonhosted.org/packages/ee/dc/98c84202135943428963858bbd4d469b44f0fe0659c885f2e790fc77a9e3/mypy-1.12.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:de5b2a8988b4e1269a98beaf0e7cc71b510d050dce80c343b53b4955fff45f19", size = 11014712 }, { url = "https://files.pythonhosted.org/packages/20/6a/7fd68f58f457efb6d30206c5c454ef26cd71ed6f7dbc87dc9cee68e1d805/mypy-1.12.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:843826966f1d65925e8b50d2b483065c51fc16dc5d72647e0236aae51dc8d77e", size = 10179252 }, { url = "https://files.pythonhosted.org/packages/bd/31/4948ea5e9331d1fec202fdeb2a3184b53752f7e914533d884500dba3233f/mypy-1.12.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9fe20f89da41a95e14c34b1ddb09c80262edcc295ad891f22cc4b60013e8f78d", size = 12585725 }, { url = "https://files.pythonhosted.org/packages/6d/6d/76c52b69799a0338e9a938eee3171a9794c4b9b7eba80080aee20355eb31/mypy-1.12.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8135ffec02121a75f75dc97c81af7c14aa4ae0dda277132cfcd6abcd21551bfd", size = 13103153 }, { url = "https://files.pythonhosted.org/packages/5c/b1/e77a79a4895e1a4009f07bc1b8639f251bb5dd3029d7110a3c07d76f021b/mypy-1.12.1-cp39-cp39-win_amd64.whl", hash = "sha256:a7b76fa83260824300cc4834a3ab93180db19876bce59af921467fd03e692810", size = 9631585 }, { url = "https://files.pythonhosted.org/packages/84/6b/1db9de4e0764778251fb2d64cb7455cf6db75dc99c9f72c8b7e74b6a8a17/mypy-1.12.1-py3-none-any.whl", hash = "sha256:ce561a09e3bb9863ab77edf29ae3a50e65685ad74bba1431278185b7e5d5486e", size = 2646060 }, ] [[package]] name = "mypy-extensions" version = "1.0.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/98/a4/1ab47638b92648243faf97a5aeb6ea83059cc3624972ab6b8d2316078d3f/mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782", size = 4433 } wheels = [ { url = "https://files.pythonhosted.org/packages/2a/e2/5d3f6ada4297caebe1a2add3b126fe800c96f56dbe5d1988a2cbe0b267aa/mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d", size = 4695 }, ] [[package]] name = "nodeenv" version = "1.9.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437 } wheels = [ { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 }, ] [[package]] name = "packaging" version = "24.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 } wheels = [ { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 }, ] [[package]] name = "paginate" version = "0.5.7" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/ec/46/68dde5b6bc00c1296ec6466ab27dddede6aec9af1b99090e1107091b3b84/paginate-0.5.7.tar.gz", hash = "sha256:22bd083ab41e1a8b4f3690544afb2c60c25e5c9a63a30fa2f483f6c60c8e5945", size = 19252 } wheels = [ { url = "https://files.pythonhosted.org/packages/90/96/04b8e52da071d28f5e21a805b19cb9390aa17a47462ac87f5e2696b9566d/paginate-0.5.7-py2.py3-none-any.whl", hash = "sha256:b885e2af73abcf01d9559fd5216b57ef722f8c42affbb63942377668e35c7591", size = 13746 }, ] [[package]] name = "pathspec" version = "0.12.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043 } wheels = [ { url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191 }, ] [[package]] name = "pillow" version = "10.4.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/cd/74/ad3d526f3bf7b6d3f408b73fde271ec69dfac8b81341a318ce825f2b3812/pillow-10.4.0.tar.gz", hash = "sha256:166c1cd4d24309b30d61f79f4a9114b7b2313d7450912277855ff5dfd7cd4a06", size = 46555059 } wheels = [ { url = "https://files.pythonhosted.org/packages/0e/69/a31cccd538ca0b5272be2a38347f8839b97a14be104ea08b0db92f749c74/pillow-10.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:4d9667937cfa347525b319ae34375c37b9ee6b525440f3ef48542fcf66f2731e", size = 3509271 }, { url = "https://files.pythonhosted.org/packages/9a/9e/4143b907be8ea0bce215f2ae4f7480027473f8b61fcedfda9d851082a5d2/pillow-10.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:543f3dc61c18dafb755773efc89aae60d06b6596a63914107f75459cf984164d", size = 3375658 }, { url = "https://files.pythonhosted.org/packages/8a/25/1fc45761955f9359b1169aa75e241551e74ac01a09f487adaaf4c3472d11/pillow-10.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7928ecbf1ece13956b95d9cbcfc77137652b02763ba384d9ab508099a2eca856", size = 4332075 }, { url = "https://files.pythonhosted.org/packages/5e/dd/425b95d0151e1d6c951f45051112394f130df3da67363b6bc75dc4c27aba/pillow-10.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e4d49b85c4348ea0b31ea63bc75a9f3857869174e2bf17e7aba02945cd218e6f", size = 4444808 }, { url = "https://files.pythonhosted.org/packages/b1/84/9a15cc5726cbbfe7f9f90bfb11f5d028586595907cd093815ca6644932e3/pillow-10.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:6c762a5b0997f5659a5ef2266abc1d8851ad7749ad9a6a5506eb23d314e4f46b", size = 4356290 }, { url = "https://files.pythonhosted.org/packages/b5/5b/6651c288b08df3b8c1e2f8c1152201e0b25d240e22ddade0f1e242fc9fa0/pillow-10.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a985e028fc183bf12a77a8bbf36318db4238a3ded7fa9df1b9a133f1cb79f8fc", size = 4525163 }, { url = "https://files.pythonhosted.org/packages/07/8b/34854bf11a83c248505c8cb0fcf8d3d0b459a2246c8809b967963b6b12ae/pillow-10.4.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:812f7342b0eee081eaec84d91423d1b4650bb9828eb53d8511bcef8ce5aecf1e", size = 4463100 }, { url = "https://files.pythonhosted.org/packages/78/63/0632aee4e82476d9cbe5200c0cdf9ba41ee04ed77887432845264d81116d/pillow-10.4.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ac1452d2fbe4978c2eec89fb5a23b8387aba707ac72810d9490118817d9c0b46", size = 4592880 }, { url = "https://files.pythonhosted.org/packages/df/56/b8663d7520671b4398b9d97e1ed9f583d4afcbefbda3c6188325e8c297bd/pillow-10.4.0-cp310-cp310-win32.whl", hash = "sha256:bcd5e41a859bf2e84fdc42f4edb7d9aba0a13d29a2abadccafad99de3feff984", size = 2235218 }, { url = "https://files.pythonhosted.org/packages/f4/72/0203e94a91ddb4a9d5238434ae6c1ca10e610e8487036132ea9bf806ca2a/pillow-10.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:ecd85a8d3e79cd7158dec1c9e5808e821feea088e2f69a974db5edf84dc53141", size = 2554487 }, { url = "https://files.pythonhosted.org/packages/bd/52/7e7e93d7a6e4290543f17dc6f7d3af4bd0b3dd9926e2e8a35ac2282bc5f4/pillow-10.4.0-cp310-cp310-win_arm64.whl", hash = "sha256:ff337c552345e95702c5fde3158acb0625111017d0e5f24bf3acdb9cc16b90d1", size = 2243219 }, { url = "https://files.pythonhosted.org/packages/a7/62/c9449f9c3043c37f73e7487ec4ef0c03eb9c9afc91a92b977a67b3c0bbc5/pillow-10.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0a9ec697746f268507404647e531e92889890a087e03681a3606d9b920fbee3c", size = 3509265 }, { url = "https://files.pythonhosted.org/packages/f4/5f/491dafc7bbf5a3cc1845dc0430872e8096eb9e2b6f8161509d124594ec2d/pillow-10.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe91cb65544a1321e631e696759491ae04a2ea11d36715eca01ce07284738be", size = 3375655 }, { url = "https://files.pythonhosted.org/packages/73/d5/c4011a76f4207a3c151134cd22a1415741e42fa5ddecec7c0182887deb3d/pillow-10.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dc6761a6efc781e6a1544206f22c80c3af4c8cf461206d46a1e6006e4429ff3", size = 4340304 }, { url = "https://files.pythonhosted.org/packages/ac/10/c67e20445a707f7a610699bba4fe050583b688d8cd2d202572b257f46600/pillow-10.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e84b6cc6a4a3d76c153a6b19270b3526a5a8ed6b09501d3af891daa2a9de7d6", size = 4452804 }, { url = "https://files.pythonhosted.org/packages/a9/83/6523837906d1da2b269dee787e31df3b0acb12e3d08f024965a3e7f64665/pillow-10.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:bbc527b519bd3aa9d7f429d152fea69f9ad37c95f0b02aebddff592688998abe", size = 4365126 }, { url = "https://files.pythonhosted.org/packages/ba/e5/8c68ff608a4203085158cff5cc2a3c534ec384536d9438c405ed6370d080/pillow-10.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:76a911dfe51a36041f2e756b00f96ed84677cdeb75d25c767f296c1c1eda1319", size = 4533541 }, { url = "https://files.pythonhosted.org/packages/f4/7c/01b8dbdca5bc6785573f4cee96e2358b0918b7b2c7b60d8b6f3abf87a070/pillow-10.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59291fb29317122398786c2d44427bbd1a6d7ff54017075b22be9d21aa59bd8d", size = 4471616 }, { url = "https://files.pythonhosted.org/packages/c8/57/2899b82394a35a0fbfd352e290945440e3b3785655a03365c0ca8279f351/pillow-10.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:416d3a5d0e8cfe4f27f574362435bc9bae57f679a7158e0096ad2beb427b8696", size = 4600802 }, { url = "https://files.pythonhosted.org/packages/4d/d7/a44f193d4c26e58ee5d2d9db3d4854b2cfb5b5e08d360a5e03fe987c0086/pillow-10.4.0-cp311-cp311-win32.whl", hash = "sha256:7086cc1d5eebb91ad24ded9f58bec6c688e9f0ed7eb3dbbf1e4800280a896496", size = 2235213 }, { url = "https://files.pythonhosted.org/packages/c1/d0/5866318eec2b801cdb8c82abf190c8343d8a1cd8bf5a0c17444a6f268291/pillow-10.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:cbed61494057c0f83b83eb3a310f0bf774b09513307c434d4366ed64f4128a91", size = 2554498 }, { url = "https://files.pythonhosted.org/packages/d4/c8/310ac16ac2b97e902d9eb438688de0d961660a87703ad1561fd3dfbd2aa0/pillow-10.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:f5f0c3e969c8f12dd2bb7e0b15d5c468b51e5017e01e2e867335c81903046a22", size = 2243219 }, { url = "https://files.pythonhosted.org/packages/05/cb/0353013dc30c02a8be34eb91d25e4e4cf594b59e5a55ea1128fde1e5f8ea/pillow-10.4.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:673655af3eadf4df6b5457033f086e90299fdd7a47983a13827acf7459c15d94", size = 3509350 }, { url = "https://files.pythonhosted.org/packages/e7/cf/5c558a0f247e0bf9cec92bff9b46ae6474dd736f6d906315e60e4075f737/pillow-10.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:866b6942a92f56300012f5fbac71f2d610312ee65e22f1aa2609e491284e5597", size = 3374980 }, { url = "https://files.pythonhosted.org/packages/84/48/6e394b86369a4eb68b8a1382c78dc092245af517385c086c5094e3b34428/pillow-10.4.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:29dbdc4207642ea6aad70fbde1a9338753d33fb23ed6956e706936706f52dd80", size = 4343799 }, { url = "https://files.pythonhosted.org/packages/3b/f3/a8c6c11fa84b59b9df0cd5694492da8c039a24cd159f0f6918690105c3be/pillow-10.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf2342ac639c4cf38799a44950bbc2dfcb685f052b9e262f446482afaf4bffca", size = 4459973 }, { url = "https://files.pythonhosted.org/packages/7d/1b/c14b4197b80150fb64453585247e6fb2e1d93761fa0fa9cf63b102fde822/pillow-10.4.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:f5b92f4d70791b4a67157321c4e8225d60b119c5cc9aee8ecf153aace4aad4ef", size = 4370054 }, { url = "https://files.pythonhosted.org/packages/55/77/40daddf677897a923d5d33329acd52a2144d54a9644f2a5422c028c6bf2d/pillow-10.4.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:86dcb5a1eb778d8b25659d5e4341269e8590ad6b4e8b44d9f4b07f8d136c414a", size = 4539484 }, { url = "https://files.pythonhosted.org/packages/40/54/90de3e4256b1207300fb2b1d7168dd912a2fb4b2401e439ba23c2b2cabde/pillow-10.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:780c072c2e11c9b2c7ca37f9a2ee8ba66f44367ac3e5c7832afcfe5104fd6d1b", size = 4477375 }, { url = "https://files.pythonhosted.org/packages/13/24/1bfba52f44193860918ff7c93d03d95e3f8748ca1de3ceaf11157a14cf16/pillow-10.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:37fb69d905be665f68f28a8bba3c6d3223c8efe1edf14cc4cfa06c241f8c81d9", size = 4608773 }, { url = "https://files.pythonhosted.org/packages/55/04/5e6de6e6120451ec0c24516c41dbaf80cce1b6451f96561235ef2429da2e/pillow-10.4.0-cp312-cp312-win32.whl", hash = "sha256:7dfecdbad5c301d7b5bde160150b4db4c659cee2b69589705b6f8a0c509d9f42", size = 2235690 }, { url = "https://files.pythonhosted.org/packages/74/0a/d4ce3c44bca8635bd29a2eab5aa181b654a734a29b263ca8efe013beea98/pillow-10.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:1d846aea995ad352d4bdcc847535bd56e0fd88d36829d2c90be880ef1ee4668a", size = 2554951 }, { url = "https://files.pythonhosted.org/packages/b5/ca/184349ee40f2e92439be9b3502ae6cfc43ac4b50bc4fc6b3de7957563894/pillow-10.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:e553cad5179a66ba15bb18b353a19020e73a7921296a7979c4a2b7f6a5cd57f9", size = 2243427 }, { url = "https://files.pythonhosted.org/packages/c3/00/706cebe7c2c12a6318aabe5d354836f54adff7156fd9e1bd6c89f4ba0e98/pillow-10.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8bc1a764ed8c957a2e9cacf97c8b2b053b70307cf2996aafd70e91a082e70df3", size = 3525685 }, { url = "https://files.pythonhosted.org/packages/cf/76/f658cbfa49405e5ecbfb9ba42d07074ad9792031267e782d409fd8fe7c69/pillow-10.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6209bb41dc692ddfee4942517c19ee81b86c864b626dbfca272ec0f7cff5d9fb", size = 3374883 }, { url = "https://files.pythonhosted.org/packages/46/2b/99c28c4379a85e65378211971c0b430d9c7234b1ec4d59b2668f6299e011/pillow-10.4.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee197b30783295d2eb680b311af15a20a8b24024a19c3a26431ff83eb8d1f70", size = 4339837 }, { url = "https://files.pythonhosted.org/packages/f1/74/b1ec314f624c0c43711fdf0d8076f82d9d802afd58f1d62c2a86878e8615/pillow-10.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ef61f5dd14c300786318482456481463b9d6b91ebe5ef12f405afbba77ed0be", size = 4455562 }, { url = "https://files.pythonhosted.org/packages/4a/2a/4b04157cb7b9c74372fa867096a1607e6fedad93a44deeff553ccd307868/pillow-10.4.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:297e388da6e248c98bc4a02e018966af0c5f92dfacf5a5ca22fa01cb3179bca0", size = 4366761 }, { url = "https://files.pythonhosted.org/packages/ac/7b/8f1d815c1a6a268fe90481232c98dd0e5fa8c75e341a75f060037bd5ceae/pillow-10.4.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e4db64794ccdf6cb83a59d73405f63adbe2a1887012e308828596100a0b2f6cc", size = 4536767 }, { url = "https://files.pythonhosted.org/packages/e5/77/05fa64d1f45d12c22c314e7b97398ffb28ef2813a485465017b7978b3ce7/pillow-10.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bd2880a07482090a3bcb01f4265f1936a903d70bc740bfcb1fd4e8a2ffe5cf5a", size = 4477989 }, { url = "https://files.pythonhosted.org/packages/12/63/b0397cfc2caae05c3fb2f4ed1b4fc4fc878f0243510a7a6034ca59726494/pillow-10.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4b35b21b819ac1dbd1233317adeecd63495f6babf21b7b2512d244ff6c6ce309", size = 4610255 }, { url = "https://files.pythonhosted.org/packages/7b/f9/cfaa5082ca9bc4a6de66ffe1c12c2d90bf09c309a5f52b27759a596900e7/pillow-10.4.0-cp313-cp313-win32.whl", hash = "sha256:551d3fd6e9dc15e4c1eb6fc4ba2b39c0c7933fa113b220057a34f4bb3268a060", size = 2235603 }, { url = "https://files.pythonhosted.org/packages/01/6a/30ff0eef6e0c0e71e55ded56a38d4859bf9d3634a94a88743897b5f96936/pillow-10.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:030abdbe43ee02e0de642aee345efa443740aa4d828bfe8e2eb11922ea6a21ea", size = 2554972 }, { url = "https://files.pythonhosted.org/packages/48/2c/2e0a52890f269435eee38b21c8218e102c621fe8d8df8b9dd06fabf879ba/pillow-10.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:5b001114dd152cfd6b23befeb28d7aee43553e2402c9f159807bf55f33af8a8d", size = 2243375 }, { url = "https://files.pythonhosted.org/packages/56/70/f40009702a477ce87d8d9faaa4de51d6562b3445d7a314accd06e4ffb01d/pillow-10.4.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:8d4d5063501b6dd4024b8ac2f04962d661222d120381272deea52e3fc52d3736", size = 3509213 }, { url = "https://files.pythonhosted.org/packages/10/43/105823d233c5e5d31cea13428f4474ded9d961652307800979a59d6a4276/pillow-10.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7c1ee6f42250df403c5f103cbd2768a28fe1a0ea1f0f03fe151c8741e1469c8b", size = 3375883 }, { url = "https://files.pythonhosted.org/packages/3c/ad/7850c10bac468a20c918f6a5dbba9ecd106ea1cdc5db3c35e33a60570408/pillow-10.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b15e02e9bb4c21e39876698abf233c8c579127986f8207200bc8a8f6bb27acf2", size = 4330810 }, { url = "https://files.pythonhosted.org/packages/84/4c/69bbed9e436ac22f9ed193a2b64f64d68fcfbc9f4106249dc7ed4889907b/pillow-10.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a8d4bade9952ea9a77d0c3e49cbd8b2890a399422258a77f357b9cc9be8d680", size = 4444341 }, { url = "https://files.pythonhosted.org/packages/8f/4f/c183c63828a3f37bf09644ce94cbf72d4929b033b109160a5379c2885932/pillow-10.4.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:43efea75eb06b95d1631cb784aa40156177bf9dd5b4b03ff38979e048258bc6b", size = 4356005 }, { url = "https://files.pythonhosted.org/packages/fb/ad/435fe29865f98a8fbdc64add8875a6e4f8c97749a93577a8919ec6f32c64/pillow-10.4.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:950be4d8ba92aca4b2bb0741285a46bfae3ca699ef913ec8416c1b78eadd64cd", size = 4525201 }, { url = "https://files.pythonhosted.org/packages/80/74/be8bf8acdfd70e91f905a12ae13cfb2e17c0f1da745c40141e26d0971ff5/pillow-10.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:d7480af14364494365e89d6fddc510a13e5a2c3584cb19ef65415ca57252fb84", size = 4460635 }, { url = "https://files.pythonhosted.org/packages/e4/90/763616e66dc9ad59c9b7fb58f863755e7934ef122e52349f62c7742b82d3/pillow-10.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:73664fe514b34c8f02452ffb73b7a92c6774e39a647087f83d67f010eb9a0cf0", size = 4590283 }, { url = "https://files.pythonhosted.org/packages/69/66/03002cb5b2c27bb519cba63b9f9aa3709c6f7a5d3b285406c01f03fb77e5/pillow-10.4.0-cp38-cp38-win32.whl", hash = "sha256:e88d5e6ad0d026fba7bdab8c3f225a69f063f116462c49892b0149e21b6c0a0e", size = 2235185 }, { url = "https://files.pythonhosted.org/packages/f2/75/3cb820b2812405fc7feb3d0deb701ef0c3de93dc02597115e00704591bc9/pillow-10.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:5161eef006d335e46895297f642341111945e2c1c899eb406882a6c61a4357ab", size = 2554594 }, { url = "https://files.pythonhosted.org/packages/31/85/955fa5400fa8039921f630372cfe5056eed6e1b8e0430ee4507d7de48832/pillow-10.4.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0ae24a547e8b711ccaaf99c9ae3cd975470e1a30caa80a6aaee9a2f19c05701d", size = 3509283 }, { url = "https://files.pythonhosted.org/packages/23/9c/343827267eb28d41cd82b4180d33b10d868af9077abcec0af9793aa77d2d/pillow-10.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:298478fe4f77a4408895605f3482b6cc6222c018b2ce565c2b6b9c354ac3229b", size = 3375691 }, { url = "https://files.pythonhosted.org/packages/60/a3/7ebbeabcd341eab722896d1a5b59a3df98c4b4d26cf4b0385f8aa94296f7/pillow-10.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:134ace6dc392116566980ee7436477d844520a26a4b1bd4053f6f47d096997fd", size = 4328295 }, { url = "https://files.pythonhosted.org/packages/32/3f/c02268d0c6fb6b3958bdda673c17b315c821d97df29ae6969f20fb49388a/pillow-10.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:930044bb7679ab003b14023138b50181899da3f25de50e9dbee23b61b4de2126", size = 4440810 }, { url = "https://files.pythonhosted.org/packages/67/5d/1c93c8cc35f2fdd3d6cc7e4ad72d203902859a2867de6ad957d9b708eb8d/pillow-10.4.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:c76e5786951e72ed3686e122d14c5d7012f16c8303a674d18cdcd6d89557fc5b", size = 4352283 }, { url = "https://files.pythonhosted.org/packages/bc/a8/8655557c9c7202b8abbd001f61ff36711cefaf750debcaa1c24d154ef602/pillow-10.4.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:b2724fdb354a868ddf9a880cb84d102da914e99119211ef7ecbdc613b8c96b3c", size = 4521800 }, { url = "https://files.pythonhosted.org/packages/58/78/6f95797af64d137124f68af1bdaa13b5332da282b86031f6fa70cf368261/pillow-10.4.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:dbc6ae66518ab3c5847659e9988c3b60dc94ffb48ef9168656e0019a93dbf8a1", size = 4459177 }, { url = "https://files.pythonhosted.org/packages/8a/6d/2b3ce34f1c4266d79a78c9a51d1289a33c3c02833fe294ef0dcbb9cba4ed/pillow-10.4.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:06b2f7898047ae93fad74467ec3d28fe84f7831370e3c258afa533f81ef7f3df", size = 4589079 }, { url = "https://files.pythonhosted.org/packages/e3/e0/456258c74da1ff5bf8ef1eab06a95ca994d8b9ed44c01d45c3f8cbd1db7e/pillow-10.4.0-cp39-cp39-win32.whl", hash = "sha256:7970285ab628a3779aecc35823296a7869f889b8329c16ad5a71e4901a3dc4ef", size = 2235247 }, { url = "https://files.pythonhosted.org/packages/37/f8/bef952bdb32aa53741f58bf21798642209e994edc3f6598f337f23d5400a/pillow-10.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:961a7293b2457b405967af9c77dcaa43cc1a8cd50d23c532e62d48ab6cdd56f5", size = 2554479 }, { url = "https://files.pythonhosted.org/packages/bb/8e/805201619cad6651eef5fc1fdef913804baf00053461522fabbc5588ea12/pillow-10.4.0-cp39-cp39-win_arm64.whl", hash = "sha256:32cda9e3d601a52baccb2856b8ea1fc213c90b340c542dcef77140dfa3278a9e", size = 2243226 }, { url = "https://files.pythonhosted.org/packages/38/30/095d4f55f3a053392f75e2eae45eba3228452783bab3d9a920b951ac495c/pillow-10.4.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5b4815f2e65b30f5fbae9dfffa8636d992d49705723fe86a3661806e069352d4", size = 3493889 }, { url = "https://files.pythonhosted.org/packages/f3/e8/4ff79788803a5fcd5dc35efdc9386af153569853767bff74540725b45863/pillow-10.4.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:8f0aef4ef59694b12cadee839e2ba6afeab89c0f39a3adc02ed51d109117b8da", size = 3346160 }, { url = "https://files.pythonhosted.org/packages/d7/ac/4184edd511b14f760c73f5bb8a5d6fd85c591c8aff7c2229677a355c4179/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f4727572e2918acaa9077c919cbbeb73bd2b3ebcfe033b72f858fc9fbef0026", size = 3435020 }, { url = "https://files.pythonhosted.org/packages/da/21/1749cd09160149c0a246a81d646e05f35041619ce76f6493d6a96e8d1103/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ff25afb18123cea58a591ea0244b92eb1e61a1fd497bf6d6384f09bc3262ec3e", size = 3490539 }, { url = "https://files.pythonhosted.org/packages/b6/f5/f71fe1888b96083b3f6dfa0709101f61fc9e972c0c8d04e9d93ccef2a045/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:dc3e2db6ba09ffd7d02ae9141cfa0ae23393ee7687248d46a7507b75d610f4f5", size = 3476125 }, { url = "https://files.pythonhosted.org/packages/96/b9/c0362c54290a31866c3526848583a2f45a535aa9d725fd31e25d318c805f/pillow-10.4.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:02a2be69f9c9b8c1e97cf2713e789d4e398c751ecfd9967c18d0ce304efbf885", size = 3579373 }, { url = "https://files.pythonhosted.org/packages/52/3b/ce7a01026a7cf46e5452afa86f97a5e88ca97f562cafa76570178ab56d8d/pillow-10.4.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:0755ffd4a0c6f267cccbae2e9903d95477ca2f77c4fcf3a3a09570001856c8a5", size = 2554661 }, { url = "https://files.pythonhosted.org/packages/e1/1f/5a9fcd6ced51633c22481417e11b1b47d723f64fb536dfd67c015eb7f0ab/pillow-10.4.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:a02364621fe369e06200d4a16558e056fe2805d3468350df3aef21e00d26214b", size = 3493850 }, { url = "https://files.pythonhosted.org/packages/cb/e6/3ea4755ed5320cb62aa6be2f6de47b058c6550f752dd050e86f694c59798/pillow-10.4.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:1b5dea9831a90e9d0721ec417a80d4cbd7022093ac38a568db2dd78363b00908", size = 3346118 }, { url = "https://files.pythonhosted.org/packages/0a/22/492f9f61e4648422b6ca39268ec8139277a5b34648d28f400faac14e0f48/pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b885f89040bb8c4a1573566bbb2f44f5c505ef6e74cec7ab9068c900047f04b", size = 3434958 }, { url = "https://files.pythonhosted.org/packages/f9/19/559a48ad4045704bb0547965b9a9345f5cd461347d977a56d178db28819e/pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87dd88ded2e6d74d31e1e0a99a726a6765cda32d00ba72dc37f0651f306daaa8", size = 3490340 }, { url = "https://files.pythonhosted.org/packages/d9/de/cebaca6fb79905b3a1aa0281d238769df3fb2ede34fd7c0caa286575915a/pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:2db98790afc70118bd0255c2eeb465e9767ecf1f3c25f9a1abb8ffc8cfd1fe0a", size = 3476048 }, { url = "https://files.pythonhosted.org/packages/71/f0/86d5b2f04693b0116a01d75302b0a307800a90d6c351a8aa4f8ae76cd499/pillow-10.4.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f7baece4ce06bade126fb84b8af1c33439a76d8a6fd818970215e0560ca28c27", size = 3579366 }, { url = "https://files.pythonhosted.org/packages/37/ae/2dbfc38cc4fd14aceea14bc440d5151b21f64c4c3ba3f6f4191610b7ee5d/pillow-10.4.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:cfdd747216947628af7b259d274771d84db2268ca062dd5faf373639d00113a3", size = 2554652 }, ] [[package]] name = "pkgutil-resolve-name" version = "1.3.10" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/70/f2/f2891a9dc37398696ddd945012b90ef8d0a034f0012e3f83c3f7a70b0f79/pkgutil_resolve_name-1.3.10.tar.gz", hash = "sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174", size = 5054 } wheels = [ { url = "https://files.pythonhosted.org/packages/c9/5c/3d4882ba113fd55bdba9326c1e4c62a15e674a2501de4869e6bd6301f87e/pkgutil_resolve_name-1.3.10-py3-none-any.whl", hash = "sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e", size = 4734 }, ] [[package]] name = "platformdirs" version = "4.3.6" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/13/fc/128cc9cb8f03208bdbf93d3aa862e16d376844a14f9a0ce5cf4507372de4/platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907", size = 21302 } wheels = [ { url = "https://files.pythonhosted.org/packages/3c/a6/bc1012356d8ece4d66dd75c4b9fc6c1f6650ddd5991e421177d9f8f671be/platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb", size = 18439 }, ] [[package]] name = "pluggy" version = "1.5.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/96/2d/02d4312c973c6050a18b314a5ad0b3210edb65a906f868e31c111dede4a6/pluggy-1.5.0.tar.gz", hash = "sha256:2cffa88e94fdc978c4c574f15f9e59b7f4201d439195c3715ca9e2486f1d0cf1", size = 67955 } wheels = [ { url = "https://files.pythonhosted.org/packages/88/5f/e351af9a41f866ac3f1fac4ca0613908d9a41741cfcf2228f4ad853b697d/pluggy-1.5.0-py3-none-any.whl", hash = "sha256:44e1ad92c8ca002de6377e165f3e0f1be63266ab4d554740532335b9d75ea669", size = 20556 }, ] [[package]] name = "py-cpuinfo" version = "9.0.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/37/a8/d832f7293ebb21690860d2e01d8115e5ff6f2ae8bbdc953f0eb0fa4bd2c7/py-cpuinfo-9.0.0.tar.gz", hash = "sha256:3cdbbf3fac90dc6f118bfd64384f309edeadd902d7c8fb17f02ffa1fc3f49690", size = 104716 } wheels = [ { url = "https://files.pythonhosted.org/packages/e0/a9/023730ba63db1e494a271cb018dcd361bd2c917ba7004c3e49d5daf795a2/py_cpuinfo-9.0.0-py3-none-any.whl", hash = "sha256:859625bc251f64e21f077d099d4162689c762b5d6a4c3c97553d56241c9674d5", size = 22335 }, ] [[package]] name = "pycparser" version = "2.22" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 } wheels = [ { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 }, ] [[package]] name = "pydantic" version = "2.10.5" source = { editable = "." } dependencies = [ { name = "annotated-types" }, { name = "pydantic-core" }, { name = "typing-extensions" }, ] [package.optional-dependencies] email = [ { name = "email-validator" }, ] timezone = [ { name = "tzdata", marker = "python_full_version >= '3.9' and platform_system == 'Windows'" }, ] [package.dev-dependencies] all = [ { name = "ansi2html" }, { name = "autoflake" }, { name = "cloudpickle" }, { name = "coverage", extra = ["toml"] }, { name = "devtools" }, { name = "dirty-equals" }, { name = "eval-type-backport" }, { name = "faker" }, { name = "greenlet", marker = "python_full_version < '3.13'" }, { name = "jsonschema" }, { name = "mike" }, { name = "mkdocs" }, { name = "mkdocs-exclude" }, { name = "mkdocs-material", extra = ["imaging"] }, { name = "mkdocs-redirects" }, { name = "mkdocstrings-python" }, { name = "mypy" }, { name = "packaging" }, { name = "pydantic-extra-types" }, { name = "pydantic-settings" }, { name = "pyright" }, { name = "pytest" }, { name = "pytest-benchmark" }, { name = "pytest-codspeed" }, { name = "pytest-examples" }, { name = "pytest-memray", marker = "platform_python_implementation == 'CPython' and platform_system != 'Windows'" }, { name = "pytest-mock" }, { name = "pytest-pretty" }, { name = "pytz" }, { name = "pyupgrade" }, { name = "requests" }, { name = "ruff" }, { name = "sqlalchemy" }, { name = "tomli" }, ] dev = [ { name = "coverage", extra = ["toml"] }, { name = "dirty-equals" }, { name = "eval-type-backport" }, { name = "faker" }, { name = "jsonschema" }, { name = "packaging" }, { name = "pytest" }, { name = "pytest-benchmark" }, { name = "pytest-codspeed" }, { name = "pytest-examples" }, { name = "pytest-memray", marker = "platform_python_implementation == 'CPython' and platform_system != 'Windows'" }, { name = "pytest-mock" }, { name = "pytest-pretty" }, { name = "pytz" }, ] docs = [ { name = "autoflake" }, { name = "mike" }, { name = "mkdocs" }, { name = "mkdocs-exclude" }, { name = "mkdocs-material", extra = ["imaging"] }, { name = "mkdocs-redirects" }, { name = "mkdocstrings-python" }, { name = "pydantic-extra-types" }, { name = "pydantic-settings" }, { name = "pyupgrade" }, { name = "requests" }, { name = "tomli" }, ] linting = [ { name = "eval-type-backport" }, { name = "pyright" }, { name = "ruff" }, ] testing-extra = [ { name = "ansi2html" }, { name = "cloudpickle" }, { name = "devtools" }, { name = "greenlet", marker = "python_full_version < '3.13'" }, { name = "sqlalchemy" }, ] typechecking = [ { name = "mypy" }, { name = "pydantic-settings" }, { name = "pyright" }, ] [package.metadata] requires-dist = [ { name = "annotated-types", specifier = ">=0.6.0" }, { name = "email-validator", marker = "extra == 'email'", specifier = ">=2.0.0" }, { name = "pydantic-core", specifier = "==2.27.2" }, { name = "typing-extensions", specifier = ">=4.12.2" }, { name = "tzdata", marker = "python_full_version >= '3.9' and platform_system == 'Windows' and extra == 'timezone'" }, ] [package.metadata.requires-dev] all = [ { name = "ansi2html" }, { name = "autoflake" }, { name = "cloudpickle" }, { name = "coverage", extras = ["toml"] }, { name = "devtools" }, { name = "dirty-equals" }, { name = "eval-type-backport" }, { name = "faker" }, { name = "greenlet", marker = "python_full_version < '3.13'" }, { name = "jsonschema" }, { name = "mike" }, { name = "mkdocs" }, { name = "mkdocs-exclude" }, { name = "mkdocs-material", extras = ["imaging"] }, { name = "mkdocs-redirects" }, { name = "mkdocstrings-python" }, { name = "mypy" }, { name = "packaging" }, { name = "pydantic-extra-types", git = "https://github.com/pydantic/pydantic-extra-types.git?rev=main" }, { name = "pydantic-settings" }, { name = "pyright" }, { name = "pytest" }, { name = "pytest-benchmark" }, { name = "pytest-codspeed" }, { name = "pytest-examples" }, { name = "pytest-memray", marker = "platform_python_implementation == 'CPython' and platform_system != 'Windows'" }, { name = "pytest-mock" }, { name = "pytest-pretty" }, { name = "pytz" }, { name = "pyupgrade" }, { name = "requests" }, { name = "ruff" }, { name = "sqlalchemy" }, { name = "tomli" }, ] dev = [ { name = "coverage", extras = ["toml"] }, { name = "dirty-equals" }, { name = "eval-type-backport" }, { name = "faker" }, { name = "jsonschema" }, { name = "packaging" }, { name = "pytest" }, { name = "pytest-benchmark" }, { name = "pytest-codspeed" }, { name = "pytest-examples" }, { name = "pytest-memray", marker = "platform_python_implementation == 'CPython' and platform_system != 'Windows'" }, { name = "pytest-mock" }, { name = "pytest-pretty" }, { name = "pytz" }, ] docs = [ { name = "autoflake" }, { name = "mike" }, { name = "mkdocs" }, { name = "mkdocs-exclude" }, { name = "mkdocs-material", extras = ["imaging"] }, { name = "mkdocs-redirects" }, { name = "mkdocstrings-python" }, { name = "pydantic-extra-types", git = "https://github.com/pydantic/pydantic-extra-types.git?rev=main" }, { name = "pydantic-settings" }, { name = "pyupgrade" }, { name = "requests" }, { name = "tomli" }, ] linting = [ { name = "eval-type-backport" }, { name = "pyright" }, { name = "ruff" }, ] testing-extra = [ { name = "ansi2html" }, { name = "cloudpickle" }, { name = "devtools" }, { name = "greenlet", marker = "python_full_version < '3.13'" }, { name = "sqlalchemy" }, ] typechecking = [ { name = "mypy" }, { name = "pydantic-settings" }, { name = "pyright" }, ] [[package]] name = "pydantic-core" version = "2.27.2" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fc/01/f3e5ac5e7c25833db5eb555f7b7ab24cd6f8c322d3a3ad2d67a952dc0abc/pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39", size = 413443 } wheels = [ { url = "https://files.pythonhosted.org/packages/3a/bc/fed5f74b5d802cf9a03e83f60f18864e90e3aed7223adaca5ffb7a8d8d64/pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa", size = 1895938 }, { url = "https://files.pythonhosted.org/packages/71/2a/185aff24ce844e39abb8dd680f4e959f0006944f4a8a0ea372d9f9ae2e53/pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c", size = 1815684 }, { url = "https://files.pythonhosted.org/packages/c3/43/fafabd3d94d159d4f1ed62e383e264f146a17dd4d48453319fd782e7979e/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a", size = 1829169 }, { url = "https://files.pythonhosted.org/packages/a2/d1/f2dfe1a2a637ce6800b799aa086d079998959f6f1215eb4497966efd2274/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5", size = 1867227 }, { url = "https://files.pythonhosted.org/packages/7d/39/e06fcbcc1c785daa3160ccf6c1c38fea31f5754b756e34b65f74e99780b5/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c", size = 2037695 }, { url = "https://files.pythonhosted.org/packages/7a/67/61291ee98e07f0650eb756d44998214231f50751ba7e13f4f325d95249ab/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7", size = 2741662 }, { url = "https://files.pythonhosted.org/packages/32/90/3b15e31b88ca39e9e626630b4c4a1f5a0dfd09076366f4219429e6786076/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a", size = 1993370 }, { url = "https://files.pythonhosted.org/packages/ff/83/c06d333ee3a67e2e13e07794995c1535565132940715931c1c43bfc85b11/pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236", size = 1996813 }, { url = "https://files.pythonhosted.org/packages/7c/f7/89be1c8deb6e22618a74f0ca0d933fdcb8baa254753b26b25ad3acff8f74/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962", size = 2005287 }, { url = "https://files.pythonhosted.org/packages/b7/7d/8eb3e23206c00ef7feee17b83a4ffa0a623eb1a9d382e56e4aa46fd15ff2/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9", size = 2128414 }, { url = "https://files.pythonhosted.org/packages/4e/99/fe80f3ff8dd71a3ea15763878d464476e6cb0a2db95ff1c5c554133b6b83/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af", size = 2155301 }, { url = "https://files.pythonhosted.org/packages/2b/a3/e50460b9a5789ca1451b70d4f52546fa9e2b420ba3bfa6100105c0559238/pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4", size = 1816685 }, { url = "https://files.pythonhosted.org/packages/57/4c/a8838731cb0f2c2a39d3535376466de6049034d7b239c0202a64aaa05533/pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31", size = 1982876 }, { url = "https://files.pythonhosted.org/packages/c2/89/f3450af9d09d44eea1f2c369f49e8f181d742f28220f88cc4dfaae91ea6e/pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc", size = 1893421 }, { url = "https://files.pythonhosted.org/packages/9e/e3/71fe85af2021f3f386da42d291412e5baf6ce7716bd7101ea49c810eda90/pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7", size = 1814998 }, { url = "https://files.pythonhosted.org/packages/a6/3c/724039e0d848fd69dbf5806894e26479577316c6f0f112bacaf67aa889ac/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15", size = 1826167 }, { url = "https://files.pythonhosted.org/packages/2b/5b/1b29e8c1fb5f3199a9a57c1452004ff39f494bbe9bdbe9a81e18172e40d3/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306", size = 1865071 }, { url = "https://files.pythonhosted.org/packages/89/6c/3985203863d76bb7d7266e36970d7e3b6385148c18a68cc8915fd8c84d57/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99", size = 2036244 }, { url = "https://files.pythonhosted.org/packages/0e/41/f15316858a246b5d723f7d7f599f79e37493b2e84bfc789e58d88c209f8a/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459", size = 2737470 }, { url = "https://files.pythonhosted.org/packages/a8/7c/b860618c25678bbd6d1d99dbdfdf0510ccb50790099b963ff78a124b754f/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048", size = 1992291 }, { url = "https://files.pythonhosted.org/packages/bf/73/42c3742a391eccbeab39f15213ecda3104ae8682ba3c0c28069fbcb8c10d/pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d", size = 1994613 }, { url = "https://files.pythonhosted.org/packages/94/7a/941e89096d1175d56f59340f3a8ebaf20762fef222c298ea96d36a6328c5/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b", size = 2002355 }, { url = "https://files.pythonhosted.org/packages/6e/95/2359937a73d49e336a5a19848713555605d4d8d6940c3ec6c6c0ca4dcf25/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474", size = 2126661 }, { url = "https://files.pythonhosted.org/packages/2b/4c/ca02b7bdb6012a1adef21a50625b14f43ed4d11f1fc237f9d7490aa5078c/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6", size = 2153261 }, { url = "https://files.pythonhosted.org/packages/72/9d/a241db83f973049a1092a079272ffe2e3e82e98561ef6214ab53fe53b1c7/pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c", size = 1812361 }, { url = "https://files.pythonhosted.org/packages/e8/ef/013f07248041b74abd48a385e2110aa3a9bbfef0fbd97d4e6d07d2f5b89a/pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc", size = 1982484 }, { url = "https://files.pythonhosted.org/packages/10/1c/16b3a3e3398fd29dca77cea0a1d998d6bde3902fa2706985191e2313cc76/pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4", size = 1867102 }, { url = "https://files.pythonhosted.org/packages/d6/74/51c8a5482ca447871c93e142d9d4a92ead74de6c8dc5e66733e22c9bba89/pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0", size = 1893127 }, { url = "https://files.pythonhosted.org/packages/d3/f3/c97e80721735868313c58b89d2de85fa80fe8dfeeed84dc51598b92a135e/pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef", size = 1811340 }, { url = "https://files.pythonhosted.org/packages/9e/91/840ec1375e686dbae1bd80a9e46c26a1e0083e1186abc610efa3d9a36180/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7", size = 1822900 }, { url = "https://files.pythonhosted.org/packages/f6/31/4240bc96025035500c18adc149aa6ffdf1a0062a4b525c932065ceb4d868/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934", size = 1869177 }, { url = "https://files.pythonhosted.org/packages/fa/20/02fbaadb7808be578317015c462655c317a77a7c8f0ef274bc016a784c54/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6", size = 2038046 }, { url = "https://files.pythonhosted.org/packages/06/86/7f306b904e6c9eccf0668248b3f272090e49c275bc488a7b88b0823444a4/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c", size = 2685386 }, { url = "https://files.pythonhosted.org/packages/8d/f0/49129b27c43396581a635d8710dae54a791b17dfc50c70164866bbf865e3/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2", size = 1997060 }, { url = "https://files.pythonhosted.org/packages/0d/0f/943b4af7cd416c477fd40b187036c4f89b416a33d3cc0ab7b82708a667aa/pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4", size = 2004870 }, { url = "https://files.pythonhosted.org/packages/35/40/aea70b5b1a63911c53a4c8117c0a828d6790483f858041f47bab0b779f44/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3", size = 1999822 }, { url = "https://files.pythonhosted.org/packages/f2/b3/807b94fd337d58effc5498fd1a7a4d9d59af4133e83e32ae39a96fddec9d/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4", size = 2130364 }, { url = "https://files.pythonhosted.org/packages/fc/df/791c827cd4ee6efd59248dca9369fb35e80a9484462c33c6649a8d02b565/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57", size = 2158303 }, { url = "https://files.pythonhosted.org/packages/9b/67/4e197c300976af185b7cef4c02203e175fb127e414125916bf1128b639a9/pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc", size = 1834064 }, { url = "https://files.pythonhosted.org/packages/1f/ea/cd7209a889163b8dcca139fe32b9687dd05249161a3edda62860430457a5/pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9", size = 1989046 }, { url = "https://files.pythonhosted.org/packages/bc/49/c54baab2f4658c26ac633d798dab66b4c3a9bbf47cff5284e9c182f4137a/pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b", size = 1885092 }, { url = "https://files.pythonhosted.org/packages/41/b1/9bc383f48f8002f99104e3acff6cba1231b29ef76cfa45d1506a5cad1f84/pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b", size = 1892709 }, { url = "https://files.pythonhosted.org/packages/10/6c/e62b8657b834f3eb2961b49ec8e301eb99946245e70bf42c8817350cbefc/pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154", size = 1811273 }, { url = "https://files.pythonhosted.org/packages/ba/15/52cfe49c8c986e081b863b102d6b859d9defc63446b642ccbbb3742bf371/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9", size = 1823027 }, { url = "https://files.pythonhosted.org/packages/b1/1c/b6f402cfc18ec0024120602bdbcebc7bdd5b856528c013bd4d13865ca473/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9", size = 1868888 }, { url = "https://files.pythonhosted.org/packages/bd/7b/8cb75b66ac37bc2975a3b7de99f3c6f355fcc4d89820b61dffa8f1e81677/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1", size = 2037738 }, { url = "https://files.pythonhosted.org/packages/c8/f1/786d8fe78970a06f61df22cba58e365ce304bf9b9f46cc71c8c424e0c334/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a", size = 2685138 }, { url = "https://files.pythonhosted.org/packages/a6/74/d12b2cd841d8724dc8ffb13fc5cef86566a53ed358103150209ecd5d1999/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e", size = 1997025 }, { url = "https://files.pythonhosted.org/packages/a0/6e/940bcd631bc4d9a06c9539b51f070b66e8f370ed0933f392db6ff350d873/pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4", size = 2004633 }, { url = "https://files.pythonhosted.org/packages/50/cc/a46b34f1708d82498c227d5d80ce615b2dd502ddcfd8376fc14a36655af1/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27", size = 1999404 }, { url = "https://files.pythonhosted.org/packages/ca/2d/c365cfa930ed23bc58c41463bae347d1005537dc8db79e998af8ba28d35e/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee", size = 2130130 }, { url = "https://files.pythonhosted.org/packages/f4/d7/eb64d015c350b7cdb371145b54d96c919d4db516817f31cd1c650cae3b21/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1", size = 2157946 }, { url = "https://files.pythonhosted.org/packages/a4/99/bddde3ddde76c03b65dfd5a66ab436c4e58ffc42927d4ff1198ffbf96f5f/pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130", size = 1834387 }, { url = "https://files.pythonhosted.org/packages/71/47/82b5e846e01b26ac6f1893d3c5f9f3a2eb6ba79be26eef0b759b4fe72946/pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee", size = 1990453 }, { url = "https://files.pythonhosted.org/packages/51/b2/b2b50d5ecf21acf870190ae5d093602d95f66c9c31f9d5de6062eb329ad1/pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b", size = 1885186 }, { url = "https://files.pythonhosted.org/packages/43/53/13e9917fc69c0a4aea06fd63ed6a8d6cda9cf140ca9584d49c1650b0ef5e/pydantic_core-2.27.2-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d3e8d504bdd3f10835468f29008d72fc8359d95c9c415ce6e767203db6127506", size = 1899595 }, { url = "https://files.pythonhosted.org/packages/f4/20/26c549249769ed84877f862f7bb93f89a6ee08b4bee1ed8781616b7fbb5e/pydantic_core-2.27.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:521eb9b7f036c9b6187f0b47318ab0d7ca14bd87f776240b90b21c1f4f149320", size = 1775010 }, { url = "https://files.pythonhosted.org/packages/35/eb/8234e05452d92d2b102ffa1b56d801c3567e628fdc63f02080fdfc68fd5e/pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85210c4d99a0114f5a9481b44560d7d1e35e32cc5634c656bc48e590b669b145", size = 1830727 }, { url = "https://files.pythonhosted.org/packages/8f/df/59f915c8b929d5f61e5a46accf748a87110ba145156f9326d1a7d28912b2/pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d716e2e30c6f140d7560ef1538953a5cd1a87264c737643d481f2779fc247fe1", size = 1868393 }, { url = "https://files.pythonhosted.org/packages/d5/52/81cf4071dca654d485c277c581db368b0c95b2b883f4d7b736ab54f72ddf/pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f66d89ba397d92f840f8654756196d93804278457b5fbede59598a1f9f90b228", size = 2040300 }, { url = "https://files.pythonhosted.org/packages/9c/00/05197ce1614f5c08d7a06e1d39d5d8e704dc81971b2719af134b844e2eaf/pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:669e193c1c576a58f132e3158f9dfa9662969edb1a250c54d8fa52590045f046", size = 2738785 }, { url = "https://files.pythonhosted.org/packages/f7/a3/5f19bc495793546825ab160e530330c2afcee2281c02b5ffafd0b32ac05e/pydantic_core-2.27.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdbe7629b996647b99c01b37f11170a57ae675375b14b8c13b8518b8320ced5", size = 1996493 }, { url = "https://files.pythonhosted.org/packages/ed/e8/e0102c2ec153dc3eed88aea03990e1b06cfbca532916b8a48173245afe60/pydantic_core-2.27.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d262606bf386a5ba0b0af3b97f37c83d7011439e3dc1a9298f21efb292e42f1a", size = 1998544 }, { url = "https://files.pythonhosted.org/packages/fb/a3/4be70845b555bd80aaee9f9812a7cf3df81550bce6dadb3cfee9c5d8421d/pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:cabb9bcb7e0d97f74df8646f34fc76fbf793b7f6dc2438517d7a9e50eee4f14d", size = 2007449 }, { url = "https://files.pythonhosted.org/packages/e3/9f/b779ed2480ba355c054e6d7ea77792467631d674b13d8257085a4bc7dcda/pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_armv7l.whl", hash = "sha256:d2d63f1215638d28221f664596b1ccb3944f6e25dd18cd3b86b0a4c408d5ebb9", size = 2129460 }, { url = "https://files.pythonhosted.org/packages/a0/f0/a6ab0681f6e95260c7fbf552874af7302f2ea37b459f9b7f00698f875492/pydantic_core-2.27.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:bca101c00bff0adb45a833f8451b9105d9df18accb8743b08107d7ada14bd7da", size = 2159609 }, { url = "https://files.pythonhosted.org/packages/8a/2b/e1059506795104349712fbca647b18b3f4a7fd541c099e6259717441e1e0/pydantic_core-2.27.2-cp38-cp38-win32.whl", hash = "sha256:f6f8e111843bbb0dee4cb6594cdc73e79b3329b526037ec242a3e49012495b3b", size = 1819886 }, { url = "https://files.pythonhosted.org/packages/aa/6d/df49c17f024dfc58db0bacc7b03610058018dd2ea2eaf748ccbada4c3d06/pydantic_core-2.27.2-cp38-cp38-win_amd64.whl", hash = "sha256:fd1aea04935a508f62e0d0ef1f5ae968774a32afc306fb8545e06f5ff5cdf3ad", size = 1980773 }, { url = "https://files.pythonhosted.org/packages/27/97/3aef1ddb65c5ccd6eda9050036c956ff6ecbfe66cb7eb40f280f121a5bb0/pydantic_core-2.27.2-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:c10eb4f1659290b523af58fa7cffb452a61ad6ae5613404519aee4bfbf1df993", size = 1896475 }, { url = "https://files.pythonhosted.org/packages/ad/d3/5668da70e373c9904ed2f372cb52c0b996426f302e0dee2e65634c92007d/pydantic_core-2.27.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ef592d4bad47296fb11f96cd7dc898b92e795032b4894dfb4076cfccd43a9308", size = 1772279 }, { url = "https://files.pythonhosted.org/packages/8a/9e/e44b8cb0edf04a2f0a1f6425a65ee089c1d6f9c4c2dcab0209127b6fdfc2/pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61709a844acc6bf0b7dce7daae75195a10aac96a596ea1b776996414791ede4", size = 1829112 }, { url = "https://files.pythonhosted.org/packages/1c/90/1160d7ac700102effe11616e8119e268770f2a2aa5afb935f3ee6832987d/pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c5f762659e47fdb7b16956c71598292f60a03aa92f8b6351504359dbdba6cf", size = 1866780 }, { url = "https://files.pythonhosted.org/packages/ee/33/13983426df09a36d22c15980008f8d9c77674fc319351813b5a2739b70f3/pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c9775e339e42e79ec99c441d9730fccf07414af63eac2f0e48e08fd38a64d76", size = 2037943 }, { url = "https://files.pythonhosted.org/packages/01/d7/ced164e376f6747e9158c89988c293cd524ab8d215ae4e185e9929655d5c/pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57762139821c31847cfb2df63c12f725788bd9f04bc2fb392790959b8f70f118", size = 2740492 }, { url = "https://files.pythonhosted.org/packages/8b/1f/3dc6e769d5b7461040778816aab2b00422427bcaa4b56cc89e9c653b2605/pydantic_core-2.27.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0d1e85068e818c73e048fe28cfc769040bb1f475524f4745a5dc621f75ac7630", size = 1995714 }, { url = "https://files.pythonhosted.org/packages/07/d7/a0bd09bc39283530b3f7c27033a814ef254ba3bd0b5cfd040b7abf1fe5da/pydantic_core-2.27.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:097830ed52fd9e427942ff3b9bc17fab52913b2f50f2880dc4a5611446606a54", size = 1997163 }, { url = "https://files.pythonhosted.org/packages/2d/bb/2db4ad1762e1c5699d9b857eeb41959191980de6feb054e70f93085e1bcd/pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044a50963a614ecfae59bb1eaf7ea7efc4bc62f49ed594e18fa1e5d953c40e9f", size = 2005217 }, { url = "https://files.pythonhosted.org/packages/53/5f/23a5a3e7b8403f8dd8fc8a6f8b49f6b55c7d715b77dcf1f8ae919eeb5628/pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:4e0b4220ba5b40d727c7f879eac379b822eee5d8fff418e9d3381ee45b3b0362", size = 2127899 }, { url = "https://files.pythonhosted.org/packages/c2/ae/aa38bb8dd3d89c2f1d8362dd890ee8f3b967330821d03bbe08fa01ce3766/pydantic_core-2.27.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5e4f4bb20d75e9325cc9696c6802657b58bc1dbbe3022f32cc2b2b632c3fbb96", size = 2155726 }, { url = "https://files.pythonhosted.org/packages/98/61/4f784608cc9e98f70839187117ce840480f768fed5d386f924074bf6213c/pydantic_core-2.27.2-cp39-cp39-win32.whl", hash = "sha256:cca63613e90d001b9f2f9a9ceb276c308bfa2a43fafb75c8031c4f66039e8c6e", size = 1817219 }, { url = "https://files.pythonhosted.org/packages/57/82/bb16a68e4a1a858bb3768c2c8f1ff8d8978014e16598f001ea29a25bf1d1/pydantic_core-2.27.2-cp39-cp39-win_amd64.whl", hash = "sha256:77d1bca19b0f7021b3a982e6f903dcd5b2b06076def36a652e3907f596e29f67", size = 1985382 }, { url = "https://files.pythonhosted.org/packages/46/72/af70981a341500419e67d5cb45abe552a7c74b66326ac8877588488da1ac/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e", size = 1891159 }, { url = "https://files.pythonhosted.org/packages/ad/3d/c5913cccdef93e0a6a95c2d057d2c2cba347815c845cda79ddd3c0f5e17d/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8", size = 1768331 }, { url = "https://files.pythonhosted.org/packages/f6/f0/a3ae8fbee269e4934f14e2e0e00928f9346c5943174f2811193113e58252/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3", size = 1822467 }, { url = "https://files.pythonhosted.org/packages/d7/7a/7bbf241a04e9f9ea24cd5874354a83526d639b02674648af3f350554276c/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f", size = 1979797 }, { url = "https://files.pythonhosted.org/packages/4f/5f/4784c6107731f89e0005a92ecb8a2efeafdb55eb992b8e9d0a2be5199335/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133", size = 1987839 }, { url = "https://files.pythonhosted.org/packages/6d/a7/61246562b651dff00de86a5f01b6e4befb518df314c54dec187a78d81c84/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc", size = 1998861 }, { url = "https://files.pythonhosted.org/packages/86/aa/837821ecf0c022bbb74ca132e117c358321e72e7f9702d1b6a03758545e2/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50", size = 2116582 }, { url = "https://files.pythonhosted.org/packages/81/b0/5e74656e95623cbaa0a6278d16cf15e10a51f6002e3ec126541e95c29ea3/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9", size = 2151985 }, { url = "https://files.pythonhosted.org/packages/63/37/3e32eeb2a451fddaa3898e2163746b0cffbbdbb4740d38372db0490d67f3/pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151", size = 2004715 }, { url = "https://files.pythonhosted.org/packages/29/0e/dcaea00c9dbd0348b723cae82b0e0c122e0fa2b43fa933e1622fd237a3ee/pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c33939a82924da9ed65dab5a65d427205a73181d8098e79b6b426bdf8ad4e656", size = 1891733 }, { url = "https://files.pythonhosted.org/packages/86/d3/e797bba8860ce650272bda6383a9d8cad1d1c9a75a640c9d0e848076f85e/pydantic_core-2.27.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:00bad2484fa6bda1e216e7345a798bd37c68fb2d97558edd584942aa41b7d278", size = 1768375 }, { url = "https://files.pythonhosted.org/packages/41/f7/f847b15fb14978ca2b30262548f5fc4872b2724e90f116393eb69008299d/pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c817e2b40aba42bac6f457498dacabc568c3b7a986fc9ba7c8d9d260b71485fb", size = 1822307 }, { url = "https://files.pythonhosted.org/packages/9c/63/ed80ec8255b587b2f108e514dc03eed1546cd00f0af281e699797f373f38/pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:251136cdad0cb722e93732cb45ca5299fb56e1344a833640bf93b2803f8d1bfd", size = 1979971 }, { url = "https://files.pythonhosted.org/packages/a9/6d/6d18308a45454a0de0e975d70171cadaf454bc7a0bf86b9c7688e313f0bb/pydantic_core-2.27.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d2088237af596f0a524d3afc39ab3b036e8adb054ee57cbb1dcf8e09da5b29cc", size = 1987616 }, { url = "https://files.pythonhosted.org/packages/82/8a/05f8780f2c1081b800a7ca54c1971e291c2d07d1a50fb23c7e4aef4ed403/pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d4041c0b966a84b4ae7a09832eb691a35aec90910cd2dbe7a208de59be77965b", size = 1998943 }, { url = "https://files.pythonhosted.org/packages/5e/3e/fe5b6613d9e4c0038434396b46c5303f5ade871166900b357ada4766c5b7/pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:8083d4e875ebe0b864ffef72a4304827015cff328a1be6e22cc850753bfb122b", size = 2116654 }, { url = "https://files.pythonhosted.org/packages/db/ad/28869f58938fad8cc84739c4e592989730bfb69b7c90a8fff138dff18e1e/pydantic_core-2.27.2-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f141ee28a0ad2123b6611b6ceff018039df17f32ada8b534e6aa039545a3efb2", size = 2152292 }, { url = "https://files.pythonhosted.org/packages/a1/0c/c5c5cd3689c32ed1fe8c5d234b079c12c281c051759770c05b8bed6412b5/pydantic_core-2.27.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:7d0c8399fcc1848491f00e0314bd59fb34a9c008761bcb422a057670c3f65e35", size = 2004961 }, ] [[package]] name = "pydantic-extra-types" version = "2.10.0" source = { git = "https://github.com/pydantic/pydantic-extra-types.git?rev=main#f27a1325de5112ccd86b192645bfbb6b633ffcae" } dependencies = [ { name = "pydantic" }, { name = "typing-extensions" }, ] [[package]] name = "pydantic-settings" version = "2.6.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pydantic" }, { name = "python-dotenv" }, ] sdist = { url = "https://files.pythonhosted.org/packages/b5/d4/9dfbe238f45ad8b168f5c96ee49a3df0598ce18a0795a983b419949ce65b/pydantic_settings-2.6.1.tar.gz", hash = "sha256:e0f92546d8a9923cb8941689abf85d6601a8c19a23e97a34b2964a2e3f813ca0", size = 75646 } wheels = [ { url = "https://files.pythonhosted.org/packages/5e/f9/ff95fd7d760af42f647ea87f9b8a383d891cdb5e5dbd4613edaeb094252a/pydantic_settings-2.6.1-py3-none-any.whl", hash = "sha256:7fb0637c786a558d3103436278a7c4f1cfd29ba8973238a50c5bb9a55387da87", size = 28595 }, ] [[package]] name = "pyflakes" version = "3.2.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/57/f9/669d8c9c86613c9d568757c7f5824bd3197d7b1c6c27553bc5618a27cce2/pyflakes-3.2.0.tar.gz", hash = "sha256:1c61603ff154621fb2a9172037d84dca3500def8c8b630657d1701f026f8af3f", size = 63788 } wheels = [ { url = "https://files.pythonhosted.org/packages/d4/d7/f1b7db88d8e4417c5d47adad627a93547f44bdc9028372dbd2313f34a855/pyflakes-3.2.0-py2.py3-none-any.whl", hash = "sha256:84b5be138a2dfbb40689ca07e2152deb896a65c3a3e24c251c5c62489568074a", size = 62725 }, ] [[package]] name = "pygments" version = "2.18.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/8e/62/8336eff65bcbc8e4cb5d05b55faf041285951b6e80f33e2bff2024788f31/pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199", size = 4891905 } wheels = [ { url = "https://files.pythonhosted.org/packages/f7/3f/01c8b82017c199075f8f788d0d906b9ffbbc5a47dc9918a945e13d5a2bda/pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a", size = 1205513 }, ] [[package]] name = "pymdown-extensions" version = "10.12" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown" }, { name = "pyyaml" }, ] sdist = { url = "https://files.pythonhosted.org/packages/d8/0b/32f05854cfd432e9286bb41a870e0d1a926b72df5f5cdb6dec962b2e369e/pymdown_extensions-10.12.tar.gz", hash = "sha256:b0ee1e0b2bef1071a47891ab17003bfe5bf824a398e13f49f8ed653b699369a7", size = 840790 } wheels = [ { url = "https://files.pythonhosted.org/packages/53/32/95a164ddf533bd676cbbe878e36e89b4ade3efde8dd61d0148c90cbbe57e/pymdown_extensions-10.12-py3-none-any.whl", hash = "sha256:49f81412242d3527b8b4967b990df395c89563043bc51a3d2d7d500e52123b77", size = 263448 }, ] [[package]] name = "pyparsing" version = "3.1.4" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/83/08/13f3bce01b2061f2bbd582c9df82723de943784cf719a35ac886c652043a/pyparsing-3.1.4.tar.gz", hash = "sha256:f86ec8d1a83f11977c9a6ea7598e8c27fc5cddfa5b07ea2241edbbde1d7bc032", size = 900231 } wheels = [ { url = "https://files.pythonhosted.org/packages/e5/0c/0e3c05b1c87bb6a1c76d281b0f35e78d2d80ac91b5f8f524cebf77f51049/pyparsing-3.1.4-py3-none-any.whl", hash = "sha256:a6a7ee4235a3f944aa1fa2249307708f893fe5717dc603503c6c7969c070fb7c", size = 104100 }, ] [[package]] name = "pyright" version = "1.1.384" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "nodeenv" }, { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/84/00/a23114619f9d005f4b0f35e037c76cee029174d090a6f73a355749c74f4a/pyright-1.1.384.tar.gz", hash = "sha256:25e54d61f55cbb45f1195ff89c488832d7a45d59f3e132f178fdf9ef6cafc706", size = 21956 } wheels = [ { url = "https://files.pythonhosted.org/packages/6d/4a/e7f4d71d194ba675f3577d11eebe4e17a592c4d1c3f9986d4b321ba3c809/pyright-1.1.384-py3-none-any.whl", hash = "sha256:f0b6f4db2da38f27aeb7035c26192f034587875f751b847e9ad42ed0c704ac9e", size = 18578 }, ] [[package]] name = "pytest" version = "8.3.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "colorama", marker = "sys_platform == 'win32'" }, { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, { name = "iniconfig" }, { name = "packaging" }, { name = "pluggy" }, { name = "tomli", marker = "python_full_version < '3.11'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/8b/6c/62bbd536103af674e227c41a8f3dcd022d591f6eed5facb5a0f31ee33bbc/pytest-8.3.3.tar.gz", hash = "sha256:70b98107bd648308a7952b06e6ca9a50bc660be218d53c257cc1fc94fda10181", size = 1442487 } wheels = [ { url = "https://files.pythonhosted.org/packages/6b/77/7440a06a8ead44c7757a64362dd22df5760f9b12dc5f11b6188cd2fc27a0/pytest-8.3.3-py3-none-any.whl", hash = "sha256:a6853c7375b2663155079443d2e45de913a911a11d669df02a50814944db57b2", size = 342341 }, ] [[package]] name = "pytest-benchmark" version = "4.0.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "py-cpuinfo" }, { name = "pytest" }, ] sdist = { url = "https://files.pythonhosted.org/packages/28/08/e6b0067efa9a1f2a1eb3043ecd8a0c48bfeb60d3255006dcc829d72d5da2/pytest-benchmark-4.0.0.tar.gz", hash = "sha256:fb0785b83efe599a6a956361c0691ae1dbb5318018561af10f3e915caa0048d1", size = 334641 } wheels = [ { url = "https://files.pythonhosted.org/packages/4d/a1/3b70862b5b3f830f0422844f25a823d0470739d994466be9dbbbb414d85a/pytest_benchmark-4.0.0-py3-none-any.whl", hash = "sha256:fdb7db64e31c8b277dff9850d2a2556d8b60bcb0ea6524e36e28ffd7c87f71d6", size = 43951 }, ] [[package]] name = "pytest-codspeed" version = "2.2.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cffi" }, { name = "filelock" }, { name = "pytest" }, { name = "setuptools", marker = "python_full_version >= '3.12'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/40/6a/ba0b067dba286443a04c9d32ff75ef3bc169617ee347d1a97972ada62665/pytest_codspeed-2.2.1.tar.gz", hash = "sha256:0adc24baf01c64a6ca0a0b83b3cd704351708997e09ec086b7776c32227d4e0a", size = 9163 } wheels = [ { url = "https://files.pythonhosted.org/packages/03/0b/e3541064afcf24ed54bcabfa6eb5f8083eb335d5c58c7b5b95bc31127f86/pytest_codspeed-2.2.1-py3-none-any.whl", hash = "sha256:aad08033015f3e6c8c14c8bf0eca475921a9b088e92c98b626bf8af8f516471e", size = 10126 }, ] [[package]] name = "pytest-examples" version = "0.0.15" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "black" }, { name = "pytest" }, { name = "ruff" }, ] sdist = { url = "https://files.pythonhosted.org/packages/c3/82/fa4fb866d52934de97b90c84c89896faebfb984436804fc3baa55d48d511/pytest_examples-0.0.15.tar.gz", hash = "sha256:2d6ced2d1f0d59863f81a4d2f193737464b8004a7670907c3bedef6306a5d660", size = 20771 } wheels = [ { url = "https://files.pythonhosted.org/packages/ba/81/3f727a7d2f9c1ff36f581453949a4af314629108642d5140298476e90902/pytest_examples-0.0.15-py3-none-any.whl", hash = "sha256:6e4adc522bf2e3f93cae3b37a4add76fcc2c1ada29d8988b2ea15b236233ec0f", size = 17922 }, ] [[package]] name = "pytest-memray" version = "1.7.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "memray", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, { name = "pytest", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/4b/33/31536fa35fae6b040f7bb31375c6b95d025eb38e16416c23c0daa36bcb1f/pytest_memray-1.7.0.tar.gz", hash = "sha256:c18fa907d2210b42f4096c093e2d3416dfc002dcaa450ef3f9ba819bc3dd8f5f", size = 240564 } wheels = [ { url = "https://files.pythonhosted.org/packages/24/1b/fe19affdc41e522aabc4e5df78edb0cd8f59cb6ae2fb151dec1797593a42/pytest_memray-1.7.0-py3-none-any.whl", hash = "sha256:b896718c1adf6d0cd339dfaaaa5620f035c9919e1199a79b3453804a1254306f", size = 17679 }, ] [[package]] name = "pytest-mock" version = "3.14.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pytest" }, ] sdist = { url = "https://files.pythonhosted.org/packages/c6/90/a955c3ab35ccd41ad4de556596fa86685bf4fc5ffcc62d22d856cfd4e29a/pytest-mock-3.14.0.tar.gz", hash = "sha256:2719255a1efeceadbc056d6bf3df3d1c5015530fb40cf347c0f9afac88410bd0", size = 32814 } wheels = [ { url = "https://files.pythonhosted.org/packages/f2/3b/b26f90f74e2986a82df6e7ac7e319b8ea7ccece1caec9f8ab6104dc70603/pytest_mock-3.14.0-py3-none-any.whl", hash = "sha256:0b72c38033392a5f4621342fe11e9219ac11ec9d375f8e2a0c164539e0d70f6f", size = 9863 }, ] [[package]] name = "pytest-pretty" version = "1.2.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pytest" }, { name = "rich" }, ] sdist = { url = "https://files.pythonhosted.org/packages/a5/18/30ad0408295f3157f7a4913f0eaa51a0a377ebad0ffa51ff239e833c6c72/pytest_pretty-1.2.0.tar.gz", hash = "sha256:105a355f128e392860ad2c478ae173ff96d2f03044692f9818ff3d49205d3a60", size = 6542 } wheels = [ { url = "https://files.pythonhosted.org/packages/bf/fe/d44d391312c1b8abee2af58ee70fabb1c00b6577ac4e0bdf25b70c1caffb/pytest_pretty-1.2.0-py3-none-any.whl", hash = "sha256:6f79122bf53864ae2951b6c9e94d7a06a87ef753476acd4588aeac018f062036", size = 6180 }, ] [[package]] name = "python-dateutil" version = "2.9.0.post0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "six" }, ] sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432 } wheels = [ { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892 }, ] [[package]] name = "python-dotenv" version = "1.0.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115 } wheels = [ { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863 }, ] [[package]] name = "pytz" version = "2024.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/3a/31/3c70bf7603cc2dca0f19bdc53b4537a797747a58875b552c8c413d963a3f/pytz-2024.2.tar.gz", hash = "sha256:2aa355083c50a0f93fa581709deac0c9ad65cca8a9e9beac660adcbd493c798a", size = 319692 } wheels = [ { url = "https://files.pythonhosted.org/packages/11/c3/005fcca25ce078d2cc29fd559379817424e94885510568bc1bc53d7d5846/pytz-2024.2-py2.py3-none-any.whl", hash = "sha256:31c7c1817eb7fae7ca4b8c7ee50c72f93aa2dd863de768e1ef4245d426aa0725", size = 508002 }, ] [[package]] name = "pyupgrade" version = "3.8.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "tokenize-rt" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fc/de/b89bb5af97a3712407db232b8a43eaf8d19657cdb5740cde22650bb182d9/pyupgrade-3.8.0.tar.gz", hash = "sha256:1facb0b8407cca468dfcc1d13717e3a85aa37b9e6e7338664ad5bfe5ef50c867", size = 42768 } wheels = [ { url = "https://files.pythonhosted.org/packages/5e/bd/793d8a359e534ed77326858c2c7182037a2a8fd446828688b9c48c9983d4/pyupgrade-3.8.0-py2.py3-none-any.whl", hash = "sha256:08d0e6129f5e9da7e7a581bdbea689e0d49c3c93eeaf156a07ae2fd794f52660", size = 58284 }, ] [[package]] name = "pyyaml" version = "6.0.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 } wheels = [ { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199 }, { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758 }, { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463 }, { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280 }, { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239 }, { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802 }, { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527 }, { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052 }, { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774 }, { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612 }, { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040 }, { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829 }, { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167 }, { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952 }, { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301 }, { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638 }, { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850 }, { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980 }, { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873 }, { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302 }, { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154 }, { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223 }, { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542 }, { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164 }, { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611 }, { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591 }, { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338 }, { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 }, { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 }, { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 }, { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 }, { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 }, { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 }, { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 }, { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 }, { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 }, { url = "https://files.pythonhosted.org/packages/74/d9/323a59d506f12f498c2097488d80d16f4cf965cee1791eab58b56b19f47a/PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a", size = 183218 }, { url = "https://files.pythonhosted.org/packages/74/cc/20c34d00f04d785f2028737e2e2a8254e1425102e730fee1d6396f832577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5", size = 728067 }, { url = "https://files.pythonhosted.org/packages/20/52/551c69ca1501d21c0de51ddafa8c23a0191ef296ff098e98358f69080577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d", size = 757812 }, { url = "https://files.pythonhosted.org/packages/fd/7f/2c3697bba5d4aa5cc2afe81826d73dfae5f049458e44732c7a0938baa673/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083", size = 746531 }, { url = "https://files.pythonhosted.org/packages/8c/ab/6226d3df99900e580091bb44258fde77a8433511a86883bd4681ea19a858/PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706", size = 800820 }, { url = "https://files.pythonhosted.org/packages/a0/99/a9eb0f3e710c06c5d922026f6736e920d431812ace24aae38228d0d64b04/PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a", size = 145514 }, { url = "https://files.pythonhosted.org/packages/75/8a/ee831ad5fafa4431099aa4e078d4c8efd43cd5e48fbc774641d233b683a9/PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff", size = 162702 }, { url = "https://files.pythonhosted.org/packages/65/d8/b7a1db13636d7fb7d4ff431593c510c8b8fca920ade06ca8ef20015493c5/PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d", size = 184777 }, { url = "https://files.pythonhosted.org/packages/0a/02/6ec546cd45143fdf9840b2c6be8d875116a64076218b61d68e12548e5839/PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f", size = 172318 }, { url = "https://files.pythonhosted.org/packages/0e/9a/8cc68be846c972bda34f6c2a93abb644fb2476f4dcc924d52175786932c9/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290", size = 720891 }, { url = "https://files.pythonhosted.org/packages/e9/6c/6e1b7f40181bc4805e2e07f4abc10a88ce4648e7e95ff1abe4ae4014a9b2/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12", size = 722614 }, { url = "https://files.pythonhosted.org/packages/3d/32/e7bd8535d22ea2874cef6a81021ba019474ace0d13a4819c2a4bce79bd6a/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19", size = 737360 }, { url = "https://files.pythonhosted.org/packages/d7/12/7322c1e30b9be969670b672573d45479edef72c9a0deac3bb2868f5d7469/PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e", size = 699006 }, { url = "https://files.pythonhosted.org/packages/82/72/04fcad41ca56491995076630c3ec1e834be241664c0c09a64c9a2589b507/PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725", size = 723577 }, { url = "https://files.pythonhosted.org/packages/ed/5e/46168b1f2757f1fcd442bc3029cd8767d88a98c9c05770d8b420948743bb/PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631", size = 144593 }, { url = "https://files.pythonhosted.org/packages/19/87/5124b1c1f2412bb95c59ec481eaf936cd32f0fe2a7b16b97b81c4c017a6a/PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8", size = 162312 }, ] [[package]] name = "pyyaml-env-tag" version = "0.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "pyyaml" }, ] sdist = { url = "https://files.pythonhosted.org/packages/fb/8e/da1c6c58f751b70f8ceb1eb25bc25d524e8f14fe16edcce3f4e3ba08629c/pyyaml_env_tag-0.1.tar.gz", hash = "sha256:70092675bda14fdec33b31ba77e7543de9ddc88f2e5b99160396572d11525bdb", size = 5631 } wheels = [ { url = "https://files.pythonhosted.org/packages/5a/66/bbb1dd374f5c870f59c5bb1db0e18cbe7fa739415a24cbd95b2d1f5ae0c4/pyyaml_env_tag-0.1-py3-none-any.whl", hash = "sha256:af31106dec8a4d68c60207c1886031cbf839b68aa7abccdb19868200532c2069", size = 3911 }, ] [[package]] name = "referencing" version = "0.35.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, { name = "rpds-py" }, ] sdist = { url = "https://files.pythonhosted.org/packages/99/5b/73ca1f8e72fff6fa52119dbd185f73a907b1989428917b24cff660129b6d/referencing-0.35.1.tar.gz", hash = "sha256:25b42124a6c8b632a425174f24087783efb348a6f1e0008e63cd4466fedf703c", size = 62991 } wheels = [ { url = "https://files.pythonhosted.org/packages/b7/59/2056f61236782a2c86b33906c025d4f4a0b17be0161b63b70fd9e8775d36/referencing-0.35.1-py3-none-any.whl", hash = "sha256:eda6d3234d62814d1c64e305c1331c9a3a6132da475ab6382eaa997b21ee75de", size = 26684 }, ] [[package]] name = "regex" version = "2024.11.6" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/8e/5f/bd69653fbfb76cf8604468d3b4ec4c403197144c7bfe0e6a5fc9e02a07cb/regex-2024.11.6.tar.gz", hash = "sha256:7ab159b063c52a0333c884e4679f8d7a85112ee3078fe3d9004b2dd875585519", size = 399494 } wheels = [ { url = "https://files.pythonhosted.org/packages/95/3c/4651f6b130c6842a8f3df82461a8950f923925db8b6961063e82744bddcc/regex-2024.11.6-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ff590880083d60acc0433f9c3f713c51f7ac6ebb9adf889c79a261ecf541aa91", size = 482674 }, { url = "https://files.pythonhosted.org/packages/15/51/9f35d12da8434b489c7b7bffc205c474a0a9432a889457026e9bc06a297a/regex-2024.11.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:658f90550f38270639e83ce492f27d2c8d2cd63805c65a13a14d36ca126753f0", size = 287684 }, { url = "https://files.pythonhosted.org/packages/bd/18/b731f5510d1b8fb63c6b6d3484bfa9a59b84cc578ac8b5172970e05ae07c/regex-2024.11.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:164d8b7b3b4bcb2068b97428060b2a53be050085ef94eca7f240e7947f1b080e", size = 284589 }, { url = "https://files.pythonhosted.org/packages/78/a2/6dd36e16341ab95e4c6073426561b9bfdeb1a9c9b63ab1b579c2e96cb105/regex-2024.11.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3660c82f209655a06b587d55e723f0b813d3a7db2e32e5e7dc64ac2a9e86fde", size = 782511 }, { url = "https://files.pythonhosted.org/packages/1b/2b/323e72d5d2fd8de0d9baa443e1ed70363ed7e7b2fb526f5950c5cb99c364/regex-2024.11.6-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d22326fcdef5e08c154280b71163ced384b428343ae16a5ab2b3354aed12436e", size = 821149 }, { url = "https://files.pythonhosted.org/packages/90/30/63373b9ea468fbef8a907fd273e5c329b8c9535fee36fc8dba5fecac475d/regex-2024.11.6-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1ac758ef6aebfc8943560194e9fd0fa18bcb34d89fd8bd2af18183afd8da3a2", size = 809707 }, { url = "https://files.pythonhosted.org/packages/f2/98/26d3830875b53071f1f0ae6d547f1d98e964dd29ad35cbf94439120bb67a/regex-2024.11.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:997d6a487ff00807ba810e0f8332c18b4eb8d29463cfb7c820dc4b6e7562d0cf", size = 781702 }, { url = "https://files.pythonhosted.org/packages/87/55/eb2a068334274db86208ab9d5599ffa63631b9f0f67ed70ea7c82a69bbc8/regex-2024.11.6-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:02a02d2bb04fec86ad61f3ea7f49c015a0681bf76abb9857f945d26159d2968c", size = 771976 }, { url = "https://files.pythonhosted.org/packages/74/c0/be707bcfe98254d8f9d2cff55d216e946f4ea48ad2fd8cf1428f8c5332ba/regex-2024.11.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f02f93b92358ee3f78660e43b4b0091229260c5d5c408d17d60bf26b6c900e86", size = 697397 }, { url = "https://files.pythonhosted.org/packages/49/dc/bb45572ceb49e0f6509f7596e4ba7031f6819ecb26bc7610979af5a77f45/regex-2024.11.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:06eb1be98df10e81ebaded73fcd51989dcf534e3c753466e4b60c4697a003b67", size = 768726 }, { url = "https://files.pythonhosted.org/packages/5a/db/f43fd75dc4c0c2d96d0881967897926942e935d700863666f3c844a72ce6/regex-2024.11.6-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:040df6fe1a5504eb0f04f048e6d09cd7c7110fef851d7c567a6b6e09942feb7d", size = 775098 }, { url = "https://files.pythonhosted.org/packages/99/d7/f94154db29ab5a89d69ff893159b19ada89e76b915c1293e98603d39838c/regex-2024.11.6-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabbfc59f2c6edba2a6622c647b716e34e8e3867e0ab975412c5c2f79b82da2", size = 839325 }, { url = "https://files.pythonhosted.org/packages/f7/17/3cbfab1f23356fbbf07708220ab438a7efa1e0f34195bf857433f79f1788/regex-2024.11.6-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8447d2d39b5abe381419319f942de20b7ecd60ce86f16a23b0698f22e1b70008", size = 843277 }, { url = "https://files.pythonhosted.org/packages/7e/f2/48b393b51900456155de3ad001900f94298965e1cad1c772b87f9cfea011/regex-2024.11.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:da8f5fc57d1933de22a9e23eec290a0d8a5927a5370d24bda9a6abe50683fe62", size = 773197 }, { url = "https://files.pythonhosted.org/packages/45/3f/ef9589aba93e084cd3f8471fded352826dcae8489b650d0b9b27bc5bba8a/regex-2024.11.6-cp310-cp310-win32.whl", hash = "sha256:b489578720afb782f6ccf2840920f3a32e31ba28a4b162e13900c3e6bd3f930e", size = 261714 }, { url = "https://files.pythonhosted.org/packages/42/7e/5f1b92c8468290c465fd50c5318da64319133231415a8aa6ea5ab995a815/regex-2024.11.6-cp310-cp310-win_amd64.whl", hash = "sha256:5071b2093e793357c9d8b2929dfc13ac5f0a6c650559503bb81189d0a3814519", size = 274042 }, { url = "https://files.pythonhosted.org/packages/58/58/7e4d9493a66c88a7da6d205768119f51af0f684fe7be7bac8328e217a52c/regex-2024.11.6-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5478c6962ad548b54a591778e93cd7c456a7a29f8eca9c49e4f9a806dcc5d638", size = 482669 }, { url = "https://files.pythonhosted.org/packages/34/4c/8f8e631fcdc2ff978609eaeef1d6994bf2f028b59d9ac67640ed051f1218/regex-2024.11.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2c89a8cc122b25ce6945f0423dc1352cb9593c68abd19223eebbd4e56612c5b7", size = 287684 }, { url = "https://files.pythonhosted.org/packages/c5/1b/f0e4d13e6adf866ce9b069e191f303a30ab1277e037037a365c3aad5cc9c/regex-2024.11.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:94d87b689cdd831934fa3ce16cc15cd65748e6d689f5d2b8f4f4df2065c9fa20", size = 284589 }, { url = "https://files.pythonhosted.org/packages/25/4d/ab21047f446693887f25510887e6820b93f791992994f6498b0318904d4a/regex-2024.11.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1062b39a0a2b75a9c694f7a08e7183a80c63c0d62b301418ffd9c35f55aaa114", size = 792121 }, { url = "https://files.pythonhosted.org/packages/45/ee/c867e15cd894985cb32b731d89576c41a4642a57850c162490ea34b78c3b/regex-2024.11.6-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:167ed4852351d8a750da48712c3930b031f6efdaa0f22fa1933716bfcd6bf4a3", size = 831275 }, { url = "https://files.pythonhosted.org/packages/b3/12/b0f480726cf1c60f6536fa5e1c95275a77624f3ac8fdccf79e6727499e28/regex-2024.11.6-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d548dafee61f06ebdb584080621f3e0c23fff312f0de1afc776e2a2ba99a74f", size = 818257 }, { url = "https://files.pythonhosted.org/packages/bf/ce/0d0e61429f603bac433910d99ef1a02ce45a8967ffbe3cbee48599e62d88/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a19f302cd1ce5dd01a9099aaa19cae6173306d1302a43b627f62e21cf18ac0", size = 792727 }, { url = "https://files.pythonhosted.org/packages/e4/c1/243c83c53d4a419c1556f43777ccb552bccdf79d08fda3980e4e77dd9137/regex-2024.11.6-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bec9931dfb61ddd8ef2ebc05646293812cb6b16b60cf7c9511a832b6f1854b55", size = 780667 }, { url = "https://files.pythonhosted.org/packages/c5/f4/75eb0dd4ce4b37f04928987f1d22547ddaf6c4bae697623c1b05da67a8aa/regex-2024.11.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9714398225f299aa85267fd222f7142fcb5c769e73d7733344efc46f2ef5cf89", size = 776963 }, { url = "https://files.pythonhosted.org/packages/16/5d/95c568574e630e141a69ff8a254c2f188b4398e813c40d49228c9bbd9875/regex-2024.11.6-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:202eb32e89f60fc147a41e55cb086db2a3f8cb82f9a9a88440dcfc5d37faae8d", size = 784700 }, { url = "https://files.pythonhosted.org/packages/8e/b5/f8495c7917f15cc6fee1e7f395e324ec3e00ab3c665a7dc9d27562fd5290/regex-2024.11.6-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:4181b814e56078e9b00427ca358ec44333765f5ca1b45597ec7446d3a1ef6e34", size = 848592 }, { url = "https://files.pythonhosted.org/packages/1c/80/6dd7118e8cb212c3c60b191b932dc57db93fb2e36fb9e0e92f72a5909af9/regex-2024.11.6-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:068376da5a7e4da51968ce4c122a7cd31afaaec4fccc7856c92f63876e57b51d", size = 852929 }, { url = "https://files.pythonhosted.org/packages/11/9b/5a05d2040297d2d254baf95eeeb6df83554e5e1df03bc1a6687fc4ba1f66/regex-2024.11.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ac10f2c4184420d881a3475fb2c6f4d95d53a8d50209a2500723d831036f7c45", size = 781213 }, { url = "https://files.pythonhosted.org/packages/26/b7/b14e2440156ab39e0177506c08c18accaf2b8932e39fb092074de733d868/regex-2024.11.6-cp311-cp311-win32.whl", hash = "sha256:c36f9b6f5f8649bb251a5f3f66564438977b7ef8386a52460ae77e6070d309d9", size = 261734 }, { url = "https://files.pythonhosted.org/packages/80/32/763a6cc01d21fb3819227a1cc3f60fd251c13c37c27a73b8ff4315433a8e/regex-2024.11.6-cp311-cp311-win_amd64.whl", hash = "sha256:02e28184be537f0e75c1f9b2f8847dc51e08e6e171c6bde130b2687e0c33cf60", size = 274052 }, { url = "https://files.pythonhosted.org/packages/ba/30/9a87ce8336b172cc232a0db89a3af97929d06c11ceaa19d97d84fa90a8f8/regex-2024.11.6-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:52fb28f528778f184f870b7cf8f225f5eef0a8f6e3778529bdd40c7b3920796a", size = 483781 }, { url = "https://files.pythonhosted.org/packages/01/e8/00008ad4ff4be8b1844786ba6636035f7ef926db5686e4c0f98093612add/regex-2024.11.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdd6028445d2460f33136c55eeb1f601ab06d74cb3347132e1c24250187500d9", size = 288455 }, { url = "https://files.pythonhosted.org/packages/60/85/cebcc0aff603ea0a201667b203f13ba75d9fc8668fab917ac5b2de3967bc/regex-2024.11.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805e6b60c54bf766b251e94526ebad60b7de0c70f70a4e6210ee2891acb70bf2", size = 284759 }, { url = "https://files.pythonhosted.org/packages/94/2b/701a4b0585cb05472a4da28ee28fdfe155f3638f5e1ec92306d924e5faf0/regex-2024.11.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b85c2530be953a890eaffde05485238f07029600e8f098cdf1848d414a8b45e4", size = 794976 }, { url = "https://files.pythonhosted.org/packages/4b/bf/fa87e563bf5fee75db8915f7352e1887b1249126a1be4813837f5dbec965/regex-2024.11.6-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bb26437975da7dc36b7efad18aa9dd4ea569d2357ae6b783bf1118dabd9ea577", size = 833077 }, { url = "https://files.pythonhosted.org/packages/a1/56/7295e6bad94b047f4d0834e4779491b81216583c00c288252ef625c01d23/regex-2024.11.6-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abfa5080c374a76a251ba60683242bc17eeb2c9818d0d30117b4486be10c59d3", size = 823160 }, { url = "https://files.pythonhosted.org/packages/fb/13/e3b075031a738c9598c51cfbc4c7879e26729c53aa9cca59211c44235314/regex-2024.11.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b7fa6606c2881c1db9479b0eaa11ed5dfa11c8d60a474ff0e095099f39d98e", size = 796896 }, { url = "https://files.pythonhosted.org/packages/24/56/0b3f1b66d592be6efec23a795b37732682520b47c53da5a32c33ed7d84e3/regex-2024.11.6-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c32f75920cf99fe6b6c539c399a4a128452eaf1af27f39bce8909c9a3fd8cbe", size = 783997 }, { url = "https://files.pythonhosted.org/packages/f9/a1/eb378dada8b91c0e4c5f08ffb56f25fcae47bf52ad18f9b2f33b83e6d498/regex-2024.11.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:982e6d21414e78e1f51cf595d7f321dcd14de1f2881c5dc6a6e23bbbbd68435e", size = 781725 }, { url = "https://files.pythonhosted.org/packages/83/f2/033e7dec0cfd6dda93390089864732a3409246ffe8b042e9554afa9bff4e/regex-2024.11.6-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a7c2155f790e2fb448faed6dd241386719802296ec588a8b9051c1f5c481bc29", size = 789481 }, { url = "https://files.pythonhosted.org/packages/83/23/15d4552ea28990a74e7696780c438aadd73a20318c47e527b47a4a5a596d/regex-2024.11.6-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:149f5008d286636e48cd0b1dd65018548944e495b0265b45e1bffecce1ef7f39", size = 852896 }, { url = "https://files.pythonhosted.org/packages/e3/39/ed4416bc90deedbfdada2568b2cb0bc1fdb98efe11f5378d9892b2a88f8f/regex-2024.11.6-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:e5364a4502efca094731680e80009632ad6624084aff9a23ce8c8c6820de3e51", size = 860138 }, { url = "https://files.pythonhosted.org/packages/93/2d/dd56bb76bd8e95bbce684326302f287455b56242a4f9c61f1bc76e28360e/regex-2024.11.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0a86e7eeca091c09e021db8eb72d54751e527fa47b8d5787caf96d9831bd02ad", size = 787692 }, { url = "https://files.pythonhosted.org/packages/0b/55/31877a249ab7a5156758246b9c59539abbeba22461b7d8adc9e8475ff73e/regex-2024.11.6-cp312-cp312-win32.whl", hash = "sha256:32f9a4c643baad4efa81d549c2aadefaeba12249b2adc5af541759237eee1c54", size = 262135 }, { url = "https://files.pythonhosted.org/packages/38/ec/ad2d7de49a600cdb8dd78434a1aeffe28b9d6fc42eb36afab4a27ad23384/regex-2024.11.6-cp312-cp312-win_amd64.whl", hash = "sha256:a93c194e2df18f7d264092dc8539b8ffb86b45b899ab976aa15d48214138e81b", size = 273567 }, { url = "https://files.pythonhosted.org/packages/90/73/bcb0e36614601016552fa9344544a3a2ae1809dc1401b100eab02e772e1f/regex-2024.11.6-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a6ba92c0bcdf96cbf43a12c717eae4bc98325ca3730f6b130ffa2e3c3c723d84", size = 483525 }, { url = "https://files.pythonhosted.org/packages/0f/3f/f1a082a46b31e25291d830b369b6b0c5576a6f7fb89d3053a354c24b8a83/regex-2024.11.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:525eab0b789891ac3be914d36893bdf972d483fe66551f79d3e27146191a37d4", size = 288324 }, { url = "https://files.pythonhosted.org/packages/09/c9/4e68181a4a652fb3ef5099e077faf4fd2a694ea6e0f806a7737aff9e758a/regex-2024.11.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:086a27a0b4ca227941700e0b31425e7a28ef1ae8e5e05a33826e17e47fbfdba0", size = 284617 }, { url = "https://files.pythonhosted.org/packages/fc/fd/37868b75eaf63843165f1d2122ca6cb94bfc0271e4428cf58c0616786dce/regex-2024.11.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bde01f35767c4a7899b7eb6e823b125a64de314a8ee9791367c9a34d56af18d0", size = 795023 }, { url = "https://files.pythonhosted.org/packages/c4/7c/d4cd9c528502a3dedb5c13c146e7a7a539a3853dc20209c8e75d9ba9d1b2/regex-2024.11.6-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b583904576650166b3d920d2bcce13971f6f9e9a396c673187f49811b2769dc7", size = 833072 }, { url = "https://files.pythonhosted.org/packages/4f/db/46f563a08f969159c5a0f0e722260568425363bea43bb7ae370becb66a67/regex-2024.11.6-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1c4de13f06a0d54fa0d5ab1b7138bfa0d883220965a29616e3ea61b35d5f5fc7", size = 823130 }, { url = "https://files.pythonhosted.org/packages/db/60/1eeca2074f5b87df394fccaa432ae3fc06c9c9bfa97c5051aed70e6e00c2/regex-2024.11.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3cde6e9f2580eb1665965ce9bf17ff4952f34f5b126beb509fee8f4e994f143c", size = 796857 }, { url = "https://files.pythonhosted.org/packages/10/db/ac718a08fcee981554d2f7bb8402f1faa7e868c1345c16ab1ebec54b0d7b/regex-2024.11.6-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0d7f453dca13f40a02b79636a339c5b62b670141e63efd511d3f8f73fba162b3", size = 784006 }, { url = "https://files.pythonhosted.org/packages/c2/41/7da3fe70216cea93144bf12da2b87367590bcf07db97604edeea55dac9ad/regex-2024.11.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:59dfe1ed21aea057a65c6b586afd2a945de04fc7db3de0a6e3ed5397ad491b07", size = 781650 }, { url = "https://files.pythonhosted.org/packages/a7/d5/880921ee4eec393a4752e6ab9f0fe28009435417c3102fc413f3fe81c4e5/regex-2024.11.6-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b97c1e0bd37c5cd7902e65f410779d39eeda155800b65fc4d04cc432efa9bc6e", size = 789545 }, { url = "https://files.pythonhosted.org/packages/dc/96/53770115e507081122beca8899ab7f5ae28ae790bfcc82b5e38976df6a77/regex-2024.11.6-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f9d1e379028e0fc2ae3654bac3cbbef81bf3fd571272a42d56c24007979bafb6", size = 853045 }, { url = "https://files.pythonhosted.org/packages/31/d3/1372add5251cc2d44b451bd94f43b2ec78e15a6e82bff6a290ef9fd8f00a/regex-2024.11.6-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:13291b39131e2d002a7940fb176e120bec5145f3aeb7621be6534e46251912c4", size = 860182 }, { url = "https://files.pythonhosted.org/packages/ed/e3/c446a64984ea9f69982ba1a69d4658d5014bc7a0ea468a07e1a1265db6e2/regex-2024.11.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f51f88c126370dcec4908576c5a627220da6c09d0bff31cfa89f2523843316d", size = 787733 }, { url = "https://files.pythonhosted.org/packages/2b/f1/e40c8373e3480e4f29f2692bd21b3e05f296d3afebc7e5dcf21b9756ca1c/regex-2024.11.6-cp313-cp313-win32.whl", hash = "sha256:63b13cfd72e9601125027202cad74995ab26921d8cd935c25f09c630436348ff", size = 262122 }, { url = "https://files.pythonhosted.org/packages/45/94/bc295babb3062a731f52621cdc992d123111282e291abaf23faa413443ea/regex-2024.11.6-cp313-cp313-win_amd64.whl", hash = "sha256:2b3361af3198667e99927da8b84c1b010752fa4b1115ee30beaa332cabc3ef1a", size = 273545 }, { url = "https://files.pythonhosted.org/packages/44/0f/207b37e6e08d548fac0aa00bf0b7464126315d58ab5161216b8cb3abb2aa/regex-2024.11.6-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:3a51ccc315653ba012774efca4f23d1d2a8a8f278a6072e29c7147eee7da446b", size = 482777 }, { url = "https://files.pythonhosted.org/packages/5a/5a/586bafa294c5d2451265d3685815606c61e620f469cac3b946fff0a4aa48/regex-2024.11.6-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ad182d02e40de7459b73155deb8996bbd8e96852267879396fb274e8700190e3", size = 287751 }, { url = "https://files.pythonhosted.org/packages/08/92/9df786fad8a4e0766bfc9a2e334c5f0757356070c9639b2ec776b8cdef3d/regex-2024.11.6-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ba9b72e5643641b7d41fa1f6d5abda2c9a263ae835b917348fc3c928182ad467", size = 284552 }, { url = "https://files.pythonhosted.org/packages/0a/27/0b3cf7d9fbe43301aa3473d54406019a7380abe4e3c9ae250bac13c4fdb3/regex-2024.11.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40291b1b89ca6ad8d3f2b82782cc33807f1406cf68c8d440861da6304d8ffbbd", size = 783587 }, { url = "https://files.pythonhosted.org/packages/89/38/499b32cbb61163af60a5c5ff26aacea7836fe7e3d821e76af216e996088c/regex-2024.11.6-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cdf58d0e516ee426a48f7b2c03a332a4114420716d55769ff7108c37a09951bf", size = 822904 }, { url = "https://files.pythonhosted.org/packages/3f/a4/e3b11c643e5ae1059a08aeef971973f0c803d2a9ae2e7a86f97c68146a6c/regex-2024.11.6-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a36fdf2af13c2b14738f6e973aba563623cb77d753bbbd8d414d18bfaa3105dd", size = 809900 }, { url = "https://files.pythonhosted.org/packages/5a/c8/dc7153ceb5bcc344f5c4f0291ea45925a5f00009afa3849e91561ac2e847/regex-2024.11.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d1cee317bfc014c2419a76bcc87f071405e3966da434e03e13beb45f8aced1a6", size = 785105 }, { url = "https://files.pythonhosted.org/packages/2a/29/841489ea52013062b22625fbaf49b0916aeb62bae2e56425ac30f9dead46/regex-2024.11.6-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:50153825ee016b91549962f970d6a4442fa106832e14c918acd1c8e479916c4f", size = 773033 }, { url = "https://files.pythonhosted.org/packages/3e/4e/4a0da5e87f7c2dc73a8505785d5af2b1a19c66f4645b93caa50b7eb08242/regex-2024.11.6-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:ea1bfda2f7162605f6e8178223576856b3d791109f15ea99a9f95c16a7636fb5", size = 702374 }, { url = "https://files.pythonhosted.org/packages/94/6e/444e66346600d11e8a0f4bb31611973cffa772d5033ba1cf1f15de8a0d52/regex-2024.11.6-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:df951c5f4a1b1910f1a99ff42c473ff60f8225baa1cdd3539fe2819d9543e9df", size = 769990 }, { url = "https://files.pythonhosted.org/packages/da/28/95c3ed6cd51b27f54e59940400e2a3ddd3f8bbbc3aaf947e57a67104ecbd/regex-2024.11.6-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:072623554418a9911446278f16ecb398fb3b540147a7828c06e2011fa531e773", size = 775345 }, { url = "https://files.pythonhosted.org/packages/07/5d/0cd19cf44d96a7aa31526611c24235d21d27c23b65201cb2c5cac508dd42/regex-2024.11.6-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:f654882311409afb1d780b940234208a252322c24a93b442ca714d119e68086c", size = 840379 }, { url = "https://files.pythonhosted.org/packages/2a/13/ec3f8d85b789ee1c6ffbdfd4092fd901416716317ee17bf51aa2890bac96/regex-2024.11.6-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:89d75e7293d2b3e674db7d4d9b1bee7f8f3d1609428e293771d1a962617150cc", size = 845842 }, { url = "https://files.pythonhosted.org/packages/50/cb/7170247e65afea2bf9204bcb2682f292b0a3a57d112478da199b84d59792/regex-2024.11.6-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:f65557897fc977a44ab205ea871b690adaef6b9da6afda4790a2484b04293a5f", size = 775026 }, { url = "https://files.pythonhosted.org/packages/cc/06/c817c9201f09b7d9dd033039ba90d8197c91e9fe2984141f2d1de270c159/regex-2024.11.6-cp38-cp38-win32.whl", hash = "sha256:6f44ec28b1f858c98d3036ad5d7d0bfc568bdd7a74f9c24e25f41ef1ebfd81a4", size = 261738 }, { url = "https://files.pythonhosted.org/packages/cf/69/c39e16320400842eb4358c982ef5fc680800866f35ebfd4dd38a22967ce0/regex-2024.11.6-cp38-cp38-win_amd64.whl", hash = "sha256:bb8f74f2f10dbf13a0be8de623ba4f9491faf58c24064f32b65679b021ed0001", size = 274094 }, { url = "https://files.pythonhosted.org/packages/89/23/c4a86df398e57e26f93b13ae63acce58771e04bdde86092502496fa57f9c/regex-2024.11.6-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5704e174f8ccab2026bd2f1ab6c510345ae8eac818b613d7d73e785f1310f839", size = 482682 }, { url = "https://files.pythonhosted.org/packages/3c/8b/45c24ab7a51a1658441b961b86209c43e6bb9d39caf1e63f46ce6ea03bc7/regex-2024.11.6-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:220902c3c5cc6af55d4fe19ead504de80eb91f786dc102fbd74894b1551f095e", size = 287679 }, { url = "https://files.pythonhosted.org/packages/7a/d1/598de10b17fdafc452d11f7dada11c3be4e379a8671393e4e3da3c4070df/regex-2024.11.6-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:5e7e351589da0850c125f1600a4c4ba3c722efefe16b297de54300f08d734fbf", size = 284578 }, { url = "https://files.pythonhosted.org/packages/49/70/c7eaa219efa67a215846766fde18d92d54cb590b6a04ffe43cef30057622/regex-2024.11.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5056b185ca113c88e18223183aa1a50e66507769c9640a6ff75859619d73957b", size = 782012 }, { url = "https://files.pythonhosted.org/packages/89/e5/ef52c7eb117dd20ff1697968219971d052138965a4d3d9b95e92e549f505/regex-2024.11.6-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2e34b51b650b23ed3354b5a07aab37034d9f923db2a40519139af34f485f77d0", size = 820580 }, { url = "https://files.pythonhosted.org/packages/5f/3f/9f5da81aff1d4167ac52711acf789df13e789fe6ac9545552e49138e3282/regex-2024.11.6-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5670bce7b200273eee1840ef307bfa07cda90b38ae56e9a6ebcc9f50da9c469b", size = 809110 }, { url = "https://files.pythonhosted.org/packages/86/44/2101cc0890c3621b90365c9ee8d7291a597c0722ad66eccd6ffa7f1bcc09/regex-2024.11.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:08986dce1339bc932923e7d1232ce9881499a0e02925f7402fb7c982515419ef", size = 780919 }, { url = "https://files.pythonhosted.org/packages/ce/2e/3e0668d8d1c7c3c0d397bf54d92fc182575b3a26939aed5000d3cc78760f/regex-2024.11.6-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:93c0b12d3d3bc25af4ebbf38f9ee780a487e8bf6954c115b9f015822d3bb8e48", size = 771515 }, { url = "https://files.pythonhosted.org/packages/a6/49/1bc4584254355e3dba930a3a2fd7ad26ccba3ebbab7d9100db0aff2eedb0/regex-2024.11.6-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:764e71f22ab3b305e7f4c21f1a97e1526a25ebdd22513e251cf376760213da13", size = 696957 }, { url = "https://files.pythonhosted.org/packages/c8/dd/42879c1fc8a37a887cd08e358af3d3ba9e23038cd77c7fe044a86d9450ba/regex-2024.11.6-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:f056bf21105c2515c32372bbc057f43eb02aae2fda61052e2f7622c801f0b4e2", size = 768088 }, { url = "https://files.pythonhosted.org/packages/89/96/c05a0fe173cd2acd29d5e13c1adad8b706bcaa71b169e1ee57dcf2e74584/regex-2024.11.6-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:69ab78f848845569401469da20df3e081e6b5a11cb086de3eed1d48f5ed57c95", size = 774752 }, { url = "https://files.pythonhosted.org/packages/b5/f3/a757748066255f97f14506483436c5f6aded7af9e37bca04ec30c90ca683/regex-2024.11.6-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:86fddba590aad9208e2fa8b43b4c098bb0ec74f15718bb6a704e3c63e2cef3e9", size = 838862 }, { url = "https://files.pythonhosted.org/packages/5c/93/c6d2092fd479dcaeea40fc8fa673822829181ded77d294a7f950f1dda6e2/regex-2024.11.6-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:684d7a212682996d21ca12ef3c17353c021fe9de6049e19ac8481ec35574a70f", size = 842622 }, { url = "https://files.pythonhosted.org/packages/ff/9c/daa99532c72f25051a90ef90e1413a8d54413a9e64614d9095b0c1c154d0/regex-2024.11.6-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:a03e02f48cd1abbd9f3b7e3586d97c8f7a9721c436f51a5245b3b9483044480b", size = 772713 }, { url = "https://files.pythonhosted.org/packages/13/5d/61a533ccb8c231b474ac8e3a7d70155b00dfc61af6cafdccd1947df6d735/regex-2024.11.6-cp39-cp39-win32.whl", hash = "sha256:41758407fc32d5c3c5de163888068cfee69cb4c2be844e7ac517a52770f9af57", size = 261756 }, { url = "https://files.pythonhosted.org/packages/dc/7b/e59b7f7c91ae110d154370c24133f947262525b5d6406df65f23422acc17/regex-2024.11.6-cp39-cp39-win_amd64.whl", hash = "sha256:b2837718570f95dd41675328e111345f9b7095d821bac435aac173ac80b19983", size = 274110 }, ] [[package]] name = "requests" version = "2.32.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "certifi" }, { name = "charset-normalizer" }, { name = "idna" }, { name = "urllib3" }, ] sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 } wheels = [ { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 }, ] [[package]] name = "rich" version = "13.9.4" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown-it-py" }, { name = "pygments" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 } wheels = [ { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 }, ] [[package]] name = "rpds-py" version = "0.20.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/25/cb/8e919951f55d109d658f81c9b49d0cc3b48637c50792c5d2e77032b8c5da/rpds_py-0.20.1.tar.gz", hash = "sha256:e1791c4aabd117653530dccd24108fa03cc6baf21f58b950d0a73c3b3b29a350", size = 25931 } wheels = [ { url = "https://files.pythonhosted.org/packages/ae/0e/d7e7e9280988a7bc56fd326042baca27f4f55fad27dc8aa64e5e0e894e5d/rpds_py-0.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:a649dfd735fff086e8a9d0503a9f0c7d01b7912a333c7ae77e1515c08c146dad", size = 327335 }, { url = "https://files.pythonhosted.org/packages/4c/72/027185f213d53ae66765c575229829b202fbacf3d55fe2bd9ff4e29bb157/rpds_py-0.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f16bc1334853e91ddaaa1217045dd7be166170beec337576818461268a3de67f", size = 318250 }, { url = "https://files.pythonhosted.org/packages/2b/e7/b4eb3e6ff541c83d3b46f45f855547e412ab60c45bef64520fafb00b9b42/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:14511a539afee6f9ab492b543060c7491c99924314977a55c98bfa2ee29ce78c", size = 361206 }, { url = "https://files.pythonhosted.org/packages/e7/80/cb9a4b4cad31bcaa37f38dae7a8be861f767eb2ca4f07a146b5ffcfbee09/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3ccb8ac2d3c71cda472b75af42818981bdacf48d2e21c36331b50b4f16930163", size = 369921 }, { url = "https://files.pythonhosted.org/packages/95/1b/463b11e7039e18f9e778568dbf7338c29bbc1f8996381115201c668eb8c8/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c142b88039b92e7e0cb2552e8967077e3179b22359e945574f5e2764c3953dcf", size = 403673 }, { url = "https://files.pythonhosted.org/packages/86/98/1ef4028e9d5b76470bf7f8f2459be07ac5c9621270a2a5e093f8d8a8cc2c/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f19169781dddae7478a32301b499b2858bc52fc45a112955e798ee307e294977", size = 430267 }, { url = "https://files.pythonhosted.org/packages/25/8e/41d7e3e6d3a4a6c94375020477705a3fbb6515717901ab8f94821cf0a0d9/rpds_py-0.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:13c56de6518e14b9bf6edde23c4c39dac5b48dcf04160ea7bce8fca8397cdf86", size = 360569 }, { url = "https://files.pythonhosted.org/packages/4f/6a/8839340464d4e1bbfaf0482e9d9165a2309c2c17427e4dcb72ce3e5cc5d6/rpds_py-0.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:925d176a549f4832c6f69fa6026071294ab5910e82a0fe6c6228fce17b0706bd", size = 382584 }, { url = "https://files.pythonhosted.org/packages/64/96/7a7f938d3796a6a3ec08ed0e8a5ecd436fbd516a3684ab1fa22d46d6f6cc/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:78f0b6877bfce7a3d1ff150391354a410c55d3cdce386f862926a4958ad5ab7e", size = 546560 }, { url = "https://files.pythonhosted.org/packages/15/c7/19fb4f1247a3c90a99eca62909bf76ee988f9b663e47878a673d9854ec5c/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3dd645e2b0dcb0fd05bf58e2e54c13875847687d0b71941ad2e757e5d89d4356", size = 549359 }, { url = "https://files.pythonhosted.org/packages/d2/4c/445eb597a39a883368ea2f341dd6e48a9d9681b12ebf32f38a827b30529b/rpds_py-0.20.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:4f676e21db2f8c72ff0936f895271e7a700aa1f8d31b40e4e43442ba94973899", size = 527567 }, { url = "https://files.pythonhosted.org/packages/4f/71/4c44643bffbcb37311fc7fe221bcf139c8d660bc78f746dd3a05741372c8/rpds_py-0.20.1-cp310-none-win32.whl", hash = "sha256:648386ddd1e19b4a6abab69139b002bc49ebf065b596119f8f37c38e9ecee8ff", size = 200412 }, { url = "https://files.pythonhosted.org/packages/f4/33/9d0529d74099e090ec9ab15eb0a049c56cca599eaaca71bfedbdbca656a9/rpds_py-0.20.1-cp310-none-win_amd64.whl", hash = "sha256:d9ecb51120de61e4604650666d1f2b68444d46ae18fd492245a08f53ad2b7711", size = 218563 }, { url = "https://files.pythonhosted.org/packages/a0/2e/a6ded84019a05b8f23e0fe6a632f62ae438a8c5e5932d3dfc90c73418414/rpds_py-0.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:762703bdd2b30983c1d9e62b4c88664df4a8a4d5ec0e9253b0231171f18f6d75", size = 327194 }, { url = "https://files.pythonhosted.org/packages/68/11/d3f84c69de2b2086be3d6bd5e9d172825c096b13842ab7e5f8f39f06035b/rpds_py-0.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0b581f47257a9fce535c4567782a8976002d6b8afa2c39ff616edf87cbeff712", size = 318126 }, { url = "https://files.pythonhosted.org/packages/18/c0/13f1bce9c901511e5e4c0b77a99dbb946bb9a177ca88c6b480e9cb53e304/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:842c19a6ce894493563c3bd00d81d5100e8e57d70209e84d5491940fdb8b9e3a", size = 361119 }, { url = "https://files.pythonhosted.org/packages/06/31/3bd721575671f22a37476c2d7b9e34bfa5185bdcee09f7fedde3b29f3adb/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42cbde7789f5c0bcd6816cb29808e36c01b960fb5d29f11e052215aa85497c93", size = 369532 }, { url = "https://files.pythonhosted.org/packages/20/22/3eeb0385f33251b4fd0f728e6a3801dc8acc05e714eb7867cefe635bf4ab/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6c8e9340ce5a52f95fa7d3b552b35c7e8f3874d74a03a8a69279fd5fca5dc751", size = 403703 }, { url = "https://files.pythonhosted.org/packages/10/e1/8dde6174e7ac5b9acd3269afca2e17719bc7e5088c68f44874d2ad9e4560/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ba6f89cac95c0900d932c9efb7f0fb6ca47f6687feec41abcb1bd5e2bd45535", size = 429868 }, { url = "https://files.pythonhosted.org/packages/19/51/a3cc1a5238acfc2582033e8934d034301f9d4931b9bf7c7ccfabc4ca0880/rpds_py-0.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a916087371afd9648e1962e67403c53f9c49ca47b9680adbeef79da3a7811b0", size = 360539 }, { url = "https://files.pythonhosted.org/packages/cd/8c/3c87471a44bd4114e2b0aec90f298f6caaac4e8db6af904d5dd2279f5c61/rpds_py-0.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:200a23239781f46149e6a415f1e870c5ef1e712939fe8fa63035cd053ac2638e", size = 382467 }, { url = "https://files.pythonhosted.org/packages/d0/9b/95073fe3e0f130e6d561e106818b6568ef1f2df3352e7f162ab912da837c/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:58b1d5dd591973d426cbb2da5e27ba0339209832b2f3315928c9790e13f159e8", size = 546669 }, { url = "https://files.pythonhosted.org/packages/de/4c/7ab3669e02bb06fedebcfd64d361b7168ba39dfdf385e4109440f2e7927b/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6b73c67850ca7cae0f6c56f71e356d7e9fa25958d3e18a64927c2d930859b8e4", size = 549304 }, { url = "https://files.pythonhosted.org/packages/f1/e8/ad5da336cd42adbdafe0ecd40dcecdae01fd3d703c621c7637615a008d3a/rpds_py-0.20.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d8761c3c891cc51e90bc9926d6d2f59b27beaf86c74622c8979380a29cc23ac3", size = 527637 }, { url = "https://files.pythonhosted.org/packages/02/f1/1b47b9e5b941c2659c9b7e4ef41b6f07385a6500c638fa10c066e4616ecb/rpds_py-0.20.1-cp311-none-win32.whl", hash = "sha256:cd945871335a639275eee904caef90041568ce3b42f402c6959b460d25ae8732", size = 200488 }, { url = "https://files.pythonhosted.org/packages/85/f6/c751c1adfa31610055acfa1cc667cf2c2d7011a73070679c448cf5856905/rpds_py-0.20.1-cp311-none-win_amd64.whl", hash = "sha256:7e21b7031e17c6b0e445f42ccc77f79a97e2687023c5746bfb7a9e45e0921b84", size = 218475 }, { url = "https://files.pythonhosted.org/packages/e7/10/4e8dcc08b58a548098dbcee67a4888751a25be7a6dde0a83d4300df48bfa/rpds_py-0.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:36785be22066966a27348444b40389f8444671630063edfb1a2eb04318721e17", size = 329749 }, { url = "https://files.pythonhosted.org/packages/d2/e4/61144f3790e12fd89e6153d77f7915ad26779735fef8ee9c099cba6dfb4a/rpds_py-0.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:142c0a5124d9bd0e2976089484af5c74f47bd3298f2ed651ef54ea728d2ea42c", size = 321032 }, { url = "https://files.pythonhosted.org/packages/fa/e0/99205aabbf3be29ef6c58ef9b08feed51ba6532fdd47461245cb58dd9897/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dbddc10776ca7ebf2a299c41a4dde8ea0d8e3547bfd731cb87af2e8f5bf8962d", size = 363931 }, { url = "https://files.pythonhosted.org/packages/ac/bd/bce2dddb518b13a7e77eed4be234c9af0c9c6d403d01c5e6ae8eb447ab62/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:15a842bb369e00295392e7ce192de9dcbf136954614124a667f9f9f17d6a216f", size = 373343 }, { url = "https://files.pythonhosted.org/packages/43/15/112b7c553066cb91264691ba7fb119579c440a0ae889da222fa6fc0d411a/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:be5ef2f1fc586a7372bfc355986226484e06d1dc4f9402539872c8bb99e34b01", size = 406304 }, { url = "https://files.pythonhosted.org/packages/af/8d/2da52aef8ae5494a382b0c0025ba5b68f2952db0f2a4c7534580e8ca83cc/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbcf360c9e3399b056a238523146ea77eeb2a596ce263b8814c900263e46031a", size = 423022 }, { url = "https://files.pythonhosted.org/packages/c8/1b/f23015cb293927c93bdb4b94a48bfe77ad9d57359c75db51f0ff0cf482ff/rpds_py-0.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ecd27a66740ffd621d20b9a2f2b5ee4129a56e27bfb9458a3bcc2e45794c96cb", size = 364937 }, { url = "https://files.pythonhosted.org/packages/7b/8b/6da8636b2ea2e2f709e56656e663b6a71ecd9a9f9d9dc21488aade122026/rpds_py-0.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0b937b2a1988f184a3e9e577adaa8aede21ec0b38320d6009e02bd026db04fa", size = 386301 }, { url = "https://files.pythonhosted.org/packages/20/af/2ae192797bffd0d6d558145b5a36e7245346ff3e44f6ddcb82f0eb8512d4/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6889469bfdc1eddf489729b471303739bf04555bb151fe8875931f8564309afc", size = 549452 }, { url = "https://files.pythonhosted.org/packages/07/dd/9f6520712a5108cd7d407c9db44a3d59011b385c58e320d58ebf67757a9e/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:19b73643c802f4eaf13d97f7855d0fb527fbc92ab7013c4ad0e13a6ae0ed23bd", size = 554370 }, { url = "https://files.pythonhosted.org/packages/5e/0e/b1bdc7ea0db0946d640ab8965146099093391bb5d265832994c47461e3c5/rpds_py-0.20.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3c6afcf2338e7f374e8edc765c79fbcb4061d02b15dd5f8f314a4af2bdc7feb5", size = 530940 }, { url = "https://files.pythonhosted.org/packages/ae/d3/ffe907084299484fab60a7955f7c0e8a295c04249090218c59437010f9f4/rpds_py-0.20.1-cp312-none-win32.whl", hash = "sha256:dc73505153798c6f74854aba69cc75953888cf9866465196889c7cdd351e720c", size = 203164 }, { url = "https://files.pythonhosted.org/packages/1f/ba/9cbb57423c4bfbd81c473913bebaed151ad4158ee2590a4e4b3e70238b48/rpds_py-0.20.1-cp312-none-win_amd64.whl", hash = "sha256:8bbe951244a838a51289ee53a6bae3a07f26d4e179b96fc7ddd3301caf0518eb", size = 220750 }, { url = "https://files.pythonhosted.org/packages/b5/01/fee2e1d1274c92fff04aa47d805a28d62c2aa971d1f49f5baea1c6e670d9/rpds_py-0.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:6ca91093a4a8da4afae7fe6a222c3b53ee4eef433ebfee4d54978a103435159e", size = 329359 }, { url = "https://files.pythonhosted.org/packages/b0/cf/4aeffb02b7090029d7aeecbffb9a10e1c80f6f56d7e9a30e15481dc4099c/rpds_py-0.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b9c2fe36d1f758b28121bef29ed1dee9b7a2453e997528e7d1ac99b94892527c", size = 320543 }, { url = "https://files.pythonhosted.org/packages/17/69/85cf3429e9ccda684ba63ff36b5866d5f9451e921cc99819341e19880334/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f009c69bc8c53db5dfab72ac760895dc1f2bc1b62ab7408b253c8d1ec52459fc", size = 363107 }, { url = "https://files.pythonhosted.org/packages/ef/de/7df88dea9c3eeb832196d23b41f0f6fc5f9a2ee9b2080bbb1db8731ead9c/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6740a3e8d43a32629bb9b009017ea5b9e713b7210ba48ac8d4cb6d99d86c8ee8", size = 372027 }, { url = "https://files.pythonhosted.org/packages/d1/b8/88675399d2038580743c570a809c43a900e7090edc6553f8ffb66b23c965/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:32b922e13d4c0080d03e7b62991ad7f5007d9cd74e239c4b16bc85ae8b70252d", size = 405031 }, { url = "https://files.pythonhosted.org/packages/e1/aa/cca639f6d17caf00bab51bdc70fcc0bdda3063e5662665c4fdf60443c474/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fe00a9057d100e69b4ae4a094203a708d65b0f345ed546fdef86498bf5390982", size = 422271 }, { url = "https://files.pythonhosted.org/packages/c4/07/bf8a949d2ec4626c285579c9d6b356c692325f1a4126e947736b416e1fc4/rpds_py-0.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:49fe9b04b6fa685bd39237d45fad89ba19e9163a1ccaa16611a812e682913496", size = 363625 }, { url = "https://files.pythonhosted.org/packages/11/f0/06675c6a58d6ce34547879138810eb9aab0c10e5607ea6c2e4dc56b703c8/rpds_py-0.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:aa7ac11e294304e615b43f8c441fee5d40094275ed7311f3420d805fde9b07b4", size = 385906 }, { url = "https://files.pythonhosted.org/packages/bf/ac/2d1f50374eb8e41030fad4e87f81751e1c39e3b5d4bee8c5618830d8a6ac/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aa97af1558a9bef4025f8f5d8c60d712e0a3b13a2fe875511defc6ee77a1ab7", size = 549021 }, { url = "https://files.pythonhosted.org/packages/f7/d4/a7d70a7cc71df772eeadf4bce05e32e780a9fe44a511a5b091c7a85cb767/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:483b29f6f7ffa6af845107d4efe2e3fa8fb2693de8657bc1849f674296ff6a5a", size = 553800 }, { url = "https://files.pythonhosted.org/packages/87/81/dc30bc449ccba63ad23a0f6633486d4e0e6955f45f3715a130dacabd6ad0/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:37fe0f12aebb6a0e3e17bb4cd356b1286d2d18d2e93b2d39fe647138458b4bcb", size = 531076 }, { url = "https://files.pythonhosted.org/packages/50/80/fb62ab48f3b5cfe704ead6ad372da1922ddaa76397055e02eb507054c979/rpds_py-0.20.1-cp313-none-win32.whl", hash = "sha256:a624cc00ef2158e04188df5e3016385b9353638139a06fb77057b3498f794782", size = 202804 }, { url = "https://files.pythonhosted.org/packages/d9/30/a3391e76d0b3313f33bdedd394a519decae3a953d2943e3dabf80ae32447/rpds_py-0.20.1-cp313-none-win_amd64.whl", hash = "sha256:b71b8666eeea69d6363248822078c075bac6ed135faa9216aa85f295ff009b1e", size = 220502 }, { url = "https://files.pythonhosted.org/packages/53/ef/b1883734ea0cd9996de793cdc38c32a28143b04911d1e570090acd8a9162/rpds_py-0.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5b48e790e0355865197ad0aca8cde3d8ede347831e1959e158369eb3493d2191", size = 327757 }, { url = "https://files.pythonhosted.org/packages/54/63/47d34dc4ddb3da73e78e10c9009dcf8edc42d355a221351c05c822c2a50b/rpds_py-0.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3e310838a5801795207c66c73ea903deda321e6146d6f282e85fa7e3e4854804", size = 318785 }, { url = "https://files.pythonhosted.org/packages/f7/e1/d6323be4afbe3013f28725553b7bfa80b3f013f91678af258f579f8ea8f9/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2249280b870e6a42c0d972339e9cc22ee98730a99cd7f2f727549af80dd5a963", size = 361511 }, { url = "https://files.pythonhosted.org/packages/ab/d3/c40e4d9ecd571f0f50fe69bc53fe608d7b2c49b30738b480044990260838/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e79059d67bea28b53d255c1437b25391653263f0e69cd7dec170d778fdbca95e", size = 370201 }, { url = "https://files.pythonhosted.org/packages/f1/b6/96a4a9977a8a06c2c49d90aa571346aff1642abf15066a39a0b4817bf049/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2b431c777c9653e569986ecf69ff4a5dba281cded16043d348bf9ba505486f36", size = 403866 }, { url = "https://files.pythonhosted.org/packages/cd/8f/702b52287949314b498a311f92b5ee0ba30c702a27e0e6b560e2da43b8d5/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da584ff96ec95e97925174eb8237e32f626e7a1a97888cdd27ee2f1f24dd0ad8", size = 430163 }, { url = "https://files.pythonhosted.org/packages/c4/ce/af016c81fda833bf125b20d1677d816f230cad2ab189f46bcbfea3c7a375/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a0629ec053fc013808a85178524e3cb63a61dbc35b22499870194a63578fb9", size = 360776 }, { url = "https://files.pythonhosted.org/packages/08/a7/988e179c9bef55821abe41762228d65077e0570ca75c9efbcd1bc6e263b4/rpds_py-0.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fbf15aff64a163db29a91ed0868af181d6f68ec1a3a7d5afcfe4501252840bad", size = 383008 }, { url = "https://files.pythonhosted.org/packages/96/b0/e4077f7f1b9622112ae83254aedfb691490278793299bc06dcf54ec8c8e4/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:07924c1b938798797d60c6308fa8ad3b3f0201802f82e4a2c41bb3fafb44cc28", size = 546371 }, { url = "https://files.pythonhosted.org/packages/e4/5e/1d4dd08ec0352cfe516ea93ea1993c2f656f893c87dafcd9312bd07f65f7/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:4a5a844f68776a7715ecb30843b453f07ac89bad393431efbf7accca3ef599c1", size = 549809 }, { url = "https://files.pythonhosted.org/packages/57/ac/a716b4729ff23ec034b7d2ff76a86e6f0753c4098401bdfdf55b2efe90e6/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:518d2ca43c358929bf08f9079b617f1c2ca6e8848f83c1225c88caeac46e6cbc", size = 528492 }, { url = "https://files.pythonhosted.org/packages/e0/ed/a0b58a9ecef79918169eacdabd14eb4c5c86ce71184ed56b80c6eb425828/rpds_py-0.20.1-cp38-none-win32.whl", hash = "sha256:3aea7eed3e55119635a74bbeb80b35e776bafccb70d97e8ff838816c124539f1", size = 200512 }, { url = "https://files.pythonhosted.org/packages/5f/c3/222e25124283afc76c473fcd2c547e82ec57683fa31cb4d6c6eb44e5d57a/rpds_py-0.20.1-cp38-none-win_amd64.whl", hash = "sha256:7dca7081e9a0c3b6490a145593f6fe3173a94197f2cb9891183ef75e9d64c425", size = 218627 }, { url = "https://files.pythonhosted.org/packages/d6/87/e7e0fcbfdc0d0e261534bcc885f6ae6253095b972e32f8b8b1278c78a2a9/rpds_py-0.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b41b6321805c472f66990c2849e152aff7bc359eb92f781e3f606609eac877ad", size = 327867 }, { url = "https://files.pythonhosted.org/packages/93/a0/17836b7961fc82586e9b818abdee2a27e2e605a602bb8c0d43f02092f8c2/rpds_py-0.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a90c373ea2975519b58dece25853dbcb9779b05cc46b4819cb1917e3b3215b6", size = 318893 }, { url = "https://files.pythonhosted.org/packages/dc/03/deb81d8ea3a8b974e7b03cfe8c8c26616ef8f4980dd430d8dd0a2f1b4d8e/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16d4477bcb9fbbd7b5b0e4a5d9b493e42026c0bf1f06f723a9353f5153e75d30", size = 361664 }, { url = "https://files.pythonhosted.org/packages/16/49/d9938603731745c7b6babff97ca61ff3eb4619e7128b5ab0111ad4e91d6d/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:84b8382a90539910b53a6307f7c35697bc7e6ffb25d9c1d4e998a13e842a5e83", size = 369796 }, { url = "https://files.pythonhosted.org/packages/87/d2/480b36c69cdc373853401b6aab6a281cf60f6d72b1545d82c0d23d9dd77c/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4888e117dd41b9d34194d9e31631af70d3d526efc363085e3089ab1a62c32ed1", size = 403860 }, { url = "https://files.pythonhosted.org/packages/31/7c/f6d909cb57761293308dbef14f1663d84376f2e231892a10aafc57b42037/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5265505b3d61a0f56618c9b941dc54dc334dc6e660f1592d112cd103d914a6db", size = 430793 }, { url = "https://files.pythonhosted.org/packages/d4/62/c9bd294c4b5f84d9cc2c387b548ae53096ad7e71ac5b02b6310e9dc85aa4/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e75ba609dba23f2c95b776efb9dd3f0b78a76a151e96f96cc5b6b1b0004de66f", size = 360927 }, { url = "https://files.pythonhosted.org/packages/c1/a7/15d927d83a44da8307a432b1cac06284b6657706d099a98cc99fec34ad51/rpds_py-0.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1791ff70bc975b098fe6ecf04356a10e9e2bd7dc21fa7351c1742fdeb9b4966f", size = 382660 }, { url = "https://files.pythonhosted.org/packages/4c/28/0630719c18456238bb07d59c4302fed50a13aa8035ec23dbfa80d116f9bc/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:d126b52e4a473d40232ec2052a8b232270ed1f8c9571aaf33f73a14cc298c24f", size = 546888 }, { url = "https://files.pythonhosted.org/packages/b9/75/3c9bda11b9c15d680b315f898af23825159314d4b56568f24b53ace8afcd/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:c14937af98c4cc362a1d4374806204dd51b1e12dded1ae30645c298e5a5c4cb1", size = 550088 }, { url = "https://files.pythonhosted.org/packages/70/f1/8fe7d04c194218171220a412057429defa9e2da785de0777c4d39309337e/rpds_py-0.20.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:3d089d0b88996df627693639d123c8158cff41c0651f646cd8fd292c7da90eaf", size = 528270 }, { url = "https://files.pythonhosted.org/packages/d6/62/41b0020f4b00af042b008e679dbe25a2f5bce655139a81f8b812f9068e52/rpds_py-0.20.1-cp39-none-win32.whl", hash = "sha256:653647b8838cf83b2e7e6a0364f49af96deec64d2a6578324db58380cff82aca", size = 200658 }, { url = "https://files.pythonhosted.org/packages/05/01/e64bb8889f2dcc951e53de33d8b8070456397ae4e10edc35e6cb9908f5c8/rpds_py-0.20.1-cp39-none-win_amd64.whl", hash = "sha256:fa41a64ac5b08b292906e248549ab48b69c5428f3987b09689ab2441f267d04d", size = 218883 }, { url = "https://files.pythonhosted.org/packages/b6/fa/7959429e69569d0f6e7d27f80451402da0409349dd2b07f6bcbdd5fad2d3/rpds_py-0.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7a07ced2b22f0cf0b55a6a510078174c31b6d8544f3bc00c2bcee52b3d613f74", size = 328209 }, { url = "https://files.pythonhosted.org/packages/25/97/5dfdb091c30267ff404d2fd9e70c7a6d6ffc65ca77fffe9456e13b719066/rpds_py-0.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:68cb0a499f2c4a088fd2f521453e22ed3527154136a855c62e148b7883b99f9a", size = 319499 }, { url = "https://files.pythonhosted.org/packages/7c/98/cf2608722400f5f9bb4c82aa5ac09026f3ac2ebea9d4059d3533589ed0b6/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fa3060d885657abc549b2a0f8e1b79699290e5d83845141717c6c90c2df38311", size = 361795 }, { url = "https://files.pythonhosted.org/packages/89/de/0e13dd43c785c60e63933e96fbddda0b019df6862f4d3019bb49c3861131/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:95f3b65d2392e1c5cec27cff08fdc0080270d5a1a4b2ea1d51d5f4a2620ff08d", size = 370604 }, { url = "https://files.pythonhosted.org/packages/8a/fc/fe3c83c77f82b8059eeec4e998064913d66212b69b3653df48f58ad33d3d/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2cc3712a4b0b76a1d45a9302dd2f53ff339614b1c29603a911318f2357b04dd2", size = 404177 }, { url = "https://files.pythonhosted.org/packages/94/30/5189518bfb80a41f664daf32b46645c7fbdcc89028a0f1bfa82e806e0fbb/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d4eea0761e37485c9b81400437adb11c40e13ef513375bbd6973e34100aeb06", size = 430108 }, { url = "https://files.pythonhosted.org/packages/67/0e/6f069feaff5c298375cd8c55e00ecd9bd79c792ce0893d39448dc0097857/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f5179583d7a6cdb981151dd349786cbc318bab54963a192692d945dd3f6435d", size = 361184 }, { url = "https://files.pythonhosted.org/packages/27/9f/ce3e2ae36f392c3ef1988c06e9e0b4c74f64267dad7c223003c34da11adb/rpds_py-0.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2fbb0ffc754490aff6dabbf28064be47f0f9ca0b9755976f945214965b3ace7e", size = 384140 }, { url = "https://files.pythonhosted.org/packages/5f/d5/89d44504d0bc7a1135062cb520a17903ff002f458371b8d9160af3b71e52/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:a94e52537a0e0a85429eda9e49f272ada715506d3b2431f64b8a3e34eb5f3e75", size = 546589 }, { url = "https://files.pythonhosted.org/packages/8f/8f/e1c2db4fcca3947d9a28ec9553700b4dc8038f0eff575f579e75885b0661/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:92b68b79c0da2a980b1c4197e56ac3dd0c8a149b4603747c4378914a68706979", size = 550059 }, { url = "https://files.pythonhosted.org/packages/67/29/00a9e986df36721b5def82fff60995c1ee8827a7d909a6ec8929fb4cc668/rpds_py-0.20.1-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:93da1d3db08a827eda74356f9f58884adb254e59b6664f64cc04cdff2cc19b0d", size = 529131 }, { url = "https://files.pythonhosted.org/packages/a3/32/95364440560ec476b19c6a2704259e710c223bf767632ebaa72cc2a1760f/rpds_py-0.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:754bbed1a4ca48479e9d4182a561d001bbf81543876cdded6f695ec3d465846b", size = 219677 }, { url = "https://files.pythonhosted.org/packages/ed/bf/ad8492e972c90a3d48a38e2b5095c51a8399d5b57e83f2d5d1649490f72b/rpds_py-0.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:ca449520e7484534a2a44faf629362cae62b660601432d04c482283c47eaebab", size = 328046 }, { url = "https://files.pythonhosted.org/packages/75/fd/84f42386165d6d555acb76c6d39c90b10c9dcf25116daf4f48a0a9d6867a/rpds_py-0.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:9c4cb04a16b0f199a8c9bf807269b2f63b7b5b11425e4a6bd44bd6961d28282c", size = 319306 }, { url = "https://files.pythonhosted.org/packages/6c/8a/abcd5119a0573f9588ad71a3fde3c07ddd1d1393cfee15a6ba7495c256f1/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb63804105143c7e24cee7db89e37cb3f3941f8e80c4379a0b355c52a52b6780", size = 362558 }, { url = "https://files.pythonhosted.org/packages/9d/65/1c2bb10afd4bd32800227a658ae9097bc1d08a4e5048a57a9bd2efdf6306/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:55cd1fa4ecfa6d9f14fbd97ac24803e6f73e897c738f771a9fe038f2f11ff07c", size = 370811 }, { url = "https://files.pythonhosted.org/packages/6c/ee/f4bab2b9e51ced30351cfd210647885391463ae682028c79760e7db28e4e/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0f8f741b6292c86059ed175d80eefa80997125b7c478fb8769fd9ac8943a16c0", size = 404660 }, { url = "https://files.pythonhosted.org/packages/48/0f/9d04d0939682f0c97be827fc51ff986555ffb573e6781bd5132441f0ce25/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fc212779bf8411667234b3cdd34d53de6c2b8b8b958e1e12cb473a5f367c338", size = 430490 }, { url = "https://files.pythonhosted.org/packages/0d/f2/e9b90fd8416d59941b6a12f2c2e1d898b63fd092f5a7a6f98236cb865764/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ad56edabcdb428c2e33bbf24f255fe2b43253b7d13a2cdbf05de955217313e6", size = 361448 }, { url = "https://files.pythonhosted.org/packages/0b/83/1cc776dce7bedb17d6f4ea62eafccee8a57a4678f4fac414ab69fb9b6b0b/rpds_py-0.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0a3a1e9ee9728b2c1734f65d6a1d376c6f2f6fdcc13bb007a08cc4b1ff576dc5", size = 383681 }, { url = "https://files.pythonhosted.org/packages/17/5c/e0cdd6b0a8373fdef3667af2778dd9ff3abf1bbb9c7bd92c603c91440eb0/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:e13de156137b7095442b288e72f33503a469aa1980ed856b43c353ac86390519", size = 546203 }, { url = "https://files.pythonhosted.org/packages/1b/a8/81fc9cbc01e7ef6d10652aedc1de4e8473934773e2808ba49094e03575df/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:07f59760ef99f31422c49038964b31c4dfcfeb5d2384ebfc71058a7c9adae2d2", size = 549855 }, { url = "https://files.pythonhosted.org/packages/b3/87/99648693d3c1bbce088119bc61ecaab62e5f9c713894edc604ffeca5ae88/rpds_py-0.20.1-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:59240685e7da61fb78f65a9f07f8108e36a83317c53f7b276b4175dc44151684", size = 528625 }, { url = "https://files.pythonhosted.org/packages/05/c3/10c68a08849f1fa45d205e54141fa75d316013e3d701ef01770ee1220bb8/rpds_py-0.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:83cba698cfb3c2c5a7c3c6bac12fe6c6a51aae69513726be6411076185a8b24a", size = 219991 }, ] [[package]] name = "ruff" version = "0.7.3" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/4b/06/09d1276df977eece383d0ed66052fc24ec4550a61f8fbc0a11200e690496/ruff-0.7.3.tar.gz", hash = "sha256:e1d1ba2e40b6e71a61b063354d04be669ab0d39c352461f3d789cac68b54a313", size = 3243664 } wheels = [ { url = "https://files.pythonhosted.org/packages/c0/56/933d433c2489e4642487b835f53dd9ff015fb3d8fa459b09bb2ce42d7c4b/ruff-0.7.3-py3-none-linux_armv6l.whl", hash = "sha256:34f2339dc22687ec7e7002792d1f50712bf84a13d5152e75712ac08be565d344", size = 10372090 }, { url = "https://files.pythonhosted.org/packages/20/ea/1f0a22a6bcdd3fc26c73f63a025d05bd565901b729d56bcb093c722a6c4c/ruff-0.7.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:fb397332a1879b9764a3455a0bb1087bda876c2db8aca3a3cbb67b3dbce8cda0", size = 10190037 }, { url = "https://files.pythonhosted.org/packages/16/74/aca75666e0d481fe394e76a8647c44ea919087748024924baa1a17371e3e/ruff-0.7.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:37d0b619546103274e7f62643d14e1adcbccb242efda4e4bdb9544d7764782e9", size = 9811998 }, { url = "https://files.pythonhosted.org/packages/20/a1/cf446a0d7f78ea1f0bd2b9171c11dfe746585c0c4a734b25966121eb4f5d/ruff-0.7.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d59f0c3ee4d1a6787614e7135b72e21024875266101142a09a61439cb6e38a5", size = 10620626 }, { url = "https://files.pythonhosted.org/packages/cd/c1/82b27d09286ae855f5d03b1ad37cf243f21eb0081732d4d7b0d658d439cb/ruff-0.7.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:44eb93c2499a169d49fafd07bc62ac89b1bc800b197e50ff4633aed212569299", size = 10177598 }, { url = "https://files.pythonhosted.org/packages/b9/42/c0acac22753bf74013d035a5ef6c5c4c40ad4d6686bfb3fda7c6f37d9b37/ruff-0.7.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6d0242ce53f3a576c35ee32d907475a8d569944c0407f91d207c8af5be5dae4e", size = 11171963 }, { url = "https://files.pythonhosted.org/packages/43/18/bb0befb7fb9121dd9009e6a72eb98e24f1bacb07c6f3ecb55f032ba98aed/ruff-0.7.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:6b6224af8b5e09772c2ecb8dc9f3f344c1aa48201c7f07e7315367f6dd90ac29", size = 11856157 }, { url = "https://files.pythonhosted.org/packages/5e/91/04e98d7d6e32eca9d1372be595f9abc7b7f048795e32eb2edbd8794d50bd/ruff-0.7.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c50f95a82b94421c964fae4c27c0242890a20fe67d203d127e84fbb8013855f5", size = 11440331 }, { url = "https://files.pythonhosted.org/packages/f5/dc/3fe99f2ce10b76d389041a1b9f99e7066332e479435d4bebcceea16caff5/ruff-0.7.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7f3eff9961b5d2644bcf1616c606e93baa2d6b349e8aa8b035f654df252c8c67", size = 12725354 }, { url = "https://files.pythonhosted.org/packages/43/7b/1daa712de1c5bc6cbbf9fa60e9c41cc48cda962dc6d2c4f2a224d2c3007e/ruff-0.7.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b8963cab06d130c4df2fd52c84e9f10d297826d2e8169ae0c798b6221be1d1d2", size = 11010091 }, { url = "https://files.pythonhosted.org/packages/b6/db/1227a903587432eb569e57a95b15a4f191a71fe315cde4c0312df7bc85da/ruff-0.7.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:61b46049d6edc0e4317fb14b33bd693245281a3007288b68a3f5b74a22a0746d", size = 10610687 }, { url = "https://files.pythonhosted.org/packages/db/e2/dc41ee90c3085aadad4da614d310d834f641aaafddf3dfbba08210c616ce/ruff-0.7.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:10ebce7696afe4644e8c1a23b3cf8c0f2193a310c18387c06e583ae9ef284de2", size = 10254843 }, { url = "https://files.pythonhosted.org/packages/6f/09/5f6cac1c91542bc5bd33d40b4c13b637bf64d7bb29e091dadb01b62527fe/ruff-0.7.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:3f36d56326b3aef8eeee150b700e519880d1aab92f471eefdef656fd57492aa2", size = 10730962 }, { url = "https://files.pythonhosted.org/packages/d3/42/89a4b9a24ef7d00269e24086c417a006f9a3ffeac2c80f2629eb5ce140ee/ruff-0.7.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5d024301109a0007b78d57ab0ba190087b43dce852e552734ebf0b0b85e4fb16", size = 11101907 }, { url = "https://files.pythonhosted.org/packages/b0/5c/efdb4777686683a8edce94ffd812783bddcd3d2454d38c5ac193fef7c500/ruff-0.7.3-py3-none-win32.whl", hash = "sha256:4ba81a5f0c5478aa61674c5a2194de8b02652f17addf8dfc40c8937e6e7d79fc", size = 8611095 }, { url = "https://files.pythonhosted.org/packages/bb/b8/28fbc6a4efa50178f973972d1c84b2d0a33cdc731588522ab751ac3da2f5/ruff-0.7.3-py3-none-win_amd64.whl", hash = "sha256:588a9ff2fecf01025ed065fe28809cd5a53b43505f48b69a1ac7707b1b7e4088", size = 9418283 }, { url = "https://files.pythonhosted.org/packages/3f/77/b587cba6febd5e2003374f37eb89633f79f161e71084f94057c8653b7fb3/ruff-0.7.3-py3-none-win_arm64.whl", hash = "sha256:1713e2c5545863cdbfe2cbce21f69ffaf37b813bfd1fb3b90dc9a6f1963f5a8c", size = 8725228 }, ] [[package]] name = "setuptools" version = "75.4.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/e2/73/c1ccf3e057ef6331cc6861412905dc218203bde46dfe8262c1631aa7fb11/setuptools-75.4.0.tar.gz", hash = "sha256:1dc484f5cf56fd3fe7216d7b8df820802e7246cfb534a1db2aa64f14fcb9cdcb", size = 1336593 } wheels = [ { url = "https://files.pythonhosted.org/packages/21/df/7c6bb83dcb45b35dc35b310d752f254211cde0bcd2a35290ea6e2862b2a9/setuptools-75.4.0-py3-none-any.whl", hash = "sha256:b3c5d862f98500b06ffdf7cc4499b48c46c317d8d56cb30b5c8bce4d88f5c216", size = 1223131 }, ] [[package]] name = "six" version = "1.16.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/71/39/171f1c67cd00715f190ba0b100d606d440a28c93c7714febeca8b79af85e/six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926", size = 34041 } wheels = [ { url = "https://files.pythonhosted.org/packages/d9/5a/e7c31adbe875f2abbb91bd84cf2dc52d792b5a01506781dbcf25c91daf11/six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254", size = 11053 }, ] [[package]] name = "sqlalchemy" version = "2.0.36" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "greenlet", marker = "(python_full_version < '3.13' and platform_machine == 'AMD64') or (python_full_version < '3.13' and platform_machine == 'WIN32') or (python_full_version < '3.13' and platform_machine == 'aarch64') or (python_full_version < '3.13' and platform_machine == 'amd64') or (python_full_version < '3.13' and platform_machine == 'ppc64le') or (python_full_version < '3.13' and platform_machine == 'win32') or (python_full_version < '3.13' and platform_machine == 'x86_64')" }, { name = "typing-extensions" }, ] sdist = { url = "https://files.pythonhosted.org/packages/50/65/9cbc9c4c3287bed2499e05033e207473504dc4df999ce49385fb1f8b058a/sqlalchemy-2.0.36.tar.gz", hash = "sha256:7f2767680b6d2398aea7082e45a774b2b0767b5c8d8ffb9c8b683088ea9b29c5", size = 9574485 } wheels = [ { url = "https://files.pythonhosted.org/packages/db/72/14ab694b8b3f0e35ef5beb74a8fea2811aa791ba1611c44dc90cdf46af17/SQLAlchemy-2.0.36-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:59b8f3adb3971929a3e660337f5dacc5942c2cdb760afcabb2614ffbda9f9f72", size = 2092604 }, { url = "https://files.pythonhosted.org/packages/1e/59/333fcbca58b79f5b8b61853d6137530198823392151fa8fd9425f367519e/SQLAlchemy-2.0.36-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:37350015056a553e442ff672c2d20e6f4b6d0b2495691fa239d8aa18bb3bc908", size = 2083796 }, { url = "https://files.pythonhosted.org/packages/6c/a0/ec3c188d2b0c1bc742262e76408d44104598d7247c23f5b06bb97ee21bfa/SQLAlchemy-2.0.36-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8318f4776c85abc3f40ab185e388bee7a6ea99e7fa3a30686580b209eaa35c08", size = 3066165 }, { url = "https://files.pythonhosted.org/packages/07/15/68ef91de5b8b7f80fb2d2b3b31ed42180c6227fe0a701aed9d01d34f98ec/SQLAlchemy-2.0.36-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c245b1fbade9c35e5bd3b64270ab49ce990369018289ecfde3f9c318411aaa07", size = 3074428 }, { url = "https://files.pythonhosted.org/packages/e2/4c/9dfea5e63b87325eef6d9cdaac913459aa6a157a05a05ea6ff20004aee8e/SQLAlchemy-2.0.36-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:69f93723edbca7342624d09f6704e7126b152eaed3cdbb634cb657a54332a3c5", size = 3030477 }, { url = "https://files.pythonhosted.org/packages/16/a5/fcfde8e74ea5f683b24add22463bfc21e431d4a5531c8a5b55bc6fbea164/SQLAlchemy-2.0.36-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f9511d8dd4a6e9271d07d150fb2f81874a3c8c95e11ff9af3a2dfc35fe42ee44", size = 3055942 }, { url = "https://files.pythonhosted.org/packages/3c/ee/c22c415a771d791ae99146d72ffdb20e43625acd24835ea7fc157436d59f/SQLAlchemy-2.0.36-cp310-cp310-win32.whl", hash = "sha256:c3f3631693003d8e585d4200730616b78fafd5a01ef8b698f6967da5c605b3fa", size = 2064960 }, { url = "https://files.pythonhosted.org/packages/aa/af/ad9c25cadc79bd851bdb9d82b68af9bdb91ff05f56d0da2f8a654825974f/SQLAlchemy-2.0.36-cp310-cp310-win_amd64.whl", hash = "sha256:a86bfab2ef46d63300c0f06936bd6e6c0105faa11d509083ba8f2f9d237fb5b5", size = 2089078 }, { url = "https://files.pythonhosted.org/packages/00/4e/5a67963fd7cbc1beb8bd2152e907419f4c940ef04600b10151a751fe9e06/SQLAlchemy-2.0.36-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fd3a55deef00f689ce931d4d1b23fa9f04c880a48ee97af488fd215cf24e2a6c", size = 2093782 }, { url = "https://files.pythonhosted.org/packages/b3/24/30e33b6389ebb5a17df2a4243b091bc709fb3dfc9a48c8d72f8e037c943d/SQLAlchemy-2.0.36-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4f5e9cd989b45b73bd359f693b935364f7e1f79486e29015813c338450aa5a71", size = 2084180 }, { url = "https://files.pythonhosted.org/packages/10/1e/70e9ed2143a27065246be40f78637ad5160ea0f5fd32f8cab819a31ff54d/SQLAlchemy-2.0.36-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0ddd9db6e59c44875211bc4c7953a9f6638b937b0a88ae6d09eb46cced54eff", size = 3202469 }, { url = "https://files.pythonhosted.org/packages/b4/5f/95e0ed74093ac3c0db6acfa944d4d8ac6284ef5e1136b878a327ea1f975a/SQLAlchemy-2.0.36-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2519f3a5d0517fc159afab1015e54bb81b4406c278749779be57a569d8d1bb0d", size = 3202464 }, { url = "https://files.pythonhosted.org/packages/91/95/2cf9b85a6bc2ee660e40594dffe04e777e7b8617fd0c6d77a0f782ea96c9/SQLAlchemy-2.0.36-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59b1ee96617135f6e1d6f275bbe988f419c5178016f3d41d3c0abb0c819f75bb", size = 3139508 }, { url = "https://files.pythonhosted.org/packages/92/ea/f0c01bc646456e4345c0fb5a3ddef457326285c2dc60435b0eb96b61bf31/SQLAlchemy-2.0.36-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:39769a115f730d683b0eb7b694db9789267bcd027326cccc3125e862eb03bfd8", size = 3159837 }, { url = "https://files.pythonhosted.org/packages/a6/93/c8edbf153ee38fe529773240877bf1332ed95328aceef6254288f446994e/SQLAlchemy-2.0.36-cp311-cp311-win32.whl", hash = "sha256:66bffbad8d6271bb1cc2f9a4ea4f86f80fe5e2e3e501a5ae2a3dc6a76e604e6f", size = 2064529 }, { url = "https://files.pythonhosted.org/packages/b1/03/d12b7c1d36fd80150c1d52e121614cf9377dac99e5497af8d8f5b2a8db64/SQLAlchemy-2.0.36-cp311-cp311-win_amd64.whl", hash = "sha256:23623166bfefe1487d81b698c423f8678e80df8b54614c2bf4b4cfcd7c711959", size = 2089874 }, { url = "https://files.pythonhosted.org/packages/b8/bf/005dc47f0e57556e14512d5542f3f183b94fde46e15ff1588ec58ca89555/SQLAlchemy-2.0.36-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f7b64e6ec3f02c35647be6b4851008b26cff592a95ecb13b6788a54ef80bbdd4", size = 2092378 }, { url = "https://files.pythonhosted.org/packages/94/65/f109d5720779a08e6e324ec89a744f5f92c48bd8005edc814bf72fbb24e5/SQLAlchemy-2.0.36-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:46331b00096a6db1fdc052d55b101dbbfc99155a548e20a0e4a8e5e4d1362855", size = 2082778 }, { url = "https://files.pythonhosted.org/packages/60/f6/d9aa8c49c44f9b8c9b9dada1f12fa78df3d4c42aa2de437164b83ee1123c/SQLAlchemy-2.0.36-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fdf3386a801ea5aba17c6410dd1dc8d39cf454ca2565541b5ac42a84e1e28f53", size = 3232191 }, { url = "https://files.pythonhosted.org/packages/8a/ab/81d4514527c068670cb1d7ab62a81a185df53a7c379bd2a5636e83d09ede/SQLAlchemy-2.0.36-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac9dfa18ff2a67b09b372d5db8743c27966abf0e5344c555d86cc7199f7ad83a", size = 3243044 }, { url = "https://files.pythonhosted.org/packages/35/b4/f87c014ecf5167dc669199cafdb20a7358ff4b1d49ce3622cc48571f811c/SQLAlchemy-2.0.36-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:90812a8933df713fdf748b355527e3af257a11e415b613dd794512461eb8a686", size = 3178511 }, { url = "https://files.pythonhosted.org/packages/ea/09/badfc9293bc3ccba6ede05e5f2b44a760aa47d84da1fc5a326e963e3d4d9/SQLAlchemy-2.0.36-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1bc330d9d29c7f06f003ab10e1eaced295e87940405afe1b110f2eb93a233588", size = 3205147 }, { url = "https://files.pythonhosted.org/packages/c8/60/70e681de02a13c4b27979b7b78da3058c49bacc9858c89ba672e030f03f2/SQLAlchemy-2.0.36-cp312-cp312-win32.whl", hash = "sha256:79d2e78abc26d871875b419e1fd3c0bca31a1cb0043277d0d850014599626c2e", size = 2062709 }, { url = "https://files.pythonhosted.org/packages/b7/ed/f6cd9395e41bfe47dd253d74d2dfc3cab34980d4e20c8878cb1117306085/SQLAlchemy-2.0.36-cp312-cp312-win_amd64.whl", hash = "sha256:b544ad1935a8541d177cb402948b94e871067656b3a0b9e91dbec136b06a2ff5", size = 2088433 }, { url = "https://files.pythonhosted.org/packages/78/5c/236398ae3678b3237726819b484f15f5c038a9549da01703a771f05a00d6/SQLAlchemy-2.0.36-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b5cc79df7f4bc3d11e4b542596c03826063092611e481fcf1c9dfee3c94355ef", size = 2087651 }, { url = "https://files.pythonhosted.org/packages/a8/14/55c47420c0d23fb67a35af8be4719199b81c59f3084c28d131a7767b0b0b/SQLAlchemy-2.0.36-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3c01117dd36800f2ecaa238c65365b7b16497adc1522bf84906e5710ee9ba0e8", size = 2078132 }, { url = "https://files.pythonhosted.org/packages/3d/97/1e843b36abff8c4a7aa2e37f9bea364f90d021754c2de94d792c2d91405b/SQLAlchemy-2.0.36-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9bc633f4ee4b4c46e7adcb3a9b5ec083bf1d9a97c1d3854b92749d935de40b9b", size = 3164559 }, { url = "https://files.pythonhosted.org/packages/7b/c5/07f18a897b997f6d6b234fab2bf31dccf66d5d16a79fe329aefc95cd7461/SQLAlchemy-2.0.36-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e46ed38affdfc95d2c958de328d037d87801cfcbea6d421000859e9789e61c2", size = 3177897 }, { url = "https://files.pythonhosted.org/packages/b3/cd/e16f3cbefd82b5c40b33732da634ec67a5f33b587744c7ab41699789d492/SQLAlchemy-2.0.36-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b2985c0b06e989c043f1dc09d4fe89e1616aadd35392aea2844f0458a989eacf", size = 3111289 }, { url = "https://files.pythonhosted.org/packages/15/85/5b8a3b0bc29c9928aa62b5c91fcc8335f57c1de0a6343873b5f372e3672b/SQLAlchemy-2.0.36-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a121d62ebe7d26fec9155f83f8be5189ef1405f5973ea4874a26fab9f1e262c", size = 3139491 }, { url = "https://files.pythonhosted.org/packages/a1/95/81babb6089938680dfe2cd3f88cd3fd39cccd1543b7cb603b21ad881bff1/SQLAlchemy-2.0.36-cp313-cp313-win32.whl", hash = "sha256:0572f4bd6f94752167adfd7c1bed84f4b240ee6203a95e05d1e208d488d0d436", size = 2060439 }, { url = "https://files.pythonhosted.org/packages/c1/ce/5f7428df55660d6879d0522adc73a3364970b5ef33ec17fa125c5dbcac1d/SQLAlchemy-2.0.36-cp313-cp313-win_amd64.whl", hash = "sha256:8c78ac40bde930c60e0f78b3cd184c580f89456dd87fc08f9e3ee3ce8765ce88", size = 2084574 }, { url = "https://files.pythonhosted.org/packages/db/da/443679a0b9e0a009f00d1542595c8a4d582ece1809704e703c4843f18768/SQLAlchemy-2.0.36-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3d6718667da04294d7df1670d70eeddd414f313738d20a6f1d1f379e3139a545", size = 2097524 }, { url = "https://files.pythonhosted.org/packages/60/88/08249dc5651d976b64c257250ade16ec02444257b44e404e258f2862c201/SQLAlchemy-2.0.36-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:72c28b84b174ce8af8504ca28ae9347d317f9dba3999e5981a3cd441f3712e24", size = 2088207 }, { url = "https://files.pythonhosted.org/packages/19/41/feb0216cced91211c9dd045f08fa020a3bd2e188110217d24b6b2a90b6a2/SQLAlchemy-2.0.36-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b11d0cfdd2b095e7b0686cf5fabeb9c67fae5b06d265d8180715b8cfa86522e3", size = 3090690 }, { url = "https://files.pythonhosted.org/packages/80/fe/0055147b71de2007e716ddc686438aefb390b03fd2e382ff4a8588b78b58/SQLAlchemy-2.0.36-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e32092c47011d113dc01ab3e1d3ce9f006a47223b18422c5c0d150af13a00687", size = 3097567 }, { url = "https://files.pythonhosted.org/packages/4e/55/52eaef72f071b89e2e965decc264d16b6031a39365a6593067053ba8bb97/SQLAlchemy-2.0.36-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:6a440293d802d3011028e14e4226da1434b373cbaf4a4bbb63f845761a708346", size = 3044686 }, { url = "https://files.pythonhosted.org/packages/2d/c8/6fb6179ad2eb9baab61132786488fa10e5b588ef2b008f57f560ae7f39d1/SQLAlchemy-2.0.36-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:c54a1e53a0c308a8e8a7dffb59097bff7facda27c70c286f005327f21b2bd6b1", size = 3067814 }, { url = "https://files.pythonhosted.org/packages/52/84/be7227a06c24418f131a877159eb962c62447883069390c868d1a782f01d/SQLAlchemy-2.0.36-cp38-cp38-win32.whl", hash = "sha256:1e0d612a17581b6616ff03c8e3d5eff7452f34655c901f75d62bd86449d9750e", size = 2067569 }, { url = "https://files.pythonhosted.org/packages/df/4f/9c70c53a5bd6de5957db78578cb0491732da636032566d3e8748659513cb/SQLAlchemy-2.0.36-cp38-cp38-win_amd64.whl", hash = "sha256:8958b10490125124463095bbdadda5aa22ec799f91958e410438ad6c97a7b793", size = 2092329 }, { url = "https://files.pythonhosted.org/packages/43/10/c1c865afeb50270677942cda17ed78b55b0a0068e426d22284a625d7341f/SQLAlchemy-2.0.36-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:dc022184d3e5cacc9579e41805a681187650e170eb2fd70e28b86192a479dcaa", size = 2095474 }, { url = "https://files.pythonhosted.org/packages/25/cb/78d7663ad1c82ca8b5cbc7532b8e3c9f80a53f1bdaafd8f5314525700a01/SQLAlchemy-2.0.36-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b817d41d692bf286abc181f8af476c4fbef3fd05e798777492618378448ee689", size = 2086708 }, { url = "https://files.pythonhosted.org/packages/5c/5b/f9b5cf759865b0dd8b20579b3d920ed87b6160fce75e2b7ed697ddbf0008/SQLAlchemy-2.0.36-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a4e46a888b54be23d03a89be510f24a7652fe6ff660787b96cd0e57a4ebcb46d", size = 3080607 }, { url = "https://files.pythonhosted.org/packages/18/f6/afaef83a3fbeff40b9289508b985c5630c0e8303d08106a0117447c680d9/SQLAlchemy-2.0.36-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4ae3005ed83f5967f961fd091f2f8c5329161f69ce8480aa8168b2d7fe37f06", size = 3088410 }, { url = "https://files.pythonhosted.org/packages/62/60/ec2b8c14b3c15b4a915ae821b455823fbafa6f38c4011b27c0a76f94928a/SQLAlchemy-2.0.36-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:03e08af7a5f9386a43919eda9de33ffda16b44eb11f3b313e6822243770e9763", size = 3047623 }, { url = "https://files.pythonhosted.org/packages/40/a2/9f748bdaf769eceb780c438b3dd7a37b8b8cbc6573e2a3748b0d5c2e9d80/SQLAlchemy-2.0.36-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:3dbb986bad3ed5ceaf090200eba750b5245150bd97d3e67343a3cfed06feecf7", size = 3074096 }, { url = "https://files.pythonhosted.org/packages/01/f7/290d7193c81d1ff0f751bd9430f3762bee0f53efd0273aba7ba18eb10520/SQLAlchemy-2.0.36-cp39-cp39-win32.whl", hash = "sha256:9fe53b404f24789b5ea9003fc25b9a3988feddebd7e7b369c8fac27ad6f52f28", size = 2067304 }, { url = "https://files.pythonhosted.org/packages/6f/a0/dc1a808d6ac466b190ca570f7ce52a1761308279eab4a09367ccf2cd6bd7/SQLAlchemy-2.0.36-cp39-cp39-win_amd64.whl", hash = "sha256:af148a33ff0349f53512a049c6406923e4e02bf2f26c5fb285f143faf4f0e46a", size = 2091520 }, { url = "https://files.pythonhosted.org/packages/b8/49/21633706dd6feb14cd3f7935fc00b60870ea057686035e1a99ae6d9d9d53/SQLAlchemy-2.0.36-py3-none-any.whl", hash = "sha256:fddbe92b4760c6f5d48162aef14824add991aeda8ddadb3c31d56eb15ca69f8e", size = 1883787 }, ] [[package]] name = "textual" version = "0.73.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "markdown-it-py", extra = ["linkify", "plugins"], marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, { name = "rich", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, { name = "typing-extensions", marker = "python_full_version < '3.9' or python_full_version >= '3.12' or platform_system != 'Windows' or sys_platform == 'win32'" }, ] sdist = { url = "https://files.pythonhosted.org/packages/d8/e9/4939bf72d4a7d1a37aa5d55ad4438594a9d5e59875195dd89e9d8c14a9a9/textual-0.73.0.tar.gz", hash = "sha256:ccd1e873370577f557dfdf2b3411f2a4f68b57d4365f9d83a00d084afb15f5a6", size = 1291992 } wheels = [ { url = "https://files.pythonhosted.org/packages/a4/f3/62ec72b437647787ac7305699e7e00318fd25827212a6b5b7fbb278ec17d/textual-0.73.0-py3-none-any.whl", hash = "sha256:4d93d80d203f7fb7ba51828a546e8777019700d529a1b405ceee313dea2edfc2", size = 564394 }, ] [[package]] name = "tinycss2" version = "1.4.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "webencodings" }, ] sdist = { url = "https://files.pythonhosted.org/packages/7a/fd/7a5ee21fd08ff70d3d33a5781c255cbe779659bd03278feb98b19ee550f4/tinycss2-1.4.0.tar.gz", hash = "sha256:10c0972f6fc0fbee87c3edb76549357415e94548c1ae10ebccdea16fb404a9b7", size = 87085 } wheels = [ { url = "https://files.pythonhosted.org/packages/e6/34/ebdc18bae6aa14fbee1a08b63c015c72b64868ff7dae68808ab500c492e2/tinycss2-1.4.0-py3-none-any.whl", hash = "sha256:3a49cf47b7675da0b15d0c6e1df8df4ebd96e9394bb905a5775adb0d884c5289", size = 26610 }, ] [[package]] name = "tokenize-rt" version = "6.0.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/7d/09/6257dabdeab5097d72c5d874f29b33cd667ec411af6667922d84f85b79b5/tokenize_rt-6.0.0.tar.gz", hash = "sha256:b9711bdfc51210211137499b5e355d3de5ec88a85d2025c520cbb921b5194367", size = 5360 } wheels = [ { url = "https://files.pythonhosted.org/packages/5c/c2/44486862562c6902778ccf88001ad5ea3f8da5c030c638cac8be72f65b40/tokenize_rt-6.0.0-py2.py3-none-any.whl", hash = "sha256:d4ff7ded2873512938b4f8cbb98c9b07118f01d30ac585a30d7a88353ca36d22", size = 5869 }, ] [[package]] name = "tomli" version = "2.1.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/1e/e4/1b6cbcc82d8832dd0ce34767d5c560df8a3547ad8cbc427f34601415930a/tomli-2.1.0.tar.gz", hash = "sha256:3f646cae2aec94e17d04973e4249548320197cfabdf130015d023de4b74d8ab8", size = 16622 } wheels = [ { url = "https://files.pythonhosted.org/packages/de/f7/4da0ffe1892122c9ea096c57f64c2753ae5dd3ce85488802d11b0992cc6d/tomli-2.1.0-py3-none-any.whl", hash = "sha256:a5c57c3d1c56f5ccdf89f6523458f60ef716e210fc47c4cfb188c5ba473e0391", size = 13750 }, ] [[package]] name = "typing-extensions" version = "4.12.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 } wheels = [ { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, ] [[package]] name = "tzdata" version = "2024.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/e1/34/943888654477a574a86a98e9896bae89c7aa15078ec29f490fef2f1e5384/tzdata-2024.2.tar.gz", hash = "sha256:7d85cc416e9382e69095b7bdf4afd9e3880418a2413feec7069d533d6b4e31cc", size = 193282 } wheels = [ { url = "https://files.pythonhosted.org/packages/a6/ab/7e5f53c3b9d14972843a647d8d7a853969a58aecc7559cb3267302c94774/tzdata-2024.2-py2.py3-none-any.whl", hash = "sha256:a48093786cdcde33cad18c2555e8532f34422074448fbc874186f0abd79565cd", size = 346586 }, ] [[package]] name = "uc-micro-py" version = "1.0.3" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/91/7a/146a99696aee0609e3712f2b44c6274566bc368dfe8375191278045186b8/uc-micro-py-1.0.3.tar.gz", hash = "sha256:d321b92cff673ec58027c04015fcaa8bb1e005478643ff4a500882eaab88c48a", size = 6043 } wheels = [ { url = "https://files.pythonhosted.org/packages/37/87/1f677586e8ac487e29672e4b17455758fce261de06a0d086167bb760361a/uc_micro_py-1.0.3-py3-none-any.whl", hash = "sha256:db1dffff340817673d7b466ec86114a9dc0e9d4d9b5ba229d9d60e5c12600cd5", size = 6229 }, ] [[package]] name = "urllib3" version = "2.2.3" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/ed/63/22ba4ebfe7430b76388e7cd448d5478814d3032121827c12a2cc287e2260/urllib3-2.2.3.tar.gz", hash = "sha256:e7d814a81dad81e6caf2ec9fdedb284ecc9c73076b62654547cc64ccdcae26e9", size = 300677 } wheels = [ { url = "https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl", hash = "sha256:ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac", size = 126338 }, ] [[package]] name = "verspec" version = "0.1.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/e7/44/8126f9f0c44319b2efc65feaad589cadef4d77ece200ae3c9133d58464d0/verspec-0.1.0.tar.gz", hash = "sha256:c4504ca697b2056cdb4bfa7121461f5a0e81809255b41c03dda4ba823637c01e", size = 27123 } wheels = [ { url = "https://files.pythonhosted.org/packages/a4/ce/3b6fee91c85626eaf769d617f1be9d2e15c1cca027bbdeb2e0d751469355/verspec-0.1.0-py3-none-any.whl", hash = "sha256:741877d5633cc9464c45a469ae2a31e801e6dbbaa85b9675d481cda100f11c31", size = 19640 }, ] [[package]] name = "watchdog" version = "4.0.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/4f/38/764baaa25eb5e35c9a043d4c4588f9836edfe52a708950f4b6d5f714fd42/watchdog-4.0.2.tar.gz", hash = "sha256:b4dfbb6c49221be4535623ea4474a4d6ee0a9cef4a80b20c28db4d858b64e270", size = 126587 } wheels = [ { url = "https://files.pythonhosted.org/packages/46/b0/219893d41c16d74d0793363bf86df07d50357b81f64bba4cb94fe76e7af4/watchdog-4.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ede7f010f2239b97cc79e6cb3c249e72962404ae3865860855d5cbe708b0fd22", size = 100257 }, { url = "https://files.pythonhosted.org/packages/6d/c6/8e90c65693e87d98310b2e1e5fd7e313266990853b489e85ce8396cc26e3/watchdog-4.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a2cffa171445b0efa0726c561eca9a27d00a1f2b83846dbd5a4f639c4f8ca8e1", size = 92249 }, { url = "https://files.pythonhosted.org/packages/6f/cd/2e306756364a934532ff8388d90eb2dc8bb21fe575cd2b33d791ce05a02f/watchdog-4.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c50f148b31b03fbadd6d0b5980e38b558046b127dc483e5e4505fcef250f9503", size = 92888 }, { url = "https://files.pythonhosted.org/packages/de/78/027ad372d62f97642349a16015394a7680530460b1c70c368c506cb60c09/watchdog-4.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7c7d4bf585ad501c5f6c980e7be9c4f15604c7cc150e942d82083b31a7548930", size = 100256 }, { url = "https://files.pythonhosted.org/packages/59/a9/412b808568c1814d693b4ff1cec0055dc791780b9dc947807978fab86bc1/watchdog-4.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:914285126ad0b6eb2258bbbcb7b288d9dfd655ae88fa28945be05a7b475a800b", size = 92252 }, { url = "https://files.pythonhosted.org/packages/04/57/179d76076cff264982bc335dd4c7da6d636bd3e9860bbc896a665c3447b6/watchdog-4.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:984306dc4720da5498b16fc037b36ac443816125a3705dfde4fd90652d8028ef", size = 92888 }, { url = "https://files.pythonhosted.org/packages/92/f5/ea22b095340545faea37ad9a42353b265ca751f543da3fb43f5d00cdcd21/watchdog-4.0.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:1cdcfd8142f604630deef34722d695fb455d04ab7cfe9963055df1fc69e6727a", size = 100342 }, { url = "https://files.pythonhosted.org/packages/cb/d2/8ce97dff5e465db1222951434e3115189ae54a9863aef99c6987890cc9ef/watchdog-4.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d7ab624ff2f663f98cd03c8b7eedc09375a911794dfea6bf2a359fcc266bff29", size = 92306 }, { url = "https://files.pythonhosted.org/packages/49/c4/1aeba2c31b25f79b03b15918155bc8c0b08101054fc727900f1a577d0d54/watchdog-4.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:132937547a716027bd5714383dfc40dc66c26769f1ce8a72a859d6a48f371f3a", size = 92915 }, { url = "https://files.pythonhosted.org/packages/79/63/eb8994a182672c042d85a33507475c50c2ee930577524dd97aea05251527/watchdog-4.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:cd67c7df93eb58f360c43802acc945fa8da70c675b6fa37a241e17ca698ca49b", size = 100343 }, { url = "https://files.pythonhosted.org/packages/ce/82/027c0c65c2245769580605bcd20a1dc7dfd6c6683c8c4e2ef43920e38d27/watchdog-4.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bcfd02377be80ef3b6bc4ce481ef3959640458d6feaae0bd43dd90a43da90a7d", size = 92313 }, { url = "https://files.pythonhosted.org/packages/2a/89/ad4715cbbd3440cb0d336b78970aba243a33a24b1a79d66f8d16b4590d6a/watchdog-4.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:980b71510f59c884d684b3663d46e7a14b457c9611c481e5cef08f4dd022eed7", size = 92919 }, { url = "https://files.pythonhosted.org/packages/55/08/1a9086a3380e8828f65b0c835b86baf29ebb85e5e94a2811a2eb4f889cfd/watchdog-4.0.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:aa160781cafff2719b663c8a506156e9289d111d80f3387cf3af49cedee1f040", size = 100255 }, { url = "https://files.pythonhosted.org/packages/6c/3e/064974628cf305831f3f78264800bd03b3358ec181e3e9380a36ff156b93/watchdog-4.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f6ee8dedd255087bc7fe82adf046f0b75479b989185fb0bdf9a98b612170eac7", size = 92257 }, { url = "https://files.pythonhosted.org/packages/23/69/1d2ad9c12d93bc1e445baa40db46bc74757f3ffc3a3be592ba8dbc51b6e5/watchdog-4.0.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:0b4359067d30d5b864e09c8597b112fe0a0a59321a0f331498b013fb097406b4", size = 92886 }, { url = "https://files.pythonhosted.org/packages/68/eb/34d3173eceab490d4d1815ba9a821e10abe1da7a7264a224e30689b1450c/watchdog-4.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:770eef5372f146997638d737c9a3c597a3b41037cfbc5c41538fc27c09c3a3f9", size = 100254 }, { url = "https://files.pythonhosted.org/packages/18/a1/4bbafe7ace414904c2cc9bd93e472133e8ec11eab0b4625017f0e34caad8/watchdog-4.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eeea812f38536a0aa859972d50c76e37f4456474b02bd93674d1947cf1e39578", size = 92249 }, { url = "https://files.pythonhosted.org/packages/f3/11/ec5684e0ca692950826af0de862e5db167523c30c9cbf9b3f4ce7ec9cc05/watchdog-4.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b2c45f6e1e57ebb4687690c05bc3a2c1fb6ab260550c4290b8abb1335e0fd08b", size = 92891 }, { url = "https://files.pythonhosted.org/packages/3b/9a/6f30f023324de7bad8a3eb02b0afb06bd0726003a3550e9964321315df5a/watchdog-4.0.2-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:10b6683df70d340ac3279eff0b2766813f00f35a1d37515d2c99959ada8f05fa", size = 91775 }, { url = "https://files.pythonhosted.org/packages/87/62/8be55e605d378a154037b9ba484e00a5478e627b69c53d0f63e3ef413ba6/watchdog-4.0.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:f7c739888c20f99824f7aa9d31ac8a97353e22d0c0e54703a547a218f6637eb3", size = 92255 }, { url = "https://files.pythonhosted.org/packages/6b/59/12e03e675d28f450bade6da6bc79ad6616080b317c472b9ae688d2495a03/watchdog-4.0.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:c100d09ac72a8a08ddbf0629ddfa0b8ee41740f9051429baa8e31bb903ad7508", size = 91682 }, { url = "https://files.pythonhosted.org/packages/ef/69/241998de9b8e024f5c2fbdf4324ea628b4231925305011ca8b7e1c3329f6/watchdog-4.0.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:f5315a8c8dd6dd9425b974515081fc0aadca1d1d61e078d2246509fd756141ee", size = 92249 }, { url = "https://files.pythonhosted.org/packages/70/3f/2173b4d9581bc9b5df4d7f2041b6c58b5e5448407856f68d4be9981000d0/watchdog-4.0.2-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:2d468028a77b42cc685ed694a7a550a8d1771bb05193ba7b24006b8241a571a1", size = 91773 }, { url = "https://files.pythonhosted.org/packages/f0/de/6fff29161d5789048f06ef24d94d3ddcc25795f347202b7ea503c3356acb/watchdog-4.0.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:f15edcae3830ff20e55d1f4e743e92970c847bcddc8b7509bcd172aa04de506e", size = 92250 }, { url = "https://files.pythonhosted.org/packages/8a/b1/25acf6767af6f7e44e0086309825bd8c098e301eed5868dc5350642124b9/watchdog-4.0.2-py3-none-manylinux2014_aarch64.whl", hash = "sha256:936acba76d636f70db8f3c66e76aa6cb5136a936fc2a5088b9ce1c7a3508fc83", size = 82947 }, { url = "https://files.pythonhosted.org/packages/e8/90/aebac95d6f954bd4901f5d46dcd83d68e682bfd21798fd125a95ae1c9dbf/watchdog-4.0.2-py3-none-manylinux2014_armv7l.whl", hash = "sha256:e252f8ca942a870f38cf785aef420285431311652d871409a64e2a0a52a2174c", size = 82942 }, { url = "https://files.pythonhosted.org/packages/15/3a/a4bd8f3b9381824995787488b9282aff1ed4667e1110f31a87b871ea851c/watchdog-4.0.2-py3-none-manylinux2014_i686.whl", hash = "sha256:0e83619a2d5d436a7e58a1aea957a3c1ccbf9782c43c0b4fed80580e5e4acd1a", size = 82947 }, { url = "https://files.pythonhosted.org/packages/09/cc/238998fc08e292a4a18a852ed8274159019ee7a66be14441325bcd811dfd/watchdog-4.0.2-py3-none-manylinux2014_ppc64.whl", hash = "sha256:88456d65f207b39f1981bf772e473799fcdc10801062c36fd5ad9f9d1d463a73", size = 82946 }, { url = "https://files.pythonhosted.org/packages/80/f1/d4b915160c9d677174aa5fae4537ae1f5acb23b3745ab0873071ef671f0a/watchdog-4.0.2-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:32be97f3b75693a93c683787a87a0dc8db98bb84701539954eef991fb35f5fbc", size = 82947 }, { url = "https://files.pythonhosted.org/packages/db/02/56ebe2cf33b352fe3309588eb03f020d4d1c061563d9858a9216ba004259/watchdog-4.0.2-py3-none-manylinux2014_s390x.whl", hash = "sha256:c82253cfc9be68e3e49282831afad2c1f6593af80c0daf1287f6a92657986757", size = 82944 }, { url = "https://files.pythonhosted.org/packages/01/d2/c8931ff840a7e5bd5dcb93f2bb2a1fd18faf8312e9f7f53ff1cf76ecc8ed/watchdog-4.0.2-py3-none-manylinux2014_x86_64.whl", hash = "sha256:c0b14488bd336c5b1845cee83d3e631a1f8b4e9c5091ec539406e4a324f882d8", size = 82947 }, { url = "https://files.pythonhosted.org/packages/d0/d8/cdb0c21a4a988669d7c210c75c6a2c9a0e16a3b08d9f7e633df0d9a16ad8/watchdog-4.0.2-py3-none-win32.whl", hash = "sha256:0d8a7e523ef03757a5aa29f591437d64d0d894635f8a50f370fe37f913ce4e19", size = 82935 }, { url = "https://files.pythonhosted.org/packages/99/2e/b69dfaae7a83ea64ce36538cc103a3065e12c447963797793d5c0a1d5130/watchdog-4.0.2-py3-none-win_amd64.whl", hash = "sha256:c344453ef3bf875a535b0488e3ad28e341adbd5a9ffb0f7d62cefacc8824ef2b", size = 82934 }, { url = "https://files.pythonhosted.org/packages/b0/0b/43b96a9ecdd65ff5545b1b13b687ca486da5c6249475b1a45f24d63a1858/watchdog-4.0.2-py3-none-win_ia64.whl", hash = "sha256:baececaa8edff42cd16558a639a9b0ddf425f93d892e8392a56bf904f5eff22c", size = 82933 }, ] [[package]] name = "webencodings" version = "0.5.1" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz", hash = "sha256:b36a1c245f2d304965eb4e0a82848379241dc04b865afcc4aab16748587e1923", size = 9721 } wheels = [ { url = "https://files.pythonhosted.org/packages/f4/24/2a3e3df732393fed8b3ebf2ec078f05546de641fe1b667ee316ec1dcf3b7/webencodings-0.5.1-py2.py3-none-any.whl", hash = "sha256:a0af1213f3c2226497a97e2b3aa01a7e4bee4f403f95be16fc9acd2947514a78", size = 11774 }, ] [[package]] name = "wheel" version = "0.45.0" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/e7/52/fd4516fb8f7d11a08e3f9cd69eb1558f098ab67e79f32d920c4974ee550f/wheel-0.45.0.tar.gz", hash = "sha256:a57353941a3183b3d5365346b567a260a0602a0f8a635926a7dede41b94c674a", size = 107426 } wheels = [ { url = "https://files.pythonhosted.org/packages/92/81/65ae90d584a73ca976d8f1eb83e2f58447a4055a9fb3ae69b28721070bdf/wheel-0.45.0-py3-none-any.whl", hash = "sha256:52f0baa5e6522155090a09c6bd95718cc46956d1b51d537ea5454249edb671c7", size = 72497 }, ] [[package]] name = "zipp" version = "3.20.2" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/54/bf/5c0000c44ebc80123ecbdddba1f5dcd94a5ada602a9c225d84b5aaa55e86/zipp-3.20.2.tar.gz", hash = "sha256:bc9eb26f4506fda01b81bcde0ca78103b6e62f991b381fec825435c836edbc29", size = 24199 } wheels = [ { url = "https://files.pythonhosted.org/packages/62/8b/5ba542fa83c90e09eac972fc9baca7a88e7e7ca4b221a89251954019308b/zipp-3.20.2-py3-none-any.whl", hash = "sha256:a817ac80d6cf4b23bf7f2828b7cabf326f15a001bea8b1f9b49631780ba28350", size = 9200 }, ]