././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/LICENSE0000644000000000000000000001674414550013131012170 0ustar00 GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/README.md0000644000000000000000000000102114550013131012420 0ustar00# Pytoolconfig **Py**thon **Tool** **Config**uration The goal of this project is to manage configuration for python tools, such as rope and add support for a pyproject.toml configuration file. [Documentation](https://pytoolconfig.readthedocs.io/en/latest/) This library only supports python 3.8 to 3.12. 3.13+ may work, but isn't tested. ## Development `pdm install --dev -G :all` to setup development environment `pdm run tox` to run tests ## Alternatives [Maison](https://dbatten5.github.io/maison/) is a similar library ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990308.7046268 pytoolconfig-1.3.1/pyproject.toml0000644000000000000000000000417214550013145014074 0ustar00[project] name = "pytoolconfig" dynamic = [] description = "Python tool configuration" dependencies = [ "tomli>=2.0.1; python_version < \"3.11\"", "packaging>=23.2", ] requires-python = ">=3.8" readme = "README.md" authors = [ { name = "bageljr", email = "bageljr897@protonmail.com" }, ] version = "1.3.1" [project.license] text = "LGPL-3.0-or-later" [project.urls] Homepage = "https://github.com/bageljrkhanofemus/pytoolconfig" [project.optional-dependencies] validation = [ "pydantic>=2.5.3", ] global = [ "platformdirs>=3.11.0", ] doc = [ "tabulate>=0.9.0", "sphinx>=7.1.2", ] gendocs = [ "sphinx>=7.1.2", "sphinx-autodoc-typehints>=1.25.2", "sphinx-rtd-theme>=2.0.0", "pytoolconfig[doc]", ] [tool.pdm.version] source = "scm" [tool.pdm.dev-dependencies] dev = [ "pytest>=7.4.4", "mypy>=1.8.0", "types-tabulate>=0.9.0.20240106", "tox>=4.11.4", "tox-pdm>=0.7.2", "types-docutils>=0.20.0.20240106", "tox-gh>=1.3.1", "pytest-emoji>=0.2.0", "pytest-md>=0.2.0", "pydantic>=2.5.3", ] [tool.pytoolconfig] formatter = "black" [tool.pytest.ini_options] testpaths = [ "tests", ] [tool.isort] profile = "black" [tool.ruff] select = [ "ALL", ] ignore = [ "FBT", "D211", "ANN101", "ANN102", "ANN401", "S101", "D212", "D213", "TCH001", "TCH002", "TCH003", "SLF001", "FA100", ] target-version = "py38" force-exclude = true [tool.ruff.per-file-ignores] "tests/**" = [ "D", "ANN201", "ANN001", ] "docs/conf.py" = [ "INP001", ] [tool.ruff.pydocstyle] convention = "google" [tool.ruff.flake8-bugbear] extend-immutable-calls = [ "pytoolconfig.field", ] [tool.tox] legacy_tox_ini = "[tox]\nmin_version = 4.0\nenvlist = py38, py39, py310, py311, py312\nisolated_build = True\n\n[gh-actions]\npython =\n 3.8: py38\n 3.9: py39\n 3.10: py310\n 3.11: py311\n 3.12: py312\n[testenv]\ngroups = dev, doc, global\ncommands = pytest\n" [tool.pylint.format] max-line-length = "88" [tool.mypy] plugins = "pydantic.mypy" [build-system] requires = [ "pdm-backend>=1.0.5", ] build-backend = "pdm.backend" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/__init__.py0000644000000000000000000000044614550013131016567 0ustar00"""Python Tool Configuration.""" from __future__ import annotations from dataclasses import dataclass # Backwards compatibility from .fields import field from .pytoolconfig import PyToolConfig from .types import UniversalKey __all__ = ["PyToolConfig", "field", "UniversalKey", "dataclass"] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/_version.py0000644000000000000000000000062014550013131016646 0ustar00"""Version of pytoolconfig.""" from __future__ import annotations from dataclasses import dataclass from pathlib import Path from pytoolconfig import PyToolConfig, UniversalKey, field @dataclass class Version: version: str = field("UNKNOWN", universal_config=UniversalKey.version) config = PyToolConfig("pytoolconfig", Path(__file__).parent, model=Version).parse() version = config.version ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/documentation.py0000644000000000000000000001322014550013131017673 0ustar00"""Program to generate documentation for a given PyToolConfig object.""" from __future__ import annotations from dataclasses import is_dataclass from typing import TYPE_CHECKING, Any, Generator if TYPE_CHECKING: from _typeshed import DataclassInstance from docutils.statemachine import StringList from sphinx.application import Sphinx from .pytoolconfig import PyToolConfig from .types import ConfigField from typing import get_origin from sphinx.ext.autodoc import ClassDocumenter from tabulate import tabulate from .fields import _gather_config_fields from .sources import Source from .universal_config import UniversalConfig def _type_to_str(type_to_print: type[Any]) -> str | None: if type_to_print is None: return None if get_origin(type_to_print) is None: try: return type_to_print.__name__ except AttributeError: return str(type_to_print) return str(type_to_print).replace("typing.", "") def _subtables(model: type[DataclassInstance]) -> dict[str, type[DataclassInstance]]: result = {} for name, field in _gather_config_fields(model).items(): if is_dataclass(field._type): result[name] = field._type return result def _generate_table( model: type[DataclassInstance], tablefmt: str = "rst", prefix: str = "", ) -> Generator[str, None, None]: header = ["name", "description", "type", "default"] model_fields: dict[str, ConfigField] = _gather_config_fields(model) command_line = any(field.command_line for field in model_fields.values()) if command_line: header.append("command line flag") table = [] for name, field in model_fields.items(): if not is_dataclass(field._type): row = [ f"{name}" if not prefix else f"{prefix}.{name}", field.description.replace("\n", " ") if field.description else None, _type_to_str(field._type), field._default, ] if field.universal_config: key = field.universal_config assert is_dataclass(UniversalConfig) universal_key = _gather_config_fields(UniversalConfig)[key.name] row[1] = universal_key.description row[3] = universal_key._default if command_line: cli_doc = field.command_line if cli_doc is not None: row.append(", ".join(cli_doc)) else: row.append(None) table.append(row) yield from tabulate(table, tablefmt=tablefmt, headers=header).split("\n") class PyToolConfigAutoDocumenter(ClassDocumenter): """Sphinx autodocumenter for pytoolconfig models.""" objtype = "pytoolconfigtable" content_indent = "" titles_allowed = True @classmethod def can_document_member( cls, member: Any, membername: str, # noqa: ARG003 isattr: bool, # noqa: ARG003 parent: Any, # noqa: ARG003 ) -> bool: """Check if member is dataclass.""" return is_dataclass(member) def add_directive_header(self, sig: str) -> None: """Remove directive headers.""" def add_content( self, more_content: StringList | None, # noqa: ARG002 no_docstring: bool = False, # noqa: ARG002 ) -> None: """Create simple table to document configuration options.""" source = self.get_sourcename() config = self.object for line in _generate_table(config): self.add_line(line, source) class PyToolConfigSourceDocumenter(ClassDocumenter): """Expiremental documenter for docmenting a source for pytoolconfig.""" objtype = "pytoolconfigsources" content_indent = "" titles_allowed = True @classmethod def can_document_member( cls, member: Any, membername: str, # noqa: ARG003 isattr: bool, # noqa: ARG003 parent: Any, # noqa: ARG003 ) -> bool: """Check if member is dataclass.""" return isinstance(member, Source) def add_directive_header(self, sig: str) -> None: """Remove directive headers.""" def add_content( self, more_content: StringList | None, # noqa: ARG002 no_docstring: bool = False, # noqa: ARG002 ) -> None: """Create simple table to document configuration options.""" source = self.get_sourcename() config = self.object for line in _generate_table(config): self.add_line(line, source) def setup(app: Sphinx) -> None: """Register automatic documenter.""" app.setup_extension("sphinx.ext.autodoc") app.add_autodocumenter(PyToolConfigAutoDocumenter) def _generate_documentation(config: PyToolConfig) -> Generator[str, None, None]: """Generate Markdown documentation for a given config model. This currently Beta at best. Do not use. """ yield "# Configuration\n" if len(config.sources) > 1: yield f"{config.tool} supports the following sources:\n" for idx, source in enumerate(config.sources): yield f" {idx}. {source.name}\n" else: name = config.sources[0].name yield f"{config.tool} supports the {name} format\n" yield "\n" for source in config.sources: if source.description: yield f"## {source.name} \n" yield source.description yield "\n" yield "## Options\n" yield from _generate_table(config.model, "github") yield "\n" for prefix, subtable in _subtables(config.model).items(): yield from _generate_table(subtable, "github", prefix) yield "\n" yield "\n" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/fields.py0000644000000000000000000000665214550013131016303 0ustar00"""Abstractions over dataclass fields.""" from __future__ import annotations import dataclasses import enum from dataclasses import fields from typing import TYPE_CHECKING, Callable, TypeVar, overload from .types import ConfigField, UniversalKey if TYPE_CHECKING: from _typeshed import DataclassInstance _METADATA_KEY = "pytoolconfig" T = TypeVar("T") class _MISSINGTYPE(enum.Enum): MISSING = enum.auto() MISSING = _MISSINGTYPE.MISSING @overload def field( default: T, description: str | None = None, command_line: tuple[str] | None = None, universal_config: UniversalKey | None = None, default_factory: _MISSINGTYPE = _MISSINGTYPE.MISSING, init: bool = True, ) -> T: pass @overload def field( *, default_factory: Callable[[], T], description: str | None = None, command_line: tuple[str] | None = None, universal_config: UniversalKey | None = None, init: bool = True, ) -> T: pass def field( # noqa: PLR0913 default: T | _MISSINGTYPE = _MISSINGTYPE.MISSING, description: str | None = None, command_line: tuple[str] | None = None, universal_config: UniversalKey | None = None, default_factory: Callable[[], T] | _MISSINGTYPE = _MISSINGTYPE.MISSING, init: bool = True, ) -> T: """Create a dataclass field with metadata.""" metadata = { _METADATA_KEY: ConfigField( description=description, universal_config=universal_config, command_line=command_line, _default=default, ), } if default_factory is not MISSING: metadata[_METADATA_KEY]._default = default_factory() return dataclasses.field( default_factory=default_factory, metadata=metadata, init=init, ) assert default is not MISSING return dataclasses.field(default=default, metadata=metadata, init=init) def _gather_config_fields( model: type[DataclassInstance] | DataclassInstance, ) -> dict[str, ConfigField]: # First try PyToolConfig Annotated Fields result = {} for dataclass_field in fields(model): if dataclass_field.init: if _METADATA_KEY in dataclass_field.metadata: result[dataclass_field.name] = dataclass_field.metadata[_METADATA_KEY] else: result[dataclass_field.name] = ConfigField( _default=dataclass_field.default, ) result[dataclass_field.name]._type = dataclass_field.type # Then use pydantic annotated fields if hasattr(model, "__pydantic_model__"): for pydantic_field in model.__pydantic_model__.__fields__.values(): if pydantic_field.init: result[pydantic_field.name] = ConfigField( description=pydantic_field.field_info.description, _type=pydantic_field.type_, _default=pydantic_field.default, ) if "universal_config" in pydantic_field.field_info.extra: result[ pydantic_field.name ].universal_config = pydantic_field.field_info.extra[ "universal_config" ] if "command_line" in pydantic_field.field_info.extra: result[ pydantic_field.name ].command_line = pydantic_field.field_info.extra["command_line"] return result ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/py.typed0000644000000000000000000000000014550013131016137 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/pytoolconfig.py0000644000000000000000000001110414550013131017535 0ustar00"""Tool to configure Python tools.""" from __future__ import annotations from argparse import SUPPRESS, ArgumentParser from dataclasses import is_dataclass from pathlib import Path from typing import TYPE_CHECKING, Any, Generic, Sequence, TypeVar if TYPE_CHECKING: from _typeshed import DataclassInstance DataclassT = TypeVar("DataclassT", bound=DataclassInstance) else: DataclassT = TypeVar("DataclassT") from pytoolconfig.fields import _gather_config_fields from pytoolconfig.sources import PyProject, PyTool, Source from pytoolconfig.types import ConfigField from pytoolconfig.universal_config import UniversalConfig from pytoolconfig.utils import _dict_to_dataclass, _recursive_merge class PyToolConfig(Generic[DataclassT]): """Python Tool Configuration Aggregator.""" sources: list[Source] tool: str working_directory: Path model: type[DataclassT] fall_through: bool = False arg_parser: ArgumentParser | None = None _config_fields: dict[str, ConfigField] def __init__( # noqa: PLR0913 self, tool: str, working_directory: Path, model: type[DataclassT], arg_parser: ArgumentParser | None = None, custom_sources: Sequence[Source] | None = None, global_config: bool = False, global_sources: Sequence[Source] | None = None, fall_through: bool = False, *args: Any, **kwargs: Any, ) -> None: """Initialize the configuration object. :param tool: name of the tool to use. :param working_directory: working directory in use. :param model: Model of configuration. :param arg_parser: Arugument Parser. :param custom_sources: Custom sources :param global_config: Enable global configuration :param global_sources: Custom global sources :param fall_through: Configuration options should fall through between sources. :param args: Passed to constructor for PyProject :param kwargs: Passed to constructor for PyProject """ assert is_dataclass(model) self.model = model self._config_fields = _gather_config_fields(model) self.tool = tool self.sources = [PyProject(working_directory, tool, *args, **kwargs)] if custom_sources: self.sources.extend(custom_sources) if global_config: self.sources.append(PyTool(tool)) if global_sources: self.sources.extend(global_sources) self.arg_parser = arg_parser self.fall_through = fall_through self._setup_arg_parser() def parse(self, args: list[str] | None = None) -> DataclassT: """Parse the configuration. :param args: any additional command line overwrites. """ configuration = self._parse_sources() assert isinstance(self.sources[0], PyProject) universal: UniversalConfig = self.sources[0].universalconfig() if self.arg_parser: if args is None: args = [] parsed = self.arg_parser.parse_args(args) for name, value in parsed._get_kwargs(): setattr(configuration, name, value) for name, field in self._config_fields.items(): if field.universal_config: universal_value = vars(universal)[field.universal_config.name] if universal_value is not None: setattr( configuration, name, universal_value, ) return configuration def _setup_arg_parser(self) -> None: if self.arg_parser: for name, field in self._config_fields.items(): if field.command_line: flags = field.command_line self.arg_parser.add_argument( *flags, type=field._type, help=field.description, default=SUPPRESS, metavar=name, dest=name, ) def _parse_sources(self) -> DataclassT: configuration = self.model() if self.fall_through: for source in reversed(self.sources): parsed = source.parse() if parsed is not None: configuration = _recursive_merge(configuration, parsed) else: for source in self.sources: parsed = source.parse() if parsed: return _dict_to_dataclass(self.model, parsed) return configuration ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/__init__.py0000644000000000000000000000045114550013131020246 0ustar00"""Sources for configuration files.""" from __future__ import annotations from .ini import IniConfig from .pyproject import PyProject from .pytool import PyTool from .setup_cfg import SetupConfig from .source import Source __all__ = ["PyProject", "PyTool", "IniConfig", "SetupConfig", "Source"] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/ini.py0000644000000000000000000000453214550013131017272 0ustar00"""Source for INI configuration files via configparser.""" from __future__ import annotations from configparser import ConfigParser, SectionProxy from pathlib import Path from typing import Dict from pytoolconfig.sources.source import Source from pytoolconfig.types import Key from pytoolconfig.utils import find_config_file def _add_split_to_dict( dest: dict[str, Key], table_to_add: list[str], table: SectionProxy, ) -> None: if len(table_to_add) == 0: for table_key in table: dest[table_key] = table[table_key] else: first = table_to_add[0] dest.setdefault(first, {}) assert isinstance(dest[first], Dict) _add_split_to_dict(dest[first], table_to_add[1:], table) class IniConfig(Source): """Source for INI configuration files via configparser.""" _config: ConfigParser name: str description: str | None def __init__( self, working_directory: Path, filename: str, base_table: str, description: str | None = None, ) -> None: """Initialize the Ini Configuration. :param working_directory: the working directory to search. :param filename: the filename to search for. :param base_table: The table to search for. The file will only be used if this is present. The base_table will not be included in the parsed output. :param description: The description used in documentation. """ self.file = find_config_file(working_directory, filename) self.base_table = base_table self.name = filename self.description = description self._config = ConfigParser() def _read(self) -> bool: if self.file is None: return False self._config.read_string(self.file.read_text()) for table in self._config: split = table.split(".") if split[0] == self.base_table: return True return False def parse(self) -> dict[str, Key] | None: """Parse the INI file.""" if not self._read(): return None output: dict[str, Key] = {} for table in self._config: split = table.split(".") if split[0] == self.base_table: _add_split_to_dict(output, split[1:], self._config[table]) return output ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/pyproject.py0000644000000000000000000000737514550013131020542 0ustar00"""Source for pyproject.toml files or more generally toml files.""" from __future__ import annotations import sys from pathlib import Path from pytoolconfig.sources.source import Source from pytoolconfig.types import Key from pytoolconfig.universal_config import UniversalConfig from pytoolconfig.utils import ( _dict_to_dataclass, find_config_file, max_py_version, min_py_version, parse_dependencies, ) if sys.version_info < (3, 11, 0): import tomli as tomllib else: import tomllib class PyProject(Source): """Source for pyproject.toml and pytool.toml files. Can be extended to other toml files. """ tool: str toml_dict: dict | None = None name: str = "pyproject.toml" description: str = """ PEP 518 defines pyproject.toml as a configuration file to store build system requirements for Python projects. With the help of tools like Poetry or Flit it can fully replace the need for setup.py and setup.cfg files. """ # taken from black. file: Path | None def __init__( self, working_directory: Path, tool: str, bases: list[str] | None = None, recursive: bool = True, ) -> None: """Initialize the TOML configuration. :param working_directory: Working Directory :param tool: name of your tool. Will read configuration from [tool.yourtool] :param bases: Base files/folders to look for (besides pyproject.toml) :param recursive: search recursively up the directory tree for the file. """ if recursive: self.file = find_config_file(working_directory, "pyproject.toml", bases) else: self.file = working_directory / "pyproject.toml" self.tool = tool def _read(self) -> bool: if not self.file or not self.file.exists(): return False self.toml_dict = tomllib.loads(self.file.read_text()) if self.toml_dict is None: return False if "tool" not in self.toml_dict: return False return self.tool in self.toml_dict["tool"] def parse(self) -> dict[str, Key] | None: """Parse the TOML file.""" if not self._read(): return None assert self.toml_dict return self.toml_dict["tool"][self.tool] def universalconfig(self) -> UniversalConfig: """Parse the file for the universal config object's fields. Only implement the relevant fields such as minimum python version. Pre: file was read but tool isn't necessarily in file. """ if not self.toml_dict: return UniversalConfig() config: UniversalConfig config = ( _dict_to_dataclass(UniversalConfig, self.toml_dict["tool"]["pytoolconfig"]) if "pytoolconfig" in self.toml_dict.get("tool", {}) else UniversalConfig() ) if "project" in self.toml_dict: project = self.toml_dict["project"] if "requires-python" in project: raw_python_ver = project["requires-python"] config.min_py_version = min_py_version(raw_python_ver) config.max_py_version = max_py_version(raw_python_ver) if "dependencies" in project: dependencies = parse_dependencies(project["dependencies"]) config.dependencies = list(dependencies) if "optional-dependencies" in project: optional_deps = {} for group, deps in project["optional-dependencies"].items(): optional_deps[group] = list(parse_dependencies(deps)) config.optional_dependencies = optional_deps if "version" in project: config.version = project["version"] return config ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/pytool.py0000644000000000000000000000223514550013131020037 0ustar00"""Source for pytool.toml files.""" from __future__ import annotations from pathlib import Path from .pyproject import PyProject class PyTool(PyProject): """Source for pytool.toml files. Uses platformdirs to find configuration directories. """ description: str tool: str name: str file: Path def __init__(self, tool: str) -> None: """Initialize the TOML configuration. :param tool: name of your tool. Will read configuration from [tool.yourtool] """ import platformdirs self.file = Path(platformdirs.user_config_dir()) / "pytool.toml" self.name = "pytool.toml" self.tool = tool self.description = rf""" The pytool.toml file is found at Mac OS X: ~/Library/Application Support/pytool.toml Unix: ~/.config/pytool.toml # or in $XDG_CONFIG_HOME, if defined Win *: C:\Users\\AppData\Local\pytool.toml It is configured in the same fashion as your pyproject.toml. Configuration for {tool} is found in the [tool.{tool}] table. """ ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/setup_cfg.py0000644000000000000000000000154314550013131020471 0ustar00"""Source for setup.cfg configuration files via ini config.""" from __future__ import annotations from pathlib import Path from .ini import IniConfig class SetupConfig(IniConfig): """Source for setup.cfg configuration files via ini config.""" name: str = "setup.cfg" description = """ Setuptools allows using configuration files (usually setup.cfg) to define a package`s metadata and other options that are normally supplied to the setup() function (declarative config).""" def __init__(self, working_directory: Path, base_table: str) -> None: """Initialize the setup.cfg file as a special INI file. Args: working_directory: working directory to find the file recursively. base_table: base table to read from. """ super().__init__(working_directory, "setup.cfg", base_table) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/sources/source.py0000644000000000000000000000156214550013131020013 0ustar00"""Base class for defining custom sources.""" from __future__ import annotations from abc import ABC, abstractmethod from pytoolconfig.types import Key class Source(ABC): """Base class for defining custom sources.""" name: str # The name of the tool for documentation description: str | None # The description, written as markdown. @abstractmethod def _read(self) -> bool: """Read the file. If file does not exist or the tool does not exist in the file, return False. Can be called multiple times and overwrite the existing cached configuration. """ @abstractmethod def parse(self) -> dict[str, Key] | None: """Parse the file for each property as a nested dict. Return None if tool is not configured in file. Otherwise, returns configuration pertaining to the tool. """ ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/types.py0000644000000000000000000000215014550013131016166 0ustar00"""PyToolConfig internal definitions and functions.""" from __future__ import annotations from dataclasses import dataclass from datetime import date, datetime, time from enum import Enum, auto from typing import Any, Dict, List, Union _BaseType = Union[str, int, float, datetime, date, time, bool] _BaseTypeWithList = Union[_BaseType, List[_BaseType]] Key = Union[Dict[str, _BaseTypeWithList], _BaseTypeWithList] # We have a circular dependency preventing us from generating universal keys from # universal_config. Universal Config requires field, which requires Universal Key. class UniversalKey(Enum): """See universal config documentation.""" formatter = auto() max_line_length = auto() min_py_version = auto() max_py_version = auto() dependencies = auto() optional_dependencies = auto() version = auto() @dataclass class ConfigField: """Dataclass store and validate fields in a configuration model.""" description: str | None = None universal_config: UniversalKey | None = None command_line: tuple[str] | None = None _type: Any = None _default: Any = None ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/universal_config.py0000644000000000000000000000243114550013131020361 0ustar00"""Universal Configuration base model.""" from __future__ import annotations from dataclasses import dataclass from packaging.requirements import Requirement from pytoolconfig import field @dataclass class UniversalConfig: """Universal Configuration base model.""" formatter: str | None = field(None, "Formatter used to format this File") max_line_length: int | None = field(None, description="Maximum line length") min_py_version: tuple[int, int] | None = field( None, """Minimum target python version. Requires PEP 621. Taken from project.requires-python""", ) max_py_version: tuple[int, int] | None = field( None, """Maximum target python version. Requires PEP 621. Taken from project.requires-python""", ) dependencies: list[Requirement] | None = field( None, """Dependencies of project. Requires PEP 621. Taken from project.dependencies. """, ) optional_dependencies: dict[str, list[Requirement]] | None = field( None, """Optional dependencies of project. Requires PEP 621. Taken from project.optional_dependencies.""", ) version: str | None = field( None, "Version of the project. Requires PEP 621. Taken from project.version.", ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/src/pytoolconfig/utils.py0000644000000000000000000001077714550013131016200 0ustar00"""Utility functions and classes.""" from __future__ import annotations import sys import warnings from dataclasses import Field, fields, is_dataclass, replace from enum import EnumMeta from pathlib import Path from typing import TYPE_CHECKING, Any, Generator, Mapping, TypeVar from packaging.requirements import Requirement from packaging.specifiers import SpecifierSet from .types import Key if TYPE_CHECKING: from _typeshed import DataclassInstance def find_config_file( working_directory: Path, filename: str, bases: list[str] | None = None, ) -> Path | None: """Find a configuration file given a working directory. Args: working_directory: Working directory to start from filename: Filename to look for bases: Bases to stop at Returns: Path of config file """ if bases is None: bases = [".git", ".hg"] """Recursively find the configuration file.""" target = working_directory / filename if target.exists(): return target for base in bases: if (working_directory / base).exists(): return None if working_directory == working_directory.parent: return None return find_config_file(working_directory.parent, filename, bases) def min_py_version(specifier: str) -> tuple[int, int]: """Return the minimum python 3 version. Between 3.4 and interpreter version. """ parsed = SpecifierSet(specifier) for i in range(4, sys.version_info.minor): if parsed.contains(f"3.{i}"): return (3, i) return (3, sys.version_info.minor) def max_py_version(specifier: str) -> tuple[int, int]: """Return the maximum python 3 version. Between 3.4 and interpreter version. """ parsed = SpecifierSet(specifier) for i in range(sys.version_info.minor, 4, -1): if parsed.contains(f"3.{i}"): return (3, i) return (3, 4) # Please don't cap your project at python3.4 def parse_dependencies(dependencies: list[str]) -> Generator[Requirement, None, None]: """Parse the dependencies from TOML using packaging.""" for dependency in dependencies: yield Requirement(dependency) T = TypeVar("T", bound="DataclassInstance") def _subtables(dataclass_fields: dict[str, Field]) -> dict[str, type[Any]]: return { name: field.type for name, field in dataclass_fields.items() if is_dataclass(field.type) } def _fields(dataclass: DataclassInstance | type[DataclassInstance]) -> dict[str, Field]: return {field.name: field for field in fields(dataclass) if field.init} def _format_enum(option: Any) -> str: if isinstance(option, str): return f'"{option}"' return str(option) def _dict_to_dataclass( dataclass: type[T], dictionary: Mapping[str, Key], ) -> T: filtered_arg_dict: dict[str, Any] = {} dataclass_fields = _fields(dataclass) sub_tables = _subtables(dataclass_fields) for key_name, value in dictionary.items(): if key_name in sub_tables: sub_table = sub_tables[key_name] assert isinstance(value, Mapping) filtered_arg_dict[key_name] = _dict_to_dataclass(sub_table, value) elif key_name in dataclass_fields: keytype = dataclass_fields[key_name].type if isinstance(keytype, EnumMeta): try: filtered_arg_dict[key_name] = keytype(value) except ValueError: valid = set(keytype._value2member_map_.keys()) warnings.warn( f"{value} is not a valid option for {key_name}, skipping." f"Valid options are: {','.join(map(_format_enum, valid))}.", stacklevel=1, ) else: filtered_arg_dict[key_name] = value return dataclass(**filtered_arg_dict) def _recursive_merge(dataclass: T, dictionary: Mapping[str, Key]) -> T: """Overwrite every value specified in dictionary on the dataclass.""" filtered_arg_dict: dict[str, Any] = {} dataclass_fields = _fields(dataclass) sub_tables = _subtables(dataclass_fields) for key_name, value in dictionary.items(): if key_name in sub_tables: sub_table = getattr(dataclass, key_name) assert isinstance(value, Mapping) filtered_arg_dict[key_name] = _recursive_merge(sub_table, value) elif key_name in dataclass_fields: filtered_arg_dict[key_name] = value return replace(dataclass, **filtered_arg_dict) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7246683 pytoolconfig-1.3.1/tests/__init__.py0000644000000000000000000000000014550013131014410 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/configfiles/pyproject.toml0000644000000000000000000001143214550013131017516 0ustar00[build-system] requires = ["flit_core>=3.2.0,<4"] build-backend = "flit_core.buildapi" [project] name = "tomli" version = "2.0.1" # DO NOT EDIT THIS LINE MANUALLY. LET bump2version UTILITY DO IT description = "A lil' TOML parser" authors = [ { name = "Taneli Hukkinen", email = "hukkin@users.noreply.github.com" }, ] license = { file = "LICENSE" } requires-python = ">=3.7" readme = "README.md" classifiers = [ "License :: OSI Approved :: MIT License", "Operating System :: MacOS", "Operating System :: Microsoft :: Windows", "Operating System :: POSIX :: Linux", "Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Software Development :: Libraries :: Python Modules", "Typing :: Typed", ] keywords = ["toml"] [project.urls] "Homepage" = "https://github.com/hukkin/tomli" "Changelog" = "https://github.com/hukkin/tomli/blob/master/CHANGELOG.md" [tool.pytoolconfig] formatter = "black" [tool.pytoolconfig2] option2 = true option1 = false option3 = "alternate" [tool.fall_through] foo_other = "ba" [tool.isort] # Force imports to be sorted by module, independent of import type force_sort_within_sections = true # Group first party and local folder imports together no_lines_before = ["LOCALFOLDER"] # Configure isort to work without access to site-packages known_first_party = ["tomli", "tests"] # Settings for Black compatibility profile = "black" [project.optional-dependencies] validation = ["pydantic>=1.7.4"] global = ["appdirs>=1.4.4"] doc = ["tabulate>=0.8.9", "sphinx>=4.5.0"] [tool.tox] legacy_tox_ini = ''' [tox] # Only run unittest envs when no args given to tox envlist = py{37,38,39,310} isolated_build = True [testenv:py{37,38,39,310}] description = run tests against a built package commands = python -m unittest {posargs} [testenv:profile] description = run profiler (use e.g. `firefox .tox/prof/output.svg` to open) deps = -r profiler/requirements.txt allowlist_externals = mkdir dot commands = mkdir -p "{toxworkdir}/prof" python -m cProfile -o "{toxworkdir}/prof/output.pstats" profiler/profiler_script.py gprof2dot -f pstats -o "{toxworkdir}/prof/output.dot" "{toxworkdir}/prof/output.pstats" dot -Tsvg -o "{toxworkdir}/prof/output.svg" "{toxworkdir}/prof/output.dot" python -c 'import pathlib; print("profiler svg output under file://\{0\}".format(pathlib.Path(r"{toxworkdir}") / "prof" / "output.svg"))' [testenv:pre-commit] description = run linters skip_install = True deps = pre-commit commands = pre-commit run {posargs:--all} [testenv:benchmark] description = run the benchmark script against a local Tomli version deps = -r benchmark/requirements.txt commands = python -c 'import datetime; print(datetime.date.today())' python --version python benchmark/run.py [testenv:benchmark-pypi] description = run the benchmark script against the latest Tomli in PyPI skip_install = True deps = tomli -r benchmark/requirements.txt commands = python -c 'import datetime; print(datetime.date.today())' python --version python benchmark/run.py [testenv:fuzz] description = run the fuzzer against a local Tomli version (needs "apt install clang") deps = -r fuzzer/requirements.txt allowlist_externals = mkdir cp commands = # Create a folder for persistent corpus and use benchmark data as initial seed mkdir -p {toxworkdir}/fuzzer-corpus cp -n benchmark/data.toml {toxworkdir}/fuzzer-corpus/data.toml # Run fuzzer python fuzzer/fuzz.py {toxworkdir}/fuzzer-corpus {posargs:-len_control=10000} ''' [tool.coverage.run] branch = true source = ['tomli'] [tool.coverage.report] # Regexes for lines to exclude from consideration exclude_lines = [ # Re-enable the standard pragma (with extra strictness) '# pragma: no cover\b', # Code for static type checkers 'if TYPE_CHECKING:', # Scripts 'if __name__ == .__main__.:', ] [tool.mypy] show_error_codes = true warn_unreachable = true warn_unused_ignores = true warn_redundant_casts = true warn_unused_configs = true # Disabling incremental mode is required for `warn_unused_configs = true` to work incremental = false disallow_untyped_defs = true check_untyped_defs = true strict_equality = true implicit_reexport = false no_implicit_optional = true [[tool.mypy.overrides]] module = "tests.*" disallow_untyped_defs = false [[tool.mypy.overrides]] # This matches `benchmark/run.py`. Since benchmark/ is # not a package, we use the module name here. module = "run" ignore_errors = true [[tool.mypy.overrides]] # This matches `fuzzer/fuzz.py`. module = "fuzz" ignore_errors = true ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/configfiles/pytool.toml0000644000000000000000000000004114550013131017017 0ustar00[tool.bogus.subtool] foo = "ajf" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/configfiles/setup.cfg0000644000000000000000000000012214550013131016415 0ustar00[metadata] name = dingus [options] python_requires=">=3.8" [pytoolconfig] aw = "" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/configfiles/test_config.ini0000644000000000000000000000017514550013131017611 0ustar00[pytoolconfig] formatter = yapf [bogus.subtool] foo=barr [bogus] foo_other=whatever # Should be ignored in fall_through mode ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/conftest.py0000644000000000000000000000050614550013131014511 0ustar00"""Setup pytest items.""" from __future__ import annotations from pathlib import Path import pytest @pytest.fixture() def cwd() -> Path: """Changes initial working directory for further tests. Returns: ------- The directory with config files. """ return Path(__file__).parent / "configfiles" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/test_config.py0000644000000000000000000000217314550013131015172 0ustar00from __future__ import annotations import sys from pytoolconfig.sources import IniConfig, PyProject, SetupConfig def test_base_pyproject(cwd): pyproject = PyProject(cwd, "pytoolconfig", []) assert pyproject.parse()["formatter"] == "black" universal = pyproject.universalconfig() assert universal.min_py_version == (3, 7) assert universal.max_py_version == (sys.version_info.major, sys.version_info.minor) assert universal.formatter == "black" assert "sphinx" in [dep.name for dep in universal.optional_dependencies["doc"]] def test_empty_pyproject(tmp_path): pyproject_toml = tmp_path / "pyproject.toml" with pyproject_toml.open(mode="w") as f: f.write("[spam]") pyproject = PyProject(tmp_path, "pytoolconfig", []) pyproject.parse() pyproject.universalconfig() with pyproject_toml.open(mode="w") as f: pass pyproject.parse() def test_base_ini(cwd): config = IniConfig(cwd, "test_config.ini", "pytoolconfig").parse() assert config["formatter"] == "yapf" def test_setup_cfg(cwd): setup_cfg = SetupConfig(cwd, "pytoolconfig") assert setup_cfg.parse() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/test_documentation.py0000644000000000000000000000213414550013131016573 0ustar00from dataclasses import dataclass from typing import Optional, Tuple from pytoolconfig import UniversalKey, field from pytoolconfig.documentation import _generate_table, _type_to_str @dataclass class SubTool: foo: str = field(description="foobar", default="lo") @dataclass class NestedModel: subtool: SubTool = field(default_factory=lambda: SubTool()) foo_other: Optional[str] = field( description="Tool One", default="no", command_line=("--foo", "-f"), ) min_py_ver: Tuple[int, int] = field( default=None, description="sauf", universal_config=UniversalKey.min_py_version, ) test_truth: bool = False def test_type_to_str(): assert _type_to_str(bool) == "bool" assert _type_to_str(int) == "int" assert _type_to_str(Tuple[int, int]) == "Tuple[int, int]" def test_documentation(): lines = list(_generate_table(NestedModel)) assert "description" in lines[1] assert "foo_other" in lines[3] assert "Tool One" in lines[3] assert "no" in lines[3] assert "str" in lines[3] assert "bool" in lines[6] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/test_enum.py0000644000000000000000000000105414550013131014666 0ustar00from dataclasses import dataclass from enum import Enum from pytoolconfig.pytoolconfig import PyToolConfig class Demo(Enum): DISABLED = False ENABLED = True ALT = "alternate" @dataclass class EnumModel: option1: Demo = Demo.DISABLED option2: Demo = Demo.DISABLED option3: Demo = Demo.DISABLED def test_simple(cwd): config = PyToolConfig("pytoolconfig2", cwd, EnumModel) result = config.parse() assert result.option1 == Demo.DISABLED assert result.option2 == Demo.ENABLED assert result.option3 == Demo.ALT ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/test_pytoolconfig.py0000644000000000000000000000506214550013131016441 0ustar00import os import sys from argparse import ArgumentParser from dataclasses import dataclass, fields from typing import Tuple import pytest from pytoolconfig import PyToolConfig, UniversalKey, field from pytoolconfig.sources import IniConfig from pytoolconfig.universal_config import UniversalConfig @dataclass class SimpleModel: formatter: str = "NOT THIS" @dataclass class EmptyModel: pass @dataclass class SubTool: foo: str = field(description="foobar", default="lo") @dataclass class NestedModel: subtool: SubTool = field(default_factory=lambda: SubTool()) foo_other: str = field(description="w", default="no", command_line=("--foo", "-f")) target: Tuple[int, int] = field( description="Minimum python version", default=(3, 1), universal_config=UniversalKey.min_py_version, ) def test_simple(cwd): config = PyToolConfig("pytoolconfig", cwd, SimpleModel) result = config.parse() assert result.formatter == "black" def test_invalid_key(cwd): config = PyToolConfig("pytoolconfig", cwd, EmptyModel) result = config.parse() with pytest.raises(AttributeError): assert result.formatter def test_nested(cwd): config = PyToolConfig( "bogus", cwd, NestedModel, custom_sources=[IniConfig(cwd, "test_config.ini", "bogus")], ) result = config.parse() assert result.subtool.foo == "barr" config = PyToolConfig( "bogus", cwd, NestedModel, ) result = config.parse() # Default argument assert result.subtool.foo == "lo" assert result.target == (3, 7) def test_cli(cwd): config = PyToolConfig("bogus", cwd, NestedModel, arg_parser=ArgumentParser()) result = config.parse() assert result.subtool.foo == "lo" result = config.parse(["--foo", "bar"]) assert result.foo_other == "bar" def test_global(cwd): if sys.platform != "linux": pytest.skip() os.environ["XDG_CONFIG_HOME"] = cwd.as_posix() config = PyToolConfig("bogus", cwd, NestedModel, global_config=True) result = config.parse() assert result.subtool.foo == "ajf" def test_fall_through(cwd): config = PyToolConfig( "fall_through", cwd, NestedModel, custom_sources=[IniConfig(cwd, "test_config.ini", "bogus")], fall_through=True, ) result = config.parse() assert result.subtool.foo == "barr" assert result.foo_other == "ba" def test_universal_key(): assert [field.name for field in fields(UniversalConfig)] == list( UniversalKey.__members__, ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1704990296.7286682 pytoolconfig-1.3.1/tests/test_utils.py0000644000000000000000000000105114550013131015057 0ustar00from pytoolconfig.utils import find_config_file, parse_dependencies def test_find_pyproject(cwd): result = find_config_file(cwd.parent, "pyproject.toml", [".git"]) assert result assert result == cwd.parent.parent / "pyproject.toml" def test_parse_deps(): deps = [ 'tomli>=2.0; python_version < "3.11"', "packaging>=21.3", 'typing-extensions; python_version < "3.9"', ] assert [dep.name for dep in parse_dependencies(deps)] == [ "tomli", "packaging", "typing-extensions", ] pytoolconfig-1.3.1/PKG-INFO0000644000000000000000000000264200000000000012220 0ustar00Metadata-Version: 2.1 Name: pytoolconfig Version: 1.3.1 Summary: Python tool configuration Author-Email: bageljr License: LGPL-3.0-or-later Project-URL: Homepage, https://github.com/bageljrkhanofemus/pytoolconfig Requires-Python: >=3.8 Requires-Dist: tomli>=2.0.1; python_version < "3.11" Requires-Dist: packaging>=23.2 Requires-Dist: pydantic>=2.5.3; extra == "validation" Requires-Dist: platformdirs>=3.11.0; extra == "global" Requires-Dist: tabulate>=0.9.0; extra == "doc" Requires-Dist: sphinx>=7.1.2; extra == "doc" Requires-Dist: sphinx>=7.1.2; extra == "gendocs" Requires-Dist: sphinx-autodoc-typehints>=1.25.2; extra == "gendocs" Requires-Dist: sphinx-rtd-theme>=2.0.0; extra == "gendocs" Requires-Dist: pytoolconfig[doc]; extra == "gendocs" Provides-Extra: validation Provides-Extra: global Provides-Extra: doc Provides-Extra: gendocs Description-Content-Type: text/markdown # Pytoolconfig **Py**thon **Tool** **Config**uration The goal of this project is to manage configuration for python tools, such as rope and add support for a pyproject.toml configuration file. [Documentation](https://pytoolconfig.readthedocs.io/en/latest/) This library only supports python 3.8 to 3.12. 3.13+ may work, but isn't tested. ## Development `pdm install --dev -G :all` to setup development environment `pdm run tox` to run tests ## Alternatives [Maison](https://dbatten5.github.io/maison/) is a similar library