pax_global_header 0000666 0000000 0000000 00000000064 14363731677 0014533 g ustar 00root root 0000000 0000000 52 comment=9cab6f4ec7dce4d16d41847f4fd44a9d8d00fd87
python-gjson-1.0.0/ 0000775 0000000 0000000 00000000000 14363731677 0014170 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/.github/ 0000775 0000000 0000000 00000000000 14363731677 0015530 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/.github/workflows/ 0000775 0000000 0000000 00000000000 14363731677 0017565 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/.github/workflows/run-tox.yaml 0000664 0000000 0000000 00000000720 14363731677 0022064 0 ustar 00root root 0000000 0000000 name: Run tox
on:
push:
branches:
- main
pull_request:
jobs:
run-tox:
runs-on: ubuntu-latest
strategy:
matrix:
python: ["3.9", "3.10"]
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python }}
- name: Install tox and any other packages
run: pip install tox
- name: Run tox
run: tox
python-gjson-1.0.0/.gitignore 0000664 0000000 0000000 00000000522 14363731677 0016157 0 ustar 00root root 0000000 0000000 # Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# Distribution / packaging
/build/
/dist/
/.eggs/
/*.egg-info/
# Unit test / coverage reports
/.tox/
/.coverage
/.coverage.*
/.pytest_cache/
# Sphinx documentation
/doc/build/
# pyenv
/.python-version
# Environments
/.venv/
# mypy
/.mypy_cache/
# Vim
*.sw?
python-gjson-1.0.0/CHANGELOG.rst 0000664 0000000 0000000 00000034243 14363731677 0016217 0 ustar 00root root 0000000 0000000 Release Notes
=============
`v1.0.0`_ (2023-01-24)
^^^^^^^^^^^^^^^^^^^^^^
With this release ``gjson-py`` has reached pretty much feature parity with `GJSON`_, with only some inherent minor
caveats, differences and limitations due to the different programming languages and technologies used. With this
release ``gjson-py`` can also be considered stable and production ready.
Minor improvements
""""""""""""""""""
* Modifiers: add support for missing upstream GJSON modifiers: ``@tostr``, ``@fromstr``, ``@group`` and ``@join``.
* gjson: make ``__version__`` more reliable:
* Migrate the reading of the version from ``pkg_resources`` to ``importlib`` as it's more efficient and avoids
an additional dependency.
* If the version is not detectable, as a fallback, attempt to read the ``SETUPTOOLS_SCM_PRETEND_VERSION`` environment
variable as documented in ``setuptools_scm``.
Bug fixes
"""""""""
* I/O: allow control characters in all JSON inputs, loading the data with ``strict=False`` both for the JSON input
and when parsing eventual JSON bits present in the query.
* gjson: fix some corner cases:
* In some corner cases gjson was not parsing correctly the query string.
* Introduce a new ``GJSONInvalidSyntaxError`` exception to distinguish some special parsing errors.
* Extract the check for sequence objects into a private static method.
Miscellanea
"""""""""""
* setup.py: add ``python_requires`` to prevent the package to be installed in an unsupported version of Python.
* setup.py: update classifiers.
* setup.py: mark the package as typed according to PEP 561.
* setup.py: mark Python 3.11 as officially supported.
* tox.ini: add Python 3.11 in the test matrix of tox.
* documentation: reduce the ``maxdepth`` property for the release notes page to not show all releases to avoid showing
all the releases in the documentation home page.
* documentation: clarify limits on multipaths keys, specifying that in gjson-py if a key of a multipath object is
given, it must be specified as a JSON string (e.g. ``"key":query``) and that bare words (e.g. ``key:query``) are not
accepted although they are in GJSON.
`v0.4.0`_ (2022-11-12)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* Query parsing: add support for the `GJSON Literals`_ feature.
* Queries: add support for nested queries. There is no limit on the number of levels queries can be nested.
Bug fixes
"""""""""
* Queries: fix bug on first element query without operator:
* When performing a query for the first element ``#(...)`` and there is no match, an exception should be raised.
* In the case of a key existence check when there is no operator (e.g. ``friends.#(nonexistent)``) an empty array was
erroneously returned instead of raising an exception.
Miscellanea
"""""""""""
* documentation: add missing method docstring.
* documentation: add note to modifiers specifying that the ``@keys`` and ``@values`` modifiers are valid only if applied
to a JSON object (mapping).
* Query parsing: simplify internal method to find a matching parentheses for queries and multipaths.
`v0.3.0`_ (2022-11-10)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* Query parsing: add `GJSON multipaths`_ support.
Bug fixes
"""""""""
* Query parsing: fix integer mapping keys
* When after a hash ``#`` or a query that returns all items ``#(...)#``, if there is an integer key, return any
matching key from the resulting items from the query if they match the integer key (as string, as per JSON
specifications).
`v0.2.1`_ (2022-10-25)
^^^^^^^^^^^^^^^^^^^^^^
Bug fixes
"""""""""
* Query parsing: fix modifier options:
* Fix a bug that was failing to properly get the modifier name when there were options and the modifier was not the
first path in the query.
`v0.2.0`_ (2022-10-24)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* Query parsing: fully rewrite of the query parser:
* Until now the parsing was mostly relying on a couple of regular expressions with lookbehind assertions to take
into account escaped characters. Although they were working fine and also allowed to sensibly speed up the first
development of gjson-py, they also had two major limitations:
* Could not work in all corner cases.
* Prevented the implementation of the GJSON features still missing in gjson-py.
* The parsing has been completely refactored using a more standard parser approach, that allows to fine-tune the
parsing much more to cover all corner cases and also enables the development of GJSON features still missing.
* There shouldn't be any difference for normal queries, but some corner cases might now return a proper error.
* Introduced a new ``GJSONParseError`` for parser-specific errors, that inherits from GJSONError and also provides
a graphic way to show where the parsing error occurred. Example output::
GJSONParseError: Invalid or unsupported query part `invalid`.
Query: name.last.invalid
-----------------^
Minor improvements
""""""""""""""""""
* Refactor code splitting it into multiple files:
* Restructure the gjson code to split it into multiple files for ease of development and maintenance.
* Keep all the split modules as private except the exceptions one, and re-export everything from the gjson module
itself, both to keep backward compatibility and also for simplicity of use by the clients.
* Custom modifiers:
* Prevent to register custom modifiers with names that contains characters that are used by the GJSON grammair,
raising a GJSONError exception.
Miscellanea
"""""""""""
* README: clarify naming for nested queries, based on feedback from `issue #2`_. Also fix a typo.
`v0.1.0`_ (2022-10-03)
^^^^^^^^^^^^^^^^^^^^^^
Minor improvements
""""""""""""""""""
* Modifiers: add ``@top_n`` modifier (not present in GJSON):
* Add a ``@top_n`` modifier that optionally accepts as options the number of top common items to return:
``@top_n:{"n": 5}``
* If no options are provided all items are returned.
* It requires a list of items as input and returns a dictionary with unique items as keys and the count of them as
values.
* Modifiers: add ``@sum_n`` modifier (not present in GJSON):
* Add a ``@sum_n`` modifier that will work on a sequence of objects, grouping the items with the same value for a
given grouping key and sum the values of a sum key for each of them.
* The options are mandatory and must specify the key to use for grouping and the key to use for summing:
``{"group": "somekey", "sum": "anotherkey"}``. Optionally specifying the ``n`` parameter to just return the top N
results based on the summed value: ``{"group": "somekey", "sum": "anotherkey", "n": 5}``
* It requires a list of objects as input and returns a dictionary with unique items as keys and the sum of their
values as values.
Bug fixes
"""""""""
* Output: fix unicode handling:
* Fix the default behaviour ensuring non-ASCII characters are returned as-is.
* Add a new modifier ``@ascii``, that when set will escape all non-ASCII characters.
* CLI: fix encoding handling:
* Use the ``surrogateescape`` Python mode when reading the input and back when printing the output to prevent
failures when parsing the input and reducing the loss of data.
Miscellanea
"""""""""""
* documentation: add mention to Debian packaging and the availability of Debian packages for the project.
* Type hints: use native types when possible. Instead of importing from ``typing`` use directly the native types when
they support the ``[]`` syntax added in Python 3.9.
* documentation: refactor the modifiers documentation to clearly split the GJSON modifiers supported by gjson-py and
the additional modifiers specific to gjson-py with more detailed explanation and example usage for the additional
ones.
* setup.py: mark project as Beta for this ``v0.1.0`` release and add an additional keyword for PyPI indexing.
`v0.0.5`_ (2022-08-05)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* Queries: add support for the tilde operator:
* When performing queries on arrays, add support for the Go GJSON tilde operator to perform truthy-ness comparison.
* The comparison is based on Python's definition of truthy-ness, hence the actual results might differ from the ones
in the Go package.
Minor improvements
""""""""""""""""""
* documentation: add man page for the gjson binary.
`v0.0.4`_ (2022-06-11)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* CLI: improve the JSON Lines support allowing to use the ``-l/--lines`` CLI argument and the special query prefix
``..`` syntax together to encapsulate each parsed line in an array to enable filtering using the Queries
capabilities.
Minor improvements
""""""""""""""""""
* CLI: the input file CLI argument is now optional, defaulting to read from stdin. The equivalent of passing ``-``.
* Modifiers: add support for the upstream Go GJSON modifier ``@this``, that just returns the current object.
Miscellanea
"""""""""""
* Documentation: add a section to with examples on how to use the CLI.
* CLI: add a link at the bottom of the help message of the CLI to the online documentation.
`v0.0.3`_ (2022-06-11)
^^^^^^^^^^^^^^^^^^^^^^
New features
""""""""""""
* Add CLI support for JSON Lines:
* Add a ``-l/--lines`` CLI argument to specify that the input file/stream is made of one JSON per line.
* When used, gjson applies the same query to all lines.
* Based on the verbosity level the failing lines are completely ignored, an error message is printed to stderr or
the execution is interrupted at the first error printing the full traceback.
* Add CLI support for GJSON JSON Lines queries:
* Add support for the GJSON queries that encapsulates a JSON Lines input in an array when the query starts with
``..`` so that they the data can be queries as if it was an array of objects in the CLI.
* Add support for custom modifiers:
* Add a ``ModifierProtocol`` to describe the interface that custom modifiers callable need to have.
* Add a ``register_modifier()`` method in the ``GJSON`` class to register custom modifiers.
* Allow to pass a dictionary of modifiers to the low-level ``GJSONObj`` class constructor.
* Add a ``GJSONObj.builtin_modifiers()`` static method that returns a set with the names of the built-in modifiers.
* Is not possible to register a custom modifier with the same name of a built-in modifier.
* Clarify in the documentation that only JSON objects are accepted as modifier arguments.
Bug fixes
"""""""""
* Query parsing: when using the queries GJSON syntax ``#(...)`` and ``#(...)#`` fix the return value in case of a key
matching that doesn't match any element.
* Query parsing fixes/improvements found with the Python fuzzing engine Atheris:
* If any query parts between delimiters is empty error out with a specific message instead of hitting a generic
``IndexError``.
* When a query has an integer index on a mapping object, in case the element is not present, raise a ``GJSONError``
instead of a ``KeyError`` one.
* When the query has a wildcard matching, ensure that it's applied on a mapping object. Fail with a ``GJSONError``
otherwise.
* Explicitly catch malformed modifier options and raise a ``GJSONError`` instead.
* If the last part of the query is a ``#``, check that the object is actually a sequence like object and fail with
a specific message if not.
* Ensure all the conditions are valid before attempting to extract the inner element of a sequence like object.
Ignore both non-mapping like objects inside the sequence or mapping like objects that don't have the specified key.
* When parsing the query value as JSON catch the eventual decoding error to encapsulate it into a ``GJSONError`` one.
* When using the queries GJSON syntax ``#(...)`` and ``#(...)#`` accept also an empty query to follow the same
behaviour of the upstream Go GJSON.
* When using the queries GJSON syntax ``#(...)`` and ``#(...)#`` follow closely the upstream behaviour of Go GJSON
for all items queries ``#(..)#`` with regex matching.
* When using the queries GJSON syntax ``#(...)`` and ``#(...)#`` fix the wildcard matching regular expression when
using pattern matching.
* Fix the regex to match keys in presence of wildcards escaping only the non-wildcards and ensuring to not
double-escaping any already escaped wildcard.
* When using the queries GJSON syntax ``#(...)`` and ``#(...)#`` ensure any exception raised while comparing
incompatible objects is catched and raise as a GJSONError.
Miscellanea
"""""""""""
* tests: when matching exception messages always escape the string or use raw strings to avoid false matchings.
* pylint: remove unnecessary comments
`v0.0.2`_ (2022-05-31)
^^^^^^^^^^^^^^^^^^^^^^
Bug fixes
"""""""""
* ``@sort`` modifier: fix the actual sorting.
* tests: ensure that mapping-like objects are compared also in the order of their keys.
Miscellanea
"""""""""""
* GitHub actions: add workflow to run tox.
* GitHub actions: fix branch name for pushes
* documentation: include also the ``@sort`` modifier that is not present in the GJSON project.
* documentation: fix link to PyPI package.
* documentation: add link to the generated docs.
* documentation: fix section hierarchy and build.
`v0.0.1`_ (2022-05-22)
^^^^^^^^^^^^^^^^^^^^^^
* Initial version.
.. _`GJSON`: https://github.com/tidwall/gjson/
.. _`GJSON Literals`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#literals
.. _`GJSON Multipaths`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#multipath
.. _`issue #2`: https://github.com/volans-/gjson-py/issues/2
.. _`v0.0.1`: https://github.com/volans-/gjson-py/releases/tag/v0.0.1
.. _`v0.0.2`: https://github.com/volans-/gjson-py/releases/tag/v0.0.2
.. _`v0.0.3`: https://github.com/volans-/gjson-py/releases/tag/v0.0.3
.. _`v0.0.4`: https://github.com/volans-/gjson-py/releases/tag/v0.0.4
.. _`v0.0.5`: https://github.com/volans-/gjson-py/releases/tag/v0.0.5
.. _`v0.1.0`: https://github.com/volans-/gjson-py/releases/tag/v0.1.0
.. _`v0.2.0`: https://github.com/volans-/gjson-py/releases/tag/v0.2.0
.. _`v0.2.1`: https://github.com/volans-/gjson-py/releases/tag/v0.2.1
.. _`v0.3.0`: https://github.com/volans-/gjson-py/releases/tag/v0.3.0
.. _`v0.4.0`: https://github.com/volans-/gjson-py/releases/tag/v0.4.0
.. _`v1.0.0`: https://github.com/volans-/gjson-py/releases/tag/v1.0.0
python-gjson-1.0.0/LICENSE 0000664 0000000 0000000 00000104515 14363731677 0015203 0 ustar 00root root 0000000 0000000 GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc.
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
Copyright (C)
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see .
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
Copyright (C)
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
.
python-gjson-1.0.0/README.rst 0000664 0000000 0000000 00000037763 14363731677 0015677 0 ustar 00root root 0000000 0000000 .. image:: https://github.com/volans-/gjson-py/actions/workflows/run-tox.yaml/badge.svg
:alt: CI results
:target: https://github.com/volans-/gjson-py/actions/workflows/run-tox.yaml
Introduction
============
gjson-py is a Python package that provides a simple way to filter and extract data from JSON-like objects or JSON
files, using the `GJSON`_ syntax.
It is, compatibly with the language differences and with some limitation, the Python equivalent of the Go
`GJSON`_ package.
The main difference from GJSON is that gjson-py doesn't work directly with JSON strings but instead with
JSON-like Python objects, that can either be the resulting object when calling ``json.load()`` or ``json.loads()``,
or any Python object that is JSON-serializable.
A detailed list of the GJSON features supported by gjson-py is provided below.
See also the full `gjson-py documentation`_.
Installation
------------
gjson-py is available on the `Python Package Index`_ (PyPI) and can be easily installed with::
pip install gjson
A ``debian`` branch is also present with all the configuration to build the package for Debian-based systems.
A ``.deb`` package for the current stable and unstable Debian versions is also available for download on the
`releases page on GitHub`_.
How to use the library
----------------------
gjson-py provides different ways to perform queries on JSON-like objects.
``gjson.get()``
^^^^^^^^^^^^^^^
A quick accessor to GJSON functionalities exposed for simplicity of use. Particularly useful to perform a single
query on a given object::
>>> import gjson
>>> data = {'name': {'first': 'Tom', 'last': 'Anderson'}, 'age': 37}
>>> gjson.get(data, 'name.first')
'Tom'
It's also possible to make it return a JSON-encoded string and decide on failure if it should raise an exception
or return `None`. See the full API documentation for more details.
``GJSON`` class
^^^^^^^^^^^^^^^
The ``GJSON`` class provides full access to the gjson-py API allowing to perform multiple queries on the same object::
>>> import gjson
>>> data = {'name': {'first': 'Tom', 'last': 'Anderson'}, 'age': 37}
>>> source = gjson.GJSON(data)
>>> source.get('name.first')
'Tom'
>>> str(source)
'{"name": {"first": "Tom", "last": "Anderson"}, "age": 37}'
>>> source.getj('name.first')
'"Tom"'
>>> name = source.get_gjson('name')
>>> name.get('first')
'Tom'
>>> name
See the full API documentation for more details.
How to use the CLI
------------------
gjson-py provides also a command line interface (CLI) for ease of use:
.. code-block:: console
$ echo '{"name": {"first": "Tom", "last": "Anderson"}, "age": 37}' > test.json
$ cat test.json | gjson 'name.first' # Read from stdin
"Tom"
$ gjson test.json 'age' # Read from a file
37
$ cat test.json | gjson - 'name.first' # Explicitely read from stdin
"Tom"
JSON Lines
^^^^^^^^^^
JSON Lines support in the CLI allows for different use cases. All the examples in this section operates on a
``test.json`` file generated with:
.. code-block:: console
$ echo -e '{"name": "Gilbert", "age": 61}\n{"name": "Alexa", "age": 34}\n{"name": "May", "age": 57}' > test.json
Apply the same query to each line
"""""""""""""""""""""""""""""""""
Using the ``-l/--lines`` CLI argument, for each input line gjson-py applies the query and filters the data according
to it. Lines are read one by one so there is no memory overhead for the processing. It can be used while tailing log
files in JSON format for example.
.. code-block:: console
$ gjson --lines test.json 'age'
61
34
57
$ tail -f log.json | gjson --lines 'bytes_sent' # Dummy example
Encapsulate all lines in an array, then apply the query
"""""""""""""""""""""""""""""""""""""""""""""""""""""""
Using the special query prefix syntax ``..``, as described in GJSON's documentation for `JSON Lines`_, gjson-py will
read all lines from the input and encapsulate them into an array. This approach has of course the memory overhead of
loading the whole input to perform the query.
.. code-block:: console
$ gjson test.json '..#.name'
["Gilbert", "Alexa", "May"]
Filter lines based on their values
""""""""""""""""""""""""""""""""""
Combining the ``-l/--lines`` CLI argument with the special query prefix ``..`` described above, it's possible to filter
input lines based on their values. In this case gjson-py encapsulates each line in an array so that is possible to use
the `Queries`_ GJSON syntax to filter them. As the ecapsulation is performed on each line, there is no memory overhead.
Because technically when a line is filtered is because there was no match on the whole line query, the final exit code,
if any line is filtered, will be ``1``.
.. code-block:: console
$ gjson --lines test.json '..#(age>40).name'
"Gilbert"
"May"
Filter lines and apply query to the result
""""""""""""""""""""""""""""""""""""""""""
Combining the methods above is possible for example to filter/extract data from the lines first and then apply a query
to the aggregated result. The memory overhead in this case is based on the amount of data resulting from the first
filtering/extraction.
.. code-block:: console
$ gjson --lines test.json 'age' | gjson '..@sort'
[34, 57, 61]
$ gjson --lines test.json '..#(age>40).age' | gjson '..@sort'
[57, 61]
Query syntax
------------
For the generic query syntax refer to the original `GJSON Path Syntax`_ documentation.
Supported GJSON features
^^^^^^^^^^^^^^^^^^^^^^^^
This is the list of GJSON features and how they are supported by gjson-py:
+------------------------+------------------------+------------------------------------------------------+
| GJSON feature | Supported by gjson-py | Notes |
+========================+========================+======================================================+
| `Path Structure`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Basic`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Wildcards`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Escape Character`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Arrays`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Queries`_ | YES | Using Python's operators [#]_ [#]_ |
+------------------------+------------------------+------------------------------------------------------+
| `Dot vs Pipe`_ | YES | |
+------------------------+------------------------+------------------------------------------------------+
| `Modifiers`_ | YES | See the table below for all the details |
+------------------------+------------------------+------------------------------------------------------+
| `Modifier arguments`_ | YES | Only a JSON object is accepted as argument |
+------------------------+------------------------+------------------------------------------------------+
| `Custom modifiers`_ | YES | Only a JSON object is accepted as argument [#]_ |
+------------------------+------------------------+------------------------------------------------------+
| `Multipaths`_ | YES | Object keys, if specified, must be JSON strings [#]_ |
+------------------------+------------------------+------------------------------------------------------+
| `Literals`_ | YES | Including infinite and NaN values [#]_ |
+------------------------+------------------------+------------------------------------------------------+
| `JSON Lines`_ | YES | CLI support [#]_ [#]_ |
+------------------------+------------------------+------------------------------------------------------+
.. [#] The queries matching is based on Python's operator and as such the results might be different than the ones from
the Go GJSON package. In particular for the ``~`` operator that checks the truthy-ness of objects.
.. [#] When using nested queries, only the outermost one controls whether to return only the first item or all items.
.. [#] Custom modifiers names cannot contain reserved characters used by the GJSON grammar.
.. [#] For example ``{"years":age}`` is valid while ``{years:age}`` is not, although that's valid in GJSON.
.. [#] Those special cases are handled according to `Python's JSON documentation`_.
.. [#] Both for applying the same query to each line using the ``-l/--lines`` argument and to automatically encapsulate
the input lines in a list and apply the query to the list using the ``..`` special query prefix described in
`JSON Lines`_.
.. [#] Library support is not currently present because gjson-py accepts only Python objects, making it impossible to
pass JSON Lines directly. The client is free to choose if calling gjson-py for each line or to encapsulate them in
a list before calling gjson-py.
This is the list of modifiers present in GJSON and how they are supported by gjson-py:
+----------------+-----------------------+------------------------------------------+
| GJSON Modifier | Supported by gjson-py | Notes |
+----------------+-----------------------+------------------------------------------+
| ``@reverse`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@ugly`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@pretty`` | PARTIALLY | The ``width`` argument is not supported |
+----------------+-----------------------+------------------------------------------+
| ``@this`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@valid`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@flatten`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@join`` | PARTIALLY | Preserving duplicate keys not supported |
+----------------+-----------------------+------------------------------------------+
| ``@keys`` | YES | Valid only on JSON objects (mappings) |
+----------------+-----------------------+------------------------------------------+
| ``@values`` | YES | Valid only on JSON objects (mappings) |
+----------------+-----------------------+------------------------------------------+
| ``@tostr`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@fromstr`` | YES | |
+----------------+-----------------------+------------------------------------------+
| ``@group`` | YES | |
+----------------+-----------------------+------------------------------------------+
Additional features
^^^^^^^^^^^^^^^^^^^
Additional modifiers
""""""""""""""""""""
This is the list of additional modifiers specific to gjson-py not present in GJSON:
* ``@ascii``: escapes all non-ASCII characters when printing/returning the string representation of the object,
ensuring that the output is made only of ASCII characters. It's implemented using the ``ensure_ascii`` arguments in
the Python's ``json`` module. This modifier doesn't accept any arguments.
* ``@sort``: sorts a mapping object by its keys or a sequence object by its values. This modifier doesn't accept any
arguments.
* ``@top_n``: given a sequence object groups the items in the sequence counting how many occurrences of each value are
present. It returns a mapping object where the keys are the distinct values of the list and the values are the number
of times the key was present in the list, ordered from the most common to the least common item. The items in the
original sequence object must be Python hashable. This modifier accepts an optional argument ``n`` to return just the
N items with the higher counts. When the ``n`` argument is not provided all items are returned. Example usage:
.. code-block:: console
$ echo '["a", "b", "c", "b", "c", "c"]' | gjson '@top_n'
{"c": 3, "b": 2, "a": 1}
$ echo '["a", "b", "c", "b", "c", "c"]' | gjson '@top_n:{"n":2}'
{"c": 3, "b": 2}
* ``@sum_n``: given a sequence of objects, groups the items in the sequence using a grouping key and sum the values of a
sum key provided. It returns a mapping object where the keys are the distinct values of the grouping key and the
values are the sums of all the values of the sum key for each distinct grouped key, ordered from the highest sum to
the lowest. The values of the grouping key must be Python hashable. The values of the sum key must be integers or
floats. This modifier required two mandatory arguments, ``group`` and ``sum`` that have as values the respective keys
in the objects of the sequence. An optional ``n`` argument is also accepted to return just the top N items with the
highest sum. Example usage:
.. code-block:: console
$ echo '[{"key": "a", "time": 1}, {"key": "b", "time": 2}, {"key": "c", "time": 3}, {"key": "a", "time": 4}]' > test.json
$ gjson test.json '@sum_n:{"group": "key", "sum": "time"}'
{"a": 5, "c": 3, "b": 2}
$ gjson test.json '@sum_n:{"group": "key", "sum": "time", "n": 2}'
{"a": 5, "c": 3}
.. _`GJSON`: https://github.com/tidwall/gjson
.. _`Python Package Index`: https://pypi.org/project/gjson/
.. _`GJSON Path Syntax`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md
.. _`gjson-py documentation`: https://volans-.github.io/gjson-py/index.html
.. _`releases page on GitHub`: https://github.com/volans-/gjson-py/releases
.. _`Python's JSON documentation`: https://docs.python.org/3/library/json.html#infinite-and-nan-number-values
.. _`Path Structure`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#path-structure
.. _`Basic`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#basic
.. _`Wildcards`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#wildcards
.. _`Escape Character`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#escape-character
.. _`Arrays`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#arrays
.. _`Queries`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#queries
.. _`Dot vs Pipe`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#dot-vs-pipe
.. _`Modifiers`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#modifiers
.. _`Modifier arguments`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#modifiers
.. _`Custom modifiers`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#custom-modifiers
.. _`Multipaths`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#multipaths
.. _`Literals`: https://github.com/tidwall/gjson/blob/master/SYNTAX.md#literals
.. _`JSON Lines`: https://github.com/tidwall/gjson#json-lines
python-gjson-1.0.0/doc/ 0000775 0000000 0000000 00000000000 14363731677 0014735 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/doc/Makefile 0000664 0000000 0000000 00000001206 14363731677 0016374 0 ustar 00root root 0000000 0000000 # Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = doc/source
BUILDDIR = doc/build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
python-gjson-1.0.0/doc/source/ 0000775 0000000 0000000 00000000000 14363731677 0016235 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/doc/source/api.rst 0000664 0000000 0000000 00000000037 14363731677 0017540 0 ustar 00root root 0000000 0000000 API
===
.. automodule:: gjson
python-gjson-1.0.0/doc/source/cli.rst 0000664 0000000 0000000 00000000143 14363731677 0017534 0 ustar 00root root 0000000 0000000 CLI
===
.. argparse::
:module: gjson._cli
:func: get_parser
:prog: gjson
:nodefault:
python-gjson-1.0.0/doc/source/conf.py 0000664 0000000 0000000 00000007327 14363731677 0017545 0 ustar 00root root 0000000 0000000 # Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
import sys
from pathlib import Path
import sphinx_rtd_theme
from pkg_resources import get_distribution
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
sys.path.insert(0, Path(__file__).resolve().parent.parent)
# -- Project information -----------------------------------------------------
project = 'gjson'
title = f'{project} Documentation'
copyright = '2022, Riccardo Coccioli'
author = 'Riccardo Coccioli'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The full version, including alpha/beta/rc tags.
release = get_distribution('gjson').version
# The short X.Y version.
version = release
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.napoleon',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode',
'sphinxarg.ext',
'sphinx_autodoc_typehints',
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = []
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('cli', 'gjson', 'filter and extract data from JSON-like files', [author], 1),
]
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = []
intersphinx_mapping = {
'python': ('https://docs.python.org/3/', None),
}
# Napoleon settings
napoleon_google_docstring = True
napoleon_numpy_docstring = False
napoleon_include_init_with_doc = False
napoleon_include_private_with_doc = False
napoleon_include_special_with_doc = False
napoleon_use_admonition_for_examples = False
napoleon_use_admonition_for_notes = False
napoleon_use_admonition_for_references = False
napoleon_use_ivar = False
napoleon_use_param = True
napoleon_use_keyword = True
napoleon_use_rtype = True
napoleon_type_aliases = None
napoleon_attr_annotations = True
# Autodoc settings
autodoc_default_options = {
# Using None as value instead of True to support the version of Sphinx used in Buster
'members': None,
'member-order': 'bysource',
'show-inheritance': None,
'special-members': '__str__,__version__,__call__',
}
autoclass_content = 'both'
python-gjson-1.0.0/doc/source/index.rst 0000664 0000000 0000000 00000000252 14363731677 0020075 0 ustar 00root root 0000000 0000000 gjson-py |release| documentation
================================
.. toctree::
:maxdepth: 3
introduction
cli
api
.. toctree::
:maxdepth: 1
release
python-gjson-1.0.0/doc/source/introduction.rst 0000664 0000000 0000000 00000000036 14363731677 0021507 0 ustar 00root root 0000000 0000000 .. include:: ../../README.rst
python-gjson-1.0.0/doc/source/release.rst 0000664 0000000 0000000 00000000041 14363731677 0020402 0 ustar 00root root 0000000 0000000 .. include:: ../../CHANGELOG.rst
python-gjson-1.0.0/gjson/ 0000775 0000000 0000000 00000000000 14363731677 0015310 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/gjson/__init__.py 0000664 0000000 0000000 00000020271 14363731677 0017423 0 ustar 00root root 0000000 0000000 """GJSON module."""
import json
import os
import re
from importlib.metadata import PackageNotFoundError, version
from typing import Any
from gjson._gjson import GJSONObj, MODIFIER_NAME_RESERVED_CHARS
from gjson._protocols import ModifierProtocol
from gjson.exceptions import GJSONError, GJSONParseError
# Explicit export of modules for the import * syntax, custom order to force the documentation order
__all__ = ['get', 'GJSON', 'GJSONError', 'GJSONParseError', 'ModifierProtocol', 'GJSONObj', '__version__']
# TODO: use a proper type hint for obj once https://github.com/python/typing/issues/182 will be fixed
def get(obj: Any, query: str, *, as_str: bool = False, quiet: bool = False) -> Any:
"""Quick accessor to GJSON functionalities exposed for simplicity of use.
Examples:
Import and directly use this quick helper for the simpler usage::
>>> import gjson
>>> data = {'items': [{'name': 'a', 'size': 1}, {'name': 'b', 'size': 2}]}
>>> gjson.get(data, 'items.#.size')
[1, 2]
Arguments:
obj: the object to query. It must be accessible in JSON-like fashion so it must be an object that can be
converted to JSON.
query: the query string to evaluate to extract the data from the object.
as_str: if set to :py:data:`True` returns a JSON-encoded string, a Python object otherwise.
quiet: on error, if set to :py:data:`True`, will raises an GJSONError exception. Otherwise returns
:py:data:`None` on error.
Return:
the resulting object.
"""
gjson_obj = GJSON(obj)
if as_str:
return gjson_obj.getj(query, quiet=quiet)
return gjson_obj.get(query, quiet=quiet)
class GJSON:
"""The GJSON class to operate on JSON-like objects."""
def __init__(self, obj: Any):
"""Initialize the instance with the given object.
Examples:
Use the :py:class:`gjson.GJSON` class for more complex usage or to perform multiple queries on the same
object::
>>> import gjson
>>> data = {'items': [{'name': 'a', 'size': 1}, {'name': 'b', 'size': 2}]}
>>> gjson_obj = gjson.GJSON(data)
Arguments:
obj: the object to query.
"""
self._obj = obj
self._custom_modifiers: dict[str, ModifierProtocol] = {}
def __str__(self) -> str:
"""Return the current object as a JSON-encoded string.
Examples:
Converting to string a :py:class:`gjson.GJSON` object returns it as a JSON-encoded string::
>>> str(gjson_obj)
'{"items": [{"name": "a", "size": 1}, {"name": "b", "size": 2}]}'
Returns:
the JSON-encoded string representing the instantiated object.
"""
return json.dumps(self._obj, ensure_ascii=False)
def get(self, query: str, *, quiet: bool = False) -> Any:
"""Perform a query on the instantiated object and return the resulting object.
Examples:
Perform a query and get the resulting object::
>>> gjson_obj.get('items.#.size')
[1, 2]
Arguments:
query: the GJSON query to apply to the object.
quiet: wheter to raise a :py:class:`gjson.GJSONError` exception on error or just return :py:data:`None` in
case of error.
Raises:
gjson.GJSONError: on error if the quiet parameter is not :py:data:`True`.
Returns:
the resulting object or :py:data:`None` if the ``quiet`` parameter is :py:data:`True` and there was an
error.
"""
try:
return GJSONObj(self._obj, query, custom_modifiers=self._custom_modifiers).get()
except GJSONError:
if quiet:
return None
raise
def getj(self, query: str, *, quiet: bool = False) -> str:
"""Perform a query on the instantiated object and return the resulting object as JSON-encoded string.
Examples:
Perform a query and get the resulting object as a JSON-encoded string::
>>> gjson_obj.getj('items.#.size')
'[1, 2]'
Arguments:
query: the GJSON query to apply to the object.
quiet: wheter to raise a :py:class:`gjson.GJSONError` exception on error or just return :py:data:`None` in
case of error.
Raises:
gjson.GJSONError: on error if the quiet parameter is not :py:data:`True`.
Returns:
the JSON-encoded string representing the resulting object or :py:data:`None` if the ``quiet`` parameter is
:py:data:`True` and there was an error.
"""
try:
return str(GJSONObj(self._obj, query, custom_modifiers=self._custom_modifiers))
except GJSONError:
if quiet:
return ''
raise
def get_gjson(self, query: str, *, quiet: bool = False) -> 'GJSON':
"""Perform a query on the instantiated object and return the resulting object as a GJSON instance.
Examples:
Perform a query and get the resulting object already encapsulated into a :py:class:`gjson.GJSON` object::
>>> sizes = gjson_obj.get_gjson('items.#.size')
>>> str(sizes)
'[1, 2]'
>>> sizes.get('0')
1
Arguments:
query: the GJSON query to apply to the object.
quiet: wheter to raise a :py:class:`gjson.GJSONError` exception on error or just return :py:data:`None` in
case of error.
Raises:
gjson.GJSONError: on error if the quiet parameter is not :py:data:`True`.
Returns:
the resulting object encapsulated as a :py:class:`gjson.GJSON` object or :py:data:`None` if the ``quiet``
parameter is :py:data:`True` and there was an error.
"""
return GJSON(self.get(query, quiet=quiet))
def register_modifier(self, name: str, func: ModifierProtocol) -> None:
"""Register a custom modifier.
Examples:
Register a custom modifier that sums all the numbers in a list:
>>> def custom_sum(options, obj, *, last):
... # insert sanity checks code here
... return sum(obj)
...
>>> gjson_obj.register_modifier('sum', custom_sum)
>>> gjson_obj.get('items.#.size.@sum')
3
Arguments:
name: the modifier name. It will be called where ``@name`` is used in the query. If two custom modifiers
are registered with the same name the last one will be used.
func: the modifier code in the form of a callable object that adhere to the
:py:class:`gjson.ModifierProtocol`.
Raises:
gjson.GJSONError: if the provided callable doesn't adhere to the :py:class:`gjson.ModifierProtocol`.
"""
# Escape the ] as they are inside a [...] block
not_allowed_regex = ''.join(MODIFIER_NAME_RESERVED_CHARS).replace(']', r'\]')
if re.search(fr'[{not_allowed_regex}]', name):
not_allowed_string = ', '.join(f'`{i}`' for i in MODIFIER_NAME_RESERVED_CHARS)
raise GJSONError(f'Unable to register modifier `{name}`, contains at least one not allowed character: '
f'{not_allowed_string}')
if name in GJSONObj.builtin_modifiers():
raise GJSONError(f'Unable to register a modifier with the same name of the built-in modifier: @{name}.')
if not isinstance(func, ModifierProtocol):
raise GJSONError(f'The given func "{func}" for the custom modifier @{name} does not adhere '
'to the gjson.ModifierProtocol.')
self._custom_modifiers[name] = func
try:
__version__: str = version('gjson')
"""str: the version of the current gjson module."""
except PackageNotFoundError: # pragma: no cover - this should never happen during tests
# Read the override from the environment, if present (like inside Debian build system)
if 'SETUPTOOLS_SCM_PRETEND_VERSION' in os.environ:
__version__ = os.environ['SETUPTOOLS_SCM_PRETEND_VERSION']
python-gjson-1.0.0/gjson/_cli.py 0000664 0000000 0000000 00000010765 14363731677 0016601 0 ustar 00root root 0000000 0000000 """GJSON module."""
import argparse
import json
import sys
from collections.abc import Sequence
from typing import Any, IO, Optional
from gjson import get, GJSONError
def cli(argv: Optional[Sequence[str]] = None) -> int: # noqa: MC0001
"""Command line entry point to run gjson as a CLI tool.
Arguments:
argv: a sequence of CLI arguments to parse. If not set they will be read from sys.argv.
Returns:
The CLI exit code to use.
Raises:
OSError: for system-related error, including I/O failures.
json.JSONDecodeError: when the input data is not a valid JSON.
gjson.GJSONError: for any query-related error in gjson.
"""
parser = get_parser()
args = parser.parse_args(argv)
encapsulate = False
if args.query.startswith('..'):
args.query = args.query[2:]
encapsulate = True
# Use argparse.FileType here instead of putting it as type in the --file argument parsing, to allow to handle the
# verbosity in case of error and make sure the file is always closed in case other arguments fail the validation.
try:
args.file = argparse.FileType(encoding='utf-8', errors='surrogateescape')(args.file)
except (OSError, argparse.ArgumentTypeError) as ex:
if args.verbose == 1:
print(f'{ex.__class__.__name__}: {ex}', file=sys.stderr)
elif args.verbose >= 2:
raise
return 1
# Reconfigure __stdin__ and __stdout__ instead of stdin and stdout because the latters are TextIO and could not
# have the reconfigure() method if re-assigned, while reconfigure() is part of TextIOWrapper.
# See also: https://github.com/python/typeshed/pull/8171
sys.__stdin__.reconfigure(errors='surrogateescape')
sys.__stdout__.reconfigure(errors='surrogateescape')
def _execute(line: str, file_obj: Optional[IO[Any]]) -> int:
try:
if encapsulate:
if line:
input_data = [json.loads(line, strict=False)]
elif file_obj is not None:
input_data = []
for input_line in file_obj:
if input_line.strip():
input_data.append(json.loads(input_line, strict=False))
else:
if line:
input_data = json.loads(line, strict=False)
elif file_obj is not None:
input_data = json.load(file_obj, strict=False)
result = get(input_data, args.query, as_str=True)
exit_code = 0
except (json.JSONDecodeError, GJSONError) as ex:
result = ''
exit_code = 1
if args.verbose == 1:
print(f'{ex.__class__.__name__}: {ex}', file=sys.stderr)
elif args.verbose >= 2:
raise
if result:
print(result)
return exit_code
if args.lines:
exit_code = 0
for line in args.file:
line = line.strip()
if not line:
continue
ret = _execute(line, None)
if ret > exit_code:
exit_code = ret
else:
exit_code = _execute('', args.file)
return exit_code
def get_parser() -> argparse.ArgumentParser:
"""Get the CLI argument parser.
Returns:
the argument parser for the CLI.
"""
parser = argparse.ArgumentParser(
prog='gjson',
description=('A simple way to filter and extract data from JSON-like data structures. Python porting of the '
'Go GJSON package.'),
epilog='See also the full documentation available at https://volans-.github.io/gjson-py/index.html',
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
)
parser.add_argument('-v', '--verbose', action='count', default=0,
help=('Verbosity level. By default on error no output will be printed. Use -v to get the '
'error message to stderr and -vv to get the full traceback.'))
parser.add_argument('-l', '--lines', action='store_true',
help='Treat the input as JSON Lines, parse each line and apply the query to each line.')
# argparse.FileType is used later to parse this argument.
parser.add_argument('file', default='-', nargs='?',
help='Input JSON file to query. Reads from stdin if the argument is missing or set to "-".')
parser.add_argument('query', help='A GJSON query to apply to the input data.')
return parser
python-gjson-1.0.0/gjson/_gjson.py 0000664 0000000 0000000 00000170345 14363731677 0017153 0 ustar 00root root 0000000 0000000 """GJSON module."""
import json
import operator
import re
from collections import Counter
from collections.abc import Callable, Mapping, Sequence
from dataclasses import dataclass
from itertools import zip_longest
from typing import Any, Optional, Union
from gjson._protocols import ModifierProtocol
from gjson.exceptions import GJSONError, GJSONInvalidSyntaxError, GJSONParseError
ESCAPE_CHARACTER = '\\'
"""str: The grammar escape character."""
DOT_DELIMITER = '.'
"""str: One of the available delimiters in the query grammar."""
PIPE_DELIMITER = '|'
"""str: One of the available delimiters in the query grammar."""
DELIMITERS = (DOT_DELIMITER, PIPE_DELIMITER)
"""tuple: All the available delimiters in the query grammar."""
MULTIPATHS_DELIMITERS = DELIMITERS + (']', '}', ',')
"""tuple: All the available delimiters in the query grammar."""
# Single character operators goes last to avoid mis-detection.
QUERIES_OPERATORS = ('==~', '==', '!=', '<=', '>=', '!%', '=', '<', '>', '%')
"""tuple: The list of supported operators inside queries."""
MODIFIER_NAME_RESERVED_CHARS = ('"', ',', '.', '|', ':', '@', '{', '}', '[', ']', '(', ')')
"""tuple: The list of reserver characters not usable in a modifier's name."""
PARENTHESES_PAIRS = {'(': ')', ')': '(', '[': ']', ']': '[', '{': '}', '}': '{'}
class NoResult:
"""A no result type to be passed around and be checked."""
@dataclass
class BaseQueryPart:
"""Base dataclass class to represent a query part."""
start: int
end: int
part: str
delimiter: str
previous: Optional['BaseQueryPart']
is_last: bool
def __str__(self) -> str:
"""String representation of the part.
Returns:
The part property of the instance.
"""
return self.part
class FieldQueryPart(BaseQueryPart):
"""Basic field path query part."""
class ArrayLenghtQueryPart(BaseQueryPart):
"""Hash query part, to get the size of an array."""
class ArrayIndexQueryPart(BaseQueryPart):
"""Integer query part to get an array index."""
@property
def index(self) -> int:
"""Return the integer representation of the query part.
Returns:
the index as integer.
"""
return int(self.part)
@dataclass
class ArrayQueryQueryPart(BaseQueryPart):
"""Query part for array queries, with additional fields."""
field: str
operator: str
value: Union[str, 'ArrayQueryQueryPart']
first_only: bool
@dataclass
class ModifierQueryPart(BaseQueryPart):
"""Modifier query part."""
name: str
options: dict[Any, Any]
@dataclass
class MultipathsItem:
"""Single multipaths query item."""
key: str
values: list[BaseQueryPart]
@dataclass
class MultipathsObjectQueryPart(BaseQueryPart):
"""JSON object multipaths query part."""
parts: list[MultipathsItem]
@dataclass
class MultipathsArrayQueryPart(BaseQueryPart):
"""JSON object multipaths query part."""
parts: list[list[BaseQueryPart]]
class LiteralQueryPart(BaseQueryPart):
"""Literal query part."""
class GJSONObj:
"""A low-level class to perform the GJSON query on a JSON-like object."""
# pylint: disable=too-many-branches,too-many-statements,too-many-locals
def __init__(self, obj: Any, query: str, *, custom_modifiers: Optional[dict[str, ModifierProtocol]] = None):
"""Initialize the instance with the starting object and query.
Examples:
Client code should not need to instantiate this low-level class in normal circumastances::
>>> import gjson
>>> data = {'items': [{'name': 'a', 'size': 1}, {'name': 'b', 'size': 2}]}
>>> gjson_obj = gjson.GJSONObj(data, 'items.#.size')
Arguments:
obj: the JSON-like object to query.
query: the GJSON query to apply to the object.
custom_modifiers: an optional dictionary with the custom modifiers to load. The dictionary keys are the
names of the modifiers and the values are the callables with the modifier code that adhere to the
:py:class:`gjson.ModifierProtocol` protocol.
Raises:
gjson.GJSONError: if any provided custom modifier overrides a built-in one or is not callable.
"""
self._obj = obj
self._query = query
if custom_modifiers is not None:
if (intersection := self.builtin_modifiers().intersection(set(custom_modifiers.keys()))):
raise GJSONError(f'Some provided custom_modifiers have the same name of built-in ones: {intersection}.')
for name, modifier in custom_modifiers.items():
if not isinstance(modifier, ModifierProtocol):
raise GJSONError(f'The given func "{modifier}" for the custom modifier @{name} does not adhere '
'to the gjson.ModifierProtocol.')
self._custom_modifiers = custom_modifiers if custom_modifiers else {}
self._dump_params: dict[str, Any] = {'ensure_ascii': False}
self._after_hash = False
self._after_query_all = False
@classmethod
def builtin_modifiers(cls) -> set[str]:
"""Return the names of the built-in modifiers.
Returns:
the names of the built-in modifiers.
"""
prefix = '_apply_modifier_'
return {modifier[len(prefix):] for modifier in dir(cls) if modifier.startswith(prefix)}
def get(self) -> Any:
"""Perform the query and return the resulting object.
Examples:
Returns the resulting object::
>>> gjson_obj.get()
[1, 2]
Raises:
gjson.GJSONError: on error.
Returns:
the resulting object.
"""
# Reset internal parameters
self._dump_params = {'ensure_ascii': False}
self._after_hash = False
self._after_query_all = False
if not self._query:
raise GJSONError('Empty query.')
obj = self._obj
for part in self._parse(start=0, end=len(self._query) - 1):
obj = self._parse_part(part, obj)
return obj
def __str__(self) -> str:
"""Return the JSON string representation of the object, based on the query parameters.
Examples:
Returns the resulting object as a JSON-encoded string::
>>> str(gjson_obj)
'[1, 2]'
Raises:
gjson.GJSONError: on error.
Returns:
the JSON encoded string.
"""
obj = self.get()
prefix = self._dump_params.pop('prefix', '')
json_string = json.dumps(obj, **self._dump_params)
if prefix:
json_string = '\n'.join(f'{prefix}{line}' for line in json_string.splitlines())
return json_string
def _parse(self, *, start: int, end: int, max_end: int = 0, delimiter: str = '',
in_multipaths: bool = False) -> list[BaseQueryPart]:
"""Main parser of the query that will delegate to more specific parsers for each different feature.
Arguments:
start: the start position in the query.
end: the end position in the query.
max_end: an optional last position up to where a closing parentheses can be searched.
delimiter: the optional delimiter before the query, if this is called on a multipaths.
in_multipaths: whether the part to be parsed is inside a multipaths.
Raises:
gjson.GJSONParseError: on error.
Returns:
the resulting object.
"""
current: list[str] = []
current_start = -1
parts: list[BaseQueryPart] = []
previous: Optional[BaseQueryPart] = None
require_delimiter = False
i = start
while True:
part: Optional[BaseQueryPart] = None
# Get current and next character in the query
if i == end:
next_char = None
elif i >= end:
if parts and not current:
parts[-1].is_last = True
break
else:
next_char = self._query[i + 1]
char = self._query[i]
if char in DELIMITERS:
if i == start:
raise GJSONParseError('Invalid query starting with a path delimiter.',
query=self._query, position=i)
if next_char in DELIMITERS:
raise GJSONParseError('Invalid query with two consecutive path delimiters.',
query=self._query, position=i)
if current:
part = FieldQueryPart(start=current_start, end=i - 1, part=''.join(current),
delimiter=delimiter, previous=previous, is_last=False)
parts.append(part)
previous = part
current = []
current_start = -1
delimiter = char
require_delimiter = False
if next_char is None:
raise GJSONParseError('Delimiter at the end of the query.', query=self._query, position=i)
i += 1
continue
if char == '@':
part = self._parse_modifier_query_part(i, delimiter, max_end=max_end, in_multipaths=in_multipaths)
elif char == '#' and (next_char in DELIMITERS or next_char is None):
part = ArrayLenghtQueryPart(start=i, end=i, part=char, delimiter=delimiter, is_last=next_char is None,
previous=previous)
elif char == '#' and next_char == '(':
part = self._parse_array_query_query_part(i, delimiter, max_end=max_end)
elif re.match(r'[0-9]', char) and not current:
part = self._parse_array_index_query_part(i, delimiter, in_multipaths=in_multipaths)
elif char == '{':
part = self._parse_object_multipaths_query_part(i, delimiter, max_end=max_end)
require_delimiter = True
elif char == '[':
part = self._parse_array_multipaths_query_part(i, delimiter, max_end=max_end)
require_delimiter = True
elif char == '!':
part = self._parse_literal_query_part(i, delimiter, max_end=max_end, in_multipaths=in_multipaths)
elif in_multipaths and char == ',':
i -= 1
break
elif in_multipaths and require_delimiter:
raise GJSONInvalidSyntaxError('Missing separator after multipath.', query=self._query, position=i)
if part:
part.previous = previous
parts.append(part)
previous = part
else: # Normal path, no special grammar
if not current:
current_start = i
current.append(char)
if char == ESCAPE_CHARACTER:
i += 1 # Skip the escaped character
if next_char is None:
raise GJSONParseError('Escape character at the end of the query.',
query=self._query, position=i)
current.append(next_char)
if part:
i = part.end + 1
else:
i += 1
if part is None and current:
part = FieldQueryPart(start=current_start, end=i, part=''.join(current),
delimiter=delimiter, previous=previous, is_last=True)
parts.append(part)
return parts
@staticmethod
def _is_sequence(obj: Any) -> bool:
"""Check if an object is a sequence but not a string or bytes object.
Arguments:
obj: the object to test.
Returns:
:py:data:`True` if the object is a sequence but not a string or bytes, :py:data:`False` otherwise.
"""
return isinstance(obj, Sequence) and not isinstance(obj, (str, bytes))
def _find_closing_parentheses(self, *, start: int, opening: str, suffix: str = '', max_end: int = 0) -> int:
"""Find the matching parentheses that closes the opening one looking for unbalance of the given character.
Arguments:
start: the index of the opening parentheses in the query.
opening: the opening parentheses to look for imbalances.
suffix: an optional suffix that can be present after the closing parentheses before reaching a delimiter or
the end of the query.
max_end: an optional last position up to where the parentheses can be found.
Raises:
gjson.GJSONParseError: if unable to find the closing parentheses or the parentheses are not balanced.
Returns:
the position of the closing parentheses if there is no suffix or the one of the last character of the
suffix if present.
"""
closing = PARENTHESES_PAIRS[opening]
opened = 0
end = -1
escaped = False
in_string = False
query = self._query[start:max_end + 1] if max_end else self._query[start:]
for i, char in enumerate(query):
if char == ESCAPE_CHARACTER:
escaped = True
continue
if escaped:
escaped = False
continue
if char == '"':
if in_string:
in_string = False
else:
in_string = True
continue
if in_string:
continue
if char == opening:
opened += 1
elif char == closing:
if opened:
opened -= 1
if not opened:
end = i
break
if opened or end < 0:
raise GJSONParseError(f'Unbalanced parentheses `{opening}`, {opened} still opened.',
query=self._query, position=start)
if suffix and end + len(suffix) < len(query) and query[end + 1:end + len(suffix) + 1] == suffix:
end += len(suffix)
if end + 1 < len(query):
delimiters = list(MULTIPATHS_DELIMITERS) if max_end else list(DELIMITERS)
if opening == '(' and suffix == '#': # Nested queries
delimiters.append(')')
if (max_end and query[end + 1] not in delimiters) or (not max_end and query[end + 1] not in DELIMITERS):
raise GJSONParseError('Expected delimiter or end of query after closing parenthesis.',
query=self._query, position=start + end)
return start + end
def _parse_modifier_query_part(self, start: int, delimiter: str, max_end: int = 0,
in_multipaths: bool = False) -> ModifierQueryPart:
"""Find the modifier end position in the query starting from a given point.
Arguments:
start: the index of the ``@`` symbol that starts a modifier in the query.
delimiter: the delimiter before the modifier.
max_end: an optional last position up to where the last character can be found.
in_multipaths: whether the part to be parsed is inside a multipaths.
Raises:
gjson.GJSONParseError: on invalid modifier.
Returns:
the modifier query part object.
"""
end = start
escaped = False
delimiters = MULTIPATHS_DELIMITERS if in_multipaths else DELIMITERS
options: dict[Any, Any] = {}
query = self._query[start:max_end + 1] if max_end else self._query[start:]
for i, char in enumerate(query):
if char == ESCAPE_CHARACTER and not escaped:
escaped = True
continue
if escaped:
escaped = False
continue
if char == ':':
name = self._query[start + 1:start + i]
options_len, options = self._parse_modifier_options(start + i + 1)
end = start + i + options_len
break
if char in delimiters:
end = start + i - 1
name = self._query[start + 1:start + i]
break
else: # End of query
end = start + i
name = self._query[start + 1:start + i + 1]
name.replace(ESCAPE_CHARACTER, '')
if not name:
raise GJSONParseError('Got empty modifier name.', query=self._query, position=start)
for char in MODIFIER_NAME_RESERVED_CHARS:
if char in name:
raise GJSONParseError(f'Invalid modifier name @{name}, the following characters are not allowed: '
f'{MODIFIER_NAME_RESERVED_CHARS}', query=self._query, position=start)
return ModifierQueryPart(start=start, end=end, part=self._query[start:end + 1], delimiter=delimiter,
name=name, options=options, is_last=False, previous=None)
def _parse_modifier_options(self, start: int) -> tuple[int, dict[Any, Any]]:
"""Find the modifier options end position in the query starting from a given point.
Arguments:
start: the index of the ``:`` symbol that starts a modifier options.
Raises:
gjson.GJSONParseError: on invalid modifier options.
Returns:
the modifier options last character index in the query and the parsed options.
"""
if start >= len(self._query):
raise GJSONParseError('Modifier with options separator `:` without any option.',
query=self._query, position=start)
if self._query[start] != '{':
raise GJSONParseError('Expected JSON object `{...}` as modifier options.',
query=self._query, position=start)
query_parts = re.split(r'(? ArrayQueryQueryPart:
"""Parse an array query part starting from the given point.
Arguments:
start: the index of the ``#`` symbol that starts a ``#(...)`` or ``#(...)#`` query.
delimiter: the delimiter before the modifier.
max_end: an optional last position up to where the closing parentheses can be found.
Raises:
gjson.GJSONParseError: on invalid query.
Returns:
the array query part object.
"""
end = self._find_closing_parentheses(start=start, opening='(', suffix='#', max_end=max_end)
part = self._query[start:end + 1]
if part[-1] == '#':
content_end = -2
first_only = False
else:
content_end = -1
first_only = True
content = part[2: content_end]
query_operator = ''
key = ''
value: Union[str, ArrayQueryQueryPart] = ''
pattern = '|'.join(re.escape(op) for op in QUERIES_OPERATORS)
match = re.search(fr'(? Optional[ArrayIndexQueryPart]:
"""Parse an array index query part.
Arguments:
start: the index of the start of the path in the query.
delimiter: the delimiter before the query part.
in_multipaths: whether the part to be parsed is inside a multipaths.
Returns:
the array index query object if the integer path is found, :py:const:`None` otherwise.
"""
subquery = self._query[start:]
delimiters = MULTIPATHS_DELIMITERS if in_multipaths else DELIMITERS
delimiters_match = '|'.join([re.escape(i) for i in delimiters])
match = re.search(fr'^([1-9][0-9]*|0)({delimiters_match}|$)', subquery)
if not match:
return None
end = start + len(match.groups()[0]) - 1
part = self._query[start:end + 1]
return ArrayIndexQueryPart(start=start, end=end, part=part, delimiter=delimiter, is_last=False, previous=None)
def _parse_object_multipaths_query_part(
self, start: int, delimiter: str, max_end: int = 0) -> MultipathsObjectQueryPart:
"""Parse a multipaths object query part.
Arguments:
start: the index of the start of the path in the query.
delimiter: the delimiter before the query part.
max_end: an optional last position up to where the multipaths can extend.
Returns:
the multipaths object query part.
Raises:
gjson.GJSONParseError: on invalid query.
"""
end = self._find_closing_parentheses(start=start, opening='{', max_end=max_end)
part = self._query[start:end + 1]
parts = []
def _get_key(current_key: Optional[str], value: Optional[BaseQueryPart]) -> str:
"""Return the current key or the default value if not set. Allow for empty key as valid key.
Arguments:
current_key: the current key to evaluate.
value: the current value from where to extract a key name if missing.
"""
if current_key is not None:
return current_key
if value and isinstance(value, (FieldQueryPart, ArrayIndexQueryPart, ModifierQueryPart)):
return value.part
return '_'
new_item = True
escaped = False
key: Optional[str] = None
key_start = 0
value_start = 0
skip_until = 0
for i, char in enumerate(part[1:-1], start=1):
if skip_until and i <= skip_until:
if i == skip_until:
skip_until = 0
new_item = True
continue
if new_item:
value_start = 0
if char == '"':
key_start = i
new_item = False
continue
if char != ',':
value_start = i
new_item = False
if key_start:
if char == ESCAPE_CHARACTER and not escaped:
escaped = True
continue
if escaped:
escaped = False
continue
if char == '"':
try:
key = json.loads(part[key_start:i + 1], strict=False)
except json.JSONDecodeError as ex:
raise GJSONParseError(f'Failed to parse multipaths key {part[key_start:i + 1]}.',
query=self._query, position=key_start) from ex
key_start = 0
continue
if key is not None and not key_start and not value_start:
if char == ':':
value_start = i + 1
continue
raise GJSONParseError(f'Expected colon after multipaths item with key "{key}".',
query=self._query, position=i)
if value_start:
try:
values = self._parse(
start=start + value_start,
end=end - 1,
max_end=max_end - 1 if max_end else end - 1,
delimiter=delimiter,
in_multipaths=True)
except GJSONInvalidSyntaxError:
raise
except GJSONParseError: # In multipaths, paths that fails are silently suppressed
values = []
if values:
parts.append(MultipathsItem(key=_get_key(key, values[-1]), values=values))
skip_until = values[-1].end - start + 1
else:
skip_until = end - start
new_item = True
key = None
key_start = 0
value_start = 0
continue
return MultipathsObjectQueryPart(start=start, end=end, part=part, delimiter=delimiter, previous=None,
is_last=False, parts=parts)
def _parse_array_multipaths_query_part(
self, start: int, delimiter: str, max_end: int = 0) -> MultipathsArrayQueryPart:
"""Parse a multipaths object query part.
Arguments:
start: the index of the start of the path in the query.
delimiter: the delimiter before the query part.
max_end: an optional last position up to where the multipaths can extend.
Returns:
the multipaths array query part.
Raises:
gjson.GJSONParseError: on invalid query.
"""
end = self._find_closing_parentheses(start=start, opening='[', max_end=max_end)
part = self._query[start:end + 1]
parts = []
skip_until = 0
for i, _ in enumerate(part[1:-1], start=1):
if skip_until and i <= skip_until:
if i == skip_until:
skip_until = 0
continue
try:
values = self._parse(
start=start + i,
end=end - 1,
max_end=max_end - 1 if max_end else end - 1,
delimiter=delimiter,
in_multipaths=True)
except GJSONInvalidSyntaxError:
raise
except GJSONParseError: # In multipaths, paths that fails are silently suppressed
values = []
if values:
parts.append(values)
skip_until = values[-1].end - start + 1
else:
skip_until = end - start
return MultipathsArrayQueryPart(start=start, end=end, part=part, delimiter=delimiter, previous=None,
is_last=False, parts=parts)
def _parse_literal_query_part(self, start: int, delimiter: str, max_end: int = 0,
in_multipaths: bool = False) -> LiteralQueryPart:
"""Parse a literal query part.
Arguments:
start: the index of the start of the path in the query.
delimiter: the delimiter before the query part.
max_end: an optional last position up to where the multipaths can extend.
in_multipaths: whether the part to be parsed is inside a multipaths.
Returns:
the literal query part.
Raises:
gjson.GJSONParseError: on invalid query.
"""
end = -1
begin = self._query[start + 1:start + 2]
if begin in ('{', '['):
end = self._find_closing_parentheses(start=start + 1, opening=begin, max_end=max_end)
elif begin == '"':
query = self._query[start + 2:max_end + 1] if max_end else self._query[start + 2:]
match = re.search(r'(? Any:
"""Parse the given part of the full query.
Arguments:
part: the query part as already parsed.
obj: the current object.
in_multipaths: whether the part to be parsed is inside a multipaths.
Raises:
gjson.GJSONParseError: on invalid query.
Returns:
the result of the query.
"""
in_hash = False
in_query_all = False
ret: Any
if isinstance(obj, NoResult):
return obj
if isinstance(part, ArrayLenghtQueryPart):
in_hash = True
if part.is_last:
if part.delimiter == DOT_DELIMITER and (self._after_hash or self._after_query_all):
ret = []
elif part.delimiter == PIPE_DELIMITER and isinstance(part.previous, ArrayLenghtQueryPart):
raise GJSONParseError('The pipe delimiter cannot immediately follow the # element.',
query=self._query, position=part.start)
elif self._is_sequence(obj):
ret = len(obj)
else:
raise GJSONParseError('Expected a sequence like object for query part # at the end of the query, '
f'got {type(obj)}.', query=self._query, position=part.start)
else:
ret = obj
elif isinstance(part, ArrayQueryQueryPart):
if not self._is_sequence(obj):
raise GJSONParseError(f'Queries are supported only for sequence like objects, got {type(obj)}.',
query=self._query, position=part.start)
in_query_all = not part.first_only
ret = self._parse_query(part, obj)
elif isinstance(part, ModifierQueryPart):
ret = self._apply_modifier(part, obj)
elif isinstance(part, ArrayIndexQueryPart):
if isinstance(obj, Mapping): # Integer object keys not supported by JSON
if not in_multipaths and part.part not in obj:
raise GJSONParseError(f'Mapping object does not have key `{part}`.',
query=self._query, position=part.start)
ret = obj.get(part.part, NoResult())
elif self._is_sequence(obj):
if (self._after_hash or self._after_query_all) and part.delimiter == DOT_DELIMITER:
# Skip non mapping items and items without the given key
ret = [i[part.part] for i in obj if isinstance(i, Mapping) and part.part in i]
elif (self._after_hash and part.delimiter == PIPE_DELIMITER
and isinstance(part.previous, ArrayLenghtQueryPart)):
raise GJSONParseError('Integer query part after a pipe delimiter on an sequence like object.',
query=self._query, position=part.start)
else:
num = len(obj)
if part.index >= num:
raise GJSONParseError(f'Index `{part}` out of range for sequence object with {num} items in '
'query.', query=self._query, position=part.start)
ret = obj[part.index]
else:
raise GJSONParseError(f'Integer query part on unsupported object type {type(obj)}, expected a mapping '
'or sequence like object.', query=self._query, position=part.start)
elif isinstance(part, FieldQueryPart):
if re.search(r'(? Any:
"""Evaluate the return value of an inline query #(...) / #(...)# depending on first match or all matches.
Arguments:
query: the query part.
obj: the current object.
Raises:
gjson.GJSONParseError: if the query is for the first element and there are no matching items.
Returns:
the result of the query.
"""
if query.first_only:
if obj:
return obj[0]
raise GJSONParseError('Query for first element does not match anything.',
query=self._query, position=query.start)
return obj
def _parse_query(self, query: ArrayQueryQueryPart, obj: Any) -> Any:
"""Parse an inline query #(...) / #(...)#.
Arguments:
query: the query part.
obj: the current object.
Raises:
gjson.GJSONParseError: on invalid query.
Returns:
the result of the query.
"""
if isinstance(query.value, ArrayQueryQueryPart):
ret = []
for i in obj:
nested_obj = None
if query.field:
if isinstance(i, Mapping) and query.field in i:
nested_obj = i[query.field]
elif self._is_sequence(i):
nested_obj = i
if nested_obj is not None and self._parse_query(query.value, nested_obj):
ret.append(i)
return self._evaluate_query_return_value(query, ret)
if not query.operator:
return self._evaluate_query_return_value(query, [i for i in obj if query.field in i])
key = query.field.replace('\\', '')
try:
value = json.loads(query.value, strict=False)
except json.JSONDecodeError as ex:
position = query.start + len(query.field) + len(query.operator)
raise GJSONParseError(f'Invalid value `{query.value}` for the query key `{key}`.',
query=self._query, position=position) from ex
if not key and query.first_only and obj and isinstance(obj[0], Mapping):
raise GJSONParseError('Query on mapping like objects require a key before the operator.',
query=self._query, position=query.start)
oper: Callable[[Any, Any], bool]
if query.operator == '==~':
if value not in (True, False):
if query.first_only:
raise GJSONParseError(f'Queries ==~ operator requires a boolean value, got {type(value)} instead: '
f'`{value}`.', query=self._query, position=query.start + len(query.field))
return []
def truthy_op(obj_a: Any, obj_b: bool) -> bool:
truthy = operator.truth(obj_a)
if obj_b:
return truthy
return not truthy
oper = truthy_op
elif query.operator in ('==', '='):
oper = operator.eq
elif query.operator == '!=':
oper = operator.ne
elif query.operator == '<':
oper = operator.lt
elif query.operator == '<=':
oper = operator.le
elif query.operator == '>':
oper = operator.gt
elif query.operator == '>=':
oper = operator.ge
elif query.operator in ('%', '!%'):
value = str(value).replace('*', '.*').replace('?', '.')
value = f'^{value}$'
if query.operator == '%':
def match_op(obj_a: Any, obj_b: Any) -> bool:
if not isinstance(obj_a, str):
return False
return re.match(obj_b, obj_a) is not None
oper = match_op
else:
def not_match_op(obj_a: Any, obj_b: Any) -> bool:
if not isinstance(obj_a, str):
return False
return re.match(obj_b, obj_a) is None
oper = not_match_op
try:
if key:
if query.operator == '==~': # Consider missing keys as falsy according to GJSON docs.
ret = [i for i in obj if oper(i.get(key), value)]
else:
ret = [i for i in obj if key in i and oper(i[key], value)]
else: # Query on an array of non-objects, match them directly
ret = [i for i in obj if oper(i, value)]
except TypeError:
ret = []
return self._evaluate_query_return_value(query, ret)
def _apply_modifier(self, modifier: ModifierQueryPart, obj: Any) -> Any:
"""Apply a modifier.
Arguments:
part: the modifier query part to parse.
obj: the current object before applying the modifier.
Raises:
gjson.GJSONError: when the modifier raises an exception.
gjson.GJSONParseError: on unknown modifier.
Returns:
the object modifier according to the modifier.
"""
try:
modifier_func = getattr(self, f'_apply_modifier_{modifier.name}')
except AttributeError:
modifier_func = self._custom_modifiers.get(modifier.name)
if modifier_func is None:
raise GJSONParseError(f'Unknown modifier @{modifier.name}.',
query=self._query, position=modifier.start) from None
try:
return modifier_func(modifier.options, obj, last=modifier.is_last)
except GJSONError:
raise
except Exception as ex:
raise GJSONError(f'Modifier @{modifier.name} raised an exception.') from ex
def _apply_modifier_reverse(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @reverse modifier.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to reverse.
last: whether this is the final part of the query.
Returns:
the reversed object. If the object cannot be reversed is returned untouched.
"""
del last # for pylint, unused argument
if isinstance(obj, Mapping):
return {k: obj[k] for k in reversed(obj.keys())}
if self._is_sequence(obj):
return obj[::-1]
return obj
def _apply_modifier_keys(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @keys modifier.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to get the keys from.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object does not have a keys() method.
Returns:
the current object keys as list.
"""
del last # for pylint, unused argument
try:
return list(obj.keys())
except AttributeError as ex:
raise GJSONError('The current object does not have a keys() method.') from ex
def _apply_modifier_values(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @values modifier.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to get the values from.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object does not have a values() method.
Returns:
the current object values as list.
"""
del last # for pylint, unused argument
try:
return list(obj.values())
except AttributeError as ex:
raise GJSONError('The current object does not have a values() method.') from ex
def _apply_modifier_ugly(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @ugly modifier to condense the output.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to uglyfy.
last: whether this is the final part of the query.
Returns:
the current object, unmodified.
"""
del last # for pylint, unused argument
self._dump_params['separators'] = (',', ':')
self._dump_params['indent'] = None
return obj
def _apply_modifier_pretty(self, options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @pretty modifier to pretty-print the output.
Arguments:
options: the eventual options for the modifier.
obj: the current object to prettyfy.
last: whether this is the final part of the query.
Returns:
the current object, unmodified.
"""
del last # for pylint, unused argument
self._dump_params['indent'] = options.get('indent', 2)
self._dump_params['sort_keys'] = options.get('sortKeys', False)
self._dump_params['prefix'] = options.get('prefix', '')
return obj
def _apply_modifier_ascii(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @ascii modifier to have all non-ASCII characters escaped when dumping the object.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to sort.
last: whether this is the final part of the query.
Returns:
the current object, unmodified.
"""
del last # for pylint, unused argument
self._dump_params['ensure_ascii'] = True
return obj
def _apply_modifier_sort(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @sort modifier, sorts mapping and sequences.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current object to sort.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object is not sortable.
Returns:
the sorted object.
"""
del last # for pylint, unused argument
if isinstance(obj, Mapping):
return {k: obj[k] for k in sorted(obj.keys())}
if self._is_sequence(obj):
return sorted(obj)
raise GJSONError(f'@sort modifier not supported for object of type {type(obj)}. '
'Expected a mapping or sequence like object.')
def _apply_modifier_valid(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @valid modifier, checking that the current object can be converted to JSON.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element to validate.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object cannot be converted to JSON.
Returns:
the current object, unmodified.
"""
del last # for pylint, unused argument
try:
json.dumps(obj, **self._dump_params)
except Exception as ex:
raise GJSONError('The current object cannot be converted to JSON.') from ex
return obj
def _apply_modifier_this(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @this modifier, that returns the current object.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element to return.
last: whether this is the final part of the query.
Returns:
the current object, unmodified.
"""
del last # for pylint, unused argument
return obj
def _apply_modifier_fromstr(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @fromstr modifier, converting a string to JSON, if valid.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element from where to extract the JSON.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object cannot be converted to JSON.
Returns:
the parsed JSON.
"""
del last # for pylint, unused argument
if not isinstance(obj, (str, bytes)):
raise GJSONError(f'Modifier @fromstr got object of type {type(obj)} as input, expected string or bytes.')
try:
return json.loads(obj, strict=False)
except Exception as ex:
raise GJSONError('The current @fromstr input object cannot be converted to JSON.') from ex
def _apply_modifier_tostr(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @tostr modifier, converting the current object to a JSON-encoded string, if valid.
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element from where to extract the JSON.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object cannot be converted to a JSON-encoded string.
Returns:
the JSON-encoded string.
"""
del last # for pylint, unused argument
try:
return json.dumps(obj, ensure_ascii=False)
except Exception as ex:
raise GJSONError('The current object cannot be converted to a JSON-encoded string for @tostr.') from ex
def _apply_modifier_group(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @group modifier, that groups a dictionary of lists in a list of dictionaries.
Example input::
{"id": ["123", "456", "789"], "val": [2, 1]}
Example output::
[{"id": "123", "val": 2}, {"id": "456", "val": 1}, {"id": "789"}]
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element to group.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object is not a dictionary.
Returns:
a list with the grouped objects or an empty list if the input has no lists as values.
"""
del last # for pylint, unused argument
if not isinstance(obj, Mapping):
raise GJSONError(f'Modifier @group got object of type {type(obj)} as input, expected dictionary.')
# Skip all values that aren't lists:
obj = {k: v for k, v in obj.items() if self._is_sequence(v)}
# Fill missing values with NoResult to remove them afterwards
obj = [dict(zip_longest(obj.keys(), values)) for values in zip_longest(*obj.values(), fillvalue=NoResult())]
# Skip keys with value NoResult in each dictionary
return [{k: v for k, v in i.items() if not isinstance(v, NoResult)} for i in obj]
def _apply_modifier_join(self, _options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @join modifier, that joins a list of dictionaries into a single dictionary.
Items in the sequence that are not dictionaries are skipped.
Differently from GJSON there is no support for duplicated keys as the can't exist in Python dictionaries.
Hence this modifier doesn't accept any option.
Example input::
[{"first": "Tom", "age": 37}, {"age": 41}]
Example output::
{"first": "Tom", "age":41}
Arguments:
options: the eventual options for the modifier, currently unused.
obj: the current element to join.
last: whether this is the final part of the query.
Returns:
the object untouched if the object is not a sequence, a dictionary with joined objects otherwise.
"""
del last # for pylint, unused argument
if not self._is_sequence(obj):
return obj
ret: dict[Any, Any] = {}
for item in obj:
if isinstance(item, Mapping):
ret.update(item)
return ret
def _apply_modifier_top_n(self, options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @top_n modifier to find the most common values of a given field.
Arguments:
options: the eventual modifier options. If not specified all items are returned. If specified it must
contain a 'n' key with the number of top N to return.
obj: the current object to extract the N most common items.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object is not a sequence.
Returns:
dict: a dictionary of unique items as keys and the count as value.
"""
del last # for pylint, unused argument
if not self._is_sequence(obj):
raise GJSONError(f'@top_n modifier not supported for object of type {type(obj)}. '
'Expected a sequence like object.')
return dict(Counter(obj).most_common(options.get('n')))
def _apply_modifier_sum_n(self, options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @sum_n modifier that groups the values of a given key while summing the values of another key.
The key used to sum must have numeric values.
Arguments:
options: the modifier options. It must contain a 'group' key with the name of the field to use to group the
items as value and a 'sum' key with the name of the field to use to sum the values for each unique
grouped identifier. If a 'n' key is also provided, only the top N results are returned. If not
specified all items are returned.
obj: the current object to group and sum the top N values.
last: whether this is the final part of the query.
Raises:
gjson.GJSONError: if the current object is not a sequence.
Returns:
dict: a dictionary of unique items as keys and the sum as value.
"""
del last # for pylint, unused argument
if not self._is_sequence(obj):
raise GJSONError(f'@sum_n modifier not supported for object of type {type(obj)}. '
'Expected a sequence like object.')
results: Counter[Any] = Counter()
for item in obj:
results[item[options['group']]] += item[options['sum']]
return dict(results.most_common(options.get('n')))
def _apply_modifier_flatten(self, options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""Apply the @flatten modifier.
Arguments:
options: the eventual modifier options.
obj: the current object to flatten.
last: whether this is the final part of the query.
Returns:
the modified object.
"""
del last # for pylint, unused argument
if not self._is_sequence(obj):
return obj
return list(self._flatten_sequence(obj, deep=options.get('deep', False)))
def _flatten_sequence(self, obj: Any, deep: bool = False) -> Any:
"""Flatten nested sequences in the given object.
Arguments:
obj: the current object to flatten
deep: if :py:data:`True` recursively flatten nested sequences. By default only the first level is
processed.
Returns:
the flattened object if it was a flattable sequence, the given object itself otherwise.
"""
for elem in obj:
if self._is_sequence(elem):
if deep:
yield from self._flatten_sequence(elem, deep=deep)
else:
yield from elem
else:
yield elem
python-gjson-1.0.0/gjson/_protocols.py 0000664 0000000 0000000 00000003405 14363731677 0020047 0 ustar 00root root 0000000 0000000 """Typing protocols used by the gjson package."""
from typing import Any, Protocol, runtime_checkable
@runtime_checkable
class ModifierProtocol(Protocol):
"""Callback protocol for the custom modifiers."""
def __call__(self, options: dict[str, Any], obj: Any, *, last: bool) -> Any:
"""To register a custom modifier a callable that adhere to this protocol must be provided.
Examples:
Register a custom modifier that sums all the numbers in a list:
>>> import gjson
>>> data = [1, 2, 3, 4, 5]
>>> def custom_sum(options, obj, *, last):
... # insert sanity checks code here
... return sum(obj)
...
>>> gjson_obj = gjson.GJSON(data)
>>> gjson_obj.register_modifier('sum', custom_sum)
>>> gjson_obj.get('@sum')
15
Arguments:
options: a dictionary of options. If no options are present in the query the callable will be called with
an empty dictionary as options. The modifier can supports any number of options, or none.
obj: the current object already modifier by any previous parts of the query.
last: :py:data:`True` if the modifier is the last element in the query or :py:data:`False` otherwise.
Raises:
Exception: any exception that might be raised by the callable is catched by gjson and re-raised as a
:py:class:`gjson.GJSONError` exception to ensure that the normal gjson behaviour is respected according
to the selected verbosity (CLI) or ``quiet`` parameter (Python library).
Returns:
the resulting object after applying the modifier.
"""
python-gjson-1.0.0/gjson/exceptions.py 0000664 0000000 0000000 00000002441 14363731677 0020044 0 ustar 00root root 0000000 0000000 """gjson custom exceptions module."""
from typing import Any
class GJSONError(Exception):
"""Raised by the gjson module on error while performing queries or converting to JSON."""
class GJSONParseError(GJSONError):
"""Raised when there is an error parsing the query string, with nicer representation of the error."""
def __init__(self, *args: Any, query: str, position: int):
"""Initialize the exception with the additional data of the query part.
Arguments:
*args: all positional arguments like any regular exception.
query: the full query that generated the parse error.
position: the position in the query string where the parse error occurred.
"""
super().__init__(*args)
self.query = query
self.position = position
def __str__(self) -> str:
"""Return a custom representation of the error.
Returns:
the whole query string with a clear indication on where the error occurred.
"""
default = super().__str__()
line = '-' * (self.position + 7) # 7 is for the lenght of 'Query: '
return f'{default}\nQuery: {self.query}\n{line}^'
class GJSONInvalidSyntaxError(GJSONParseError):
"""Raised when there is a query with an invalid syntax."""
python-gjson-1.0.0/gjson/py.typed 0000664 0000000 0000000 00000000000 14363731677 0016775 0 ustar 00root root 0000000 0000000 python-gjson-1.0.0/prospector.yaml 0000664 0000000 0000000 00000002020 14363731677 0017246 0 ustar 00root root 0000000 0000000 strictness: high
inherits:
- strictness_high
doc-warnings: true
member-warnings: true
test-warnings: true
autodetect: false
output-format: grouped
ignore-paths:
- doc/source/conf.py
- build/
- dist/
- logs/
pycodestyle:
full: true
options:
max-line-length: 120
pydocstyle:
explain: true
source: true
disable:
- D203 # 1 blank line required before class docstring, D211 (after) is enforce instead
- D213 # Multi-line docstring summary should start at the second line, D212 (first line) is enforced instead
- D406 # Section name should end with a newline, incompatible with Google Style Python Docstrings
- D407 # Missing dashed underline after section, incompatible with Google Style Python Docstrings
pylint:
disable:
- pointless-string-statement # used as documentation for class attributes
options:
max-line-length: 120
max-module-lines: 2000
include-naming-hint: true
pyroma:
run: true
vulture:
run: true
mccabe:
disable:
- MC0001 # too complex method
python-gjson-1.0.0/setup.cfg 0000664 0000000 0000000 00000001050 14363731677 0016005 0 ustar 00root root 0000000 0000000 [aliases]
test = pytest
[build_sphinx]
project = gjson-py
source-dir = doc/source
build-dir = doc/build
[mypy]
disallow_untyped_calls = True
disallow_untyped_defs = True
disallow_incomplete_defs = True
disallow_untyped_decorators = True
no_implicit_optional = True
warn_redundant_casts = True
warn_unused_ignores = True
warn_return_any = True
warn_unreachable = True
strict_equality = True
strict = True
show_error_context = True
show_column_numbers = True
show_error_codes = True
pretty = True
warn_incomplete_stub = True
warn_unused_configs = True
python-gjson-1.0.0/setup.py 0000664 0000000 0000000 00000004614 14363731677 0015707 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python
"""Package configuration."""
from setuptools import find_packages, setup
with open('README.rst', 'r') as readme:
long_description = readme.read()
# Extra dependencies
extras_require = {
# Test dependencies
'tests': [
'bandit',
'flake8',
'flake8-import-order',
'mypy',
'pytest-cov',
'pytest-xdist',
'pytest',
'sphinx_rtd_theme',
'sphinx-argparse',
'sphinx-autodoc-typehints',
'Sphinx',
'types-pkg_resources',
],
'prospector': [
'prospector[with_everything]',
'pytest',
],
}
setup_requires = [
'pytest-runner',
'setuptools_scm',
]
setup(
author='Riccardo Coccioli',
author_email='volans-@users.noreply.github.com',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Intended Audience :: Developers',
'Intended Audience :: End Users/Desktop',
'Intended Audience :: Information Technology',
'Intended Audience :: Science/Research',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)',
'Operating System :: OS Independent',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3 :: Only',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: Text Processing',
'Topic :: Utilities',
'Typing :: Typed',
],
description=('gjson-py is a Python package that provides a simple way to filter and extract data from JSON-like '
'objects or JSON files, using the GJSON syntax.'),
entry_points={
'console_scripts': [
'gjson = gjson._cli:cli',
],
},
extras_require=extras_require,
install_requires=[],
keywords=['gjson', 'json'],
license='GPLv3+',
long_description=long_description,
long_description_content_type='text/x-rst',
name='gjson',
package_data={'gjson': ['py.typed']},
packages=find_packages(),
platforms=['GNU/Linux', 'BSD', 'MacOSX'],
python_requires='>=3.9',
setup_requires=setup_requires,
url='https://github.com/volans-/gjson-py',
use_scm_version=True,
zip_safe=False,
)
python-gjson-1.0.0/tests/ 0000775 0000000 0000000 00000000000 14363731677 0015332 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/tests/unit/ 0000775 0000000 0000000 00000000000 14363731677 0016311 5 ustar 00root root 0000000 0000000 python-gjson-1.0.0/tests/unit/__init__.py 0000664 0000000 0000000 00000000030 14363731677 0020413 0 ustar 00root root 0000000 0000000 """gjson unit tests."""
python-gjson-1.0.0/tests/unit/test__cli.py 0000664 0000000 0000000 00000016157 14363731677 0020642 0 ustar 00root root 0000000 0000000 """CLI test module."""
import argparse
import io
import json
import pytest
from gjson._cli import cli
from gjson.exceptions import GJSONParseError
from .test_init import INPUT_JSON, INPUT_LINES, INPUT_LINES_WITH_ERRORS
def test_cli_stdin(monkeypatch, capsys):
"""It should read the data from stdin and query it."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_JSON))
ret = cli(['-', 'name.first'])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '"Tom"\n'
assert not captured.err
def test_cli_file(tmp_path, capsys):
"""It should read the data from the provided file and query it."""
data_file = tmp_path / 'input.json'
data_file.write_text(INPUT_JSON)
ret = cli([str(data_file), 'name.first'])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '"Tom"\n'
assert not captured.err
@pytest.mark.parametrize('query', (
'a\tkey',
'..0.a\tkey',
))
def test_cli_file_with_control_chars(query, tmp_path, capsys):
"""It should read the data from the provided file and query it."""
data_file = tmp_path / 'input.json'
data_file.write_text('{"a\tkey": "a\tvalue"}')
ret = cli(['-vvv', str(data_file), query])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '"a\\tvalue"\n'
assert not captured.err
@pytest.mark.parametrize('query', (
'a\tkey',
'..0.a\tkey',
))
def test_cli_stdin_with_control_chars(query, monkeypatch, capsys):
"""It should read the data from stdin and query it."""
monkeypatch.setattr('sys.stdin', io.StringIO('{"a\tkey": "a\tvalue"}'))
ret = cli(['-vvv', '-', query])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '"a\\tvalue"\n'
assert not captured.err
def test_cli_nonexistent_file(tmp_path, capsys):
"""It should exit with a failure exit code and no output."""
ret = cli([str(tmp_path / 'nonexistent.json'), 'name.first'])
assert ret == 1
captured = capsys.readouterr()
assert not captured.out
assert not captured.err
def test_cli_nonexistent_file_verbosity_1(tmp_path, capsys):
"""It should exit with a failure exit code and print the error message."""
ret = cli(['-v', str(tmp_path / 'nonexistent.json'), 'name.first'])
assert ret == 1
captured = capsys.readouterr()
assert not captured.out
assert captured.err.startswith("ArgumentTypeError: can't open")
assert 'nonexistent.json' in captured.err
def test_cli_nonexistent_file_verbosity_2(tmp_path):
"""It should raise the exception and print the full traceback."""
with pytest.raises(
argparse.ArgumentTypeError, match=r"can't open .*/nonexistent.json.* No such file or directory"):
cli(['-vv', str(tmp_path / 'nonexistent.json'), 'name.first'])
def test_cli_stdin_query_verbosity_1(monkeypatch, capsys):
"""It should exit with a failure exit code and print the error message."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_JSON))
ret = cli(['-v', '-', 'nonexistent'])
assert ret == 1
captured = capsys.readouterr()
assert not captured.out
assert captured.err == ('GJSONParseError: Mapping object does not have key `nonexistent`.\n'
'Query: nonexistent\n-------^\n')
def test_cli_stdin_query_verbosity_2(monkeypatch):
"""It should exit with a failure exit code and print the full traceback."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_JSON))
with pytest.raises(GJSONParseError, match=r'Mapping object does not have key `nonexistent`.'):
cli(['-vv', '-', 'nonexistent'])
def test_cli_lines_ok(monkeypatch, capsys):
"""It should apply the same query to each line."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES))
ret = cli(['--lines', '-', 'name'])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '"Gilbert"\n"Alexa"\n"May"\n"Deloise"\n'
assert not captured.err
def test_cli_lines_failed_lines_verbosity_0(monkeypatch, capsys):
"""It should keep going with the other lines and just skip the failed line."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
ret = cli(['--lines', '-', 'name'])
assert ret == 1
captured = capsys.readouterr()
assert captured.out == '"Gilbert"\n"Deloise"\n'
assert not captured.err
def test_cli_lines_failed_lines_verbosity_1(monkeypatch, capsys):
"""It should keep going with the other lines printing an error for the failed lines."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
ret = cli(['-v', '--lines', '-', 'name'])
assert ret == 1
captured = capsys.readouterr()
assert captured.out == '"Gilbert"\n"Deloise"\n'
assert captured.err.count('JSONDecodeError') == 2
def test_cli_lines_failed_lines_verbosity_2(monkeypatch):
"""It should interrupt the processing and print the full traceback."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
with pytest.raises(
json.decoder.JSONDecodeError, match=r'Expecting property name enclosed in double quotes'):
cli(['-vv', '--lines', '-', 'name'])
def test_cli_lines_double_dot_query(monkeypatch, capsys):
"""It should encapsulate each line in an array to allow queries."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES))
ret = cli(['--lines', '..#(age>45).name'])
assert ret == 1
captured = capsys.readouterr()
assert captured.out == '"Gilbert"\n"May"\n'
assert not captured.err
def test_cli_double_dot_query_ok(monkeypatch, capsys):
"""It should encapsulate the input in an array and apply the query to the array."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES))
ret = cli(['-', '..#.name'])
assert ret == 0
captured = capsys.readouterr()
assert captured.out == '["Gilbert", "Alexa", "May", "Deloise"]\n'
assert not captured.err
def test_cli_double_dot_query_failed_lines_verbosity_0(monkeypatch, capsys):
"""It should encapsulate the input in an array skipping failing lines."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
ret = cli(['-', '..#.name'])
assert ret == 1
captured = capsys.readouterr()
assert not captured.out
assert not captured.err
def test_cli_double_dot_query_failed_lines_verbosity_1(monkeypatch, capsys):
"""It should encapsulate the input in an array skipping failing lines and printing an error for each failure."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
ret = cli(['-v', '-', '..#.name'])
assert ret == 1
captured = capsys.readouterr()
assert not captured.out
assert captured.err.startswith('JSONDecodeError: Expecting property name enclosed in double quotes')
def test_cli_double_dot_query_failed_lines_verbosity_2(monkeypatch):
"""It should interrupt the execution at the first invalid line and exit printing the traceback."""
monkeypatch.setattr('sys.stdin', io.StringIO(INPUT_LINES_WITH_ERRORS))
with pytest.raises(json.decoder.JSONDecodeError, match=r'Expecting property name enclosed in double quotes'):
cli(['-vv', '-', '..#.name'])
python-gjson-1.0.0/tests/unit/test_init.py 0000664 0000000 0000000 00000120565 14363731677 0020676 0 ustar 00root root 0000000 0000000 """GJSON test module."""
# pylint: disable=attribute-defined-outside-init
import json
import re
from collections.abc import Mapping
from math import isnan
import pytest
import gjson
from gjson._gjson import MODIFIER_NAME_RESERVED_CHARS
INPUT_JSON = """
{
"name": {"first": "Tom", "last": "Anderson"},
"age":37,
"children": ["Sara","Alex","Jack"],
"fav.movie": "Deer Hunter",
"friends": [
{"first": "Dale", "last": "Murphy", "age": 44, "nets": ["ig", "fb", "tw"]},
{"first": "Roger", "last": "Craig", "age": 68, "nets": ["fb", "tw"]},
{"first": "Jane", "last": "Murphy", "age": 47, "nets": ["ig", "tw"]}
]
}
"""
INPUT_OBJECT = json.loads(INPUT_JSON)
INPUT_LIST = json.loads("""
[
{"first": "Dale"},
{"first": "Jane"},
{"last": "Murphy"}
]
""")
INPUT_ESCAPE = json.loads("""
{
"test": {
"*":"valZ",
"*v":"val0",
"keyv*":"val1",
"key*v":"val2",
"keyv?":"val3",
"key?v":"val4",
"keyv.":"val5",
"key.v":"val6",
"keyk*":{"key?":"val7"},
"1key":"val8"
}
}
""")
# This json block is poorly formed on purpose.
INPUT_BASIC = json.loads("""
{"age":100, "name2":{"here":"B\\\\\\"R"},
"noop":{"what is a wren?":"a bird"},
"happy":true,"immortal":false,
"items":[1,2,3,{"tags":[1,2,3],"points":[[1,2],[3,4]]},4,5,6,7],
"arr":["1",2,"3",{"hello":"world"},"4",5],
"vals":[1,2,3],"name":{"first":"tom","last":null},
"created":"2014-05-16T08:28:06.989Z",
"loggy":{
"programmers": [
{
"firstName": "Brett",
"lastName": "McLaughlin",
"email": "aaaa",
"tag": "good"
},
{
"firstName": "Jason",
"lastName": "Hunter",
"email": "bbbb",
"tag": "bad"
},
{
"firstName": "Elliotte",
"lastName": "Harold",
"email": "cccc",
"tag": "good"
},
{
"firstName": 1002.3,
"age": 101
}
]
},
"lastly":{"end...ing":"soon","yay":"final"}
}
""")
INPUT_LINES = """
{"name": "Gilbert", "age": 61}
{"name": "Alexa", "age": 34}
{"name": "May", "age": 57}
{"name": "Deloise", "age": 44}
"""
INPUT_LINES_WITH_ERRORS = """
{"name": "Gilbert", "age": 61}
{invalid
{invalid
{"name": "Deloise", "age": 44}
"""
INPUT_TRUTHINESS = json.loads("""
{
"vals": [
{ "a": 1, "b": true },
{ "a": 2, "b": true },
{ "a": 3, "b": false },
{ "a": 4, "b": "0" },
{ "a": 5, "b": 0 },
{ "a": 6, "b": "1" },
{ "a": 7, "b": 1 },
{ "a": 8, "b": "true" },
{ "a": 9, "b": false },
{ "a": 10, "b": null },
{ "a": 11 }
]
}
""")
INPUT_SUM_N = json.loads("""
[
{"key": "a", "value": 1, "other": "value"},
{"key": "b", "value": 2},
{"key": "c", "value": 3, "other": "value"},
{"key": "a", "value": 7},
{"key": "b", "value": 1.5},
{"key": "d", "value": 4},
{"key": "c", "value": 9}
]
""")
INPUT_NESTED_QUERIES = json.loads("""
{
"key": [
{"level1": [{"level2": [{"level3": [1, 2]}]}]},
{"level1": [{"level2": [{"level3": [2, 3]}]}]},
[[{"level3": [1, 2]}], [{"level3": [2, 3]}]],
[[{"level3": [2, 3]}], [{"level3": [3, 4]}]],
{"another": [{"level2": [{"level3": [2, 3]}]}]},
{"level1": [{"another": [{"level3": [2, 3]}]}]},
{"level1": [{"level2": [{"another": [2, 3]}]}]},
[[{"another": [2, 3]}], [{"another": [3, 4]}]],
"spurious",
12.34,
{"mixed": [[{"level4": [1, 2]}]]}
]
}
""")
def compare_values(result, expected):
"""Compare results with the expected values ensuring same-order of keys for dictionaries."""
if isinstance(expected, float):
if isnan(expected):
assert isnan(result)
return
assert result == expected
if isinstance(expected, Mapping):
assert list(result.keys()) == list(expected.keys())
class TestObject:
"""Testing gjson with a basic input object."""
def setup_method(self):
"""Initialize the test instance."""
def upper(options, obj, *, last):
"""Custom modifier to return a string upper case."""
del options
del last
if isinstance(obj, list):
return [i.upper() for i in obj]
return obj.upper()
self.object = gjson.GJSON(INPUT_OBJECT)
self.object.register_modifier('upper', upper)
@pytest.mark.parametrize('query, expected', (
# Basic
('name.last', 'Anderson'),
('name.first', 'Tom'),
('age', 37),
('children', ['Sara', 'Alex', 'Jack']),
('children.0', 'Sara'),
('children.1', 'Alex'),
('friends.1', {'first': 'Roger', 'last': 'Craig', 'age': 68, 'nets': ['fb', 'tw']}),
('friends.1.first', 'Roger'),
# Wildcards
('*.first', 'Tom'),
('?a??.first', 'Tom'),
('child*.2', 'Jack'),
('c?ildren.0', 'Sara'),
# Escape characters
(r'fav\.movie', 'Deer Hunter'),
# Arrays
('friends.#', 3),
('friends.#.age', [44, 68, 47]),
('friends.#.first', ['Dale', 'Roger', 'Jane']),
# Queries
('friends.#(last=="Murphy").first', 'Dale'),
('friends.#(last=="Murphy")#.first', ['Dale', 'Jane']),
('friends.#(=="Murphy")#', []),
('friends.#(=="Mu)(phy")#', []),
('friends.#(=="Mur\tphy")#', []),
('friends.#(age\\===44)#', []),
('friends.#(age>47)#.last', ['Craig']),
('friends.#(age>=47)#.last', ['Craig', 'Murphy']),
('friends.#(age<47)#.last', ['Murphy']),
('friends.#(age<=47)#.last', ['Murphy', 'Murphy']),
('friends.#(age==44)#.last', ['Murphy']),
('friends.#(age!=44)#.last', ['Craig', 'Murphy']),
('friends.#(first%"D*").last', 'Murphy'),
('friends.#(first!%"D*").last', 'Craig'),
('friends.#(first!%"D???").last', 'Craig'),
('friends.#(%0)#', []),
('friends.#(>40)#', []),
('children.#(!%"*a*")', 'Alex'),
('children.#(%"*a*")#', ['Sara', 'Jack']),
# Nested queries (TODO)
# ('friends.#(nets.#(=="fb"))#.first', ['Dale', 'Roger']),
# Tilde in queries (TODO)
# ('vals.#(b==~true)#.a')
# Modifiers
('children.@reverse', ['Jack', 'Alex', 'Sara']),
('children.@reverse.0', 'Jack'),
('name.@reverse', {'last': 'Anderson', 'first': 'Tom'}),
('age.@reverse', 37),
('@keys', ['name', 'age', 'children', 'fav.movie', 'friends']),
('name.@values', ['Tom', 'Anderson']),
('age.@flatten', 37),
('@pretty:{"indent": 4}', INPUT_OBJECT),
(r'fav\.movie.@pretty:{"indent": 4}', 'Deer Hunter'),
('name.@tostr', '{"first": "Tom", "last": "Anderson"}'),
('name.@join', {'first': 'Tom', 'last': 'Anderson'}),
('age.@join', 37),
('children.@join', {}),
('children.0.@join', 'Sara'),
('friends.@join', {'first': 'Jane', 'last': 'Murphy', 'age': 47, 'nets': ['ig', 'tw']}),
# Dot vs Pipe
('friends.0.first', 'Dale'),
('friends|0.first', 'Dale'),
('friends.0|first', 'Dale'),
('friends|0|first', 'Dale'),
('friends|#', 3),
('friends.#', 3),
('friends.#(last="Murphy")#',
[{'first': 'Dale', 'last': 'Murphy', 'age': 44, 'nets': ['ig', 'fb', 'tw']},
{'first': 'Jane', 'last': 'Murphy', 'age': 47, 'nets': ['ig', 'tw']}]),
('friends.#(last="Murphy")#.first', ['Dale', 'Jane']),
('friends.#(last="Murphy")#.0', []),
('friends.#(last="Murphy")#|0', {'first': 'Dale', 'last': 'Murphy', 'age': 44, 'nets': ['ig', 'fb', 'tw']}),
('friends.#(last="Murphy")#.#', []),
('friends.#(last="Murphy")#|#', 2),
# Multipaths objects
('{}', {}),
('{.}', {}),
('{.invalid}', {}),
('{.invalid,}', {}),
('{age}', {'age': 37}),
(r'{a\ge}', {r'a\ge': 37}),
(r'{"a\\ge":age}', {r'a\ge': 37}),
('{"a\tb":age}', {'a\tb': 37}),
('{"key":age}', {'key': 37}),
('{age,age}', {'age': 37}),
('{age,"years":age}', {'age': 37, 'years': 37}),
('{"years":age,age}', {'years': 37, 'age': 37}),
('{age,name.first}', {'age': 37, 'first': 'Tom'}),
('{invalid,invalid.invalid,age}', {'age': 37}),
('{name.first,age,name.last}', {'first': 'Tom', 'age': 37, 'last': 'Anderson'}),
('{{age}}', {'_': {'age': 37}}),
('{{age},age}', {'_': {'age': 37}, 'age': 37}),
('friends.0.{age,nets.#(="ig")}', {'age': 44, "_": 'ig'}),
('friends.0.{age,nets.#(="ig"),invalid}', {'age': 44, "_": 'ig'}),
('friends.0.{age,nets.#(="ig")#}', {'age': 44, "_": ['ig']}),
('friends.#.{age,"key":first}',
[{'age': 44, 'key': 'Dale'}, {'age': 68, 'key': 'Roger'}, {'age': 47, 'key': 'Jane'}]),
('friends.#(age>44)#.{age,"key":first}', [{'age': 68, 'key': 'Roger'}, {'age': 47, 'key': 'Jane'}]),
('friends.#(age>44)#.{age,"key":first,invalid}', [{'age': 68, 'key': 'Roger'}, {'age': 47, 'key': 'Jane'}]),
(r'{age,name.first,fav\.movie}', {'age': 37, 'first': 'Tom', r'fav\.movie': 'Deer Hunter'}),
('{age,name.{"name":first,"surname":last},children.@sort}',
{'age': 37, '_': {'name': 'Tom', 'surname': 'Anderson'}, '@sort': ['Alex', 'Jack', 'Sara']}),
('friends.{0.first,1.last,2.age}.@values', ['Dale', 'Craig', 47]),
('{friends.{"a":0.{nets.{0}}}}', {'_': {'a': {'_': {'0': 'ig'}}}}),
('{friends.{"a":0.{nets.{0,1}}}}', {'_': {'a': {'_': {'0': 'ig', '1': 'fb'}}}}),
('friends.#.{age,first|@upper}',
[{"age": 44, "@upper": "DALE"}, {"age": 68, "@upper": "ROGER"}, {"age": 47, "@upper": "JANE"}]),
('{friends.#.{age,"first":first|@upper}|0.first}', {"first": "DALE"}),
('{"children":children|@upper,"name":name.first,"age":age}',
{"children": ["SARA", "ALEX", "JACK"], "name": "Tom", "age": 37}),
('friends.#.{age,"first":first.invalid}', [{'age': 44}, {'age': 68}, {'age': 47}]),
# Multipaths arrays
('[]', []),
('[.]', []),
('[.invalid]', []),
('[.invalid,]', []),
('[age]', [37]),
(r'[a\ge]', [37]),
('[age,age]', [37, 37]),
('[age,name.first]', [37, 'Tom']),
('[name.first,age,invalid,invalid.invalid,name.last]', ['Tom', 37, 'Anderson']),
('[[age]]', [[37]]),
('[[age],age]', [[37], 37]),
('friends.0.[age,nets.#(="ig")]', [44, 'ig']),
('friends.0.[age,nets.#(="ig"),invalid]', [44, 'ig']),
('friends.0.[age,nets.#(="ig")#]', [44, ['ig']]),
('friends.#.[age,first]', [[44, 'Dale'], [68, 'Roger'], [47, 'Jane']]),
('friends.#(age>44)#.[age,first]', [[68, 'Roger'], [47, 'Jane']]),
('friends.#(age>44)#.[age,invalid,invalid.invalid,first]', [[68, 'Roger'], [47, 'Jane']]),
(r'[age,name.first,fav\.movie]', [37, 'Tom', 'Deer Hunter']),
('[age,name.[first,last],children.@sort]', [37, ['Tom', 'Anderson'], ['Alex', 'Jack', 'Sara']]),
('friends.[0.first,1.last,2.age]', ['Dale', 'Craig', 47]),
('[friends.[0.[nets.[0]]]]', [[[['ig']]]]),
('[friends.[0.[nets.[0,1]]]]', [[[['ig', 'fb']]]]),
# Multipaths mixed
('[{}]', [{}]),
('{[]}', {'_': []}),
('[{},[],{}]', [{}, [], {}]),
('{"a":[]}', {'a': []}),
('[{age},{name.first}]', [{'age': 37}, {'first': 'Tom'}]),
('{friends.0.[age,nets.#(="ig")]}', {'_': [44, 'ig']}),
('{friends.0.[age,nets.#(="ig")],age}', {'_': [44, 'ig'], 'age': 37}),
('{friends.0.[invalid,nets.#(="ig")],age,invalid}', {'_': ['ig'], 'age': 37}),
# Literals
('!true', True),
('!false', False),
('!null', None),
('!NaN', float('nan')),
('!Infinity', float('inf')),
('!-Infinity', float('-inf')),
('!"key"', 'key'),
('!"line \\"quotes\\""', 'line "quotes"'),
('!"a\tb"', 'a\tb'),
('!0', 0),
('!12', 12),
('!-12', -12),
('!12.34', 12.34),
('!12.34E2', 1234),
('!12.34E+2', 1234),
('!12.34e-2', 0.1234),
('!-12.34e-2', -0.1234),
('friends.#.!"value"', ['value', 'value', 'value']),
('friends.#.!invalid', []),
('friends.#|!"value"', 'value'),
('friends.#(age>45)#.!"value"', ['value', 'value']),
('name|!"value"', 'value'),
('!{}', {}),
('![]', []),
('!{"name":{"first":"Tom"}}.{name.first}.first', 'Tom'),
('{name.last,"key":!"value"}', {'last': 'Anderson', 'key': 'value'}),
('{name.last,"key":!{"a":"b"},"invalid"}', {'last': 'Anderson', 'key': {'a': 'b'}}),
('{name.last,"key":!{"c":"d"},!"valid"}', {'last': 'Anderson', 'key': {'c': 'd'}, '_': 'valid'}),
('[!true,!false,!null,!Infinity,!invalid,{"name":!"andy",name.last},+Infinity,!["value1","value2"]]',
[True, False, None, float('inf'), {'name': 'andy', 'last': 'Anderson'}, ['value1', 'value2']]),
('[!12.34,!-12.34e-2,!true]', [12.34, -0.1234, True]),
))
def test_get_ok(self, query, expected):
"""It should query the JSON object and return the expected result."""
compare_values(self.object.get(query), expected)
@pytest.mark.parametrize('query, error', (
# Basic
('.', 'Invalid query starting with a path delimiter.'),
('|', 'Invalid query starting with a path delimiter.'),
('.name', 'Invalid query starting with a path delimiter.'),
('|age', 'Invalid query starting with a path delimiter.'),
('name..first', 'Invalid query with two consecutive path delimiters.'),
('name||first', 'Invalid query with two consecutive path delimiters.'),
('name.|first', 'Invalid query with two consecutive path delimiters.'),
('name|.first', 'Invalid query with two consecutive path delimiters.'),
('age.0', "Integer query part on unsupported object type "),
('friends.99', 'Index `99` out of range for sequence object with 3 items in query.'),
('name.nonexistent', 'Mapping object does not have key `nonexistent`.'),
('name.1', 'Mapping object does not have key `1`.'),
('children.invalid', 'Invalid or unsupported query part `invalid`.'),
('children.', 'Delimiter at the end of the query.'),
('children\\', 'Escape character at the end of the query.'),
# Wildcards
('x*', 'No key matching pattern with wildcard `x*`'),
('??????????', 'No key matching pattern with wildcard `??????????`'),
('children.x*', "Wildcard matching key `x*` requires a mapping object, got instead."),
('(-?', 'No key matching pattern with wildcard `(-?`'),
# Queries
('#', "Expected a sequence like object for query part # at the end of the query, got ."),
('#.invalid', 'Invalid or unsupported query part `invalid`.'),
('friends.#(=="Murphy")', 'Query on mapping like objects require a key before the operator.'),
('friends.#(last=={1: 2})', 'Invalid value `{1: 2}` for the query key `last`'),
('friends.#(invalid', 'Unbalanced parentheses `(`, 1 still opened.'),
('#(first)', 'Queries are supported only for sequence like objects'),
('friends.#(invalid)', 'Query for first element does not match anything.'),
('friends.#(last=="invalid")', 'Query for first element does not match anything.'),
('friends.#(first%"D?")', 'Query for first element does not match anything.'),
('friends.#(last=="Murphy")invalid', 'Expected delimiter or end of query after closing parenthesis.'),
('children.#()', 'Empty or invalid query.'),
('children.#()#', 'Empty or invalid query.'),
('friends.#.invalid.#()', 'Empty or invalid query.'),
('friends.#.invalid.#()#', 'Empty or invalid query.'),
# Dot vs Pipe
('friends.#(last="Murphy")#|first', 'Invalid or unsupported query'),
# Modifiers
('@', 'Got empty modifier name.'),
('friends.@', 'Got empty modifier name.'),
('friends.@pretty:', 'Modifier with options separator `:` without any option.'),
('friends.@pretty:{invalid', 'Unable to load modifier options.'),
('friends.@pretty:["invalid"]', "Expected JSON object `{...}` as modifier options."),
('friends.@invalid', 'Unknown modifier @invalid.'),
('friends.@in"valid', 'Invalid modifier name @in"valid, the following characters are not allowed'),
# JSON Lines
('..name', 'Invalid query starting with a path delimiter.'),
# Multipaths
(r'{"a\ge":age}', r'Failed to parse multipaths key "a\ge"'),
('{"age",age}', 'Expected colon after multipaths item with key "age".'),
('{]', 'Unbalanced parentheses `{`, 1 still opened.'),
('{', 'Unbalanced parentheses `{`, 1 still opened.'),
('{}@pretty', 'Expected delimiter or end of query after closing parenthesis.'),
('[{age}}]', 'Missing separator after multipath.'),
('{[age]]}', 'Missing separator after multipath.'),
('[{age,name.first]},age]', 'Expected delimiter or end of query after closing parenthesis.'),
# Literals
('!', 'Unable to load literal JSON'),
('name.!', 'Unable to load literal JSON'),
('!invalid', 'Unable to load literal JSON'),
(r'!in\valid', 'Unable to load literal JSON'),
('!0.a', 'Invalid or unsupported query part `a`.'),
('!0.1ea', 'Invalid or unsupported query part `ea`.'),
('!-12.', 'Delimiter at the end of the query.'),
('!-12.e', 'Invalid or unsupported query part `e`.'),
('name.!invalid', 'Unable to load literal JSON'),
('!"invalid', 'Unable to find end of literal string.'),
('friends.#|!invalid', 'Unable to load literal JSON'),
('!{true,', 'Unbalanced parentheses `{`, 1 still opened.'),
('![true,', 'Unbalanced parentheses `[`, 1 still opened.'),
('!"value".invalid', 'Invalid or unsupported query part `invalid`.'),
('name.!"value"', 'Unable to load literal JSON: literal afer a dot delimiter.'),
))
def test_get_parser_raise(self, query, error):
"""It should raise a GJSONParseError error with the expected message."""
with pytest.raises(gjson.GJSONParseError, match=re.escape(error)):
self.object.get(query)
@pytest.mark.parametrize('query, error', (
# Basic
('', 'Empty query.'),
# Modifiers
('children.@keys', 'The current object does not have a keys() method.'),
('children.@values', 'The current object does not have a values() method.'),
('age.@group', "Modifier @group got object of type as input, expected dictionary."),
('children.@group', "Modifier @group got object of type as input, expected dictionary."),
))
def test_get_raise(self, query, error):
"""It should raise a GJSONError error with the expected message."""
with pytest.raises(gjson.GJSONError, match=re.escape(error)):
self.object.get(query)
class TestEscape:
"""Test gjson for all the escape sequences."""
def setup_method(self):
"""Initialize the test instance."""
self.escape = gjson.GJSON(INPUT_ESCAPE)
@pytest.mark.parametrize('query, expected', (
(r'test.\*', 'valZ'),
(r'test.\*v', 'val0'),
(r'test.keyv\*', 'val1'),
(r'test.key\*v', 'val2'),
(r'test.keyv\?', 'val3'),
(r'test.key\?v', 'val4'),
(r'test.keyv\.', 'val5'),
(r'test.key\.v', 'val6'),
(r'test.keyk\*.key\?', 'val7'),
('test.1key', 'val8'),
))
def test_get_ok(self, query, expected):
"""It should query the escape test JSON and return the expected result."""
assert self.escape.get(query, quiet=True) == expected
class TestBasic:
"""Test gjson for basic queries."""
def setup_method(self):
"""Initialize the test instance."""
self.basic = gjson.GJSON(INPUT_BASIC)
@pytest.mark.parametrize('query, expected', (
('loggy.programmers.#(age=101).firstName', 1002.3),
('loggy.programmers.#(firstName != "Brett").firstName', 'Jason'),
('loggy.programmers.#(firstName % "Bre*").email', 'aaaa'),
('loggy.programmers.#(firstName !% "Bre*").email', 'bbbb'),
('loggy.programmers.#(firstName == "Brett").email', 'aaaa'),
('loggy.programmers.#.firstName', ['Brett', 'Jason', 'Elliotte', 1002.3]),
('loggy.programmers.#.asd', []),
('items.3.tags.#', 3),
('items.3.points.1.#', 2),
('items.#', 8),
('vals.#', 3),
('name2.here', r'B\"R'),
('arr.#', 6),
('arr.3.hello', 'world'),
('name.first', 'tom'),
('name.last', None),
('age', 100),
('happy', True),
('immortal', False),
('noop', {'what is a wren?': 'a bird'}),
# Modifiers
('arr.@join', {'hello': 'world'}),
))
def test_get_ok(self, query, expected):
"""It should query the basic test JSON and return the expected result."""
assert self.basic.get(query) == expected
class TestList:
"""Test gjson queries on a list object."""
def setup_method(self):
"""Initialize the test instance."""
self.list = gjson.GJSON(INPUT_LIST)
@pytest.mark.parametrize('query, expected', (
# Dot vs Pipe
('#.first', ['Dale', 'Jane']),
('#.first.#', []),
('#.first|#', 2),
('#.0', []),
('#.#', []),
# Queries
('#(first)#', [{'first': 'Dale'}, {'first': 'Jane'}]),
('#(first)', {'first': 'Dale'}),
('#(last)#', [{'last': 'Murphy'}]),
('#(last)', {'last': 'Murphy'}),
# Modifiers
('@join', {'first': 'Jane', 'last': 'Murphy'}),
# Multipaths
('#.{first.@reverse}', [{'@reverse': 'Dale'}, {'@reverse': 'Jane'}, {}]),
))
def test_get_ok(self, query, expected):
"""It should query the list test JSON and return the expected result."""
assert self.list.get(query, quiet=False) == expected
@pytest.mark.parametrize('query, error', (
# Dot vs Pipe
('#|first', 'Invalid or unsupported query part `first`.'),
('#|0', 'Integer query part after a pipe delimiter on an sequence like object.'),
('#|#', 'The pipe delimiter cannot immediately follow the # element.'),
))
def test_get_raise(self, query, error):
"""It should raise a GJSONError error with the expected message."""
with pytest.raises(gjson.GJSONParseError, match=re.escape(error)):
self.list.get(query)
class TestFlattenModifier:
"""Test gjson @flatten modifier."""
def setup_method(self):
"""Initialize the test instance."""
self.list = gjson.GJSON(json.loads('[1, [2], [3, 4], [5, [6, 7]], [8, [9, [10, 11]]]]'))
@pytest.mark.parametrize('query, expected', (
('@flatten', [1, 2, 3, 4, 5, [6, 7], 8, [9, [10, 11]]]),
('@flatten:{"deep":true}', [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]),
))
def test_get(self, query, expected):
"""It should correctly flatten the given object."""
assert self.list.get(query, quiet=True) == expected
class TestTruthiness:
"""Testing gjson with an input object with truthy/falsy objects."""
def setup_method(self):
"""Initialize the test instance."""
self.object = gjson.GJSON(INPUT_TRUTHINESS)
@pytest.mark.parametrize('query, expected', (
('vals.#(b==~true).a', 1),
('vals.#(b==~true)#.a', [1, 2, 4, 6, 7, 8]),
('vals.#(b==~false).a', 3),
('vals.#(b==~false)#.a', [3, 5, 9, 10, 11]),
('vals.#(b==~"invalid")#', []),
))
def test_get_ok(self, query, expected):
"""It should query the JSON object and return the expected result."""
compare_values(self.object.get(query), expected)
@pytest.mark.parametrize('query, error', (
('vals.#(b==~"invalid")',
"Queries ==~ operator requires a boolean value, got instead: `invalid`"),
))
def test_get_raise(self, query, error):
"""It should raise a GJSONError error with the expected message."""
with pytest.raises(gjson.GJSONError, match=re.escape(error)):
self.object.get(query)
class TestNestedQueries:
"""Testing gjson nested queries."""
def setup_method(self):
"""Initialize the test instance."""
self.object = gjson.GJSON(INPUT_NESTED_QUERIES)
@pytest.mark.parametrize('query, expected', (
# Arrays of objects
('key.#(level1.#(level2.#(level3)))', INPUT_NESTED_QUERIES['key'][0]),
('key.#(level1.#(level2.#(level3)))#', INPUT_NESTED_QUERIES['key'][0:2]),
('key.#(level1.#(level2.#(level3.#(==0))))#', []),
('key.#(level1.#(level2.#(level3.#(=1))))', INPUT_NESTED_QUERIES['key'][0]),
('key.#(level1.#(level2.#(level3.#(=1)#)#)#)', INPUT_NESTED_QUERIES['key'][0]),
('key.#(level1.#(level2.#(level3.#(==1))))#', [INPUT_NESTED_QUERIES['key'][0]]),
('key.#(level1.#(level2.#(level3.#(==2))))', INPUT_NESTED_QUERIES['key'][0]),
('key.#(level1.#(level2.#(level3.#(=2))))#', INPUT_NESTED_QUERIES['key'][0:2]),
# Arrays of arrays
('key.#(#(#(level3)))', INPUT_NESTED_QUERIES['key'][2]),
('key.#(#(#(level3)))#', INPUT_NESTED_QUERIES['key'][2:4]),
('key.#(#(#(level3.#(==0))))#', []),
('key.#(#(#(level3.#(==1))))', INPUT_NESTED_QUERIES['key'][2]),
('key.#(#(#(level3.#(==1)#)#)#)', INPUT_NESTED_QUERIES['key'][2]),
('key.#(#(#(level3.#(==1))))#', [INPUT_NESTED_QUERIES['key'][2]]),
('key.#(#(#(level3.#(==2))))', INPUT_NESTED_QUERIES['key'][2]),
('key.#(#(#(level3.#(==2))))#', INPUT_NESTED_QUERIES['key'][2:4]),
('key.#(#(#(level3.#(>=4))))', INPUT_NESTED_QUERIES['key'][3]),
('key.#(#(#(level3.#(>=4))))#', [INPUT_NESTED_QUERIES['key'][3]]),
# Mixed
('key.#(mixed.#(#(level4)))', INPUT_NESTED_QUERIES['key'][-1]),
('key.#(mixed.#(#(level4)))#', [INPUT_NESTED_QUERIES['key'][-1]]),
))
def test_get_ok(self, query, expected):
"""It should query the JSON object and return the expected result."""
compare_values(self.object.get(query), expected)
@pytest.mark.parametrize('query, error', (
('key.#(level1.#(level2.#(level3.#(==0))))', 'Query for first element does not match anything.'),
('key.#(#(#(level3.#(==0))))', 'Query for first element does not match anything.'),
))
def test_get_raise(self, query, error):
"""It should raise a GJSONError error with the expected message."""
with pytest.raises(gjson.GJSONError, match=re.escape(error)):
self.object.get(query)
@pytest.mark.parametrize('query, expected', (
('0.0', 'zero'),
('0|0', 'zero'),
('#.0', ['zero']),
('#.1', ['one', 'one']),
('#.9', []),
('#(0="zero")#|0', {'0': 'zero', '1': 'one'}),
('#(0="zero")#.1', ['one']),
('#(0="zero")#.9', []),
('#(0="invalid")#.1', []),
))
def test_get_integer_mapping_keys_ok(query, expected):
"""It should return the expected result."""
obj = gjson.GJSON([{'0': 'zero', '1': 'one'}, {'1': 'one'}])
assert obj.get(query, quiet=True) == expected
@pytest.mark.parametrize('query, error', (
('0.1', 'Mapping object does not have key `1`.'),
('#|0', 'Integer query part after a pipe delimiter on an sequence like object.'),
('#|9', 'Integer query part after a pipe delimiter on an sequence like object.'),
('#(0="zero")#|1', 'Index `1` out of range for sequence object with 1 items in query.'),
))
def test_get_integer_mapping_keys_raise(query, error):
"""It should return the expected result."""
with pytest.raises(gjson.GJSONError, match=re.escape(error)):
gjson.GJSON([{'0': 'zero'}]).get(query)
@pytest.mark.parametrize('modifier', ('@valid', '@this'))
def test_get_modifier_unmodified_ok(modifier):
"""It should return the same object."""
obj = gjson.GJSON(INPUT_OBJECT)
assert obj.get(modifier, quiet=True) == INPUT_OBJECT
def test_get_modifier_valid_raise():
"""It should return None if the object is invalid JSON and quiet is True."""
obj = gjson.GJSON({'invalid': {1, 2}})
assert obj.get('@valid', quiet=True) is None
@pytest.mark.parametrize('data, expected', (
('[3, 1, 5, 8, 2]', [1, 2, 3, 5, 8]),
('{"b": 2, "d": 4, "c": 3, "a": 1}', {"a": 1, "b": 2, "c": 3, "d": 4}),
('"a string"', None),
))
def test_get_modifier_sort(data, expected):
"""It should return the object sorted."""
obj = gjson.GJSON(json.loads(data))
compare_values(obj.get('@sort', quiet=True), expected)
@pytest.mark.parametrize('data, query, expected', (
({'a': '{"b": 25}'}, 'a.@fromstr', {'b': 25}),
({'a': '{"b": 25}'}, 'a.@fromstr.b', 25),
))
def test_get_modifier_fromstr_ok(data, query, expected):
"""It should load the JSON-encoded string."""
obj = gjson.GJSON(data)
assert obj.get(query, quiet=True) == expected
@pytest.mark.parametrize('query, error', (
('a.@fromstr', 'The current @fromstr input object cannot be converted to JSON.'),
('b.@fromstr', "Modifier @fromstr got object of type as input, expected string or bytes."),
))
def test_get_modifier_fromstr_raise(query, error):
"""It should raise a GJSONError if the JSON-encoded string has invalid JSON."""
obj = gjson.GJSON({'a': '{"invalid: json"', 'b': {'not': 'a string'}})
with pytest.raises(gjson.GJSONError, match=re.escape(error)):
obj.get(query)
def test_get_modifier_tostr_raise():
"""It should raise a GJSONError if the object cannot be JSON-encoded."""
obj = gjson.GJSON({'a': {1, 2, 3}}) # Python sets cannot be JSON-encoded
match = re.escape('The current object cannot be converted to a JSON-encoded string for @tostr.')
with pytest.raises(gjson.GJSONError, match=match):
obj.get('a.@tostr')
def test_get_modifier_group_ok():
"""It should group the dict of lists into a list of dicts."""
obj = gjson.GJSON({
'invalid1': 5,
'id': ['123', '456', '789'],
'val': [2, 1],
'invalid2': 'invalid',
'unit': ['ms', 's', 's', 'ms'],
})
assert obj.get('@group') == [
{'id': '123', 'val': 2, 'unit': 'ms'},
{'id': '456', 'val': 1, 'unit': 's'},
{'id': '789', 'unit': 's'},
{'unit': 'ms'},
]
def test_get_modifier_group_empty():
"""It should return an empty list if no values are lists or are empty."""
obj = gjson.GJSON({'invalid1': 5, 'invalid2': 'invalid', 'invalid3': {'a': 5}, 'id': []})
assert obj.get('@group') == []
def test_get_integer_index_on_mapping():
"""It should access the integer as string key correctly."""
obj = gjson.GJSON(json.loads('{"1": 5, "11": 7}'))
assert obj.get('1') == 5
assert obj.get('11') == 7
def test_module_get():
"""It should return the queried object."""
assert gjson.get({'key': 'value'}, 'key') == 'value'
def test_gjson_get_gjson():
"""It should return the queried object as a GJSON object."""
ret = gjson.GJSON(INPUT_OBJECT).get_gjson('children')
assert isinstance(ret, gjson.GJSON)
assert str(ret) == '["Sara", "Alex", "Jack"]'
@pytest.mark.parametrize('data, num, expected', (
# Valid data
('[1, 2, 3, 4, 5]', None, {1: 1, 2: 1, 3: 1, 4: 1, 5: 1}),
('[1, 2, 3, 4, 5]', 0, {}),
('[1, 2, 3, 4, 5]', 2, {1: 1, 2: 1}),
('[1, 1, 1, 1, 1]', None, {1: 5}),
('[1, 1, 1, 1, 1]', 1, {1: 5}),
('[1, 1, 1, 2, 2, 3]', None, {1: 3, 2: 2, 3: 1}),
('[1, 1, 1, 2, 2, 3, 3, 3, 3]', None, {3: 4, 1: 3, 2: 2}),
('[1, 1, 1, 2, 2, 3, 3, 3, 3]', 2, {3: 4, 1: 3}),
# Invalid data
('{"key": "value"}', None, None),
('1', None, None),
('"value"', None, None),
))
def test_get_modifier_top_n(data, num, expected):
"""It should return the top N common items."""
obj = gjson.GJSON(json.loads(data))
if num is not None:
compare_values(obj.get(f'@top_n:{{"n": {num}}}', quiet=True), expected)
else:
compare_values(obj.get('@top_n', quiet=True), expected)
@pytest.mark.parametrize('num, expected', (
(0, {}),
(1, {"c": 12}),
(2, {"c": 12, "a": 8}),
(3, {"c": 12, "a": 8, "d": 4}),
(4, {"c": 12, "a": 8, "d": 4, "b": 3.5}),
(None, {"c": 12, "a": 8, "d": 4, "b": 3.5}),
))
def test_get_modifier_sum_n_valid(num, expected):
"""It should group and sum and return the top N items."""
obj = gjson.GJSON(INPUT_SUM_N)
if num is not None:
compare_values(obj.get(f'@sum_n:{{"group": "key", "sum": "value", "n": {num}}}', quiet=True), expected)
else:
compare_values(obj.get('@sum_n:{"group": "key", "sum": "value"}', quiet=True), expected)
@pytest.mark.parametrize('data', (
'{"an": "object"}',
'"a string"',
'1',
))
def test_get_modifier_sum_n_invalid_data(data):
"""It should raise a GJSONError if the input is invalid."""
obj = gjson.GJSON(json.loads(data))
with pytest.raises(gjson.GJSONError, match="@sum_n modifier not supported for object of type"):
obj.get('@sum_n:{"group": "key", "sum": "value"}')
@pytest.mark.parametrize('options', (
'',
':{}',
':{"group": "invalid", "sum": "value"}',
':{"group": "key", "sum": "invalid"}',
':{"group": "key", "sum": "other"}',
':{"group": "other", "sum": "value"}',
))
def test_get_modifier_sum_n_invalid_options(options):
"""It should raise a GJSONError if the options are invalid."""
obj = gjson.GJSON(INPUT_SUM_N)
with pytest.raises(gjson.GJSONError, match="Modifier @sum_n raised an exception"):
obj.get(f'@sum_n{options}')
class TestJSONOutput:
"""Test class for all JSON output functionalities."""
def setup_method(self):
"""Initialize the test instance."""
self.obj = {'key': 'value', 'hello world': '\u3053\u3093\u306b\u3061\u306f\u4e16\u754c'}
self.query = 'key'
self.value = '"value"'
self.gjson = gjson.GJSON(self.obj)
def test_module_get_as_str(self):
"""It should return the queried object as a JSON string."""
assert gjson.get(self.obj, self.query, as_str=True) == self.value
assert gjson.get(self.obj, '', as_str=True, quiet=True) == ''
def test_gjson_getj(self):
"""It should return the queried object as a JSON string."""
assert self.gjson.getj(self.query) == self.value
assert self.gjson.getj('', quiet=True) == ''
def test_module_get_as_str_raise(self):
"""It should raise a GJSONError with the proper message on failure."""
with pytest.raises(gjson.GJSONError, match='Empty query.'):
gjson.get(self.obj, '', as_str=True)
def test_gjson_get_as_str_raise(self):
"""It should raise a GJSONError with the proper message on failure."""
with pytest.raises(gjson.GJSONError, match='Empty query.'):
self.gjson.getj('')
@pytest.mark.parametrize('query, expected', (
('@pretty', '{\n "key": "value",\n "hello world": "\u3053\u3093\u306b\u3061\u306f\u4e16\u754c"\n}'),
('@pretty:{"indent": 4}',
'{\n "key": "value",\n "hello world": "\u3053\u3093\u306b\u3061\u306f\u4e16\u754c"\n}'),
('@pretty:{"indent": "\t"}',
'{\n\t"key": "value",\n\t"hello world": "\u3053\u3093\u306b\u3061\u306f\u4e16\u754c"\n}'),
# Multipaths
('{key,"another":key}.@pretty', '{\n "key": "value",\n "another": "value"\n}'),
))
def test_modifier_pretty(self, query, expected):
"""It should prettyfy the JSON string based on the parameters."""
assert self.gjson.getj(query) == expected
def test_modifier_pretty_sort_keys_prefix(self):
"""It should prettyfy the JSON string and sort the keys."""
output = gjson.GJSON({'key2': 'value2', 'key1': 'value1'}).getj('@pretty:{"sortKeys": true, "prefix": "## "}')
assert output == '## {\n## "key1": "value1",\n## "key2": "value2"\n## }'
def test_modifier_ugly(self):
"""It should uglyfy the JSON string."""
assert gjson.get(self.obj, '@ugly', as_str=True) == (
'{"key":"value","hello world":"\u3053\u3093\u306b\u3061\u306f\u4e16\u754c"}')
def test_output_unicode(self):
"""It should return unicode characters as-is."""
assert gjson.get(self.obj, 'hello world', as_str=True) == '"\u3053\u3093\u306b\u3061\u306f\u4e16\u754c"'
def test_modifier_ascii(self):
"""It should escape all non-ASCII characters."""
assert gjson.get(self.obj, 'hello world.@ascii', as_str=True) == (
'"\\u3053\\u3093\\u306b\\u3061\\u306f\\u4e16\\u754c"')
def custom_sum(options, obj, *, last):
"""Custom modifier function."""
assert last is True
assert options == {}
if not isinstance(obj, list):
raise RuntimeError('@sum can be used only on lists')
return sum(obj)
class TestCustomModifiers:
"""Test class for custom modifiers."""
def setup_method(self):
"""Initialize the test instance."""
self.valid_obj = [1, 2, 3, 4, 5]
self.invalid_obj = 'invalid'
self.query = '@sum'
def test_gjson_register_modifier_ok(self):
"""It should register a valid modifier."""
obj = gjson.GJSON(self.valid_obj)
obj.register_modifier('sum', custom_sum)
assert obj.get(self.query) == 15
def test_gjson_register_modifier_with_escape_ok(self):
"""It should register a valid modifier with escaped characters in the name."""
obj = gjson.GJSON(self.valid_obj)
obj.register_modifier('sum\\=', custom_sum)
assert obj.get('@sum\\=') == 15
@pytest.mark.parametrize('char', MODIFIER_NAME_RESERVED_CHARS)
def test_gjson_register_modifier_invalid_name(self, char):
"""It should raise a GJSONError if trying to register a modifier with a name with not allowed characters."""
obj = gjson.GJSON(self.valid_obj)
name = fr'a{char}b'
with pytest.raises(
gjson.GJSONError,
match=fr'Unable to register modifier `{re.escape(name)}`, contains at least one not allowed'):
obj.register_modifier(name, custom_sum)
def test_gjson_register_modifier_override_builtin(self):
"""It should raise a GJSONError if trying to register a modifier with the same name of a built-in one."""
obj = gjson.GJSON(self.valid_obj)
with pytest.raises(gjson.GJSONError,
match=r'Unable to register a modifier with the same name of the built-in modifier: @valid'):
obj.register_modifier('valid', custom_sum)
def test_gjson_register_modifier_not_callable(self):
"""It should raise a GJSONError if trying to register a modifier that is not callable."""
obj = gjson.GJSON(self.valid_obj)
with pytest.raises(gjson.GJSONError, match=r'The given func "not-callable" for the custom modifier @sum does'):
obj.register_modifier('sum', 'not-callable')
def test_gjsonobj_custom_modifiers_ok(self):
"""It should register a valid modifier."""
obj = gjson.GJSONObj(self.valid_obj, self.query, custom_modifiers={'sum': custom_sum})
assert obj.get() == 15
def test_gjsonobj_custom_modifiers_raise(self):
"""It should encapsulate the modifier raised exception in a GJSONError."""
with pytest.raises(gjson.GJSONError,
match=r'Modifier @sum raised an exception'):
gjson.GJSONObj(self.invalid_obj, self.query, custom_modifiers={'sum': custom_sum}).get()
def test_gjsonobj_custom_modifiers_override_builtin(self):
"""It should raise a GJSONError if passing custom modifiers that have the same name of a built-in one."""
with pytest.raises(gjson.GJSONError,
match=r"Some provided custom_modifiers have the same name of built-in ones: {'valid'}"):
gjson.GJSONObj(self.valid_obj, self.query, custom_modifiers={'valid': custom_sum})
def test_gjsoniobj_custom_modifiers_not_callable(self):
"""It should raise a GJSONError if passing custom modifiers that are not callable."""
with pytest.raises(gjson.GJSONError, match=r'The given func "not-callable" for the custom modifier @sum does'):
gjson.GJSONObj(self.valid_obj, self.query, custom_modifiers={'sum': 'not-callable'})
def test_gjsonobj_builtin_modifiers(self):
"""It should return a set with the names of the built-in modifiers."""
expected = {'ascii', 'flatten', 'fromstr', 'group', 'join', 'keys', 'pretty', 'reverse', 'sort', 'sum_n',
'this', 'top_n', 'tostr', 'valid', 'values', 'ugly'}
assert gjson.GJSONObj.builtin_modifiers() == expected
python-gjson-1.0.0/tox.ini 0000664 0000000 0000000 00000003260 14363731677 0015504 0 ustar 00root root 0000000 0000000 [tox]
minversion = 3.10.0
envlist = py{39,310,311}-{flake8,unit,bandit,mypy,prospector,sphinx}
skip_missing_interpreters = True
[testenv]
usedevelop = True
download = True
whitelist_externals = sed
description =
flake8: Style consistency checker
unit: Run unit tests
bandit: Security-oriented static analyzer
mypy: Static analyzer for type annotations
prospector: Static analysis multi-tool
sphinx: Build documentation and manpages
py39: (Python 3.9)
py310: (Python 3.10)
py311: (Python 3.11)
envdir =
prospector: {toxworkdir}/{envname}
py39-!prospector: {toxworkdir}/py39-tests
py310-!prospector: {toxworkdir}/py310-tests
py311-!prospector: {toxworkdir}/py311-tests
commands =
flake8: flake8 setup.py gjson tests
unit: py.test --strict-markers --cov-report=term-missing --cov=gjson tests/unit {posargs}
# Skip some bandit issues:
# - assert_used (B101) in tests
bandit: bandit -l -i -r gjson/
bandit: bandit -l -i -r --skip B101 tests/
mypy: mypy --show-error-codes gjson/
prospector: prospector --profile '{toxinidir}/prospector.yaml' {posargs} {toxinidir}
sphinx: python setup.py build_sphinx -W -b html
sphinx: python setup.py build_sphinx -W -b man
# Fix missing space after bold blocks in man page: https://github.com/ribozz/sphinx-argparse/issues/80
sphinx: sed -i='' -e 's/^\.B/.B /' '{toxinidir}/doc/build/man/gjson.1'
deps =
# Use install_requires and the additional extras_require[tests/prospector] from setup.py
prospector: .[prospector]
!prospector: .[tests]
[flake8]
max-line-length = 120
statistics = True
ignore = W503
import-order-style = edited
application-import-names = gjson