././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8048103 rpcq-3.11.0/0000755000076700000240000000000000000000000013346 5ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1570747843.0 rpcq-3.11.0/LICENSE0000644000076700000240000002613500000000000014362 0ustar00ksnyderstaff00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1570747843.0 rpcq-3.11.0/MANIFEST.in0000644000076700000240000000011700000000000015103 0ustar00ksnyderstaff00000000000000include LICENSE include README.md include requirements.txt include VERSION.txt ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8045688 rpcq-3.11.0/PKG-INFO0000644000076700000240000002210300000000000014441 0ustar00ksnyderstaff00000000000000Metadata-Version: 2.1 Name: rpcq Version: 3.11.0 Summary: The RPC framework and message specification for Rigetti QCS. Home-page: https://github.com/rigetticomputing/rpcq.git Author: Rigetti Computing Author-email: info@rigetti.com License: Apache-2.0 Description: rpcq ==== [![pypi version](https://img.shields.io/pypi/v/rpcq.svg)](https://pypi.org/project/rpcq/) [![conda-forge version](https://img.shields.io/conda/vn/conda-forge/rpcq.svg)](https://anaconda.org/conda-forge/rpcq) [![docker pulls](https://img.shields.io/docker/pulls/rigetti/rpcq.svg)](https://hub.docker.com/r/rigetti/rpcq) The asynchronous RPC client-server framework and message specification for [Rigetti Quantum Cloud Services (QCS)](https://www.rigetti.com/). Implements an efficient transport protocol by using [ZeroMQ](http://zeromq.org/) (ZMQ) sockets and [MessagePack](https://msgpack.org/index.html) (`msgpack`) serialization. Not intended to be a full-featured replacement for other frameworks like [gRPC](https://grpc.io/) or [Apache Thrift](https://thrift.apache.org/). Python Installation ------------------- To install directly from the source, run `pip install -e .` from within the top-level directory of the `rpcq` repository. To additionally install the requirements for testing, make sure to run `pip install -r requirements.txt`. To instead install the latest released verson of `rpcq` from the Python package manager PyPi, run `pip install rpcq`. **NOTE**: We strongly encourage users of `rpcq` to install the software within a (Python) virtual environment (read up on [`virtualenv`](https://github.com/pypa/virtualenv), [`pyenv`](https://github.com/pyenv/pyenv), or [`conda`](https://github.com/conda/conda) for more info). Lisp Installation ----------------- Installation is easier with QuickLisp. After placing the source for RPCQ within your local Lisp projects directory (cf. `ql:*local-project-directories*`), run `(ql:quickload :rpcq)` and QuickLisp will download the necessary Lisp dependencies. In addition to the Lisp dependencies, RPCQ depends on ZeroMQ. Be sure to install both the library *and* its development headers (which are necessary for the Lisp foreign-function interface to get its bearings). Using the Client-Server Framework --------------------------------- The following two code samples (first in Python, then in Lisp) demonstrate how to create a server, add a test handler, and spin it up. ```python from rpcq import Server server = Server() @server.rpc_handler def test(): return 'test' server.run('tcp://*:5555') ``` ```lisp (defun test () "test") (let ((dt (rpcq:make-dispatch-table))) (rpcq:dispatch-table-add-handler dt 'test) (rpcq:start-server :dispatch-table dt :listen-addresses '("tcp://*:5555"))) ``` In another window, we can (again first in Python, then in Lisp) create a client that points to the same socket, and call the test method. ```python from rpcq import Client client = Client('tcp://localhost:5555') client.call('test') ``` ```lisp (rpcq:with-rpc-client (client "tcp://localhost:5555") (rpcq:rpc-call client "test")) ``` In all cases (including interoperating a client/server pair written in different languages), this will return the string `'test'`. Using the Message Spec ---------------------- The message spec as defined in `src/messages.lisp` (which in turn produces `rpcq/messages.py`) is meant to be used with the [Rigetti QCS](https://www.rigetti.com/qcs) platform. Therefore, these messages are used in [`pyquil`](https://github.com/rigetticomputing/pyquil), in order to allow users to communicate with the Rigetti Quil compiler and quantum processing units (QPUs). PyQuil provides utilities for users to interact with the QCS API and write programs in [Quil](https://arxiv.org/abs/1608.03355), the quantum instruction language developed at Rigetti. Thus, most users will not interact with `rpcq.messages` directly. However, for those interested in building their own implementation of the QCS API utilities in pyQuil, becoming acquainted with the client-server framework, the available messages in the message spec, and how they are used in the `pyquil.api` module would be a good place to start! Updating the Python Message Bindings ------------------------------------ Currently only Python bindings are available for the message spec, but more language bindings are in the works. To update the Python message bindings after editing `src/messages.lisp`, open `rlwrap sbcl` and run: ```lisp (ql:quickload :rpcq) (with-open-file (f "rpcq/messages.py" :direction :output :if-exists :supersede) (rpcq:python-message-spec f)) ``` **NOTE**: Requires pre-installed [`sbcl`](http://www.sbcl.org/), [`quicklisp`](https://www.quicklisp.org/beta/), and (optionally) [`rlwrap`](https://github.com/hanslub42/rlwrap). We can also use the rpcq docker container to update the message spec without to install the requirements. ```bash ./docker_update_python_spec.sh ``` Running the Unit Tests ---------------------- The `rpcq` repository is configured with GitLab CI to automatically run the unit tests. The tests run within a container based off of the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `.gitlab-ci.yml`. The Python unit tests can be executed locally by running `pytest` from the top-level directory of the repository (assuming you have installed the test requirements). The Lisp unit tests can be run locally by doing the following from within `rlwrap sbcl`: ```lisp (ql:quickload :rpcq) (asdf:test-system :rpcq) ``` There may be some instances of `STYLE-WARNING`, but if the test run successfully, there should be something near the bottom of the output that looks like: ``` RPCQ-TESTS (Suite) TEST-DEFMESSAGE [ OK ] ``` Automated Packaging with Docker ------------------------------- The CI pipeline for `rpcq` produces a Docker image, available at [`rigetti/rpcq`](https://hub.docker.com/r/rigetti/rpcq). To get the latest stable version of `rpcq`, run `docker pull rigetti/rpcq`. The image is built from the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `Dockerfile`. To learn more about the `rigetti/lisp` Docker image, check out the [`docker-lisp`](https://github.com/rigetti/docker-lisp) repository. Release Process --------------- 1. Update `VERSION.txt` and dependency versions (if applicable) and push the commit to `master`. 2. Push a git tag `vX.Y.Z` that contains the same version number as in `VERSION.txt`. 3. Verify that the resulting build (triggered by pushing the tag) completes successfully. 4. Push the tagged commit to `pypi` and verify it appears [here](https://pypi.org/project/rpcq/). 5. Publish a [release](https://github.com/rigetti/rpcq/releases) using the tag as the name. 6. Close the [milestone](https://github.com/rigetti/rpcq/milestones) associated with this release, and migrate incomplete issues to the next one. Authors ------- Developed at [Rigetti Computing](https://github.com/rigetticomputing) by [Nikolas Tezak](https://github.com/ntezak), [Steven Heidel](https://github.com/stevenheidel), [Eric Peterson](https://github.com/ecp-rigetti), [Colm Ryan](https://github.com/caryan), [Peter Karalekas](https://github.com/karalekas), [Guen Prawiroatmodjo](https://github.com/guenp), [Erik Davis](https://github.com/kilimanjaro), and [Robert Smith](https://github.com/tarballs-are-good). Keywords: quantum rpc qcs Platform: UNKNOWN Requires-Python: >=3.6 Description-Content-Type: text/markdown ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1606326691.0 rpcq-3.11.0/README.md0000644000076700000240000001636600000000000014641 0ustar00ksnyderstaff00000000000000rpcq ==== [![pypi version](https://img.shields.io/pypi/v/rpcq.svg)](https://pypi.org/project/rpcq/) [![conda-forge version](https://img.shields.io/conda/vn/conda-forge/rpcq.svg)](https://anaconda.org/conda-forge/rpcq) [![docker pulls](https://img.shields.io/docker/pulls/rigetti/rpcq.svg)](https://hub.docker.com/r/rigetti/rpcq) The asynchronous RPC client-server framework and message specification for [Rigetti Quantum Cloud Services (QCS)](https://www.rigetti.com/). Implements an efficient transport protocol by using [ZeroMQ](http://zeromq.org/) (ZMQ) sockets and [MessagePack](https://msgpack.org/index.html) (`msgpack`) serialization. Not intended to be a full-featured replacement for other frameworks like [gRPC](https://grpc.io/) or [Apache Thrift](https://thrift.apache.org/). Python Installation ------------------- To install directly from the source, run `pip install -e .` from within the top-level directory of the `rpcq` repository. To additionally install the requirements for testing, make sure to run `pip install -r requirements.txt`. To instead install the latest released verson of `rpcq` from the Python package manager PyPi, run `pip install rpcq`. **NOTE**: We strongly encourage users of `rpcq` to install the software within a (Python) virtual environment (read up on [`virtualenv`](https://github.com/pypa/virtualenv), [`pyenv`](https://github.com/pyenv/pyenv), or [`conda`](https://github.com/conda/conda) for more info). Lisp Installation ----------------- Installation is easier with QuickLisp. After placing the source for RPCQ within your local Lisp projects directory (cf. `ql:*local-project-directories*`), run `(ql:quickload :rpcq)` and QuickLisp will download the necessary Lisp dependencies. In addition to the Lisp dependencies, RPCQ depends on ZeroMQ. Be sure to install both the library *and* its development headers (which are necessary for the Lisp foreign-function interface to get its bearings). Using the Client-Server Framework --------------------------------- The following two code samples (first in Python, then in Lisp) demonstrate how to create a server, add a test handler, and spin it up. ```python from rpcq import Server server = Server() @server.rpc_handler def test(): return 'test' server.run('tcp://*:5555') ``` ```lisp (defun test () "test") (let ((dt (rpcq:make-dispatch-table))) (rpcq:dispatch-table-add-handler dt 'test) (rpcq:start-server :dispatch-table dt :listen-addresses '("tcp://*:5555"))) ``` In another window, we can (again first in Python, then in Lisp) create a client that points to the same socket, and call the test method. ```python from rpcq import Client client = Client('tcp://localhost:5555') client.call('test') ``` ```lisp (rpcq:with-rpc-client (client "tcp://localhost:5555") (rpcq:rpc-call client "test")) ``` In all cases (including interoperating a client/server pair written in different languages), this will return the string `'test'`. Using the Message Spec ---------------------- The message spec as defined in `src/messages.lisp` (which in turn produces `rpcq/messages.py`) is meant to be used with the [Rigetti QCS](https://www.rigetti.com/qcs) platform. Therefore, these messages are used in [`pyquil`](https://github.com/rigetticomputing/pyquil), in order to allow users to communicate with the Rigetti Quil compiler and quantum processing units (QPUs). PyQuil provides utilities for users to interact with the QCS API and write programs in [Quil](https://arxiv.org/abs/1608.03355), the quantum instruction language developed at Rigetti. Thus, most users will not interact with `rpcq.messages` directly. However, for those interested in building their own implementation of the QCS API utilities in pyQuil, becoming acquainted with the client-server framework, the available messages in the message spec, and how they are used in the `pyquil.api` module would be a good place to start! Updating the Python Message Bindings ------------------------------------ Currently only Python bindings are available for the message spec, but more language bindings are in the works. To update the Python message bindings after editing `src/messages.lisp`, open `rlwrap sbcl` and run: ```lisp (ql:quickload :rpcq) (with-open-file (f "rpcq/messages.py" :direction :output :if-exists :supersede) (rpcq:python-message-spec f)) ``` **NOTE**: Requires pre-installed [`sbcl`](http://www.sbcl.org/), [`quicklisp`](https://www.quicklisp.org/beta/), and (optionally) [`rlwrap`](https://github.com/hanslub42/rlwrap). We can also use the rpcq docker container to update the message spec without to install the requirements. ```bash ./docker_update_python_spec.sh ``` Running the Unit Tests ---------------------- The `rpcq` repository is configured with GitLab CI to automatically run the unit tests. The tests run within a container based off of the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `.gitlab-ci.yml`. The Python unit tests can be executed locally by running `pytest` from the top-level directory of the repository (assuming you have installed the test requirements). The Lisp unit tests can be run locally by doing the following from within `rlwrap sbcl`: ```lisp (ql:quickload :rpcq) (asdf:test-system :rpcq) ``` There may be some instances of `STYLE-WARNING`, but if the test run successfully, there should be something near the bottom of the output that looks like: ``` RPCQ-TESTS (Suite) TEST-DEFMESSAGE [ OK ] ``` Automated Packaging with Docker ------------------------------- The CI pipeline for `rpcq` produces a Docker image, available at [`rigetti/rpcq`](https://hub.docker.com/r/rigetti/rpcq). To get the latest stable version of `rpcq`, run `docker pull rigetti/rpcq`. The image is built from the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `Dockerfile`. To learn more about the `rigetti/lisp` Docker image, check out the [`docker-lisp`](https://github.com/rigetti/docker-lisp) repository. Release Process --------------- 1. Update `VERSION.txt` and dependency versions (if applicable) and push the commit to `master`. 2. Push a git tag `vX.Y.Z` that contains the same version number as in `VERSION.txt`. 3. Verify that the resulting build (triggered by pushing the tag) completes successfully. 4. Push the tagged commit to `pypi` and verify it appears [here](https://pypi.org/project/rpcq/). 5. Publish a [release](https://github.com/rigetti/rpcq/releases) using the tag as the name. 6. Close the [milestone](https://github.com/rigetti/rpcq/milestones) associated with this release, and migrate incomplete issues to the next one. Authors ------- Developed at [Rigetti Computing](https://github.com/rigetticomputing) by [Nikolas Tezak](https://github.com/ntezak), [Steven Heidel](https://github.com/stevenheidel), [Eric Peterson](https://github.com/ecp-rigetti), [Colm Ryan](https://github.com/caryan), [Peter Karalekas](https://github.com/karalekas), [Guen Prawiroatmodjo](https://github.com/guenp), [Erik Davis](https://github.com/kilimanjaro), and [Robert Smith](https://github.com/tarballs-are-good). ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613114.0 rpcq-3.11.0/VERSION.txt0000644000076700000240000000000700000000000015231 0ustar00ksnyderstaff000000000000003.11.0 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613114.0 rpcq-3.11.0/requirements.txt0000644000076700000240000000050200000000000016627 0ustar00ksnyderstaff00000000000000# If you change the requirements here, don't forget to make the # corresponding change in the conda-forge rcpq-feedstock here: # # https://github.com/conda-forge/rpcq-feedstock/blob/master/recipe/meta.yaml msgpack >=0.6,<2.0 python-rapidjson pyzmq>=17 ruamel.yaml # testing numpy pytest>=5.4.0 pytest-asyncio pytest-cov ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8020535 rpcq-3.11.0/rpcq/0000755000076700000240000000000000000000000014313 5ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679534595.0 rpcq-3.11.0/rpcq/__init__.py0000644000076700000240000000065600000000000016433 0ustar00ksnyderstaff00000000000000from rpcq._client import Client, ClientAuthConfig from rpcq._server import Server, ServerAuthConfig # These are imported so that the corresponding data classes are # registered whenever rpcq is imported. Without which one would have # to import the messages and core_messages modules directly before # using, e.g., from_json / to_json. from rpcq import messages from rpcq import core_messages from rpcq.version import __version__ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613114.0 rpcq-3.11.0/rpcq/_base.py0000644000076700000240000001471100000000000015742 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## import functools import inspect import sys import msgpack import rapidjson from ruamel import yaml if sys.version_info < (3, 7): from rpcq.external.dataclasses import astuple, replace, fields else: from dataclasses import astuple, replace, fields MAX_BIN_LEN = MAX_STR_LEN = 2 ** 31 - 1 # less than 2 GB REPR_LIST_TRUNCATION = 10 "Number of list elements to print when calling repr on a Message with a list field." def repr_value(value): """ Represent a value in human readable form. For long list's this truncates the printed representation. :param value: The value to represent. :return: A string representation. :rtype: basestring """ if isinstance(value, list) and len(value) > REPR_LIST_TRUNCATION: return "[{},...]".format(", ".join(map(repr, value[:REPR_LIST_TRUNCATION]))) else: return repr(value) class UnknownMessageType(Exception): """Raised when trying to decode an unknown message type.""" class Message: """ Base class for messages. """ def asdict(self): """ Create a dictionary ``{fieldname1: fieldvalue1, ...}`` of the Message object. :return: A dictionary representation of the message. :rtype: Dict[str,Any] """ return self.__dict__.copy() def astuple(self): """ Create a tuple ``{fieldvalue1, ...}`` of the Message object. :return: A tuple representation of the message. :rtype: Tuple[Any] """ return tuple(getattr(self, f.name) for f in fields(self)) def replace(self, **kwargs): """ Return a copy of the message object where the fields given in kwargs are replaced. :param kwargs: The replaced fields. :return: A copy of self. """ return replace(self, **kwargs) def _extend_by_deprecated_fields(self, d): pass copy = replace def items(self): return self.__dict__.items() def get(self, key, default): return self.__dict__.get(key, default) def __getitem__(self, item): return self.__dict__[item] def __repr__(self): return "{}({})".format( self.__class__.__name__, ", ".join("{}={}".format(k, repr_value(v)) for k, v in sorted(self.asdict().items(), key=lambda kv: kv[0]))) def __eq__(self, other): return type(self) == type(other) and astuple(self) == astuple(other) def __hash__(self): return hash((self.__class__, astuple(self))) @staticmethod @functools.lru_cache() def types(): """ Return a mapping ``{type_name: (message_type, args)}`` for all defined Message's, where ``args`` is a list of kwarg names that message_type's __init__ function accepts. :return: A dictionary of ``Message`` types. :rtype: Dict[str,type] """ types = {} classes_to_process = [Message] while classes_to_process: atom = classes_to_process.pop() classes_to_process += atom.__subclasses__() types[atom.__name__] = (atom, inspect.getfullargspec(atom.__init__).args) return types def _default(obj): if isinstance(obj, Message): d = obj.__dict__ obj._extend_by_deprecated_fields(d) d["_type"] = obj.__class__.__name__ return d else: raise TypeError('Object of type {} is not JSON serializable'.format(obj.__class__.__name__)) def _object_hook(obj): if '_type' in obj: try: class_dict = Message.types() msg_type, allowed_args = class_dict[obj['_type']] except KeyError: # pragma no coverage raise UnknownMessageType("The message type {} is unknown".format(obj["_type"])) itms = {k: v for k, v in obj.items() if k in allowed_args} return msg_type(**itms) else: return obj def to_msgpack(obj): """ Convert Python objects (including rpcq objects) to a msgpack byte array :rtype: bytes """ # Docs for `use_bin_type` parameter are somewhat hard to find so they're copied here: # Use bin type introduced in msgpack spec 2.0 for bytes. # It also enables str8 type for unicode. return msgpack.dumps(obj, default=_default, use_bin_type=True) def from_msgpack(b, *, max_bin_len=MAX_BIN_LEN, max_str_len=MAX_STR_LEN): """ Convert a msgpack byte array into Python objects (including rpcq objects) """ # Docs for raw parameter are somewhat hard to find so they're copied here: # If true, unpack msgpack raw to Python bytes (default). # Otherwise, unpack to Python str (or unicode on Python 2) by decoding with UTF-8 encoding (recommended). # In msgpack >= 0.6, max_xxx_len is reduced from 2 GB to 1 MB, so we set the relevant ones # to 2 GB as to not run into issues with the size of the values returned from rpcq return msgpack.loads( b, object_hook=_object_hook, raw=False, max_bin_len=max_bin_len, max_str_len=max_str_len, strict_map_key=False, ) def to_json(obj): """ Convert Python objects (including rpcq objects) to a JSON string. :rtype: str """ return rapidjson.dumps(obj, default=_default) def from_json(s): """ Convert a JSON string into Python objects (including rpcq objects). """ return rapidjson.loads(s, object_hook=_object_hook) def to_yaml_file(obj, f): """ Convert Python objects (including rpcq messages) to yaml and write it to `f`. """ yaml.dump(rapidjson.loads(to_json(obj)), f) def from_yaml_file(f): """ Read a yaml file and convert to Python objects (including rpcq messages). """ return from_json(to_json(yaml.load(f, Loader=yaml.Loader))) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1588975256.0 rpcq-3.11.0/rpcq/_client.py0000644000076700000240000002527100000000000016311 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## import asyncio import logging import sys import time from typing import Dict, Optional, Union from warnings import warn import zmq import zmq.asyncio from rpcq._base import to_msgpack, from_msgpack import rpcq._utils as utils from rpcq.messages import RPCError, RPCReply if sys.version_info < (3, 7): from rpcq.external.dataclasses import dataclass else: from dataclasses import dataclass _log = logging.getLogger(__name__) # Required values for ZeroMQ curve authentication, in lieu of a TypedDict @dataclass class ClientAuthConfig: client_secret_key: bytes client_public_key: bytes server_public_key: bytes class Client: """ Client that executes methods on a remote server by sending JSON RPC requests to a socket. """ def __init__(self, endpoint: str, timeout: Optional[float] = None, auth_config: Optional[ClientAuthConfig] = None): """ Create a client that connects to a server at . :param str endpoint: Socket endpoint, e.g. "tcp://localhost:1234" :param float timeout: Timeout in seconds for Server response, set to None to disable the timeout :param auth_config: The configuration values necessary to enable Curve ZeroMQ authentication. These must be provided at instantiation, so they are available when the socket is created. """ # TODO: leaving self.timeout for backwards compatibility; we should move towards using rpc_timeout only self.timeout = timeout self.rpc_timeout = timeout self.endpoint = endpoint self._auth_config = auth_config self._socket = self._connect_to_socket(zmq.Context(), endpoint) # The async socket can't be created yet because it's possible that the current event loop during Client creation # is different to the one used later to call a method, so we need to create the socket after the first call and # then cache it self._async_socket_cache = None # Mapping from request id to an event used to wake up the call that's waiting on that request. # This is necessary to support parallel, asynchronous calls where we don't know which # receive task will receive which reply. self._events: Dict[str, asyncio.Event] = {} # Cache of replies so that different tasks can share results with each other self._replies: Dict[str, Union[RPCReply, RPCError]] = {} def __setattr__(self, key, value): """ Ensure rpc_timeout attribute gets update with timeout. Currently keeping self.timeout and self.rpc_timeout for backwards compatibility. We should move towards using rpc_timeout only. :param key: attribute key :param value: attribute value :return: """ if key == 'timeout': self.rpc_timeout = value super().__setattr__(key, value) async def call_async(self, method_name: str, *args, rpc_timeout: float = None, **kwargs): """ Send JSON RPC request to a backend socket and receive reply (asynchronously) :param method_name: Method name :param args: Args that will be passed to the remote function :param float rpc_timeout: Timeout in seconds for Server response, set to None to disable the timeout :param kwargs: Keyword args that will be passed to the remote function """ # if an rpc_timeout override is not specified, use the one set in the Client attributes if rpc_timeout is None: rpc_timeout = self.rpc_timeout if rpc_timeout: # Implementation note: this simply wraps the call in a timeout and converts to the built-in TimeoutError try: return await asyncio.wait_for(self._call_async(method_name, *args, **kwargs), timeout=rpc_timeout) except asyncio.TimeoutError: raise TimeoutError(f"Timeout on client {self.endpoint}, method name {method_name}, class info: {self}") else: return await self._call_async(method_name, *args, **kwargs) async def _call_async(self, method_name: str, *args, **kwargs): """ Sends a request to the socket and then wait for the reply. To deal with multiple, asynchronous requests we do not expect that the receive reply task scheduled from this call is the one that receives this call's reply and instead rely on Events to signal across multiple _async_call/_recv_reply tasks. """ request = utils.rpc_request(method_name, *args, **kwargs) _log.debug("Sending request: %s", request) # setup an event to notify us when the reply is received (potentially by a task scheduled by # another call to _async_call). we do this before we send the request to catch the case # where the reply comes back before we re-enter this thread self._events[request.id] = asyncio.Event() # schedule a task to receive the reply to ensure we have a task to receive the reply asyncio.ensure_future(self._recv_reply()) await self._async_socket.send_multipart([to_msgpack(request)]) await self._events[request.id].wait() reply = self._replies.pop(request.id) if isinstance(reply, RPCError): raise utils.RPCError(reply.error) else: return reply.result async def _recv_reply(self): """ Helper task to recieve a reply store the result and trigger the associated event. """ raw_reply, = await self._async_socket.recv_multipart() reply = from_msgpack(raw_reply) _log.debug("Received reply: %s", reply) self._replies[reply.id] = reply self._events.pop(reply.id).set() def call(self, method_name: str, *args, rpc_timeout: float = None, **kwargs): """ Send JSON RPC request to a backend socket and receive reply Note that this uses the default event loop to run in a blocking manner. If you would rather run in an async fashion or provide your own event loop then use .async_call instead :param method_name: Method name :param args: Args that will be passed to the remote function :param float rpc_timeout: Timeout in seconds for Server response, set to None to disable the timeout :param kwargs: Keyword args that will be passed to the remote function """ # if an rpc_timeout override is not specified, use the one set in the Client attributes if rpc_timeout is None: rpc_timeout = self.rpc_timeout request = utils.rpc_request(method_name, *args, **kwargs) # Rather than change the utils.rpc_request interface in a # non-BC way, install the timeout here. This timeout is # communicated to the server, so that the server can terminate # (if it so chooses) requests that will not be received by the # client. request.client_timeout = rpc_timeout _log.debug("Sending request: %s", request) self._socket.send_multipart([to_msgpack(request)]) start_time = time.time() while True: # Need to keep track of timeout manually in case this loop runs more than once. We subtract off already # elapsed time from the timeout. The call to max is to make sure we don't send a negative value # which would throw an error. timeout = max((start_time + rpc_timeout - time.time()) * 1000, 0) if rpc_timeout is not None else None if self._socket.poll(timeout) == 0: raise TimeoutError(f"Timeout on client {self.endpoint}, method name {method_name}, class info: {self}") raw_reply, = self._socket.recv_multipart() reply = from_msgpack(raw_reply) _log.debug("Received reply: %s", reply) # there's a possibility that the socket will have some leftover replies from a previous # request on it if that .call() was cancelled or timed out. Therefore, we need to discard replies that # don't match the request just like in the call_async case. if reply.id == request.id: break else: _log.debug('Discarding reply: %s', reply) for warning in reply.warnings: warn(f"{warning.kind}: {warning.body}") if isinstance(reply, RPCError): raise utils.RPCError(reply.error) else: return reply.result def close(self): """ Close the sockets """ self._socket.close() if self._async_socket_cache: self._async_socket_cache.close() self._async_socket_cache = None def _connect_to_socket(self, context: zmq.Context, endpoint: str): """ Connect to a DEALER socket at endpoint and turn off lingering. :param context: ZMQ Context to use (potentially async) :param endpoint: Endpoint :return: Connected socket """ socket = context.socket(zmq.DEALER) self.enable_auth(socket) socket.connect(endpoint) socket.setsockopt(zmq.LINGER, 0) _log.debug("Client connected to endpoint %s", self.endpoint) return socket @property def _async_socket(self): """ Creates a new async socket if one doesn't already exist for this Client """ if not self._async_socket_cache: self._async_socket_cache = self._connect_to_socket(zmq.asyncio.Context(), self.endpoint) return self._async_socket_cache @property def auth_configured(self) -> bool: return self._auth_config is not None def enable_auth(self, socket=None) -> bool: """ Enables Curve ZeroMQ Authentication if the necessary configuration is present """ if not self.auth_configured: return False socket.curve_secretkey = self._auth_config.client_secret_key socket.curve_publickey = self._auth_config.client_public_key socket.curve_serverkey = self._auth_config.server_public_key return True ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1588975256.0 rpcq-3.11.0/rpcq/_server.py0000644000076700000240000003321600000000000016337 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## """ Server that accepts JSON RPC requests and returns JSON RPC replies/errors. """ import asyncio import logging import sys from asyncio import AbstractEventLoop from typing import Callable, List, Optional, Tuple from datetime import datetime import zmq.asyncio from zmq.auth.asyncio import AsyncioAuthenticator from rpcq._base import to_msgpack, from_msgpack from rpcq._spec import RPCSpec from rpcq.messages import RPCRequest if sys.version_info < (3, 7): from rpcq.external.dataclasses import dataclass else: from dataclasses import dataclass _log = logging.getLogger(__name__) # Required values for ZeroMQ curve authentication, in lieu of a TypedDict @dataclass class ServerAuthConfig: server_secret_key: bytes server_public_key: bytes client_keys_directory: str = "" class Server: """ Server that accepts JSON RPC calls through a socket. """ def __init__(self, rpc_spec: RPCSpec = None, announce_timing: bool = False, serialize_exceptions: bool = True, auth_config: Optional[ServerAuthConfig] = None): """ Create a server that will be linked to a socket :param rpc_spec: JSON RPC spec :param announce_timing: :param serialize_exceptions: If set to True, this Server will catch all exceptions occurring internally to it and, when possible, communicate them to the interrogating Client. If set to False, this Server will re-raise any exceptions it encounters (including, but not limited to, those which might occur through method calls to rpc_spec) for Server's local owner to handle. IMPORTANT NOTE: When set to False, this *almost definitely* means an unrecoverable crash, and the Server should then be _shutdown(). :param auth_config: The configuration values necessary to enable Curve ZeroMQ authentication. These must be provided at instantiation, so they are available between the creation of the context and socket. """ self.announce_timing = announce_timing self.serialize_exceptions = serialize_exceptions self.rpc_spec = rpc_spec if rpc_spec else RPCSpec(serialize_exceptions=serialize_exceptions) self._exit_handlers = [] self._socket = None self._auth_config = auth_config self._authenticator = None self._preloaded_keys = None def rpc_handler(self, f: Callable): """ Add a function to the server. It will respond to JSON RPC requests with the corresponding method name. This can be used as both a side-effecting function or as a decorator. :param f: Function to add :return: Function wrapper (so it can be used as a decorator) """ return self.rpc_spec.add_handler(f) def exit_handler(self, f: Callable): """ Add an exit handler - a function which will be called when the server shuts down. :param f: Function to add """ self._exit_handlers.append(f) async def recv_multipart(self): if self.auth_enabled: return await self.recv_multipart_with_auth() else: # If auth is not enabled, then the client "User-Id" will not be retrieved from # the frames received, and we return None for that value. return (*await self._socket.recv_multipart(), None) async def recv_multipart_with_auth(self) -> Tuple[bytes, list, bytes]: """ Code taken from pyzmq itself: https://github.com/zeromq/pyzmq/blob/master/zmq/sugar/socket.py#L449 and then adapted to allow us to access the information in the frames. When copy=True, only the contents of the messages are returned, rather than the messages themselves. The message is necessary to be able to fetch the "User-Id", which is the public key the client used to connect to this socket. When using auth, knowing which client sent which message is important for authentication, and so we reimplement recv_multipart here, and return the client key as the final member of a tuple """ copy = False # Given a ROUTER socket, the first frame will be the sender's identity. # While, per the docs, this _should_ be retrievable from any frame with # frame.get('Identity'), in practice this value was always blank. # If we don't record the identity value, messages cannot be returned to # the correct client. identity_frame = await self._socket.recv(0, copy=copy, track=False) identity = identity_frame.bytes # The client_id is the public key the client used to establish this connection # It can be retrieved from all frames after the first. Here, we assume it # is the same among all frames, and set it to the value from the first frame client_key = None # After the identity frame, we assemble all further frames in a single buffer. parts = bytearray(b'') while self._socket.getsockopt(zmq.RCVMORE): part = await self._socket.recv(0, copy=copy, track=False) data = part.bytes if client_key is None: client_key = part.get('User-Id') if not isinstance(client_key, bytes) and client_key is not None: client_key = client_key.encode('utf-8') parts += data _log.debug(f'Received authenticated request from client_key {client_key}') return (identity, parts, client_key) async def run_async(self, endpoint: str): """ Run server main task (asynchronously). :param endpoint: Socket endpoint to listen to, e.g. "tcp://*:1234" """ self._connect(endpoint) # spawn an initial listen task listen_task = asyncio.ensure_future(self.recv_multipart()) task_list = [listen_task] while True: dones, pendings = await asyncio.wait(task_list, return_when=asyncio.FIRST_COMPLETED) # grab one "done" task to handle task_list, done_list = list(pendings), list(dones) done = done_list.pop() task_list += done_list if done == listen_task: try: # empty_frame may either be: # 1. a single null frame if the client is a REQ socket # 2. an empty list (ie. no frames) if the client is a DEALER socket identity, *empty_frame, msg, client_key = done.result() request = from_msgpack(msg) request.params['client_key'] = client_key # spawn a processing task task_list.append(asyncio.ensure_future( self._process_request(identity, empty_frame, request))) except Exception as e: if self.serialize_exceptions: _log.exception('Exception thrown in Server run loop during request ' 'reception: {}'.format(repr(e))) else: raise e finally: # spawn a new listen task listen_task = asyncio.ensure_future(self.recv_multipart()) task_list.append(listen_task) else: # if there's been an exception during processing, consider reraising it try: done.result() except Exception as e: if self.serialize_exceptions: _log.exception('Exception thrown in Server run loop during request ' 'dispatch: {}'.format(repr(e))) else: raise e def run(self, endpoint: str, loop: AbstractEventLoop = None): """ Run server main task. :param endpoint: Socket endpoint to listen to, e.g. "tcp://*:1234" :param loop: Event loop to run server in (alternatively just use run_async method) """ if not loop: loop = asyncio.get_event_loop() try: loop.run_until_complete(self.run_async(endpoint)) except KeyboardInterrupt: self._shutdown() def stop(self): """ DEPRECATED """ pass def _shutdown(self): """ Shut down the server. """ for exit_handler in self._exit_handlers: exit_handler() if self._socket: self._socket.close() self._socket = None def _connect(self, endpoint: str): """ Connect the server to an endpoint. Creates a ZMQ ROUTER socket for the given endpoint. :param endpoint: Socket endpoint, e.g. "tcp://*:1234" """ if self._socket: raise RuntimeError('Cannot run multiple Servers on the same socket') context = zmq.asyncio.Context() self._socket = context.socket(zmq.ROUTER) self.start_auth(context) self._socket.bind(endpoint) _log.info("Starting server, listening on endpoint {}".format(endpoint)) async def _process_request(self, identity: bytes, empty_frame: list, request: RPCRequest): """ Executes the method specified in a JSON RPC request and then sends the reply to the socket. :param identity: Client identity provided by ZeroMQ :param empty_frame: Either an empty list or a single null frame depending on the client type :param request: JSON RPC request """ try: _log.debug("Client %s sent request: %s", identity, request) start_time = datetime.now() reply = await self.rpc_spec.run_handler(request) if self.announce_timing: _log.info("Request {} for {} lasted {} seconds".format( request.id, request.method, (datetime.now() - start_time).total_seconds())) _log.debug("Sending client %s reply: %s", identity, reply) await self._socket.send_multipart([identity, *empty_frame, to_msgpack(reply)]) except Exception as e: if self.serialize_exceptions: _log.exception('Exception thrown in _process_request') else: raise e @property def auth_configured(self) -> bool: return (self._auth_config is not None) and isinstance(self._auth_config.server_secret_key, bytes) and isinstance(self._auth_config.server_public_key, bytes) @property def auth_enabled(self) -> bool: return bool(self._socket and self._socket.curve_server) def start_auth(self, context: zmq.Context) -> bool: """ Starts the ZMQ auth service thread, enabling authorization on all sockets within this context """ if not self.auth_configured: return False self._socket.curve_secretkey = self._auth_config.server_secret_key self._socket.curve_publickey = self._auth_config.server_public_key self._socket.curve_server = True self._authenticator = AsyncioAuthenticator(context) if self._preloaded_keys: self.set_client_keys(self._preloaded_keys) else: self.load_client_keys_from_directory() self._authenticator.start() return True def stop_auth(self) -> bool: """ Stops the ZMQ auth service thread, allowing NULL authenticated clients (only) to connect to all threads within its context """ if self._authenticator: self._socket.curve_server = False self._authenticator.stop() return True else: return False def load_client_keys_from_directory(self, directory: Optional[str] = None) -> bool: """ Reset authorized public key list to those present in the specified directory """ # The directory must either be specified at class creation or on each method call if directory is None: if self._auth_config.client_keys_directory: directory = self._auth_config.client_keys_directory if not directory or not self.auth_configured: return False self._authenticator.configure_curve(domain='*', location=directory) return True def set_client_keys(self, client_keys: List[bytes]): """ Reset authorized public key list to this set. Avoids the disk read required by configure_curve, and allows keys to be managed externally. In some cases, keys may be preloaded before the authenticator is started. In this case, we cache those preloaded keys """ if self._authenticator: _log.debug(f"Authorizer: Setting client keys to {client_keys}") self._authenticator.certs['*'] = {key: True for key in client_keys} else: _log.debug(f"Authorizer: Preloading client keys to {client_keys}") self._preloaded_keys = client_keys ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1588975256.0 rpcq-3.11.0/rpcq/_spec.py0000644000076700000240000001160100000000000015755 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## """ Class with json_rpc_call decorator for asynchronous JSON RPC calls """ import asyncio import logging import traceback from typing import Union from rpcq._utils import rpc_reply, rpc_error, RPCMethodError, get_input, get_safe_input, \ catch_warnings from rpcq.messages import RPCRequest, RPCReply, RPCError _log = logging.getLogger(__name__) class RPCSpec(object): """ Class for keeping track of class methods that are exposed to the JSON RPC interface """ def __init__(self, *, provide_tracebacks: bool = True, serialize_exceptions: bool = True): """ Create a JsonRpcSpec object. Usage: jr = JsonRpcSpec() class MyClass(object): def __init__(self): self.num = 5 @jr.add_method def add(obj, *args): return sum(args) + obj.num obj = MyClass() request = { "jsonrpc": "2.0", "id": "0", "method": "add", "params": [1, 2] } reply = jr.call(request, obj) :param provide_tracebacks: If set to True, unhandled exceptions which occur during RPC call implementations will have their tracebacks forwarded to the calling client as part of the generated RPCError reply objject. If set to False, the generated RPCError reply will omit this information (but the traceback will still get written to the logfile). :param serialize_exceptions: If set to True, unhandled exceptions which occur during RPC call implementations will be serialized into RPCError messages (which the Server instance will then probably send to the corresponding Client). If set to False, the exception is re-raised and left for the local caller to handle further. """ self._json_rpc_methods = {} self.provide_tracebacks = provide_tracebacks self.serialize_exceptions = serialize_exceptions def add_handler(self, f): """ Adds the function f to a dictionary of JSON RPC methods. :param callable f: Method to be exposed :return: """ if f.__name__.startswith('rpc_'): raise ValueError("Server method names cannot start with rpc_.") self._json_rpc_methods[f.__name__] = f return f def get_handler(self, request): """ Get callable from JSON RPC request :param RPCRequest request: JSON RPC request :return: Method :rtype: callable """ try: f = self._json_rpc_methods[request.method] except (AttributeError, KeyError): # pragma no coverage raise RPCMethodError("Received invalid method '{}'".format(request.method)) return f async def run_handler(self, request: RPCRequest) -> Union[RPCReply, RPCError]: """ Process a JSON RPC request :param RPCRequest request: JSON RPC request :return: JSON RPC reply """ with catch_warnings(record=True) as warnings: try: rpc_handler = self.get_handler(request) except RPCMethodError as e: return rpc_error(request.id, repr(e), warnings=warnings) try: # Run RPC and get result args, kwargs = get_safe_input(request.params, rpc_handler) result = rpc_handler(*args, **kwargs) if asyncio.iscoroutine(result): result = await result except Exception as e: if self.serialize_exceptions: _traceback = traceback.format_exc() _log.error(_traceback) if self.provide_tracebacks: return rpc_error(request.id, "{}\n{}".format(repr(e), _traceback), warnings=warnings) else: return rpc_error(request.id, repr(e), warnings=warnings) else: raise e return rpc_reply(request.id, result, warnings=warnings) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1574288439.0 rpcq-3.11.0/rpcq/_utils.py0000644000076700000240000001236600000000000016174 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## """Utils for message passing""" import uuid import warnings from inspect import signature from typing import Callable, Optional, Tuple, Union, List, Any import rpcq.messages def rpc_warning(warning: warnings.WarningMessage) -> rpcq.messages.RPCWarning: return rpcq.messages.RPCWarning(body=str(warning.message), kind=str(warning.category.__name__)) def rpc_request(method_name: str, *args, **kwargs) -> rpcq.messages.RPCRequest: """ Create RPC request :param method_name: Method name :param args: Positional arguments :param kwargs: Keyword arguments :return: JSON RPC formatted dict """ if args: kwargs['*args'] = args return rpcq.messages.RPCRequest( jsonrpc='2.0', id=str(uuid.uuid4()), method=method_name, params=kwargs ) def rpc_reply(id: Union[str, int], result: Optional[object], warnings: Optional[List[Warning]] = None) -> rpcq.messages.RPCReply: """ Create RPC reply :param str|int id: Request ID :param result: Result :param warnings: List of warnings to attach to the message :return: JSON RPC formatted dict """ warnings = warnings or [] return rpcq.messages.RPCReply( jsonrpc='2.0', id=id, result=result, warnings=[rpc_warning(warning) for warning in warnings] ) def rpc_error(id: Union[str, int], error_msg: str, warnings: List[Any] = []) -> rpcq.messages.RPCError: """ Create RPC error :param id: Request ID :param error_msg: Error message :param warning: List of warnings to attach to the message :return: JSON RPC formatted dict """ return rpcq.messages.RPCError( jsonrpc='2.0', id=id, error=error_msg, warnings=[rpc_warning(warning) for warning in warnings]) def get_input(params: Union[dict, list]) -> Tuple[list, dict]: """ Get positional or keyword arguments from JSON RPC params :param params: Parameters passed through JSON RPC :return: args, kwargs """ # Backwards compatibility for old clients that send params as a list if isinstance(params, list): args = params kwargs = {} elif isinstance(params, dict): args = params.pop('*args', []) kwargs = params else: # pragma no coverage raise TypeError( 'Unknown type {} of params, must be list or dict'.format(type(params))) return args, kwargs def get_safe_input(params: Union[dict, list], handler: Callable) -> Tuple[list, dict]: """ Get positional or keyword arguments from JSON RPC params, filtering out kwargs that aren't in the function signature :param params: Parameters passed through JSON RPC :param handler: RPC handler function :return: args, kwargs """ args, kwargs = get_input(params) handler_signature = signature(handler) kwargs = { k: v for k, v in kwargs.items() if k in handler_signature.parameters } return args, kwargs class RPCErrorError(IOError): """JSON RPC error that is raised by a Client when it receives an RPCError message""" def __init__(self, *args, **kwargs): if type(self) is RPCErrorError: warnings.warn("`RPCErrorError` is deprecated in favor of the " "less-loquacious `RPCError`.", DeprecationWarning) super().__init__(*args, **kwargs) class RPCError(RPCErrorError): """JSON RPC error that is raised by a Client when it receives an RPCError message""" class RPCMethodError(AttributeError): """JSON RPC error that is raised by JSON RPC spec for nonexistent methods""" class catch_warnings(warnings.catch_warnings): """This variant of warnings.catch_warnings both logs *and* re-emits warnings.""" def __enter__(self): super().__enter__() # the super() method does most of the work. what follows below is actually # also cut out of the super() method, but the relevant line there is # # self._module._showwarnmsg_impl = log.append # # we, on the other hand, want to both append *and* call the saved parent # log-displayer, so we wrap both inside of new_logger and store that instead. if self._record: log = [] def new_logger(msg): nonlocal log, self log.append(msg) self._showwarnmsg_impl(msg) self._module._showwarnmsg_impl = new_logger return log else: return None ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679534595.0 rpcq-3.11.0/rpcq/core_messages.py0000644000076700000240000007537100000000000017521 0ustar00ksnyderstaff00000000000000#!/usr/bin/env python """ WARNING: This file is auto-generated, do not edit by hand. See README.md. """ import sys from warnings import warn from rpcq._base import Message from typing import Any, List, Dict, Optional if sys.version_info < (3, 7): from rpcq.external.dataclasses import dataclass, field, InitVar else: from dataclasses import dataclass, field, InitVar from rpcq.messages import * @dataclass(eq=False, repr=False) class Frame(Message): """ A frame encapsulates any rotating frame relative to which control or readout waveforms may be defined. """ direction: str """'rx' or 'tx'""" sample_rate: float """The sample rate [Hz] of the associated AWG/ADC""" frequency: float """The frame frequency [Hz]""" @dataclass(eq=False, repr=False) class Resources(Message): """ The resources required by a job """ qubits: List[str] = field(default_factory=list) """A list of qubits blocked/required by a job.""" frames: Dict[str, Frame] = field(default_factory=dict) """RF/UHF frames by label.""" frames_to_controls: Dict[str, str] = field(default_factory=dict) """Mapping of frames to control channels by labels.""" @dataclass(eq=False, repr=False) class AbstractWaveform(Message): """ A waveform envelope defined for a specific frame. This abstract class is made concrete by either a `Waveform` or a templated waveform such as `GaussianWaveform` """ frame: str """The label of the associated tx-frame.""" @dataclass(eq=False, repr=False) class Waveform(AbstractWaveform): """ A waveform envelope defined by specific IQ values for a specific frame. """ iqs: List[float] = field(default_factory=list) """The raw waveform envelope samples, alternating I and Q values.""" @dataclass(eq=False, repr=False) class TemplateWaveform(AbstractWaveform): """ A waveform envelope defined for a specific frame. A templated waveform is defined by a parameterized pulseshape rather than explicit IQ values. The message specification does not enforce that the duration is implementable on the hardware. """ duration: float """Length of the pulse in seconds""" scale: float = 1.e+0 """Scale to apply to waveform envelope""" phase: float = 0.0e+0 """Phase [units of tau=2pi] to rotate the complex waveform envelope.""" detuning: float = 0.0e+0 """Modulation to apply to the waveform in Hz""" @dataclass(eq=False, repr=False) class GaussianWaveform(TemplateWaveform): """ A Gaussian shaped waveform envelope defined for a specific frame. """ fwhm: Optional[float] = None """Full Width Half Max shape paramter in seconds""" t0: Optional[float] = None """Center time coordinate of the shape in seconds. Defaults to mid-point of pulse.""" @dataclass(eq=False, repr=False) class DragGaussianWaveform(TemplateWaveform): """ A DRAG Gaussian shaped waveform envelope defined for a specific frame. """ fwhm: Optional[float] = None """Full Width Half Max shape paramter in seconds""" t0: Optional[float] = None """Center time coordinate of the shape in seconds. Defaults to mid-point of pulse.""" anh: float = -2.1e+8 """Anharmonicity of the qubit, f01-f12 in (Hz)""" alpha: float = 0.0e+0 """Dimensionless DRAG parameter""" @dataclass(eq=False, repr=False) class HermiteGaussianWaveform(TemplateWaveform): """ Hermite-Gaussian shaped pulse. Reference: Effects of arbitrary laser or NMR pulse shapes on population inversion and coherence Warren S. Warren. 81, (1984); doi: 10.1063/1.447644 """ fwhm: Optional[float] = None """Full Width Half Max shape paramter in seconds""" t0: float = 0.0e+0 """Center time coordinate of the shape in seconds. Defaults to mid-point of pulse.""" anh: float = -2.1e+8 """Anharmonicity of the qubit, f01-f12 in Hz""" alpha: float = 0.0e+0 """Dimensionless DRAG parameter""" second_order_hrm_coeff: float = 9.56e-1 """Second order coefficient (see paper)""" @dataclass(eq=False, repr=False) class ErfSquareWaveform(TemplateWaveform): """ Pulse with a flat top and rounded shoulders given by error functions """ risetime: float = 1.e-9 """The width of the rise and fall sections in seconds.""" pad_left: float = 0.0e+0 """Length of zero-amplitude padding before the pulse in seconds.""" pad_right: float = 0.0e+0 """Length of zero-amplitude padding after the pulse in seconds.""" @dataclass(eq=False, repr=False) class FlatWaveform(TemplateWaveform): """ Flat pulse. """ iq: List[float] = field(default_factory=list) """Individual IQ point to hold constant""" @dataclass(eq=False, repr=False) class AbstractKernel(Message): """ An integration kernel defined for a specific frame. This abstract class is made concrete by either a `FilterKernel` or `TemplateKernel` """ frame: str """The label of the associated rx-frame.""" @dataclass(eq=False, repr=False) class FilterKernel(AbstractKernel): """ A filter kernel to produce scalar readout features from acquired readout waveforms. """ iqs: List[float] = field(default_factory=list) """The raw kernel coefficients, alternating real and imaginary parts.""" bias: float = 0.0e+0 """The kernel is offset by this real value. Can be used to ensure the decision threshold lies at 0.0.""" @dataclass(eq=False, repr=False) class TemplateKernel(AbstractKernel): """ An integration kernel defined for a specific frame. """ duration: float """Length of the boxcar kernel in seconds""" bias: float = 0.0e+0 """The kernel is offset by this real value. Can be used to ensure the decision threshold lies at 0.0.""" scale: float = 1.e+0 """Scale to apply to boxcar kernel""" phase: float = 0.0e+0 """Phase [units of tau=2pi] to rotate the kernel by.""" detuning: float = 0.0e+0 """Modulation to apply to the filter kernel in Hz""" @dataclass(eq=False, repr=False) class FlatKernel(TemplateKernel): """ An unnormalized flat or boxcar integration kernel. """ @dataclass(eq=False, repr=False) class BoxcarAveragerKernel(TemplateKernel): """ A normalized flat or boxcar integration kernel. """ @dataclass(eq=False, repr=False) class FilterNode(Message): """ A node in the filter pipeline. """ module: str """Absolute python module import path in which the filter class is defined.""" filter_type: str """The type (class name) of the filter.""" source: str """Filter node label of the input to this node.""" publish: bool """If True, return the output of this node with the job results (and publish a stream for it).""" params: Dict[str, float] = field(default_factory=dict) """Additional filter parameters.""" @dataclass(eq=False, repr=False) class DataAxis(Message): """ A data axis allows to label element(s) of a stream. """ name: str """Label for the axis, e.g., 'time' or 'shot_index'.""" points: List[float] = field(default_factory=list) """The sequence of values along the axis.""" @dataclass(eq=False, repr=False) class Receiver(Message): """ The receiver settings generated by the low-level translator. """ instrument: str """The instrument name""" channel: str """The instrument channel (label)""" stream: str """Name of the associated (raw) output stream that should be published.""" publish: bool """Whether to publish the raw output stream.""" data_axes: List[DataAxis] = field(default_factory=list) """Ordered list of DataAxis objects that together uniquely label each element in the stream.""" @dataclass(eq=False, repr=False) class ParameterExpression(Message): """ A parametric expression. """ operator: str """The operator '+', '-', '*'. The operands can be constant floating point numbers or strings referencing a dynamic program parameter or a ParameterAref to index into an array or itself a ParameterExpression.""" a: Any """The first operand""" b: Any """The second operand""" @dataclass(eq=False, repr=False) class Instruction(Message): """ An instruction superclass. """ time: float """The time at which the instruction is emitted [in seconds].""" @dataclass(eq=False, repr=False) class DebugMessage(Instruction): """ Instructs the target to emit a specified debug message. """ frame: str """The frame label that owns this debug message.""" message: int """The 2-byte wide debug message to emit.""" @dataclass(eq=False, repr=False) class Pulse(Instruction): """ Instruction to play a pulse with some (modified) waveform envelope at a specific time on a specific frame. """ frame: str """The tx-frame label on which the pulse is played.""" waveform: str """The waveform label""" scale: Optional[float] = 1.e+0 """Dimensionless (re-)scaling factor which is applied to the envelope.""" phase: float = 0.0e+0 """Static phase angle [units of tau=2pi] by which the envelope quadratures are rotated.""" detuning: float = 0.0e+0 """Detuning [Hz] with which the pulse envelope should be modulated relative to the frame frequency.""" @dataclass(eq=False, repr=False) class FlatPulse(Instruction): """ Instruction to play a pulse with a constant amplitude (except for phase modulation) at a specific time on a specific frame. """ frame: str """The tx-frame label on which the pulse is played.""" iq: List[float] """The I and Q value of the constant pulse.""" duration: float """The duration of the pulse in [seconds], should be a multiple of the associated tx-frame's inverse sample rate.""" phase: float = 0.0e+0 """Static phase angle [units of tau=2pi] by which the envelope quadratures are rotated.""" detuning: float = 0.0e+0 """Detuning [Hz] with which the pulse envelope should be modulated relative to the frame frequency.""" scale: Optional[float] = 1.e+0 """Dimensionless (re-)scaling factor which is applied to the envelope.""" @dataclass(eq=False, repr=False) class SetPhase(Instruction): """ Set the phase of a frame to an absolute value at a specific time. """ frame: str """The frame label for which to set the phase.""" phase: float = 0.0e+0 """Phase angle [units of tau=2pi] to update the frame phase to.""" @dataclass(eq=False, repr=False) class ShiftPhase(Instruction): """ Shift the phase of a frame by a relative value at a specific time. """ frame: str """The frame label for which to set the phase.""" delta: Any = None """Phase angle [units of tau=2pi] by which to shift the frame phase. Can be a numerical value, a ParameterExpression or a ParameterAref.""" @dataclass(eq=False, repr=False) class SwapPhases(Instruction): """ Swap the phases of two tx-frames at a specific time. """ frame_a: str """The first frame's label.""" frame_b: str """The second frame's label.""" @dataclass(eq=False, repr=False) class SetFrequency(Instruction): """ Set the frequency of a tx-frame to a specific value at a specific time. """ frame: str """The frame label for which to set the frequency.""" frequency: float = 0.0e+0 """The frequency [Hz] to set the frame frequency to.""" @dataclass(eq=False, repr=False) class ShiftFrequency(Instruction): """ Shift the frequency of a tx-frame by a specific amount at a specific time. """ frame: str """The frame label for which to set the frequency.""" delta: float = 0.0e+0 """Frequency shift (new-old) [Hz] to apply to the frame frequency.""" @dataclass(eq=False, repr=False) class SetScale(Instruction): """ Set the scale of a tx-frame to a value at a specific time. """ frame: str """The frame label for which to set the scale.""" scale: float = 1.e+0 """Scale (unitless) to apply to waveforms generated on the frame.""" @dataclass(eq=False, repr=False) class Capture(Instruction): """ Specify an acquisition on an rx-frame as well as the filters to apply. """ frame: str """The rx-frame label on which to trigger the acquisition.""" duration: float """The duration of the acquisition in [seconds]""" filters: List[str] = field(default_factory=list) """An ordered list of labels of filter kernels to apply to the captured waveform.""" send_to_host: bool = True """Transmit the readout bit back to Lodgepole. (Unnecessary for fully calibrated active reset captures).""" phase: float = 0.0e+0 """Static phase angle [units of tau=2pi] by which the envelope quadratures are rotated.""" detuning: float = 0.0e+0 """Detuning [Hz] with which the pulse envelope should be modulated relative to the frame frequency.""" @dataclass(eq=False, repr=False) class MNIOConnection(Message): """ Description of one side of an MNIO connection between two Tsunamis. """ port: int """The physical Tsunami MNIO port, indexed from 0, where this connection originates.""" destination: str """The Tsunami where this connection terminates.""" @dataclass(eq=False, repr=False) class Program(Message): """ The dynamic aspects (waveforms, readout kernels, scheduled instructions and parameters) of a job. """ waveforms: Dict[str, AbstractWaveform] = field(default_factory=dict) """The waveforms appearing in the program by waveform label.""" filters: Dict[str, AbstractKernel] = field(default_factory=dict) """The readout filter kernels appearing in the program by feature label.""" scheduled_instructions: List[Instruction] = field(default_factory=list) """The ordered sequence of scheduled instruction objects.""" parameters: Dict[str, ParameterSpec] = field(default_factory=dict) """A mapping of dynamic parameter names to their type specification.""" @dataclass(eq=False, repr=False) class ScheduleIRJob(Message): """ The unit of work to be executed. """ num_shots: int """How many repetitions the job should be executed for.""" resources: Resources """The resources required by the job.""" program: Program """The actual program to be executed.""" operating_point: Dict[str, Dict] = field(default_factory=dict) """Operating points or static instrument channel settings (mapping control_name (instrument name) -> instrument channel settings (instrument settings) dictionary).""" filter_pipeline: Dict[str, FilterNode] = field(default_factory=dict) """The filter pipeline. Mapping of node labels to FilterNode's.""" job_id: InitVar[Optional[str]] = None """A unique ID to help the submitter track the job.""" def _extend_by_deprecated_fields(self, d): super()._extend_by_deprecated_fields(d) def __post_init__(self, job_id): if job_id is not None: warn('job_id is deprecated; please don\'t set it anymore') @dataclass(eq=False, repr=False) class RackMeta(Message): """ Meta information about a rack configuration. """ rack_id: Optional[str] = None """A unique identifier for the rack.""" rack_version: Optional[int] = None """A version of the rack configuration.""" schema_version: int = 0 """A version of the rack configuration.""" @dataclass(eq=False, repr=False) class QPU(Message): """ Configuration info for the QPU """ chip_label: str """The fabrication label for the QPU chip.""" qubits: List[str] = field(default_factory=list) """A list of qubits labels.""" controls: Dict[str, List] = field(default_factory=dict) """A mapping of control labels to tuples (instrument label, channel label).""" controls_by_qubit: Dict[str, List] = field(default_factory=dict) """A map of qubit label to list of controls that should be considered blocked when the qubit is part of a job execution.""" @dataclass(eq=False, repr=False) class Instrument(Message): """ Instrument settings. """ address: str """The full address of a QPU.""" module: str """Full python import path for the module that includes the instrument driver.""" instrument_type: str """Instrument type (driver class name).""" mnio_connections: Dict[str, MNIOConnection] = field(default_factory=dict) """MNIO network connections between Tsunami instruments""" channels: Dict[str, Any] = field(default_factory=dict) """Mapping of channel labels to channel settings""" virtual: bool = False """Whether the instrument is virtual.""" setup: Dict[str, Any] = field(default_factory=dict) """Any additional information used by the instrument for one-time-setup""" @dataclass(eq=False, repr=False) class DeployedRack(Message): """ The rack configuration for lodgepole. """ rack_meta: RackMeta """Meta information about the deployed rack.""" qpu: QPU """Information about the QPU.""" instruments: Dict[str, Instrument] = field(default_factory=dict) """Mapping of instrument name to instrument settings.""" @dataclass(eq=False, repr=False) class AWGChannel(Message): """ Configuration of a single RF channel. """ sample_rate: float """The sampling rate [Hz] of the associated DAC/ADC component.""" direction: str """'rx' or 'tx'""" lo_frequency: Optional[float] = None """The local oscillator frequency [Hz] of the channel.""" gain: Optional[float] = None """If there is an amplifier, the amplifier gain [dB].""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" @dataclass(eq=False, repr=False) class QFDChannel(Message): """ Configuration for a single QFD Channel. """ channel_index: int """The channel index on the QFD, zero indexed from the lowest channel, as installed in the box.""" direction: Optional[str] = "tx" """The QFD is a device that transmits pulses.""" nco_frequency: Optional[float] = 0.0e+0 """The DAC NCO frequency [Hz].""" gain: Optional[float] = 0.0e+0 """The output gain on the DAC in [dB]. Note that this should be in the range -45dB to 0dB and is rounded to the nearest 3dB step.""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" flux_current: Optional[float] = None """Flux current [Amps].""" relay_closed: Optional[bool] = None """Set the state of the Flux relay. True - Relay closed, allows flux current to flow. False - Relay open, no flux current can flow.""" @dataclass(eq=False, repr=False) class QGSChannel(Message): """ Configuration for a single QGS Channel. """ direction: Optional[str] = "tx" """The QGS is a device that transmits pulses.""" nco_frequency: Optional[float] = 2.e+9 """The DAC NCO frequency [Hz].""" gain: Optional[float] = 0.0e+0 """The output gain on the DAC in [dB]. Note that this should be in the range -45dB to 0dB and is rounded to the nearest 3dB step.""" channel_index: Optional[int] = 0 """The channel index on the QGS, zero indexed from the lowest channel, as installed in the box.""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" @dataclass(eq=False, repr=False) class QRTChannel(Message): """ Configuration for a single QRT Channel. """ direction: Optional[str] = "tx" """The QRT is a device that transmits readout pulses.""" nco_frequency: Optional[float] = 1.25e+9 """The DAC NCO frequency [Hz].""" gain: Optional[float] = 0.0e+0 """The output gain on the DAC in [dB]. Note that this should be in the range -45dB to 0dB and is rounded to the nearest 3dB step.""" channel_index: Optional[int] = 0 """The channel index on the QRT, zero indexed from the lowest channel, as installed in the box.""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" @dataclass(eq=False, repr=False) class QRRChannel(Message): """ Configuration for a single QRR Channel. """ channel_index: int """The channel index on the QRR, zero indexed from the lowest channel, as installed in the box.""" direction: Optional[str] = "rx" """The QRR is a device that receives readout pulses.""" nco_frequency: Optional[float] = 0.0e+0 """The ADC NCO frequency [Hz].""" gain: Optional[float] = 0.0e+0 """The input gain on the ADC in [dB]. Note that this should be in the range -45dB to 0dB and is rounded to the nearest 3dB step.""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" @dataclass(eq=False, repr=False) class CWChannel(Message): """ Configuration for a single CW Generator Channel. """ channel_index: int = 0 """The zero-indexed channel of the generator's output.""" rf_output_frequency: Optional[int] = 1000000000 """The CW generator's output frequency [Hz].""" rf_output_power: Optional[float] = 0.0e+0 """The power of CW generator's output [dBm].""" rf_output_enabled: Optional[bool] = True """The state (on/off) of CW generator's output.""" @dataclass(eq=False, repr=False) class QDOSlowFluxChannel(Message): """ Configuration for a single QDO Slow Flux Channel. """ channel_index: int """The channel index on the QDO, zero indexed from the lowest channel, as installed in the box. Flux index typically starts at 4.""" flux_current: Optional[float] = None """Flux current [Amps].""" relay_closed: Optional[bool] = False """Set the state of the Flux relay. True - Relay closed, allows flux current to flow. False - Relay open, no flux current can flow.""" @dataclass(eq=False, repr=False) class QDOFastFluxChannel(Message): """ Configuration for a single QDO Fast Flux Channel. """ channel_index: int """The channel index on the QDO, zero indexed from the lowest channel, as installed in the box.""" direction: Optional[str] = "tx" """The QDO is a device that transmits pulses.""" delay: float = 0.0e+0 """Delay [seconds] to account for inter-channel skew.""" flux_current: Optional[float] = None """Flux current [Amps].""" @dataclass(eq=False, repr=False) class YokogawaGS200Channel(Message): """ Configuration for a single Yokogawa GS200 Channel. """ @dataclass(eq=False, repr=False) class LegacyUSRPSequencer(Message): """ Configuration for a Legacy USRP Sequencer """ tx_channel: Optional[str] = None """The label of the associated tx channel.""" rx_channel: Optional[str] = None """The label of the associated rx channel.""" @dataclass(eq=False, repr=False) class QFDSequencer(Message): """ Configuration for a single QFD Sequencer. """ tx_channel: str """The label of the associated channel.""" sequencer_index: int """The sequencer index of this sequencer.""" @dataclass(eq=False, repr=False) class QDOSequencer(Message): """ Configuration for a single QDO Sequencer. """ tx_channel: str """The label of the associated channel.""" sequencer_index: int """The sequencer index of this sequencer.""" @dataclass(eq=False, repr=False) class QFDx2Sequencer(Message): """ Configuration for a single QFDx2 Sequencer. """ tx_channel: str """The label of the associated channel.""" sequencer_index: int """The sequencer index of this sequencer.""" @dataclass(eq=False, repr=False) class QGSSequencer(Message): """ Configuration for a single QGS Sequencer. """ tx_channel: str """The label of the associated channel.""" sequencer_index: int """The sequencer index of this sequencer.""" @dataclass(eq=False, repr=False) class QGSx2Sequencer(Message): """ Configuration for a single QGSx2 Sequencer. """ tx_channel: str """The label of the associated channel.""" sequencer_index: int """The sequencer index of this sequencer.""" @dataclass(eq=False, repr=False) class QRTSequencer(Message): """ Configuration for a single readout transmit (QRT) sequencer. """ tx_channel: str """The label of the associated tx channel.""" sequencer_index: int """The sequencer index (0-7) of this sequencer.""" low_freq_range: Optional[bool] = False """Used to signal if this sequencer is in the low frequency configuration.""" @dataclass(eq=False, repr=False) class QRTx2Sequencer(Message): """ Configuration for a dual readout transmit (QRTx2) sequencer. """ tx_channel: str """The label of the associated tx channel.""" sequencer_index: int """The sequencer index (0-15) of this sequencer.""" low_freq_range: Optional[bool] = False """Used to signal if this sequencer is in the low frequency configuration.""" @dataclass(eq=False, repr=False) class QRRSequencer(Message): """ Configuration for a single readout receive (QRR) sequencer. """ rx_channel: str """The label of the associated rx channel.""" sequencer_index: int """The sequencer index (0-15) to assign. Note that only sequencer 0 can return raw readout measurements.""" @dataclass(eq=False, repr=False) class USICardSequencer(Message): """ Configuration for the card which interfaces with the USI Target on the USRP. """ tx_channel: str """The label of the associated channel.""" @dataclass(eq=False, repr=False) class USITargetSequencer(Message): """ Configuration for a single USITarget Sequencer. """ tx_channel: str """The label of the associated intial tx channel.""" rx_channel: str """The label of the associated initial rx channel.""" sequencer_index: int """The sequencer index (0-7) to assign. Note that only sequencer 0 has the ability to use the NCO or capture raw readout streams.""" @dataclass(eq=False, repr=False) class CWFrequencySweep(Message): """ Configuration of a continuous wave frequency sweep. """ start: float """Start frequency of the sweep, in Hz""" stop: float """Stop frequency of the sweep, in Hz""" num_pts: int """Number of frequency points to sample, cast to int.""" source: int """Source port number""" measure: int """Measure port number""" @dataclass(eq=False, repr=False) class VNASettings(Message): """ Configuration of VNA settings for a continuous wave sweep. """ e_delay: float """Electrical delay in seconds from source to measure port""" phase_offset: float """Phase offset in degrees from measured to reported phase""" bandwidth: float """Bandwidth of the sweep, in Hz""" power: float """Source power in dBm""" freq_sweep: CWFrequencySweep """Frequency sweep settings""" averaging: int = 1 """Sets the number of points to combine into an averaged trace""" @dataclass(eq=False, repr=False) class TimeBomb(Message): """ Payload used to match a job with a particular execution target. """ deadline: str """Deadline, specified in the format '%Y-%m-%dT%H:%M:%S.000Z', after which this job becomes unexecutable.""" chip_label: str """Label string for the chip on which this job is meant to execute.""" @dataclass(eq=False, repr=False) class MicrowaveSourceSettings(Message): """ Configuration of Microwave Source settings for operating amplifiers. """ frequency: float """Frequency setting for microwave source (Hz).""" power: float """Power setting for microwave source (dBm).""" output: bool """Output setting for microwave source. If true, the source will be turned on.""" @dataclass(eq=False, repr=False) class ExecutorJob(Message): """ Job which is sent directly to the executor """ instrument_settings: Dict[str, Any] """Dict mapping instrument names to arbitrary instrument settings.""" filter_pipeline: Dict[str, FilterNode] """The filter pipeline to process measured data.""" receivers: Dict[str, Receiver] """Dict mapping stream names to receiver settings.""" duration: Optional[float] = None """The total duration of the program execution in seconds.""" timebomb: Optional[TimeBomb] = None """An optional payload used to match this job with a particular execution target.""" @dataclass(eq=False, repr=False) class PatchableBinary(Message): """ Tsunami binary with patching metadata for classical parameter modification. """ base_binary: Any """Raw Tsunami binary object.""" patch_table: Dict[str, PatchTarget] """Dictionary mapping patch names to their memory descriptors.""" @dataclass(eq=False, repr=False) class ActiveReset(Message): """ An active reset control sequence consisting of a repeated sequence of a measurement block and a feedback block conditional on the outcome of a specific measurement bit. Regardless of the measurement outcomes the total duration of the control sequence is [attempts x (measurement_duration + feedback_duration)]. The total measurement_duration must be chosen to allow for enough time after any Capture commands for the measurement bit to propagate back to the gate cards that are actuating the feedback. """ time: float """Time at which the ActiveReset begins in [seconds].""" measurement_duration: float """The duration of measurement block in [seconds]. The measurement bit is expected to have arrived on the QGS after this time relative to the overall start of the ActiveReset block.""" feedback_duration: float """The duration of feedback block in [seconds]""" measurement_bit: int """The address of the readout bit to condition the feedback on. The bit is first accessed after measurement_duration has elapsed.""" attempts: int = 3 """The number of times to repeat the active reset sequence.""" measurement_instructions: List[Dict] = field(default_factory=list) """The ordered sequence of scheduled measurement instructions.""" apply_feedback_when: bool = True """Apply the feedback when the measurement_bit equals the value of this flag.""" feedback_instructions: List[Dict] = field(default_factory=list) """The ordered sequence of scheduled feedback instructions.""" ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8034413 rpcq-3.11.0/rpcq/external/0000755000076700000240000000000000000000000016135 5ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1570747843.0 rpcq-3.11.0/rpcq/external/__init__.py0000644000076700000240000000000000000000000020234 0ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1570747843.0 rpcq-3.11.0/rpcq/external/dataclasses.py0000644000076700000240000013023000000000000020775 0ustar00ksnyderstaff00000000000000import re import sys import copy import types import inspect import keyword __all__ = ['dataclass', 'field', 'Field', 'FrozenInstanceError', 'InitVar', 'MISSING', # Helper functions. 'fields', 'asdict', 'astuple', 'make_dataclass', 'replace', 'is_dataclass', ] # Conditions for adding methods. The boxes indicate what action the # dataclass decorator takes. For all of these tables, when I talk # about init=, repr=, eq=, order=, unsafe_hash=, or frozen=, I'm # referring to the arguments to the @dataclass decorator. When # checking if a dunder method already exists, I mean check for an # entry in the class's __dict__. I never check to see if an attribute # is defined in a base class. # Key: # +=========+=========================================+ # + Value | Meaning | # +=========+=========================================+ # | | No action: no method is added. | # +---------+-----------------------------------------+ # | add | Generated method is added. | # +---------+-----------------------------------------+ # | raise | TypeError is raised. | # +---------+-----------------------------------------+ # | None | Attribute is set to None. | # +=========+=========================================+ # __init__ # # +--- init= parameter # | # v | | | # | no | yes | <--- class has __init__ in __dict__? # +=======+=======+=======+ # | False | | | # +-------+-------+-------+ # | True | add | | <- the default # +=======+=======+=======+ # __repr__ # # +--- repr= parameter # | # v | | | # | no | yes | <--- class has __repr__ in __dict__? # +=======+=======+=======+ # | False | | | # +-------+-------+-------+ # | True | add | | <- the default # +=======+=======+=======+ # __setattr__ # __delattr__ # # +--- frozen= parameter # | # v | | | # | no | yes | <--- class has __setattr__ or __delattr__ in __dict__? # +=======+=======+=======+ # | False | | | <- the default # +-------+-------+-------+ # | True | add | raise | # +=======+=======+=======+ # Raise because not adding these methods would break the "frozen-ness" # of the class. # __eq__ # # +--- eq= parameter # | # v | | | # | no | yes | <--- class has __eq__ in __dict__? # +=======+=======+=======+ # | False | | | # +-------+-------+-------+ # | True | add | | <- the default # +=======+=======+=======+ # __lt__ # __le__ # __gt__ # __ge__ # # +--- order= parameter # | # v | | | # | no | yes | <--- class has any comparison method in __dict__? # +=======+=======+=======+ # | False | | | <- the default # +-------+-------+-------+ # | True | add | raise | # +=======+=======+=======+ # Raise because to allow this case would interfere with using # functools.total_ordering. # __hash__ # +------------------- unsafe_hash= parameter # | +----------- eq= parameter # | | +--- frozen= parameter # | | | # v v v | | | # | no | yes | <--- class has explicitly defined __hash__ # +=======+=======+=======+========+========+ # | False | False | False | | | No __eq__, use the base class __hash__ # +-------+-------+-------+--------+--------+ # | False | False | True | | | No __eq__, use the base class __hash__ # +-------+-------+-------+--------+--------+ # | False | True | False | None | | <-- the default, not hashable # +-------+-------+-------+--------+--------+ # | False | True | True | add | | Frozen, so hashable, allows override # +-------+-------+-------+--------+--------+ # | True | False | False | add | raise | Has no __eq__, but hashable # +-------+-------+-------+--------+--------+ # | True | False | True | add | raise | Has no __eq__, but hashable # +-------+-------+-------+--------+--------+ # | True | True | False | add | raise | Not frozen, but hashable # +-------+-------+-------+--------+--------+ # | True | True | True | add | raise | Frozen, so hashable # +=======+=======+=======+========+========+ # For boxes that are blank, __hash__ is untouched and therefore # inherited from the base class. If the base is object, then # id-based hashing is used. # # Note that a class may already have __hash__=None if it specified an # __eq__ method in the class body (not one that was created by # @dataclass). # # See _hash_action (below) for a coded version of this table. # Raised when an attempt is made to modify a frozen class. class FrozenInstanceError(AttributeError): pass # A sentinel object for default values to signal that a default # factory will be used. This is given a nice repr() which will appear # in the function signature of dataclasses' constructors. class _HAS_DEFAULT_FACTORY_CLASS: def __repr__(self): return '' _HAS_DEFAULT_FACTORY = _HAS_DEFAULT_FACTORY_CLASS() # A sentinel object to detect if a parameter is supplied or not. Use # a class to give it a better repr. class _MISSING_TYPE: pass MISSING = _MISSING_TYPE() # Since most per-field metadata will be unused, create an empty # read-only proxy that can be shared among all fields. _EMPTY_METADATA = types.MappingProxyType({}) # Markers for the various kinds of fields and pseudo-fields. class _FIELD_BASE: def __init__(self, name): self.name = name def __repr__(self): return self.name _FIELD = _FIELD_BASE('_FIELD') _FIELD_CLASSVAR = _FIELD_BASE('_FIELD_CLASSVAR') _FIELD_INITVAR = _FIELD_BASE('_FIELD_INITVAR') # The name of an attribute on the class where we store the Field # objects. Also used to check if a class is a Data Class. _FIELDS = '__dataclass_fields__' # The name of an attribute on the class that stores the parameters to # @dataclass. _PARAMS = '__dataclass_params__' # The name of the function, that if it exists, is called at the end of # __init__. _POST_INIT_NAME = '__post_init__' # String regex that string annotations for ClassVar or InitVar must match. # Allows "identifier.identifier[" or "identifier[". # https://bugs.python.org/issue33453 for details. _MODULE_IDENTIFIER_RE = re.compile(r'^(?:\s*(\w+)\s*\.)?\s*(\w+)') class _InitVarMeta(type): def __getitem__(self, params): return self class InitVar(metaclass=_InitVarMeta): pass # Instances of Field are only ever created from within this module, # and only from the field() function, although Field instances are # exposed externally as (conceptually) read-only objects. # # name and type are filled in after the fact, not in __init__. # They're not known at the time this class is instantiated, but it's # convenient if they're available later. # # When cls._FIELDS is filled in with a list of Field objects, the name # and type fields will have been populated. class Field: __slots__ = ('name', 'type', 'default', 'default_factory', 'repr', 'hash', 'init', 'compare', 'metadata', '_field_type', # Private: not to be used by user code. ) def __init__(self, default, default_factory, init, repr, hash, compare, metadata): self.name = None self.type = None self.default = default self.default_factory = default_factory self.init = init self.repr = repr self.hash = hash self.compare = compare self.metadata = (_EMPTY_METADATA if metadata is None or len(metadata) == 0 else types.MappingProxyType(metadata)) self._field_type = None def __repr__(self): return ('Field(' f'name={self.name!r},' f'type={self.type!r},' f'default={self.default!r},' f'default_factory={self.default_factory!r},' f'init={self.init!r},' f'repr={self.repr!r},' f'hash={self.hash!r},' f'compare={self.compare!r},' f'metadata={self.metadata!r},' f'_field_type={self._field_type}' ')') # This is used to support the PEP 487 __set_name__ protocol in the # case where we're using a field that contains a descriptor as a # defaul value. For details on __set_name__, see # https://www.python.org/dev/peps/pep-0487/#implementation-details. # # Note that in _process_class, this Field object is overwritten # with the default value, so the end result is a descriptor that # had __set_name__ called on it at the right time. def __set_name__(self, owner, name): func = getattr(type(self.default), '__set_name__', None) if func: # There is a __set_name__ method on the descriptor, call # it. func(self.default, owner, name) class _DataclassParams: __slots__ = ('init', 'repr', 'eq', 'order', 'unsafe_hash', 'frozen', ) def __init__(self, init, repr, eq, order, unsafe_hash, frozen): self.init = init self.repr = repr self.eq = eq self.order = order self.unsafe_hash = unsafe_hash self.frozen = frozen def __repr__(self): return ('_DataclassParams(' f'init={self.init!r},' f'repr={self.repr!r},' f'eq={self.eq!r},' f'order={self.order!r},' f'unsafe_hash={self.unsafe_hash!r},' f'frozen={self.frozen!r}' ')') # This function is used instead of exposing Field creation directly, # so that a type checker can be told (via overloads) that this is a # function whose type depends on its parameters. def field(*, default=MISSING, default_factory=MISSING, init=True, repr=True, hash=None, compare=True, metadata=None): """Return an object to identify dataclass fields. default is the default value of the field. default_factory is a 0-argument function called to initialize a field's value. If init is True, the field will be a parameter to the class's __init__() function. If repr is True, the field will be included in the object's repr(). If hash is True, the field will be included in the object's hash(). If compare is True, the field will be used in comparison functions. metadata, if specified, must be a mapping which is stored but not otherwise examined by dataclass. It is an error to specify both default and default_factory. """ if default is not MISSING and default_factory is not MISSING: raise ValueError('cannot specify both default and default_factory') return Field(default, default_factory, init, repr, hash, compare, metadata) def _tuple_str(obj_name, fields): # Return a string representing each field of obj_name as a tuple # member. So, if fields is ['x', 'y'] and obj_name is "self", # return "(self.x,self.y)". # Special case for the 0-tuple. if not fields: return '()' # Note the trailing comma, needed if this turns out to be a 1-tuple. return f'({",".join([f"{obj_name}.{f.name}" for f in fields])},)' def _create_fn(name, args, body, *, globals=None, locals=None, return_type=MISSING): # Note that we mutate locals when exec() is called. Caller # beware! The only callers are internal to this module, so no # worries about external callers. if locals is None: locals = {} return_annotation = '' if return_type is not MISSING: locals['_return_type'] = return_type return_annotation = '->_return_type' args = ','.join(args) body = '\n'.join(f' {b}' for b in body) # Compute the text of the entire function. txt = f'def {name}({args}){return_annotation}:\n{body}' exec(txt, globals, locals) return locals[name] def _field_assign(frozen, name, value, self_name): # If we're a frozen class, then assign to our fields in __init__ # via object.__setattr__. Otherwise, just use a simple # assignment. # # self_name is what "self" is called in this function: don't # hard-code "self", since that might be a field name. if frozen: return f'object.__setattr__({self_name},{name!r},{value})' return f'{self_name}.{name}={value}' def _field_init(f, frozen, globals, self_name): # Return the text of the line in the body of __init__ that will # initialize this field. default_name = f'_dflt_{f.name}' if f.default_factory is not MISSING: if f.init: # This field has a default factory. If a parameter is # given, use it. If not, call the factory. globals[default_name] = f.default_factory value = (f'{default_name}() ' f'if {f.name} is _HAS_DEFAULT_FACTORY ' f'else {f.name}') else: # This is a field that's not in the __init__ params, but # has a default factory function. It needs to be # initialized here by calling the factory function, # because there's no other way to initialize it. # For a field initialized with a default=defaultvalue, the # class dict just has the default value # (cls.fieldname=defaultvalue). But that won't work for a # default factory, the factory must be called in __init__ # and we must assign that to self.fieldname. We can't # fall back to the class dict's value, both because it's # not set, and because it might be different per-class # (which, after all, is why we have a factory function!). globals[default_name] = f.default_factory value = f'{default_name}()' else: # No default factory. if f.init: if f.default is MISSING: # There's no default, just do an assignment. value = f.name elif f.default is not MISSING: globals[default_name] = f.default value = f.name else: # This field does not need initialization. Signify that # to the caller by returning None. return None # Only test this now, so that we can create variables for the # default. However, return None to signify that we're not going # to actually do the assignment statement for InitVars. if f._field_type == _FIELD_INITVAR: return None # Now, actually generate the field assignment. return _field_assign(frozen, f.name, value, self_name) def _init_param(f): # Return the __init__ parameter string for this field. For # example, the equivalent of 'x:int=3' (except instead of 'int', # reference a variable set to int, and instead of '3', reference a # variable set to 3). if f.default is MISSING and f.default_factory is MISSING: # There's no default, and no default_factory, just output the # variable name and type. default = '' elif f.default is not MISSING: # There's a default, this will be the name that's used to look # it up. default = f'=_dflt_{f.name}' elif f.default_factory is not MISSING: # There's a factory function. Set a marker. default = '=_HAS_DEFAULT_FACTORY' return f'{f.name}:_type_{f.name}{default}' def _init_fn(fields, frozen, has_post_init, self_name): # fields contains both real fields and InitVar pseudo-fields. # Make sure we don't have fields without defaults following fields # with defaults. This actually would be caught when exec-ing the # function source code, but catching it here gives a better error # message, and future-proofs us in case we build up the function # using ast. seen_default = False for f in fields: # Only consider fields in the __init__ call. if f.init: if not (f.default is MISSING and f.default_factory is MISSING): seen_default = True elif seen_default: raise TypeError(f'non-default argument {f.name!r} ' 'follows default argument') globals = {'MISSING': MISSING, '_HAS_DEFAULT_FACTORY': _HAS_DEFAULT_FACTORY} body_lines = [] for f in fields: line = _field_init(f, frozen, globals, self_name) # line is None means that this field doesn't require # initialization (it's a pseudo-field). Just skip it. if line: body_lines.append(line) # Does this class have a post-init function? if has_post_init: params_str = ','.join(f.name for f in fields if f._field_type is _FIELD_INITVAR) body_lines.append(f'{self_name}.{_POST_INIT_NAME}({params_str})') # If no body lines, use 'pass'. if not body_lines: body_lines = ['pass'] locals = {f'_type_{f.name}': f.type for f in fields} return _create_fn('__init__', [self_name] + [_init_param(f) for f in fields if f.init], body_lines, locals=locals, globals=globals, return_type=None) def _repr_fn(fields): return _create_fn('__repr__', ('self',), ['return self.__class__.__qualname__ + f"(' + ', '.join([f"{f.name}={{self.{f.name}!r}}" for f in fields]) + ')"']) def _frozen_get_del_attr(cls, fields): # XXX: globals is modified on the first call to _create_fn, then # the modified version is used in the second call. Is this okay? globals = {'cls': cls, 'FrozenInstanceError': FrozenInstanceError} if fields: fields_str = '(' + ','.join(repr(f.name) for f in fields) + ',)' else: # Special case for the zero-length tuple. fields_str = '()' return (_create_fn('__setattr__', ('self', 'name', 'value'), (f'if type(self) is cls or name in {fields_str}:', ' raise FrozenInstanceError(f"cannot assign to field {name!r}")', f'super(cls, self).__setattr__(name, value)'), globals=globals), _create_fn('__delattr__', ('self', 'name'), (f'if type(self) is cls or name in {fields_str}:', ' raise FrozenInstanceError(f"cannot delete field {name!r}")', f'super(cls, self).__delattr__(name)'), globals=globals), ) def _cmp_fn(name, op, self_tuple, other_tuple): # Create a comparison function. If the fields in the object are # named 'x' and 'y', then self_tuple is the string # '(self.x,self.y)' and other_tuple is the string # '(other.x,other.y)'. return _create_fn(name, ('self', 'other'), [ 'if other.__class__ is self.__class__:', f' return {self_tuple}{op}{other_tuple}', 'return NotImplemented']) def _hash_fn(fields): self_tuple = _tuple_str('self', fields) return _create_fn('__hash__', ('self',), [f'return hash({self_tuple})']) def _is_classvar(a_type, typing): # This test uses a typing internal class, but it's the best way to # test if this is a ClassVar. return type(a_type) is typing._ClassVar def _is_initvar(a_type, dataclasses): # The module we're checking against is the module we're # currently in (dataclasses.py). return a_type is dataclasses.InitVar def _is_type(annotation, cls, a_module, a_type, is_type_predicate): # Given a type annotation string, does it refer to a_type in # a_module? For example, when checking that annotation denotes a # ClassVar, then a_module is typing, and a_type is # typing.ClassVar. # It's possible to look up a_module given a_type, but it involves # looking in sys.modules (again!), and seems like a waste since # the caller already knows a_module. # - annotation is a string type annotation # - cls is the class that this annotation was found in # - a_module is the module we want to match # - a_type is the type in that module we want to match # - is_type_predicate is a function called with (obj, a_module) # that determines if obj is of the desired type. # Since this test does not do a local namespace lookup (and # instead only a module (global) lookup), there are some things it # gets wrong. # With string annotations, cv0 will be detected as a ClassVar: # CV = ClassVar # @dataclass # class C0: # cv0: CV # But in this example cv1 will not be detected as a ClassVar: # @dataclass # class C1: # CV = ClassVar # cv1: CV # In C1, the code in this function (_is_type) will look up "CV" in # the module and not find it, so it will not consider cv1 as a # ClassVar. This is a fairly obscure corner case, and the best # way to fix it would be to eval() the string "CV" with the # correct global and local namespaces. However that would involve # a eval() penalty for every single field of every dataclass # that's defined. It was judged not worth it. match = _MODULE_IDENTIFIER_RE.match(annotation) if match: ns = None module_name = match.group(1) if not module_name: # No module name, assume the class's module did # "from dataclasses import InitVar". ns = sys.modules.get(cls.__module__).__dict__ else: # Look up module_name in the class's module. module = sys.modules.get(cls.__module__) if module and module.__dict__.get(module_name) is a_module: ns = sys.modules.get(a_type.__module__).__dict__ if ns and is_type_predicate(ns.get(match.group(2)), a_module): return True return False def _get_field(cls, a_name, a_type): # Return a Field object for this field name and type. ClassVars # and InitVars are also returned, but marked as such (see # f._field_type). # If the default value isn't derived from Field, then it's only a # normal default value. Convert it to a Field(). default = getattr(cls, a_name, MISSING) if isinstance(default, Field): f = default else: if isinstance(default, types.MemberDescriptorType): # This is a field in __slots__, so it has no default value. default = MISSING f = field(default=default) # Only at this point do we know the name and the type. Set them. f.name = a_name f.type = a_type # Assume it's a normal field until proven otherwise. We're next # going to decide if it's a ClassVar or InitVar, everything else # is just a normal field. f._field_type = _FIELD # In addition to checking for actual types here, also check for # string annotations. get_type_hints() won't always work for us # (see https://github.com/python/typing/issues/508 for example), # plus it's expensive and would require an eval for every stirng # annotation. So, make a best effort to see if this is a ClassVar # or InitVar using regex's and checking that the thing referenced # is actually of the correct type. # For the complete discussion, see https://bugs.python.org/issue33453 # If typing has not been imported, then it's impossible for any # annotation to be a ClassVar. So, only look for ClassVar if # typing has been imported by any module (not necessarily cls's # module). typing = sys.modules.get('typing') if typing: if (_is_classvar(a_type, typing) or (isinstance(f.type, str) and _is_type(f.type, cls, typing, typing.ClassVar, _is_classvar))): f._field_type = _FIELD_CLASSVAR # If the type is InitVar, or if it's a matching string annotation, # then it's an InitVar. if f._field_type is _FIELD: # The module we're checking against is the module we're # currently in (dataclasses.py). dataclasses = sys.modules[__name__] if (_is_initvar(a_type, dataclasses) or (isinstance(f.type, str) and _is_type(f.type, cls, dataclasses, dataclasses.InitVar, _is_initvar))): f._field_type = _FIELD_INITVAR # Validations for individual fields. This is delayed until now, # instead of in the Field() constructor, since only here do we # know the field name, which allows for better error reporting. # Special restrictions for ClassVar and InitVar. if f._field_type in (_FIELD_CLASSVAR, _FIELD_INITVAR): if f.default_factory is not MISSING: raise TypeError(f'field {f.name} cannot have a ' 'default factory') # Should I check for other field settings? default_factory # seems the most serious to check for. Maybe add others. For # example, how about init=False (or really, # init=)? It makes no sense for # ClassVar and InitVar to specify init=. # For real fields, disallow mutable defaults for known types. if f._field_type is _FIELD and isinstance(f.default, (list, dict, set)): raise ValueError(f'mutable default {type(f.default)} for field ' f'{f.name} is not allowed: use default_factory') return f def _set_new_attribute(cls, name, value): # Never overwrites an existing attribute. Returns True if the # attribute already exists. if name in cls.__dict__: return True setattr(cls, name, value) return False # Decide if/how we're going to create a hash function. Key is # (unsafe_hash, eq, frozen, does-hash-exist). Value is the action to # take. The common case is to do nothing, so instead of providing a # function that is a no-op, use None to signify that. def _hash_set_none(cls, fields): return None def _hash_add(cls, fields): flds = [f for f in fields if (f.compare if f.hash is None else f.hash)] return _hash_fn(flds) def _hash_exception(cls, fields): # Raise an exception. raise TypeError(f'Cannot overwrite attribute __hash__ ' f'in class {cls.__name__}') # # +-------------------------------------- unsafe_hash? # | +------------------------------- eq? # | | +------------------------ frozen? # | | | +---------------- has-explicit-hash? # | | | | # | | | | +------- action # | | | | | # v v v v v _hash_action = {(False, False, False, False): None, (False, False, False, True ): None, (False, False, True, False): None, (False, False, True, True ): None, (False, True, False, False): _hash_set_none, (False, True, False, True ): None, (False, True, True, False): _hash_add, (False, True, True, True ): None, (True, False, False, False): _hash_add, (True, False, False, True ): _hash_exception, (True, False, True, False): _hash_add, (True, False, True, True ): _hash_exception, (True, True, False, False): _hash_add, (True, True, False, True ): _hash_exception, (True, True, True, False): _hash_add, (True, True, True, True ): _hash_exception, } # See https://bugs.python.org/issue32929#msg312829 for an if-statement # version of this table. def _process_class(cls, init, repr, eq, order, unsafe_hash, frozen): # Now that dicts retain insertion order, there's no reason to use # an ordered dict. I am leveraging that ordering here, because # derived class fields overwrite base class fields, but the order # is defined by the base class, which is found first. fields = {} setattr(cls, _PARAMS, _DataclassParams(init, repr, eq, order, unsafe_hash, frozen)) # Find our base classes in reverse MRO order, and exclude # ourselves. In reversed order so that more derived classes # override earlier field definitions in base classes. As long as # we're iterating over them, see if any are frozen. any_frozen_base = False has_dataclass_bases = False for b in cls.__mro__[-1:0:-1]: # Only process classes that have been processed by our # decorator. That is, they have a _FIELDS attribute. base_fields = getattr(b, _FIELDS, None) if base_fields: has_dataclass_bases = True for f in base_fields.values(): fields[f.name] = f if getattr(b, _PARAMS).frozen: any_frozen_base = True # Annotations that are defined in this class (not in base # classes). If __annotations__ isn't present, then this class # adds no new annotations. We use this to compute fields that are # added by this class. # # Fields are found from cls_annotations, which is guaranteed to be # ordered. Default values are from class attributes, if a field # has a default. If the default value is a Field(), then it # contains additional info beyond (and possibly including) the # actual default value. Pseudo-fields ClassVars and InitVars are # included, despite the fact that they're not real fields. That's # dealt with later. cls_annotations = cls.__dict__.get('__annotations__', {}) # Now find fields in our class. While doing so, validate some # things, and set the default values (as class attributes) where # we can. cls_fields = [_get_field(cls, name, type) for name, type in cls_annotations.items()] for f in cls_fields: fields[f.name] = f # If the class attribute (which is the default value for this # field) exists and is of type 'Field', replace it with the # real default. This is so that normal class introspection # sees a real default value, not a Field. if isinstance(getattr(cls, f.name, None), Field): if f.default is MISSING: # If there's no default, delete the class attribute. # This happens if we specify field(repr=False), for # example (that is, we specified a field object, but # no default value). Also if we're using a default # factory. The class attribute should not be set at # all in the post-processed class. delattr(cls, f.name) else: setattr(cls, f.name, f.default) # Do we have any Field members that don't also have annotations? for name, value in cls.__dict__.items(): if isinstance(value, Field) and not name in cls_annotations: raise TypeError(f'{name!r} is a field but has no type annotation') # Check rules that apply if we are derived from any dataclasses. if has_dataclass_bases: # Raise an exception if any of our bases are frozen, but we're not. if any_frozen_base and not frozen: raise TypeError('cannot inherit non-frozen dataclass from a ' 'frozen one') # Raise an exception if we're frozen, but none of our bases are. if not any_frozen_base and frozen: raise TypeError('cannot inherit frozen dataclass from a ' 'non-frozen one') # Remember all of the fields on our class (including bases). This # also marks this class as being a dataclass. setattr(cls, _FIELDS, fields) # Was this class defined with an explicit __hash__? Note that if # __eq__ is defined in this class, then python will automatically # set __hash__ to None. This is a heuristic, as it's possible # that such a __hash__ == None was not auto-generated, but it # close enough. class_hash = cls.__dict__.get('__hash__', MISSING) has_explicit_hash = not (class_hash is MISSING or (class_hash is None and '__eq__' in cls.__dict__)) # If we're generating ordering methods, we must be generating the # eq methods. if order and not eq: raise ValueError('eq must be true if order is true') if init: # Does this class have a post-init function? has_post_init = hasattr(cls, _POST_INIT_NAME) # Include InitVars and regular fields (so, not ClassVars). flds = [f for f in fields.values() if f._field_type in (_FIELD, _FIELD_INITVAR)] _set_new_attribute(cls, '__init__', _init_fn(flds, frozen, has_post_init, # The name to use for the "self" # param in __init__. Use "self" # if possible. '__dataclass_self__' if 'self' in fields else 'self', )) # Get the fields as a list, and include only real fields. This is # used in all of the following methods. field_list = [f for f in fields.values() if f._field_type is _FIELD] if repr: flds = [f for f in field_list if f.repr] _set_new_attribute(cls, '__repr__', _repr_fn(flds)) if eq: # Create _eq__ method. There's no need for a __ne__ method, # since python will call __eq__ and negate it. flds = [f for f in field_list if f.compare] self_tuple = _tuple_str('self', flds) other_tuple = _tuple_str('other', flds) _set_new_attribute(cls, '__eq__', _cmp_fn('__eq__', '==', self_tuple, other_tuple)) if order: # Create and set the ordering methods. flds = [f for f in field_list if f.compare] self_tuple = _tuple_str('self', flds) other_tuple = _tuple_str('other', flds) for name, op in [('__lt__', '<'), ('__le__', '<='), ('__gt__', '>'), ('__ge__', '>='), ]: if _set_new_attribute(cls, name, _cmp_fn(name, op, self_tuple, other_tuple)): raise TypeError(f'Cannot overwrite attribute {name} ' f'in class {cls.__name__}. Consider using ' 'functools.total_ordering') if frozen: for fn in _frozen_get_del_attr(cls, field_list): if _set_new_attribute(cls, fn.__name__, fn): raise TypeError(f'Cannot overwrite attribute {fn.__name__} ' f'in class {cls.__name__}') # Decide if/how we're going to create a hash function. hash_action = _hash_action[bool(unsafe_hash), bool(eq), bool(frozen), has_explicit_hash] if hash_action: # No need to call _set_new_attribute here, since by the time # we're here the overwriting is unconditional. cls.__hash__ = hash_action(cls, field_list) if not getattr(cls, '__doc__'): # Create a class doc-string. cls.__doc__ = (cls.__name__ + str(inspect.signature(cls)).replace(' -> None', '')) return cls # _cls should never be specified by keyword, so start it with an # underscore. The presence of _cls is used to detect if this # decorator is being called with parameters or not. def dataclass(_cls=None, *, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False): """Returns the same class as was passed in, with dunder methods added based on the fields defined in the class. Examines PEP 526 __annotations__ to determine fields. If init is true, an __init__() method is added to the class. If repr is true, a __repr__() method is added. If order is true, rich comparison dunder methods are added. If unsafe_hash is true, a __hash__() method function is added. If frozen is true, fields may not be assigned to after instance creation. """ def wrap(cls): return _process_class(cls, init, repr, eq, order, unsafe_hash, frozen) # See if we're being called as @dataclass or @dataclass(). if _cls is None: # We're called with parens. return wrap # We're called as @dataclass without parens. return wrap(_cls) def fields(class_or_instance): """Return a tuple describing the fields of this dataclass. Accepts a dataclass or an instance of one. Tuple elements are of type Field. """ # Might it be worth caching this, per class? try: fields = getattr(class_or_instance, _FIELDS) except AttributeError: raise TypeError('must be called with a dataclass type or instance') # Exclude pseudo-fields. Note that fields is sorted by insertion # order, so the order of the tuple is as the fields were defined. return tuple(f for f in fields.values() if f._field_type is _FIELD) def _is_dataclass_instance(obj): """Returns True if obj is an instance of a dataclass.""" return not isinstance(obj, type) and hasattr(obj, _FIELDS) def is_dataclass(obj): """Returns True if obj is a dataclass or an instance of a dataclass.""" return hasattr(obj, _FIELDS) def asdict(obj, *, dict_factory=dict): """Return the fields of a dataclass instance as a new dictionary mapping field names to field values. Example usage: @dataclass class C: x: int y: int c = C(1, 2) assert asdict(c) == {'x': 1, 'y': 2} If given, 'dict_factory' will be used instead of built-in dict. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts. """ if not _is_dataclass_instance(obj): raise TypeError("asdict() should be called on dataclass instances") return _asdict_inner(obj, dict_factory) def _asdict_inner(obj, dict_factory): if _is_dataclass_instance(obj): result = [] for f in fields(obj): value = _asdict_inner(getattr(obj, f.name), dict_factory) result.append((f.name, value)) return dict_factory(result) elif isinstance(obj, (list, tuple)): return type(obj)(_asdict_inner(v, dict_factory) for v in obj) elif isinstance(obj, dict): return type(obj)((_asdict_inner(k, dict_factory), _asdict_inner(v, dict_factory)) for k, v in obj.items()) else: return copy.deepcopy(obj) def astuple(obj, *, tuple_factory=tuple): """Return the fields of a dataclass instance as a new tuple of field values. Example usage:: @dataclass class C: x: int y: int c = C(1, 2) assert astuple(c) == (1, 2) If given, 'tuple_factory' will be used instead of built-in tuple. The function applies recursively to field values that are dataclass instances. This will also look into built-in containers: tuples, lists, and dicts. """ if not _is_dataclass_instance(obj): raise TypeError("astuple() should be called on dataclass instances") return _astuple_inner(obj, tuple_factory) def _astuple_inner(obj, tuple_factory): if _is_dataclass_instance(obj): result = [] for f in fields(obj): value = _astuple_inner(getattr(obj, f.name), tuple_factory) result.append(value) return tuple_factory(result) elif isinstance(obj, (list, tuple)): return type(obj)(_astuple_inner(v, tuple_factory) for v in obj) elif isinstance(obj, dict): return type(obj)((_astuple_inner(k, tuple_factory), _astuple_inner(v, tuple_factory)) for k, v in obj.items()) else: return copy.deepcopy(obj) def make_dataclass(cls_name, fields, *, bases=(), namespace=None, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False): """Return a new dynamically created dataclass. The dataclass name will be 'cls_name'. 'fields' is an iterable of either (name), (name, type) or (name, type, Field) objects. If type is omitted, use the string 'typing.Any'. Field objects are created by the equivalent of calling 'field(name, type [, Field-info])'. C = make_dataclass('C', ['x', ('y', int), ('z', int, field(init=False))], bases=(Base,)) is equivalent to: @dataclass class C(Base): x: 'typing.Any' y: int z: int = field(init=False) For the bases and namespace parameters, see the builtin type() function. The parameters init, repr, eq, order, unsafe_hash, and frozen are passed to dataclass(). """ if namespace is None: namespace = {} else: # Copy namespace since we're going to mutate it. namespace = namespace.copy() # While we're looking through the field names, validate that they # are identifiers, are not keywords, and not duplicates. seen = set() anns = {} for item in fields: if isinstance(item, str): name = item tp = 'typing.Any' elif len(item) == 2: name, tp, = item elif len(item) == 3: name, tp, spec = item namespace[name] = spec else: raise TypeError(f'Invalid field: {item!r}') if not isinstance(name, str) or not name.isidentifier(): raise TypeError(f'Field names must be valid identifers: {name!r}') if keyword.iskeyword(name): raise TypeError(f'Field names must not be keywords: {name!r}') if name in seen: raise TypeError(f'Field name duplicated: {name!r}') seen.add(name) anns[name] = tp namespace['__annotations__'] = anns # We use `types.new_class()` instead of simply `type()` to allow dynamic creation # of generic dataclassses. cls = types.new_class(cls_name, bases, {}, lambda ns: ns.update(namespace)) return dataclass(cls, init=init, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen) def replace(obj, **changes): """Return a new object replacing specified fields with new values. This is especially useful for frozen classes. Example usage: @dataclass(frozen=True) class C: x: int y: int c = C(1, 2) c1 = replace(c, x=3) assert c1.x == 3 and c1.y == 2 """ # We're going to mutate 'changes', but that's okay because it's a # new dict, even if called with 'replace(obj, **my_changes)'. if not _is_dataclass_instance(obj): raise TypeError("replace() should be called on dataclass instances") # It's an error to have init=False fields in 'changes'. # If a field is not in 'changes', read its value from the provided obj. for f in getattr(obj, _FIELDS).values(): if not f.init: # Error if this field is specified in changes. if f.name in changes: raise ValueError(f'field {f.name} is declared with ' 'init=False, it cannot be specified with ' 'replace()') continue if f.name not in changes: changes[f.name] = getattr(obj, f.name) # Create the new object, which calls __init__() and # __post_init__() (if defined), using all of the init fields we've # added and/or left in 'changes'. If there are values supplied in # changes that aren't fields, this will correctly raise a # TypeError. return obj.__class__(**changes) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679534595.0 rpcq-3.11.0/rpcq/messages.py0000644000076700000240000002515000000000000016477 0ustar00ksnyderstaff00000000000000#!/usr/bin/env python """ WARNING: This file is auto-generated, do not edit by hand. See README.md. """ import sys from warnings import warn from rpcq._base import Message from typing import Any, List, Dict, Optional if sys.version_info < (3, 7): from rpcq.external.dataclasses import dataclass, field, InitVar else: from dataclasses import dataclass, field, InitVar @dataclass(eq=False, repr=False) class ParameterSpec(Message): """ Specification of a dynamic parameter type and array-length. """ type: str = "" """The parameter type, e.g., one of 'INTEGER', or 'FLOAT'.""" length: int = 1 """If this is not 1, the parameter is an array of this length.""" @dataclass(eq=False, repr=False) class ParameterAref(Message): """ A parametric expression. """ name: str """The parameter name""" index: int """The array index.""" @dataclass(eq=False, repr=False) class PatchTarget(Message): """ Patchable memory location descriptor. """ patch_type: ParameterSpec """Data type at this address.""" patch_offset: int """Memory address of the patch.""" @dataclass(eq=False, repr=False) class RPCRequest(Message): """ A single request object according to the JSONRPC standard. """ method: str """The RPC function name.""" params: Any """The RPC function arguments.""" id: str """RPC request id (used to verify that request and response belong together).""" jsonrpc: str = "2.0" """The JSONRPC version.""" client_timeout: Optional[float] = None """The client-side timeout for the request. The server itself may be configured with a timeout that is greater than the client-side timeout, in which case the server can choose to terminate any processing of the request.""" client_key: Optional[str] = None """The ZeroMQ CURVE public key used to make the request, as received by the server. Empty if no key is used.""" @dataclass(eq=False, repr=False) class RPCWarning(Message): """ An individual warning emitted in the course of RPC processing. """ body: str """The warning string.""" kind: Optional[str] = None """The type of the warning raised.""" @dataclass(eq=False, repr=False) class RPCReply(Message): """ The reply for a JSONRPC request. """ id: str """The RPC request id.""" jsonrpc: str = "2.0" """The JSONRPC version.""" result: Optional[Any] = None """The RPC result.""" warnings: List[RPCWarning] = field(default_factory=list) """A list of warnings that occurred during request processing.""" @dataclass(eq=False, repr=False) class RPCError(Message): """ A error message for JSONRPC requests. """ error: str """The error message.""" id: str """The RPC request id.""" jsonrpc: str = "2.0" """The JSONRPC version.""" warnings: List[RPCWarning] = field(default_factory=list) """A list of warnings that occurred during request processing.""" @dataclass(eq=False, repr=False) class TargetDevice(Message): """ ISA and specs for a particular device. """ isa: Dict[str, Dict] """Instruction-set architecture for this device.""" specs: Dict[str, Dict] """Fidelities and coherence times for this device.""" @dataclass(eq=False, repr=False) class RandomizedBenchmarkingRequest(Message): """ RPC request payload for generating a randomized benchmarking sequence. """ depth: int """Depth of the benchmarking sequence.""" qubits: int """Number of qubits involved in the benchmarking sequence.""" gateset: List[str] """List of Quil programs, each describing a Clifford.""" seed: Optional[int] = None """PRNG seed. Set this to guarantee repeatable results.""" interleaver: Optional[str] = None """Fixed Clifford, specified as a Quil string, to interleave through an RB sequence.""" @dataclass(eq=False, repr=False) class RandomizedBenchmarkingResponse(Message): """ RPC reply payload for a randomly generated benchmarking sequence. """ sequence: List[List[int]] """List of Cliffords, each expressed as a list of generator indices.""" @dataclass(eq=False, repr=False) class PauliTerm(Message): """ Specification of a single Pauli term as a tensor product of Pauli factors. """ indices: List[int] """Qubit indices onto which the factors of a Pauli term are applied.""" symbols: List[str] """Ordered factors of a Pauli term.""" @dataclass(eq=False, repr=False) class ConjugateByCliffordRequest(Message): """ RPC request payload for conjugating a Pauli element by a Clifford element. """ pauli: PauliTerm """Specification of a Pauli element.""" clifford: str """Specification of a Clifford element.""" @dataclass(eq=False, repr=False) class ConjugateByCliffordResponse(Message): """ RPC reply payload for a Pauli element as conjugated by a Clifford element. """ phase: int """Encoded global phase factor on the emitted Pauli.""" pauli: str """Description of the encoded Pauli.""" @dataclass(eq=False, repr=False) class NativeQuilRequest(Message): """ Quil and the device metadata necessary for quilc. """ quil: str """Arbitrary Quil to be sent to quilc.""" target_device: TargetDevice """Specifications for the device to target with quilc.""" @dataclass(eq=False, repr=False) class NativeQuilMetadata(Message): """ Metadata for a native quil program. """ final_rewiring: List[int] = field(default_factory=list) """Output qubit index relabeling due to SWAP insertion.""" gate_depth: Optional[int] = None """Maximum number of successive gates in the native quil program.""" gate_volume: Optional[int] = None """Total number of gates in the native quil program.""" multiqubit_gate_depth: Optional[int] = None """Maximum number of successive two-qubit gates in the native quil program.""" program_duration: Optional[float] = None """Rough estimate of native quil program length in nanoseconds.""" program_fidelity: Optional[float] = None """Rough estimate of the fidelity of the full native quil program, uses specs.""" topological_swaps: Optional[int] = None """Total number of SWAPs in the native quil program.""" qpu_runtime_estimation: Optional[float] = None """The estimated runtime (milliseconds) on a Rigetti QPU for a protoquil program.""" @dataclass(eq=False, repr=False) class NativeQuilResponse(Message): """ Native Quil and associated metadata returned from quilc. """ quil: str """Native Quil returned from quilc.""" metadata: Optional[NativeQuilMetadata] = None """Metadata for the returned Native Quil.""" @dataclass(eq=False, repr=False) class RewriteArithmeticRequest(Message): """ A request type to handle compiling arithmetic out of gate parameters. """ quil: str """Native Quil for which to rewrite arithmetic parameters.""" @dataclass(eq=False, repr=False) class RewriteArithmeticResponse(Message): """ The data needed to run programs with gate arithmetic on the hardware. """ quil: str """Native Quil rewritten with no arithmetic in gate parameters.""" original_memory_descriptors: Dict[str, ParameterSpec] = field(default_factory=dict) """The declared memory descriptors in the Quil of the related request.""" recalculation_table: Dict[ParameterAref, str] = field(default_factory=dict) """A mapping from memory references to the original gate arithmetic.""" @dataclass(eq=False, repr=False) class BinaryExecutableRequest(Message): """ Native Quil and the information needed to create binary executables. """ quil: str """Native Quil to be translated into an executable program.""" num_shots: int """The number of times to repeat the program.""" @dataclass(eq=False, repr=False) class BinaryExecutableResponse(Message): """ Program to run on the QPU. """ program: str """Execution settings and sequencer binaries.""" memory_descriptors: Dict[str, ParameterSpec] = field(default_factory=dict) """Internal field for constructing patch tables.""" ro_sources: List[Any] = field(default_factory=list) """Internal field for reshaping returned buffers.""" @dataclass(eq=False, repr=False) class QuiltBinaryExecutableRequest(Message): """ Native Quilt and the information needed to create binary executables. """ quilt: str """Native Quilt to be translated into an executable program.""" num_shots: int """The number of times to repeat the program.""" @dataclass(eq=False, repr=False) class QuiltBinaryExecutableResponse(Message): """ Program to run on the QPU. """ program: str """Execution settings and sequencer binaries.""" debug: Dict[str, Any] """Debug information associated with the translation process.""" memory_descriptors: Dict[str, ParameterSpec] = field(default_factory=dict) """Internal field for constructing patch tables.""" ro_sources: List[Any] = field(default_factory=list) """Internal field for reshaping returned buffers.""" @dataclass(eq=False, repr=False) class PyQuilExecutableResponse(Message): """ rpcQ-serializable form of a pyQuil Program object. """ program: str """String representation of a Quil program.""" attributes: Dict[str, Any] """Miscellaneous attributes to be unpacked onto the pyQuil Program object.""" @dataclass(eq=False, repr=False) class QPURequest(Message): """ Program and patch values to send to the QPU for execution. """ program: Any """Execution settings and sequencer binaries.""" patch_values: Dict[str, List[Any]] """Dictionary mapping data names to data values for patching the binary.""" id: str """QPU request ID.""" @dataclass(eq=False, repr=False) class QuiltCalibrationsRequest(Message): """ A request for up-to-date Quilt calibrations. """ target_device: TargetDevice """Specifications for the device to get calibrations for.""" @dataclass(eq=False, repr=False) class QuiltCalibrationsResponse(Message): """ Up-to-date Quilt calibrations. """ quilt: str """Quilt code with definitions for frames, waveforms, and calibrations.""" @dataclass(eq=False, repr=False) class GetExecutionResultsResponse(Message): """ Results of a completed ExecutorJob execution. """ buffers: Dict[str, Dict[str, Any]] """Result buffers for a completed ExecutorJob.""" execution_duration_microseconds: int """Duration (in microseconds) ExecutorJob held exclusive access to quantum hardware.""" ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1570747843.0 rpcq-3.11.0/rpcq/py.typed0000644000076700000240000000000000000000000016000 0ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq/version.py0000644000076700000240000000002700000000000016351 0ustar00ksnyderstaff00000000000000__version__ = '3.11.0' ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8031359 rpcq-3.11.0/rpcq.egg-info/0000755000076700000240000000000000000000000016005 5ustar00ksnyderstaff00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq.egg-info/PKG-INFO0000644000076700000240000002210300000000000017100 0ustar00ksnyderstaff00000000000000Metadata-Version: 2.1 Name: rpcq Version: 3.11.0 Summary: The RPC framework and message specification for Rigetti QCS. Home-page: https://github.com/rigetticomputing/rpcq.git Author: Rigetti Computing Author-email: info@rigetti.com License: Apache-2.0 Description: rpcq ==== [![pypi version](https://img.shields.io/pypi/v/rpcq.svg)](https://pypi.org/project/rpcq/) [![conda-forge version](https://img.shields.io/conda/vn/conda-forge/rpcq.svg)](https://anaconda.org/conda-forge/rpcq) [![docker pulls](https://img.shields.io/docker/pulls/rigetti/rpcq.svg)](https://hub.docker.com/r/rigetti/rpcq) The asynchronous RPC client-server framework and message specification for [Rigetti Quantum Cloud Services (QCS)](https://www.rigetti.com/). Implements an efficient transport protocol by using [ZeroMQ](http://zeromq.org/) (ZMQ) sockets and [MessagePack](https://msgpack.org/index.html) (`msgpack`) serialization. Not intended to be a full-featured replacement for other frameworks like [gRPC](https://grpc.io/) or [Apache Thrift](https://thrift.apache.org/). Python Installation ------------------- To install directly from the source, run `pip install -e .` from within the top-level directory of the `rpcq` repository. To additionally install the requirements for testing, make sure to run `pip install -r requirements.txt`. To instead install the latest released verson of `rpcq` from the Python package manager PyPi, run `pip install rpcq`. **NOTE**: We strongly encourage users of `rpcq` to install the software within a (Python) virtual environment (read up on [`virtualenv`](https://github.com/pypa/virtualenv), [`pyenv`](https://github.com/pyenv/pyenv), or [`conda`](https://github.com/conda/conda) for more info). Lisp Installation ----------------- Installation is easier with QuickLisp. After placing the source for RPCQ within your local Lisp projects directory (cf. `ql:*local-project-directories*`), run `(ql:quickload :rpcq)` and QuickLisp will download the necessary Lisp dependencies. In addition to the Lisp dependencies, RPCQ depends on ZeroMQ. Be sure to install both the library *and* its development headers (which are necessary for the Lisp foreign-function interface to get its bearings). Using the Client-Server Framework --------------------------------- The following two code samples (first in Python, then in Lisp) demonstrate how to create a server, add a test handler, and spin it up. ```python from rpcq import Server server = Server() @server.rpc_handler def test(): return 'test' server.run('tcp://*:5555') ``` ```lisp (defun test () "test") (let ((dt (rpcq:make-dispatch-table))) (rpcq:dispatch-table-add-handler dt 'test) (rpcq:start-server :dispatch-table dt :listen-addresses '("tcp://*:5555"))) ``` In another window, we can (again first in Python, then in Lisp) create a client that points to the same socket, and call the test method. ```python from rpcq import Client client = Client('tcp://localhost:5555') client.call('test') ``` ```lisp (rpcq:with-rpc-client (client "tcp://localhost:5555") (rpcq:rpc-call client "test")) ``` In all cases (including interoperating a client/server pair written in different languages), this will return the string `'test'`. Using the Message Spec ---------------------- The message spec as defined in `src/messages.lisp` (which in turn produces `rpcq/messages.py`) is meant to be used with the [Rigetti QCS](https://www.rigetti.com/qcs) platform. Therefore, these messages are used in [`pyquil`](https://github.com/rigetticomputing/pyquil), in order to allow users to communicate with the Rigetti Quil compiler and quantum processing units (QPUs). PyQuil provides utilities for users to interact with the QCS API and write programs in [Quil](https://arxiv.org/abs/1608.03355), the quantum instruction language developed at Rigetti. Thus, most users will not interact with `rpcq.messages` directly. However, for those interested in building their own implementation of the QCS API utilities in pyQuil, becoming acquainted with the client-server framework, the available messages in the message spec, and how they are used in the `pyquil.api` module would be a good place to start! Updating the Python Message Bindings ------------------------------------ Currently only Python bindings are available for the message spec, but more language bindings are in the works. To update the Python message bindings after editing `src/messages.lisp`, open `rlwrap sbcl` and run: ```lisp (ql:quickload :rpcq) (with-open-file (f "rpcq/messages.py" :direction :output :if-exists :supersede) (rpcq:python-message-spec f)) ``` **NOTE**: Requires pre-installed [`sbcl`](http://www.sbcl.org/), [`quicklisp`](https://www.quicklisp.org/beta/), and (optionally) [`rlwrap`](https://github.com/hanslub42/rlwrap). We can also use the rpcq docker container to update the message spec without to install the requirements. ```bash ./docker_update_python_spec.sh ``` Running the Unit Tests ---------------------- The `rpcq` repository is configured with GitLab CI to automatically run the unit tests. The tests run within a container based off of the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `.gitlab-ci.yml`. The Python unit tests can be executed locally by running `pytest` from the top-level directory of the repository (assuming you have installed the test requirements). The Lisp unit tests can be run locally by doing the following from within `rlwrap sbcl`: ```lisp (ql:quickload :rpcq) (asdf:test-system :rpcq) ``` There may be some instances of `STYLE-WARNING`, but if the test run successfully, there should be something near the bottom of the output that looks like: ``` RPCQ-TESTS (Suite) TEST-DEFMESSAGE [ OK ] ``` Automated Packaging with Docker ------------------------------- The CI pipeline for `rpcq` produces a Docker image, available at [`rigetti/rpcq`](https://hub.docker.com/r/rigetti/rpcq). To get the latest stable version of `rpcq`, run `docker pull rigetti/rpcq`. The image is built from the [`rigetti/lisp`](https://hub.docker.com/r/rigetti/lisp) Docker image, which is pinned to a specific tag. If you need a more recent version of the image, update the tag in the `Dockerfile`. To learn more about the `rigetti/lisp` Docker image, check out the [`docker-lisp`](https://github.com/rigetti/docker-lisp) repository. Release Process --------------- 1. Update `VERSION.txt` and dependency versions (if applicable) and push the commit to `master`. 2. Push a git tag `vX.Y.Z` that contains the same version number as in `VERSION.txt`. 3. Verify that the resulting build (triggered by pushing the tag) completes successfully. 4. Push the tagged commit to `pypi` and verify it appears [here](https://pypi.org/project/rpcq/). 5. Publish a [release](https://github.com/rigetti/rpcq/releases) using the tag as the name. 6. Close the [milestone](https://github.com/rigetti/rpcq/milestones) associated with this release, and migrate incomplete issues to the next one. Authors ------- Developed at [Rigetti Computing](https://github.com/rigetticomputing) by [Nikolas Tezak](https://github.com/ntezak), [Steven Heidel](https://github.com/stevenheidel), [Eric Peterson](https://github.com/ecp-rigetti), [Colm Ryan](https://github.com/caryan), [Peter Karalekas](https://github.com/karalekas), [Guen Prawiroatmodjo](https://github.com/guenp), [Erik Davis](https://github.com/kilimanjaro), and [Robert Smith](https://github.com/tarballs-are-good). Keywords: quantum rpc qcs Platform: UNKNOWN Requires-Python: >=3.6 Description-Content-Type: text/markdown ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq.egg-info/SOURCES.txt0000644000076700000240000000070100000000000017667 0ustar00ksnyderstaff00000000000000LICENSE MANIFEST.in README.md VERSION.txt requirements.txt setup.py rpcq/__init__.py rpcq/_base.py rpcq/_client.py rpcq/_server.py rpcq/_spec.py rpcq/_utils.py rpcq/core_messages.py rpcq/messages.py rpcq/py.typed rpcq/version.py rpcq.egg-info/PKG-INFO rpcq.egg-info/SOURCES.txt rpcq.egg-info/dependency_links.txt rpcq.egg-info/not-zip-safe rpcq.egg-info/requires.txt rpcq.egg-info/top_level.txt rpcq/external/__init__.py rpcq/external/dataclasses.py././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq.egg-info/dependency_links.txt0000644000076700000240000000000100000000000022053 0ustar00ksnyderstaff00000000000000 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1606335797.0 rpcq-3.11.0/rpcq.egg-info/not-zip-safe0000644000076700000240000000000100000000000020233 0ustar00ksnyderstaff00000000000000 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq.egg-info/requires.txt0000644000076700000240000000007100000000000020403 0ustar00ksnyderstaff00000000000000msgpack<2.0,>=0.6 python-rapidjson pyzmq>=17 ruamel.yaml ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613412.0 rpcq-3.11.0/rpcq.egg-info/top_level.txt0000644000076700000240000000000500000000000020532 0ustar00ksnyderstaff00000000000000rpcq ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1679613412.8048828 rpcq-3.11.0/setup.cfg0000644000076700000240000000004600000000000015167 0ustar00ksnyderstaff00000000000000[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1679613114.0 rpcq-3.11.0/setup.py0000644000076700000240000000441300000000000015062 0ustar00ksnyderstaff00000000000000############################################################################## # Copyright 2018 Rigetti Computing # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. ############################################################################## import sys from setuptools import setup if sys.version_info < (3, 6): raise ImportError('The rpcq library requires Python 3.6 or above.') with open('VERSION.txt', 'r') as f: __version__ = f.read().strip() with open('README.md', 'r') as f: long_description = f.read() # save the source code in version.py with open('rpcq/version.py', 'r') as f: version_file_source = f.read() # overwrite version.py in the source distribution with open('rpcq/version.py', 'w') as f: f.write(f'__version__ = \'{__version__}\'\n') setup( name='rpcq', version=__version__, author='Rigetti Computing', author_email='info@rigetti.com', license='Apache-2.0', packages=[ 'rpcq', 'rpcq.external' ], package_data={'rpcq': ['py.typed']}, url='https://github.com/rigetticomputing/rpcq.git', description='''The RPC framework and message specification for Rigetti QCS.''', long_description=long_description, long_description_content_type='text/markdown', install_requires=[ "msgpack>=0.6,<2.0", "python-rapidjson", "pyzmq>=17", "ruamel.yaml", ], keywords='quantum rpc qcs', python_requires='>=3.6', # zip_safe must be disabled in order for mypy to find the installed py.typed file, according to # https://mypy.readthedocs.io/en/latest/installed_packages.html#making-pep-561-compatible-packages zip_safe=False ) # restore version.py to its previous state with open('rpcq/version.py', 'w') as f: f.write(version_file_source)