././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496532.0019214 google-api-core-2.19.0/0002755000000000017530000000000014614222024012425 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/LICENSE0000664000000000017530000002613614614221643013450 0ustar00root Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/MANIFEST.in0000664000000000017530000000153414614221643014174 0ustar00root# -*- coding: utf-8 -*- # # Copyright 2023 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Generated by synthtool. DO NOT EDIT! include README.rst LICENSE recursive-include google *.json *.proto py.typed recursive-include tests * global-exclude *.py[co] global-exclude __pycache__ # Exclude scripts for samples readmegen prune scripts/readme-gen ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496532.0019214 google-api-core-2.19.0/PKG-INFO0000644000000000017530000000522414614222024013523 0ustar00rootMetadata-Version: 2.1 Name: google-api-core Version: 2.19.0 Summary: Google API client core library Home-page: https://github.com/googleapis/python-api-core Author: Google LLC Author-email: googleapis-packages@google.com License: Apache 2.0 Platform: Posix; MacOS X; Windows Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Apache Software License Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Operating System :: OS Independent Classifier: Topic :: Internet Requires-Python: >=3.7 License-File: LICENSE Requires-Dist: googleapis-common-protos<2.0.dev0,>=1.56.2 Requires-Dist: protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0.dev0,>=3.19.5 Requires-Dist: proto-plus<2.0.0dev,>=1.22.3 Requires-Dist: google-auth<3.0.dev0,>=2.14.1 Requires-Dist: requests<3.0.0.dev0,>=2.18.0 Provides-Extra: grpc Requires-Dist: grpcio<2.0dev,>=1.33.2; extra == "grpc" Requires-Dist: grpcio<2.0dev,>=1.49.1; python_version >= "3.11" and extra == "grpc" Requires-Dist: grpcio-status<2.0.dev0,>=1.33.2; extra == "grpc" Requires-Dist: grpcio-status<2.0.dev0,>=1.49.1; python_version >= "3.11" and extra == "grpc" Provides-Extra: grpcgcp Requires-Dist: grpcio-gcp<1.0.dev0,>=0.2.2; extra == "grpcgcp" Provides-Extra: grpcio-gcp Requires-Dist: grpcio-gcp<1.0.dev0,>=0.2.2; extra == "grpcio-gcp" Core Library for Google Client Libraries ======================================== |pypi| |versions| This library is not meant to stand-alone. Instead it defines common helpers used by all Google API clients. For more information, see the `documentation`_. .. |pypi| image:: https://img.shields.io/pypi/v/google-api_core.svg :target: https://pypi.org/project/google-api_core/ .. |versions| image:: https://img.shields.io/pypi/pyversions/google-api_core.svg :target: https://pypi.org/project/google-api_core/ .. _documentation: https://googleapis.dev/python/google-api-core/latest Supported Python Versions ------------------------- Python >= 3.7 Unsupported Python Versions --------------------------- Python == 2.7, Python == 3.5, Python == 3.6. The last version of this library compatible with Python 2.7 and 3.5 is `google-api-core==1.31.1`. The last version of this library compatible with Python 3.6 is `google-api-core==2.8.2`. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/README.rst0000664000000000017530000000166514614221643014132 0ustar00rootCore Library for Google Client Libraries ======================================== |pypi| |versions| This library is not meant to stand-alone. Instead it defines common helpers used by all Google API clients. For more information, see the `documentation`_. .. |pypi| image:: https://img.shields.io/pypi/v/google-api_core.svg :target: https://pypi.org/project/google-api_core/ .. |versions| image:: https://img.shields.io/pypi/pyversions/google-api_core.svg :target: https://pypi.org/project/google-api_core/ .. _documentation: https://googleapis.dev/python/google-api-core/latest Supported Python Versions ------------------------- Python >= 3.7 Unsupported Python Versions --------------------------- Python == 2.7, Python == 3.5, Python == 3.6. The last version of this library compatible with Python 2.7 and 3.5 is `google-api-core==1.31.1`. The last version of this library compatible with Python 3.6 is `google-api-core==2.8.2`. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9699059 google-api-core-2.19.0/google/0002755000000000017530000000000014614222024013701 5ustar00root././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9779098 google-api-core-2.19.0/google/api_core/0002755000000000017530000000000014614222024015462 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/__init__.py0000664000000000017530000000141614614221643017603 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Google API Core. This package contains common code and utilities used by Google client libraries. """ from google.api_core import version as api_core_version __version__ = api_core_version.__version__ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/bidi.py0000664000000000017530000006644414614221643016767 0ustar00root# Copyright 2017, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # https://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Bi-directional streaming RPC helpers.""" import collections import datetime import logging import queue as queue_module import threading import time from google.api_core import exceptions _LOGGER = logging.getLogger(__name__) _BIDIRECTIONAL_CONSUMER_NAME = "Thread-ConsumeBidirectionalStream" class _RequestQueueGenerator(object): """A helper for sending requests to a gRPC stream from a Queue. This generator takes requests off a given queue and yields them to gRPC. This helper is useful when you have an indeterminate, indefinite, or otherwise open-ended set of requests to send through a request-streaming (or bidirectional) RPC. The reason this is necessary is because gRPC takes an iterator as the request for request-streaming RPCs. gRPC consumes this iterator in another thread to allow it to block while generating requests for the stream. However, if the generator blocks indefinitely gRPC will not be able to clean up the thread as it'll be blocked on `next(iterator)` and not be able to check the channel status to stop iterating. This helper mitigates that by waiting on the queue with a timeout and checking the RPC state before yielding. Finally, it allows for retrying without swapping queues because if it does pull an item off the queue when the RPC is inactive, it'll immediately put it back and then exit. This is necessary because yielding the item in this case will cause gRPC to discard it. In practice, this means that the order of messages is not guaranteed. If such a thing is necessary it would be easy to use a priority queue. Example:: requests = request_queue_generator(q) call = stub.StreamingRequest(iter(requests)) requests.call = call for response in call: print(response) q.put(...) Note that it is possible to accomplish this behavior without "spinning" (using a queue timeout). One possible way would be to use more threads to multiplex the grpc end event with the queue, another possible way is to use selectors and a custom event/queue object. Both of these approaches are significant from an engineering perspective for small benefit - the CPU consumed by spinning is pretty minuscule. Args: queue (queue_module.Queue): The request queue. period (float): The number of seconds to wait for items from the queue before checking if the RPC is cancelled. In practice, this determines the maximum amount of time the request consumption thread will live after the RPC is cancelled. initial_request (Union[protobuf.Message, Callable[None, protobuf.Message]]): The initial request to yield. This is done independently of the request queue to allow fo easily restarting streams that require some initial configuration request. """ def __init__(self, queue, period=1, initial_request=None): self._queue = queue self._period = period self._initial_request = initial_request self.call = None def _is_active(self): # Note: there is a possibility that this starts *before* the call # property is set. So we have to check if self.call is set before # seeing if it's active. We need to return True if self.call is None. # See https://github.com/googleapis/python-api-core/issues/560. return self.call is None or self.call.is_active() def __iter__(self): if self._initial_request is not None: if callable(self._initial_request): yield self._initial_request() else: yield self._initial_request while True: try: item = self._queue.get(timeout=self._period) except queue_module.Empty: if not self._is_active(): _LOGGER.debug( "Empty queue and inactive call, exiting request " "generator." ) return else: # call is still active, keep waiting for queue items. continue # The consumer explicitly sent "None", indicating that the request # should end. if item is None: _LOGGER.debug("Cleanly exiting request generator.") return if not self._is_active(): # We have an item, but the call is closed. We should put the # item back on the queue so that the next call can consume it. self._queue.put(item) _LOGGER.debug( "Inactive call, replacing item on queue and exiting " "request generator." ) return yield item class _Throttle(object): """A context manager limiting the total entries in a sliding time window. If more than ``access_limit`` attempts are made to enter the context manager instance in the last ``time window`` interval, the exceeding requests block until enough time elapses. The context manager instances are thread-safe and can be shared between multiple threads. If multiple requests are blocked and waiting to enter, the exact order in which they are allowed to proceed is not determined. Example:: max_three_per_second = _Throttle( access_limit=3, time_window=datetime.timedelta(seconds=1) ) for i in range(5): with max_three_per_second as time_waited: print("{}: Waited {} seconds to enter".format(i, time_waited)) Args: access_limit (int): the maximum number of entries allowed in the time window time_window (datetime.timedelta): the width of the sliding time window """ def __init__(self, access_limit, time_window): if access_limit < 1: raise ValueError("access_limit argument must be positive") if time_window <= datetime.timedelta(0): raise ValueError("time_window argument must be a positive timedelta") self._time_window = time_window self._access_limit = access_limit self._past_entries = collections.deque( maxlen=access_limit ) # least recent first self._entry_lock = threading.Lock() def __enter__(self): with self._entry_lock: cutoff_time = datetime.datetime.now() - self._time_window # drop the entries that are too old, as they are no longer relevant while self._past_entries and self._past_entries[0] < cutoff_time: self._past_entries.popleft() if len(self._past_entries) < self._access_limit: self._past_entries.append(datetime.datetime.now()) return 0.0 # no waiting was needed to_wait = (self._past_entries[0] - cutoff_time).total_seconds() time.sleep(to_wait) self._past_entries.append(datetime.datetime.now()) return to_wait def __exit__(self, *_): pass def __repr__(self): return "{}(access_limit={}, time_window={})".format( self.__class__.__name__, self._access_limit, repr(self._time_window) ) class BidiRpc(object): """A helper for consuming a bi-directional streaming RPC. This maps gRPC's built-in interface which uses a request iterator and a response iterator into a socket-like :func:`send` and :func:`recv`. This is a more useful pattern for long-running or asymmetric streams (streams where there is not a direct correlation between the requests and responses). Example:: initial_request = example_pb2.StreamingRpcRequest( setting='example') rpc = BidiRpc( stub.StreamingRpc, initial_request=initial_request, metadata=[('name', 'value')] ) rpc.open() while rpc.is_active(): print(rpc.recv()) rpc.send(example_pb2.StreamingRpcRequest( data='example')) This does *not* retry the stream on errors. See :class:`ResumableBidiRpc`. Args: start_rpc (grpc.StreamStreamMultiCallable): The gRPC method used to start the RPC. initial_request (Union[protobuf.Message, Callable[None, protobuf.Message]]): The initial request to yield. This is useful if an initial request is needed to start the stream. metadata (Sequence[Tuple(str, str)]): RPC metadata to include in the request. """ def __init__(self, start_rpc, initial_request=None, metadata=None): self._start_rpc = start_rpc self._initial_request = initial_request self._rpc_metadata = metadata self._request_queue = queue_module.Queue() self._request_generator = None self._is_active = False self._callbacks = [] self.call = None def add_done_callback(self, callback): """Adds a callback that will be called when the RPC terminates. This occurs when the RPC errors or is successfully terminated. Args: callback (Callable[[grpc.Future], None]): The callback to execute. It will be provided with the same gRPC future as the underlying stream which will also be a :class:`grpc.Call`. """ self._callbacks.append(callback) def _on_call_done(self, future): # This occurs when the RPC errors or is successfully terminated. # Note that grpc's "future" here can also be a grpc.RpcError. # See note in https://github.com/grpc/grpc/issues/10885#issuecomment-302651331 # that `grpc.RpcError` is also `grpc.call`. for callback in self._callbacks: callback(future) def open(self): """Opens the stream.""" if self.is_active: raise ValueError("Can not open an already open stream.") request_generator = _RequestQueueGenerator( self._request_queue, initial_request=self._initial_request ) try: call = self._start_rpc(iter(request_generator), metadata=self._rpc_metadata) except exceptions.GoogleAPICallError as exc: # The original `grpc.RpcError` (which is usually also a `grpc.Call`) is # available from the ``response`` property on the mapped exception. self._on_call_done(exc.response) raise request_generator.call = call # TODO: api_core should expose the future interface for wrapped # callables as well. if hasattr(call, "_wrapped"): # pragma: NO COVER call._wrapped.add_done_callback(self._on_call_done) else: call.add_done_callback(self._on_call_done) self._request_generator = request_generator self.call = call def close(self): """Closes the stream.""" if self.call is None: return self._request_queue.put(None) self.call.cancel() self._request_generator = None # Don't set self.call to None. Keep it around so that send/recv can # raise the error. def send(self, request): """Queue a message to be sent on the stream. Send is non-blocking. If the underlying RPC has been closed, this will raise. Args: request (protobuf.Message): The request to send. """ if self.call is None: raise ValueError("Can not send() on an RPC that has never been open()ed.") # Don't use self.is_active(), as ResumableBidiRpc will overload it # to mean something semantically different. if self.call.is_active(): self._request_queue.put(request) else: # calling next should cause the call to raise. next(self.call) def recv(self): """Wait for a message to be returned from the stream. Recv is blocking. If the underlying RPC has been closed, this will raise. Returns: protobuf.Message: The received message. """ if self.call is None: raise ValueError("Can not recv() on an RPC that has never been open()ed.") return next(self.call) @property def is_active(self): """bool: True if this stream is currently open and active.""" return self.call is not None and self.call.is_active() @property def pending_requests(self): """int: Returns an estimate of the number of queued requests.""" return self._request_queue.qsize() def _never_terminate(future_or_error): """By default, no errors cause BiDi termination.""" return False class ResumableBidiRpc(BidiRpc): """A :class:`BidiRpc` that can automatically resume the stream on errors. It uses the ``should_recover`` arg to determine if it should re-establish the stream on error. Example:: def should_recover(exc): return ( isinstance(exc, grpc.RpcError) and exc.code() == grpc.StatusCode.UNAVAILABLE) initial_request = example_pb2.StreamingRpcRequest( setting='example') metadata = [('header_name', 'value')] rpc = ResumableBidiRpc( stub.StreamingRpc, should_recover=should_recover, initial_request=initial_request, metadata=metadata ) rpc.open() while rpc.is_active(): print(rpc.recv()) rpc.send(example_pb2.StreamingRpcRequest( data='example')) Args: start_rpc (grpc.StreamStreamMultiCallable): The gRPC method used to start the RPC. initial_request (Union[protobuf.Message, Callable[None, protobuf.Message]]): The initial request to yield. This is useful if an initial request is needed to start the stream. should_recover (Callable[[Exception], bool]): A function that returns True if the stream should be recovered. This will be called whenever an error is encountered on the stream. should_terminate (Callable[[Exception], bool]): A function that returns True if the stream should be terminated. This will be called whenever an error is encountered on the stream. metadata Sequence[Tuple(str, str)]: RPC metadata to include in the request. throttle_reopen (bool): If ``True``, throttling will be applied to stream reopen calls. Defaults to ``False``. """ def __init__( self, start_rpc, should_recover, should_terminate=_never_terminate, initial_request=None, metadata=None, throttle_reopen=False, ): super(ResumableBidiRpc, self).__init__(start_rpc, initial_request, metadata) self._should_recover = should_recover self._should_terminate = should_terminate self._operational_lock = threading.RLock() self._finalized = False self._finalize_lock = threading.Lock() if throttle_reopen: self._reopen_throttle = _Throttle( access_limit=5, time_window=datetime.timedelta(seconds=10) ) else: self._reopen_throttle = None def _finalize(self, result): with self._finalize_lock: if self._finalized: return for callback in self._callbacks: callback(result) self._finalized = True def _on_call_done(self, future): # Unlike the base class, we only execute the callbacks on a terminal # error, not for errors that we can recover from. Note that grpc's # "future" here is also a grpc.RpcError. with self._operational_lock: if self._should_terminate(future): self._finalize(future) elif not self._should_recover(future): self._finalize(future) else: _LOGGER.debug("Re-opening stream from gRPC callback.") self._reopen() def _reopen(self): with self._operational_lock: # Another thread already managed to re-open this stream. if self.call is not None and self.call.is_active(): _LOGGER.debug("Stream was already re-established.") return self.call = None # Request generator should exit cleanly since the RPC its bound to # has exited. self._request_generator = None # Note: we do not currently do any sort of backoff here. The # assumption is that re-establishing the stream under normal # circumstances will happen in intervals greater than 60s. # However, it is possible in a degenerative case that the server # closes the stream rapidly which would lead to thrashing here, # but hopefully in those cases the server would return a non- # retryable error. try: if self._reopen_throttle: with self._reopen_throttle: self.open() else: self.open() # If re-opening or re-calling the method fails for any reason, # consider it a terminal error and finalize the stream. except Exception as exc: _LOGGER.debug("Failed to re-open stream due to %s", exc) self._finalize(exc) raise _LOGGER.info("Re-established stream") def _recoverable(self, method, *args, **kwargs): """Wraps a method to recover the stream and retry on error. If a retryable error occurs while making the call, then the stream will be re-opened and the method will be retried. This happens indefinitely so long as the error is a retryable one. If an error occurs while re-opening the stream, then this method will raise immediately and trigger finalization of this object. Args: method (Callable[..., Any]): The method to call. args: The args to pass to the method. kwargs: The kwargs to pass to the method. """ while True: try: return method(*args, **kwargs) except Exception as exc: with self._operational_lock: _LOGGER.debug("Call to retryable %r caused %s.", method, exc) if self._should_terminate(exc): self.close() _LOGGER.debug("Terminating %r due to %s.", method, exc) self._finalize(exc) break if not self._should_recover(exc): self.close() _LOGGER.debug("Not retrying %r due to %s.", method, exc) self._finalize(exc) raise exc _LOGGER.debug("Re-opening stream from retryable %r.", method) self._reopen() def _send(self, request): # Grab a reference to the RPC call. Because another thread (notably # the gRPC error thread) can modify self.call (by invoking reopen), # we should ensure our reference can not change underneath us. # If self.call is modified (such as replaced with a new RPC call) then # this will use the "old" RPC, which should result in the same # exception passed into gRPC's error handler being raised here, which # will be handled by the usual error handling in retryable. with self._operational_lock: call = self.call if call is None: raise ValueError("Can not send() on an RPC that has never been open()ed.") # Don't use self.is_active(), as ResumableBidiRpc will overload it # to mean something semantically different. if call.is_active(): self._request_queue.put(request) pass else: # calling next should cause the call to raise. next(call) def send(self, request): return self._recoverable(self._send, request) def _recv(self): with self._operational_lock: call = self.call if call is None: raise ValueError("Can not recv() on an RPC that has never been open()ed.") return next(call) def recv(self): return self._recoverable(self._recv) def close(self): self._finalize(None) super(ResumableBidiRpc, self).close() @property def is_active(self): """bool: True if this stream is currently open and active.""" # Use the operational lock. It's entirely possible for something # to check the active state *while* the RPC is being retried. # Also, use finalized to track the actual terminal state here. # This is because if the stream is re-established by the gRPC thread # it's technically possible to check this between when gRPC marks the # RPC as inactive and when gRPC executes our callback that re-opens # the stream. with self._operational_lock: return self.call is not None and not self._finalized class BackgroundConsumer(object): """A bi-directional stream consumer that runs in a separate thread. This maps the consumption of a stream into a callback-based model. It also provides :func:`pause` and :func:`resume` to allow for flow-control. Example:: def should_recover(exc): return ( isinstance(exc, grpc.RpcError) and exc.code() == grpc.StatusCode.UNAVAILABLE) initial_request = example_pb2.StreamingRpcRequest( setting='example') rpc = ResumeableBidiRpc( stub.StreamingRpc, initial_request=initial_request, should_recover=should_recover) def on_response(response): print(response) consumer = BackgroundConsumer(rpc, on_response) consumer.start() Note that error handling *must* be done by using the provided ``bidi_rpc``'s ``add_done_callback``. This helper will automatically exit whenever the RPC itself exits and will not provide any error details. Args: bidi_rpc (BidiRpc): The RPC to consume. Should not have been ``open()``ed yet. on_response (Callable[[protobuf.Message], None]): The callback to be called for every response on the stream. """ def __init__(self, bidi_rpc, on_response): self._bidi_rpc = bidi_rpc self._on_response = on_response self._paused = False self._wake = threading.Condition() self._thread = None self._operational_lock = threading.Lock() def _on_call_done(self, future): # Resume the thread if it's paused, this prevents blocking forever # when the RPC has terminated. self.resume() def _thread_main(self, ready): try: ready.set() self._bidi_rpc.add_done_callback(self._on_call_done) self._bidi_rpc.open() while self._bidi_rpc.is_active: # Do not allow the paused status to change at all during this # section. There is a condition where we could be resumed # between checking if we are paused and calling wake.wait(), # which means that we will miss the notification to wake up # (oops!) and wait for a notification that will never come. # Keeping the lock throughout avoids that. # In the future, we could use `Condition.wait_for` if we drop # Python 2.7. # See: https://github.com/googleapis/python-api-core/issues/211 with self._wake: while self._paused: _LOGGER.debug("paused, waiting for waking.") self._wake.wait() _LOGGER.debug("woken.") _LOGGER.debug("waiting for recv.") response = self._bidi_rpc.recv() _LOGGER.debug("recved response.") self._on_response(response) except exceptions.GoogleAPICallError as exc: _LOGGER.debug( "%s caught error %s and will exit. Generally this is due to " "the RPC itself being cancelled and the error will be " "surfaced to the calling code.", _BIDIRECTIONAL_CONSUMER_NAME, exc, exc_info=True, ) except Exception as exc: _LOGGER.exception( "%s caught unexpected exception %s and will exit.", _BIDIRECTIONAL_CONSUMER_NAME, exc, ) _LOGGER.info("%s exiting", _BIDIRECTIONAL_CONSUMER_NAME) def start(self): """Start the background thread and begin consuming the thread.""" with self._operational_lock: ready = threading.Event() thread = threading.Thread( name=_BIDIRECTIONAL_CONSUMER_NAME, target=self._thread_main, args=(ready,), ) thread.daemon = True thread.start() # Other parts of the code rely on `thread.is_alive` which # isn't sufficient to know if a thread is active, just that it may # soon be active. This can cause races. Further protect # against races by using a ready event and wait on it to be set. ready.wait() self._thread = thread _LOGGER.debug("Started helper thread %s", thread.name) def stop(self): """Stop consuming the stream and shutdown the background thread.""" with self._operational_lock: self._bidi_rpc.close() if self._thread is not None: # Resume the thread to wake it up in case it is sleeping. self.resume() # The daemonized thread may itself block, so don't wait # for it longer than a second. self._thread.join(1.0) if self._thread.is_alive(): # pragma: NO COVER _LOGGER.warning("Background thread did not exit.") self._thread = None @property def is_active(self): """bool: True if the background thread is active.""" return self._thread is not None and self._thread.is_alive() def pause(self): """Pauses the response stream. This does *not* pause the request stream. """ with self._wake: self._paused = True def resume(self): """Resumes the response stream.""" with self._wake: self._paused = False self._wake.notify_all() @property def is_paused(self): """bool: True if the response stream is paused.""" return self._paused ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/client_info.py0000664000000000017530000000724014614221643020336 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for providing client information. Client information is used to send information about the calling client, such as the library and Python version, to API services. """ import platform from typing import Union from google.api_core import version as api_core_version _PY_VERSION = platform.python_version() _API_CORE_VERSION = api_core_version.__version__ _GRPC_VERSION: Union[str, None] try: import grpc _GRPC_VERSION = grpc.__version__ except ImportError: # pragma: NO COVER _GRPC_VERSION = None class ClientInfo(object): """Client information used to generate a user-agent for API calls. This user-agent information is sent along with API calls to allow the receiving service to do analytics on which versions of Python and Google libraries are being used. Args: python_version (str): The Python interpreter version, for example, ``'3.9.6'``. grpc_version (Optional[str]): The gRPC library version. api_core_version (str): The google-api-core library version. gapic_version (Optional[str]): The version of gapic-generated client library, if the library was generated by gapic. client_library_version (Optional[str]): The version of the client library, generally used if the client library was not generated by gapic or if additional functionality was built on top of a gapic client library. user_agent (Optional[str]): Prefix to the user agent header. This is used to supply information such as application name or partner tool. Recommended format: ``application-or-tool-ID/major.minor.version``. rest_version (Optional[str]): The requests library version. """ def __init__( self, python_version=_PY_VERSION, grpc_version=_GRPC_VERSION, api_core_version=_API_CORE_VERSION, gapic_version=None, client_library_version=None, user_agent=None, rest_version=None, ): self.python_version = python_version self.grpc_version = grpc_version self.api_core_version = api_core_version self.gapic_version = gapic_version self.client_library_version = client_library_version self.user_agent = user_agent self.rest_version = rest_version def to_user_agent(self): """Returns the user-agent string for this client info.""" # Note: the order here is important as the internal metrics system # expects these items to be in specific locations. ua = "" if self.user_agent is not None: ua += "{user_agent} " ua += "gl-python/{python_version} " if self.grpc_version is not None: ua += "grpc/{grpc_version} " if self.rest_version is not None: ua += "rest/{rest_version} " ua += "gax/{api_core_version} " if self.gapic_version is not None: ua += "gapic/{gapic_version} " if self.client_library_version is not None: ua += "gccl/{client_library_version} " return ua.format(**self.__dict__).strip() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/client_options.py0000664000000000017530000001251114614221643021073 0ustar00root# Copyright 2019 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Client options class. Client options provide a consistent interface for user options to be defined across clients. You can pass a client options object to a client. .. code-block:: python from google.api_core.client_options import ClientOptions from google.cloud.vision_v1 import ImageAnnotatorClient def get_client_cert(): # code to load client certificate and private key. return client_cert_bytes, client_private_key_bytes options = ClientOptions(api_endpoint="foo.googleapis.com", client_cert_source=get_client_cert) client = ImageAnnotatorClient(client_options=options) You can also pass a mapping object. .. code-block:: python from google.cloud.vision_v1 import ImageAnnotatorClient client = ImageAnnotatorClient( client_options={ "api_endpoint": "foo.googleapis.com", "client_cert_source" : get_client_cert }) """ class ClientOptions(object): """Client Options used to set options on clients. Args: api_endpoint (Optional[str]): The desired API endpoint, e.g., compute.googleapis.com client_cert_source (Optional[Callable[[], (bytes, bytes)]]): A callback which returns client certificate bytes and private key bytes both in PEM format. ``client_cert_source`` and ``client_encrypted_cert_source`` are mutually exclusive. client_encrypted_cert_source (Optional[Callable[[], (str, str, bytes)]]): A callback which returns client certificate file path, encrypted private key file path, and the passphrase bytes.``client_cert_source`` and ``client_encrypted_cert_source`` are mutually exclusive. quota_project_id (Optional[str]): A project name that a client's quota belongs to. credentials_file (Optional[str]): A path to a file storing credentials. ``credentials_file` and ``api_key`` are mutually exclusive. scopes (Optional[Sequence[str]]): OAuth access token override scopes. api_key (Optional[str]): Google API key. ``credentials_file`` and ``api_key`` are mutually exclusive. api_audience (Optional[str]): The intended audience for the API calls to the service that will be set when using certain 3rd party authentication flows. Audience is typically a resource identifier. If not set, the service endpoint value will be used as a default. An example of a valid ``api_audience`` is: "https://language.googleapis.com". universe_domain (Optional[str]): The desired universe domain. This must match the one in credentials. If not set, the default universe domain is `googleapis.com`. If both `api_endpoint` and `universe_domain` are set, then `api_endpoint` is used as the service endpoint. If `api_endpoint` is not specified, the format will be `{service}.{universe_domain}`. Raises: ValueError: If both ``client_cert_source`` and ``client_encrypted_cert_source`` are provided, or both ``credentials_file`` and ``api_key`` are provided. """ def __init__( self, api_endpoint=None, client_cert_source=None, client_encrypted_cert_source=None, quota_project_id=None, credentials_file=None, scopes=None, api_key=None, api_audience=None, universe_domain=None, ): if client_cert_source and client_encrypted_cert_source: raise ValueError( "client_cert_source and client_encrypted_cert_source are mutually exclusive" ) if api_key and credentials_file: raise ValueError("api_key and credentials_file are mutually exclusive") self.api_endpoint = api_endpoint self.client_cert_source = client_cert_source self.client_encrypted_cert_source = client_encrypted_cert_source self.quota_project_id = quota_project_id self.credentials_file = credentials_file self.scopes = scopes self.api_key = api_key self.api_audience = api_audience self.universe_domain = universe_domain def __repr__(self): return "ClientOptions: " + repr(self.__dict__) def from_dict(options): """Construct a client options object from a mapping object. Args: options (collections.abc.Mapping): A mapping object with client options. See the docstring for ClientOptions for details on valid arguments. """ client_options = ClientOptions() for key, value in options.items(): if hasattr(client_options, key): setattr(client_options, key, value) else: raise ValueError("ClientOptions does not accept an option '" + key + "'") return client_options ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/datetime_helpers.py0000664000000000017530000002151214614221643021361 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for :mod:`datetime`.""" import calendar import datetime import re from google.protobuf import timestamp_pb2 _UTC_EPOCH = datetime.datetime(1970, 1, 1, tzinfo=datetime.timezone.utc) _RFC3339_MICROS = "%Y-%m-%dT%H:%M:%S.%fZ" _RFC3339_NO_FRACTION = "%Y-%m-%dT%H:%M:%S" # datetime.strptime cannot handle nanosecond precision: parse w/ regex _RFC3339_NANOS = re.compile( r""" (?P \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2} # YYYY-MM-DDTHH:MM:SS ) ( # Optional decimal part \. # decimal point (?P\d{1,9}) # nanoseconds, maybe truncated )? Z # Zulu """, re.VERBOSE, ) def utcnow(): """A :meth:`datetime.datetime.utcnow()` alias to allow mocking in tests.""" return datetime.datetime.now(tz=datetime.timezone.utc).replace(tzinfo=None) def to_milliseconds(value): """Convert a zone-aware datetime to milliseconds since the unix epoch. Args: value (datetime.datetime): The datetime to covert. Returns: int: Milliseconds since the unix epoch. """ micros = to_microseconds(value) return micros // 1000 def from_microseconds(value): """Convert timestamp in microseconds since the unix epoch to datetime. Args: value (float): The timestamp to convert, in microseconds. Returns: datetime.datetime: The datetime object equivalent to the timestamp in UTC. """ return _UTC_EPOCH + datetime.timedelta(microseconds=value) def to_microseconds(value): """Convert a datetime to microseconds since the unix epoch. Args: value (datetime.datetime): The datetime to covert. Returns: int: Microseconds since the unix epoch. """ if not value.tzinfo: value = value.replace(tzinfo=datetime.timezone.utc) # Regardless of what timezone is on the value, convert it to UTC. value = value.astimezone(datetime.timezone.utc) # Convert the datetime to a microsecond timestamp. return int(calendar.timegm(value.timetuple()) * 1e6) + value.microsecond def from_iso8601_date(value): """Convert a ISO8601 date string to a date. Args: value (str): The ISO8601 date string. Returns: datetime.date: A date equivalent to the date string. """ return datetime.datetime.strptime(value, "%Y-%m-%d").date() def from_iso8601_time(value): """Convert a zoneless ISO8601 time string to a time. Args: value (str): The ISO8601 time string. Returns: datetime.time: A time equivalent to the time string. """ return datetime.datetime.strptime(value, "%H:%M:%S").time() def from_rfc3339(value): """Convert an RFC3339-format timestamp to a native datetime. Supported formats include those without fractional seconds, or with any fraction up to nanosecond precision. .. note:: Python datetimes do not support nanosecond precision; this function therefore truncates such values to microseconds. Args: value (str): The RFC3339 string to convert. Returns: datetime.datetime: The datetime object equivalent to the timestamp in UTC. Raises: ValueError: If the timestamp does not match the RFC3339 regular expression. """ with_nanos = _RFC3339_NANOS.match(value) if with_nanos is None: raise ValueError( "Timestamp: {!r}, does not match pattern: {!r}".format( value, _RFC3339_NANOS.pattern ) ) bare_seconds = datetime.datetime.strptime( with_nanos.group("no_fraction"), _RFC3339_NO_FRACTION ) fraction = with_nanos.group("nanos") if fraction is None: micros = 0 else: scale = 9 - len(fraction) nanos = int(fraction) * (10**scale) micros = nanos // 1000 return bare_seconds.replace(microsecond=micros, tzinfo=datetime.timezone.utc) from_rfc3339_nanos = from_rfc3339 # from_rfc3339_nanos method was deprecated. def to_rfc3339(value, ignore_zone=True): """Convert a datetime to an RFC3339 timestamp string. Args: value (datetime.datetime): The datetime object to be converted to a string. ignore_zone (bool): If True, then the timezone (if any) of the datetime object is ignored and the datetime is treated as UTC. Returns: str: The RFC3339 formatted string representing the datetime. """ if not ignore_zone and value.tzinfo is not None: # Convert to UTC and remove the time zone info. value = value.replace(tzinfo=None) - value.utcoffset() return value.strftime(_RFC3339_MICROS) class DatetimeWithNanoseconds(datetime.datetime): """Track nanosecond in addition to normal datetime attrs. Nanosecond can be passed only as a keyword argument. """ __slots__ = ("_nanosecond",) # pylint: disable=arguments-differ def __new__(cls, *args, **kw): nanos = kw.pop("nanosecond", 0) if nanos > 0: if "microsecond" in kw: raise TypeError("Specify only one of 'microsecond' or 'nanosecond'") kw["microsecond"] = nanos // 1000 inst = datetime.datetime.__new__(cls, *args, **kw) inst._nanosecond = nanos or 0 return inst # pylint: disable=arguments-differ @property def nanosecond(self): """Read-only: nanosecond precision.""" return self._nanosecond def rfc3339(self): """Return an RFC3339-compliant timestamp. Returns: (str): Timestamp string according to RFC3339 spec. """ if self._nanosecond == 0: return to_rfc3339(self) nanos = str(self._nanosecond).rjust(9, "0").rstrip("0") return "{}.{}Z".format(self.strftime(_RFC3339_NO_FRACTION), nanos) @classmethod def from_rfc3339(cls, stamp): """Parse RFC3339-compliant timestamp, preserving nanoseconds. Args: stamp (str): RFC3339 stamp, with up to nanosecond precision Returns: :class:`DatetimeWithNanoseconds`: an instance matching the timestamp string Raises: ValueError: if `stamp` does not match the expected format """ with_nanos = _RFC3339_NANOS.match(stamp) if with_nanos is None: raise ValueError( "Timestamp: {}, does not match pattern: {}".format( stamp, _RFC3339_NANOS.pattern ) ) bare = datetime.datetime.strptime( with_nanos.group("no_fraction"), _RFC3339_NO_FRACTION ) fraction = with_nanos.group("nanos") if fraction is None: nanos = 0 else: scale = 9 - len(fraction) nanos = int(fraction) * (10**scale) return cls( bare.year, bare.month, bare.day, bare.hour, bare.minute, bare.second, nanosecond=nanos, tzinfo=datetime.timezone.utc, ) def timestamp_pb(self): """Return a timestamp message. Returns: (:class:`~google.protobuf.timestamp_pb2.Timestamp`): Timestamp message """ inst = ( self if self.tzinfo is not None else self.replace(tzinfo=datetime.timezone.utc) ) delta = inst - _UTC_EPOCH seconds = int(delta.total_seconds()) nanos = self._nanosecond or self.microsecond * 1000 return timestamp_pb2.Timestamp(seconds=seconds, nanos=nanos) @classmethod def from_timestamp_pb(cls, stamp): """Parse RFC3339-compliant timestamp, preserving nanoseconds. Args: stamp (:class:`~google.protobuf.timestamp_pb2.Timestamp`): timestamp message Returns: :class:`DatetimeWithNanoseconds`: an instance matching the timestamp message """ microseconds = int(stamp.seconds * 1e6) bare = from_microseconds(microseconds) return cls( bare.year, bare.month, bare.day, bare.hour, bare.minute, bare.second, nanosecond=stamp.nanos, tzinfo=datetime.timezone.utc, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/exceptions.py0000664000000000017530000004557614614221643020244 0ustar00root# Copyright 2014 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Exceptions raised by Google API core & clients. This module provides base classes for all errors raised by libraries based on :mod:`google.api_core`, including both HTTP and gRPC clients. """ from __future__ import absolute_import from __future__ import unicode_literals import http.client from typing import Dict from typing import Union import warnings from google.rpc import error_details_pb2 try: import grpc try: from grpc_status import rpc_status except ImportError: # pragma: NO COVER warnings.warn( "Please install grpcio-status to obtain helpful grpc error messages.", ImportWarning, ) rpc_status = None except ImportError: # pragma: NO COVER grpc = None # Lookup tables for mapping exceptions from HTTP and gRPC transports. # Populated by _GoogleAPICallErrorMeta _HTTP_CODE_TO_EXCEPTION: Dict[int, Exception] = {} _GRPC_CODE_TO_EXCEPTION: Dict[int, Exception] = {} # Additional lookup table to map integer status codes to grpc status code # grpc does not currently support initializing enums from ints # i.e., grpc.StatusCode(5) raises an error _INT_TO_GRPC_CODE = {} if grpc is not None: # pragma: no branch for x in grpc.StatusCode: _INT_TO_GRPC_CODE[x.value[0]] = x class GoogleAPIError(Exception): """Base class for all exceptions raised by Google API Clients.""" pass class DuplicateCredentialArgs(GoogleAPIError): """Raised when multiple credentials are passed.""" pass class RetryError(GoogleAPIError): """Raised when a function has exhausted all of its available retries. Args: message (str): The exception message. cause (Exception): The last exception raised when retrying the function. """ def __init__(self, message, cause): super(RetryError, self).__init__(message) self.message = message self._cause = cause @property def cause(self): """The last exception raised when retrying the function.""" return self._cause def __str__(self): return "{}, last exception: {}".format(self.message, self.cause) class _GoogleAPICallErrorMeta(type): """Metaclass for registering GoogleAPICallError subclasses.""" def __new__(mcs, name, bases, class_dict): cls = type.__new__(mcs, name, bases, class_dict) if cls.code is not None: _HTTP_CODE_TO_EXCEPTION.setdefault(cls.code, cls) if cls.grpc_status_code is not None: _GRPC_CODE_TO_EXCEPTION.setdefault(cls.grpc_status_code, cls) return cls class GoogleAPICallError(GoogleAPIError, metaclass=_GoogleAPICallErrorMeta): """Base class for exceptions raised by calling API methods. Args: message (str): The exception message. errors (Sequence[Any]): An optional list of error details. details (Sequence[Any]): An optional list of objects defined in google.rpc.error_details. response (Union[requests.Request, grpc.Call]): The response or gRPC call metadata. error_info (Union[error_details_pb2.ErrorInfo, None]): An optional object containing error info (google.rpc.error_details.ErrorInfo). """ code: Union[int, None] = None """Optional[int]: The HTTP status code associated with this error. This may be ``None`` if the exception does not have a direct mapping to an HTTP error. See http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html """ grpc_status_code = None """Optional[grpc.StatusCode]: The gRPC status code associated with this error. This may be ``None`` if the exception does not match up to a gRPC error. """ def __init__(self, message, errors=(), details=(), response=None, error_info=None): super(GoogleAPICallError, self).__init__(message) self.message = message """str: The exception message.""" self._errors = errors self._details = details self._response = response self._error_info = error_info def __str__(self): error_msg = "{} {}".format(self.code, self.message) if self.details: error_msg = "{} {}".format(error_msg, self.details) # Note: This else condition can be removed once proposal A from # b/284179390 is implemented. else: if self.errors: errors = [ f"{error.code}: {error.message}" for error in self.errors if hasattr(error, "code") and hasattr(error, "message") ] if errors: error_msg = "{} {}".format(error_msg, "\n".join(errors)) return error_msg @property def reason(self): """The reason of the error. Reference: https://github.com/googleapis/googleapis/blob/master/google/rpc/error_details.proto#L112 Returns: Union[str, None]: An optional string containing reason of the error. """ return self._error_info.reason if self._error_info else None @property def domain(self): """The logical grouping to which the "reason" belongs. Reference: https://github.com/googleapis/googleapis/blob/master/google/rpc/error_details.proto#L112 Returns: Union[str, None]: An optional string containing a logical grouping to which the "reason" belongs. """ return self._error_info.domain if self._error_info else None @property def metadata(self): """Additional structured details about this error. Reference: https://github.com/googleapis/googleapis/blob/master/google/rpc/error_details.proto#L112 Returns: Union[Dict[str, str], None]: An optional object containing structured details about the error. """ return self._error_info.metadata if self._error_info else None @property def errors(self): """Detailed error information. Returns: Sequence[Any]: A list of additional error details. """ return list(self._errors) @property def details(self): """Information contained in google.rpc.status.details. Reference: https://github.com/googleapis/googleapis/blob/master/google/rpc/status.proto https://github.com/googleapis/googleapis/blob/master/google/rpc/error_details.proto Returns: Sequence[Any]: A list of structured objects from error_details.proto """ return list(self._details) @property def response(self): """Optional[Union[requests.Request, grpc.Call]]: The response or gRPC call metadata.""" return self._response class Redirection(GoogleAPICallError): """Base class for for all redirection (HTTP 3xx) responses.""" class MovedPermanently(Redirection): """Exception mapping a ``301 Moved Permanently`` response.""" code = http.client.MOVED_PERMANENTLY class NotModified(Redirection): """Exception mapping a ``304 Not Modified`` response.""" code = http.client.NOT_MODIFIED class TemporaryRedirect(Redirection): """Exception mapping a ``307 Temporary Redirect`` response.""" code = http.client.TEMPORARY_REDIRECT class ResumeIncomplete(Redirection): """Exception mapping a ``308 Resume Incomplete`` response. .. note:: :attr:`http.client.PERMANENT_REDIRECT` is ``308``, but Google APIs differ in their use of this status code. """ code = 308 class ClientError(GoogleAPICallError): """Base class for all client error (HTTP 4xx) responses.""" class BadRequest(ClientError): """Exception mapping a ``400 Bad Request`` response.""" code = http.client.BAD_REQUEST class InvalidArgument(BadRequest): """Exception mapping a :attr:`grpc.StatusCode.INVALID_ARGUMENT` error.""" grpc_status_code = grpc.StatusCode.INVALID_ARGUMENT if grpc is not None else None class FailedPrecondition(BadRequest): """Exception mapping a :attr:`grpc.StatusCode.FAILED_PRECONDITION` error.""" grpc_status_code = grpc.StatusCode.FAILED_PRECONDITION if grpc is not None else None class OutOfRange(BadRequest): """Exception mapping a :attr:`grpc.StatusCode.OUT_OF_RANGE` error.""" grpc_status_code = grpc.StatusCode.OUT_OF_RANGE if grpc is not None else None class Unauthorized(ClientError): """Exception mapping a ``401 Unauthorized`` response.""" code = http.client.UNAUTHORIZED class Unauthenticated(Unauthorized): """Exception mapping a :attr:`grpc.StatusCode.UNAUTHENTICATED` error.""" grpc_status_code = grpc.StatusCode.UNAUTHENTICATED if grpc is not None else None class Forbidden(ClientError): """Exception mapping a ``403 Forbidden`` response.""" code = http.client.FORBIDDEN class PermissionDenied(Forbidden): """Exception mapping a :attr:`grpc.StatusCode.PERMISSION_DENIED` error.""" grpc_status_code = grpc.StatusCode.PERMISSION_DENIED if grpc is not None else None class NotFound(ClientError): """Exception mapping a ``404 Not Found`` response or a :attr:`grpc.StatusCode.NOT_FOUND` error.""" code = http.client.NOT_FOUND grpc_status_code = grpc.StatusCode.NOT_FOUND if grpc is not None else None class MethodNotAllowed(ClientError): """Exception mapping a ``405 Method Not Allowed`` response.""" code = http.client.METHOD_NOT_ALLOWED class Conflict(ClientError): """Exception mapping a ``409 Conflict`` response.""" code = http.client.CONFLICT class AlreadyExists(Conflict): """Exception mapping a :attr:`grpc.StatusCode.ALREADY_EXISTS` error.""" grpc_status_code = grpc.StatusCode.ALREADY_EXISTS if grpc is not None else None class Aborted(Conflict): """Exception mapping a :attr:`grpc.StatusCode.ABORTED` error.""" grpc_status_code = grpc.StatusCode.ABORTED if grpc is not None else None class LengthRequired(ClientError): """Exception mapping a ``411 Length Required`` response.""" code = http.client.LENGTH_REQUIRED class PreconditionFailed(ClientError): """Exception mapping a ``412 Precondition Failed`` response.""" code = http.client.PRECONDITION_FAILED class RequestRangeNotSatisfiable(ClientError): """Exception mapping a ``416 Request Range Not Satisfiable`` response.""" code = http.client.REQUESTED_RANGE_NOT_SATISFIABLE class TooManyRequests(ClientError): """Exception mapping a ``429 Too Many Requests`` response.""" code = http.client.TOO_MANY_REQUESTS class ResourceExhausted(TooManyRequests): """Exception mapping a :attr:`grpc.StatusCode.RESOURCE_EXHAUSTED` error.""" grpc_status_code = grpc.StatusCode.RESOURCE_EXHAUSTED if grpc is not None else None class Cancelled(ClientError): """Exception mapping a :attr:`grpc.StatusCode.CANCELLED` error.""" # This maps to HTTP status code 499. See # https://github.com/googleapis/googleapis/blob/master/google/rpc/code.proto code = 499 grpc_status_code = grpc.StatusCode.CANCELLED if grpc is not None else None class ServerError(GoogleAPICallError): """Base for 5xx responses.""" class InternalServerError(ServerError): """Exception mapping a ``500 Internal Server Error`` response. or a :attr:`grpc.StatusCode.INTERNAL` error.""" code = http.client.INTERNAL_SERVER_ERROR grpc_status_code = grpc.StatusCode.INTERNAL if grpc is not None else None class Unknown(ServerError): """Exception mapping a :attr:`grpc.StatusCode.UNKNOWN` error.""" grpc_status_code = grpc.StatusCode.UNKNOWN if grpc is not None else None class DataLoss(ServerError): """Exception mapping a :attr:`grpc.StatusCode.DATA_LOSS` error.""" grpc_status_code = grpc.StatusCode.DATA_LOSS if grpc is not None else None class MethodNotImplemented(ServerError): """Exception mapping a ``501 Not Implemented`` response or a :attr:`grpc.StatusCode.UNIMPLEMENTED` error.""" code = http.client.NOT_IMPLEMENTED grpc_status_code = grpc.StatusCode.UNIMPLEMENTED if grpc is not None else None class BadGateway(ServerError): """Exception mapping a ``502 Bad Gateway`` response.""" code = http.client.BAD_GATEWAY class ServiceUnavailable(ServerError): """Exception mapping a ``503 Service Unavailable`` response or a :attr:`grpc.StatusCode.UNAVAILABLE` error.""" code = http.client.SERVICE_UNAVAILABLE grpc_status_code = grpc.StatusCode.UNAVAILABLE if grpc is not None else None class GatewayTimeout(ServerError): """Exception mapping a ``504 Gateway Timeout`` response.""" code = http.client.GATEWAY_TIMEOUT class DeadlineExceeded(GatewayTimeout): """Exception mapping a :attr:`grpc.StatusCode.DEADLINE_EXCEEDED` error.""" grpc_status_code = grpc.StatusCode.DEADLINE_EXCEEDED if grpc is not None else None def exception_class_for_http_status(status_code): """Return the exception class for a specific HTTP status code. Args: status_code (int): The HTTP status code. Returns: :func:`type`: the appropriate subclass of :class:`GoogleAPICallError`. """ return _HTTP_CODE_TO_EXCEPTION.get(status_code, GoogleAPICallError) def from_http_status(status_code, message, **kwargs): """Create a :class:`GoogleAPICallError` from an HTTP status code. Args: status_code (int): The HTTP status code. message (str): The exception message. kwargs: Additional arguments passed to the :class:`GoogleAPICallError` constructor. Returns: GoogleAPICallError: An instance of the appropriate subclass of :class:`GoogleAPICallError`. """ error_class = exception_class_for_http_status(status_code) error = error_class(message, **kwargs) if error.code is None: error.code = status_code return error def from_http_response(response): """Create a :class:`GoogleAPICallError` from a :class:`requests.Response`. Args: response (requests.Response): The HTTP response. Returns: GoogleAPICallError: An instance of the appropriate subclass of :class:`GoogleAPICallError`, with the message and errors populated from the response. """ try: payload = response.json() except ValueError: payload = {"error": {"message": response.text or "unknown error"}} error_message = payload.get("error", {}).get("message", "unknown error") errors = payload.get("error", {}).get("errors", ()) # In JSON, details are already formatted in developer-friendly way. details = payload.get("error", {}).get("details", ()) error_info = list( filter( lambda detail: detail.get("@type", "") == "type.googleapis.com/google.rpc.ErrorInfo", details, ) ) error_info = error_info[0] if error_info else None message = "{method} {url}: {error}".format( method=response.request.method, url=response.request.url, error=error_message, ) exception = from_http_status( response.status_code, message, errors=errors, details=details, response=response, error_info=error_info, ) return exception def exception_class_for_grpc_status(status_code): """Return the exception class for a specific :class:`grpc.StatusCode`. Args: status_code (grpc.StatusCode): The gRPC status code. Returns: :func:`type`: the appropriate subclass of :class:`GoogleAPICallError`. """ return _GRPC_CODE_TO_EXCEPTION.get(status_code, GoogleAPICallError) def from_grpc_status(status_code, message, **kwargs): """Create a :class:`GoogleAPICallError` from a :class:`grpc.StatusCode`. Args: status_code (Union[grpc.StatusCode, int]): The gRPC status code. message (str): The exception message. kwargs: Additional arguments passed to the :class:`GoogleAPICallError` constructor. Returns: GoogleAPICallError: An instance of the appropriate subclass of :class:`GoogleAPICallError`. """ if isinstance(status_code, int): status_code = _INT_TO_GRPC_CODE.get(status_code, status_code) error_class = exception_class_for_grpc_status(status_code) error = error_class(message, **kwargs) if error.grpc_status_code is None: error.grpc_status_code = status_code return error def _is_informative_grpc_error(rpc_exc): return hasattr(rpc_exc, "code") and hasattr(rpc_exc, "details") def _parse_grpc_error_details(rpc_exc): try: status = rpc_status.from_call(rpc_exc) except NotImplementedError: # workaround return [], None if not status: return [], None possible_errors = [ error_details_pb2.BadRequest, error_details_pb2.PreconditionFailure, error_details_pb2.QuotaFailure, error_details_pb2.ErrorInfo, error_details_pb2.RetryInfo, error_details_pb2.ResourceInfo, error_details_pb2.RequestInfo, error_details_pb2.DebugInfo, error_details_pb2.Help, error_details_pb2.LocalizedMessage, ] error_info = None error_details = [] for detail in status.details: matched_detail_cls = list( filter(lambda x: detail.Is(x.DESCRIPTOR), possible_errors) ) # If nothing matched, use detail directly. if len(matched_detail_cls) == 0: info = detail else: info = matched_detail_cls[0]() detail.Unpack(info) error_details.append(info) if isinstance(info, error_details_pb2.ErrorInfo): error_info = info return error_details, error_info def from_grpc_error(rpc_exc): """Create a :class:`GoogleAPICallError` from a :class:`grpc.RpcError`. Args: rpc_exc (grpc.RpcError): The gRPC error. Returns: GoogleAPICallError: An instance of the appropriate subclass of :class:`GoogleAPICallError`. """ # NOTE(lidiz) All gRPC error shares the parent class grpc.RpcError. # However, check for grpc.RpcError breaks backward compatibility. if ( grpc is not None and isinstance(rpc_exc, grpc.Call) ) or _is_informative_grpc_error(rpc_exc): details, err_info = _parse_grpc_error_details(rpc_exc) return from_grpc_status( rpc_exc.code(), rpc_exc.details(), errors=(rpc_exc,), details=details, response=rpc_exc, error_info=err_info, ) else: return GoogleAPICallError(str(rpc_exc), errors=(rpc_exc,), response=rpc_exc) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/extended_operation.py0000664000000000017530000002067014614221643021727 0ustar00root# Copyright 2022 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Futures for extended long-running operations returned from Google Cloud APIs. These futures can be used to synchronously wait for the result of a long-running operations using :meth:`ExtendedOperation.result`: .. code-block:: python extended_operation = my_api_client.long_running_method() extended_operation.result() Or asynchronously using callbacks and :meth:`Operation.add_done_callback`: .. code-block:: python extended_operation = my_api_client.long_running_method() def my_callback(ex_op): print(f"Operation {ex_op.name} completed") extended_operation.add_done_callback(my_callback) """ import threading from google.api_core import exceptions from google.api_core.future import polling class ExtendedOperation(polling.PollingFuture): """An ExtendedOperation future for interacting with a Google API Long-Running Operation. Args: extended_operation (proto.Message): The initial operation. refresh (Callable[[], type(extended_operation)]): A callable that returns the latest state of the operation. cancel (Callable[[], None]): A callable that tries to cancel the operation. polling Optional(google.api_core.retry.Retry): The configuration used for polling. This can be used to control how often :meth:`done` is polled. If the ``timeout`` argument to :meth:`result` is specified it will override the ``polling.timeout`` property. retry Optional(google.api_core.retry.Retry): DEPRECATED use ``polling`` instead. If specified it will override ``polling`` parameter to maintain backward compatibility. Note: Most long-running API methods use google.api_core.operation.Operation This class is a wrapper for a subset of methods that use alternative Long-Running Operation (LRO) semantics. Note: there is not a concrete type the extended operation must be. It MUST have fields that correspond to the following, POSSIBLY WITH DIFFERENT NAMES: * name: str * status: Union[str, bool, enum.Enum] * error_code: int * error_message: str """ def __init__( self, extended_operation, refresh, cancel, polling=polling.DEFAULT_POLLING, **kwargs, ): super().__init__(polling=polling, **kwargs) self._extended_operation = extended_operation self._refresh = refresh self._cancel = cancel # Note: the extended operation does not give a good way to indicate cancellation. # We make do with manually tracking cancellation and checking for doneness. self._cancelled = False self._completion_lock = threading.Lock() # Invoke in case the operation came back already complete. self._handle_refreshed_operation() # Note: the following four properties MUST be overridden in a subclass # if, and only if, the fields in the corresponding extended operation message # have different names. # # E.g. we have an extended operation class that looks like # # class MyOperation(proto.Message): # moniker = proto.Field(proto.STRING, number=1) # status_msg = proto.Field(proto.STRING, number=2) # optional http_error_code = proto.Field(proto.INT32, number=3) # optional http_error_msg = proto.Field(proto.STRING, number=4) # # the ExtendedOperation subclass would provide property overrides that map # to these (poorly named) fields. @property def name(self): return self._extended_operation.name @property def status(self): return self._extended_operation.status @property def error_code(self): return self._extended_operation.error_code @property def error_message(self): return self._extended_operation.error_message def __getattr__(self, name): return getattr(self._extended_operation, name) def done(self, retry=None): self._refresh_and_update(retry) return self._extended_operation.done def cancel(self): if self.done(): return False self._cancel() self._cancelled = True return True def cancelled(self): # TODO(dovs): there is not currently a good way to determine whether the # operation has been cancelled. # The best we can do is manually keep track of cancellation # and check for doneness. if not self._cancelled: return False self._refresh_and_update() return self._extended_operation.done def _refresh_and_update(self, retry=None): if not self._extended_operation.done: self._extended_operation = ( self._refresh(retry=retry) if retry else self._refresh() ) self._handle_refreshed_operation() def _handle_refreshed_operation(self): with self._completion_lock: if not self._extended_operation.done: return if self.error_code and self.error_message: # Note: `errors` can be removed once proposal A from # b/284179390 is implemented. errors = [] if hasattr(self, "error") and hasattr(self.error, "errors"): errors = self.error.errors exception = exceptions.from_http_status( status_code=self.error_code, message=self.error_message, response=self._extended_operation, errors=errors, ) self.set_exception(exception) elif self.error_code or self.error_message: exception = exceptions.GoogleAPICallError( f"Unexpected error {self.error_code}: {self.error_message}" ) self.set_exception(exception) else: # Extended operations have no payload. self.set_result(None) @classmethod def make(cls, refresh, cancel, extended_operation, **kwargs): """ Return an instantiated ExtendedOperation (or child) that wraps * a refresh callable * a cancel callable (can be a no-op) * an initial result .. note:: It is the caller's responsibility to set up refresh and cancel with their correct request argument. The reason for this is that the services that use Extended Operations have rpcs that look something like the following: // service.proto service MyLongService { rpc StartLongTask(StartLongTaskRequest) returns (ExtendedOperation) { option (google.cloud.operation_service) = "CustomOperationService"; } } service CustomOperationService { rpc Get(GetOperationRequest) returns (ExtendedOperation) { option (google.cloud.operation_polling_method) = true; } } Any info needed for the poll, e.g. a name, path params, etc. is held in the request, which the initial client method is in a much better position to make made because the caller made the initial request. TL;DR: the caller sets up closures for refresh and cancel that carry the properly configured requests. Args: refresh (Callable[Optional[Retry]][type(extended_operation)]): A callable that returns the latest state of the operation. cancel (Callable[][Any]): A callable that tries to cancel the operation on a best effort basis. extended_operation (Any): The initial response of the long running method. See the docstring for ExtendedOperation.__init__ for requirements on the type and fields of extended_operation """ return cls(extended_operation, refresh, cancel, **kwargs) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9819117 google-api-core-2.19.0/google/api_core/future/0002755000000000017530000000000014614222024016774 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/future/__init__.py0000664000000000017530000000127614614221643021121 0ustar00root# Copyright 2017, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Futures for dealing with asynchronous operations.""" from google.api_core.future.base import Future __all__ = ["Future"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/future/_helpers.py0000664000000000017530000000234014614221643021154 0ustar00root# Copyright 2017, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Private helpers for futures.""" import logging import threading _LOGGER = logging.getLogger(__name__) def start_daemon_thread(*args, **kwargs): """Starts a thread and marks it as a daemon thread.""" thread = threading.Thread(*args, **kwargs) thread.daemon = True thread.start() return thread def safe_invoke_callback(callback, *args, **kwargs): """Invoke a callback, swallowing and logging any exceptions.""" # pylint: disable=bare-except # We intentionally want to swallow all exceptions. try: return callback(*args, **kwargs) except Exception: _LOGGER.exception("Error while executing Future callback.") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/future/async_future.py0000664000000000017530000001235314614221643022067 0ustar00root# Copyright 2020, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO implementation of the abstract base Future class.""" import asyncio from google.api_core import exceptions from google.api_core import retry from google.api_core import retry_async from google.api_core.future import base class _OperationNotComplete(Exception): """Private exception used for polling via retry.""" pass RETRY_PREDICATE = retry.if_exception_type( _OperationNotComplete, exceptions.TooManyRequests, exceptions.InternalServerError, exceptions.BadGateway, ) DEFAULT_RETRY = retry_async.AsyncRetry(predicate=RETRY_PREDICATE) class AsyncFuture(base.Future): """A Future that polls peer service to self-update. The :meth:`done` method should be implemented by subclasses. The polling behavior will repeatedly call ``done`` until it returns True. .. note:: Privacy here is intended to prevent the final class from overexposing, not to prevent subclasses from accessing methods. Args: retry (google.api_core.retry.Retry): The retry configuration used when polling. This can be used to control how often :meth:`done` is polled. Regardless of the retry's ``deadline``, it will be overridden by the ``timeout`` argument to :meth:`result`. """ def __init__(self, retry=DEFAULT_RETRY): super().__init__() self._retry = retry self._future = asyncio.get_event_loop().create_future() self._background_task = None async def done(self, retry=DEFAULT_RETRY): """Checks to see if the operation is complete. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the RPC. Returns: bool: True if the operation is complete, False otherwise. """ # pylint: disable=redundant-returns-doc, missing-raises-doc raise NotImplementedError() async def _done_or_raise(self): """Check if the future is done and raise if it's not.""" result = await self.done() if not result: raise _OperationNotComplete() async def running(self): """True if the operation is currently running.""" result = await self.done() return not result async def _blocking_poll(self, timeout=None): """Poll and await for the Future to be resolved. Args: timeout (int): How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. """ if self._future.done(): return retry_ = self._retry.with_timeout(timeout) try: await retry_(self._done_or_raise)() except exceptions.RetryError: raise asyncio.TimeoutError( "Operation did not complete within the designated " "timeout." ) async def result(self, timeout=None): """Get the result of the operation. Args: timeout (int): How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. Returns: google.protobuf.Message: The Operation's result. Raises: google.api_core.GoogleAPICallError: If the operation errors or if the timeout is reached before the operation completes. """ await self._blocking_poll(timeout=timeout) return self._future.result() async def exception(self, timeout=None): """Get the exception from the operation. Args: timeout (int): How long to wait for the operation to complete. If None, wait indefinitely. Returns: Optional[google.api_core.GoogleAPICallError]: The operation's error. """ await self._blocking_poll(timeout=timeout) return self._future.exception() def add_done_callback(self, fn): """Add a callback to be executed when the operation is complete. If the operation is completed, the callback will be scheduled onto the event loop. Otherwise, the callback will be stored and invoked when the future is done. Args: fn (Callable[Future]): The callback to execute when the operation is complete. """ if self._background_task is None: self._background_task = asyncio.get_event_loop().create_task( self._blocking_poll() ) self._future.add_done_callback(fn) def set_result(self, result): """Set the Future's result.""" self._future.set_result(result) def set_exception(self, exception): """Set the Future's exception.""" self._future.set_exception(exception) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/future/base.py0000664000000000017530000000334314614221643020271 0ustar00root# Copyright 2017, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Abstract and helper bases for Future implementations.""" import abc class Future(object, metaclass=abc.ABCMeta): # pylint: disable=missing-docstring # We inherit the interfaces here from concurrent.futures. """Future interface. This interface is based on :class:`concurrent.futures.Future`. """ @abc.abstractmethod def cancel(self): raise NotImplementedError() @abc.abstractmethod def cancelled(self): raise NotImplementedError() @abc.abstractmethod def running(self): raise NotImplementedError() @abc.abstractmethod def done(self): raise NotImplementedError() @abc.abstractmethod def result(self, timeout=None): raise NotImplementedError() @abc.abstractmethod def exception(self, timeout=None): raise NotImplementedError() @abc.abstractmethod def add_done_callback(self, fn): # pylint: disable=invalid-name raise NotImplementedError() @abc.abstractmethod def set_result(self, result): raise NotImplementedError() @abc.abstractmethod def set_exception(self, exception): raise NotImplementedError() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/future/polling.py0000664000000000017530000003401514614221643021023 0ustar00root# Copyright 2017, Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Abstract and helper bases for Future implementations.""" import abc import concurrent.futures from google.api_core import exceptions from google.api_core import retry as retries from google.api_core.future import _helpers from google.api_core.future import base class _OperationNotComplete(Exception): """Private exception used for polling via retry.""" pass # DEPRECATED as it conflates RPC retry and polling concepts into one. # Use POLLING_PREDICATE instead to configure polling. RETRY_PREDICATE = retries.if_exception_type( _OperationNotComplete, exceptions.TooManyRequests, exceptions.InternalServerError, exceptions.BadGateway, exceptions.ServiceUnavailable, ) # DEPRECATED: use DEFAULT_POLLING to configure LRO polling logic. Construct # Retry object using its default values as a baseline for any custom retry logic # (not to be confused with polling logic). DEFAULT_RETRY = retries.Retry(predicate=RETRY_PREDICATE) # POLLING_PREDICATE is supposed to poll only on _OperationNotComplete. # Any RPC-specific errors (like ServiceUnavailable) will be handled # by retry logic (not to be confused with polling logic) which is triggered for # every polling RPC independently of polling logic but within its context. POLLING_PREDICATE = retries.if_exception_type( _OperationNotComplete, ) # Default polling configuration DEFAULT_POLLING = retries.Retry( predicate=POLLING_PREDICATE, initial=1.0, # seconds maximum=20.0, # seconds multiplier=1.5, timeout=900, # seconds ) class PollingFuture(base.Future): """A Future that needs to poll some service to check its status. The :meth:`done` method should be implemented by subclasses. The polling behavior will repeatedly call ``done`` until it returns True. The actual polling logic is encapsulated in :meth:`result` method. See documentation for that method for details on how polling works. .. note:: Privacy here is intended to prevent the final class from overexposing, not to prevent subclasses from accessing methods. Args: polling (google.api_core.retry.Retry): The configuration used for polling. This parameter controls how often :meth:`done` is polled. If the ``timeout`` argument is specified in :meth:`result` method it will override the ``polling.timeout`` property. retry (google.api_core.retry.Retry): DEPRECATED use ``polling`` instead. If set, it will override ``polling`` parameter for backward compatibility. """ _DEFAULT_VALUE = object() def __init__(self, polling=DEFAULT_POLLING, **kwargs): super(PollingFuture, self).__init__() self._polling = kwargs.get("retry", polling) self._result = None self._exception = None self._result_set = False """bool: Set to True when the result has been set via set_result or set_exception.""" self._polling_thread = None self._done_callbacks = [] @abc.abstractmethod def done(self, retry=None): """Checks to see if the operation is complete. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the polling RPC (to not be confused with polling configuration. See the documentation for :meth:`result` for details). Returns: bool: True if the operation is complete, False otherwise. """ # pylint: disable=redundant-returns-doc, missing-raises-doc raise NotImplementedError() def _done_or_raise(self, retry=None): """Check if the future is done and raise if it's not.""" if not self.done(retry=retry): raise _OperationNotComplete() def running(self): """True if the operation is currently running.""" return not self.done() def _blocking_poll(self, timeout=_DEFAULT_VALUE, retry=None, polling=None): """Poll and wait for the Future to be resolved.""" if self._result_set: return polling = polling or self._polling if timeout is not PollingFuture._DEFAULT_VALUE: polling = polling.with_timeout(timeout) try: polling(self._done_or_raise)(retry=retry) except exceptions.RetryError: raise concurrent.futures.TimeoutError( f"Operation did not complete within the designated timeout of " f"{polling.timeout} seconds." ) def result(self, timeout=_DEFAULT_VALUE, retry=None, polling=None): """Get the result of the operation. This method will poll for operation status periodically, blocking if necessary. If you just want to make sure that this method does not block for more than X seconds and you do not care about the nitty-gritty of how this method operates, just call it with ``result(timeout=X)``. The other parameters are for advanced use only. Every call to this method is controlled by the following three parameters, each of which has a specific, distinct role, even though all three may look very similar: ``timeout``, ``retry`` and ``polling``. In most cases users do not need to specify any custom values for any of these parameters and may simply rely on default ones instead. If you choose to specify custom parameters, please make sure you've read the documentation below carefully. First, please check :class:`google.api_core.retry.Retry` class documentation for the proper definition of timeout and deadline terms and for the definition the three different types of timeouts. This class operates in terms of Retry Timeout and Polling Timeout. It does not let customizing RPC timeout and the user is expected to rely on default behavior for it. The roles of each argument of this method are as follows: ``timeout`` (int): (Optional) The Polling Timeout as defined in :class:`google.api_core.retry.Retry`. If the operation does not complete within this timeout an exception will be thrown. This parameter affects neither Retry Timeout nor RPC Timeout. ``retry`` (google.api_core.retry.Retry): (Optional) How to retry the polling RPC. The ``retry.timeout`` property of this parameter is the Retry Timeout as defined in :class:`google.api_core.retry.Retry`. This parameter defines ONLY how the polling RPC call is retried (i.e. what to do if the RPC we used for polling returned an error). It does NOT define how the polling is done (i.e. how frequently and for how long to call the polling RPC); use the ``polling`` parameter for that. If a polling RPC throws and error and retrying it fails, the whole future fails with the corresponding exception. If you want to tune which server response error codes are not fatal for operation polling, use this parameter to control that (``retry.predicate`` in particular). ``polling`` (google.api_core.retry.Retry): (Optional) How often and for how long to call the polling RPC periodically (i.e. what to do if a polling rpc returned successfully but its returned result indicates that the long running operation is not completed yet, so we need to check it again at some point in future). This parameter does NOT define how to retry each individual polling RPC in case of an error; use the ``retry`` parameter for that. The ``polling.timeout`` of this parameter is Polling Timeout as defined in as defined in :class:`google.api_core.retry.Retry`. For each of the arguments, there are also default values in place, which will be used if a user does not specify their own. The default values for the three parameters are not to be confused with the default values for the corresponding arguments in this method (those serve as "not set" markers for the resolution logic). If ``timeout`` is provided (i.e.``timeout is not _DEFAULT VALUE``; note the ``None`` value means "infinite timeout"), it will be used to control the actual Polling Timeout. Otherwise, the ``polling.timeout`` value will be used instead (see below for how the ``polling`` config itself gets resolved). In other words, this parameter effectively overrides the ``polling.timeout`` value if specified. This is so to preserve backward compatibility. If ``retry`` is provided (i.e. ``retry is not None``) it will be used to control retry behavior for the polling RPC and the ``retry.timeout`` will determine the Retry Timeout. If not provided, the polling RPC will be called with whichever default retry config was specified for the polling RPC at the moment of the construction of the polling RPC's client. For example, if the polling RPC is ``operations_client.get_operation()``, the ``retry`` parameter will be controlling its retry behavior (not polling behavior) and, if not specified, that specific method (``operations_client.get_operation()``) will be retried according to the default retry config provided during creation of ``operations_client`` client instead. This argument exists mainly for backward compatibility; users are very unlikely to ever need to set this parameter explicitly. If ``polling`` is provided (i.e. ``polling is not None``), it will be used to control the overall polling behavior and ``polling.timeout`` will control Polling Timeout unless it is overridden by ``timeout`` parameter as described above. If not provided, the``polling`` parameter specified during construction of this future (the ``polling`` argument in the constructor) will be used instead. Note: since the ``timeout`` argument may override ``polling.timeout`` value, this parameter should be viewed as coupled with the ``timeout`` parameter as described above. Args: timeout (int): (Optional) How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. retry (google.api_core.retry.Retry): (Optional) How to retry the polling RPC. This defines ONLY how the polling RPC call is retried (i.e. what to do if the RPC we used for polling returned an error). It does NOT define how the polling is done (i.e. how frequently and for how long to call the polling RPC). polling (google.api_core.retry.Retry): (Optional) How often and for how long to call polling RPC periodically. This parameter does NOT define how to retry each individual polling RPC call (use the ``retry`` parameter for that). Returns: google.protobuf.Message: The Operation's result. Raises: google.api_core.GoogleAPICallError: If the operation errors or if the timeout is reached before the operation completes. """ self._blocking_poll(timeout=timeout, retry=retry, polling=polling) if self._exception is not None: # pylint: disable=raising-bad-type # Pylint doesn't recognize that this is valid in this case. raise self._exception return self._result def exception(self, timeout=_DEFAULT_VALUE): """Get the exception from the operation, blocking if necessary. See the documentation for the :meth:`result` method for details on how this method operates, as both ``result`` and this method rely on the exact same polling logic. The only difference is that this method does not accept ``retry`` and ``polling`` arguments but relies on the default ones instead. Args: timeout (int): How long to wait for the operation to complete. If None, wait indefinitely. Returns: Optional[google.api_core.GoogleAPICallError]: The operation's error. """ self._blocking_poll(timeout=timeout) return self._exception def add_done_callback(self, fn): """Add a callback to be executed when the operation is complete. If the operation is not already complete, this will start a helper thread to poll for the status of the operation in the background. Args: fn (Callable[Future]): The callback to execute when the operation is complete. """ if self._result_set: _helpers.safe_invoke_callback(fn, self) return self._done_callbacks.append(fn) if self._polling_thread is None: # The polling thread will exit on its own as soon as the operation # is done. self._polling_thread = _helpers.start_daemon_thread( target=self._blocking_poll ) def _invoke_callbacks(self, *args, **kwargs): """Invoke all done callbacks.""" for callback in self._done_callbacks: _helpers.safe_invoke_callback(callback, *args, **kwargs) def set_result(self, result): """Set the Future's result.""" self._result = result self._result_set = True self._invoke_callbacks(self) def set_exception(self, exception): """Set the Future's exception.""" self._exception = exception self._result_set = True self._invoke_callbacks(self) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9819117 google-api-core-2.19.0/google/api_core/gapic_v1/0002755000000000017530000000000014614222024017153 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/__init__.py0000664000000000017530000000173414614221643021277 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from google.api_core.gapic_v1 import client_info from google.api_core.gapic_v1 import config from google.api_core.gapic_v1 import config_async from google.api_core.gapic_v1 import method from google.api_core.gapic_v1 import method_async from google.api_core.gapic_v1 import routing_header __all__ = [ "client_info", "config", "config_async", "method", "method_async", "routing_header", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/client_info.py0000664000000000017530000000424714614221643022033 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for providing client information. Client information is used to send information about the calling client, such as the library and Python version, to API services. """ from google.api_core import client_info METRICS_METADATA_KEY = "x-goog-api-client" class ClientInfo(client_info.ClientInfo): """Client information used to generate a user-agent for API calls. This user-agent information is sent along with API calls to allow the receiving service to do analytics on which versions of Python and Google libraries are being used. Args: python_version (str): The Python interpreter version, for example, ``'3.9.6'``. grpc_version (Optional[str]): The gRPC library version. api_core_version (str): The google-api-core library version. gapic_version (Optional[str]): The version of gapic-generated client library, if the library was generated by gapic. client_library_version (Optional[str]): The version of the client library, generally used if the client library was not generated by gapic or if additional functionality was built on top of a gapic client library. user_agent (Optional[str]): Prefix to the user agent header. This is used to supply information such as application name or partner tool. Recommended format: ``application-or-tool-ID/major.minor.version``. """ def to_grpc_metadata(self): """Returns the gRPC metadata for this client info.""" return (METRICS_METADATA_KEY, self.to_user_agent()) DEFAULT_CLIENT_INFO = ClientInfo() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/config.py0000664000000000017530000001423414614221643021004 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for loading gapic configuration data. The Google API generator creates supplementary configuration for each RPC method to tell the client library how to deal with retries and timeouts. """ import collections import grpc from google.api_core import exceptions from google.api_core import retry from google.api_core import timeout _MILLIS_PER_SECOND = 1000.0 def _exception_class_for_grpc_status_name(name): """Returns the Google API exception class for a gRPC error code name. DEPRECATED: use ``exceptions.exception_class_for_grpc_status`` method directly instead. Args: name (str): The name of the gRPC status code, for example, ``UNAVAILABLE``. Returns: :func:`type`: The appropriate subclass of :class:`google.api_core.exceptions.GoogleAPICallError`. """ return exceptions.exception_class_for_grpc_status(getattr(grpc.StatusCode, name)) def _retry_from_retry_config(retry_params, retry_codes, retry_impl=retry.Retry): """Creates a Retry object given a gapic retry configuration. DEPRECATED: instantiate retry and timeout classes directly instead. Args: retry_params (dict): The retry parameter values, for example:: { "initial_retry_delay_millis": 1000, "retry_delay_multiplier": 2.5, "max_retry_delay_millis": 120000, "initial_rpc_timeout_millis": 120000, "rpc_timeout_multiplier": 1.0, "max_rpc_timeout_millis": 120000, "total_timeout_millis": 600000 } retry_codes (sequence[str]): The list of retryable gRPC error code names. Returns: google.api_core.retry.Retry: The default retry object for the method. """ exception_classes = [ _exception_class_for_grpc_status_name(code) for code in retry_codes ] return retry_impl( retry.if_exception_type(*exception_classes), initial=(retry_params["initial_retry_delay_millis"] / _MILLIS_PER_SECOND), maximum=(retry_params["max_retry_delay_millis"] / _MILLIS_PER_SECOND), multiplier=retry_params["retry_delay_multiplier"], deadline=retry_params["total_timeout_millis"] / _MILLIS_PER_SECOND, ) def _timeout_from_retry_config(retry_params): """Creates a ExponentialTimeout object given a gapic retry configuration. DEPRECATED: instantiate retry and timeout classes directly instead. Args: retry_params (dict): The retry parameter values, for example:: { "initial_retry_delay_millis": 1000, "retry_delay_multiplier": 2.5, "max_retry_delay_millis": 120000, "initial_rpc_timeout_millis": 120000, "rpc_timeout_multiplier": 1.0, "max_rpc_timeout_millis": 120000, "total_timeout_millis": 600000 } Returns: google.api_core.retry.ExponentialTimeout: The default time object for the method. """ return timeout.ExponentialTimeout( initial=(retry_params["initial_rpc_timeout_millis"] / _MILLIS_PER_SECOND), maximum=(retry_params["max_rpc_timeout_millis"] / _MILLIS_PER_SECOND), multiplier=retry_params["rpc_timeout_multiplier"], deadline=(retry_params["total_timeout_millis"] / _MILLIS_PER_SECOND), ) MethodConfig = collections.namedtuple("MethodConfig", ["retry", "timeout"]) def parse_method_configs(interface_config, retry_impl=retry.Retry): """Creates default retry and timeout objects for each method in a gapic interface config. DEPRECATED: instantiate retry and timeout classes directly instead. Args: interface_config (Mapping): The interface config section of the full gapic library config. For example, If the full configuration has an interface named ``google.example.v1.ExampleService`` you would pass in just that interface's configuration, for example ``gapic_config['interfaces']['google.example.v1.ExampleService']``. retry_impl (Callable): The constructor that creates a retry decorator that will be applied to the method based on method configs. Returns: Mapping[str, MethodConfig]: A mapping of RPC method names to their configuration. """ # Grab all the retry codes retry_codes_map = { name: retry_codes for name, retry_codes in interface_config.get("retry_codes", {}).items() } # Grab all of the retry params retry_params_map = { name: retry_params for name, retry_params in interface_config.get("retry_params", {}).items() } # Iterate through all the API methods and create a flat MethodConfig # instance for each one. method_configs = {} for method_name, method_params in interface_config.get("methods", {}).items(): retry_params_name = method_params.get("retry_params_name") if retry_params_name is not None: retry_params = retry_params_map[retry_params_name] retry_ = _retry_from_retry_config( retry_params, retry_codes_map[method_params["retry_codes_name"]], retry_impl, ) timeout_ = _timeout_from_retry_config(retry_params) # No retry config, so this is a non-retryable method. else: retry_ = None timeout_ = timeout.ConstantTimeout( method_params["timeout_millis"] / _MILLIS_PER_SECOND ) method_configs[method_name] = MethodConfig(retry=retry_, timeout=timeout_) return method_configs ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/config_async.py0000664000000000017530000000330014614221643022171 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO helpers for loading gapic configuration data. The Google API generator creates supplementary configuration for each RPC method to tell the client library how to deal with retries and timeouts. """ from google.api_core import retry_async from google.api_core.gapic_v1 import config from google.api_core.gapic_v1.config import MethodConfig # noqa: F401 def parse_method_configs(interface_config): """Creates default retry and timeout objects for each method in a gapic interface config with AsyncIO semantics. Args: interface_config (Mapping): The interface config section of the full gapic library config. For example, If the full configuration has an interface named ``google.example.v1.ExampleService`` you would pass in just that interface's configuration, for example ``gapic_config['interfaces']['google.example.v1.ExampleService']``. Returns: Mapping[str, MethodConfig]: A mapping of RPC method names to their configuration. """ return config.parse_method_configs( interface_config, retry_impl=retry_async.AsyncRetry ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/method.py0000664000000000017530000002242614614221643021021 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for wrapping low-level gRPC methods with common functionality. This is used by gapic clients to provide common error mapping, retry, timeout, compression, pagination, and long-running operations to gRPC methods. """ import enum import functools from google.api_core import grpc_helpers from google.api_core.gapic_v1 import client_info from google.api_core.timeout import TimeToDeadlineTimeout USE_DEFAULT_METADATA = object() class _MethodDefault(enum.Enum): # Uses enum so that pytype/mypy knows that this is the only possible value. # https://stackoverflow.com/a/60605919/101923 # # Literal[_DEFAULT_VALUE] is an alternative, but only added in Python 3.8. # https://docs.python.org/3/library/typing.html#typing.Literal _DEFAULT_VALUE = object() DEFAULT = _MethodDefault._DEFAULT_VALUE """Sentinel value indicating that a retry, timeout, or compression argument was unspecified, so the default should be used.""" def _is_not_none_or_false(value): return value is not None and value is not False def _apply_decorators(func, decorators): """Apply a list of decorators to a given function. ``decorators`` may contain items that are ``None`` or ``False`` which will be ignored. """ filtered_decorators = filter(_is_not_none_or_false, reversed(decorators)) for decorator in filtered_decorators: func = decorator(func) return func class _GapicCallable(object): """Callable that applies retry, timeout, and metadata logic. Args: target (Callable): The low-level RPC method. retry (google.api_core.retry.Retry): The default retry for the callable. If ``None``, this callable will not retry by default timeout (google.api_core.timeout.Timeout): The default timeout for the callable (i.e. duration of time within which an RPC must terminate after its start, not to be confused with deadline). If ``None``, this callable will not specify a timeout argument to the low-level RPC method. compression (grpc.Compression): The default compression for the callable. If ``None``, this callable will not specify a compression argument to the low-level RPC method. metadata (Sequence[Tuple[str, str]]): Additional metadata that is provided to the RPC method on every invocation. This is merged with any metadata specified during invocation. If ``None``, no additional metadata will be passed to the RPC method. """ def __init__( self, target, retry, timeout, compression, metadata=None, ): self._target = target self._retry = retry self._timeout = timeout self._compression = compression self._metadata = metadata def __call__( self, *args, timeout=DEFAULT, retry=DEFAULT, compression=DEFAULT, **kwargs ): """Invoke the low-level RPC with retry, timeout, compression, and metadata.""" if retry is DEFAULT: retry = self._retry if timeout is DEFAULT: timeout = self._timeout if compression is DEFAULT: compression = self._compression if isinstance(timeout, (int, float)): timeout = TimeToDeadlineTimeout(timeout=timeout) # Apply all applicable decorators. wrapped_func = _apply_decorators(self._target, [retry, timeout]) # Add the user agent metadata to the call. if self._metadata is not None: metadata = kwargs.get("metadata", []) # Due to the nature of invocation, None should be treated the same # as not specified. if metadata is None: metadata = [] metadata = list(metadata) metadata.extend(self._metadata) kwargs["metadata"] = metadata if self._compression is not None: kwargs["compression"] = compression return wrapped_func(*args, **kwargs) def wrap_method( func, default_retry=None, default_timeout=None, default_compression=None, client_info=client_info.DEFAULT_CLIENT_INFO, *, with_call=False, ): """Wrap an RPC method with common behavior. This applies common error wrapping, retry, timeout, and compression behavior to a function. The wrapped function will take optional ``retry``, ``timeout``, and ``compression`` arguments. For example:: import google.api_core.gapic_v1.method from google.api_core import retry from google.api_core import timeout from grpc import Compression # The original RPC method. def get_topic(name, timeout=None): request = publisher_v2.GetTopicRequest(name=name) return publisher_stub.GetTopic(request, timeout=timeout) default_retry = retry.Retry(deadline=60) default_timeout = timeout.Timeout(deadline=60) default_compression = Compression.NoCompression wrapped_get_topic = google.api_core.gapic_v1.method.wrap_method( get_topic, default_retry) # Execute get_topic with default retry and timeout: response = wrapped_get_topic() # Execute get_topic without doing any retying but with the default # timeout: response = wrapped_get_topic(retry=None) # Execute get_topic but only retry on 5xx errors: my_retry = retry.Retry(retry.if_exception_type( exceptions.InternalServerError)) response = wrapped_get_topic(retry=my_retry) The way this works is by late-wrapping the given function with the retry and timeout decorators. Essentially, when ``wrapped_get_topic()`` is called: * ``get_topic()`` is first wrapped with the ``timeout`` into ``get_topic_with_timeout``. * ``get_topic_with_timeout`` is wrapped with the ``retry`` into ``get_topic_with_timeout_and_retry()``. * The final ``get_topic_with_timeout_and_retry`` is called passing through the ``args`` and ``kwargs``. The callstack is therefore:: method.__call__() -> Retry.__call__() -> Timeout.__call__() -> wrap_errors() -> get_topic() Note that if ``timeout`` or ``retry`` is ``None``, then they are not applied to the function. For example, ``wrapped_get_topic(timeout=None, retry=None)`` is more or less equivalent to just calling ``get_topic`` but with error re-mapping. Args: func (Callable[Any]): The function to wrap. It should accept an optional ``timeout`` argument. If ``metadata`` is not ``None``, it should accept a ``metadata`` argument. default_retry (Optional[google.api_core.Retry]): The default retry strategy. If ``None``, the method will not retry by default. default_timeout (Optional[google.api_core.Timeout]): The default timeout strategy. Can also be specified as an int or float. If ``None``, the method will not have timeout specified by default. default_compression (Optional[grpc.Compression]): The default grpc.Compression. If ``None``, the method will not have compression specified by default. client_info (Optional[google.api_core.gapic_v1.client_info.ClientInfo]): Client information used to create a user-agent string that's passed as gRPC metadata to the method. If unspecified, then a sane default will be used. If ``None``, then no user agent metadata will be provided to the RPC method. with_call (bool): If True, wrapped grpc.UnaryUnaryMulticallables will return a tuple of (response, grpc.Call) instead of just the response. This is useful for extracting trailing metadata from unary calls. Defaults to False. Returns: Callable: A new callable that takes optional ``retry``, ``timeout``, and ``compression`` arguments and applies the common error mapping, retry, timeout, compression, and metadata behavior to the low-level RPC method. """ if with_call: try: func = func.with_call except AttributeError as exc: raise ValueError( "with_call=True is only supported for unary calls." ) from exc func = grpc_helpers.wrap_errors(func) if client_info is not None: user_agent_metadata = [client_info.to_grpc_metadata()] else: user_agent_metadata = None return functools.wraps(func)( _GapicCallable( func, default_retry, default_timeout, default_compression, metadata=user_agent_metadata, ) ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/method_async.py0000664000000000017530000000364014614221643022213 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO helpers for wrapping gRPC methods with common functionality. This is used by gapic clients to provide common error mapping, retry, timeout, compression, pagination, and long-running operations to gRPC methods. """ import functools from google.api_core import grpc_helpers_async from google.api_core.gapic_v1 import client_info from google.api_core.gapic_v1.method import _GapicCallable from google.api_core.gapic_v1.method import DEFAULT # noqa: F401 from google.api_core.gapic_v1.method import USE_DEFAULT_METADATA # noqa: F401 def wrap_method( func, default_retry=None, default_timeout=None, default_compression=None, client_info=client_info.DEFAULT_CLIENT_INFO, ): """Wrap an async RPC method with common behavior. Returns: Callable: A new callable that takes optional ``retry``, ``timeout``, and ``compression`` arguments and applies the common error mapping, retry, timeout, metadata, and compression behavior to the low-level RPC method. """ func = grpc_helpers_async.wrap_errors(func) metadata = [client_info.to_grpc_metadata()] if client_info is not None else None return functools.wraps(func)( _GapicCallable( func, default_retry, default_timeout, default_compression, metadata=metadata, ) ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/gapic_v1/routing_header.py0000664000000000017530000000602514614221643022535 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for constructing routing headers. These headers are used by Google infrastructure to determine how to route requests, especially for services that are regional. Generally, these headers are specified as gRPC metadata. """ import functools from enum import Enum from urllib.parse import urlencode ROUTING_METADATA_KEY = "x-goog-request-params" # This is the value for the `maxsize` argument of @functools.lru_cache # https://docs.python.org/3/library/functools.html#functools.lru_cache # This represents the number of recent function calls to store. ROUTING_PARAM_CACHE_SIZE = 32 def to_routing_header(params, qualified_enums=True): """Returns a routing header string for the given request parameters. Args: params (Mapping[str, str | bytes | Enum]): A dictionary containing the request parameters used for routing. qualified_enums (bool): Whether to represent enum values as their type-qualified symbol names instead of as their unqualified symbol names. Returns: str: The routing header string. """ tuples = params.items() if isinstance(params, dict) else params if not qualified_enums: tuples = [(x[0], x[1].name) if isinstance(x[1], Enum) else x for x in tuples] return "&".join([_urlencode_param(*t) for t in tuples]) def to_grpc_metadata(params, qualified_enums=True): """Returns the gRPC metadata containing the routing headers for the given request parameters. Args: params (Mapping[str, str | bytes | Enum]): A dictionary containing the request parameters used for routing. qualified_enums (bool): Whether to represent enum values as their type-qualified symbol names instead of as their unqualified symbol names. Returns: Tuple(str, str): The gRPC metadata containing the routing header key and value. """ return (ROUTING_METADATA_KEY, to_routing_header(params, qualified_enums)) # use caching to avoid repeated computation @functools.lru_cache(maxsize=ROUTING_PARAM_CACHE_SIZE) def _urlencode_param(key, value): """Cacheable wrapper over urlencode Args: key (str): The key of the parameter to encode. value (str | bytes | Enum): The value of the parameter to encode. Returns: str: The encoded parameter. """ return urlencode( {key: value}, # Per Google API policy (go/api-url-encoding), / is not encoded. safe="/", ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/general_helpers.py0000664000000000017530000000125114614221643021200 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # This import for backward compatibility only. from functools import wraps # noqa: F401 pragma: NO COVER ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/grpc_helpers.py0000664000000000017530000005537314614221643020534 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Helpers for :mod:`grpc`.""" from typing import Generic, Iterator, Optional, TypeVar import collections import functools import warnings import grpc from google.api_core import exceptions import google.auth import google.auth.credentials import google.auth.transport.grpc import google.auth.transport.requests import google.protobuf PROTOBUF_VERSION = google.protobuf.__version__ # The grpcio-gcp package only has support for protobuf < 4 if PROTOBUF_VERSION[0:2] == "3.": # pragma: NO COVER try: import grpc_gcp warnings.warn( """Support for grpcio-gcp is deprecated. This feature will be removed from `google-api-core` after January 1, 2024. If you need to continue to use this feature, please pin to a specific version of `google-api-core`.""", DeprecationWarning, ) HAS_GRPC_GCP = True except ImportError: HAS_GRPC_GCP = False else: HAS_GRPC_GCP = False # The list of gRPC Callable interfaces that return iterators. _STREAM_WRAP_CLASSES = (grpc.UnaryStreamMultiCallable, grpc.StreamStreamMultiCallable) # denotes the proto response type for grpc calls P = TypeVar("P") def _patch_callable_name(callable_): """Fix-up gRPC callable attributes. gRPC callable lack the ``__name__`` attribute which causes :func:`functools.wraps` to error. This adds the attribute if needed. """ if not hasattr(callable_, "__name__"): callable_.__name__ = callable_.__class__.__name__ def _wrap_unary_errors(callable_): """Map errors for Unary-Unary and Stream-Unary gRPC callables.""" _patch_callable_name(callable_) @functools.wraps(callable_) def error_remapped_callable(*args, **kwargs): try: return callable_(*args, **kwargs) except grpc.RpcError as exc: raise exceptions.from_grpc_error(exc) from exc return error_remapped_callable class _StreamingResponseIterator(Generic[P], grpc.Call): def __init__(self, wrapped, prefetch_first_result=True): self._wrapped = wrapped # This iterator is used in a retry context, and returned outside after init. # gRPC will not throw an exception until the stream is consumed, so we need # to retrieve the first result, in order to fail, in order to trigger a retry. try: if prefetch_first_result: self._stored_first_result = next(self._wrapped) except TypeError: # It is possible the wrapped method isn't an iterable (a grpc.Call # for instance). If this happens don't store the first result. pass except StopIteration: # ignore stop iteration at this time. This should be handled outside of retry. pass def __iter__(self) -> Iterator[P]: """This iterator is also an iterable that returns itself.""" return self def __next__(self) -> P: """Get the next response from the stream. Returns: protobuf.Message: A single response from the stream. """ try: if hasattr(self, "_stored_first_result"): result = self._stored_first_result del self._stored_first_result return result return next(self._wrapped) except grpc.RpcError as exc: # If the stream has already returned data, we cannot recover here. raise exceptions.from_grpc_error(exc) from exc # grpc.Call & grpc.RpcContext interface def add_callback(self, callback): return self._wrapped.add_callback(callback) def cancel(self): return self._wrapped.cancel() def code(self): return self._wrapped.code() def details(self): return self._wrapped.details() def initial_metadata(self): return self._wrapped.initial_metadata() def is_active(self): return self._wrapped.is_active() def time_remaining(self): return self._wrapped.time_remaining() def trailing_metadata(self): return self._wrapped.trailing_metadata() # public type alias denoting the return type of streaming gapic calls GrpcStream = _StreamingResponseIterator[P] def _wrap_stream_errors(callable_): """Wrap errors for Unary-Stream and Stream-Stream gRPC callables. The callables that return iterators require a bit more logic to re-map errors when iterating. This wraps both the initial invocation and the iterator of the return value to re-map errors. """ _patch_callable_name(callable_) @functools.wraps(callable_) def error_remapped_callable(*args, **kwargs): try: result = callable_(*args, **kwargs) # Auto-fetching the first result causes PubSub client's streaming pull # to hang when re-opening the stream, thus we need examine the hacky # hidden flag to see if pre-fetching is disabled. # https://github.com/googleapis/python-pubsub/issues/93#issuecomment-630762257 prefetch_first = getattr(callable_, "_prefetch_first_result_", True) return _StreamingResponseIterator( result, prefetch_first_result=prefetch_first ) except grpc.RpcError as exc: raise exceptions.from_grpc_error(exc) from exc return error_remapped_callable def wrap_errors(callable_): """Wrap a gRPC callable and map :class:`grpc.RpcErrors` to friendly error classes. Errors raised by the gRPC callable are mapped to the appropriate :class:`google.api_core.exceptions.GoogleAPICallError` subclasses. The original `grpc.RpcError` (which is usually also a `grpc.Call`) is available from the ``response`` property on the mapped exception. This is useful for extracting metadata from the original error. Args: callable_ (Callable): A gRPC callable. Returns: Callable: The wrapped gRPC callable. """ if isinstance(callable_, _STREAM_WRAP_CLASSES): return _wrap_stream_errors(callable_) else: return _wrap_unary_errors(callable_) def _create_composite_credentials( credentials=None, credentials_file=None, default_scopes=None, scopes=None, ssl_credentials=None, quota_project_id=None, default_host=None, ): """Create the composite credentials for secure channels. Args: credentials (google.auth.credentials.Credentials): The credentials. If not specified, then this function will attempt to ascertain the credentials from the environment using :func:`google.auth.default`. credentials_file (str): A file with credentials that can be loaded with :func:`google.auth.load_credentials_from_file`. This argument is mutually exclusive with credentials. default_scopes (Sequence[str]): A optional list of scopes needed for this service. These are only used when credentials are not specified and are passed to :func:`google.auth.default`. scopes (Sequence[str]): A optional list of scopes needed for this service. These are only used when credentials are not specified and are passed to :func:`google.auth.default`. ssl_credentials (grpc.ChannelCredentials): Optional SSL channel credentials. This can be used to specify different certificates. quota_project_id (str): An optional project to use for billing and quota. default_host (str): The default endpoint. e.g., "pubsub.googleapis.com". Returns: grpc.ChannelCredentials: The composed channel credentials object. Raises: google.api_core.DuplicateCredentialArgs: If both a credentials object and credentials_file are passed. """ if credentials and credentials_file: raise exceptions.DuplicateCredentialArgs( "'credentials' and 'credentials_file' are mutually exclusive." ) if credentials_file: credentials, _ = google.auth.load_credentials_from_file( credentials_file, scopes=scopes, default_scopes=default_scopes ) elif credentials: credentials = google.auth.credentials.with_scopes_if_required( credentials, scopes=scopes, default_scopes=default_scopes ) else: credentials, _ = google.auth.default( scopes=scopes, default_scopes=default_scopes ) if quota_project_id and isinstance( credentials, google.auth.credentials.CredentialsWithQuotaProject ): credentials = credentials.with_quota_project(quota_project_id) request = google.auth.transport.requests.Request() # Create the metadata plugin for inserting the authorization header. metadata_plugin = google.auth.transport.grpc.AuthMetadataPlugin( credentials, request, default_host=default_host, ) # Create a set of grpc.CallCredentials using the metadata plugin. google_auth_credentials = grpc.metadata_call_credentials(metadata_plugin) # if `ssl_credentials` is set, use `grpc.composite_channel_credentials` instead of # `grpc.compute_engine_channel_credentials` as the former supports passing # `ssl_credentials` via `channel_credentials` which is needed for mTLS. if ssl_credentials: # Combine the ssl credentials and the authorization credentials. # See https://grpc.github.io/grpc/python/grpc.html#grpc.composite_channel_credentials return grpc.composite_channel_credentials( ssl_credentials, google_auth_credentials ) else: # Use grpc.compute_engine_channel_credentials in order to support Direct Path. # See https://grpc.github.io/grpc/python/grpc.html#grpc.compute_engine_channel_credentials # TODO(https://github.com/googleapis/python-api-core/issues/598): # Although `grpc.compute_engine_channel_credentials` returns channel credentials # outside of a Google Compute Engine environment (GCE), we should determine if # there is a way to reliably detect a GCE environment so that # `grpc.compute_engine_channel_credentials` is not called outside of GCE. return grpc.compute_engine_channel_credentials(google_auth_credentials) def create_channel( target, credentials=None, scopes=None, ssl_credentials=None, credentials_file=None, quota_project_id=None, default_scopes=None, default_host=None, compression=None, attempt_direct_path: Optional[bool] = False, **kwargs, ): """Create a secure channel with credentials. Args: target (str): The target service address in the format 'hostname:port'. credentials (google.auth.credentials.Credentials): The credentials. If not specified, then this function will attempt to ascertain the credentials from the environment using :func:`google.auth.default`. scopes (Sequence[str]): A optional list of scopes needed for this service. These are only used when credentials are not specified and are passed to :func:`google.auth.default`. ssl_credentials (grpc.ChannelCredentials): Optional SSL channel credentials. This can be used to specify different certificates. credentials_file (str): A file with credentials that can be loaded with :func:`google.auth.load_credentials_from_file`. This argument is mutually exclusive with credentials. quota_project_id (str): An optional project to use for billing and quota. default_scopes (Sequence[str]): Default scopes passed by a Google client library. Use 'scopes' for user-defined scopes. default_host (str): The default endpoint. e.g., "pubsub.googleapis.com". compression (grpc.Compression): An optional value indicating the compression method to be used over the lifetime of the channel. attempt_direct_path (Optional[bool]): If set, Direct Path will be attempted when the request is made. Direct Path is only available within a Google Compute Engine (GCE) environment and provides a proxyless connection which increases the available throughput, reduces latency, and increases reliability. Note: - This argument should only be set in a GCE environment and for Services that are known to support Direct Path. - If this argument is set outside of GCE, then this request will fail unless the back-end service happens to have configured fall-back to DNS. - If the request causes a `ServiceUnavailable` response, it is recommended that the client repeat the request with `attempt_direct_path` set to `False` as the Service may not support Direct Path. - Using `ssl_credentials` with `attempt_direct_path` set to `True` will result in `ValueError` as this combination is not yet supported. kwargs: Additional key-word args passed to :func:`grpc_gcp.secure_channel` or :func:`grpc.secure_channel`. Note: `grpc_gcp` is only supported in environments with protobuf < 4.0.0. Returns: grpc.Channel: The created channel. Raises: google.api_core.DuplicateCredentialArgs: If both a credentials object and credentials_file are passed. ValueError: If `ssl_credentials` is set and `attempt_direct_path` is set to `True`. """ # If `ssl_credentials` is set and `attempt_direct_path` is set to `True`, # raise ValueError as this is not yet supported. # See https://github.com/googleapis/python-api-core/issues/590 if ssl_credentials and attempt_direct_path: raise ValueError("Using ssl_credentials with Direct Path is not supported") composite_credentials = _create_composite_credentials( credentials=credentials, credentials_file=credentials_file, default_scopes=default_scopes, scopes=scopes, ssl_credentials=ssl_credentials, quota_project_id=quota_project_id, default_host=default_host, ) # Note that grpcio-gcp is deprecated if HAS_GRPC_GCP: # pragma: NO COVER if compression is not None and compression != grpc.Compression.NoCompression: warnings.warn( "The `compression` argument is ignored for grpc_gcp.secure_channel creation.", DeprecationWarning, ) if attempt_direct_path: warnings.warn( """The `attempt_direct_path` argument is ignored for grpc_gcp.secure_channel creation.""", DeprecationWarning, ) return grpc_gcp.secure_channel(target, composite_credentials, **kwargs) if attempt_direct_path: target = _modify_target_for_direct_path(target) return grpc.secure_channel( target, composite_credentials, compression=compression, **kwargs ) def _modify_target_for_direct_path(target: str) -> str: """ Given a target, return a modified version which is compatible with Direct Path. Args: target (str): The target service address in the format 'hostname[:port]' or 'dns://hostname[:port]'. Returns: target (str): The target service address which is converted into a format compatible with Direct Path. If the target contains `dns:///` or does not contain `:///`, the target will be converted in a format compatible with Direct Path; otherwise the original target will be returned as the original target may already denote Direct Path. """ # A DNS prefix may be included with the target to indicate the endpoint is living in the Internet, # outside of Google Cloud Platform. dns_prefix = "dns:///" # Remove "dns:///" if `attempt_direct_path` is set to True as # the Direct Path prefix `google-c2p:///` will be used instead. target = target.replace(dns_prefix, "") direct_path_separator = ":///" if direct_path_separator not in target: target_without_port = target.split(":")[0] # Modify the target to use Direct Path by adding the `google-c2p:///` prefix target = f"google-c2p{direct_path_separator}{target_without_port}" return target _MethodCall = collections.namedtuple( "_MethodCall", ("request", "timeout", "metadata", "credentials", "compression") ) _ChannelRequest = collections.namedtuple("_ChannelRequest", ("method", "request")) class _CallableStub(object): """Stub for the grpc.*MultiCallable interfaces.""" def __init__(self, method, channel): self._method = method self._channel = channel self.response = None """Union[protobuf.Message, Callable[protobuf.Message], exception]: The response to give when invoking this callable. If this is a callable, it will be invoked with the request protobuf. If it's an exception, the exception will be raised when this is invoked. """ self.responses = None """Iterator[ Union[protobuf.Message, Callable[protobuf.Message], exception]]: An iterator of responses. If specified, self.response will be populated on each invocation by calling ``next(self.responses)``.""" self.requests = [] """List[protobuf.Message]: All requests sent to this callable.""" self.calls = [] """List[Tuple]: All invocations of this callable. Each tuple is the request, timeout, metadata, compression, and credentials.""" def __call__( self, request, timeout=None, metadata=None, credentials=None, compression=None ): self._channel.requests.append(_ChannelRequest(self._method, request)) self.calls.append( _MethodCall(request, timeout, metadata, credentials, compression) ) self.requests.append(request) response = self.response if self.responses is not None: if response is None: response = next(self.responses) else: raise ValueError( "{method}.response and {method}.responses are mutually " "exclusive.".format(method=self._method) ) if callable(response): return response(request) if isinstance(response, Exception): raise response if response is not None: return response raise ValueError('Method stub for "{}" has no response.'.format(self._method)) def _simplify_method_name(method): """Simplifies a gRPC method name. When gRPC invokes the channel to create a callable, it gives a full method name like "/google.pubsub.v1.Publisher/CreateTopic". This returns just the name of the method, in this case "CreateTopic". Args: method (str): The name of the method. Returns: str: The simplified name of the method. """ return method.rsplit("/", 1).pop() class ChannelStub(grpc.Channel): """A testing stub for the grpc.Channel interface. This can be used to test any client that eventually uses a gRPC channel to communicate. By passing in a channel stub, you can configure which responses are returned and track which requests are made. For example: .. code-block:: python channel_stub = grpc_helpers.ChannelStub() client = FooClient(channel=channel_stub) channel_stub.GetFoo.response = foo_pb2.Foo(name='bar') foo = client.get_foo(labels=['baz']) assert foo.name == 'bar' assert channel_stub.GetFoo.requests[0].labels = ['baz'] Each method on the stub can be accessed and configured on the channel. Here's some examples of various configurations: .. code-block:: python # Return a basic response: channel_stub.GetFoo.response = foo_pb2.Foo(name='bar') assert client.get_foo().name == 'bar' # Raise an exception: channel_stub.GetFoo.response = NotFound('...') with pytest.raises(NotFound): client.get_foo() # Use a sequence of responses: channel_stub.GetFoo.responses = iter([ foo_pb2.Foo(name='bar'), foo_pb2.Foo(name='baz'), ]) assert client.get_foo().name == 'bar' assert client.get_foo().name == 'baz' # Use a callable def on_get_foo(request): return foo_pb2.Foo(name='bar' + request.id) channel_stub.GetFoo.response = on_get_foo assert client.get_foo(id='123').name == 'bar123' """ def __init__(self, responses=[]): self.requests = [] """Sequence[Tuple[str, protobuf.Message]]: A list of all requests made on this channel in order. The tuple is of method name, request message.""" self._method_stubs = {} def _stub_for_method(self, method): method = _simplify_method_name(method) self._method_stubs[method] = _CallableStub(method, self) return self._method_stubs[method] def __getattr__(self, key): try: return self._method_stubs[key] except KeyError: raise AttributeError def unary_unary( self, method, request_serializer=None, response_deserializer=None, _registered_method=False, ): """grpc.Channel.unary_unary implementation.""" return self._stub_for_method(method) def unary_stream( self, method, request_serializer=None, response_deserializer=None, _registered_method=False, ): """grpc.Channel.unary_stream implementation.""" return self._stub_for_method(method) def stream_unary( self, method, request_serializer=None, response_deserializer=None, _registered_method=False, ): """grpc.Channel.stream_unary implementation.""" return self._stub_for_method(method) def stream_stream( self, method, request_serializer=None, response_deserializer=None, _registered_method=False, ): """grpc.Channel.stream_stream implementation.""" return self._stub_for_method(method) def subscribe(self, callback, try_to_connect=False): """grpc.Channel.subscribe implementation.""" pass def unsubscribe(self, callback): """grpc.Channel.unsubscribe implementation.""" pass def close(self): """grpc.Channel.close implementation.""" pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/grpc_helpers_async.py0000664000000000017530000003007014614221643021714 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO helpers for :mod:`grpc` supporting 3.7+. Please combine more detailed docstring in grpc_helpers.py to use following functions. This module is implementing the same surface with AsyncIO semantics. """ import asyncio import functools from typing import AsyncGenerator, Generic, Iterator, Optional, TypeVar import grpc from grpc import aio from google.api_core import exceptions, grpc_helpers # denotes the proto response type for grpc calls P = TypeVar("P") # NOTE(lidiz) Alternatively, we can hack "__getattribute__" to perform # automatic patching for us. But that means the overhead of creating an # extra Python function spreads to every single send and receive. class _WrappedCall(aio.Call): def __init__(self): self._call = None def with_call(self, call): """Supplies the call object separately to keep __init__ clean.""" self._call = call return self async def initial_metadata(self): return await self._call.initial_metadata() async def trailing_metadata(self): return await self._call.trailing_metadata() async def code(self): return await self._call.code() async def details(self): return await self._call.details() def cancelled(self): return self._call.cancelled() def done(self): return self._call.done() def time_remaining(self): return self._call.time_remaining() def cancel(self): return self._call.cancel() def add_done_callback(self, callback): self._call.add_done_callback(callback) async def wait_for_connection(self): try: await self._call.wait_for_connection() except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error class _WrappedUnaryResponseMixin(Generic[P], _WrappedCall): def __await__(self) -> Iterator[P]: try: response = yield from self._call.__await__() return response except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error class _WrappedStreamResponseMixin(Generic[P], _WrappedCall): def __init__(self): self._wrapped_async_generator = None async def read(self) -> P: try: return await self._call.read() except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error async def _wrapped_aiter(self) -> AsyncGenerator[P, None]: try: # NOTE(lidiz) coverage doesn't understand the exception raised from # __anext__ method. It is covered by test case: # test_wrap_stream_errors_aiter_non_rpc_error async for response in self._call: # pragma: no branch yield response except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error def __aiter__(self) -> AsyncGenerator[P, None]: if not self._wrapped_async_generator: self._wrapped_async_generator = self._wrapped_aiter() return self._wrapped_async_generator class _WrappedStreamRequestMixin(_WrappedCall): async def write(self, request): try: await self._call.write(request) except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error async def done_writing(self): try: await self._call.done_writing() except grpc.RpcError as rpc_error: raise exceptions.from_grpc_error(rpc_error) from rpc_error # NOTE(lidiz) Implementing each individual class separately, so we don't # expose any API that should not be seen. E.g., __aiter__ in unary-unary # RPC, or __await__ in stream-stream RPC. class _WrappedUnaryUnaryCall(_WrappedUnaryResponseMixin[P], aio.UnaryUnaryCall): """Wrapped UnaryUnaryCall to map exceptions.""" class _WrappedUnaryStreamCall(_WrappedStreamResponseMixin[P], aio.UnaryStreamCall): """Wrapped UnaryStreamCall to map exceptions.""" class _WrappedStreamUnaryCall( _WrappedUnaryResponseMixin[P], _WrappedStreamRequestMixin, aio.StreamUnaryCall ): """Wrapped StreamUnaryCall to map exceptions.""" class _WrappedStreamStreamCall( _WrappedStreamRequestMixin, _WrappedStreamResponseMixin[P], aio.StreamStreamCall ): """Wrapped StreamStreamCall to map exceptions.""" # public type alias denoting the return type of async streaming gapic calls GrpcAsyncStream = _WrappedStreamResponseMixin[P] # public type alias denoting the return type of unary gapic calls AwaitableGrpcCall = _WrappedUnaryResponseMixin[P] def _wrap_unary_errors(callable_): """Map errors for Unary-Unary async callables.""" grpc_helpers._patch_callable_name(callable_) @functools.wraps(callable_) def error_remapped_callable(*args, **kwargs): call = callable_(*args, **kwargs) return _WrappedUnaryUnaryCall().with_call(call) return error_remapped_callable def _wrap_stream_errors(callable_): """Map errors for streaming RPC async callables.""" grpc_helpers._patch_callable_name(callable_) @functools.wraps(callable_) async def error_remapped_callable(*args, **kwargs): call = callable_(*args, **kwargs) if isinstance(call, aio.UnaryStreamCall): call = _WrappedUnaryStreamCall().with_call(call) elif isinstance(call, aio.StreamUnaryCall): call = _WrappedStreamUnaryCall().with_call(call) elif isinstance(call, aio.StreamStreamCall): call = _WrappedStreamStreamCall().with_call(call) else: raise TypeError("Unexpected type of call %s" % type(call)) await call.wait_for_connection() return call return error_remapped_callable def wrap_errors(callable_): """Wrap a gRPC async callable and map :class:`grpc.RpcErrors` to friendly error classes. Errors raised by the gRPC callable are mapped to the appropriate :class:`google.api_core.exceptions.GoogleAPICallError` subclasses. The original `grpc.RpcError` (which is usually also a `grpc.Call`) is available from the ``response`` property on the mapped exception. This is useful for extracting metadata from the original error. Args: callable_ (Callable): A gRPC callable. Returns: Callable: The wrapped gRPC callable. """ if isinstance(callable_, aio.UnaryUnaryMultiCallable): return _wrap_unary_errors(callable_) else: return _wrap_stream_errors(callable_) def create_channel( target, credentials=None, scopes=None, ssl_credentials=None, credentials_file=None, quota_project_id=None, default_scopes=None, default_host=None, compression=None, attempt_direct_path: Optional[bool] = False, **kwargs ): """Create an AsyncIO secure channel with credentials. Args: target (str): The target service address in the format 'hostname:port'. credentials (google.auth.credentials.Credentials): The credentials. If not specified, then this function will attempt to ascertain the credentials from the environment using :func:`google.auth.default`. scopes (Sequence[str]): A optional list of scopes needed for this service. These are only used when credentials are not specified and are passed to :func:`google.auth.default`. ssl_credentials (grpc.ChannelCredentials): Optional SSL channel credentials. This can be used to specify different certificates. credentials_file (str): A file with credentials that can be loaded with :func:`google.auth.load_credentials_from_file`. This argument is mutually exclusive with credentials. quota_project_id (str): An optional project to use for billing and quota. default_scopes (Sequence[str]): Default scopes passed by a Google client library. Use 'scopes' for user-defined scopes. default_host (str): The default endpoint. e.g., "pubsub.googleapis.com". compression (grpc.Compression): An optional value indicating the compression method to be used over the lifetime of the channel. attempt_direct_path (Optional[bool]): If set, Direct Path will be attempted when the request is made. Direct Path is only available within a Google Compute Engine (GCE) environment and provides a proxyless connection which increases the available throughput, reduces latency, and increases reliability. Note: - This argument should only be set in a GCE environment and for Services that are known to support Direct Path. - If this argument is set outside of GCE, then this request will fail unless the back-end service happens to have configured fall-back to DNS. - If the request causes a `ServiceUnavailable` response, it is recommended that the client repeat the request with `attempt_direct_path` set to `False` as the Service may not support Direct Path. - Using `ssl_credentials` with `attempt_direct_path` set to `True` will result in `ValueError` as this combination is not yet supported. kwargs: Additional key-word args passed to :func:`aio.secure_channel`. Returns: aio.Channel: The created channel. Raises: google.api_core.DuplicateCredentialArgs: If both a credentials object and credentials_file are passed. ValueError: If `ssl_credentials` is set and `attempt_direct_path` is set to `True`. """ # If `ssl_credentials` is set and `attempt_direct_path` is set to `True`, # raise ValueError as this is not yet supported. # See https://github.com/googleapis/python-api-core/issues/590 if ssl_credentials and attempt_direct_path: raise ValueError("Using ssl_credentials with Direct Path is not supported") composite_credentials = grpc_helpers._create_composite_credentials( credentials=credentials, credentials_file=credentials_file, scopes=scopes, default_scopes=default_scopes, ssl_credentials=ssl_credentials, quota_project_id=quota_project_id, default_host=default_host, ) if attempt_direct_path: target = grpc_helpers._modify_target_for_direct_path(target) return aio.secure_channel( target, composite_credentials, compression=compression, **kwargs ) class FakeUnaryUnaryCall(_WrappedUnaryUnaryCall): """Fake implementation for unary-unary RPCs. It is a dummy object for response message. Supply the intended response upon the initialization, and the coroutine will return the exact response message. """ def __init__(self, response=object()): self.response = response self._future = asyncio.get_event_loop().create_future() self._future.set_result(self.response) def __await__(self): response = yield from self._future.__await__() return response class FakeStreamUnaryCall(_WrappedStreamUnaryCall): """Fake implementation for stream-unary RPCs. It is a dummy object for response message. Supply the intended response upon the initialization, and the coroutine will return the exact response message. """ def __init__(self, response=object()): self.response = response self._future = asyncio.get_event_loop().create_future() self._future.set_result(self.response) def __await__(self): response = yield from self._future.__await__() return response async def wait_for_connection(self): pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/iam.py0000664000000000017530000003163514614221643016620 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Non-API-specific IAM policy definitions For allowed roles / permissions, see: https://cloud.google.com/iam/docs/understanding-roles Example usage: .. code-block:: python # ``get_iam_policy`` returns a :class:'~google.api_core.iam.Policy`. policy = resource.get_iam_policy(requested_policy_version=3) phred = "user:phred@example.com" admin_group = "group:admins@groups.example.com" account = "serviceAccount:account-1234@accounts.example.com" policy.version = 3 policy.bindings = [ { "role": "roles/owner", "members": {phred, admin_group, account} }, { "role": "roles/editor", "members": {"allAuthenticatedUsers"} }, { "role": "roles/viewer", "members": {"allUsers"} "condition": { "title": "request_time", "description": "Requests made before 2021-01-01T00:00:00Z", "expression": "request.time < timestamp(\"2021-01-01T00:00:00Z\")" } } ] resource.set_iam_policy(policy) """ import collections import collections.abc import operator import warnings # Generic IAM roles OWNER_ROLE = "roles/owner" """Generic role implying all rights to an object.""" EDITOR_ROLE = "roles/editor" """Generic role implying rights to modify an object.""" VIEWER_ROLE = "roles/viewer" """Generic role implying rights to access an object.""" _ASSIGNMENT_DEPRECATED_MSG = """\ Assigning to '{}' is deprecated. Use the `policy.bindings` property to modify bindings instead.""" _DICT_ACCESS_MSG = """\ Dict access is not supported on policies with version > 1 or with conditional bindings.""" class InvalidOperationException(Exception): """Raised when trying to use Policy class as a dict.""" pass class Policy(collections.abc.MutableMapping): """IAM Policy Args: etag (Optional[str]): ETag used to identify a unique of the policy version (Optional[int]): The syntax schema version of the policy. Note: Using conditions in bindings requires the policy's version to be set to `3` or greater, depending on the versions that are currently supported. Accessing the policy using dict operations will raise InvalidOperationException when the policy's version is set to 3. Use the policy.bindings getter/setter to retrieve and modify the policy's bindings. See: IAM Policy https://cloud.google.com/iam/reference/rest/v1/Policy Policy versions https://cloud.google.com/iam/docs/policies#versions Conditions overview https://cloud.google.com/iam/docs/conditions-overview. """ _OWNER_ROLES = (OWNER_ROLE,) """Roles mapped onto our ``owners`` attribute.""" _EDITOR_ROLES = (EDITOR_ROLE,) """Roles mapped onto our ``editors`` attribute.""" _VIEWER_ROLES = (VIEWER_ROLE,) """Roles mapped onto our ``viewers`` attribute.""" def __init__(self, etag=None, version=None): self.etag = etag self.version = version self._bindings = [] def __iter__(self): self.__check_version__() # Exclude bindings with no members return (binding["role"] for binding in self._bindings if binding["members"]) def __len__(self): self.__check_version__() # Exclude bindings with no members return len(list(self.__iter__())) def __getitem__(self, key): self.__check_version__() for b in self._bindings: if b["role"] == key: return b["members"] # If the binding does not yet exist, create one # NOTE: This will create bindings with no members # which are ignored by __iter__ and __len__ new_binding = {"role": key, "members": set()} self._bindings.append(new_binding) return new_binding["members"] def __setitem__(self, key, value): self.__check_version__() value = set(value) for binding in self._bindings: if binding["role"] == key: binding["members"] = value return self._bindings.append({"role": key, "members": value}) def __delitem__(self, key): self.__check_version__() for b in self._bindings: if b["role"] == key: self._bindings.remove(b) return raise KeyError(key) def __check_version__(self): """Raise InvalidOperationException if version is greater than 1 or policy contains conditions.""" raise_version = self.version is not None and self.version > 1 if raise_version or self._contains_conditions(): raise InvalidOperationException(_DICT_ACCESS_MSG) def _contains_conditions(self): for b in self._bindings: if b.get("condition") is not None: return True return False @property def bindings(self): """The policy's list of bindings. A binding is specified by a dictionary with keys: * role (str): Role that is assigned to `members`. * members (:obj:`set` of str): Specifies the identities associated to this binding. * condition (:obj:`dict` of str:str): Specifies a condition under which this binding will apply. * title (str): Title for the condition. * description (:obj:str, optional): Description of the condition. * expression: A CEL expression. Type: :obj:`list` of :obj:`dict` See: Policy versions https://cloud.google.com/iam/docs/policies#versions Conditions overview https://cloud.google.com/iam/docs/conditions-overview. Example: .. code-block:: python USER = "user:phred@example.com" ADMIN_GROUP = "group:admins@groups.example.com" SERVICE_ACCOUNT = "serviceAccount:account-1234@accounts.example.com" CONDITION = { "title": "request_time", "description": "Requests made before 2021-01-01T00:00:00Z", # Optional "expression": "request.time < timestamp(\"2021-01-01T00:00:00Z\")" } # Set policy's version to 3 before setting bindings containing conditions. policy.version = 3 policy.bindings = [ { "role": "roles/viewer", "members": {USER, ADMIN_GROUP, SERVICE_ACCOUNT}, "condition": CONDITION }, ... ] """ return self._bindings @bindings.setter def bindings(self, bindings): self._bindings = bindings @property def owners(self): """Legacy access to owner role. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to access bindings instead. """ result = set() for role in self._OWNER_ROLES: for member in self.get(role, ()): result.add(member) return frozenset(result) @owners.setter def owners(self, value): """Update owners. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to access bindings instead. """ warnings.warn( _ASSIGNMENT_DEPRECATED_MSG.format("owners", OWNER_ROLE), DeprecationWarning ) self[OWNER_ROLE] = value @property def editors(self): """Legacy access to editor role. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to access bindings instead. """ result = set() for role in self._EDITOR_ROLES: for member in self.get(role, ()): result.add(member) return frozenset(result) @editors.setter def editors(self, value): """Update editors. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to modify bindings instead. """ warnings.warn( _ASSIGNMENT_DEPRECATED_MSG.format("editors", EDITOR_ROLE), DeprecationWarning, ) self[EDITOR_ROLE] = value @property def viewers(self): """Legacy access to viewer role. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to modify bindings instead. """ result = set() for role in self._VIEWER_ROLES: for member in self.get(role, ()): result.add(member) return frozenset(result) @viewers.setter def viewers(self, value): """Update viewers. Raise InvalidOperationException if version is greater than 1 or policy contains conditions. DEPRECATED: use `policy.bindings` to modify bindings instead. """ warnings.warn( _ASSIGNMENT_DEPRECATED_MSG.format("viewers", VIEWER_ROLE), DeprecationWarning, ) self[VIEWER_ROLE] = value @staticmethod def user(email): """Factory method for a user member. Args: email (str): E-mail for this particular user. Returns: str: A member string corresponding to the given user. """ return "user:%s" % (email,) @staticmethod def service_account(email): """Factory method for a service account member. Args: email (str): E-mail for this particular service account. Returns: str: A member string corresponding to the given service account. """ return "serviceAccount:%s" % (email,) @staticmethod def group(email): """Factory method for a group member. Args: email (str): An id or e-mail for this particular group. Returns: str: A member string corresponding to the given group. """ return "group:%s" % (email,) @staticmethod def domain(domain): """Factory method for a domain member. Args: domain (str): The domain for this member. Returns: str: A member string corresponding to the given domain. """ return "domain:%s" % (domain,) @staticmethod def all_users(): """Factory method for a member representing all users. Returns: str: A member string representing all users. """ return "allUsers" @staticmethod def authenticated_users(): """Factory method for a member representing all authenticated users. Returns: str: A member string representing all authenticated users. """ return "allAuthenticatedUsers" @classmethod def from_api_repr(cls, resource): """Factory: create a policy from a JSON resource. Args: resource (dict): policy resource returned by ``getIamPolicy`` API. Returns: :class:`Policy`: the parsed policy """ version = resource.get("version") etag = resource.get("etag") policy = cls(etag, version) policy.bindings = resource.get("bindings", []) for binding in policy.bindings: binding["members"] = set(binding.get("members", ())) return policy def to_api_repr(self): """Render a JSON policy resource. Returns: dict: a resource to be passed to the ``setIamPolicy`` API. """ resource = {} if self.etag is not None: resource["etag"] = self.etag if self.version is not None: resource["version"] = self.version if self._bindings and len(self._bindings) > 0: bindings = [] for binding in self._bindings: members = binding.get("members") if members: new_binding = {"role": binding["role"], "members": sorted(members)} condition = binding.get("condition") if condition: new_binding["condition"] = condition bindings.append(new_binding) if bindings: # Sort bindings by role key = operator.itemgetter("role") resource["bindings"] = sorted(bindings, key=key) return resource ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operation.py0000664000000000017530000003161614614221643020051 0ustar00root# Copyright 2016 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Futures for long-running operations returned from Google Cloud APIs. These futures can be used to synchronously wait for the result of a long-running operation using :meth:`Operation.result`: .. code-block:: python operation = my_api_client.long_running_method() result = operation.result() Or asynchronously using callbacks and :meth:`Operation.add_done_callback`: .. code-block:: python operation = my_api_client.long_running_method() def my_callback(future): result = future.result() operation.add_done_callback(my_callback) """ import functools import threading from google.api_core import exceptions from google.api_core import protobuf_helpers from google.api_core.future import polling from google.longrunning import operations_pb2 from google.protobuf import json_format from google.rpc import code_pb2 class Operation(polling.PollingFuture): """A Future for interacting with a Google API Long-Running Operation. Args: operation (google.longrunning.operations_pb2.Operation): The initial operation. refresh (Callable[[], ~.api_core.operation.Operation]): A callable that returns the latest state of the operation. cancel (Callable[[], None]): A callable that tries to cancel the operation. result_type (func:`type`): The protobuf type for the operation's result. metadata_type (func:`type`): The protobuf type for the operation's metadata. polling (google.api_core.retry.Retry): The configuration used for polling. This parameter controls how often :meth:`done` is polled. If the ``timeout`` argument is specified in the :meth:`result` method, it will override the ``polling.timeout`` property. retry (google.api_core.retry.Retry): DEPRECATED: use ``polling`` instead. If specified it will override ``polling`` parameter to maintain backward compatibility. """ def __init__( self, operation, refresh, cancel, result_type, metadata_type=None, polling=polling.DEFAULT_POLLING, **kwargs ): super(Operation, self).__init__(polling=polling, **kwargs) self._operation = operation self._refresh = refresh self._cancel = cancel self._result_type = result_type self._metadata_type = metadata_type self._completion_lock = threading.Lock() # Invoke this in case the operation came back already complete. self._set_result_from_operation() @property def operation(self): """google.longrunning.Operation: The current long-running operation.""" return self._operation @property def metadata(self): """google.protobuf.Message: the current operation metadata.""" if not self._operation.HasField("metadata"): return None return protobuf_helpers.from_any_pb( self._metadata_type, self._operation.metadata ) @classmethod def deserialize(self, payload): """Deserialize a ``google.longrunning.Operation`` protocol buffer. Args: payload (bytes): A serialized operation protocol buffer. Returns: ~.operations_pb2.Operation: An Operation protobuf object. """ return operations_pb2.Operation.FromString(payload) def _set_result_from_operation(self): """Set the result or exception from the operation if it is complete.""" # This must be done in a lock to prevent the polling thread # and main thread from both executing the completion logic # at the same time. with self._completion_lock: # If the operation isn't complete or if the result has already been # set, do not call set_result/set_exception again. # Note: self._result_set is set to True in set_result and # set_exception, in case those methods are invoked directly. if not self._operation.done or self._result_set: return if self._operation.HasField("response"): response = protobuf_helpers.from_any_pb( self._result_type, self._operation.response ) self.set_result(response) elif self._operation.HasField("error"): exception = exceptions.from_grpc_status( status_code=self._operation.error.code, message=self._operation.error.message, errors=(self._operation.error,), response=self._operation, ) self.set_exception(exception) else: exception = exceptions.GoogleAPICallError( "Unexpected state: Long-running operation had neither " "response nor error set." ) self.set_exception(exception) def _refresh_and_update(self, retry=None): """Refresh the operation and update the result if needed. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the RPC. """ # If the currently cached operation is done, no need to make another # RPC as it will not change once done. if not self._operation.done: self._operation = self._refresh(retry=retry) if retry else self._refresh() self._set_result_from_operation() def done(self, retry=None): """Checks to see if the operation is complete. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the RPC. Returns: bool: True if the operation is complete, False otherwise. """ self._refresh_and_update(retry) return self._operation.done def cancel(self): """Attempt to cancel the operation. Returns: bool: True if the cancel RPC was made, False if the operation is already complete. """ if self.done(): return False self._cancel() return True def cancelled(self): """True if the operation was cancelled.""" self._refresh_and_update() return ( self._operation.HasField("error") and self._operation.error.code == code_pb2.CANCELLED ) def _refresh_http(api_request, operation_name, retry=None): """Refresh an operation using a JSON/HTTP client. Args: api_request (Callable): A callable used to make an API request. This should generally be :meth:`google.cloud._http.Connection.api_request`. operation_name (str): The name of the operation. retry (google.api_core.retry.Retry): (Optional) retry policy Returns: google.longrunning.operations_pb2.Operation: The operation. """ path = "operations/{}".format(operation_name) if retry is not None: api_request = retry(api_request) api_response = api_request(method="GET", path=path) return json_format.ParseDict(api_response, operations_pb2.Operation()) def _cancel_http(api_request, operation_name): """Cancel an operation using a JSON/HTTP client. Args: api_request (Callable): A callable used to make an API request. This should generally be :meth:`google.cloud._http.Connection.api_request`. operation_name (str): The name of the operation. """ path = "operations/{}:cancel".format(operation_name) api_request(method="POST", path=path) def from_http_json(operation, api_request, result_type, **kwargs): """Create an operation future using a HTTP/JSON client. This interacts with the long-running operations `service`_ (specific to a given API) via `HTTP/JSON`_. .. _HTTP/JSON: https://cloud.google.com/speech/reference/rest/\ v1beta1/operations#Operation Args: operation (dict): Operation as a dictionary. api_request (Callable): A callable used to make an API request. This should generally be :meth:`google.cloud._http.Connection.api_request`. result_type (:func:`type`): The protobuf result type. kwargs: Keyword args passed into the :class:`Operation` constructor. Returns: ~.api_core.operation.Operation: The operation future to track the given operation. """ operation_proto = json_format.ParseDict(operation, operations_pb2.Operation()) refresh = functools.partial(_refresh_http, api_request, operation_proto.name) cancel = functools.partial(_cancel_http, api_request, operation_proto.name) return Operation(operation_proto, refresh, cancel, result_type, **kwargs) def _refresh_grpc(operations_stub, operation_name, retry=None): """Refresh an operation using a gRPC client. Args: operations_stub (google.longrunning.operations_pb2.OperationsStub): The gRPC operations stub. operation_name (str): The name of the operation. retry (google.api_core.retry.Retry): (Optional) retry policy Returns: google.longrunning.operations_pb2.Operation: The operation. """ request_pb = operations_pb2.GetOperationRequest(name=operation_name) rpc = operations_stub.GetOperation if retry is not None: rpc = retry(rpc) return rpc(request_pb) def _cancel_grpc(operations_stub, operation_name): """Cancel an operation using a gRPC client. Args: operations_stub (google.longrunning.operations_pb2.OperationsStub): The gRPC operations stub. operation_name (str): The name of the operation. """ request_pb = operations_pb2.CancelOperationRequest(name=operation_name) operations_stub.CancelOperation(request_pb) def from_grpc(operation, operations_stub, result_type, grpc_metadata=None, **kwargs): """Create an operation future using a gRPC client. This interacts with the long-running operations `service`_ (specific to a given API) via gRPC. .. _service: https://github.com/googleapis/googleapis/blob/\ 050400df0fdb16f63b63e9dee53819044bffc857/\ google/longrunning/operations.proto#L38 Args: operation (google.longrunning.operations_pb2.Operation): The operation. operations_stub (google.longrunning.operations_pb2.OperationsStub): The operations stub. result_type (:func:`type`): The protobuf result type. grpc_metadata (Optional[List[Tuple[str, str]]]): Additional metadata to pass to the rpc. kwargs: Keyword args passed into the :class:`Operation` constructor. Returns: ~.api_core.operation.Operation: The operation future to track the given operation. """ refresh = functools.partial( _refresh_grpc, operations_stub, operation.name, metadata=grpc_metadata, ) cancel = functools.partial( _cancel_grpc, operations_stub, operation.name, metadata=grpc_metadata, ) return Operation(operation, refresh, cancel, result_type, **kwargs) def from_gapic(operation, operations_client, result_type, grpc_metadata=None, **kwargs): """Create an operation future from a gapic client. This interacts with the long-running operations `service`_ (specific to a given API) via a gapic client. .. _service: https://github.com/googleapis/googleapis/blob/\ 050400df0fdb16f63b63e9dee53819044bffc857/\ google/longrunning/operations.proto#L38 Args: operation (google.longrunning.operations_pb2.Operation): The operation. operations_client (google.api_core.operations_v1.OperationsClient): The operations client. result_type (:func:`type`): The protobuf result type. grpc_metadata (Optional[List[Tuple[str, str]]]): Additional metadata to pass to the rpc. kwargs: Keyword args passed into the :class:`Operation` constructor. Returns: ~.api_core.operation.Operation: The operation future to track the given operation. """ refresh = functools.partial( operations_client.get_operation, operation.name, metadata=grpc_metadata, ) cancel = functools.partial( operations_client.cancel_operation, operation.name, metadata=grpc_metadata, ) return Operation(operation, refresh, cancel, result_type, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operation_async.py0000664000000000017530000001755614614221643021255 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO futures for long-running operations returned from Google Cloud APIs. These futures can be used to await for the result of a long-running operation using :meth:`AsyncOperation.result`: .. code-block:: python operation = my_api_client.long_running_method() result = await operation.result() Or asynchronously using callbacks and :meth:`Operation.add_done_callback`: .. code-block:: python operation = my_api_client.long_running_method() def my_callback(future): result = await future.result() operation.add_done_callback(my_callback) """ import functools import threading from google.api_core import exceptions from google.api_core import protobuf_helpers from google.api_core.future import async_future from google.longrunning import operations_pb2 from google.rpc import code_pb2 class AsyncOperation(async_future.AsyncFuture): """A Future for interacting with a Google API Long-Running Operation. Args: operation (google.longrunning.operations_pb2.Operation): The initial operation. refresh (Callable[[], ~.api_core.operation.Operation]): A callable that returns the latest state of the operation. cancel (Callable[[], None]): A callable that tries to cancel the operation. result_type (func:`type`): The protobuf type for the operation's result. metadata_type (func:`type`): The protobuf type for the operation's metadata. retry (google.api_core.retry.Retry): The retry configuration used when polling. This can be used to control how often :meth:`done` is polled. Regardless of the retry's ``deadline``, it will be overridden by the ``timeout`` argument to :meth:`result`. """ def __init__( self, operation, refresh, cancel, result_type, metadata_type=None, retry=async_future.DEFAULT_RETRY, ): super().__init__(retry=retry) self._operation = operation self._refresh = refresh self._cancel = cancel self._result_type = result_type self._metadata_type = metadata_type self._completion_lock = threading.Lock() # Invoke this in case the operation came back already complete. self._set_result_from_operation() @property def operation(self): """google.longrunning.Operation: The current long-running operation.""" return self._operation @property def metadata(self): """google.protobuf.Message: the current operation metadata.""" if not self._operation.HasField("metadata"): return None return protobuf_helpers.from_any_pb( self._metadata_type, self._operation.metadata ) @classmethod def deserialize(cls, payload): """Deserialize a ``google.longrunning.Operation`` protocol buffer. Args: payload (bytes): A serialized operation protocol buffer. Returns: ~.operations_pb2.Operation: An Operation protobuf object. """ return operations_pb2.Operation.FromString(payload) def _set_result_from_operation(self): """Set the result or exception from the operation if it is complete.""" # This must be done in a lock to prevent the async_future thread # and main thread from both executing the completion logic # at the same time. with self._completion_lock: # If the operation isn't complete or if the result has already been # set, do not call set_result/set_exception again. if not self._operation.done or self._future.done(): return if self._operation.HasField("response"): response = protobuf_helpers.from_any_pb( self._result_type, self._operation.response ) self.set_result(response) elif self._operation.HasField("error"): exception = exceptions.GoogleAPICallError( self._operation.error.message, errors=(self._operation.error,), response=self._operation, ) self.set_exception(exception) else: exception = exceptions.GoogleAPICallError( "Unexpected state: Long-running operation had neither " "response nor error set." ) self.set_exception(exception) async def _refresh_and_update(self, retry=async_future.DEFAULT_RETRY): """Refresh the operation and update the result if needed. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the RPC. """ # If the currently cached operation is done, no need to make another # RPC as it will not change once done. if not self._operation.done: self._operation = await self._refresh(retry=retry) self._set_result_from_operation() async def done(self, retry=async_future.DEFAULT_RETRY): """Checks to see if the operation is complete. Args: retry (google.api_core.retry.Retry): (Optional) How to retry the RPC. Returns: bool: True if the operation is complete, False otherwise. """ await self._refresh_and_update(retry) return self._operation.done async def cancel(self): """Attempt to cancel the operation. Returns: bool: True if the cancel RPC was made, False if the operation is already complete. """ result = await self.done() if result: return False else: await self._cancel() return True async def cancelled(self): """True if the operation was cancelled.""" await self._refresh_and_update() return ( self._operation.HasField("error") and self._operation.error.code == code_pb2.CANCELLED ) def from_gapic(operation, operations_client, result_type, grpc_metadata=None, **kwargs): """Create an operation future from a gapic client. This interacts with the long-running operations `service`_ (specific to a given API) via a gapic client. .. _service: https://github.com/googleapis/googleapis/blob/\ 050400df0fdb16f63b63e9dee53819044bffc857/\ google/longrunning/operations.proto#L38 Args: operation (google.longrunning.operations_pb2.Operation): The operation. operations_client (google.api_core.operations_v1.OperationsClient): The operations client. result_type (:func:`type`): The protobuf result type. grpc_metadata (Optional[List[Tuple[str, str]]]): Additional metadata to pass to the rpc. kwargs: Keyword args passed into the :class:`Operation` constructor. Returns: ~.api_core.operation.Operation: The operation future to track the given operation. """ refresh = functools.partial( operations_client.get_operation, operation.name, metadata=grpc_metadata, ) cancel = functools.partial( operations_client.cancel_operation, operation.name, metadata=grpc_metadata, ) return AsyncOperation(operation, refresh, cancel, result_type, **kwargs) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9819117 google-api-core-2.19.0/google/api_core/operations_v1/0002755000000000017530000000000014614222024020253 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/__init__.py0000664000000000017530000000214614614221643022375 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Package for interacting with the google.longrunning.operations meta-API.""" from google.api_core.operations_v1.abstract_operations_client import AbstractOperationsClient from google.api_core.operations_v1.operations_async_client import OperationsAsyncClient from google.api_core.operations_v1.operations_client import OperationsClient from google.api_core.operations_v1.transports.rest import OperationsRestTransport __all__ = [ "AbstractOperationsClient", "OperationsAsyncClient", "OperationsClient", "OperationsRestTransport" ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/abstract_operations_client.py0000664000000000017530000005774714614221643026263 0ustar00root# -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from collections import OrderedDict import os import re from typing import Dict, Optional, Sequence, Tuple, Type, Union from google.api_core import client_options as client_options_lib # type: ignore from google.api_core import gapic_v1 # type: ignore from google.api_core import retry as retries # type: ignore from google.api_core.operations_v1 import pagers from google.api_core.operations_v1.transports.base import ( DEFAULT_CLIENT_INFO, OperationsTransport, ) from google.api_core.operations_v1.transports.rest import OperationsRestTransport from google.auth import credentials as ga_credentials # type: ignore from google.auth.exceptions import MutualTLSChannelError # type: ignore from google.auth.transport import mtls # type: ignore from google.longrunning import operations_pb2 from google.oauth2 import service_account # type: ignore import grpc OptionalRetry = Union[retries.Retry, object] class AbstractOperationsClientMeta(type): """Metaclass for the Operations client. This provides class-level methods for building and retrieving support objects (e.g. transport) without polluting the client instance objects. """ _transport_registry = OrderedDict() # type: Dict[str, Type[OperationsTransport]] _transport_registry["rest"] = OperationsRestTransport def get_transport_class( cls, label: Optional[str] = None, ) -> Type[OperationsTransport]: """Returns an appropriate transport class. Args: label: The name of the desired transport. If none is provided, then the first transport in the registry is used. Returns: The transport class to use. """ # If a specific transport is requested, return that one. if label: return cls._transport_registry[label] # No transport is requested; return the default (that is, the first one # in the dictionary). return next(iter(cls._transport_registry.values())) class AbstractOperationsClient(metaclass=AbstractOperationsClientMeta): """Manages long-running operations with an API service. When an API method normally takes long time to complete, it can be designed to return [Operation][google.api_core.operations_v1.Operation] to the client, and the client can use this interface to receive the real response asynchronously by polling the operation resource, or pass the operation resource to another API (such as Google Cloud Pub/Sub API) to receive the response. Any API service that returns long-running operations should implement the ``Operations`` interface so developers can have a consistent client experience. """ @staticmethod def _get_default_mtls_endpoint(api_endpoint): """Converts api endpoint to mTLS endpoint. Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. Args: api_endpoint (Optional[str]): the api endpoint to convert. Returns: str: converted mTLS api endpoint. """ if not api_endpoint: return api_endpoint mtls_endpoint_re = re.compile( r"(?P[^.]+)(?P\.mtls)?(?P\.sandbox)?(?P\.googleapis\.com)?" ) m = mtls_endpoint_re.match(api_endpoint) name, mtls, sandbox, googledomain = m.groups() if mtls or not googledomain: return api_endpoint if sandbox: return api_endpoint.replace( "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" ) return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") DEFAULT_ENDPOINT = "longrunning.googleapis.com" DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore DEFAULT_ENDPOINT ) @classmethod def from_service_account_info(cls, info: dict, *args, **kwargs): """Creates an instance of this client using the provided credentials info. Args: info (dict): The service account private key info. args: Additional arguments to pass to the constructor. kwargs: Additional arguments to pass to the constructor. Returns: AbstractOperationsClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_info(info) kwargs["credentials"] = credentials return cls(*args, **kwargs) @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials file. Args: filename (str): The path to the service account private key json file. args: Additional arguments to pass to the constructor. kwargs: Additional arguments to pass to the constructor. Returns: AbstractOperationsClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file(filename) kwargs["credentials"] = credentials return cls(*args, **kwargs) from_service_account_json = from_service_account_file @property def transport(self) -> OperationsTransport: """Returns the transport used by the client instance. Returns: OperationsTransport: The transport used by the client instance. """ return self._transport @staticmethod def common_billing_account_path( billing_account: str, ) -> str: """Returns a fully-qualified billing_account string.""" return "billingAccounts/{billing_account}".format( billing_account=billing_account, ) @staticmethod def parse_common_billing_account_path(path: str) -> Dict[str, str]: """Parse a billing_account path into its component segments.""" m = re.match(r"^billingAccounts/(?P.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_folder_path( folder: str, ) -> str: """Returns a fully-qualified folder string.""" return "folders/{folder}".format( folder=folder, ) @staticmethod def parse_common_folder_path(path: str) -> Dict[str, str]: """Parse a folder path into its component segments.""" m = re.match(r"^folders/(?P.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_organization_path( organization: str, ) -> str: """Returns a fully-qualified organization string.""" return "organizations/{organization}".format( organization=organization, ) @staticmethod def parse_common_organization_path(path: str) -> Dict[str, str]: """Parse a organization path into its component segments.""" m = re.match(r"^organizations/(?P.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_project_path( project: str, ) -> str: """Returns a fully-qualified project string.""" return "projects/{project}".format( project=project, ) @staticmethod def parse_common_project_path(path: str) -> Dict[str, str]: """Parse a project path into its component segments.""" m = re.match(r"^projects/(?P.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_location_path( project: str, location: str, ) -> str: """Returns a fully-qualified location string.""" return "projects/{project}/locations/{location}".format( project=project, location=location, ) @staticmethod def parse_common_location_path(path: str) -> Dict[str, str]: """Parse a location path into its component segments.""" m = re.match(r"^projects/(?P.+?)/locations/(?P.+?)$", path) return m.groupdict() if m else {} def __init__( self, *, credentials: Optional[ga_credentials.Credentials] = None, transport: Union[str, OperationsTransport, None] = None, client_options: Optional[client_options_lib.ClientOptions] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: """Instantiates the operations client. Args: credentials (Optional[google.auth.credentials.Credentials]): The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. transport (Union[str, OperationsTransport]): The transport to use. If set to None, a transport is chosen automatically. client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT environment variable can also be used to override the endpoint: "always" (always use the default mTLS endpoint), "never" (always use the default regular endpoint) and "auto" (auto switch to the default mTLS endpoint if client certificate is present, this is the default value). However, the ``api_endpoint`` property takes precedence if provided. (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable is "true", then the ``client_cert_source`` property can be used to provide client certificate for mutual TLS transport. If not provided, the default SSL client certificate will be used if present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not set, no client certificate will be used. client_info (google.api_core.gapic_v1.client_info.ClientInfo): The client info used to send a user-agent string along with API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. Raises: google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport creation failed for any reason. """ if isinstance(client_options, dict): client_options = client_options_lib.from_dict(client_options) if client_options is None: client_options = client_options_lib.ClientOptions() # Create SSL credentials for mutual TLS if needed. use_client_cert = os.getenv( "GOOGLE_API_USE_CLIENT_CERTIFICATE", "false" ).lower() if use_client_cert not in ("true", "false"): raise ValueError( "Environment variable `GOOGLE_API_USE_CLIENT_CERTIFICATE` must be either `true` or `false`" ) client_cert_source_func = None is_mtls = False if use_client_cert == "true": if client_options.client_cert_source: is_mtls = True client_cert_source_func = client_options.client_cert_source else: is_mtls = mtls.has_default_client_cert_source() if is_mtls: client_cert_source_func = mtls.default_client_cert_source() else: client_cert_source_func = None # Figure out which api endpoint to use. if client_options.api_endpoint is not None: api_endpoint = client_options.api_endpoint else: use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto") if use_mtls_env == "never": api_endpoint = self.DEFAULT_ENDPOINT elif use_mtls_env == "always": api_endpoint = self.DEFAULT_MTLS_ENDPOINT elif use_mtls_env == "auto": if is_mtls: api_endpoint = self.DEFAULT_MTLS_ENDPOINT else: api_endpoint = self.DEFAULT_ENDPOINT else: raise MutualTLSChannelError( "Unsupported GOOGLE_API_USE_MTLS_ENDPOINT value. Accepted " "values: never, auto, always" ) # Save or instantiate the transport. # Ordinarily, we provide the transport, but allowing a custom transport # instance provides an extensibility point for unusual situations. if isinstance(transport, OperationsTransport): # transport is a OperationsTransport instance. if credentials or client_options.credentials_file: raise ValueError( "When providing a transport instance, " "provide its credentials directly." ) if client_options.scopes: raise ValueError( "When providing a transport instance, provide its scopes " "directly." ) self._transport = transport else: Transport = type(self).get_transport_class(transport) self._transport = Transport( credentials=credentials, credentials_file=client_options.credentials_file, host=api_endpoint, scopes=client_options.scopes, client_cert_source_for_mtls=client_cert_source_func, quota_project_id=client_options.quota_project_id, client_info=client_info, always_use_jwt_access=True, ) def list_operations( self, name: str, filter_: Optional[str] = None, *, page_size: Optional[int] = None, page_token: Optional[str] = None, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> pagers.ListOperationsPager: r"""Lists operations that match the specified filter in the request. If the server doesn't support this method, it returns ``UNIMPLEMENTED``. NOTE: the ``name`` binding allows API services to override the binding to use different resource name schemes, such as ``users/*/operations``. To override the binding, API services can add a binding such as ``"/v1/{name=users/*}/operations"`` to their service configuration. For backwards compatibility, the default name includes the operations collection id, however overriding users must ensure the name binding is the parent resource, without the operations collection id. Args: name (str): The name of the operation's parent resource. filter_ (str): The standard list filter. This corresponds to the ``filter`` field on the ``request`` instance; if ``request`` is provided, this should not be set. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. Returns: google.api_core.operations_v1.pagers.ListOperationsPager: The response message for [Operations.ListOperations][google.api_core.operations_v1.Operations.ListOperations]. Iterating over this object will yield results and resolve additional pages automatically. """ # Create a protobuf request object. request = operations_pb2.ListOperationsRequest(name=name, filter=filter_) if page_size is not None: request.page_size = page_size if page_token is not None: request.page_token = page_token # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. rpc = self._transport._wrapped_methods[self._transport.list_operations] # Certain fields should be provided within the metadata header; # add these here. metadata = tuple(metadata or ()) + ( gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), ) # Send the request. response = rpc( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) # This method is paged; wrap the response in a pager, which provides # an `__iter__` convenience method. response = pagers.ListOperationsPager( method=rpc, request=request, response=response, metadata=metadata, ) # Done; return the response. return response def get_operation( self, name: str, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> operations_pb2.Operation: r"""Gets the latest state of a long-running operation. Clients can use this method to poll the operation result at intervals as recommended by the API service. Args: name (str): The name of the operation resource. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. Returns: google.longrunning.operations_pb2.Operation: This resource represents a long- running operation that is the result of a network API call. """ request = operations_pb2.GetOperationRequest(name=name) # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. rpc = self._transport._wrapped_methods[self._transport.get_operation] # Certain fields should be provided within the metadata header; # add these here. metadata = tuple(metadata or ()) + ( gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), ) # Send the request. response = rpc( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) # Done; return the response. return response def delete_operation( self, name: str, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> None: r"""Deletes a long-running operation. This method indicates that the client is no longer interested in the operation result. It does not cancel the operation. If the server doesn't support this method, it returns ``google.rpc.Code.UNIMPLEMENTED``. Args: name (str): The name of the operation resource to be deleted. This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. """ # Create the request object. request = operations_pb2.DeleteOperationRequest(name=name) # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. rpc = self._transport._wrapped_methods[self._transport.delete_operation] # Certain fields should be provided within the metadata header; # add these here. metadata = tuple(metadata or ()) + ( gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), ) # Send the request. rpc( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) def cancel_operation( self, name: Optional[str] = None, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> None: r"""Starts asynchronous cancellation on a long-running operation. The server makes a best effort to cancel the operation, but success is not guaranteed. If the server doesn't support this method, it returns ``google.rpc.Code.UNIMPLEMENTED``. Clients can use [Operations.GetOperation][google.api_core.operations_v1.Operations.GetOperation] or other methods to check whether the cancellation succeeded or whether the operation completed despite cancellation. On successful cancellation, the operation is not deleted; instead, it becomes an operation with an [Operation.error][google.api_core.operations_v1.Operation.error] value with a [google.rpc.Status.code][google.rpc.Status.code] of 1, corresponding to ``Code.CANCELLED``. Args: name (str): The name of the operation resource to be cancelled. This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. """ # Create the request object. request = operations_pb2.CancelOperationRequest(name=name) # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. rpc = self._transport._wrapped_methods[self._transport.cancel_operation] # Certain fields should be provided within the metadata header; # add these here. metadata = tuple(metadata or ()) + ( gapic_v1.routing_header.to_grpc_metadata((("name", request.name),)), ) # Send the request. rpc( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/operations_async_client.py0000664000000000017530000003471214614221643025560 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """An async client for the google.longrunning.operations meta-API. .. _Google API Style Guide: https://cloud.google.com/apis/design/design_pattern s#long_running_operations .. _google/longrunning/operations.proto: https://github.com/googleapis/googleapis/blob/master/google/longrunning /operations.proto """ import functools from google.api_core import exceptions as core_exceptions from google.api_core import gapic_v1, page_iterator_async from google.api_core import retry_async as retries from google.api_core import timeout as timeouts from google.longrunning import operations_pb2 from grpc import Compression class OperationsAsyncClient: """Async client for interacting with long-running operations. Args: channel (aio.Channel): The gRPC AsyncIO channel associated with the service that implements the ``google.longrunning.operations`` interface. client_config (dict): A dictionary of call options for each method. If not specified the default configuration is used. """ def __init__(self, channel, client_config=None): # Create the gRPC client stub with gRPC AsyncIO channel. self.operations_stub = operations_pb2.OperationsStub(channel) default_retry = retries.AsyncRetry( initial=0.1, # seconds maximum=60.0, # seconds multiplier=1.3, predicate=retries.if_exception_type( core_exceptions.DeadlineExceeded, core_exceptions.ServiceUnavailable, ), timeout=600.0, # seconds ) default_timeout = timeouts.TimeToDeadlineTimeout(timeout=600.0) default_compression = Compression.NoCompression self._get_operation = gapic_v1.method_async.wrap_method( self.operations_stub.GetOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._list_operations = gapic_v1.method_async.wrap_method( self.operations_stub.ListOperations, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._cancel_operation = gapic_v1.method_async.wrap_method( self.operations_stub.CancelOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._delete_operation = gapic_v1.method_async.wrap_method( self.operations_stub.DeleteOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) async def get_operation( self, name, retry=gapic_v1.method_async.DEFAULT, timeout=gapic_v1.method_async.DEFAULT, compression=gapic_v1.method_async.DEFAULT, metadata=None, ): """Gets the latest state of a long-running operation. Clients can use this method to poll the operation result at intervals as recommended by the API service. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> response = await api.get_operation(name) Args: name (str): The name of the operation resource. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Returns: google.longrunning.operations_pb2.Operation: The state of the operation. Raises: google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ request = operations_pb2.GetOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) return await self._get_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) async def list_operations( self, name, filter_, retry=gapic_v1.method_async.DEFAULT, timeout=gapic_v1.method_async.DEFAULT, compression=gapic_v1.method_async.DEFAULT, metadata=None, ): """ Lists operations that match the specified filter in the request. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> >>> # Iterate over all results >>> for operation in await api.list_operations(name): >>> # process operation >>> pass >>> >>> # Or iterate over results one page at a time >>> iter = await api.list_operations(name) >>> for page in iter.pages: >>> for operation in page: >>> # process operation >>> pass Args: name (str): The name of the operation collection. filter_ (str): The standard list filter. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Returns: google.api_core.page_iterator.Iterator: An iterator that yields :class:`google.longrunning.operations_pb2.Operation` instances. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ # Create the request object. request = operations_pb2.ListOperationsRequest(name=name, filter=filter_) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) # Create the method used to fetch pages method = functools.partial( self._list_operations, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) iterator = page_iterator_async.AsyncGRPCIterator( client=None, method=method, request=request, items_field="operations", request_token_field="page_token", response_token_field="next_page_token", ) return iterator async def cancel_operation( self, name, retry=gapic_v1.method_async.DEFAULT, timeout=gapic_v1.method_async.DEFAULT, compression=gapic_v1.method_async.DEFAULT, metadata=None, ): """Starts asynchronous cancellation on a long-running operation. The server makes a best effort to cancel the operation, but success is not guaranteed. Clients can use :meth:`get_operation` or service- specific methods to check whether the cancellation succeeded or whether the operation completed despite cancellation. On successful cancellation, the operation is not deleted; instead, it becomes an operation with an ``Operation.error`` value with a ``google.rpc.Status.code`` of ``1``, corresponding to ``Code.CANCELLED``. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> api.cancel_operation(name) Args: name (str): The name of the operation resource to be cancelled. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. """ # Create the request object. request = operations_pb2.CancelOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) await self._cancel_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) async def delete_operation( self, name, retry=gapic_v1.method_async.DEFAULT, timeout=gapic_v1.method_async.DEFAULT, compression=gapic_v1.method_async.DEFAULT, metadata=None, ): """Deletes a long-running operation. This method indicates that the client is no longer interested in the operation result. It does not cancel the operation. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> api.delete_operation(name) Args: name (str): The name of the operation resource to be deleted. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ # Create the request object. request = operations_pb2.DeleteOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) await self._delete_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/operations_client.py0000664000000000017530000003565214614221643024367 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """A client for the google.longrunning.operations meta-API. This is a client that deals with long-running operations that follow the pattern outlined by the `Google API Style Guide`_. When an API method normally takes long time to complete, it can be designed to return ``Operation`` to the client, and the client can use this interface to receive the real response asynchronously by polling the operation resource to receive the response. It is not a separate service, but rather an interface implemented by a larger service. The protocol-level definition is available at `google/longrunning/operations.proto`_. Typically, this will be constructed automatically by another client class to deal with operations. .. _Google API Style Guide: https://cloud.google.com/apis/design/design_pattern s#long_running_operations .. _google/longrunning/operations.proto: https://github.com/googleapis/googleapis/blob/master/google/longrunning /operations.proto """ import functools from google.api_core import exceptions as core_exceptions from google.api_core import gapic_v1 from google.api_core import page_iterator from google.api_core import retry as retries from google.api_core import timeout as timeouts from google.longrunning import operations_pb2 from grpc import Compression class OperationsClient(object): """Client for interacting with long-running operations within a service. Args: channel (grpc.Channel): The gRPC channel associated with the service that implements the ``google.longrunning.operations`` interface. client_config (dict): A dictionary of call options for each method. If not specified the default configuration is used. """ def __init__(self, channel, client_config=None): # Create the gRPC client stub. self.operations_stub = operations_pb2.OperationsStub(channel) default_retry = retries.Retry( initial=0.1, # seconds maximum=60.0, # seconds multiplier=1.3, predicate=retries.if_exception_type( core_exceptions.DeadlineExceeded, core_exceptions.ServiceUnavailable, ), timeout=600.0, # seconds ) default_timeout = timeouts.TimeToDeadlineTimeout(timeout=600.0) default_compression = Compression.NoCompression self._get_operation = gapic_v1.method.wrap_method( self.operations_stub.GetOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._list_operations = gapic_v1.method.wrap_method( self.operations_stub.ListOperations, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._cancel_operation = gapic_v1.method.wrap_method( self.operations_stub.CancelOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) self._delete_operation = gapic_v1.method.wrap_method( self.operations_stub.DeleteOperation, default_retry=default_retry, default_timeout=default_timeout, default_compression=default_compression, ) # Service calls def get_operation( self, name, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, compression=gapic_v1.method.DEFAULT, metadata=None, ): """Gets the latest state of a long-running operation. Clients can use this method to poll the operation result at intervals as recommended by the API service. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> response = api.get_operation(name) Args: name (str): The name of the operation resource. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Returns: google.longrunning.operations_pb2.Operation: The state of the operation. Raises: google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ request = operations_pb2.GetOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) return self._get_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) def list_operations( self, name, filter_, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, compression=gapic_v1.method.DEFAULT, metadata=None, ): """ Lists operations that match the specified filter in the request. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> >>> # Iterate over all results >>> for operation in api.list_operations(name): >>> # process operation >>> pass >>> >>> # Or iterate over results one page at a time >>> iter = api.list_operations(name) >>> for page in iter.pages: >>> for operation in page: >>> # process operation >>> pass Args: name (str): The name of the operation collection. filter_ (str): The standard list filter. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Returns: google.api_core.page_iterator.Iterator: An iterator that yields :class:`google.longrunning.operations_pb2.Operation` instances. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ # Create the request object. request = operations_pb2.ListOperationsRequest(name=name, filter=filter_) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) # Create the method used to fetch pages method = functools.partial( self._list_operations, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) iterator = page_iterator.GRPCIterator( client=None, method=method, request=request, items_field="operations", request_token_field="page_token", response_token_field="next_page_token", ) return iterator def cancel_operation( self, name, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, compression=gapic_v1.method.DEFAULT, metadata=None, ): """Starts asynchronous cancellation on a long-running operation. The server makes a best effort to cancel the operation, but success is not guaranteed. Clients can use :meth:`get_operation` or service- specific methods to check whether the cancellation succeeded or whether the operation completed despite cancellation. On successful cancellation, the operation is not deleted; instead, it becomes an operation with an ``Operation.error`` value with a ``google.rpc.Status.code`` of ``1``, corresponding to ``Code.CANCELLED``. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> api.cancel_operation(name) Args: name (str): The name of the operation resource to be cancelled. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ # Create the request object. request = operations_pb2.CancelOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) self._cancel_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) def delete_operation( self, name, retry=gapic_v1.method.DEFAULT, timeout=gapic_v1.method.DEFAULT, compression=gapic_v1.method.DEFAULT, metadata=None, ): """Deletes a long-running operation. This method indicates that the client is no longer interested in the operation result. It does not cancel the operation. Example: >>> from google.api_core import operations_v1 >>> api = operations_v1.OperationsClient() >>> name = '' >>> api.delete_operation(name) Args: name (str): The name of the operation resource to be deleted. retry (google.api_core.retry.Retry): The retry strategy to use when invoking the RPC. If unspecified, the default retry from the client configuration will be used. If ``None``, then this method will not retry the RPC at all. timeout (float): The amount of time in seconds to wait for the RPC to complete. Note that if ``retry`` is used, this timeout applies to each individual attempt and the overall time it takes for this method to complete may be longer. If unspecified, the the default timeout in the client configuration is used. If ``None``, then the RPC method will not time out. compression (grpc.Compression): An element of grpc.compression e.g. grpc.compression.Gzip. metadata (Optional[List[Tuple[str, str]]]): Additional gRPC metadata. Raises: google.api_core.exceptions.MethodNotImplemented: If the server does not support this method. Services are not required to implement this method. google.api_core.exceptions.GoogleAPICallError: If an error occurred while invoking the RPC, the appropriate ``GoogleAPICallError`` subclass will be raised. """ # Create the request object. request = operations_pb2.DeleteOperationRequest(name=name) # Add routing header metadata = metadata or [] metadata.append(gapic_v1.routing_header.to_grpc_metadata({"name": name})) self._delete_operation( request, retry=retry, timeout=timeout, compression=compression, metadata=metadata, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/operations_client_config.py0000664000000000017530000000435514614221643025710 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """gapic configuration for the google.longrunning.operations client.""" # DEPRECATED: retry and timeout classes are instantiated directly config = { "interfaces": { "google.longrunning.Operations": { "retry_codes": { "idempotent": ["DEADLINE_EXCEEDED", "UNAVAILABLE"], "non_idempotent": [], }, "retry_params": { "default": { "initial_retry_delay_millis": 100, "retry_delay_multiplier": 1.3, "max_retry_delay_millis": 60000, "initial_rpc_timeout_millis": 20000, "rpc_timeout_multiplier": 1.0, "max_rpc_timeout_millis": 600000, "total_timeout_millis": 600000, } }, "methods": { "GetOperation": { "timeout_millis": 60000, "retry_codes_name": "idempotent", "retry_params_name": "default", }, "ListOperations": { "timeout_millis": 60000, "retry_codes_name": "idempotent", "retry_params_name": "default", }, "CancelOperation": { "timeout_millis": 60000, "retry_codes_name": "idempotent", "retry_params_name": "default", }, "DeleteOperation": { "timeout_millis": 60000, "retry_codes_name": "idempotent", "retry_params_name": "default", }, }, } } } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/pagers.py0000664000000000017530000000610714614221643022120 0ustar00root# -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from typing import ( Any, Callable, Iterator, Sequence, Tuple, ) from google.longrunning import operations_pb2 class ListOperationsPager: """A pager for iterating through ``list_operations`` requests. This class thinly wraps an initial :class:`google.longrunning.operations_pb2.ListOperationsResponse` object, and provides an ``__iter__`` method to iterate through its ``operations`` field. If there are more pages, the ``__iter__`` method will make additional ``ListOperations`` requests and continue to iterate through the ``operations`` field on the corresponding responses. All the usual :class:`google.longrunning.operations_pb2.ListOperationsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ def __init__( self, method: Callable[..., operations_pb2.ListOperationsResponse], request: operations_pb2.ListOperationsRequest, response: operations_pb2.ListOperationsResponse, *, metadata: Sequence[Tuple[str, str]] = () ): """Instantiate the pager. Args: method (Callable): The method that was originally called, and which instantiated this pager. request (google.longrunning.operations_pb2.ListOperationsRequest): The initial request object. response (google.longrunning.operations_pb2.ListOperationsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. """ self._method = method self._request = request self._response = response self._metadata = metadata def __getattr__(self, name: str) -> Any: return getattr(self._response, name) @property def pages(self) -> Iterator[operations_pb2.ListOperationsResponse]: yield self._response while self._response.next_page_token: self._request.page_token = self._response.next_page_token self._response = self._method(self._request, metadata=self._metadata) yield self._response def __iter__(self) -> Iterator[operations_pb2.Operation]: for page in self.pages: yield from page.operations def __repr__(self) -> str: return "{0}<{1!r}>".format(self.__class__.__name__, self._response) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1714496531.9859135 google-api-core-2.19.0/google/api_core/operations_v1/transports/0002755000000000017530000000000014614222024022472 5ustar00root././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/transports/__init__.py0000664000000000017530000000162614614221643024616 0ustar00root# -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from collections import OrderedDict from .base import OperationsTransport from .rest import OperationsRestTransport # Compile a registry of transports. _transport_registry = OrderedDict() _transport_registry["rest"] = OperationsRestTransport __all__ = ( "OperationsTransport", "OperationsRestTransport", ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/transports/base.py0000664000000000017530000002062214614221643023766 0ustar00root# -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # import abc from typing import Awaitable, Callable, Optional, Sequence, Union import google.api_core # type: ignore from google.api_core import exceptions as core_exceptions # type: ignore from google.api_core import gapic_v1 # type: ignore from google.api_core import retry as retries # type: ignore from google.api_core import version import google.auth # type: ignore from google.auth import credentials as ga_credentials # type: ignore from google.longrunning import operations_pb2 from google.oauth2 import service_account # type: ignore from google.protobuf import empty_pb2 # type: ignore from grpc import Compression DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo( gapic_version=version.__version__, ) class OperationsTransport(abc.ABC): """Abstract transport class for Operations.""" AUTH_SCOPES = () DEFAULT_HOST: str = "longrunning.googleapis.com" def __init__( self, *, host: str = DEFAULT_HOST, credentials: ga_credentials.Credentials = None, credentials_file: Optional[str] = None, scopes: Optional[Sequence[str]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, always_use_jwt_access: Optional[bool] = False, **kwargs, ) -> None: """Instantiate the transport. Args: host (Optional[str]): The hostname to connect to. credentials (Optional[google.auth.credentials.Credentials]): The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. credentials_file (Optional[str]): A file with credentials that can be loaded with :func:`google.auth.load_credentials_from_file`. This argument is mutually exclusive with credentials. scopes (Optional[Sequence[str]]): A list of scopes. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): The client info used to send a user-agent string along with API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. always_use_jwt_access (Optional[bool]): Whether self signed JWT should be used for service account credentials. """ # Save the hostname. Default to port 443 (HTTPS) if none is specified. if ":" not in host: host += ":443" self._host = host scopes_kwargs = {"scopes": scopes, "default_scopes": self.AUTH_SCOPES} # Save the scopes. self._scopes = scopes # If no credentials are provided, then determine the appropriate # defaults. if credentials and credentials_file: raise core_exceptions.DuplicateCredentialArgs( "'credentials_file' and 'credentials' are mutually exclusive" ) if credentials_file is not None: credentials, _ = google.auth.load_credentials_from_file( credentials_file, **scopes_kwargs, quota_project_id=quota_project_id ) elif credentials is None: credentials, _ = google.auth.default( **scopes_kwargs, quota_project_id=quota_project_id ) # If the credentials are service account credentials, then always try to use self signed JWT. if ( always_use_jwt_access and isinstance(credentials, service_account.Credentials) and hasattr(service_account.Credentials, "with_always_use_jwt_access") ): credentials = credentials.with_always_use_jwt_access(True) # Save the credentials. self._credentials = credentials def _prep_wrapped_messages(self, client_info): # Precompute the wrapped methods. self._wrapped_methods = { self.list_operations: gapic_v1.method.wrap_method( self.list_operations, default_retry=retries.Retry( initial=0.5, maximum=10.0, multiplier=2.0, predicate=retries.if_exception_type( core_exceptions.ServiceUnavailable, ), deadline=10.0, ), default_timeout=10.0, default_compression=Compression.NoCompression, client_info=client_info, ), self.get_operation: gapic_v1.method.wrap_method( self.get_operation, default_retry=retries.Retry( initial=0.5, maximum=10.0, multiplier=2.0, predicate=retries.if_exception_type( core_exceptions.ServiceUnavailable, ), deadline=10.0, ), default_timeout=10.0, default_compression=Compression.NoCompression, client_info=client_info, ), self.delete_operation: gapic_v1.method.wrap_method( self.delete_operation, default_retry=retries.Retry( initial=0.5, maximum=10.0, multiplier=2.0, predicate=retries.if_exception_type( core_exceptions.ServiceUnavailable, ), deadline=10.0, ), default_timeout=10.0, default_compression=Compression.NoCompression, client_info=client_info, ), self.cancel_operation: gapic_v1.method.wrap_method( self.cancel_operation, default_retry=retries.Retry( initial=0.5, maximum=10.0, multiplier=2.0, predicate=retries.if_exception_type( core_exceptions.ServiceUnavailable, ), deadline=10.0, ), default_timeout=10.0, default_compression=Compression.NoCompression, client_info=client_info, ), } def close(self): """Closes resources associated with the transport. .. warning:: Only call this method if the transport is NOT shared with other clients - this may cause errors in other clients! """ raise NotImplementedError() @property def list_operations( self, ) -> Callable[ [operations_pb2.ListOperationsRequest], Union[ operations_pb2.ListOperationsResponse, Awaitable[operations_pb2.ListOperationsResponse], ], ]: raise NotImplementedError() @property def get_operation( self, ) -> Callable[ [operations_pb2.GetOperationRequest], Union[operations_pb2.Operation, Awaitable[operations_pb2.Operation]], ]: raise NotImplementedError() @property def delete_operation( self, ) -> Callable[ [operations_pb2.DeleteOperationRequest], Union[empty_pb2.Empty, Awaitable[empty_pb2.Empty]], ]: raise NotImplementedError() @property def cancel_operation( self, ) -> Callable[ [operations_pb2.CancelOperationRequest], Union[empty_pb2.Empty, Awaitable[empty_pb2.Empty]], ]: raise NotImplementedError() __all__ = ("OperationsTransport",) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/operations_v1/transports/rest.py0000664000000000017530000004620014614221643024031 0ustar00root# -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # import re from typing import Callable, Dict, Optional, Sequence, Tuple, Union from requests import __version__ as requests_version from google.api_core import exceptions as core_exceptions # type: ignore from google.api_core import gapic_v1 # type: ignore from google.api_core import path_template # type: ignore from google.api_core import rest_helpers # type: ignore from google.api_core import retry as retries # type: ignore from google.auth import credentials as ga_credentials # type: ignore from google.auth.transport.requests import AuthorizedSession # type: ignore from google.longrunning import operations_pb2 # type: ignore from google.protobuf import empty_pb2 # type: ignore from google.protobuf import json_format # type: ignore import grpc from .base import DEFAULT_CLIENT_INFO as BASE_DEFAULT_CLIENT_INFO, OperationsTransport OptionalRetry = Union[retries.Retry, object] DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo( gapic_version=BASE_DEFAULT_CLIENT_INFO.gapic_version, grpc_version=None, rest_version=requests_version, ) class OperationsRestTransport(OperationsTransport): """REST backend transport for Operations. Manages long-running operations with an API service. When an API method normally takes long time to complete, it can be designed to return [Operation][google.api_core.operations_v1.Operation] to the client, and the client can use this interface to receive the real response asynchronously by polling the operation resource, or pass the operation resource to another API (such as Google Cloud Pub/Sub API) to receive the response. Any API service that returns long-running operations should implement the ``Operations`` interface so developers can have a consistent client experience. This class defines the same methods as the primary client, so the primary client can load the underlying transport implementation and call it. It sends JSON representations of protocol buffers over HTTP/1.1 """ def __init__( self, *, host: str = "longrunning.googleapis.com", credentials: ga_credentials.Credentials = None, credentials_file: Optional[str] = None, scopes: Optional[Sequence[str]] = None, client_cert_source_for_mtls: Optional[Callable[[], Tuple[bytes, bytes]]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, always_use_jwt_access: Optional[bool] = False, url_scheme: str = "https", http_options: Optional[Dict] = None, path_prefix: str = "v1", ) -> None: """Instantiate the transport. Args: host (Optional[str]): The hostname to connect to. credentials (Optional[google.auth.credentials.Credentials]): The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. credentials_file (Optional[str]): A file with credentials that can be loaded with :func:`google.auth.load_credentials_from_file`. This argument is ignored if ``channel`` is provided. scopes (Optional(Sequence[str])): A list of scopes. This argument is ignored if ``channel`` is provided. client_cert_source_for_mtls (Callable[[], Tuple[bytes, bytes]]): Client certificate to configure mutual TLS HTTP channel. It is ignored if ``channel`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): The client info used to send a user-agent string along with API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. always_use_jwt_access (Optional[bool]): Whether self signed JWT should be used for service account credentials. url_scheme: the protocol scheme for the API endpoint. Normally "https", but for testing or local servers, "http" can be specified. http_options: a dictionary of http_options for transcoding, to override the defaults from operations.proto. Each method has an entry with the corresponding http rules as value. path_prefix: path prefix (usually represents API version). Set to "v1" by default. """ # Run the base constructor # TODO(yon-mg): resolve other ctor params i.e. scopes, quota, etc. # TODO: When custom host (api_endpoint) is set, `scopes` must *also* be set on the # credentials object maybe_url_match = re.match("^(?Phttp(?:s)?://)?(?P.*)$", host) if maybe_url_match is None: raise ValueError( f"Unexpected hostname structure: {host}" ) # pragma: NO COVER url_match_items = maybe_url_match.groupdict() host = f"{url_scheme}://{host}" if not url_match_items["scheme"] else host super().__init__( host=host, credentials=credentials, client_info=client_info, always_use_jwt_access=always_use_jwt_access, ) self._session = AuthorizedSession( self._credentials, default_host=self.DEFAULT_HOST ) if client_cert_source_for_mtls: self._session.configure_mtls_channel(client_cert_source_for_mtls) self._prep_wrapped_messages(client_info) self._http_options = http_options or {} self._path_prefix = path_prefix def _list_operations( self, request: operations_pb2.ListOperationsRequest, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> operations_pb2.ListOperationsResponse: r"""Call the list operations method over HTTP. Args: request (~.operations_pb2.ListOperationsRequest): The request object. The request message for [Operations.ListOperations][google.api_core.operations_v1.Operations.ListOperations]. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. Returns: ~.operations_pb2.ListOperationsResponse: The response message for [Operations.ListOperations][google.api_core.operations_v1.Operations.ListOperations]. """ http_options = [ { "method": "get", "uri": "/{}/{{name=**}}/operations".format(self._path_prefix), }, ] if "google.longrunning.Operations.ListOperations" in self._http_options: http_options = self._http_options[ "google.longrunning.Operations.ListOperations" ] request_kwargs = json_format.MessageToDict( request, preserving_proto_field_name=True, including_default_value_fields=True, ) transcoded_request = path_template.transcode(http_options, **request_kwargs) uri = transcoded_request["uri"] method = transcoded_request["method"] # Jsonify the query params query_params_request = operations_pb2.ListOperationsRequest() json_format.ParseDict(transcoded_request["query_params"], query_params_request) query_params = json_format.MessageToDict( query_params_request, including_default_value_fields=False, preserving_proto_field_name=False, use_integers_for_enums=False, ) # Send the request headers = dict(metadata) headers["Content-Type"] = "application/json" response = getattr(self._session, method)( "{host}{uri}".format(host=self._host, uri=uri), timeout=timeout, headers=headers, params=rest_helpers.flatten_query_params(query_params), ) # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception # subclass. if response.status_code >= 400: raise core_exceptions.from_http_response(response) # Return the response api_response = operations_pb2.ListOperationsResponse() json_format.Parse(response.content, api_response, ignore_unknown_fields=False) return api_response def _get_operation( self, request: operations_pb2.GetOperationRequest, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> operations_pb2.Operation: r"""Call the get operation method over HTTP. Args: request (~.operations_pb2.GetOperationRequest): The request object. The request message for [Operations.GetOperation][google.api_core.operations_v1.Operations.GetOperation]. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. Returns: ~.operations_pb2.Operation: This resource represents a long- running operation that is the result of a network API call. """ http_options = [ { "method": "get", "uri": "/{}/{{name=**/operations/*}}".format(self._path_prefix), }, ] if "google.longrunning.Operations.GetOperation" in self._http_options: http_options = self._http_options[ "google.longrunning.Operations.GetOperation" ] request_kwargs = json_format.MessageToDict( request, preserving_proto_field_name=True, including_default_value_fields=True, ) transcoded_request = path_template.transcode(http_options, **request_kwargs) uri = transcoded_request["uri"] method = transcoded_request["method"] # Jsonify the query params query_params_request = operations_pb2.GetOperationRequest() json_format.ParseDict(transcoded_request["query_params"], query_params_request) query_params = json_format.MessageToDict( query_params_request, including_default_value_fields=False, preserving_proto_field_name=False, use_integers_for_enums=False, ) # Send the request headers = dict(metadata) headers["Content-Type"] = "application/json" response = getattr(self._session, method)( "{host}{uri}".format(host=self._host, uri=uri), timeout=timeout, headers=headers, params=rest_helpers.flatten_query_params(query_params), ) # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception # subclass. if response.status_code >= 400: raise core_exceptions.from_http_response(response) # Return the response api_response = operations_pb2.Operation() json_format.Parse(response.content, api_response, ignore_unknown_fields=False) return api_response def _delete_operation( self, request: operations_pb2.DeleteOperationRequest, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> empty_pb2.Empty: r"""Call the delete operation method over HTTP. Args: request (~.operations_pb2.DeleteOperationRequest): The request object. The request message for [Operations.DeleteOperation][google.api_core.operations_v1.Operations.DeleteOperation]. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. """ http_options = [ { "method": "delete", "uri": "/{}/{{name=**/operations/*}}".format(self._path_prefix), }, ] if "google.longrunning.Operations.DeleteOperation" in self._http_options: http_options = self._http_options[ "google.longrunning.Operations.DeleteOperation" ] request_kwargs = json_format.MessageToDict( request, preserving_proto_field_name=True, including_default_value_fields=True, ) transcoded_request = path_template.transcode(http_options, **request_kwargs) uri = transcoded_request["uri"] method = transcoded_request["method"] # Jsonify the query params query_params_request = operations_pb2.DeleteOperationRequest() json_format.ParseDict(transcoded_request["query_params"], query_params_request) query_params = json_format.MessageToDict( query_params_request, including_default_value_fields=False, preserving_proto_field_name=False, use_integers_for_enums=False, ) # Send the request headers = dict(metadata) headers["Content-Type"] = "application/json" response = getattr(self._session, method)( "{host}{uri}".format(host=self._host, uri=uri), timeout=timeout, headers=headers, params=rest_helpers.flatten_query_params(query_params), ) # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception # subclass. if response.status_code >= 400: raise core_exceptions.from_http_response(response) return empty_pb2.Empty() def _cancel_operation( self, request: operations_pb2.CancelOperationRequest, *, retry: OptionalRetry = gapic_v1.method.DEFAULT, timeout: Optional[float] = None, compression: Optional[grpc.Compression] = gapic_v1.method.DEFAULT, metadata: Sequence[Tuple[str, str]] = (), ) -> empty_pb2.Empty: r"""Call the cancel operation method over HTTP. Args: request (~.operations_pb2.CancelOperationRequest): The request object. The request message for [Operations.CancelOperation][google.api_core.operations_v1.Operations.CancelOperation]. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. """ http_options = [ { "method": "post", "uri": "/{}/{{name=**/operations/*}}:cancel".format(self._path_prefix), "body": "*", }, ] if "google.longrunning.Operations.CancelOperation" in self._http_options: http_options = self._http_options[ "google.longrunning.Operations.CancelOperation" ] request_kwargs = json_format.MessageToDict( request, preserving_proto_field_name=True, including_default_value_fields=True, ) transcoded_request = path_template.transcode(http_options, **request_kwargs) # Jsonify the request body body_request = operations_pb2.CancelOperationRequest() json_format.ParseDict(transcoded_request["body"], body_request) body = json_format.MessageToDict( body_request, including_default_value_fields=False, preserving_proto_field_name=False, use_integers_for_enums=False, ) uri = transcoded_request["uri"] method = transcoded_request["method"] # Jsonify the query params query_params_request = operations_pb2.CancelOperationRequest() json_format.ParseDict(transcoded_request["query_params"], query_params_request) query_params = json_format.MessageToDict( query_params_request, including_default_value_fields=False, preserving_proto_field_name=False, use_integers_for_enums=False, ) # Send the request headers = dict(metadata) headers["Content-Type"] = "application/json" response = getattr(self._session, method)( "{host}{uri}".format(host=self._host, uri=uri), timeout=timeout, headers=headers, params=rest_helpers.flatten_query_params(query_params), data=body, ) # In case of error, raise the appropriate core_exceptions.GoogleAPICallError exception # subclass. if response.status_code >= 400: raise core_exceptions.from_http_response(response) return empty_pb2.Empty() @property def list_operations( self, ) -> Callable[ [operations_pb2.ListOperationsRequest], operations_pb2.ListOperationsResponse ]: return self._list_operations @property def get_operation( self, ) -> Callable[[operations_pb2.GetOperationRequest], operations_pb2.Operation]: return self._get_operation @property def delete_operation( self, ) -> Callable[[operations_pb2.DeleteOperationRequest], empty_pb2.Empty]: return self._delete_operation @property def cancel_operation( self, ) -> Callable[[operations_pb2.CancelOperationRequest], empty_pb2.Empty]: return self._cancel_operation __all__ = ("OperationsRestTransport",) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/page_iterator.py0000664000000000017530000004755214614221643020704 0ustar00root# Copyright 2015 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Iterators for paging through paged API methods. These iterators simplify the process of paging through API responses where the request takes a page token and the response is a list of results with a token for the next page. See `list pagination`_ in the Google API Style Guide for more details. .. _list pagination: https://cloud.google.com/apis/design/design_patterns#list_pagination API clients that have methods that follow the list pagination pattern can return an :class:`.Iterator`. You can use this iterator to get **all** of the results across all pages:: >>> results_iterator = client.list_resources() >>> list(results_iterator) # Convert to a list (consumes all values). Or you can walk your way through items and call off the search early if you find what you're looking for (resulting in possibly fewer requests):: >>> for resource in results_iterator: ... print(resource.name) ... if not resource.is_valid: ... break At any point, you may check the number of items consumed by referencing the ``num_results`` property of the iterator:: >>> for my_item in results_iterator: ... if results_iterator.num_results >= 10: ... break When iterating, not every new item will send a request to the server. To iterate based on each page of items (where a page corresponds to a request):: >>> for page in results_iterator.pages: ... print('=' * 20) ... print(' Page number: {:d}'.format(iterator.page_number)) ... print(' Items in page: {:d}'.format(page.num_items)) ... print(' First item: {!r}'.format(next(page))) ... print('Items remaining: {:d}'.format(page.remaining)) ... print('Next page token: {}'.format(iterator.next_page_token)) ==================== Page number: 1 Items in page: 1 First item: Items remaining: 0 Next page token: eav1OzQB0OM8rLdGXOEsyQWSG ==================== Page number: 2 Items in page: 19 First item: Items remaining: 18 Next page token: None Then, for each page you can get all the resources on that page by iterating through it or using :func:`list`:: >>> list(page) [ , , , ] """ import abc class Page(object): """Single page of results in an iterator. Args: parent (google.api_core.page_iterator.Iterator): The iterator that owns the current page. items (Sequence[Any]): An iterable (that also defines __len__) of items from a raw API response. item_to_value (Callable[google.api_core.page_iterator.Iterator, Any]): Callable to convert an item from the type in the raw API response into the native object. Will be called with the iterator and a single item. raw_page Optional[google.protobuf.message.Message]: The raw page response. """ def __init__(self, parent, items, item_to_value, raw_page=None): self._parent = parent self._num_items = len(items) self._remaining = self._num_items self._item_iter = iter(items) self._item_to_value = item_to_value self._raw_page = raw_page @property def raw_page(self): """google.protobuf.message.Message""" return self._raw_page @property def num_items(self): """int: Total items in the page.""" return self._num_items @property def remaining(self): """int: Remaining items in the page.""" return self._remaining def __iter__(self): """The :class:`Page` is an iterator of items.""" return self def __next__(self): """Get the next value in the page.""" item = next(self._item_iter) result = self._item_to_value(self._parent, item) # Since we've successfully got the next value from the # iterator, we update the number of remaining. self._remaining -= 1 return result def _item_to_value_identity(iterator, item): """An item to value transformer that returns the item un-changed.""" # pylint: disable=unused-argument # We are conforming to the interface defined by Iterator. return item class Iterator(object, metaclass=abc.ABCMeta): """A generic class for iterating through API list responses. Args: client(google.cloud.client.Client): The API client. item_to_value (Callable[google.api_core.page_iterator.Iterator, Any]): Callable to convert an item from the type in the raw API response into the native object. Will be called with the iterator and a single item. page_token (str): A token identifying a page in a result set to start fetching results from. max_results (int): The maximum number of results to fetch. """ def __init__( self, client, item_to_value=_item_to_value_identity, page_token=None, max_results=None, ): self._started = False self.__active_iterator = None self.client = client """Optional[Any]: The client that created this iterator.""" self.item_to_value = item_to_value """Callable[Iterator, Any]: Callable to convert an item from the type in the raw API response into the native object. Will be called with the iterator and a single item. """ self.max_results = max_results """int: The maximum number of results to fetch""" # The attributes below will change over the life of the iterator. self.page_number = 0 """int: The current page of results.""" self.next_page_token = page_token """str: The token for the next page of results. If this is set before the iterator starts, it effectively offsets the iterator to a specific starting point.""" self.num_results = 0 """int: The total number of results fetched so far.""" @property def pages(self): """Iterator of pages in the response. returns: types.GeneratorType[google.api_core.page_iterator.Page]: A generator of page instances. raises: ValueError: If the iterator has already been started. """ if self._started: raise ValueError("Iterator has already started", self) self._started = True return self._page_iter(increment=True) def _items_iter(self): """Iterator for each item returned.""" for page in self._page_iter(increment=False): for item in page: self.num_results += 1 yield item def __iter__(self): """Iterator for each item returned. Returns: types.GeneratorType[Any]: A generator of items from the API. Raises: ValueError: If the iterator has already been started. """ if self._started: raise ValueError("Iterator has already started", self) self._started = True return self._items_iter() def __next__(self): if self.__active_iterator is None: self.__active_iterator = iter(self) return next(self.__active_iterator) def _page_iter(self, increment): """Generator of pages of API responses. Args: increment (bool): Flag indicating if the total number of results should be incremented on each page. This is useful since a page iterator will want to increment by results per page while an items iterator will want to increment per item. Yields: Page: each page of items from the API. """ page = self._next_page() while page is not None: self.page_number += 1 if increment: self.num_results += page.num_items yield page page = self._next_page() @abc.abstractmethod def _next_page(self): """Get the next page in the iterator. This does nothing and is intended to be over-ridden by subclasses to return the next :class:`Page`. Raises: NotImplementedError: Always, this method is abstract. """ raise NotImplementedError def _do_nothing_page_start(iterator, page, response): """Helper to provide custom behavior after a :class:`Page` is started. This is a do-nothing stand-in as the default value. Args: iterator (Iterator): An iterator that holds some request info. page (Page): The page that was just created. response (Any): The API response for a page. """ # pylint: disable=unused-argument pass class HTTPIterator(Iterator): """A generic class for iterating through HTTP/JSON API list responses. To make an iterator work, you'll need to provide a way to convert a JSON item returned from the API into the object of your choice (via ``item_to_value``). You also may need to specify a custom ``items_key`` so that a given response (containing a page of results) can be parsed into an iterable page of the actual objects you want. Args: client (google.cloud.client.Client): The API client. api_request (Callable): The function to use to make API requests. Generally, this will be :meth:`google.cloud._http.JSONConnection.api_request`. path (str): The method path to query for the list of items. item_to_value (Callable[google.api_core.page_iterator.Iterator, Any]): Callable to convert an item from the type in the JSON response into a native object. Will be called with the iterator and a single item. items_key (str): The key in the API response where the list of items can be found. page_token (str): A token identifying a page in a result set to start fetching results from. page_size (int): The maximum number of results to fetch per page max_results (int): The maximum number of results to fetch extra_params (dict): Extra query string parameters for the API call. page_start (Callable[ google.api_core.page_iterator.Iterator, google.api_core.page_iterator.Page, dict]): Callable to provide any special behavior after a new page has been created. Assumed signature takes the :class:`.Iterator` that started the page, the :class:`.Page` that was started and the dictionary containing the page response. next_token (str): The name of the field used in the response for page tokens. .. autoattribute:: pages """ _DEFAULT_ITEMS_KEY = "items" _PAGE_TOKEN = "pageToken" _MAX_RESULTS = "maxResults" _NEXT_TOKEN = "nextPageToken" _RESERVED_PARAMS = frozenset([_PAGE_TOKEN]) _HTTP_METHOD = "GET" def __init__( self, client, api_request, path, item_to_value, items_key=_DEFAULT_ITEMS_KEY, page_token=None, page_size=None, max_results=None, extra_params=None, page_start=_do_nothing_page_start, next_token=_NEXT_TOKEN, ): super(HTTPIterator, self).__init__( client, item_to_value, page_token=page_token, max_results=max_results ) self.api_request = api_request self.path = path self._items_key = items_key self.extra_params = extra_params self._page_size = page_size self._page_start = page_start self._next_token = next_token # Verify inputs / provide defaults. if self.extra_params is None: self.extra_params = {} self._verify_params() def _verify_params(self): """Verifies the parameters don't use any reserved parameter. Raises: ValueError: If a reserved parameter is used. """ reserved_in_use = self._RESERVED_PARAMS.intersection(self.extra_params) if reserved_in_use: raise ValueError("Using a reserved parameter", reserved_in_use) def _next_page(self): """Get the next page in the iterator. Returns: Optional[Page]: The next page in the iterator or :data:`None` if there are no pages left. """ if self._has_next_page(): response = self._get_next_page_response() items = response.get(self._items_key, ()) page = Page(self, items, self.item_to_value, raw_page=response) self._page_start(self, page, response) self.next_page_token = response.get(self._next_token) return page else: return None def _has_next_page(self): """Determines whether or not there are more pages with results. Returns: bool: Whether the iterator has more pages. """ if self.page_number == 0: return True if self.max_results is not None: if self.num_results >= self.max_results: return False return self.next_page_token is not None def _get_query_params(self): """Getter for query parameters for the next request. Returns: dict: A dictionary of query parameters. """ result = {} if self.next_page_token is not None: result[self._PAGE_TOKEN] = self.next_page_token page_size = None if self.max_results is not None: page_size = self.max_results - self.num_results if self._page_size is not None: page_size = min(page_size, self._page_size) elif self._page_size is not None: page_size = self._page_size if page_size is not None: result[self._MAX_RESULTS] = page_size result.update(self.extra_params) return result def _get_next_page_response(self): """Requests the next page from the path provided. Returns: dict: The parsed JSON response of the next page's contents. Raises: ValueError: If the HTTP method is not ``GET`` or ``POST``. """ params = self._get_query_params() if self._HTTP_METHOD == "GET": return self.api_request( method=self._HTTP_METHOD, path=self.path, query_params=params ) elif self._HTTP_METHOD == "POST": return self.api_request( method=self._HTTP_METHOD, path=self.path, data=params ) else: raise ValueError("Unexpected HTTP method", self._HTTP_METHOD) class _GAXIterator(Iterator): """A generic class for iterating through Cloud gRPC APIs list responses. Any: client (google.cloud.client.Client): The API client. page_iter (google.gax.PageIterator): A GAX page iterator to be wrapped to conform to the :class:`Iterator` interface. item_to_value (Callable[Iterator, Any]): Callable to convert an item from the protobuf response into a native object. Will be called with the iterator and a single item. max_results (int): The maximum number of results to fetch. .. autoattribute:: pages """ def __init__(self, client, page_iter, item_to_value, max_results=None): super(_GAXIterator, self).__init__( client, item_to_value, page_token=page_iter.page_token, max_results=max_results, ) self._gax_page_iter = page_iter def _next_page(self): """Get the next page in the iterator. Wraps the response from the :class:`~google.gax.PageIterator` in a :class:`Page` instance and captures some state at each page. Returns: Optional[Page]: The next page in the iterator or :data:`None` if there are no pages left. """ try: items = next(self._gax_page_iter) page = Page(self, items, self.item_to_value) self.next_page_token = self._gax_page_iter.page_token or None return page except StopIteration: return None class GRPCIterator(Iterator): """A generic class for iterating through gRPC list responses. .. note:: The class does not take a ``page_token`` argument because it can just be specified in the ``request``. Args: client (google.cloud.client.Client): The API client. This unused by this class, but kept to satisfy the :class:`Iterator` interface. method (Callable[protobuf.Message]): A bound gRPC method that should take a single message for the request. request (protobuf.Message): The request message. items_field (str): The field in the response message that has the items for the page. item_to_value (Callable[GRPCIterator, Any]): Callable to convert an item from the type in the JSON response into a native object. Will be called with the iterator and a single item. request_token_field (str): The field in the request message used to specify the page token. response_token_field (str): The field in the response message that has the token for the next page. max_results (int): The maximum number of results to fetch. .. autoattribute:: pages """ _DEFAULT_REQUEST_TOKEN_FIELD = "page_token" _DEFAULT_RESPONSE_TOKEN_FIELD = "next_page_token" def __init__( self, client, method, request, items_field, item_to_value=_item_to_value_identity, request_token_field=_DEFAULT_REQUEST_TOKEN_FIELD, response_token_field=_DEFAULT_RESPONSE_TOKEN_FIELD, max_results=None, ): super(GRPCIterator, self).__init__( client, item_to_value, max_results=max_results ) self._method = method self._request = request self._items_field = items_field self._request_token_field = request_token_field self._response_token_field = response_token_field def _next_page(self): """Get the next page in the iterator. Returns: Page: The next page in the iterator or :data:`None` if there are no pages left. """ if not self._has_next_page(): return None if self.next_page_token is not None: setattr(self._request, self._request_token_field, self.next_page_token) response = self._method(self._request) self.next_page_token = getattr(response, self._response_token_field) items = getattr(response, self._items_field) page = Page(self, items, self.item_to_value, raw_page=response) return page def _has_next_page(self): """Determines whether or not there are more pages with results. Returns: bool: Whether the iterator has more pages. """ if self.page_number == 0: return True if self.max_results is not None: if self.num_results >= self.max_results: return False # Note: intentionally a falsy check instead of a None check. The RPC # can return an empty string indicating no more pages. return True if self.next_page_token else False ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/page_iterator_async.py0000664000000000017530000002406614614221643022074 0ustar00root# Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """AsyncIO iterators for paging through paged API methods. These iterators simplify the process of paging through API responses where the request takes a page token and the response is a list of results with a token for the next page. See `list pagination`_ in the Google API Style Guide for more details. .. _list pagination: https://cloud.google.com/apis/design/design_patterns#list_pagination API clients that have methods that follow the list pagination pattern can return an :class:`.AsyncIterator`: >>> results_iterator = await client.list_resources() Or you can walk your way through items and call off the search early if you find what you're looking for (resulting in possibly fewer requests):: >>> async for resource in results_iterator: ... print(resource.name) ... if not resource.is_valid: ... break At any point, you may check the number of items consumed by referencing the ``num_results`` property of the iterator:: >>> async for my_item in results_iterator: ... if results_iterator.num_results >= 10: ... break When iterating, not every new item will send a request to the server. To iterate based on each page of items (where a page corresponds to a request):: >>> async for page in results_iterator.pages: ... print('=' * 20) ... print(' Page number: {:d}'.format(iterator.page_number)) ... print(' Items in page: {:d}'.format(page.num_items)) ... print(' First item: {!r}'.format(next(page))) ... print('Items remaining: {:d}'.format(page.remaining)) ... print('Next page token: {}'.format(iterator.next_page_token)) ==================== Page number: 1 Items in page: 1 First item: Items remaining: 0 Next page token: eav1OzQB0OM8rLdGXOEsyQWSG ==================== Page number: 2 Items in page: 19 First item: Items remaining: 18 Next page token: None """ import abc from google.api_core.page_iterator import Page def _item_to_value_identity(iterator, item): """An item to value transformer that returns the item un-changed.""" # pylint: disable=unused-argument # We are conforming to the interface defined by Iterator. return item class AsyncIterator(abc.ABC): """A generic class for iterating through API list responses. Args: client(google.cloud.client.Client): The API client. item_to_value (Callable[google.api_core.page_iterator_async.AsyncIterator, Any]): Callable to convert an item from the type in the raw API response into the native object. Will be called with the iterator and a single item. page_token (str): A token identifying a page in a result set to start fetching results from. max_results (int): The maximum number of results to fetch. """ def __init__( self, client, item_to_value=_item_to_value_identity, page_token=None, max_results=None, ): self._started = False self.__active_aiterator = None self.client = client """Optional[Any]: The client that created this iterator.""" self.item_to_value = item_to_value """Callable[Iterator, Any]: Callable to convert an item from the type in the raw API response into the native object. Will be called with the iterator and a single item. """ self.max_results = max_results """int: The maximum number of results to fetch.""" # The attributes below will change over the life of the iterator. self.page_number = 0 """int: The current page of results.""" self.next_page_token = page_token """str: The token for the next page of results. If this is set before the iterator starts, it effectively offsets the iterator to a specific starting point.""" self.num_results = 0 """int: The total number of results fetched so far.""" @property def pages(self): """Iterator of pages in the response. returns: types.GeneratorType[google.api_core.page_iterator.Page]: A generator of page instances. raises: ValueError: If the iterator has already been started. """ if self._started: raise ValueError("Iterator has already started", self) self._started = True return self._page_aiter(increment=True) async def _items_aiter(self): """Iterator for each item returned.""" async for page in self._page_aiter(increment=False): for item in page: self.num_results += 1 yield item def __aiter__(self): """Iterator for each item returned. Returns: types.GeneratorType[Any]: A generator of items from the API. Raises: ValueError: If the iterator has already been started. """ if self._started: raise ValueError("Iterator has already started", self) self._started = True return self._items_aiter() async def __anext__(self): if self.__active_aiterator is None: self.__active_aiterator = self.__aiter__() return await self.__active_aiterator.__anext__() async def _page_aiter(self, increment): """Generator of pages of API responses. Args: increment (bool): Flag indicating if the total number of results should be incremented on each page. This is useful since a page iterator will want to increment by results per page while an items iterator will want to increment per item. Yields: Page: each page of items from the API. """ page = await self._next_page() while page is not None: self.page_number += 1 if increment: self.num_results += page.num_items yield page page = await self._next_page() @abc.abstractmethod async def _next_page(self): """Get the next page in the iterator. This does nothing and is intended to be over-ridden by subclasses to return the next :class:`Page`. Raises: NotImplementedError: Always, this method is abstract. """ raise NotImplementedError class AsyncGRPCIterator(AsyncIterator): """A generic class for iterating through gRPC list responses. .. note:: The class does not take a ``page_token`` argument because it can just be specified in the ``request``. Args: client (google.cloud.client.Client): The API client. This unused by this class, but kept to satisfy the :class:`Iterator` interface. method (Callable[protobuf.Message]): A bound gRPC method that should take a single message for the request. request (protobuf.Message): The request message. items_field (str): The field in the response message that has the items for the page. item_to_value (Callable[GRPCIterator, Any]): Callable to convert an item from the type in the JSON response into a native object. Will be called with the iterator and a single item. request_token_field (str): The field in the request message used to specify the page token. response_token_field (str): The field in the response message that has the token for the next page. max_results (int): The maximum number of results to fetch. .. autoattribute:: pages """ _DEFAULT_REQUEST_TOKEN_FIELD = "page_token" _DEFAULT_RESPONSE_TOKEN_FIELD = "next_page_token" def __init__( self, client, method, request, items_field, item_to_value=_item_to_value_identity, request_token_field=_DEFAULT_REQUEST_TOKEN_FIELD, response_token_field=_DEFAULT_RESPONSE_TOKEN_FIELD, max_results=None, ): super().__init__(client, item_to_value, max_results=max_results) self._method = method self._request = request self._items_field = items_field self._request_token_field = request_token_field self._response_token_field = response_token_field async def _next_page(self): """Get the next page in the iterator. Returns: Page: The next page in the iterator or :data:`None` if there are no pages left. """ if not self._has_next_page(): return None if self.next_page_token is not None: setattr(self._request, self._request_token_field, self.next_page_token) response = await self._method(self._request) self.next_page_token = getattr(response, self._response_token_field) items = getattr(response, self._items_field) page = Page(self, items, self.item_to_value, raw_page=response) return page def _has_next_page(self): """Determines whether or not there are more pages with results. Returns: bool: Whether the iterator has more pages. """ if self.page_number == 0: return True # Note: intentionally a falsy check instead of a None check. The RPC # can return an empty string indicating no more pages. if self.max_results is not None: if self.num_results >= self.max_results: return False return True if self.next_page_token else False ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1714496419.0 google-api-core-2.19.0/google/api_core/path_template.py0000664000000000017530000002664514614221643020706 0ustar00root# Copyright 2017 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Expand and validate URL path templates. This module provides the :func:`expand` and :func:`validate` functions for interacting with Google-style URL `path templates`_ which are commonly used in Google APIs for `resource names`_. .. _path templates: https://github.com/googleapis/googleapis/blob /57e2d376ac7ef48681554204a3ba78a414f2c533/google/api/http.proto#L212 .. _resource names: https://cloud.google.com/apis/design/resource_names """ from __future__ import unicode_literals from collections import deque import copy import functools import re # Regular expression for extracting variable parts from a path template. # The variables can be expressed as: # # - "*": a single-segment positional variable, for example: "books/*" # - "**": a multi-segment positional variable, for example: "shelf/**/book/*" # - "{name}": a single-segment wildcard named variable, for example # "books/{name}" # - "{name=*}: same as above. # - "{name=**}": a multi-segment wildcard named variable, for example # "shelf/{name=**}" # - "{name=/path/*/**}": a multi-segment named variable with a sub-template. _VARIABLE_RE = re.compile( r""" ( # Capture the entire variable expression (?P\*\*?) # Match & capture * and ** positional variables. | # Match & capture named variables {name} { (?P[^/]+?) # Optionally match and capture the named variable's template. (?:=(?P