././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/0000755000175100001770000000000000000000000014362 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/LICENSE.txt0000644000175100001770000000271300000000000016210 0ustar00runnerdocker00000000000000Copyright (c) 2019, Vladimir Magamedov All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of grpclib nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/PKG-INFO0000644000175100001770000001612400000000000015463 0ustar00runnerdocker00000000000000Metadata-Version: 2.1 Name: grpclib Version: 0.4.8rc2 Summary: Pure-Python gRPC implementation for asyncio Home-page: https://github.com/vmagamedov/grpclib Author: Vladimir Magamedov Author-email: vladimir@magamedov.com License: BSD-3-Clause Description: Pure-Python gRPC implementation for asyncio =========================================== .. image:: https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/7e1631d13476f1e870af0d5605b643fc14471a6d/banner-direct-single.svg :target: https://standforukraine.com |project|_ |documentation|_ |version|_ |tag|_ |downloads|_ |license|_ This project is based on `hyper-h2`_ and **requires Python >= 3.8**. .. contents:: :local: Example ~~~~~~~ See `examples`_ directory in the project's repository for all available examples. Client ------ .. code-block:: python3 import asyncio from grpclib.client import Channel # generated by protoc from .helloworld_pb2 import HelloRequest, HelloReply from .helloworld_grpc import GreeterStub async def main(): async with Channel('127.0.0.1', 50051) as channel: greeter = GreeterStub(channel) reply = await greeter.SayHello(HelloRequest(name='Dr. Strange')) print(reply.message) if __name__ == '__main__': asyncio.run(main()) Server ------ .. code-block:: python3 import asyncio from grpclib.utils import graceful_exit from grpclib.server import Server # generated by protoc from .helloworld_pb2 import HelloReply from .helloworld_grpc import GreeterBase class Greeter(GreeterBase): async def SayHello(self, stream): request = await stream.recv_message() message = f'Hello, {request.name}!' await stream.send_message(HelloReply(message=message)) async def main(*, host='127.0.0.1', port=50051): server = Server([Greeter()]) # Note: graceful_exit isn't supported in Windows with graceful_exit([server]): await server.start(host, port) print(f'Serving on {host}:{port}') await server.wait_closed() if __name__ == '__main__': asyncio.run(main()) Installation ~~~~~~~~~~~~ .. code-block:: console $ pip3 install "grpclib[protobuf]" Bug fixes and new features are frequently published via release candidates: .. code-block:: console $ pip3 install --upgrade --pre "grpclib[protobuf]" For the code generation you will also need a ``protoc`` compiler, which can be installed with ``protobuf`` system package: .. code-block:: console $ brew install protobuf # example for macOS users $ protoc --version libprotoc ... **Or** you can use ``protoc`` compiler from the ``grpcio-tools`` Python package: .. code-block:: console $ pip3 install grpcio-tools $ python3 -m grpc_tools.protoc --version libprotoc ... **Note:** ``grpcio`` and ``grpcio-tools`` packages are **not required in runtime**, ``grpcio-tools`` package will be used only during code generation. ``protoc`` plugin ~~~~~~~~~~~~~~~~~ In order to use this library you will have to generate special stub files using plugin provided, which can be used like this: .. code-block:: console $ python3 -m grpc_tools.protoc -I. --python_out=. --grpclib_python_out=. helloworld/helloworld.proto ^----- note -----^ This command will generate ``helloworld_pb2.py`` and ``helloworld_grpc.py`` files. Plugin which implements ``--grpclib_python_out`` option should be available for the ``protoc`` compiler as the ``protoc-gen-grpclib_python`` executable which should be installed by ``pip`` into your ``$PATH`` during installation of the ``grpclib`` library. Changed in v0.3.2: ``--python_grpc_out`` option was renamed into ``--grpclib_python_out``. Contributing ~~~~~~~~~~~~ * Please submit an issue before working on a Pull Request * Do not merge/squash/rebase your development branch while you work on a Pull Request, use rebase if this is really necessary * You may use Tox_ in order to test and lint your changes, but it is Ok to rely on CI for this matter .. _gRPC: http://www.grpc.io .. _hyper-h2: https://github.com/python-hyper/hyper-h2 .. _grpcio: https://pypi.org/project/grpcio/ .. _Tox: https://tox.readthedocs.io/ .. _examples: https://github.com/vmagamedov/grpclib/tree/master/examples .. |version| image:: https://img.shields.io/pypi/v/grpclib.svg?label=stable&color=blue .. _version: https://pypi.org/project/grpclib/ .. |license| image:: https://img.shields.io/pypi/l/grpclib.svg?color=blue .. _license: https://github.com/vmagamedov/grpclib/blob/master/LICENSE.txt .. |tag| image:: https://img.shields.io/github/tag/vmagamedov/grpclib.svg?label=latest&color=blue .. _tag: https://pypi.org/project/grpclib/#history .. |project| image:: https://img.shields.io/badge/vmagamedov%2Fgrpclib-blueviolet.svg?logo=github&color=blue .. _project: https://github.com/vmagamedov/grpclib .. |documentation| image:: https://img.shields.io/badge/docs-grpclib.rtfd.io-blue.svg .. _documentation: https://grpclib.readthedocs.io/en/latest/ .. |downloads| image:: https://static.pepy.tech/badge/grpclib/month .. _downloads: https://pepy.tech/project/grpclib Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Topic :: Internet :: WWW/HTTP :: HTTP Servers Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires-Python: >=3.8 Description-Content-Type: text/x-rst Provides-Extra: protobuf ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/README.rst0000644000175100001770000001147400000000000016060 0ustar00runnerdocker00000000000000Pure-Python gRPC implementation for asyncio =========================================== .. image:: https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/7e1631d13476f1e870af0d5605b643fc14471a6d/banner-direct-single.svg :target: https://standforukraine.com |project|_ |documentation|_ |version|_ |tag|_ |downloads|_ |license|_ This project is based on `hyper-h2`_ and **requires Python >= 3.8**. .. contents:: :local: Example ~~~~~~~ See `examples`_ directory in the project's repository for all available examples. Client ------ .. code-block:: python3 import asyncio from grpclib.client import Channel # generated by protoc from .helloworld_pb2 import HelloRequest, HelloReply from .helloworld_grpc import GreeterStub async def main(): async with Channel('127.0.0.1', 50051) as channel: greeter = GreeterStub(channel) reply = await greeter.SayHello(HelloRequest(name='Dr. Strange')) print(reply.message) if __name__ == '__main__': asyncio.run(main()) Server ------ .. code-block:: python3 import asyncio from grpclib.utils import graceful_exit from grpclib.server import Server # generated by protoc from .helloworld_pb2 import HelloReply from .helloworld_grpc import GreeterBase class Greeter(GreeterBase): async def SayHello(self, stream): request = await stream.recv_message() message = f'Hello, {request.name}!' await stream.send_message(HelloReply(message=message)) async def main(*, host='127.0.0.1', port=50051): server = Server([Greeter()]) # Note: graceful_exit isn't supported in Windows with graceful_exit([server]): await server.start(host, port) print(f'Serving on {host}:{port}') await server.wait_closed() if __name__ == '__main__': asyncio.run(main()) Installation ~~~~~~~~~~~~ .. code-block:: console $ pip3 install "grpclib[protobuf]" Bug fixes and new features are frequently published via release candidates: .. code-block:: console $ pip3 install --upgrade --pre "grpclib[protobuf]" For the code generation you will also need a ``protoc`` compiler, which can be installed with ``protobuf`` system package: .. code-block:: console $ brew install protobuf # example for macOS users $ protoc --version libprotoc ... **Or** you can use ``protoc`` compiler from the ``grpcio-tools`` Python package: .. code-block:: console $ pip3 install grpcio-tools $ python3 -m grpc_tools.protoc --version libprotoc ... **Note:** ``grpcio`` and ``grpcio-tools`` packages are **not required in runtime**, ``grpcio-tools`` package will be used only during code generation. ``protoc`` plugin ~~~~~~~~~~~~~~~~~ In order to use this library you will have to generate special stub files using plugin provided, which can be used like this: .. code-block:: console $ python3 -m grpc_tools.protoc -I. --python_out=. --grpclib_python_out=. helloworld/helloworld.proto ^----- note -----^ This command will generate ``helloworld_pb2.py`` and ``helloworld_grpc.py`` files. Plugin which implements ``--grpclib_python_out`` option should be available for the ``protoc`` compiler as the ``protoc-gen-grpclib_python`` executable which should be installed by ``pip`` into your ``$PATH`` during installation of the ``grpclib`` library. Changed in v0.3.2: ``--python_grpc_out`` option was renamed into ``--grpclib_python_out``. Contributing ~~~~~~~~~~~~ * Please submit an issue before working on a Pull Request * Do not merge/squash/rebase your development branch while you work on a Pull Request, use rebase if this is really necessary * You may use Tox_ in order to test and lint your changes, but it is Ok to rely on CI for this matter .. _gRPC: http://www.grpc.io .. _hyper-h2: https://github.com/python-hyper/hyper-h2 .. _grpcio: https://pypi.org/project/grpcio/ .. _Tox: https://tox.readthedocs.io/ .. _examples: https://github.com/vmagamedov/grpclib/tree/master/examples .. |version| image:: https://img.shields.io/pypi/v/grpclib.svg?label=stable&color=blue .. _version: https://pypi.org/project/grpclib/ .. |license| image:: https://img.shields.io/pypi/l/grpclib.svg?color=blue .. _license: https://github.com/vmagamedov/grpclib/blob/master/LICENSE.txt .. |tag| image:: https://img.shields.io/github/tag/vmagamedov/grpclib.svg?label=latest&color=blue .. _tag: https://pypi.org/project/grpclib/#history .. |project| image:: https://img.shields.io/badge/vmagamedov%2Fgrpclib-blueviolet.svg?logo=github&color=blue .. _project: https://github.com/vmagamedov/grpclib .. |documentation| image:: https://img.shields.io/badge/docs-grpclib.rtfd.io-blue.svg .. _documentation: https://grpclib.readthedocs.io/en/latest/ .. |downloads| image:: https://static.pepy.tech/badge/grpclib/month .. _downloads: https://pepy.tech/project/grpclib ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6522336 grpclib-0.4.8rc2/grpclib/0000755000175100001770000000000000000000000016004 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/__init__.py0000644000175100001770000000020400000000000020111 0ustar00runnerdocker00000000000000from .const import Status from .exceptions import GRPCError __version__ = '0.4.8rc2' __all__ = ( 'Status', 'GRPCError', ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/_registry.py0000644000175100001770000000034700000000000020371 0ustar00runnerdocker00000000000000import typing import weakref if typing.TYPE_CHECKING: from .server import Server from .client import Channel servers: 'weakref.WeakSet[Server]' = weakref.WeakSet() channels: 'weakref.WeakSet[Channel]' = weakref.WeakSet() ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/_typing.py0000644000175100001770000000141300000000000020026 0ustar00runnerdocker00000000000000from typing import Mapping, Any from typing_extensions import Protocol from . import const from . import server class IServable(Protocol): def __mapping__(self) -> Mapping[str, const.Handler]: ... class ICheckable(Protocol): def __mapping__(self) -> Mapping[str, Any]: ... class IClosable(Protocol): def close(self) -> None: ... class IProtoMessage(Protocol): @classmethod def FromString(cls, s: bytes) -> 'IProtoMessage': ... def SerializeToString(self) -> bytes: ... class IEventsTarget(Protocol): __dispatch__: Any # FIXME: should be events._Dispatch class IServerMethodFunc(Protocol): async def __call__(self, stream: 'server.Stream[Any, Any]') -> None: ... class IReleaseStream(Protocol): def __call__(self) -> None: ... ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6522336 grpclib-0.4.8rc2/grpclib/channelz/0000755000175100001770000000000000000000000017606 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/channelz/__init__.py0000644000175100001770000000000000000000000021705 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/channelz/service.py0000644000175100001770000000342300000000000021622 0ustar00runnerdocker00000000000000from ..const import Status from ..server import Stream from ..exceptions import GRPCError from .v1.channelz_pb2 import GetTopChannelsRequest, GetTopChannelsResponse from .v1.channelz_pb2 import GetServersRequest, GetServersResponse from .v1.channelz_pb2 import GetServerRequest, GetServerResponse from .v1.channelz_pb2 import GetServerSocketsRequest, GetServerSocketsResponse from .v1.channelz_pb2 import GetChannelRequest, GetChannelResponse from .v1.channelz_pb2 import GetSubchannelRequest, GetSubchannelResponse from .v1.channelz_pb2 import GetSocketRequest, GetSocketResponse from .v1.channelz_grpc import ChannelzBase class Channelz(ChannelzBase): async def GetTopChannels( self, stream: 'Stream[GetTopChannelsRequest, GetTopChannelsResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetServers( self, stream: 'Stream[GetServersRequest, GetServersResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetServer( self, stream: 'Stream[GetServerRequest, GetServerResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetServerSockets( self, stream: 'Stream[GetServerSocketsRequest, GetServerSocketsResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetChannel( self, stream: 'Stream[GetChannelRequest, GetChannelResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetSubchannel( self, stream: 'Stream[GetSubchannelRequest, GetSubchannelResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) async def GetSocket( self, stream: 'Stream[GetSocketRequest, GetSocketResponse]', ) -> None: raise GRPCError(Status.UNIMPLEMENTED) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/channelz/v1/0000755000175100001770000000000000000000000020134 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/channelz/v1/__init__.py0000644000175100001770000000000000000000000022233 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/channelz/v1/channelz_grpc.py0000644000175100001770000001440100000000000023323 0ustar00runnerdocker00000000000000# Generated by the Protocol Buffers compiler. DO NOT EDIT! # source: grpclib/channelz/v1/channelz.proto # plugin: grpclib.plugin.main import abc import typing import grpclib.const import grpclib.client if typing.TYPE_CHECKING: import grpclib.server import google.protobuf.any_pb2 import google.protobuf.duration_pb2 import google.protobuf.timestamp_pb2 import google.protobuf.wrappers_pb2 import grpclib.channelz.v1.channelz_pb2 class ChannelzBase(abc.ABC): @abc.abstractmethod async def GetTopChannels(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetTopChannelsRequest, grpclib.channelz.v1.channelz_pb2.GetTopChannelsResponse]') -> None: pass @abc.abstractmethod async def GetServers(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetServersRequest, grpclib.channelz.v1.channelz_pb2.GetServersResponse]') -> None: pass @abc.abstractmethod async def GetServer(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetServerRequest, grpclib.channelz.v1.channelz_pb2.GetServerResponse]') -> None: pass @abc.abstractmethod async def GetServerSockets(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetServerSocketsRequest, grpclib.channelz.v1.channelz_pb2.GetServerSocketsResponse]') -> None: pass @abc.abstractmethod async def GetChannel(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetChannelRequest, grpclib.channelz.v1.channelz_pb2.GetChannelResponse]') -> None: pass @abc.abstractmethod async def GetSubchannel(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetSubchannelRequest, grpclib.channelz.v1.channelz_pb2.GetSubchannelResponse]') -> None: pass @abc.abstractmethod async def GetSocket(self, stream: 'grpclib.server.Stream[grpclib.channelz.v1.channelz_pb2.GetSocketRequest, grpclib.channelz.v1.channelz_pb2.GetSocketResponse]') -> None: pass def __mapping__(self) -> typing.Dict[str, grpclib.const.Handler]: return { '/grpc.channelz.v1.Channelz/GetTopChannels': grpclib.const.Handler( self.GetTopChannels, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetTopChannelsRequest, grpclib.channelz.v1.channelz_pb2.GetTopChannelsResponse, ), '/grpc.channelz.v1.Channelz/GetServers': grpclib.const.Handler( self.GetServers, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetServersRequest, grpclib.channelz.v1.channelz_pb2.GetServersResponse, ), '/grpc.channelz.v1.Channelz/GetServer': grpclib.const.Handler( self.GetServer, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetServerRequest, grpclib.channelz.v1.channelz_pb2.GetServerResponse, ), '/grpc.channelz.v1.Channelz/GetServerSockets': grpclib.const.Handler( self.GetServerSockets, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetServerSocketsRequest, grpclib.channelz.v1.channelz_pb2.GetServerSocketsResponse, ), '/grpc.channelz.v1.Channelz/GetChannel': grpclib.const.Handler( self.GetChannel, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetChannelRequest, grpclib.channelz.v1.channelz_pb2.GetChannelResponse, ), '/grpc.channelz.v1.Channelz/GetSubchannel': grpclib.const.Handler( self.GetSubchannel, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetSubchannelRequest, grpclib.channelz.v1.channelz_pb2.GetSubchannelResponse, ), '/grpc.channelz.v1.Channelz/GetSocket': grpclib.const.Handler( self.GetSocket, grpclib.const.Cardinality.UNARY_UNARY, grpclib.channelz.v1.channelz_pb2.GetSocketRequest, grpclib.channelz.v1.channelz_pb2.GetSocketResponse, ), } class ChannelzStub: def __init__(self, channel: grpclib.client.Channel) -> None: self.GetTopChannels = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetTopChannels', grpclib.channelz.v1.channelz_pb2.GetTopChannelsRequest, grpclib.channelz.v1.channelz_pb2.GetTopChannelsResponse, ) self.GetServers = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetServers', grpclib.channelz.v1.channelz_pb2.GetServersRequest, grpclib.channelz.v1.channelz_pb2.GetServersResponse, ) self.GetServer = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetServer', grpclib.channelz.v1.channelz_pb2.GetServerRequest, grpclib.channelz.v1.channelz_pb2.GetServerResponse, ) self.GetServerSockets = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetServerSockets', grpclib.channelz.v1.channelz_pb2.GetServerSocketsRequest, grpclib.channelz.v1.channelz_pb2.GetServerSocketsResponse, ) self.GetChannel = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetChannel', grpclib.channelz.v1.channelz_pb2.GetChannelRequest, grpclib.channelz.v1.channelz_pb2.GetChannelResponse, ) self.GetSubchannel = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetSubchannel', grpclib.channelz.v1.channelz_pb2.GetSubchannelRequest, grpclib.channelz.v1.channelz_pb2.GetSubchannelResponse, ) self.GetSocket = grpclib.client.UnaryUnaryMethod( channel, '/grpc.channelz.v1.Channelz/GetSocket', grpclib.channelz.v1.channelz_pb2.GetSocketRequest, grpclib.channelz.v1.channelz_pb2.GetSocketResponse, ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/channelz/v1/channelz_pb2.py0000644000175100001770000004015100000000000023054 0ustar00runnerdocker00000000000000# -*- coding: utf-8 -*- # Generated by the protocol buffer compiler. DO NOT EDIT! # source: grpclib/channelz/v1/channelz.proto # Protobuf Python Version: 4.25.1 """Generated protocol buffer code.""" from google.protobuf import descriptor as _descriptor from google.protobuf import descriptor_pool as _descriptor_pool from google.protobuf import symbol_database as _symbol_database from google.protobuf.internal import builder as _builder # @@protoc_insertion_point(imports) _sym_db = _symbol_database.Default() from google.protobuf import any_pb2 as google_dot_protobuf_dot_any__pb2 from google.protobuf import duration_pb2 as google_dot_protobuf_dot_duration__pb2 from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2 from google.protobuf import wrappers_pb2 as google_dot_protobuf_dot_wrappers__pb2 DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\"grpclib/channelz/v1/channelz.proto\x12\x10grpc.channelz.v1\x1a\x19google/protobuf/any.proto\x1a\x1egoogle/protobuf/duration.proto\x1a\x1fgoogle/protobuf/timestamp.proto\x1a\x1egoogle/protobuf/wrappers.proto\"\xfe\x01\n\x07\x43hannel\x12)\n\x03ref\x18\x01 \x01(\x0b\x32\x1c.grpc.channelz.v1.ChannelRef\x12+\n\x04\x64\x61ta\x18\x02 \x01(\x0b\x32\x1d.grpc.channelz.v1.ChannelData\x12\x31\n\x0b\x63hannel_ref\x18\x03 \x03(\x0b\x32\x1c.grpc.channelz.v1.ChannelRef\x12\x37\n\x0esubchannel_ref\x18\x04 \x03(\x0b\x32\x1f.grpc.channelz.v1.SubchannelRef\x12/\n\nsocket_ref\x18\x05 \x03(\x0b\x32\x1b.grpc.channelz.v1.SocketRef\"\x84\x02\n\nSubchannel\x12,\n\x03ref\x18\x01 \x01(\x0b\x32\x1f.grpc.channelz.v1.SubchannelRef\x12+\n\x04\x64\x61ta\x18\x02 \x01(\x0b\x32\x1d.grpc.channelz.v1.ChannelData\x12\x31\n\x0b\x63hannel_ref\x18\x03 \x03(\x0b\x32\x1c.grpc.channelz.v1.ChannelRef\x12\x37\n\x0esubchannel_ref\x18\x04 \x03(\x0b\x32\x1f.grpc.channelz.v1.SubchannelRef\x12/\n\nsocket_ref\x18\x05 \x03(\x0b\x32\x1b.grpc.channelz.v1.SocketRef\"\xbb\x01\n\x18\x43hannelConnectivityState\x12?\n\x05state\x18\x01 \x01(\x0e\x32\x30.grpc.channelz.v1.ChannelConnectivityState.State\"^\n\x05State\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x08\n\x04IDLE\x10\x01\x12\x0e\n\nCONNECTING\x10\x02\x12\t\n\x05READY\x10\x03\x12\x15\n\x11TRANSIENT_FAILURE\x10\x04\x12\x0c\n\x08SHUTDOWN\x10\x05\"\x8e\x02\n\x0b\x43hannelData\x12\x39\n\x05state\x18\x01 \x01(\x0b\x32*.grpc.channelz.v1.ChannelConnectivityState\x12\x0e\n\x06target\x18\x02 \x01(\t\x12-\n\x05trace\x18\x03 \x01(\x0b\x32\x1e.grpc.channelz.v1.ChannelTrace\x12\x15\n\rcalls_started\x18\x04 \x01(\x03\x12\x17\n\x0f\x63\x61lls_succeeded\x18\x05 \x01(\x03\x12\x14\n\x0c\x63\x61lls_failed\x18\x06 \x01(\x03\x12?\n\x1blast_call_started_timestamp\x18\x07 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\"\xdb\x02\n\x11\x43hannelTraceEvent\x12\x13\n\x0b\x64\x65scription\x18\x01 \x01(\t\x12>\n\x08severity\x18\x02 \x01(\x0e\x32,.grpc.channelz.v1.ChannelTraceEvent.Severity\x12-\n\ttimestamp\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x33\n\x0b\x63hannel_ref\x18\x04 \x01(\x0b\x32\x1c.grpc.channelz.v1.ChannelRefH\x00\x12\x39\n\x0esubchannel_ref\x18\x05 \x01(\x0b\x32\x1f.grpc.channelz.v1.SubchannelRefH\x00\"E\n\x08Severity\x12\x0e\n\nCT_UNKNOWN\x10\x00\x12\x0b\n\x07\x43T_INFO\x10\x01\x12\x0e\n\nCT_WARNING\x10\x02\x12\x0c\n\x08\x43T_ERROR\x10\x03\x42\x0b\n\tchild_ref\"\x96\x01\n\x0c\x43hannelTrace\x12\x19\n\x11num_events_logged\x18\x01 \x01(\x03\x12\x36\n\x12\x63reation_timestamp\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x33\n\x06\x65vents\x18\x03 \x03(\x0b\x32#.grpc.channelz.v1.ChannelTraceEvent\"R\n\nChannelRef\x12\x12\n\nchannel_id\x18\x01 \x01(\x03\x12\x0c\n\x04name\x18\x02 \x01(\tJ\x04\x08\x03\x10\x04J\x04\x08\x04\x10\x05J\x04\x08\x05\x10\x06J\x04\x08\x06\x10\x07J\x04\x08\x07\x10\x08J\x04\x08\x08\x10\t\"X\n\rSubchannelRef\x12\x15\n\rsubchannel_id\x18\x07 \x01(\x03\x12\x0c\n\x04name\x18\x08 \x01(\tJ\x04\x08\x01\x10\x02J\x04\x08\x02\x10\x03J\x04\x08\x03\x10\x04J\x04\x08\x04\x10\x05J\x04\x08\x05\x10\x06J\x04\x08\x06\x10\x07\"P\n\tSocketRef\x12\x11\n\tsocket_id\x18\x03 \x01(\x03\x12\x0c\n\x04name\x18\x04 \x01(\tJ\x04\x08\x01\x10\x02J\x04\x08\x02\x10\x03J\x04\x08\x05\x10\x06J\x04\x08\x06\x10\x07J\x04\x08\x07\x10\x08J\x04\x08\x08\x10\t\"P\n\tServerRef\x12\x11\n\tserver_id\x18\x05 \x01(\x03\x12\x0c\n\x04name\x18\x06 \x01(\tJ\x04\x08\x01\x10\x02J\x04\x08\x02\x10\x03J\x04\x08\x03\x10\x04J\x04\x08\x04\x10\x05J\x04\x08\x07\x10\x08J\x04\x08\x08\x10\t\"\x92\x01\n\x06Server\x12(\n\x03ref\x18\x01 \x01(\x0b\x32\x1b.grpc.channelz.v1.ServerRef\x12*\n\x04\x64\x61ta\x18\x02 \x01(\x0b\x32\x1c.grpc.channelz.v1.ServerData\x12\x32\n\rlisten_socket\x18\x03 \x03(\x0b\x32\x1b.grpc.channelz.v1.SocketRef\"\xc2\x01\n\nServerData\x12-\n\x05trace\x18\x01 \x01(\x0b\x32\x1e.grpc.channelz.v1.ChannelTrace\x12\x15\n\rcalls_started\x18\x02 \x01(\x03\x12\x17\n\x0f\x63\x61lls_succeeded\x18\x03 \x01(\x03\x12\x14\n\x0c\x63\x61lls_failed\x18\x04 \x01(\x03\x12?\n\x1blast_call_started_timestamp\x18\x05 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\"\xf6\x01\n\x06Socket\x12(\n\x03ref\x18\x01 \x01(\x0b\x32\x1b.grpc.channelz.v1.SocketRef\x12*\n\x04\x64\x61ta\x18\x02 \x01(\x0b\x32\x1c.grpc.channelz.v1.SocketData\x12(\n\x05local\x18\x03 \x01(\x0b\x32\x19.grpc.channelz.v1.Address\x12)\n\x06remote\x18\x04 \x01(\x0b\x32\x19.grpc.channelz.v1.Address\x12,\n\x08security\x18\x05 \x01(\x0b\x32\x1a.grpc.channelz.v1.Security\x12\x13\n\x0bremote_name\x18\x06 \x01(\t\"\xee\x04\n\nSocketData\x12\x17\n\x0fstreams_started\x18\x01 \x01(\x03\x12\x19\n\x11streams_succeeded\x18\x02 \x01(\x03\x12\x16\n\x0estreams_failed\x18\x03 \x01(\x03\x12\x15\n\rmessages_sent\x18\x04 \x01(\x03\x12\x19\n\x11messages_received\x18\x05 \x01(\x03\x12\x18\n\x10keep_alives_sent\x18\x06 \x01(\x03\x12G\n#last_local_stream_created_timestamp\x18\x07 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12H\n$last_remote_stream_created_timestamp\x18\x08 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12?\n\x1blast_message_sent_timestamp\x18\t \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x43\n\x1flast_message_received_timestamp\x18\n \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12>\n\x19local_flow_control_window\x18\x0b \x01(\x0b\x32\x1b.google.protobuf.Int64Value\x12?\n\x1aremote_flow_control_window\x18\x0c \x01(\x0b\x32\x1b.google.protobuf.Int64Value\x12.\n\x06option\x18\r \x03(\x0b\x32\x1e.grpc.channelz.v1.SocketOption\"\xe8\x02\n\x07\x41\x64\x64ress\x12?\n\rtcpip_address\x18\x01 \x01(\x0b\x32&.grpc.channelz.v1.Address.TcpIpAddressH\x00\x12;\n\x0buds_address\x18\x02 \x01(\x0b\x32$.grpc.channelz.v1.Address.UdsAddressH\x00\x12?\n\rother_address\x18\x03 \x01(\x0b\x32&.grpc.channelz.v1.Address.OtherAddressH\x00\x1a\x30\n\x0cTcpIpAddress\x12\x12\n\nip_address\x18\x01 \x01(\x0c\x12\x0c\n\x04port\x18\x02 \x01(\x05\x1a\x1e\n\nUdsAddress\x12\x10\n\x08\x66ilename\x18\x01 \x01(\t\x1a\x41\n\x0cOtherAddress\x12\x0c\n\x04name\x18\x01 \x01(\t\x12#\n\x05value\x18\x02 \x01(\x0b\x32\x14.google.protobuf.AnyB\t\n\x07\x61\x64\x64ress\"\xbe\x02\n\x08Security\x12-\n\x03tls\x18\x01 \x01(\x0b\x32\x1e.grpc.channelz.v1.Security.TlsH\x00\x12\x39\n\x05other\x18\x02 \x01(\x0b\x32(.grpc.channelz.v1.Security.OtherSecurityH\x00\x1a{\n\x03Tls\x12\x17\n\rstandard_name\x18\x01 \x01(\tH\x00\x12\x14\n\nother_name\x18\x02 \x01(\tH\x00\x12\x19\n\x11local_certificate\x18\x03 \x01(\x0c\x12\x1a\n\x12remote_certificate\x18\x04 \x01(\x0c\x42\x0e\n\x0c\x63ipher_suite\x1a\x42\n\rOtherSecurity\x12\x0c\n\x04name\x18\x01 \x01(\t\x12#\n\x05value\x18\x02 \x01(\x0b\x32\x14.google.protobuf.AnyB\x07\n\x05model\"U\n\x0cSocketOption\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\x12(\n\nadditional\x18\x03 \x01(\x0b\x32\x14.google.protobuf.Any\"B\n\x13SocketOptionTimeout\x12+\n\x08\x64uration\x18\x01 \x01(\x0b\x32\x19.google.protobuf.Duration\"Q\n\x12SocketOptionLinger\x12\x0e\n\x06\x61\x63tive\x18\x01 \x01(\x08\x12+\n\x08\x64uration\x18\x02 \x01(\x0b\x32\x19.google.protobuf.Duration\"\xae\x05\n\x13SocketOptionTcpInfo\x12\x12\n\ntcpi_state\x18\x01 \x01(\r\x12\x15\n\rtcpi_ca_state\x18\x02 \x01(\r\x12\x18\n\x10tcpi_retransmits\x18\x03 \x01(\r\x12\x13\n\x0btcpi_probes\x18\x04 \x01(\r\x12\x14\n\x0ctcpi_backoff\x18\x05 \x01(\r\x12\x14\n\x0ctcpi_options\x18\x06 \x01(\r\x12\x17\n\x0ftcpi_snd_wscale\x18\x07 \x01(\r\x12\x17\n\x0ftcpi_rcv_wscale\x18\x08 \x01(\r\x12\x10\n\x08tcpi_rto\x18\t \x01(\r\x12\x10\n\x08tcpi_ato\x18\n \x01(\r\x12\x14\n\x0ctcpi_snd_mss\x18\x0b \x01(\r\x12\x14\n\x0ctcpi_rcv_mss\x18\x0c \x01(\r\x12\x14\n\x0ctcpi_unacked\x18\r \x01(\r\x12\x13\n\x0btcpi_sacked\x18\x0e \x01(\r\x12\x11\n\ttcpi_lost\x18\x0f \x01(\r\x12\x14\n\x0ctcpi_retrans\x18\x10 \x01(\r\x12\x14\n\x0ctcpi_fackets\x18\x11 \x01(\r\x12\x1b\n\x13tcpi_last_data_sent\x18\x12 \x01(\r\x12\x1a\n\x12tcpi_last_ack_sent\x18\x13 \x01(\r\x12\x1b\n\x13tcpi_last_data_recv\x18\x14 \x01(\r\x12\x1a\n\x12tcpi_last_ack_recv\x18\x15 \x01(\r\x12\x11\n\ttcpi_pmtu\x18\x16 \x01(\r\x12\x19\n\x11tcpi_rcv_ssthresh\x18\x17 \x01(\r\x12\x10\n\x08tcpi_rtt\x18\x18 \x01(\r\x12\x13\n\x0btcpi_rttvar\x18\x19 \x01(\r\x12\x19\n\x11tcpi_snd_ssthresh\x18\x1a \x01(\r\x12\x15\n\rtcpi_snd_cwnd\x18\x1b \x01(\r\x12\x13\n\x0btcpi_advmss\x18\x1c \x01(\r\x12\x17\n\x0ftcpi_reordering\x18\x1d \x01(\r\"F\n\x15GetTopChannelsRequest\x12\x18\n\x10start_channel_id\x18\x01 \x01(\x03\x12\x13\n\x0bmax_results\x18\x02 \x01(\x03\"Q\n\x16GetTopChannelsResponse\x12*\n\x07\x63hannel\x18\x01 \x03(\x0b\x32\x19.grpc.channelz.v1.Channel\x12\x0b\n\x03\x65nd\x18\x02 \x01(\x08\"A\n\x11GetServersRequest\x12\x17\n\x0fstart_server_id\x18\x01 \x01(\x03\x12\x13\n\x0bmax_results\x18\x02 \x01(\x03\"K\n\x12GetServersResponse\x12(\n\x06server\x18\x01 \x03(\x0b\x32\x18.grpc.channelz.v1.Server\x12\x0b\n\x03\x65nd\x18\x02 \x01(\x08\"%\n\x10GetServerRequest\x12\x11\n\tserver_id\x18\x01 \x01(\x03\"=\n\x11GetServerResponse\x12(\n\x06server\x18\x01 \x01(\x0b\x32\x18.grpc.channelz.v1.Server\"Z\n\x17GetServerSocketsRequest\x12\x11\n\tserver_id\x18\x01 \x01(\x03\x12\x17\n\x0fstart_socket_id\x18\x02 \x01(\x03\x12\x13\n\x0bmax_results\x18\x03 \x01(\x03\"X\n\x18GetServerSocketsResponse\x12/\n\nsocket_ref\x18\x01 \x03(\x0b\x32\x1b.grpc.channelz.v1.SocketRef\x12\x0b\n\x03\x65nd\x18\x02 \x01(\x08\"\'\n\x11GetChannelRequest\x12\x12\n\nchannel_id\x18\x01 \x01(\x03\"@\n\x12GetChannelResponse\x12*\n\x07\x63hannel\x18\x01 \x01(\x0b\x32\x19.grpc.channelz.v1.Channel\"-\n\x14GetSubchannelRequest\x12\x15\n\rsubchannel_id\x18\x01 \x01(\x03\"I\n\x15GetSubchannelResponse\x12\x30\n\nsubchannel\x18\x01 \x01(\x0b\x32\x1c.grpc.channelz.v1.Subchannel\"6\n\x10GetSocketRequest\x12\x11\n\tsocket_id\x18\x01 \x01(\x03\x12\x0f\n\x07summary\x18\x02 \x01(\x08\"=\n\x11GetSocketResponse\x12(\n\x06socket\x18\x01 \x01(\x0b\x32\x18.grpc.channelz.v1.Socket2\x9a\x05\n\x08\x43hannelz\x12\x63\n\x0eGetTopChannels\x12\'.grpc.channelz.v1.GetTopChannelsRequest\x1a(.grpc.channelz.v1.GetTopChannelsResponse\x12W\n\nGetServers\x12#.grpc.channelz.v1.GetServersRequest\x1a$.grpc.channelz.v1.GetServersResponse\x12T\n\tGetServer\x12\".grpc.channelz.v1.GetServerRequest\x1a#.grpc.channelz.v1.GetServerResponse\x12i\n\x10GetServerSockets\x12).grpc.channelz.v1.GetServerSocketsRequest\x1a*.grpc.channelz.v1.GetServerSocketsResponse\x12W\n\nGetChannel\x12#.grpc.channelz.v1.GetChannelRequest\x1a$.grpc.channelz.v1.GetChannelResponse\x12`\n\rGetSubchannel\x12&.grpc.channelz.v1.GetSubchannelRequest\x1a\'.grpc.channelz.v1.GetSubchannelResponse\x12T\n\tGetSocket\x12\".grpc.channelz.v1.GetSocketRequest\x1a#.grpc.channelz.v1.GetSocketResponseBX\n\x13io.grpc.channelz.v1B\rChannelzProtoP\x01Z0google.golang.org/grpc/channelz/grpc_channelz_v1b\x06proto3') _globals = globals() _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'grpclib.channelz.v1.channelz_pb2', _globals) if _descriptor._USE_C_DESCRIPTORS == False: _globals['DESCRIPTOR']._options = None _globals['DESCRIPTOR']._serialized_options = b'\n\023io.grpc.channelz.v1B\rChannelzProtoP\001Z0google.golang.org/grpc/channelz/grpc_channelz_v1' _globals['_CHANNEL']._serialized_start=181 _globals['_CHANNEL']._serialized_end=435 _globals['_SUBCHANNEL']._serialized_start=438 _globals['_SUBCHANNEL']._serialized_end=698 _globals['_CHANNELCONNECTIVITYSTATE']._serialized_start=701 _globals['_CHANNELCONNECTIVITYSTATE']._serialized_end=888 _globals['_CHANNELCONNECTIVITYSTATE_STATE']._serialized_start=794 _globals['_CHANNELCONNECTIVITYSTATE_STATE']._serialized_end=888 _globals['_CHANNELDATA']._serialized_start=891 _globals['_CHANNELDATA']._serialized_end=1161 _globals['_CHANNELTRACEEVENT']._serialized_start=1164 _globals['_CHANNELTRACEEVENT']._serialized_end=1511 _globals['_CHANNELTRACEEVENT_SEVERITY']._serialized_start=1429 _globals['_CHANNELTRACEEVENT_SEVERITY']._serialized_end=1498 _globals['_CHANNELTRACE']._serialized_start=1514 _globals['_CHANNELTRACE']._serialized_end=1664 _globals['_CHANNELREF']._serialized_start=1666 _globals['_CHANNELREF']._serialized_end=1748 _globals['_SUBCHANNELREF']._serialized_start=1750 _globals['_SUBCHANNELREF']._serialized_end=1838 _globals['_SOCKETREF']._serialized_start=1840 _globals['_SOCKETREF']._serialized_end=1920 _globals['_SERVERREF']._serialized_start=1922 _globals['_SERVERREF']._serialized_end=2002 _globals['_SERVER']._serialized_start=2005 _globals['_SERVER']._serialized_end=2151 _globals['_SERVERDATA']._serialized_start=2154 _globals['_SERVERDATA']._serialized_end=2348 _globals['_SOCKET']._serialized_start=2351 _globals['_SOCKET']._serialized_end=2597 _globals['_SOCKETDATA']._serialized_start=2600 _globals['_SOCKETDATA']._serialized_end=3222 _globals['_ADDRESS']._serialized_start=3225 _globals['_ADDRESS']._serialized_end=3585 _globals['_ADDRESS_TCPIPADDRESS']._serialized_start=3427 _globals['_ADDRESS_TCPIPADDRESS']._serialized_end=3475 _globals['_ADDRESS_UDSADDRESS']._serialized_start=3477 _globals['_ADDRESS_UDSADDRESS']._serialized_end=3507 _globals['_ADDRESS_OTHERADDRESS']._serialized_start=3509 _globals['_ADDRESS_OTHERADDRESS']._serialized_end=3574 _globals['_SECURITY']._serialized_start=3588 _globals['_SECURITY']._serialized_end=3906 _globals['_SECURITY_TLS']._serialized_start=3706 _globals['_SECURITY_TLS']._serialized_end=3829 _globals['_SECURITY_OTHERSECURITY']._serialized_start=3831 _globals['_SECURITY_OTHERSECURITY']._serialized_end=3897 _globals['_SOCKETOPTION']._serialized_start=3908 _globals['_SOCKETOPTION']._serialized_end=3993 _globals['_SOCKETOPTIONTIMEOUT']._serialized_start=3995 _globals['_SOCKETOPTIONTIMEOUT']._serialized_end=4061 _globals['_SOCKETOPTIONLINGER']._serialized_start=4063 _globals['_SOCKETOPTIONLINGER']._serialized_end=4144 _globals['_SOCKETOPTIONTCPINFO']._serialized_start=4147 _globals['_SOCKETOPTIONTCPINFO']._serialized_end=4833 _globals['_GETTOPCHANNELSREQUEST']._serialized_start=4835 _globals['_GETTOPCHANNELSREQUEST']._serialized_end=4905 _globals['_GETTOPCHANNELSRESPONSE']._serialized_start=4907 _globals['_GETTOPCHANNELSRESPONSE']._serialized_end=4988 _globals['_GETSERVERSREQUEST']._serialized_start=4990 _globals['_GETSERVERSREQUEST']._serialized_end=5055 _globals['_GETSERVERSRESPONSE']._serialized_start=5057 _globals['_GETSERVERSRESPONSE']._serialized_end=5132 _globals['_GETSERVERREQUEST']._serialized_start=5134 _globals['_GETSERVERREQUEST']._serialized_end=5171 _globals['_GETSERVERRESPONSE']._serialized_start=5173 _globals['_GETSERVERRESPONSE']._serialized_end=5234 _globals['_GETSERVERSOCKETSREQUEST']._serialized_start=5236 _globals['_GETSERVERSOCKETSREQUEST']._serialized_end=5326 _globals['_GETSERVERSOCKETSRESPONSE']._serialized_start=5328 _globals['_GETSERVERSOCKETSRESPONSE']._serialized_end=5416 _globals['_GETCHANNELREQUEST']._serialized_start=5418 _globals['_GETCHANNELREQUEST']._serialized_end=5457 _globals['_GETCHANNELRESPONSE']._serialized_start=5459 _globals['_GETCHANNELRESPONSE']._serialized_end=5523 _globals['_GETSUBCHANNELREQUEST']._serialized_start=5525 _globals['_GETSUBCHANNELREQUEST']._serialized_end=5570 _globals['_GETSUBCHANNELRESPONSE']._serialized_start=5572 _globals['_GETSUBCHANNELRESPONSE']._serialized_end=5645 _globals['_GETSOCKETREQUEST']._serialized_start=5647 _globals['_GETSOCKETREQUEST']._serialized_end=5701 _globals['_GETSOCKETRESPONSE']._serialized_start=5703 _globals['_GETSOCKETRESPONSE']._serialized_end=5764 _globals['_CHANNELZ']._serialized_start=5767 _globals['_CHANNELZ']._serialized_end=6433 # @@protoc_insertion_point(module_scope) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/channelz/v1/channelz_pb2.pyi0000644000175100001770000015306300000000000023234 0ustar00runnerdocker00000000000000""" @generated by mypy-protobuf. Do not edit manually! isort:skip_file This file defines an interface for exporting monitoring information out of gRPC servers. See the full design at https://github.com/grpc/proposal/blob/master/A14-channelz.md The canonical version of this proto can be found at https://github.com/grpc/grpc-proto/blob/master/grpc/channelz/v1/channelz.proto """ import builtins import collections.abc import google.protobuf.any_pb2 import google.protobuf.descriptor import google.protobuf.duration_pb2 import google.protobuf.internal.containers import google.protobuf.internal.enum_type_wrapper import google.protobuf.message import google.protobuf.timestamp_pb2 import google.protobuf.wrappers_pb2 import sys import typing if sys.version_info >= (3, 10): import typing as typing_extensions else: import typing_extensions DESCRIPTOR: google.protobuf.descriptor.FileDescriptor @typing.final class Channel(google.protobuf.message.Message): """Channel is a logical grouping of channels, subchannels, and sockets.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor REF_FIELD_NUMBER: builtins.int DATA_FIELD_NUMBER: builtins.int CHANNEL_REF_FIELD_NUMBER: builtins.int SUBCHANNEL_REF_FIELD_NUMBER: builtins.int SOCKET_REF_FIELD_NUMBER: builtins.int @property def ref(self) -> global___ChannelRef: """The identifier for this channel. This should be set.""" @property def data(self) -> global___ChannelData: """Data specific to this channel. At most one of 'channel_ref+subchannel_ref' and 'socket' is set. """ @property def channel_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___ChannelRef]: """There are no ordering guarantees on the order of channel refs. There may not be cycles in the ref graph. A channel ref may be present in more than one channel or subchannel. """ @property def subchannel_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SubchannelRef]: """At most one of 'channel_ref+subchannel_ref' and 'socket' is set. There are no ordering guarantees on the order of subchannel refs. There may not be cycles in the ref graph. A sub channel ref may be present in more than one channel or subchannel. """ @property def socket_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SocketRef]: """There are no ordering guarantees on the order of sockets.""" def __init__( self, *, ref: global___ChannelRef | None = ..., data: global___ChannelData | None = ..., channel_ref: collections.abc.Iterable[global___ChannelRef] | None = ..., subchannel_ref: collections.abc.Iterable[global___SubchannelRef] | None = ..., socket_ref: collections.abc.Iterable[global___SocketRef] | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["data", b"data", "ref", b"ref"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["channel_ref", b"channel_ref", "data", b"data", "ref", b"ref", "socket_ref", b"socket_ref", "subchannel_ref", b"subchannel_ref"]) -> None: ... global___Channel = Channel @typing.final class Subchannel(google.protobuf.message.Message): """Subchannel is a logical grouping of channels, subchannels, and sockets. A subchannel is load balanced over by it's ancestor """ DESCRIPTOR: google.protobuf.descriptor.Descriptor REF_FIELD_NUMBER: builtins.int DATA_FIELD_NUMBER: builtins.int CHANNEL_REF_FIELD_NUMBER: builtins.int SUBCHANNEL_REF_FIELD_NUMBER: builtins.int SOCKET_REF_FIELD_NUMBER: builtins.int @property def ref(self) -> global___SubchannelRef: """The identifier for this channel.""" @property def data(self) -> global___ChannelData: """Data specific to this channel. At most one of 'channel_ref+subchannel_ref' and 'socket' is set. """ @property def channel_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___ChannelRef]: """There are no ordering guarantees on the order of channel refs. There may not be cycles in the ref graph. A channel ref may be present in more than one channel or subchannel. """ @property def subchannel_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SubchannelRef]: """At most one of 'channel_ref+subchannel_ref' and 'socket' is set. There are no ordering guarantees on the order of subchannel refs. There may not be cycles in the ref graph. A sub channel ref may be present in more than one channel or subchannel. """ @property def socket_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SocketRef]: """There are no ordering guarantees on the order of sockets.""" def __init__( self, *, ref: global___SubchannelRef | None = ..., data: global___ChannelData | None = ..., channel_ref: collections.abc.Iterable[global___ChannelRef] | None = ..., subchannel_ref: collections.abc.Iterable[global___SubchannelRef] | None = ..., socket_ref: collections.abc.Iterable[global___SocketRef] | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["data", b"data", "ref", b"ref"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["channel_ref", b"channel_ref", "data", b"data", "ref", b"ref", "socket_ref", b"socket_ref", "subchannel_ref", b"subchannel_ref"]) -> None: ... global___Subchannel = Subchannel @typing.final class ChannelConnectivityState(google.protobuf.message.Message): """These come from the specified states in this document: https://github.com/grpc/grpc/blob/master/doc/connectivity-semantics-and-api.md """ DESCRIPTOR: google.protobuf.descriptor.Descriptor class _State: ValueType = typing.NewType("ValueType", builtins.int) V: typing_extensions.TypeAlias = ValueType class _StateEnumTypeWrapper(google.protobuf.internal.enum_type_wrapper._EnumTypeWrapper[ChannelConnectivityState._State.ValueType], builtins.type): DESCRIPTOR: google.protobuf.descriptor.EnumDescriptor UNKNOWN: ChannelConnectivityState._State.ValueType # 0 IDLE: ChannelConnectivityState._State.ValueType # 1 CONNECTING: ChannelConnectivityState._State.ValueType # 2 READY: ChannelConnectivityState._State.ValueType # 3 TRANSIENT_FAILURE: ChannelConnectivityState._State.ValueType # 4 SHUTDOWN: ChannelConnectivityState._State.ValueType # 5 class State(_State, metaclass=_StateEnumTypeWrapper): ... UNKNOWN: ChannelConnectivityState.State.ValueType # 0 IDLE: ChannelConnectivityState.State.ValueType # 1 CONNECTING: ChannelConnectivityState.State.ValueType # 2 READY: ChannelConnectivityState.State.ValueType # 3 TRANSIENT_FAILURE: ChannelConnectivityState.State.ValueType # 4 SHUTDOWN: ChannelConnectivityState.State.ValueType # 5 STATE_FIELD_NUMBER: builtins.int state: global___ChannelConnectivityState.State.ValueType def __init__( self, *, state: global___ChannelConnectivityState.State.ValueType = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["state", b"state"]) -> None: ... global___ChannelConnectivityState = ChannelConnectivityState @typing.final class ChannelData(google.protobuf.message.Message): """Channel data is data related to a specific Channel or Subchannel.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor STATE_FIELD_NUMBER: builtins.int TARGET_FIELD_NUMBER: builtins.int TRACE_FIELD_NUMBER: builtins.int CALLS_STARTED_FIELD_NUMBER: builtins.int CALLS_SUCCEEDED_FIELD_NUMBER: builtins.int CALLS_FAILED_FIELD_NUMBER: builtins.int LAST_CALL_STARTED_TIMESTAMP_FIELD_NUMBER: builtins.int target: builtins.str """The target this channel originally tried to connect to. May be absent""" calls_started: builtins.int """The number of calls started on the channel""" calls_succeeded: builtins.int """The number of calls that have completed with an OK status""" calls_failed: builtins.int """The number of calls that have completed with a non-OK status""" @property def state(self) -> global___ChannelConnectivityState: """The connectivity state of the channel or subchannel. Implementations should always set this. """ @property def trace(self) -> global___ChannelTrace: """A trace of recent events on the channel. May be absent.""" @property def last_call_started_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a call was started on the channel.""" def __init__( self, *, state: global___ChannelConnectivityState | None = ..., target: builtins.str = ..., trace: global___ChannelTrace | None = ..., calls_started: builtins.int = ..., calls_succeeded: builtins.int = ..., calls_failed: builtins.int = ..., last_call_started_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["last_call_started_timestamp", b"last_call_started_timestamp", "state", b"state", "trace", b"trace"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["calls_failed", b"calls_failed", "calls_started", b"calls_started", "calls_succeeded", b"calls_succeeded", "last_call_started_timestamp", b"last_call_started_timestamp", "state", b"state", "target", b"target", "trace", b"trace"]) -> None: ... global___ChannelData = ChannelData @typing.final class ChannelTraceEvent(google.protobuf.message.Message): """A trace event is an interesting thing that happened to a channel or subchannel, such as creation, address resolution, subchannel creation, etc. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor class _Severity: ValueType = typing.NewType("ValueType", builtins.int) V: typing_extensions.TypeAlias = ValueType class _SeverityEnumTypeWrapper(google.protobuf.internal.enum_type_wrapper._EnumTypeWrapper[ChannelTraceEvent._Severity.ValueType], builtins.type): DESCRIPTOR: google.protobuf.descriptor.EnumDescriptor CT_UNKNOWN: ChannelTraceEvent._Severity.ValueType # 0 CT_INFO: ChannelTraceEvent._Severity.ValueType # 1 CT_WARNING: ChannelTraceEvent._Severity.ValueType # 2 CT_ERROR: ChannelTraceEvent._Severity.ValueType # 3 class Severity(_Severity, metaclass=_SeverityEnumTypeWrapper): """The supported severity levels of trace events.""" CT_UNKNOWN: ChannelTraceEvent.Severity.ValueType # 0 CT_INFO: ChannelTraceEvent.Severity.ValueType # 1 CT_WARNING: ChannelTraceEvent.Severity.ValueType # 2 CT_ERROR: ChannelTraceEvent.Severity.ValueType # 3 DESCRIPTION_FIELD_NUMBER: builtins.int SEVERITY_FIELD_NUMBER: builtins.int TIMESTAMP_FIELD_NUMBER: builtins.int CHANNEL_REF_FIELD_NUMBER: builtins.int SUBCHANNEL_REF_FIELD_NUMBER: builtins.int description: builtins.str """High level description of the event.""" severity: global___ChannelTraceEvent.Severity.ValueType """the severity of the trace event""" @property def timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """When this event occurred.""" @property def channel_ref(self) -> global___ChannelRef: ... @property def subchannel_ref(self) -> global___SubchannelRef: ... def __init__( self, *, description: builtins.str = ..., severity: global___ChannelTraceEvent.Severity.ValueType = ..., timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., channel_ref: global___ChannelRef | None = ..., subchannel_ref: global___SubchannelRef | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["channel_ref", b"channel_ref", "child_ref", b"child_ref", "subchannel_ref", b"subchannel_ref", "timestamp", b"timestamp"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["channel_ref", b"channel_ref", "child_ref", b"child_ref", "description", b"description", "severity", b"severity", "subchannel_ref", b"subchannel_ref", "timestamp", b"timestamp"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["child_ref", b"child_ref"]) -> typing.Literal["channel_ref", "subchannel_ref"] | None: ... global___ChannelTraceEvent = ChannelTraceEvent @typing.final class ChannelTrace(google.protobuf.message.Message): """ChannelTrace represents the recent events that have occurred on the channel.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor NUM_EVENTS_LOGGED_FIELD_NUMBER: builtins.int CREATION_TIMESTAMP_FIELD_NUMBER: builtins.int EVENTS_FIELD_NUMBER: builtins.int num_events_logged: builtins.int """Number of events ever logged in this tracing object. This can differ from events.size() because events can be overwritten or garbage collected by implementations. """ @property def creation_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """Time that this channel was created.""" @property def events(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___ChannelTraceEvent]: """List of events that have occurred on this channel.""" def __init__( self, *, num_events_logged: builtins.int = ..., creation_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., events: collections.abc.Iterable[global___ChannelTraceEvent] | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["creation_timestamp", b"creation_timestamp"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["creation_timestamp", b"creation_timestamp", "events", b"events", "num_events_logged", b"num_events_logged"]) -> None: ... global___ChannelTrace = ChannelTrace @typing.final class ChannelRef(google.protobuf.message.Message): """ChannelRef is a reference to a Channel.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor CHANNEL_ID_FIELD_NUMBER: builtins.int NAME_FIELD_NUMBER: builtins.int channel_id: builtins.int """The globally unique id for this channel. Must be a positive number.""" name: builtins.str """An optional name associated with the channel.""" def __init__( self, *, channel_id: builtins.int = ..., name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["channel_id", b"channel_id", "name", b"name"]) -> None: ... global___ChannelRef = ChannelRef @typing.final class SubchannelRef(google.protobuf.message.Message): """SubchannelRef is a reference to a Subchannel.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor SUBCHANNEL_ID_FIELD_NUMBER: builtins.int NAME_FIELD_NUMBER: builtins.int subchannel_id: builtins.int """The globally unique id for this subchannel. Must be a positive number.""" name: builtins.str """An optional name associated with the subchannel.""" def __init__( self, *, subchannel_id: builtins.int = ..., name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["name", b"name", "subchannel_id", b"subchannel_id"]) -> None: ... global___SubchannelRef = SubchannelRef @typing.final class SocketRef(google.protobuf.message.Message): """SocketRef is a reference to a Socket.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor SOCKET_ID_FIELD_NUMBER: builtins.int NAME_FIELD_NUMBER: builtins.int socket_id: builtins.int """The globally unique id for this socket. Must be a positive number.""" name: builtins.str """An optional name associated with the socket.""" def __init__( self, *, socket_id: builtins.int = ..., name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["name", b"name", "socket_id", b"socket_id"]) -> None: ... global___SocketRef = SocketRef @typing.final class ServerRef(google.protobuf.message.Message): """ServerRef is a reference to a Server.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVER_ID_FIELD_NUMBER: builtins.int NAME_FIELD_NUMBER: builtins.int server_id: builtins.int """A globally unique identifier for this server. Must be a positive number.""" name: builtins.str """An optional name associated with the server.""" def __init__( self, *, server_id: builtins.int = ..., name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["name", b"name", "server_id", b"server_id"]) -> None: ... global___ServerRef = ServerRef @typing.final class Server(google.protobuf.message.Message): """Server represents a single server. There may be multiple servers in a single program. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor REF_FIELD_NUMBER: builtins.int DATA_FIELD_NUMBER: builtins.int LISTEN_SOCKET_FIELD_NUMBER: builtins.int @property def ref(self) -> global___ServerRef: """The identifier for a Server. This should be set.""" @property def data(self) -> global___ServerData: """The associated data of the Server.""" @property def listen_socket(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SocketRef]: """The sockets that the server is listening on. There are no ordering guarantees. This may be absent. """ def __init__( self, *, ref: global___ServerRef | None = ..., data: global___ServerData | None = ..., listen_socket: collections.abc.Iterable[global___SocketRef] | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["data", b"data", "ref", b"ref"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["data", b"data", "listen_socket", b"listen_socket", "ref", b"ref"]) -> None: ... global___Server = Server @typing.final class ServerData(google.protobuf.message.Message): """ServerData is data for a specific Server.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor TRACE_FIELD_NUMBER: builtins.int CALLS_STARTED_FIELD_NUMBER: builtins.int CALLS_SUCCEEDED_FIELD_NUMBER: builtins.int CALLS_FAILED_FIELD_NUMBER: builtins.int LAST_CALL_STARTED_TIMESTAMP_FIELD_NUMBER: builtins.int calls_started: builtins.int """The number of incoming calls started on the server""" calls_succeeded: builtins.int """The number of incoming calls that have completed with an OK status""" calls_failed: builtins.int """The number of incoming calls that have a completed with a non-OK status""" @property def trace(self) -> global___ChannelTrace: """A trace of recent events on the server. May be absent.""" @property def last_call_started_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a call was started on the server.""" def __init__( self, *, trace: global___ChannelTrace | None = ..., calls_started: builtins.int = ..., calls_succeeded: builtins.int = ..., calls_failed: builtins.int = ..., last_call_started_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["last_call_started_timestamp", b"last_call_started_timestamp", "trace", b"trace"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["calls_failed", b"calls_failed", "calls_started", b"calls_started", "calls_succeeded", b"calls_succeeded", "last_call_started_timestamp", b"last_call_started_timestamp", "trace", b"trace"]) -> None: ... global___ServerData = ServerData @typing.final class Socket(google.protobuf.message.Message): """Information about an actual connection. Pronounced "sock-ay".""" DESCRIPTOR: google.protobuf.descriptor.Descriptor REF_FIELD_NUMBER: builtins.int DATA_FIELD_NUMBER: builtins.int LOCAL_FIELD_NUMBER: builtins.int REMOTE_FIELD_NUMBER: builtins.int SECURITY_FIELD_NUMBER: builtins.int REMOTE_NAME_FIELD_NUMBER: builtins.int remote_name: builtins.str """Optional, represents the name of the remote endpoint, if different than the original target name. """ @property def ref(self) -> global___SocketRef: """The identifier for the Socket.""" @property def data(self) -> global___SocketData: """Data specific to this Socket.""" @property def local(self) -> global___Address: """The locally bound address.""" @property def remote(self) -> global___Address: """The remote bound address. May be absent.""" @property def security(self) -> global___Security: """Security details for this socket. May be absent if not available, or there is no security on the socket. """ def __init__( self, *, ref: global___SocketRef | None = ..., data: global___SocketData | None = ..., local: global___Address | None = ..., remote: global___Address | None = ..., security: global___Security | None = ..., remote_name: builtins.str = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["data", b"data", "local", b"local", "ref", b"ref", "remote", b"remote", "security", b"security"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["data", b"data", "local", b"local", "ref", b"ref", "remote", b"remote", "remote_name", b"remote_name", "security", b"security"]) -> None: ... global___Socket = Socket @typing.final class SocketData(google.protobuf.message.Message): """SocketData is data associated for a specific Socket. The fields present are specific to the implementation, so there may be minor differences in the semantics. (e.g. flow control windows) """ DESCRIPTOR: google.protobuf.descriptor.Descriptor STREAMS_STARTED_FIELD_NUMBER: builtins.int STREAMS_SUCCEEDED_FIELD_NUMBER: builtins.int STREAMS_FAILED_FIELD_NUMBER: builtins.int MESSAGES_SENT_FIELD_NUMBER: builtins.int MESSAGES_RECEIVED_FIELD_NUMBER: builtins.int KEEP_ALIVES_SENT_FIELD_NUMBER: builtins.int LAST_LOCAL_STREAM_CREATED_TIMESTAMP_FIELD_NUMBER: builtins.int LAST_REMOTE_STREAM_CREATED_TIMESTAMP_FIELD_NUMBER: builtins.int LAST_MESSAGE_SENT_TIMESTAMP_FIELD_NUMBER: builtins.int LAST_MESSAGE_RECEIVED_TIMESTAMP_FIELD_NUMBER: builtins.int LOCAL_FLOW_CONTROL_WINDOW_FIELD_NUMBER: builtins.int REMOTE_FLOW_CONTROL_WINDOW_FIELD_NUMBER: builtins.int OPTION_FIELD_NUMBER: builtins.int streams_started: builtins.int """The number of streams that have been started.""" streams_succeeded: builtins.int """The number of streams that have ended successfully: On client side, received frame with eos bit set; On server side, sent frame with eos bit set. """ streams_failed: builtins.int """The number of streams that have ended unsuccessfully: On client side, ended without receiving frame with eos bit set; On server side, ended without sending frame with eos bit set. """ messages_sent: builtins.int """The number of grpc messages successfully sent on this socket.""" messages_received: builtins.int """The number of grpc messages received on this socket.""" keep_alives_sent: builtins.int """The number of keep alives sent. This is typically implemented with HTTP/2 ping messages. """ @property def last_local_stream_created_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a stream was created by this endpoint. Usually unset for servers. """ @property def last_remote_stream_created_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a stream was created by the remote endpoint. Usually unset for clients. """ @property def last_message_sent_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a message was sent by this endpoint.""" @property def last_message_received_timestamp(self) -> google.protobuf.timestamp_pb2.Timestamp: """The last time a message was received by this endpoint.""" @property def local_flow_control_window(self) -> google.protobuf.wrappers_pb2.Int64Value: """The amount of window, granted to the local endpoint by the remote endpoint. This may be slightly out of date due to network latency. This does NOT include stream level or TCP level flow control info. """ @property def remote_flow_control_window(self) -> google.protobuf.wrappers_pb2.Int64Value: """The amount of window, granted to the remote endpoint by the local endpoint. This may be slightly out of date due to network latency. This does NOT include stream level or TCP level flow control info. """ @property def option(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SocketOption]: """Socket options set on this socket. May be absent if 'summary' is set on GetSocketRequest. """ def __init__( self, *, streams_started: builtins.int = ..., streams_succeeded: builtins.int = ..., streams_failed: builtins.int = ..., messages_sent: builtins.int = ..., messages_received: builtins.int = ..., keep_alives_sent: builtins.int = ..., last_local_stream_created_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., last_remote_stream_created_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., last_message_sent_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., last_message_received_timestamp: google.protobuf.timestamp_pb2.Timestamp | None = ..., local_flow_control_window: google.protobuf.wrappers_pb2.Int64Value | None = ..., remote_flow_control_window: google.protobuf.wrappers_pb2.Int64Value | None = ..., option: collections.abc.Iterable[global___SocketOption] | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["last_local_stream_created_timestamp", b"last_local_stream_created_timestamp", "last_message_received_timestamp", b"last_message_received_timestamp", "last_message_sent_timestamp", b"last_message_sent_timestamp", "last_remote_stream_created_timestamp", b"last_remote_stream_created_timestamp", "local_flow_control_window", b"local_flow_control_window", "remote_flow_control_window", b"remote_flow_control_window"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["keep_alives_sent", b"keep_alives_sent", "last_local_stream_created_timestamp", b"last_local_stream_created_timestamp", "last_message_received_timestamp", b"last_message_received_timestamp", "last_message_sent_timestamp", b"last_message_sent_timestamp", "last_remote_stream_created_timestamp", b"last_remote_stream_created_timestamp", "local_flow_control_window", b"local_flow_control_window", "messages_received", b"messages_received", "messages_sent", b"messages_sent", "option", b"option", "remote_flow_control_window", b"remote_flow_control_window", "streams_failed", b"streams_failed", "streams_started", b"streams_started", "streams_succeeded", b"streams_succeeded"]) -> None: ... global___SocketData = SocketData @typing.final class Address(google.protobuf.message.Message): """Address represents the address used to create the socket.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor @typing.final class TcpIpAddress(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor IP_ADDRESS_FIELD_NUMBER: builtins.int PORT_FIELD_NUMBER: builtins.int ip_address: builtins.bytes """Either the IPv4 or IPv6 address in bytes. Will be either 4 bytes or 16 bytes in length. """ port: builtins.int """0-64k, or -1 if not appropriate.""" def __init__( self, *, ip_address: builtins.bytes = ..., port: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["ip_address", b"ip_address", "port", b"port"]) -> None: ... @typing.final class UdsAddress(google.protobuf.message.Message): """A Unix Domain Socket address.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor FILENAME_FIELD_NUMBER: builtins.int filename: builtins.str def __init__( self, *, filename: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["filename", b"filename"]) -> None: ... @typing.final class OtherAddress(google.protobuf.message.Message): """An address type not included above.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor NAME_FIELD_NUMBER: builtins.int VALUE_FIELD_NUMBER: builtins.int name: builtins.str """The human readable version of the value. This value should be set.""" @property def value(self) -> google.protobuf.any_pb2.Any: """The actual address message.""" def __init__( self, *, name: builtins.str = ..., value: google.protobuf.any_pb2.Any | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["value", b"value"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["name", b"name", "value", b"value"]) -> None: ... TCPIP_ADDRESS_FIELD_NUMBER: builtins.int UDS_ADDRESS_FIELD_NUMBER: builtins.int OTHER_ADDRESS_FIELD_NUMBER: builtins.int @property def tcpip_address(self) -> global___Address.TcpIpAddress: ... @property def uds_address(self) -> global___Address.UdsAddress: ... @property def other_address(self) -> global___Address.OtherAddress: ... def __init__( self, *, tcpip_address: global___Address.TcpIpAddress | None = ..., uds_address: global___Address.UdsAddress | None = ..., other_address: global___Address.OtherAddress | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["address", b"address", "other_address", b"other_address", "tcpip_address", b"tcpip_address", "uds_address", b"uds_address"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["address", b"address", "other_address", b"other_address", "tcpip_address", b"tcpip_address", "uds_address", b"uds_address"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["address", b"address"]) -> typing.Literal["tcpip_address", "uds_address", "other_address"] | None: ... global___Address = Address @typing.final class Security(google.protobuf.message.Message): """Security represents details about how secure the socket is.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor @typing.final class Tls(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor STANDARD_NAME_FIELD_NUMBER: builtins.int OTHER_NAME_FIELD_NUMBER: builtins.int LOCAL_CERTIFICATE_FIELD_NUMBER: builtins.int REMOTE_CERTIFICATE_FIELD_NUMBER: builtins.int standard_name: builtins.str """The cipher suite name in the RFC 4346 format: https://tools.ietf.org/html/rfc4346#appendix-C """ other_name: builtins.str """Some other way to describe the cipher suite if the RFC 4346 name is not available. """ local_certificate: builtins.bytes """the certificate used by this endpoint.""" remote_certificate: builtins.bytes """the certificate used by the remote endpoint.""" def __init__( self, *, standard_name: builtins.str = ..., other_name: builtins.str = ..., local_certificate: builtins.bytes = ..., remote_certificate: builtins.bytes = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["cipher_suite", b"cipher_suite", "other_name", b"other_name", "standard_name", b"standard_name"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["cipher_suite", b"cipher_suite", "local_certificate", b"local_certificate", "other_name", b"other_name", "remote_certificate", b"remote_certificate", "standard_name", b"standard_name"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["cipher_suite", b"cipher_suite"]) -> typing.Literal["standard_name", "other_name"] | None: ... @typing.final class OtherSecurity(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor NAME_FIELD_NUMBER: builtins.int VALUE_FIELD_NUMBER: builtins.int name: builtins.str """The human readable version of the value.""" @property def value(self) -> google.protobuf.any_pb2.Any: """The actual security details message.""" def __init__( self, *, name: builtins.str = ..., value: google.protobuf.any_pb2.Any | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["value", b"value"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["name", b"name", "value", b"value"]) -> None: ... TLS_FIELD_NUMBER: builtins.int OTHER_FIELD_NUMBER: builtins.int @property def tls(self) -> global___Security.Tls: ... @property def other(self) -> global___Security.OtherSecurity: ... def __init__( self, *, tls: global___Security.Tls | None = ..., other: global___Security.OtherSecurity | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["model", b"model", "other", b"other", "tls", b"tls"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["model", b"model", "other", b"other", "tls", b"tls"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["model", b"model"]) -> typing.Literal["tls", "other"] | None: ... global___Security = Security @typing.final class SocketOption(google.protobuf.message.Message): """SocketOption represents socket options for a socket. Specifically, these are the options returned by getsockopt(). """ DESCRIPTOR: google.protobuf.descriptor.Descriptor NAME_FIELD_NUMBER: builtins.int VALUE_FIELD_NUMBER: builtins.int ADDITIONAL_FIELD_NUMBER: builtins.int name: builtins.str """The full name of the socket option. Typically this will be the upper case name, such as "SO_REUSEPORT". """ value: builtins.str """The human readable value of this socket option. At least one of value or additional will be set. """ @property def additional(self) -> google.protobuf.any_pb2.Any: """Additional data associated with the socket option. At least one of value or additional will be set. """ def __init__( self, *, name: builtins.str = ..., value: builtins.str = ..., additional: google.protobuf.any_pb2.Any | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["additional", b"additional"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["additional", b"additional", "name", b"name", "value", b"value"]) -> None: ... global___SocketOption = SocketOption @typing.final class SocketOptionTimeout(google.protobuf.message.Message): """For use with SocketOption's additional field. This is primarily used for SO_RCVTIMEO and SO_SNDTIMEO """ DESCRIPTOR: google.protobuf.descriptor.Descriptor DURATION_FIELD_NUMBER: builtins.int @property def duration(self) -> google.protobuf.duration_pb2.Duration: ... def __init__( self, *, duration: google.protobuf.duration_pb2.Duration | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["duration", b"duration"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["duration", b"duration"]) -> None: ... global___SocketOptionTimeout = SocketOptionTimeout @typing.final class SocketOptionLinger(google.protobuf.message.Message): """For use with SocketOption's additional field. This is primarily used for SO_LINGER. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor ACTIVE_FIELD_NUMBER: builtins.int DURATION_FIELD_NUMBER: builtins.int active: builtins.bool """active maps to `struct linger.l_onoff`""" @property def duration(self) -> google.protobuf.duration_pb2.Duration: """duration maps to `struct linger.l_linger`""" def __init__( self, *, active: builtins.bool = ..., duration: google.protobuf.duration_pb2.Duration | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["duration", b"duration"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["active", b"active", "duration", b"duration"]) -> None: ... global___SocketOptionLinger = SocketOptionLinger @typing.final class SocketOptionTcpInfo(google.protobuf.message.Message): """For use with SocketOption's additional field. Tcp info for SOL_TCP and TCP_INFO. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor TCPI_STATE_FIELD_NUMBER: builtins.int TCPI_CA_STATE_FIELD_NUMBER: builtins.int TCPI_RETRANSMITS_FIELD_NUMBER: builtins.int TCPI_PROBES_FIELD_NUMBER: builtins.int TCPI_BACKOFF_FIELD_NUMBER: builtins.int TCPI_OPTIONS_FIELD_NUMBER: builtins.int TCPI_SND_WSCALE_FIELD_NUMBER: builtins.int TCPI_RCV_WSCALE_FIELD_NUMBER: builtins.int TCPI_RTO_FIELD_NUMBER: builtins.int TCPI_ATO_FIELD_NUMBER: builtins.int TCPI_SND_MSS_FIELD_NUMBER: builtins.int TCPI_RCV_MSS_FIELD_NUMBER: builtins.int TCPI_UNACKED_FIELD_NUMBER: builtins.int TCPI_SACKED_FIELD_NUMBER: builtins.int TCPI_LOST_FIELD_NUMBER: builtins.int TCPI_RETRANS_FIELD_NUMBER: builtins.int TCPI_FACKETS_FIELD_NUMBER: builtins.int TCPI_LAST_DATA_SENT_FIELD_NUMBER: builtins.int TCPI_LAST_ACK_SENT_FIELD_NUMBER: builtins.int TCPI_LAST_DATA_RECV_FIELD_NUMBER: builtins.int TCPI_LAST_ACK_RECV_FIELD_NUMBER: builtins.int TCPI_PMTU_FIELD_NUMBER: builtins.int TCPI_RCV_SSTHRESH_FIELD_NUMBER: builtins.int TCPI_RTT_FIELD_NUMBER: builtins.int TCPI_RTTVAR_FIELD_NUMBER: builtins.int TCPI_SND_SSTHRESH_FIELD_NUMBER: builtins.int TCPI_SND_CWND_FIELD_NUMBER: builtins.int TCPI_ADVMSS_FIELD_NUMBER: builtins.int TCPI_REORDERING_FIELD_NUMBER: builtins.int tcpi_state: builtins.int tcpi_ca_state: builtins.int tcpi_retransmits: builtins.int tcpi_probes: builtins.int tcpi_backoff: builtins.int tcpi_options: builtins.int tcpi_snd_wscale: builtins.int tcpi_rcv_wscale: builtins.int tcpi_rto: builtins.int tcpi_ato: builtins.int tcpi_snd_mss: builtins.int tcpi_rcv_mss: builtins.int tcpi_unacked: builtins.int tcpi_sacked: builtins.int tcpi_lost: builtins.int tcpi_retrans: builtins.int tcpi_fackets: builtins.int tcpi_last_data_sent: builtins.int tcpi_last_ack_sent: builtins.int tcpi_last_data_recv: builtins.int tcpi_last_ack_recv: builtins.int tcpi_pmtu: builtins.int tcpi_rcv_ssthresh: builtins.int tcpi_rtt: builtins.int tcpi_rttvar: builtins.int tcpi_snd_ssthresh: builtins.int tcpi_snd_cwnd: builtins.int tcpi_advmss: builtins.int tcpi_reordering: builtins.int def __init__( self, *, tcpi_state: builtins.int = ..., tcpi_ca_state: builtins.int = ..., tcpi_retransmits: builtins.int = ..., tcpi_probes: builtins.int = ..., tcpi_backoff: builtins.int = ..., tcpi_options: builtins.int = ..., tcpi_snd_wscale: builtins.int = ..., tcpi_rcv_wscale: builtins.int = ..., tcpi_rto: builtins.int = ..., tcpi_ato: builtins.int = ..., tcpi_snd_mss: builtins.int = ..., tcpi_rcv_mss: builtins.int = ..., tcpi_unacked: builtins.int = ..., tcpi_sacked: builtins.int = ..., tcpi_lost: builtins.int = ..., tcpi_retrans: builtins.int = ..., tcpi_fackets: builtins.int = ..., tcpi_last_data_sent: builtins.int = ..., tcpi_last_ack_sent: builtins.int = ..., tcpi_last_data_recv: builtins.int = ..., tcpi_last_ack_recv: builtins.int = ..., tcpi_pmtu: builtins.int = ..., tcpi_rcv_ssthresh: builtins.int = ..., tcpi_rtt: builtins.int = ..., tcpi_rttvar: builtins.int = ..., tcpi_snd_ssthresh: builtins.int = ..., tcpi_snd_cwnd: builtins.int = ..., tcpi_advmss: builtins.int = ..., tcpi_reordering: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["tcpi_advmss", b"tcpi_advmss", "tcpi_ato", b"tcpi_ato", "tcpi_backoff", b"tcpi_backoff", "tcpi_ca_state", b"tcpi_ca_state", "tcpi_fackets", b"tcpi_fackets", "tcpi_last_ack_recv", b"tcpi_last_ack_recv", "tcpi_last_ack_sent", b"tcpi_last_ack_sent", "tcpi_last_data_recv", b"tcpi_last_data_recv", "tcpi_last_data_sent", b"tcpi_last_data_sent", "tcpi_lost", b"tcpi_lost", "tcpi_options", b"tcpi_options", "tcpi_pmtu", b"tcpi_pmtu", "tcpi_probes", b"tcpi_probes", "tcpi_rcv_mss", b"tcpi_rcv_mss", "tcpi_rcv_ssthresh", b"tcpi_rcv_ssthresh", "tcpi_rcv_wscale", b"tcpi_rcv_wscale", "tcpi_reordering", b"tcpi_reordering", "tcpi_retrans", b"tcpi_retrans", "tcpi_retransmits", b"tcpi_retransmits", "tcpi_rto", b"tcpi_rto", "tcpi_rtt", b"tcpi_rtt", "tcpi_rttvar", b"tcpi_rttvar", "tcpi_sacked", b"tcpi_sacked", "tcpi_snd_cwnd", b"tcpi_snd_cwnd", "tcpi_snd_mss", b"tcpi_snd_mss", "tcpi_snd_ssthresh", b"tcpi_snd_ssthresh", "tcpi_snd_wscale", b"tcpi_snd_wscale", "tcpi_state", b"tcpi_state", "tcpi_unacked", b"tcpi_unacked"]) -> None: ... global___SocketOptionTcpInfo = SocketOptionTcpInfo @typing.final class GetTopChannelsRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor START_CHANNEL_ID_FIELD_NUMBER: builtins.int MAX_RESULTS_FIELD_NUMBER: builtins.int start_channel_id: builtins.int """start_channel_id indicates that only channels at or above this id should be included in the results. To request the first page, this should be set to 0. To request subsequent pages, the client generates this value by adding 1 to the highest seen result ID. """ max_results: builtins.int """If non-zero, the server will return a page of results containing at most this many items. If zero, the server will choose a reasonable page size. Must never be negative. """ def __init__( self, *, start_channel_id: builtins.int = ..., max_results: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["max_results", b"max_results", "start_channel_id", b"start_channel_id"]) -> None: ... global___GetTopChannelsRequest = GetTopChannelsRequest @typing.final class GetTopChannelsResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor CHANNEL_FIELD_NUMBER: builtins.int END_FIELD_NUMBER: builtins.int end: builtins.bool """If set, indicates that the list of channels is the final list. Requesting more channels can only return more if they are created after this RPC completes. """ @property def channel(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___Channel]: """list of channels that the connection detail service knows about. Sorted in ascending channel_id order. Must contain at least 1 result, otherwise 'end' must be true. """ def __init__( self, *, channel: collections.abc.Iterable[global___Channel] | None = ..., end: builtins.bool = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["channel", b"channel", "end", b"end"]) -> None: ... global___GetTopChannelsResponse = GetTopChannelsResponse @typing.final class GetServersRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor START_SERVER_ID_FIELD_NUMBER: builtins.int MAX_RESULTS_FIELD_NUMBER: builtins.int start_server_id: builtins.int """start_server_id indicates that only servers at or above this id should be included in the results. To request the first page, this must be set to 0. To request subsequent pages, the client generates this value by adding 1 to the highest seen result ID. """ max_results: builtins.int """If non-zero, the server will return a page of results containing at most this many items. If zero, the server will choose a reasonable page size. Must never be negative. """ def __init__( self, *, start_server_id: builtins.int = ..., max_results: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["max_results", b"max_results", "start_server_id", b"start_server_id"]) -> None: ... global___GetServersRequest = GetServersRequest @typing.final class GetServersResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVER_FIELD_NUMBER: builtins.int END_FIELD_NUMBER: builtins.int end: builtins.bool """If set, indicates that the list of servers is the final list. Requesting more servers will only return more if they are created after this RPC completes. """ @property def server(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___Server]: """list of servers that the connection detail service knows about. Sorted in ascending server_id order. Must contain at least 1 result, otherwise 'end' must be true. """ def __init__( self, *, server: collections.abc.Iterable[global___Server] | None = ..., end: builtins.bool = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["end", b"end", "server", b"server"]) -> None: ... global___GetServersResponse = GetServersResponse @typing.final class GetServerRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVER_ID_FIELD_NUMBER: builtins.int server_id: builtins.int """server_id is the identifier of the specific server to get.""" def __init__( self, *, server_id: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["server_id", b"server_id"]) -> None: ... global___GetServerRequest = GetServerRequest @typing.final class GetServerResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVER_FIELD_NUMBER: builtins.int @property def server(self) -> global___Server: """The Server that corresponds to the requested server_id. This field should be set. """ def __init__( self, *, server: global___Server | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["server", b"server"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["server", b"server"]) -> None: ... global___GetServerResponse = GetServerResponse @typing.final class GetServerSocketsRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVER_ID_FIELD_NUMBER: builtins.int START_SOCKET_ID_FIELD_NUMBER: builtins.int MAX_RESULTS_FIELD_NUMBER: builtins.int server_id: builtins.int start_socket_id: builtins.int """start_socket_id indicates that only sockets at or above this id should be included in the results. To request the first page, this must be set to 0. To request subsequent pages, the client generates this value by adding 1 to the highest seen result ID. """ max_results: builtins.int """If non-zero, the server will return a page of results containing at most this many items. If zero, the server will choose a reasonable page size. Must never be negative. """ def __init__( self, *, server_id: builtins.int = ..., start_socket_id: builtins.int = ..., max_results: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["max_results", b"max_results", "server_id", b"server_id", "start_socket_id", b"start_socket_id"]) -> None: ... global___GetServerSocketsRequest = GetServerSocketsRequest @typing.final class GetServerSocketsResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SOCKET_REF_FIELD_NUMBER: builtins.int END_FIELD_NUMBER: builtins.int end: builtins.bool """If set, indicates that the list of sockets is the final list. Requesting more sockets will only return more if they are created after this RPC completes. """ @property def socket_ref(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___SocketRef]: """list of socket refs that the connection detail service knows about. Sorted in ascending socket_id order. Must contain at least 1 result, otherwise 'end' must be true. """ def __init__( self, *, socket_ref: collections.abc.Iterable[global___SocketRef] | None = ..., end: builtins.bool = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["end", b"end", "socket_ref", b"socket_ref"]) -> None: ... global___GetServerSocketsResponse = GetServerSocketsResponse @typing.final class GetChannelRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor CHANNEL_ID_FIELD_NUMBER: builtins.int channel_id: builtins.int """channel_id is the identifier of the specific channel to get.""" def __init__( self, *, channel_id: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["channel_id", b"channel_id"]) -> None: ... global___GetChannelRequest = GetChannelRequest @typing.final class GetChannelResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor CHANNEL_FIELD_NUMBER: builtins.int @property def channel(self) -> global___Channel: """The Channel that corresponds to the requested channel_id. This field should be set. """ def __init__( self, *, channel: global___Channel | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["channel", b"channel"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["channel", b"channel"]) -> None: ... global___GetChannelResponse = GetChannelResponse @typing.final class GetSubchannelRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SUBCHANNEL_ID_FIELD_NUMBER: builtins.int subchannel_id: builtins.int """subchannel_id is the identifier of the specific subchannel to get.""" def __init__( self, *, subchannel_id: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["subchannel_id", b"subchannel_id"]) -> None: ... global___GetSubchannelRequest = GetSubchannelRequest @typing.final class GetSubchannelResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SUBCHANNEL_FIELD_NUMBER: builtins.int @property def subchannel(self) -> global___Subchannel: """The Subchannel that corresponds to the requested subchannel_id. This field should be set. """ def __init__( self, *, subchannel: global___Subchannel | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["subchannel", b"subchannel"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["subchannel", b"subchannel"]) -> None: ... global___GetSubchannelResponse = GetSubchannelResponse @typing.final class GetSocketRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SOCKET_ID_FIELD_NUMBER: builtins.int SUMMARY_FIELD_NUMBER: builtins.int socket_id: builtins.int """socket_id is the identifier of the specific socket to get.""" summary: builtins.bool """If true, the response will contain only high level information that is inexpensive to obtain. Fields thay may be omitted are documented. """ def __init__( self, *, socket_id: builtins.int = ..., summary: builtins.bool = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["socket_id", b"socket_id", "summary", b"summary"]) -> None: ... global___GetSocketRequest = GetSocketRequest @typing.final class GetSocketResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SOCKET_FIELD_NUMBER: builtins.int @property def socket(self) -> global___Socket: """The Socket that corresponds to the requested socket_id. This field should be set. """ def __init__( self, *, socket: global___Socket | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["socket", b"socket"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["socket", b"socket"]) -> None: ... global___GetSocketResponse = GetSocketResponse ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/client.py0000644000175100001770000010633000000000000017637 0ustar00runnerdocker00000000000000import sys import enum import http import time import asyncio import warnings import ipaddress from types import TracebackType from typing import Generic, Optional, Union, Type, List, Sequence, Any, cast from typing import Dict, Tuple, TYPE_CHECKING try: import ssl as _ssl except ImportError: _ssl = None # type: ignore from h2.config import H2Configuration from multidict import MultiDict from .utils import Wrapper, DeadlineWrapper from .const import Status, Cardinality from .config import Configuration from .stream import send_message, recv_message, StreamIterator from .stream import _RecvType, _SendType from .events import _DispatchChannelEvents from .protocol import H2Protocol, AbstractHandler, Stream as _Stream, Peer from .metadata import Deadline, USER_AGENT, decode_grpc_message, encode_timeout from .metadata import encode_metadata, decode_metadata, _MetadataLike, _Metadata from .metadata import _STATUS_DETAILS_KEY, decode_bin_value from .exceptions import GRPCError, ProtocolError, StreamTerminatedError from .encoding.base import GRPC_CONTENT_TYPE, CodecBase, StatusDetailsCodecBase from .encoding.proto import ProtoCodec, ProtoStatusDetailsCodec from .encoding.proto import _googleapis_available from ._registry import channels as _channels if TYPE_CHECKING: from ._typing import IReleaseStream # noqa _H2_OK = '200' # https://github.com/grpc/grpc/blob/master/doc/http-grpc-status-mapping.md _H2_TO_GRPC_STATUS_MAP = { # 400 str(http.HTTPStatus.BAD_REQUEST.value): Status.INTERNAL, # 401 str(http.HTTPStatus.UNAUTHORIZED.value): Status.UNAUTHENTICATED, # 403 str(http.HTTPStatus.FORBIDDEN.value): Status.PERMISSION_DENIED, # 404 str(http.HTTPStatus.NOT_FOUND.value): Status.UNIMPLEMENTED, # 502 str(http.HTTPStatus.BAD_GATEWAY.value): Status.UNAVAILABLE, # 503 str(http.HTTPStatus.SERVICE_UNAVAILABLE.value): Status.UNAVAILABLE, # 504 str(http.HTTPStatus.GATEWAY_TIMEOUT.value): Status.UNAVAILABLE, # 429 str(http.HTTPStatus.TOO_MANY_REQUESTS.value): Status.UNAVAILABLE, } class Handler(AbstractHandler): connection_lost = False def accept(self, stream: Any, headers: Any, release_stream: Any) -> None: raise NotImplementedError('Client connection can not accept requests') def cancel(self, stream: Any) -> None: pass def close(self) -> None: self.connection_lost = True class Stream(StreamIterator[_RecvType], Generic[_SendType, _RecvType]): """ Represents gRPC method call - HTTP/2 request/stream, and everything you need to communicate with server in order to get response. In order to work directly with stream, you should :py:meth:`ServiceMethod.open` request like this: .. code-block:: python3 request = cafe_pb2.LatteOrder( size=cafe_pb2.SMALL, temperature=70, sugar=3, ) async with client.MakeLatte.open() as stream: await stream.send_message(request, end=True) reply: empty_pb2.Empty = await stream.recv_message() """ # stream state _send_request_done = False _send_message_done = False _end_done = False _recv_initial_metadata_done = False _recv_trailing_metadata_done = False _cancel_done = False _trailers_only: Optional[bool] = None _stream: _Stream _release_stream: 'IReleaseStream' _wrapper_ctx = None #: This property contains initial metadata, received with headers from #: the server. It equals to ``None`` initially, and to a multi-dict object #: after :py:meth:`recv_initial_metadata` coroutine succeeds. initial_metadata: Optional[_Metadata] = None #: This property contains trailing metadata, received with trailers from #: the server. It equals to ``None`` initially, and to a multi-dict object #: after :py:meth:`recv_trailing_metadata` coroutine succeeds. trailing_metadata: Optional[_Metadata] = None #: Connection's peer info of type :py:class:`~grpclib.protocol.Peer` peer: Optional[Peer] = None # stats _messages_sent = 0 _messages_received = 0 def __init__( self, channel: 'Channel', method_name: str, metadata: _Metadata, cardinality: Cardinality, send_type: Type[_SendType], recv_type: Type[_RecvType], *, codec: CodecBase, status_details_codec: Optional[StatusDetailsCodecBase], dispatch: _DispatchChannelEvents, deadline: Optional[Deadline] = None, ) -> None: self._channel = channel self._method_name = method_name self._metadata = metadata self._cardinality = cardinality self._send_type = send_type self._recv_type = recv_type self._codec = codec self._status_details_codec = status_details_codec self._dispatch = dispatch self._deadline = deadline async def send_request(self, *, end: bool = False) -> None: """Coroutine to send request headers with metadata to the server. New HTTP/2 stream will be created during this coroutine call. .. note:: This coroutine will be called implicitly during first :py:meth:`send_message` coroutine call, if not called before explicitly. :param end: end outgoing stream if there are no messages to send in a streaming request """ if self._send_request_done: raise ProtocolError('Request is already sent') if end and not self._cardinality.client_streaming: raise ProtocolError('Unary request requires a message to be sent ' 'before ending outgoing stream') with self._wrapper: protocol = await self._channel.__connect__() stream = protocol.processor.connection\ .create_stream(wrapper=self._wrapper) headers = [ (':method', 'POST'), (':scheme', self._channel._scheme), (':path', self._method_name), (':authority', self._channel._authority), ] if self._deadline is not None: timeout = self._deadline.time_remaining() headers.append(('grpc-timeout', encode_timeout(timeout))) # FIXME: remove this check after this issue gets resolved: # https://github.com/googleapis/googleapis.github.io/issues/27 if self._codec.__content_subtype__ == 'proto': content_type = GRPC_CONTENT_TYPE else: content_type = (GRPC_CONTENT_TYPE + '+' + self._codec.__content_subtype__) headers.extend(( ('te', 'trailers'), ('content-type', content_type), ('user-agent', USER_AGENT), )) metadata, = await self._dispatch.send_request( self._metadata, method_name=self._method_name, deadline=self._deadline, content_type=content_type, ) headers.extend(encode_metadata(metadata)) release_stream = await stream.send_request( headers, end_stream=end, _processor=protocol.processor, ) self._stream = stream self._release_stream = release_stream self.peer = self._stream.connection.get_peer() self._send_request_done = True if end: self._end_done = True async def send_message( self, message: _SendType, *, end: bool = False, ) -> None: """Coroutine to send message to the server. If client sends UNARY request, then you should call this coroutine only once. If client sends STREAM request, then you can call this coroutine as many times as you need. .. warning:: It is important to finally end stream from the client-side when you finished sending messages. You can do this in two ways: - specify ``end=True`` argument while sending last message - and last DATA frame will include END_STREAM flag; - call :py:meth:`end` coroutine after sending last message - and extra HEADERS frame with END_STREAM flag will be sent. First approach is preferred, because it doesn't require sending additional HTTP/2 frame. """ if not self._send_request_done: await self.send_request() end_stream = end if not self._cardinality.client_streaming: if self._send_message_done: raise ProtocolError('Message was already sent') else: end_stream = True if self._end_done: raise ProtocolError('Stream is ended') with self._wrapper: message, = await self._dispatch.send_message(message) await send_message(self._stream, self._codec, message, self._send_type, end=end_stream) self._send_message_done = True self._messages_sent += 1 self._stream.connection.messages_sent += 1 self._stream.connection.last_message_sent = time.monotonic() if end: self._end_done = True async def end(self) -> None: """Coroutine to end stream from the client-side. It should be used to finally end stream from the client-side when we're finished sending messages to the server and stream wasn't closed with last DATA frame. See :py:meth:`send_message` for more details. HTTP/2 stream will have half-closed (local) state after this coroutine call. """ if not self._send_request_done: raise ProtocolError('Request was not sent') if self._end_done: raise ProtocolError('Stream was already ended') if not self._cardinality.client_streaming: if not self._send_message_done: raise ProtocolError('Unary request requires a single message ' 'to be sent') else: # `send_message` must already ended stream self._end_done = True return else: await self._stream.end() self._end_done = True def _raise_for_status(self, headers_map: Dict[str, str]) -> None: status = headers_map[':status'] if status is not None and status != _H2_OK: grpc_status = _H2_TO_GRPC_STATUS_MAP.get(status, Status.UNKNOWN) raise GRPCError(grpc_status, 'Received :status = {!r}'.format(status)) def _raise_for_content_type(self, headers_map: Dict[str, str]) -> None: content_type = headers_map.get('content-type') if content_type is None: raise GRPCError(Status.UNKNOWN, 'Missing content-type header') base_content_type, _, sub_type = content_type.partition('+') sub_type = sub_type or ProtoCodec.__content_subtype__ if ( base_content_type != GRPC_CONTENT_TYPE or sub_type != self._codec.__content_subtype__ ): raise GRPCError(Status.UNKNOWN, 'Invalid content-type: {!r}' .format(content_type)) def _process_grpc_status( self, headers_map: Dict[str, str], ) -> Tuple[Status, Optional[str], Any]: grpc_status = headers_map.get('grpc-status') if grpc_status is None: raise GRPCError(Status.UNKNOWN, 'Missing grpc-status header') try: status = Status(int(grpc_status)) except ValueError: raise GRPCError(Status.UNKNOWN, ('Invalid grpc-status: {!r}' .format(grpc_status))) else: message, details = None, None if status is not Status.OK: message = headers_map.get('grpc-message') if message is not None: message = decode_grpc_message(message) if self._status_details_codec is not None: details_bin = headers_map.get(_STATUS_DETAILS_KEY) if details_bin is not None: details = self._status_details_codec.decode( status, message, decode_bin_value(details_bin.encode('ascii')) ) return status, message, details def _raise_for_grpc_status( self, status: Status, message: Optional[str], details: Any, ) -> None: if status is not Status.OK: raise GRPCError(status, message, details) async def recv_initial_metadata(self) -> None: """Coroutine to wait for headers with initial metadata from the server. .. note:: This coroutine will be called implicitly during first :py:meth:`recv_message` coroutine call, if not called before explicitly. May raise :py:class:`~grpclib.exceptions.GRPCError` if server returned non-:py:attr:`Status.OK ` in trailers-only response. When this coroutine finishes, you can access received initial metadata by using :py:attr:`initial_metadata` attribute. """ if not self._send_request_done: raise ProtocolError('Request was not sent yet') if self._recv_initial_metadata_done: raise ProtocolError('Initial metadata was already received') with self._wrapper: headers = await self._stream.recv_headers() self._recv_initial_metadata_done = True headers_map = dict(headers) self._raise_for_status(headers_map) self._raise_for_content_type(headers_map) if 'grpc-status' in headers_map: # trailers-only response self._trailers_only = True im = cast(_Metadata, MultiDict()) im, = await self._dispatch.recv_initial_metadata(im) self.initial_metadata = im status, message, details = self._process_grpc_status( headers_map, ) tm = decode_metadata(headers) tm, = await self._dispatch.recv_trailing_metadata( tm, status=status, status_message=message, status_details=details, ) self.trailing_metadata = tm self._raise_for_grpc_status(status, message, details) else: im = decode_metadata(headers) im, = await self._dispatch.recv_initial_metadata(im) self.initial_metadata = im async def recv_message(self) -> Optional[_RecvType]: """Coroutine to receive incoming message from the server. If server sends UNARY response, then you can call this coroutine only once. If server sends STREAM response, then you should call this coroutine several times, until it returns None when the server has ended the stream. To simplify you code in this case, :py:class:`Stream` implements async iterations protocol, so you can use it like this: .. code-block:: python3 async for message in stream: do_smth_with(message) or even like this: .. code-block:: python3 messages = [msg async for msg in stream] HTTP/2 has flow control mechanism, so client will acknowledge received DATA frames as a message only after user consumes this coroutine. :returns: message """ if not self._recv_initial_metadata_done: await self.recv_initial_metadata() with self._wrapper: message = await recv_message(self._stream, self._codec, self._recv_type) if message is not None: message, = await self._dispatch.recv_message(message) self._messages_received += 1 self._stream.connection.messages_received += 1 self._stream.connection.last_message_received = time.monotonic() return message else: return None async def recv_trailing_metadata(self) -> None: """Coroutine to wait for trailers with trailing metadata from the server. .. note:: This coroutine will be called implicitly at exit from this call (context manager's exit), if not called before explicitly. May raise :py:class:`~grpclib.exceptions.GRPCError` if server returned non-:py:attr:`Status.OK ` in trailers. When this coroutine finishes, you can access received trailing metadata by using :py:attr:`trailing_metadata` attribute. """ if (not self._end_done # explicit end and not (not self._cardinality.client_streaming # implicit end and self._send_message_done)): raise ProtocolError('Outgoing stream was not ended') if not self._recv_initial_metadata_done: raise ProtocolError('Initial metadata was not received before ' 'waiting for trailing metadata') if self._recv_trailing_metadata_done: raise ProtocolError('Trailing metadata was already received') if self._trailers_only: self._recv_trailing_metadata_done = True else: with self._wrapper: trailers = await self._stream.recv_trailers() self._recv_trailing_metadata_done = True status, message, details = self._process_grpc_status( dict(trailers), ) tm = decode_metadata(trailers) tm, = await self._dispatch.recv_trailing_metadata( tm, status=status, status_message=message, status_details=details, ) self.trailing_metadata = tm self._raise_for_grpc_status(status, message, details) async def cancel(self) -> None: """Coroutine to cancel this request/stream. Client will send RST_STREAM frame to the server, so it will be explicitly informed that there is nothing to expect from the client regarding this request/stream. """ if not self._send_request_done: raise ProtocolError('Request was not sent yet') if self._cancel_done: raise ProtocolError('Stream was already cancelled') with self._wrapper: await self._stream.reset() # TODO: specify error code self._cancel_done = True async def __aenter__(self) -> 'Stream[_SendType, _RecvType]': if self._deadline is None: self._wrapper = Wrapper() else: self._wrapper = DeadlineWrapper() self._wrapper_ctx = self._wrapper.start(self._deadline) self._wrapper_ctx.__enter__() self._channel._calls_started += 1 self._channel._last_call_started = time.monotonic() return self async def _maybe_finish(self) -> None: if ( not self._cancel_done and not self._stream._transport.is_closing() ): if not self._recv_initial_metadata_done: await self.recv_initial_metadata() if not self._recv_trailing_metadata_done: await self.recv_trailing_metadata() def _maybe_raise(self) -> None: if self._stream.headers is not None: self._raise_for_status(dict(self._stream.headers)) if self._stream.trailers is not None: status, message, details = self._process_grpc_status( dict(self._stream.trailers), ) self._raise_for_grpc_status(status, message, details) elif self._stream.headers is not None: headers_map = dict(self._stream.headers) if 'grpc-status' in headers_map: status, message, details = self._process_grpc_status( headers_map, ) self._raise_for_grpc_status(status, message, details) async def __aexit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> None: if not self._send_request_done: return try: reraise = False if exc_val is None: try: await self._maybe_finish() except Exception: exc_type, exc_val, exc_tb = sys.exc_info() reraise = True if isinstance(exc_val, StreamTerminatedError): self._maybe_raise() if reraise: assert exc_val is not None raise exc_val finally: if self._stream.closable: self._stream.reset_nowait() self._release_stream() if self._wrapper_ctx is not None: self._wrapper_ctx.__exit__(exc_type, exc_val, exc_tb) if exc_val is None: self._channel._calls_succeeded += 1 else: self._channel._calls_failed += 1 class _ChannelState(enum.IntEnum): IDLE = 1 CONNECTING = 2 READY = 3 TRANSIENT_FAILURE = 4 class Channel: """ Represents a connection to the server, which can be used with generated stub classes to perform gRPC calls. .. code-block:: python3 channel = Channel() client = cafe_grpc.CoffeeMachineStub(channel) ... request = cafe_pb2.LatteOrder( size=cafe_pb2.SMALL, temperature=70, sugar=3, ) reply: empty_pb2.Empty = await client.MakeLatte(request) ... channel.close() """ _protocol = None # stats _calls_started = 0 _calls_succeeded = 0 _calls_failed = 0 _last_call_started: Optional[float] = None def __init__( self, host: Optional[str] = None, port: Optional[int] = None, *, loop: Optional[asyncio.AbstractEventLoop] = None, path: Optional[str] = None, codec: Optional[CodecBase] = None, status_details_codec: Optional[StatusDetailsCodecBase] = None, ssl: Union[ None, bool, "_ssl.SSLContext", "_ssl.DefaultVerifyPaths" ] = None, config: Optional[Configuration] = None, ): """Initialize connection to the server :param host: server host name. :param port: server port number. :param loop: (deprecated) asyncio-compatible event loop :param path: server socket path. If specified, host and port should be omitted (must be None). :param codec: instance of a codec to encode and decode messages, if omitted ``ProtoCodec`` is used by default :param status_details_codec: instance of a status details codec to decode error details in a trailing metadata, if omitted ``ProtoStatusDetailsCodec`` is used by default :param ssl: ``True`` or :py:class:`~python:ssl.SSLContext` object or ``ssl.DefaultVerifyPaths`` object; if ``True``, default SSL context is used. """ if path is not None and (host is not None or port is not None): raise ValueError("The 'path' parameter can not be used with the " "'host' or 'port' parameters.") else: if host is None: host = '127.0.0.1' if port is None: port = 50051 if _ssl is None: if ssl is not None: raise RuntimeError('SSL is not supported') elif ssl is True: ssl = self._get_default_ssl_context() elif isinstance(ssl, _ssl.DefaultVerifyPaths): ssl = self._get_default_ssl_context(verify_paths=ssl) if codec is None: codec = ProtoCodec() if status_details_codec is None and _googleapis_available(): status_details_codec = ProtoStatusDetailsCodec() if loop: warnings.warn("The loop argument is deprecated and scheduled " "for removal in grpclib 0.5", DeprecationWarning, stacklevel=2) self._host = host self._port = port self._loop = loop or asyncio.get_event_loop() self._path = path self._codec = codec self._status_details_codec = status_details_codec self._ssl = ssl or None self._scheme = 'https' if self._ssl else 'http' self._authority = self._get_authority(self._host, self._port) self._h2_config = H2Configuration( client_side=True, header_encoding='ascii', validate_inbound_headers=False, validate_outbound_headers=False, normalize_inbound_headers=False, normalize_outbound_headers=False, ) self._connect_lock = asyncio.Lock() self._state = _ChannelState.IDLE config = Configuration() if config is None else config self._config = config.__for_client__() self.__dispatch__ = _DispatchChannelEvents() _channels.add(self) def __repr__(self) -> str: return ('Channel({!r}, {!r}, ..., path={!r})' .format(self._host, self._port, self._path)) def _protocol_factory(self) -> H2Protocol: return H2Protocol(Handler(), self._config, self._h2_config) async def _create_connection(self) -> H2Protocol: if self._path is not None: _, protocol = await self._loop.create_unix_connection( self._protocol_factory, self._path, ssl=self._ssl, server_hostname=( self._config.ssl_target_name_override if self._ssl is not None else None ), ) else: _, protocol = await self._loop.create_connection( self._protocol_factory, self._host, self._port, ssl=self._ssl, server_hostname=( self._config.ssl_target_name_override if self._ssl is not None else None ), ) return protocol @property def _connected(self) -> bool: return (self._protocol is not None and not self._protocol.handler.connection_lost) async def __connect__(self) -> H2Protocol: if not self._connected: async with self._connect_lock: self._state = _ChannelState.CONNECTING if not self._connected: try: self._protocol = await self._create_connection() except Exception: self._state = _ChannelState.TRANSIENT_FAILURE raise else: self._state = _ChannelState.READY return cast(H2Protocol, self._protocol) # https://python-hyper.org/projects/h2/en/stable/negotiating-http2.html def _get_default_ssl_context( self, *, verify_paths: Optional['_ssl.DefaultVerifyPaths'] = None, ) -> '_ssl.SSLContext': if verify_paths is not None: cafile = verify_paths.cafile capath = verify_paths.capath else: try: import certifi except ImportError: cafile = None else: cafile = certifi.where() capath = None ctx = _ssl.create_default_context( purpose=_ssl.Purpose.SERVER_AUTH, cafile=cafile, capath=capath, ) ctx.minimum_version = _ssl.TLSVersion.TLSv1_2 ctx.set_ciphers('ECDHE+AESGCM:ECDHE+CHACHA20:DHE+AESGCM:DHE+CHACHA20') ctx.set_alpn_protocols(['h2']) return ctx def _get_authority(self, host: str, port: int) -> str: try: ipv6_address = ipaddress.IPv6Address(host) except ipaddress.AddressValueError: pass else: host = f"[{ipv6_address}]" return "{}:{}".format(host, port) def request( self, name: str, cardinality: Cardinality, request_type: Type[_SendType], reply_type: Type[_RecvType], *, timeout: Optional[float] = None, deadline: Optional[Deadline] = None, metadata: Optional[_MetadataLike] = None, ) -> Stream[_SendType, _RecvType]: if timeout is not None and deadline is None: deadline = Deadline.from_timeout(timeout) elif timeout is not None and deadline is not None: deadline = min(Deadline.from_timeout(timeout), deadline) metadata = cast(_Metadata, MultiDict(metadata or ())) return Stream(self, name, metadata, cardinality, request_type, reply_type, codec=self._codec, status_details_codec=self._status_details_codec, dispatch=self.__dispatch__, deadline=deadline) def close(self) -> None: """Closes connection to the server. """ if self._protocol is not None: self._protocol.processor.close() del self._protocol self._state = _ChannelState.IDLE def __del__(self) -> None: if self._protocol is not None: message = 'Unclosed connection: {!r}'.format(self) warnings.warn(message, ResourceWarning) if self._loop.is_closed(): return else: self.close() self._loop.call_exception_handler({'message': message}) async def __aenter__(self) -> 'Channel': return self async def __aexit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> None: self.close() class ServiceMethod(Generic[_SendType, _RecvType]): """ Base class for all gRPC method types """ def __init__( self, channel: Channel, name: str, request_type: Type[_SendType], reply_type: Type[_RecvType], ): self.channel = channel self.name = name self.request_type = request_type self.reply_type = reply_type @property def _cardinality(self) -> Cardinality: raise NotImplementedError def open( self, *, timeout: Optional[float] = None, metadata: Optional[_MetadataLike] = None, ) -> Stream[_SendType, _RecvType]: """Creates and returns :py:class:`Stream` object to perform request to the server. Nothing will happen to the current underlying HTTP/2 connection during this method call. It just initializes :py:class:`Stream` object for you. Actual request will be sent only during :py:meth:`Stream.send_request` or :py:meth:`Stream.send_message` coroutine call. :param float timeout: request timeout (seconds) :param metadata: custom request metadata, dict or list of pairs :return: :py:class:`Stream` object """ return self.channel.request(self.name, self._cardinality, self.request_type, self.reply_type, timeout=timeout, metadata=metadata) class UnaryUnaryMethod(ServiceMethod[_SendType, _RecvType]): """ Represents UNARY-UNARY gRPC method type. .. automethod:: __call__ .. automethod:: open """ _cardinality = Cardinality.UNARY_UNARY async def __call__( self, message: _SendType, *, timeout: Optional[float] = None, metadata: Optional[_MetadataLike] = None, ) -> _RecvType: """Coroutine to perform defined call. :param message: message :param float timeout: request timeout (seconds) :param metadata: custom request metadata, dict or list of pairs :return: message """ async with self.open(timeout=timeout, metadata=metadata) as stream: await stream.send_message(message, end=True) reply = await stream.recv_message() assert reply is not None return reply class UnaryStreamMethod(ServiceMethod[_SendType, _RecvType]): """ Represents UNARY-STREAM gRPC method type. .. automethod:: __call__ .. automethod:: open """ _cardinality = Cardinality.UNARY_STREAM async def __call__( self, message: _SendType, *, timeout: Optional[float] = None, metadata: Optional[_MetadataLike] = None, ) -> List[_RecvType]: """Coroutine to perform defined call. :param message: message :param float timeout: request timeout (seconds) :param metadata: custom request metadata, dict or list of pairs :return: sequence of messages """ async with self.open(timeout=timeout, metadata=metadata) as stream: await stream.send_message(message, end=True) return [message async for message in stream] class StreamUnaryMethod(ServiceMethod[_SendType, _RecvType]): """ Represents STREAM-UNARY gRPC method type. .. automethod:: __call__ .. automethod:: open """ _cardinality = Cardinality.STREAM_UNARY async def __call__( self, messages: Sequence[_SendType], *, timeout: Optional[float] = None, metadata: Optional[_MetadataLike] = None, ) -> _RecvType: """Coroutine to perform defined call. :param messages: sequence of messages :param float timeout: request timeout (seconds) :param metadata: custom request metadata, dict or list of pairs :return: message """ async with self.open(timeout=timeout, metadata=metadata) as stream: for message in messages[:-1]: await stream.send_message(message) if messages: await stream.send_message(messages[-1], end=True) else: await stream.send_request(end=True) reply = await stream.recv_message() assert reply is not None return reply class StreamStreamMethod(ServiceMethod[_SendType, _RecvType]): """ Represents STREAM-STREAM gRPC method type. .. automethod:: __call__ .. automethod:: open """ _cardinality = Cardinality.STREAM_STREAM async def __call__( self, messages: Sequence[_SendType], *, timeout: Optional[float] = None, metadata: Optional[_MetadataLike] = None, ) -> List[_RecvType]: """Coroutine to perform defined call. :param messages: sequence of messages :param float timeout: request timeout (seconds) :param metadata: custom request metadata, dict or list of pairs :return: sequence of messages """ async with self.open(timeout=timeout, metadata=metadata) as stream: for message in messages[:-1]: await stream.send_message(message) if messages: await stream.send_message(messages[-1], end=True) else: await stream.send_request(end=True) return [message async for message in stream] ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/config.py0000644000175100001770000001123000000000000017620 0ustar00runnerdocker00000000000000from typing import Optional, TypeVar, Callable, Any, Union, cast from dataclasses import dataclass, field, fields, replace, is_dataclass class _DefaultType: def __repr__(self) -> str: return '' _DEFAULT = _DefaultType() _ValidatorType = Callable[[str, Any], None] _ConfigurationType = TypeVar('_ConfigurationType') _WMIN = 2 ** 16 - 1 _4MiB = 4 * 2 ** 20 _WMAX = 2 ** 31 - 1 def _optional(validator: _ValidatorType) -> _ValidatorType: def proc(name: str, value: Any) -> None: if value is not None: validator(name, value) return proc def _chain(*validators: _ValidatorType) -> _ValidatorType: def proc(name: str, value: Any) -> None: for validator in validators: validator(name, value) return proc def _of_type(*types: type) -> _ValidatorType: def proc(name: str, value: Any) -> None: if not isinstance(value, types): types_repr = ' or '.join(str(t) for t in types) raise TypeError(f'"{name}" should be of type {types_repr}') return proc def _positive(name: str, value: Union[float, int]) -> None: if value <= 0: raise ValueError(f'"{name}" should be positive') def _non_negative(name: str, value: Union[float, int]) -> None: if value < 0: raise ValueError(f'"{name}" should not be negative') def _range(min_: int, max_: int) -> _ValidatorType: def proc(name: str, value: Union[float, int]) -> None: if value < min_: raise ValueError(f'"{name}" should be higher or equal to {min_}') if value > max_: raise ValueError(f'"{name}" should be less or equal to {max_}') return proc def _validate(config: 'Configuration') -> None: for f in fields(config): validate_fn = f.metadata.get('validate') if validate_fn is not None: value = getattr(config, f.name) if value is not _DEFAULT: validate_fn(f.name, value) def _with_defaults( cls: _ConfigurationType, metadata_key: str, ) -> _ConfigurationType: assert is_dataclass(cls) defaults = {} for f in fields(cls): if getattr(cls, f.name) is _DEFAULT: if metadata_key in f.metadata: default = f.metadata[metadata_key] else: default = f.metadata['default'] defaults[f.name] = default return replace(cls, **defaults) # type: ignore @dataclass(frozen=True) class Configuration: _keepalive_time: Optional[float] = field( default=cast(None, _DEFAULT), metadata={ 'validate': _optional(_chain(_of_type(int, float), _positive)), 'server-default': 7200.0, 'client-default': None, 'test-default': None, }, ) _keepalive_timeout: float = field( default=20.0, metadata={ 'validate': _chain(_of_type(int, float), _positive), }, ) _keepalive_permit_without_calls: bool = field( default=False, metadata={ 'validate': _optional(_of_type(bool)), }, ) _http2_max_pings_without_data: int = field( default=2, metadata={ 'validate': _optional(_chain(_of_type(int), _non_negative)), }, ) _http2_min_sent_ping_interval_without_data: float = field( default=300, metadata={ 'validate': _optional(_chain(_of_type(int, float), _positive)), }, ) #: Sets inbound window size for a connection. HTTP/2 spec allows this value #: to be from 64 KiB to 2 GiB, 4 MiB is used by default http2_connection_window_size: int = field( default=_4MiB, metadata={ 'validate': _chain(_of_type(int), _range(_WMIN, _WMAX)), }, ) #: Sets inbound window size for a stream. HTTP/2 spec allows this value #: to be from 64 KiB to 2 GiB, 4 MiB is used by default http2_stream_window_size: int = field( default=_4MiB, metadata={ 'validate': _chain(_of_type(int), _range(_WMIN, _WMAX)), }, ) #: NOTE: This should be used for testing only. Overrides the hostname that #: the target server’s certificate will be matched against. By default, the #: value of the host argument is used. ssl_target_name_override: Optional[str] = field( default=None, ) def __post_init__(self) -> None: _validate(self) def __for_server__(self) -> 'Configuration': return _with_defaults(self, 'server-default') def __for_client__(self) -> 'Configuration': return _with_defaults(self, 'client-default') def __for_test__(self) -> 'Configuration': return _with_defaults(self, 'test-default') ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/const.py0000644000175100001770000000406200000000000017506 0ustar00runnerdocker00000000000000import enum import collections @enum.unique class Status(enum.Enum): """Predefined gRPC status codes represented as enum See also: https://github.com/grpc/grpc/blob/master/doc/statuscodes.md """ #: The operation completed successfully OK = 0 #: The operation was cancelled (typically by the caller) CANCELLED = 1 #: Generic status to describe error when it can't be described using #: other statuses UNKNOWN = 2 #: Client specified an invalid argument INVALID_ARGUMENT = 3 #: Deadline expired before operation could complete DEADLINE_EXCEEDED = 4 #: Some requested entity was not found NOT_FOUND = 5 #: Some entity that we attempted to create already exists ALREADY_EXISTS = 6 #: The caller does not have permission to execute the specified operation PERMISSION_DENIED = 7 #: Some resource has been exhausted, perhaps a per-user quota, or perhaps #: the entire file system is out of space RESOURCE_EXHAUSTED = 8 #: Operation was rejected because the system is not in a state required #: for the operation's execution FAILED_PRECONDITION = 9 #: The operation was aborted ABORTED = 10 #: Operation was attempted past the valid range OUT_OF_RANGE = 11 #: Operation is not implemented or not supported/enabled in this service UNIMPLEMENTED = 12 #: Internal errors INTERNAL = 13 #: The service is currently unavailable UNAVAILABLE = 14 #: Unrecoverable data loss or corruption DATA_LOSS = 15 #: The request does not have valid authentication credentials for the #: operation UNAUTHENTICATED = 16 _Cardinality = collections.namedtuple( '_Cardinality', 'client_streaming, server_streaming', ) @enum.unique class Cardinality(_Cardinality, enum.Enum): UNARY_UNARY = _Cardinality(False, False) UNARY_STREAM = _Cardinality(False, True) STREAM_UNARY = _Cardinality(True, False) STREAM_STREAM = _Cardinality(True, True) Handler = collections.namedtuple( 'Handler', 'func, cardinality, request_type, reply_type', ) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/encoding/0000755000175100001770000000000000000000000017572 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/encoding/__init__.py0000644000175100001770000000000000000000000021671 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/encoding/base.py0000644000175100001770000000135700000000000021064 0ustar00runnerdocker00000000000000import abc from typing import Any, Optional from ..const import Status GRPC_CONTENT_TYPE = 'application/grpc' class CodecBase(abc.ABC): @property @abc.abstractmethod def __content_subtype__(self) -> str: pass @abc.abstractmethod def encode(self, message: Any, message_type: Any) -> bytes: pass @abc.abstractmethod def decode(self, data: bytes, message_type: Any) -> Any: pass class StatusDetailsCodecBase(abc.ABC): @abc.abstractmethod def encode( self, status: Status, message: Optional[str], details: Any, ) -> bytes: pass @abc.abstractmethod def decode( self, status: Status, message: Optional[str], data: bytes, ) -> Any: pass ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/encoding/proto.py0000644000175100001770000000512000000000000021305 0ustar00runnerdocker00000000000000from typing import TYPE_CHECKING, Type, Optional, Sequence, Any from ..const import Status from ..utils import _cached from .base import CodecBase, StatusDetailsCodecBase if TYPE_CHECKING: from google.protobuf.message import Message # noqa from .._typing import IProtoMessage # noqa @_cached def _status_pb2() -> Any: from google.rpc import status_pb2 return status_pb2 @_cached def _sym_db() -> Any: from google.protobuf.symbol_database import Default return Default() @_cached def _googleapis_available() -> bool: try: import google.rpc.status_pb2 # noqa except ImportError: return False else: return True class ProtoCodec(CodecBase): __content_subtype__ = 'proto' def encode( self, message: 'IProtoMessage', message_type: Type['IProtoMessage'], ) -> bytes: if not isinstance(message, message_type): raise TypeError('Message must be of type {!r}, not {!r}' .format(message_type, type(message))) return message.SerializeToString() def decode( self, data: bytes, message_type: Type['IProtoMessage'], ) -> 'IProtoMessage': return message_type.FromString(data) class _Unknown: def __init__(self, name: str) -> None: self._name = name def __repr__(self) -> str: return 'Unknown({!r})'.format(self._name) class ProtoStatusDetailsCodec(StatusDetailsCodecBase): def encode( self, status: Status, message: Optional[str], details: Sequence['Message'], ) -> bytes: status_pb2 = _status_pb2() status_proto = status_pb2.Status(code=status.value, message=message) if details is not None: for detail in details: detail_container = status_proto.details.add() detail_container.Pack(detail) return status_proto.SerializeToString() # type: ignore def decode( self, status: Status, message: Optional[str], data: bytes, ) -> Sequence[Any]: status_pb2 = _status_pb2() sym_db = _sym_db() status_proto = status_pb2.Status.FromString(data) details = [] for detail_container in status_proto.details: try: msg_type = sym_db.GetSymbol(detail_container.TypeName()) except KeyError: details.append(_Unknown(detail_container.TypeName())) continue detail = msg_type() detail_container.Unpack(detail) details.append(detail) return details ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/events.py0000644000175100001770000002441000000000000017663 0ustar00runnerdocker00000000000000from typing import TYPE_CHECKING, Type, TypeVar, Tuple, FrozenSet, Dict from typing import Optional, Callable, Any, Collection, List, Coroutine from itertools import chain from collections import defaultdict from .const import Status from .metadata import Deadline, _Metadata if TYPE_CHECKING: from .stream import _SendType, _RecvType from ._typing import IEventsTarget, IServerMethodFunc # noqa from .protocol import Peer class _Event: __slots__ = ('__interrupted__',) __payload__: Collection[str] = () __readonly__: FrozenSet[str] = frozenset() __interrupted__: bool def __init__(self, **kwargs): # type: ignore assert len(kwargs) == len(self.__slots__), self.__slots__ super().__setattr__('__interrupted__', False) for key, value in kwargs.items(): super().__setattr__(key, value) def __setattr__(self, key: str, value: Any) -> None: if key in self.__readonly__: raise AttributeError('Read-only property: {!r}'.format(key)) else: super().__setattr__(key, value) def interrupt(self) -> None: super().__setattr__('__interrupted__', True) _EventType = TypeVar('_EventType', bound=_Event) class _EventMeta(type): def __new__(mcs, name, bases, params): # type: ignore annotations = params.get('__annotations__') or {} payload = params.get('__payload__') or () params['__slots__'] = tuple(name for name in annotations) params['__readonly__'] = frozenset(name for name in annotations if name not in payload) return super().__new__(mcs, name, bases, params) async def _ident(*args, **_): # type: ignore return args def _dispatches(event_type: Type[_Event]) -> Callable[[Any], Any]: def decorator(func: Any) -> Any: func.__dispatches__ = event_type return func return decorator _Callback = Callable[[_Event], Coroutine[Any, Any, None]] class _Dispatch: __dispatch_methods__: Dict[Type[_Event], str] = {} def __init__(self) -> None: self._listeners: Dict[Type[_Event], List[_Callback]] = defaultdict(list) for name in self.__dispatch_methods__.values(): self.__dict__[name] = _ident def add_listener( self, event_type: Type[_Event], callback: _Callback, ) -> None: self.__dict__.pop(self.__dispatch_methods__[event_type], None) self._listeners[event_type].append(callback) async def __dispatch__(self, event: _Event) -> Any: for callback in self._listeners[event.__class__]: await callback(event) if event.__interrupted__: break return tuple(getattr(event, name) for name in event.__payload__) class _DispatchMeta(type): def __new__(mcs, name, bases, params): # type: ignore dispatch_methods = dict(chain.from_iterable( getattr(base, '__dispatch_methods__', {}).items() for base in bases )) for key, value in params.items(): dispatches = getattr(value, '__dispatches__', None) if dispatches is not None: assert (isinstance(dispatches, type) and issubclass(dispatches, _Event)), dispatches assert dispatches not in dispatch_methods, dispatches dispatch_methods[dispatches] = key params['__dispatch_methods__'] = dispatch_methods return super().__new__(mcs, name, bases, params) def listen( target: 'IEventsTarget', event_type: Type[_EventType], callback: Callable[[_EventType], Coroutine[Any, Any, None]], ) -> None: """Registers a listener function for the given target and event type .. code-block:: python3 async def callback(event: SomeEvent): print(event.data) listen(target, SomeEvent, callback) """ target.__dispatch__.add_listener(event_type, callback) class SendMessage(_Event, metaclass=_EventMeta): """Dispatches before sending message to the other party :param mutable message: message to send """ __payload__ = ('message',) message: Any class RecvMessage(_Event, metaclass=_EventMeta): """Dispatches after message was received from the other party :param mutable message: received message """ __payload__ = ('message',) message: Any class _DispatchCommonEvents(_Dispatch, metaclass=_DispatchMeta): @_dispatches(SendMessage) async def send_message(self, message: '_SendType') -> Tuple['_SendType']: return await self.__dispatch__(SendMessage( # type: ignore message=message, )) @_dispatches(RecvMessage) async def recv_message(self, message: '_RecvType') -> Tuple['_RecvType']: return await self.__dispatch__(RecvMessage( # type: ignore message=message, )) class RecvRequest(_Event, metaclass=_EventMeta): """Dispatches after request was received from the client :param mutable metadata: invocation metadata :param mutable method_func: coroutine function to process this request, accepts :py:class:`~grpclib.server.Stream` :param read-only method_name: RPC's method name :param read-only deadline: request's :py:class:`~grpclib.metadata.Deadline` :param read-only content_type: request's content type :param read-only user_agent: request's user agent :param read-only peer: request's :py:class:`~grpclib.protocol.Peer` """ __payload__ = ('metadata', 'method_func') metadata: _Metadata method_func: 'IServerMethodFunc' method_name: str deadline: Optional[Deadline] content_type: str user_agent: Optional[str] peer: 'Peer' class SendInitialMetadata(_Event, metaclass=_EventMeta): """Dispatches before sending headers with initial metadata to the client :param mutable metadata: initial metadata """ __payload__ = ('metadata',) metadata: _Metadata class SendTrailingMetadata(_Event, metaclass=_EventMeta): """Dispatches before sending trailers with trailing metadata to the client :param mutable metadata: trailing metadata :param read-only status: status of the RPC call :param read-only status_message: description of the status :param read-only status_details: additional status details """ __payload__ = ('metadata',) metadata: _Metadata status: Status status_message: Optional[str] status_details: Any class _DispatchServerEvents(_DispatchCommonEvents): @_dispatches(RecvRequest) async def recv_request( self, metadata: _Metadata, method_func: 'IServerMethodFunc', *, method_name: str, deadline: Optional[Deadline], content_type: str, user_agent: Optional[str], peer: 'Peer', ) -> Tuple[_Metadata, 'IServerMethodFunc']: return await self.__dispatch__(RecvRequest( # type: ignore metadata=metadata, method_func=method_func, method_name=method_name, deadline=deadline, content_type=content_type, user_agent=user_agent, peer=peer, )) @_dispatches(SendInitialMetadata) async def send_initial_metadata( self, metadata: _Metadata, ) -> Tuple[_Metadata]: return await self.__dispatch__(SendInitialMetadata( # type: ignore metadata=metadata, )) @_dispatches(SendTrailingMetadata) async def send_trailing_metadata( self, metadata: _Metadata, *, status: Status, status_message: Optional[str], status_details: Any, ) -> Tuple[_Metadata]: return await self.__dispatch__(SendTrailingMetadata( # type: ignore metadata=metadata, status=status, status_message=status_message, status_details=status_details, )) class SendRequest(_Event, metaclass=_EventMeta): """Dispatches before sending request to the server :param mutable metadata: invocation metadata :param read-only method_name: RPC's method name :param read-only deadline: request's :py:class:`~grpclib.metadata.Deadline` :param read-only content_type: request's content type """ __payload__ = ('metadata',) metadata: _Metadata method_name: str deadline: Optional[Deadline] content_type: str class RecvInitialMetadata(_Event, metaclass=_EventMeta): """Dispatches after headers with initial metadata were received from the server :param mutable metadata: initial metadata """ __payload__ = ('metadata',) metadata: _Metadata class RecvTrailingMetadata(_Event, metaclass=_EventMeta): """Dispatches after trailers with trailing metadata were received from the server :param mutable metadata: trailing metadata :param read-only status: status of the RPC call :param read-only status_message: description of the status :param read-only status_details: additional status details """ __payload__ = ('metadata',) metadata: _Metadata status: Status status_message: Optional[str] status_details: Any class _DispatchChannelEvents(_DispatchCommonEvents): @_dispatches(SendRequest) async def send_request( self, metadata: _Metadata, *, method_name: str, deadline: Optional[Deadline], content_type: str, ) -> Tuple[_Metadata]: return await self.__dispatch__(SendRequest( # type: ignore metadata=metadata, method_name=method_name, deadline=deadline, content_type=content_type, )) @_dispatches(RecvInitialMetadata) async def recv_initial_metadata( self, metadata: _Metadata, ) -> Tuple[_Metadata]: return await self.__dispatch__(RecvInitialMetadata( # type: ignore metadata=metadata, )) @_dispatches(RecvTrailingMetadata) async def recv_trailing_metadata( self, metadata: _Metadata, *, status: Status, status_message: Optional[str], status_details: Any, ) -> Tuple[_Metadata]: return await self.__dispatch__(RecvTrailingMetadata( # type: ignore metadata=metadata, status=status, status_message=status_message, status_details=status_details, )) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/exceptions.py0000644000175100001770000000403100000000000020535 0ustar00runnerdocker00000000000000from typing import Optional, Any from .const import Status class GRPCError(Exception): """Expected error, may be raised during RPC call There can be multiple origins of this error. It can be generated on the server-side and on the client-side. If this error originates from the server, on the wire this error is represented as ``grpc-status`` and ``grpc-message`` trailers. Possible values of the ``grpc-status`` trailer are described in the gRPC protocol definition. In ``grpclib`` these values are represented as :py:class:`~grpclib.const.Status` enum. Here are possible origins of this error: - you may raise this error to cancel current call on the server-side or return non-OK :py:class:`~grpclib.const.Status` using :py:meth:`~grpclib.server.Stream.send_trailing_metadata` method `(e.g. resource not found)` - server may return non-OK ``grpc-status`` in different failure conditions `(e.g. invalid request)` - client raises this error for non-OK ``grpc-status`` from the server - client may raise this error in different failure conditions `(e.g. server returned unsupported` ``:content-type`` `header)` """ def __init__( self, status: Status, message: Optional[str] = None, details: Any = None, ) -> None: super().__init__(status, message, details) #: :py:class:`~grpclib.const.Status` of the error self.status = status #: Error message self.message = message #: Error details self.details = details class ProtocolError(Exception): """Unexpected error, raised by ``grpclib`` when your code violates gRPC protocol This error means that you probably should fix your code. """ class StreamTerminatedError(Exception): """Unexpected error, raised when we receive ``RST_STREAM`` frame from the other side This error means that the other side decided to forcefully cancel current call, probably because of a protocol error. """ ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/health/0000755000175100001770000000000000000000000017251 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/health/__init__.py0000644000175100001770000000000000000000000021350 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/health/check.py0000644000175100001770000001425600000000000020710 0ustar00runnerdocker00000000000000import abc import time import asyncio import logging import warnings from typing import Optional, Set, Callable, Awaitable from ..utils import DeadlineWrapper from ..metadata import Deadline log = logging.getLogger(__name__) DEFAULT_CHECK_TTL = 30 DEFAULT_CHECK_TIMEOUT = 10 _Status = Optional[bool] class CheckBase(abc.ABC): @abc.abstractmethod def __status__(self) -> _Status: pass @abc.abstractmethod async def __check__(self) -> _Status: pass @abc.abstractmethod async def __subscribe__(self) -> asyncio.Event: pass @abc.abstractmethod async def __unsubscribe__(self, event: asyncio.Event) -> None: pass class ServiceCheck(CheckBase): """Performs periodic checks Example: .. code-block:: python3 async def db_test(): # raised exceptions are the same as returning False, # except that exceptions will be logged await db.execute('SELECT 1;') return True db_check = ServiceCheck(db_test) """ _value = None _poll_task = None _last_check = None def __init__( self, func: Callable[[], Awaitable[_Status]], *, loop: Optional[asyncio.AbstractEventLoop] = None, check_ttl: float = DEFAULT_CHECK_TTL, check_timeout: float = DEFAULT_CHECK_TIMEOUT, ) -> None: """ :param func: callable object which returns awaitable object, where result is one of: ``True`` (healthy), ``False`` (unhealthy), or ``None`` (unknown) :param loop: (deprecated) asyncio-compatible event loop :param check_ttl: how long we can cache result of the previous check :param check_timeout: timeout for this check """ self._func = func self._check_ttl = check_ttl self._check_timeout = check_timeout self._events: Set[asyncio.Event] = set() if loop: warnings.warn("The loop argument is deprecated and scheduled " "for removal in grpclib 0.5", DeprecationWarning, stacklevel=2) self._check_lock = asyncio.Event() self._check_lock.set() self._check_wrapper = DeadlineWrapper() def __status__(self) -> _Status: return self._value async def __check__(self) -> _Status: if ( self._last_check is not None and time.monotonic() - self._last_check < self._check_ttl ): return self._value if not self._check_lock.is_set(): # wait until concurrent check succeed await self._check_lock.wait() return self._value prev_value = self._value self._check_lock.clear() try: deadline = Deadline.from_timeout(self._check_timeout) with self._check_wrapper.start(deadline): value = await self._func() if value is not None and not isinstance(value, bool): raise TypeError('Invalid status type: {!r}'.format(value)) self._value = value except asyncio.CancelledError: raise except Exception: log.exception('Health check failed') self._value = False finally: self._check_lock.set() self._last_check = time.monotonic() if self._value != prev_value: log_level = log.info if self._value else log.warning log_level('Health check %r status changed to %r', self._func, self._value) # notify all watchers that this check was changed for event in self._events: event.set() return self._value async def _poll(self) -> None: while True: status = await self.__check__() if status: await asyncio.sleep(self._check_ttl) else: await asyncio.sleep(self._check_ttl) # TODO: change interval? async def __subscribe__(self) -> asyncio.Event: if self._poll_task is None: loop = asyncio.get_event_loop() self._poll_task = loop.create_task(self._poll()) event = asyncio.Event() self._events.add(event) return event async def __unsubscribe__(self, event: asyncio.Event) -> None: self._events.discard(event) if not self._events: assert self._poll_task is not None task = self._poll_task self._poll_task = None task.cancel() try: await task except asyncio.CancelledError: pass class ServiceStatus(CheckBase): """Contains status of a proactive check Example: .. code-block:: python3 redis_status = ServiceStatus() # detected that Redis is available redis_status.set(True) # detected that Redis is unavailable redis_status.set(False) """ def __init__( self, *, loop: Optional[asyncio.AbstractEventLoop] = None, ) -> None: """ :param loop: (deprecated) asyncio-compatible event loop """ if loop: warnings.warn("The loop argument is deprecated and scheduled " "for removal in grpclib 0.5", DeprecationWarning, stacklevel=2) self._value: _Status = None self._events: Set[asyncio.Event] = set() def set(self, value: _Status) -> None: """Sets current status of a check :param value: ``True`` (healthy), ``False`` (unhealthy), or ``None`` (unknown) """ prev_value = self._value self._value = value if self._value != prev_value: # notify all watchers that this check was changed for event in self._events: event.set() def __status__(self) -> _Status: return self._value async def __check__(self) -> _Status: return self._value async def __subscribe__(self) -> asyncio.Event: event = asyncio.Event() self._events.add(event) return event async def __unsubscribe__(self, event: asyncio.Event) -> None: self._events.discard(event) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/health/service.py0000644000175100001770000001106000000000000021261 0ustar00runnerdocker00000000000000import asyncio from typing import TYPE_CHECKING, Set, Collection, Mapping, Dict, Any, Optional from itertools import chain from ..const import Status from ..utils import _service_name from ..server import Stream from .v1.health_pb2 import HealthCheckRequest, HealthCheckResponse from .v1.health_grpc import HealthBase if TYPE_CHECKING: from .check import CheckBase # noqa from .._typing import ICheckable # noqa def _status( checks: Set['CheckBase'], ) -> 'HealthCheckResponse.ServingStatus.ValueType': statuses = {check.__status__() for check in checks} if statuses == {None}: return HealthCheckResponse.UNKNOWN elif statuses == {True}: return HealthCheckResponse.SERVING else: return HealthCheckResponse.NOT_SERVING def _reset_waits( events: Collection[asyncio.Event], waits: Mapping[asyncio.Event, 'asyncio.Task[bool]'], ) -> Dict[asyncio.Event, 'asyncio.Task[bool]']: new_waits = {} for event in events: wait = waits.get(event) if wait is None or wait.done(): event.clear() wait = asyncio.ensure_future(event.wait()) new_waits[event] = wait return new_waits class _Overall: # `_service_name` should return '' (empty string) for this service def __mapping__(self) -> Dict[str, Any]: return {'//': None} #: Represents overall health status of all services OVERALL = _Overall() _ChecksConfig = Mapping['ICheckable', Collection['CheckBase']] class Health(HealthBase): """Health-checking service Example: .. code-block:: python3 from grpclib.health.service import Health auth = AuthService() billing = BillingService() health = Health({ auth: [redis_status], billing: [db_check], }) server = Server([auth, billing, health]) """ def __init__(self, checks: Optional[_ChecksConfig] = None) -> None: if checks is None: checks = {OVERALL: []} elif OVERALL not in checks: checks = dict(checks) checks[OVERALL] = list(chain.from_iterable(checks.values())) self._checks = {_service_name(s): set(check_list) for s, check_list in checks.items()} async def Check( self, stream: Stream[HealthCheckRequest, HealthCheckResponse], ) -> None: """Implements synchronous periodic checks""" request = await stream.recv_message() assert request is not None checks = self._checks.get(request.service) if checks is None: await stream.send_trailing_metadata(status=Status.NOT_FOUND) elif len(checks) == 0: await stream.send_message(HealthCheckResponse( status=HealthCheckResponse.SERVING, )) else: for check in checks: await check.__check__() await stream.send_message(HealthCheckResponse( status=_status(checks), )) async def Watch( self, stream: Stream[HealthCheckRequest, HealthCheckResponse], ) -> None: request = await stream.recv_message() assert request is not None checks = self._checks.get(request.service) if checks is None: await stream.send_message(HealthCheckResponse( status=HealthCheckResponse.SERVICE_UNKNOWN, )) while True: await asyncio.sleep(3600) elif len(checks) == 0: await stream.send_message(HealthCheckResponse( status=HealthCheckResponse.SERVING, )) while True: await asyncio.sleep(3600) else: events = [] for check in checks: events.append(await check.__subscribe__()) waits = _reset_waits(events, {}) try: await stream.send_message(HealthCheckResponse( status=_status(checks), )) while True: await asyncio.wait(waits.values(), return_when=asyncio.FIRST_COMPLETED) waits = _reset_waits(events, waits) await stream.send_message(HealthCheckResponse( status=_status(checks), )) finally: for check, event in zip(checks, events): await check.__unsubscribe__(event) for wait in waits.values(): if not wait.done(): wait.cancel() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/health/v1/0000755000175100001770000000000000000000000017577 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/health/v1/__init__.py0000644000175100001770000000000000000000000021676 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/health/v1/health_grpc.py0000644000175100001770000000373000000000000022434 0ustar00runnerdocker00000000000000# Generated by the Protocol Buffers compiler. DO NOT EDIT! # source: grpclib/health/v1/health.proto # plugin: grpclib.plugin.main import abc import typing import grpclib.const import grpclib.client if typing.TYPE_CHECKING: import grpclib.server import grpclib.health.v1.health_pb2 class HealthBase(abc.ABC): @abc.abstractmethod async def Check(self, stream: 'grpclib.server.Stream[grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse]') -> None: pass @abc.abstractmethod async def Watch(self, stream: 'grpclib.server.Stream[grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse]') -> None: pass def __mapping__(self) -> typing.Dict[str, grpclib.const.Handler]: return { '/grpc.health.v1.Health/Check': grpclib.const.Handler( self.Check, grpclib.const.Cardinality.UNARY_UNARY, grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse, ), '/grpc.health.v1.Health/Watch': grpclib.const.Handler( self.Watch, grpclib.const.Cardinality.UNARY_STREAM, grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse, ), } class HealthStub: def __init__(self, channel: grpclib.client.Channel) -> None: self.Check = grpclib.client.UnaryUnaryMethod( channel, '/grpc.health.v1.Health/Check', grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse, ) self.Watch = grpclib.client.UnaryStreamMethod( channel, '/grpc.health.v1.Health/Watch', grpclib.health.v1.health_pb2.HealthCheckRequest, grpclib.health.v1.health_pb2.HealthCheckResponse, ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/health/v1/health_pb2.py0000644000175100001770000000431300000000000022162 0ustar00runnerdocker00000000000000# -*- coding: utf-8 -*- # Generated by the protocol buffer compiler. DO NOT EDIT! # source: grpclib/health/v1/health.proto # Protobuf Python Version: 4.25.1 """Generated protocol buffer code.""" from google.protobuf import descriptor as _descriptor from google.protobuf import descriptor_pool as _descriptor_pool from google.protobuf import symbol_database as _symbol_database from google.protobuf.internal import builder as _builder # @@protoc_insertion_point(imports) _sym_db = _symbol_database.Default() DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1egrpclib/health/v1/health.proto\x12\x0egrpc.health.v1\"%\n\x12HealthCheckRequest\x12\x0f\n\x07service\x18\x01 \x01(\t\"\xa9\x01\n\x13HealthCheckResponse\x12\x41\n\x06status\x18\x01 \x01(\x0e\x32\x31.grpc.health.v1.HealthCheckResponse.ServingStatus\"O\n\rServingStatus\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x0b\n\x07SERVING\x10\x01\x12\x0f\n\x0bNOT_SERVING\x10\x02\x12\x13\n\x0fSERVICE_UNKNOWN\x10\x03\x32\xae\x01\n\x06Health\x12P\n\x05\x43heck\x12\".grpc.health.v1.HealthCheckRequest\x1a#.grpc.health.v1.HealthCheckResponse\x12R\n\x05Watch\x12\".grpc.health.v1.HealthCheckRequest\x1a#.grpc.health.v1.HealthCheckResponse0\x01\x42\x61\n\x11io.grpc.health.v1B\x0bHealthProtoP\x01Z,google.golang.org/grpc/health/grpc_health_v1\xaa\x02\x0eGrpc.Health.V1b\x06proto3') _globals = globals() _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'grpclib.health.v1.health_pb2', _globals) if _descriptor._USE_C_DESCRIPTORS == False: _globals['DESCRIPTOR']._options = None _globals['DESCRIPTOR']._serialized_options = b'\n\021io.grpc.health.v1B\013HealthProtoP\001Z,google.golang.org/grpc/health/grpc_health_v1\252\002\016Grpc.Health.V1' _globals['_HEALTHCHECKREQUEST']._serialized_start=50 _globals['_HEALTHCHECKREQUEST']._serialized_end=87 _globals['_HEALTHCHECKRESPONSE']._serialized_start=90 _globals['_HEALTHCHECKRESPONSE']._serialized_end=259 _globals['_HEALTHCHECKRESPONSE_SERVINGSTATUS']._serialized_start=180 _globals['_HEALTHCHECKRESPONSE_SERVINGSTATUS']._serialized_end=259 _globals['_HEALTH']._serialized_start=262 _globals['_HEALTH']._serialized_end=436 # @@protoc_insertion_point(module_scope) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/health/v1/health_pb2.pyi0000644000175100001770000000475700000000000022347 0ustar00runnerdocker00000000000000""" @generated by mypy-protobuf. Do not edit manually! isort:skip_file The canonical version of this proto can be found at https://github.com/grpc/grpc-proto/blob/master/grpc/health/v1/health.proto """ import builtins import google.protobuf.descriptor import google.protobuf.internal.enum_type_wrapper import google.protobuf.message import sys import typing if sys.version_info >= (3, 10): import typing as typing_extensions else: import typing_extensions DESCRIPTOR: google.protobuf.descriptor.FileDescriptor @typing.final class HealthCheckRequest(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVICE_FIELD_NUMBER: builtins.int service: builtins.str def __init__( self, *, service: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["service", b"service"]) -> None: ... global___HealthCheckRequest = HealthCheckRequest @typing.final class HealthCheckResponse(google.protobuf.message.Message): DESCRIPTOR: google.protobuf.descriptor.Descriptor class _ServingStatus: ValueType = typing.NewType("ValueType", builtins.int) V: typing_extensions.TypeAlias = ValueType class _ServingStatusEnumTypeWrapper(google.protobuf.internal.enum_type_wrapper._EnumTypeWrapper[HealthCheckResponse._ServingStatus.ValueType], builtins.type): DESCRIPTOR: google.protobuf.descriptor.EnumDescriptor UNKNOWN: HealthCheckResponse._ServingStatus.ValueType # 0 SERVING: HealthCheckResponse._ServingStatus.ValueType # 1 NOT_SERVING: HealthCheckResponse._ServingStatus.ValueType # 2 SERVICE_UNKNOWN: HealthCheckResponse._ServingStatus.ValueType # 3 """Used only by the Watch method.""" class ServingStatus(_ServingStatus, metaclass=_ServingStatusEnumTypeWrapper): ... UNKNOWN: HealthCheckResponse.ServingStatus.ValueType # 0 SERVING: HealthCheckResponse.ServingStatus.ValueType # 1 NOT_SERVING: HealthCheckResponse.ServingStatus.ValueType # 2 SERVICE_UNKNOWN: HealthCheckResponse.ServingStatus.ValueType # 3 """Used only by the Watch method.""" STATUS_FIELD_NUMBER: builtins.int status: global___HealthCheckResponse.ServingStatus.ValueType def __init__( self, *, status: global___HealthCheckResponse.ServingStatus.ValueType = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["status", b"status"]) -> None: ... global___HealthCheckResponse = HealthCheckResponse ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/metadata.py0000644000175100001770000001201200000000000020132 0ustar00runnerdocker00000000000000import re import time import platform from base64 import b64encode, b64decode from typing import Union, Mapping, Tuple, NewType, Optional, cast, Collection from urllib.parse import quote, unquote from multidict import MultiDict from . import __version__ USER_AGENT = ( 'grpc-python-grpclib/{lib_ver} ({sys}; {py}/{py_ver})' .format( lib_ver=__version__, sys=platform.system(), py=platform.python_implementation(), py_ver=platform.python_version(), ) .lower() ) _UNITS = { 'H': 60 * 60, 'M': 60, 'S': 1, 'm': 10 ** -3, 'u': 10 ** -6, 'n': 10 ** -9, } _TIMEOUT_RE = re.compile(r'^(\d+)([{}])$'.format(''.join(_UNITS))) _STATUS_DETAILS_KEY = 'grpc-status-details-bin' _Headers = Collection[Tuple[str, str]] def decode_timeout(value: str) -> float: match = _TIMEOUT_RE.match(value) if match is None: raise ValueError('Invalid timeout: {}'.format(value)) timeout, unit = match.groups() return int(timeout) * _UNITS[unit] def encode_timeout(timeout: float) -> str: if timeout > 10: return '{}S'.format(int(timeout)) elif timeout > 0.01: return '{}m'.format(int(timeout * 10 ** 3)) elif timeout > 0.00001: return '{}u'.format(int(timeout * 10 ** 6)) else: return '{}n'.format(int(timeout * 10 ** 9)) class Deadline: """Represents request's deadline - fixed point in time """ def __init__(self, *, _timestamp: float) -> None: self._timestamp = _timestamp def __lt__(self, other: object) -> bool: if not isinstance(other, Deadline): raise TypeError('comparison is not supported between ' 'instances of \'{}\' and \'{}\'' .format(type(self).__name__, type(other).__name__)) return self._timestamp < other._timestamp def __eq__(self, other: object) -> bool: if not isinstance(other, Deadline): return False return self._timestamp == other._timestamp @classmethod def from_headers(cls, headers: _Headers) -> Optional['Deadline']: timeout = min(map(decode_timeout, (v for k, v in headers if k == 'grpc-timeout')), default=None) if timeout is not None: return cls.from_timeout(timeout) else: return None @classmethod def from_timeout(cls, timeout: float) -> 'Deadline': return cls(_timestamp=time.monotonic() + timeout) def time_remaining(self) -> float: """Calculates remaining time for the current request completion This function returns time in seconds as a floating point number, greater or equal to zero. """ return max(0, self._timestamp - time.monotonic()) _UNQUOTED = ''.join([chr(i) for i in range(0x20, 0x24 + 1)] + [chr(i) for i in range(0x26, 0x7E + 1)]) def encode_grpc_message(message: str) -> str: return quote(message, safe=_UNQUOTED, encoding='utf-8') def decode_grpc_message(value: str) -> str: return unquote(value, encoding='utf-8', errors='replace') _KEY_RE = re.compile(r'^[0-9a-z_.\-]+$') _VALUE_RE = re.compile(r'^[ !-~]+$') # 0x20-0x7E - space and printable ASCII _SPECIAL = { 'te', 'content-type', 'user-agent', } _Value = Union[str, bytes] _Metadata = NewType('_Metadata', 'MultiDict[_Value]') _MetadataLike = Union[Mapping[str, _Value], Collection[Tuple[str, _Value]]] def decode_bin_value(value: bytes) -> bytes: return b64decode(value + (b'=' * (len(value) % 4))) def decode_metadata(headers: _Headers) -> _Metadata: metadata = cast(_Metadata, MultiDict()) for key, value in headers: if key.startswith((':', 'grpc-')) or key in _SPECIAL: continue elif key.endswith('-bin'): metadata.add(key, decode_bin_value(value.encode('ascii'))) else: metadata.add(key, value) return metadata def encode_bin_value(value: bytes) -> bytes: return b64encode(value).rstrip(b'=') def encode_metadata(metadata: _MetadataLike) -> _Headers: if isinstance(metadata, Mapping): metadata = metadata.items() result = [] for key, value in metadata: if ( key in _SPECIAL or key.startswith('grpc-') or not _KEY_RE.fullmatch(key) ): raise ValueError('Invalid metadata key: {!r}'.format(key)) if key.endswith('-bin'): if not isinstance(value, bytes): raise TypeError('Invalid metadata value type, bytes expected: ' '{!r}'.format(value)) result.append((key, encode_bin_value(value).decode('ascii'))) else: if not isinstance(value, str): raise TypeError('Invalid metadata value type, str expected: ' '{!r}'.format(value)) if not _VALUE_RE.fullmatch(value): raise ValueError('Invalid metadata value: {!r}'.format(value)) result.append((key, value)) return result ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/plugin/0000755000175100001770000000000000000000000017302 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/plugin/__init__.py0000644000175100001770000000000000000000000021401 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/plugin/main.py0000644000175100001770000002165700000000000020613 0ustar00runnerdocker00000000000000import os import sys from typing import List, Any, Collection, Iterator, NamedTuple, cast from typing import Dict, Tuple, Optional, Deque from contextlib import contextmanager from collections import deque from google.protobuf.descriptor_pb2 import FileDescriptorProto, DescriptorProto from google.protobuf.compiler.plugin_pb2 import CodeGeneratorRequest from google.protobuf.compiler.plugin_pb2 import CodeGeneratorResponse from .. import const from .. import client from .. import server _CARDINALITY = { (False, False): const.Cardinality.UNARY_UNARY, (True, False): const.Cardinality.STREAM_UNARY, (False, True): const.Cardinality.UNARY_STREAM, (True, True): const.Cardinality.STREAM_STREAM, } class Method(NamedTuple): name: str cardinality: const.Cardinality request_type: str reply_type: str class Service(NamedTuple): name: str methods: List[Method] class Buffer: def __init__(self) -> None: self._lines: List[str] = [] self._indent = 0 def add(self, string: str, *args: Any, **kwargs: Any) -> None: line = ' ' * self._indent * 4 + string.format(*args, **kwargs) self._lines.append(line.rstrip(' ')) @contextmanager def indent(self) -> Iterator[None]: self._indent += 1 try: yield finally: self._indent -= 1 def content(self) -> str: return '\n'.join(self._lines) + '\n' def render( proto_file: str, package: str, imports: Collection[str], services: Collection[Service], ) -> str: buf = Buffer() buf.add('# Generated by the Protocol Buffers compiler. DO NOT EDIT!') buf.add('# source: {}', proto_file) buf.add('# plugin: {}', __name__) if not services: return buf.content() buf.add('import abc') buf.add('import typing') buf.add('') buf.add('import {}', const.__name__) buf.add('import {}', client.__name__) buf.add('if typing.TYPE_CHECKING:') with buf.indent(): buf.add('import {}', server.__name__) buf.add('') for mod in imports: buf.add('import {}', mod) for service in services: if package: service_name = '{}.{}'.format(package, service.name) else: service_name = service.name buf.add('') buf.add('') buf.add('class {}Base(abc.ABC):', service.name) with buf.indent(): for (name, _, request_type, reply_type) in service.methods: buf.add('') buf.add('@abc.abstractmethod') buf.add("async def {}(self, stream: '{}.{}[{}, {}]') -> None:", name, server.__name__, server.Stream.__name__, request_type, reply_type) with buf.indent(): buf.add('pass') buf.add('') buf.add('def __mapping__(self) -> typing.Dict[str, {}.{}]:', const.__name__, const.Handler.__name__) with buf.indent(): buf.add('return {{') with buf.indent(): for method in service.methods: name, cardinality, request_type, reply_type = method full_name = '/{}/{}'.format(service_name, name) buf.add("'{}': {}.{}(", full_name, const.__name__, const.Handler.__name__) with buf.indent(): buf.add('self.{},', name) buf.add('{}.{}.{},', const.__name__, const.Cardinality.__name__, cardinality.name) buf.add('{},', request_type) buf.add('{},', reply_type) buf.add('),') buf.add('}}') buf.add('') buf.add('') buf.add('class {}Stub:', service.name) with buf.indent(): buf.add('') buf.add('def __init__(self, channel: {}.{}) -> None:' .format(client.__name__, client.Channel.__name__)) with buf.indent(): if len(service.methods) == 0: buf.add('pass') for method in service.methods: name, cardinality, request_type, reply_type = method full_name = '/{}/{}'.format(service_name, name) method_cls: type if cardinality is const.Cardinality.UNARY_UNARY: method_cls = client.UnaryUnaryMethod elif cardinality is const.Cardinality.UNARY_STREAM: method_cls = client.UnaryStreamMethod elif cardinality is const.Cardinality.STREAM_UNARY: method_cls = client.StreamUnaryMethod elif cardinality is const.Cardinality.STREAM_STREAM: method_cls = client.StreamStreamMethod else: raise TypeError(cardinality) method_cls = cast(type, method_cls) # FIXME: redundant buf.add('self.{} = {}.{}('.format(name, client.__name__, method_cls.__name__)) with buf.indent(): buf.add('channel,') buf.add('{!r},'.format(full_name)) buf.add('{},', request_type) buf.add('{},', reply_type) buf.add(')') return buf.content() def _get_proto(request: CodeGeneratorRequest, name: str) -> FileDescriptorProto: return next(f for f in request.proto_file if f.name == name) def _strip_proto(proto_file_path: str) -> str: for suffix in [".protodevel", ".proto"]: if proto_file_path.endswith(suffix): return proto_file_path[: -len(suffix)] return proto_file_path def _base_module_name(proto_file_path: str) -> str: basename = _strip_proto(proto_file_path) return basename.replace("-", "_").replace("/", ".") def _proto2pb2_module_name(proto_file_path: str) -> str: return _base_module_name(proto_file_path) + "_pb2" def _proto2grpc_module_name(proto_file_path: str) -> str: return _base_module_name(proto_file_path) + "_grpc" def _type_names( proto_file: FileDescriptorProto, message_type: DescriptorProto, parents: Optional[Deque[str]] = None, ) -> Iterator[Tuple[str, str]]: if parents is None: parents = deque() proto_name_parts = [''] if proto_file.package: proto_name_parts.append(proto_file.package) proto_name_parts.extend(parents) proto_name_parts.append(message_type.name) py_name_parts = [_proto2pb2_module_name(proto_file.name)] py_name_parts.extend(parents) py_name_parts.append(message_type.name) yield '.'.join(proto_name_parts), '.'.join(py_name_parts) parents.append(message_type.name) for nested in message_type.nested_type: yield from _type_names(proto_file, nested, parents=parents) parents.pop() def main() -> None: with os.fdopen(sys.stdin.fileno(), 'rb') as inp: request = CodeGeneratorRequest.FromString(inp.read()) types_map: Dict[str, str] = {} for pf in request.proto_file: for mt in pf.message_type: types_map.update(_type_names(pf, mt)) response = CodeGeneratorResponse() # See https://github.com/protocolbuffers/protobuf/blob/v3.12.0/docs/implementing_proto3_presence.md # noqa if hasattr(CodeGeneratorResponse, 'Feature'): response.supported_features = ( CodeGeneratorResponse.FEATURE_PROTO3_OPTIONAL ) for file_to_generate in request.file_to_generate: proto_file = _get_proto(request, file_to_generate) imports = [_proto2pb2_module_name(dep) for dep in list(proto_file.dependency) + [file_to_generate]] services = [] for service in proto_file.service: methods = [] for method in service.method: cardinality = _CARDINALITY[(method.client_streaming, method.server_streaming)] methods.append(Method( name=method.name, cardinality=cardinality, request_type=types_map[method.input_type], reply_type=types_map[method.output_type], )) services.append(Service(name=service.name, methods=methods)) file = response.file.add() module_name = _proto2grpc_module_name(file_to_generate) file.name = module_name.replace(".", "/") + ".py" file.content = render( proto_file=proto_file.name, package=proto_file.package, imports=imports, services=services, ) with os.fdopen(sys.stdout.fileno(), 'wb') as out: out.write(response.SerializeToString()) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/protocol.py0000644000175100001770000006117200000000000020226 0ustar00runnerdocker00000000000000import asyncio import struct import time import socket import logging from io import BytesIO from abc import ABC, abstractmethod from typing import Optional, List, Tuple, Dict, NamedTuple, Callable, Any from typing import cast, TYPE_CHECKING from asyncio import Transport, Protocol, Event, BaseTransport, TimerHandle from asyncio import Queue from functools import partial from collections import deque from h2.errors import ErrorCodes from h2.config import H2Configuration from h2.events import Event as H2Event from h2.events import RequestReceived, DataReceived, StreamEnded, WindowUpdated from h2.events import ConnectionTerminated, RemoteSettingsChanged, StreamReset from h2.events import SettingsAcknowledged, ResponseReceived, TrailersReceived from h2.events import PriorityUpdated, PingReceived, PingAckReceived from h2.settings import SettingCodes from h2.connection import H2Connection, ConnectionState from h2.exceptions import ProtocolError, TooManyStreamsError, StreamClosedError from .utils import Wrapper from .config import Configuration from .exceptions import StreamTerminatedError if TYPE_CHECKING: from typing import Deque log = logging.getLogger(__name__) if hasattr(socket, 'TCP_NODELAY'): _sock_type_mask = 0xf if hasattr(socket, 'SOCK_NONBLOCK') else 0xffffffff def _set_nodelay(sock: socket.socket) -> None: if ( sock.family in {socket.AF_INET, socket.AF_INET6} and sock.type & _sock_type_mask == socket.SOCK_STREAM and sock.proto == socket.IPPROTO_TCP ): sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1) else: def _set_nodelay(sock: socket.socket) -> None: pass class UnackedData(NamedTuple): data: bytes data_size: int ack_size: int class AckedData(NamedTuple): data: memoryview data_size: int class Buffer: def __init__(self, ack_callback: Callable[[int], None]) -> None: self._ack_callback = ack_callback self._eof = False self._unacked: 'Queue[UnackedData]' = Queue() self._acked: 'Deque[AckedData]' = deque() self._acked_size = 0 def add(self, data: bytes, ack_size: int) -> None: self._unacked.put_nowait(UnackedData(data, len(data), ack_size)) def eof(self) -> None: self._unacked.put_nowait(UnackedData(b'', 0, 0)) self._eof = True async def read(self, size: int) -> bytes: assert size >= 0, 'Size can not be negative' if size == 0: return b'' if not self._eof or not self._unacked.empty(): while self._acked_size < size: data, data_size, ack_size = await self._unacked.get() if not ack_size: break self._acked.append(AckedData(memoryview(data), data_size)) self._acked_size += data_size self._ack_callback(ack_size) if self._eof and self._acked_size == 0: return b'' if self._acked_size < size: raise AssertionError('Received less data than expected') chunks = [] chunks_size = 0 while chunks_size < size: next_chunk, next_chunk_size = self._acked[0] if chunks_size + next_chunk_size <= size: chunks.append(next_chunk) chunks_size += next_chunk_size self._acked.popleft() else: offset = size - chunks_size chunks.append(next_chunk[:offset]) chunks_size += offset self._acked[0] = AckedData( data=next_chunk[offset:], data_size=next_chunk_size - offset, ) self._acked_size -= size assert chunks_size == size return b''.join(chunks) def unacked_size(self) -> int: return sum(self._unacked.get_nowait().ack_size for _ in range(self._unacked.qsize())) class Peer: """ Represents an information about a connection's peer """ def __init__(self, transport: Transport) -> None: self._transport = transport def addr(self) -> Optional[Tuple[str, int]]: """Returns the remote address to which we are connected""" return self._transport.get_extra_info('peername') # type: ignore def cert(self) -> Optional[Dict[str, Any]]: """Returns the peer certificate Result of the :py:meth:`python:ssl.SSLSocket.getpeercert` """ ssl_object = self._transport.get_extra_info('ssl_object') if ssl_object is not None: return ssl_object.getpeercert() # type: ignore else: return None class Connection: """ Holds connection state (write_ready), and manages H2Connection <-> Transport communication """ # stats streams_started = 0 streams_succeeded = 0 streams_failed = 0 data_sent = 0 data_received = 0 messages_sent = 0 messages_received = 0 last_stream_created: Optional[float] = None last_data_sent: Optional[float] = None last_data_received: Optional[float] = None last_message_sent: Optional[float] = None last_message_received: Optional[float] = None last_ping_sent: Optional[float] = None ping_count_in_sequence: int = 0 _ping_handle: Optional[TimerHandle] = None _close_by_ping_handler: Optional[TimerHandle] = None def __init__( self, connection: H2Connection, transport: Transport, *, config: Configuration, ) -> None: self._connection = connection self._transport = transport self._config = config self.write_ready = Event() self.write_ready.set() self.stream_close_waiter = Event() def feed(self, data: bytes) -> List[H2Event]: return self._connection.receive_data(data) # type: ignore def ack(self, stream_id: int, size: int) -> None: if size: self._connection.acknowledge_received_data(size, stream_id) self.flush() def pause_writing(self) -> None: self.write_ready.clear() def resume_writing(self) -> None: self.write_ready.set() def create_stream( self, *, stream_id: Optional[int] = None, wrapper: Optional[Wrapper] = None, ) -> 'Stream': return Stream(self, self._connection, self._transport, stream_id=stream_id, wrapper=wrapper) def flush(self) -> None: data = self._connection.data_to_send() if data: self._transport.write(data) def initialize(self) -> None: if self._config._keepalive_time is not None: self._ping_handle = asyncio.get_event_loop().call_later( self._config._keepalive_time, self._ping ) def get_peer(self) -> Peer: return Peer(self._transport) def is_closing(self) -> bool: if hasattr(self, '_transport'): return self._transport.is_closing() else: return True def close(self) -> None: if hasattr(self, '_transport'): self._transport.close() # remove cyclic references to improve memory usage del self._transport if hasattr(self._connection, '_frame_dispatch_table'): del self._connection._frame_dispatch_table if self._ping_handle is not None: self._ping_handle.cancel() if self._close_by_ping_handler is not None: self._close_by_ping_handler.cancel() def _is_need_send_ping(self) -> bool: assert self._config._keepalive_time is not None if not self._config._keepalive_permit_without_calls: if not any(s.open for s in self._connection.streams.values()): return False if self._config._http2_max_pings_without_data != 0 and \ self.ping_count_in_sequence >= \ self._config._http2_max_pings_without_data: return False if self.last_ping_sent is not None and \ time.monotonic() - self.last_ping_sent < \ self._config._http2_min_sent_ping_interval_without_data: return False return True def _ping(self) -> None: assert self._config._keepalive_time is not None if self._is_need_send_ping(): log.debug('send ping') data = struct.pack('!Q', int(time.monotonic() * 10 ** 6)) self._connection.ping(data) self.flush() self.last_ping_sent = time.monotonic() self.ping_count_in_sequence += 1 if self._close_by_ping_handler is None: self._close_by_ping_handler = asyncio.get_event_loop().\ call_later( self._config._keepalive_timeout, self.close ) self._ping_handle = asyncio.get_event_loop().call_later( self._config._keepalive_time, self._ping ) def headers_send_process(self) -> None: self.ping_count_in_sequence = 0 def data_send_process(self) -> None: self.ping_count_in_sequence = 0 self.last_data_sent = time.monotonic() def ping_ack_process(self) -> None: if self._close_by_ping_handler is not None: self._close_by_ping_handler.cancel() self._close_by_ping_handler = None _Headers = List[Tuple[str, str]] class Stream: """ API for working with streams, used by clients and request handlers """ id: Optional[int] = None # stats created: Optional[float] = None data_sent = 0 data_received = 0 def __init__( self, connection: Connection, h2_connection: H2Connection, transport: Transport, *, stream_id: Optional[int] = None, wrapper: Optional[Wrapper] = None ) -> None: self.connection = connection self._h2_connection = h2_connection self._transport = transport self.wrapper = wrapper if stream_id is not None: self.init_stream(stream_id, self.connection) self.window_updated = Event() self.headers: Optional['_Headers'] = None self.headers_received = Event() self.trailers: Optional['_Headers'] = None self.trailers_received = Event() def init_stream(self, stream_id: int, connection: Connection) -> None: self.id = stream_id self.buffer = Buffer(partial(connection.ack, self.id)) self.connection.streams_started += 1 self.created = self.connection.last_stream_created = time.monotonic() async def recv_headers(self) -> _Headers: if self.headers is None: await self.headers_received.wait() assert self.headers is not None return self.headers async def recv_data(self, size: int) -> bytes: return await self.buffer.read(size) async def recv_trailers(self) -> _Headers: if self.trailers is None: await self.trailers_received.wait() assert self.trailers is not None return self.trailers async def send_request( self, headers: _Headers, end_stream: bool = False, *, _processor: 'EventsProcessor', ) -> Callable[[], None]: assert self.id is None, self.id while True: # this is the first thing we should check before even trying to # create new stream, because this wait() can be cancelled by timeout # and we wouldn't need to create new stream at all await self.connection.write_ready.wait() # `get_next_available_stream_id()` should be as close to # `connection.send_headers()` as possible, without any async # interruptions in between, see the docs on the # `get_next_available_stream_id()` method stream_id = self._h2_connection.get_next_available_stream_id() try: self._h2_connection.send_headers(stream_id, headers, end_stream=end_stream) except TooManyStreamsError: # we're going to wait until any of currently opened streams will # be closed, and we will be able to open a new one # TODO: maybe implement FIFO for waiters, but this limit # shouldn't be reached in a normal case, so why bother # TODO: maybe we should raise an exception here instead of # waiting, if timeout wasn't set for the current request self.connection.stream_close_waiter.clear() await self.connection.stream_close_waiter.wait() # while we were trying to create a new stream, write buffer # can became full, so we need to repeat checks from checking # if we can write() data continue else: self.init_stream(stream_id, self.connection) release_stream = _processor.register(self) self._transport.write(self._h2_connection.data_to_send()) self.connection.headers_send_process() return release_stream async def send_headers( self, headers: _Headers, end_stream: bool = False, ) -> None: assert self.id is not None await self.connection.write_ready.wait() # Workaround for the H2Connection.send_headers method, which will try # to create a new stream if it was removed earlier from the # H2Connection.streams, and therefore will raise StreamIDTooLowError if self.id not in self._h2_connection.streams: raise StreamClosedError(self.id) self._h2_connection.send_headers(self.id, headers, end_stream=end_stream) self._transport.write(self._h2_connection.data_to_send()) self.connection.headers_send_process() async def send_data(self, data: bytes, end_stream: bool = False) -> None: f = BytesIO(data) f_pos, f_last = 0, len(data) while True: await self.connection.write_ready.wait() window = self._h2_connection.local_flow_control_window(self.id) # window can become negative if not window > 0: self.window_updated.clear() await self.window_updated.wait() # during "await" above other streams were able to send data and # decrease current window size, so try from the beginning continue max_frame_size = self._h2_connection.max_outbound_frame_size f_chunk = f.read(min(window, max_frame_size, f_last - f_pos)) f_chunk_len = len(f_chunk) f_pos = f.tell() if f_pos == f_last: self._h2_connection.send_data(self.id, f_chunk, end_stream=end_stream) self._transport.write(self._h2_connection.data_to_send()) self.data_sent += f_chunk_len self.connection.data_sent += f_chunk_len self.connection.data_send_process() break else: self._h2_connection.send_data(self.id, f_chunk) self._transport.write(self._h2_connection.data_to_send()) self.data_sent += f_chunk_len self.connection.data_sent += f_chunk_len self.connection.data_send_process() async def end(self) -> None: await self.connection.write_ready.wait() self._h2_connection.end_stream(self.id) self._transport.write(self._h2_connection.data_to_send()) async def reset(self, error_code: ErrorCodes = ErrorCodes.NO_ERROR) -> None: await self.connection.write_ready.wait() self._h2_connection.reset_stream(self.id, error_code=error_code) self._transport.write(self._h2_connection.data_to_send()) def reset_nowait( self, error_code: ErrorCodes = ErrorCodes.NO_ERROR, ) -> None: self._h2_connection.reset_stream(self.id, error_code=error_code) if self.connection.write_ready.is_set(): self._transport.write(self._h2_connection.data_to_send()) def __ended__(self) -> None: self.buffer.eof() def __terminated__(self, reason: str) -> None: if self.wrapper is not None: self.wrapper.cancel(StreamTerminatedError(reason)) @property def closable(self) -> bool: if self._transport.is_closing(): return False if self._h2_connection.state_machine.state is ConnectionState.CLOSED: return False stream = self._h2_connection.streams.get(self.id) if stream is None: return False return not stream.closed class AbstractHandler(ABC): @abstractmethod def accept( self, stream: Stream, headers: _Headers, release_stream: Callable[[], None], ) -> None: pass @abstractmethod def cancel(self, stream: Stream) -> None: pass @abstractmethod def close(self) -> None: pass _Streams = Dict[int, Stream] class EventsProcessor: """ H2 events processor, synchronous, not doing any IO, as hyper-h2 itself """ def __init__( self, handler: AbstractHandler, connection: Connection, ) -> None: self.handler = handler self.connection = connection self.processors = { RequestReceived: self.process_request_received, ResponseReceived: self.process_response_received, RemoteSettingsChanged: self.process_remote_settings_changed, SettingsAcknowledged: self.process_settings_acknowledged, DataReceived: self.process_data_received, WindowUpdated: self.process_window_updated, TrailersReceived: self.process_trailers_received, StreamEnded: self.process_stream_ended, StreamReset: self.process_stream_reset, PriorityUpdated: self.process_priority_updated, ConnectionTerminated: self.process_connection_terminated, PingReceived: self.process_ping_received, PingAckReceived: self.process_ping_ack_received, } self.streams: _Streams = {} def register(self, stream: Stream) -> Callable[[], None]: assert stream.id is not None self.streams[stream.id] = stream def release_stream(*, _streams: _Streams = self.streams) -> None: assert stream.id is not None _stream = _streams.pop(stream.id) self.connection.stream_close_waiter.set() if not self.connection.is_closing(): self.connection.ack(stream.id, _stream.buffer.unacked_size()) return release_stream def close(self, reason: str = 'Connection closed') -> None: self.connection.close() self.handler.close() for stream in self.streams.values(): stream.__terminated__(reason) # remove cyclic references to improve memory usage if hasattr(self, 'processors'): del self.processors def process(self, event: H2Event) -> None: try: proc = self.processors[event.__class__] except KeyError: raise NotImplementedError(event) except AttributeError: pass # connection was closed and self.processors was deleted else: proc(event) def process_request_received(self, event: RequestReceived) -> None: stream = self.connection.create_stream(stream_id=event.stream_id) release_stream = self.register(stream) self.handler.accept(stream, event.headers, release_stream) # TODO: check EOF def process_response_received(self, event: ResponseReceived) -> None: stream = self.streams.get(event.stream_id) if stream is not None: stream.headers = event.headers stream.headers_received.set() def process_remote_settings_changed( self, event: RemoteSettingsChanged, ) -> None: if SettingCodes.INITIAL_WINDOW_SIZE in event.changed_settings: for stream in self.streams.values(): stream.window_updated.set() def process_settings_acknowledged( self, event: SettingsAcknowledged, ) -> None: pass def process_data_received(self, event: DataReceived) -> None: size = len(event.data) stream = self.streams.get(event.stream_id) if stream is not None: stream.buffer.add( event.data, event.flow_controlled_length, ) stream.data_received += size else: self.connection.ack( event.stream_id, event.flow_controlled_length, ) self.connection.data_received += size self.connection.last_data_received = time.monotonic() def process_window_updated(self, event: WindowUpdated) -> None: if event.stream_id == 0: for value in self.streams.values(): value.window_updated.set() else: stream = self.streams.get(event.stream_id) if stream is not None: stream.window_updated.set() def process_trailers_received(self, event: TrailersReceived) -> None: stream = self.streams.get(event.stream_id) if stream is not None: stream.trailers = event.headers stream.trailers_received.set() def process_stream_ended(self, event: StreamEnded) -> None: stream = self.streams.get(event.stream_id) if stream is not None: stream.__ended__() self.connection.streams_succeeded += 1 def process_stream_reset(self, event: StreamReset) -> None: stream = self.streams.get(event.stream_id) if stream is not None: if event.remote_reset: msg = ('Stream reset by remote party, error_code: {}' .format(event.error_code)) else: msg = 'Protocol error' stream.__terminated__(msg) self.handler.cancel(stream) self.connection.streams_failed += 1 def process_priority_updated(self, event: PriorityUpdated) -> None: pass def process_connection_terminated( self, event: ConnectionTerminated, ) -> None: self.close(reason=( 'Received GOAWAY frame, closing connection; error_code: {}' .format(event.error_code) )) def process_ping_received(self, event: PingReceived) -> None: pass def process_ping_ack_received(self, event: PingAckReceived) -> None: self.connection.ping_ack_process() class H2Protocol(Protocol): connection: Connection processor: EventsProcessor def __init__( self, handler: AbstractHandler, config: Configuration, h2_config: H2Configuration, ) -> None: self.handler = handler self.config = config self.h2_config = h2_config def connection_made(self, transport: BaseTransport) -> None: sock = transport.get_extra_info('socket') if sock is not None: _set_nodelay(sock) h2_conn = H2Connection(config=self.h2_config) h2_conn.initiate_connection() initial = h2_conn.local_settings.initial_window_size conn_delta = self.config.http2_connection_window_size - initial stream_delta = self.config.http2_stream_window_size - initial if conn_delta: h2_conn.increment_flow_control_window(conn_delta) if stream_delta: h2_conn.update_settings({ SettingCodes.INITIAL_WINDOW_SIZE: self.config.http2_stream_window_size, }) self.connection = Connection( h2_conn, cast(Transport, transport), config=self.config, ) self.connection.flush() self.connection.initialize() self.processor = EventsProcessor(self.handler, self.connection) def data_received(self, data: bytes) -> None: try: events = self.connection.feed(data) except ProtocolError: log.debug('Protocol error', exc_info=True) self.processor.close('Protocol error') else: self.connection.flush() for event in events: self.processor.process(event) self.connection.flush() def pause_writing(self) -> None: self.connection.pause_writing() def resume_writing(self) -> None: self.connection.resume_writing() def connection_lost(self, exc: Optional[BaseException]) -> None: self.processor.close(reason='Connection lost') ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/py.typed0000644000175100001770000000000000000000000017471 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/reflection/0000755000175100001770000000000000000000000020136 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/reflection/__init__.py0000644000175100001770000000000000000000000022235 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/reflection/_deprecated.py0000644000175100001770000001350000000000000022746 0ustar00runnerdocker00000000000000# This code is heavily based on grpcio-reflection reference implementation: # # Copyright 2016 gRPC authors. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from typing import Collection from google.protobuf.descriptor import FileDescriptor from google.protobuf.descriptor_pb2 import FileDescriptorProto from google.protobuf.descriptor_pool import Default from ..const import Status from ..server import Stream from .v1alpha.reflection_pb2 import ServerReflectionRequest from .v1alpha.reflection_pb2 import ServerReflectionResponse from .v1alpha.reflection_pb2 import ErrorResponse, ListServiceResponse from .v1alpha.reflection_pb2 import ServiceResponse, ExtensionNumberResponse from .v1alpha.reflection_pb2 import FileDescriptorResponse from .v1alpha.reflection_grpc import ServerReflectionBase class ServerReflection(ServerReflectionBase): """ Implements server reflection protocol. """ def __init__(self, *, _service_names: Collection[str]): self._service_names = _service_names # FIXME: DescriptorPool has incomplete typings self._pool = Default() # type: ignore def _not_found_response(self) -> ServerReflectionResponse: return ServerReflectionResponse( error_response=ErrorResponse( error_code=Status.NOT_FOUND.value, error_message='not found', ), ) def _file_descriptor_response( self, file_descriptor: FileDescriptor, ) -> ServerReflectionResponse: proto = FileDescriptorProto() file_descriptor.CopyToProto(proto) # type: ignore return ServerReflectionResponse( file_descriptor_response=FileDescriptorResponse( file_descriptor_proto=[proto.SerializeToString()], ), ) def _file_by_filename_response( self, file_name: str, ) -> ServerReflectionResponse: try: file = self._pool.FindFileByName(file_name) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _file_containing_symbol_response( self, symbol: str, ) -> ServerReflectionResponse: try: file = self._pool.FindFileContainingSymbol(symbol) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _file_containing_extension_response( self, msg_name: str, ext_number: int, ) -> ServerReflectionResponse: try: message = self._pool.FindMessageTypeByName(msg_name) extension = self._pool.FindExtensionByNumber(message, ext_number) file = self._pool.FindFileContainingSymbol(extension.full_name) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _all_extension_numbers_of_type_response( self, type_name: str, ) -> ServerReflectionResponse: try: message = self._pool.FindMessageTypeByName(type_name) extensions = self._pool.FindAllExtensions(message) except KeyError: return self._not_found_response() else: return ServerReflectionResponse( all_extension_numbers_response=ExtensionNumberResponse( base_type_name=message.full_name, extension_number=[ext.number for ext in extensions], ) ) def _list_services_response(self) -> ServerReflectionResponse: return ServerReflectionResponse( list_services_response=ListServiceResponse( service=[ServiceResponse(name=service_name) for service_name in self._service_names], ) ) async def ServerReflectionInfo( self, stream: Stream[ServerReflectionRequest, ServerReflectionResponse], ) -> None: async for request in stream: if request.HasField('file_by_filename'): response = self._file_by_filename_response( request.file_by_filename, ) elif request.HasField('file_containing_symbol'): response = self._file_containing_symbol_response( request.file_containing_symbol, ) elif request.HasField('file_containing_extension'): response = self._file_containing_extension_response( request.file_containing_extension.containing_type, request.file_containing_extension.extension_number, ) elif request.HasField('all_extension_numbers_of_type'): response = self._all_extension_numbers_of_type_response( request.all_extension_numbers_of_type, ) elif request.HasField('list_services'): response = self._list_services_response() else: response = ServerReflectionResponse( error_response=ErrorResponse( error_code=Status.INVALID_ARGUMENT.value, error_message='invalid argument', ) ) await stream.send_message(response) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/reflection/service.py0000644000175100001770000001534600000000000022161 0ustar00runnerdocker00000000000000# This code is heavily based on grpcio-reflection reference implementation: # # Copyright 2016 gRPC authors. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from typing import TYPE_CHECKING, Collection, List from google.protobuf.descriptor import FileDescriptor from google.protobuf.descriptor_pb2 import FileDescriptorProto from google.protobuf.descriptor_pool import Default from ..const import Status from ..utils import _service_name from ..server import Stream from .v1.reflection_pb2 import ServerReflectionRequest, ServerReflectionResponse from .v1.reflection_pb2 import ErrorResponse, ListServiceResponse from .v1.reflection_pb2 import ServiceResponse, ExtensionNumberResponse from .v1.reflection_pb2 import FileDescriptorResponse from .v1.reflection_grpc import ServerReflectionBase from ._deprecated import ServerReflection as _ServerReflectionV1Alpha if TYPE_CHECKING: from .._typing import IServable # noqa class ServerReflection(ServerReflectionBase): """ Implements server reflection protocol. """ def __init__(self, *, _service_names: Collection[str]): self._service_names = _service_names # FIXME: DescriptorPool has incomplete typings self._pool = Default() # type: ignore def _not_found_response(self) -> ServerReflectionResponse: return ServerReflectionResponse( error_response=ErrorResponse( error_code=Status.NOT_FOUND.value, error_message='not found', ), ) def _file_descriptor_response( self, file_descriptor: FileDescriptor, ) -> ServerReflectionResponse: proto = FileDescriptorProto() file_descriptor.CopyToProto(proto) # type: ignore return ServerReflectionResponse( file_descriptor_response=FileDescriptorResponse( file_descriptor_proto=[proto.SerializeToString()], ), ) def _file_by_filename_response( self, file_name: str, ) -> ServerReflectionResponse: try: file = self._pool.FindFileByName(file_name) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _file_containing_symbol_response( self, symbol: str, ) -> ServerReflectionResponse: try: file = self._pool.FindFileContainingSymbol(symbol) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _file_containing_extension_response( self, msg_name: str, ext_number: int, ) -> ServerReflectionResponse: try: message = self._pool.FindMessageTypeByName(msg_name) extension = self._pool.FindExtensionByNumber(message, ext_number) file = self._pool.FindFileContainingSymbol(extension.full_name) except KeyError: return self._not_found_response() else: return self._file_descriptor_response(file) def _all_extension_numbers_of_type_response( self, type_name: str, ) -> ServerReflectionResponse: try: message = self._pool.FindMessageTypeByName(type_name) extensions = self._pool.FindAllExtensions(message) except KeyError: return self._not_found_response() else: return ServerReflectionResponse( all_extension_numbers_response=ExtensionNumberResponse( base_type_name=message.full_name, extension_number=[ext.number for ext in extensions], ) ) def _list_services_response(self) -> ServerReflectionResponse: return ServerReflectionResponse( list_services_response=ListServiceResponse( service=[ServiceResponse(name=service_name) for service_name in self._service_names], ) ) async def ServerReflectionInfo( self, stream: Stream[ServerReflectionRequest, ServerReflectionResponse], ) -> None: async for request in stream: if request.HasField('file_by_filename'): response = self._file_by_filename_response( request.file_by_filename, ) elif request.HasField('file_containing_symbol'): response = self._file_containing_symbol_response( request.file_containing_symbol, ) elif request.HasField('file_containing_extension'): response = self._file_containing_extension_response( request.file_containing_extension.containing_type, request.file_containing_extension.extension_number, ) elif request.HasField('all_extension_numbers_of_type'): response = self._all_extension_numbers_of_type_response( request.all_extension_numbers_of_type, ) elif request.HasField('list_services'): response = self._list_services_response() else: response = ServerReflectionResponse( error_response=ErrorResponse( error_code=Status.INVALID_ARGUMENT.value, error_message='invalid argument', ) ) await stream.send_message(response) @classmethod def extend(cls, services: 'Collection[IServable]') -> 'List[IServable]': """ Extends services list with reflection service: .. code-block:: python3 from grpclib.reflection.service import ServerReflection services = [Greeter()] services = ServerReflection.extend(services) server = Server(services) ... Returns new services list with reflection support added. """ service_names = [] for service in services: service_names.append(_service_name(service)) services = list(services) services.append(cls(_service_names=service_names)) services.append(_ServerReflectionV1Alpha(_service_names=service_names)) return services ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/reflection/v1/0000755000175100001770000000000000000000000020464 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/reflection/v1/__init__.py0000644000175100001770000000000000000000000022563 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1/reflection_grpc.py0000644000175100001770000000270200000000000024204 0ustar00runnerdocker00000000000000# Generated by the Protocol Buffers compiler. DO NOT EDIT! # source: grpclib/reflection/v1/reflection.proto # plugin: grpclib.plugin.main import abc import typing import grpclib.const import grpclib.client if typing.TYPE_CHECKING: import grpclib.server import grpclib.reflection.v1.reflection_pb2 class ServerReflectionBase(abc.ABC): @abc.abstractmethod async def ServerReflectionInfo(self, stream: 'grpclib.server.Stream[grpclib.reflection.v1.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1.reflection_pb2.ServerReflectionResponse]') -> None: pass def __mapping__(self) -> typing.Dict[str, grpclib.const.Handler]: return { '/grpc.reflection.v1.ServerReflection/ServerReflectionInfo': grpclib.const.Handler( self.ServerReflectionInfo, grpclib.const.Cardinality.STREAM_STREAM, grpclib.reflection.v1.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1.reflection_pb2.ServerReflectionResponse, ), } class ServerReflectionStub: def __init__(self, channel: grpclib.client.Channel) -> None: self.ServerReflectionInfo = grpclib.client.StreamStreamMethod( channel, '/grpc.reflection.v1.ServerReflection/ServerReflectionInfo', grpclib.reflection.v1.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1.reflection_pb2.ServerReflectionResponse, ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1/reflection_pb2.py0000644000175100001770000001001500000000000023730 0ustar00runnerdocker00000000000000# -*- coding: utf-8 -*- # Generated by the protocol buffer compiler. DO NOT EDIT! # source: grpclib/reflection/v1/reflection.proto # Protobuf Python Version: 4.25.1 """Generated protocol buffer code.""" from google.protobuf import descriptor as _descriptor from google.protobuf import descriptor_pool as _descriptor_pool from google.protobuf import symbol_database as _symbol_database from google.protobuf.internal import builder as _builder # @@protoc_insertion_point(imports) _sym_db = _symbol_database.Default() DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n&grpclib/reflection/v1/reflection.proto\x12\x12grpc.reflection.v1\"\x85\x02\n\x17ServerReflectionRequest\x12\x0c\n\x04host\x18\x01 \x01(\t\x12\x1a\n\x10\x66ile_by_filename\x18\x03 \x01(\tH\x00\x12 \n\x16\x66ile_containing_symbol\x18\x04 \x01(\tH\x00\x12I\n\x19\x66ile_containing_extension\x18\x05 \x01(\x0b\x32$.grpc.reflection.v1.ExtensionRequestH\x00\x12\'\n\x1d\x61ll_extension_numbers_of_type\x18\x06 \x01(\tH\x00\x12\x17\n\rlist_services\x18\x07 \x01(\tH\x00\x42\x11\n\x0fmessage_request\"E\n\x10\x45xtensionRequest\x12\x17\n\x0f\x63ontaining_type\x18\x01 \x01(\t\x12\x18\n\x10\x65xtension_number\x18\x02 \x01(\x05\"\xb8\x03\n\x18ServerReflectionResponse\x12\x12\n\nvalid_host\x18\x01 \x01(\t\x12\x45\n\x10original_request\x18\x02 \x01(\x0b\x32+.grpc.reflection.v1.ServerReflectionRequest\x12N\n\x18\x66ile_descriptor_response\x18\x04 \x01(\x0b\x32*.grpc.reflection.v1.FileDescriptorResponseH\x00\x12U\n\x1e\x61ll_extension_numbers_response\x18\x05 \x01(\x0b\x32+.grpc.reflection.v1.ExtensionNumberResponseH\x00\x12I\n\x16list_services_response\x18\x06 \x01(\x0b\x32\'.grpc.reflection.v1.ListServiceResponseH\x00\x12;\n\x0e\x65rror_response\x18\x07 \x01(\x0b\x32!.grpc.reflection.v1.ErrorResponseH\x00\x42\x12\n\x10message_response\"7\n\x16\x46ileDescriptorResponse\x12\x1d\n\x15\x66ile_descriptor_proto\x18\x01 \x03(\x0c\"K\n\x17\x45xtensionNumberResponse\x12\x16\n\x0e\x62\x61se_type_name\x18\x01 \x01(\t\x12\x18\n\x10\x65xtension_number\x18\x02 \x03(\x05\"K\n\x13ListServiceResponse\x12\x34\n\x07service\x18\x01 \x03(\x0b\x32#.grpc.reflection.v1.ServiceResponse\"\x1f\n\x0fServiceResponse\x12\x0c\n\x04name\x18\x01 \x01(\t\":\n\rErrorResponse\x12\x12\n\nerror_code\x18\x01 \x01(\x05\x12\x15\n\rerror_message\x18\x02 \x01(\t2\x89\x01\n\x10ServerReflection\x12u\n\x14ServerReflectionInfo\x12+.grpc.reflection.v1.ServerReflectionRequest\x1a,.grpc.reflection.v1.ServerReflectionResponse(\x01\x30\x01\x42\x66\n\x15io.grpc.reflection.v1B\x15ServerReflectionProtoP\x01Z4google.golang.org/grpc/reflection/grpc_reflection_v1b\x06proto3') _globals = globals() _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'grpclib.reflection.v1.reflection_pb2', _globals) if _descriptor._USE_C_DESCRIPTORS == False: _globals['DESCRIPTOR']._options = None _globals['DESCRIPTOR']._serialized_options = b'\n\025io.grpc.reflection.v1B\025ServerReflectionProtoP\001Z4google.golang.org/grpc/reflection/grpc_reflection_v1' _globals['_SERVERREFLECTIONREQUEST']._serialized_start=63 _globals['_SERVERREFLECTIONREQUEST']._serialized_end=324 _globals['_EXTENSIONREQUEST']._serialized_start=326 _globals['_EXTENSIONREQUEST']._serialized_end=395 _globals['_SERVERREFLECTIONRESPONSE']._serialized_start=398 _globals['_SERVERREFLECTIONRESPONSE']._serialized_end=838 _globals['_FILEDESCRIPTORRESPONSE']._serialized_start=840 _globals['_FILEDESCRIPTORRESPONSE']._serialized_end=895 _globals['_EXTENSIONNUMBERRESPONSE']._serialized_start=897 _globals['_EXTENSIONNUMBERRESPONSE']._serialized_end=972 _globals['_LISTSERVICERESPONSE']._serialized_start=974 _globals['_LISTSERVICERESPONSE']._serialized_end=1049 _globals['_SERVICERESPONSE']._serialized_start=1051 _globals['_SERVICERESPONSE']._serialized_end=1082 _globals['_ERRORRESPONSE']._serialized_start=1084 _globals['_ERRORRESPONSE']._serialized_end=1142 _globals['_SERVERREFLECTION']._serialized_start=1145 _globals['_SERVERREFLECTION']._serialized_end=1282 # @@protoc_insertion_point(module_scope) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1/reflection_pb2.pyi0000644000175100001770000002746400000000000024121 0ustar00runnerdocker00000000000000""" @generated by mypy-protobuf. Do not edit manually! isort:skip_file Service exported by server reflection. A more complete description of how server reflection works can be found at https://github.com/grpc/grpc/blob/master/doc/server-reflection.md The canonical version of this proto can be found at https://github.com/grpc/grpc-proto/blob/master/grpc/reflection/v1/reflection.proto """ import builtins import collections.abc import google.protobuf.descriptor import google.protobuf.internal.containers import google.protobuf.message import typing DESCRIPTOR: google.protobuf.descriptor.FileDescriptor @typing.final class ServerReflectionRequest(google.protobuf.message.Message): """The message sent by the client when calling ServerReflectionInfo method.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor HOST_FIELD_NUMBER: builtins.int FILE_BY_FILENAME_FIELD_NUMBER: builtins.int FILE_CONTAINING_SYMBOL_FIELD_NUMBER: builtins.int FILE_CONTAINING_EXTENSION_FIELD_NUMBER: builtins.int ALL_EXTENSION_NUMBERS_OF_TYPE_FIELD_NUMBER: builtins.int LIST_SERVICES_FIELD_NUMBER: builtins.int host: builtins.str file_by_filename: builtins.str """Find a proto file by the file name.""" file_containing_symbol: builtins.str """Find the proto file that declares the given fully-qualified symbol name. This field should be a fully-qualified symbol name (e.g. .[.] or .). """ all_extension_numbers_of_type: builtins.str """Finds the tag numbers used by all known extensions of the given message type, and appends them to ExtensionNumberResponse in an undefined order. Its corresponding method is best-effort: it's not guaranteed that the reflection service will implement this method, and it's not guaranteed that this method will provide all extensions. Returns StatusCode::UNIMPLEMENTED if it's not implemented. This field should be a fully-qualified type name. The format is . """ list_services: builtins.str """List the full names of registered services. The content will not be checked. """ @property def file_containing_extension(self) -> global___ExtensionRequest: """Find the proto file which defines an extension extending the given message type with the given field number. """ def __init__( self, *, host: builtins.str = ..., file_by_filename: builtins.str = ..., file_containing_symbol: builtins.str = ..., file_containing_extension: global___ExtensionRequest | None = ..., all_extension_numbers_of_type: builtins.str = ..., list_services: builtins.str = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["all_extension_numbers_of_type", b"all_extension_numbers_of_type", "file_by_filename", b"file_by_filename", "file_containing_extension", b"file_containing_extension", "file_containing_symbol", b"file_containing_symbol", "list_services", b"list_services", "message_request", b"message_request"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["all_extension_numbers_of_type", b"all_extension_numbers_of_type", "file_by_filename", b"file_by_filename", "file_containing_extension", b"file_containing_extension", "file_containing_symbol", b"file_containing_symbol", "host", b"host", "list_services", b"list_services", "message_request", b"message_request"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["message_request", b"message_request"]) -> typing.Literal["file_by_filename", "file_containing_symbol", "file_containing_extension", "all_extension_numbers_of_type", "list_services"] | None: ... global___ServerReflectionRequest = ServerReflectionRequest @typing.final class ExtensionRequest(google.protobuf.message.Message): """The type name and extension number sent by the client when requesting file_containing_extension. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor CONTAINING_TYPE_FIELD_NUMBER: builtins.int EXTENSION_NUMBER_FIELD_NUMBER: builtins.int containing_type: builtins.str """Fully-qualified type name. The format should be .""" extension_number: builtins.int def __init__( self, *, containing_type: builtins.str = ..., extension_number: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["containing_type", b"containing_type", "extension_number", b"extension_number"]) -> None: ... global___ExtensionRequest = ExtensionRequest @typing.final class ServerReflectionResponse(google.protobuf.message.Message): """The message sent by the server to answer ServerReflectionInfo method.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor VALID_HOST_FIELD_NUMBER: builtins.int ORIGINAL_REQUEST_FIELD_NUMBER: builtins.int FILE_DESCRIPTOR_RESPONSE_FIELD_NUMBER: builtins.int ALL_EXTENSION_NUMBERS_RESPONSE_FIELD_NUMBER: builtins.int LIST_SERVICES_RESPONSE_FIELD_NUMBER: builtins.int ERROR_RESPONSE_FIELD_NUMBER: builtins.int valid_host: builtins.str @property def original_request(self) -> global___ServerReflectionRequest: ... @property def file_descriptor_response(self) -> global___FileDescriptorResponse: """This message is used to answer file_by_filename, file_containing_symbol, file_containing_extension requests with transitive dependencies. As the repeated label is not allowed in oneof fields, we use a FileDescriptorResponse message to encapsulate the repeated fields. The reflection service is allowed to avoid sending FileDescriptorProtos that were previously sent in response to earlier requests in the stream. """ @property def all_extension_numbers_response(self) -> global___ExtensionNumberResponse: """This message is used to answer all_extension_numbers_of_type requests.""" @property def list_services_response(self) -> global___ListServiceResponse: """This message is used to answer list_services requests.""" @property def error_response(self) -> global___ErrorResponse: """This message is used when an error occurs.""" def __init__( self, *, valid_host: builtins.str = ..., original_request: global___ServerReflectionRequest | None = ..., file_descriptor_response: global___FileDescriptorResponse | None = ..., all_extension_numbers_response: global___ExtensionNumberResponse | None = ..., list_services_response: global___ListServiceResponse | None = ..., error_response: global___ErrorResponse | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["all_extension_numbers_response", b"all_extension_numbers_response", "error_response", b"error_response", "file_descriptor_response", b"file_descriptor_response", "list_services_response", b"list_services_response", "message_response", b"message_response", "original_request", b"original_request"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["all_extension_numbers_response", b"all_extension_numbers_response", "error_response", b"error_response", "file_descriptor_response", b"file_descriptor_response", "list_services_response", b"list_services_response", "message_response", b"message_response", "original_request", b"original_request", "valid_host", b"valid_host"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["message_response", b"message_response"]) -> typing.Literal["file_descriptor_response", "all_extension_numbers_response", "list_services_response", "error_response"] | None: ... global___ServerReflectionResponse = ServerReflectionResponse @typing.final class FileDescriptorResponse(google.protobuf.message.Message): """Serialized FileDescriptorProto messages sent by the server answering a file_by_filename, file_containing_symbol, or file_containing_extension request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor FILE_DESCRIPTOR_PROTO_FIELD_NUMBER: builtins.int @property def file_descriptor_proto(self) -> google.protobuf.internal.containers.RepeatedScalarFieldContainer[builtins.bytes]: """Serialized FileDescriptorProto messages. We avoid taking a dependency on descriptor.proto, which uses proto2 only features, by making them opaque bytes instead. """ def __init__( self, *, file_descriptor_proto: collections.abc.Iterable[builtins.bytes] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["file_descriptor_proto", b"file_descriptor_proto"]) -> None: ... global___FileDescriptorResponse = FileDescriptorResponse @typing.final class ExtensionNumberResponse(google.protobuf.message.Message): """A list of extension numbers sent by the server answering all_extension_numbers_of_type request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor BASE_TYPE_NAME_FIELD_NUMBER: builtins.int EXTENSION_NUMBER_FIELD_NUMBER: builtins.int base_type_name: builtins.str """Full name of the base type, including the package name. The format is . """ @property def extension_number(self) -> google.protobuf.internal.containers.RepeatedScalarFieldContainer[builtins.int]: ... def __init__( self, *, base_type_name: builtins.str = ..., extension_number: collections.abc.Iterable[builtins.int] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["base_type_name", b"base_type_name", "extension_number", b"extension_number"]) -> None: ... global___ExtensionNumberResponse = ExtensionNumberResponse @typing.final class ListServiceResponse(google.protobuf.message.Message): """A list of ServiceResponse sent by the server answering list_services request.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVICE_FIELD_NUMBER: builtins.int @property def service(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___ServiceResponse]: """The information of each service may be expanded in the future, so we use ServiceResponse message to encapsulate it. """ def __init__( self, *, service: collections.abc.Iterable[global___ServiceResponse] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["service", b"service"]) -> None: ... global___ListServiceResponse = ListServiceResponse @typing.final class ServiceResponse(google.protobuf.message.Message): """The information of a single service used by ListServiceResponse to answer list_services request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor NAME_FIELD_NUMBER: builtins.int name: builtins.str """Full name of a registered service, including its package name. The format is . """ def __init__( self, *, name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["name", b"name"]) -> None: ... global___ServiceResponse = ServiceResponse @typing.final class ErrorResponse(google.protobuf.message.Message): """The error code and error message sent by the server when an error occurs.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor ERROR_CODE_FIELD_NUMBER: builtins.int ERROR_MESSAGE_FIELD_NUMBER: builtins.int error_code: builtins.int """This field uses the error codes defined in grpc::StatusCode.""" error_message: builtins.str def __init__( self, *, error_code: builtins.int = ..., error_message: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["error_code", b"error_code", "error_message", b"error_message"]) -> None: ... global___ErrorResponse = ErrorResponse ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6562335 grpclib-0.4.8rc2/grpclib/reflection/v1alpha/0000755000175100001770000000000000000000000021472 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/reflection/v1alpha/__init__.py0000644000175100001770000000000000000000000023571 0ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1alpha/reflection_grpc.py0000644000175100001770000000276400000000000025222 0ustar00runnerdocker00000000000000# Generated by the Protocol Buffers compiler. DO NOT EDIT! # source: grpclib/reflection/v1alpha/reflection.proto # plugin: grpclib.plugin.main import abc import typing import grpclib.const import grpclib.client if typing.TYPE_CHECKING: import grpclib.server import grpclib.reflection.v1alpha.reflection_pb2 class ServerReflectionBase(abc.ABC): @abc.abstractmethod async def ServerReflectionInfo(self, stream: 'grpclib.server.Stream[grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionResponse]') -> None: pass def __mapping__(self) -> typing.Dict[str, grpclib.const.Handler]: return { '/grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo': grpclib.const.Handler( self.ServerReflectionInfo, grpclib.const.Cardinality.STREAM_STREAM, grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionResponse, ), } class ServerReflectionStub: def __init__(self, channel: grpclib.client.Channel) -> None: self.ServerReflectionInfo = grpclib.client.StreamStreamMethod( channel, '/grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo', grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionRequest, grpclib.reflection.v1alpha.reflection_pb2.ServerReflectionResponse, ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1alpha/reflection_pb2.py0000644000175100001770000001002400000000000024736 0ustar00runnerdocker00000000000000# -*- coding: utf-8 -*- # Generated by the protocol buffer compiler. DO NOT EDIT! # source: grpclib/reflection/v1alpha/reflection.proto # Protobuf Python Version: 4.25.1 """Generated protocol buffer code.""" from google.protobuf import descriptor as _descriptor from google.protobuf import descriptor_pool as _descriptor_pool from google.protobuf import symbol_database as _symbol_database from google.protobuf.internal import builder as _builder # @@protoc_insertion_point(imports) _sym_db = _symbol_database.Default() DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n+grpclib/reflection/v1alpha/reflection.proto\x12\x17grpc.reflection.v1alpha\"\x8a\x02\n\x17ServerReflectionRequest\x12\x0c\n\x04host\x18\x01 \x01(\t\x12\x1a\n\x10\x66ile_by_filename\x18\x03 \x01(\tH\x00\x12 \n\x16\x66ile_containing_symbol\x18\x04 \x01(\tH\x00\x12N\n\x19\x66ile_containing_extension\x18\x05 \x01(\x0b\x32).grpc.reflection.v1alpha.ExtensionRequestH\x00\x12\'\n\x1d\x61ll_extension_numbers_of_type\x18\x06 \x01(\tH\x00\x12\x17\n\rlist_services\x18\x07 \x01(\tH\x00\x42\x11\n\x0fmessage_request\"E\n\x10\x45xtensionRequest\x12\x17\n\x0f\x63ontaining_type\x18\x01 \x01(\t\x12\x18\n\x10\x65xtension_number\x18\x02 \x01(\x05\"\xd1\x03\n\x18ServerReflectionResponse\x12\x12\n\nvalid_host\x18\x01 \x01(\t\x12J\n\x10original_request\x18\x02 \x01(\x0b\x32\x30.grpc.reflection.v1alpha.ServerReflectionRequest\x12S\n\x18\x66ile_descriptor_response\x18\x04 \x01(\x0b\x32/.grpc.reflection.v1alpha.FileDescriptorResponseH\x00\x12Z\n\x1e\x61ll_extension_numbers_response\x18\x05 \x01(\x0b\x32\x30.grpc.reflection.v1alpha.ExtensionNumberResponseH\x00\x12N\n\x16list_services_response\x18\x06 \x01(\x0b\x32,.grpc.reflection.v1alpha.ListServiceResponseH\x00\x12@\n\x0e\x65rror_response\x18\x07 \x01(\x0b\x32&.grpc.reflection.v1alpha.ErrorResponseH\x00\x42\x12\n\x10message_response\"7\n\x16\x46ileDescriptorResponse\x12\x1d\n\x15\x66ile_descriptor_proto\x18\x01 \x03(\x0c\"K\n\x17\x45xtensionNumberResponse\x12\x16\n\x0e\x62\x61se_type_name\x18\x01 \x01(\t\x12\x18\n\x10\x65xtension_number\x18\x02 \x03(\x05\"P\n\x13ListServiceResponse\x12\x39\n\x07service\x18\x01 \x03(\x0b\x32(.grpc.reflection.v1alpha.ServiceResponse\"\x1f\n\x0fServiceResponse\x12\x0c\n\x04name\x18\x01 \x01(\t\":\n\rErrorResponse\x12\x12\n\nerror_code\x18\x01 \x01(\x05\x12\x15\n\rerror_message\x18\x02 \x01(\t2\x93\x01\n\x10ServerReflection\x12\x7f\n\x14ServerReflectionInfo\x12\x30.grpc.reflection.v1alpha.ServerReflectionRequest\x1a\x31.grpc.reflection.v1alpha.ServerReflectionResponse(\x01\x30\x01\x42\x38\n\x1aio.grpc.reflection.v1alphaB\x15ServerReflectionProtoP\x01\xb8\x01\x01\x62\x06proto3') _globals = globals() _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'grpclib.reflection.v1alpha.reflection_pb2', _globals) if _descriptor._USE_C_DESCRIPTORS == False: _globals['DESCRIPTOR']._options = None _globals['DESCRIPTOR']._serialized_options = b'\n\032io.grpc.reflection.v1alphaB\025ServerReflectionProtoP\001\270\001\001' _globals['_SERVERREFLECTIONREQUEST']._serialized_start=73 _globals['_SERVERREFLECTIONREQUEST']._serialized_end=339 _globals['_EXTENSIONREQUEST']._serialized_start=341 _globals['_EXTENSIONREQUEST']._serialized_end=410 _globals['_SERVERREFLECTIONRESPONSE']._serialized_start=413 _globals['_SERVERREFLECTIONRESPONSE']._serialized_end=878 _globals['_FILEDESCRIPTORRESPONSE']._serialized_start=880 _globals['_FILEDESCRIPTORRESPONSE']._serialized_end=935 _globals['_EXTENSIONNUMBERRESPONSE']._serialized_start=937 _globals['_EXTENSIONNUMBERRESPONSE']._serialized_end=1012 _globals['_LISTSERVICERESPONSE']._serialized_start=1014 _globals['_LISTSERVICERESPONSE']._serialized_end=1094 _globals['_SERVICERESPONSE']._serialized_start=1096 _globals['_SERVICERESPONSE']._serialized_end=1127 _globals['_ERRORRESPONSE']._serialized_start=1129 _globals['_ERRORRESPONSE']._serialized_end=1187 _globals['_SERVERREFLECTION']._serialized_start=1190 _globals['_SERVERREFLECTION']._serialized_end=1337 # @@protoc_insertion_point(module_scope) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851944.0 grpclib-0.4.8rc2/grpclib/reflection/v1alpha/reflection_pb2.pyi0000644000175100001770000002717100000000000025122 0ustar00runnerdocker00000000000000""" @generated by mypy-protobuf. Do not edit manually! isort:skip_file Warning: this entire file is deprecated. Use this instead: https://github.com/grpc/grpc-proto/blob/master/grpc/reflection/v1/reflection.proto """ import builtins import collections.abc import google.protobuf.descriptor import google.protobuf.internal.containers import google.protobuf.message import typing DESCRIPTOR: google.protobuf.descriptor.FileDescriptor @typing.final class ServerReflectionRequest(google.protobuf.message.Message): """The message sent by the client when calling ServerReflectionInfo method.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor HOST_FIELD_NUMBER: builtins.int FILE_BY_FILENAME_FIELD_NUMBER: builtins.int FILE_CONTAINING_SYMBOL_FIELD_NUMBER: builtins.int FILE_CONTAINING_EXTENSION_FIELD_NUMBER: builtins.int ALL_EXTENSION_NUMBERS_OF_TYPE_FIELD_NUMBER: builtins.int LIST_SERVICES_FIELD_NUMBER: builtins.int host: builtins.str file_by_filename: builtins.str """Find a proto file by the file name.""" file_containing_symbol: builtins.str """Find the proto file that declares the given fully-qualified symbol name. This field should be a fully-qualified symbol name (e.g. .[.] or .). """ all_extension_numbers_of_type: builtins.str """Finds the tag numbers used by all known extensions of extendee_type, and appends them to ExtensionNumberResponse in an undefined order. Its corresponding method is best-effort: it's not guaranteed that the reflection service will implement this method, and it's not guaranteed that this method will provide all extensions. Returns StatusCode::UNIMPLEMENTED if it's not implemented. This field should be a fully-qualified type name. The format is . """ list_services: builtins.str """List the full names of registered services. The content will not be checked. """ @property def file_containing_extension(self) -> global___ExtensionRequest: """Find the proto file which defines an extension extending the given message type with the given field number. """ def __init__( self, *, host: builtins.str = ..., file_by_filename: builtins.str = ..., file_containing_symbol: builtins.str = ..., file_containing_extension: global___ExtensionRequest | None = ..., all_extension_numbers_of_type: builtins.str = ..., list_services: builtins.str = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["all_extension_numbers_of_type", b"all_extension_numbers_of_type", "file_by_filename", b"file_by_filename", "file_containing_extension", b"file_containing_extension", "file_containing_symbol", b"file_containing_symbol", "list_services", b"list_services", "message_request", b"message_request"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["all_extension_numbers_of_type", b"all_extension_numbers_of_type", "file_by_filename", b"file_by_filename", "file_containing_extension", b"file_containing_extension", "file_containing_symbol", b"file_containing_symbol", "host", b"host", "list_services", b"list_services", "message_request", b"message_request"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["message_request", b"message_request"]) -> typing.Literal["file_by_filename", "file_containing_symbol", "file_containing_extension", "all_extension_numbers_of_type", "list_services"] | None: ... global___ServerReflectionRequest = ServerReflectionRequest @typing.final class ExtensionRequest(google.protobuf.message.Message): """The type name and extension number sent by the client when requesting file_containing_extension. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor CONTAINING_TYPE_FIELD_NUMBER: builtins.int EXTENSION_NUMBER_FIELD_NUMBER: builtins.int containing_type: builtins.str """Fully-qualified type name. The format should be .""" extension_number: builtins.int def __init__( self, *, containing_type: builtins.str = ..., extension_number: builtins.int = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["containing_type", b"containing_type", "extension_number", b"extension_number"]) -> None: ... global___ExtensionRequest = ExtensionRequest @typing.final class ServerReflectionResponse(google.protobuf.message.Message): """The message sent by the server to answer ServerReflectionInfo method.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor VALID_HOST_FIELD_NUMBER: builtins.int ORIGINAL_REQUEST_FIELD_NUMBER: builtins.int FILE_DESCRIPTOR_RESPONSE_FIELD_NUMBER: builtins.int ALL_EXTENSION_NUMBERS_RESPONSE_FIELD_NUMBER: builtins.int LIST_SERVICES_RESPONSE_FIELD_NUMBER: builtins.int ERROR_RESPONSE_FIELD_NUMBER: builtins.int valid_host: builtins.str @property def original_request(self) -> global___ServerReflectionRequest: ... @property def file_descriptor_response(self) -> global___FileDescriptorResponse: """This message is used to answer file_by_filename, file_containing_symbol, file_containing_extension requests with transitive dependencies. As the repeated label is not allowed in oneof fields, we use a FileDescriptorResponse message to encapsulate the repeated fields. The reflection service is allowed to avoid sending FileDescriptorProtos that were previously sent in response to earlier requests in the stream. """ @property def all_extension_numbers_response(self) -> global___ExtensionNumberResponse: """This message is used to answer all_extension_numbers_of_type requst.""" @property def list_services_response(self) -> global___ListServiceResponse: """This message is used to answer list_services request.""" @property def error_response(self) -> global___ErrorResponse: """This message is used when an error occurs.""" def __init__( self, *, valid_host: builtins.str = ..., original_request: global___ServerReflectionRequest | None = ..., file_descriptor_response: global___FileDescriptorResponse | None = ..., all_extension_numbers_response: global___ExtensionNumberResponse | None = ..., list_services_response: global___ListServiceResponse | None = ..., error_response: global___ErrorResponse | None = ..., ) -> None: ... def HasField(self, field_name: typing.Literal["all_extension_numbers_response", b"all_extension_numbers_response", "error_response", b"error_response", "file_descriptor_response", b"file_descriptor_response", "list_services_response", b"list_services_response", "message_response", b"message_response", "original_request", b"original_request"]) -> builtins.bool: ... def ClearField(self, field_name: typing.Literal["all_extension_numbers_response", b"all_extension_numbers_response", "error_response", b"error_response", "file_descriptor_response", b"file_descriptor_response", "list_services_response", b"list_services_response", "message_response", b"message_response", "original_request", b"original_request", "valid_host", b"valid_host"]) -> None: ... def WhichOneof(self, oneof_group: typing.Literal["message_response", b"message_response"]) -> typing.Literal["file_descriptor_response", "all_extension_numbers_response", "list_services_response", "error_response"] | None: ... global___ServerReflectionResponse = ServerReflectionResponse @typing.final class FileDescriptorResponse(google.protobuf.message.Message): """Serialized FileDescriptorProto messages sent by the server answering a file_by_filename, file_containing_symbol, or file_containing_extension request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor FILE_DESCRIPTOR_PROTO_FIELD_NUMBER: builtins.int @property def file_descriptor_proto(self) -> google.protobuf.internal.containers.RepeatedScalarFieldContainer[builtins.bytes]: """Serialized FileDescriptorProto messages. We avoid taking a dependency on descriptor.proto, which uses proto2 only features, by making them opaque bytes instead. """ def __init__( self, *, file_descriptor_proto: collections.abc.Iterable[builtins.bytes] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["file_descriptor_proto", b"file_descriptor_proto"]) -> None: ... global___FileDescriptorResponse = FileDescriptorResponse @typing.final class ExtensionNumberResponse(google.protobuf.message.Message): """A list of extension numbers sent by the server answering all_extension_numbers_of_type request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor BASE_TYPE_NAME_FIELD_NUMBER: builtins.int EXTENSION_NUMBER_FIELD_NUMBER: builtins.int base_type_name: builtins.str """Full name of the base type, including the package name. The format is . """ @property def extension_number(self) -> google.protobuf.internal.containers.RepeatedScalarFieldContainer[builtins.int]: ... def __init__( self, *, base_type_name: builtins.str = ..., extension_number: collections.abc.Iterable[builtins.int] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["base_type_name", b"base_type_name", "extension_number", b"extension_number"]) -> None: ... global___ExtensionNumberResponse = ExtensionNumberResponse @typing.final class ListServiceResponse(google.protobuf.message.Message): """A list of ServiceResponse sent by the server answering list_services request.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor SERVICE_FIELD_NUMBER: builtins.int @property def service(self) -> google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___ServiceResponse]: """The information of each service may be expanded in the future, so we use ServiceResponse message to encapsulate it. """ def __init__( self, *, service: collections.abc.Iterable[global___ServiceResponse] | None = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["service", b"service"]) -> None: ... global___ListServiceResponse = ListServiceResponse @typing.final class ServiceResponse(google.protobuf.message.Message): """The information of a single service used by ListServiceResponse to answer list_services request. """ DESCRIPTOR: google.protobuf.descriptor.Descriptor NAME_FIELD_NUMBER: builtins.int name: builtins.str """Full name of a registered service, including its package name. The format is . """ def __init__( self, *, name: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["name", b"name"]) -> None: ... global___ServiceResponse = ServiceResponse @typing.final class ErrorResponse(google.protobuf.message.Message): """The error code and error message sent by the server when an error occurs.""" DESCRIPTOR: google.protobuf.descriptor.Descriptor ERROR_CODE_FIELD_NUMBER: builtins.int ERROR_MESSAGE_FIELD_NUMBER: builtins.int error_code: builtins.int """This field uses the error codes defined in grpc::StatusCode.""" error_message: builtins.str def __init__( self, *, error_code: builtins.int = ..., error_message: builtins.str = ..., ) -> None: ... def ClearField(self, field_name: typing.Literal["error_code", b"error_code", "error_message", b"error_message"]) -> None: ... global___ErrorResponse = ErrorResponse ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/server.py0000644000175100001770000006512200000000000017672 0ustar00runnerdocker00000000000000import abc import time import socket import logging import asyncio import warnings from types import TracebackType from typing import TYPE_CHECKING, Optional, Collection, Generic, Type, cast from typing import List, Tuple, Dict, Any, Callable, ContextManager, Set from contextlib import nullcontext import h2.config import h2.exceptions from multidict import MultiDict from .utils import DeadlineWrapper, Wrapper from .const import Status, Cardinality from .config import Configuration from .stream import send_message, recv_message, StreamIterator from .stream import _RecvType, _SendType from .events import _DispatchServerEvents from .metadata import Deadline, encode_grpc_message, _Metadata from .metadata import encode_metadata, decode_metadata, _MetadataLike from .metadata import _STATUS_DETAILS_KEY, encode_bin_value from .protocol import H2Protocol, AbstractHandler from .exceptions import GRPCError, ProtocolError, StreamTerminatedError from .encoding.base import GRPC_CONTENT_TYPE, CodecBase, StatusDetailsCodecBase from .encoding.proto import ProtoCodec, ProtoStatusDetailsCodec from .encoding.proto import _googleapis_available from ._registry import servers as _servers if TYPE_CHECKING: import ssl as _ssl # noqa from . import const # noqa from . import protocol # noqa from ._typing import IServable # noqa log = logging.getLogger(__name__) _Headers = List[Tuple[str, str]] class Stream(StreamIterator[_RecvType], Generic[_RecvType, _SendType]): """ Represents gRPC method call – HTTP/2 request/stream, and everything you need to communicate with client in order to handle this request. As you can see, every method handler accepts single positional argument - stream: .. code-block:: python3 async def MakeLatte(self, stream: grpclib.server.Stream): task: cafe_pb2.LatteOrder = await stream.recv_message() ... await stream.send_message(empty_pb2.Empty()) This is true for every gRPC method type. """ # state _send_initial_metadata_done = False _send_message_done = False _send_trailing_metadata_done = False _cancel_done = False # stats _messages_sent = 0 _messages_received = 0 def __init__( self, stream: 'protocol.Stream', method_name: str, cardinality: Cardinality, recv_type: Type[_RecvType], send_type: Type[_SendType], *, codec: CodecBase, status_details_codec: Optional[StatusDetailsCodecBase], dispatch: _DispatchServerEvents, deadline: Optional[Deadline] = None, user_agent: Optional[str] = None, ): self._stream = stream self._method_name = method_name self._cardinality = cardinality self._recv_type = recv_type self._send_type = send_type self._codec = codec self._status_details_codec = status_details_codec self._dispatch = dispatch #: :py:class:`~grpclib.metadata.Deadline` of the current request self.deadline = deadline #: Invocation metadata, received with headers from the client. #: Represented as a multi-dict object. self.metadata: Optional[_Metadata] = None #: Client's user-agent self.user_agent = user_agent #: Connection's peer info of type :py:class:`~grpclib.protocol.Peer` self.peer = self._stream.connection.get_peer() @property def _content_type(self) -> str: return GRPC_CONTENT_TYPE + '+' + self._codec.__content_subtype__ async def recv_message(self) -> Optional[_RecvType]: """Coroutine to receive incoming message from the client. If client sends UNARY request, then you can call this coroutine only once. If client sends STREAM request, then you should call this coroutine several times, until it returns None when the client has ended the stream. To simplify your code in this case, :py:class:`Stream` class implements async iteration protocol, so you can use it like this: .. code-block:: python3 async for message in stream: do_smth_with(message) or even like this: .. code-block:: python3 messages = [msg async for msg in stream] HTTP/2 has flow control mechanism, so server will acknowledge received DATA frames as a message only after user consumes this coroutine. :returns: message """ message = await recv_message(self._stream, self._codec, self._recv_type) if message is not None: message, = await self._dispatch.recv_message(message) self._messages_received += 1 self._stream.connection.messages_received += 1 self._stream.connection.last_message_received = time.monotonic() return message else: return None async def send_initial_metadata( self, *, metadata: Optional[_MetadataLike] = None, ) -> None: """Coroutine to send headers with initial metadata to the client. In gRPC you can send initial metadata as soon as possible, because gRPC doesn't use `:status` pseudo header to indicate success or failure of the current request. gRPC uses trailers for this purpose, and trailers are sent during :py:meth:`send_trailing_metadata` call, which should be called in the end. .. note:: This coroutine will be called implicitly during first :py:meth:`send_message` coroutine call, if not called before explicitly. :param metadata: custom initial metadata, dict or list of pairs """ if self._send_initial_metadata_done: raise ProtocolError('Initial metadata was already sent') headers = [ (':status', '200'), ('content-type', self._content_type), ] metadata = MultiDict(metadata or ()) metadata, = await self._dispatch.send_initial_metadata(metadata) headers.extend(encode_metadata(cast(_Metadata, metadata))) await self._stream.send_headers(headers) self._send_initial_metadata_done = True async def send_message(self, message: _SendType) -> None: """Coroutine to send message to the client. If server sends UNARY response, then you should call this coroutine only once. If server sends STREAM response, then you can call this coroutine as many times as you need. :param message: message object """ if not self._send_initial_metadata_done: await self.send_initial_metadata() if not self._cardinality.server_streaming: if self._send_message_done: raise ProtocolError('Message was already sent') message, = await self._dispatch.send_message(message) await send_message(self._stream, self._codec, message, self._send_type) self._send_message_done = True self._messages_sent += 1 self._stream.connection.messages_sent += 1 self._stream.connection.last_message_sent = time.monotonic() async def send_trailing_metadata( self, *, status: Status = Status.OK, status_message: Optional[str] = None, status_details: Any = None, metadata: Optional[_MetadataLike] = None, ) -> None: """Coroutine to send trailers with trailing metadata to the client. This coroutine allows sending trailers-only responses, in case of some failure conditions during handling current request, i.e. when ``status is not OK``. .. note:: This coroutine will be called implicitly at exit from request handler, with appropriate status code, if not called explicitly during handler execution. :param status: resulting status of this coroutine call :param status_message: description for a status :param metadata: custom trailing metadata, dict or list of pairs """ if self._send_trailing_metadata_done: raise ProtocolError('Trailing metadata was already sent') if ( not self._cardinality.server_streaming and not self._send_message_done and status is Status.OK ): raise ProtocolError('Unary response with OK status requires ' 'a single message to be sent') if self._send_initial_metadata_done: headers: _Headers = [] else: # trailers-only response headers = [ (':status', '200'), ('content-type', self._content_type), ] headers.append(('grpc-status', str(status.value))) if status_message is not None: headers.append(('grpc-message', encode_grpc_message(status_message))) if ( status_details is not None and self._status_details_codec is not None ): status_details_bin = ( encode_bin_value(self._status_details_codec.encode( status, status_message, status_details, )).decode('ascii') ) headers.append((_STATUS_DETAILS_KEY, status_details_bin)) metadata = MultiDict(metadata or ()) metadata, = await self._dispatch.send_trailing_metadata( metadata, status=status, status_message=status_message, status_details=status_details, ) headers.extend(encode_metadata(cast(_Metadata, metadata))) await self._stream.send_headers(headers, end_stream=True) self._send_trailing_metadata_done = True if status != Status.OK and self._stream.closable: self._stream.reset_nowait() async def cancel(self) -> None: """Coroutine to cancel this request/stream. Server will send RST_STREAM frame to the client, so it will be explicitly informed that there is nothing to expect from the server regarding this request/stream. """ if self._cancel_done: raise ProtocolError('Stream was already cancelled') await self._stream.reset() # TODO: specify error code self._cancel_done = True async def __aenter__(self) -> 'Stream[_RecvType, _SendType]': return self async def __aexit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> Optional[bool]: if ( self._send_trailing_metadata_done or self._cancel_done or self._stream._transport.is_closing() ): # to suppress exception propagation return True protocol_error = None if exc_val is not None: # This error should be logged by ``request_handler``, here we # have to convert it into trailers and send to the client using # ``send_trailing_metadata`` method. if isinstance(exc_val, GRPCError): status = exc_val.status status_message = exc_val.message status_details = exc_val.details elif isinstance(exc_val, Exception): status = Status.UNKNOWN status_message = 'Internal Server Error' status_details = None else: # propagate exception return None elif ( # There is a possibility of a ``ProtocolError`` in the # ``send_trailing_metadata`` method, so we are checking for such # errors here not self._cardinality.server_streaming and not self._send_message_done ): status = Status.UNKNOWN status_message = 'Internal Server Error' status_details = None protocol_error = ('Unary response with OK status requires ' 'a single message to be sent: {!r}' .format(self._method_name)) else: status = Status.OK status_message = None status_details = None try: await self.send_trailing_metadata(status=status, status_message=status_message, status_details=status_details) except h2.exceptions.StreamClosedError: pass if protocol_error is not None: raise ProtocolError(protocol_error) # to suppress exception propagation return True async def _abort( h2_stream: 'protocol.Stream', h2_status: int, grpc_status: Optional[Status] = None, grpc_message: Optional[str] = None, ) -> None: headers = [(':status', str(h2_status))] if grpc_status is not None: headers.append(('grpc-status', str(grpc_status.value))) if grpc_message is not None: headers.append(('grpc-message', grpc_message)) await h2_stream.send_headers(headers, end_stream=True) if h2_stream.closable: h2_stream.reset_nowait() async def request_handler( mapping: Dict[str, 'const.Handler'], _stream: 'protocol.Stream', headers: _Headers, codec: CodecBase, status_details_codec: Optional[StatusDetailsCodecBase], dispatch: _DispatchServerEvents, release_stream: Callable[[], Any], ) -> None: try: headers_map = dict(headers) if headers_map[':method'] != 'POST': await _abort(_stream, 405) return content_type = headers_map.get('content-type') if content_type is None: await _abort(_stream, 415, Status.UNKNOWN, 'Missing content-type header') return base_content_type, _, sub_type = content_type.partition('+') sub_type = sub_type or ProtoCodec.__content_subtype__ if ( base_content_type != GRPC_CONTENT_TYPE or sub_type != codec.__content_subtype__ ): await _abort(_stream, 415, Status.UNKNOWN, 'Unacceptable content-type header') return if headers_map.get('te') != 'trailers': await _abort(_stream, 400, Status.UNKNOWN, 'Required "te: trailers" header is missing') return method_name = headers_map[':path'] method = mapping.get(method_name) if method is None: await _abort(_stream, 200, Status.UNIMPLEMENTED, 'Method not found') return try: deadline = Deadline.from_headers(headers) except ValueError: await _abort(_stream, 200, Status.UNKNOWN, 'Invalid grpc-timeout header') return metadata = decode_metadata(headers) user_agent = headers_map.get('user-agent') async with Stream( _stream, method_name, method.cardinality, method.request_type, method.reply_type, codec=codec, status_details_codec=status_details_codec, dispatch=dispatch, deadline=deadline, user_agent=user_agent, ) as stream: deadline_wrapper: 'ContextManager[Any]' if deadline is None: wrapper = _stream.wrapper = Wrapper() deadline_wrapper = nullcontext() else: wrapper = _stream.wrapper = DeadlineWrapper() deadline_wrapper = wrapper.start(deadline) try: with deadline_wrapper, wrapper: stream.metadata, method_func = await dispatch.recv_request( metadata, method.func, method_name=method_name, deadline=deadline, content_type=content_type, user_agent=user_agent, peer=stream.peer, ) await method_func(stream) except GRPCError: raise except asyncio.TimeoutError: if wrapper.cancel_failed: log.exception('Failed to handle cancellation') raise GRPCError(Status.DEADLINE_EXCEEDED) elif wrapper.cancelled: log.info('Deadline exceeded') raise GRPCError(Status.DEADLINE_EXCEEDED) else: log.exception('Timeout occurred') raise except StreamTerminatedError as err: if wrapper.cancel_failed: log.exception('Failed to handle cancellation') raise else: assert wrapper.cancelled log.info('Request was cancelled: %s', err) raise except Exception: log.exception('Application error') raise except ProtocolError: log.exception('Application error') except Exception: log.exception('Server error') finally: release_stream() class _GC(abc.ABC): _gc_counter = 0 @property @abc.abstractmethod def __gc_interval__(self) -> int: raise NotImplementedError @abc.abstractmethod def __gc_collect__(self) -> None: pass def __gc_step__(self) -> None: self._gc_counter += 1 if not (self._gc_counter % self.__gc_interval__): self.__gc_collect__() class Handler(_GC, AbstractHandler): __gc_interval__ = 10 closing = False def __init__( self, mapping: Dict[str, 'const.Handler'], codec: CodecBase, status_details_codec: Optional[StatusDetailsCodecBase], dispatch: _DispatchServerEvents, ) -> None: self.mapping = mapping self.codec = codec self.status_details_codec = status_details_codec self.dispatch = dispatch self.loop = asyncio.get_event_loop() self._tasks: Dict['protocol.Stream', 'asyncio.Task[None]'] = {} self._cancelled: Set['asyncio.Task[None]'] = set() def __gc_collect__(self) -> None: self._tasks = {s: t for s, t in self._tasks.items() if not t.done()} self._cancelled = {t for t in self._cancelled if not t.done()} def accept( self, stream: 'protocol.Stream', headers: _Headers, release_stream: Callable[[], Any], ) -> None: self.__gc_step__() self._tasks[stream] = self.loop.create_task(request_handler( self.mapping, stream, headers, self.codec, self.status_details_codec, self.dispatch, release_stream, )) def cancel(self, stream: 'protocol.Stream') -> None: task = self._tasks.pop(stream) task.cancel() self._cancelled.add(task) def close(self) -> None: for task in self._tasks.values(): task.cancel() self._cancelled.update(self._tasks.values()) self.closing = True async def wait_closed(self) -> None: if self._cancelled: await asyncio.wait(self._cancelled) def check_closed(self) -> bool: self.__gc_collect__() return not self._tasks and not self._cancelled class Server(_GC): """ HTTP/2 server, which uses gRPC service handlers to handle requests. Handler is a subclass of the abstract base class, which was generated from .proto file: .. code-block:: python3 class CoffeeMachine(cafe_grpc.CoffeeMachineBase): async def MakeLatte(self, stream): task: cafe_pb2.LatteOrder = await stream.recv_message() ... await stream.send_message(empty_pb2.Empty()) server = Server([CoffeeMachine()]) """ __gc_interval__ = 10 def __init__( self, handlers: Collection['IServable'], *, loop: Optional[asyncio.AbstractEventLoop] = None, codec: Optional[CodecBase] = None, status_details_codec: Optional[StatusDetailsCodecBase] = None, config: Optional[Configuration] = None, ) -> None: """ :param handlers: list of handlers :param loop: (deprecated) asyncio-compatible event loop :param codec: instance of a codec to encode and decode messages, if omitted ``ProtoCodec`` is used by default :param status_details_codec: instance of a status details codec to encode error details in a trailing metadata, if omitted ``ProtoStatusDetailsCodec`` is used by default """ if loop: warnings.warn("The loop argument is deprecated and scheduled " "for removal in grpclib 0.5", DeprecationWarning, stacklevel=2) mapping: Dict[str, 'const.Handler'] = {} for handler in handlers: mapping.update(handler.__mapping__()) self._mapping = mapping self._loop = loop or asyncio.get_event_loop() if codec is None: codec = ProtoCodec() if status_details_codec is None and _googleapis_available(): status_details_codec = ProtoStatusDetailsCodec() self._codec = codec self._status_details_codec = status_details_codec self._h2_config = h2.config.H2Configuration( client_side=False, header_encoding='ascii', validate_inbound_headers=False, validate_outbound_headers=False, normalize_inbound_headers=False, normalize_outbound_headers=False, ) config = Configuration() if config is None else config self._config = config.__for_server__() self._server: Optional[asyncio.AbstractServer] = None self._server_closed_fut: Optional[asyncio.Future[None]] = None self._handlers: Set[Handler] = set() self.__dispatch__ = _DispatchServerEvents() _servers.add(self) def __gc_collect__(self) -> None: self._handlers = {h for h in self._handlers if not (h.closing and h.check_closed())} def _protocol_factory(self) -> H2Protocol: self.__gc_step__() handler = Handler( self._mapping, self._codec, self._status_details_codec, self.__dispatch__, ) self._handlers.add(handler) return H2Protocol(handler, self._config, self._h2_config) async def start( self, host: Optional[str] = None, port: Optional[int] = None, *, path: Optional[str] = None, family: 'socket.AddressFamily' = socket.AF_UNSPEC, flags: 'socket.AddressInfo' = socket.AI_PASSIVE, sock: Optional[socket.socket] = None, backlog: int = 100, ssl: Optional['_ssl.SSLContext'] = None, reuse_address: Optional[bool] = None, reuse_port: Optional[bool] = None, ) -> None: """Coroutine to start the server. :param host: can be a string, containing IPv4/v6 address or domain name. If host is None, server will be bound to all available interfaces. :param port: port number. :param path: UNIX domain socket path. If specified, host and port should be omitted (must be None). :param family: can be set to either :py:data:`python:socket.AF_INET` or :py:data:`python:socket.AF_INET6` to force the socket to use IPv4 or IPv6. If not set it will be determined from host. :param flags: is a bitmask for :py:meth:`~python:asyncio.AbstractEventLoop.getaddrinfo`. :param sock: sock can optionally be specified in order to use a preexisting socket object. If specified, host and port should be omitted (must be None). :param backlog: is the maximum number of queued connections passed to listen(). :param ssl: can be set to an :py:class:`~python:ssl.SSLContext` to enable SSL over the accepted connections. :param reuse_address: tells the kernel to reuse a local socket in TIME_WAIT state, without waiting for its natural timeout to expire. :param reuse_port: tells the kernel to allow this endpoint to be bound to the same port as other existing endpoints are bound to, so long as they all set this flag when being created. """ if path is not None and (host is not None or port is not None): raise ValueError("The 'path' parameter can not be used with the " "'host' or 'port' parameters.") if self._server is not None: raise RuntimeError('Server is already started') if path is not None: self._server = await self._loop.create_unix_server( self._protocol_factory, path, sock=sock, backlog=backlog, ssl=ssl ) else: # FIXME: Not all union combinations were tried because there are # too many unions self._server = await self._loop.create_server( # type: ignore self._protocol_factory, host, port, # type: ignore family=family, flags=flags, sock=sock, # type: ignore backlog=backlog, ssl=ssl, reuse_address=reuse_address, reuse_port=reuse_port ) self._server_closed_fut = self._loop.create_future() def close(self) -> None: """Stops accepting new connections, cancels all currently running requests. Request handlers are able to handle `CancelledError` and exit properly. """ if self._server is None or self._server_closed_fut is None: raise RuntimeError('Server is not started') self._server.close() if not self._server_closed_fut.done(): self._server_closed_fut.set_result(None) for handler in self._handlers: handler.close() async def wait_closed(self) -> None: """Coroutine to wait until all existing request handlers will exit properly. """ if self._server is None or self._server_closed_fut is None: raise RuntimeError('Server is not started') await self._server_closed_fut await self._server.wait_closed() if self._handlers: await asyncio.wait({ self._loop.create_task(h.wait_closed()) for h in self._handlers }) async def __aenter__(self) -> 'Server': return self async def __aexit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> None: self.close() await self.wait_closed() ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/stream.py0000644000175100001770000000334100000000000017652 0ustar00runnerdocker00000000000000import abc import struct from typing import Type, TypeVar, Optional, AsyncIterator, TYPE_CHECKING, cast if TYPE_CHECKING: from .protocol import Stream from .encoding.base import CodecBase _SendType = TypeVar('_SendType') _RecvType = TypeVar('_RecvType') async def recv_message( stream: 'Stream', codec: 'CodecBase', message_type: Type[_RecvType], ) -> Optional[_RecvType]: meta = await stream.recv_data(5) if not meta: return None compressed_flag = struct.unpack('?', meta[:1])[0] if compressed_flag: raise NotImplementedError('Compression not implemented') message_len = struct.unpack('>I', meta[1:])[0] message_bin = await stream.recv_data(message_len) assert len(message_bin) == message_len, \ '{} != {}'.format(len(message_bin), message_len) message = codec.decode(message_bin, message_type) return cast(_RecvType, message) async def send_message( stream: 'Stream', codec: 'CodecBase', message: _SendType, message_type: Type[_SendType], *, end: bool = False, ) -> None: reply_bin = codec.encode(message, message_type) reply_data = (struct.pack('?', False) + struct.pack('>I', len(reply_bin)) + reply_bin) await stream.send_data(reply_data, end_stream=end) class StreamIterator(AsyncIterator[_RecvType], metaclass=abc.ABCMeta): @abc.abstractmethod async def recv_message(self) -> Optional[_RecvType]: pass def __aiter__(self) -> AsyncIterator[_RecvType]: return self async def __anext__(self) -> _RecvType: message = await self.recv_message() if message is None: raise StopAsyncIteration() else: return message ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/testing.py0000644000175100001770000000767000000000000020045 0ustar00runnerdocker00000000000000import asyncio from types import TracebackType from typing import TYPE_CHECKING, Collection, Optional, Type from .client import Channel from .server import Server from .protocol import H2Protocol from .encoding.base import CodecBase, StatusDetailsCodecBase if TYPE_CHECKING: from ._typing import IServable # noqa class _Server(asyncio.AbstractServer): def get_loop(self) -> asyncio.AbstractEventLoop: raise NotImplementedError def is_serving(self) -> bool: raise NotImplementedError async def start_serving(self) -> None: raise NotImplementedError async def serve_forever(self) -> None: raise NotImplementedError def close(self) -> None: pass async def wait_closed(self) -> None: pass class _InMemoryTransport(asyncio.Transport): def __init__( self, protocol: H2Protocol, ) -> None: super().__init__() self._loop = asyncio.get_event_loop() self._protocol = protocol def _write_soon(self, data: bytes) -> None: if not self._protocol.connection.is_closing(): self._protocol.data_received(data) def write(self, data: bytes) -> None: if data: self._loop.call_soon(self._write_soon, data) def is_closing(self) -> bool: return False def close(self) -> None: pass class ChannelFor: """Manages specially initialised :py:class:`~grpclib.client.Channel` with an in-memory transport to a :py:class:`~grpclib.server.Server` Example: .. code-block:: python3 class Greeter(GreeterBase): ... greeter = Greeter() async with ChannelFor([greeter]) as channel: stub = GreeterStub(channel) response = await stub.SayHello(HelloRequest(name='Dr. Strange')) assert response.message == 'Hello, Dr. Strange!' """ def __init__( self, services: Collection['IServable'], codec: Optional[CodecBase] = None, status_details_codec: Optional[StatusDetailsCodecBase] = None, ) -> None: """ :param services: list of services you want to test :param codec: instance of a codec to encode and decode messages, if omitted ``ProtoCodec`` is used by default :param status_details_codec: instance of a status details codec to encode and decode error details in a trailing metadata, if omitted ``ProtoStatusDetailsCodec`` is used by default """ self._services = services self._codec = codec self._status_details_codec = status_details_codec async def __aenter__(self) -> Channel: """ :return: :py:class:`~grpclib.client.Channel` """ self._server = Server( self._services, codec=self._codec, status_details_codec=self._status_details_codec, ) self._server._server = _Server() self._server._server_closed_fut = self._server._loop.create_future() self._server_protocol = self._server._protocol_factory() self._channel = Channel( codec=self._codec, status_details_codec=self._status_details_codec, ) self._channel._protocol = self._channel._protocol_factory() self._channel._protocol.connection_made( _InMemoryTransport(self._server_protocol) ) self._server_protocol.connection_made( _InMemoryTransport(self._channel._protocol) ) return self._channel async def __aexit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> None: assert self._channel._protocol is not None self._channel._protocol.connection_lost(None) self._channel.close() self._server_protocol.connection_lost(None) self._server.close() await self._server.wait_closed() ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/grpclib/utils.py0000644000175100001770000001441700000000000017525 0ustar00runnerdocker00000000000000import sys import signal import asyncio import warnings from types import TracebackType from typing import TYPE_CHECKING, Optional, Set, Type, ContextManager, List from typing import Iterator, Collection, Callable, Any, cast from functools import wraps from contextlib import contextmanager if sys.version_info > (3, 7): _current_task = asyncio.current_task else: _current_task = asyncio.Task.current_task if TYPE_CHECKING: from .metadata import Deadline # noqa from ._typing import IServable, IClosable # noqa class Wrapper(ContextManager[None]): """Special wrapper for coroutines to wake them up in case of some error. Example: .. code-block:: python3 w = Wrapper() async def blocking_call(): with w: await asyncio.sleep(10) # and somewhere else: w.cancel(NoNeedToWaitError('With explanation')) """ _error: Optional[Exception] = None cancelled: Optional[bool] = None cancel_failed: Optional[bool] = None def __init__(self) -> None: self._tasks: Set['asyncio.Task[Any]'] = set() def __enter__(self) -> None: if self._error is not None: raise self._error task = _current_task() if task is None: raise RuntimeError('Called not inside a task') self._tasks.add(task) def __exit__( self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType], ) -> None: task = _current_task() assert task self._tasks.discard(task) if self._error is not None: self.cancel_failed = exc_type is not asyncio.CancelledError raise self._error def cancel(self, error: Exception) -> None: self._error = error for task in self._tasks: task.cancel() self.cancelled = True class DeadlineWrapper(Wrapper): """Deadline wrapper to specify deadline once for any number of awaiting method calls. Example: .. code-block:: python3 dw = DeadlineWrapper() with dw.start(deadline): await handle_request() # somewhere during request handling: async def blocking_call(): with dw: await asyncio.sleep(10) """ @contextmanager def start(self, deadline: 'Deadline') -> Iterator[None]: timeout = deadline.time_remaining() if not timeout: raise asyncio.TimeoutError('Deadline exceeded') def callback() -> None: self.cancel(asyncio.TimeoutError('Deadline exceeded')) loop = asyncio.get_event_loop() timer = loop.call_later(timeout, callback) try: yield finally: timer.cancel() def _service_name(service: 'IServable') -> str: methods = service.__mapping__() method_name = next(iter(methods), None) assert method_name is not None _, service_name, _ = method_name.split('/') return service_name def _first_stage( sig_num: 'signal.Signals', servers: Collection['IClosable'], ) -> None: fail = False for server in servers: try: server.close() except RuntimeError: # probably server wasn't started yet fail = True if fail: # using second stage in case of error will ensure that non-closed # server wont start later _second_stage(sig_num) def _second_stage(sig_num: 'signal.Signals') -> None: raise SystemExit(128 + sig_num) def _exit_handler( sig_num: int, servers: Collection['IClosable'], flag: List[bool], ) -> None: if flag: _second_stage(cast('signal.Signals', sig_num)) else: _first_stage(cast('signal.Signals', sig_num), servers) flag.append(True) @contextmanager def graceful_exit( servers: Collection['IClosable'], *, loop: Optional[asyncio.AbstractEventLoop] = None, signals: Collection[int] = (signal.SIGINT, signal.SIGTERM), ) -> Iterator[None]: """Utility context-manager to help properly shutdown server in response to the OS signals By default this context-manager handles ``SIGINT`` and ``SIGTERM`` signals. There are two stages: 1. first received signal closes servers 2. subsequent signals raise ``SystemExit`` exception Example: .. code-block:: python3 async def main(...): ... with graceful_exit([server]): await server.start(host, port) print('Serving on {}:{}'.format(host, port)) await server.wait_closed() print('Server closed') First stage calls ``server.close()`` and ``await server.wait_closed()`` should complete successfully without errors. If server wasn't started yet, second stage runs to prevent server start. Second stage raises ``SystemExit`` exception, but you will receive ``asyncio.CancelledError`` in your ``async def main()`` coroutine. You can use ``try..finally`` constructs and context-managers to properly handle this error. This context-manager is designed to work in cooperation with :py:func:`python:asyncio.run` function: .. code-block:: python3 if __name__ == '__main__': asyncio.run(main()) :param servers: list of servers :param loop: (deprecated) asyncio-compatible event loop :param signals: set of the OS signals to handle .. note:: Not supported in Windows """ if loop: warnings.warn("The loop argument is deprecated and scheduled " "for removal in grpclib 0.5", DeprecationWarning, stacklevel=2) loop = loop or asyncio.get_event_loop() signals = set(signals) flag: 'List[bool]' = [] for sig_num in signals: loop.add_signal_handler(sig_num, _exit_handler, sig_num, servers, flag) try: yield finally: for sig_num in signals: loop.remove_signal_handler(sig_num) def _cached(func: Callable[[], Any]) -> Callable[[], Any]: @wraps(func) def wrapper() -> Any: try: return func.__result__ # type: ignore except AttributeError: func.__result__ = func() # type: ignore return func.__result__ # type: ignore return wrapper ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6522336 grpclib-0.4.8rc2/grpclib.egg-info/0000755000175100001770000000000000000000000017476 5ustar00runnerdocker00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/PKG-INFO0000644000175100001770000001612400000000000020577 0ustar00runnerdocker00000000000000Metadata-Version: 2.1 Name: grpclib Version: 0.4.8rc2 Summary: Pure-Python gRPC implementation for asyncio Home-page: https://github.com/vmagamedov/grpclib Author: Vladimir Magamedov Author-email: vladimir@magamedov.com License: BSD-3-Clause Description: Pure-Python gRPC implementation for asyncio =========================================== .. image:: https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/7e1631d13476f1e870af0d5605b643fc14471a6d/banner-direct-single.svg :target: https://standforukraine.com |project|_ |documentation|_ |version|_ |tag|_ |downloads|_ |license|_ This project is based on `hyper-h2`_ and **requires Python >= 3.8**. .. contents:: :local: Example ~~~~~~~ See `examples`_ directory in the project's repository for all available examples. Client ------ .. code-block:: python3 import asyncio from grpclib.client import Channel # generated by protoc from .helloworld_pb2 import HelloRequest, HelloReply from .helloworld_grpc import GreeterStub async def main(): async with Channel('127.0.0.1', 50051) as channel: greeter = GreeterStub(channel) reply = await greeter.SayHello(HelloRequest(name='Dr. Strange')) print(reply.message) if __name__ == '__main__': asyncio.run(main()) Server ------ .. code-block:: python3 import asyncio from grpclib.utils import graceful_exit from grpclib.server import Server # generated by protoc from .helloworld_pb2 import HelloReply from .helloworld_grpc import GreeterBase class Greeter(GreeterBase): async def SayHello(self, stream): request = await stream.recv_message() message = f'Hello, {request.name}!' await stream.send_message(HelloReply(message=message)) async def main(*, host='127.0.0.1', port=50051): server = Server([Greeter()]) # Note: graceful_exit isn't supported in Windows with graceful_exit([server]): await server.start(host, port) print(f'Serving on {host}:{port}') await server.wait_closed() if __name__ == '__main__': asyncio.run(main()) Installation ~~~~~~~~~~~~ .. code-block:: console $ pip3 install "grpclib[protobuf]" Bug fixes and new features are frequently published via release candidates: .. code-block:: console $ pip3 install --upgrade --pre "grpclib[protobuf]" For the code generation you will also need a ``protoc`` compiler, which can be installed with ``protobuf`` system package: .. code-block:: console $ brew install protobuf # example for macOS users $ protoc --version libprotoc ... **Or** you can use ``protoc`` compiler from the ``grpcio-tools`` Python package: .. code-block:: console $ pip3 install grpcio-tools $ python3 -m grpc_tools.protoc --version libprotoc ... **Note:** ``grpcio`` and ``grpcio-tools`` packages are **not required in runtime**, ``grpcio-tools`` package will be used only during code generation. ``protoc`` plugin ~~~~~~~~~~~~~~~~~ In order to use this library you will have to generate special stub files using plugin provided, which can be used like this: .. code-block:: console $ python3 -m grpc_tools.protoc -I. --python_out=. --grpclib_python_out=. helloworld/helloworld.proto ^----- note -----^ This command will generate ``helloworld_pb2.py`` and ``helloworld_grpc.py`` files. Plugin which implements ``--grpclib_python_out`` option should be available for the ``protoc`` compiler as the ``protoc-gen-grpclib_python`` executable which should be installed by ``pip`` into your ``$PATH`` during installation of the ``grpclib`` library. Changed in v0.3.2: ``--python_grpc_out`` option was renamed into ``--grpclib_python_out``. Contributing ~~~~~~~~~~~~ * Please submit an issue before working on a Pull Request * Do not merge/squash/rebase your development branch while you work on a Pull Request, use rebase if this is really necessary * You may use Tox_ in order to test and lint your changes, but it is Ok to rely on CI for this matter .. _gRPC: http://www.grpc.io .. _hyper-h2: https://github.com/python-hyper/hyper-h2 .. _grpcio: https://pypi.org/project/grpcio/ .. _Tox: https://tox.readthedocs.io/ .. _examples: https://github.com/vmagamedov/grpclib/tree/master/examples .. |version| image:: https://img.shields.io/pypi/v/grpclib.svg?label=stable&color=blue .. _version: https://pypi.org/project/grpclib/ .. |license| image:: https://img.shields.io/pypi/l/grpclib.svg?color=blue .. _license: https://github.com/vmagamedov/grpclib/blob/master/LICENSE.txt .. |tag| image:: https://img.shields.io/github/tag/vmagamedov/grpclib.svg?label=latest&color=blue .. _tag: https://pypi.org/project/grpclib/#history .. |project| image:: https://img.shields.io/badge/vmagamedov%2Fgrpclib-blueviolet.svg?logo=github&color=blue .. _project: https://github.com/vmagamedov/grpclib .. |documentation| image:: https://img.shields.io/badge/docs-grpclib.rtfd.io-blue.svg .. _documentation: https://grpclib.readthedocs.io/en/latest/ .. |downloads| image:: https://static.pepy.tech/badge/grpclib/month .. _downloads: https://pepy.tech/project/grpclib Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Topic :: Internet :: WWW/HTTP :: HTTP Servers Classifier: Topic :: Software Development :: Libraries :: Python Modules Requires-Python: >=3.8 Description-Content-Type: text/x-rst Provides-Extra: protobuf ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/SOURCES.txt0000644000175100001770000000271700000000000021371 0ustar00runnerdocker00000000000000LICENSE.txt README.rst pyproject.toml setup.cfg setup.py grpclib/__init__.py grpclib/_registry.py grpclib/_typing.py grpclib/client.py grpclib/config.py grpclib/const.py grpclib/events.py grpclib/exceptions.py grpclib/metadata.py grpclib/protocol.py grpclib/py.typed grpclib/server.py grpclib/stream.py grpclib/testing.py grpclib/utils.py grpclib.egg-info/PKG-INFO grpclib.egg-info/SOURCES.txt grpclib.egg-info/dependency_links.txt grpclib.egg-info/entry_points.txt grpclib.egg-info/requires.txt grpclib.egg-info/top_level.txt grpclib/channelz/__init__.py grpclib/channelz/service.py grpclib/channelz/v1/__init__.py grpclib/channelz/v1/channelz_grpc.py grpclib/channelz/v1/channelz_pb2.py grpclib/channelz/v1/channelz_pb2.pyi grpclib/encoding/__init__.py grpclib/encoding/base.py grpclib/encoding/proto.py grpclib/health/__init__.py grpclib/health/check.py grpclib/health/service.py grpclib/health/v1/__init__.py grpclib/health/v1/health_grpc.py grpclib/health/v1/health_pb2.py grpclib/health/v1/health_pb2.pyi grpclib/plugin/__init__.py grpclib/plugin/main.py grpclib/reflection/__init__.py grpclib/reflection/_deprecated.py grpclib/reflection/service.py grpclib/reflection/v1/__init__.py grpclib/reflection/v1/reflection_grpc.py grpclib/reflection/v1/reflection_pb2.py grpclib/reflection/v1/reflection_pb2.pyi grpclib/reflection/v1alpha/__init__.py grpclib/reflection/v1alpha/reflection_grpc.py grpclib/reflection/v1alpha/reflection_pb2.py grpclib/reflection/v1alpha/reflection_pb2.pyi././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/dependency_links.txt0000644000175100001770000000000100000000000023544 0ustar00runnerdocker00000000000000 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/entry_points.txt0000644000175100001770000000017200000000000022774 0ustar00runnerdocker00000000000000[console_scripts] protoc-gen-grpclib_python = grpclib.plugin.main:main protoc-gen-python_grpc = grpclib.plugin.main:main ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/requires.txt0000644000175100001770000000006400000000000022076 0ustar00runnerdocker00000000000000h2<5,>=3.1.0 multidict [protobuf] protobuf>=3.20.0 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851946.0 grpclib-0.4.8rc2/grpclib.egg-info/top_level.txt0000644000175100001770000000001000000000000022217 0ustar00runnerdocker00000000000000grpclib ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/pyproject.toml0000644000175100001770000000014400000000000017275 0ustar00runnerdocker00000000000000[build-system] requires = ["setuptools >= 40.6.0", "wheel"] build-backend = "setuptools.build_meta" ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1721851946.6602335 grpclib-0.4.8rc2/setup.cfg0000644000175100001770000000445300000000000016211 0ustar00runnerdocker00000000000000[metadata] name = grpclib version = attr: grpclib.__version__ license = BSD-3-Clause license_file = LICENSE.txt description = Pure-Python gRPC implementation for asyncio long_description = file: README.rst long_description_content_type = text/x-rst author = Vladimir Magamedov author_email = vladimir@magamedov.com url = https://github.com/vmagamedov/grpclib classifiers = Development Status :: 5 - Production/Stable Intended Audience :: Developers License :: OSI Approved :: BSD License Operating System :: OS Independent Programming Language :: Python Programming Language :: Python :: 3.8 Programming Language :: Python :: 3.9 Programming Language :: Python :: 3.10 Programming Language :: Python :: 3.11 Programming Language :: Python :: 3.12 Programming Language :: Python :: 3 :: Only Topic :: Internet :: WWW/HTTP :: HTTP Servers Topic :: Software Development :: Libraries :: Python Modules [options] packages = find: python_requires = >=3.8 install_requires = h2<5,>=3.1.0 multidict [options.extras_require] protobuf = protobuf>=3.20.0 [options.entry_points] console_scripts = protoc-gen-python_grpc=grpclib.plugin.main:main protoc-gen-grpclib_python=grpclib.plugin.main:main [options.package_data] * = *.pyi grpclib = py.typed [tool:pytest] addopts = -q --tb=native testpaths = tests asyncio_mode = auto filterwarnings = error ignore:.*pkg_resources.*:DeprecationWarning ignore:.*google.*:DeprecationWarning ignore:.*utcfromtimestamp.*:DeprecationWarning ignore::ResourceWarning [coverage:run] source = grpclib omit = grpclib/plugin/main.py *_pb2.py *_grpc.py [coverage:report] skip_covered = true sort = miss [flake8] exclude = .git,.tox,env,*_pb2.py,*_grpc.py max_line_length = 80 [mypy] follow_imports = silent warn_unused_configs = true disallow_subclassing_any = true disallow_any_generics = true disallow_untyped_calls = true disallow_untyped_defs = true disallow_incomplete_defs = true check_untyped_defs = true disallow_untyped_decorators = true no_implicit_optional = true warn_redundant_casts = true warn_unused_ignores = true warn_return_any = true [mypy-helloworld.helloworld_pb2_grpc] ignore_errors = true [mypy-_reference.*] ignore_errors = true [mypy-h2.*] ignore_missing_imports = true [mypy-google.rpc.*] ignore_missing_imports = true [egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1721851930.0 grpclib-0.4.8rc2/setup.py0000644000175100001770000000013000000000000016066 0ustar00runnerdocker00000000000000import setuptools setuptools.setup( name="grpclib", python_requires='>=3.8', )