././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/LICENSE0000644000000000000000000002613614740311703014436 0ustar00 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/README.md0000644000000000000000000000033014740311703014674 0ustar00sigstore-protobuf-specs ======================= These are the Python language bindings for Sigstore's protobuf specs. See the [repository's README](https://github.com/sigstore/protobuf-specs) for more information. ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/pyproject.toml0000644000000000000000000000227514740311703016343 0ustar00[build-system] requires = ["flit_core >=3.2,<4"] build-backend = "flit_core.buildapi" [project] name = "sigstore-protobuf-specs" version = "0.3.5" description = "A library for serializing and deserializing Sigstore messages" readme = "README.md" license = { file = "LICENSE" } authors = [ { name = "Sigstore Authors", email = "sigstore-dev@googlegroups.com" }, ] classifiers = [ "License :: OSI Approved :: Apache Software License", "Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Development Status :: 4 - Beta", "Intended Audience :: Developers", "Topic :: Security", "Topic :: Security :: Cryptography", ] dependencies = ["betterproto==2.0.0b7", "pydantic >= 2, < 3"] requires-python = ">=3.8" [project.urls] Homepage = "https://pypi.org/project/sigstore-protobuf-specs/" Issues = "https://github.com/sigstore/protobuf-specs/issues" Source = "https://github.com/sigstore/protobuf-specs" [project.optional-dependencies] dev = ["build"] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/__init__.py0000644000000000000000000000000014740311703022474 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/__init__.py0000644000000000000000000000000014740311703023252 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/__init__.py0000644000000000000000000000000014740311703025111 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/bundle/__init__.py0000644000000000000000000000000014740311703026362 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/bundle/v1/__init__.py0000644000000000000000000001333014740311703026722 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: sigstore_bundle.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from typing import ( List, Optional, ) import betterproto from pydantic import model_validator from pydantic.dataclasses import rebuild_dataclass from .....io import intoto as ____io_intoto__ from ...common import v1 as __common_v1__ from ...rekor import v1 as __rekor_v1__ @dataclass(eq=False, repr=False) class TimestampVerificationData(betterproto.Message): """ Various timestamped counter signatures over the artifacts signature. Currently only RFC3161 signatures are provided. More formats may be added in the future. """ rfc3161_timestamps: List["__common_v1__.Rfc3161SignedTimestamp"] = ( betterproto.message_field(1) ) """ A list of RFC3161 signed timestamps provided by the user. This can be used when the entry has not been stored on a transparency log, or in conjunction for a stronger trust model. Clients MUST verify the hashed message in the message imprint against the signature in the bundle. """ @dataclass(eq=False, repr=False) class VerificationMaterial(betterproto.Message): """ VerificationMaterial captures details on the materials used to verify signatures. This message may be embedded in a DSSE envelope as a signature extension. Specifically, the `ext` field of the extension will expect this message when the signature extension is for Sigstore. This is identified by the `kind` field in the extension, which must be set to application/vnd.dev.sigstore.verificationmaterial;version=0.1 for Sigstore. When used as a DSSE extension, if the `public_key` field is used to indicate the key identifier, it MUST match the `keyid` field of the signature the extension is attached to. """ public_key: Optional["__common_v1__.PublicKeyIdentifier"] = ( betterproto.message_field(1, optional=True, group="content") ) x509_certificate_chain: Optional["__common_v1__.X509CertificateChain"] = ( betterproto.message_field(2, optional=True, group="content") ) certificate: Optional["__common_v1__.X509Certificate"] = betterproto.message_field( 5, optional=True, group="content" ) tlog_entries: List["__rekor_v1__.TransparencyLogEntry"] = betterproto.message_field( 3 ) """ An inclusion proof and an optional signed timestamp from the log. Client verification libraries MAY provide an option to support v0.1 bundles for backwards compatibility, which may contain an inclusion promise and not an inclusion proof. In this case, the client MUST validate the promise. Verifiers SHOULD NOT allow v0.1 bundles if they're used in an ecosystem which never produced them. """ timestamp_verification_data: "TimestampVerificationData" = ( betterproto.message_field(4) ) """ Timestamp may also come from tlog_entries.inclusion_promise.signed_entry_timestamp. """ @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class Bundle(betterproto.Message): media_type: str = betterproto.string_field(1) """ MUST be application/vnd.dev.sigstore.bundle.v0.3+json when when encoded as JSON. Clients must to be able to accept media type using the previously defined formats: * application/vnd.dev.sigstore.bundle+json;version=0.1 * application/vnd.dev.sigstore.bundle+json;version=0.2 * application/vnd.dev.sigstore.bundle+json;version=0.3 """ verification_material: "VerificationMaterial" = betterproto.message_field(2) """ When a signer is identified by a X.509 certificate, a verifier MUST verify that the signature was computed at the time the certificate was valid as described in the Sigstore client spec: "Verification using a Bundle". If the verification material contains a public key identifier (key hint) and the `content` is a DSSE envelope, the key hints MUST be exactly the same in the verification material and in the DSSE envelope. """ message_signature: Optional["__common_v1__.MessageSignature"] = ( betterproto.message_field(3, optional=True, group="content") ) dsse_envelope: Optional["____io_intoto__.Envelope"] = betterproto.message_field( 4, optional=True, group="content" ) """ A DSSE envelope can contain arbitrary payloads. Verifiers must verify that the payload type is a supported and expected type. This is part of the DSSE protocol which is defined here: DSSE envelopes in a bundle MUST have exactly one signture. This is a limitation from the DSSE spec, as it can contain multiple signatures. There are two primary reasons: 1. It simplfies the verification logic and policy 2. The bundle (currently) can only contain a single instance of the required verification materials During verification a client MUST reject an envelope if the number of signatures is not equal to one. """ @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) rebuild_dataclass(TimestampVerificationData) # type: ignore rebuild_dataclass(VerificationMaterial) # type: ignore rebuild_dataclass(Bundle) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/common/__init__.py0000644000000000000000000000000014740311703026401 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/common/v1/__init__.py0000644000000000000000000002360714740311703026751 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: sigstore_common.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from datetime import datetime from typing import ( List, Optional, ) import betterproto from pydantic import model_validator from pydantic.dataclasses import rebuild_dataclass class HashAlgorithm(betterproto.Enum): """ Only a subset of the secure hash standard algorithms are supported. See for more details. UNSPECIFIED SHOULD not be used, primary reason for inclusion is to force any proto JSON serialization to emit the used hash algorithm, as default option is to *omit* the default value of an enum (which is the first value, represented by '0'. """ UNSPECIFIED = 0 SHA2_256 = 1 SHA2_384 = 2 SHA2_512 = 3 SHA3_256 = 4 SHA3_384 = 5 @classmethod def __get_pydantic_core_schema__(cls, _source_type, _handler): from pydantic_core import core_schema return core_schema.int_schema(ge=0) class PublicKeyDetails(betterproto.Enum): """ Details of a specific public key, capturing the the key encoding method, and signature algorithm. PublicKeyDetails captures the public key/hash algorithm combinations recommended in the Sigstore ecosystem. This is modelled as a linear set as we want to provide a small number of opinionated options instead of allowing every possible permutation. Any changes to this enum MUST be reflected in the algorithm registry. See: docs/algorithm-registry.md To avoid the possibility of contradicting formats such as PKCS1 with ED25519 the valid permutations are listed as a linear set instead of a cartesian set (i.e one combined variable instead of two, one for encoding and one for the signature algorithm). """ UNSPECIFIED = 0 PKCS1_RSA_PKCS1V5 = 1 """RSA""" PKCS1_RSA_PSS = 2 PKIX_RSA_PKCS1V5 = 3 PKIX_RSA_PSS = 4 PKIX_RSA_PKCS1V15_2048_SHA256 = 9 """RSA public key in PKIX format, PKCS#1v1.5 signature""" PKIX_RSA_PKCS1V15_3072_SHA256 = 10 PKIX_RSA_PKCS1V15_4096_SHA256 = 11 PKIX_RSA_PSS_2048_SHA256 = 16 """RSA public key in PKIX format, RSASSA-PSS signature""" PKIX_RSA_PSS_3072_SHA256 = 17 PKIX_RSA_PSS_4096_SHA256 = 18 PKIX_ECDSA_P256_HMAC_SHA_256 = 6 """ECDSA""" PKIX_ECDSA_P256_SHA_256 = 5 PKIX_ECDSA_P384_SHA_384 = 12 PKIX_ECDSA_P521_SHA_512 = 13 PKIX_ED25519 = 7 """Ed 25519""" PKIX_ED25519_PH = 8 LMS_SHA256 = 14 """ LMS and LM-OTS These keys and signatures may be used by private Sigstore deployments, but are not currently supported by the public good instance. USER WARNING: LMS and LM-OTS are both stateful signature schemes. Using them correctly requires discretion and careful consideration to ensure that individual secret keys are not used more than once. In addition, LM-OTS is a single-use scheme, meaning that it MUST NOT be used for more than one signature per LM-OTS key. If you cannot maintain these invariants, you MUST NOT use these schemes. """ LMOTS_SHA256 = 15 @classmethod def __get_pydantic_core_schema__(cls, _source_type, _handler): from pydantic_core import core_schema return core_schema.int_schema(ge=0) class SubjectAlternativeNameType(betterproto.Enum): UNSPECIFIED = 0 EMAIL = 1 URI = 2 OTHER_NAME = 3 """ OID 1.3.6.1.4.1.57264.1.7 See https://github.com/sigstore/fulcio/blob/main/docs/oid-info.md#1361415726417--othername-san for more details. """ @classmethod def __get_pydantic_core_schema__(cls, _source_type, _handler): from pydantic_core import core_schema return core_schema.int_schema(ge=0) @dataclass(eq=False, repr=False) class HashOutput(betterproto.Message): """ HashOutput captures a digest of a 'message' (generic octet sequence) and the corresponding hash algorithm used. """ algorithm: "HashAlgorithm" = betterproto.enum_field(1) digest: bytes = betterproto.bytes_field(2) """ This is the raw octets of the message digest as computed by the hash algorithm. """ @dataclass(eq=False, repr=False) class MessageSignature(betterproto.Message): """MessageSignature stores the computed signature over a message.""" message_digest: "HashOutput" = betterproto.message_field(1) """ Message digest can be used to identify the artifact. Clients MUST NOT attempt to use this digest to verify the associated signature; it is intended solely for identification. """ signature: bytes = betterproto.bytes_field(2) """ The raw bytes as returned from the signature algorithm. The signature algorithm (and so the format of the signature bytes) are determined by the contents of the 'verification_material', either a key-pair or a certificate. If using a certificate, the certificate contains the required information on the signature algorithm. When using a key pair, the algorithm MUST be part of the public key, which MUST be communicated out-of-band. """ @dataclass(eq=False, repr=False) class LogId(betterproto.Message): """LogId captures the identity of a transparency log.""" key_id: bytes = betterproto.bytes_field(1) """The unique identity of the log, represented by its public key.""" @dataclass(eq=False, repr=False) class Rfc3161SignedTimestamp(betterproto.Message): """This message holds a RFC 3161 timestamp.""" signed_timestamp: bytes = betterproto.bytes_field(1) """ Signed timestamp is the DER encoded TimeStampResponse. See https://www.rfc-editor.org/rfc/rfc3161.html#section-2.4.2 """ @dataclass(eq=False, repr=False) class PublicKey(betterproto.Message): raw_bytes: Optional[bytes] = betterproto.bytes_field(1, optional=True) """ DER-encoded public key, encoding method is specified by the key_details attribute. """ key_details: "PublicKeyDetails" = betterproto.enum_field(2) """Key encoding and signature algorithm to use for this key.""" valid_for: Optional["TimeRange"] = betterproto.message_field(3, optional=True) """Optional validity period for this key, *inclusive* of the endpoints.""" @dataclass(eq=False, repr=False) class PublicKeyIdentifier(betterproto.Message): """ PublicKeyIdentifier can be used to identify an (out of band) delivered key, to verify a signature. """ hint: str = betterproto.string_field(1) """ Optional unauthenticated hint on which key to use. The format of the hint must be agreed upon out of band by the signer and the verifiers, and so is not subject to this specification. Example use-case is to specify the public key to use, from a trusted key-ring. Implementors are RECOMMENDED to derive the value from the public key as described in RFC 6962. See: """ @dataclass(eq=False, repr=False) class ObjectIdentifier(betterproto.Message): """An ASN.1 OBJECT IDENTIFIER""" id: List[int] = betterproto.int32_field(1) @dataclass(eq=False, repr=False) class ObjectIdentifierValuePair(betterproto.Message): """An OID and the corresponding (byte) value.""" oid: "ObjectIdentifier" = betterproto.message_field(1) value: bytes = betterproto.bytes_field(2) @dataclass(eq=False, repr=False) class DistinguishedName(betterproto.Message): organization: str = betterproto.string_field(1) common_name: str = betterproto.string_field(2) @dataclass(eq=False, repr=False) class X509Certificate(betterproto.Message): raw_bytes: bytes = betterproto.bytes_field(1) """DER-encoded X.509 certificate.""" @dataclass(eq=False, repr=False) class SubjectAlternativeName(betterproto.Message): type: "SubjectAlternativeNameType" = betterproto.enum_field(1) regexp: Optional[str] = betterproto.string_field(2, optional=True, group="identity") """ A regular expression describing the expected value for the SAN. """ value: Optional[str] = betterproto.string_field(3, optional=True, group="identity") """The exact value to match against.""" @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class X509CertificateChain(betterproto.Message): """ A collection of X.509 certificates. This "chain" can be used in multiple contexts, such as providing a root CA certificate within a TUF root of trust or multiple untrusted certificates for the purpose of chain building. """ certificates: List["X509Certificate"] = betterproto.message_field(1) """ One or more DER-encoded certificates. In some contexts (such as `VerificationMaterial.x509_certificate_chain`), this sequence has an imposed order. Unless explicitly specified, there is otherwise no guaranteed order. """ @dataclass(eq=False, repr=False) class TimeRange(betterproto.Message): """ The time range is closed and includes both the start and end times, (i.e., [start, end]). End is optional to be able to capture a period that has started but has no known end. """ start: datetime = betterproto.message_field(1) end: Optional[datetime] = betterproto.message_field(2, optional=True) rebuild_dataclass(HashOutput) # type: ignore rebuild_dataclass(MessageSignature) # type: ignore rebuild_dataclass(PublicKey) # type: ignore rebuild_dataclass(ObjectIdentifierValuePair) # type: ignore rebuild_dataclass(SubjectAlternativeName) # type: ignore rebuild_dataclass(X509CertificateChain) # type: ignore rebuild_dataclass(TimeRange) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/events/__init__.py0000644000000000000000000000000014740311703026415 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/events/v1/__init__.py0000644000000000000000000000520414740311703026756 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: events.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from datetime import datetime from typing import ( Dict, List, Optional, ) import betterproto import betterproto.lib.pydantic.google.protobuf as betterproto_lib_pydantic_google_protobuf from pydantic import model_validator from pydantic.dataclasses import rebuild_dataclass @dataclass(eq=False, repr=False) class CloudEvent(betterproto.Message): id: str = betterproto.string_field(1) """Required Attributes""" source: str = betterproto.string_field(2) spec_version: str = betterproto.string_field(3) type: str = betterproto.string_field(4) attributes: Dict[str, "CloudEventCloudEventAttributeValue"] = betterproto.map_field( 5, betterproto.TYPE_STRING, betterproto.TYPE_MESSAGE ) """Optional & Extension Attributes""" binary_data: Optional[bytes] = betterproto.bytes_field( 6, optional=True, group="data" ) text_data: Optional[str] = betterproto.string_field(7, optional=True, group="data") proto_data: Optional["betterproto_lib_pydantic_google_protobuf.Any"] = ( betterproto.message_field(8, optional=True, group="data") ) @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class CloudEventCloudEventAttributeValue(betterproto.Message): ce_boolean: Optional[bool] = betterproto.bool_field(1, optional=True, group="attr") ce_integer: Optional[int] = betterproto.int32_field(2, optional=True, group="attr") ce_string: Optional[str] = betterproto.string_field(3, optional=True, group="attr") ce_bytes: Optional[bytes] = betterproto.bytes_field(4, optional=True, group="attr") ce_uri: Optional[str] = betterproto.string_field(5, optional=True, group="attr") ce_uri_ref: Optional[str] = betterproto.string_field(6, optional=True, group="attr") ce_timestamp: Optional[datetime] = betterproto.message_field( 7, optional=True, group="attr" ) @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class CloudEventBatch(betterproto.Message): events: List["CloudEvent"] = betterproto.message_field(1) rebuild_dataclass(CloudEvent) # type: ignore rebuild_dataclass(CloudEventCloudEventAttributeValue) # type: ignore rebuild_dataclass(CloudEventBatch) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/rekor/__init__.py0000644000000000000000000000000014740311703026233 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/rekor/v1/__init__.py0000644000000000000000000001603414740311703026577 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: sigstore_rekor.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from typing import List import betterproto from pydantic.dataclasses import rebuild_dataclass from ...common import v1 as __common_v1__ @dataclass(eq=False, repr=False) class KindVersion(betterproto.Message): """KindVersion contains the entry's kind and api version.""" kind: str = betterproto.string_field(1) """ Kind is the type of entry being stored in the log. See here for a list: https://github.com/sigstore/rekor/tree/main/pkg/types """ version: str = betterproto.string_field(2) """The specific api version of the type.""" @dataclass(eq=False, repr=False) class Checkpoint(betterproto.Message): """ The checkpoint MUST contain an origin string as a unique log identifier, the tree size, and the root hash. It MAY also be followed by optional data, and clients MUST NOT assume optional data. The checkpoint MUST also contain a signature over the root hash (tree head). The checkpoint MAY contain additional signatures, but the first SHOULD be the signature from the log. Checkpoint contents are concatenated with newlines into a single string. The checkpoint format is described in https://github.com/transparency-dev/formats/blob/main/log/README.md and https://github.com/C2SP/C2SP/blob/main/tlog-checkpoint.md. An example implementation can be found in https://github.com/sigstore/rekor/blob/main/pkg/util/signed_note.go """ envelope: str = betterproto.string_field(1) @dataclass(eq=False, repr=False) class InclusionProof(betterproto.Message): """ InclusionProof is the proof returned from the transparency log. Can be used for offline or online verification against the log. """ log_index: int = betterproto.int64_field(1) """The index of the entry in the tree it was written to.""" root_hash: bytes = betterproto.bytes_field(2) """ The hash digest stored at the root of the merkle tree at the time the proof was generated. """ tree_size: int = betterproto.int64_field(3) """The size of the merkle tree at the time the proof was generated.""" hashes: List[bytes] = betterproto.bytes_field(4) """ A list of hashes required to compute the inclusion proof, sorted in order from leaf to root. Note that leaf and root hashes are not included. The root hash is available separately in this message, and the leaf hash should be calculated by the client. """ checkpoint: "Checkpoint" = betterproto.message_field(5) """ Signature of the tree head, as of the time of this proof was generated. See above info on 'Checkpoint' for more details. """ @dataclass(eq=False, repr=False) class InclusionPromise(betterproto.Message): """ The inclusion promise is calculated by Rekor. It's calculated as a signature over a canonical JSON serialization of the persisted entry, the log ID, log index and the integration timestamp. See https://github.com/sigstore/rekor/blob/a6e58f72b6b18cc06cefe61808efd562b9726330/pkg/api/entries.go#L54 The format of the signature depends on the transparency log's public key. If the signature algorithm requires a hash function and/or a signature scheme (e.g. RSA) those has to be retrieved out-of-band from the log's operators, together with the public key. This is used to verify the integration timestamp's value and that the log has promised to include the entry. """ signed_entry_timestamp: bytes = betterproto.bytes_field(1) @dataclass(eq=False, repr=False) class TransparencyLogEntry(betterproto.Message): """ TransparencyLogEntry captures all the details required from Rekor to reconstruct an entry, given that the payload is provided via other means. This type can easily be created from the existing response from Rekor. Future iterations could rely on Rekor returning the minimal set of attributes (excluding the payload) that are required for verifying the inclusion promise. The inclusion promise (called SignedEntryTimestamp in the response from Rekor) is similar to a Signed Certificate Timestamp as described here https://www.rfc-editor.org/rfc/rfc6962.html#section-3.2. """ log_index: int = betterproto.int64_field(1) """The global index of the entry, used when querying the log by index.""" log_id: "__common_v1__.LogId" = betterproto.message_field(2) """The unique identifier of the log.""" kind_version: "KindVersion" = betterproto.message_field(3) """ The kind (type) and version of the object associated with this entry. These values are required to construct the entry during verification. """ integrated_time: int = betterproto.int64_field(4) """ The UNIX timestamp from the log when the entry was persisted. The integration time MUST NOT be trusted if inclusion_promise is omitted. """ inclusion_promise: "InclusionPromise" = betterproto.message_field(5) """ The inclusion promise/signed entry timestamp from the log. Required for v0.1 bundles, and MUST be verified. Optional for >= v0.2 bundles if another suitable source of time is present (such as another source of signed time, or the current system time for long-lived certificates). MUST be verified if no other suitable source of time is present, and SHOULD be verified otherwise. """ inclusion_proof: "InclusionProof" = betterproto.message_field(6) """ The inclusion proof can be used for offline or online verification that the entry was appended to the log, and that the log has not been altered. """ canonicalized_body: bytes = betterproto.bytes_field(7) """ Optional. The canonicalized transparency log entry, used to reconstruct the Signed Entry Timestamp (SET) during verification. The contents of this field are the same as the `body` field in a Rekor response, meaning that it does **not** include the "full" canonicalized form (of log index, ID, etc.) which are exposed as separate fields. The verifier is responsible for combining the `canonicalized_body`, `log_index`, `log_id`, and `integrated_time` into the payload that the SET's signature is generated over. This field is intended to be used in cases where the SET cannot be produced determinisitically (e.g. inconsistent JSON field ordering, differing whitespace, etc). If set, clients MUST verify that the signature referenced in the `canonicalized_body` matches the signature provided in the `Bundle.content`. If not set, clients are responsible for constructing an equivalent payload from other sources to verify the signature. """ rebuild_dataclass(InclusionProof) # type: ignore rebuild_dataclass(TransparencyLogEntry) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/trustroot/__init__.py0000644000000000000000000000000014740311703027176 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/trustroot/v1/__init__.py0000644000000000000000000002335214740311703027543 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: sigstore_trustroot.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from typing import List import betterproto from pydantic.dataclasses import rebuild_dataclass from ...common import v1 as __common_v1__ @dataclass(eq=False, repr=False) class TransparencyLogInstance(betterproto.Message): """ TransparencyLogInstance describes the immutable parameters from a transparency log. See https://www.rfc-editor.org/rfc/rfc9162.html#name-log-parameters for more details. The included parameters are the minimal set required to identify a log, and verify an inclusion proof/promise. """ base_url: str = betterproto.string_field(1) """The base URL at which can be used to URLs for the client.""" hash_algorithm: "__common_v1__.HashAlgorithm" = betterproto.enum_field(2) """The hash algorithm used for the Merkle Tree.""" public_key: "__common_v1__.PublicKey" = betterproto.message_field(3) """ The public key used to verify signatures generated by the log. This attribute contains the signature algorithm used by the log. """ log_id: "__common_v1__.LogId" = betterproto.message_field(4) """ The unique identifier for this transparency log. Represented as the SHA-256 hash of the log's public key, calculated over the DER encoding of the key represented as SubjectPublicKeyInfo. See https://www.rfc-editor.org/rfc/rfc6962#section-3.2 """ checkpoint_key_id: "__common_v1__.LogId" = betterproto.message_field(5) """ The checkpoint key identifier for the log used in a checkpoint. Optional, not provided for logs that do not generate checkpoints. For logs that do generate checkpoints, if not set, assume log_id equals checkpoint_key_id. Follows the specification described here for ECDSA and Ed25519 signatures: https://github.com/C2SP/C2SP/blob/main/signed-note.md#signatures For RSA signatures, the key ID will match the ECDSA format, the hashed DER-encoded SPKI public key. Publicly witnessed logs MUST NOT use RSA-signed checkpoints, since witnesses do not support RSA signatures. This is provided for convenience. Clients can also calculate the checkpoint key ID given the log's public key. SHOULD be set for logs generating Ed25519 signatures. SHOULD be 4 bytes long, as a truncated hash. """ @dataclass(eq=False, repr=False) class CertificateAuthority(betterproto.Message): """ CertificateAuthority enlists the information required to identify which CA to use and perform signature verification. """ subject: "__common_v1__.DistinguishedName" = betterproto.message_field(1) """ The root certificate MUST be self-signed, and so the subject and issuer are the same. """ uri: str = betterproto.string_field(2) """ The URI identifies the certificate authority. It is RECOMMENDED that the URI is the base URL for the certificate authority, that can be provided to any SDK/client provided by the certificate authority to interact with the certificate authority. """ cert_chain: "__common_v1__.X509CertificateChain" = betterproto.message_field(3) """ The certificate chain for this CA. The last certificate in the chain MUST be the trust anchor. The trust anchor MAY be a self-signed root CA certificate or MAY be an intermediate CA certificate. """ valid_for: "__common_v1__.TimeRange" = betterproto.message_field(4) """ The time the *entire* chain was valid. This is at max the longest interval when *all* certificates in the chain were valid, but it MAY be shorter. Clients MUST check timestamps against *both* the `valid_for` time range *and* the entire certificate chain. The TimeRange should be considered valid *inclusive* of the endpoints. """ @dataclass(eq=False, repr=False) class TrustedRoot(betterproto.Message): """ TrustedRoot describes the client's complete set of trusted entities. How the TrustedRoot is populated is not specified, but can be a combination of many sources such as TUF repositories, files on disk etc. The TrustedRoot is not meant to be used for any artifact verification, only to capture the complete/global set of trusted verification materials. When verifying an artifact, based on the artifact and policies, a selection of keys/authorities are expected to be extracted and provided to the verification function. This way the set of keys/authorities can be kept to a minimal set by the policy to gain better control over what signatures that are allowed. The embedded transparency logs, CT logs, CAs and TSAs MUST include any previously used instance -- otherwise signatures made in the past cannot be verified. All the listed instances SHOULD be sorted by the 'valid_for' in ascending order, that is, the oldest instance first. Only the last instance is allowed to have their 'end' timestamp unset. All previous instances MUST have a closed interval of validity. The last instance MAY have a closed interval. Clients MUST accept instances that overlaps in time, if not clients may experience problems during rotations of verification materials. To be able to manage planned rotations of either transparency logs or certificate authorities, clienst MUST accept lists of instances where the last instance have a 'valid_for' that belongs to the future. This should not be a problem as clients SHOULD first seek the trust root for a suitable instance before creating a per artifact trust root (that is, a sub-set of the complete trust root) that is used for verification. """ media_type: str = betterproto.string_field(1) """ MUST be application/vnd.dev.sigstore.trustedroot.v0.1+json when encoded as JSON. Clients MUST be able to process and parse content with the media type defined in the old format: application/vnd.dev.sigstore.trustedroot+json;version=0.1 """ tlogs: List["TransparencyLogInstance"] = betterproto.message_field(2) """A set of trusted Rekor servers.""" certificate_authorities: List["CertificateAuthority"] = betterproto.message_field(3) """ A set of trusted certificate authorities (e.g Fulcio), and any intermediate certificates they provide. If a CA is issuing multiple intermediate certificate, each combination shall be represented as separate chain. I.e, a single root cert may appear in multiple chains but with different intermediate and/or leaf certificates. The certificates are intended to be used for verifying artifact signatures. """ ctlogs: List["TransparencyLogInstance"] = betterproto.message_field(4) """A set of trusted certificate transparency logs.""" timestamp_authorities: List["CertificateAuthority"] = betterproto.message_field(5) """A set of trusted timestamping authorities.""" @dataclass(eq=False, repr=False) class SigningConfig(betterproto.Message): """ SigningConfig represents the trusted entities/state needed by Sigstore signing. In particular, it primarily contains service URLs that a Sigstore signer may need to connect to for the online aspects of signing. """ media_type: str = betterproto.string_field(5) """MUST be application/vnd.dev.sigstore.signingconfig.v0.1+json""" ca_url: str = betterproto.string_field(1) """ A URL to a Fulcio-compatible CA, capable of receiving Certificate Signing Requests (CSRs) and responding with issued certificates. This URL **MUST** be the "base" URL for the CA, which clients should construct an appropriate CSR endpoint on top of. For example, if `ca_url` is `https://example.com/ca`, then the client **MAY** construct the CSR endpoint as `https://example.com/ca/api/v2/signingCert`. """ oidc_url: str = betterproto.string_field(2) """ A URL to an OpenID Connect identity provider. This URL **MUST** be the "base" URL for the OIDC IdP, which clients should perform well-known OpenID Connect discovery against. """ tlog_urls: List[str] = betterproto.string_field(3) """ One or more URLs to Rekor-compatible transparency log. Each URL **MUST** be the "base" URL for the transparency log, which clients should construct appropriate API endpoints on top of. """ tsa_urls: List[str] = betterproto.string_field(4) """ One ore more URLs to RFC 3161 Time Stamping Authority (TSA). Each URL **MUST** be the **full** URL for the TSA, meaning that it should be suitable for submitting Time Stamp Requests (TSRs) to via HTTP, per RFC 3161. """ @dataclass(eq=False, repr=False) class ClientTrustConfig(betterproto.Message): """ ClientTrustConfig describes the complete state needed by a client to perform both signing and verification operations against a particular instance of Sigstore. """ media_type: str = betterproto.string_field(1) """MUST be application/vnd.dev.sigstore.clienttrustconfig.v0.1+json""" trusted_root: "TrustedRoot" = betterproto.message_field(2) """The root of trust, which MUST be present.""" signing_config: "SigningConfig" = betterproto.message_field(3) """Configuration for signing clients, which MUST be present.""" rebuild_dataclass(TransparencyLogInstance) # type: ignore rebuild_dataclass(CertificateAuthority) # type: ignore rebuild_dataclass(TrustedRoot) # type: ignore rebuild_dataclass(ClientTrustConfig) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/verification/__init__.py0000644000000000000000000000000014740311703027573 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/dev/sigstore/verification/v1/__init__.py0000644000000000000000000002063514740311703030141 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: sigstore_verification.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from typing import ( List, Optional, ) import betterproto from pydantic import model_validator from pydantic.dataclasses import rebuild_dataclass from ...bundle import v1 as __bundle_v1__ from ...common import v1 as __common_v1__ from ...trustroot import v1 as __trustroot_v1__ @dataclass(eq=False, repr=False) class CertificateIdentity(betterproto.Message): """The identity of a X.509 Certificate signer.""" issuer: str = betterproto.string_field(1) """The X.509v3 issuer extension (OID 1.3.6.1.4.1.57264.1.1)""" san: "__common_v1__.SubjectAlternativeName" = betterproto.message_field(2) oids: List["__common_v1__.ObjectIdentifierValuePair"] = betterproto.message_field(3) """ An unordered list of OIDs that must be verified. All OID/values provided in this list MUST exactly match against the values in the certificate for verification to be successful. """ @dataclass(eq=False, repr=False) class CertificateIdentities(betterproto.Message): identities: List["CertificateIdentity"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) class PublicKeyIdentities(betterproto.Message): public_keys: List["__common_v1__.PublicKey"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) class ArtifactVerificationOptions(betterproto.Message): """ A light-weight set of options/policies for identifying trusted signers, used during verification of a single artifact. """ certificate_identities: Optional["CertificateIdentities"] = ( betterproto.message_field(1, optional=True, group="signers") ) public_keys: Optional["PublicKeyIdentities"] = betterproto.message_field( 2, optional=True, group="signers" ) """ To simplify verification implementation, the logic for bundle verification should be implemented as a higher-order function, where one of argument should be an interface over the set of trusted public keys, like this: `Verify(bytes artifact, bytes signature, string key_id)`. This way the caller is in full control of mapping the identified (or hinted) key in the bundle to one of the trusted keys, as this process is inherently application specific. """ tlog_options: Optional["ArtifactVerificationOptionsTlogOptions"] = ( betterproto.message_field(3, optional=True) ) """ Optional options for artifact transparency log verification. If none is provided, the default verification options are: Threshold: 1 Online verification: false Disable: false """ ctlog_options: Optional["ArtifactVerificationOptionsCtlogOptions"] = ( betterproto.message_field(4, optional=True) ) """ Optional options for certificate transparency log verification. If none is provided, the default verification options are: Threshold: 1 Disable: false """ tsa_options: Optional["ArtifactVerificationOptionsTimestampAuthorityOptions"] = ( betterproto.message_field(5, optional=True) ) """ Optional options for certificate signed timestamp verification. If none is provided, the default verification options are: Threshold: 0 Disable: true """ integrated_ts_options: Optional[ "ArtifactVerificationOptionsTlogIntegratedTimestampOptions" ] = betterproto.message_field(6, optional=True) """ Optional options for integrated timestamp verification. If none is provided, the default verification options are: Threshold: 0 Disable: true """ observer_options: Optional[ "ArtifactVerificationOptionsObserverTimestampOptions" ] = betterproto.message_field(7, optional=True) """ Optional options for observed timestamp verification. If none is provided, the default verification options are: Threshold 1 Disable: false """ @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class ArtifactVerificationOptionsTlogOptions(betterproto.Message): threshold: int = betterproto.int32_field(1) """Number of transparency logs the entry must appear on.""" perform_online_verification: bool = betterproto.bool_field(2) """Perform an online inclusion proof.""" disable: bool = betterproto.bool_field(3) """Disable verification for transparency logs.""" @dataclass(eq=False, repr=False) class ArtifactVerificationOptionsCtlogOptions(betterproto.Message): threshold: int = betterproto.int32_field(1) """ The number of ct transparency logs the certificate must appear on. """ disable: bool = betterproto.bool_field(3) """Disable ct transparency log verification""" @dataclass(eq=False, repr=False) class ArtifactVerificationOptionsTimestampAuthorityOptions(betterproto.Message): threshold: int = betterproto.int32_field(1) """The number of signed timestamps that are expected.""" disable: bool = betterproto.bool_field(2) """Disable signed timestamp verification.""" @dataclass(eq=False, repr=False) class ArtifactVerificationOptionsTlogIntegratedTimestampOptions(betterproto.Message): threshold: int = betterproto.int32_field(1) """The number of integrated timestamps that are expected.""" disable: bool = betterproto.bool_field(2) """Disable integrated timestamp verification.""" @dataclass(eq=False, repr=False) class ArtifactVerificationOptionsObserverTimestampOptions(betterproto.Message): threshold: int = betterproto.int32_field(1) """ The number of external observers of the timestamp. This is a union of RFC3161 signed timestamps, and integrated timestamps from a transparency log, that could include additional timestamp sources in the future. """ disable: bool = betterproto.bool_field(2) """Disable observer timestamp verification.""" @dataclass(eq=False, repr=False) class Artifact(betterproto.Message): artifact_uri: Optional[str] = betterproto.string_field( 1, optional=True, group="data" ) """Location of the artifact""" artifact: Optional[bytes] = betterproto.bytes_field(2, optional=True, group="data") """The raw bytes of the artifact""" artifact_digest: Optional["__common_v1__.HashOutput"] = betterproto.message_field( 3, optional=True, group="data" ) """ Digest of the artifact. SHOULD NOT be used when verifying an in-toto attestation as the subject digest cannot be reconstructed. This option will not work with Ed25519 signatures, use Ed25519Ph or another algorithm instead. """ @model_validator(mode="after") def check_oneof(cls, values): return cls._validate_field_groups(values) @dataclass(eq=False, repr=False) class Input(betterproto.Message): """ Input captures all that is needed to call the bundle verification method, to verify a single artifact referenced by the bundle. """ artifact_trust_root: "__trustroot_v1__.TrustedRoot" = betterproto.message_field(1) """ The verification materials provided during a bundle verification. The running process is usually preloaded with a "global" dev.sisgtore.trustroot.TrustedRoot.v1 instance. Prior to verifying an artifact (i.e a bundle), and/or based on current policy, some selection is expected to happen, to filter out the exact certificate authority to use, which transparency logs are relevant etc. The result should b ecaptured in the `artifact_trust_root`. """ artifact_verification_options: "ArtifactVerificationOptions" = ( betterproto.message_field(2) ) bundle: "__bundle_v1__.Bundle" = betterproto.message_field(3) artifact: Optional["Artifact"] = betterproto.message_field(4, optional=True) """ If the bundle contains a message signature, the artifact must be provided. """ rebuild_dataclass(CertificateIdentity) # type: ignore rebuild_dataclass(CertificateIdentities) # type: ignore rebuild_dataclass(PublicKeyIdentities) # type: ignore rebuild_dataclass(ArtifactVerificationOptions) # type: ignore rebuild_dataclass(Artifact) # type: ignore rebuild_dataclass(Input) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/google/__init__.py0000644000000000000000000000000014740311703023750 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/google/api/__init__.py0000644000000000000000000000464314740311703024542 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: google/api/field_behavior.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass import betterproto from pydantic.dataclasses import rebuild_dataclass class FieldBehavior(betterproto.Enum): """ An indicator of the behavior of a given field (for example, that a field is required in requests, or given as output but ignored as input). This **does not** change the behavior in protocol buffers itself; it only denotes the behavior and may affect how API tooling handles the field. Note: This enum **may** receive new values in the future. """ UNSPECIFIED = 0 """Conventional default for enums. Do not use this.""" OPTIONAL = 1 """ Specifically denotes a field as optional. While all fields in protocol buffers are optional, this may be specified for emphasis if appropriate. """ REQUIRED = 2 """ Denotes a field as required. This indicates that the field **must** be provided as part of the request, and failure to do so will cause an error (usually `INVALID_ARGUMENT`). """ OUTPUT_ONLY = 3 """ Denotes a field as output only. This indicates that the field is provided in responses, but including the field in a request does nothing (the server *must* ignore it and *must not* throw an error as a result of the field's presence). """ INPUT_ONLY = 4 """ Denotes a field as input only. This indicates that the field is provided in requests, and the corresponding field is not included in output. """ IMMUTABLE = 5 """ Denotes a field as immutable. This indicates that the field may be set once in a request to create a resource, but may not be changed thereafter. """ UNORDERED_LIST = 6 """ Denotes that a (repeated) field is an unordered list. This indicates that the service may provide the elements of the list in any arbitrary order, rather than the order the user originally provided. Additionally, the list's order may or may not be stable. """ @classmethod def __get_pydantic_core_schema__(cls, _source_type, _handler): from pydantic_core import core_schema return core_schema.int_schema(ge=0) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/io/__init__.py0000644000000000000000000000000014740311703023103 0ustar00././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/io/intoto/__init__.py0000644000000000000000000000327714740311703024442 0ustar00# Generated by the protocol buffer compiler. DO NOT EDIT! # sources: envelope.proto # plugin: python-betterproto # This file has been @generated from typing import TYPE_CHECKING if TYPE_CHECKING: from dataclasses import dataclass else: from pydantic.dataclasses import dataclass from typing import List import betterproto from pydantic.dataclasses import rebuild_dataclass @dataclass(eq=False, repr=False) class Envelope(betterproto.Message): """An authenticated message of arbitrary type.""" payload: bytes = betterproto.bytes_field(1) """ Message to be signed. (In JSON, this is encoded as base64.) REQUIRED. """ payload_type: str = betterproto.string_field(2) """ String unambiguously identifying how to interpret payload. REQUIRED. """ signatures: List["Signature"] = betterproto.message_field(3) """ Signature over: PAE(type, payload) Where PAE is defined as: PAE(type, payload) = "DSSEv1" + SP + LEN(type) + SP + type + SP + LEN(payload) + SP + payload + = concatenation SP = ASCII space [0x20] "DSSEv1" = ASCII [0x44, 0x53, 0x53, 0x45, 0x76, 0x31] LEN(s) = ASCII decimal encoding of the byte length of s, with no leading zeros REQUIRED (length >= 1). """ @dataclass(eq=False, repr=False) class Signature(betterproto.Message): sig: bytes = betterproto.bytes_field(1) """ Signature itself. (In JSON, this is encoded as base64.) REQUIRED. """ keyid: str = betterproto.string_field(2) """ *Unauthenticated* hint identifying which public key was used. OPTIONAL. """ rebuild_dataclass(Envelope) # type: ignore ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1736545218.5444748 sigstore_protobuf_specs-0.3.5/sigstore_protobuf_specs/py.typed0000644000000000000000000000000014740311703022062 0ustar00sigstore_protobuf_specs-0.3.5/PKG-INFO0000644000000000000000000000257000000000000014463 0ustar00Metadata-Version: 2.3 Name: sigstore-protobuf-specs Version: 0.3.5 Summary: A library for serializing and deserializing Sigstore messages Author-email: Sigstore Authors Requires-Python: >=3.8 Description-Content-Type: text/markdown Classifier: License :: OSI Approved :: Apache Software License Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: Topic :: Security Classifier: Topic :: Security :: Cryptography Requires-Dist: betterproto==2.0.0b7 Requires-Dist: pydantic >= 2, < 3 Requires-Dist: build ; extra == "dev" Project-URL: Homepage, https://pypi.org/project/sigstore-protobuf-specs/ Project-URL: Issues, https://github.com/sigstore/protobuf-specs/issues Project-URL: Source, https://github.com/sigstore/protobuf-specs Provides-Extra: dev sigstore-protobuf-specs ======================= These are the Python language bindings for Sigstore's protobuf specs. See the [repository's README](https://github.com/sigstore/protobuf-specs) for more information.