in-toto-0.4.0/0000755000076500000240000000000013535503044013204 5ustar lukpstaff00000000000000in-toto-0.4.0/.coveragerc0000644000076500000240000000032113535414523015324 0ustar lukpstaff00000000000000[run] # enable branch coverage in addition to statement coverage branch = True omit = */tests/* */python?.?/* */site-packages/* [report] exclude_lines = pragma: no cover if __name__ == .__main__.: in-toto-0.4.0/.travis.yml0000644000076500000240000000116613535414523015324 0ustar lukpstaff00000000000000dist: xenial language: python # Run each tox environment (see tox.ini) in a separate travis environment that # has the required Python version. Unit tests are run in all supported Python # versions, pylint and bandit only in the latest. matrix: include: - python: "2.7" env: TOXENV=py27 - python: "3.5" env: TOXENV=py35 - python: "3.6" env: TOXENV=py36 - python: "3.7" env: TOXENV=py37 - python: "3.7" env: TOXENV=lint install: - pip install -U tox - pip install -U coveralls script: - tox after_success: coveralls in-toto-0.4.0/LICENSE0000644000076500000240000000106013535414523014211 0ustar lukpstaff00000000000000Copyright 2018 New York University Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. in-toto-0.4.0/MANIFEST.in0000644000076500000240000000041713535414523014747 0ustar lukpstaff00000000000000include LICENSE include README.md # Add tests and test data recursive-include tests * global-exclude *.py[cod] __pycache__ *.so .coverage # Add test config files to show how to run tests include .coveragerc include tox.ini include .travis.yml include requirements*.txt in-toto-0.4.0/PKG-INFO0000644000076500000240000004002013535503044014275 0ustar lukpstaff00000000000000Metadata-Version: 2.1 Name: in-toto Version: 0.4.0 Summary: A framework to define and secure the integrity of software supply chains Home-page: https://in-toto.io Author: New York University: Secure Systems Lab Author-email: in-toto-dev@googlegroups.com License: Apache-2.0 Project-URL: Source, https://github.com/in-toto/in-toto Project-URL: Bug Reports, https://github.com/in-toto/in-toto/issues Description: # in-toto [![Build Status](https://travis-ci.org/in-toto/in-toto.svg?branch=develop)](https://travis-ci.org/in-toto/in-toto) [![Coverage Status](https://coveralls.io/repos/github/in-toto/in-toto/badge.svg?branch=develop)](https://coveralls.io/github/in-toto/in-toto?branch=develop) [![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/1523/badge)](https://bestpractices.coreinfrastructure.org/projects/1523) [![Build status](https://ci.appveyor.com/api/projects/status/taxlhrrlf3co07e1/branch/develop?svg=true)](https://ci.appveyor.com/project/in-toto/in-toto/branch/develop) in-toto provides a framework to protect the integrity of the software supply chain. It does so by verifying that each task in the chain is carried out as planned, by authorized personnel only, and that the product is not tampered with in transit. in-toto requires a **project owner** to create a **layout**. A layout lists the sequence of **steps** of the software supply chain, and the **functionaries** authorized to perform these steps. When a functionary performs a step in-toto gathers information about the used command and the related files and stores it in a **link** metadata file. As a consequence link files provide the required evidence to establish a continuous chain that can be validated against the steps defined in the layout. The layout, signed by the project owners, together with the links, signed by the designated functionaries, are released as part of the final product, and can be validated manually or via automated tooling in, e.g. a package manager. ## Getting Started ### Install Dependencies - [Python](www.python.org) in version 2.7 - crypto libraries require header files - [OpenSSL](https://www.openssl.org/) - crypto libraries require header files - [git](https://git-scm.com/) - version control system - [pip](https://pip.pypa.io) - package installer tool ### Installation It is strongly recommended to install in-toto in an isolated Python environment. For easy setup instructions visit the docs for [`virtualenv`](https://virtualenv.pypa.io) and the convenient [`vitualenvwrapper`](https://virtualenvwrapper.readthedocs.io). ```shell pip install in-toto ``` ### Create layout, run supply chain steps and verify final product #### Layout The in-toto software supply chain layout consists of the following parts: - **expiration date** - **readme** (an optional description of the supply chain) - **functionary keys** (public keys, used to verify link metadata signatures) - **signatures** (one or more layout signatures created with the project owner key(s)) - **software supply chain steps** correspond to steps carried out by a functionary as part of the software supply chain. The steps defined in the layout list the functionaries who are authorized to carry out the step (by key id). Steps require a unique name to associate them (upon verification) with link metadata that is created when a functionary carries out the step using the `in-toto` tools. Additionally, steps must have material and product rules which define the files a step is supposed to operate on. Material and product rules are described in the section below. - **inspections** define commands to be run during the verification process and can also list material and product rules. Take a look at the [demo layout creation example](https://github.com/in-toto/in-toto/blob/develop/layout-creation.md) for further information on how to create an in-toto layout. Or try our experimental [layout creation web tool](https://in-toto.engineering.nyu.edu/). #### Artifact Rules A software supply chain usually operates on a set of files, such as source code, executables, packages, or the like. in-toto calls these files artifacts. A material is an artifact that will be used when a step or inspection is carried out. Likewise, a product is an artifact that results from carrying out a step. The in-toto layout provides a simple rule language to authorize or enforce the artifacts of a step and to chain them together. This adds the following guarantees for any given step or inspection: - Only artifacts **authorized** by the project owner are created, modified or deleted, - each defined creation, modification or deletion is **enforced**, and also - restricted to the scope of its definition, which **chains** subsequent steps and inspections together. Note that it is up to you to properly secure your supply chain, by authorizing, enforcing and chaining materials and products using any and usually multiple of the following rules: - `CREATE ` - `DELETE ` - `MODIFY ` - `ALLOW ` - `DISALLOW ` - `REQUIRE ` - `MATCH [IN ] WITH (MATERIALS|PRODUCTS) [IN ] FROM ` *Rule arguments specified as `` allow for Unix shell-style wildcards as implemented by Python's [`fnmatch`](https://docs.python.org/2/library/fnmatch.html).* To learn more about the different rule types, their guarantees and how they are applied take a look at the [Artifact Rules](https://github.com/in-toto/docs/blob/master/in-toto-spec.md#433-artifact-rules) section of the in-toto specification. #### Carrying out software supply chain steps ##### in-toto-run `in-toto-run` is used to execute a step in the software supply chain. This can be anything relevant to the project such as tagging a release with `git`, running a test, or building a binary. The relevant step name and command are passed as arguments, along with materials, which are files required for that step's command to execute, and products which are files expected as a result of the execution of that command. These, and other relevant details pertaining to the step are stored in a link file, which is signed using the functionary's key. If materials are not passed to the command, the link file generated just doesn't record them. Similarly, if the execution of a command via `in-toto-run` doesn't result in any products, they're not recorded in the link file. Any files that are modified or used in any way during the execution of the command are not recorded in the link file unless explicitly passed as artifacts. Conversely, any materials or products passed to the command are recorded in the link file even if they're not part of the execution of the command. See [this simple usage example from the demo application for more details](https://github.com/in-toto/demo). For a detailed list of all the command line arguments, run `in-toto-run --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_run.py). ##### in-toto-record `in-toto-record` works similar to `in-toto-run` but can be used for multi-part software supply chain steps, i.e. steps that are not carried out by a single command. Use `in-toto-record start ...` to create a preliminary link file that only records the *materials*, then run the commands of that step or edit files manually and finally use `in-toto-record stop ...` to record the *products* and generate the actual link metadata file. For a detailed list of all command line arguments and their usage, run `in-toto-record start --help` or `in-toto-record stop --help`, or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_record.py). #### Release final product In order to verify the final product with in-toto, the verifier must have access to the layout, the `*.link` files, and the project owner's public key(s). #### Verification Use `in-toto-verify` on the final product to verify that - the layout was signed with the project owner's private key(s), - has not expired, - each step was performed and signed by the authorized functionary, - the functionaries used the commands, they were supposed to use, - materials and products of each step were in place as defined by the rules, and - run the defined inspections For a detailed list of all command line arguments and their usage, run `in-toto-verify --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_verify.py). #### Signatures `in-toto-sign` is a metadata signature helper tool to add, replace, and verify signatures within in-toto Link or Layout metadata, with options to: - replace (default) or add signature(s), with layout metadata able to be signed by multiple keys at once while link metadata can only be signed by one key at a time - write signed metadata to a specified path (if no output path is specified, layout metadata is written to the path of the input file while link metadata is written to `..link`) - verify signatures This tool is intended to sign layouts created by the [layout web wizard](https://in-toto.engineering.nyu.edu/), but also serves well to re-sign test and demo data. For example, it can be used if metadata formats or signing routines change. For a detailed list of all command line arguments and their usage, run `in-toto-sign --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_sign.py). #### Settings Settings can be configured in [`in_toto.settings`](https://github.com/in-toto/in-toto/blob/develop/in_toto/settings.py), via prefixed environment variables or in RCfiles in one of the following paths: */etc/in_toto/config, /etc/in_totorc, \~/.config/in_toto/config, \~/.config/in_toto, \~/.in_toto/config, \~/.in_totorc, .in_totorc*. A setting in an RCfile in the current working directory overrides the same setting in an RCfile in the user's home directory, which overrides the same setting in an environment variable, which overrides the same setting in `in_toto.settings`. Setting names are restricted to the below listed settings (case sensitive). Also, setting values that contain colons are parsed as list. ##### Available Settings `ARTIFACT_EXCLUDE_PATTERNS` Specifies a list of glob patterns that can be used to exclude files from being recorded as materials or products. See [runlib docs for more details](https://github.com/in-toto/in-toto/blob/develop/in_toto/runlib.py#L124-L142). `ARTIFACT_BASE_PATH` If set, material and product paths passed to `in-toto-run` are searched relative to the set base path. Also, the base path is stripped from the paths written to the resulting link metadata file. ##### Examples ```shell # Bash style environment variable export export IN_TOTO_ARTIFACT_BASE_PATH='/home/user/project' export IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS='*.link:.gitignore' ``` ``` # E.g in rcfile ~/.in_totorc [in-toto settings] ARTIFACT_BASE_PATH=/home/user/project ARTIFACT_EXCLUDE_PATTERNS=*.link:.gitignore ``` ## in-toto demo You can try in-toto by running the [demo application](https://github.com/in-toto/demo). The demo basically outlines three users viz., Alice (project owner), Bob (functionary) and Carl (functionary) and how in-toto helps to specify a project layout and verify that the layout has been followed in a correct manner. ## Specification You can read more about how in-toto works by taking a look at the [specification](https://github.com/in-toto/docs/raw/master/in-toto-spec.pdf). ## Security Issues and Bugs Security issues can be reported by emailing justincappos@gmail.com. At a minimum, the report must contain the following: * Description of the vulnerability. * Steps to reproduce the issue. Optionally, reports that are emailed can be encrypted with PGP. You should use PGP key fingerprint E9C0 59EC 0D32 64FA B35F 94AD 465B F9F6 F8EB 475A. Please do not use the GitHub issue tracker to submit vulnerability reports. The issue tracker is intended for bug reports and to make feature requests. ## Instructions for Contributors Development of in-toto occurs on the "develop" branch of this repository. Contributions can be made by submitting GitHub *Pull Requests*. Take a look at our [development guidelines](https://github.com/secure-systems-lab/lab-guidelines/blob/master/dev-workflow.md) for detailed instructions. Submitted code should follow our [style guidelines](https://github.com/secure-systems-lab/code-style-guidelines) and must be unit tested. Contributors must also indicate acceptance of the [Developer Certificate of Origin (DCO)](https://developercertificate.org/) by appending a `Signed-off-by: Your Name ` to each git commit message (see [`git commit --signoff`](https://git-scm.com/docs/git-commit#Documentation/git-commit.txt---signoff)). ## Acknowledgments This project is managed by Prof. Justin Cappos and other members of the [Secure Systems Lab](https://ssl.engineering.nyu.edu/) at NYU and the [NJIT Cybersecurity Research Center](https://centers.njit.edu/cybersecurity). This research was supported by the Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of DARPA and AFRL. The United States Government is authorized to reproduce and distribute reprints notwithstanding any copyright notice herein. Keywords: software supply chain security Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Apache Software License Classifier: Natural Language :: English Classifier: Operating System :: POSIX Classifier: Operating System :: POSIX :: Linux Classifier: Operating System :: MacOS :: MacOS X Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Security Classifier: Topic :: Software Development Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4 Description-Content-Type: text/markdown Provides-Extra: pynacl in-toto-0.4.0/README.md0000644000076500000240000003157313535414523014477 0ustar lukpstaff00000000000000# in-toto [![Build Status](https://travis-ci.org/in-toto/in-toto.svg?branch=develop)](https://travis-ci.org/in-toto/in-toto) [![Coverage Status](https://coveralls.io/repos/github/in-toto/in-toto/badge.svg?branch=develop)](https://coveralls.io/github/in-toto/in-toto?branch=develop) [![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/1523/badge)](https://bestpractices.coreinfrastructure.org/projects/1523) [![Build status](https://ci.appveyor.com/api/projects/status/taxlhrrlf3co07e1/branch/develop?svg=true)](https://ci.appveyor.com/project/in-toto/in-toto/branch/develop) in-toto provides a framework to protect the integrity of the software supply chain. It does so by verifying that each task in the chain is carried out as planned, by authorized personnel only, and that the product is not tampered with in transit. in-toto requires a **project owner** to create a **layout**. A layout lists the sequence of **steps** of the software supply chain, and the **functionaries** authorized to perform these steps. When a functionary performs a step in-toto gathers information about the used command and the related files and stores it in a **link** metadata file. As a consequence link files provide the required evidence to establish a continuous chain that can be validated against the steps defined in the layout. The layout, signed by the project owners, together with the links, signed by the designated functionaries, are released as part of the final product, and can be validated manually or via automated tooling in, e.g. a package manager. ## Getting Started ### Install Dependencies - [Python](www.python.org) in version 2.7 - crypto libraries require header files - [OpenSSL](https://www.openssl.org/) - crypto libraries require header files - [git](https://git-scm.com/) - version control system - [pip](https://pip.pypa.io) - package installer tool ### Installation It is strongly recommended to install in-toto in an isolated Python environment. For easy setup instructions visit the docs for [`virtualenv`](https://virtualenv.pypa.io) and the convenient [`vitualenvwrapper`](https://virtualenvwrapper.readthedocs.io). ```shell pip install in-toto ``` ### Create layout, run supply chain steps and verify final product #### Layout The in-toto software supply chain layout consists of the following parts: - **expiration date** - **readme** (an optional description of the supply chain) - **functionary keys** (public keys, used to verify link metadata signatures) - **signatures** (one or more layout signatures created with the project owner key(s)) - **software supply chain steps** correspond to steps carried out by a functionary as part of the software supply chain. The steps defined in the layout list the functionaries who are authorized to carry out the step (by key id). Steps require a unique name to associate them (upon verification) with link metadata that is created when a functionary carries out the step using the `in-toto` tools. Additionally, steps must have material and product rules which define the files a step is supposed to operate on. Material and product rules are described in the section below. - **inspections** define commands to be run during the verification process and can also list material and product rules. Take a look at the [demo layout creation example](https://github.com/in-toto/in-toto/blob/develop/layout-creation.md) for further information on how to create an in-toto layout. Or try our experimental [layout creation web tool](https://in-toto.engineering.nyu.edu/). #### Artifact Rules A software supply chain usually operates on a set of files, such as source code, executables, packages, or the like. in-toto calls these files artifacts. A material is an artifact that will be used when a step or inspection is carried out. Likewise, a product is an artifact that results from carrying out a step. The in-toto layout provides a simple rule language to authorize or enforce the artifacts of a step and to chain them together. This adds the following guarantees for any given step or inspection: - Only artifacts **authorized** by the project owner are created, modified or deleted, - each defined creation, modification or deletion is **enforced**, and also - restricted to the scope of its definition, which **chains** subsequent steps and inspections together. Note that it is up to you to properly secure your supply chain, by authorizing, enforcing and chaining materials and products using any and usually multiple of the following rules: - `CREATE ` - `DELETE ` - `MODIFY ` - `ALLOW ` - `DISALLOW ` - `REQUIRE ` - `MATCH [IN ] WITH (MATERIALS|PRODUCTS) [IN ] FROM ` *Rule arguments specified as `` allow for Unix shell-style wildcards as implemented by Python's [`fnmatch`](https://docs.python.org/2/library/fnmatch.html).* To learn more about the different rule types, their guarantees and how they are applied take a look at the [Artifact Rules](https://github.com/in-toto/docs/blob/master/in-toto-spec.md#433-artifact-rules) section of the in-toto specification. #### Carrying out software supply chain steps ##### in-toto-run `in-toto-run` is used to execute a step in the software supply chain. This can be anything relevant to the project such as tagging a release with `git`, running a test, or building a binary. The relevant step name and command are passed as arguments, along with materials, which are files required for that step's command to execute, and products which are files expected as a result of the execution of that command. These, and other relevant details pertaining to the step are stored in a link file, which is signed using the functionary's key. If materials are not passed to the command, the link file generated just doesn't record them. Similarly, if the execution of a command via `in-toto-run` doesn't result in any products, they're not recorded in the link file. Any files that are modified or used in any way during the execution of the command are not recorded in the link file unless explicitly passed as artifacts. Conversely, any materials or products passed to the command are recorded in the link file even if they're not part of the execution of the command. See [this simple usage example from the demo application for more details](https://github.com/in-toto/demo). For a detailed list of all the command line arguments, run `in-toto-run --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_run.py). ##### in-toto-record `in-toto-record` works similar to `in-toto-run` but can be used for multi-part software supply chain steps, i.e. steps that are not carried out by a single command. Use `in-toto-record start ...` to create a preliminary link file that only records the *materials*, then run the commands of that step or edit files manually and finally use `in-toto-record stop ...` to record the *products* and generate the actual link metadata file. For a detailed list of all command line arguments and their usage, run `in-toto-record start --help` or `in-toto-record stop --help`, or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_record.py). #### Release final product In order to verify the final product with in-toto, the verifier must have access to the layout, the `*.link` files, and the project owner's public key(s). #### Verification Use `in-toto-verify` on the final product to verify that - the layout was signed with the project owner's private key(s), - has not expired, - each step was performed and signed by the authorized functionary, - the functionaries used the commands, they were supposed to use, - materials and products of each step were in place as defined by the rules, and - run the defined inspections For a detailed list of all command line arguments and their usage, run `in-toto-verify --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_verify.py). #### Signatures `in-toto-sign` is a metadata signature helper tool to add, replace, and verify signatures within in-toto Link or Layout metadata, with options to: - replace (default) or add signature(s), with layout metadata able to be signed by multiple keys at once while link metadata can only be signed by one key at a time - write signed metadata to a specified path (if no output path is specified, layout metadata is written to the path of the input file while link metadata is written to `..link`) - verify signatures This tool is intended to sign layouts created by the [layout web wizard](https://in-toto.engineering.nyu.edu/), but also serves well to re-sign test and demo data. For example, it can be used if metadata formats or signing routines change. For a detailed list of all command line arguments and their usage, run `in-toto-sign --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_sign.py). #### Settings Settings can be configured in [`in_toto.settings`](https://github.com/in-toto/in-toto/blob/develop/in_toto/settings.py), via prefixed environment variables or in RCfiles in one of the following paths: */etc/in_toto/config, /etc/in_totorc, \~/.config/in_toto/config, \~/.config/in_toto, \~/.in_toto/config, \~/.in_totorc, .in_totorc*. A setting in an RCfile in the current working directory overrides the same setting in an RCfile in the user's home directory, which overrides the same setting in an environment variable, which overrides the same setting in `in_toto.settings`. Setting names are restricted to the below listed settings (case sensitive). Also, setting values that contain colons are parsed as list. ##### Available Settings `ARTIFACT_EXCLUDE_PATTERNS` Specifies a list of glob patterns that can be used to exclude files from being recorded as materials or products. See [runlib docs for more details](https://github.com/in-toto/in-toto/blob/develop/in_toto/runlib.py#L124-L142). `ARTIFACT_BASE_PATH` If set, material and product paths passed to `in-toto-run` are searched relative to the set base path. Also, the base path is stripped from the paths written to the resulting link metadata file. ##### Examples ```shell # Bash style environment variable export export IN_TOTO_ARTIFACT_BASE_PATH='/home/user/project' export IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS='*.link:.gitignore' ``` ``` # E.g in rcfile ~/.in_totorc [in-toto settings] ARTIFACT_BASE_PATH=/home/user/project ARTIFACT_EXCLUDE_PATTERNS=*.link:.gitignore ``` ## in-toto demo You can try in-toto by running the [demo application](https://github.com/in-toto/demo). The demo basically outlines three users viz., Alice (project owner), Bob (functionary) and Carl (functionary) and how in-toto helps to specify a project layout and verify that the layout has been followed in a correct manner. ## Specification You can read more about how in-toto works by taking a look at the [specification](https://github.com/in-toto/docs/raw/master/in-toto-spec.pdf). ## Security Issues and Bugs Security issues can be reported by emailing justincappos@gmail.com. At a minimum, the report must contain the following: * Description of the vulnerability. * Steps to reproduce the issue. Optionally, reports that are emailed can be encrypted with PGP. You should use PGP key fingerprint E9C0 59EC 0D32 64FA B35F 94AD 465B F9F6 F8EB 475A. Please do not use the GitHub issue tracker to submit vulnerability reports. The issue tracker is intended for bug reports and to make feature requests. ## Instructions for Contributors Development of in-toto occurs on the "develop" branch of this repository. Contributions can be made by submitting GitHub *Pull Requests*. Take a look at our [development guidelines](https://github.com/secure-systems-lab/lab-guidelines/blob/master/dev-workflow.md) for detailed instructions. Submitted code should follow our [style guidelines](https://github.com/secure-systems-lab/code-style-guidelines) and must be unit tested. Contributors must also indicate acceptance of the [Developer Certificate of Origin (DCO)](https://developercertificate.org/) by appending a `Signed-off-by: Your Name ` to each git commit message (see [`git commit --signoff`](https://git-scm.com/docs/git-commit#Documentation/git-commit.txt---signoff)). ## Acknowledgments This project is managed by Prof. Justin Cappos and other members of the [Secure Systems Lab](https://ssl.engineering.nyu.edu/) at NYU and the [NJIT Cybersecurity Research Center](https://centers.njit.edu/cybersecurity). This research was supported by the Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of DARPA and AFRL. The United States Government is authorized to reproduce and distribute reprints notwithstanding any copyright notice herein. in-toto-0.4.0/in_toto/0000755000076500000240000000000013535503044014657 5ustar lukpstaff00000000000000in-toto-0.4.0/in_toto/__init__.py0000644000076500000240000000013513535414523016772 0ustar lukpstaff00000000000000""" Configure base logger for in_toto (see in_toto.log for details). """ import in_toto.log in-toto-0.4.0/in_toto/common_args.py0000644000076500000240000000340513535414523017542 0ustar lukpstaff00000000000000""" common_args.py Lukas Puehringer Mar 09, 2018 See LICENSE for licensing information. Provides a collection of constants that can be used as `*args` or `**kwargs` to argparse.ArgumentParser.add_argument() for cli tools with common command line arguments. Example Usage: ``` form in_toto.common_args import EXCLUDE_ARGS, EXCLUDE_KWARGS parser = argparse.ArgumentParser() parser.add_argument(*EXCLUDE_KWARGS, **EXCLUDE_KWARGS) ``` """ EXCLUDE_ARGS = ["--exclude"] EXCLUDE_KWARGS = { "dest": "exclude_patterns", "required": False, "metavar": "", "nargs": "+", "help": ("Do not record 'materials/products' that match one of ." " Passed exclude patterns override previously set patterns, using" " e.g.: environment variables or RCfiles. See" " ARTIFACT_EXCLUDE_PATTERNS documentation for additional info.") } BASE_PATH_ARGS = ["--base-path"] BASE_PATH_KWARGS = { "dest": "base_path", "required": False, "metavar": "", "help": ("Record 'materials/products' relative to . If not set," " current working directory is used as base path.") } LSTRIP_PATHS_ARGS = ["--lstrip-paths"] LSTRIP_PATHS_KWARGS = { "dest": "lstrip_paths", "required": False, "nargs": "+", "metavar": "", "help": ("Record the path of artifacts in link metadata after left" " stripping the specified from the full path. If" " there are multiple prefixes specified, only a single " "prefix can match the path of any artifact and that is " "then left stripped. All prefixes are checked to ensure none " "of them are a left substring of another.") } in-toto-0.4.0/in_toto/exceptions.py0000644000076500000240000000166613535414523017426 0ustar lukpstaff00000000000000# TODO: Add module docstring and remove pylint exemption in in-toto/in-toto#126 # pylint: disable=missing-docstring from securesystemslib.exceptions import Error class SignatureVerificationError(Error): """Indicates a signature verification Error. """ class LayoutExpiredError(Error): """Indicates that the layout expired. """ class RuleVerificationError(Error): """Indicates that artifact rule verification failed. """ class ThresholdVerificationError(Error): """Indicates that signature threshold verification failed. """ class BadReturnValueError(Error): """Indicates that a ran command exited with non-int or non-zero return value. """ class LinkNotFoundError(Error): """Indicates that a link file was not found. """ class UnsupportedKeyTypeError(Error): """Indicates that the specified key type is not yet supported. """ class PrefixError(Error): """Indicates that there is an error because of the prefixes passed. """ in-toto-0.4.0/in_toto/formats.py0000644000076500000240000000361413535414523016713 0ustar lukpstaff00000000000000""" formats.py Lukas Puehringer Santiago Torres-Arias November 28, 2017. See LICENSE for licensing information. Format schemas for in-toto metadata, based on securesystemslib.schema. The schemas can be verified using the following methods inherited from securesystemslib.schema: in_toto.gpg.formats..check_match() in_toto.gpg.formats..matches() `check_match` raises a securesystemslib.exceptions.FormatError and `matches` returns False if the verified object does not match the schema (True otherwise). """ import in_toto.gpg.formats as gpg_formats import securesystemslib.schema as ssl_schema import securesystemslib.formats as ssl_formats # pylint: disable=bad-whitespace # Note: Verification keys can have private portions but in case of GPG we # only have a PUBKEY_SCHEMA (because we never export private gpg keys from # the gpg keyring) ANY_VERIFICATION_KEY_SCHEMA = ssl_schema.OneOf([ssl_formats.ANYKEY_SCHEMA, gpg_formats.PUBKEY_SCHEMA]) ANY_VERIFICATION_KEY_DICT_SCHEMA = ssl_schema.DictOf( key_schema = ssl_formats.KEYID_SCHEMA, value_schema = ANY_VERIFICATION_KEY_SCHEMA) ANY_PUBKEY_SCHEMA = ssl_schema.OneOf([ssl_formats.PUBLIC_KEY_SCHEMA, gpg_formats.PUBKEY_SCHEMA]) ANY_PUBKEY_DICT_SCHEMA = ssl_schema.DictOf( key_schema = ssl_formats.KEYID_SCHEMA, value_schema = ANY_PUBKEY_SCHEMA) ANY_SIGNATURE_SCHEMA = ssl_schema.OneOf([ssl_formats.SIGNATURE_SCHEMA, gpg_formats.SIGNATURE_SCHEMA]) ANY_STRING_SCHEMA = ssl_schema.AnyString() LIST_OF_ANY_STRING_SCHEMA = ssl_schema.ListOf(ANY_STRING_SCHEMA) PARAMETER_DICTIONARY_KEY = ssl_schema.RegularExpression(r'[a-zA-Z0-9_-]+') PARAMETER_DICTIONARY_SCHEMA = ssl_schema.DictOf( key_schema = PARAMETER_DICTIONARY_KEY, value_schema = ssl_schema.AnyString()) in-toto-0.4.0/in_toto/gpg/0000755000076500000240000000000013535503044015434 5ustar lukpstaff00000000000000in-toto-0.4.0/in_toto/gpg/__init__.py0000644000076500000240000000136213535414523017552 0ustar lukpstaff00000000000000""" gpg Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. This module was written due to the lack of other python (such as pygpg) modules that can provide an abstraction to the RFC4480 encoded messages from GPG. The closest candidate we could find was the python bindings for gpgme, we oped to use a Popen-based python-only construction given that gpgme is often shipped separately and other popular tools using gpg (e.g., git) don't use these bindings either. This is because users willing to use gpg signing are almost guaranteed to have gpg installed, yet the same assumption can't be made for the gpgme python bindings. """ in-toto-0.4.0/in_toto/gpg/common.py0000644000076500000240000007273713535414523017321 0ustar lukpstaff00000000000000""" common.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. Provides algorithm-agnostic gpg public key and signature parsing functions. The functions select the appropriate functions for each algorithm and call them. """ import struct import binascii import logging import collections import in_toto.gpg.util from in_toto.gpg.exceptions import (PacketVersionNotSupportedError, SignatureAlgorithmNotSupportedError, KeyNotFoundError, PacketParsingError) from in_toto.gpg.constants import ( PACKET_TYPE_PRIMARY_KEY, PACKET_TYPE_USER_ID, PACKET_TYPE_USER_ATTR, PACKET_TYPE_SUB_KEY, PACKET_TYPE_SIGNATURE, SUPPORTED_PUBKEY_PACKET_VERSIONS, SIGNATURE_TYPE_BINARY, SIGNATURE_TYPE_CERTIFICATES, SIGNATURE_TYPE_SUB_KEY_BINDING, SUPPORTED_SIGNATURE_PACKET_VERSIONS, SUPPORTED_SIGNATURE_ALGORITHMS, SIGNATURE_HANDLERS, FULL_KEYID_SUBPACKET, PARTIAL_KEYID_SUBPACKET, SHA1, SHA256, SHA512, KEY_EXPIRATION_SUBPACKET, PRIMARY_USERID_SUBPACKET, SIG_CREATION_SUBPACKET) from in_toto.gpg.formats import GPG_HASH_ALGORITHM_STRING import securesystemslib.formats # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) def parse_pubkey_payload(data): """ Parse the passed public-key packet (payload only) and construct a public key dictionary. data: An RFC4880 public key packet payload as described in section 5.5.2. (version 4) of the RFC. NOTE: The payload can be parsed from a full key packet (header + payload) by using in_toto.gpg.util.parse_packet_header. WARNING: this doesn't support armored pubkey packets, so use with care. pubkey packets are a little bit more complicated than the signature ones ValueError If the passed public key data is empty. in_toto.gpg.exceptions.PacketVersionNotSupportedError If the packet version does not match in_toto.gpg.constants.SUPPORTED_PUBKEY_PACKET_VERSIONS in_toto.gpg.exceptions.SignatureAlgorithmNotSupportedError If the signature algorithm does not match one of in_toto.gpg.constants.SUPPORTED_SIGNATURE_ALGORITHMS None. A public key in the format in_toto.gpg.formats.PUBKEY_SCHEMA """ if not data: raise ValueError("Could not parse empty pubkey payload.") ptr = 0 keyinfo = {} version_number = data[ptr] ptr += 1 if version_number not in SUPPORTED_PUBKEY_PACKET_VERSIONS: raise PacketVersionNotSupportedError( "Pubkey packet version '{}' not supported, must be one of {}".format( version_number, SUPPORTED_PUBKEY_PACKET_VERSIONS)) # NOTE: Uncomment this line to decode the time of creation time_of_creation = struct.unpack(">I", data[ptr:ptr + 4]) ptr += 4 algorithm = data[ptr] ptr += 1 # TODO: Should we only export keys with signing capabilities? # Section 5.5.2 of RFC4880 describes a public-key algorithm octet with one # of the values described in section 9.1 that could be used to determine the # capabilities. However, in case of RSA subkeys this field doesn't seem to # correctly encode the capabilities. It always has the value 1, i.e. # RSA (Encrypt or Sign). # For RSA public keys we would have to parse the subkey's signature created # with the master key, for the signature's key flags subpacket, identified # by the value 27 (see section 5.2.3.1.) containing a list of binary flags # as described in section 5.2.3.21. if algorithm not in SUPPORTED_SIGNATURE_ALGORITHMS: raise SignatureAlgorithmNotSupportedError("Signature algorithm '{}' not " "supported, please verify that your gpg configuration is creating " "either DSA or RSA signatures (see RFC4880 9.1. Public-Key " "Algorithms).".format(algorithm)) keyinfo['type'] = SUPPORTED_SIGNATURE_ALGORITHMS[algorithm]['type'] keyinfo['method'] = SUPPORTED_SIGNATURE_ALGORITHMS[algorithm]['method'] handler = SIGNATURE_HANDLERS[keyinfo['type']] keyinfo['keyid'] = in_toto.gpg.util.compute_keyid(data) key_params = handler.get_pubkey_params(data[ptr:]) return { "method": keyinfo['method'], "type": keyinfo['type'], "hashes": [GPG_HASH_ALGORITHM_STRING], "creation_time": time_of_creation[0], "keyid": keyinfo['keyid'], "keyval" : { "private": "", "public": key_params } } def parse_pubkey_bundle(data): """ Parse packets from passed gpg public key data, associating self-signatures with the packets they correspond to, based on the structure of V4 keys defined in RFC4880 12.1 Key Structures. The returned raw key bundle may be used to further enrich the master key, with certified information (e.g. key expiration date) taken from self-signatures, and/or to verify that the parsed subkeys are bound to the primary key via signatures. data: Public key data as written to stdout by GPG_EXPORT_PUBKEY_COMMAND. in_toto.gpg.exceptions.PacketParsingError If data is empty. If data cannot be parsed. None. A raw public key bundle where self-signatures are associated with their corresponding packets. See `key_bundle` for details. """ if not data: raise PacketParsingError("Cannot parse keys from empty gpg data.") # Temporary data structure to hold parsed gpg packets key_bundle = { PACKET_TYPE_PRIMARY_KEY: { "key": {}, "packet": None, "signatures": [] }, PACKET_TYPE_USER_ID: collections.OrderedDict(), PACKET_TYPE_USER_ATTR: collections.OrderedDict(), PACKET_TYPE_SUB_KEY: collections.OrderedDict() } # Iterate over gpg data and parse out packets of different types position = 0 while position < len(data): try: packet_type, header_len, body_len, packet_length = \ in_toto.gpg.util.parse_packet_header(data[position:]) packet = data[position:position+packet_length] payload = packet[header_len:] # The first (and only the first) packet in the bundle must be the master # key. See RFC4880 12.1 Key Structures, V4 version keys # TODO: Do we need additional key structure assertions? e.g. # - there must be least one User ID packet, or # - order and type of signatures, or # - disallow duplicate packets # pylint: disable=no-else-raise if packet_type != PACKET_TYPE_PRIMARY_KEY and \ not key_bundle[PACKET_TYPE_PRIMARY_KEY]["key"]: raise PacketParsingError("First packet must be a primary key ('{}'), " "got '{}'.".format(PACKET_TYPE_PRIMARY_KEY, packet_type)) elif packet_type == PACKET_TYPE_PRIMARY_KEY and \ key_bundle[PACKET_TYPE_PRIMARY_KEY]["key"]: raise PacketParsingError("Unexpected primary key.") # Fully parse master key to fail early, e.g. if key is malformed # or not supported, but also retain original packet for subkey binding # signature verification elif packet_type == PACKET_TYPE_PRIMARY_KEY: key_bundle[PACKET_TYPE_PRIMARY_KEY] = { "key": parse_pubkey_payload(bytearray(payload)), "packet": packet, "signatures": [] } # Other non-signature packets in the key bundle include User IDs and User # Attributes, required to verify primary key certificates, and subkey # packets. For each packet we create a new ordered dictionary entry. We # use a dictionary to aggregate signatures by packet below, # and it must be ordered because each signature packet belongs to the # most recently parsed packet of a type. elif packet_type in {PACKET_TYPE_USER_ID, PACKET_TYPE_USER_ATTR, PACKET_TYPE_SUB_KEY}: key_bundle[packet_type][packet] = { "header_len": header_len, "body_len": body_len, "signatures": [] } # The remaining relevant packets are signatures, required to bind subkeys # to the primary key, or to gather additional information about the # primary key, e.g. expiration date. # A signature corresponds to the most recently parsed packet of a type, # where the type is given by the availability of respective packets. # We test availability and assign accordingly as per the order of packet # types defined in RFC4880 12.1 (bottom-up). elif packet_type == PACKET_TYPE_SIGNATURE: for _type in [PACKET_TYPE_SUB_KEY, PACKET_TYPE_USER_ATTR, PACKET_TYPE_USER_ID]: if key_bundle[_type]: # Add to most recently added packet's signatures of matching type key_bundle[_type][next(reversed(key_bundle[_type]))]\ ["signatures"].append(packet) break else: # If no packets are available for any of above types (yet), the # signature belongs to the primary key key_bundle[PACKET_TYPE_PRIMARY_KEY]["signatures"].append(packet) else: LOG.info("Ignoring gpg key packet '{}', we only handle packets of " "types '{}' (see RFC4880 4.3. Packet Tags).".format(packet_type, [PACKET_TYPE_PRIMARY_KEY, PACKET_TYPE_USER_ID, PACKET_TYPE_USER_ATTR, PACKET_TYPE_SUB_KEY, PACKET_TYPE_SIGNATURE])) # Both errors might be raised in parse_packet_header and in this loop except (PacketParsingError, IndexError) as e: raise PacketParsingError("Invalid public key data at position {}: {}." .format(position, e)) # Go to next packet position += packet_length return key_bundle def _assign_certified_key_info(bundle): """ Helper function to verify User ID certificates corresponding to a gpg master key, in order to enrich the master key with additional information (e.g. expiration dates). The enriched master key is returned. NOTE: Currently we only consider User ID certificates. We can do the same for User Attribute certificates by iterating over bundle[PACKET_TYPE_USER_ATTR] instead of bundle[PACKET_TYPE_USER_ID], and replacing the signed_content constant '\xb4' with '\xd1' (see RFC4880 section 5.2.4. paragraph 4). bundle: GPG key bundle as parsed in parse_pubkey_bundle(). None. None. A public key in the format in_toto.gpg.formats.PUBKEY_SCHEMA. """ # Create handler shortcut handler = SIGNATURE_HANDLERS[bundle[PACKET_TYPE_PRIMARY_KEY]["key"]["type"]] is_primary_user = False validity_period = None sig_creation_time = None # Verify User ID signatures to gather information about primary key # (see Notes about certification signatures in RFC 4880 5.2.3.3.) for user_id_packet, packet_data in bundle[PACKET_TYPE_USER_ID].items(): # Construct signed content (see RFC4880 section 5.2.4. paragraph 4) signed_content = (bundle[PACKET_TYPE_PRIMARY_KEY]["packet"] + b"\xb4\x00\x00\x00" + user_id_packet[1:]) for signature_packet in packet_data["signatures"]: try: signature = parse_signature_packet(signature_packet, supported_hash_algorithms={SHA1, SHA256, SHA512}, supported_signature_types=SIGNATURE_TYPE_CERTIFICATES, include_info=True) # gpg_verify_signature requires a "keyid" even if it is short. # (see parse_signature_packet for more information about keyids) signature["keyid"] = signature["keyid"] or signature["short_keyid"] # TODO: Revise exception taxonomy: # It's okay to ignore some exceptions (unsupported algorithms etc.) but # we should blow up if a signature is malformed (missing subpackets). except Exception as e: LOG.info(e) continue if not bundle[PACKET_TYPE_PRIMARY_KEY]["key"]["keyid"].endswith( signature["keyid"]): LOG.info("Ignoring User ID certificate issued by '{}'.".format( signature["keyid"])) continue is_valid = handler.gpg_verify_signature(signature, bundle[PACKET_TYPE_PRIMARY_KEY]["key"], signed_content, signature["info"]["hash_algorithm"]) if not is_valid: LOG.info("Ignoring invalid User ID self-certificate issued " "by '{}'.".format(signature["keyid"])) continue # If the signature is valid, we try to extract subpackets relevant to # the primary key, i.e. expiration time. # NOTE: There might be multiple User IDs per primary key and multiple # certificates per User ID. RFC4880 5.2.3.19. and last paragraph of # 5.2.3.3. provides some suggestions about ambiguity, but delegates the # responsibility to the implementer. # Ambiguity resolution scheme: # We take the key expiration time from the most recent certificate, i.e. # the certificate with the highest signature creation time. Additionally, # we prioritize certificates with primary user id flag set True. Note # that, if the ultimately prioritized certificate does not have a key # expiration time subpacket, we don't assign one, even if there were # certificates of lower priority carrying that subpacket. tmp_validity_period = \ signature["info"]["subpackets"].get(KEY_EXPIRATION_SUBPACKET) # No key expiration time, go to next certificate if tmp_validity_period is None: continue # Create shortcut to mandatory pre-parsed creation time subpacket tmp_sig_creation_time = signature["info"]["creation_time"] tmp_is_primary_user = \ signature["info"]["subpackets"].get(PRIMARY_USERID_SUBPACKET) if tmp_is_primary_user is not None: tmp_is_primary_user = bool(tmp_is_primary_user[0]) # If we already have a primary user certified expiration date and this # is none, we don't consider it, and go to next certificate if is_primary_user and not tmp_is_primary_user: continue if not sig_creation_time or sig_creation_time < tmp_sig_creation_time: # This is the most recent certificate that has a validity_period and # doesn't have lower priority in regard to the primary user id flag. We # accept it the keys validty_period, until we get a newer value from # a certificate with higher priority. validity_period = struct.unpack(">I", tmp_validity_period)[0] # We also keep track of the used certificate's primary user id flag and # the signature creation time, for prioritization. is_primary_user = tmp_is_primary_user sig_creation_time = tmp_sig_creation_time if validity_period is not None: bundle[PACKET_TYPE_PRIMARY_KEY]["key"]["validity_period"] = validity_period return bundle[PACKET_TYPE_PRIMARY_KEY]["key"] def _get_verified_subkeys(bundle): """ Helper function to verify the subkey binding signature for all subkeys in the passed bundle in order to enrich subkeys with additional information (e.g. expiration dates). Only valid (i.e. parsable) subkeys that are verifiably bound to the the master key of the bundle are returned. All other subkeys are discarded. bundle: GPG key bundle as parsed in parse_pubkey_bundle(). None. None. A dictionary of public keys in the format in_toto.gpg.formats.PUBKEY_SCHEMA, with keyids as dict keys. """ # Create handler shortcut handler = SIGNATURE_HANDLERS[bundle[PACKET_TYPE_PRIMARY_KEY]["key"]["type"]] # Verify subkey binding signatures and only keep verified keys # See notes about subkey binding signature in RFC4880 5.2.3.3 verified_subkeys = {} for subkey_packet, packet_data in bundle[PACKET_TYPE_SUB_KEY].items(): try: # Parse subkey if possible and skip if invalid (e.g. not-supported) subkey = parse_pubkey_payload( bytearray(subkey_packet[-packet_data["body_len"]:])) # TODO: Revise exception taxonomy except Exception as e: LOG.info(e) continue # Construct signed content (see RFC4880 section 5.2.4. paragraph 3) signed_content = (bundle[PACKET_TYPE_PRIMARY_KEY]["packet"] + b"\x99" + subkey_packet[1:]) # Filter sub key binding signature from other signatures, e.g. subkey # binding revocation signatures key_binding_signatures = [] for signature_packet in packet_data["signatures"]: try: signature = parse_signature_packet(signature_packet, supported_hash_algorithms={SHA1, SHA256, SHA512}, supported_signature_types={SIGNATURE_TYPE_SUB_KEY_BINDING}, include_info=True) # gpg_verify_signature requires a "keyid" even if it is short. # (see parse_signature_packet for more information about keyids) signature["keyid"] = signature["keyid"] or signature["short_keyid"] key_binding_signatures.append(signature) # TODO: Revise exception taxonomy except Exception as e: LOG.info(e) continue # NOTE: As per the V4 key structure diagram in RFC4880 section 12.1., a # subkey must be followed by exactly one Primary-Key-Binding-Signature. # Based on inspection of real-world keys and other parts of the RFC (e.g. # the paragraph below the diagram and paragraph 0x18: Subkey Binding # Signature in section 5.2.1.) the mandated signature is actually a # *subkey binding signature*, which in case of a signing subkey, must have # an *embedded primary key binding signature*. if len(key_binding_signatures) != 1: LOG.info("Ignoring subkey '{}' due to wrong amount of key binding " "signatures ({}), must be exactly 1.".format(subkey["keyid"], len(key_binding_signatures))) continue is_valid = handler.gpg_verify_signature(signature, bundle[PACKET_TYPE_PRIMARY_KEY]["key"], signed_content, signature["info"]["hash_algorithm"]) if not is_valid: LOG.info("Ignoring subkey '{}' due to invalid key binding signature." .format(subkey["keyid"])) continue # If the signature is valid, we may also extract relevant information from # its "info" field (e.g. subkey expiration date) and assign to it to the # subkey here validity_period = \ signature["info"]["subpackets"].get(KEY_EXPIRATION_SUBPACKET) if validity_period is not None: subkey["validity_period"] = struct.unpack(">I", validity_period)[0] verified_subkeys[subkey["keyid"]] = subkey return verified_subkeys def get_pubkey_bundle(data, keyid): """ Call function to extract and verify master key and subkeys from the passed gpg key data, where either the master key or one of the subkeys matches the passed keyid. NOTE: - If the keyid matches one of the subkeys, a warning is issued to notify the user about potential privilege escalation - Subkeys with invalid key binding signatures are discarded data: Public key data as written to stdout by in_toto.gpg.constants.GPG_EXPORT_PUBKEY_COMMAND. keyid: The keyid of the master key or one of its subkeys expected to be contained in the passed gpg data. in_toto.gpg.exceptions.PacketParsingError If the key data could not be parsed in_toto.gpg.exceptions.KeyNotFoundError If the passed data is empty. If no master key or subkeys could be found that matches the passed keyid. securesystemslib.exceptions.FormatError If the passed keyid does not match securesystemslib.formats.KEYID_SCHEMA None. A public key in the format in_toto.gpg.formats.PUBKEY_SCHEMA with optional subkeys. """ securesystemslib.formats.KEYID_SCHEMA.check_match(keyid) if not data: raise KeyNotFoundError("Could not find gpg key '{}' in empty exported key " "data.".format(keyid)) # Parse out master key and subkeys (enriched and verified via certificates # and binding signatures) raw_key_bundle = parse_pubkey_bundle(data) master_public_key = _assign_certified_key_info(raw_key_bundle) sub_public_keys = _get_verified_subkeys(raw_key_bundle) # Since GPG returns all pubkeys associated with a keyid (master key and # subkeys) we check which key matches the passed keyid. # If the matching key is a subkey, we warn the user because we return # the whole bundle (master plus all subkeys) and not only the subkey. # If no matching key is found we raise a KeyNotFoundError. for idx, public_key in enumerate( [master_public_key] + list(sub_public_keys.values())): if public_key and public_key["keyid"].endswith(keyid.lower()): if idx > 1: LOG.warning("Exporting master key '{}' including subkeys '{}' for" " passed keyid '{}'.".format(master_public_key["keyid"], ", ".join(list(sub_public_keys.keys())), keyid)) break else: raise KeyNotFoundError("Could not find gpg key '{}' in exported key data." .format(keyid)) # Add subkeys dictionary to master pubkey "subkeys" field if subkeys exist if sub_public_keys: master_public_key["subkeys"] = sub_public_keys return master_public_key def parse_signature_packet(data, supported_signature_types=None, supported_hash_algorithms=None, include_info=False): """ Parse the signature information on an RFC4880-encoded binary signature data buffer. NOTE: Older gpg versions (< FULLY_SUPPORTED_MIN_VERSION) might only reveal the partial key id. It is the callers responsibility to determine the full keyid based on the partial keyid, e.g. by exporting the related public and replacing the partial keyid with the full keyid. data: the RFC4880-encoded binary signature data buffer as described in section 5.2 (and 5.2.3.1). supported_signature_types: (optional) a set of supported signature_types, the signature packet may be (see in_toto.gpg.constants for available types). If None is specified the signature packet must be of type SIGNATURE_TYPE_BINARY. supported_hash_algorithms: (optional) a set of supported hash algorithm ids, the signature packet may use. Available ids are SHA1, SHA256, SHA512 (see in_toto.gpg.constants). If None is specified, the signature packet must use SHA256. include_info: (optional) a boolean that indicates whether an opaque dictionary should be added to the returned signature under the key "info". Default is False. ValueError: if the signature packet is not supported or the data is malformed IndexError: if the signature packet is incomplete None. A signature dictionary matching in_toto.gpg.formats.SIGNATURE_SCHEMA with the following special characteristics: - The "keyid" field is an empty string if it cannot be determined - The "short_keyid" is not added if it cannot be determined - At least one of non-empty "keyid" or "short_keyid" are part of the signature """ if not supported_signature_types: supported_signature_types = {SIGNATURE_TYPE_BINARY} if not supported_hash_algorithms: supported_hash_algorithms = {SHA256} _, header_len, _, packet_len = in_toto.gpg.util.parse_packet_header( data, PACKET_TYPE_SIGNATURE) data = bytearray(data[header_len:packet_len]) ptr = 0 # we get the version number, which we also expect to be v4, or we bail # FIXME: support v3 type signatures (which I haven't seen in the wild) version_number = data[ptr] ptr += 1 if version_number not in SUPPORTED_SIGNATURE_PACKET_VERSIONS: raise ValueError("Signature version '{}' not supported, must be one of " "{}.".format(version_number, SUPPORTED_SIGNATURE_PACKET_VERSIONS)) # here, we want to make sure the signature type is indeed PKCSV1.5 with RSA signature_type = data[ptr] ptr += 1 if signature_type not in supported_signature_types: raise ValueError("Signature type '{}' not supported, must be one of {} " "(see RFC4880 5.2.1. Signature Types).".format(signature_type, supported_signature_types)) signature_algorithm = data[ptr] ptr += 1 if signature_algorithm not in SUPPORTED_SIGNATURE_ALGORITHMS: raise ValueError("Signature algorithm '{}' not " "supported, please verify that your gpg configuration is creating " "either DSA or RSA signatures (see RFC4880 9.1. Public-Key " "Algorithms).".format(signature_algorithm)) key_type = SUPPORTED_SIGNATURE_ALGORITHMS[signature_algorithm]['type'] handler = SIGNATURE_HANDLERS[key_type] hash_algorithm = data[ptr] ptr += 1 if hash_algorithm not in supported_hash_algorithms: raise ValueError("Hash algorithm '{}' not supported, must be one of {}" " (see RFC4880 9.4. Hash Algorithms).".format(hash_algorithm, supported_hash_algorithms)) # Obtain the hashed octets hashed_octet_count = struct.unpack(">H", data[ptr:ptr+2])[0] ptr += 2 hashed_subpackets = data[ptr:ptr+hashed_octet_count] hashed_subpacket_info = in_toto.gpg.util.parse_subpackets(hashed_subpackets) # Check whether we were actually able to read this much hashed octets if len(hashed_subpackets) != hashed_octet_count: # pragma: no cover raise ValueError("This signature packet seems to be corrupted." "It is missing hashed octets!") ptr += hashed_octet_count other_headers_ptr = ptr unhashed_octet_count = struct.unpack(">H", data[ptr: ptr + 2])[0] ptr += 2 unhashed_subpackets = data[ptr:ptr+unhashed_octet_count] unhashed_subpacket_info = in_toto.gpg.util.parse_subpackets( unhashed_subpackets) ptr += unhashed_octet_count # Use the info dict to return further signature information that may be # needed for intermediate processing, but does not have to be on the eventual # signature datastructure info = { "signature_type": signature_type, "hash_algorithm": hash_algorithm, "creation_time": None, "subpackets": {}, } keyid = "" short_keyid = "" # Parse Issuer (short keyid) and Issuer Fingerprint (full keyid) from hashed # and unhashed signature subpackets. Full keyids are only available in newer # signatures. (see RFC4880 and rfc4880bis-06 5.2.3.1.) # NOTE: A subpacket may be found either in the hashed or unhashed subpacket # sections of a signature. If a subpacket is not hashed, then the information # in it cannot be considered definitive because it is not part of the # signature proper. # (see RFC4880 5.2.3.2.) # NOTE: Signatures may contain conflicting information in subpackets. In most # cases, an implementation SHOULD use the last subpacket, but MAY use any # conflict resolution scheme that makes more sense. # (see RFC4880 5.2.4.1.) # Below we only consider the last and favor hashed over unhashed subpackets for idx, subpacket_tuple in \ enumerate(unhashed_subpacket_info + hashed_subpacket_info): is_hashed = (idx >= len(unhashed_subpacket_info)) subpacket_type, subpacket_data = subpacket_tuple # Warn if expiration subpacket is not hashed if subpacket_type == KEY_EXPIRATION_SUBPACKET: if not is_hashed: LOG.warning("Expiration subpacket not hashed, gpg client possibly " "exporting a weakly configured key.") if subpacket_type == FULL_KEYID_SUBPACKET: # pragma: no cover # Exclude from coverage for consistent results across test envs # NOTE: The first byte of the subpacket payload is a version number # (see rfc4880bis-06 5.2.3.28.) keyid = binascii.hexlify(subpacket_data[1:]).decode("ascii") # We also return the short keyid, because the full might not be available if subpacket_type == PARTIAL_KEYID_SUBPACKET: short_keyid = binascii.hexlify(subpacket_data).decode("ascii") if subpacket_type == SIG_CREATION_SUBPACKET: info["creation_time"] = struct.unpack(">I", subpacket_data)[0] info["subpackets"][subpacket_type] = subpacket_data # Fail if there is no keyid at all (this should not happen) if not (keyid or short_keyid): # pragma: no cover raise ValueError("This signature packet seems to be corrupted. It does " "not have an 'Issuer' or 'Issuer Fingerprint' subpacket (see RFC4880 " "and rfc4880bis-06 5.2.3.1. Signature Subpacket Specification).") # Fail if keyid and short keyid are specified but don't match if keyid and not keyid.endswith(short_keyid): # pragma: no cover raise ValueError("This signature packet seems to be corrupted. The key ID " "'{}' of the 'Issuer' subpacket must match the lower 64 bits of the " "fingerprint '{}' of the 'Issuer Fingerprint' subpacket (see RFC4880 " "and rfc4880bis-06 5.2.3.28. Issuer Fingerprint).".format( short_keyid, keyid)) if not info["creation_time"]: # pragma: no cover raise ValueError("This signature packet seems to be corrupted. It does " "not have a 'Signature Creation Time' subpacket (see RFC4880 5.2.3.4 " "Signature Creation Time).") # Uncomment this variable to obtain the left-hash-bits information (used for # early rejection) #left_hash_bits = struct.unpack(">H", data[ptr:ptr+2])[0] ptr += 2 signature = handler.get_signature_params(data[ptr:]) signature_data = { 'keyid': "{}".format(keyid), 'other_headers': binascii.hexlify( data[:other_headers_ptr]).decode('ascii'), 'signature': binascii.hexlify(signature).decode('ascii') } if short_keyid: # pragma: no branch signature_data["short_keyid"] = short_keyid if include_info: signature_data["info"] = info return signature_data in-toto-0.4.0/in_toto/gpg/constants.py0000644000076500000240000000510313535414523020024 0ustar lukpstaff00000000000000""" constants.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. aggregates all the constant definitions and lookup structures for signature handling """ import logging import in_toto.gpg.rsa as rsa import in_toto.gpg.dsa as dsa import in_toto.process as process # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) # By default, we assume and test that gpg2 exists. Otherwise, we assume gpg # exists. GPG_COMMAND = "gpg2" GPG_VERSION_COMMAND = GPG_COMMAND + " --version" FULLY_SUPPORTED_MIN_VERSION = "2.1.0" try: process.run(GPG_VERSION_COMMAND, stdout=process.PIPE, stderr=process.PIPE) except OSError: # pragma: no cover GPG_COMMAND = "gpg" GPG_VERSION_COMMAND = GPG_COMMAND + " --version" GPG_SIGN_COMMAND = GPG_COMMAND + \ " --detach-sign --digest-algo SHA256 {keyarg} {homearg}" GPG_EXPORT_PUBKEY_COMMAND = GPG_COMMAND + " {homearg} --export {keyid}" # See RFC4880 section 4.3. Packet Tags for a list of all packet types The # relevant packets defined below are described in sections 5.2 (signature), # 5.5.1.1 (primary pubkey) and 5.5.1.2 (pub subkey), 5.12 (user id) and 5.13 # (user attribute) PACKET_TYPE_SIGNATURE = 0x02 PACKET_TYPE_PRIMARY_KEY = 0x06 PACKET_TYPE_USER_ID = 0x0D PACKET_TYPE_USER_ATTR = 0x11 PACKET_TYPE_SUB_KEY = 0x0E # See sections 5.2.3 (signature) and 5.5.2 (public key) of RFC4880 SUPPORTED_SIGNATURE_PACKET_VERSIONS = {0x04} SUPPORTED_PUBKEY_PACKET_VERSIONS = {0x04} # See section 5.2.3.1 (signature algorithms) of RFC4880 SUPPORTED_SIGNATURE_ALGORITHMS = { 0x01: { "type":"rsa", "method": "pgp+rsa-pkcsv1.5", "handler": rsa }, 0x11: { "type": "dsa", "method": "pgp+dsa-fips-180-2", "handler": dsa } } SIGNATURE_HANDLERS = { "rsa": rsa, "dsa": dsa } # The constants for hash algorithms are taken from section 9.4 of RFC4880. SHA1 = 0x02 SHA256 = 0x08 SHA512 = 0x0A # See section 5.2.1 of RFC4880 SIGNATURE_TYPE_BINARY = 0x00 SIGNATURE_TYPE_SUB_KEY_BINDING = 0x18 SIGNATURE_TYPE_CERTIFICATES = {0x10, 0x11, 0x12, 0x13} # See section 5.2.3.4 (Signature Creation Time) of RFC4880 SIG_CREATION_SUBPACKET = 0x02 # See section 5.2.3.5. (Issuer) of RFC4880 PARTIAL_KEYID_SUBPACKET = 0x10 # See section 5.2.3.6 (Key Expiration Time) of RFC4880 KEY_EXPIRATION_SUBPACKET = 0x09 # See section 5.2.3.19 (Primary User ID) of RFC4880 PRIMARY_USERID_SUBPACKET = 0x19 # See section 5.2.3.28. (Issuer Fingerprint) of rfc4880bis-06 FULL_KEYID_SUBPACKET = 0x21 in-toto-0.4.0/in_toto/gpg/dsa.py0000644000076500000240000001442413535414523016565 0ustar lukpstaff00000000000000# pylint: disable=invalid-name """ dsa.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. DSA-specific handling routines for signature verification and key parsing """ import binascii import cryptography.hazmat.primitives.asymmetric.dsa as dsa import cryptography.hazmat.backends as backends import cryptography.hazmat.primitives.asymmetric.utils as dsautils import cryptography.exceptions import in_toto.gpg.util import in_toto.gpg.exceptions import in_toto.gpg.formats def create_pubkey(pubkey_info): """ Create and return a DSAPublicKey object from the passed pubkey_info using pyca/cryptography. pubkey_info: The DSA pubkey info dictionary as specified by gpg.formats.DSA_PUBKEY_SCHEMA securesystemslib.exceptions.FormatError if pubkey_info does not match gpg.formats.DSA_PUBKEY_SCHEMA A cryptography.hazmat.primitives.asymmetric.dsa.DSAPublicKey based on the passed pubkey_info. """ in_toto.gpg.formats.DSA_PUBKEY_SCHEMA.check_match(pubkey_info) y = int(pubkey_info['keyval']['public']['y'], 16) g = int(pubkey_info['keyval']['public']['g'], 16) p = int(pubkey_info['keyval']['public']['p'], 16) q = int(pubkey_info['keyval']['public']['q'], 16) parameter_numbers = dsa.DSAParameterNumbers(p, q, g) pubkey = dsa.DSAPublicNumbers(y, parameter_numbers).public_key( backends.default_backend()) return pubkey def get_pubkey_params(data): """ Parse the public-key parameters as multi-precision-integers. data: the RFC4880-encoded public key parameters data buffer as described in the fifth paragraph of section 5.5.2. in_toto.gpg.exceptions.PacketParsingError: if the public key parameters are malformed None. The decoded signature buffer """ ptr = 0 prime_p_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 prime_p = data[ptr:ptr + prime_p_length] if len(prime_p) != prime_p_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This MPI was truncated!") ptr += prime_p_length group_order_q_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 group_order_q = data[ptr:ptr + group_order_q_length] if len(group_order_q) != group_order_q_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This MPI has been truncated!") ptr += group_order_q_length generator_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 generator = data[ptr:ptr + generator_length] if len(generator) != generator_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This MPI has been truncated!") ptr += generator_length value_y_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 value_y = data[ptr:ptr + value_y_length] if len(value_y) != value_y_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This MPI has been truncated!") return { "y": binascii.hexlify(value_y).decode('ascii'), "p": binascii.hexlify(prime_p).decode("ascii"), "g": binascii.hexlify(generator).decode("ascii"), "q": binascii.hexlify(group_order_q).decode("ascii"), } def get_signature_params(data): """ Parse the signature parameters as multi-precision-integers. data: the RFC4880-encoded public key parameters data buffer as described in the fourth paragraph of section 5.2.2. in_toto.gpg.exceptions.PacketParsingError: if the public key parameters are malformed None. The decoded signature buffer """ ptr = 0 r_length = in_toto.gpg.util.get_mpi_length(data[ptr:ptr+2]) ptr += 2 r = data[ptr:ptr + r_length] if len(r) != r_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "r-value truncated in signature") ptr += r_length s_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr+2]) ptr += 2 s = data[ptr: ptr + s_length] if len(s) != s_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "s-value truncated in signature") s = int(binascii.hexlify(s), 16) r = int(binascii.hexlify(r), 16) signature = dsautils.encode_dss_signature(r, s) return signature def gpg_verify_signature(signature_object, pubkey_info, content, hash_algorithm_id): """ Verify the passed signature against the passed content with the passed DSA public key using pyca/cryptography. signature_object: A signature dictionary as specified by gpg.formats.SIGNATURE_SCHEMA pubkey_info: The DSA public key info dictionary as specified by gpg.formats.DSA_PUBKEY_SCHEMA hash_algorithm_id: one of SHA1, SHA256, SHA512 (see in_toto.gpg.constants) used to verify the signature NOTE: Overrides any hash algorithm specification in "pubkey_info"'s "hashes" or "method" fields. content: The signed bytes against which the signature is verified securesystemslib.exceptions.FormatError if: signature_object does not match gpg.formats.SIGNATURE_SCHEMA pubkey_info does not match gpg.formats.DSA_PUBKEY_SCHEMA ValueError: if the passed hash_algorithm_id is not supported (see in_toto.gpg.util.get_hashing_class) True if signature verification passes and False otherwise """ in_toto.gpg.formats.SIGNATURE_SCHEMA.check_match(signature_object) in_toto.gpg.formats.DSA_PUBKEY_SCHEMA.check_match(pubkey_info) hasher = in_toto.gpg.util.get_hashing_class(hash_algorithm_id) pubkey_object = create_pubkey(pubkey_info) digest = in_toto.gpg.util.hash_object( binascii.unhexlify(signature_object['other_headers']), hasher(), content) try: pubkey_object.verify( binascii.unhexlify(signature_object['signature']), digest, dsautils.Prehashed(hasher()) ) return True except cryptography.exceptions.InvalidSignature: return False in-toto-0.4.0/in_toto/gpg/exceptions.py0000644000076500000240000000274213535414523020177 0ustar lukpstaff00000000000000""" exceptions.py Santiago Torres-Arias Lukas Puehringer Dec 8, 2017 See LICENSE for licensing information. Define Exceptions used in the gpg package. Following the practice from securesystemslib the names chosen for exception classes should end in 'Error' (except where there is a good reason not to). """ import datetime import dateutil.tz # TODO: Add class docstrings and remove pylint exemption in in-toto/in-toto#126 # pylint: disable=missing-docstring class PacketParsingError(Exception): pass class KeyNotFoundError(Exception): pass class PacketVersionNotSupportedError(Exception): pass class SignatureAlgorithmNotSupportedError(Exception): pass class CommandError(Exception): pass class KeyExpirationError(Exception): def __init__(self, key): super(KeyExpirationError, self).__init__() self.key = key def __str__(self): creation_time = datetime.datetime.fromtimestamp( self.key["creation_time"], dateutil.tz.UTC) expiration_time = datetime.datetime.fromtimestamp( self.key["creation_time"] + self.key["validity_period"], dateutil.tz.UTC) validity_period = expiration_time - creation_time return ("GPG key '{}' created on '{:%Y-%m-%d %H:%M} UTC' with validity " "period '{}' expired on '{:%Y-%m-%d %H:%M} UTC'.".format( self.key["keyid"], creation_time, validity_period, expiration_time)) in-toto-0.4.0/in_toto/gpg/formats.py0000644000076500000240000001203313535414523017463 0ustar lukpstaff00000000000000""" formats.py Lukas Puehringer November 28, 2017. See LICENSE for licensing information. Format schemas for gpg data structures (keys, signatures) based on securesystemslib.schema. The schemas can be verified using the following methods inherited from securesystemslib.schema: in_toto.gpg.formats..check_match() in_toto.gpg.formats..matches() `check_match` raises a securesystemslib.exceptions.FormatError and `matches` returns False if the verified object does not match the schema (True otherwise). Example Usage: >>> rsa_pubkey = { 'type': 'rsa', 'hashes': ['pgp+SHA2'], 'keyid': '8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17', 'keyval': { 'public': { 'e': u'010001', 'n': (u'da59409e6ede307a52f6851954a7bd4b9e309bd40a390f8c0de9722b63101 10ef0b095bf1c473e33db97150edae05c63dda70c03902701b15f3c5c3089 47e1b06675b4f1112030f1145be84ae1562e9120c2d429b20d5056337cbc9 7fc8b5db5704a21db635d00b2157ed68a403c793e9958b77e00163f99b018 09e08ee9099b99b117c086501e79eb947f760a0715bead0024c48d81f9000 671c4306a93725965f3ff2dc9806eaf081357f0268cab8ba7582d2e95e512 25a9dc7ed31a9568c45568d7917b05e7c954d561cd084291e77a7bdd69e3a c2f9091de55fe3f4e730147e880e2fc044c5f7c04c75ce33a3c0b52380f4d 60309708c56185f3bce6703b') }, 'private': '' }, 'method': 'pgp+rsa-pkcsv1.5' } >>> RSA_PUBKEY_SCHEMA.matches(rsa) True """ import securesystemslib.schema as ssl_schema import securesystemslib.formats as ssl_formats def _create_pubkey_with_subkey_schema(pubkey_schema): """Helper method to extend the passed public key schema with an optional dictionary of sub public keys "subkeys" with the same schema.""" schema = pubkey_schema subkey_schema_tuple = ("subkeys", ssl_schema.Optional( ssl_schema.DictOf( key_schema=ssl_formats.KEYID_SCHEMA, value_schema=pubkey_schema ) ) ) # Any subclass of `securesystemslib.schema.Object` stores the schemas that # define the attributes of the object in its `_required` property, even if # such a schema is of type `Optional`. # TODO: Find a way that does not require to access a protected member schema._required.append(subkey_schema_tuple) # pylint: disable=protected-access return schema # pylint: disable=bad-whitespace GPG_HASH_ALGORITHM_STRING = "pgp+SHA2" PGP_RSA_PUBKEY_METHOD_STRING = "pgp+rsa-pkcsv1.5" PGP_DSA_PUBKEY_METHOD_STRING = "pgp+dsa-fips-180-2" RSA_PUBKEYVAL_SCHEMA = ssl_schema.Object( object_name = "RSA_PUBKEYVAL_SCHEMA", e = ssl_schema.AnyString(), n = ssl_formats.HEX_SCHEMA ) # We have to define RSA_PUBKEY_SCHEMA in two steps, because it is # self-referential. Here we define a shallow _RSA_PUBKEY_SCHEMA, which we use # below to create the self-referential RSA_PUBKEY_SCHEMA. _RSA_PUBKEY_SCHEMA = ssl_schema.Object( object_name = "RSA_PUBKEY_SCHEMA", type = ssl_schema.String("rsa"), method = ssl_schema.String(PGP_RSA_PUBKEY_METHOD_STRING), hashes = ssl_schema.ListOf(ssl_schema.String(GPG_HASH_ALGORITHM_STRING)), creation_time = ssl_schema.Optional(ssl_formats.UNIX_TIMESTAMP_SCHEMA), validity_period = ssl_schema.Optional(ssl_schema.Integer(lo=0)), keyid = ssl_formats.KEYID_SCHEMA, keyval = ssl_schema.Object( public = RSA_PUBKEYVAL_SCHEMA, private = ssl_schema.String("") ) ) RSA_PUBKEY_SCHEMA = _create_pubkey_with_subkey_schema( _RSA_PUBKEY_SCHEMA) DSA_PUBKEYVAL_SCHEMA = ssl_schema.Object( object_name = "DSA_PUBKEYVAL_SCHEMA", y = ssl_formats.HEX_SCHEMA, p = ssl_formats.HEX_SCHEMA, q = ssl_formats.HEX_SCHEMA, g = ssl_formats.HEX_SCHEMA ) # We have to define DSA_PUBKEY_SCHEMA in two steps, because it is # self-referential. Here we define a shallow _DSA_PUBKEY_SCHEMA, which we use # below to create the self-referential DSA_PUBKEY_SCHEMA. _DSA_PUBKEY_SCHEMA = ssl_schema.Object( object_name = "DSA_PUBKEY_SCHEMA", type = ssl_schema.String("dsa"), method = ssl_schema.String(PGP_DSA_PUBKEY_METHOD_STRING), hashes = ssl_schema.ListOf(ssl_schema.String(GPG_HASH_ALGORITHM_STRING)), creation_time = ssl_schema.Optional(ssl_formats.UNIX_TIMESTAMP_SCHEMA), validity_period = ssl_schema.Optional(ssl_schema.Integer(lo=0)), keyid = ssl_formats.KEYID_SCHEMA, keyval = ssl_schema.Object( public = DSA_PUBKEYVAL_SCHEMA, private = ssl_schema.String("") ) ) DSA_PUBKEY_SCHEMA = _create_pubkey_with_subkey_schema( _DSA_PUBKEY_SCHEMA) PUBKEY_SCHEMA = ssl_schema.OneOf([RSA_PUBKEY_SCHEMA, DSA_PUBKEY_SCHEMA]) SIGNATURE_SCHEMA = ssl_schema.Object( object_name = "SIGNATURE_SCHEMA", keyid = ssl_formats.KEYID_SCHEMA, short_keyid = ssl_schema.Optional(ssl_formats.KEYID_SCHEMA), other_headers = ssl_formats.HEX_SCHEMA, signature = ssl_formats.HEX_SCHEMA, info = ssl_schema.Optional(ssl_schema.Any()), ) in-toto-0.4.0/in_toto/gpg/functions.py0000644000076500000240000002104213535414523020020 0ustar lukpstaff00000000000000""" functions.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. publicly-usable functions for exporting public-keys, signing data and verifying signatures. """ import logging import time import in_toto.gpg.common import in_toto.gpg.exceptions import in_toto.gpg.formats from in_toto.gpg.constants import (GPG_EXPORT_PUBKEY_COMMAND, GPG_SIGN_COMMAND, SIGNATURE_HANDLERS, FULLY_SUPPORTED_MIN_VERSION, SHA256) import in_toto.process import securesystemslib.formats # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) def gpg_sign_object(content, keyid=None, homedir=None): """ Calls the gpg2 command line utility to sign the passed content with the key identified by the passed keyid from the gpg keyring at the passed homedir. The executed base command is defined in constants.GPG_SIGN_COMMAND. NOTE: On not fully supported versions of GPG, i.e. versions below in_toto.gpg.constants.FULLY_SUPPORTED_MIN_VERSION the returned signature does not contain the full keyid. As a work around, we export the public key bundle identified by the short keyid to compute the full keyid and add it to the returned signature. content: The content to be signed. (bytes) keyid: (optional) The keyid of the gpg signing keyid. If not passed the default key in the keyring is used. homedir: (optional) Path to the gpg keyring. If not passed the default keyring is used. securesystemslib.exceptions.FormatError: If the keyid was passed and does not match securesystemslib.formats.KEYID_SCHEMA ValueError: If the gpg command failed to create a valid signature. OSError: If the gpg command is not present or non-executable. in_toto.gpg.exceptions.CommandError: If the gpg command returned a non-zero exit code in_toto.gpg.exceptions.KeyNotFoundError: If the used gpg version is not fully supported and no public key can be found for short keyid. None. The created signature in the format: gpg.formats.SIGNATURE_SCHEMA. """ keyarg = "" if keyid: securesystemslib.formats.KEYID_SCHEMA.check_match(keyid) keyarg = "--local-user {}".format(keyid) homearg = "" if homedir: homearg = "--homedir {}".format(homedir).replace("\\", "/") command = GPG_SIGN_COMMAND.format(keyarg=keyarg, homearg=homearg) process = in_toto.process.run(command, input=content, check=False, stdout=in_toto.process.PIPE, stderr=in_toto.process.PIPE) # TODO: It's suggested to take a look at `--status-fd` for proper error # reporting, as there is no clear distinction between the return codes # https://lists.gnupg.org/pipermail/gnupg-devel/2005-December/022559.html if process.returncode is not 0: raise in_toto.gpg.exceptions.CommandError("Command '{}' returned non-zero " "exit status '{}', stderr was:\n{}.".format(process.args, process.returncode, process.stderr.decode())) signature_data = process.stdout signature = in_toto.gpg.common.parse_signature_packet(signature_data) # On GPG < 2.1 we cannot derive the full keyid from the signature data. # Instead we try to compute the keyid from the public part of the signing # key or its subkeys, identified by the short keyid. # parse_signature_packet is guaranteed to return at least one of keyid or # short_keyid. # Exclude the following code from coverage for consistent coverage across # test environments. if not signature["keyid"]: # pragma: no cover LOG.warning("The created signature does not include the hashed subpacket" " '33' (full keyid). You probably have a gpg version <{}." " We will export the public keys associated with the short keyid to" " compute the full keyid.".format(FULLY_SUPPORTED_MIN_VERSION)) short_keyid = signature["short_keyid"] # Export public key bundle (master key including with optional subkeys) public_key_bundle = gpg_export_pubkey(short_keyid, homedir) # Test if the short keyid matches the master key ... master_key_full_keyid = public_key_bundle["keyid"] if master_key_full_keyid.endswith(short_keyid.lower()): signature["keyid"] = master_key_full_keyid # ... or one of the subkeys and add the full keyid to the signature dict. else: for sub_key_full_keyid in list( public_key_bundle.get("subkeys", {}).keys()): if sub_key_full_keyid.endswith(short_keyid.lower()): signature["keyid"] = sub_key_full_keyid break # If there is still no full keyid something went wrong if not signature["keyid"]: # pragma: no cover raise ValueError("Full keyid could not be determined for signature '{}'". format(signature)) # It is okay now to remove the optional short keyid to save space signature.pop("short_keyid", None) return signature def gpg_verify_signature(signature_object, pubkey_info, content): """ Verifies the passed signature against the passed content using the passed public key, or one of its subkeys, associated by the signature's keyid. The function selects the appropriate verification algorithm (rsa or dsa) based on the "type" field in the passed public key object. signature_object: A signature object in the format: gpg.formats.SIGNATURE_SCHEMA pubkey_info: A public key object in the format: gpg.formats.PUBKEY_SCHEMA content: The content to be verified. (bytes) in_toto.gpg.exceptions.KeyExpirationError: if the passed public key has expired None. True if signature verification passes, False otherwise. """ in_toto.gpg.formats.PUBKEY_SCHEMA.check_match(pubkey_info) in_toto.gpg.formats.SIGNATURE_SCHEMA.check_match(signature_object) handler = SIGNATURE_HANDLERS[pubkey_info['type']] sig_keyid = signature_object["keyid"] verification_key = pubkey_info # If the keyid on the signature matches a subkey of the passed key, # we use that subkey for verification instead of the master key. if sig_keyid in list(pubkey_info.get("subkeys", {}).keys()): verification_key = pubkey_info["subkeys"][sig_keyid] creation_time = verification_key.get("creation_time") validity_period = verification_key.get("validity_period") if creation_time and validity_period and \ creation_time + validity_period < time.time(): raise in_toto.gpg.exceptions.KeyExpirationError(verification_key) return handler.gpg_verify_signature( signature_object, verification_key, content, SHA256) def gpg_export_pubkey(keyid, homedir=None): """ Calls gpg2 command line utility to export the gpg public key bundle identified by the passed keyid from the gpg keyring at the passed homedir in a format suitable for in-toto. Note: The identified key is exported including the corresponding master key and all subkeys. The executed base command is defined in constants.GPG_EXPORT_PUBKEY_COMMAND. keyid: The GPG keyid in format: securesystemslib.formats.KEYID_SCHEMA homedir: (optional) Path to the gpg keyring. If not passed the default keyring is used. ValueError: if the keyid does not match the required format. in_toto.gpg.execeptions.KeyNotFoundError: if no key or subkey was found for that keyid. None. The exported public key object in the format: gpg.formats.PUBKEY_SCHEMA """ if not securesystemslib.formats.KEYID_SCHEMA.matches(keyid): # FIXME: probably needs smarter parsing of what a valid keyid is so as to # not export more than one pubkey packet. raise ValueError("we need to export an individual key." " Please provide a valid keyid! Keyid was '{}'.".format(keyid)) homearg = "" if homedir: homearg = "--homedir {}".format(homedir).replace("\\", "/") # TODO: Consider adopting command error handling from `gpg_sign_object` # above, e.g. in a common 'run gpg command' utility function command = GPG_EXPORT_PUBKEY_COMMAND.format(keyid=keyid, homearg=homearg) process = in_toto.process.run(command, stdout=in_toto.process.PIPE, stderr=in_toto.process.PIPE) key_packet = process.stdout key_bundle = in_toto.gpg.common.get_pubkey_bundle(key_packet, keyid) return key_bundle in-toto-0.4.0/in_toto/gpg/rsa.py0000644000076500000240000001340213535414523016576 0ustar lukpstaff00000000000000# pylint: disable=invalid-name """ rsa.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. RSA-specific handling routines for signature verification and key parsing """ import binascii import cryptography.hazmat.primitives.asymmetric.rsa as rsa import cryptography.hazmat.backends as backends import cryptography.hazmat.primitives.asymmetric.padding as padding import cryptography.hazmat.primitives.asymmetric.utils as utils import cryptography.exceptions import in_toto.gpg.util import in_toto.gpg.exceptions import in_toto.gpg.formats def create_pubkey(pubkey_info): """ Create and return an RSAPublicKey object from the passed pubkey_info using pyca/cryptography. pubkey_info: The RSA pubkey info dictionary as specified by gpg.formats.RSA_PUBKEY_SCHEMA securesystemslib.exceptions.FormatError if pubkey_info does not match gpg.formats.RSA_PUBKEY_SCHEMA A cryptography.hazmat.primitives.asymmetric.rsa.RSAPublicKey based on the passed pubkey_info. """ in_toto.gpg.formats.RSA_PUBKEY_SCHEMA.check_match(pubkey_info) e = int(pubkey_info['keyval']['public']['e'], 16) n = int(pubkey_info['keyval']['public']['n'], 16) pubkey = rsa.RSAPublicNumbers(e, n).public_key(backends.default_backend()) return pubkey def get_pubkey_params(data): """ Parse the public key parameters as multi-precision-integers. data: the RFC4880-encoded public key parameters data buffer as described in the fifth paragraph of section 5.5.2. in_toto.gpg.exceptions.PacketParsingError: if the public key parameters are malformed None. The decoded signature buffer """ ptr = 0 modulus_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 modulus = data[ptr:ptr + modulus_length] if len(modulus) != modulus_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This modulus MPI was truncated!") ptr += modulus_length exponent_e_length = in_toto.gpg.util.get_mpi_length(data[ptr: ptr + 2]) ptr += 2 exponent_e = data[ptr:ptr + exponent_e_length] if len(exponent_e) != exponent_e_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This e MPI has been truncated!") return { "e": binascii.hexlify(exponent_e).decode('ascii'), "n": binascii.hexlify(modulus).decode("ascii"), } def get_signature_params(data): """ Parse the signature parameters as multi-precision-integers. data: the RFC4880-encoded public key parameters data buffer as described in the third paragraph of section 5.2.2. in_toto.gpg.exceptions.PacketParsingError: if the public key parameters are malformed None. The decoded signature buffer """ ptr = 0 signature_length = in_toto.gpg.util.get_mpi_length(data[ptr:ptr+2]) ptr += 2 signature = data[ptr:ptr + signature_length] if len(signature) != signature_length: # pragma: no cover raise in_toto.gpg.exceptions.PacketParsingError( "This signature was truncated!") return signature def gpg_verify_signature(signature_object, pubkey_info, content, hash_algorithm_id): """ Verify the passed signature against the passed content with the passed RSA public key using pyca/cryptography. signature_object: A signature dictionary as specified by gpg.formats.SIGNATURE_SCHEMA pubkey_info: The RSA public key info dictionary as specified by gpg.formats.RSA_PUBKEY_SCHEMA content: The signed bytes against which the signature is verified hash_algorithm_id: one of SHA1, SHA256, SHA512 (see in_toto.gpg.constants) used to verify the signature NOTE: Overrides any hash algorithm specification in "pubkey_info"'s "hashes" or "method" fields. securesystemslib.exceptions.FormatError if: signature_object does not match gpg.formats.SIGNATURE_SCHEMA pubkey_info does not match gpg.formats.RSA_PUBKEY_SCHEMA ValueError: if the passed hash_algorithm_id is not supported (see in_toto.gpg.util.get_hashing_class) True if signature verification passes and False otherwise """ in_toto.gpg.formats.SIGNATURE_SCHEMA.check_match(signature_object) in_toto.gpg.formats.RSA_PUBKEY_SCHEMA.check_match(pubkey_info) hasher = in_toto.gpg.util.get_hashing_class(hash_algorithm_id) pubkey_object = create_pubkey(pubkey_info) # zero-pad the signature due to a discrepancy between the openssl backend # and the gnupg interpretation of PKCSv1.5. Read more at: # https://github.com/in-toto/in-toto/issues/171#issuecomment-440039256 # we are skipping this if on the tests because well, how would one test this # deterministically. pubkey_length = len(pubkey_info['keyval']['public']['n']) signature_length = len(signature_object['signature']) if pubkey_length != signature_length: # pragma: no cover zero_pad = "0"*(pubkey_length - signature_length) signature_object['signature'] = "{}{}".format(zero_pad, signature_object['signature']) digest = in_toto.gpg.util.hash_object( binascii.unhexlify(signature_object['other_headers']), hasher(), content) try: pubkey_object.verify( binascii.unhexlify(signature_object['signature']), digest, padding.PKCS1v15(), utils.Prehashed(hasher()) ) return True except cryptography.exceptions.InvalidSignature: return False in-toto-0.4.0/in_toto/gpg/util.py0000644000076500000240000002410513535414523016770 0ustar lukpstaff00000000000000""" util.py Santiago Torres-Arias Nov 15, 2017 See LICENSE for licensing information. general-purpose utilities for binary data handling and pgp data parsing """ import struct import binascii import re import logging from distutils.version import StrictVersion # pylint: disable=no-name-in-module,import-error import cryptography.hazmat.backends as backends import cryptography.hazmat.primitives.hashes as hashing import in_toto.gpg.exceptions import in_toto.process import in_toto.gpg.constants # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) def get_mpi_length(data): """ parses an MPI (Multi-Precision Integer) buffer and returns the appropriate length. This is mostly done to perform bitwise to byte-wise conversion. data: The MPI data None None The length of the MPI contained at the beginning of this data buffer. """ bitlength = int(struct.unpack(">H", data)[0]) # Notice the /8 at the end, this length is the bitlength, not the length of # the data in bytes (as len reports it) return int((bitlength - 1)/8) + 1 def hash_object(headers, algorithm, content): """ Hash data prior to signature verification in conformance of the RFC4880 openPGP standard. headers: the additional OpenPGP headers as populated from gpg_generate_signature algorithm: The hash algorithm object defined by the cryptography.io hashes module content: the signed content None None The RFC4880-compliant hashed buffer """ # As per RFC4880 Section 5.2.4., we need to hash the content, # signature headers and add a very opinionated trailing header hasher = hashing.Hash(algorithm, backend=backends.default_backend()) hasher.update(content) hasher.update(headers) hasher.update(b'\x04\xff') hasher.update(struct.pack(">I", len(headers))) return hasher.finalize() def parse_packet_header(data, expected_type=None): """ Parse out packet type and header and body lengths from an RFC4880 packet. data: An RFC4880 packet as described in section 4.2 of the rfc. expected_type: (optional) Used to error out if the packet does not have the expected type. See in_toto.gpg.constants.PACKET_TYPE_* for available types. in_toto.gpg.exceptions.PacketParsingError If the new format packet length encodes a partial body length If the old format packet length encodes an indeterminate length If header or body length could not be determined If the expected_type was passed and does not match the packet type IndexError If the passed data is incomplete None. A tuple of packet type, header length, body length and packet length. (see RFC4880 4.3. for the list of available packet types) """ data = bytearray(data) header_len = None body_len = None # If Bit 6 of 1st octet is set we parse a New Format Packet Length, and # an Old Format Packet Lengths otherwise if data[0] & 0b01000000: # In new format packet lengths the packet type is encoded in Bits 5-0 of # the 1st octet of the packet packet_type = data[0] & 0b00111111 # The rest of the packet header is the body length header, which may # consist of one, two or five octets. To disambiguate the RFC, the first # octet of the body length header is the second octet of the packet. if data[1] < 192: header_len = 2 body_len = data[1] elif data[1] >= 192 and data[1] <= 223: header_len = 3 body_len = (data[1] - 192 << 8) + data[2] + 192 elif data[1] >= 224 and data[1] < 255: raise in_toto.gpg.exceptions.PacketParsingError("New length format " " packets of partial body lengths are not supported") elif data[1] == 255: header_len = 6 body_len = data[2] << 24 | data[3] << 16 | data[4] << 8 | data[5] else: # pragma: no cover # Unreachable: octet must be between 0 and 255 raise in_toto.gpg.exceptions.PacketParsingError("Invalid new length") else: # In old format packet lengths the packet type is encoded in Bits 5-2 of # the 1st octet and the length type in Bits 1-0 packet_type = (data[0] & 0b00111100) >> 2 length_type = data[0] & 0b00000011 # The body length is encoded using one, two, or four octets, starting # with the second octet of the packet if length_type == 0: body_len = data[1] header_len = 2 elif length_type == 1: header_len = 3 body_len = struct.unpack(">H", data[1:header_len])[0] elif length_type == 2: header_len = 5 body_len = struct.unpack(">I", data[1:header_len])[0] elif length_type == 3: raise in_toto.gpg.exceptions.PacketParsingError("Old length format " "packets of indeterminate length are not supported") else: # pragma: no cover (unreachable) # Unreachable: bits 1-0 must be one of 0 to 3 raise in_toto.gpg.exceptions.PacketParsingError("Invalid old length") if header_len is None or body_len is None: # pragma: no cover # Unreachable: One of above must have assigned lengths or raised error raise in_toto.gpg.exceptions.PacketParsingError("Could not determine " "packet length") if expected_type is not None and packet_type != expected_type: raise in_toto.gpg.exceptions.PacketParsingError("Expected packet {}, " "but got {} instead!".format(expected_type, packet_type)) return packet_type, header_len, body_len, header_len + body_len def compute_keyid(pubkey_packet_data): """ compute a keyid from an RFC4880 public-key buffer pubkey_packet_data: the public-key packet buffer None None The RFC4880-compliant hashed buffer """ hasher = hashing.Hash(hashing.SHA1(), backend=backends.default_backend()) hasher.update(b'\x99') hasher.update(struct.pack(">H", len(pubkey_packet_data))) hasher.update(bytes(pubkey_packet_data)) return binascii.hexlify(hasher.finalize()).decode("ascii") def parse_subpacket_header(data): """ Parse out subpacket header as per RFC4880 5.2.3.1. Signature Subpacket Specification. """ # NOTE: Although the RFC does not state it explicitly, the length encoded # in the header must be greater equal 1, as it includes the mandatory # subpacket type octet. # Hence, passed bytearrays like [0] or [255, 0, 0, 0, 0], which encode a # subpacket length 0 are invalid. # The caller has to deal with the resulting IndexError. if data[0] < 192: length_len = 1 length = data[0] elif data[0] >= 192 and data[0] < 255: length_len = 2 length = ((data[0] - 192 << 8) + (data[1] + 192)) elif data[0] == 255: length_len = 5 length = struct.unpack(">I", data[1:length_len])[0] else: # pragma: no cover (unreachable) raise in_toto.gpg.exceptions.PacketParsingError("Invalid subpacket header") return data[length_len], length_len + 1, length - 1, length_len + length def parse_subpackets(data): """ parse the subpackets fields data: the unparsed subpacketoctets IndexErrorif the subpackets octets are incomplete or malformed None A list of tuples with like: [ (packet_type, data), (packet_type, data), ... ] """ parsed_subpackets = [] position = 0 while position < len(data): subpacket_type, header_len, _, subpacket_len = \ parse_subpacket_header(data[position:]) payload = data[position+header_len:position+subpacket_len] parsed_subpackets.append((subpacket_type, payload)) position += subpacket_len return parsed_subpackets def get_version(): """ Uses `gpg2 --version` to get the version info of the installed gpg2 and extracts and returns the version number. The executed base command is defined in constants.GPG_VERSION_COMMAND. Version number string, e.g. "2.1.22" """ command = in_toto.gpg.constants.GPG_VERSION_COMMAND process = in_toto.process.run(command, stdout=in_toto.process.PIPE, stderr=in_toto.process.PIPE, universal_newlines=True) full_version_info = process.stdout version_string = re.search(r'(\d\.\d\.\d+)', full_version_info).group(1) return version_string def is_version_fully_supported(): """ Compares the version of installed gpg2 with the minimal fully supported gpg2 version (2.1.0). True if the version returned by `get_version` is greater-equal constants.FULLY_SUPPORTED_MIN_VERSION, False otherwise. """ installed_version = get_version() # Excluded so that coverage does not vary in different test environments return (StrictVersion(installed_version) >= StrictVersion(in_toto.gpg.constants.FULLY_SUPPORTED_MIN_VERSION)) def get_hashing_class(hash_algorithm_id): """ Return a pyca/cryptography hashing class reference for the passed RFC4880 hash algorithm ID. hash_algorithm_id: one of SHA1, SHA256, SHA512 (see in_toto.gpg.constants) ValueError if the passed hash_algorithm_id is not supported. A pyca/cryptography hashing class """ supported_hashing_algorithms = [in_toto.gpg.constants.SHA1, in_toto.gpg.constants.SHA256, in_toto.gpg.constants.SHA512] corresponding_hashing_classes = [hashing.SHA1, hashing.SHA256, hashing.SHA512] # Map supported hash algorithm ids to corresponding hashing classes hashing_class = dict(zip(supported_hashing_algorithms, corresponding_hashing_classes)) try: return hashing_class[hash_algorithm_id] except KeyError: raise ValueError("Hash algorithm '{}' not supported, must be one of '{}' " "(see RFC4880 9.4. Hash Algorithms).".format(hash_algorithm_id, supported_hashing_algorithms)) in-toto-0.4.0/in_toto/in_toto_keygen.py0000644000076500000240000001041313535414523020250 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_keygen.py Lukas Puehringer Sachit Malik June 28, 2017 See LICENSE for licensing information. A CLI tool for creating key files, and dumping them with , .pub for private and public keys respectively. It also takes an integer as an input, which specifies the length of the RSA key to be generated. By default it is set as 3072. General Usage: in-toto-keygen [-p] [-t {ed25519, rsa}] [--bits=] Example Usage: Suppose Bob wants to create the rsa keys of size 2048 bits and dump them with file name "bob_keys" in the current directory. He also wants to encrypt the so created private key with his choice of passphrase. The keys would then be created, the private key would be encrypted and dumped as "./bob_keys" and public key would be dumped as "./bob_keys.pub". Bob will use the following command: in-toto-keygen -p -t rsa -b 2048 bob_keys in-toto-keygen -p -t ed25519 bob_keys 2 if an exception occurred during argument parsing 1 if an exception occurred 0 if no exception occurred """ import sys import argparse import logging import in_toto.util # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def parse_args(): """ A function which parses the user supplied arguments. None None Parsed arguments (args object) """ parser = argparse.ArgumentParser( description="in-toto-keygen : Generates the keys, stores them with the " "supplied name (public key as: .pub, private key as: " "), additionally prompts for a password when -p is " "supplied and encrypts the private key with the same, " "before storing") in_toto_args = parser.add_argument_group("in-toto-keygen options") in_toto_args.add_argument("-p", "--prompt", action="store_true", help="Prompts for a password and encrypts the " "private key with the same before storing") in_toto_args.add_argument("-t", "--type", type=str, choices=in_toto.util.SUPPORTED_KEY_TYPES, default=in_toto.util.KEY_TYPE_RSA, help="Type of the key to be generated") in_toto_args.add_argument("name", type=str, help="The filename of the resulting key files", metavar="") in_toto_args.add_argument("-b", "--bits", default=3072, type=int, help="The key size, or key length, of the RSA " "key.", metavar="") args = parser.parse_args() return args def main(): """ First calls parse_args to parse the arguments, and then calls either prompt_generate_and_write_rsa_keypair or generate_and_write_rsa_keypair depending upon the arguments. It then dumps the corresponding key files as: and .pub (Private key and Public key respectively) """ args = parse_args() try: if args.prompt: if args.type == in_toto.util.KEY_TYPE_RSA: in_toto.util.prompt_generate_and_write_rsa_keypair(args.name, args.bits) elif args.type == in_toto.util.KEY_TYPE_ED25519: in_toto.util.prompt_generate_and_write_ed25519_keypair(args.name) else: # pragma: no cover LOG.error( "(in-toto-keygen) Unsupported keytype: {0}".format(str(args.type))) sys.exit(1) sys.exit(0) else: if args.type == in_toto.util.KEY_TYPE_RSA: in_toto.util.generate_and_write_rsa_keypair(args.name) elif args.type == in_toto.util.KEY_TYPE_ED25519: in_toto.util.generate_and_write_ed25519_keypair(args.name) else: # pragma: no cover LOG.error( "(in-toto-keygen) Unsupported keytype: {0}".format(str(args.type))) sys.exit(1) sys.exit(0) except Exception as e: LOG.error("(in-toto-keygen) {0}: {1}".format(type(e).__name__, e)) sys.exit(1) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/in_toto_mock.py0000755000076500000240000000651313535414523017730 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_mock.py Shikher Verma June 12, 2017 See LICENSE for licensing information. Provides a command line interface for runlib.in_toto_mock. 2 if an exception occurred during argument parsing 1 if an exception occurred 0 if no exception occurred usage: in-toto-mock [-h] --name -- [args] A stripped down variant of 'in-toto-run' that can be used to create unsigned link metadata for the passed command, recording all files in the current working directory as materials and products. This command should not be used to secure the supply chain but only to try out the 'in-toto-run' command. positional arguments: Command to be executed with options and arguments, separated from 'in-toto-mock' options by double dash '--'. optional arguments: -h, --help show this help message and exit required named arguments: -n , --name Name used to associate the resulting link metadata with the corresponding step defined in an in-toto layout. examples: Generate link metadata 'foo' for the activity of creating file 'bar'. in-toto-mock --name foo -- touch bar """ import sys import argparse import logging import in_toto.runlib # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def main(): """Parse arguments and call in_toto_mock. """ parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=""" A stripped down variant of 'in-toto-run' that can be used to create unsigned link metadata for the passed command, recording all files in the current working directory as materials and products. This command should not be used to secure the supply chain but only to try out the 'in-toto-run' command.""") parser.usage = "%(prog)s [-h] --name -- [args]" parser.epilog = """ examples: Generate link metadata 'foo' for the activity of creating file 'bar'. {prog} --name foo -- touch bar """.format(prog=parser.prog) named_args = parser.add_argument_group("required named arguments") # FIXME: Do we limit the allowed characters for the name? named_args.add_argument("-n", "--name", type=str, required=True, metavar="", help=( "Name used to associate the resulting link metadata with the" " corresponding step defined in an in-toto layout.")) # FIXME: This is not yet ideal. # What should we do with tokens like > or ; ? parser.add_argument("link_cmd", nargs="+", metavar="", help=( "Command to be executed with options and arguments, separated from" " 'in-toto-mock' options by double dash '--'.")) args = parser.parse_args() # in-toto-mock should not be used to secure the supply chain but only to try # out in-toto-run with max. user feedback, hence we set a verbose log level LOG.setLevel(logging.INFO) try: in_toto.runlib.in_toto_mock(args.name, args.link_cmd) except Exception as e: LOG.error("(in-toto-mock) {0}: {1}".format(type(e).__name__, e)) sys.exit(1) sys.exit(0) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/in_toto_record.py0000644000076500000240000003010113535414523020240 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_record.py Lukas Puehringer Nov 28, 2016 See LICENSE for licensing information. Provides a command line interface for runlib.in_toto_record_start and runlib.in_toto_record_stop. 2 if an exception occurred during argument parsing 1 if an exception occurred 0 if no exception occurred usage: in-toto-record [-h] {start,stop} ... Creates a signed link metadata file in two steps, in order to provide evidence for supply chain steps that cannot be carried out by a single command (for which 'in-toto-run' should be used). Returns nonzero value on failure and zero otherwise. positional arguments: {start,stop} start Creates a preliminary link file recording the paths and hashes of the passed materials and signs it with the passed functionary's key. The resulting link file is stored as '...link-unfinished'. stop Expects preliminary link file '...link- unfinished' in the CWD, signed by the passed functionary's key. If found, it records and adds the paths and hashes of the passed products to the link metadata file, updates the signature and renames the file to '..link'. optional arguments: -h, --help show this help message and exit -k , --key Path to a PEM formatted private key file used to sign the resulting link metadata. (passing one of '--key' or '--gpg' is required) -t {ed25519,rsa}, --key-type {ed25519,rsa} Specify the key-type of the key specified by the '--key' option. If '--key-type' is not passed, default is "rsa". -g [], --gpg [] GPG keyid used to sign the resulting link metadata. When '--gpg' is passed without keyid, the keyring's default GPG key is used. (passing one of '--key' or ' --gpg' is required) --gpg-home Path to GPG keyring to load GPG key identified by '-- gpg' option. If '--gpg-home' is not passed, the default GPG keyring is used. --exclude [ ...] Do not record 'materials/products' that match one of . Passed exclude patterns override previously set patterns, using e.g.: environment variables or RCfiles. See ARTIFACT_EXCLUDE_PATTERNS documentation for additional info. --base-path Record 'materials/products' relative to . If not set, current working directory is used as base path. --lstrip-paths [ ...] Record the path of artifacts in link metadata after left stripping the specified from the full path. If there are multiple prefixes specified, only a single prefix can match the path of any artifact and that is then left stripped. All prefixes are checked to ensure none of them are a left substring of another. -v, --verbose Verbose execution. -q, --quiet Suppress all output. optional arguments (start subcommand only): -m [ ...], --materials [ ...] Paths to files or directories, whose paths and hashes are stored in the resulting link metadata's material section when running the 'start' subcommand. Symlinks are followed. optional arguments (stop subcommand only): -p [ ...], --products [ ...] Paths to files or directories, whose paths and hashes are stored in the resulting link metadata's product section when running the 'stop' subcommand. Symlinks are followed. required named arguments: -n , --step-name Name used to associate the resulting link metadata with the corresponding step defined in an in-toto layout. examples: Create link metadata file in two commands, signing it with the private key loaded from 'key_file', recording all files in the CWD as materials (on start), and as products (on stop). in-toto-record start -n edit-files -k path/to/key_file -m . in-toto-record stop -n edit-files -k path/to/key_file -p . Create link metadata file signed with the default GPG key from the default GPG keychain and record a file named 'foo' as material and product. in-toto-record start -n edit-foo --gpg -m path/to/foo in-toto-record stop -n edit-foo --gpg -p path/to/foo """ import sys import argparse import logging import in_toto.util import in_toto.user_settings import in_toto.runlib from in_toto.common_args import (EXCLUDE_ARGS, EXCLUDE_KWARGS, BASE_PATH_ARGS, BASE_PATH_KWARGS, LSTRIP_PATHS_ARGS, LSTRIP_PATHS_KWARGS) # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def main(): """Parse arguments, load key from disk (if passed) and call either runlib.in_toto_record_start or runlib.in_toto_record_stop depending on the specified subcommand. """ parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=""" Creates a signed link metadata file in two steps, in order to provide evidence for supply chain steps that cannot be carried out by a single command (for which 'in-toto-run' should be used). Returns nonzero value on failure and zero otherwise.""") parser.epilog = """ examples: Create link metadata file in two commands, signing it with the private key loaded from 'key_file', recording all files in the CWD as materials (on start), and as products (on stop). {prog} start -n edit-files -k path/to/key_file -m . {prog} stop -n edit-files -k path/to/key_file -p . Create link metadata file signed with the default GPG key from the default GPG keychain and record a file named 'foo' as material and product. {prog} start -n edit-foo --gpg -m path/to/foo {prog} stop -n edit-foo --gpg -p path/to/foo """.format(prog=parser.prog) # The subparsers inherit the arguments from the parent parser parent_parser = argparse.ArgumentParser(add_help=False) subparsers = parser.add_subparsers(dest="command") # Workaround to make subcommands mandatory in Python>=3.3 # https://bugs.python.org/issue9253#msg186387 subparsers.required = True parent_named_args = parent_parser.add_argument_group( "required named arguments") # FIXME: Do we limit the allowed characters for the name? parent_named_args.add_argument("-n", "--step-name", type=str, required=True, metavar="", help=( "Name used to associate the resulting link metadata with the" " corresponding step defined in an in-toto layout.")) # Either a key or a gpg key id have to be specified but not both key_args_group = parent_parser.add_mutually_exclusive_group(required=True) key_args_group.add_argument("-k", "--key", type=str, metavar="", help=( "Path to a PEM formatted private key file used to sign the resulting" " link metadata." " (passing one of '--key' or '--gpg' is required)")) parent_parser.add_argument("-t", "--key-type", dest="key_type", type=str, choices=in_toto.util.SUPPORTED_KEY_TYPES, default=in_toto.util.KEY_TYPE_RSA, help=( "Specify the key-type of the key specified by the '--key' option. If" " '--key-type' is not passed, default is \"rsa\".")) key_args_group.add_argument("-g", "--gpg", nargs="?", const=True, metavar="", help=( "GPG keyid used to sign the resulting link metadata. When '--gpg' is" " passed without keyid, the keyring's default GPG key is used." " (passing one of '--key' or '--gpg' is required)")) parent_parser.add_argument("--gpg-home", dest="gpg_home", type=str, metavar="", help=( "Path to GPG keyring to load GPG key identified by '--gpg' option. If" " '--gpg-home' is not passed, the default GPG keyring is used.")) parent_parser.add_argument(*EXCLUDE_ARGS, **EXCLUDE_KWARGS) parent_parser.add_argument(*BASE_PATH_ARGS, **BASE_PATH_KWARGS) parent_parser.add_argument(*LSTRIP_PATHS_ARGS, **LSTRIP_PATHS_KWARGS) verbosity_args = parent_parser.add_mutually_exclusive_group(required=False) verbosity_args.add_argument("-v", "--verbose", dest="verbose", help="Verbose execution.", action="store_true") verbosity_args.add_argument("-q", "--quiet", dest="quiet", help="Suppress all output.", action="store_true") subparser_start = subparsers.add_parser("start", parents=[parent_parser], help=( "Creates a preliminary link file recording the paths and hashes of" " the passed materials and signs it with the passed functionary's" " key. The resulting link file is stored as" " '...link-unfinished'.")) subparser_stop = subparsers.add_parser("stop", parents=[parent_parser], help=( "Expects preliminary link file '...link-unfinished'" " in the CWD, signed by the passed functionary's key. If found, it" " records and adds the paths and hashes of the passed products to the" " link metadata file, updates the signature and renames the file to" " '..link'.")) subparser_start.add_argument("-m", "--materials", type=str, required=False, nargs='+', metavar="", help=( "Paths to files or directories, whose paths and hashes are stored in the" " resulting link metadata's material section when running the 'start'" " subcommand. Symlinks are followed.")) subparser_stop.add_argument("-p", "--products", type=str, required=False, nargs='+', metavar="", help=( "Paths to files or directories, whose paths and hashes are stored in the" " resulting link metadata's product section when running the 'stop'" " subcommand. Symlinks are followed.")) args = parser.parse_args() LOG.setLevelVerboseOrQuiet(args.verbose, args.quiet) # Override defaults in settings.py with environment variables and RCfiles in_toto.user_settings.set_settings() # If `--gpg` was set without argument it has the value `True` and # we will try to sign with the default key gpg_use_default = (args.gpg is True) # Otherwise gpg_keyid stays either None or gets the passed argument assigned gpg_keyid = None if not gpg_use_default and args.gpg: gpg_keyid = args.gpg try: # We load the key here because it might prompt the user for a password in # case the key is encrypted. Something that should not happen in the lib. key = None if args.key: key = in_toto.util.import_private_key_from_file(args.key, args.key_type) if args.command == "start": in_toto.runlib.in_toto_record_start(args.step_name, args.materials, signing_key=key, gpg_keyid=gpg_keyid, gpg_use_default=gpg_use_default, gpg_home=args.gpg_home, exclude_patterns=args.exclude_patterns, base_path=args.base_path, lstrip_paths=args.lstrip_paths) # Mutually exclusiveness is guaranteed by argparser else: # args.command == "stop": in_toto.runlib.in_toto_record_stop(args.step_name, args.products, signing_key=key, gpg_keyid=gpg_keyid, gpg_use_default=gpg_use_default, gpg_home=args.gpg_home, exclude_patterns=args.exclude_patterns, base_path=args.base_path, lstrip_paths=args.lstrip_paths) except Exception as e: LOG.error("(in-toto-record {0}) {1}: {2}" .format(args.command, type(e).__name__, e)) sys.exit(1) sys.exit(0) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/in_toto_run.py0000755000076500000240000002625613535414523017611 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_run.py Lukas Puehringer June 27, 2016 See LICENSE for licensing information. Provides a command line interface for runlib.in_toto_run. 2 if an exception occurred during argument parsing 1 if an exception occurred 0 if no exception occurred usage: in-toto-run [optional arguments] -- [args] Executes the passed command and records paths and hashes of 'materials' (i.e. files before command execution) and 'products' (i.e. files after command execution) and stores them together with other information (executed command, return value, stdout, stderr, ...) to a link metadata file, which is signed with the passed key. Returns nonzero value on failure and zero otherwise. positional arguments: Command to be executed with options and arguments, separated from 'in-toto-run' options by double dash '--'. optional arguments: -h, --help show this help message and exit -m [ ...], --materials [ ...] Paths to files or directories, whose paths and hashes are stored in the resulting link metadata before the command is executed. Symlinks are followed. -p [ ...], --products [ ...] Paths to files or directories, whose paths and hashes are stored in the resulting link metadata after the command is executed. Symlinks are followed. --gpg-home Path to GPG keyring to load GPG key identified by '-- gpg' option. If '--gpg-home' is not passed, the default GPG keyring is used. -b, --record-streams If passed 'stdout' and 'stderr' of the executed command are duplicated and stored in the resulting link metadata. -x, --no-command Generate link metadata without executing a command, e.g. for a 'signed off by' step. --exclude [ ...] Do not record 'materials/products' that match one of . Passed exclude patterns override previously set patterns, using e.g.: environment variables or RCfiles. See ARTIFACT_EXCLUDE_PATTERNS documentation for additional info. --base-path Record 'materials/products' relative to . If not set, current working directory is used as base path. --lstrip-paths [ ...] Record the path of artifacts in link metadata after left stripping the specified from the full path. If there are multiple prefixes specified, only a single prefix can match the path of any artifact and that is then left stripped. All prefixes are checked to ensure none of them are a left substring of another. -t {ed25519,rsa}, --key-type {ed25519,rsa} Specify the key-type of the key specified by the '--key' option. If '--key-type' is not passed, default is "rsa". -v, --verbose Verbose execution. -q, --quiet Suppress all output. required named arguments: -n , --step-name Name used to associate the resulting link metadata with the corresponding step defined in an in-toto layout. -k , --key Path to a PEM formatted private key file used to sign the resulting link metadata. (passing one of '--key' or '--gpg' is required) -g [], --gpg [] GPG keyid used to sign the resulting link metadata. When '--gpg' is passed without keyid, the keyring's default GPG key is used. (passing one of '--key' or ' --gpg' is required) examples: Tag a git repo, storing files in CWD as products, signing the resulting link file with the private key loaded from 'key_file'. in-toto-run -n tag -p . -k key_file -- git tag v1.0 Create tarball, storing files in 'project' directory as materials and the tarball as product, signing the link file with GPG key '...7E0C8A17'. in-toto-run -n package -m project -p project.tar.gz \ -g 8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17 \ -- tar czf project.tar.gz project """ import sys import argparse import logging import in_toto.user_settings from in_toto import (util, runlib) from in_toto.common_args import (EXCLUDE_ARGS, EXCLUDE_KWARGS, BASE_PATH_ARGS, BASE_PATH_KWARGS, LSTRIP_PATHS_ARGS, LSTRIP_PATHS_KWARGS) # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def main(): """Parse arguments, load key from disk (prompts for password if key is encrypted) and call in_toto_run. """ parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=""" Executes the passed command and records paths and hashes of 'materials' (i.e. files before command execution) and 'products' (i.e. files after command execution) and stores them together with other information (executed command, return value, stdout, stderr, ...) to a link metadata file, which is signed with the passed key. Returns nonzero value on failure and zero otherwise.""") parser.usage = ("%(prog)s [optional arguments]" " -- [args]") parser.epilog = """ examples: Tag a git repo, storing files in CWD as products, signing the resulting link file with the private key loaded from 'key_file'. {prog} -n tag -p . -k key_file -- git tag v1.0 Create tarball, storing files in 'project' directory as materials and the tarball as product, signing the link file with GPG key '...7E0C8A17'. {prog} -n package -m project -p project.tar.gz \\ -g 8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17 \\ -- tar czf project.tar.gz project """.format(prog=parser.prog) named_args = parser.add_argument_group("required named arguments") # FIXME: Do we limit the allowed characters for the name? named_args.add_argument("-n", "--step-name", type=str, required=True, metavar="", help=( "Name used to associate the resulting link metadata with the" " corresponding step defined in an in-toto layout.")) parser.add_argument("-m", "--materials", type=str, required=False, nargs='+', metavar="", help=( "Paths to files or directories, whose paths and hashes are stored in the" " resulting link metadata before the command is executed. Symlinks are" " followed.")) parser.add_argument("-p", "--products", type=str, required=False, nargs='+', metavar="", help=( "Paths to files or directories, whose paths and hashes are stored in the" " resulting link metadata after the command is executed. Symlinks are" " followed.")) named_args.add_argument("-k", "--key", type=str, metavar="", help=( "Path to a PEM formatted private key file used to sign the resulting" " link metadata." " (passing one of '--key' or '--gpg' is required)")) parser.add_argument("-t", "--key-type", dest="key_type", type=str, choices=util.SUPPORTED_KEY_TYPES, default=util.KEY_TYPE_RSA, help=( "Specify the key-type of the key specified by the '--key' option. If" " '--key-type' is not passed, default is \"ed25519\".")) named_args.add_argument("-g", "--gpg", nargs="?", const=True, metavar="", help=( "GPG keyid used to sign the resulting link metadata. When '--gpg' is" " passed without keyid, the keyring's default GPG key is used." " (passing one of '--key' or '--gpg' is required)")) parser.add_argument("--gpg-home", dest="gpg_home", type=str, metavar="", help=( "Path to GPG keyring to load GPG key identified by '--gpg' option. If" " '--gpg-home' is not passed, the default GPG keyring is used.")) parser.add_argument("-s", "--record-streams", dest="record_streams", default=False, action="store_true", help=( "If passed 'stdout' and 'stderr' of the executed command are duplicated" " and stored in the resulting link metadata.")) parser.add_argument("-x", "--no-command", dest="no_command", default=False, action="store_true", help=( "Generate link metadata without executing a command, e.g. for a 'signed" " off by' step.")) parser.add_argument(*EXCLUDE_ARGS, **EXCLUDE_KWARGS) parser.add_argument(*BASE_PATH_ARGS, **BASE_PATH_KWARGS) parser.add_argument(*LSTRIP_PATHS_ARGS, **LSTRIP_PATHS_KWARGS) verbosity_args = parser.add_mutually_exclusive_group(required=False) verbosity_args.add_argument("-v", "--verbose", dest="verbose", help="Verbose execution.", action="store_true") verbosity_args.add_argument("-q", "--quiet", dest="quiet", help="Suppress all output.", action="store_true") # FIXME: This is not yet ideal. # What should we do with tokens like > or ; ? parser.add_argument("link_cmd", nargs="*", metavar="", help=( "Command to be executed with options and arguments, separated from" " 'in-toto-run' options by double dash '--'.")) args = parser.parse_args() LOG.setLevelVerboseOrQuiet(args.verbose, args.quiet) # Override defaults in settings.py with environment variables and RCfiles in_toto.user_settings.set_settings() # Regular signing and GPG signing are mutually exclusive if (args.key is None) == (args.gpg is None): parser.print_usage() parser.error("Specify either `--key ` or `--gpg []`") # If `--gpg` was set without argument it has the value `True` and # we will try to sign with the default key gpg_use_default = (args.gpg is True) # Otherwise we interpret it as actual keyid gpg_keyid = None if args.gpg is not True: gpg_keyid = args.gpg # If no_command is specified run in_toto_run without executing a command if args.no_command: args.link_cmd = [] elif not args.link_cmd: # pragma: no branch parser.print_usage() parser.error("No command specified." " Please specify (or use the --no-command option)") try: # We load the key here because it might prompt the user for a password in # case the key is encrypted. Something that should not happen in the lib. key = None if args.key: key = util.import_private_key_from_file(args.key, args.key_type) runlib.in_toto_run( args.step_name, args.materials, args.products, args.link_cmd, args.record_streams, key, gpg_keyid, gpg_use_default, args.gpg_home, args.exclude_patterns, args.base_path, args.lstrip_paths) except Exception as e: LOG.error("(in-toto-run) {0}: {1}".format(type(e).__name__, e)) sys.exit(1) sys.exit(0) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/in_toto_sign.py0000644000076500000240000003502413535414523017733 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_sign.py Sachit Malik Lukas Puehringer June 13, 2017 See LICENSE for licensing information. Provides command line interface to sign in-toto link or layout metadata or to verify its signatures. usage: in-toto-sign [-h] -f [-k [ ...]] [-t [ ...]] [-g [ [ ...]]] [--gpg-home ] [-o ] [-a] [--verify] [-v | -q] Provides command line interface to sign in-toto link or layout metadata or verify its signatures, with options to: * replace (default) or add signature(s): + layout metadata can be signed by multiple keys at once, + link metadata can only be signed by one key at a time. * write signed metadata to a specified path. If no output path is specified, + layout metadata is written to the path of the input file, + link metadata is written to '..link'. * verify signatures Returns nonzero value on failure and zero otherwise. optional arguments: -h, --help show this help message and exit -k [ ...], --key [ ...] Path(s) to PEM formatted key file(s), used to sign the passed link or layout metadata or to verify its signatures. -t [ ...], --key-type [ ...] Specify the key-type of the keys specified by the '--key' option. Number of values should be the same as the number of keys specified by the '--key' option. If '--key-type' is not passed, default key_type of all keys is assumed to be \"rsa\". -g [ [ ...]], --gpg [ [ ...]] GPG keyid used to sign the passed link or layout metadata or to verify its signatures. If passed without keyid, the default GPG key is used. --gpg-home Path to GPG keyring to load GPG key identified by '-- gpg' option. If '--gpg-home' is not passed, the default GPG keyring is used. -o , --output Path to store metadata file to be signed. If not passed, layout metadata is written to the path of the input file and link metadata is written to '..link' -a, --append If passed, signatures are added rather than replacing existing signatures. This option is only availabe for layout metdata. --verify Verify signature(s) of passed link or layout metadata. -v, --verbose Verbose execution. -q, --quiet Suppress all output. required named arguments: -f , --file Path to link or layout file to be signed or verified. examples: Append two signatures to 'unsigned.layout' file and write to 'root.layout'. in-toto-sign -f unsigned.layout -k priv_key1 priv_key2 -o root.layout -a Replace signature in link file. And write to default filename, i.e. 'package..link'. in-toto-sign -f package.2f89b927.link -k priv_key Verify layout signed with 3 keys. in-toto-sign -f root.layout -k pub_key0 pub_key1 pub_key2 --verify Sign layout with default gpg key in default gpg keyring. in-toto-sign -f root.layout --gpg Verify layout with a gpg key identified by keyid '...439F3C2'. in-toto-sign -f root.layout --verify \ --gpg 3BF8135765A07E21BD12BF89A5627F6BF439F3C2 """ import sys import six import argparse import logging from in_toto import exceptions, util from in_toto.models.link import FILENAME_FORMAT from in_toto.models.metadata import Metablock import in_toto.gpg.functions import securesystemslib.formats # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def _sign_and_dump_metadata(metadata, args): """ Internal method to sign link or layout metadata and dump it to disk. metadata: Metablock object (contains Link or Layout object) args: see argparser SystemExit(0) if signing is successful SystemExit(2) if any exception occurs """ try: if not args.append: metadata.signatures = [] signature = None # If the cli tool was called with `--gpg [KEYID ...]` `args.gpg` is # a list (not None) and we will try to sign with gpg. # If `--gpg-home` was not set, args.gpg_home is None and the signer tries # to use the default gpg keyring. if args.gpg is not None: # If `--gpg` was passed without argument we sign with the default key # Excluded so that coverage does not vary in different test environments if len(args.gpg) == 0: # pragma: no cover signature = metadata.sign_gpg(gpg_keyid=None, gpg_home=args.gpg_home) # Otherwise we sign with each passed keyid for keyid in args.gpg: securesystemslib.formats.KEYID_SCHEMA.check_match(keyid) signature = metadata.sign_gpg(gpg_keyid=keyid, gpg_home=args.gpg_home) # Alternatively we iterate over passed private key paths `--key KEYPATH # ...` load the corresponding key from disk and sign with it elif args.key is not None: # pragma: no branch if args.key_type is None: args.key_type = [util.KEY_TYPE_RSA] * len(args.key) if len(args.key_type) != len(args.key): raise securesystemslib.exceptions.FormatError( "number of key_types should match with the number" " of keys specified") for idx, key_path in enumerate(args.key): key = util.import_private_key_from_file(key_path, args.key_type[idx]) signature = metadata.sign(key) # If `--output` was specified we store the signed link or layout metadata # to that location no matter what if args.output: out_path = args.output # Otherwise, in case of links, we build the filename using the link/step # name and the keyid of the created signature (there is only one for links) elif metadata.type_ == "link": in_toto.formats.ANY_SIGNATURE_SCHEMA.check_match(signature) keyid = signature["keyid"] out_path = FILENAME_FORMAT.format(step_name=metadata.signed.name, keyid=keyid) # In case of layouts we just override the input file. elif metadata.type_ == "layout": # pragma: no branch out_path = args.file LOG.info("Dumping {0} to '{1}'...".format(metadata.type_, out_path)) metadata.dump(out_path) sys.exit(0) except Exception as e: LOG.error("The following error occurred while signing: " "{}".format(e)) sys.exit(2) def _verify_metadata(metadata, args): """ Internal method to verify link or layout signatures. metadata: Metablock object (contains Link or Layout object) args: see argparser SystemExit(0) if verification passes SystemExit(1) if verification fails SystemExit(2) if any exception occurs """ try: # Load pubkeys from disk .... if args.key is not None: pub_key_dict = util.import_public_keys_from_files_as_dict(args.key, args.key_type) # ... or from gpg keyring elif args.gpg is not None: # pragma: no branch pub_key_dict = util.import_gpg_public_keys_from_keyring_as_dict( args.gpg, args.gpg_home) for keyid, verification_key in six.iteritems(pub_key_dict): metadata.verify_signature(verification_key) LOG.info("Signature verification passed for keyid '{}'" .format(keyid)) sys.exit(0) except exceptions.SignatureVerificationError as e: LOG.error("Signature verification failed: {}".format(e)) sys.exit(1) except Exception as e: LOG.error("The following error occurred while verifying signatures: " "{}".format(e)) sys.exit(2) def _load_metadata(file_path): """ Loads Metablock (link or layout metadata) file from disk file_path: path to link or layout metadata file SystemExit(2) if any exception occurs in-toto Metablock object (contains Link or Layout object) """ try: return Metablock.load(file_path) except Exception as e: LOG.error("The following error occurred while loading the file '{}': " "{}".format(file_path, e)) sys.exit(2) def main(): """Parse arguments, load link or layout metadata file and either sign metadata file or verify its signatures. """ parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=""" Provides command line interface to sign in-toto link or layout metadata or verify its signatures, with options to: * replace (default) or add signature(s): + layout metadata can be signed by multiple keys at once, + link metadata can only be signed by one key at a time. * write signed metadata to a specified path. If no output path is specified, + layout metadata is written to the path of the input file, + link metadata is written to '..link'. * verify signatures Returns nonzero value on failure and zero otherwise.""") parser.epilog = """ examples: Append two signatures to 'unsigned.layout' file and write to 'root.layout'. {prog} -f unsigned.layout -k priv_key1 priv_key2 -o root.layout -a Replace signature in link file. And write to default filename, i.e. 'package..link'. {prog} -f package.2f89b927.link -k priv_key Verify layout signed with 3 keys. {prog} -f root.layout -k pub_key0 pub_key1 pub_key2 --verify Sign layout with default gpg key in default gpg keyring. {prog} -f root.layout --gpg Verify layout with a gpg key identified by keyid '...439F3C2'. {prog} -f root.layout --verify \\ --gpg 3BF8135765A07E21BD12BF89A5627F6BF439F3C2 """.format(prog=parser.prog) named_args = parser.add_argument_group("required named arguments") named_args.add_argument("-f", "--file", type=str, required=True, metavar="", help=( "Path to link or layout file to be signed or verified.")) parser.add_argument("-k", "--key", nargs="+", metavar="", help=( "Path(s) to PEM formatted key file(s), used to sign the passed link or" " layout metadata or to verify its signatures.")) parser.add_argument("-t", "--key-type", dest="key_type", type=str, choices=in_toto.util.SUPPORTED_KEY_TYPES, nargs="+", help=( "Specify the key-type of the keys specified by the '--key'" " option. Number of values should be the same as the number of keys" " specified by the '--key' option. If '--key-type' is not passed," " default key_type of all keys is assumed to be \"rsa\".")) parser.add_argument("-g", "--gpg", nargs="*", metavar="", help=( "GPG keyid used to sign the passed link or layout metadata or to verify" " its signatures. If passed without keyid, the default GPG key is" " used.")) parser.add_argument("--gpg-home", dest="gpg_home", type=str, metavar="", help=( "Path to GPG keyring to load GPG key identified by '--gpg' option. If" " '--gpg-home' is not passed, the default GPG keyring is used.")) # Only when signing parser.add_argument("-o", "--output", type=str, metavar="", help=( "Path to store metadata file to be signed. If not passed, layout" " metadata is written to the path of the input file and link metadata is" " written to '..link'")) # Only when signing parser.add_argument("-a", "--append", action="store_true", help=( "If passed, signatures are added rather than replacing existing" " signatures. This option is only availabe for layout metdata.")) parser.add_argument("--verify", action="store_true", help="Verify signature(s) of passed link or layout metadata.") verbosity_args = parser.add_mutually_exclusive_group(required=False) verbosity_args.add_argument("-v", "--verbose", dest="verbose", help="Verbose execution.", action="store_true") verbosity_args.add_argument("-q", "--quiet", dest="quiet", help="Suppress all output.", action="store_true") args = parser.parse_args() LOG.setLevelVerboseOrQuiet(args.verbose, args.quiet) # Additional argparse sanitization # NOTE: This tool is starting to have many inter-dependent argument # restrictions. Maybe we should make it less sophisticated at some point. if args.verify and (args.append or args.output): parser.print_help() parser.error("conflicting arguments: don't specify any of" " 'append' or 'output' when verifying signatures") # Regular signing and GPG signing are mutually exclusive if (args.key is None) == (args.gpg is None): parser.print_help() parser.error("wrong arguments: specify either `--key PATH [PATH ...]`" " or `--gpg [KEYID [KEYID ...]]`") # For gpg verification we must specify a keyid (no default key is loaded) if args.verify and args.gpg is not None and len(args.gpg) < 1: parser.print_help() parser.error("missing arguments: specify at least one keyid for GPG" " signature verification (`--gpg KEYID ...`)") metadata = _load_metadata(args.file) # Specific command line argument restrictions if we deal with links if metadata.type_ == "link": # Above we check that it's either `--key ...` or `--gpg ...` # Here we check that it is not more than one in each case when dealing # with links link_error_message = ("Link metadata is associated with a" " single functionary and is usually namespaced accordingly:" " '..link'.") if ((args.key is not None and len(args.key) > 1) or (args.gpg is not None and len(args.gpg) > 1)): parser.print_help() parser.error("too many arguments: {} Hence signing Link metadata" " with multiple keys is not allowed.".format(link_error_message)) if args.append: parser.print_help() parser.error("wrong arguments: {}. Hence adding signatures to" " existing signatures on Link metadata is not allowed." .format(link_error_message)) if args.verify: _verify_metadata(metadata, args) else: _sign_and_dump_metadata(metadata, args) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/in_toto_verify.py0000755000076500000240000002132113535414523020275 0ustar lukpstaff00000000000000#!/usr/bin/env python """ in_toto_verify.py Lukas Puehringer Oct 3, 2016 See LICENSE for licensing information. Provides a command line interface for verifylib.in_toto_verify. 2 if an exception occurred during argument parsing 1 if an exception occurred (verification failed) 0 if no exception occurred (verification passed) usage: in-toto-verify [optional arguments] Verifies that a software supply chain was carried out according to the passed in-toto supply chain layout. The verification includes the following checks: * the layout was signed with the the passed key(s), * the layout has not expired, * signed link metadata files exist for each step of the layout in the CWD, * link files are provided by the required number of authorized functionaries, * the materials and products for each step, as reported by the corresponding link files, adhere to the artifact rules specified by the step. Additionally, inspection commands defined in the layout are executed sequentially, followed by applying the inspection's artifact rules. The command returns a nonzero value if verification fails and zero otherwise. optional arguments: -h, --help show this help message and exit --link-dir Path to directory where link metadata files for steps defined in the root layout should be loaded from. If not passed links are loaded from the current working directory. --gpg-home Path to GPG keyring to load GPG key identified by '-- gpg' option. If '--gpg-home' is not passed, the default GPG keyring is used. -v, --verbose Verbose execution. -q, --quiet Suppress all output. required named arguments: -l , --layout Path to root layout specifying the software supply chain to be verified. -k [ ...], --layout-keys [ ...] Path(s) to PEM formatted public key(s), used to verify the passed root layout's signature(s). Passing at least one key using '--layout-keys' and/or '--gpg' is required. For each passed key the layout must carry a valid signature. -g [ ...], --gpg [ ...] GPG keyid, identifying a public key in the GPG keychain, used to verify the passed root layout's signature(s). Passing at least one key using '-- layout-keys' and/or '--gpg' is required. For each passed key the layout must carry a valid signature. examples: Verify supply chain in 'root.layout', signed with private part of 'key_file.pub'. in-toto-verify --layout root.layout --layout-keys key_file.pub Verify supply chain like above but load links corresponding to steps of root.layout from 'link_dir'. in-toto-verify --layout root.layout --layout-keys key_file.pub \ --link-dir link_dir Verify supply chain in 'root.layout', signed with GPG key '...7E0C8A17', whose public part can be found in the GPG keyring at '~/.gnupg'. in-toto-verify --layout root.layout \ --gpg 8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17 --gpg-home ~/.gnupg """ import sys import argparse import logging import in_toto.util from in_toto import verifylib from in_toto.models.metadata import Metablock # Command line interfaces should use in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger("in_toto") def main(): """Parse arguments and call in_toto_verify. """ parser = argparse.ArgumentParser( formatter_class=argparse.RawDescriptionHelpFormatter, description=""" Verifies that a software supply chain was carried out according to the passed in-toto supply chain layout. The verification includes the following checks: * the layout was signed with the the passed key(s), * the layout has not expired, * signed link metadata files exist for each step of the layout in the CWD, * link files are provided by the required number of authorized functionaries, * the materials and products for each step, as reported by the corresponding link files, adhere to the artifact rules specified by the step. Additionally, inspection commands defined in the layout are executed sequentially, followed by applying the inspection's artifact rules. The command returns a nonzero value if verification fails and zero otherwise. """) parser.usage = "%(prog)s [optional arguments]" parser.epilog = """ examples: Verify supply chain in 'root.layout', signed with private part of 'key_file.pub'. {prog} --layout root.layout --layout-keys key_file.pub Verify supply chain like above but load links corresponding to steps of root.layout from 'link_dir'. {prog} --layout root.layout --layout-keys key_file.pub \\ --link-dir link_dir Verify supply chain in 'root.layout', signed with GPG key '...7E0C8A17', whose public part can be found in the GPG keyring at '~/.gnupg'. {prog} --layout root.layout \\ --gpg 8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17 --gpg-home ~/.gnupg """.format(prog=parser.prog) named_args = parser.add_argument_group("required named arguments") named_args.add_argument("-l", "--layout", type=str, required=True, metavar="", help=( "Path to root layout specifying the software supply chain to be" " verified.")) named_args.add_argument("-k", "--layout-keys", type=str, metavar="", nargs="+", help=( "Path(s) to PEM formatted public key(s), used to verify the passed root" " layout's signature(s)." " Passing at least one key using '--layout-keys' and/or '--gpg' is" " required. For each passed key the layout must carry a valid" " signature.")) parser.add_argument("-t", "--key-types", dest="key_types", type=str, choices=in_toto.util.SUPPORTED_KEY_TYPES, nargs="+", help=( "Specify the key-type of the keys specified by the '--layout-keys'" " option. If '--key-types' is not passed, default key_type is assumed" " to be \"rsa\".")) named_args.add_argument("-g", "--gpg", nargs="+", metavar="", help=( "GPG keyid, identifying a public key in the GPG keychain, used to verify" " the passed root layout's signature(s)." " Passing at least one key using '--layout-keys' and/or '--gpg' is" " required. For each passed key the layout must carry a valid" " signature.")) parser.add_argument("--link-dir", dest="link_dir", type=str, metavar="", default=".", help=( "Path to directory where link metadata files for steps defined in" " the root layout should be loaded from. If not passed links are" " loaded from the current working directory.")) parser.add_argument("--gpg-home", dest="gpg_home", type=str, metavar="", help=("Path to GPG keyring to load GPG key identified" " by '--gpg' option. If '--gpg-home' is not passed, the default GPG" " keyring is used.")) verbosity_args = parser.add_mutually_exclusive_group(required=False) verbosity_args.add_argument("-v", "--verbose", dest="verbose", help="Verbose execution.", action="store_true") verbosity_args.add_argument("-q", "--quiet", dest="quiet", help="Suppress all output.", action="store_true") args = parser.parse_args() LOG.setLevelVerboseOrQuiet(args.verbose, args.quiet) # For verifying at least one of --layout-keys or --gpg must be specified # Note: Passing both at the same time is possible. if (args.layout_keys is None) and (args.gpg is None): parser.print_help() parser.error("wrong arguments: specify at least one of" " `--layout-keys path [path ...]` or `--gpg id [id ...]`") try: LOG.info("Loading layout...") layout = Metablock.load(args.layout) layout_key_dict = {} if args.layout_keys is not None: LOG.info("Loading layout key(s)...") layout_key_dict.update( in_toto.util.import_public_keys_from_files_as_dict( args.layout_keys, args.key_types)) if args.gpg is not None: LOG.info("Loading layout gpg key(s)...") layout_key_dict.update( in_toto.util.import_gpg_public_keys_from_keyring_as_dict( args.gpg, gpg_home=args.gpg_home)) verifylib.in_toto_verify(layout, layout_key_dict, args.link_dir) except Exception as e: LOG.error("(in-toto-verify) {0}: {1}".format(type(e).__name__, e)) sys.exit(1) sys.exit(0) if __name__ == "__main__": main() in-toto-0.4.0/in_toto/log.py0000644000076500000240000001007513535414523016020 0ustar lukpstaff00000000000000""" log.py Lukas Puehringer Oct 4, 2016 See LICENSE for licensing information. Configures "in_toto" base logger, which can be used for debugging and user feedback in in-toto command line interfaces and libraries. Logging methods and levels are available through Python's logging module. If the log level is set to 'logging.DEBUG' log messages include additional information about the log statement. Moreover, calls to the `error` method will also output a stacktrace (if available). In all other log levels only the log message is shown without additional info. The default log level of the base logger is 'logging.WARNING', unless 'in_toto.settings.DEBUG' is 'True', in that case the default log level is 'logging.DEBUG'. The default handler of the base logger is a 'StreamHandler', which writes all log messages permitted by the used log level to 'sys.stderr'. This module should be imported in '__init__.py' to configure the base logger. Subsequently, command line interfaces should fetch the base logger by name and customize the log level according to any passed command line arguments. They can use the convenience method `setLevelVerboseOrQuiet` available on the custom base logger, e.g.: ``` import logging LOG = logging.getLogger("in_toto") # parse args ... LOG.setLevelVerboseOrQuiet(args.verbose, args.quiet) # call library function ... ``` Library modules can then create loggers, passing the module name, which will inherit the base logger's log level and format, e.g.: ``` import logging LOG = logging.getLogger(__name__) LOG.warning("Shown per default.") LOG.info("Only shown if, log level was set to <= logging.INFO, e.g. in cli.") LOG.info("In debug mode it looks something like this) # in_toto.runlib:400:INFO:In debug mode it looks something like this ``` """ import sys import logging import in_toto.settings # Different log message formats for different log levels FORMAT_MESSAGE = "%(message)s" FORMAT_DEBUG = "%(name)s:%(lineno)d:%(levelname)s:%(message)s" # Cache default logger class, should be logging.Logger if not changed elsewhere _LOGGER_CLASS = logging.getLoggerClass() # Create logger subclass class InTotoLogger(_LOGGER_CLASS): """logger.Logging subclass, using providing custom error method and convenience method for log levels. """ QUIET = logging.CRITICAL + 1 def error(self, msg): """Show stacktrace depending on its availability and the logger's log level, i.e. only show stacktrace in DEBUG level. """ show_stacktrace = (self.level == logging.DEBUG and sys.exc_info() != (None, None, None)) return super(InTotoLogger, self).error(msg, exc_info=show_stacktrace) # Allow non snake_case function name for consistency with logging library def setLevelVerboseOrQuiet(self, verbose, quiet): # pylint: disable=invalid-name """Convenience method to set the logger's verbosity level based on the passed booleans verbose and quiet (useful for cli tools). """ if verbose: self.setLevel(logging.INFO) elif quiet: # TODO: Is it enough to use logging.CRITICAL + 1 to suppress all output? # A saver way would be to use a NullHandler or Filters. self.setLevel(self.QUIET) # Temporarily change logger default class to instantiate an in-toto base logger logging.setLoggerClass(InTotoLogger) LOGGER = logging.getLogger("in_toto") logging.setLoggerClass(_LOGGER_CLASS) # In DEBUG mode we log all log types and add additional information, # otherwise we only log warning, error and critical and only the message. if in_toto.settings.DEBUG: # pragma: no cover LEVEL = logging.DEBUG FORMAT_STRING = FORMAT_DEBUG else: LEVEL = logging.WARNING FORMAT_STRING = FORMAT_MESSAGE # Add a StreamHandler with the chosen format to in-toto's base logger, # which will write log messages to `sys.stderr`. FORMATTER = logging.Formatter(FORMAT_STRING) HANDLER = logging.StreamHandler() HANDLER.setFormatter(FORMATTER) LOGGER.addHandler(HANDLER) LOGGER.setLevel(LEVEL) in-toto-0.4.0/in_toto/models/0000755000076500000240000000000013535503044016142 5ustar lukpstaff00000000000000in-toto-0.4.0/in_toto/models/__init__.py0000644000076500000240000000000013535414523020244 0ustar lukpstaff00000000000000in-toto-0.4.0/in_toto/models/common.py0000644000076500000240000000500113535414523020003 0ustar lukpstaff00000000000000""" common.py Lukas Puehringer Santiago Torres Sep 23, 2016 See LICENSE for licensing information. Provides base classes for various classes in the model. """ import json import attr import inspect import securesystemslib.formats class ValidationMixin(object): """ The validation mixin provides a self-inspecting method, validate, to allow in-toto's objects to check that they are proper. """ def validate(self): """ Inspects the class (or subclass) for validate methods to ensure the all its members are properly formed. This method can be used to ensure the metadata contained in this class is proper before calling dump. None FormatError: If any of the members of this class are not properly populated. None None """ for method in inspect.getmembers(self, predicate=inspect.ismethod): if method[0].startswith("_validate_"): method[1]() @attr.s(repr=False, init=False) class Signable(ValidationMixin): """Objects with base class Signable are to be included in a Metablock class to be signed (hence the name). They provide a `signable_bytes` property used to create deterministic signatures. """ def __repr__(self): """Returns an indented JSON string of the metadata object. """ return json.dumps(attr.asdict(self), indent=1, separators=(",", ": "), sort_keys=True) @property def signable_bytes(self): """Returns canonical JSON utf-8 encoded bytes of Signable object dictionary representation. The bytes returned from this function are used to generate and verify signatures (c.f. `metadata.Metablock`). Changes to this function might break backwards compatibility with existing metadata. """ return securesystemslib.formats.encode_canonical( attr.asdict(self)).encode("UTF-8") @property def signable_dict(self): """Returns the dictionary representation of Signable, which we pass to securesystemslib signing and verifying functions, where it gets converted to canonical JSON utf-8 encoded bytes before signing and verifying. TODO: I'd rather fully control what data is signed here and not in the crypto backend, i.e. pass signable_bytes to the signing/verifying functions. This would require a change to securesystemslib. """ return attr.asdict(self) in-toto-0.4.0/in_toto/models/layout.py0000644000076500000240000006705613535414523020052 0ustar lukpstaff00000000000000#!/usr/bin/env python """ layout.py Lukas Puehringer Santiago Torres Sep 23, 2016 See LICENSE for licensing information. Provides classes related to the definition of a software supply chain. Layout: represents the metadata file that defines a software supply chain through steps and inspections Step: represents one step of the software supply chain, performed by one or many functionaries, who are identified by a key also stored to the layout Inspection: represents a hook that is run at verification """ from six import string_types import attr import shlex import json from datetime import datetime from dateutil.relativedelta import relativedelta from dateutil.parser import parse from in_toto.models.common import Signable, ValidationMixin import in_toto.rulelib import in_toto.exceptions import in_toto.formats import securesystemslib.exceptions import securesystemslib.formats import securesystemslib.schema import securesystemslib.interface # Link metadata for sublayouts are expected to be found in a subdirectory # with the following name, relative to the verification directory SUBLAYOUT_LINK_DIR_FORMAT = "{name}.{keyid:.8}" @attr.s(repr=False, init=False) class Layout(Signable): """ A layout lists the sequence of steps of the software supply chain, and the functionaries authorized to perform these steps. The object should be wrapped in a metablock object, to provide functionality for signing and signature verification, and reading from and writing to disk. """ _type = attr.ib() steps = attr.ib() inspect = attr.ib() keys = attr.ib() expires = attr.ib() readme = attr.ib() def __init__(self, **kwargs): """ Instantiate a new layout object with optional initial values. steps: A list of step objects describing the steps required to carry out the software supply chain. inspect: A list of inspection objects describing any additional actions carried out upon verification. keys: A dictionary of public keys whose private keys are used to sign the metadata (link metadata) corresponding to the steps of the supply chain. Each step can authorize one or more of the here listed keys individually. expires: The expiration date of a layout. readme: A human readable description of the software supply chain defined by the layout. securesystemslib.exceptions.FormatError If the instantiated layout has invalid properties, e.g. because any of the assigned keyword arguments are invalid. """ super(Layout, self).__init__() self._type = "layout" self.steps = kwargs.get("steps", []) self.inspect = kwargs.get("inspect", []) self.keys = kwargs.get("keys", {}) self.readme = kwargs.get("readme", "") # Assign a default expiration (one month) if not expiration date is passed # TODO: Is one month a sensible default? In any case, we need a valid # expiration date in order for the layout object to validate. # (see self._validate_expires) self.expires = kwargs.get("expires") if not self.expires: self.set_relative_expiration(months=1) # TODO: Should we validate in the constructor or allow the user to create # an invalid layout and call validate explicitly? self.validate() @property def type_(self): """ Getter for protected _type attribute. NOTE: The trailing underscore used is by convention (pep8) to avoid conflicts with Python's 'type' keyword. The type of the metadata object, i.e. "layout" (see constructor). """ return self._type @staticmethod def read(data): """ Static method to instantiate a layout object from a Python dictionary, e.g. by parsing its JSON representation. The method expects any contained steps and inspections to be Python dictionaries as well, and tries to instantiate the corresponding objects using the step's and inspection's read methods respectively. data: A dictionary containing layout metadata. securesystemslib.exceptions.FormatError If any of the layout's properties is invalid. The newly created layout object, optionally containing newly instantiated step and inspection objects. """ steps = [] for step_data in data.get("steps"): steps.append(Step.read(step_data)) data["steps"] = steps inspections = [] for inspect_data in data.get("inspect"): inspections.append(Inspection.read(inspect_data)) data["inspect"] = inspections return Layout(**data) def set_relative_expiration(self, days=0, months=0, years=0): """ Set the layout's expiration date in one or more of "days", "months" or "years" from today. If no argument is passed, it defaults to today. days: Days from today. months: Months from today. years: Years from today. securesystemslib.exceptions.FormatError If any of days, months or years is passed and is not an integer. """ securesystemslib.schema.Integer().check_match(days) securesystemslib.schema.Integer().check_match(months) securesystemslib.schema.Integer().check_match(years) self.expires = (datetime.today() + relativedelta( days=days, months=months, years=years)).strftime("%Y-%m-%dT%H:%M:%SZ") def get_step_name_list(self): """ Return list of step names in the order in which they are listed in the layout. A list of step names. """ step_names = [] for step in self.steps: step_names.append(step.name) return step_names def get_step_by_name(self, step_name): """ Return the first step in the layout's list of steps identified by the passed step name. NOTE: Step names must be unique within a layout, which is enforced by they Layout's validate method. However, if validate has not been called, there may be multiple steps with the same name. In that case only the first step with the passed name is returned. step_name: A step name. securesystemslib.exceptions.FormatError If the passed step name is not a string. A step object. """ securesystemslib.schema.AnyString().check_match(step_name) for step in self.steps: # pragma: no branch if step.name == step_name: return step def remove_step_by_name(self, step_name): """ Remove all steps from the layout's list of steps identified by the passed step name. NOTE: Step names must be unique within a layout, which is enforced by they layout's validate method. However, if validate has not been called, there might be multiple steps with the same name. The method removes all steps with the passed name. step_name: A step name. securesystemslib.exceptions.FormatError If the passed step name is not a string. """ securesystemslib.schema.AnyString().check_match(step_name) for step in self.steps: if step.name == step_name: self.steps.remove(step) def get_inspection_name_list(self): """ Return list of inspection names in the order in which they are listed in the layout. A list of the inspection names. """ inspection_names = [] for inspection in self.inspect: inspection_names.append(inspection.name) return inspection_names def get_inspection_by_name(self, inspection_name): """ Return the first inspection in the layout's list of inspections identified by the passed inspection name. NOTE: Inspection names must be unique within a layout, which is enforced by they layout's validate method. However, if validate has not been called, there may be multiple inspections with the same name. In that case only the first inspection with the passed name is returned. inspection_name: An inspection name. securesystemslib.exceptions.FormatError If the passed inspection name is not a string. An inspection object. """ securesystemslib.schema.AnyString().check_match(inspection_name) for inspection in self.inspect: # pragma: no branch if inspection.name == inspection_name: return inspection def remove_inspection_by_name(self, inspection_name): """ Remove all inspections from the layout's list of inspections identified by the passed inspection name. NOTE: Inspection names must be unique within a layout, which is enforced by they layout's validate method. However, if validate has not been called, there may be multiple inspections with the same name. The method removes all inspections with the passed name. inspection_name: An inspection name. securesystemslib.exceptions.FormatError If the passed inspection name is not a string. """ securesystemslib.schema.AnyString().check_match(inspection_name) for inspection in self.inspect: if inspection.name == inspection_name: self.inspect.remove(inspection) def get_functionary_key_id_list(self): """ Return a list of the functionary keyids from the layout's keys dictionary. A list of functionary keyids. """ return list(self.keys.keys()) def add_functionary_key(self, key): """ Add the passed functionary public key to the layout's dictionary of keys. key: A functionary public key conformant with in_toto.formats.ANY_PUBKEY_SCHEMA. securesystemslib.exceptions.FormatError If the passed key does not match in_toto.formats.ANY_PUBKEY_SCHEMA. The added functionary public key. """ in_toto.formats.ANY_PUBKEY_SCHEMA.check_match(key) keyid = key["keyid"] self.keys[keyid] = key return key def add_functionary_key_from_path(self, key_path): """ Load a functionary public key in RSA PEM format from the passed path and add it to the layout's dictionary of keys. key_path: A path, conformant with securesystemslib.formats.PATH_SCHEMA, to a functionary public key. securesystemslib.exceptions.FormatError If the passed key path does not match securesystemslib.formats.PATH_SCHEMA. securesystemslib.exceptions.Error If the key at the passed path cannot be imported as public key. The added functionary public key. """ securesystemslib.formats.PATH_SCHEMA.check_match(key_path) key = securesystemslib.interface.import_rsa_publickey_from_file(key_path) return self.add_functionary_key(key) def add_functionary_key_from_gpg_keyid(self, gpg_keyid, gpg_home=None): """ Load a functionary public key from the GPG keychain, located at the passed GPG home path, identified by the passed GPG keyid, and add it to the layout's dictionary of keys. gpg_keyid: A GPG keyid. gpg_home: A path to the GPG keychain to load the key from. If not passed the default GPG keychain is used. securesystemslib.exceptions.FormatError If the passed gpg keyid does not match securesystemslib.formats.KEYID_SCHEMA. If the gpg home path is passed and does not match securesystemslib.formats.PATH_SCHEMA. If the key loaded from the GPG keychain does not match in_toto.formats.ANY_PUBKEY_SCHEMA. The added functionary public key. """ securesystemslib.formats.KEYID_SCHEMA.check_match(gpg_keyid) if gpg_home: # pragma: no branch securesystemslib.formats.PATH_SCHEMA.check_match(gpg_home) key = in_toto.gpg.functions.gpg_export_pubkey(gpg_keyid, homedir=gpg_home) return self.add_functionary_key(key) def add_functionary_keys_from_paths(self, key_path_list): """ Load the functionary public keys in RSA PEM format from the passed list of paths and add them to the layout's dictionary of keys. key_path_list: A list of paths, conformant with securesystemslib.formats.PATH_SCHEMA, to functionary public keys. securesystemslib.exceptions.FormatError If any of the passed key paths does not match securesystemslib.formats.PATH_SCHEMA. securesystemslib.exceptions.Error If any of the keys at the passed paths cannot be imported as public key. A dictionary of the added functionary public keys with the key's keyids as dictionary keys and the keys as values. """ securesystemslib.formats.PATHS_SCHEMA.check_match(key_path_list) key_dict = {} for key_path in key_path_list: key = self.add_functionary_key_from_path(key_path) key_dict[key["keyid"]] = key return key_dict def add_functionary_keys_from_gpg_keyids(self, gpg_keyid_list, gpg_home=None): """ Load functionary public keys from the GPG keychain, located at the passed GPG home path, identified by the passed GPG keyids, and add it to the layout's dictionary of keys. gpg_keyid_list: A list of GPG keyids. gpg_home: A path to the GPG keychain to load the keys from. If not passed the default GPG keychain is used. securesystemslib.exceptions.FormatError If any of the passed gpg keyids does not match securesystemslib.formats.KEYID_SCHEMA. If gpg home is passed and does not match securesystemslib.formats.PATH_SCHEMA. If any of the keys loaded from the GPG keychain does not match in_toto.formats.ANY_PUBKEY_SCHEMA. A dictionary of the added functionary public keys with the key's keyids as dictionary keys and the keys as values. """ securesystemslib.formats.KEYIDS_SCHEMA.check_match(gpg_keyid_list) key_dict = {} for gpg_keyid in gpg_keyid_list: key = self.add_functionary_key_from_gpg_keyid(gpg_keyid, gpg_home) key_dict[key["keyid"]] = key return key_dict def _validate_type(self): """Private method to check that the type string is set to layout.""" if self._type != "layout": raise securesystemslib.exceptions.FormatError( "Invalid _type value for layout (Should be 'layout')") def _validate_expires(self): """Private method to verify if the expiration field has the right format and can be parsed.""" try: # We do both 'parse' and 'check_match' because the format check does not # detect bogus dates (e.g. Jan 35th) and parse can do more formats. parse(self.expires) securesystemslib.formats.ISO8601_DATETIME_SCHEMA.check_match( self.expires) except Exception as e: raise securesystemslib.exceptions.FormatError( "Malformed date string in layout. Exception: {}".format(e)) def _validate_readme(self): """Private method to check that the readme field is a string.""" if not isinstance(self.readme, string_types): raise securesystemslib.exceptions.FormatError( "Invalid readme '{}', value must be a string." .format(self.readme)) def _validate_keys(self): """Private method to ensure that the keys contained are right.""" in_toto.formats.ANY_PUBKEY_DICT_SCHEMA.check_match(self.keys) def _validate_steps_and_inspections(self): """Private method to verify that the list of steps and inspections are correctly formed.""" names_seen = set() if not isinstance(self.steps, list): raise securesystemslib.exceptions.FormatError( "The steps field should be a list!") for step in self.steps: if not isinstance(step, Step): raise securesystemslib.exceptions.FormatError( "The steps list should only contain steps!") step.validate() if step.name in names_seen: raise securesystemslib.exceptions.FormatError( "There is already a step with name '{}'. Step names must be" " unique within a layout.".format(step.name)) names_seen.add(step.name) if not isinstance(self.inspect, list): raise securesystemslib.exceptions.FormatError( "The inspect field should a be a list!") for inspection in self.inspect: if not isinstance(inspection, Inspection): raise securesystemslib.exceptions.FormatError( "The inspect list should only contain inspections!") inspection.validate() if inspection.name in names_seen: raise securesystemslib.exceptions.FormatError( "There is already an inspection with name '{}'. Inspection names" " must be unique within a layout.".format(inspection.name)) names_seen.add(inspection.name) @attr.s(repr=False, init=False) class SupplyChainItem(ValidationMixin): """ Parent class for items of the supply chain, i.e. Steps and Inspections. """ name = attr.ib() expected_materials = attr.ib() expected_products = attr.ib() def __init__(self, **kwargs): """ Instantiate a new SupplyChainItem object with optional initial values. name: A unique name used to identify the related link metadata expected_materials and expected_products: A list of artifact rules used to verify if the materials or products of the item (found in the according link metadata file) link correctly with other items of the supply chain. """ super(SupplyChainItem, self).__init__() self.name = kwargs.get("name") self.expected_materials = kwargs.get("expected_materials", []) self.expected_products = kwargs.get("expected_products", []) def __repr__(self): """Returns an indented JSON string of the metadata object. """ return json.dumps(attr.asdict(self), indent=1, separators=(",", ": "), sort_keys=True) def add_material_rule_from_string(self, rule_string): """ Convenience method to parse the passed rule string into a list and append it to the item's list of expected_materials. rule_string: An artifact rule string, whose list representation is parseable by in_toto.rulelib.unpack_rule securesystemslib.exceptions.FormatError: If the passed rule_string is not a string. If the parsed rule_string cannot be unpacked using rulelib. """ securesystemslib.schema.AnyString().check_match(rule_string) rule_list = shlex.split(rule_string) # Raises format error if the parsed rule_string is not a valid rule in_toto.rulelib.unpack_rule(rule_list) self.expected_materials.append(rule_list) def add_product_rule_from_string(self, rule_string): """ Convenience method to parse the passed rule string into a list and append it to the item's list of expected_products. rule_string: An artifact rule string, whose list representation is parseable by in_toto.rulelib.unpack_rule securesystemslib.exceptions.FormatError: If the passed rule_string is not a string. If the parsed rule_string cannot be unpacked using rulelib. """ securesystemslib.schema.AnyString().check_match(rule_string) rule_list = shlex.split(rule_string) # Raises format error if the parsed rule_string is not a valid rule in_toto.rulelib.unpack_rule(rule_list) self.expected_products.append(rule_list) def _validate_expected_materials(self): """Private method to check that material rules are correctly formed.""" if not isinstance(self.expected_materials, list): raise securesystemslib.exceptions.FormatError( "Material rules should be a list!") for rule in self.expected_materials: in_toto.rulelib.unpack_rule(rule) def _validate_expected_products(self): """Private method to check that product rules are correctly formed.""" if not isinstance(self.expected_products, list): raise securesystemslib.exceptions.FormatError( "Product rules should be a list!") for rule in self.expected_products: in_toto.rulelib.unpack_rule(rule) @attr.s(repr=False, init=False) class Step(SupplyChainItem): """ Represents a step of the supply chain performed by a functionary. A step relates to link metadata generated when the step was performed. Materials and products used/produced by the step are constrained by the artifact rules in the step's expected_materials and expected_products attributes. """ _type = attr.ib() pubkeys = attr.ib() expected_command = attr.ib() threshold = attr.ib() def __init__(self, **kwargs): """ Instantiates a new step object with optional initial values. name: see parent class SupplyChainItem expected_materials and expected_products: see parent class SupplyChainItem pubkeys: A list of keyids of the functionaries authorized to perform the step expected_command: The command expected to have performed this step threshold: The least number of functionaries expected to perform this step securesystemslib.exceptions.FormatError If the instantiated step has invalid properties, e.g. because any of the assigned keyword arguments are invalid. """ super(Step, self).__init__(**kwargs) self._type = "step" self.pubkeys = kwargs.get("pubkeys", []) self.expected_command = kwargs.get("expected_command", []) self.threshold = kwargs.get("threshold", 1) self.validate() @staticmethod def read(data): """ Static method to instantiate a step object from a Python dictionary, e.g. by parsing its JSON representation. data: A dictionary containing step metadata. securesystemslib.exceptions.FormatError If any of the step's properties is invalid. The newly created step object. """ return Step(**data) def set_expected_command_from_string(self, command_string): """ Convenience method to parse the passed command_string into a list and assign it to the step's expected_command attribute. command_string: A string containing a command and command arguments. securesystemslib.exceptions.FormatError If the passed command_string is not a string. """ securesystemslib.schema.AnyString().check_match(command_string) self.expected_command = shlex.split(command_string) def _validate_type(self): """Private method to ensure that the type field is set to step.""" if self._type != "step": raise securesystemslib.exceptions.FormatError( "Invalid _type value for step (Should be 'step')") def _validate_threshold(self): """Private method to check that the threshold field is set to an int.""" if not isinstance(self.threshold, int): raise securesystemslib.exceptions.FormatError( "Invalid threshold '{}', value must be an int." .format(self.threshold)) def _validate_pubkeys(self): """Private method to check that the pubkeys is a list of keyids.""" if not isinstance(self.pubkeys, list): raise securesystemslib.exceptions.FormatError( "The pubkeys field should be a list!") for keyid in self.pubkeys: securesystemslib.formats.KEYID_SCHEMA.check_match(keyid) def _validate_expected_command(self): """Private method to check that the expected_command is proper.""" if not isinstance(self.expected_command, list): raise securesystemslib.exceptions.FormatError( "The expected command field is malformed!") @attr.s(repr=False, init=False) class Inspection(SupplyChainItem): """ Represents an inspection whose command in the run attribute is executed during final product verification. Materials and products used/produced by the inspection are constrained by the artifact rules in the inspection's expected_materials and expected_products attributes. """ _type = attr.ib() run = attr.ib() def __init__(self, **kwargs): """ Instantiates a new inspection object with optional initial values. name: see parent class SupplyChainItem expected_materials and expected_products: see parent class SupplyChainItem run: The command to be executed during final product verification securesystemslib.exceptions.FormatError If the instantiated inspection has invalid properties, e.g. because any of the assigned keyword arguments are invalid. """ super(Inspection, self).__init__(**kwargs) self._type = "inspection" self.run = kwargs.get("run", []) self.validate() @staticmethod def read(data): """ Static method to instantiate an inspection object from a Python dictionary, e.g. by parsing its JSON representation. data: A dictionary containing inspection metadata. securesystemslib.exceptions.FormatError If any of the inspection's properties is invalid. The newly created inspection object. """ return Inspection(**data) def set_run_from_string(self, command_string): """ Convenience method to parse the passed command_string into a list and assign it to the inspection's run attribute. command_string: A string containing a command and command arguments. securesystemslib.exceptions.FormatError If the passed command_string is not a string. """ securesystemslib.schema.AnyString().check_match(command_string) self.run = shlex.split(command_string) def _validate_type(self): """Private method to ensure that the type field is set to inspection.""" if self._type != "inspection": raise securesystemslib.exceptions.FormatError( "The _type field must be set to 'inspection'!") def _validate_run(self): """Private method to check that the expected command is correct.""" if not isinstance(self.run, list): raise securesystemslib.exceptions.FormatError( "The run field is malformed!") in-toto-0.4.0/in_toto/models/link.py0000644000076500000240000001173113535414523017457 0ustar lukpstaff00000000000000#!/usr/bin/env python """ link.py Lukas Puehringer Santiago Torres Sep 23, 2016 See LICENSE for licensing information. Provides a class for link metadata which is information gathered when a step of the supply chain is performed. """ import attr import securesystemslib.formats from in_toto.models.common import Signable FILENAME_FORMAT = "{step_name}.{keyid:.8}.link" FILENAME_FORMAT_SHORT = "{step_name}.link" UNFINISHED_FILENAME_FORMAT = ".{step_name}.{keyid:.8}.link-unfinished" UNFINISHED_FILENAME_FORMAT_GLOB = ".{step_name}.{pattern}.link-unfinished" @attr.s(repr=False, init=False) class Link(Signable): """ A link is the metadata representation of a supply chain step performed by a functionary. The object should be contained in a generic Metablock object, which provides functionality for signing and signature verification, and reading from and writing to disk. _type: "link" name: a unique name used to identify the related step in the layout materials and products: a dictionary in the format of { : { { : } },... } byproducts: a dictionary in the format of { "stdout": , "stderr": , "return-value": the return value of the executed command } command: the command that was wrapped by in_toto-run return_value: the return value of the executed command environment: environment information, e.g. { "variables": , "filesystem": , "workdir": } Note: None of the values in environment is mandated runlib currently only records the workdir """ _type = attr.ib() name = attr.ib() materials = attr.ib() products = attr.ib() byproducts = attr.ib() command = attr.ib() environment = attr.ib() def __init__(self, **kwargs): super(Link, self).__init__() self._type = "link" self.name = kwargs.get("name") self.materials = kwargs.get("materials", {}) self.products = kwargs.get("products", {}) self.byproducts = kwargs.get("byproducts", {}) self.command = kwargs.get("command", []) self.environment = kwargs.get("environment", {}) self.validate() @property def type_(self): """Getter for protected _type attribute. Trailing underscore used by convention (pep8) to avoid conflict with Python's type keyword. """ return self._type @staticmethod def read(data): """Static method to instantiate a new Link from a Python dictionary """ return Link(**data) def _validate_type(self): """Private method to check that `_type` is set to "link".""" if self._type != "link": raise securesystemslib.exceptions.FormatError( "Invalid Link: field `_type` must be set to 'link', got: {}" .format(self._type)) def _validate_materials(self): """Private method to check that `materials` is a `dict` of `HASHDICTs`.""" if not isinstance(self.materials, dict): raise securesystemslib.exceptions.FormatError( "Invalid Link: field `materials` must be of type dict, got: {}" .format(type(self.materials))) for material in list(self.materials.values()): securesystemslib.formats.HASHDICT_SCHEMA.check_match(material) def _validate_products(self): """Private method to check that `products` is a `dict` of `HASHDICTs`.""" if not isinstance(self.products, dict): raise securesystemslib.exceptions.FormatError( "Invalid Link: field `products` must be of type dict, got: {}" .format(type(self.products))) for product in list(self.products.values()): securesystemslib.formats.HASHDICT_SCHEMA.check_match(product) def _validate_byproducts(self): """Private method to check that `byproducts` is a `dict`.""" if not isinstance(self.byproducts, dict): raise securesystemslib.exceptions.FormatError( "Invalid Link: field `byproducts` must be of type dict, got: {}" .format(type(self.byproducts))) def _validate_command(self): """Private method to check that `command` is a `list`.""" if not isinstance(self.command, list): raise securesystemslib.exceptions.FormatError( "Invalid Link: field `command` must be of type list, got: {}" .format(type(self.command))) def _validate_environment(self): """Private method to check that `environment` is a `dict`. """ if not isinstance(self.environment, dict): raise securesystemslib.exceptions.FormatError( "Invalid Link: field `environment` must be of type dict, got: {}" .format(type(self.environment))) in-toto-0.4.0/in_toto/models/metadata.py0000644000076500000240000002156313535414523020306 0ustar lukpstaff00000000000000""" metadata.py Lukas Puehringer Santiago Torres Oct 23, 2017 See LICENSE for licensing information. Provides a container class `Metablock` for signed metadata and functions for signing, signature verification, de-serialization and serialization from and to JSON. """ import attr import json import securesystemslib.keys import securesystemslib.formats import securesystemslib.exceptions import in_toto.formats import in_toto.gpg.functions from in_toto.models.common import ValidationMixin from in_toto.models.link import Link from in_toto.models.layout import Layout from in_toto.exceptions import SignatureVerificationError @attr.s(repr=False, init=False) class Metablock(ValidationMixin): """ This object holds the in-toto metablock data structure. This includes the fields "signed" and "signatures", i.e., what was signed and the signatures. """ signatures = attr.ib() signed = attr.ib() def __init__(self, **kwargs): self.signatures = kwargs.get("signatures", []) self.signed = kwargs.get("signed") self.compact_json = kwargs.get("compact_json", False) self.validate() def __repr__(self): """Returns an indented JSON string of the metadata object. """ indent = None if self.compact_json else 1 separators = (',', ':') if self.compact_json else (',', ': ') return json.dumps( { "signatures": self.signatures, "signed": attr.asdict(self.signed) }, indent=indent, separators=separators, sort_keys=True ) def dump(self, filename): """ Write the JSON string representation of the Metablock object to disk. filename: The path to write the file to. Writing metadata file to disk None. """ with open(filename, "wb") as fp: fp.write("{}".format(self).encode("utf-8")) @staticmethod def load(path): """ Loads the JSON string representation of signed metadata from disk and creates a Metablock object. The `signed` attribute of the Metablock object is assigned a Link or Layout object, depending on the `_type` field in the loaded metadata file. path: The path to write the file to. Reading metadata file from disk None. """ with open(path, "r") as fp: data = json.load(fp) signatures = data.get("signatures", []) signed_data = data.get("signed", {}) signed_type = signed_data.get("_type") if signed_type == "link": signed = Link.read(signed_data) elif signed_type == "layout": signed = Layout.read(signed_data) else: raise securesystemslib.exceptions.FormatError("Invalid Metadata format") return Metablock(signatures=signatures, signed=signed) @property def type_(self): """Shortcut to the _type property of the contained Link or Layout object, should be one of "link" or "layout". Trailing underscore used by convention (pep8) to avoid conflict with Python's type keyword. """ return self.signed.type_ def sign(self, key): """ Signs the utf-8 encoded canonical JSON bytes of the Link or Layout object contained in `self.signed` with the passed key and appends the created signature to `self.signatures`. Note: We actually pass the dictionary representation of the data to be signed and `securesystemslib.keys.create_signature` converts it to canonical JSON utf-8 encoded bytes before creating the signature. key: A signing key in the format securesystemslib.formats.KEY_SCHEMA The dictionary representation of the newly created signature. """ securesystemslib.formats.KEY_SCHEMA.check_match(key) signature = securesystemslib.keys.create_signature(key, self.signed.signable_dict) self.signatures.append(signature) return signature def sign_gpg(self, gpg_keyid=None, gpg_home=None): """ Signs the utf-8 encoded canonical JSON bytes of the Link or Layout object contained in `self.signed` using `gpg.functions.gpg_sign_object` and appends the created signature to `self.signatures`. gpg_keyid: (optional) A gpg keyid, if omitted the default signing key is used gpg_home: (optional) The path to the gpg keyring, if omitted the default gpg keyring is used in_toto.gpg.exceptions.CommandError: If the gpg signing command returned a non-zero exit code, e.g. because the key has expired. The dictionary representation of the newly created signature. """ signature = in_toto.gpg.functions.gpg_sign_object( self.signed.signable_bytes, gpg_keyid, gpg_home) self.signatures.append(signature) return signature def verify_signature(self, verification_key): """ Verifies the signature, found in `self.signatures`, corresponding to the passed verification key, or in case of GPG one of its subkeys, identified by the key's keyid, using the passed verification key and the utf-8 encoded canonical JSON bytes of the Link or Layout object, contained in `self.signed`. If the signature matches `in_toto.gpg.formats.SIGNATURE_SCHEMA`, `in_toto.gpg.functions.gpg_verify_signature` is used for verification, if the signature matches `securesystemslib.formats.SIGNATURE_SCHEMA` `securesystemslib.keys.verify_signature` is used. Note: In case of securesystemslib we actually pass the dictionary representation of the data to be verified and `securesystemslib.keys.verify_signature` converts it to canonical JSON utf-8 encoded bytes before verifying the signature. verification_key: Verifying key in the format: in_toto.formats.ANY_VERIFICATION_KEY_SCHEMA FormatError If the passed key is not conformant with `in_toto.formats.ANY_VERIFICATION_KEY_SCHEMA` SignatureVerificationError If the Metablock does not carry a signature signed with the private key corresponding to the passed verification key or one of its subkeys If the signature corresponding to the passed verification key or one of its subkeys does not match securesystemslib's or in_toto.gpg's signature schema. If the signature to be verified is malformed or invalid. in_toto.gpg.exceptions.KeyExpirationError: if the passed verification key is an expired gpg key None. """ in_toto.formats.ANY_VERIFICATION_KEY_SCHEMA.check_match(verification_key) verification_keyid = verification_key["keyid"] # Find a signature that corresponds to the keyid of the passed # verification key or one of its subkeys signature = None for signature in self.signatures: if signature["keyid"] == verification_keyid: break elif signature["keyid"] in list( verification_key.get("subkeys", {}).keys()): break else: raise SignatureVerificationError("No signature found for key '{}'" .format(verification_keyid)) if in_toto.gpg.formats.SIGNATURE_SCHEMA.matches(signature): valid = in_toto.gpg.functions.gpg_verify_signature(signature, verification_key, self.signed.signable_bytes) elif securesystemslib.formats.SIGNATURE_SCHEMA.matches(signature): valid = securesystemslib.keys.verify_signature( verification_key, signature, self.signed.signable_dict) else: valid = False if not valid: raise SignatureVerificationError("Invalid signature for keyid '{}'" .format(verification_keyid)) def _validate_signed(self): """Private method to check if the 'signed' attribute contains a valid Layout or Link object. """ if not (isinstance(self.signed, Layout) or isinstance(self.signed, Link)): raise securesystemslib.exceptions.FormatError("The Metblock's 'signed'" " property has has to be of type 'Link' or 'Layout'.") # If the signed object is a Link or Layout object validate it. self.signed.validate() def _validate_signatures(self): """Private method to check that the 'signatures' attribute is a list of signatures in the format 'in_toto.formats.ANY_SIGNATURE_SCHEMA'. """ if not isinstance(self.signatures, list): raise securesystemslib.exceptions.FormatError("The Metablock's" " 'signatures' property has to be of type 'list'.") for signature in self.signatures: in_toto.formats.ANY_SIGNATURE_SCHEMA.check_match(signature) in-toto-0.4.0/in_toto/process.py0000644000076500000240000001624213535414523016717 0ustar lukpstaff00000000000000#!/usr/bin/env python """ process.py Trishank Karthik Kuppusamy Lukas Puehringer September 25, 2018 See LICENSE for licensing information. Provide a common interface for Python's subprocess module to: - require the Py3 subprocess backport `subprocess32` on Python2, - in-toto namespace subprocess constants (DEVNULL, PIPE) and - provide a custom `subprocess.run` wrapper - provide a special `run_duplicate_streams` function """ import os import sys import io import tempfile import logging import time import shlex import six if six.PY2: import subprocess32 as subprocess # pragma: no cover pylint: disable=import-error else: # pragma: no cover import subprocess import in_toto.formats as formats # Constants. from in_toto.settings import SUBPROCESS_TIMEOUT DEVNULL = subprocess.DEVNULL PIPE = subprocess.PIPE # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) def run(cmd, check=True, timeout=SUBPROCESS_TIMEOUT, **kwargs): """ Provide wrapper for `subprocess.run` (see https://github.com/python/cpython/blob/3.5/Lib/subprocess.py#L352-L399) where: * `timeout` has a default (see in_toto.settings.SUBPROCESS_TIMEOUT), * `check` is `True` by default, * there is only one positional argument, i.e. `cmd` that can be either a str (will be split with shlex) or a list of str and * instead of raising a ValueError if both `input` and `stdin` are passed, `stdin` is ignored. cmd: The command and its arguments. (list of str, or str) Splits a string specifying a command and its argument into a list of substrings, if necessary. check: (default True) "If check is true, and the process exits with a non-zero exit code, a CalledProcessError exception will be raised. Attributes of that exception hold the arguments, the exit code, and stdout and stderr if they were captured." timeout: (default see settings.SUBPROCESS_TIMEOUT) "The timeout argument is passed to Popen.communicate(). If the timeout expires, the child process will be killed and waited for. The TimeoutExpired exception will be re-raised after the child process has terminated." **kwargs: See subprocess.run and Frequently Used Arguments to Popen constructor for available kwargs. https://docs.python.org/3.5/library/subprocess.html#subprocess.run https://docs.python.org/3.5/library/subprocess.html#frequently-used-arguments securesystemslib.exceptions.FormatError: If the `cmd` is a list and does not match in_toto.formats.LIST_OF_ANY_STRING_SCHEMA. OSError: If the given command is not present or non-executable. subprocess.TimeoutExpired: If the process does not terminate after timeout seconds. Default is `settings.SUBPROCESS_TIMEOUT` The side effects of executing the given command in this environment. A subprocess.CompletedProcess instance. """ # Make list of command passed as string for convenience if isinstance(cmd, six.string_types): cmd = shlex.split(cmd) else: formats.LIST_OF_ANY_STRING_SCHEMA.check_match(cmd) # NOTE: The CPython implementation would raise a ValueError here, we just # don't pass on `stdin` if the user passes `input` and `stdin` # https://github.com/python/cpython/blob/3.5/Lib/subprocess.py#L378-L381 if kwargs.get("input") is not None and "stdin" in kwargs: LOG.debug("stdin and input arguments may not both be used. " "Ignoring passed stdin: " + str(kwargs["stdin"])) del kwargs["stdin"] return subprocess.run(cmd, check=check, timeout=timeout, **kwargs) def run_duplicate_streams(cmd, timeout=SUBPROCESS_TIMEOUT): """ Provide a function that executes a command in a subprocess and returns its exit code and the contents of what it printed to its standard streams upon termination. NOTE: The function might behave unexpectedly with interactive commands. cmd: The command and its arguments. (list of str, or str) Splits a string specifying a command and its argument into a list of substrings, if necessary. timeout: (default see settings.SUBPROCESS_TIMEOUT) If the timeout expires, the child process will be killed and waited for and then subprocess.TimeoutExpired will be raised. securesystemslib.exceptions.FormatError: If the `cmd` is a list and does not match in_toto.formats.LIST_OF_ANY_STRING_SCHEMA. OSError: If the given command is not present or non-executable. subprocess.TimeoutExpired: If the process does not terminate after timeout seconds. Default is `settings.SUBPROCESS_TIMEOUT` The side effects of executing the given command in this environment. A tuple of command's exit code, standard output and standard error contents. """ if isinstance(cmd, six.string_types): cmd = shlex.split(cmd) else: formats.LIST_OF_ANY_STRING_SCHEMA.check_match(cmd) # Use temporary files as targets for child process standard stream redirects # They seem to work better (i.e. do not hang) than pipes, when using # interactive commands like `vi`. stdout_fd, stdout_name = tempfile.mkstemp() stderr_fd, stderr_name = tempfile.mkstemp() try: with io.open(stdout_name, "r") as stdout_reader, \ os.fdopen(stdout_fd, "w") as stdout_writer, \ io.open(stderr_name, "r") as stderr_reader, \ os.fdopen(stderr_fd, "w") as stderr_writer: # Start child , writing standard streams to temporary files proc = subprocess.Popen(cmd, stdout=stdout_writer, stderr=stderr_writer, universal_newlines=True) proc_start_time = time.time() stdout_str = stderr_str = "" stdout_part = stderr_part = "" # Read as long as the process runs or there is data on one of the streams while proc.poll() is None or stdout_part or stderr_part: # Raise timeout error in they same manner as `subprocess` would do it if (timeout is not None and time.time() > proc_start_time + timeout): proc.kill() proc.wait() raise subprocess.TimeoutExpired(cmd, timeout) # Read from child process's redirected streams, write to parent # process's standard streams and construct retuirn values stdout_part = stdout_reader.read() stderr_part = stderr_reader.read() sys.stdout.write(stdout_part) sys.stderr.write(stderr_part) sys.stdout.flush() sys.stderr.flush() stdout_str += stdout_part stderr_str += stderr_part finally: # The work is done or was interrupted, the temp files can be removed os.remove(stdout_name) os.remove(stderr_name) # Return process exit code and captured stream return proc.poll(), stdout_str, stderr_str in-toto-0.4.0/in_toto/rulelib.py0000644000076500000240000002324113535414523016674 0ustar lukpstaff00000000000000""" in_toto/rulelib.py Santiago Torres-Arias Lukas Puehringer Nov 18, 2016 See LICENSE for licensing information. This module provides functions parse artifact rules and validate their syntax. """ import in_toto.formats import securesystemslib.exceptions import securesystemslib.formats GENERIC_RULES = {"create", "modify", "delete", "allow", "disallow", "require",} COMPLEX_RULES = {"match",} ALL_RULES = GENERIC_RULES | COMPLEX_RULES def unpack_rule(rule): """ Parses the rule and extracts and returns the necessary data to apply the rule. Can also be used to verify if a rule complies with any of the formats rule: The list of rule elements, in one of the following formats: MATCH [IN ] WITH (MATERIALS|PRODUCTS) [IN ] FROM , CREATE , DELETE , MODIFY , ALLOW , DISALLOW , REQUIRE Note that REQUIRE is somewhat of a weird animal that does not use patterns but rather single filenames (for now). raises FormatError, if the rule does not comply with any of the formats. None. A dictionary of the artifact rule data, if it is a generic rule the dictionary is: { "rule_type": rule[0] ("CREATE"|"MODIFY"|"DELETE"|"ALLOW"|"DISALLOW") "pattern" : rule[1], a path pattern } if it is a match rule, the dictionary is: { "rule_type": rule[0], ("MATCH"), "pattern": rule[1], a path pattern "source_prefix": path or empty string "dest_prefix": path or empty string "dest_type" : destination artifact type, ("MATERIAL"|"PRODUCT") "dest_name": destination step/inspection name } """ in_toto.formats.LIST_OF_ANY_STRING_SCHEMA.check_match(rule) # Create all lower rule copy to case insensitively parse out tokens whose # position we don't know yet # We keep the original rule to retain the non-token elements' case rule_lower = [] for rule_elem in rule: rule_lower.append(rule_elem.lower()) rule_len = len(rule) if rule_len < 2 or rule_lower[0] not in ALL_RULES: raise securesystemslib.exceptions.FormatError("Wrong rule format," " rules must start with one of '{0}' and specify a 'pattern' as" " second element.\n" "Got: \n\t'{1}'".format(", ".join(ALL_RULES), rule)) rule_type = rule_lower[0] pattern = rule[1] # Type is one of "CREATE", "MODIFY", "DELETE", "ALLOW", "DISALLOW" if rule_type in GENERIC_RULES: # pylint: disable=no-else-raise if rule_len != 2: raise securesystemslib.exceptions.FormatError("Wrong rule format," " generic rules must have one of the formats:\n\t" "CREATE \n\t" "MODIFY \n\t" "DELETE \n\t" "ALLOW \n\t" "DISALLOW \n" "REQUIRE \n" "Got:\n\t{}".format(rule)) else: return { "rule_type": rule_type, "pattern": pattern, } # Type is "MATCH" # NOTE: Can't reach `else` branch, if the rule is neither in GENERIC_RULES # nor in COMPLEX_RULES an exception is raised earlier. elif rule_type in COMPLEX_RULES: # pragma: no branch # ... IN WITH (MATERIALS|PRODUCTS) # IN FROM if (rule_len == 10 and rule_lower[2] == "in" and rule_lower[4] == "with" and rule_lower[6] == "in" and rule_lower[8] == "from"): source_prefix = rule[3] dest_type = rule_lower[5] dest_prefix = rule[7] dest_name = rule[9] # ... IN WITH (MATERIALS|PRODUCTS) FROM elif (rule_len == 8 and rule_lower[2] == "in" and rule_lower[4] == "with" and rule_lower[6] == "from"): source_prefix = rule[3] dest_type = rule_lower[5] dest_prefix = "" dest_name = rule[7] # ... WITH (MATERIALS|PRODUCTS) IN FROM elif (rule_len == 8 and rule_lower[2] == "with" and rule_lower[4] == "in" and rule_lower[6] == "from"): source_prefix = "" dest_type = rule_lower[3] dest_prefix = rule[5] dest_name = rule[7] # ... WITH (MATERIALS|PRODUCTS) FROM elif (rule_len == 6 and rule_lower[2] == "with" and rule_lower[4] == "from"): source_prefix = "" dest_type = rule_lower[3] dest_prefix = "" dest_name = rule[5] else: raise securesystemslib.exceptions.FormatError("Wrong rule format," " match rules must have the format:\n\t" " MATCH [IN ] WITH" " (MATERIALS|PRODUCTS) [IN ] FROM .\n" "Got: \n\t{}".format(rule)) if dest_type not in {"materials", "products"}: raise securesystemslib.exceptions.FormatError("Wrong rule format," " match rules must have either MATERIALS or PRODUCTS (case" " insensitive) as destination.\n" "Got: \n\t{}".format(rule)) return { "rule_type": rule_type, "pattern": pattern, "source_prefix": source_prefix, "dest_prefix": dest_prefix, "dest_type" : dest_type, "dest_name": dest_name } def pack_rule(rule_type, pattern, source_prefix=None, dest_type=None, dest_prefix=None, dest_name=None): """ Create an artifact rule in the passed arguments and return it as list as it is stored in a step's or inspection's expected_material or expected_product field. rule_type: One of "MATCH", "CREATE", "DELETE", MODIFY, ALLOW, DISALLOW, REQUIRE (case insensitive). pattern: A glob pattern to match artifact paths. source_prefix: (only for "MATCH" rules) A prefix for 'pattern' to match artifacts reported by the link corresponding to the step that contains the rule. dest_type: (only for "MATCH" rules) One of "MATERIALS" or "PRODUCTS" (case insensitive) to specify if materials or products of the link corresponding to the step identified by 'dest_name' should be matched. dest_prefix: (only for "MATCH" rules) A prefix for 'pattern' to match artifacts reported by the link corresponding to the step identified by 'dest_name'. dest_name: (only for "MATCH" rules) The name of the step whose corresponding link is used to match artifacts. securesystemslib.exceptions.FormatError if any of the arguments is malformed. One of the following rule formats in as a list MATCH [IN ] WITH (MATERIALS|PRODUCTS) [IN ] FROM , CREATE , DELETE , MODIFY , ALLOW , DISALLOW , REQUIRE Note that REQUIRE is somewhat of a weird animal that does not use patterns but rather single filenames (for now). """ in_toto.formats.ANY_STRING_SCHEMA.check_match(rule_type) in_toto.formats.ANY_STRING_SCHEMA.check_match(pattern) if rule_type.lower() not in ALL_RULES: raise securesystemslib.exceptions.FormatError("'{0}' is not a valid " "'type'. Rule type must be one of: {1} (case insensitive)." .format(rule_type, ", ".join(ALL_RULES))) if rule_type.upper() == "MATCH": if (not in_toto.formats.ANY_STRING_SCHEMA.matches(dest_type) or not (dest_type.lower() == "materials" or dest_type.lower() == "products")): raise securesystemslib.exceptions.FormatError("'{}' is not a valid" " 'dest_type'. Rules of type 'MATCH' require a destination type of" " either 'MATERIALS' or 'PRODUCTS' (case insensitive)." .format(dest_type)) if not (in_toto.formats.ANY_STRING_SCHEMA.matches(dest_name) and dest_name): raise securesystemslib.exceptions.FormatError("'{}' is not a valid" " 'dest_name'. Rules of type 'MATCH' require a step name as a" " destination name.".format(dest_name)) # Construct rule rule = ["MATCH", pattern] if source_prefix: in_toto.formats.ANY_STRING_SCHEMA.check_match(source_prefix) rule += ["IN", source_prefix] rule += ["WITH", dest_type.upper()] if dest_prefix: in_toto.formats.ANY_STRING_SCHEMA.check_match(dest_prefix) rule += ["IN", dest_prefix] rule += ["FROM", dest_name] else: rule = [rule_type.upper(), pattern] return rule def pack_rule_data(rule_data): """Shortcut for 'pack_rule' to pack a rule based on a rule_data dictionary as returned by 'unpack_rule'. """ return pack_rule(**rule_data) def pack_create_rule(pattern): """Shortcut for 'pack_rule' to pack a CREATE rule. """ return pack_rule("CREATE", pattern) def pack_delete_rule(pattern): """Shortcut for 'pack_rule' to pack a DELETE rule. """ return pack_rule("DELETE", pattern) def pack_modify_rule(pattern): """Shortcut for 'pack_rule' to pack a MODIFY rule. """ return pack_rule("MODIFY", pattern) def pack_allow_rule(pattern): """Shortcut for 'pack_rule' to pack an ALLOW rule. """ return pack_rule("ALLOW", pattern) def pack_disallow_rule(pattern): """Shortcut for 'pack_rule' to pack a DISALLOW rule. """ return pack_rule("DISALLOW", pattern) def pack_require_rule(filename): """ Shortcut for 'pack_rule' to pack a REQUIRE rule: note that REQUIRE is somewhat of a weird animal that does not use patterns but rather single filenames (for now). """ return pack_rule("REQUIRE", filename) in-toto-0.4.0/in_toto/runlib.py0000644000076500000240000010242613535414523016534 0ustar lukpstaff00000000000000""" runlib.py Lukas Puehringer June 27, 2016 See LICENSE for licensing information. Provides a wrapper for any command of the software supply chain. The wrapper performs the following tasks which are implemented in this library. - Record state of material (files the command is executed on) - Execute command - Capture stdout/stderr/return value of the executed command - Record state of product (files after the command was executed) - Return Metablock containing a Link object which can be can be signed and stored to disk """ import glob import logging import os import itertools from pathspec import PathSpec import in_toto.settings import in_toto.exceptions import in_toto.process from in_toto.models.link import (UNFINISHED_FILENAME_FORMAT, FILENAME_FORMAT, FILENAME_FORMAT_SHORT, UNFINISHED_FILENAME_FORMAT_GLOB) import securesystemslib.formats import securesystemslib.hash import securesystemslib.exceptions from in_toto.models.metadata import Metablock # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) def _hash_artifact(filepath, hash_algorithms=None, normalize_line_endings=False): """Internal helper that takes a filename and hashes the respective file's contents using the passed hash_algorithms and returns a hashdict conformant with securesystemslib.formats.HASHDICT_SCHEMA. """ if not hash_algorithms: hash_algorithms = ['sha256'] securesystemslib.formats.HASHALGORITHMS_SCHEMA.check_match(hash_algorithms) hash_dict = {} for algorithm in hash_algorithms: digest_object = securesystemslib.hash.digest_filename(filepath, algorithm, normalize_line_endings=normalize_line_endings) hash_dict.update({algorithm: digest_object.hexdigest()}) securesystemslib.formats.HASHDICT_SCHEMA.check_match(hash_dict) return hash_dict def _apply_exclude_patterns(names, exclude_filter): """Exclude matched patterns from passed names.""" included = set(names) # Assume old way for easier testing if hasattr(exclude_filter, '__iter__'): exclude_filter = PathSpec.from_lines('gitwildmatch', exclude_filter) for excluded in exclude_filter.match_files(names): included.discard(excluded) return sorted(included) def _apply_left_strip(artifact_filepath, artifacts_dict, lstrip_paths=None): """ Internal helper function to left strip dictionary keys based on prefixes passed by the user. """ if lstrip_paths: # If a prefix is passed using the argument --lstrip-paths, # that prefix is left stripped from the filepath passed. # Note: if the prefix doesn't include a trailing /, the dictionary key # may include an unexpected /. for prefix in lstrip_paths: if artifact_filepath.startswith(prefix): artifact_filepath = artifact_filepath[len(prefix):] break if artifact_filepath in artifacts_dict: raise in_toto.exceptions.PrefixError("Prefix selection has " "resulted in non unique dictionary key '{}'" .format(artifact_filepath)) return artifact_filepath def record_artifacts_as_dict(artifacts, exclude_patterns=None, base_path=None, follow_symlink_dirs=False, normalize_line_endings=False, lstrip_paths=None): """ Hashes each file in the passed path list. If the path list contains paths to directories the directory tree(s) are traversed. The files a link command is executed on are called materials. The files that result form a link command execution are called products. Paths are normalized for matching and storing by left stripping "./" NOTE on exclude patterns: - Uses PathSpec to compile gitignore-style patterns, making use of the GitWildMatchPattern class (registered as 'gitwildmatch') - Patterns are checked for match against the full path relative to each path passed in the artifacts list - If a directory is excluded, all its files and subdirectories are also excluded - How it differs from .gitignore - No need to escape # - No ignoring of trailing spaces - No general negation with exclamation mark ! - No special treatment of slash / - No special treatment of consecutive asterisks ** - Exclude patterns are likely to become command line arguments or part of a config file. artifacts: A list of file or directory paths used as materials or products for the link command. exclude_patterns: (optional) Artifacts matched by the pattern are excluded from the result. Exclude patterns can be passed as argument or specified via ARTIFACT_EXCLUDE_PATTERNS setting (see `in_toto.settings`) or via envvars or rcfiles (see `in_toto.user_settings`). If passed, patterns specified via settings are overriden. base_path: (optional) Change to base_path and record artifacts relative from there. If not passed, current working directory is used as base_path. NOTE: The base_path part of the recorded artifact is not included in the returned paths. follow_symlink_dirs: (optional) Follow symlinked dirs if the linked dir exists (default is False). The recorded path contains the symlink name, not the resolved name. NOTE: This parameter toggles following linked directories only, linked files are always recorded, independently of this parameter. NOTE: Beware of infinite recursions that can occur if a symlink points to a parent directory or itself. normalize_line_endings: (optional) If True, replaces windows and mac line endings with unix line endings before hashing the content of the passed files, for cross-platform support. lstrip_paths: (optional) If a prefix path is passed, the prefix is left stripped from the path of every artifact that contains the prefix. in_toto.exceptions.ValueError, if we cannot change to base path directory in_toto.exceptions.FormatError, if the list of exlcude patterns does not match format securesystemslib.formats.NAMES_SCHEMA Calls functions to generate cryptographic hashes. A dictionary with file paths as keys and the files' hashes as values. """ artifacts_dict = {} if not artifacts: return artifacts_dict if base_path: LOG.info("Overriding setting ARTIFACT_BASE_PATH with passed" " base path.") else: base_path = in_toto.settings.ARTIFACT_BASE_PATH # Temporarily change into base path dir if set if base_path: original_cwd = os.getcwd() try: os.chdir(base_path) except Exception as e: raise ValueError("Could not use '{}' as base path: '{}'".format( base_path, e)) # Normalize passed paths norm_artifacts = [] for path in artifacts: norm_artifacts.append(os.path.normpath(path)) # Passed exclude patterns take precedence over exclude pattern settings if exclude_patterns: LOG.info("Overriding setting ARTIFACT_EXCLUDE_PATTERNS with passed" " exclude patterns.") else: # TODO: Do we want to keep the exclude pattern setting? exclude_patterns = in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS # Apply exclude patterns on the passed artifact paths if available if exclude_patterns: securesystemslib.formats.NAMES_SCHEMA.check_match(exclude_patterns) norm_artifacts = _apply_exclude_patterns(norm_artifacts, exclude_patterns) # Check if any of the prefixes passed for left stripping is a left substring # of another if lstrip_paths: for prefix_one, prefix_two in itertools.combinations(lstrip_paths, 2): if prefix_one.startswith(prefix_two) or \ prefix_two.startswith(prefix_one): raise in_toto.exceptions.PrefixError("'{}' and '{}' " "triggered a left substring error".format(prefix_one, prefix_two)) # Compile the gitignore-style patterns exclude_filter = PathSpec.from_lines('gitwildmatch', exclude_patterns or []) # Iterate over remaining normalized artifact paths for artifact in norm_artifacts: if os.path.isfile(artifact): # FIXME: this is necessary to provide consisency between windows # filepaths and *nix filepaths. A better solution may be in order # though... artifact = artifact.replace('\\', '/') key = _apply_left_strip(artifact, artifacts_dict, lstrip_paths) artifacts_dict[key] = _hash_artifact(artifact, normalize_line_endings=normalize_line_endings) elif os.path.isdir(artifact): for root, dirs, files in os.walk(artifact, followlinks=follow_symlink_dirs): # Create a list of normalized dirpaths dirpaths = [] for dirname in dirs: norm_dirpath = os.path.normpath(os.path.join(root, dirname)) dirpaths.append(norm_dirpath) # Applying exclude patterns on the directory paths returned by walk # allows to exclude a subdirectory 'sub' with a pattern 'sub'. # If we only applied the patterns below on the subdirectory's # containing file paths, we'd have to use a wildcard, e.g.: 'sub*' if exclude_patterns: dirpaths = _apply_exclude_patterns(dirpaths, exclude_filter) # Reset and refill dirs with remaining names after exclusion # Modify (not reassign) dirnames to only recurse into remaining dirs dirs[:] = [] for dirpath in dirpaths: # Dirs only contain the basename and not the full path name = os.path.basename(dirpath) dirs.append(name) # Create a list of normalized filepaths filepaths = [] for filename in files: norm_filepath = os.path.normpath(os.path.join(root, filename)) # `os.walk` could also list dead symlinks, which would # result in an error later when trying to read the file if os.path.isfile(norm_filepath): filepaths.append(norm_filepath) else: LOG.info("File '{}' appears to be a broken symlink. Skipping..." .format(norm_filepath)) # Apply exlcude patterns on the normalized file paths returned by walk if exclude_patterns: filepaths = _apply_exclude_patterns(filepaths, exclude_filter) for filepath in filepaths: # FIXME: this is necessary to provide consisency between windows # filepaths and *nix filepaths. A better solution may be in order # though... normalized_filepath = filepath.replace("\\", "/") key = _apply_left_strip( normalized_filepath, artifacts_dict, lstrip_paths) artifacts_dict[key] = _hash_artifact(filepath, normalize_line_endings=normalize_line_endings) # Path is no file and no directory else: LOG.info("path: {} does not exist, skipping..".format(artifact)) # Change back to where original current working dir if base_path: os.chdir(original_cwd) return artifacts_dict def execute_link(link_cmd_args, record_streams): """ Executes the passed command plus arguments in a subprocess and returns the return value of the executed command. If the specified standard output and standard error of the command are recorded and also returned to the caller. link_cmd_args: A list where the first element is a command and the remaining elements are arguments passed to that command. record_streams: A bool that specifies whether to redirect standard output and and standard error to a temporary file which is returned to the caller (True) or not (False). TBA (see https://github.com/in-toto/in-toto/issues/6) Executes passed command in a subprocess and redirects stdout and stderr if specified. - A dictionary containing standard output and standard error of the executed command, called by-products. Note: If record_streams is False, the dict values are empty strings. - The return value of the executed command. """ if record_streams: return_code, stdout_str, stderr_str = \ in_toto.process.run_duplicate_streams(link_cmd_args) else: process = in_toto.process.run(link_cmd_args, check=False, stdout=in_toto.process.DEVNULL, stderr=in_toto.process.DEVNULL) stdout_str = stderr_str = "" return_code = process.returncode return { "stdout": stdout_str, "stderr": stderr_str, "return-value": return_code } def in_toto_mock(name, link_cmd_args): """ in_toto_run with defaults - Records materials and products in current directory - Does not sign resulting link file - Stores resulting link file under ".link" name: A unique name to relate mock link metadata with a step or inspection defined in the layout. link_cmd_args: A list where the first element is a command and the remaining elements are arguments passed to that command. None. Writes newly created link metadata file to disk using the filename scheme from link.FILENAME_FORMAT_SHORT Newly created Metablock object containing a Link object """ link_metadata = in_toto_run(name, ["."], ["."], link_cmd_args, record_streams=True) filename = FILENAME_FORMAT_SHORT.format(step_name=name) LOG.info("Storing unsigned link metadata to '{}'...".format(filename)) link_metadata.dump(filename) return link_metadata def _check_match_signing_key(signing_key): """ Helper method to check if the signing_key has securesystemslib's KEY_SCHEMA and the private part is not empty. # FIXME: Add private key format check to formats """ securesystemslib.formats.KEY_SCHEMA.check_match(signing_key) if not signing_key["keyval"].get("private"): raise securesystemslib.exceptions.FormatError( "Signing key needs to be a private key.") def in_toto_run(name, material_list, product_list, link_cmd_args, record_streams=False, signing_key=None, gpg_keyid=None, gpg_use_default=False, gpg_home=None, exclude_patterns=None, base_path=None, compact_json=False, record_environment=False, normalize_line_endings=False, lstrip_paths=None): """ Calls functions in this module to run the command passed as link_cmd_args argument and to store materials, products, by-products and environment information into a link metadata file. The link metadata file is signed either with the passed signing_key, or a gpg key identified by the passed gpg_keyid or with the default gpg key if gpg_use_default is True. Even if multiple key parameters are passed, only one key is used for signing (in above order of precedence). The link file is dumped to `link.FILENAME_FORMAT` using the signing key's keyid. If no key parameter is passed the link is neither signed nor dumped. name: A unique name to relate link metadata with a step or inspection defined in the layout. material_list: List of file or directory paths that should be recorded as materials. product_list: List of file or directory paths that should be recorded as products. link_cmd_args: A list where the first element is a command and the remaining elements are arguments passed to that command. record_streams: (optional) A bool that specifies whether to redirect standard output and and standard error to a temporary file which is returned to the caller (True) or not (False). signing_key: (optional) If not None, link metadata is signed with this key. Format is securesystemslib.formats.KEY_SCHEMA gpg_keyid: (optional) If not None, link metadata is signed with a gpg key identified by the passed keyid. gpg_use_default: (optional) If True, link metadata is signed with default gpg key. gpg_home: (optional) Path to GPG keyring (if not set the default keyring is used). exclude_patterns: (optional) Artifacts matched by the pattern are excluded from the materials and products sections in the resulting link. base_path: (optional) If passed, record artifacts relative to base_path. Default is current working directory. NOTE: The base_path part of the recorded material is not included in the resulting preliminary link's material/product sections. compact_json: (optional) Whether or not to use the most compact json representation. record_environment: (optional) if values such as workdir should be recorded on the environment dictionary (false by default) normalize_line_endings: (optional) If True, replaces windows and mac line endings with unix line endings before hashing materials and products, for cross-platform support. lstrip_paths: (optional) If a prefix path is passed, the prefix is left stripped from the path of every artifact that contains the prefix. securesystemslib.FormatError if a signing_key is passed and does not match securesystemslib.formats.KEY_SCHEMA or a gpg_keyid is passed and does not match securesystemslib.formats.KEYID_SCHEMA or exclude_patterns are passed and don't match securesystemslib.formats.NAMES_SCHEMA, or base_path is passed and does not match securesystemslib.formats.PATH_SCHEMA or is not a directory. in_toto.gpg.exceptions.CommandError: If gpg is used for signing and the command exits with a non-zero code. If a key parameter is passed for signing, the newly created link metadata file is written to disk using the filename scheme: `link.FILENAME_FORMAT` Newly created Metablock object containing a Link object """ LOG.info("Running '{}'...".format(name)) # Check key formats to fail early if signing_key: _check_match_signing_key(signing_key) if gpg_keyid: securesystemslib.formats.KEYID_SCHEMA.check_match(gpg_keyid) if exclude_patterns: securesystemslib.formats.NAMES_SCHEMA.check_match(exclude_patterns) if base_path: securesystemslib.formats.PATH_SCHEMA.check_match(base_path) if material_list: LOG.info("Recording materials '{}'...".format(", ".join(material_list))) materials_dict = record_artifacts_as_dict(material_list, exclude_patterns=exclude_patterns, base_path=base_path, follow_symlink_dirs=True, normalize_line_endings=normalize_line_endings, lstrip_paths=lstrip_paths) if link_cmd_args: LOG.info("Running command '{}'...".format(" ".join(link_cmd_args))) byproducts = execute_link(link_cmd_args, record_streams) else: byproducts = {} if product_list: securesystemslib.formats.PATHS_SCHEMA.check_match(product_list) LOG.info("Recording products '{}'...".format(", ".join(product_list))) products_dict = record_artifacts_as_dict(product_list, exclude_patterns=exclude_patterns, base_path=base_path, follow_symlink_dirs=True, normalize_line_endings=normalize_line_endings, lstrip_paths=lstrip_paths) LOG.info("Creating link metadata...") environment = {} if record_environment: environment['workdir'] = os.getcwd().replace('\\', '/') link = in_toto.models.link.Link(name=name, materials=materials_dict, products=products_dict, command=link_cmd_args, byproducts=byproducts, environment=environment) link_metadata = Metablock(signed=link, compact_json=compact_json) signature = None if signing_key: LOG.info("Signing link metadata using passed key...") signature = link_metadata.sign(signing_key) elif gpg_keyid: LOG.info("Signing link metadata using passed GPG keyid...") signature = link_metadata.sign_gpg(gpg_keyid, gpg_home=gpg_home) elif gpg_use_default: LOG.info("Signing link metadata using default GPG key ...") signature = link_metadata.sign_gpg(gpg_keyid=None, gpg_home=gpg_home) # We need the signature's keyid to write the link to keyid infix'ed filename if signature: signing_keyid = signature["keyid"] filename = FILENAME_FORMAT.format(step_name=name, keyid=signing_keyid) LOG.info("Storing link metadata to '{}'...".format(filename)) link_metadata.dump(filename) return link_metadata def in_toto_record_start(step_name, material_list, signing_key=None, gpg_keyid=None, gpg_use_default=False, gpg_home=None, exclude_patterns=None, base_path=None, record_environment=False, normalize_line_endings=False, lstrip_paths=None): """ Starts creating link metadata for a multi-part in-toto step. I.e. records passed materials, creates link meta data object from it, signs it with passed signing_key, gpg key identified by the passed gpg_keyid or the default gpg key and stores it to disk under UNFINISHED_FILENAME_FORMAT. One of signing_key, gpg_keyid or gpg_use_default has to be passed. step_name: A unique name to relate link metadata with a step defined in the layout. material_list: List of file or directory paths that should be recorded as materials. signing_key: (optional) If not None, link metadata is signed with this key. Format is securesystemslib.formats.KEY_SCHEMA gpg_keyid: (optional) If not None, link metadata is signed with a gpg key identified by the passed keyid. gpg_use_default: (optional) If True, link metadata is signed with default gpg key. gpg_home: (optional) Path to GPG keyring (if not set the default keyring is used). exclude_patterns: (optional) Artifacts matched by the pattern are excluded from the materials section in the resulting preliminary link. base_path: (optional) If passed, record materials relative to base_path. Default is current working directory. NOTE: The base_path part of the recorded materials is not included in the resulting preliminary link's material section. record_environment: (optional) if values such as workdir should be recorded on the environment dictionary (false by default) normalize_line_endings: (optional) If True, replaces windows and mac line endings with unix line endings before hashing materials, for cross-platform support. lstrip_paths: (optional) If a prefix path is passed, the prefix is left stripped from the path of every artifact that contains the prefix. ValueError if none of signing_key, gpg_keyid or gpg_use_default=True is passed. securesystemslib.FormatError if a signing_key is passed and does not match securesystemslib.formats.KEY_SCHEMA or a gpg_keyid is passed and does not match securesystemslib.formats.KEYID_SCHEMA or exclude_patterns are passed and don't match securesystemslib.formats.NAMES_SCHEMA, or base_path is passed and does not match securesystemslib.formats.PATH_SCHEMA or is not a directory. in_toto.gpg.exceptions.CommandError: If gpg is used for signing and the command exits with a non-zero code. Writes newly created link metadata file to disk using the filename scheme from link.UNFINISHED_FILENAME_FORMAT None. """ LOG.info("Start recording '{}'...".format(step_name)) # Fail if there is no signing key arg at all if not signing_key and not gpg_keyid and not gpg_use_default: raise ValueError("Pass either a signing key, a gpg keyid or set" " gpg_use_default to True!") # Check key formats to fail early if signing_key: _check_match_signing_key(signing_key) if gpg_keyid: securesystemslib.formats.KEYID_SCHEMA.check_match(gpg_keyid) if exclude_patterns: securesystemslib.formats.NAMES_SCHEMA.check_match(exclude_patterns) if base_path: securesystemslib.formats.PATH_SCHEMA.check_match(base_path) if material_list: LOG.info("Recording materials '{}'...".format(", ".join(material_list))) materials_dict = record_artifacts_as_dict(material_list, exclude_patterns=exclude_patterns, base_path=base_path, follow_symlink_dirs=True, normalize_line_endings=normalize_line_endings, lstrip_paths=lstrip_paths) LOG.info("Creating preliminary link metadata...") environment = {} if record_environment: environment['workdir'] = os.getcwd().replace('\\', '/') link = in_toto.models.link.Link(name=step_name, materials=materials_dict, products={}, command=[], byproducts={}, environment=environment) link_metadata = Metablock(signed=link) if signing_key: LOG.info("Signing link metadata using passed key...") signature = link_metadata.sign(signing_key) elif gpg_keyid: LOG.info("Signing link metadata using passed GPG keyid...") signature = link_metadata.sign_gpg(gpg_keyid, gpg_home=gpg_home) else: # (gpg_use_default) LOG.info("Signing link metadata using default GPG key ...") signature = link_metadata.sign_gpg(gpg_keyid=None, gpg_home=gpg_home) # We need the signature's keyid to write the link to keyid infix'ed filename signing_keyid = signature["keyid"] unfinished_fn = UNFINISHED_FILENAME_FORMAT.format(step_name=step_name, keyid=signing_keyid) LOG.info( "Storing preliminary link metadata to '{}'...".format(unfinished_fn)) link_metadata.dump(unfinished_fn) def in_toto_record_stop(step_name, product_list, signing_key=None, gpg_keyid=None, gpg_use_default=False, gpg_home=None, exclude_patterns=None, base_path=None, normalize_line_endings=False, lstrip_paths=None): """ Finishes creating link metadata for a multi-part in-toto step. Loads unfinished link metadata file from disk, verifies that the file was signed with either the passed signing key, a gpg key identified by the passed gpg_keyid or the default gpg key. Then records products, updates unfinished Link object (products and signature), removes unfinished link file from and stores new link file to disk. One of signing_key, gpg_keyid or gpg_use_default has to be passed and it needs to be the same that was used with preceding in_toto_record_start. step_name: A unique name to relate link metadata with a step defined in the layout. product_list: List of file or directory paths that should be recorded as products. signing_key: (optional) If not None, link metadata is signed with this key. Format is securesystemslib.formats.KEY_SCHEMA gpg_keyid: (optional) If not None, link metadata is signed with a gpg key identified by the passed keyid. gpg_use_default: (optional) If True, link metadata is signed with default gpg key. gpg_home: (optional) Path to GPG keyring (if not set the default keyring is used). exclude_patterns: (optional) Artifacts matched by the pattern are excluded from the products sections in the resulting link. base_path: (optional) If passed, record products relative to base_path. Default is current working directory. NOTE: The base_path part of the recorded products is not included in the resulting preliminary link's product section. normalize_line_endings: (optional) If True, replaces windows and mac line endings with unix line endings before hashing products, for cross-platform support. lstrip_paths: (optional) If a prefix path is passed, the prefix is left stripped from the path of every artifact that contains the prefix. ValueError if none of signing_key, gpg_keyid or gpg_use_default=True is passed. securesystemslib.FormatError if a signing_key is passed and does not match securesystemslib.formats.KEY_SCHEMA or a gpg_keyid is passed and does not match securesystemslib.formats.KEYID_SCHEMA, or exclude_patterns are passed and don't match securesystemslib.formats.NAMES_SCHEMA, or base_path is passed and does not match securesystemslib.formats.PATH_SCHEMA or is not a directory. LinkNotFoundError if gpg is used for signing and the corresponding preliminary link file can not be found in the current working directory SignatureVerificationError: If the signature of the preliminary link file is invalid. in_toto.gpg.exceptions.KeyExpirationError: If the key used to verify the signature of the preliminary link file is an expired gpg key. in_toto.gpg.exceptions.CommandError: If gpg is used for signing and the command exits with a non-zero code. Writes newly created link metadata file to disk using the filename scheme from link.FILENAME_FORMAT Removes unfinished link file link.UNFINISHED_FILENAME_FORMAT from disk None. """ LOG.info("Stop recording '{}'...".format(step_name)) # Check that we have something to sign and if the formats are right if not signing_key and not gpg_keyid and not gpg_use_default: raise ValueError("Pass either a signing key, a gpg keyid or set" " gpg_use_default to True") if signing_key: _check_match_signing_key(signing_key) if gpg_keyid: securesystemslib.formats.KEYID_SCHEMA.check_match(gpg_keyid) if exclude_patterns: securesystemslib.formats.NAMES_SCHEMA.check_match(exclude_patterns) if base_path: securesystemslib.formats.PATH_SCHEMA.check_match(base_path) # Load preliminary link file # If we have a signing key we can use the keyid to construct the name if signing_key: unfinished_fn = UNFINISHED_FILENAME_FORMAT.format(step_name=step_name, keyid=signing_key["keyid"]) # FIXME: Currently there is no way to know the default GPG key's keyid and # so we glob for preliminary link files else: unfinished_fn_glob = UNFINISHED_FILENAME_FORMAT_GLOB.format( step_name=step_name, pattern="*") unfinished_fn_list = glob.glob(unfinished_fn_glob) if not len(unfinished_fn_list): raise in_toto.exceptions.LinkNotFoundError("Could not find a preliminary" " link for step '{}' in the current working directory.".format( step_name)) if len(unfinished_fn_list) > 1: raise in_toto.exceptions.LinkNotFoundError("Found more than one" " preliminary links for step '{}' in the current working directory:" " {}. We need exactly one to stop recording.".format( step_name, ", ".join(unfinished_fn_list))) unfinished_fn = unfinished_fn_list[0] LOG.info("Loading preliminary link metadata '{}'...".format(unfinished_fn)) link_metadata = Metablock.load(unfinished_fn) # The file must have been signed by the same key # If we have a signing_key we use it for verification as well if signing_key: LOG.info( "Verifying preliminary link signature using passed signing key...") keyid = signing_key["keyid"] verification_key = signing_key elif gpg_keyid: LOG.info("Verifying preliminary link signature using passed gpg key...") gpg_pubkey = in_toto.gpg.functions.gpg_export_pubkey(gpg_keyid, gpg_home) keyid = gpg_pubkey["keyid"] verification_key = gpg_pubkey else: # must be gpg_use_default # FIXME: Currently there is no way to know the default GPG key's keyid # before signing. As a workaround we extract the keyid of the preliminary # Link file's signature and try to export a pubkey from the gpg # keyring. We do this even if a gpg_keyid was specified, because gpg # accepts many different ids (mail, name, parts of an id, ...) but we # need a specific format. LOG.info("Verifying preliminary link signature using default gpg key...") keyid = link_metadata.signatures[0]["keyid"] gpg_pubkey = in_toto.gpg.functions.gpg_export_pubkey(keyid, gpg_home) verification_key = gpg_pubkey link_metadata.verify_signature(verification_key) # Record products if a product path list was passed if product_list: LOG.info("Recording products '{}'...".format(", ".join(product_list))) link_metadata.signed.products = record_artifacts_as_dict( product_list, exclude_patterns=exclude_patterns, base_path=base_path, follow_symlink_dirs=True, normalize_line_endings=normalize_line_endings, lstrip_paths=lstrip_paths) link_metadata.signatures = [] if signing_key: LOG.info("Updating signature with key '{:.8}...'...".format(keyid)) link_metadata.sign(signing_key) else: # gpg_keyid or gpg_use_default # In both cases we use the keyid we got from verifying the preliminary # link signature above. LOG.info("Updating signature with gpg key '{:.8}...'...".format(keyid)) link_metadata.sign_gpg(keyid, gpg_home) fn = FILENAME_FORMAT.format(step_name=step_name, keyid=keyid) LOG.info("Storing link metadata to '{}'...".format(fn)) link_metadata.dump(fn) LOG.info("Removing unfinished link metadata '{}'...".format(unfinished_fn)) os.remove(unfinished_fn) in-toto-0.4.0/in_toto/settings.py0000644000076500000240000000265113535414523017100 0ustar lukpstaff00000000000000""" settings.py Lukas Puehringer June 23, 2016 See LICENSE for licensing information. A central place to define default settings that can be used throughout the package. Defaults can be changed, - here (hardcoded), - programmatically, e.g. ``` import in_toto.settings in_toto.settings.ARTIFACT_BASE_PATH = "/home/user/project" ``` - or, when using in-toto via command line tooling, with environment variables or RCfiles, see the `in_toto.user_settings` module """ # The debug setting is used to set to the in-toto base logger to logging.DEBUG # TODO: This setting is currently not available via environment variables or # rcfiles, partially because at the moment it is read before we parse # envvars/rcfiles. Do we want to make it available to those as well? DEBUG = False # See docstring of `in-toto.record_artifacts_as_dict` for how this is used ARTIFACT_EXCLUDE_PATTERNS = ["*.link*", ".git", "*.pyc", "*~"] # Used as base path for --materials and --products arguments when running # in-toto-run/in-toto-record # If not set the current working directory is used as base path # FIXME: Do we want different base paths for materials and products? ARTIFACT_BASE_PATH = None # If the process does not terminate after timeout seconds, a # subprocess.TimeoutExpired exception will be raised. SUBPROCESS_TIMEOUT = 3 in-toto-0.4.0/in_toto/user_settings.py0000644000076500000240000001306713535414523020141 0ustar lukpstaff00000000000000""" user_settings.py Lukas Puehringer Oct 25, 2017 See LICENSE for licensing information. Provides methods to parse environment variables (`get_env`) and RCfiles (`get_rc`) and to override default settings (`set_settings`) defined in the `in_toto.settings` module. Check out the respective docstrings to learn about the requirements for environment variables and RCfiles (includes examples). """ import os import six import logging import in_toto.settings try: import configparser except ImportError: # pragma: no cover import ConfigParser as configparser # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) USER_PATH = os.path.expanduser("~") # Prefix required by environment variables to be considered as in_toto settings ENV_PREFIX = "IN_TOTO_" # List of considered rcfile paths in the order they get parsed and overridden, # i.e. the same setting in `/etc/in_toto/config` and `.in_totorc` (cwd) uses # the latter RC_PATHS = [ os.path.join("/etc", "in_toto", "config"), os.path.join("/etc", "in_totorc"), os.path.join(USER_PATH, ".config", "in_toto", "config"), os.path.join(USER_PATH, ".config", "in_toto"), os.path.join(USER_PATH, ".in_toto", "config"), os.path.join(USER_PATH, ".in_totorc"), ".in_totorc" ] # List of settings, for which defaults exist in `settings.py` # TODO: Should we use `dir` on the module instead? If we list them here, we # have to manually update if `settings.py` changes. IN_TOTO_SETTINGS = [ "ARTIFACT_EXCLUDE_PATTERNS", "ARTIFACT_BASE_PATH" ] def _colon_split(value): """ If `value` contains colons, return a list split at colons, return value otherwise. """ value_list = value.split(":") if len(value_list) > 1: return value_list return value def get_env(): """ Parse environment for variables with prefix `ENV_PREFIX` and return a dict of key-value pairs. The prefix `ENV_PREFIX` is stripped from the keys in the returned dict. Values that contain colons (:) are split at the postion of the colons and converted into a list. Example: ``` # Exporting variables in e.g. bash export IN_TOTO_ARTIFACT_BASE_PATH='/home/user/project' export IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS='*.link:.gitignore' ``` produces ``` { "ARTIFACT_BASE_PATH": "/home/user/project" "ARTIFACT_EXCLUDE_PATTERNS": ["*.link", ".gitignore"] } ``` None. Calls function to read files from disk. A dictionary containing the parsed key-value pairs. """ env_dict = {} for name, value in six.iteritems(os.environ): if (name.startswith(ENV_PREFIX) and len(name) > len(ENV_PREFIX)): stripped_name = name[len(ENV_PREFIX):] env_dict[stripped_name] = _colon_split(value) return env_dict def get_rc(): """ Reads RCfiles from the paths defined in `RC_PATHS` and returns a dictionary with all parsed key-value pairs. The RCfile format is as expected by Python's builtin `ConfigParser` with the addition that values that contain colons (:) are split at the position of the colons and converted into a list. Section titles in RCfiles are ignored when parsing the key-value pairs. However, there has to be at least one section defined. The paths in `RC_PATHS` are ordered in reverse precedence, i.e. each file's settings override a previous file's settings, e.g. a setting defined in `.in_totorc` (in the current working dir) overrides the same setting defined in `~/.in_totorc` (in the user's home dir) and so on ... Example: ``` # E.g. file `.in_totorc` in current working directory [in-toto setting] ARTIFACT_BASE_PATH = /home/user/project ARTIFACT_EXCLUDE_PATTERNS = *.link:.gitignore ``` produces ``` { "ARTIFACT_BASE_PATH": "/home/user/project" "ARTIFACT_EXCLUDE_PATTERNS": ["*.link", ".gitignore"] } ``` None. Calls function to read files from disk. A dictionary containing the parsed key-value pairs. """ rc_dict = {} config = configparser.ConfigParser() # Reset `optionxform`'s default case conversion to enable case-sensitivity config.optionxform = str config.read(RC_PATHS) for section in config.sections(): for name, value in config.items(section): rc_dict[name] = _colon_split(value) return rc_dict def set_settings(): """ Calls functions that read in-toto related environment variables and RCfiles and overrides variables in `settings.py` with the retrieved values, if they are whitelisted in `IN_TOTO_SETTINGS`. Settings defined in RCfiles take precedence over settings defined in environment variables. None. Calls functions that read environment variables and files from disk. None. """ user_settings = get_env() user_settings.update(get_rc()) # If the user has specified one of the settings whitelisted in # IN_TOTO_SETTINGS per envvar or rcfile, override the item in `settings.py` for setting in IN_TOTO_SETTINGS: user_setting = user_settings.get(setting) if user_setting: LOG.info("Setting (user): {0}={1}".format( setting, user_setting)) setattr(in_toto.settings, setting, user_setting) else: default_setting = getattr(in_toto.settings, setting) LOG.info("Setting (default): {0}={1}".format( setting, default_setting)) in-toto-0.4.0/in_toto/util.py0000644000076500000240000002031513535414523016212 0ustar lukpstaff00000000000000# TODO: Add module docstring and remove pylint exemption in in-toto/in-toto#80 # pylint: disable=missing-docstring import sys import getpass import in_toto.gpg.functions import in_toto.gpg.formats import securesystemslib.formats import securesystemslib.hash import securesystemslib.interface import securesystemslib.keys import securesystemslib.exceptions from in_toto.exceptions import UnsupportedKeyTypeError from securesystemslib.interface import (import_ed25519_privatekey_from_file, import_ed25519_publickey_from_file) DEFAULT_RSA_KEY_BITS = 3072 KEY_TYPE_RSA = 'rsa' KEY_TYPE_ED25519 = 'ed25519' SUPPORTED_KEY_TYPES = [KEY_TYPE_ED25519, KEY_TYPE_RSA] def generate_and_write_rsa_keypair(filepath, bits=DEFAULT_RSA_KEY_BITS, password=""): """ Calls securesystemslib.interface.generate_and_write_rsa_keypair with a default password of "" so as to prevent the library from prompting for password. If prompting is needed, it's already handled here and then forwarded. filepath: is where to write the private key. Public key is written to .pub bits (optional): Key size of the rsa key generated. password: (optional) Password to be used to encrypt the private key created. Only those from securesystemslib.interface.generate_and_write_rsa_keypair Same as securesystemslib.interface.generate_and_write_rsa_keypair None. """ securesystemslib.interface.generate_and_write_rsa_keypair(filepath, bits, password) def generate_and_write_ed25519_keypair(filepath, password=""): """ Calls securesystemslib.interface.generate_and_write_ed25519_keypair with a default password of "" so as to prevent the library from prompting for password. If prompting is needed, it's already handled here and then forwarded. filepath: is where to write the private key. Public key is written to .pub password: (optional) Password to be used to encrypt the private key created. Same as securesystemslib.interface.generate_and_write_ed25519_keypair Same as securesystemslib.interface.generate_and_write_ed25519_keypair None. """ securesystemslib.interface.generate_and_write_ed25519_keypair(filepath, password) def import_rsa_key_from_file(filepath, password=None): """ Import the RSA key stored in PEM format to 'filepath'. This can be a public key or a private key. If it is a private key and the password is specified, it will be used to decrypt the private key. filepath: file, an RSA PEM file password: (optional) If a password is specified, the imported private key will be decrypted securesystemslib.exceptions.FormatError, if the arguments are improperly formatted 'filepath' is read and its contents extracted An RSA key object conformant to 'tuf.formats.RSAKEY_SCHEMA' """ securesystemslib.formats.PATH_SCHEMA.check_match(filepath) with open(filepath, "rb") as fo_pem: rsa_pem = fo_pem.read().decode("utf-8") if securesystemslib.keys.is_pem_private(rsa_pem): rsa_key = securesystemslib.keys.import_rsakey_from_private_pem( rsa_pem, password=password) elif securesystemslib.keys.is_pem_public(rsa_pem): rsa_key = securesystemslib.keys.import_rsakey_from_public_pem(rsa_pem) else: raise securesystemslib.exceptions.FormatError( "The key has to be clear either a private or" " public RSA key in PEM format") return rsa_key def import_public_keys_from_files_as_dict(filepaths, key_types=None): """ Takes a list of filepaths to RSA public keys and returns them as a dictionary conformant with securesystemslib.formats.KEYDICT_SCHEMA. filepaths: List of paths to the public keys. key_types: (optional) List types of each of the keys being imported into the dict. If not specified, all keys are assumed to be RSA. securesystemslib.exceptions.FormatError, if the arguments are don't have the same length. UnsupportedKeyTypeError, if the key_type specified is unsupported. Each file in 'filepaths' is read and its contents extracted A key dict object conformant with securesystemslib.formats.KEYDICT_SCHEMA """ # are key_types needed? # we could figure it out using the key format if key_types is None: key_types = [KEY_TYPE_RSA] * len(filepaths) if len(key_types) != len(filepaths): raise securesystemslib.exceptions.FormatError( "number of key_types should match with the number" " of layout keys specified") key_dict = {} for idx, filepath in enumerate(filepaths): if key_types[idx] == KEY_TYPE_ED25519: key = import_ed25519_publickey_from_file(filepath) elif key_types[idx] == KEY_TYPE_RSA: key = import_rsa_key_from_file(filepath) else: # pragma: no cover # This branch is never possible as argparse already checks valid keys # via the choices parameter. raise UnsupportedKeyTypeError('Unsupported keytype: ' + key_types[idx]) securesystemslib.formats.PUBLIC_KEY_SCHEMA.check_match(key) keyid = key["keyid"] key_dict[keyid] = key return key_dict def import_gpg_public_keys_from_keyring_as_dict(keyids, gpg_home=False): """Creates a dictionary of gpg public keys retrieving gpg public keys identified by the list of passed `keyids` from the gpg keyring at `gpg_home`. If `gpg_home` is False the default keyring is used. """ key_dict = {} for gpg_keyid in keyids: pub_key = in_toto.gpg.functions.gpg_export_pubkey(gpg_keyid, homedir=gpg_home) in_toto.gpg.formats.PUBKEY_SCHEMA.check_match(pub_key) keyid = pub_key["keyid"] key_dict[keyid] = pub_key return key_dict def prompt_password(prompt="Enter password: "): """Prompts for password input and returns the password. """ return getpass.getpass(prompt, sys.stderr) def import_private_key_from_file(filepath, key_type): """ Tries to load a key with/without password. If a CryptoError occurs, prompts the user for a password and tries to load the key again. filepath: file, a private key file key_type: Type of the private key being imported. UnsupportedKeyTypeError, if the key_type specified is unsupported. 'filepath' is read and its contents extracted A private key object conformant with securesystemslib.formats.KEY_SCHEMA """ if key_type == KEY_TYPE_ED25519: key = prompt_import_ed25519_privatekey_from_file(filepath) elif key_type == KEY_TYPE_RSA: key = prompt_import_rsa_key_from_file(filepath) else: raise UnsupportedKeyTypeError('Unsupported keytype: ' + key_type) return key def prompt_import_ed25519_privatekey_from_file(filepath): """Tries to load an Ed25519 private key without password. If a CryptoError occurs, prompts the user for a password and tries to load the key again. """ password = None try: import_ed25519_privatekey_from_file(filepath, password) except securesystemslib.exceptions.CryptoError: password = prompt_password() return import_ed25519_privatekey_from_file(filepath, password) def prompt_import_rsa_key_from_file(filepath): """Tries to load an RSA key without password. If a CryptoError occurs, prompts the user for a password and tries to load the key again. """ password = None try: import_rsa_key_from_file(filepath, password) except securesystemslib.exceptions.CryptoError: password = prompt_password() return import_rsa_key_from_file(filepath, password) def prompt_generate_and_write_rsa_keypair(filepath, bits): """Prompts for password and calls generate_and_write_rsa_keypair""" password = prompt_password() generate_and_write_rsa_keypair(filepath, bits, password) def prompt_generate_and_write_ed25519_keypair(filepath): """Prompts for password and calls generate_and_write_ed25519_keypair""" password = prompt_password() generate_and_write_ed25519_keypair(filepath, password) in-toto-0.4.0/in_toto/verifylib.py0000644000076500000240000014406613535414523017242 0ustar lukpstaff00000000000000""" verifylib.py Lukas Puehringer Santiago Torres-Arias June 28, 2016 See LICENSE for licensing information. Provides a library to verify an in-toto final product containing a software supply chain layout. Take a look at `in_toto_verify`'s docstring for more details about the entire verification workflow. """ import os import datetime import iso8601 import fnmatch import six import logging from dateutil import tz import securesystemslib.exceptions import in_toto.settings import in_toto.util import in_toto.runlib import in_toto.models.layout import in_toto.models.link import in_toto.formats from in_toto.models.metadata import Metablock from in_toto.models.link import (FILENAME_FORMAT, FILENAME_FORMAT_SHORT) from in_toto.models.layout import SUBLAYOUT_LINK_DIR_FORMAT from in_toto.exceptions import (RuleVerificationError, LayoutExpiredError, ThresholdVerificationError, BadReturnValueError, SignatureVerificationError) from in_toto.gpg.exceptions import KeyExpirationError import in_toto.rulelib # Inherits from in_toto base logger (c.f. in_toto.log) LOG = logging.getLogger(__name__) RULE_TRACE = {} def _raise_on_bad_retval(return_value, command=None): """ Internal function that checks return values of shell commands, e.g. from inspections. Raises exception if the passed value is non-int and non-zero. return_value: The return value to be verified command: (optional) The command whose execution returned the value, used for exception message. BadReturnValueError if the return_value is non-int and non-zero None. None. """ msg = "Got non-{what} " + "return value '{}'".format(return_value) if command: msg = "{0} from command '{1}'.".format(msg, command) else: msg = "{0}.".format(msg) if not isinstance(return_value, int): raise BadReturnValueError(msg.format(what="int")) # TODO: in-toto specification suggests special behavior on # return_value == 127, but does not fully define that behavior yet if return_value != 0: raise BadReturnValueError(msg.format(what="zero")) def load_links_for_layout(layout, link_dir_path): """ Try to load all existing metadata files for each Step of the Layout from the current directory. For each step the metadata might consist of multiple (thresholds) Link or Layout (sub-layouts) files. layout: Layout object link_dir_path: A path to directory where links are loaded from Calls function to read files from disk in_toto.exceptions.LinkNotFoundError, if fewer than `threshold` link files can be found for any given step of the supply chain (preliminary threshold check) A dictionary carrying all the found metadata corresponding to the passed layout, e.g.: { : { : , ... }, ... } """ steps_metadata = {} # Iterate over all the steps in the layout for step in layout.steps: links_per_step = {} # We try to load a link for every authorized functionary, but don't fail # if the file does not exist (authorized != required) # FIXME: Should we really pass on IOError, or just skip inexistent links? for authorized_keyid in step.pubkeys: # Iterate over the authorized key and if present over subkeys for keyid in [authorized_keyid] + list(layout.keys.get(authorized_keyid, {}).get("subkeys", {}).keys()): filename = FILENAME_FORMAT.format(step_name=step.name, keyid=keyid) filepath = os.path.join(link_dir_path, filename) try: metadata = Metablock.load(filepath) links_per_step[keyid] = metadata except IOError: pass # This is only a preliminary threshold check, based on (authorized) # filenames, to fail early. A more thorough signature-based threshold # check is indispensable. if len(links_per_step) < step.threshold: raise in_toto.exceptions.LinkNotFoundError("Step '{0}' requires '{1}'" " link metadata file(s), found '{2}'." .format(step.name, step.threshold, len(links_per_step))) steps_metadata[step.name] = links_per_step return steps_metadata def run_all_inspections(layout): """ Extracts all inspections from a passed Layout's inspect field and iteratively runs each command defined in the Inspection's `run` field using `runlib.in_toto_run`, which returns a Metablock object containing a Link object. If a link command returns non-zero the verification is aborted. layout: A Layout object which is used to extract the Inspections. Calls function that raises BadReturnValueError if an inspection returned non-int or non-zero. A dictionary of metadata about the executed inspections, e.g.: { : { , ... }, ... } """ inspection_links_dict = {} for inspection in layout.inspect: LOG.info("Executing command for inspection '{}'...".format( inspection.name)) # FIXME: We don't want to use the base path for runlib so we patch this # for now. This will not stay! base_path_backup = in_toto.settings.ARTIFACT_BASE_PATH in_toto.settings.ARTIFACT_BASE_PATH = None # FIXME: What should we record as material/product? # Is the current directory a sensible default? In general? # If so, we should probably make it a default in run_link # We could use artifact rule paths. material_list = product_list = ["."] link = in_toto.runlib.in_toto_run(inspection.name, material_list, product_list, inspection.run) _raise_on_bad_retval( link.signed.byproducts.get("return-value"), inspection.run) inspection_links_dict[inspection.name] = link # Dump the inspection link file for auditing # Keep in mind that this pollutes the verifier's (client's) filesystem. filename = FILENAME_FORMAT_SHORT.format(step_name=inspection.name) link.dump(filename) in_toto.settings.ARTIFACT_BASE_PATH = base_path_backup return inspection_links_dict def verify_layout_expiration(layout): """ Raises an exception if the passed layout has expired, i.e. if its `expires` property is lesser "now". Time zone aware datetime objects in UTC+00:00 (Zulu Time) are used. layout: The Layout object to be verified. LayoutExpiredError TBA (see https://github.com/in-toto/in-toto/issues/6) None. """ expire_datetime = iso8601.parse_date(layout.expires) if expire_datetime < datetime.datetime.now(tz.tzutc()): raise LayoutExpiredError("Layout expired") def substitute_parameters(layout, parameter_dictionary): """ This function is a transitionary measure for parameter substitution (or any other solution defined by the in-toto team). As of now, it acts as a very simple replacement layer for python-like parameters layout: The Layout object to process. parameter_dictionary: A dictionary containing key-value pairs for substitution. securesystemslib.exceptions.FormatError: if the parameter dictionary is malformed. KeyError: if one of the keys in the parameter dictionary are not present for substitution The layout object will have any tags replaced with the corresponding values defined in the parameter dictionary. """ in_toto.formats.PARAMETER_DICTIONARY_SCHEMA.check_match(parameter_dictionary) for step in layout.steps: new_material_rules = [] for rule in step.expected_materials: new_rule = [] for stanza in rule: new_rule.append(stanza.format(**parameter_dictionary)) new_material_rules.append(new_rule) new_product_rules = [] for rule in step.expected_products: new_rule = [] for stanza in rule: new_rule.append(stanza.format(**parameter_dictionary)) new_product_rules.append(new_rule) new_expected_command = [] for argv in step.expected_command: new_expected_command.append(argv.format(**parameter_dictionary)) step.expected_command = new_expected_command step.expected_materials = new_material_rules step.expected_products = new_product_rules for inspection in layout.inspect: new_material_rules = [] for rule in inspection.expected_materials: new_rule = [] for stanza in rule: new_rule.append(stanza.format(**parameter_dictionary)) new_material_rules.append(new_rule) new_product_rules = [] for rule in inspection.expected_products: new_rule = [] for stanza in rule: new_rule.append(stanza.format(**parameter_dictionary)) new_product_rules.append(new_rule) new_run = [] for argv in inspection.run: new_run.append(argv.format(**parameter_dictionary)) inspection.run = new_run inspection.expected_materials = new_material_rules inspection.expected_products = new_product_rules def verify_layout_signatures(layout_metablock, keys_dict): """ Iteratively verifies the signatures of a Metablock object containing a Layout object for every verification key in the passed keys dictionary. Requires at least one key to be passed and requires every passed key to find a valid signature. layout_metablock: A Metablock object containing a Layout whose signatures are verified. keys_dict: A dictionary of keys to verify the signatures conformant with securesystemslib.formats.ANY_VERIFICATION_KEY_DICT_SCHEMA. securesystemslib.exceptions.FormatError if the passed key dict does not match ANY_VERIFICATION_KEY_DICT_SCHEMA. SignatureVerificationError if an empty verification key dictionary was passed, or if any of the passed verification keys fails to verify a signature. in_toto.gpg.exceptions.KeyExpirationError: if any of the passed verification keys is an expired gpg key """ in_toto.formats.ANY_VERIFICATION_KEY_DICT_SCHEMA.check_match(keys_dict) # Fail if an empty verification key dictionary was passed if len(keys_dict) < 1: raise SignatureVerificationError("Layout signature verification" " requires at least one key.") # Fail if any of the passed keys can't verify a signature on the Layout for junk, verify_key in six.iteritems(keys_dict): layout_metablock.verify_signature(verify_key) def verify_link_signature_thresholds(layout, chain_link_dict): """ Verify that for each step of the layout there are at least `threshold` links, signed by different authorized functionaries and return the chain link dictionary containing only authorized links whose signatures were successfully verified. NOTE: If the layout's key store (`layout.keys`) lists a (master) key `K`, with a subkey `K'`, then `K'` is authorized implicitly, to sign any link that `K` is authorized to sign. In other words, the trust in a master key extends to the trust in a subkey. The inverse is not true. layout: A Layout object whose Steps are extracted and verified. chain_link_dict: A dictionary containing link metadata per functionary per step, e.g.: { : { : , ... }, ... } ThresholdVerificationError If any of the steps of the passed layout does not have enough (`step.threshold`) links signed by different authorized functionaries. A chain_link_dict containing only links with valid signatures created by authorized functionaries. """ # Create an inverse keys-subkeys dictionary, with subkey keyids as # dictionary keys and main keys as dictionary values. This will be # required below to assess main-subkey trust delegations. # We assume that a given subkey can only belong to one master key # TODO: Is this a safe assumption? Should we assert for it? main_keys_for_subkeys = {} for main_key in list(layout.keys.values()): for sub_keyid in main_key.get("subkeys", []): main_keys_for_subkeys[sub_keyid] = main_key # Dict for valid and authorized links of all steps of the layout verfied_chain_link_dict = {} # For each step of the layout check the signatures of corresponding links. # Consider only links where the signature is valid and keys are authorized, # and discard others. # Only count one of multiple links signed with different subkeys of a main # key towards link threshold. # Only proceed with final product verification if threshold requirements are # fulfilled. for step in layout.steps: # Dict for valid and authorized links of a given step verified_key_link_dict = {} # List of used keyids used_main_keyids = [] # Do per step link threshold verification for link_keyid, link in six.iteritems(chain_link_dict.get(step.name, {})): # Iterate over authorized keyids to find a key or subkey corresponding # to the given link and check if the link's keyid is authorized. # Subkeys of authorized main keys are authorized implicitly. for authorized_keyid in step.pubkeys: authorized_key = layout.keys.get(authorized_keyid) main_key_for_subkey = main_keys_for_subkeys.get(authorized_keyid) # The signing key is authorized if authorized_key and link_keyid == authorized_keyid: verification_key = authorized_key break # The signing key is an authorized subkey elif main_key_for_subkey and link_keyid == authorized_keyid: verification_key = main_key_for_subkey break # The signing key is a subkey of an authorized key elif (authorized_key and link_keyid in authorized_key.get("subkeys", {}).keys()): verification_key = authorized_key break else: LOG.info("Skipping link. Keyid '{0}' is not authorized to sign links" " for step '{1}'".format(link_keyid, step.name)) continue # Verify signature and skip invalidly signed links try: link.verify_signature(verification_key) except SignatureVerificationError: LOG.info("Skipping link. Broken link signature with keyid '{0}'" " for step '{1}'".format(link_keyid, step.name)) continue except KeyExpirationError as e: LOG.info("Skipping link. {}".format(e)) continue # Warn if there are links signed by different subkeys of same main key if verification_key["keyid"] in used_main_keyids: LOG.warning("Found links signed by different subkeys of the same main" " key '{}' for step '{}'. Only one of them is counted towards the" " step threshold.".format(verification_key["keyid"], step.name)) used_main_keyids.append(verification_key["keyid"]) # Keep only links with valid and authorized signature verified_key_link_dict[link_keyid] = link # For each step, verify that we have enough validly signed links from # distinct authorized functionaries. Links signed by different subkeys of # the same main key are counted only once towards the threshold. valid_authorized_links_cnt = (len(verified_key_link_dict) - (len(used_main_keyids) - len(set(used_main_keyids)))) # TODO: To guarantee that links are signed by different functionaries # we rely on the layout to not carry duplicate verification keys under # different dictionary keys, e.g. {keyid1: KEY1, keyid2: KEY1} # Maybe we should add such a check to the layout validation? Or here? if valid_authorized_links_cnt < step.threshold: raise ThresholdVerificationError("Step '{}' requires at least '{}' links" " validly signed by different authorized functionaries. Only" " found '{}'".format(step.name, step.threshold, valid_authorized_links_cnt)) # Add all good links of this step to the dictionary of links of all steps verfied_chain_link_dict[step.name] = verified_key_link_dict # Threshold verification succeeded, return valid and authorized links for # further verification return verfied_chain_link_dict def verify_command_alignment(command, expected_command): """ Checks if a run command aligns with an expected command. The commands align if all of their elements are equal. If alignment fails, a warning is printed. Note: Command alignment is a weak guarantee. Because a functionary can easily alias commands. command: A command list, e.g. ["vi", "foo.py"] expected_command: A command list, e.g. ["make", "install"] None. Logs warning in case commands do not align. """ # In what case command alignment should fail and how that failure should be # propagated has been thoughly discussed in: # https://github.com/in-toto/in-toto/issues/46 and # https://github.com/in-toto/in-toto/pull/47 # We chose the simplest solution for now, i.e. Warn if they do not align. if command != expected_command: LOG.warning("Run command '{0}' differs from expected command '{1}'" .format(command, expected_command)) def verify_all_steps_command_alignment(layout, chain_link_dict): """ Iteratively checks if all expected commands as defined in the Steps of a Layout align with the actual commands as recorded in the Link metadata. layout: A Layout object to extract the expected commands from. chain_link_dict: A dictionary containing link metadata per functionary per step, e.g.: { : { : , ... }, ... } None. None. """ for step in layout.steps: # Find the according link for this step expected_command = step.expected_command key_link_dict = chain_link_dict[step.name] # FIXME: I think we could do this for one link per step only # providing that we verify command alignment AFTER threshold equality for keyid, link in six.iteritems(key_link_dict): LOG.info("Verifying command alignment for '{0}'...".format( in_toto.models.link.FILENAME_FORMAT.format(step_name=step.name, keyid=keyid))) command = link.signed.command verify_command_alignment(command, expected_command) def verify_match_rule(rule_data, artifacts_queue, source_artifacts, links): """ Filters artifacts from artifact queue using rule pattern and optional rule source prefix and consumes them if there is a corresponding destination artifact, filtered using the same rule pattern and an optional rule destination prefix, and source and destination artifacts have matching hashes. NOTE: The destination artifacts are extracted from the links dictionary, using destination name and destination type from the rule data. The source artifacts could also be extracted from the links dictionary, but would require the caller to pass source name and source type, as those are not encoded in the rule. However, we choose to let the caller directly pass the relevant artifacts. rule_data: An unpacked "MATCH" rule (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). source_artifacts: All artifacts of the source item (including hashes). links: A dictionary containing link metadata per step or inspection, e.g.: { : , ... } None. None. The set of consumed artifacts (paths only). """ consumed = set() # The rule can only consume artifacts if the destination link exists dest_link = links.get(rule_data["dest_name"]) if not dest_link: return consumed # Extract destination artifacts from destination link dest_artifacts = getattr(dest_link.signed, rule_data["dest_type"]) # Filter part 1 - Filter artifacts using optional source prefix, and subtract # prefix before filtering with rule pattern (see filter part 2) to prevent # globbing in the prefix. if rule_data["source_prefix"]: filtered_source_paths = [] # Add trailing slash to source prefix if it does not exist normalized_source_prefix = os.path.join( rule_data["source_prefix"], "").replace("\\", "/") for artifact_path in artifacts_queue: if artifact_path.startswith(normalized_source_prefix): filtered_source_paths.append( artifact_path[len(normalized_source_prefix):]) else: filtered_source_paths = artifacts_queue # Filter part 2 - glob above filtered artifact paths filtered_source_paths = fnmatch.filter( filtered_source_paths, rule_data["pattern"]) # Iterate over filtered source paths and try to match the corresponding # source artifact hash with the corresponding destination artifact hash for path in filtered_source_paths: # If a source prefix was specified, we subtracted the prefix above before # globbing. We have to re-prepend the prefix in order to retrieve the # corresponding source artifact below. if rule_data["source_prefix"]: full_source_path = os.path.join( rule_data["source_prefix"], path).replace("\\", "/") else: full_source_path = path # If a destination prefix was specified, the destination artifact should # be queried with the full destination path, i.e. the prefix joined with # the globbed path. if rule_data["dest_prefix"]: full_dest_path = os.path.join( rule_data["dest_prefix"], path).replace("\\", "/") else: full_dest_path = path # Extract source artifact hash dict # We know the source artifact is available, it is also in the queue source_artifact = source_artifacts[full_source_path] # Don't consume source artifact w/o corresponding dest artifact (by path) try: dest_artifact = dest_artifacts[full_dest_path] except KeyError: continue # Don't consume source artifact w/o corresponding dest artifact (by hash) if source_artifact != dest_artifact: continue # Source and destination matched, consume artifact consumed.add(full_source_path) return consumed def verify_create_rule(rule_pattern, artifacts_queue, materials, products): """ Filters artifacts from artifacts queue using rule pattern and consumes them if they are not in the materials set but are in the products set, i.e. were created. rule_pattern: A "CREATE" rule pattern (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). materials: All materials of an item (paths only). products: All products of an item (paths only). None. None. The set of consumed artifacts (paths only). """ # Filter queued artifacts using the rule pattern filtered_artifacts = fnmatch.filter(artifacts_queue, rule_pattern) # Consume filtered artifacts that are products but not materials consumed = set(filtered_artifacts) & (products - materials) return consumed def verify_delete_rule(rule_pattern, artifacts_queue, materials, products): """ Filters artifacts from artifacts queue using rule pattern and consumes them if they are in the materials set but are not in the products set, i.e. were deleted. rule_pattern: A "DELETE" rule pattern (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). materials: All materials of an item (paths only). products: All products of an item (paths only). None. None. The set of consumed artifacts (paths only). """ # Filter queued artifacts using the rule pattern filtered_artifacts = fnmatch.filter(artifacts_queue, rule_pattern) # Consume filtered artifacts that are materials but not products consumed = set(filtered_artifacts) & (materials - products) return consumed def verify_modify_rule(rule_pattern, artifacts_queue, materials, products): """ Filters artifacts from artifacts queue using rule pattern and consumes them if they are in both the materials dict and in the products doct, but have different hashes, i.e. were modified. rule_pattern: A "MODIFY" rule pattern (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). materials: All materials of an item (including hashes). products: All products of an item (including hashes). None. None. The set of consumed artifacts (paths only). """ # Filter queued artifacts using the rule pattern filtered_artifacts = fnmatch.filter(artifacts_queue, rule_pattern) # Filter filtered artifacts that are materials and products filtered_artifacts = set(filtered_artifacts) & \ set(materials.keys()) & set(products.keys()) # Consume filtered artifacts that have different hashes consumed = set() for path in filtered_artifacts: if materials[path] != products[path]: consumed.add(path) return consumed def verify_allow_rule(rule_pattern, artifacts_queue): """ Consumes artifacts, filtered from the artifacts queue using rule pattern. rule_pattern: An "ALLOW" rule pattern (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). None. None. The set of consumed artifacts (paths only). """ # Filter queued artifacts using the rule pattern filtered_artifacts = fnmatch.filter(artifacts_queue, rule_pattern) # Consume all filtered artifacts return set(filtered_artifacts) def verify_disallow_rule(rule_pattern, artifacts_queue): """ Raises RuleVerificationError if rule pattern applies to any artifacts in the queue. NOTE: Each set of rules should have a terminal DISALLOW rule to make overall verification fail in case preceding rules did not consume all artifacts as intended. rule_pattern: A "DISALLOW" rule pattern (see in_toto.rulelib). artifacts_queue: Not yet consumed artifacts (paths only). RuleVerificationError if the rule pattern filters artifacts in the artifact queue. None. None. """ filtered_artifacts = fnmatch.filter(artifacts_queue, rule_pattern) if len(filtered_artifacts): raise RuleVerificationError("'DISALLOW {}' matched the following " "artifacts: {}\n{}".format(rule_pattern, filtered_artifacts, _get_artifact_rule_traceback())) def verify_require_rule(filename, artifacts_queue): """ Raises RuleVerificationError if the filename provided does not exist in the artifacts_queue filename: A single filename (see issues #193 and #152). We will ignore the artifact rule pattern because it's ambiguous and instead treat it as a literal file name. artifacts_queue: Not yet consumed artifacts (paths only). RuleVerificationError: if the filename is not present in the artifacts queue None. None. """ if filename not in artifacts_queue: raise RuleVerificationError("'REQUIRE {filename}' did not find {filename} " "in: {queue}\n{traceback}".format(filename=filename, queue=artifacts_queue, traceback=_get_artifact_rule_traceback())) def _get_artifact_rule_traceback(): """Build and return string form global `RULE_TRACE` which may be used as error message for RuleVerificationError. """ traceback_str = "Full trace for 'expected_{0}' of item '{1}':\n".format( RULE_TRACE["source_type"], RULE_TRACE["source_name"]) # Show all materials and products available in the beginning and # label the one that is used to generate a queue. for source_type in ["materials", "products"]: traceback_str += "Available {}{}:\n{}\n".format( source_type, [" (used for queue)", ""][RULE_TRACE["source_type"] != source_type], RULE_TRACE[source_type]) for trace_entry in RULE_TRACE["trace"]: traceback_str += "Queue after '{0}':\n".format( " ".join(trace_entry["rule"])) traceback_str += "{}\n".format(trace_entry["queue"]) return traceback_str def verify_item_rules(source_name, source_type, rules, links): """ Apply all passed material or product rules (see source_type) of a given step or inspection (see source_name), to enforce and authorize the corresponding artifacts and to guarantee that artifacts are linked together across steps of the supply chain. The mode of operation is similar to that of a firewall: In the beginning all materials or products of the step or inspection are placed into an artifact queue. The rules are then applied sequentially, consuming artifacts in the queue, i.e. removing them from the queue upon successful application. The consumption of artifacts by itself has no effects on the verification. Only through a subsequent "DISALLOW" rule, that finds unconsumed artifacts, is an exception raised. Similarly does the "REQUIRE" rule raise exception, if it does not find the artifact it requires, because it has falsely been consumed or was not there from the beginning. source_name: The name of the item (step or inspection) being verified. source_type: One of "materials" or "products" depending on whether the rules are taken from the "expected_materials" or "expected_products" field of the item being verified. rules: The list of rules (material or product rules) for the item being verified. links: A dictionary containing link metadata per step or inspection, e.g.: { : , ... } FormatError if source_type is not "materials" or "products", or if a rule in the passed list of rules does not conform with any rule format. RuleVerificationError if a DISALLOW rule matches disallowed artifacts, or if a REQUIRE rule does not find a required artifact. Clears and populates the global RULE_TRACE data structure. """ if source_type not in ["materials", "products"]: raise securesystemslib.exceptions.FormatError( "Argument 'source_type' of function 'verify_item_rules' has to be " "one of 'materials' or 'products'. Got: '{}'".format(source_type)) # Create shortcuts to item's materials and products (including hashes), # required to verify "modify" and "match" rules. materials_dict = links[source_name].signed.materials products_dict = links[source_name].signed.products # All other rules only require materials or products paths (without hashes) materials_paths = set(materials_dict.keys()) products_paths = set(products_dict.keys()) # Depending on the source type we create the artifact queue from the item's # materials or products and use it to keep track of (not) consumed artifacts. # The queue also only contains aritfact keys (without hashes) artifacts = getattr(links[source_name].signed, source_type) artifacts_queue = set(artifacts.keys()) # Reset and re-populate rule traceback info dict for a rich error message RULE_TRACE.clear() RULE_TRACE["source_name"] = source_name RULE_TRACE["source_type"] = source_type RULE_TRACE["materials"] = list(materials_dict) RULE_TRACE["products"] = list(products_dict) RULE_TRACE["trace"] = [] # Process rules and remove consumed items from queue in each iteration for rule in rules: LOG.info("Verifying '{}'...".format(" ".join(rule))) # Parse the rule and create two shortcuts to contained rule data rule_data = in_toto.rulelib.unpack_rule(rule) _type = rule_data["rule_type"] _pattern = rule_data["pattern"] # Initialize empty consumed set as fallback for rules that do not consume # artifacts. All rules except "disallow" and "require" consume artifacts. consumed = set() if _type == "match": consumed = verify_match_rule( rule_data, artifacts_queue, artifacts, links) elif _type == "create": consumed = verify_create_rule( _pattern, artifacts_queue, materials_paths, products_paths) elif _type == "delete": consumed = verify_delete_rule( _pattern, artifacts_queue, materials_paths, products_paths) elif _type == "modify": consumed = verify_modify_rule( _pattern, artifacts_queue, materials_dict, products_dict) elif _type == "allow": consumed = verify_allow_rule(_pattern, artifacts_queue) # It's up to the "disallow" and "require" rule to raise an error if # artifacts were not consumed as intended elif _type == "disallow": verify_disallow_rule(_pattern, artifacts_queue) elif _type == "require": verify_require_rule(_pattern, artifacts_queue) else: # pragma: no cover (unreachable) raise securesystemslib.exceptions.FormatError( "Invaldid rule type '{}'.".format(_type)) artifacts_queue -= consumed # Append rule and copy of queue to global info for a rich error message RULE_TRACE["trace"].append({ "rule": rule, "queue": list(artifacts_queue) }) def verify_all_item_rules(items, links): """ Iteratively verifies artifact rules of passed items (Steps or Inspections). items: A list containing Step or Inspection objects whose material and product rules will be verified. links: A dictionary containing link metadata per step or inspection, e.g.: { : , ... } None. None. """ for item in items: LOG.info("Verifying material rules for '{}'...".format(item.name)) verify_item_rules(item.name, "materials", item.expected_materials, links) LOG.info("Verifying product rules for '{}'...".format(item.name)) verify_item_rules(item.name, "products", item.expected_products, links) def verify_threshold_constraints(layout, chain_link_dict): """ Verifies that all links corresponding to a given step report the same materials and products. NOTE: This function does not verify if the signatures of each link corresponding to a step are valid or created by a different authorized functionary. This should be done earlier, using the function `verify_link_signature_thresholds`. layout: The layout whose step thresholds are being verified chain_link_dict: A dictionary containing link metadata per functionary per step, e.g.: { : { : , ... }, ... } ThresholdVerificationError If there are not enough (threshold) links for a steps If the artifacts for all links of a step are not equal None. """ # We are only interested in links that are related to steps defined in the # Layout, so iterate over layout.steps for step in layout.steps: # Skip steps that don't require multiple functionaries if step.threshold <= 1: LOG.info("Skipping threshold verification for step '{0}' with" " threshold '{1}'...".format(step.name, step.threshold)) continue LOG.info("Verifying threshold for step '{0}' with" " threshold '{1}'...".format(step.name, step.threshold)) # Extract the key_link_dict for this step from the passed chain_link_dict key_link_dict = chain_link_dict[step.name] # Check if we have at least links for this step # NOTE: This is already done in `verify_link_signature_thresholds`, # Should we remove the check? if len(key_link_dict) < step.threshold: raise ThresholdVerificationError("Step '{0}' not performed" " by enough functionaries!".format(step.name)) # Take a reference link (e.g. the first in the step_link_dict) reference_keyid = list(key_link_dict.keys())[0] reference_link = key_link_dict[reference_keyid] # Iterate over all links to compare their properties with a reference_link for keyid, link in six.iteritems(key_link_dict): # TODO: Do we only care for artifacts, or do we want to # assert equality of other properties as well? if (reference_link.signed.materials != link.signed.materials or reference_link.signed.products != link.signed.products): raise ThresholdVerificationError("Links '{0}' and '{1}' have different" " artifacts!".format( in_toto.models.link.FILENAME_FORMAT.format( step_name=step.name, keyid=reference_keyid), in_toto.models.link.FILENAME_FORMAT.format( step_name=step.name, keyid=keyid))) def reduce_chain_links(chain_link_dict): """ Iterates through the passed chain_link_dict and builds a dict with step-name as keys and link objects as values. We already check if the links of different functionaries are identical. layout: The layout specified by the project owner against which the threshold will be verified. chain_link_dict: A dictionary containing link metadata per functionary per step, e.g.: { : { : , ... }, ... } None. None. A dictionary containing one Link metadata object per step only if the link artifacts of all link objects are identical for a step. """ reduced_chain_link_dict = {} for step_name, key_link_dict in six.iteritems(chain_link_dict): # Extract the key_link_dict for this step from the passed chain_link_dict # take one exemplary link (e.g. the first in the step_link_dict) # form the reduced_chain_link_dict to return reduced_chain_link_dict[step_name] = list(key_link_dict.values())[0] return reduced_chain_link_dict def verify_sublayouts(layout, chain_link_dict, superlayout_link_dir_path): """ Checks if any step has been delegated by the functionary, recurses into the delegation and replaces the layout object in the chain_link_dict by an equivalent link object. layout: The layout specified by the project owner. chain_link_dict: A dictionary containing link metadata per functionary per step, e.g.: { : { : , ... }, ... } superlayout_link_dir_path: A path to a directory, where links of the superlayout are loaded from. Links of the sublayout are expected to be in a subdirectory relative to this path, with a name in the format in_toto.models.layout.SUBLAYOUT_LINK_DIR_FORMAT. raises an Exception if verification of the delegated step fails. None. The passed dictionary containing link metadata per functionary per step, with layouts replaced with summary links. e.g.: { : { : , ... }, ... } """ for step_name, key_link_dict in six.iteritems(chain_link_dict): for keyid, link in six.iteritems(key_link_dict): if link.type_ == "layout": LOG.info("Verifying sublayout {}...".format(step_name)) layout_key_dict = {} # Retrieve the entire key object for the keyid # corresponding to the link layout_key_dict = {keyid: layout.keys.get(keyid)} # Sublayout links are expected to be in a directory with the following # name relative the the current link directory path, i.e. if there # are multiple levels of sublayout nesting, the links are expected to # be nested accordingly sublayout_link_dir = SUBLAYOUT_LINK_DIR_FORMAT.format( name=step_name, keyid=keyid) sublayout_link_dir_path = os.path.join( superlayout_link_dir_path, sublayout_link_dir) # Make a recursive call to in_toto_verify with the # layout and the extracted key object summary_link = in_toto_verify(link, layout_key_dict, link_dir_path=sublayout_link_dir_path, step_name=step_name) # Replace the layout object in the passed chain_link_dict # with the link file returned by in-toto-verify key_link_dict[keyid] = summary_link return chain_link_dict def get_summary_link(layout, reduced_chain_link_dict, name): """ Merges the materials of the first step (as mentioned in the layout) and the products of the last step and returns a new link. This link reports the materials and products and summarizes the overall software supply chain. NOTE: The assumption is that the steps mentioned in the layout are to be performed sequentially. So, the first step mentioned in the layout denotes what comes into the supply chain and the last step denotes what goes out. layout: The layout specified by the project owner. reduced_chain_link_dict: A dictionary containing link metadata per step, e.g.: { : , ... } name: The name that the summary link will be associated with. None. None. A Metablock object containing a Link which summarizes the materials and products of the overall software supply chain. """ # Create empty link object summary_link = in_toto.models.link.Link() # Take first and last link in the order the corresponding # steps appear in the layout, if there are any. if len(layout.steps) > 0: first_step_link = reduced_chain_link_dict[layout.steps[0].name] last_step_link = reduced_chain_link_dict[layout.steps[-1].name] summary_link.materials = first_step_link.signed.materials summary_link.name = name summary_link.products = last_step_link.signed.products summary_link.byproducts = last_step_link.signed.byproducts summary_link.command = last_step_link.signed.command return Metablock(signed=summary_link) def in_toto_verify(layout, layout_key_dict, link_dir_path=".", substitution_parameters=None, step_name=""): """ Does entire in-toto supply chain verification of a final product by performing the following actions: 1. Verify layout signature(s), requires at least one verification key to be passed, and a valid signature for each passed key. 2. Verify layout expiration 3. Load link metadata for every Step defined in the layout and fail if less links than the defined threshold for a step are found. NOTE: Link files are expected to have the corresponding step and the functionary, who carried out the step, encoded in their filename. 4. Verify functionary signature for every loaded Link, skipping links with failing signatures or signed by unauthorized functionaries, and fail if less than `threshold` links validly signed by different authorized functionaries can be found. The routine returns a dictionary containing only links with valid signatures by authorized functionaries. 5. Verify sublayouts Recurses into layout verification for each link of the superlayout that is a layout itself (i.e. sublayout). Links for the sublayout are expected to be in a subdirectory relative to the superlayout's link_dir_path, with a name in the format: in_toto.models.layout.SUBLAYOUT_LINK_DIR_FORMAT. The successfully verified sublayout is replaced with an unsigned summary link in the chain_link_dict of the superlayout. The summary link is then used just like a regular link to verify command alignments, thresholds and inspections according to the superlayout. 6. Verify alignment of defined (Step) and reported (Link) commands NOTE: Won't raise exception on mismatch 7. Verify threshold constraints, i.e. if all links corresponding to one step have recorded the same artifacts (materials and products). 8. Verify rules defined in each Step's expected_materials and expected_products field NOTE: At this point no Inspection link metadata is available, hence (MATCH) rules cannot reference materials or products of Inspections. Verifying Steps' artifact rules before executing Inspections guarantees that Inspection commands don't run on compromised target files, which would be a surface for attacks. 9. Execute Inspection commands NOTE: Inspections, similar to Steps executed with 'in-toto-run', will record materials before and products after command execution. For now it records everything in the current working directory. 10. Verify rules defined in each Inspection's expected_materials and expected_products field layout: Layout object that is being verified. layout_key_dict: Dictionary of project owner public keys, used to verify the layout's signature. link_dir_path: (optional) A path to the directory from which link metadata files corresponding to the steps in the passed layout are loaded. Default is the current working directory. substitution_parameters: (optional) a dictionary containing key-value pairs for substituting in the following metadata fields: - artifact rules in step and inspection definitions in the layout - the run fields in the inspection definitions - the expected command in the step definitions step_name: (optional) The step that the layout corresponds to, typically used during recursive calls of in_toto_verify. This usually happens when resolving sublayouts. The function verify_sublayouts may provide a clearer picture on how it's used. None. Read link metadata files from disk A link which summarizes the materials and products of the overall software supply chain (used by super-layout verification if any) """ LOG.info("Verifying layout signatures...") verify_layout_signatures(layout, layout_key_dict) # For the rest of the verification we only care about the layout payload # (Layout) that carries all the information and not about the layout # container (Metablock) that also carries the signatures layout = layout.signed LOG.info("Verifying layout expiration...") verify_layout_expiration(layout) # If there are parameters sent to the tanslation layer, substitute them if substitution_parameters is not None: LOG.info('Performing parameter substitution...') substitute_parameters(layout, substitution_parameters) LOG.info("Reading link metadata files...") chain_link_dict = load_links_for_layout(layout, link_dir_path) LOG.info("Verifying link metadata signatures...") chain_link_dict = verify_link_signature_thresholds(layout, chain_link_dict) LOG.info("Verifying sublayouts...") chain_link_dict = verify_sublayouts(layout, chain_link_dict, link_dir_path) LOG.info("Verifying alignment of reported commands...") verify_all_steps_command_alignment(layout, chain_link_dict) LOG.info("Verifying threshold constraints...") verify_threshold_constraints(layout, chain_link_dict) reduced_chain_link_dict = reduce_chain_links(chain_link_dict) LOG.info("Verifying Step rules...") verify_all_item_rules(layout.steps, reduced_chain_link_dict) LOG.info("Executing Inspection commands...") inspection_link_dict = run_all_inspections(layout) LOG.info("Verifying Inspection rules...") # Artifact rules for inspections can reference links that correspond to # Steps or Inspections, hence the concatenation of both collections of links combined_links = reduced_chain_link_dict.copy() combined_links.update(inspection_link_dict) verify_all_item_rules(layout.inspect, combined_links) # We made it this far without exception that means, verification passed LOG.info("The software product passed all verification.") # Return a link file which summarizes the entire software supply chain # This is mostly relevant if the currently verified supply chain is embedded # in another supply chain return get_summary_link(layout, reduced_chain_link_dict, step_name) in-toto-0.4.0/in_toto.egg-info/0000755000076500000240000000000013535503044016351 5ustar lukpstaff00000000000000in-toto-0.4.0/in_toto.egg-info/PKG-INFO0000644000076500000240000004002013535503044017442 0ustar lukpstaff00000000000000Metadata-Version: 2.1 Name: in-toto Version: 0.4.0 Summary: A framework to define and secure the integrity of software supply chains Home-page: https://in-toto.io Author: New York University: Secure Systems Lab Author-email: in-toto-dev@googlegroups.com License: Apache-2.0 Project-URL: Source, https://github.com/in-toto/in-toto Project-URL: Bug Reports, https://github.com/in-toto/in-toto/issues Description: # in-toto [![Build Status](https://travis-ci.org/in-toto/in-toto.svg?branch=develop)](https://travis-ci.org/in-toto/in-toto) [![Coverage Status](https://coveralls.io/repos/github/in-toto/in-toto/badge.svg?branch=develop)](https://coveralls.io/github/in-toto/in-toto?branch=develop) [![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/1523/badge)](https://bestpractices.coreinfrastructure.org/projects/1523) [![Build status](https://ci.appveyor.com/api/projects/status/taxlhrrlf3co07e1/branch/develop?svg=true)](https://ci.appveyor.com/project/in-toto/in-toto/branch/develop) in-toto provides a framework to protect the integrity of the software supply chain. It does so by verifying that each task in the chain is carried out as planned, by authorized personnel only, and that the product is not tampered with in transit. in-toto requires a **project owner** to create a **layout**. A layout lists the sequence of **steps** of the software supply chain, and the **functionaries** authorized to perform these steps. When a functionary performs a step in-toto gathers information about the used command and the related files and stores it in a **link** metadata file. As a consequence link files provide the required evidence to establish a continuous chain that can be validated against the steps defined in the layout. The layout, signed by the project owners, together with the links, signed by the designated functionaries, are released as part of the final product, and can be validated manually or via automated tooling in, e.g. a package manager. ## Getting Started ### Install Dependencies - [Python](www.python.org) in version 2.7 - crypto libraries require header files - [OpenSSL](https://www.openssl.org/) - crypto libraries require header files - [git](https://git-scm.com/) - version control system - [pip](https://pip.pypa.io) - package installer tool ### Installation It is strongly recommended to install in-toto in an isolated Python environment. For easy setup instructions visit the docs for [`virtualenv`](https://virtualenv.pypa.io) and the convenient [`vitualenvwrapper`](https://virtualenvwrapper.readthedocs.io). ```shell pip install in-toto ``` ### Create layout, run supply chain steps and verify final product #### Layout The in-toto software supply chain layout consists of the following parts: - **expiration date** - **readme** (an optional description of the supply chain) - **functionary keys** (public keys, used to verify link metadata signatures) - **signatures** (one or more layout signatures created with the project owner key(s)) - **software supply chain steps** correspond to steps carried out by a functionary as part of the software supply chain. The steps defined in the layout list the functionaries who are authorized to carry out the step (by key id). Steps require a unique name to associate them (upon verification) with link metadata that is created when a functionary carries out the step using the `in-toto` tools. Additionally, steps must have material and product rules which define the files a step is supposed to operate on. Material and product rules are described in the section below. - **inspections** define commands to be run during the verification process and can also list material and product rules. Take a look at the [demo layout creation example](https://github.com/in-toto/in-toto/blob/develop/layout-creation.md) for further information on how to create an in-toto layout. Or try our experimental [layout creation web tool](https://in-toto.engineering.nyu.edu/). #### Artifact Rules A software supply chain usually operates on a set of files, such as source code, executables, packages, or the like. in-toto calls these files artifacts. A material is an artifact that will be used when a step or inspection is carried out. Likewise, a product is an artifact that results from carrying out a step. The in-toto layout provides a simple rule language to authorize or enforce the artifacts of a step and to chain them together. This adds the following guarantees for any given step or inspection: - Only artifacts **authorized** by the project owner are created, modified or deleted, - each defined creation, modification or deletion is **enforced**, and also - restricted to the scope of its definition, which **chains** subsequent steps and inspections together. Note that it is up to you to properly secure your supply chain, by authorizing, enforcing and chaining materials and products using any and usually multiple of the following rules: - `CREATE ` - `DELETE ` - `MODIFY ` - `ALLOW ` - `DISALLOW ` - `REQUIRE ` - `MATCH [IN ] WITH (MATERIALS|PRODUCTS) [IN ] FROM ` *Rule arguments specified as `` allow for Unix shell-style wildcards as implemented by Python's [`fnmatch`](https://docs.python.org/2/library/fnmatch.html).* To learn more about the different rule types, their guarantees and how they are applied take a look at the [Artifact Rules](https://github.com/in-toto/docs/blob/master/in-toto-spec.md#433-artifact-rules) section of the in-toto specification. #### Carrying out software supply chain steps ##### in-toto-run `in-toto-run` is used to execute a step in the software supply chain. This can be anything relevant to the project such as tagging a release with `git`, running a test, or building a binary. The relevant step name and command are passed as arguments, along with materials, which are files required for that step's command to execute, and products which are files expected as a result of the execution of that command. These, and other relevant details pertaining to the step are stored in a link file, which is signed using the functionary's key. If materials are not passed to the command, the link file generated just doesn't record them. Similarly, if the execution of a command via `in-toto-run` doesn't result in any products, they're not recorded in the link file. Any files that are modified or used in any way during the execution of the command are not recorded in the link file unless explicitly passed as artifacts. Conversely, any materials or products passed to the command are recorded in the link file even if they're not part of the execution of the command. See [this simple usage example from the demo application for more details](https://github.com/in-toto/demo). For a detailed list of all the command line arguments, run `in-toto-run --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_run.py). ##### in-toto-record `in-toto-record` works similar to `in-toto-run` but can be used for multi-part software supply chain steps, i.e. steps that are not carried out by a single command. Use `in-toto-record start ...` to create a preliminary link file that only records the *materials*, then run the commands of that step or edit files manually and finally use `in-toto-record stop ...` to record the *products* and generate the actual link metadata file. For a detailed list of all command line arguments and their usage, run `in-toto-record start --help` or `in-toto-record stop --help`, or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_record.py). #### Release final product In order to verify the final product with in-toto, the verifier must have access to the layout, the `*.link` files, and the project owner's public key(s). #### Verification Use `in-toto-verify` on the final product to verify that - the layout was signed with the project owner's private key(s), - has not expired, - each step was performed and signed by the authorized functionary, - the functionaries used the commands, they were supposed to use, - materials and products of each step were in place as defined by the rules, and - run the defined inspections For a detailed list of all command line arguments and their usage, run `in-toto-verify --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_verify.py). #### Signatures `in-toto-sign` is a metadata signature helper tool to add, replace, and verify signatures within in-toto Link or Layout metadata, with options to: - replace (default) or add signature(s), with layout metadata able to be signed by multiple keys at once while link metadata can only be signed by one key at a time - write signed metadata to a specified path (if no output path is specified, layout metadata is written to the path of the input file while link metadata is written to `..link`) - verify signatures This tool is intended to sign layouts created by the [layout web wizard](https://in-toto.engineering.nyu.edu/), but also serves well to re-sign test and demo data. For example, it can be used if metadata formats or signing routines change. For a detailed list of all command line arguments and their usage, run `in-toto-sign --help` or look at the [code documentation](https://github.com/in-toto/in-toto/blob/develop/in_toto/in_toto_sign.py). #### Settings Settings can be configured in [`in_toto.settings`](https://github.com/in-toto/in-toto/blob/develop/in_toto/settings.py), via prefixed environment variables or in RCfiles in one of the following paths: */etc/in_toto/config, /etc/in_totorc, \~/.config/in_toto/config, \~/.config/in_toto, \~/.in_toto/config, \~/.in_totorc, .in_totorc*. A setting in an RCfile in the current working directory overrides the same setting in an RCfile in the user's home directory, which overrides the same setting in an environment variable, which overrides the same setting in `in_toto.settings`. Setting names are restricted to the below listed settings (case sensitive). Also, setting values that contain colons are parsed as list. ##### Available Settings `ARTIFACT_EXCLUDE_PATTERNS` Specifies a list of glob patterns that can be used to exclude files from being recorded as materials or products. See [runlib docs for more details](https://github.com/in-toto/in-toto/blob/develop/in_toto/runlib.py#L124-L142). `ARTIFACT_BASE_PATH` If set, material and product paths passed to `in-toto-run` are searched relative to the set base path. Also, the base path is stripped from the paths written to the resulting link metadata file. ##### Examples ```shell # Bash style environment variable export export IN_TOTO_ARTIFACT_BASE_PATH='/home/user/project' export IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS='*.link:.gitignore' ``` ``` # E.g in rcfile ~/.in_totorc [in-toto settings] ARTIFACT_BASE_PATH=/home/user/project ARTIFACT_EXCLUDE_PATTERNS=*.link:.gitignore ``` ## in-toto demo You can try in-toto by running the [demo application](https://github.com/in-toto/demo). The demo basically outlines three users viz., Alice (project owner), Bob (functionary) and Carl (functionary) and how in-toto helps to specify a project layout and verify that the layout has been followed in a correct manner. ## Specification You can read more about how in-toto works by taking a look at the [specification](https://github.com/in-toto/docs/raw/master/in-toto-spec.pdf). ## Security Issues and Bugs Security issues can be reported by emailing justincappos@gmail.com. At a minimum, the report must contain the following: * Description of the vulnerability. * Steps to reproduce the issue. Optionally, reports that are emailed can be encrypted with PGP. You should use PGP key fingerprint E9C0 59EC 0D32 64FA B35F 94AD 465B F9F6 F8EB 475A. Please do not use the GitHub issue tracker to submit vulnerability reports. The issue tracker is intended for bug reports and to make feature requests. ## Instructions for Contributors Development of in-toto occurs on the "develop" branch of this repository. Contributions can be made by submitting GitHub *Pull Requests*. Take a look at our [development guidelines](https://github.com/secure-systems-lab/lab-guidelines/blob/master/dev-workflow.md) for detailed instructions. Submitted code should follow our [style guidelines](https://github.com/secure-systems-lab/code-style-guidelines) and must be unit tested. Contributors must also indicate acceptance of the [Developer Certificate of Origin (DCO)](https://developercertificate.org/) by appending a `Signed-off-by: Your Name ` to each git commit message (see [`git commit --signoff`](https://git-scm.com/docs/git-commit#Documentation/git-commit.txt---signoff)). ## Acknowledgments This project is managed by Prof. Justin Cappos and other members of the [Secure Systems Lab](https://ssl.engineering.nyu.edu/) at NYU and the [NJIT Cybersecurity Research Center](https://centers.njit.edu/cybersecurity). This research was supported by the Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of DARPA and AFRL. The United States Government is authorized to reproduce and distribute reprints notwithstanding any copyright notice herein. Keywords: software supply chain security Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Apache Software License Classifier: Natural Language :: English Classifier: Operating System :: POSIX Classifier: Operating System :: POSIX :: Linux Classifier: Operating System :: MacOS :: MacOS X Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Security Classifier: Topic :: Software Development Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4 Description-Content-Type: text/markdown Provides-Extra: pynacl in-toto-0.4.0/in_toto.egg-info/SOURCES.txt0000644000076500000240000000552213535503044020241 0ustar lukpstaff00000000000000.coveragerc .travis.yml LICENSE MANIFEST.in README.md requirements-dev.txt requirements-pinned.txt requirements-test.txt requirements-tox.txt requirements.txt setup.py tox.ini in_toto/__init__.py in_toto/common_args.py in_toto/exceptions.py in_toto/formats.py in_toto/in_toto_keygen.py in_toto/in_toto_mock.py in_toto/in_toto_record.py in_toto/in_toto_run.py in_toto/in_toto_sign.py in_toto/in_toto_verify.py in_toto/log.py in_toto/process.py in_toto/rulelib.py in_toto/runlib.py in_toto/settings.py in_toto/user_settings.py in_toto/util.py in_toto/verifylib.py in_toto.egg-info/PKG-INFO in_toto.egg-info/SOURCES.txt in_toto.egg-info/dependency_links.txt in_toto.egg-info/entry_points.txt in_toto.egg-info/requires.txt in_toto.egg-info/top_level.txt in_toto/gpg/__init__.py in_toto/gpg/common.py in_toto/gpg/constants.py in_toto/gpg/dsa.py in_toto/gpg/exceptions.py in_toto/gpg/formats.py in_toto/gpg/functions.py in_toto/gpg/rsa.py in_toto/gpg/util.py in_toto/models/__init__.py in_toto/models/common.py in_toto/models/layout.py in_toto/models/link.py in_toto/models/metadata.py tests/__init__.py tests/common.py tests/runtests.py tests/test_gpg.py tests/test_in_toto_keygen.py tests/test_in_toto_mock.py tests/test_in_toto_record.py tests/test_in_toto_run.py tests/test_in_toto_sign.py tests/test_in_toto_verify.py tests/test_log.py tests/test_param_substitution.py tests/test_process.py tests/test_rulelib.py tests/test_runlib.py tests/test_settings.py tests/test_user_settings.py tests/test_util.py tests/test_verifylib.py tests/demo_files/alice tests/demo_files/alice.pub tests/demo_files/bob tests/demo_files/bob.pub tests/demo_files/carl tests/demo_files/carl.pub tests/demo_files/danny tests/demo_files/danny.pub tests/demo_files/demo.layout.template tests/demo_files/foo.tar.gz tests/demo_files/package.2f89b927.link tests/demo_files/write-code.776a00e2.link tests/demo_files_gpg/demo.layout.template tests/demo_files_gpg/foo.tar.gz tests/demo_files_gpg/package.7b3abb26.link tests/demo_files_gpg/write-code.8288ef56.link tests/gpg_keyrings/dsa/C242A830DAAF1C2BEF604A9EF033A3A3E267B3B1.ssh tests/gpg_keyrings/dsa/pubring.gpg tests/gpg_keyrings/dsa/random_seed tests/gpg_keyrings/dsa/secring.gpg tests/gpg_keyrings/dsa/trustdb.gpg tests/gpg_keyrings/rsa/7B3ABB26B97B655AB9296BD15B0BD02E1C768C43.ssh tests/gpg_keyrings/rsa/8288EF560ED3795F9DF2C0DB56193089B285DA58.ssh tests/gpg_keyrings/rsa/8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17.ssh tests/gpg_keyrings/rsa/pubring.gpg tests/gpg_keyrings/rsa/random_seed tests/gpg_keyrings/rsa/secring.gpg tests/gpg_keyrings/rsa/trustdb.gpg tests/models/__init__.py tests/models/test_common.py tests/models/test_inspection.py tests/models/test_layout.py tests/models/test_link.py tests/models/test_metadata.py tests/models/test_step.py tests/models/test_supply_chain_item.py tests/rc_test/.in_totorc tests/scripts/expr tests/scripts/tar tests/scripts/touchin-toto-0.4.0/in_toto.egg-info/dependency_links.txt0000644000076500000240000000000113535503044022417 0ustar lukpstaff00000000000000 in-toto-0.4.0/in_toto.egg-info/entry_points.txt0000644000076500000240000000042313535503044021646 0ustar lukpstaff00000000000000[console_scripts] in-toto-keygen = in_toto.in_toto_keygen:main in-toto-mock = in_toto.in_toto_mock:main in-toto-record = in_toto.in_toto_record:main in-toto-run = in_toto.in_toto_run:main in-toto-sign = in_toto.in_toto_sign:main in-toto-verify = in_toto.in_toto_verify:main in-toto-0.4.0/in_toto.egg-info/requires.txt0000644000076500000240000000021113535503044020743 0ustar lukpstaff00000000000000six securesystemslib[crypto]>=0.11.3 attrs python-dateutil iso8601 pathspec [:python_version < "3"] subprocess32 [pynacl] pynacl>1.2.0 in-toto-0.4.0/in_toto.egg-info/top_level.txt0000644000076500000240000000001013535503044021072 0ustar lukpstaff00000000000000in_toto in-toto-0.4.0/requirements-dev.txt0000644000076500000240000000042213535414523017245 0ustar lukpstaff00000000000000# Install tox for local testing, but also everything that tox would install # in a test environment, so that we can run the test suite or individual tests # directly in the development environment as well. tox -r requirements-tox.txt # Install in-toto in editable mode -e .in-toto-0.4.0/requirements-pinned.txt0000644000076500000240000000320613535414523017747 0ustar lukpstaff00000000000000# These pinned requirements are the combined results of `pip-compile` over # "requirements.txt" in each supported Python version. # # - Below script may be used (copy-paste to bash removing leading '#') to # re-generate this file, e.g. if `requirements.txt` is updated. # # - Version updates should be handled by a GitHub-integrated dependency monitor # (e.g. Dependabot) that regularly scans PyPI for updates, patches this file, # and submits a PR, which tirggers CI/CD builds and should catch breaking # updates. # # # Gather pip-compile results for each supported Python version # for v in 2.7 3.5 3.6 3.7; do # mkvirtualenv in-toto-env-${v} -p python${v}; # pip install pip-tools; # pip-compile requirements.txt -n | grep -v "^#" >> requirements-pinned.combined; # deactivate; # rmvirtualenv in-toto-env-${v}; # done; # # # Create pinned requirements file retaining this doc header # cat requirements-pinned.txt | grep "^#" > requirements-pinned.tmp; # mv requirements-pinned.tmp requirements-pinned.txt # cat requirements-pinned.combined | sort -u >> requirements-pinned.txt # rm requirements-pinned.combined # asn1crypto==0.24.0 # via cryptography attrs==19.1.0 cffi==1.12.3 # via cryptography, pynacl colorama==0.4.1 # via securesystemslib cryptography==2.7 # via securesystemslib enum34==1.1.6 # via cryptography ipaddress==1.0.22 # via cryptography iso8601==0.1.12 pathspec==0.5.9 pycparser==2.19 # via cffi pynacl==1.3.0 # via securesystemslib python-dateutil==2.8.0 securesystemslib[crypto,pynacl]==0.11.3 six==1.12.0 subprocess32==3.5.4 ; python_version < "3" in-toto-0.4.0/requirements-test.txt0000644000076500000240000000010313535414523017442 0ustar lukpstaff00000000000000# Runtime requirements for test suite mock; python_version < '3.3' in-toto-0.4.0/requirements-tox.txt0000644000076500000240000000031413535414523017301 0ustar lukpstaff00000000000000# in-toto runtime dependencies -r requirements-pinned.txt # in-toto test suite runtime dependencies -r requirements-test.txt # Test tools for linting and test coverage measurement pylint bandit coveragein-toto-0.4.0/requirements.txt0000644000076500000240000000016613535414523016476 0ustar lukpstaff00000000000000securesystemslib[crypto,pynacl]>=0.11.3 attrs python-dateutil iso8601 six pathspec subprocess32; python_version < '3' in-toto-0.4.0/setup.cfg0000644000076500000240000000004613535503044015025 0ustar lukpstaff00000000000000[egg_info] tag_build = tag_date = 0 in-toto-0.4.0/setup.py0000755000076500000240000000665013535502217014731 0ustar lukpstaff00000000000000#!/usr/bin/env python """ setup.py Santiago Torres Lukas Puehringer May 23, 2016 See LICENSE for licensing information. setup.py script to install in-toto framework and in-toto scripts # System Dependencies - Python2.7 (www.python.org) - OpenSSL (www.openssl.org) # Recommended Tools - git (git-scm.com) - pip (pip.pypa.io) - virtualenvs - optional but strongly recommended! (http://docs.python-guide.org/en/latest/dev/virtualenvs/) # Installation from GitHub ``` pip install git+https://github.com/in-toto/in-toto@develop ``` """ from setuptools import setup, find_packages version = "0.4.0" with open("README.md") as f: long_description = f.read() setup( name="in-toto", version=version, author="New York University: Secure Systems Lab", author_email="in-toto-dev@googlegroups.com", url="https://in-toto.io", description=("A framework to define and secure the integrity of " "software supply chains"), long_description_content_type="text/markdown", long_description=long_description, license="Apache-2.0", keywords="software supply chain security", classifiers = [ 'Development Status :: 4 - Beta', 'Intended Audience :: Developers', 'License :: OSI Approved :: Apache Software License', 'Natural Language :: English', 'Operating System :: POSIX', 'Operating System :: POSIX :: Linux', 'Operating System :: MacOS :: MacOS X', 'Programming Language :: Python', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: Implementation :: CPython', 'Topic :: Security', 'Topic :: Software Development' ], python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4", packages=find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests", "debian"]), install_requires=["six", "securesystemslib[crypto]>=0.11.3", "attrs", "python-dateutil", "iso8601", "pathspec", "subprocess32; python_version < '3'"], extras_require={ # Install pynacl as optional dependency to use with securesystemslib, as a # workaround for `"ssl-pynacl": ["securesystemslib[pynacl]>=0.11.3"]`, # which currently is not supported in "extra_require" (see pypa/pip#4957). # TODO: Keep track of changes (version, additional requirements) under the # "pynacl" key in securesystemslib's setup.py. # https://github.com/secure-systems-lab/securesystemslib/blob/master/setup.py#L101 "pynacl": ["pynacl>1.2.0"] }, test_suite="tests.runtests", tests_require=["mock"], entry_points={ "console_scripts": ["in-toto-run = in_toto.in_toto_run:main", "in-toto-mock = in_toto.in_toto_mock:main", "in-toto-record = in_toto.in_toto_record:main", "in-toto-verify = in_toto.in_toto_verify:main", "in-toto-sign = in_toto.in_toto_sign:main", "in-toto-keygen = in_toto.in_toto_keygen:main"] }, project_urls={ "Source": "https://github.com/in-toto/in-toto", "Bug Reports": "https://github.com/in-toto/in-toto/issues", }, ) in-toto-0.4.0/tests/0000755000076500000240000000000013535503044014346 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/__init__.py0000644000076500000240000000202313535414523016457 0ustar lukpstaff00000000000000import sys import logging import in_toto class CapturableStreamHandler(logging.StreamHandler): """Override logging.StreamHandler's stream property to always write log output to `sys.stderr` available at the time of logging. """ @property def stream(self): """Always use currently available sys.stderr. """ return sys.stderr @stream.setter def stream(self, value): """Disable setting stream. """ # Python `unittest` is configured to buffer output to `sys.stdout/sys.stderr` # (see `TextTestRunner` in `tests/runtests.py`) and only show it in case a test # fails. Python `unittest` buffers output by overriding `sys.stdout/sys.stderr` # before running tests, hence we need to log to that overridden `sys.stderr`, # which we do by using a custom StreamHandler. handler = CapturableStreamHandler() # We also use a verbose logging level and format formatter = logging.Formatter(in_toto.log.FORMAT_DEBUG) handler.setFormatter(formatter) in_toto.log.LOGGER.handlers = [handler] in_toto.log.LOGGER.setLevel(logging.DEBUG) in-toto-0.4.0/tests/common.py0000644000076500000240000000541313535414523016216 0ustar lukpstaff00000000000000#!/usr/bin/env python """ common.py Lukas Puehringer Feb 6, 2018 See LICENSE for licensing information. Common code for in-toto unittests, import like so: `import tests.common` Tests importing this module, should be run from the project root, e.g.: `python tests/test_in_toto_run.py` or using the aggregator script (preferred way): `python tests/runtests.py`. """ import os import sys import inspect import unittest if sys.version_info >= (3, 3): from unittest.mock import patch # pylint: disable=no-name-in-module,import-error else: from mock import patch # pylint: disable=import-error def run_with_portable_scripts(decorated): print("patching...") scripts_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "scripts") print("scripts are located in {}".format(scripts_path)) @patch.dict(os.environ, {"PATH": "{};{}".format(scripts_path, os.environ['PATH'])}) class Patched(decorated): pass return Patched class CliTestCase(unittest.TestCase): """TestCase subclass providing a test helper that patches sys.argv with passed arguments and asserts a SystemExit with a return code equal to the passed status argument. Subclasses of CliTestCase require a class variable that stores the main function of the cli tool to test as staticmethod, e.g.: ``` import tests.common from in_toto.in_toto_run import main as in_toto_run_main class TestInTotoRunTool(tests.common.CliTestCase): cli_main_func = staticmethod(in_toto_run_main) ... ``` """ cli_main_func = None def __init__(self, *args, **kwargs): """Constructor that checks for the presence of a callable cli_main_func class variable. And stores the filename of the module containing that function, to be used as first argument when patching sys.argv in self.assert_cli_sys_exit. """ if not callable(self.cli_main_func): raise Exception("Subclasses of `CliTestCase` need to assign the main" " function of the cli tool to test using `staticmethod()`: {}" .format(self.__class__.__name__)) file_path = inspect.getmodule(self.cli_main_func).__file__ self.file_name = os.path.basename(file_path) super(CliTestCase, self).__init__(*args, **kwargs) def assert_cli_sys_exit(self, cli_args, status): """Test helper to mock command line call and assert return value. The passed args does not need to contain the command line tool's name. This is assessed from `self.cli_main_func` """ with patch.object(sys, "argv", [self.file_name] + cli_args), self.assertRaises(SystemExit) as raise_ctx: self.cli_main_func() # pylint: disable=not-callable self.assertEqual(raise_ctx.exception.code, status) in-toto-0.4.0/tests/demo_files/0000755000076500000240000000000013535503044016454 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/demo_files/alice0000644000076500000240000000462713535414523017470 0ustar lukpstaff00000000000000-----BEGIN RSA PRIVATE KEY----- MIIG4wIBAAKCAYEAxPX3kFs/z645x4UOC3KFY3V80YQtKrp6YS3qU+Jlvx/XzK53 lb4sCDRU9jqBBx3We45TmFUibroMd8tQXCUSe8gYCBUBqBmmz0dEHJYbW0tYF7Io apMIxhRYn76YqNdl1JoRTcmzIaOJ7QrHxQrSGpivvTm6kQ9WLeApG1GLYJ3C3Wl4 bnsI1bKSv55Zi45/JawHzTzYUAIXX9qCd3IoHzDucz9IAj9Ookw0va/q9FjoPGrR B80IReVxLVnbo6pYJfu/O37jvEobHFa8ckHdYxUIg8wvkIOy1O3M74lBDm6CVI0Z O25xPlDB/4nHAE1PbA3aF3lw8JGuxLDsetxmfzgAleVt4vXLQiCrZaLf+0cM97Jc T7wdHcbIvRLsij9LNP+2tWZgeZ/hIAOEdaDqcYANPDIAxfTvbe9I0sXrCtrLer1S S7GqUmdFCdkdun8erXdNF0ls9Rp4cbYhjdf3yMxdI/24LUOOQ71cHW3ITIDImm6I 8KmrXFM2NewTARKfAgMBAAECggGAdbdQM+3lkHlfvRiP0VWr1UrFw+8Mk6oKNISd tW7tQrKEZqerf0q+xFSKpvNGZHt30ja5TaUsRCNcCkjwiXH6vxJTEpmDePWD1gSQ 98jbJtA8IUVwlGm2Z7SHV0oxsU+zY8KFLwmqzyMP7yVvShvygMTa2+xhzgrthdOg ndw5wg/oBC7iNJ3CJP5qaK36dMdAxMIxk3+XBRKK59YP/dWzlxjGmwiqP/WYSLXl G63Fbi6o9lsc/V2UYToFT4aSGBpZMkfcNPX2Iz94YbtSEkPORTFmbtHUCuP2GfRM 45MTHErHdzqzUgO+KbtDozKsjyzwFciFBSlIhYA4yIwmomzJpq/6vfQvXRMfL1lm iil3OByT3BGw8w7k7WSRaiZR4ns9J3ALCga7yqpqZ4kvS2JNNgho9ETGIoCzv9us 5HMMm9aP9Fa+w1XTNgQJ/lA5zVpfyXiZAVQ8iz9PvxXrMPLR0+DLcUWvSFeDqfz3 +J7xDLCTbWrA4nwMJywf83+p5P8JAoHBAOHC2Hscyg8ku66IiLYhXbpXlAV1U837 rgeculuMtQfW5pqByfzg8O4DyxA/2JppYD4w2EG95FYzRmPg+ibTB/cgXhyBoWSS M2gdyrOrQ4sWAKz0HIv3GxwSN5NhO6PiAIwN59MtduHsximmTAyFF1NHcDuMvIdn uP7Jwbh1c5VG8Fb7ul9rTN41agdFh+Xa6gTUJxqGGz5N8GeFV99xp738NyBSMhFE /wxWeQwE+FvnLNryaCZC42viJ2jpT6N2pQKBwQDfV5Vr4YjIg4ycBZ0dLXSjy2yp hql7Yf5fEYoNjw4a4Uf9gWNN9QiK2hbb5tFVmZaMm8tf0zyqT6yZpC6LkYCm3X1T QrTBCapo/0pQ9pAYTLvYaL3IYoUvN5+v7X5BYYJ1QHTkXhbUpPl5LIenOIn8rOrn k3rmuZATRTVTh2fxrzP823LF5kJllEBqvF4nsm04bAXCiLu4pJq3ascqdIrixwC6 Lk0cUo6oV7RitxE8FY8HhxEnzgXHg0QEgjVTZPMCgcAkUKFd/F2MXg5Knu/OzEM1 bE0FK8BVS/zMgKuBenrMTgc+J06EfPKEdtu9O2fuPrEaj+TZfmAydYEHI/NZN2z6 lZxN3ZRGhzX5s4EdsZjl0J4/M+07nn4f39ZMwMFFNV99J+d4ksGiyeF+ZZ+qC+aa oM0u5w6UgVzCr1WYBFyZUJXsiAWMv8fXnqP1k3uuv64RJMc9fwD23rajEFH4QWII L3/2lQI0wPJ925MRGeORdPhEJ+YU8YF/oxtPxufmlXkCgcAH+XSYWYEsx6WpnHmz pP/ZKVZD50793M3cTyACw+zZANo1Lv2AtxMLAiZ2y5MF32oEsztbvIsZ+aZMBhSz Xwqc6qOi6WrSyamP/i2FHoielX7Ph03fbcUbnnzRJ0Wux/CEhzylOsbN6OYPcYuW aOpkXzgz9Iwa2N1QEtSImvkXJA5TJPLAJiyQu+5g4UDrYe+MaC78dy1ctmPf0Kwz 091xo3FfNHAEZt45HIiQTcELyClHN4dhSHXkXcd78bo9tAkCgcEAszBLH2PkY+pi If4sm/9OWDJUpI2QepOKKjeHNoh395PwiCP6utiRrbFt3gCopShPXgzaG5mV/sIU PkmulS7XT4564MwfwKMJGhJze61jolgxTG+MohlR9qGnzAY+nfH4UeSms34+vCoE B26nhxhs3iQJEelQx3Hvf7RXfQUAN7DzweFQ+jyKA0cyBgyZV7NsP9tgzrP4McTe uUlYv3f9+jZmaFLLOMM0Ggj0jPqsHY4DAPeiaDB42KkmdRYfUc2y -----END RSA PRIVATE KEY----- in-toto-0.4.0/tests/demo_files/alice.pub0000644000076500000240000000116113535414523020243 0ustar lukpstaff00000000000000-----BEGIN PUBLIC KEY----- MIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEAxPX3kFs/z645x4UOC3KF Y3V80YQtKrp6YS3qU+Jlvx/XzK53lb4sCDRU9jqBBx3We45TmFUibroMd8tQXCUS e8gYCBUBqBmmz0dEHJYbW0tYF7IoapMIxhRYn76YqNdl1JoRTcmzIaOJ7QrHxQrS GpivvTm6kQ9WLeApG1GLYJ3C3Wl4bnsI1bKSv55Zi45/JawHzTzYUAIXX9qCd3Io HzDucz9IAj9Ookw0va/q9FjoPGrRB80IReVxLVnbo6pYJfu/O37jvEobHFa8ckHd YxUIg8wvkIOy1O3M74lBDm6CVI0ZO25xPlDB/4nHAE1PbA3aF3lw8JGuxLDsetxm fzgAleVt4vXLQiCrZaLf+0cM97JcT7wdHcbIvRLsij9LNP+2tWZgeZ/hIAOEdaDq cYANPDIAxfTvbe9I0sXrCtrLer1SS7GqUmdFCdkdun8erXdNF0ls9Rp4cbYhjdf3 yMxdI/24LUOOQ71cHW3ITIDImm6I8KmrXFM2NewTARKfAgMBAAE= -----END PUBLIC KEY----- in-toto-0.4.0/tests/demo_files/bob0000644000076500000240000000463313535414523017152 0ustar lukpstaff00000000000000-----BEGIN RSA PRIVATE KEY----- MIIG5QIBAAKCAYEA0Zfzonp3/FScaIP+KKuzB+OZNFpjbVGWjm3leqnFqHYLqrLc Cw5KhlXpycJqoSvZBpO+PFCksUx8U/ryklHGVoDiB84pRkvZtBoVaA4b4IHDIhz1 K5NqkJgieya4fwReTxmCW0a9gH7AnDicHBCXlzMxqEdt6OKMV5g4yjKaxf8lW72O 1gSI46GSIToo+Z7UUgs3ofaM5UFIcczgCpUa5kEKocB6cSZ9U8PKRLSs0xO0ROjr cOTsfxMs8eV4bsRCWY5mAq1WM9EHDSV9WO8gqrRmanC4enNqa8jU4O3zhgJVegP9 A01r9AwNt6AqgPSikwhXN/P4v1FMYV+R6N3bS1lsVWRAnwBq5RFz5zVvcY88JEkH brcBqP/A4909NXae1VMXmnoJb4EzGAkyUySBa+fHXAVJgzwyv3I48d/OIjH8NWcV mM/DQL7FtcJk3tp0YUjY5wNpcbQTnLzURtlUsd+MtGuvdlDxUUvtUYCIVKRdS8Uz YnTPjI2xzeoSHZ2ZAgMBAAECggGBAIUj0GlTAKsymFSwHCB7yXNmeejOzkAgRtJd Lxv3go7bxdd+XNdNEbw6ERPQQ2n0b52E9qBb3fKeko/KZpLaDXLf1jV9Ga0E+9sz gouiAsVfyLP/zyIKN/R4H9c5JpPRE5ONscgHrNNWMUZLk6ckRxeONqoeDcyVNO9j zBdtg/HofTPEu1pqcQagmTIwPt0qrtcbNxDUjHYJPVrE+UDfrMG9aWKM4XrFJ3Gx euigGPTQnH/1sbH6Sd0DMlbLHPDIC/N0BaJXgWId1+KkkxGEYh671yB2ZN0MN4JO q2RxSOynFY0x7yu6my8MCMbiByxnk00+scCY54r8Hs+9yECb4qlqsPYr+XbsIbG2 RJGRLWLMrD/EhTyDx5fCfM/ZFHFoDy5BWO+vjUehow+PEQBsWSRWNjXgfkzPjVMa SCovCOoPY8Ghwrg4p/QG9Lf+Y0egLn6EniWPrgKnMPW3tzCvC/5sjC0y1WWciW3o RJB9nu6GShk942w9jscr4gM634vkHQKBwQDtfaWZ6ndy1Z1quD1oDiBAaeyPGpzM P/A2u2x95h/2TVo4PN+Zs1ehlNwIKoMWBwXJil4fLWmDW2081PrT/XnnInNkw4iV HWIZUcmOCQzLm0PPuxPHywbP1LDu8/IGnoYjbK8yiOBZAy/klI1A+oDHf6cmf7GY Jm4+KrDsFdaroudYOAz/twuf8KXzznZAFZMSApjq6c2ZHVju48rhWLYxMOG85XBF 4suZ3yCi337Szj1zhfIE3lvqmZbUMJDj+fMCgcEA4e201xi6U/K2H0dRFaLtUi4O EhR+VJTKzWrwne+07RVes3yrbowYs9mjg9HIKn9rYPxBG+hfmkrOeRJXYpPmfx8q kk39K1XOe787tjguD1Uj8b5PPr88t0XhxHr5XsS12xpeMR/e2lJ715pycOjnJji/ nz3ne1RHuew2j3nSnCjoDd7TmpHvyQDjRhZr8S28k2hWXfK8WuvTOj5e+SzvkMWG 8J5kLi+qFscADGXlpmFQtz57iwAI0MBjkR0yscFDAoHAEiP82EruoNjsU1CLcD1T /VeZ+DxiKb/gi225lcxUOK4j7BPKSKVIVlFWlVEZ/j6/FGv7UIpZeu0q5PCn0DWW cC9TfSjqb+l0qtZyfOT4Ez1i6qUxl5tMg+eNNFNx80t8l4wfvc5yxJnXuLAYMhRw bcy0ad5rJGIbHaiJJx9r7GRfI3/0jjvfKXJqWrs0kSSUvVVxdNAzIjT5rBW+U4RB NnSzaYhlERGH19MRXR+RQmz6iK58lB6gCsV8neyvxJo9AoHBAJBY34HOOr4IBHRX jGbWgepPoo3KqixAJJK6EKHX1TDkxmzG6oDm4aGHHAHMtqbwYhrFEJRUE0DxKpoQ LeS9ujbeIsT3LxnQ6OwHco8ptcP2EdESVm8woAo4i9aM+2ahJ8+lOSkJw8iZiqZl 91hMdeLlvwhu9MbHQkx3ryRcIUPEnv69r1TCiQFTn+HX0X92SVWlBAliXRV6Nqqv zt5E54sHqP9zM26O5Y1H96/0KpXy9y8crLJSg09cnEDK9ui7IQKBwQC5wFhP/J3z U3W+PVV4j3IvSTSZPTqg7jyHn95BZaUwT3/L6VbOssN2EavPC4rAtKwIK7gvvhLG HKp0RPXK+52ANggjopvKuR+wyHVLXdNCTroCOHTJ9vQX9GlzcLb0kjAnt7BSCUCi zvVDSCFiZR3KWsf8oMIlpf4ZcxVVqd84/ALMRQVH4ep0sBIfwrDVjRL6Cl47rLaa MnWqzjNeDVfGt45q0RSCjNWrZI6I2SvWeMLmnMhXVeUViB8XEfNH6a0= -----END RSA PRIVATE KEY----- in-toto-0.4.0/tests/demo_files/bob.pub0000644000076500000240000000116113535414523017730 0ustar lukpstaff00000000000000-----BEGIN PUBLIC KEY----- MIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEA0Zfzonp3/FScaIP+KKuz B+OZNFpjbVGWjm3leqnFqHYLqrLcCw5KhlXpycJqoSvZBpO+PFCksUx8U/ryklHG VoDiB84pRkvZtBoVaA4b4IHDIhz1K5NqkJgieya4fwReTxmCW0a9gH7AnDicHBCX lzMxqEdt6OKMV5g4yjKaxf8lW72O1gSI46GSIToo+Z7UUgs3ofaM5UFIcczgCpUa 5kEKocB6cSZ9U8PKRLSs0xO0ROjrcOTsfxMs8eV4bsRCWY5mAq1WM9EHDSV9WO8g qrRmanC4enNqa8jU4O3zhgJVegP9A01r9AwNt6AqgPSikwhXN/P4v1FMYV+R6N3b S1lsVWRAnwBq5RFz5zVvcY88JEkHbrcBqP/A4909NXae1VMXmnoJb4EzGAkyUySB a+fHXAVJgzwyv3I48d/OIjH8NWcVmM/DQL7FtcJk3tp0YUjY5wNpcbQTnLzURtlU sd+MtGuvdlDxUUvtUYCIVKRdS8UzYnTPjI2xzeoSHZ2ZAgMBAAE= -----END PUBLIC KEY----- in-toto-0.4.0/tests/demo_files/carl0000644000076500000240000000462713535414523017334 0ustar lukpstaff00000000000000-----BEGIN RSA PRIVATE KEY----- MIIG4wIBAAKCAYEAzgLBsMFSgwBiWTBmVsyW5KbJwLFSodAzdUhU2Bq6SdRz/W6U OBGdojZXibxupjRtAaEQW/eXDe+1CbKg6ENZGt2D9HGFCQZgQS8ONgNDQGiNxgAp MA0T21AaUhru0vEofzdN1DfEF4CAGv5AkcgKsalhTyONervFIjFEdXGelFZ7dVMV 3Pp5WkZPG0jFQWjnmDZhUrtSxEtqbVghc3kKAUj9Ll/3jyi2wS92Z1j5ueN8X62h WX2xBqQ6nViOMzdujkoiYCRSwuMLRqzW2CbTL8hF1+S5KWKFzxl5sCVfpPe7V5Hk gEHjwCILXTbCn2fCMKlaSbJ/MG2lW7qSY2RowVXWkp1wDrsJ6Ii9f2dErv9vJeOV ZeO9DsooQ5EuzLCfQLEU5mn7ul7bU7rFsb8JxYOeudkNBatnNCgVMAkmDPiNA7E3 3bmL5ARRwU0iZicsqLQR32pmwdap8PjofxqQk7Gtvz/iYzaLrZv33cFWWTsEOqK1 gKqigSqgW9T26wO9AgMBAAECggGARgxR59QURk7/Iex/H8x5Ek5UE503x6WPmbV6 g/CynyAKQIY1B8uVme54XXEg/oqc9gzaDytkxWxYVm/SnqcnySbBE4D2B1ePqxpg 7qS1XVUvv7+WQaxwsAXQrVJTGo69gmQ/poU/u+4JoSQKuIcYe8hoOgyQssbeBP5T lMuJbE7rs70Ilhbk5Hk7tL9NsywEc5EqDoN7ZRtvmLZ6yVTCviUfUYF4H8lsOsv0 qrdqUnazd6UGU/OVGfkFpTLTRz71T/7ls25pXvMGyYPacS+EwencSNpnMqWqyREQ jETGSiPL4PdYD8ZxbWs5McTFkTeEcxQiHTvooIYto7sVA+QZSzX7t/jQ+DnfiodP LXrgGOG28vcCSnOXSbwmqHsVzkiLHWmVaE/+dgGDiUQfFlCGYEzSDE7DxlvYrM/8 deXdOFH9tqOmz/Fj7XQTZFZTxg4D3EOAFl//RV5shxugNy+AGMR97mpwhsbX25rY j84h2sox/awhG9mOfLJ12iUNu56BAoHBAPbrvbgWD75GcUzfVmqGIMkJRa9QyyvU esl3UXvdOQJC7UjnzMRtnDgoJXVrY2WUcXk9NUgrJHRnMOwdG87wMHbtl64+XMuF ETvwePcDosiQfnL6VxguCBxurrk+ASyVbY5E1hhZs+X7lT4vHf6dqZIQwQ8Sp1hZ ERKAgW3C9+gOIgJ15rLb1zgBx/CJPKnmNF+rA0I+Gl2k6nyHKWYscOp3XsfSPch3 UL8990C2D4Tw3oVx+eBxYDkdG8IbNmP0pQKBwQDVle4SqeSSDFZfscr/LpM6sZpv XAFTBB+3Doo0fxSWvBaPqUy6DePFORktQGuEq1FUewT2J8CbC5OdmuGdT1pB3k8u ww54tmkdX/TjcWZlDpSOilZpZbrb/CpPtzxoBHpHcB5uydmRNLTX1ejsh3lIC2Z/ ++479jaZxCUVjxIWlnCajwZ/69xLyS4i0PJSZ7tHzVCr9/xfcRggBzynEFIFigEu kRWWS/9lE6ZVMJIiaGu9k0S/ZQZJsjo+J8NebzkCgcBZHcoSN3wlGz1nFjAVMCWD CSKqXImHXx+VuMei4bvikg8bwfVIa5r3NZ4XW9O65LFzpWCmlFOK43dnsDXKISwM sEGPNJi/J16J+Idf92L36haJHsryQiLRSC0tVDCOS7wHndZ7YVypQ3ygvdagf3yk 6AEVlJDrNPIRcGnGGJjqmrcxliXWJbvuTc5AhPdtBlWetZTugoV3iL7MhOevJZj3 Nm/xxHJh+JYe/5lONKczPs/A79rUY3bsBhJvouyldLUCgcEAupnfwoyNto9y0u45 RRLC7I+INmxyd73tm3fnhQ/VxA+VwnOOJirGaWOGPIPAq8slJiLIZeq7GnpVwGc1 X8OZPlzkXx1pQktTsKdKA3/qjrXbUmFIN5L6WwGHUdfrvZDT6B/rZq2RGIysxrkL Y8LEUg4rwfsv21EzyMmWAKzbh8JtpIr6ib5d3BRq614Tp60a8RNOvL/OzO+4vRq1 gdv/XPmCKNX2vzWsbvlo45qowcjrAFhuwqyXfsFooy8IQDP5AoHAZQJYElEUULpr tNPQKlJy6vwZh95kaJyhjmBtyNyP64e93QwagN8AHcwoRXxERBrzy8eJ7dUfrduc xw0AgX8VecsFo2GOfgVVTuC4Ychbaj5NfRtj2suJMUshvYu85A95HHV+q3x+V2C8 MTf49Gyf3tjrIlqgyAT7/smUBY+mGdfvhvQ6yEyXiBKMibUeDt+X1nvcIII2kBl7 xdOOfJKqqkPLEDq3f2t7zIcoPs2hXfHmyRVpb+ySAuAtxFGqj0/O -----END RSA PRIVATE KEY----- in-toto-0.4.0/tests/demo_files/carl.pub0000644000076500000240000000116113535414523020107 0ustar lukpstaff00000000000000-----BEGIN PUBLIC KEY----- MIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEAzgLBsMFSgwBiWTBmVsyW 5KbJwLFSodAzdUhU2Bq6SdRz/W6UOBGdojZXibxupjRtAaEQW/eXDe+1CbKg6ENZ Gt2D9HGFCQZgQS8ONgNDQGiNxgApMA0T21AaUhru0vEofzdN1DfEF4CAGv5AkcgK salhTyONervFIjFEdXGelFZ7dVMV3Pp5WkZPG0jFQWjnmDZhUrtSxEtqbVghc3kK AUj9Ll/3jyi2wS92Z1j5ueN8X62hWX2xBqQ6nViOMzdujkoiYCRSwuMLRqzW2CbT L8hF1+S5KWKFzxl5sCVfpPe7V5HkgEHjwCILXTbCn2fCMKlaSbJ/MG2lW7qSY2Ro wVXWkp1wDrsJ6Ii9f2dErv9vJeOVZeO9DsooQ5EuzLCfQLEU5mn7ul7bU7rFsb8J xYOeudkNBatnNCgVMAkmDPiNA7E33bmL5ARRwU0iZicsqLQR32pmwdap8PjofxqQ k7Gtvz/iYzaLrZv33cFWWTsEOqK1gKqigSqgW9T26wO9AgMBAAE= -----END PUBLIC KEY----- in-toto-0.4.0/tests/demo_files/danny0000644000076500000240000000052013535414523017510 0ustar lukpstaff00000000000000{"keyid": "36a5529f72cf3b1cc096a28887350969212f9d6c46529658ab186fc0cb781699", "keyval": {"public": "32b2b597295d7a9ef4c5cfd2d588f6b02dc983a58cf5677b1d7d20af82c7e1a2", "private": "b471d9ec4c83f8ed8047f3e1ed6014fbd283e002a3b6946faff4895629ab64aa"}, "keytype": "ed25519", "scheme": "ed25519", "keyid_hash_algorithms": ["sha256", "sha512"]}in-toto-0.4.0/tests/demo_files/danny.pub0000644000076500000240000000026413535414523020302 0ustar lukpstaff00000000000000{"keyval": {"public": "32b2b597295d7a9ef4c5cfd2d588f6b02dc983a58cf5677b1d7d20af82c7e1a2"}, "keytype": "ed25519", "scheme": "ed25519", "keyid_hash_algorithms": ["sha256", "sha512"]}in-toto-0.4.0/tests/demo_files/demo.layout.template0000644000076500000240000001051113535414523022452 0ustar lukpstaff00000000000000{"signed": { "_type": "layout", "inspect": [ { "_type": "inspection", "expected_materials": [ [ "MATCH", "foo.tar.gz", "WITH", "PRODUCTS", "FROM", "package" ] ], "expected_products": [ [ "MATCH", "foo.py", "WITH", "PRODUCTS", "FROM", "write-code" ] ], "name": "untar", "run": [ "python", "./scripts/tar", "xfz", "foo.tar.gz" ] } ], "keys": { "2f89b9272acfc8f4a0a0f094d789fdb0ba798b0fe41f2f5f417c12f0085ff498": { "keyid": "2f89b9272acfc8f4a0a0f094d789fdb0ba798b0fe41f2f5f417c12f0085ff498", "keyid_hash_algorithms": [ "sha256", "sha512" ], "keytype": "rsa", "keyval": { "private": "", "public": "-----BEGIN PUBLIC KEY-----\nMIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEAzgLBsMFSgwBiWTBmVsyW\n5KbJwLFSodAzdUhU2Bq6SdRz/W6UOBGdojZXibxupjRtAaEQW/eXDe+1CbKg6ENZ\nGt2D9HGFCQZgQS8ONgNDQGiNxgApMA0T21AaUhru0vEofzdN1DfEF4CAGv5AkcgK\nsalhTyONervFIjFEdXGelFZ7dVMV3Pp5WkZPG0jFQWjnmDZhUrtSxEtqbVghc3kK\nAUj9Ll/3jyi2wS92Z1j5ueN8X62hWX2xBqQ6nViOMzdujkoiYCRSwuMLRqzW2CbT\nL8hF1+S5KWKFzxl5sCVfpPe7V5HkgEHjwCILXTbCn2fCMKlaSbJ/MG2lW7qSY2Ro\nwVXWkp1wDrsJ6Ii9f2dErv9vJeOVZeO9DsooQ5EuzLCfQLEU5mn7ul7bU7rFsb8J\nxYOeudkNBatnNCgVMAkmDPiNA7E33bmL5ARRwU0iZicsqLQR32pmwdap8PjofxqQ\nk7Gtvz/iYzaLrZv33cFWWTsEOqK1gKqigSqgW9T26wO9AgMBAAE=\n-----END PUBLIC KEY-----" }, "scheme": "rsassa-pss-sha256" }, "776a00e29f3559e0141b3b096f696abc6cfb0c657ab40f441132b345b08453f5": { "keyid": "776a00e29f3559e0141b3b096f696abc6cfb0c657ab40f441132b345b08453f5", "keyid_hash_algorithms": [ "sha256", "sha512" ], "keytype": "rsa", "keyval": { "private": "", "public": "-----BEGIN PUBLIC KEY-----\nMIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEA0Zfzonp3/FScaIP+KKuz\nB+OZNFpjbVGWjm3leqnFqHYLqrLcCw5KhlXpycJqoSvZBpO+PFCksUx8U/ryklHG\nVoDiB84pRkvZtBoVaA4b4IHDIhz1K5NqkJgieya4fwReTxmCW0a9gH7AnDicHBCX\nlzMxqEdt6OKMV5g4yjKaxf8lW72O1gSI46GSIToo+Z7UUgs3ofaM5UFIcczgCpUa\n5kEKocB6cSZ9U8PKRLSs0xO0ROjrcOTsfxMs8eV4bsRCWY5mAq1WM9EHDSV9WO8g\nqrRmanC4enNqa8jU4O3zhgJVegP9A01r9AwNt6AqgPSikwhXN/P4v1FMYV+R6N3b\nS1lsVWRAnwBq5RFz5zVvcY88JEkHbrcBqP/A4909NXae1VMXmnoJb4EzGAkyUySB\na+fHXAVJgzwyv3I48d/OIjH8NWcVmM/DQL7FtcJk3tp0YUjY5wNpcbQTnLzURtlU\nsd+MtGuvdlDxUUvtUYCIVKRdS8UzYnTPjI2xzeoSHZ2ZAgMBAAE=\n-----END PUBLIC KEY-----" }, "scheme": "rsassa-pss-sha256" } }, "readme": "", "steps": [ { "_type": "step", "expected_command": [], "expected_materials": [], "expected_products": [ [ "CREATE", "foo.py" ] ], "name": "write-code", "pubkeys": [ "776a00e29f3559e0141b3b096f696abc6cfb0c657ab40f441132b345b08453f5" ], "threshold": 1 }, { "_type": "step", "expected_command": [ "tar", "zcvf", "foo.tar.gz", "foo.py" ], "expected_materials": [ [ "MATCH", "foo.py", "WITH", "PRODUCTS", "FROM", "write-code" ] ], "expected_products": [ [ "CREATE", "foo.tar.gz" ], [ "ALLOW", "foo.py" ] ], "name": "package", "pubkeys": [ "2f89b9272acfc8f4a0a0f094d789fdb0ba798b0fe41f2f5f417c12f0085ff498" ], "threshold": 1 } ] }, "signatures": []} in-toto-0.4.0/tests/demo_files/foo.tar.gz0000644000076500000240000000022413535414523020367 0ustar lukpstaff00000000000000‹½tXíÏ= Â@†áÔžbÀZ™ýÉæ ^Ã" ââÝÞÞÕË&ˆð>ÍÇ0S|s>NÏnSª¼—w¡ÿ¤Ú67VÅ8u½k/5œ è¶µšùQÎ÷Z%Í×i鮞Ÿ°oŸÈ7ÿÄ^NcJY.·CÉ%ï~ݰΠˆžG–(in-toto-0.4.0/tests/demo_files/package.2f89b927.link0000644000076500000240000000254013535414523022025 0ustar lukpstaff00000000000000{ "signatures": [ { "keyid": "2f89b9272acfc8f4a0a0f094d789fdb0ba798b0fe41f2f5f417c12f0085ff498", "sig": "66365d379d66a2e76d39a1f048847826393127572ba43bead96419499b02561a08e1cb06cf91f2addd87c30a01f776a8ccc599574bc9a2bd519558351f56cffa61ac4f994d0d491204ff54707937e15f9abfa97c5bda1ec1ae2a2afea63f808613f4fb343b85a5a455b668b95fa3a11cb9b34219d4d6af2dd4e80a9af01023954a8813b510a6ff6041c3af52056d021fabbc975211b0d8ee7a429a6c22efde583d8ac0719fd657b398a3e02cc711897acbe8cadf32d54f47012aa44621728ede42c3bc95c662f9c1211df4e18da8e0f6b2de358700cea5db1e76fc61ef5a90bcebcc883eed2272e5ca1c8cbb09b868613b839266cd3ae346ce88439bdb5bb4c69dcb7398f4373f2b051adb3d44d11ef1b70c7189aa5c0e6906bf7be1228dc553390024c9c796316067fda7d63cf60bfac86ef2e13bbd8e4c3575683673f7cdf4639c3a5dc225fc0c040dbd9962a6ff51913b240544939ce2d32a5e84792c0acfa94ee07e88e474bf4937558d107c6ecdef5b5b3a7f3a44a657662bbc1046df3a" } ], "signed": { "_type": "link", "byproducts": { "return-value": 0, "stderr": "a foo.py\n", "stdout": "" }, "command": [ "tar", "zcvf", "foo.tar.gz", "foo.py" ], "environment": {}, "materials": { "foo.py": { "sha256": "74dc3727c6e89308b39e4dfedf787e37841198b1fa165a27c013544a60502549" } }, "name": "package", "products": { "foo.tar.gz": { "sha256": "52947cb78b91ad01fe81cd6aef42d1f6817e92b9e6936c1e5aabb7c98514f355" } } } }in-toto-0.4.0/tests/demo_files/write-code.776a00e2.link0000644000076500000240000000217613535414523022460 0ustar lukpstaff00000000000000{ "signatures": [ { "keyid": "776a00e29f3559e0141b3b096f696abc6cfb0c657ab40f441132b345b08453f5", "sig": "ba2899895d769db034b898e27104a3f8d6cfca4f482c55fc68283cf34feede07966f9d6a32ea283215d373d8b3f2d5b7c7b02546f2cb55751eb347069cd4225da7d5829491aef18405380e21a74b04859c4db41e33133b2aae323c84e95d081a544d58940396f94d49fb718637025a3b7d0b9d4fb3d0ae2a604cb4f45c4cb001e20c411a9832851916b3a2e21d2e29a4d33300a848cd968b588bce7483f6f751ee95c2eb6c18c4e2978be6bdd9eb63b9436e434b38c7f37a7c9f444744a250470262e9ea408e65b96608a6b7748ead175d5b3894a20f258046c769e94e50da4145052dc9bb736196edcb86d79857437d4994a7ec9f022fae3c16645605ae03e4f3408f29818dcaec0ff3e513a5fab6906877ea5e4c92bfa0967913ce70ddeb598e73cb8799ce80d1bf2b88e53d1264f0a36d3549be86b2dff39592a59d4982176d04f5f2d7170a13f5f37ee471493be1eb9a30e002e77e9046b29aa1c1363ca46e4a2ae2f96dd780d0132469815e3fe1ab6db11b570e9411d0968482ac02d2cc" } ], "signed": { "_type": "link", "byproducts": {}, "command": [], "environment": {}, "materials": {}, "name": "write-code", "products": { "foo.py": { "sha256": "74dc3727c6e89308b39e4dfedf787e37841198b1fa165a27c013544a60502549" } } } }in-toto-0.4.0/tests/demo_files_gpg/0000755000076500000240000000000013535503044017311 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/demo_files_gpg/demo.layout.template0000644000076500000240000000775313535414523023325 0ustar lukpstaff00000000000000{"signed": { "_type": "layout", "inspect": [ { "_type": "inspection", "expected_materials": [ [ "MATCH", "foo.tar.gz", "WITH", "PRODUCTS", "FROM", "package" ] ], "expected_products": [ [ "MATCH", "foo.py", "WITH", "PRODUCTS", "FROM", "write-code" ] ], "name": "untar", "run": [ "tar", "xfz", "foo.tar.gz" ] } ], "keys": { "7b3abb26b97b655ab9296bd15b0bd02e1c768c43": { "hashes": [ "pgp+SHA2" ], "keyid": "7b3abb26b97b655ab9296bd15b0bd02e1c768c43", "keyval": { "private": "", "public": { "e": "010001", "n": "e9ad391502ae32bd4fcc41a0f9970f8901ed6ad1c5c128c02add22721cdc22318b64bec9f9467b6949b19fc2e98ce41906125ad45d0b138f1ad6c5da7bde38092d9e3e697ce8b8373b150b57342dd921d634b873f258f5c15559b52921fa4bb7f482ec43a1c85c3385bd520cedbdc16b2524a64aecf32ac5690e6dd4ee0210a975e1b6c5af164ea69ca64533422432070511068730594793885567bb8f7cffacf6eb5ffdc640e898e599579b21b15e497f5c052112c5fdf7974e7056cd1564fe84f207cb946d1efc521e5031299e6275936e6f9464a735bd4edc8e0cde3fe5b1bf6d3bc1ed12993b865d8fcb9d9a2b2ef2df30cb7f0ab4c0dea819ea017ff195" } }, "method": "pgp+rsa-pkcsv1.5", "type": "rsa" }, "8288ef560ed3795f9df2c0db56193089b285da58": { "hashes": [ "pgp+SHA2" ], "keyid": "8288ef560ed3795f9df2c0db56193089b285da58", "keyval": { "private": "", "public": { "e": "010001", "n": "c945b94bbc504a61769bb992e2b91a247f8d32c9188cb4918396dadf31ad189928eb21e52ba117e2d4f22737b1ec1b03e36d34386335e00696005515cc3b0c948743f6a6a9988bbdef9047804c0e0eeae43e16b1e8f1a1c5589567b8802c9f33c02e38133a6027958da5ae07c050722cc7261c49fed09a9ce65ff503f973a61ba23ccdbe4204ed7efac13df77274c7be3750b7234087e9d475d7bd93b20d004f5cab5864bdf94c417e7d3d9a094890aa81a8a7caacf4ac7afe3431aa4e01fd01da430c6a6a5ee08fe6e8528f5257b7576f0439467fb0d32a61501809a0ecf4974451866f97d72c617cb12897bd16d4719eb48aefdb5f4fab09bd33b03ebbbcdd" } }, "method": "pgp+rsa-pkcsv1.5", "type": "rsa" } }, "readme": "", "steps": [ { "_type": "step", "expected_command": [], "expected_materials": [], "expected_products": [ [ "CREATE", "foo.py" ] ], "name": "write-code", "pubkeys": [ "8288ef560ed3795f9df2c0db56193089b285da58" ], "threshold": 1 }, { "_type": "step", "expected_command": [ "tar", "zcvf", "foo.tar.gz", "foo.py" ], "expected_materials": [ [ "MATCH", "foo.py", "WITH", "PRODUCTS", "FROM", "write-code" ] ], "expected_products": [ [ "CREATE", "foo.tar.gz" ], [ "ALLOW", "foo.py" ] ], "name": "package", "pubkeys": [ "7b3abb26b97b655ab9296bd15b0bd02e1c768c43" ], "threshold": 1 } ] }, "signatures": []} in-toto-0.4.0/tests/demo_files_gpg/foo.tar.gz0000644000076500000240000000022413535414523021224 0ustar lukpstaff00000000000000‹½tXíÏ= Â@†áÔžbÀZ™ýÉæ ^Ã" ââÝÞÞÕË&ˆð>ÍÇ0S|s>NÏnSª¼—w¡ÿ¤Ú67VÅ8u½k/5œ è¶µšùQÎ÷Z%Í×i鮞Ÿ°oŸÈ7ÿÄ^NcJY.·CÉ%ï~ݰΠˆžG–(in-toto-0.4.0/tests/demo_files_gpg/package.7b3abb26.link0000644000076500000240000000225413535414523023000 0ustar lukpstaff00000000000000{ "signatures": [ { "keyid": "7b3abb26b97b655ab9296bd15b0bd02e1c768c43", "other_headers": "04000108001d1621047b3abb26b97b655ab9296bd15b0bd02e1c768c4305025a25a328", "signature": "e29c54033e60a883c326cc8f1ecaecc3c61cbddfd67c91e223235ad19075e360609de01121140210c0be7d296726df82f6b61611aafce46e94b4ae5a596e938b0ac6cc141397965b8b3f7fc695bd8b8361186678e1dbc765f0b4e690f22b7a5f3e72b7221088fb519fd619b3f227055643cd09094a3991b4765d4a1b0c2d988d360c633d5e508f8ad6fe87a07f5f0047f44ab35e53cb8b55c6f06c43b5aa512f946a9d8ff711bc7ea4aac6d0306ce94bb8cb9fe03b0493c6afbf2d7255e1937491d4f43d09ee603682823a530c16e2cc0f16227f9ba5d6f6926631fc170a30a59590cee2a0695b692aba0252c7b5d983bf3ebf822f992a877a1382478a557b11" } ], "signed": { "_type": "link", "byproducts": { "return-value": 0, "stderr": "a foo.py\n", "stdout": "" }, "command": [ "tar", "zcvf", "foo.tar.gz", "foo.py" ], "environment": {}, "materials": { "foo.py": { "sha256": "74dc3727c6e89308b39e4dfedf787e37841198b1fa165a27c013544a60502549" } }, "name": "package", "products": { "foo.tar.gz": { "sha256": "52947cb78b91ad01fe81cd6aef42d1f6817e92b9e6936c1e5aabb7c98514f355" } } } }in-toto-0.4.0/tests/demo_files_gpg/write-code.8288ef56.link0000644000076500000240000000171213535414523023334 0ustar lukpstaff00000000000000{ "signatures": [ { "keyid": "8288ef560ed3795f9df2c0db56193089b285da58", "other_headers": "04000108001d1621048288ef560ed3795f9df2c0db56193089b285da5805025a25a335", "signature": "bc8db9988d722ccc2e34165be012488effec5f73be29d2db4f860154d90b58247ed8a589d13fc11b6301939e926353ecc0484ac14d55183e7c89ca90b315e9399103dcf7a41748f3ee4c55d114ac49fc32bc0f0a512e5324431eff2325555b24d4ae0426aba9cd0c5509ca14c1db2205da1d9ff7a50b1a5219f6fade5d4cd547552344191103d9511971b2e2cca7aa028fe5b3218c6ee3fa5beac94e67602fdd548e554b4bc42fedda42aa5e634b6805e4982ee0ae3c8b91515c61590e2fdce1e02b977e52913fd49e9c3e6c41f1a9eed80669fbb4a60d54cc70a77ada294e0bbe4c28c10312bb00754be626696abbf1b10c7d6ac952b591637f45a20ec26e8f" } ], "signed": { "_type": "link", "byproducts": {}, "command": [], "environment": {}, "materials": {}, "name": "write-code", "products": { "foo.py": { "sha256": "74dc3727c6e89308b39e4dfedf787e37841198b1fa165a27c013544a60502549" } } } }in-toto-0.4.0/tests/gpg_keyrings/0000755000076500000240000000000013535503044017036 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/gpg_keyrings/dsa/0000755000076500000240000000000013535503044017605 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/gpg_keyrings/dsa/C242A830DAAF1C2BEF604A9EF033A3A3E267B3B1.ssh0000644000076500000240000000216013535414523025044 0ustar lukpstaff00000000000000ssh-dss AAAAB3NzaC1kc3MAAAEBAPyjJ2zXjCDjxzriOYZ0BGA59dkPQePt6byZ+UAA0UVpNSJnH7pIHSLgqbMeaV0ZjaXmL0/7TbXcZAdtDy19A86VP8eEam1OF6EL8dzRcWf3r/dhtZ+iGA5/zSylJ8A8UMeGZbVTm/K0Vki20j8x83mZ5qe04Idt2tfseDuO7H4fsUcz50trCxBcvcWn3o4JRlfyFGzkOjF3WBywIqTizmZ4ozZKVuAgkFWabf2B2Ryjt8av1Pz8Zv2IM50hcGJGL1HFyR1uzPr7MgZb5o5rkeyDfFmlG66+yhxw/TiRybu2f32SD5FT/E0soD+Iontw3xaEcJ+ZrRhwcYmwFUQbK/sAAAAhAIR3nurgI416mgMKY5vwGg+e9Rel2VBZnBmk5U+78jIZAAABAH9yUq4YJLryvl/I9DGhl4aDo41KIswrzcAczR9e7kepZKpXY5phjPsbEHB7TQn/EaRI6DunASNXPy1JpZn1MTp0Rj5bs8o9YXKgDwKwEGXOMSUB4Xl/e1fmBpR8RL2Dn96NQyafH7dK9s7fTbf6vwsjV+0J1WOBrHae9aivG0RQ4MiLZO4cq5+t6zG3vmIHt+FwCKM6dhODH3ChI9WSedy8Ijj0buqoCXeVt4BfG4N+87joBxZOGG+un6P/UQITCWv1QEDqxUWmpbR8kQ5s9+MG4fRnI/FLAs2eCw/ypWw7JgSGlDGrMmPWG/UGi+42yIDHvyx0bcrl0Ney//JE70MAAAEALdULIpJEFERYH5oLfY1/iLVz/EUfXnIHwyRpQjLCLhcbUI9oQq6bq8Vv5OWGoiCGGItIJ7erqMe/9KSsmqgMg1Qgsa+6SrTxscDviUQ3kDqfTFbr7wN4BKmZJcmhU7ihbBVi8pd1WuqiD6AqsyqlNm4FK2uqmpNDVtT1/CGHhQGN0Sssjm1gXSr7NssGqcztnqH1+CeY1jXeJk7w61lZDEpLL98jaaNvlWFIBMeqWWa6lZdAS6LSxogZWREt5S3kttTx4silndqtsIpZrIM0EY8VqgFZPoUQJJBeptiEw6VFr2/dA8jStU2h015xDvdaK0d1u3jFCyjR4vtIQW3JQQ== openpgp:0xE267B3B1 in-toto-0.4.0/tests/gpg_keyrings/dsa/pubring.gpg0000644000076500000240000000172013535414523021755 0ustar lukpstaff00000000000000™.Z ¦ü£'l׌ ãÇ:â9†t`9õÙAãíé¼™ù@ÑEi5"gºH"ੳi]¥æ/OûMµÜdm-}Ε?Ç„jmN¡ ñÜÑqg÷¯÷aµŸ¢Í,¥'À[³Ê=ar °eÎ1%áy{Wæ”|D½ƒŸÞC&Ÿ·JöÎßM·ú¿ #Wí Õc¬vžõ¨¯DPàÈ‹d­ë1·¾b·áp£:vƒp¡#Õ’yܼ"8ônê¨ w•·€_ƒ~ó¸èNo®Ÿ£ÿQ kõ@@êÅE¦¥´|‘l÷ãáôg#ñKÍž ò¥l;&†”1«2cÖõ‹î6Ȁǿ,tmÊåÐײÿòDïCþ-Õ "’DDXš }ˆµsüE^rÃ$iB2Â.PhB®›«Åoä冢 †‹H'·«¨Ç¿ô¤¬š¨ ƒT ±¯ºJ´ñ±Àï‰D7:ŸLVëïx©™%É¡S¸¡lbò—uZꢠ*³*¥6n+kªš“CVÔõü!‡…Ñ+,Žm`]*û6Ë©Ìíž¡õø'˜Ö5Þ&NðëYY JK/ß#i£o•aHǪYfº•—@K¢ÒƈY-å-ä¶ÔñâÈ¥Ú­°ŠY¬ƒ4ªY>…$^¦Ø„Ã¥E¯oÝÈÒµM¡Ó^q÷Z+Gu»xÅ (ÑâûHAmÉA´Joan Doe ˆz"Z ¦  € ð3££âg³±aÛþ>|¹}bß±@Òe!Yè~ÂéB‚‡.GúË.ÿ“Uþ?Vyùÿ, mø&€UÕq›€­x!hs×àXc0·!°in-toto-0.4.0/tests/gpg_keyrings/dsa/random_seed0000644000076500000240000000113013535414523022006 0ustar lukpstaff00000000000000Òñ¡D'Ež°X‹Ýëí èÃvÄ—­õíö›½}¤ÒÞ-§«…겈­´xm6­¦0 º,b«Ñ ÖÔ#`Ĉ$°Ê<ë8õ1¦2©tI<žî+¼K¬ù•ƒnAÝgWÑmr 𙿣*›S®P‹J„màÎÿ7[㎛ÜÞu%ìyK[!Ö1;ðSh²æWà»;¥O¹·Óël€Å°}bè 91²o‚5‰z‚ŒˉVýWK(üyœÙAÛ¦õ΢¸Le1±ø…°Vö[§ W(º#§‚·ôäìG’ŒE]¹ÏÓ:¢HEB&¸çŒUþ¹¸­ÇžnµÖ z”⹉1$nsý· E°„•wœi÷ÎMÊR„‡ ޳‰dØdà¶ôÈ ÕõÀ^ yïó|UF®Y”:Ú*´|›]~B_áœÔRª8`‹]„èM1ÑÅjzAçàʼnøPó:PË«€¿²ÝênWEf%Z–—ÝS èaX`DýÜÇMS,#›ÑÕhm‡ts™J‰¸Oû™þd;´³²Ìó*Z¯Ì$ÅÅ%¹òm:7;*A Zþ•9c'cˆêëKyš´÷STú–¿g3b¥,¢ÔIèêq­šI‰>ÓáÙtPß±),ëq¡®ìŽ7‰|ñÍE'™>Âg&»oV+·á~fŒðê3Å— ¸Þ,]Ï„yRÜróù¿àAH ýwÊ`”²Ô{êÍÓ?½|vWØ•ÈÉãRmX¦øW™®g£Ù¦¡þ³{S›,ÁÕþin-toto-0.4.0/tests/gpg_keyrings/dsa/secring.gpg0000644000076500000240000000176513535414523021752 0ustar lukpstaff00000000000000•SZ ¦ü£'l׌ ãÇ:â9†t`9õÙAãíé¼™ù@ÑEi5"gºH"ੳi]¥æ/OûMµÜdm-}Ε?Ç„jmN¡ ñÜÑqg÷¯÷aµŸ¢Í,¥'À[³Ê=ar °eÎ1%áy{Wæ”|D½ƒŸÞC&Ÿ·JöÎßM·ú¿ #Wí Õc¬vžõ¨¯DPàÈ‹d­ë1·¾b·áp£:vƒp¡#Õ’yܼ"8ônê¨ w•·€_ƒ~ó¸èNo®Ÿ£ÿQ kõ@@êÅE¦¥´|‘l÷ãáôg#ñKÍž ò¥l;&†”1«2cÖõ‹î6Ȁǿ,tmÊåÐײÿòDïCþ-Õ "’DDXš }ˆµsüE^rÃ$iB2Â.PhB®›«Åoä冢 †‹H'·«¨Ç¿ô¤¬š¨ ƒT ±¯ºJ´ñ±Àï‰D7:ŸLVëïx©™%É¡S¸¡lbò—uZꢠ*³*¥6n+kªš“CVÔõü!‡…Ñ+,Žm`]*û6Ë©Ìíž¡õø'˜Ö5Þ&NðëYY JK/ß#i£o•aHǪYfº•—@K¢ÒƈY-å-ä¶ÔñâÈ¥Ú­°ŠY¬ƒ4ªY>…$^¦Ø„Ã¥E¯oÝÈÒµM¡Ó^q÷Z+Gu»xÅ (ÑâûHAmÉAý1“Ká;sñòXÑž€žŽ¬ò_ýº¾^`gÖå×î´Joan Doe ˆz"Z ¦  € ð3££âg³±aÛþ>|¹}bß±@Òe!Yè~ÂéB‚‡.GúË.ÿ“Uþ?Vyùÿ, mø&€UÕq›€­x!hs×àXc0·!°in-toto-0.4.0/tests/gpg_keyrings/dsa/trustdb.gpg0000644000076500000240000000240013535414523021772 0ustar lukpstaff00000000000000gpgZ §  ÂB¨0Ú¯+ï`Jžð3££âg³± “03,ßy™i âœ|]"µ2›¸in-toto-0.4.0/tests/gpg_keyrings/rsa/0000755000076500000240000000000013535503044017623 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/gpg_keyrings/rsa/7B3ABB26B97B655AB9296BD15B0BD02E1C768C43.ssh0000644000076500000240000000062013535414523025052 0ustar lukpstaff00000000000000ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDprTkVAq4yvU/MQaD5lw+JAe1q0cXBKMAq3SJyHNwiMYtkvsn5RntpSbGfwumM5BkGElrUXQsTjxrWxdp73jgJLZ4+aXzouDc7FQtXNC3ZIdY0uHPyWPXBVVm1KSH6S7f0guxDochcM4W9UgztvcFrJSSmSuzzKsVpDm3U7gIQqXXhtsWvFk6mnKZFM0IkMgcFEQaHMFlHk4hVZ7uPfP+s9utf/cZA6JjlmVebIbFeSX9cBSESxf33l05wVs0VZP6E8gfLlG0e/FIeUDEpnmJ1k25vlGSnNb1O3I4M3j/lsb9tO8HtEpk7hl2Py52aKy7y3zDLfwq0wN6oGeoBf/GV openpgp:0x1C768C43 in-toto-0.4.0/tests/gpg_keyrings/rsa/8288EF560ED3795F9DF2C0DB56193089B285DA58.ssh0000644000076500000240000000062013535414523025043 0ustar lukpstaff00000000000000ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDJRblLvFBKYXabuZLiuRokf40yyRiMtJGDltrfMa0YmSjrIeUroRfi1PInN7HsGwPjbTQ4YzXgBpYAVRXMOwyUh0P2pqmYi73vkEeATA4O6uQ+FrHo8aHFWJVnuIAsnzPALjgTOmAnlY2lrgfAUHIsxyYcSf7QmpzmX/UD+XOmG6I8zb5CBO1++sE993J0x743ULcjQIfp1HXXvZOyDQBPXKtYZL35TEF+fT2aCUiQqoGop8qs9Kx6/jQxqk4B/QHaQwxqal7gj+boUo9SV7dXbwQ5Rn+w0yphUBgJoOz0l0RRhm+X1yxhfLEol70W1HGetIrv219Pqwm9M7A+u7zd openpgp:0xB285DA58 in-toto-0.4.0/tests/gpg_keyrings/rsa/8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17.ssh0000644000076500000240000000062013535414523025052 0ustar lukpstaff00000000000000ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDaWUCebt4welL2hRlUp71LnjCb1Ao5D4wN6XIrYxARDvCwlb8cRz4z25cVDtrgXGPdpwwDkCcBsV88XDCJR+GwZnW08REgMPEUW+hK4VYukSDC1CmyDVBWM3y8l/yLXbVwSiHbY10AshV+1opAPHk+mVi3fgAWP5mwGAngjukJm5mxF8CGUB5565R/dgoHFb6tACTEjYH5AAZxxDBqk3JZZfP/LcmAbq8IE1fwJoyri6dYLS6V5RIlqdx+0xqVaMRVaNeRewXnyVTVYc0IQpHnenvdaeOsL5CR3lX+P05zAUfogOL8BExffATHXOM6PAtSOA9NYDCXCMVhhfO85nA7 openpgp:0x7E0C8A17 in-toto-0.4.0/tests/gpg_keyrings/rsa/pubring.gpg0000644000076500000240000004251413535414523022001 0ustar lukpstaff00000000000000™ Z ÚY@žnÞ0zRö…T§½Kž0›Ô 9Œ ér+cð°•¿G>3Û—Úà\cݧ '±_<\0‰Gá°fu´ñ 0ñ[èJáV.‘ ÂÔ)² PV3|¼—ü‹]µpJ!Ûc]²~ÖŠ@™X·~?™° àŽé ›™±À†Pyë”v ¾­$ÄùqÄ0j“rYeóÿ-É€n¯Wð&Œ«‹§X-.•å%©Ü~Ó•hÄUhב{çÉTÕaÍB‘çz{Ýiã¬/‘ÞUþ?NsGè€âüL_|Ç\ã:< R8M`0—Åa…ó¼æp;´Joan Doe ‰8"Z   € û?S~ Š ÿniZ§"ëŒKGÖÓŠ¡Ûß„jÇ«]5޳ŠÑé¹g7Ïñ+­diï/] ñÚçÂl©A»Ê:h'÷„…pÂõ«¸ —Ò«'˜Š,5“ ?áwÕ²*ÙÊh×qLÌùÊëDÆþ£X öíÒˆ¦`U÷Äœ=‰a"–ãÔí¥Ä¼ÅüEˆ$Sÿ—K`±XG¹›â}%ôŸÒ¡á@h…ðŸt›Ëɘ8È?Òº0ä“z²WÁô9¦§IqHδ e±~_q ñW¡g}¸Ç Ã»~w£èÿz>v+¨¼wQTï'VÆ?¯qåûÁ~3@³vÙïä-:T zvåfŒÏ?HGRWÔ›¬ùÿ°¹ Z ¤(kþ²¾éóm™Sï/ƒ&_+€ $˜u΄˜öðŠm~¡MÎÌ‚;qçÙ=nË)Oˆ€îp>§<´]‡@"q OKlßyôD8Ǹ¸´–=Càô ¾®çžHýùþËfvÚ{¡z]ÿ¨Šü‰„t=:8y¢c„Ìá#PâPhì8Ôã×Qv¶‚Ê¡-MødÜYñùŽì^^"ЫlÃ+ñÝpÏL,§Æªä ùMμ/ Ö2X!èÍÅOBÖ>¼6§&€¯¬NQ_˺"9¾ •«©Á}}p ßǬ–SzÕ ÃS=7í1~ήÞS#&ÿô•ÊŽ ñøF¯,’y¾G§u¸?°¹ Z”2ÔÁRü5¦ÓÁèÀÞÎ  2DfáN¥]r#«\C9>ëÊs̱-DÃyúãö;&= y8‚§3.ðo(¤¹®w]-(‘gèl-ñ¹©á'¬²hh…Vì°„’Ðqðl¯ˆÎ©Uq5CIØïÿ°ÒYú㿬šµ¬ÖòoGpþ/0üУÅOF:0”ªe$ã'¦%á$uÚ$³µ6ßa°öâ•G9¸‚Œaføáv‚>ˆ~eú ì²¥Ö÷ã³)¤Cá"+›bPÛ E§ ˆûˆÔ6Ôsö4H•>‰TÇ¥sKöNbtá–Û0t§Î6– ªË‰> Z”2Ô) û?S~ ŠÀ] Z”2Ô ›éßQ1Ù$éЫá™Ë¿a…R8ÀáDÖk²£Pt?éMM6xÆi¬Wɇ }a~Å£ÓhM·ôÖÚóåï©y©v²á$p¡Q÷ßÛ­^£ÑîÁ(ÿ•‘”?WÇpñÕ²gŸŸJ“ Hwåk–ŠXJÒÔ ÖÍö¤Ö>¿S5Ý Žm/ÌÏ–­œUK½ª Æ{ÔMm(™r¤ã‚àÎ\,¹M1“âøÐ6OæÖßJÅÚ•¥FH"\1”'er¿(”ßÄHiÅ4>›v}C$ÂÏÑ\H<6,ÚŒúÛ Þ¿ÑÐ1îûÐ ú9è|ée})$mÇ1„BšH¿­Ñ Ÿ¿=&F Æ­)þ9è<«©YmЉüšî²lƒø#˱µ“Ô/nzwán©÷ÓY*A£„ qNÊÔúœDÂ~J«/>Ø|è­Müð“›Î' Á櫯ìõ‡fﶪhµ§šIFžáÔÅûWáùIfÂÊ Ç^\2Ê « $R;x K^Q¥d÷19.U⧇µ"Ž·€w]#]dÖ´U-¢b%^l°³^€ ÿmé§{òð9º¹Kúø|¢…»€ÓIr¹’¤ÕûfÊæà¬qÉ ¡=»jNîïHs'f%YœAz³øÑ÷–ZîX€št³W!¸»w¢ëæ—;Š‚…M©Cî#õ»>³I°¹ Z—¨ÌjŠwCÇŠ ¥þ/áW©Òn]%º}1¡i¦ÕúB»ógô±Ó”]T2ð,5w>˜q†Lâ’Eh(,seÕ¹QÍQŒFù°ßÎf¬ùèZÔp@Fc`÷HõÖÀ-fA“KˆÀ0˜yDð¦î±Jp•%âÆ_û†5ï¬YþË~Ör¨N-óÍ&ƒ8§€‰h}ÿnÈvx0W«ñ2ÓË$–h¶9“ø‚Ÿ¡/u"JZ(ïŒHJ ®”Ñ]—¡l\:kX‘ó'";‹²z¡E¥¹Kq”¾ÂưÉIxª¿Cè*!vHþh$dǵ9<Ñ/K\®™VJ¸×öFÅýMÚÕ¸ŸrÁR"3ÿWý¥k\EÏÆ~ùSGÍ$6kXÄ-D êá1ê»gKC!ÁFï8óåö/°+ô¸.åHb0&-„]úÌ;­Ð­ù¥ÝRaHøtõÎ^2‹[ž· ]ÿù‹P0Ft°€‹-¢‡;¢ƒg;­ØÍ¾›|’½ºÇcÿs;€W!!Ñ©úú~¾¨14ÿæ!š;‡l›þþÊÛ‰RÊ=cdz ƒo€pHЈùEövVåW×j·3Uâs¬ÿÐ_ʛɭXïÈ"<íCkx‘ž—H£ÍÛŸO®ÆÅpQ¯17¨Q§©ìnpuî‚©ª§JônÛ—çüL ®n’‰ Z—¨ û?S~ ŠQ¸¸N—'+Jì@% +Ö|R50ýuŒGÐØ¾9//¢L#9åÂ0|…¿[ ‹Pßen ª\i3>¨P_Ù}¤ zUÖ¡µ9°Ý~L­õÔ^}N›<ö_ÙsüOWC¾/Ö¯ ‹ZeùÕÉIæÐ^Ù™U· %Ñ­düÁÔ²äYRf+9dùxð5”ñö~12M¤÷ðPòاɲÙíJ¶Ô oí¨ò±¶ ’P7GNªá…hÏ.•<ãS:,æ(€ºÙσlµúÉ<»/¹­cB ø™û©$¡1W¨Ù™\•$nmFË–ÆÉ¥C&–IXÁ¹¥Äu¤!/qcÑ p»T§º°™ Z!¢6é­9®2½OÌA ù—‰íjÑÅÁ(À*Ý"rÜ"1‹d¾ÉùF{iI±ŸÂéŒäZÔ] ÖÅÚ{Þ8 -ž>i|è¸7; W4-Ù!Ö4¸sòXõÁUYµ)!úK·ô‚ìC¡È\3…½R í½Ák%$¦Jìó*ÅimÔî©uá¶Å¯N¦œ¦E3B$2‡0YG“ˆUg»|ÿ¬öë_ýÆ@è˜å™W›!±^I\!Åý÷—NpVÍdþ„òË”müRP1)žbu“no”d§5½NÜŽ Þ?屿m;Áí™;†]Ëš+.òß0Ë ´ÀÞ¨êñ•´Don Joan ‰8"Z!¢6  € [ Ð.vŒCvy• JŠ‹yž#A­a­pËt<&cý¨û¯p¿~JõÃ}ÔQjTCÛýŒk[ýîX_a‘ÁŽnxbM›¿³öÁ÷–"ï>¦åƒâW„£r¹”¶—T@]ÖæØ¾XÞWO(“¨Ä’ßÏ4—­ŸëŸö 3©„"}FåǸ¦Ð?˜8ê‡\?Ç¢L¾)qS‚÷°¸Hnô—«ñæY£ò,Ûø€”¡S1mN·ã–þØÎ‹ÞÊAòîÏ?èáô'¡Q¦ó²½±÷ yõ¨“.õ¡; …‘|Ð¥¢ŽÆé½k== Uä˜Ç‰“é3jX¶©ÊNÂzØä°¹ Z!¢6´ƒÆ;;š¦Í×ÒÞN8¯ÀJ¼’W?†ÀÃEØ P*ÔRðˆÏ¥1éã<^×à ÿ|ÆUÙü’€/œÈ8ð–êQÞ=£³îÉ{Ïÿ£Õ}.CÃýqV†¶e’RÖ%…î„!ˆ>Õ ˆõ¯fÓÝÂyiµÜæS<‚ÿåÚzë{iÍà»^Ž¡•^ÔÊj ¨ßy¬’Ôòe i†òîd–¾3âÕëþÓåvhqbSZY·#„zs m¿÷­ÉçÓ9º|¾­“„„¿T9ŽL·áÑV·^4°DÝþäˆ$ƒó÷où÷U©`óoÓ2`"ÜD?ÅÌÅ05‰ Z!¢6 [ Ð.vŒCKPÙ­ˆm´Àâ0À ç2"4Ʀ?§v­”®ñÜÆ¾† }ÛÉ‹ct.k¶vÕº°WÇ@Œ¨¸3ú$?]Oä«DcV§c•Ø»#¦?sÅw.Û1ßó"k„ã>ŽCS[‰FØ»R¯ë†G‹€ø9u²ëí.˜X±”³bÄñ.•Ô‚îh 3ÅÝUÅÚ$J# fΰüÕ•8ðEéú%Y¼'¹tÕ:Qä®ûˆ¤œ²“§‰F¬›˜×Þ%.rT¡Zĸ)¥ÅÎ@MLéŒz㹩ٙÓw¯ƒr·’,ùœ¥5… n5½ºíC–Õýa8K¶\ΰ™ Z!¢¯ÉE¹K¼PJav›¹’â¹$2ÉŒ´‘ƒ–Úß1­™(ë!å+¡âÔò'7±ìãm48c5à–UÌ; ”‡Cö¦©˜‹½ïG€Lêä>±èñ¡ÅX•g¸€,Ÿ3À.8:`'•¥®ÀPr,Ç&IþКœæ_õùs¦¢<;Bí~úÁ=÷rtǾ7P·#@‡éÔu×½“² O\«Xd½ùLA~}=š Hª¨§Ê¬ô¬zþ41ªNýÚC jj^àæèRRW·Wo9F°Ó*aP  ìô—DQ†o—×,a|±(—½Ôqž´ŠïÛ_O« ½3°>»¼Ý´Dodo Jojo ‰8"Z!¢¯  € V0‰²…ÚXŒ˜lY÷¬ùð¨ó¸,yZËS'q¿ÈÙ'NŒëš¦pó²oâ4DyÎd•¶Zm?æÏÌûß¹´ $›²‘íðrc±Õh&ÓVàuæsú‹,i­ÿ¸ɹ$ï&m5ð-|Ì3èó–´­:¶ÎÃhaýÏå`nGZ„›©ý|8Z2¥wfAµ¹v¥£nž˜õÿMM|:žíà_Ò¨‚ÌÕpÎæÃý!°õò­f´j$Áo«OÇ|.íGí×F¿#)W0¤,è'Í:hD\ßžÓÿ«›Ð.so6¡é<`éú­ö× '¬ƒ2Ö‰w†Ð id· –‹.oâÀ‚xM¤-°¹ Z!¢¯ØuµJh—AßçÊ!X7Ú™Y4é˜çî.0l7 vãxÑþ¸{jÈ»±!Wˆ.¢é²p9˜6´gÛÅ%)ðÌåf¶K×3‰ÿ÷â’&£QˆzÚÓÿ–·ÙM$îGMsÕõöFÎÿŠpo,Ïü×[›Â<¨‹6±öìÓ¨uÊ_Š[소BuÔ& g(SËæég8È£¸kH0*û ëÈ_™‚éóµ¤;Ä“UNTå5°k';ð\¼â3,·Dõèž\T[tB­%βYÆ ³z‰8§+qè¿õǵ&†€!r˜Å%™>kUûr¹gä`êR¦=‰ Z!¢¯ V0‰²…ÚXëƒÿSmswå‚«[,X}7™ã51eœd"ûV& ŸWŒ™,·j™êja@²QH8ù=¬üL¾•1:õ ?“ÕH=ªK³±*Xê“æs±ÊèN'”©ÚªÌœ&2iMlÝ—´—©yÕÀióå3õÎÞoë¼H¡Ú ôâ®Á&¡…@ÆGìŒ9>ßos‚ØKø´-ë°(¤®]NC‹~°èîfý0þ´¹È1÷ãmúdžš’œ(…¹Pù+5³Œ¨(qYÊþ þ_‚lüÚ9õÔ¶IôÕñ¬ÞŽLi{¡ëøÑCв>O‘0¹€ô!ï½xì©ô¼´`C+¨Å¶eZ2İ™ \E¥óâ(Mb0œï]Í¡²ŒmçgWÈ„|䮦¤öøFÎ#±6Y3º†:8GŒ#º]U´1x2‹i~'Æ·`¼Q6É/š(|‘»Bõ+Žš7óbÇ(°c¤ÚHËXI‘J… ®¾ÄYê2<Â\0ÌúXHÝ, õ#G’Œ->ã I¥g̵‚+öXóöV)f2Á=G›Fª¾ Ã>Ú@áš1ïlèLøuÌ‹§¿åíkþ>6 Ƴt† =z¬çy4–DN²˜›Ü ¯°œbî²íÀÄ{¡J¦¡ 1âûÞ‹…~òtȦ[¥~Ø­vžƒ»l…ËlB)Qˆo´%Arthur Two Keys ‰8"\E¥ó  € ÒÉþ“f™—è–­Þs­òdf¹me¡ÑµIé2º`µûÏlO—÷X½cZwîH: k{Î{ªi`‘Øÿû²oÈþç÷ÅètÄ…ð£uÏ‹ùÂ@+H•£õ ¹ zfsóÈãR¯‰Œð}Ú–Ò!xa³cëŸâÐwüxå oJx®7€à`!¼ÃÎý ¼sóY.Eص%nŸ†YV¹q(¤ln¡+\ë.WKôo«¡Ç?=‰/ÉÅ{ù½§B½ guÒBüÖÍ 6¬s —1ÃÖ<ÖK ¢`Ú@wˆÙaDÈû×§!Ò¦žP÷ÆÖ˜«:åcb<þ6†´é‹RpöŒªxLް¹ \E¥óÅü½‘Øã Ö°r` O7ææ†Zc Ãm·«ÅèHõàý]ppâÉÐ>Œ¯Â=Î6:Ç_TÊ‹ö$‡ÄÿhðÂsHxà+.Þ®ú5“¶ÛF•h•Â1!òíÁ%Ó­mµSç=yñ®2ì4Üø”JwÛˆûØ-}Ñ!w¶ Ŭ伵žXQ†ª÷‡_µô±“Ê\h&¹Î(v² 10Úö× _~‘—üª™àá"©|œ$Ú€MZÝÁ׈cÍí# ÂQ„l%!ùr]/Km“W¡ êÝŠ£))è”I 'YžŸQ&ÑŒé(SÍ+KFDƒ‰ \E¥ó ÒÉþ“f™µ-ÿbælѼ½´‚þ¾¦¬îÞ“¬°:Õ­í6»O7!új’ìZ¼±Y©¸€ HµÂ©‚ —qÍ¥ÊÚȇæÜîML¿AâìŠt4ö&Ÿ±a(ß‹…SоÓÄHÄ^4!fŒi·Ž›€ØÎL˜y2|ùq6 ›`ÓŒ@9 ýx=±AM¸¨Æ;^ŒÉÎG”A‰bÔQô>Ñø!$(š§î¨é€ÕgÊÀÓîÉB”FªÎÍ)ÜÖ– ]Â’üvm÷ûҾĄ7w-ò§üw¿Nc¤ÝÍrÈë¹Á®»ˆ."°¤qt‘ÀàZ¾Ü\cŸót¯mª‡¿‚°¹ \E¦ßÛtµwøGq¦” 8àÿV9°JW çÖ¤A,’GšÎÃà–ÊaùL¼õDaõ€ÂÉ–Kâ9Ʀ³!¶ÎsŠpró‘!½ír?WÄ­€Fôi,ÞØÀÚûfgŒÌl‘Ã@ð•£Ï}´K!iºž*öh›7Á„zk0ø9!l¨nª!9Ÿñiû{JnVqä÷èŠg¾š4@Ngóxèö½£9 Œ­ò´>wöLYÕ~”8ÖÎkŠ;ÏÍeÞ •ÛQ1_ò®ûDÓ# ²È.‘4Žœ’÷B§èÃth’Ë_žHN uŒ¸8-Îg –uz unû}¨)±e Ï•…‰> \E¦ß) ÒÉþ“f™À] \E¦ß +%dðïYþK5ücÁÁ-³öR»ì¬¿"HÄeà°eÚtöfXj.vÃÓ¿œXù%d%4%t)»DžÜÚ t0~7‰,ÞdÜUq DlÞÚ§ +À0žkô1ëºÈ9-ã_ž2?üÐî’/EK[“è­ÇmÓÍó|Ø]Ž(%°Œåÿ”ÿñ¹˜:ÊÃÐÜÕËyí ˆxã3m"Q~U«ðŒÝpÆÅ>©©'--–› ¯†t;Å•¼D“O*kR”‘ÞÀêî†4AÉ ÐÀcmQ•?1ž‹õí¹îd‚£Ÿ”ëb‹íÕÖ„ÃÄ ‹êC’äMj{8m_×ÿu!Ò/»£áóéˆ> ]9N6ÖÒÁ !jÛ·K…+ ¥¤«…u0÷ó_.ÛhÓ¦ªxar)ŽÞLwœÍu¥D#^ïræi·Ï¹eŽì™3†ðÄáˤà¦Ö4}`šL*i}§ QqJ‘|£õÿNò-V¬E·6ŠÓ ƒÙY%…GdÈtqû7¢oNÄXu|ž¬VÝü&_&‡¾Ïù—ãEÜI¢DóƒcõX$zòìÃu@Cz„™Öu+S"‚ þÌ9îb7ÍÄ þĦ~º*-pü7­³½5@wg&Ad£GÌg×|¥RæûwoŽIôWÅ”æk÷|ÿ-BÌ~'t¹2°¹ \E¨½áMw—q¢e2¥ŒHö~£þ‚xC±º³¶)ùÅÇÊHùÊÃ@ß÷Ð[·ªÁb~]*Fÿ¬ª•©­³¯(8^ýþ¹øÇ‘ ß -ã0#ˆþeøZòb8T+Ή҂)?› ×ùrj£É*ìXmÌëüÈ 'ÊÍésªìe°d¶òÐ-õ*±Àу¢¶Àˆ]Pyà« <“ tG"C°ÐšÛP`ž`Y«|ä¬3Ñ®CbÖ }0ã%›S)Ñ霩\Æ’BCI¯"ˆk¸3Î÷щ> \E¨½) ÒÉþ“f™À] \E¨½ þ­’,‘ësQlcÿ|óItJ½ÏÒ—~1>…¢¥12תûã‰d+ÙñݢёšIºZêÍÇýUíð××ý êíGíú³.øììé wOÝk»«[/0îÕ_ÐlÐBãã¿ä½RÙ‰ aÎûpäý13·‰«Šc{Š1Ðõ%dYMŽ»krvÐL¸èÛrŠ:`Mä‰n2ɾ4ºç?–É'ÖÖk£…ü¾Âj$‚ö—¹ž0Ÿ¢Ò™oÏ¿qYCo +û@!SêHµpv*ŽñSKµ*w²^}ü‡üW_ªð‚nÕO÷Ê'ø×Ñœ–¦0ç¬H¢9–(ÿ.kÿ]Š+‡~J|U-MvÛÔUWº9”œBDQtøÚÌþH7è'Ìΰœâµt[ˆ2ïý›ð㻎)¿ëWÿUt­c ¨îv9ü|µºG!ˆD ‰!´ÐÏ•i;Y?XˆexêȇäÒÉøO9øóÉ)gýg¥M0yž~ÿ2¸ì;†5 9ÛM,*Nßc/Iêµxg}rËKIkO‰™ß4a›ŽšªËÔ§Ï¢Î(€ÅŒ*¿t³O;íaT ˜ ÿ86ŒN>•½ðª‹ìS0âxoöΠ9l päSE+pP:¡äPÊa›ñ¥³Qа™ \’gí½¡¹—òÎMŸ«ŸN©Ò ¨0µ —ÒEa¯°Ã•ËqÆxjUxE{ÿ}9Ø÷ÿX£c|ˆ9–e'•³:‚½Nºê~®_ô'Ð}„“†#ªàgÎ ”æã=B¬³zC¬ö«IDpmô8¥ 3~iÒùG®˜ùïVžaŠízj!©ÛO_×ÌÉ*ÿ¹…@}À4Ë¢$«¦dg(–9评ýâJ ~GÖ…ø§¯ÎÎ88 Æy2| ๬z}°äLJ1ض·+<ÊcÐMÁûÄ<%aò‘Ð¥oöhP\2xºèÇå‹$4™þ±™<Œö¥Y ÈèžÙ‹êøg^™·Ñó‰- \’k'Testing reason ž§ Ñ=ˆ3«ßÿaIèuÌ· ±›Ë‚Dî£Z¾ûÐšŠ–PÖs—³6[Øß·¯'Ú<¿Yñ¸ºzzµ³Ž¡Ç‹x~*÷½öaèò9…)›õ'°ÓõÈ,ź³üsÄ ©ê”ü4‚4Êûmâ€ßýPb‰HþGå¥z…ݱàU°DúË ¨JYáæ%¶òÄéAà­¡Në¾6;X7HJ×¢û–t¥Ç°@Øú׎âO(`êJWš»Ÿs¼gÔÿµˆmU¬ £(ÆÅµ”¥I¤mî•Ó·4Á-‹Yc6ÎA ‚Þ–]¦ OðÀG œ|i\à€Ýw³¬çÅP9¶•¾°nHän|=ãv>“#°´Test Self-sig ‰8"\’gí  € ž§ Ñ=ˆ3³®Üµ@çË )ñXb*ÛF“_Aß™õ¦:ž›5I¶’`v÷Ò{[ìð[Ù¯‹ôÞVX sÃ×”Õz†¶ÛS±>7Q¦€kÓF;Ê6”Ü荒<íˆü­jÉtPª44(«–ú3òOD×Um—E‰¥gƒ±"Õ:ÚçóÃ9(ž ëü>¼ö ‹vœ‚Òr¹[ˆ5 ZæÃÈ4Î{tY"„iG”á¡`ÞSj>wSe¤¶3?XßSµ01ù“ã½ðÓqž›[sËU„FqLx¶•§ô=™MÿúIgRÆ€»ÆH·V)ç$¾òãZGFû·ÿÙ±:¯²°ÑÒ¾Ò¼ÿØÿàJFIFÿÛC       ÿÀ ¢¢ÿÄ ÿÄQ  !WÔ V•"17Q¥ÒÓ)2AXuv”–¦³´ð45aq#38B‘¡CDRr’“£¤ÿÚ?²˜ßÅýåáªÔYš%…¡Ù™ÌV`=)õU£HuiZ$‘ Úy²"ËÎF+VµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜX“])}û[k̲V2«dìP«ÕØÉ.G1.¡§ä!µš RŒ‰DJ<ŒÈË<³#ñ .ÊÉåÁ|Í+ñÈQÜ?ùy»o¥Ôδ7Œfw+'”ó4¯Ç!DK.Þé¯÷«}ï]µ¨W¦–FáFALç’uFM´“ÈÈjIìñ‹ƒ`ù(­åN2%Þ5çR(+Q¥]ÉL†¹ë$äY¥KRšJUã/L¿‰‰ìÎNl(Ùfs¶·ëZ„´†·«TØi#þN2¬¿Ì|w0eÉòÊÍ·±J¤,¼iU·¢ÿ‘°U_~h~À}û¾Ân(6öÍW,¦%½Ó­ÓªðåÓawåG{ºe6òÓ\Ûl’פ²JtRd£Ï"Ú/ˆÌîVO(6 æi_ŽBˆ ‘„ŒνZtkÒ¾7ߠ؃GuEŠK&dÔÙ-¼á¨ÿ±ŽdYéü%§3N‰\fõ1ût×G+«Â­Œ£ÏnœJdªÙ·KaddJ6Òœ—-g’³tÔ’3ÉZNfb^>!/ªöžyvþòku6ËJt0‹/Q›Ñh¿žŽgñæ9胇ïv_Lh¿hnøÌîVO(6 æi_ŽBˆµØÂäkñ·Û‹kž±vQä¬:ƒ6êS²%"9ç±HId· Ìm¤Ë%™—ÛÇN2¦Þ}f]ÐÝ…S¹ì=-Ó6LUäU—y„‡&Y%%±fZg™hhÓ`ÿÞ쾘Ñ:ÐÝñ™Ü¬žPlÌÒ¿…Í2›>µR‰G¥DrTÙÏ·3 kuÕ¨’„$¾351¦˜Ž¯BÁ~¨)c¦¡«Mi#®š¹lf•¨Ô’]FZOÆY›„Ú6é$F‰ÿV3tk¹Ã¥øÞË™w÷e[ªÂs= ÜÉ1yFI饣2?³¹>ñxe™]ÿÖ½LâG÷WÖ/7Géú_¾±yº?OÒøÕõ‹ÍÑú~—ݬ^nÓô¾$5}bót~Ÿ¥ñ!«ë›£ôý/‰ _X¼Ý§é|HjúÅæèý?KâDÆæp3Šk'|Ô×itkKL¨Mݺrù¨íJmn/E JÉ)3É$fyl#1¬â³bË¡g|”Ÿ½;ßr¿{~ϧáþÃàü,¾ß6ÁÂ=èo×|áïC~»çzõß8ú¶RÔrPØ›KLµöj[qjÔyMÌ„úãÚ7ɧÐzH^ƒ‰R ÈÈŒ³#ÈÈâ+×¾¾M[ð¬Ã¯Þªr·6n㌳hc¡¦´FDÛ(B33Q欴"#3$‘#Þ†ýwÎô7ë¾qìÓ*<‘ÔŠŒj¤Fã-莥æÓ&-£’Ñ©'™Úu*mÄùÒ¤šOÄdd,?˜B[h‰ öb°ÛI$! ¢ÔBRE°‹J9°ö#® ÝÄ&,Þ'èö}ײKn1>žÃÄgâÉ™Ygü "3xe‹™pUX¹,Y³=¥¤”ÄJå¥\OÆe.Šû†|îÙêRãÈ"ÈÌÚyÍ·2̳ÑQ™g‘äb¬›Üô/† `¸¼Þç )|0kÅæ÷=KáƒX./7¹è _ Áqy½Ï@Rø`Ö ‹Íîz—à uÏã£ö¦öìM˜¯^uS*öŽ›kâS‘ΰì–Ðâ4“”œÒ£,ÈÈË=†F5¨fw+'”ó4¯Ç!DLnÚø¯>è*…W»kmT¡³3‡ T×é×Ë õ™œ8j¦¿N¾XO¬ÌáÃU5úuòÂ}fg©¯Ó¯–ë38qøÈä©¿äq­¥€w/‰Sf$ÿ*cŠÞÞqrÑ«Û ûÔf ôêÔÇ2*R_ãY£Ãi;rÍÄ ³hõ*RJJŒ”G™Ò1¨(xñÉ€÷—)6ÜÙd+Â"Í×*Л̌¼Dj’ÂÈü$§Ïþ— '¸òóvßK¨ÿhoÏNS‹Ï¼»nlTK xvšÎ1.“!ÇÚ¤Õ¤CC«'ˆ‰K&–’Q‘lÌÅ.éˆ]ü^'Ú‰ÞÔ:Fâ‰ö¢wµ‘¸…ßÅâ}¨íD’íqk~V.ßÙûU[½‹m\¦Sj ¿:›6½&C2ã²u£mÕš ÔƒQ™l<Œ¶‘ «Ê*ócX{)w-z¦Ÿg”ÃqªEC®JÈò2\Id†–HÈÍjA¬öŸ8Ém"ÙAºFâ‰ö¢wµ‘¸…ßÅâ}¨íDîæñ»WSkÚ´[m\¶”Å‘56“_«H’Û­g™ókqJ6\ó-$Ä”[ý±ØÝÃòC„U[ÁªXz»DkîZ„çéÄÚ¼FfójîgSæ'~sIÂéô{ú¸}<áOĉ¬FoO´´²pÒà2hÛI‘Í©ÏÎf+=³Ã?'U±´Óí*/ÎÎÐ}ÐwžT =´¥³ ¥ÂæÛY,ÐFyž‰‰g’H‹"/‹Ñ “·å/íÝ#Ù‡D.Nß”¼o·tf;)˜:Ã[uÆ,#¬ä¦kꎹ-Um5ä!Ló„•6HÐÑ3' •ãÏE>aÇj¸EäóªU&TÓ‰ÈЊ\‡(Ñm½,°JQ«A²SFd„ç‘™žD[Lz½ ù=>U_~h~À:òz|ª¾üÐý€t7äôùU}ù¡û÷îû ¸ [Û5]²˜–÷J·N«Ã—M…ß•îé”ÛÈSMsm²K^’É)ÑI’<‹h¾#3¹Y< Ø/™¥~9 D9?oÞÍÞ5‡¨a.öÉ™ŒHˆûTR”½“!¬”§¡çã%·š–ّ磙6œêv'°åi°ßxÙš’•Cœk‘Bª©Èä̶¨Ì’´ìÈÌ”E¢¤™ò^_nÓé…ó­ àÊÉåÁ|Í+ñÈQÍ2§Q¢Ô¢V)Ÿ‡: è“K 48˨Q) B‹jTJ"22ñ 5¹÷º¼sÝšîÿŒÅ³KzM-–5ÆÒz3a+,›’’ÌÖÙFZfISf´&™âK—‡*£’*°×X²:i…_ˆÑó*#?§i°æÒðUà™ç¢¥dypÐÐ0õåöí>˜Q¿:ÐޙܬžPlÌÒ¿…´9“)ÒØ¨Så=Tgó²³C8“Í+J‹jTFDde´Œ…ã¸NRÊ¥&–Õ†ÄEv¦¶»•U˜í¡rÍ£#-,«$H,#QU‘™8£3>›Qž qHÕˋ·Pìíaô-åF£¼Z(3<´¦½¢ãI#-„Ži'·ÇãnÖòW_}*Kê²6ÂÊW¡£û#yסIsù¶h[iÿÊcžLäñżW¦.ɉi/öŒ×iä“ÿ½ôŸúÃWÖ/7Géú_¾±yº?OÒøÕõ‹ÍÑú~—ݬ^nÓô¾$5}bót~Ÿ¥ñ!«ë›£ôý/‰ _X¼Ý§é|HjúÅæèý?KâDÂæð1Šk){ÖÔWi”{ILŸ5ÿvéËæ˜jSkqz(jVIIžI#3Ëah›] ;à ô¤ýéÜnû•ûÛö}?öáeð¶ù¶ïC~»çzõß8{Ðß®ùÃÞ†ýwÎô7ë¾p÷¡¿]ó‡½ úïœy²ÿ$dg‘";ËiÖ”KCˆU¨J’¢<ÈÈËiÆ:Ý‘Æ,JnÏ_ ¢udMT´•¤ˆ¼IL²p’_Ë!½ÞTë J&sR}r¨y¡ªY¾æ‚ÞÍŽd®uÝ»4LšóçñSW(6/ Fe{D’3ñ™‘ó†°\^osо5‚âó{ž€¥ðÂú±|7¥r88‘{7ïiýÑ·u«~ŸL(ñÍ™RK(qy¦Ù(ÐDN¸FZeýqg’ *'¬C»ÉØP=ˆkÅ®òcvbÄ1k¼˜Ý…؆± Zï&7a@ö"[t8ñÅ «½›eë·ƒE6±h©°&4TXH7vKhq:Ih”œÒ£,ÈÈËâÄ3;•“Ê ‚ùšWã¢.Æpˆ«kT~×¥O(öFо餯–D„T¤6yóËÒÿwhË=»¢"Ú”¬ŽŽœQ•ÿÞ,ý“˜áØ‹.âÚ§™’gÉ=ŽK4ùà·žÒFgà›ŠIV0Ü?ùy»o¥Ôδ7Œfw+'”ó4¯Ç!D{TªMV»RG¡Ó%ÔgÌpš#*yç–~$¡ #RŒüÄB÷ᣓ­Ø¨nó±8äzEg0¬û²ƒ4'Âç&ºG¢Ód’ÌÛ#ÌóðÍN9ŒìpD·t×.RâÜ÷>ÅGl¢T*Ûæ=Ñm¢QØAsqHˆˆö¬²ND‚2])Ü?ùy»o¥Ôδ7ŒT oáòñ+j,ÍnÂ×,Ì(ЊújÒd4µ-nˆÐM2áeç2«UN!zåw}£;„ U8…ë•ÝöŒî5Tâ®WwÚ3¸@ÕSˆ^¹]ßhÎáUN!zåw}£;„ U8…ë•ÝöŒî5Tâ®WwÚ3¸@ÕSˆ^¹]ßhÎá¹K䤾ÇdkWƒb"±žÕÅv\…‘ʦ/õ"‰Ésö*­÷_kÎÓ£š\Y¶LQ⤋Ɨyn¤üäh?âCêÌņ 0±O‘D¸+ÐV *inÒš2K†G™%ú‹úN8ŒÌÌêKâÈSkÿÅåðâÕAµdS,ê\Óf…MÒn.ÃÍ*tÌÍO,²#ÍfdFFiJs2L÷þ^nÛéuó­ ãéÊqy÷—`mÍŠ‰aoÓYÆ%Òd8ûTš´ˆhudñ)dÒÒJ2-™˜¥Ý#q ¿‹ÄûQ;Ú‡HÜBïâñ>ÔNö¡Ò7»ø¼Oµ½¨tÄ.þ/íDïj#q ¿‹ÄûQ;Ú‡HÜBïâñ>ÔNö¡Ò7»ø¼Oµ½¨tÄ.þ/íDïj<ÄN ]I¡Ûô¼%¤ö*ÓÍ2?ý¢X­Ö­ åÕ+õyµ9Ž|9$-çUüÔ³3?ó îü¼Ý·Òê?çZÆ3ç”Êéï>ñmÅ‹™`®úÐÚ&!Ò¤5!Úe9Ù)ifñ%F„™™mÈÅ2èňÝÅÛ¾À“êF,Fî.ÝöŸP:1b7qvï°$úÑ‹»‹·}'ÔŒXÜ]»ì > tbÄnâíß`Iõ£#wnûO¨±¸»wØ}@èňÝÅÛ¾À“êF,Fî.ÝöŸP:1b7qvï°$úÑ‹»‹·}'ÔŒXÜ]»ì > tbÄnâíß`Iõ£#wnûO¨±¸»wØ}@èňÝÅÛ¾À“êF,Fî.ÝöŸPMnGwûF¾‹W«\Å´‡ ¨¥I“%ú$„6ËH–Ú–µ¨Ñ’RI#33ØDCg€ÿÙ‰8"\’jì  € ž§ Ñ=ˆ3ÇýÿMo­)9P`ˆ Ãö2ÌP/¹Á¹íÁÿÖ¿ÞøŽÕ<´ÄXS?.]4 —eÿE†mq×øÄœ^éžäœÃ3ã!d5¿nÍüAÏä?x—½´HHñáa4¦ÁÅEŒ^¦/Ê Î@3W½7Ź qJábùTæg‘êíg6ë_FýÞH¼n í Þ”@ÓRlÁàa} nSë¯iÅ|¿Ìï+êj…ÒÜàR²,;vLÿ¤–ÀNÑU¸=a ãÞð`9½üY„.°›ÓQViØEè2½/F°¿k}ûî“ó¬?ï;=|¤´¿ ¨Ã¶`"pyÚ1|{GÆ8°¹ \’gí²þ÷‚E*L¶ÙÝÚ÷ðŒân:ªjÔPzç³N3Wž‹µÊñR•mÜâ³Mâ«îÊ»DM¿ß‡Ù#‹IA7Œq{0ïÁ­zÌ­Ÿ–ÐùP<Ú£ZÒüUb¯šÜi=ç è%EÎOñç‡s¬ž¥;¾¹I4Ϻ„áòº­G °óHS¸ÇKU_¦w™w¶ô5›Ööƒ‡Q0áq2’¢]=˜µER©0ÌϘ1lÚ !̆sÔžxKG-|¢ ü¬ewAñký­ ®;„Ð$!5Pd¹¯ãH0Sy9=OP”߸–Ì™¬ÐÄi[iO|SEH§YÄu9¼ÿ‰ \’gí ž§ Ñ=ˆ3AhÿhÝì`DæèÚÐÛÊRPì84aJe›ÔFpIЃ5ZP±A*B¤Rgu~±Þ Íjí²×YÊXìíŠkè-fßnNßi {£ö.î™¶F'†Á…L{[luvÝ·Ê>_$ëskÖ÷ºÜnHpƒö²—ö`õqo)p*F=j‰aº—OâéüáüÂYÞÝwW †S®ðÏ2éÑ|Õ 6Ä}•te\vXK¹…|& qèÃÊò D‘‚\Óòè8ŽÛî )ÜkƒÙå"+BÇéÖ.9 KÒZRÐ*Aô–Xâ”eÃIÁuÍy³ÍV§P(Œå©4©h]¹ÈmÑ\L¼C'¡ÿuËù® Ž¨¤ÈûkM"ø€§ù¢>ã! Ù³D½† q–KE—žÊ1ÑJFŒ¶4’RŽEÔ]£Oø|v,NG´w|«bþO˜†ÚäSK´#Test Expiration II ‰A+ W]  €\˜ú  |°}m,œ|qÄÿiÏ‚®’‰Q.|öi‡•ôÉ|T\îqAù”tÛå$—3Øé‹0¿Èk–KžŸ-µbî²ÑùГoǹ_©j§6ZåK=¼â–­JE5ßc=®‹I-),°†tBjŦЮT /ï²¹—+sÈÊIæžTÍö¾¦O ± %4±W+ Ðàe3µ?'–ØÞçï½Â=yÊ )%ÊUy<µâpÇ<媞Vú)!ßK!!üP”ðÞ°Q©´±"\AZ…3ж[23[_îì!{ge o±ÈÉù!ƒ¾dr(S¿®Ìó!^uÝ ºHjÈ_ó57ãË=áû”°[v]°´"Test Expiration I ‰>(\˜æ¢ W]  € |°}m,œ|ö ÀkäúáꤽMQpØv³VKž”ÜÈYüg¹/#ª½*p6j&ÏMé7Ç)“µ@È€l®Š~8ZÙ2HüÌt™SãÆÐf”¬þOyó_mÒÀ3Pž"ŽÑ¿ÏÉLš‘ÐQòÀDŠ`ááÆl†¿SسD1_ÈvV9±ð[Ðô¼ò67—J9?EÒNO±LiªËÝ€ Cî±E–T›Í>Ó)WÇ“Ål¼#ëuºÑ˜Ê*ú9ÀJÿ:­‹œ©¹*¸Ýà‰S‰>(\˜ú! W]  € |°}m,œ|ÓÖ‰1ì=ˆÃói®Øýœj[ÐHè3Iy{W5gzŠ+,¼s<èºdÕ¾ùk€,B”–<°¦®9«=tw»+ ˜–f»úIt#)þÛÍ4e “ïëc#h߯ ªúû'¨ÖŠgÀMY–òå¯Eœà9›ˆ£ÿÞaÐGfŸA€Gûï£dÎHøHÕæ°]øý3kÛXìòá!MGíK«dÒîÒÞi–çuA{UH.´QÖgèÒ·Å•T[`;X^ó)TQÌÂëí¤Pƒøýn'‚/%YXôz´)ôÊÔÔ÷— zà㱊î{¼µ1ÙÅ-@^>Š)ûéÎP°¹ \˜Í”º7§Û”ìŠfþƒrÜ=³˜Ú ™C”ŒèÎj¬ª nçN4$Ó6v,qþ°Š €/RÁÝôªýé8ä¬â¤_›Ø|å[T”KÙv•]Þ~­ZÞI¨yÏxÜ0¿¨gæ§ ñ3\͆MÎ~eô¯êRŤ¾3}Å„Û ™K¸Ï g˜=¢ÞÃø/²5¯œ£« p%°—ÖAS>•Å\ˆ¨iѦdÿ‡WiXG9ûŠå'… ?wͶ™ àe¼&Ž‚ÙYñ#лü,"WßvÅÇÔp:+§ÄëûÎTáó¸)‘™ ʱ-=‚é™àŽJî,£ó}›÷"À£ß5ß*£‘‰% \˜×ú ­[ |°}m,œ|\ÿQ 3‰.ùB†¬ŸLuÿóçiøåõF¥Ç—óE;Û~æa­R.WßÛ®FÜ×+­¬ðÛ¹ˆÚ ì«*°ìÄT/5¶Ú½€°`m„˜•Õ³Øt\šJÕÅñW„DMASE~ ŒÔ7¸)9ˆÈ6£Làp¼éü8½Øx.êm’T·á¨Ïa@̙ũ´tµZàÙkJ„Ä*"Do67ÛzàCyTNß»]L⮉9sc„©É\e<¡•ÃþìrszÒ“¥ÂcÿÚ¾ÿƒŠwÕàßöYæŒÝ’ ³t©–ŽÆ´Ü?|'añ¶œ²a?ÕÖ¡xgªžµó OÎÛ\«ó°¹ \™:¶jppðš@òâ× Úm¼Înüûm d ™{êG;:s+µû »«`ŽmÐÁ «W×lzË“ì•è ¼ŸXiÙ¤6_ùµ¬$þW]é’n $À4úïĶtÓšÃà(î  ¦8È•Pâ}jÝzGέND1D˜—ñ*ŽË5婘aŒ‘•ƒ^UÊ™>Gœ méé{„À»3ø”u5úÞV´HeQûPŽÑnE:媖;¾[³¾DY9q¶ùÞ„\vǤjñ3½föá0Ãú©Ì4ðx¨…}2©×ÇPЖ=¬”Í¡@‡ ¾»þ)UEõ‰> \™:) |°}m,œ|À] \™: Í ÓÕ¼¶ï]`–O”žæìóý"S4Q}ªjôéñaøZà®|­ß|©è&‡'ÃÁQÇÿï‘Û„/V ÆŸÒ°5)K÷0~…Ëçõ^fô E¼8u·‡ÐßËž5®¥Mšupošû9´(R‡#ú²GmÄGòµBr¾‘)ñiÇÖ­“÷ؽ«oÏøZÆ™¶ènðFwäh[2ƒ,ºÑÇUÈ?ª¯Rñ¿õ¾ÙÏ”õšJW=Ýš¢›÷ú¨OÚAdu¿#H†a Ò/-¥cø\)cý^R$îIs}›ÜP‚ §êýVæ,ZE‹±D ˆ :7só8+I@JZz¼aœ†îû—¸XÅP1$0éñ))¢B±è󌧲Yñlþ¡ñÃh¨´Ár@kE©gÄôòyO Ñ:zà$¿>Hv Ö»à¾óæ‡!W ØH¸Œæal$”©”~ÿ#¥² „{Ûâh“Å ^'¸Ö¯]ÅÆš&°åŠŒ ©©ãeŠ7°*äç½ö—R>—®”‰sÜñÌê?Ç$õÀTÍ)3•AÚHÉÂÕŽ²<†Žñ`æá‘ÇÓsAA&\zß¶ê÷ÿfOVÝíq,C(YLff,˜™?¨úŸô"ôÙ†c^¼5^:`Æ7«\‘ZýTù±ìØVÙÃÿ œ]ÄòLð.Í…1ùRLFP·&Yã°in-toto-0.4.0/tests/gpg_keyrings/rsa/random_seed0000644000076500000240000000113013535414523022024 0ustar lukpstaff00000000000000;T->‡§IVåwWéŠôm:Yq#N¯! ¥“üz)‡Â}ìV)sÝÇiÐéB u6ÿúx<5=¯ù5”K’úõò¸ð$Ë/c•a ôVüG)³×O (¹.¥6Ë; ø[F ÞÞ…Í5–^Î5¾ú%n³rý áÒ¥í4i>U,éVß3/f‡Úy@×âÇþ|]•ÛI'£Z¸ögÄø:>²¯Éx[¿ïIâH¯©*®ÊMÄÛj³T%+n´˜&zü$BÝ™„‚>s®T)JÃÑ’GN¿QXõÀ}bh²ÀÅÝÓ'b*ìæ?ó—†°\ËÉÜ3Û—Úà\cݧ '±_<\0‰Gá°fu´ñ 0ñ[èJáV.‘ ÂÔ)² PV3|¼—ü‹]µpJ!Ûc]²~ÖŠ@™X·~?™° àŽé ›™±À†Pyë”v ¾­$ÄùqÄ0j“rYeóÿ-É€n¯Wð&Œ«‹§X-.•å%©Ü~Ó•hÄUhב{çÉTÕaÍB‘çz{Ýiã¬/‘ÞUþ?NsGè€âüL_|Ç\ã:< R8M`0—Åa…ó¼æp;þ%5Pm¯6íËê—W×5üli%ù-D±£c^›‰:‡J)çz9‚½%OVo.6i5²:ónækÂÊšÝ:Ò:üº—Є†‘˜ä)Â%íUGÞ¼×Çøè9-S„ø½©GY‡_¶ù2ċÇÎ6v¢M(IhX°V4B>‘1‘-d3——²ûÀà£A¼ÌO ïÙ!ìáA_RUf3aÒx{ôš!>à­­³ñ~Ù*=F7<¤îÕÄÚ$¡@5®®IŽ“æåZáöT^±iîMd)íIÿ¹ã(¼[›€Ž‚ÑDíY¯eýä£@…ííý£'Õ6`äVKå”Zý!Ëù´³Öì=N{”ˆ?A¦½!P“TÜw€Ê&ñû…$0(LqÆ·ÎÖnØÏž á€Ëè4[M’\«¼ ô?›3‡)ªز¿G7²L¿ŒÖNŽ:`&ŒÑä|h“X¶D*µš:B„–öj) œ©zq 3C&jñôÍ,ÈVý•wÑ—ÞÕçrÑñ# 4UýÚimgà@XcÚkL©¦<´†¤—q ué=ÐÖxqÔX ïrM*)8·¿F'Œ'ªö‡ÈÏÍȃŒ5lXþ)X–qЩ¬SK«÷±wtÀ.ÝEaÈ]¢I`#vyL•Æ«nV¦b†3퇻þ1 )çvþ`ÁAdÑ ¶öè½\D¢…¡¾m‹ˆ¢[æ~ùþ@-ˆ&ùë/5(öa©œˆ j_p“úD’MD‰ÎH–Fï9y¹<þe@Z̈ò3Ÿ¨‰8"Z   € û?S~ Š ÿniZ§"ëŒKGÖÓŠ¡Ûß„jÇ«]5޳ŠÑé¹g7Ïñ+­diï/] ñÚçÂl©A»Ê:h'÷„…pÂõ«¸ —Ò«'˜Š,5“ ?áwÕ²*ÙÊh×qLÌùÊëDÆþ£X öíÒˆ¦`U÷Äœ=‰a"–ãÔí¥Ä¼ÅüEˆ$Sÿ—K`±XG¹›â}%ôŸÒ¡á@h…ðŸt›Ëɘ8È?Òº0ä“z²WÁô9¦§IqHδ e±~_q ñW¡g}¸Ç Ã»~w£èÿz>v+¨¼wQTï'VÆ?¯qåûÁ~3@³vÙïä-:T zvåfŒÏ?HGRWÔ›¬ùÿ°˜Z ¤(kþ²¾éóm™Sï/ƒ&_+€ $˜u΄˜öðŠm~¡MÎÌ‚;qçÙ=nË)Oˆ€îp>§<´]‡@"q OKlßyôD8Ǹ¸´–=Càô ¾®çžHýùþËfvÚ{¡z]ÿ¨Šü‰„t=:8y¢c„Ìá#PâPhì8Ôã×Qv¶‚Ê¡-MødÜYñùŽì^^"ЫlÃ+ñÝpÏL,§Æªä ùMμ/ Ö2X!èÍÅOBÖ>¼6§&€¯¬NQ_˺"9¾ •«©Á}}p ßǬ–SzÕ ÃS=7í1~ήÞS#&ÿô•ÊŽ ñøF¯,’y¾G§u¸?°˜Z”2ÔÁRü5¦ÓÁèÀÞÎ  2DfáN¥]r#«\C9>ëÊs̱-DÃyúãö;&= y8‚§3.ðo(¤¹®w]-(‘gèl-ñ¹©á'¬²hh…Vì°„’Ðqðl¯ˆÎ©Uq5CIØïÿ°ÒYú㿬šµ¬ÖòoGpþ/0üУÅOF:0”ªe$ã'¦%á$uÚ$³µ6ßa°öâ•G9¸‚Œaføáv‚>ˆ~eú ì²¥Ö÷ã³)¤Cá"+›bPÛ E§ ˆûˆÔ6Ôsö4H•>‰TÇ¥sKöNbtá–Û0t§Î6– ªËþ+¼Å“–­Ó·vØ\ÂK¿Œ3”nËCÁƒãq8QùeÛºüÒw¼ÊÉÖyâ™û žÊ=ùQ:~vß#üÓ-B‘3…É‚ºŠï"OzN"ÔITMªî—$íºS3ê2¸ŽÆÈ^fX¹Ž! x^ÿDÈ›m{ñ<‚ÿg¿^¶~²aÉ‚®¾ÖŒ|–pP-´Ÿñ1êUQ½¢Áðb\öýÂW¿‡¤ƒì¢áí©÷ædò¤ ŠI^˜˜WÍ ª€õíÙnAf;éþß î\13 ;³@4™…E‚n²N& ë™…ÕS1@ÌÆ@ô?;j`’týb7öhy,L¾Ðktå!;-†øÍw#iÜJŠeg’Çr ÷€&Ê~‚Za­~c*bvÂÍzåÀå°"óìÍÕ*»îä’‹ä‰\<#´oÛð “Þ–h•°£ˆxhÄ_Û ?±–MGñL1ÙÿþÑ×*D0ÜlÖMYz´»2*r·‰ä5L›'ÎÙ±7v¤a—$4"GÛ|Ìÿà©cº ~c–´Ýä°³Œ±\¯Î-éØ@øÂg#ÒPºb³Í¹‹÷ÿÜM£Í: (ãÐAÿ“ˆá.þ²5¶Í45jºMÄmÛÓRÍZ™ð&{¼Ôm@A¿qb¦•PʰìŸ"l‹±ydèwÃ~­´†ÅNcI•å¤ì¸?â=ƒj¬ËnjäÏθ¡º™_Æ:Àºøz ^ùea¼dl-?v¹vTO•*17ÔÒ=ò£óº3„GÔ‘ÃÉd¬Ü¼edQb LÊ€ä13“Q‡x> Gù¬¼êå½]‡t[Ó~%jMÔF‚›$Ø5C ,YälCû‰> Z”2Ô) û?S~ ŠÀ] Z”2Ô ›éßQ1Ù$éЫá™Ë¿a…R8ÀáDÖk²£Pt?éMM6xÆi¬Wɇ }a~Å£ÓhM·ôÖÚóåï©y©v²á$p¡Q÷ßÛ­^£ÑîÁ(ÿ•‘”?WÇpñÕ²gŸŸJ“ Hwåk–ŠXJÒÔ ÖÍö¤Ö>¿S5Ý Žm/ÌÏ–­œUK½ª Æ{ÔMm(™r¤ã‚àÎ\,¹M1“âøÐ6OæÖßJÅÚ•¥FH"\1”'er¿(”ßÄHiÅ4>›v}C$ÂÏÑ\H<6,ÚŒúÛ Þ¿ÑÐ1îûÐ ú9è|ée})$mÇ1„BšH¿­Ñ Ÿ¿=&F Æ­)þ9è<«©YmЉüšî²lƒø#˱µ“Ô/nzwán©÷ÓY*A£„ qNÊÔúœDÂ~J«/>Ø|è­Müð“›Î' Á櫯ìõ‡fﶪhµ§šIFžáÔÅûWáùIfÂÊ Ç^\2Ê « $R;x K^Q¥d÷19.U⧇µ"Ž·€w]#]dÖ´U-¢b%^l°³^€ ÿmé§{òð9º¹Kúø|¢…»€ÓIr¹’¤ÕûfÊæà¬qÉ ¡=»jNîïHs'f%YœAz³øÑ÷–ZîX€št³W!¸»w¢ëæ—;Š‚…M©Cî#õ»>³I°=Z—¨ÌjŠwCÇŠ ¥þ/áW©Òn]%º}1¡i¦ÕúB»ógô±Ó”]T2ð,5w>˜q†Lâ’Eh(,seÕ¹QÍQŒFù°ßÎf¬ùèZÔp@Fc`÷HõÖÀ-fA“KˆÀ0˜yDð¦î±Jp•%âÆ_û†5ï¬YþË~Ör¨N-óÍ&ƒ8§€‰h}ÿnÈvx0W«ñ2ÓË$–h¶9“ø‚Ÿ¡/u"JZ(ïŒHJ ®”Ñ]—¡l\:kX‘ó'";‹²z¡E¥¹Kq”¾ÂưÉIxª¿Cè*!vHþh$dǵ9<Ñ/K\®™VJ¸×öFÅýMÚÕ¸ŸrÁR"3ÿWý¥k\EÏÆ~ùSGÍ$6kXÄ-D êá1ê»gKC!ÁFï8óåö/°+ô¸.åHb0&-„]úÌ;­Ð­ù¥ÝRaHøtõÎ^2‹[ž· ]ÿù‹P0Ft°€‹-¢‡;¢ƒg;­ØÍ¾›|’½ºÇcÿs;€W!!Ñ©úú~¾¨14ÿæ!š;‡l›þþÊÛ‰RÊ=cdz ƒo€pHЈùEövVåW×j·3Uâs¬ÿÐ_ʛɭXïÈ"<íCkx‘ž—H£ÍÛŸO®ÆÅpQ¯17¨Q§©ìnpuî‚©ª§JônÛ—çüL ®n’TÉ[Và™¾,‹Ã‘•6†!†äkØV+WúË®?C|µã[Ó0Û¼C×Jª[‰ Z—¨ û?S~ ŠQ¸¸N—'+Jì@% +Ö|R50ýuŒGÐØ¾9//¢L#9åÂ0|…¿[ ‹Pßen ª\i3>¨P_Ù}¤ zUÖ¡µ9°Ý~L­õÔ^}N›<ö_ÙsüOWC¾/Ö¯ ‹ZeùÕÉIæÐ^Ù™U· %Ñ­düÁÔ²äYRf+9dùxð5”ñö~12M¤÷ðPòاɲÙíJ¶Ô oí¨ò±¶ ’P7GNªá…hÏ.•<ãS:,æ(€ºÙσlµúÉ<»/¹­cB ø™û©$¡1W¨Ù™\•$nmFË–ÆÉ¥C&–IXÁ¹¥Äu¤!/qcÑ p»T§º°•˜Z!¢6é­9®2½OÌA ù—‰íjÑÅÁ(À*Ý"rÜ"1‹d¾ÉùF{iI±ŸÂéŒäZÔ] ÖÅÚ{Þ8 -ž>i|è¸7; W4-Ù!Ö4¸sòXõÁUYµ)!úK·ô‚ìC¡È\3…½R í½Ák%$¦Jìó*ÅimÔî©uá¶Å¯N¦œ¦E3B$2‡0YG“ˆUg»|ÿ¬öë_ýÆ@è˜å™W›!±^I\!Åý÷—NpVÍdþ„òË”müRP1)žbu“no”d§5½NÜŽ Þ?屿m;Áí™;†]Ëš+.òß0Ë ´ÀÞ¨êñ•ÿj!ù'Ü‚Ã%):m¨¯ ©J¬Ð¥rR&öÛE6^ynŽíÛÞ+|gû×þä)ÝSt«ÈöËÇZü;%X„—ĉ„î@Àƒ~ýè=Ýx¦NÛ§¿•¦K&åÔ<7mÆf6£ )¤œ[Æ›)÷zO!›·Q&µ?u0Û|¹ãPdâo8Î40ÀÈ­M]—×/êëÑø ãÕ¿£EbÖfe¨&ʰsg—:{âsŽ ctéò ÞØ<ñÙÝFMa f„ò§ûäO¡¾\@n±^Ä£¤õ‰Tw— ‘Ú<¦9“cwØ" 2­W†òºÿ‰éààÿ$ù¬Ÿ[Â8̰Ôm>Øx´ÿ}†c†¶;éaXÁà˜?ãß.dÊG¤•àRÈ7ë=á´Don Joan ‰8"Z!¢6  € [ Ð.vŒCvy• JŠ‹yž#A­a­pËt<&cý¨û¯p¿~JõÃ}ÔQjTCÛýŒk[ýîX_a‘ÁŽnxbM›¿³öÁ÷–"ï>¦åƒâW„£r¹”¶—T@]ÖæØ¾XÞWO(“¨Ä’ßÏ4—­ŸëŸö 3©„"}FåǸ¦Ð?˜8ê‡\?Ç¢L¾)qS‚÷°¸Hnô—«ñæY£ò,Ûø€”¡S1mN·ã–þØÎ‹ÞÊAòîÏ?èáô'¡Q¦ó²½±÷ yõ¨“.õ¡; …‘|Ð¥¢ŽÆé½k== Uä˜Ç‰“é3jX¶©ÊNÂzØä°˜Z!¢6´ƒÆ;;š¦Í×ÒÞN8¯ÀJ¼’W?†ÀÃEØ P*ÔRðˆÏ¥1éã<^×à ÿ|ÆUÙü’€/œÈ8ð–êQÞ=£³îÉ{Ïÿ£Õ}.CÃýqV†¶e’RÖ%…î„!ˆ>Õ ˆõ¯fÓÝÂyiµÜæS<‚ÿåÚzë{iÍà»^Ž¡•^ÔÊj ¨ßy¬’Ôòe i†òîd–¾3âÕëþÓåvhqbSZY·#„zs m¿÷­ÉçÓ9º|¾­“„„¿T9ŽL·áÑV·^4°DÝþäˆ$ƒó÷où÷U©`óoÓ2`"ÜD?ÅÌÅ05ú*Ц¨4:9û´þ•†ˆ5mßZÈÃ*µR—¾/ «xµvÓKã{eøyr¦2[ð4›·,¶±ï •€šfWÖ—\¬`ò!9ExƒïúÜéÿ™ˆÿBK ÙÊÙÔ%Âz +q›»Ô¼ý.%¯â×N#P¡áðCF•¾ë(WÑMïÖoB5N<ý΢T(®šÏ6÷ž­]’œ4_K1¯f¶|oÛ8E:­ž£ ÞjŒ¾ä=ÈdžÅÿÉÙÓ<ràœ$:z‹†é3ë‹L.÷Ÿ·ú6ÊzÜ|WÕÀr—Áij èäs2¯@΄'!hA|d(­!S¸ÅxJ³uF¾õŽ%)šÂà¦eõà"oTìÊyhmIø»ž³RLw¥=ìéEÒ ¾ÞMs1kí[:11tþ Í#2~{ì …´LÓ(€Œ@¦ú%,#ÔvØ›þ(J„€iíS¹ &Ûf 3“’æ–m@âC%®6q{‘Û)­–‹j`s5êóÐ…y]ÌyU­H×4õŸ-<ñp3ñ¬É‰¹‰ÚèäY„V6×a¤)ºûZ>Dë3•µ»]åʸq~ïMÍ× ’CEÊ ™¤ÔðÈÍšÄcñF!B“Õº¨›§ØÓ›e5f÷à¹,<Þ~7ýˆýFB Ì@þÑgŸ¹ËÍ{iïà>mÑñxÚ‚³O.m°øB¹yãå„;rD쵘¶'™ zë”^†4ž`¶NÛ_§Û2ö \ ¬öI•Ãúýs<€¢å¯m©·ì‰Ò,fÁ\ú{|ZnØ>?»[i¾÷øÖ#™„UV&˜æË”w•{ýX=@ܨlT^TAß=þ‰ Z!¢6 [ Ð.vŒCKPÙ­ˆm´Àâ0À ç2"4Ʀ?§v­”®ñÜÆ¾† }ÛÉ‹ct.k¶vÕº°WÇ@Œ¨¸3ú$?]Oä«DcV§c•Ø»#¦?sÅw.Û1ßó"k„ã>ŽCS[‰FØ»R¯ë†G‹€ø9u²ëí.˜X±”³bÄñ.•Ô‚îh 3ÅÝUÅÚ$J# fΰüÕ•8ðEéú%Y¼'¹tÕ:Qä®ûˆ¤œ²“§‰F¬›˜×Þ%.rT¡Zĸ)¥ÅÎ@MLéŒz㹩ٙÓw¯ƒr·’,ùœ¥5… n5½ºíC–Õýa8K¶\ΰ•˜Z!¢¯ÉE¹K¼PJav›¹’â¹$2ÉŒ´‘ƒ–Úß1­™(ë!å+¡âÔò'7±ìãm48c5à–UÌ; ”‡Cö¦©˜‹½ïG€Lêä>±èñ¡ÅX•g¸€,Ÿ3À.8:`'•¥®ÀPr,Ç&IþКœæ_õùs¦¢<;Bí~úÁ=÷rtǾ7P·#@‡éÔu×½“² O\«Xd½ùLA~}=š Hª¨§Ê¬ô¬zþ41ªNýÚC jj^àæèRRW·Wo9F°Ó*aP  ìô—DQ†o—×,a|±(—½Ôqž´ŠïÛ_O« ½3°>»¼Ýý¬%D¡sÑr¹N9)'_„Í^ûn,˜á'žÍ ”Û:,û^Õ-Æ1G0:®lÓ/†2 "6ff!à£ÑÖ˜ðó.Y éQQ‘Õ|xeÒFðTY¹µ>'!6Âí )¡ŸÌ£S@ˤls $U+ʱ™(¾å Z ÝŸŽŠ_Q}T -P½œ ¤Z`tÛù ›&ÃÅÈNðŠJÑk+ùÏ’:ë#Çö[sM³ªn¦i=@x\„¦#Ðd#Úý¸ŽlžSïÙ'9¥_ñÓ's9‡‚C1F†1Èü,tœy¡EÛ@ Æ¡h‹}…~WÙ?p»ŒPÝ2kÒm¤˜ßZÇv‰*VFÃÕ r§Nÿx-LZÌ-ˆ$Þ*Ù"é™FÊw¤´ºôê'‰'Wz&j…¡ÄZRZ°h]€Žó‹æ ÜÛ÷·% )—ní`Ú²ñë³.û6ÒIQ} “c£,Ÿ¥ãAìdã­|ÚJô"{Õ„ªA/_©ò lV~5æ±}”œw|­ê$qå¹å¢p8èBw±ÖOFŠ(³›™ò;¹ÿÃüªAɬ5#öRs¶©Ëe%ÕÖpT9£@Ì4ì*𒼯zÝÁà#ý)G:‡s7€°iS†7¸.U¶’ŒâlžèóeŸ›½²< Ø÷çDv9< Ø)¹ ÿkø¡[HF—+¸Ñ‹âÁò.½ Ää—Ñ¡üPÛ.?Jj»B •tíªÈnÂã-Õ+(KøEßé› ÿç\2'†+öæ–Ý‹•Ã]Ù¤ò¿„tœGDéKÑMœW^òÇRAGàóæ´¾jÅéÒËhG*y*Uœ4*ë51+ v†ÔB29´Dodo Jojo ‰8"Z!¢¯  € V0‰²…ÚXŒ˜lY÷¬ùð¨ó¸,yZËS'q¿ÈÙ'NŒëš¦pó²oâ4DyÎd•¶Zm?æÏÌûß¹´ $›²‘íðrc±Õh&ÓVàuæsú‹,i­ÿ¸ɹ$ï&m5ð-|Ì3èó–´­:¶ÎÃhaýÏå`nGZ„›©ý|8Z2¥wfAµ¹v¥£nž˜õÿMM|:žíà_Ò¨‚ÌÕpÎæÃý!°õò­f´j$Áo«OÇ|.íGí×F¿#)W0¤,è'Í:hD\ßžÓÿ«›Ð.so6¡é<`éú­ö× '¬ƒ2Ö‰w†Ð id· –‹.oâÀ‚xM¤-°˜Z!¢¯ØuµJh—AßçÊ!X7Ú™Y4é˜çî.0l7 vãxÑþ¸{jÈ»±!Wˆ.¢é²p9˜6´gÛÅ%)ðÌåf¶K×3‰ÿ÷â’&£QˆzÚÓÿ–·ÙM$îGMsÕõöFÎÿŠpo,Ïü×[›Â<¨‹6±öìÓ¨uÊ_Š[소BuÔ& g(SËæég8È£¸kH0*û ëÈ_™‚éóµ¤;Ä“UNTå5°k';ð\¼â3,·Dõèž\T[tB­%βYÆ ³z‰8§+qè¿õǵ&†€!r˜Å%™>kUûr¹gä`êR¦=ü ?Ê(q™L…ÏbnœuÐý'ßH@QlûÒ§ ^¸No?̘x í3N"gBqìOßN ð#ŒtÐØ=`Tr{t$€)9¿ò-HÌ?x ÿa[yó“²[–,¿SÝ¢‹áIyª²A!‡Ù ¬~4/‹ÝCªIç vuQ°áŠÌs§ßþ’·”μ!UÌÉ!Ë‚„q[cí™ÕM Æ)f‘ˆvêÚ’ž°}”„È,¶Špž¿ÏgØG'ù Ùð«³Æ]Ÿ++4•Q3<—,u UYݯ÷ë¨f4šçØè¶/J>]ÙÊÐ^S&˜¤ØÝ–ÒÑÛôh½Ò)P¹°Ù Ǻs£ ™U µæÞ|íqTžä¾À"òÆ7Z}èß—-¤‘NLÓkôP{­ÇƒÉ6-ÑQÎÜ5áaÀÀŸY p!ÖÆ¾z™j¤UKÑçìÙ«! Æ ênï\¹ð”ø§ò   0àÃõ¨üþÜ3K?üâñ={Ø>ÇÇÍnÁ($Û&5…µ!Þð +»™)[¡FN·b9À0ìðÖjÃ\,އŒöæD&ô}¶³PÛD³ØÓÑï?*†ë`·T“^ri)ÄÇôi– `é£p^‡ƒ‰ï­ÜÙa‡ÉzÊû ŽÜX"…|›ƒÏðf8Üá‚ç[Õ0޵ÏqP¼õrLµ•ÿ* ¶K5<þóÐo?!šùK3©š§®9O™Ö#üña>+MƒƒÂZë…`ŒÅý†pÞ_ÛqÓtqùBöÍ1ò—n{%fª¸ ÒI¬SL­¢è¢î“ZðÛÝ8Öy‹ž:²‰ Z!¢¯ V0‰²…ÚXëƒÿSmswå‚«[,X}7™ã51eœd"ûV& ŸWŒ™,·j™êja@²QH8ù=¬üL¾•1:õ ?“ÕH=ªK³±*Xê“æs±ÊèN'”©ÚªÌœ&2iMlÝ—´—©yÕÀióå3õÎÞoë¼H¡Ú ôâ®Á&¡…@ÆGìŒ9>ßos‚ØKø´-ë°(¤®]NC‹~°èîfý0þ´¹È1÷ãmúdžš’œ(…¹Pù+5³Œ¨(qYÊþ þ_‚lüÚ9õÔ¶IôÕñ¬ÞŽLi{¡ëøÑCв>O‘0¹€ô!ï½xì©ô¼´`C+¨Å¶eZ2İ•˜\E¥óâ(Mb0œï]Í¡²ŒmçgWÈ„|䮦¤öøFÎ#±6Y3º†:8GŒ#º]U´1x2‹i~'Æ·`¼Q6É/š(|‘»Bõ+Žš7óbÇ(°c¤ÚHËXI‘J… ®¾ÄYê2<Â\0ÌúXHÝ, õ#G’Œ->ã I¥g̵‚+öXóöV)f2Á=G›Fª¾ Ã>Ú@áš1ïlèLøuÌ‹§¿åíkþ>6 Ƴt† =z¬çy4–DN²˜›Ü ¯°œbî²íÀÄ{¡J¦¡ 1âûÞ‹…~òtȦ[¥~Ø­vžƒ»l…ËlB)QˆoÿhЧ4)–:3ÉvÅY‘æÜŸ#€/ÌØªŠ¡ªäÇ´1æ F ç•4œera5Å7Dµžþkà€Ãù/ È‹ûR¿?í˜K†YpŽE.äƒiÚòhT@ƒ ‡8Yàâx@8¾8çûß/éŽÃ•Ó< Æ/ÒL4ÒûiýöÞÐb7ÉÎa”ÅÝyZt]øðL €&ââµÉq³â¿h¶–äqC¿õÆë5ûfSrÚþ¦î*„å‘D”($¡t§#1Ê AºË\ØiضyLà G«á(*ûµöG¹.€"½w‹Œö‰8"\E¥ó  € ÒÉþ“f™—è–­Þs­òdf¹me¡ÑµIé2º`µûÏlO—÷X½cZwîH: k{Î{ªi`‘Øÿû²oÈþç÷ÅètÄ…ð£uÏ‹ùÂ@+H•£õ ¹ zfsóÈãR¯‰Œð}Ú–Ò!xa³cëŸâÐwüxå oJx®7€à`!¼ÃÎý ¼sóY.Eص%nŸ†YV¹q(¤ln¡+\ë.WKôo«¡Ç?=‰/ÉÅ{ù½§B½ guÒBüÖÍ 6¬s —1ÃÖ<ÖK ¢`Ú@wˆÙaDÈû×§!Ò¦žP÷ÆÖ˜«:åcb<þ6†´é‹RpöŒªxLް˜\E¥óÅü½‘Øã Ö°r` O7ææ†Zc Ãm·«ÅèHõàý]ppâÉÐ>Œ¯Â=Î6:Ç_TÊ‹ö$‡ÄÿhðÂsHxà+.Þ®ú5“¶ÛF•h•Â1!òíÁ%Ó­mµSç=yñ®2ì4Üø”JwÛˆûØ-}Ñ!w¶ Ŭ伵žXQ†ª÷‡_µô±“Ê\h&¹Î(v² 10Úö× _~‘—üª™àá"©|œ$Ú€MZÝÁ׈cÍí# ÂQ„l%!ùr]/Km“W¡ êÝŠ£))è”I 'YžŸQ&ÑŒé(SÍ+KFDƒübHÁUùµzÁ·æ4€‚wjbïmu7,žjUõòqbh°øÎ0ÇVx#§5ù å$¦ÕUáE©vx²­I #_ ¡Ӥ±^Ê[phº"«ŒÏþ½ç8gÿS5XFàIŸQʼn´J^³~ÑÇà«X‘aP“‹¾8}fH¢s3Àëê‘]~^;Ž4µÞàòIÛ‚h@4Ï@î,Wk:¡õD²¼x¯]ÑÉøx&Àù´­m‡_?x‚‰Y¤±Îk\\”)íyâsýÍp¿÷zo¶Ð«L‘íDÝLÚJãõÌLŒÜª—†Þ?—GFË]¯[Ím××Éb¬ð›§‘ÉýHkS‡"_ßëÚ+LÑ€Û?.©E3=&;øølè'(ª¸ÉB&6ê‡tKÚ¿‚Q%"xè}tçS‘"X Y\-úQ/iâRÈe‘<Ò6ÀMÈx­‰g-p[Æ©éQæû‚Rœa±_ìÀ¬ÑóP®•û¾©S¤BUlUdÂßO•»úí€v“2È'ÞÜT„ÊŠMt®Ã(A??Fk~ìÐŒkËS>‚•‡yo,±MCÆ»'²Âš¯Ù˜w¶Ëq~dÍWÇæC׈žóøeÁ‹õ‘P6ÉRzdóKÙc÷>¼(9ÖÔÞ¡q+“û2ýL F°æÖ´ZJ íÀ³¤©æûÙÿB÷ßziRc{rÖ'e_+ÑŸT4‡øºðS¾»V‚ýí[럭Q5HŽq{æZ-ÏŠîig/äW£âPääºS'UFŒýغH'v«³  ùö‹T`ö™6oá¿P!ôÿü ³ÀËӪ먑.xlT”X´3‘»-‚&²zäE§‰ \E¥ó ÒÉþ“f™µ-ÿbælѼ½´‚þ¾¦¬îÞ“¬°:Õ­í6»O7!új’ìZ¼±Y©¸€ HµÂ©‚ —qÍ¥ÊÚȇæÜîML¿AâìŠt4ö&Ÿ±a(ß‹…SоÓÄHÄ^4!fŒi·Ž›€ØÎL˜y2|ùq6 ›`ÓŒ@9 ýx=±AM¸¨Æ;^ŒÉÎG”A‰bÔQô>Ñø!$(š§î¨é€ÕgÊÀÓîÉB”FªÎÍ)ÜÖ– ]Â’üvm÷ûҾĄ7w-ò§üw¿Nc¤ÝÍrÈë¹Á®»ˆ."°¤qt‘ÀàZ¾Ü\cŸót¯mª‡¿‚°˜\E¦ßÛtµwøGq¦” 8àÿV9°JW çÖ¤A,’GšÎÃà–ÊaùL¼õDaõ€ÂÉ–Kâ9Ʀ³!¶ÎsŠpró‘!½ír?WÄ­€Fôi,ÞØÀÚûfgŒÌl‘Ã@ð•£Ï}´K!iºž*öh›7Á„zk0ø9!l¨nª!9Ÿñiû{JnVqä÷èŠg¾š4@Ngóxèö½£9 Œ­ò´>wöLYÕ~”8ÖÎkŠ;ÏÍeÞ •ÛQ1_ò®ûDÓ# ²È.‘4Žœ’÷B§èÃth’Ë_žHN uŒ¸8-Îg –uz unû}¨)±e Ï•…ÿb%,a6~Ñ#“üc#Gª™Eù/Ï×n†p¶mÕ^¨:b<ØÌ¯nFôŒi4z`¡D·ì¥V¢Õ~óŸŠ4‹ºWÀyGæTߨ*¼;/çìFAµ]ò,ÉJ K„TÎ!ÀnaùY‡öGØ Ö\s’E£°ËºàGxîäÛT/Ì´yŒ–¤§»«Zš 7+ÏÞ¾B,êh±ªµéáX(·\h¦ŽØ›ÙJWˆi " ÒØŽÌ(’ðmœ…oa>hC¯ALŒªDù^C 0ñÁÚ'¬l–ÿ ²»È¾AâÙŒ"(»€*~#%MQ°ÌMóµœÞöQ0ÁS¾¦³¶£SßÃuáŒâ~¢_Dr%¼ ÉRPÍÁî§ÜPC±l1[9Ȱ'¡†h÷Ä:Z;:°Œ„«ž$Ka¦…Ù ŸXa\»ë ðíŠæN§¾À†¹éò(LÐÜñGqô‡ ÷nçÝE×óȸÙä.ïáÜkÀJ’ÛsâŠÁÿhÁ›þÙJ®Á$ÃûaiÌ)§¡´$g$ü—[t@…Û•Æn)¼›½=g´ý¶9B°ûÎO½ ³OÀ¡ðÿô_³漘A1è Ôó]DKk¬Á²ó7ÊW¤H‚"ÏV›)Ð%0ñè|°ÉAI&Ž•ÀŽ;5ä¤'ú’¸ûQæMêq5œÉÁÿg·ž–¶ŸK"MËVE¼Ó»}JQí‚»š\ðr£Ü€LeË^éÔæi.‘¦ƒ±.ÓÓ¨£:c$aý,*å-ÝUL0Ò»}¥ÝWˆë =‡­‹9UÙðÁëR0>.zÞgÁð†äžðîfzãšëƒ±A )cßc/Y'ð*xÄE{Dü‰> \E¦ß) ÒÉþ“f™À] \E¦ß +%dðïYþK5ücÁÁ-³öR»ì¬¿"HÄeà°eÚtöfXj.vÃÓ¿œXù%d%4%t)»DžÜÚ t0~7‰,ÞdÜUq DlÞÚ§ +À0žkô1ëºÈ9-ã_ž2?üÐî’/EK[“è­ÇmÓÍó|Ø]Ž(%°Œåÿ”ÿñ¹˜:ÊÃÐÜÕËyí ˆxã3m"Q~U«ðŒÝpÆÅ>©©'--–› ¯†t;Å•¼D“O*kR”‘ÞÀêî†4AÉ ÐÀcmQ•?1ž‹õí¹îd‚£Ÿ”ëb‹íÕÖ„ÃÄ ‹êC’äMj{8m_×ÿu!Ò/»£áóéˆ> ]9N6ÖÒÁ !jÛ·K…+ ¥¤«…u0÷ó_.ÛhÓ¦ªxar)ŽÞLwœÍu¥D#^ïræi·Ï¹eŽì™3†ðÄáˤà¦Ö4}`šL*i}§ QqJ‘|£õÿNò-V¬E·6ŠÓ ƒÙY%…GdÈtqû7¢oNÄXu|ž¬VÝü&_&‡¾Ïù—ãEÜI¢DóƒcõX$zòìÃu@Cz„™Öu+S"‚ þÌ9îb7ÍÄ þĦ~º*-pü7­³½5@wg&Ad£GÌg×|¥RæûwoŽIôWÅ”æk÷|ÿ-BÌ~'t¹2°˜\E¨½áMw—q¢e2¥ŒHö~£þ‚xC±º³¶)ùÅÇÊHùÊÃ@ß÷Ð[·ªÁb~]*Fÿ¬ª•©­³¯(8^ýþ¹øÇ‘ ß -ã0#ˆþeøZòb8T+Ή҂)?› ×ùrj£É*ìXmÌëüÈ 'ÊÍésªìe°d¶òÐ-õ*±Àу¢¶Àˆ]Pyà« <“ tG"C°ÐšÛP`ž`Y«|ä¬3Ñ®CbÖ }0ã%›S)Ñ霩\Æ’BCI¯"ˆk¸3Î÷ÑÿZ Θ sàÚ=ZsîØÅª˜¾…€VÅ™ô¶¶±}FëSúûþ¶ë'KPèï½ÎÿâN™&—ØU”•ÚóåD¶i-êÇUäU -rIHOóåã?þ§\ÜÙâ90Šàxi³—{”JQ3aЂ57;Ëc_\a#àê`)ˆVÍAÛûQ|­&|dd1;çñ¶Š´z—”îy§&duòe{Ëí½P3‡*ú!ÿ–ãMRŽe Q;[Iû!½­l‚é¼wÂö}öú oÃP—ÍÐ@[f§uòC*v^xa''MÆW"tF v,6 ÝŸÿïÉûøÞú/X*g·“ÇG½¾ƒì¥˜h‚ÐÆJidýŒ‘÷*=Ýô)Y1ª"Mç~XC¤ŽZ`§´àtÒ×d1¬ƒ-MïúŠP@¤·>'¬6;ðˆOº?ß_&b8–ÏI—\Cß•>T$ ±µf¯z`õÔ~Ьƒ±×ÖìÝKNqÉ`€ØóÕzÞRïg#?¬OƒôGPp\Ìú@¦PÑ®{»[­2u·Ž8%¾v›„Ž<*°Axž‚u®<†X…¼NF/qÖ ÒÙ‚ûÈù…$¹W¡Äžê€X»Ñ³ƒ;Ѧ’r(¹!Eå.•ɽêd/0…Sáµib›ëÑ«®M¹®¬#·×úJ²Òÿ{ØxÚ`:LÏô0JÁ‹¤ ÄÍ¡2¥2Õ&6¨¯{rL¸YÈlB)M²{¾Ãþ_¥BݾR\¸k´ä/¦£iÞD SvÂÈÃίê«Å7Þ­EB¶öh•ŸŽk+ûÅ ¤¹Ú=áyå×ÞF*«S+ÏŒÙF á$kÕÍ8Í`zBó‰> \E¨½) ÒÉþ“f™À] \E¨½ þ­’,‘ësQlcÿ|óItJ½ÏÒ—~1>…¢¥12תûã‰d+ÙñݢёšIºZêÍÇýUíð××ý êíGíú³.øììé wOÝk»«[/0îÕ_ÐlÐBãã¿ä½RÙ‰ aÎûpäý13·‰«Šc{Š1Ðõ%dYMŽ»krvÐL¸èÛrŠ:`Mä‰n2ɾ4ºç?–É'ÖÖk£…ü¾Âj$‚ö—¹ž0Ÿ¢Ò™oÏ¿qYCo +û@!SêHµpv*ŽñSKµ*w²^}ü‡üW_ªð‚nÕO÷Ê'ø×Ñœ–¦0ç¬H¢9–(ÿ.kÿ]Š+‡~J|U-MvÛÔUWº9”œBDQtøÚÌþH7è'Ìΰœâµt[ˆ2ïý›ð㻎)¿ëWÿUt­c ¨îv9ü|µºG!ˆD ‰!´ÐÏ•i;Y?XˆexêȇäÒÉøO9øóÉ)gýg¥M0yž~ÿ2¸ì;†5 9ÛM,*Nßc/Iêµxg}rËKIkO‰™ß4a›ŽšªËÔ§Ï¢Î(€ÅŒ*¿t³O;íaT ˜ ÿ86ŒN>•½ðª‹ìS0âxoöΠ9l päSE+pP:¡äPÊa›ñ¥³Qа•˜\’gí½¡¹—òÎMŸ«ŸN©Ò ¨0µ —ÒEa¯°Ã•ËqÆxjUxE{ÿ}9Ø÷ÿX£c|ˆ9–e'•³:‚½Nºê~®_ô'Ð}„“†#ªàgÎ ”æã=B¬³zC¬ö«IDpmô8¥ 3~iÒùG®˜ùïVžaŠízj!©ÛO_×ÌÉ*ÿ¹…@}À4Ë¢$«¦dg(–9评ýâJ ~GÖ…ø§¯ÎÎ88 Æy2| ๬z}°äLJ1ض·+<ÊcÐMÁûÄ<%aò‘Ð¥oöhP\2xºèÇå‹$4™þ±™<Œö¥Y ÈèžÙ‹êøg^™·ÑóÿF^ÅÛnJð“ÑZ›‰]SU÷3Ý’QŽFÓHNáb¡LÐó³;*Íãž,ú™ÙÊ¥ÈcGûý¦i¥g¡(¤% ±ZÍqÒ’á\ê!ÌÞAb —þ0ëGJ8È?£Ð¢=F…p5UJí$ß)g\º,‰™ºÙ.k»wym.Ì$r¡éϸ<¦ü1ÿŠæ¸P^+¨±×¸6;ÓoB»ôûó‚`ª{kt:œ‚î­F=([í¦æ‡Æ2¢o†B:¯âé¤X¿î×Ù‰-;¦KjÛr k®«ÅÚœ™ï†gá«"Êt,¹îKèRLŒ‰ƒ[óg/Áobêiã=ÂzŠŠÊƒ `ñe9ÿIyÈÆ&+ýrf͹«!â&Ün4È‹e%Ø‘“ÝGkÝâŸb¤È·9Þ¾\ÙSßä ÌŒ ~7®ÞŒ‘ó-Õl·ˆ3EADÊÊôÈÄOÊ»¢®^8ù„ùÔ2Ún'”ÍöûJ$äÈnåÞà{’ð½ùžª*ºM/ФdÜjfeá¶s†׬Æ ¯4î³Z{S~õ.8¤*ƒƒõ`ùOªá’تI¨¢ ±G cJ«ØÆFl]™ç<„ —õ+æÐÉB’%ǰ3OŠ_êÞpDÿûÓ5©ÿ‚†`ý:Ó!­›GÛ ‚&é0NĈœoý +¯O ²\ÌêƒípL¯Üà%‹ ÜC z`dA†½O÷MQ”8Æa žÛžM›¾˜Éœ¼¶Å (­JŠ!E[ƒÕÕsi¸úVíJot—Óºi`1Mm[¡íLªèäuCoâ°¤œSu–&ñ QJfAWñ+‚¹²1ǵ>¼´Test Self-sig ‰8"\’gí  € ž§ Ñ=ˆ3³®Üµ@çË )ñXb*ÛF“_Aß™õ¦:ž›5I¶’`v÷Ò{[ìð[Ù¯‹ôÞVX sÃ×”Õz†¶ÛS±>7Q¦€kÓF;Ê6”Ü荒<íˆü­jÉtPª44(«–ú3òOD×Um—E‰¥gƒ±"Õ:ÚçóÃ9(ž ëü>¼ö ‹vœ‚Òr¹[ˆ5 ZæÃÈ4Î{tY"„iG”á¡`ÞSj>wSe¤¶3?XßSµ01ù“ã½ðÓqž›[sËU„FqLx¶•§ô=™MÿúIgRÆ€»ÆH·V)ç$¾òãZGFû·ÿÙ±:¯²°ÑÒ¾Ò¼ÿØÿàJFIFÿÛC       ÿÀ ¢¢ÿÄ ÿÄQ  !WÔ V•"17Q¥ÒÓ)2AXuv”–¦³´ð45aq#38B‘¡CDRr’“£¤ÿÚ?²˜ßÅýåáªÔYš%…¡Ù™ÌV`=)õU£HuiZ$‘ Úy²"ËÎF+VµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜXkVÄ/Snï³§qa­[½M»¾ÎņµlBõ6îû:wÕ± ÔÛ»ìéÜX“])}û[k̲V2«dìP«ÕØÉ.G1.¡§ä!µš RŒ‰DJ<ŒÈË<³#ñ .ÊÉåÁ|Í+ñÈQÜ?ùy»o¥Ôδ7Œfw+'”ó4¯Ç!DK.Þé¯÷«}ï]µ¨W¦–FáFALç’uFM´“ÈÈjIìñ‹ƒ`ù(­åN2%Þ5çR(+Q¥]ÉL†¹ë$äY¥KRšJUã/L¿‰‰ìÎNl(Ùfs¶·ëZ„´†·«TØi#þN2¬¿Ì|w0eÉòÊÍ·±J¤,¼iU·¢ÿ‘°U_~h~À}û¾Ân(6öÍW,¦%½Ó­ÓªðåÓawåG{ºe6òÓ\Ûl’פ²JtRd£Ï"Ú/ˆÌîVO(6 æi_ŽBˆ ‘„ŒνZtkÒ¾7ߠ؃GuEŠK&dÔÙ-¼á¨ÿ±ŽdYéü%§3N‰\fõ1ût×G+«Â­Œ£ÏnœJdªÙ·KaddJ6Òœ—-g’³tÔ’3ÉZNfb^>!/ªöžyvþòku6ËJt0‹/Q›Ñh¿žŽgñæ9胇ïv_Lh¿hnøÌîVO(6 æi_ŽBˆµØÂäkñ·Û‹kž±vQä¬:ƒ6êS²%"9ç±HId· Ìm¤Ë%™—ÛÇN2¦Þ}f]ÐÝ…S¹ì=-Ó6LUäU—y„‡&Y%%±fZg™hhÓ`ÿÞ쾘Ñ:ÐÝñ™Ü¬žPlÌÒ¿…Í2›>µR‰G¥DrTÙÏ·3 kuÕ¨’„$¾351¦˜Ž¯BÁ~¨)c¦¡«Mi#®š¹lf•¨Ô’]FZOÆY›„Ú6é$F‰ÿV3tk¹Ã¥øÞË™w÷e[ªÂs= ÜÉ1yFI饣2?³¹>ñxe™]ÿÖ½LâG÷WÖ/7Géú_¾±yº?OÒøÕõ‹ÍÑú~—ݬ^nÓô¾$5}bót~Ÿ¥ñ!«ë›£ôý/‰ _X¼Ý§é|HjúÅæèý?KâDÆæp3Šk'|Ô×itkKL¨Mݺrù¨íJmn/E JÉ)3É$fyl#1¬â³bË¡g|”Ÿ½;ßr¿{~ϧáþÃàü,¾ß6ÁÂ=èo×|áïC~»çzõß8ú¶RÔrPØ›KLµöj[qjÔyMÌ„úãÚ7ɧÐzH^ƒ‰R ÈÈŒ³#ÈÈâ+×¾¾M[ð¬Ã¯Þªr·6n㌳hc¡¦´FDÛ(B33Q欴"#3$‘#Þ†ýwÎô7ë¾qìÓ*<‘ÔŠŒj¤Fã-莥æÓ&-£’Ñ©'™Úu*mÄùÒ¤šOÄdd,?˜B[h‰ öb°ÛI$! ¢ÔBRE°‹J9°ö#® ÝÄ&,Þ'èö}ײKn1>žÃÄgâÉ™Ygü "3xe‹™pUX¹,Y³=¥¤”ÄJå¥\OÆe.Šû†|îÙêRãÈ"ÈÌÚyÍ·2̳ÑQ™g‘äb¬›Üô/† `¸¼Þç )|0kÅæ÷=KáƒX./7¹è _ Áqy½Ï@Rø`Ö ‹Íîz—à uÏã£ö¦öìM˜¯^uS*öŽ›kâS‘ΰì–Ðâ4“”œÒ£,ÈÈË=†F5¨fw+'”ó4¯Ç!DLnÚø¯>è*…W»kmT¡³3‡ T×é×Ë õ™œ8j¦¿N¾XO¬ÌáÃU5úuòÂ}fg©¯Ó¯–ë38qøÈä©¿äq­¥€w/‰Sf$ÿ*cŠÞÞqrÑ«Û ûÔf ôêÔÇ2*R_ãY£Ãi;rÍÄ ³hõ*RJJŒ”G™Ò1¨(xñÉ€÷—)6ÜÙd+Â"Í×*Л̌¼Dj’ÂÈü$§Ïþ— '¸òóvßK¨ÿhoÏNS‹Ï¼»nlTK xvšÎ1.“!ÇÚ¤Õ¤CC«'ˆ‰K&–’Q‘lÌÅ.éˆ]ü^'Ú‰ÞÔ:Fâ‰ö¢wµ‘¸…ßÅâ}¨íD’íqk~V.ßÙûU[½‹m\¦Sj ¿:›6½&C2ã²u£mÕš ÔƒQ™l<Œ¶‘ «Ê*ócX{)w-z¦Ÿg”ÃqªEC®JÈò2\Id†–HÈÍjA¬öŸ8Ém"ÙAºFâ‰ö¢wµ‘¸…ßÅâ}¨íDîæñ»WSkÚ´[m\¶”Å‘56“_«H’Û­g™ókqJ6\ó-$Ä”[ý±ØÝÃòC„U[ÁªXz»DkîZ„çéÄÚ¼FfójîgSæ'~sIÂéô{ú¸}<áOĉ¬FoO´´²pÒà2hÛI‘Í©ÏÎf+=³Ã?'U±´Óí*/ÎÎÐ}ÐwžT =´¥³ ¥ÂæÛY,ÐFyž‰‰g’H‹"/‹Ñ “·å/íÝ#Ù‡D.Nß”¼o·tf;)˜:Ã[uÆ,#¬ä¦kꎹ-Um5ä!Ló„•6HÐÑ3' •ãÏE>aÇj¸EäóªU&TÓ‰ÈЊ\‡(Ñm½,°JQ«A²SFd„ç‘™žD[Lz½ ù=>U_~h~À:òz|ª¾üÐý€t7äôùU}ù¡û÷îû ¸ [Û5]²˜–÷J·N«Ã—M…ß•îé”ÛÈSMsm²K^’É)ÑI’<‹h¾#3¹Y< Ø/™¥~9 D9?oÞÍÞ5‡¨a.öÉ™ŒHˆûTR”½“!¬”§¡çã%·š–ّ磙6œêv'°åi°ßxÙš’•Cœk‘Bª©Èä̶¨Ì’´ìÈÌ”E¢¤™ò^_nÓé…ó­ àÊÉåÁ|Í+ñÈQÍ2§Q¢Ô¢V)Ÿ‡: è“K 48˨Q) B‹jTJ"22ñ 5¹÷º¼sÝšîÿŒÅ³KzM-–5ÆÒz3a+,›’’ÌÖÙFZfISf´&™âK—‡*£’*°×X²:i…_ˆÑó*#?§i°æÒðUà™ç¢¥dypÐÐ0õåöí>˜Q¿:ÐޙܬžPlÌÒ¿…´9“)ÒØ¨Så=Tgó²³C8“Í+J‹jTFDde´Œ…ã¸NRÊ¥&–Õ†ÄEv¦¶»•U˜í¡rÍ£#-,«$H,#QU‘™8£3>›Qž qHÕˋ·Pìíaô-åF£¼Z(3<´¦½¢ãI#-„Ži'·ÇãnÖòW_}*Kê²6ÂÊW¡£û#yסIsù¶h[iÿÊcžLäñżW¦.ɉi/öŒ×iä“ÿ½ôŸúÃWÖ/7Géú_¾±yº?OÒøÕõ‹ÍÑú~—ݬ^nÓô¾$5}bót~Ÿ¥ñ!«ë›£ôý/‰ _X¼Ý§é|HjúÅæèý?KâDÂæð1Šk){ÖÔWi”{ILŸ5ÿvéËæ˜jSkqz(jVIIžI#3Ëah›] ;à ô¤ýéÜnû•ûÛö}?öáeð¶ù¶ïC~»çzõß8{Ðß®ùÃÞ†ýwÎô7ë¾p÷¡¿]ó‡½ úïœy²ÿ$dg‘";ËiÖ”KCˆU¨J’¢<ÈÈËiÆ:Ý‘Æ,JnÏ_ ¢udMT´•¤ˆ¼IL²p’_Ë!½ÞTë J&sR}r¨y¡ªY¾æ‚ÞÍŽd®uÝ»4LšóçñSW(6/ Fe{D’3ñ™‘ó†°\^osо5‚âó{ž€¥ðÂú±|7¥r88‘{7ïiýÑ·u«~ŸL(ñÍ™RK(qy¦Ù(ÐDN¸FZeýqg’ *'¬C»ÉØP=ˆkÅ®òcvbÄ1k¼˜Ý…؆± Zï&7a@ö"[t8ñÅ «½›eë·ƒE6±h©°&4TXH7vKhq:Ih”œÒ£,ÈÈËâÄ3;•“Ê ‚ùšWã¢.Æpˆ«kT~×¥O(öFо餯–D„T¤6yóËÒÿwhË=»¢"Ú”¬ŽŽœQ•ÿÞ,ý“˜áØ‹.âÚ§™’gÉ=ŽK4ùà·žÒFgà›ŠIV0Ü?ùy»o¥Ôδ7Œfw+'”ó4¯Ç!D{TªMV»RG¡Ó%ÔgÌpš#*yç–~$¡ #RŒüÄB÷ᣓ­Ø¨nó±8äzEg0¬û²ƒ4'Âç&ºG¢Ód’ÌÛ#ÌóðÍN9ŒìpD·t×.RâÜ÷>ÅGl¢T*Ûæ=Ñm¢QØAsqHˆˆö¬²ND‚2])Ü?ùy»o¥Ôδ7ŒT oáòñ+j,ÍnÂ×,Ì(ЊújÒd4µ-nˆÐM2áeç2«UN!zåw}£;„ U8…ë•ÝöŒî5Tâ®WwÚ3¸@ÕSˆ^¹]ßhÎáUN!zåw}£;„ U8…ë•ÝöŒî5Tâ®WwÚ3¸@ÕSˆ^¹]ßhÎá¹K䤾ÇdkWƒb"±žÕÅv\…‘ʦ/õ"‰Ésö*­÷_kÎÓ£š\Y¶LQ⤋Ɨyn¤üäh?âCêÌņ 0±O‘D¸+ÐV *inÒš2K†G™%ú‹úN8ŒÌÌêKâÈSkÿÅåðâÕAµdS,ê\Óf…MÒn.ÃÍ*tÌÍO,²#ÍfdFFiJs2L÷þ^nÛéuó­ ãéÊqy÷—`mÍŠ‰aoÓYÆ%Òd8ûTš´ˆhudñ)dÒÒJ2-™˜¥Ý#q ¿‹ÄûQ;Ú‡HÜBïâñ>ÔNö¡Ò7»ø¼Oµ½¨tÄ.þ/íDïj#q ¿‹ÄûQ;Ú‡HÜBïâñ>ÔNö¡Ò7»ø¼Oµ½¨tÄ.þ/íDïj<ÄN ]I¡Ûô¼%¤ö*ÓÍ2?ý¢X­Ö­ åÕ+õyµ9Ž|9$-çUüÔ³3?ó îü¼Ý·Òê?çZÆ3ç”Êéï>ñmÅ‹™`®úÐÚ&!Ò¤5!Úe9Ù)ifñ%F„™™mÈÅ2èňÝÅÛ¾À“êF,Fî.ÝöŸP:1b7qvï°$úÑ‹»‹·}'ÔŒXÜ]»ì > tbÄnâíß`Iõ£#wnûO¨±¸»wØ}@èňÝÅÛ¾À“êF,Fî.ÝöŸP:1b7qvï°$úÑ‹»‹·}'ÔŒXÜ]»ì > tbÄnâíß`Iõ£#wnûO¨±¸»wØ}@èňÝÅÛ¾À“êF,Fî.ÝöŸPMnGwûF¾‹W«\Å´‡ ¨¥I“%ú$„6ËH–Ú–µ¨Ñ’RI#33ØDCg€ÿÙ‰8"\’jì  € ž§ Ñ=ˆ3ÇýÿMo­)9P`ˆ Ãö2ÌP/¹Á¹íÁÿÖ¿ÞøŽÕ<´ÄXS?.]4 —eÿE†mq×øÄœ^éžäœÃ3ã!d5¿nÍüAÏä?x—½´HHñáa4¦ÁÅEŒ^¦/Ê Î@3W½7Ź qJábùTæg‘êíg6ë_FýÞH¼n í Þ”@ÓRlÁàa} nSë¯iÅ|¿Ìï+êj…ÒÜàR²,;vLÿ¤–ÀNÑU¸=a ãÞð`9½üY„.°›ÓQViØEè2½/F°¿k}ûî“ó¬?ï;=|¤´¿ ¨Ã¶`"pyÚ1|{GÆ8°˜\’gí²þ÷‚E*L¶ÙÝÚ÷ðŒân:ªjÔPzç³N3Wž‹µÊñR•mÜâ³Mâ«îÊ»DM¿ß‡Ù#‹IA7Œq{0ïÁ­zÌ­Ÿ–ÐùP<Ú£ZÒüUb¯šÜi=ç è%EÎOñç‡s¬ž¥;¾¹I4Ϻ„áòº­G °óHS¸ÇKU_¦w™w¶ô5›Ööƒ‡Q0áq2’¢]=˜µER©0ÌϘ1lÚ !̆sÔžxKG-|¢ ü¬ewAñký­ ®;„Ð$!5Pd¹¯ãH0Sy9=OP”߸–Ì™¬ÐÄi[iO|SEH§YÄu9¼ÿü lØ';–£ŸÃŒñõyÛ2 za_Y¥A³QY¼pP(¡º—”>,Mc¤lÒ'¹*\üàH¶æ[³y?ÌçcpºÔ9ʘ8~úÍcN5²ë¢`,ˆwØbÞOŠ›4 Mºýnw¬éÆolòdÎðĤ.ýoˆ(\ŽÉa‹ª7ñ%L8s1~ ž¿ïáÊs¸1Õ³JÿÜå¡CÑìÃqî8¡ŸjŽ[KœºL\/ ϱ¥½¹Ô_²Õøq D?•¹Áÿ.­Uú￘u#d“Eù¢ñ1w„–+‰Ê‘Q䂉Éö˜(Ýã5ic­¿ÀzxaÒ”š± )ªAËïU ÷˜¢$`˜Ý.¨ÙE³·Õ¿×·KÀÇÝ]Xö–:ÃŽTK=¹·XΟ‡1-6•`Q*?ÜvO¯*[|Û úhàÁó'*¦*&Ý,@®ªÑž›÷D%DL‘~’jÀxÙÝg¹€˜–>j¾kà" ÚØ"Ä?EಠL v,þbOdˉ¸† 4ôFfòÉSV¸1n¤†­-¢”‚‹vÈ2M)’"©ß´ƒ€Î4Üð:Tãæ»âb¾b®a(äËa„¨Öå ¥TSƒ±9D`v6$Š—0ëî¯ï@ïº)à¶ìByÓ+å½ÁçtsÕëz|¼þ›äaá+ÅzPžx”Oêb$ªèRò†ú!ã° æqñ×SiH­Ãr¬Šrˆ^Çæk Gi4s°¨ùn.wPÏrõ#"ìÈ$SoTf5óþöûÌiÌïþ)öo#€Ø3lŠà>ãðÄ]¥•½®-lºG’/Ü8é‰ \’gí ž§ Ñ=ˆ3AhÿhÝì`DæèÚÐÛÊRPì84aJe›ÔFpIЃ5ZP±A*B¤Rgu~±Þ Íjí²×YÊXìíŠkè-fßnNßi {£ö.î™¶F'†Á…L{[luvÝ·Ê>_$ëskÖ÷ºÜnHpƒö²—ö`õqo)p*F=j‰aº—OâéüáüÂYÞÝwW †S®ðÏ2éÑ|Õ 6Ä}•te\vXK¹…|& qèÃÊò D‘‚\Óòè8ŽÛî )ÜkƒÙå"+BÇéÖ.9 KÒZRÐ*Aô–Xâ”eÃIÁuÍy³ÍV§P(Œå©4©h]¹ÈmÑ\L¼C'¡ÿuËù® Ž¨¤ÈûkM"ø€§ù¢>ã! Ù³D½† q–KE—žÊ1ÑJFŒ¶4’RŽEÔ]£Oø|v,NG´w|«bþO˜†ÚäSKÿb#§*ͨ™D%õ™{/FÍHK=b{T*“~-Ö··öU—òRn ‚ÓÏ äaW@3uÿ}Fn˜„&hQè×ažQ®+§õyJoFüŽV×#ÅU®}ÑKVX¡÷쇗{µo¼2•úxvPuh¢Ý¢%¸JúP9B²’#´ÌÀv2·ºß m§©ê•V¿üŠ–,|”ãv8üW°C+‹¾>gȵŽƒT!*LQòUu÷Æ I¼cU,œ&·ÉÞ’ìüÇÿ‡µ<²²ÿa³×/ 8f*Aú‚Ï’]KyÉ~Óº!¼'G˜?@ ØoyòUQÉá팟‡@Ð_Õí6ÚWH’ˇü+ (9Od*žwŠˆAS²×lÅ`¹’cN 4+ù,ï4¦rÉú&9züñ©£zÐêŠÐ os:] UÝŠiÏݯžµ+ZËiŒà«R¾øò>@ZéÝ»ßI¢ î¤Bäw§÷áQgÍkÿ¼)îVÚ‚Ó ¥|Ã)wß/¶(\‘ÞCA›»·Ïº?ûNS³P‰þ¨“^¸µ' YÖÄNô[c]9:GG'ð¿g€êa,’ RF3Ý/<ÖZ]#È@•ÝáÌé³Ê—b)8©‰ùåQpŸÙŠ¸ç²ƒjÂOáé\*á¢)¡ò÷"Ï…·w ‰¨HȦln9Çô»½ÍI­žh°/²n—òv«µ%íbƒ¶VCâÒƒ ž¦ä¶ä0¸¶Øàì¯Í(hMëWÍ=Ñ2Ý"áx+3™M° j±t h,y¨lóÁ¤»g&Ù2•ÍOÛG#ãžœFÁ„-Z9O´"Test Expiration I ‰>(\˜æ¢ W]  € |°}m,œ|ö ÀkäúáꤽMQpØv³VKž”ÜÈYüg¹/#ª½*p6j&ÏMé7Ç)“µ@È€l®Š~8ZÙ2HüÌt™SãÆÐf”¬þOyó_mÒÀ3Pž"ŽÑ¿ÏÉLš‘ÐQòÀDŠ`ááÆl†¿SسD1_ÈvV9±ð[Ðô¼ò67—J9?EÒNO±LiªËÝ€ Cî±E–T›Í>Ó)WÇ“Ål¼#ëuºÑ˜Ê*ú9ÀJÿ:­‹œ©¹*¸Ýà‰S‰>(\˜ú W]  € |°}m,œ|õý>Ó ¤pç¹-¤<üØÏ죫Žp²¨¿ËŽõ@MIqÀ¸QÒöxk-ÏÝp.Û„sÍ|ìGÂäÝÀ× §‘é¸:@2ÕuûÂŒï˜ E¢o¯¼:ɯVVƒööšÊ»›é1[ ‚F1æEø˜7øà×Ì};ºÓÊÚ¸³Þ¼GX2àý´qü4N"tà|™Í)ˆ“Jj\RÉÀvßõ´^ȞشÑ]ÇèUŠCN;R»Ÿ'¥õ€tW4…áÚ0ò2Až¦.CåˆÉÆ %覼–æî7¦&£¸umÍ ºÙouÿ£ç”6{PÄ“4RÔù¨˜ò·“G°´&Test Expiration III ‰>(\˜ú! W]  € |°}m,œ|ÓÖ‰1ì=ˆÃói®Øýœj[ÐHè3Iy{W5gzŠ+,¼s<èºdÕ¾ùk€,B”–<°¦®9«=tw»+ ˜–f»úIt#)þÛÍ4e “ïëc#h߯ ªúû'¨ÖŠgÀMY–òå¯Eœà9›ˆ£ÿÞaÐGfŸA€Gûï£dÎHøHÕæ°]øý3kÛXìòá!MGíK«dÒîÒÞi–çuA{UH.´QÖgèÒ·Å•T[`;X^ó)TQÌÂëí¤Pƒøýn'‚/%YXôz´)ôÊÔÔ÷— zà㱊î{¼µ1ÙÅ-@^>Š)ûéÎP°˜\˜Í”º7§Û”ìŠfþƒrÜ=³˜Ú ™C”ŒèÎj¬ª nçN4$Ó6v,qþ°Š €/RÁÝôªýé8ä¬â¤_›Ø|å[T”KÙv•]Þ~­ZÞI¨yÏxÜ0¿¨gæ§ ñ3\͆MÎ~eô¯êRŤ¾3}Å„Û ™K¸Ï g˜=¢ÞÃø/²5¯œ£« p%°—ÖAS>•Å\ˆ¨iѦdÿ‡WiXG9ûŠå'… ?wͶ™ àe¼&Ž‚ÙYñ#лü,"WßvÅÇÔp:+§ÄëûÎTáó¸)‘™ ʱ-=‚é™àŽJî,£ó}›÷"À£ß5ß*£‘ûpW€”j+ú VðÒ™a™ßñ&)ÝK«.í‚ÁtÀs¼„ú7…ƒèˆö„ãgu)²¦êá>=4é˜ÄãP#BÒÜ¢²s‡ß¼`o$䪭_°Â wôÕÝiIÇÍsñrœ³ðÍÅ¢}X Z[#Éî? K¿2tsí+ö·³I˜ð”:V¥9—¸;“øJÛîx`3&ç —¤µt)ùŒ‹•[¼ÿr´X3š£ß+" ;=0E‚¥lEÃY¶‚nÊ](o»®¬LpïÊú‰:Wy¥"q²¯EÞ:ÆÁ{Ç·à5›Õ!  ñö:ͪÊX2[žÅ¿_Û»W§×áØzþD¬¢p°ƒä²ä9G“—Õ¼Õ¦¡îõoîåûe©¦EBŒ.Þ ´_{¸p[Φãˆo‹åx4ÕÒœ¨2Â'о&O6= •–ÌWXüQŹ}£ýpL’òÀFØ…8Ç\Ú¹''ùRÝPçÒK,~fÚåKå)+¶žRªd-uåwÜÒ‘T¥Æ×ùˆ…ÄoºéÚÒuP\­Ö"¦ãòZs'¡\¦Fv¢(ŽsK®àÆ¢÷sµþð •‹ŒŠ<µ{ǰQDÀ&¡‘/Z{ñŠjBÈa.‘+O1$Hç1Pa+IÁ½8µƒâË„L Ê7òU‚ºi¦â¦D6Tÿ!•1ÙŠN!7ü ³m\°¶$-:’ý›xIÒù ze‡%×±Ýn™£ÔF ¾ ¹>$ÁidHK Ï9DáŒc†ª¬YdàŒ"D)°jª‚î9÷,Ö4Ú”yy/ÐPi½² ý.Zõ] ðù…§+k½5€ô—_ fĜǕ€ôæ.»Ð™ªV†=Þ‰% \˜×ú ­[ |°}m,œ|\ÿQ 3‰.ùB†¬ŸLuÿóçiøåõF¥Ç—óE;Û~æa­R.WßÛ®FÜ×+­¬ðÛ¹ˆÚ ì«*°ìÄT/5¶Ú½€°`m„˜•Õ³Øt\šJÕÅñW„DMASE~ ŒÔ7¸)9ˆÈ6£Làp¼éü8½Øx.êm’T·á¨Ïa@̙ũ´tµZàÙkJ„Ä*"Do67ÛzàCyTNß»]L⮉9sc„©É\e<¡•ÃþìrszÒ“¥ÂcÿÚ¾ÿƒŠwÕàßöYæŒÝ’ ³t©–ŽÆ´Ü?|'añ¶œ²a?ÕÖ¡xgªžµó OÎÛ\«ó°˜\™:¶jppðš@òâ× Úm¼Înüûm d ™{êG;:s+µû »«`ŽmÐÁ «W×lzË“ì•è ¼ŸXiÙ¤6_ùµ¬$þW]é’n $À4úïĶtÓšÃà(î  ¦8È•Pâ}jÝzGέND1D˜—ñ*ŽË5婘aŒ‘•ƒ^UÊ™>Gœ méé{„À»3ø”u5úÞV´HeQûPŽÑnE:媖;¾[³¾DY9q¶ùÞ„\vǤjñ3½föá0Ãú©Ì4ðx¨…}2©×ÇPЖ=¬”Í¡@‡ ¾»þ)UEõþ-†Ö&QÞeÌ'*ºÌ㫆( *¢@8õ48«‰x”ý@NF`Ž^¨Á‘úe‚|XÊ·î3ŒXÔÓì¼ ÌÛ²È}÷ׂÈ)d7ä–9ˆÓD1Óœ0«F‰T¶ìµæKŒð.UÀ¯‹§€ ®NWð ×EZËÒ7ݱºFe+ ßÜb±Á¡¿£|ŒASÔ2[‰÷OWDpMþfÈ"°u`‚7Äæ”Ž&¿Ò»ÝV×®426™áÞhæIŽ‚6'ñV͈OP©Á@Š/—â,ÄFÝñýËð^€9‘¯¤®QôEmË ²åÝ™öMòئãÙ‹¯À+óæép'ÒÄìtáÙ+ë²ò³˜ÔØ@mzDuW{eôÓM“++·°;Mí·ÈÀ<)/1AGËE®\¶M?eZ·…­ä! *¶M—„®SÞÚ@@ˆðÍd§U—€W„cÒ(.¢ªÇR½ì¸§¹8-Æzº³}4WV°)s7?EÉ$‹ÝÛÐ ÿëê5ÚøÈßÎäô ¸bÿPÇgèo°æ }Úî}ÂA¡C¶v>³"GVc²[ ¿ ,8pG(ˆœ"´>Ãïv™Dª†T-1Ê*Ù!{U×K{Ó­K )2Mѱq½%3qݧٕ X™’mµžmî¿w{âþ40Ù^0¼ê¨°a«yƒ&ÑP–ŒÛžjÊÌÂèB?q úKzçRE¸È+Tô!´øû|ƒ½Ï$ˆ‹CSç®*ׯ“4¯ ž¤È;7aÍ-a:jL„ÁöÒ4yX†¯ëaÒzÝtâPùFO+ýµo}œd`ñU'^ªNgÎ!Ï”=‰> \™:) |°}m,œ|À] \™: Í ÓÕ¼¶ï]`–O”žæìóý"S4Q}ªjôéñaøZà®|­ß|©è&‡'ÃÁQÇÿï‘Û„/V ÆŸÒ°5)K÷0~…Ëçõ^fô E¼8u·‡ÐßËž5®¥Mšupošû9´(R‡#ú²GmÄGòµBr¾‘)ñiÇÖ­“÷ؽ«oÏøZÆ™¶ènðFwäh[2ƒ,ºÑÇUÈ?ª¯Rñ¿õ¾ÙÏ”õšJW=Ýš¢›÷ú¨OÚAdu¿#H†a Ò/-¥cø\)cý^R$îIs}›ÜP‚ §êýVæ,ZE‹±D ˆ :7só8+I@JZz¼aœ†îû—¸XÅP1$0éñ))¢B±è󌧲Yñlþ¡ñÃh¨´Ár@kE©gÄôòyO Ñ:zà$¿>Hv Ö»à¾óæ‡!W ØH¸Œæal$”©”~ÿ#¥² „{Ûâh“Å ^'¸Ö¯]ÅÆš&°åŠŒ ©©ãeŠ7°*äç½ö—R>—®”‰sÜñÌê?Ç$õÀTÍ)3•AÚHÉÂÕŽ²<†Žñ`æá‘ÇÓsAA&\zß¶ê÷ÿfOVÝíq,C(YLff,˜™?¨úŸô"ôÙ†c^¼5^:`Æ7«\‘ZýTù±ìØVÙÃÿ œ]ÄòLð.Í…1ùRLFP·&Yã°in-toto-0.4.0/tests/gpg_keyrings/rsa/trustdb.gpg0000644000076500000240000000365013535414523022020 0ustar lukpstaff00000000000000gpg\™ \š$ñ $ " ) & „e¡âàû+@­²GŽû?S~ Š “03,ßy™i âœ|]"µ2›¸ {:»&¹{eZ¹)kÑ[ Ð.vŒC! ¿DݹÛkãô(ÓO*j…Ûöõ ‚ˆïVÓy_òÀÛV0‰²…ÚX# Òºž¨Œ×?¦¸M…Tâ•ô0á-² @æ’îö¸ÿ•ÐÒÉþ“f™% ñ·.·¡÷IšiKÈ[Q”½z õWÐÿEïE7%‘Bž§ Ñ=ˆ3( \µ&ãa{ØxäÄÑ£}U¼A’R ®Ï0 ¡t';¼O…”Ð{[â ' 謀É$m«µK˜|°}m,œ|0 aÿâJ_ž<Ĩ‰ÒM:¯ÊB ¤vK¢±8«— ½¼In$.ÐÔq* ?Ü5uÖ‡–Áäœj:À†·y|ÕËz+ ïiÇ€BLDã¡8y{¿@°ÚhÓ,  £…p¡bqª=D?û)°”- Íí7üÿ]¡V+M‰hXîf„¨. Ì&m z?/°ãø¹ÇLòöƬ…)/in-toto-0.4.0/tests/models/0000755000076500000240000000000013535503044015631 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/models/__init__.py0000644000076500000240000000000013535414523017733 0ustar lukpstaff00000000000000in-toto-0.4.0/tests/models/test_common.py0000755000076500000240000000113013535414523020533 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_common.py Santiago Torres-Arias Sept 26, 2017 See LICENSE for licensing information. Test the Signable and ValidationMixin class functions. """ import unittest import json from in_toto.models.common import Signable class TestSignable(unittest.TestCase): """ Verifies Signable class. """ def test_load_repr_string_as_json(self): """Test load string returned by `Signable.repr` as JSON """ json.loads(repr(Signable())) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/models/test_inspection.py0000755000076500000240000000326713535414523021433 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_inspection.py Santiago Torres-Arias Nov 18, 2016 See LICENSE for licensing information. Test inspection class functions. """ import unittest from in_toto.models.layout import Inspection import securesystemslib.exceptions class TestInspectionValidator(unittest.TestCase): """Test verifylib.verify_delete_rule(rule, artifact_queue) """ def setUp(self): """Populate a base layout that we can use.""" self.inspection = Inspection(name="some-inspection") def test_wrong_type(self): """Test the type field within Validate().""" self.inspection._type = "wrong" with self.assertRaises(securesystemslib.exceptions.FormatError): self.inspection._validate_type() with self.assertRaises(securesystemslib.exceptions.FormatError): self.inspection.validate() self.inspection._type = "inspection" self.inspection._validate_type() def test_wrong_run(self): """Test that the run validators catch malformed values.""" self.inspection.run = -1 with self.assertRaises(securesystemslib.exceptions.FormatError): self.inspection._validate_run() with self.assertRaises(securesystemslib.exceptions.FormatError): self.inspection.validate() self.inspection.run = ["somecommand"] self.inspection._validate_run() self.inspection.validate() def test_set_run_from_string(self): """Test shelx parse command string to list. """ inspection = Inspection() inspection.set_run_from_string("echo 'foo bar'") self.assertListEqual(inspection.run, ["echo", "foo bar"]) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/models/test_layout.py0000755000076500000240000003255313535414523020575 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_layout.py Santiago Torres-Arias Nov 18, 2016 See LICENSE for licensing information. Test layout class functions. """ import os import unittest import tempfile import shutil from in_toto.models.layout import Layout, Step, Inspection from in_toto.models.metadata import Metablock import in_toto.models.link import in_toto.exceptions import in_toto.verifylib import securesystemslib.exceptions class TestLayoutMethods(unittest.TestCase): """Test Layout methods. """ @classmethod def setUpClass(self): """Create temporary test directory and copy gpg keychain, and rsa keys from demo files. """ self.gpg_keyid1 = "7b3abb26b97b655ab9296bd15b0bd02e1c768c43" self.gpg_keyid2 = "8288ef560ed3795f9df2c0db56193089b285da58" self.test_dir = os.path.realpath(tempfile.mkdtemp()) # Create directory to run the tests without having everything blow up self.working_dir = os.getcwd() # Copy GPG keyring to temp test dir tests_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), "..") self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree( os.path.join(tests_dir, "gpg_keyrings", "rsa"), self.gnupg_home) # Copy keys to temp test dir key_names = ["bob", "bob.pub", "carl.pub"] for name in key_names: shutil.copy( os.path.join(tests_dir, "demo_files", name), os.path.join(self.test_dir, name)) self.key_path = os.path.join(self.test_dir, "bob") self.pubkey_path1 = os.path.join(self.test_dir, "bob.pub") self.pubkey_path2 = os.path.join(self.test_dir, "carl.pub") os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_set_relative_expiration(self): """Test adding expiration date relative from today. """ layout = Layout() with self.assertRaises(securesystemslib.exceptions.FormatError): layout.set_relative_expiration(days=None, months=0, years=0) with self.assertRaises(securesystemslib.exceptions.FormatError): layout.set_relative_expiration(days=0, months="", years=0) with self.assertRaises(securesystemslib.exceptions.FormatError): layout.set_relative_expiration(days=0, months=0, years=[]) layout.set_relative_expiration(days=1) layout._validate_expires() layout.set_relative_expiration(months=2) layout._validate_expires() layout.set_relative_expiration(years=3) layout._validate_expires() layout.set_relative_expiration(days=3, months=2, years=1) layout._validate_expires() # It's possible to add an expiration date in the past layout.set_relative_expiration(days=-3, months=-2, years=-1) layout._validate_expires() def test_get_step_name_list(self): """Test getting list of step names. """ names = ["a", "b", "c"] layout = Layout(steps=[Step(name=name) for name in names]) self.assertListEqual(layout.get_step_name_list(), names) def test_get_step_by_name(self): """Test getting step by name. """ names = ["a", "b", "c"] layout = Layout(steps=[Step(name=name) for name in names]) self.assertEqual(layout.get_step_by_name("b").name, "b") with self.assertRaises(securesystemslib.exceptions.FormatError): layout.get_step_by_name(None) def test_remove_step_by_name(self): """Test removing step by name. """ names = ["a", "b", "c"] layout = Layout(steps=[Step(name=name) for name in names]) self.assertEqual(len(layout.steps), 3) self.assertTrue("b" in layout.get_step_name_list()) # Items are only removed if they exist for _ in range(2): layout.remove_step_by_name("b") self.assertEqual(len(layout.steps), 2) self.assertTrue("b" not in layout.get_step_name_list()) with self.assertRaises(securesystemslib.exceptions.FormatError): layout.get_step_by_name([]) def test_get_inspection_name_list(self): """Test getting list of inspection names. """ names = ["a", "b", "c"] layout = Layout(inspect=[Inspection(name=name) for name in names]) self.assertListEqual(layout.get_inspection_name_list(), names) def test_get_inspection_by_name(self): """Test getting inspection by name. """ names = ["a", "b", "c"] layout = Layout(inspect=[Inspection(name=name) for name in names]) self.assertEqual(layout.get_inspection_by_name("b").name, "b") with self.assertRaises(securesystemslib.exceptions.FormatError): layout.get_inspection_by_name(1) def test_remove_inspection_by_name(self): """Test removing inspection by name. """ names = ["a", "b", "c"] layout = Layout(inspect=[Inspection(name=name) for name in names]) self.assertEqual(len(layout.inspect), 3) self.assertTrue("b" in layout.get_inspection_name_list()) # Items are only removed if they exist for _ in range(2): layout.remove_inspection_by_name("b") self.assertEqual(len(layout.inspect), 2) self.assertTrue("b" not in layout.get_inspection_name_list()) with self.assertRaises(securesystemslib.exceptions.FormatError): layout.remove_inspection_by_name(False) def test_functionary_keys(self): """Test adding and listing functionary keys (securesystemslib and gpg). """ layout = Layout() self.assertEqual(len(layout.get_functionary_key_id_list()), 0) layout.add_functionary_keys_from_paths([self.pubkey_path1, self.pubkey_path2]) layout.add_functionary_keys_from_gpg_keyids([self.gpg_keyid1, self.gpg_keyid2], gpg_home=self.gnupg_home) layout._validate_keys() self.assertEqual(len(layout.get_functionary_key_id_list()), 4) # Must be a valid key object with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_key("abcd") # Must be pubkey and not private key with self.assertRaises(securesystemslib.exceptions.Error): layout.add_functionary_key_from_path(self.key_path) # Must be a valid path with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_key_from_path(123) # Must be a valid keyid with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_key_from_gpg_keyid("abcdefg") # Must be a list of paths with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_keys_from_paths("abcd") with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_keys_from_paths([1]) # Must be a list of keyids with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_keys_from_gpg_keyids(None) with self.assertRaises(securesystemslib.exceptions.FormatError): layout.add_functionary_keys_from_gpg_keyids(["abcdefg"]) class TestLayoutValidator(unittest.TestCase): """Test in_toto.models.layout.Layout validators. """ def setUp(self): """Populate a base layout that we can use.""" self.layout = Layout() self.layout.expires = '2016-11-18T16:44:55Z' def test_wrong_type(self): """Test that the type field is validated properly.""" with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._type = "wrong" self.layout._validate_type() self.layout.validate() self.layout._type = "layout" self.layout._validate_type() def test_validate_readme_field(self): """Tests the readme field data type validator. """ self.layout.readme = 1 with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_readme() self.layout.readme = "This is a test supply chain" self.layout._validate_readme() def test_wrong_expires(self): """Test the expires field is properly populated.""" self.layout.expires = '' with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_expires() with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.expires = '-1' with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_expires() with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() # notice the wrong month self.layout.expires = '2016-13-18T16:44:55Z' with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_expires() with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.expires = '2016-11-18T16:44:55Z' self.layout._validate_expires() self.layout.validate() def test_wrong_key_dictionary(self): """Test that the keys dictionary is properly populated.""" rsa_key_one = securesystemslib.keys.generate_rsa_key() rsa_key_two = securesystemslib.keys.generate_rsa_key() # FIXME: attr.ib reutilizes the default dictionary, so future constructor # are not empty... self.layout.keys = {"kek": rsa_key_one} with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_keys() with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.keys = {} self.layout.keys[rsa_key_two['keyid']] = "kek" with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout._validate_keys() with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.keys = {} del rsa_key_one["keyval"]["private"] del rsa_key_two["keyval"]["private"] self.layout.keys[rsa_key_one['keyid']] = rsa_key_one self.layout.keys[rsa_key_two['keyid']] = rsa_key_two self.layout._validate_keys() self.layout.validate() def test_wrong_steps_list(self): """Check that the validate method checks the steps' correctness.""" self.layout.steps = "not-a-step" with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.steps = ["not-a-step"] with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() test_step = Step(name="this-is-a-step") with self.assertRaises(securesystemslib.exceptions.FormatError): test_step.expected_materials = ['this is a malformed step'] self.layout.steps = [test_step] self.layout.validate() test_step = Step(name="this-is-a-step") test_step.expected_materials = [["CREATE", "foo"]] test_step.threshold = 1 self.layout.steps = [test_step] self.layout.validate() def test_wrong_inspect_list(self): """Check that the validate method checks the inspections' correctness.""" self.layout.inspect = "not-an-inspection" with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.inspect = ["not-an-inspection"] with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() test_inspection = Inspection(name="this-is-a-step") test_inspection.expected_materials = ['this is a malformed artifact rule'] self.layout.inspect = [test_inspection] with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() test_inspection = Inspection(name="this-is-a-step") test_inspection.expected_materials = [["CREATE", "foo"]] self.layout.inspect = [test_inspection] self.layout.validate() def test_repeated_step_names(self): """Check that only unique names exist in the steps and inspect lists""" self.layout.steps = [Step(name="name"), Step(name="name")] with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.steps = [Step(name="name")] self.layout.inspect = [Inspection(name="name")] with self.assertRaises(securesystemslib.exceptions.FormatError): self.layout.validate() self.layout.step = [Step(name="name"), Step(name="othername")] self.layout.inspect = [Inspection(name="thirdname")] self.layout.validate() def test_import_step_metadata_wrong_type(self): functionary_key = securesystemslib.keys.generate_rsa_key() name = "name" # Create and dump a link file with a wrong type link_name = in_toto.models.link.FILENAME_FORMAT.format( step_name=name, keyid=functionary_key["keyid"]) link_path = os.path.abspath(link_name) link = in_toto.models.link.Link(name=name) metadata = Metablock(signed=link) metadata.signed._type = "wrong-type" metadata.dump(link_path) # Add the single step to the test layout and try to read the failing link self.layout.steps.append(Step( name=name, pubkeys=[functionary_key["keyid"]])) with self.assertRaises(securesystemslib.exceptions.FormatError): in_toto.verifylib.load_links_for_layout(self.layout, ".") # Clean up os.remove(link_path) def test_wrong_pubkeys(self): """Check validate pubkeys fails with wrong keys.""" # Pubkeys must be lists ... tmp_step = Step() tmp_step.pubkeys = "abcd" with self.assertRaises(securesystemslib.exceptions.FormatError): tmp_step.validate() # ... of keyids (hex schema) tmp_step.pubkeys = ["abcdefg"] with self.assertRaises(securesystemslib.exceptions.FormatError): tmp_step.validate() if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/models/test_link.py0000644000076500000240000000543613535414523020212 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_link.py Lukas Puehringer Sep 28, 2017 See LICENSE for licensing information. Test link class functions. """ import unittest from in_toto.models.link import Link from securesystemslib.exceptions import FormatError class TestLinkValidator(unittest.TestCase): """Test link format validators """ def test_validate_type(self): """Test `_type` field. Must be "link" """ test_link = Link() # Good type test_link._type = "link" test_link.validate() # Bad type test_link._type = "bad link" with self.assertRaises(FormatError): test_link.validate() def test_validate_materials(self): """Test `materials` field. Must be a `dict` of HASH_DICTs """ test_link = Link() # Good materials sha = "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" test_link.materials = {"foo": {"sha256": sha}} test_link.validate() # Bad materials 1 test_link.materials = "not a dict" with self.assertRaises(FormatError): test_link.validate() # Bad materials 1 test_link.materials = {"not": "a material dict"} with self.assertRaises(FormatError): test_link.validate() def test_validate_products(self): """Test `products` field. Must be a `dict` of HASH_DICTs """ test_link = Link() # Good products sha = "cfdaaf1ab2e4661952a9dec5e8fa3c360c1b06b1a073e8493a7c46d2af8c504b" test_link.products = {"bar": {"sha256": sha}} test_link.validate() # Bad products 1 test_link = Link() test_link.products = "not a dict" with self.assertRaises(FormatError): test_link.validate() # Bad products 2 test_link.products = {"not": "a product dict"} with self.assertRaises(FormatError): test_link.validate() def test_validate_byproducts(self): """Test `byproducts` field. Must be a `dict` """ test_link = Link() # Good byproducts test_link.byproducts = {} test_link.validate() # Bad byproducts test_link.byproducts = "not a dict" with self.assertRaises(FormatError): test_link.validate() def test_validate_command(self): """Test `command` field. Must be either a `list` """ test_link = Link() # Good command test_link.command = ["echo", "'good command'"] test_link.validate() # Bad command test_link.command = "echo 'bad command'" with self.assertRaises(FormatError): test_link.validate() def test_validate_environment(self): """Test `environment` field. Must be a `dict` """ test_link = Link() # good env per default test_link.validate() # Bad env test_link.environment = "not a dict" with self.assertRaises(FormatError): test_link.validate() in-toto-0.4.0/tests/models/test_metadata.py0000644000076500000240000000443213535414523021030 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_metadata.py Lukas Puehringer Jan 24, 2018 See LICENSE for licensing information. Test in_toto.models.metadata.Metablock class methods. """ import os import unittest from in_toto.models.metadata import Metablock from in_toto.models.layout import Layout from in_toto.models.link import Link from securesystemslib.exceptions import FormatError class TestMetablockValidator(unittest.TestCase): """Test in_toto.models.metadata.Metablock validators. """ def test_validate_signed(self): """Test validate Metablock's 'signed' property. """ # Valid Layout Metablock metablock = Metablock(signed=Layout()) metablock._validate_signed() # Valid Link Metablock Metablock(signed=Link()) metablock._validate_signed() # Fail instantiation with empty or invalid signed property # Metablock is validated on instantiation with self.assertRaises(FormatError): Metablock() with self.assertRaises(FormatError): Metablock(signed="not-a-layout-or-link") # Fail with invalid signed property metablock = Metablock(signed=Layout()) metablock.signed._type = "bogus type" with self.assertRaises(FormatError): metablock._validate_signed() def test_validate_signatures(self): """Test validate Metablock's 'signatures' property. """ # An empty signature list is okay metablock = Metablock(signed=Layout()) metablock._validate_signatures() # Fail with signatures property not a list metablock.signatures = "not-a-signatures-list" with self.assertRaises(FormatError): metablock._validate_signatures() # Fail with invalid signature metablock.signatures = [] metablock.signatures.append("not-a-signature") with self.assertRaises(FormatError): metablock._validate_signatures() # Load signed demo link demo_link_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "..", "demo_files", "write-code.776a00e2.link") metablock = Metablock.load(demo_link_path) # Verify that there is a signature and that it is valid self.assertTrue(len(metablock.signatures) > 0) metablock._validate_signatures() if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/models/test_step.py0000755000076500000240000000543213535414523020227 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_step.py Santiago Torres-Arias Nov 18, 2016 See LICENSE for licensing information. Test step class functions. """ import unittest from in_toto.models.layout import Step import securesystemslib.keys import securesystemslib.exceptions class TestStepValidator(unittest.TestCase): """Test verifylib.verify_delete_rule(rule, artifact_queue) """ def setUp(self): """Populate a base layout that we can use.""" self.step = Step(name="this-step") def test_wrong_type(self): """Test the type field within Validate().""" self.step._type = "wrong" with self.assertRaises(securesystemslib.exceptions.FormatError): self.step._validate_type() with self.assertRaises(securesystemslib.exceptions.FormatError): self.step.validate() self.step._type = "step" self.step._validate_type() def test_wrong_threshold(self): """Test that the threshold value is correctly checked.""" # no, python is not *this* smart self.step.threshold = "Ten" with self.assertRaises(securesystemslib.exceptions.FormatError): self.step._validate_threshold() with self.assertRaises(securesystemslib.exceptions.FormatError): self.step.validate() self.step.threshold = 10 self.step._validate_threshold() self.step.validate() def test_wrong_pubkeys(self): # FIXME: generating keys for each test are expensive processes, maybe we # should have an asset/fixture folder/loader? rsa_key_one = securesystemslib.keys.generate_rsa_key() rsa_key_two = securesystemslib.keys.generate_rsa_key() self.step.pubkeys = ['bad-keyid'] with self.assertRaises(securesystemslib.exceptions.FormatError): self.step._validate_pubkeys() with self.assertRaises(securesystemslib.exceptions.FormatError): self.step.validate() self.step.pubkeys = [rsa_key_one['keyid'], rsa_key_two['keyid']] self.step._validate_pubkeys() self.step.validate() def test_wrong_expected_command(self): """Test that the expected command validator catches malformed ones.""" self.step.expected_command = -1 with self.assertRaises(securesystemslib.exceptions.FormatError): self.step._validate_expected_command() with self.assertRaises(securesystemslib.exceptions.FormatError): self.step.validate() self.step.expected_command = ["somecommand"] self.step._validate_expected_command() self.step.validate() def test_set_expected_command_from_string(self): """Test shelx parse command string to list. """ step = Step() step.set_expected_command_from_string("echo 'foo bar'") self.assertListEqual(step.expected_command, ["echo", "foo bar"]) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/models/test_supply_chain_item.py0000644000076500000240000000522613535414523022766 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_supply_chain_item.py Lukas Puehringer Jan 25, 2018 See LICENSE for licensing information. Test SupplyChainItem class methods. SupplyChainItem is a super class for Steps and Inspections. """ import json import unittest from in_toto.models.layout import SupplyChainItem import securesystemslib.exceptions class TestSupplyChainItem(unittest.TestCase): """Test models.SupplyChainItem. """ def test_wrong_expected_materials(self): """Test that the material rule validators catch malformed ones.""" item = SupplyChainItem() with self.assertRaises(securesystemslib.exceptions.FormatError): item.expected_materials = [["NONFOO"]] item._validate_expected_materials() with self.assertRaises(securesystemslib.exceptions.FormatError): item.validate() with self.assertRaises(securesystemslib.exceptions.FormatError): item.expected_materials = "PFF" item._validate_expected_materials() with self.assertRaises(securesystemslib.exceptions.FormatError): item.validate() # for more thorough tests, check the test_rulelib.py module item.expected_materials = [["CREATE", "foo"]] item._validate_expected_materials() item.validate() def test_wrong_expected_products(self): """Test that the product rule validators catch malformed values.""" item = SupplyChainItem() item.expected_products = [["NONFOO"]] with self.assertRaises(securesystemslib.exceptions.FormatError): item._validate_expected_products() with self.assertRaises(securesystemslib.exceptions.FormatError): item.validate() item.expected_products = "PFF" with self.assertRaises(securesystemslib.exceptions.FormatError): item._validate_expected_products() with self.assertRaises(securesystemslib.exceptions.FormatError): item.validate() # for more thorough tests, check the test_rulelib.py module item.expected_products = [["CREATE", "foo"]] item._validate_expected_products() item.validate() def test_repr(self): """Test repr returns a JSON parseable string. """ json.loads(repr(SupplyChainItem())) def test_add_rule_from_string(self): """Test that add_rule_from string methods set property correctly. """ item = SupplyChainItem() item.add_material_rule_from_string("CREATE foo") self.assertListEqual(item.expected_materials[-1], ["CREATE", "foo"]) item.add_product_rule_from_string("ALLOW bar") self.assertListEqual(item.expected_products[-1], ["ALLOW", "bar"]) item.validate() if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/rc_test/0000755000076500000240000000000013535503044016011 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/rc_test/.in_totorc0000644000076500000240000000014713535414523020017 0ustar lukpstaff00000000000000[settings] artifact_base_path=r/c/file ARTIFACT_EXCLUDE_PATTERNS=r:c:file new_rc_setting=new rc settingin-toto-0.4.0/tests/runtests.py0000644000076500000240000000204113535414523016607 0ustar lukpstaff00000000000000#!/usr/bin/env python """ runtests.py Santiago Torres Lukas Puehringer May 23, 2016 See LICENSE for licensing information. Script to search, load and run in-toto tests using the Python `unittest` framework. """ from unittest import defaultTestLoader, TextTestRunner import sys import os import subprocess def check_usable_gpg(): """Set `TEST_SKIP_GPG` environment variable if neither gpg2 nor gpg is available. """ os.environ["TEST_SKIP_GPG"] = "1" for gpg in ["gpg2", "gpg"]: try: subprocess.check_call([gpg, "--version"]) except OSError: pass else: # If one of the two exists, we can unset the skip envvar and ... os.environ.pop("TEST_SKIP_GPG", None) # ... abort the availability check.: break check_usable_gpg() suite = defaultTestLoader.discover(start_dir=".") result = TextTestRunner(verbosity=2, buffer=True).run(suite) sys.exit(0 if result.wasSuccessful() else 1) in-toto-0.4.0/tests/scripts/0000755000076500000240000000000013535503044016035 5ustar lukpstaff00000000000000in-toto-0.4.0/tests/scripts/expr0000644000076500000240000000142513535414523016743 0ustar lukpstaff00000000000000#!/usr/bin/env python """ this is a mock touch command written to provide portability to non-unix platforms that don't have the touch command. It uses touchfile to replace the very baseline functionality used within the in-toto test suite. You will probably find a couple of tests decorated as follows: @mock.patch.dict(os.environ, $PATH_TO_THIS FOLDER) def test_that_requires_to_execute_touch(...) ... This will automatically make it so that in-toto popens this touch executable instead of the system's touch. This script is *not* meant to do thorough checking of its inputs, so use it with care. Santiago Torres-Arias 1527623020 """ import sys sys.exit(1) in-toto-0.4.0/tests/scripts/tar0000644000076500000240000000150713535414523016554 0ustar lukpstaff00000000000000#!/usr/bin/env python """ this is a mock tar command written to provide portability to non-unix platforms that don't have the tar command. It uses tarfile to replace the very baseline functionality used within the in-toto test suite. You will probably find a couple of tests decorated as follows: @mock.patch.dict(os.environ, $PATH_TO_THIS FOLDER) def test_that_requires_to_execute_tar(...) ... This will automatically make it so that in-toto popens this tar executable instead of the system's tar. This script is *not* meant to do thorough checking of its inputs, so use it with care. Santiago Torres-Arias 1527623020 """ import tarfile import sys tarfile.open(sys.argv[2], "r:*").extractall() sys.exit(0) in-toto-0.4.0/tests/scripts/touch0000644000076500000240000000146613535414523017114 0ustar lukpstaff00000000000000#!/usr/bin/env python """ this is a mock touch command written to provide portability to non-unix platforms that don't have the touch command. It uses touchfile to replace the very baseline functionality used within the in-toto test suite. You will probably find a couple of tests decorated as follows: @mock.patch.dict(os.environ, $PATH_TO_THIS FOLDER) def test_that_requires_to_execute_touch(...) ... This will automatically make it so that in-toto popens this touch executable instead of the system's touch. This script is *not* meant to do thorough checking of its inputs, so use it with care. Santiago Torres-Arias 1527623020 """ import sys open(sys.argv[1], "a+").close() sys.exit(0) in-toto-0.4.0/tests/test_gpg.py0000755000076500000240000006771413535414523016561 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_gpg.py Santiago Torres-Arias Lukas Puehringer Nov 15, 2017 See LICENSE for licensing information. Test gpg/pgp-related functions. """ import os import sys import shutil import tempfile import unittest if sys.version_info >= (3, 3): from unittest.mock import patch # pylint: disable=no-name-in-module,import-error else: from mock import patch # pylint: disable=import-error from six import string_types from copy import deepcopy from collections import OrderedDict import cryptography.hazmat.primitives.serialization as serialization import cryptography.hazmat.backends as backends import cryptography.hazmat.primitives.hashes as hashing from in_toto import process from in_toto.gpg.functions import (gpg_sign_object, gpg_export_pubkey, gpg_verify_signature) from in_toto.gpg.util import (get_version, is_version_fully_supported, get_hashing_class, parse_packet_header, parse_subpacket_header) from in_toto.gpg.rsa import create_pubkey as rsa_create_pubkey from in_toto.gpg.dsa import create_pubkey as dsa_create_pubkey from in_toto.gpg.common import (parse_pubkey_payload, parse_pubkey_bundle, get_pubkey_bundle, _assign_certified_key_info, _get_verified_subkeys, parse_signature_packet) from in_toto.gpg.constants import (SHA1, SHA256, SHA512, GPG_EXPORT_PUBKEY_COMMAND, PACKET_TYPE_PRIMARY_KEY, PACKET_TYPE_USER_ID, PACKET_TYPE_USER_ATTR, PACKET_TYPE_SUB_KEY) from in_toto.gpg.exceptions import (PacketParsingError, PacketVersionNotSupportedError, SignatureAlgorithmNotSupportedError, KeyNotFoundError, CommandError, KeyExpirationError) from in_toto.gpg.formats import PUBKEY_SCHEMA @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestUtil(unittest.TestCase): """Test util functions. """ def test_version_utils_return_types(self): """Run dummy tests for coverage. """ self.assertTrue(isinstance(get_version(), string_types)) self.assertTrue(isinstance(is_version_fully_supported(), bool)) def test_get_hashing_class(self): # Assert return expected hashing class expected_hashing_class = [hashing.SHA1, hashing.SHA256, hashing.SHA512] for idx, hashing_id in enumerate([SHA1, SHA256, SHA512]): result = get_hashing_class(hashing_id) self.assertEqual(result, expected_hashing_class[idx]) # Assert raises ValueError with non-supported hashing id with self.assertRaises(ValueError): get_hashing_class("bogus_hashing_id") def test_parse_packet_header(self): """Test parse_packet_header with manually crafted data. """ data_list = [ ## New format packet length with mock packet type 100001 # one-octet length, header len: 2, body len: 0 to 191 [0b01100001, 0], [0b01100001, 191], # two-octet length, header len: 3, body len: 192 to 8383 [0b01100001, 192, 0], [0b01100001, 223, 255], # five-octet length, header len: 6, body len: 0 to 4,294,967,295 [0b01100001, 255, 0, 0, 0, 0], [0b01100001, 255, 255, 255, 255, 255], ## Old format packet lengths with mock packet type 1001 # one-octet length, header len: 2, body len: 0 to 255 [0b00100100, 0], [0b00100100, 255], # two-octet length, header len: 3, body len: 0 to 65,535 [0b00100101, 0, 0], [0b00100101, 255, 255], # four-octet length, header len: 5, body len: 0 to 4,294,967,295 [0b00100110, 0, 0, 0, 0, 0], [0b00100110, 255, 255, 255, 255, 255], ] # packet_type | header_len | body_len | packet_len expected = [ (33, 2, 0, 2), (33, 2, 191, 193), (33, 3, 192, 195), (33, 3, 8383, 8386), (33, 6, 0, 6), (33, 6, 4294967295, 4294967301), (9, 2, 0, 2), (9, 2, 255, 257), (9, 3, 0, 3), (9, 3, 65535, 65538), (9, 5, 0, 5), (9, 5, 4294967295, 4294967300), ] for idx, data in enumerate(data_list): result = parse_packet_header(bytearray(data)) self.assertEqual(result, expected[idx]) # New Format Packet Lengths with Partial Body Lengths range for second_octet in [224, 254]: with self.assertRaises(PacketParsingError): parse_packet_header(bytearray([0b01100001, second_octet])) # Old Format Packet Lengths with indeterminate length (length type 3) with self.assertRaises(PacketParsingError): parse_packet_header(bytearray([0b00100111])) # Get expected type parse_packet_header(bytearray([0b01100001, 0]), expected_type=33) # Raise with unexpected type with self.assertRaises(PacketParsingError): parse_packet_header(bytearray([0b01100001, 0]), expected_type=34) def test_parse_subpacket_header(self): """Test parse_subpacket_header with manually crafted data. """ # All items until last item encode the length of the subpacket, # the last item encodes the mock subpacket type. data_list = [ # length of length 1, subpacket length 0 to 191 [0, 33], # NOTE: Nonsense 0-length [191, 33], # # length of length 2, subpacket length 192 to 16,319 [192, 0, 33], [254, 255, 33], # # length of length 5, subpacket length 0 to 4,294,967,295 [255, 0, 0, 0, 0, 33], # NOTE: Nonsense 0-length [255, 255, 255, 255, 255, 33], ] # packet_type | header_len | body_len | packet_len expected = [ (33, 2, -1, 1), # NOTE: Nonsense negative payload (33, 2, 190, 192), (33, 3, 191, 194), (33, 3, 16318, 16321), (33, 6, -1, 5), # NOTE: Nonsense negative payload (33, 6, 4294967294, 4294967300) ] for idx, data in enumerate(data_list): result = parse_subpacket_header(bytearray(data)) self.assertEqual(result, expected[idx]) @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestCommon(unittest.TestCase): """Test common functions of the in_toto.gpg module. """ @classmethod def setUpClass(self): gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") homearg = "--homedir {}".format(gpg_keyring_path).replace("\\", "/") # Load test raw public key bundle from rsa keyring, used to construct # erroneous gpg data in tests below. keyid = "F557D0FF451DEF45372591429EA70BD13D883381" cmd = GPG_EXPORT_PUBKEY_COMMAND.format(keyid=keyid, homearg=homearg) proc = process.run(cmd, stdout=process.PIPE, stderr=process.PIPE) self.raw_key_data = proc.stdout self.raw_key_bundle = parse_pubkey_bundle(self.raw_key_data) # Export pubkey bundle with expired key for key expiration tests keyid = "E8AC80C924116DABB51D4B987CB07D6D2C199C7C" cmd = GPG_EXPORT_PUBKEY_COMMAND.format(keyid=keyid, homearg=homearg) proc = process.run(cmd, stdout=process.PIPE, stderr=process.PIPE) self.raw_expired_key_bundle = parse_pubkey_bundle(proc.stdout) def test_parse_pubkey_payload_errors(self): """ Test parse_pubkey_payload errors with manually crafted data. """ # passed data | expected error | expected error message test_data = [ (None, ValueError, "empty pubkey"), (bytearray([0x03]), PacketVersionNotSupportedError, "packet version '3' not supported"), (bytearray([0x04, 0, 0, 0, 0, 255]), SignatureAlgorithmNotSupportedError, "Signature algorithm '255' not supported") ] for data, error, error_str in test_data: with self.assertRaises(error) as ctx: parse_pubkey_payload(data) self.assertTrue(error_str in str(ctx.exception)) def test_parse_pubkey_bundle_errors(self): """Test parse_pubkey_bundle errors with manually crafted data partially based on real gpg key data (see self.raw_key_bundle). """ # Extract sample (legitimate) user ID packet and pass as first packet to # raise first packet must be primary key error user_id_packet = list(self.raw_key_bundle[PACKET_TYPE_USER_ID].keys())[0] # Extract sample (legitimate) primary key packet and pass as first two # packets to raise unexpected second primary key error primary_key_packet = self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["packet"] # Create incomplete packet to re-raise header parsing IndexError as # PacketParsingError incomplete_packet = bytearray([0b01111111]) # passed data | expected error message test_data = [ (None, "empty gpg data"), (user_id_packet, "must be a primary key"), (primary_key_packet + primary_key_packet, "Unexpected primary key"), (incomplete_packet, "index out of range") ] for data, error_str in test_data: with self.assertRaises(PacketParsingError) as ctx: parse_pubkey_bundle(data) self.assertTrue(error_str in str(ctx.exception)) # Create empty packet of unsupported type 66 (bit 0-5) and length 0 and # pass as second packet to provoke skipping of unsupported packet unsupported_packet = bytearray([0b01111111, 0]) with patch("in_toto.gpg.common.LOG") as mock_log: parse_pubkey_bundle(primary_key_packet + unsupported_packet) self.assertTrue("Ignoring gpg key packet '63'" in mock_log.info.call_args[0][0]) def test_parse_pubkey_bundle(self): """Assert presence of packets expected returned from `parse_pubkey_bundle` for specific test key). See ``` gpg --homedir tests/gpg_keyrings/rsa/ --export 9EA70BD13D883381 | \ gpg --list-packets ``` """ # Expect parsed primary key matching PUBKEY_SCHEMA self.assertTrue(PUBKEY_SCHEMA.matches( self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["key"])) # Parse corresponding raw packet for comparison _, header_len, _, _ = parse_packet_header( self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["packet"]) # pylint: disable=unsubscriptable-object parsed_raw_packet = parse_pubkey_payload(bytearray( self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["packet"][header_len:])) # And compare self.assertDictEqual( self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["key"], parsed_raw_packet) # Expect one primary key signature (revocation signature) self.assertEqual( len(self.raw_key_bundle[PACKET_TYPE_PRIMARY_KEY]["signatures"]), 1) # Expect one User ID packet, one User Attribute packet and one Subkey, # each with correct data for _type in [PACKET_TYPE_USER_ID, PACKET_TYPE_USER_ATTR, PACKET_TYPE_SUB_KEY]: # Of each type there is only one packet self.assertTrue(len(self.raw_key_bundle[_type]) == 1) # The raw packet is stored as key in the per-packet type collection raw_packet = next(iter(self.raw_key_bundle[_type])) # Its values are the raw packets header and body length self.assertEqual(len(raw_packet), self.raw_key_bundle[_type][raw_packet]["header_len"] + self.raw_key_bundle[_type][raw_packet]["body_len"]) # and one self-signature self.assertEqual( len(self.raw_key_bundle[_type][raw_packet]["signatures"]), 1) def test_assign_certified_key_info_errors(self): """Test _assign_certified_key_info errors with manually crafted data based on real gpg key data (see self.raw_key_bundle). """ # Replace legitimate user certifacte with a bogus packet wrong_cert_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = wrong_cert_bundle[PACKET_TYPE_USER_ID].popitem() packet_data["signatures"] = [bytearray([0b01111111, 0])] wrong_cert_bundle[PACKET_TYPE_USER_ID][packet] = packet_data # Replace primary key id with a non-associated keyid wrong_keyid_bundle = deepcopy(self.raw_key_bundle) wrong_keyid_bundle[PACKET_TYPE_PRIMARY_KEY]["key"]["keyid"] = \ "8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17" # Remove a byte in user id packet to make signature verification fail invalid_cert_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = invalid_cert_bundle[PACKET_TYPE_USER_ID].popitem() packet = packet[:-1] invalid_cert_bundle[PACKET_TYPE_USER_ID][packet] = packet_data test_data = [ # Skip and log parse_signature_packet error (wrong_cert_bundle, "Expected packet 2, but got 63 instead"), # Skip and log signature packet that doesn't match primary key id (wrong_keyid_bundle, "Ignoring User ID certificate issued by"), # Skip and log invalid signature (invalid_cert_bundle, "Ignoring invalid User ID self-certificate") ] for bundle, expected_msg in test_data: with patch("in_toto.gpg.common.LOG") as mock_log: _assign_certified_key_info(bundle) msg = str(mock_log.info.call_args[0][0]) self.assertTrue(expected_msg in msg, "'{}' not in '{}'".format(expected_msg, msg)) def test_assign_certified_key_info_expiration(self): """Test assignment of key expiration date in gpg.common._assign_certified_key_info using real gpg data (with ambiguity resolution / prioritization). # FIXME: Below tests are missing proper assertions for which User ID self-certificate is considered for the expiration date. Reasons are: - gpg does not let you (easily) modify individual expiration dates of User IDs (changing one changes all), hence we cannot assert the chosen packet by the particular date - _assign_certified_key_info first verifies all self-certificates and then only considers successfully verified ones, hence we cannot modify the certificate data, before passing it to _assign_certified_key_info IMO the best solution is a better separation of concerns, e.g. separate self-certificate verification and packet prioritization. """ # Test ambiguity resolution scheme with 3 User IDs # :user ID packet: "Test Expiration I " # :user ID packet: "Test Expiration II " # :user ID packet: "Test Expiration III " # User ID packets are ordered by their creation time in ascending order. # "Test Expiration II" has the primary user ID flag set and therefor has # the highest priority. key = _assign_certified_key_info(self.raw_expired_key_bundle) self.assertTrue(key["validity_period"] == 87901) # ~ 1 day # Test ambiguity resolution scheme with 2 User IDs # :user ID packet: "Test Expiration III " # :user ID packet: "Test Expiration I " # User ID packets are ordered by their creation time in descending order. # Neither packet has the primary user ID flag set. # "Test Expiration III" has the highest priority. raw_key_bundle = deepcopy(self.raw_expired_key_bundle) user_id_items = list(reversed(raw_key_bundle[PACKET_TYPE_USER_ID].items())) del user_id_items[1] raw_key_bundle[PACKET_TYPE_USER_ID] = OrderedDict(user_id_items) key = _assign_certified_key_info(raw_key_bundle) self.assertTrue(key["validity_period"] == 87901) # ~ 1 day def test_get_verified_subkeys_errors(self): """Test _get_verified_subkeys errors with manually crafted data based on real gpg key data (see self.raw_key_bundle). """ # Tamper with subkey (change version number) to trigger key parsing error bad_subkey_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = bad_subkey_bundle[PACKET_TYPE_SUB_KEY].popitem() packet = bytes(packet[:packet_data["header_len"]] + bytearray([0x03]) + packet[packet_data["header_len"]+1:]) bad_subkey_bundle[PACKET_TYPE_SUB_KEY][packet] = packet_data # Add bogus sig to trigger sig parsing error wrong_sig_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = wrong_sig_bundle[PACKET_TYPE_SUB_KEY].popitem() # NOTE: We can't only pass the bogus sig, because that would also trigger # the not enough sigs error (see not_enough_sigs_bundle) and mock only # lets us assert for the most recent log statement packet_data["signatures"].append(bytearray([0b01111111, 0])) wrong_sig_bundle[PACKET_TYPE_SUB_KEY][packet] = packet_data # Remove sigs to trigger not enough sigs error not_enough_sigs_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = not_enough_sigs_bundle[PACKET_TYPE_SUB_KEY].popitem() packet_data["signatures"] = [] not_enough_sigs_bundle[PACKET_TYPE_SUB_KEY][packet] = packet_data # Duplicate sig to trigger wrong amount signatures too_many_sigs_bundle = deepcopy(self.raw_key_bundle) packet, packet_data = too_many_sigs_bundle[PACKET_TYPE_SUB_KEY].popitem() packet_data["signatures"] = packet_data["signatures"] * 2 too_many_sigs_bundle[PACKET_TYPE_SUB_KEY][packet] = packet_data # Tamper with primary key to trigger signature verification error invalid_sig_bundle = deepcopy(self.raw_key_bundle) invalid_sig_bundle[PACKET_TYPE_PRIMARY_KEY]["packet"] = \ invalid_sig_bundle[PACKET_TYPE_PRIMARY_KEY]["packet"][:-1] test_data = [ (bad_subkey_bundle, "Pubkey packet version '3' not supported"), (wrong_sig_bundle, "Expected packet 2, but got 63 instead"), (not_enough_sigs_bundle, "wrong amount of key binding signatures (0)"), (too_many_sigs_bundle, "wrong amount of key binding signatures (2)"), (invalid_sig_bundle, "invalid key binding signature"), ] for bundle, expected_msg in test_data: with patch("in_toto.gpg.common.LOG") as mock_log: _get_verified_subkeys(bundle) msg = str(mock_log.info.call_args[0][0]) self.assertTrue(expected_msg in msg, "'{}' not in '{}'".format(expected_msg, msg)) def test_get_verified_subkeys(self): """Test correct assignment of subkey expiration date in gpg.common._get_verified_subkeys using real gpg data. """ subkeys = _get_verified_subkeys(self.raw_expired_key_bundle) # Test subkey with validity period 175451, i.e. ~ 2 days self.assertTrue(subkeys["0ce427fa3f0f50bc83a4a760ed95e1581691db4d"].get( "validity_period") == 175451) # Test subkey without validity period, i.e. it does not expire self.assertTrue(subkeys["70cfabf1e2f1dc60ac5c7bca10cd20d3d5bcb6ef"].get( "validity_period") is None) def test_get_pubkey_bundle_errors(self): """Pass wrong keyid with valid gpg data to trigger KeyNotFoundError. """ not_associated_keyid = "8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17" with self.assertRaises(KeyNotFoundError): get_pubkey_bundle(self.raw_key_data, not_associated_keyid) def test_parse_signature_packet_errors(self): """Test parse_signature_packet errors with manually crafted data. """ # passed data | expected error message test_data = [ (bytearray([0b01000010, 1, 255]), "Signature version '255' not supported"), (bytearray([0b01000010, 2, 4, 255]), "Signature type '255' not supported"), (bytearray([0b01000010, 3, 4, 0, 255]), "Signature algorithm '255' not supported"), (bytearray([0b01000010, 4, 4, 0, 1, 255]), "Hash algorithm '255' not supported"), ] for data, expected_error_str in test_data: with self.assertRaises(ValueError) as ctx: parse_signature_packet(data) self.assertTrue(expected_error_str in str(ctx.exception), "'{}' not in '{}'".format(expected_error_str, str(ctx.exception))) @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestGPGRSA(unittest.TestCase): """Test signature creation, verification and key export from the gpg module""" default_keyid = "8465A1E2E0FB2B40ADB2478E18FB3F537E0C8A17" signing_subkey_keyid = "C5A0ABE6EC19D0D65F85E2C39BE9DF5131D924E9" encryption_subkey_keyid = "6A112FD3390B2E53AFC2E57F8FC8E12099AECEEA" unsupported_subkey_keyid = "611A9B648E16F54E8A7FAD5DA51E8CDF3B06524F" expired_key_keyid = "E8AC80C924116DABB51D4B987CB07D6D2C199C7C" @classmethod def setUpClass(self): # Create directory to run the tests without having everything blow up self.working_dir = os.getcwd() # Find demo files gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") self.test_dir = os.path.realpath(tempfile.mkdtemp()) self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_gpg_export_pubkey(self): """ export a public key and make sure the parameters are the right ones: since there's very little we can do to check rsa key parameters are right we pre-exported the public key to an ssh key, which we can load with cryptography for the sake of comparison """ # export our gpg key, using our functions key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) our_exported_key = rsa_create_pubkey(key_data) # load the equivalent ssh key, and make sure that we get the same RSA key # parameters ssh_key_basename = "{}.ssh".format(self.default_keyid) ssh_key_path = os.path.join(self.gnupg_home, ssh_key_basename) with open(ssh_key_path, "rb") as fp: keydata = fp.read() ssh_key = serialization.load_ssh_public_key(keydata, backends.default_backend()) self.assertEqual(ssh_key.public_numbers().n, our_exported_key.public_numbers().n) self.assertEqual(ssh_key.public_numbers().e, our_exported_key.public_numbers().e) subkey_keyids = list(key_data["subkeys"].keys()) # We export the whole master key bundle which must contain the subkeys self.assertTrue(self.signing_subkey_keyid.lower() in subkey_keyids) # Currently we do not exclude encryption subkeys self.assertTrue(self.encryption_subkey_keyid.lower() in subkey_keyids) # However we do exclude subkeys, whose algorithm we do not support self.assertFalse(self.unsupported_subkey_keyid.lower() in subkey_keyids) # When passing the subkey keyid we also export the whole keybundle key_data2 = gpg_export_pubkey(self.signing_subkey_keyid, homedir=self.gnupg_home) self.assertDictEqual(key_data, key_data2) def test_gpg_sign_and_verify_object_with_default_key(self): """Create a signature using the default key on the keyring """ test_data = b'test_data' wrong_data = b'something malicious' signature = gpg_sign_object(test_data, homedir=self.gnupg_home) key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) self.assertTrue(gpg_verify_signature(signature, key_data, test_data)) self.assertFalse(gpg_verify_signature(signature, key_data, wrong_data)) def test_gpg_sign_and_verify_object(self): """Create a signature using a specific key on the keyring """ test_data = b'test_data' wrong_data = b'something malicious' signature = gpg_sign_object(test_data, keyid=self.default_keyid, homedir=self.gnupg_home) key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) self.assertTrue(gpg_verify_signature(signature, key_data, test_data)) self.assertFalse(gpg_verify_signature(signature, key_data, wrong_data)) def test_gpg_sign_and_verify_object_default_keyring(self): """Sign/verify using keyring from envvar. """ test_data = b'test_data' gnupg_home_backup = os.environ.get("GNUPGHOME") os.environ["GNUPGHOME"] = self.gnupg_home signature = gpg_sign_object(test_data, keyid=self.default_keyid) key_data = gpg_export_pubkey(self.default_keyid) self.assertTrue(gpg_verify_signature(signature, key_data, test_data)) # Reset GNUPGHOME if gnupg_home_backup: os.environ["GNUPGHOME"] = gnupg_home_backup else: del os.environ["GNUPGHOME"] def test_gpg_sign_object_with_expired_key(self): """Test signing with expired key raises gpg CommandError. """ with self.assertRaises(CommandError) as ctx: gpg_sign_object(b"livestock", keyid=self.expired_key_keyid, homedir=self.gnupg_home) expected = "returned non-zero exit status '2'" self.assertTrue(expected in str(ctx.exception), "{} not in {}".format( expected, ctx.exception)) def test_gpg_verify_signature_with_expired_key(self): """Test sig verification with expired key raises KeyExpirationError. """ signature = { "keyid": self.expired_key_keyid, "other_headers": "deadbeef", "signature": "deadbeef", } content = b"livestock" key = gpg_export_pubkey(self.expired_key_keyid, homedir=self.gnupg_home) with self.assertRaises(KeyExpirationError) as ctx: gpg_verify_signature(signature, key, content) expected = ("GPG key 'e8ac80c924116dabb51d4b987cb07d6d2c199c7c' " "created on '2019-03-25 12:46 UTC' with validity period '1 day, " "0:25:01' expired on '2019-03-26 13:11 UTC'.") self.assertTrue(expected == str(ctx.exception), "\nexpected: {}" "\ngot: {}".format(expected, ctx.exception)) @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestGPGDSA(unittest.TestCase): """ Test signature creation, verification and key export from the gpg module """ default_keyid = "C242A830DAAF1C2BEF604A9EF033A3A3E267B3B1" @classmethod def setUpClass(self): # Create directory to run the tests without having everything blow up self.working_dir = os.getcwd() self.test_dir = os.path.realpath(tempfile.mkdtemp()) self.gnupg_home = os.path.join(self.test_dir, "dsa") # Find keyrings keyrings = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "dsa") shutil.copytree(keyrings, self.gnupg_home) os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_gpg_export_pubkey(self): """ export a public key and make sure the parameters are the right ones: since there's very little we can do to check rsa key parameters are right we pre-exported the public key to an ssh key, which we can load with cryptography for the sake of comparison """ # export our gpg key, using our functions key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) our_exported_key = dsa_create_pubkey(key_data) # load the equivalent ssh key, and make sure that we get the same RSA key # parameters ssh_key_basename = "{}.ssh".format(self.default_keyid) ssh_key_path = os.path.join(self.gnupg_home, ssh_key_basename) with open(ssh_key_path, "rb") as fp: keydata = fp.read() ssh_key = serialization.load_ssh_public_key(keydata, backends.default_backend()) self.assertEqual(ssh_key.public_numbers().y, our_exported_key.public_numbers().y) self.assertEqual(ssh_key.public_numbers().parameter_numbers.g, our_exported_key.public_numbers().parameter_numbers.g) self.assertEqual(ssh_key.public_numbers().parameter_numbers.q, our_exported_key.public_numbers().parameter_numbers.q) self.assertEqual(ssh_key.public_numbers().parameter_numbers.p, our_exported_key.public_numbers().parameter_numbers.p) def test_gpg_sign_and_verify_object_with_default_key(self): """Create a signature using the default key on the keyring """ test_data = b'test_data' wrong_data = b'something malicious' signature = gpg_sign_object(test_data, homedir=self.gnupg_home) key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) self.assertTrue(gpg_verify_signature(signature, key_data, test_data)) self.assertFalse(gpg_verify_signature(signature, key_data, wrong_data)) def test_gpg_sign_and_verify_object(self): """Create a signature using a specific key on the keyring """ test_data = b'test_data' wrong_data = b'something malicious' signature = gpg_sign_object(test_data, keyid=self.default_keyid, homedir=self.gnupg_home) key_data = gpg_export_pubkey(self.default_keyid, homedir=self.gnupg_home) self.assertTrue(gpg_verify_signature(signature, key_data, test_data)) self.assertFalse(gpg_verify_signature(signature, key_data, wrong_data)) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_in_toto_keygen.py0000644000076500000240000001264013535414523021002 0ustar lukpstaff00000000000000""" test_in_toto_keygen.py Sachit Malik Wed Jun 28, 2017 See LICENSE for licensing information. Test in_toto_keygen command line tool. """ import os import sys import unittest import shutil import tempfile if sys.version_info >= (3, 3): from unittest.mock import patch # pylint: disable=no-name-in-module,import-error else: from mock import patch # pylint: disable=import-error from in_toto.in_toto_keygen import main as in_toto_keygen_main from in_toto.util import (generate_and_write_rsa_keypair, prompt_generate_and_write_rsa_keypair, prompt_password, generate_and_write_ed25519_keypair, prompt_generate_and_write_ed25519_keypair, import_rsa_key_from_file) import securesystemslib WORKING_DIR = os.getcwd() class TestInTotoKeyGenTool(unittest.TestCase): """Test in_toto_keygen's main() - requires sys.argv patching; error logs/exits on Exception. """ @classmethod def setUpClass(self): # Create directory where the verification will take place self.working_dir = os.getcwd() self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_main_required_args(self): """Test in-toto-keygen CLI tool with required arguments. """ args = ["in_toto_keygen.py"] with patch.object(sys, 'argv', args + ["bob"]), \ self.assertRaises(SystemExit): in_toto_keygen_main() def test_main_optional_args(self): """Test CLI command keygen with optional arguments. """ args = ["in_toto_keygen.py"] password = "123456" with patch.object(sys, 'argv', args + ["-p", "bob"]), \ patch("getpass.getpass", return_value=password), self.assertRaises( SystemExit): in_toto_keygen_main() with patch.object(sys, 'argv', args + ["-p", "-t", "rsa", "bob"]), \ patch("getpass.getpass", return_value=password), self.assertRaises( SystemExit): in_toto_keygen_main() with patch.object(sys, 'argv', args + ["-t", "ed25519", "bob"]), \ self.assertRaises(SystemExit): in_toto_keygen_main() with patch.object(sys, 'argv', args + ["-p", "-t", "ed25519", "bob"]), \ patch("getpass.getpass", return_value=password), self.assertRaises( SystemExit): in_toto_keygen_main() with patch.object(sys, 'argv', args + ["-p", "-b", "3072", "bob"]), \ patch("getpass.getpass", return_value=password), self.assertRaises( SystemExit): in_toto_keygen_main() def test_main_wrong_args(self): """Test CLI command with missing arguments. """ wrong_args_list = [ ["in_toto_keygen.py"], ["in_toto_keygen.py", "-r"], ["in_toto_keygen.py", "-p", "-b", "1024", "bob"]] password = "123456" for wrong_args in wrong_args_list: with patch.object(sys, 'argv', wrong_args), patch("getpass.getpass", return_value=password), self.assertRaises(SystemExit): in_toto_keygen_main() def test_in_toto_keygen_generate_and_write_rsa_keypair(self): """in_toto_keygen_generate_and_write_rsa_keypair run through. """ bits = 3072 generate_and_write_rsa_keypair("bob", bits) def test_in_toto_keygen_prompt_generate_and_write_rsa_keypair(self): """in_toto_keygen_prompt_generate_and_write_rsa_keypair run through. """ name = "bob" password = "123456" bits = 3072 with patch("getpass.getpass", return_value=password): prompt_generate_and_write_rsa_keypair(name, bits) def test_in_toto_keygen_generate_and_write_ed25519_keypair(self): """in_toto_keygen_generate_and_write_ed25519_keypair run through. """ generate_and_write_ed25519_keypair("bob") def test_in_toto_keygen_prompt_generate_and_write_ed25519_keypair(self): """in_toto_keygen_prompt_generate_and_write_ed25519_keypair run through. """ name = "bob" password = "123456" with patch("getpass.getpass", return_value=password): prompt_generate_and_write_ed25519_keypair(name) def test_prompt_password(self): """Call password prompt. """ password = "123456" with patch("getpass.getpass", return_value=password): self.assertEqual(prompt_password(), password) def test_create_and_import_encrypted_rsa(self): """Create ecrypted RSA key and import private and public key separately.""" name = "key_encrypted" password = "123456" bits = 3072 generate_and_write_rsa_keypair(name, bits, password) private_key = import_rsa_key_from_file(name, password) public_key = import_rsa_key_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) def test_create_and_import_encrypted_rsa_nondefault_length(self): name = "key_encrypted_2" password = "123456" bits = 2048 generate_and_write_rsa_keypair(name, bits, password) private_key = import_rsa_key_from_file(name, password) public_key = import_rsa_key_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) if __name__ == '__main__': unittest.main() in-toto-0.4.0/tests/test_in_toto_mock.py0000644000076500000240000000513413535414523020451 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_in_toto_mock.py Shikher Verma June 12, 2017 See LICENSE for licensing information. Test in_toto_mock command line tool. """ import os import unittest import shutil import tempfile import logging from in_toto.in_toto_mock import main as in_toto_mock_main import tests.common # Required to cache and restore default log level logger = logging.getLogger("in_toto") class TestInTotoMockTool(tests.common.CliTestCase): """Test in_toto_mock's main() - requires sys.argv patching; and in_toto_mock- calls runlib and error logs/exits on Exception. """ cli_main_func = staticmethod(in_toto_mock_main) @classmethod def setUpClass(self): """Create and change into temporary directory, dummy artifact and base arguments. """ # Below tests override the base logger ('in_toto') log level to # `logging.INFO`. We cache the original log level before running the tests # to restore it afterwards. self._base_log_level = logger.level self.working_dir = os.getcwd() self.test_dir = tempfile.mkdtemp() os.chdir(self.test_dir) self.test_step = "test_step" self.test_link = self.test_step + ".link" self.test_artifact = "test_artifact" open(self.test_artifact, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) # Restore log level to what it was before running in-toto-mock logger.setLevel(self._base_log_level) def tearDown(self): try: os.remove(self.test_link) except OSError: pass def test_main_required_args(self): """Test CLI command with required arguments. """ args = ["--name", self.test_step, "--", "python", "--version"] self.assert_cli_sys_exit(args, 0) self.assertTrue(os.path.exists(self.test_link)) def test_main_wrong_args(self): """Test CLI command with missing arguments. """ wrong_args_list = [ [], ["--name", "test-step"], ["--", "echo", "blub"]] for wrong_args in wrong_args_list: self.assert_cli_sys_exit(wrong_args, 2) self.assertFalse(os.path.exists(self.test_link)) def test_main_bad_cmd(self): """Test CLI command with non-existing command. """ # TODO: Is it safe to assume this command does not exist, or should we # assert for it? args = ["-n", "bad-command", "--", "ggadsfljasdhlasdfljvzxc"] self.assert_cli_sys_exit(args, 1) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_in_toto_record.py0000644000076500000240000001751113535414523021000 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_in_toto_record.py Lukas Puehringer Nov 28, 2016 See LICENSE for licensing information. Test in_toto_record command line tool. """ import os import sys import unittest import shutil import tempfile if sys.version_info >= (3, 3): import unittest.mock as mock # pylint: disable=no-name-in-module,import-error else: import mock # pylint: disable=import-error import in_toto.util from in_toto.models.link import UNFINISHED_FILENAME_FORMAT from in_toto.in_toto_record import main as in_toto_record_main import tests.common WORKING_DIR = os.getcwd() class TestInTotoRecordTool(tests.common.CliTestCase): """Test in_toto_record's main() - requires sys.argv patching; and in_toto_record_start/in_toto_record_stop - calls runlib and error logs/exits on Exception. """ cli_main_func = staticmethod(in_toto_record_main) @classmethod def setUpClass(self): """Create and change into temporary directory, generate key pair, dummy artifact and base arguments. """ self.test_dir = tempfile.mkdtemp() # Find gpg keyring gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") os.chdir(self.test_dir) # Copy gpg keyring to temp dir self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) self.gpg_keyid = "7b3abb26b97b655ab9296bd15b0bd02e1c768c43" self.rsa_key_path = "test_key_rsa" in_toto.util.generate_and_write_rsa_keypair(self.rsa_key_path) self.ed25519_key_path = "test_key_ed25519" in_toto.util.generate_and_write_ed25519_keypair(self.ed25519_key_path) self.test_artifact1 = "test_artifact1" self.test_artifact2 = "test_artifact2" open(self.test_artifact1, "w").close() open(self.test_artifact2, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(WORKING_DIR) shutil.rmtree(self.test_dir) def test_start_stop(self): """Test CLI command record start/stop with various arguments. """ # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): # Start/stop recording using rsa key args = ["--step-name", "test1", "--key", self.rsa_key_path] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) # Start/stop with recording one artifact using rsa key args = ["--step-name", "test2", "--key", self.rsa_key_path] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact1], 0) # Start/stop with excluding one artifact using rsa key args = ["--step-name", "test2.5", "--key", self.rsa_key_path] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1, "--exclude", "test*"], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact1, "--exclude", "test*"], 0) # Start/stop with base-path using rsa key args = ["--step-name", "test2.6", "--key", self.rsa_key_path, "--base-path", self.test_dir] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) # Start/stop with recording multiple artifacts using rsa key args = ["--step-name", "test3", "--key", self.rsa_key_path] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1, self.test_artifact2], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact2, self.test_artifact2], 0) # Start/stop recording using ed25519 key args = ["--step-name", "test4", "--key", self.ed25519_key_path, "--key-type", "ed25519"] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) # Start/stop with recording one artifact using ed25519 key args = ["--step-name", "test5", "--key", self.ed25519_key_path, "--key-type", "ed25519"] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact1], 0) # Start/stop with excluding one artifact using ed25519 key args = ["--step-name", "test5.5", "--key", self.ed25519_key_path, "--key-type", "ed25519"] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1, "--exclude", "test*"], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact1, "--exclude", "test*"], 0) # Start/stop with base-path using ed25519 key args = ["--step-name", "test5.6", "--key", self.ed25519_key_path, "--key-type", "ed25519", "--base-path", self.test_dir] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) # Start/stop with recording multiple artifacts using ed25519 key args = ["--step-name", "test6", "--key", self.ed25519_key_path, "--key-type", "ed25519"] self.assert_cli_sys_exit(["start"] + args + ["--materials", self.test_artifact1, self.test_artifact2], 0) self.assert_cli_sys_exit(["stop"] + args + ["--products", self.test_artifact2, self.test_artifact2], 0) # Start/stop sign with specified gpg keyid args = ["--step-name", "test7", "--gpg", self.gpg_keyid, "--gpg-home", self.gnupg_home] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) # Start/stop sign with default gpg keyid args = ["--step-name", "test8", "--gpg", "--gpg-home", self.gnupg_home] self.assert_cli_sys_exit(["start"] + args, 0) self.assert_cli_sys_exit(["stop"] + args, 0) def test_glob_no_unfinished_files(self): """Test record stop with missing unfinished files when globbing (gpg). """ args = ["--step-name", "test-no-glob", "--gpg", self.gpg_keyid, "--gpg-home", self.gnupg_home] self.assert_cli_sys_exit(["stop"] + args, 1) def test_glob_to_many_unfinished_files(self): """Test record stop with to many unfinished files when globbing (gpg). """ name = "test-to-many-glob" fn1 = UNFINISHED_FILENAME_FORMAT.format(step_name=name, keyid="a12345678") fn2 = UNFINISHED_FILENAME_FORMAT.format(step_name=name, keyid="b12345678") open(fn1, "w").close() open(fn2, "w").close() args = ["--step-name", name, "--gpg", self.gpg_keyid, "--gpg-home", self.gnupg_home] self.assert_cli_sys_exit(["stop"] + args, 1) def test_wrong_key(self): """Test CLI command record with wrong key exits 1 """ args = ["--step-name", "wrong-key", "--key", "non-existing-key"] self.assert_cli_sys_exit(["start"] + args, 1) self.assert_cli_sys_exit(["stop"] + args, 1) def test_no_key(self): """Test if no key is specified, argparse error exists with 2""" args = ["--step-name", "no-key"] self.assert_cli_sys_exit(["start"] + args, 2) self.assert_cli_sys_exit(["stop"] + args, 2) def test_missing_unfinished_link(self): """Error exit with missing unfinished link file. """ args = ["--step-name", "no-link", "--key", self.rsa_key_path] # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): self.assert_cli_sys_exit(["stop"] + args, 1) args = ["--step-name", "no-link", "--key", self.ed25519_key_path, "--key-type", "ed25519"] # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): self.assert_cli_sys_exit(["stop"] + args, 1) if __name__ == '__main__': unittest.main() in-toto-0.4.0/tests/test_in_toto_run.py0000755000076500000240000002101713535414523020325 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_in_toto_run.py Lukas Puehringer Nov 28, 2016 See LICENSE for licensing information. Test in_toto_run command line tool. """ import os import sys import unittest import shutil import glob import tempfile # Use external backport 'mock' on versions under 3.3 if sys.version_info >= (3, 3): import unittest.mock as mock # pylint: disable=no-name-in-module,import-error else: import mock # pylint: disable=import-error from in_toto.util import (generate_and_write_rsa_keypair, generate_and_write_ed25519_keypair, import_private_key_from_file, KEY_TYPE_RSA, KEY_TYPE_ED25519) from in_toto.models.metadata import Metablock from in_toto.in_toto_run import main as in_toto_run_main from in_toto.models.link import FILENAME_FORMAT import tests.common class TestInTotoRunTool(tests.common.CliTestCase): """Test in_toto_run's main() - requires sys.argv patching; and in_toto_run- calls runlib and error logs/exits on Exception. """ cli_main_func = staticmethod(in_toto_run_main) @classmethod def setUpClass(self): """Create and change into temporary directory, generate key pair, dummy artifact and base arguments. """ self.working_dir = os.getcwd() self.test_dir = tempfile.mkdtemp() # Copy gpg keyring self.default_gpg_keyid = "8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17" self.default_gpg_subkeyid = "c5a0abe6ec19d0d65f85e2c39be9df5131d924e9" self.non_default_gpg_keyid = "8288ef560ed3795f9df2c0db56193089b285da58" gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) os.chdir(self.test_dir) self.rsa_key_path = "test_key_rsa" generate_and_write_rsa_keypair(self.rsa_key_path) self.rsa_key = import_private_key_from_file(self.rsa_key_path, KEY_TYPE_RSA) self.ed25519_key_path = "test_key_ed25519" generate_and_write_ed25519_keypair(self.ed25519_key_path) self.ed25519_key = import_private_key_from_file(self.ed25519_key_path, KEY_TYPE_ED25519) self.test_step = "test_step" self.test_link_rsa = FILENAME_FORMAT.format(step_name=self.test_step, keyid=self.rsa_key["keyid"]) self.test_link_ed25519 = FILENAME_FORMAT.format(step_name=self.test_step, keyid=self.ed25519_key["keyid"]) self.test_artifact = "test_artifact" open(self.test_artifact, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def tearDown(self): for link in glob.glob("*.link"): os.remove(link) def test_main_required_args(self): """Test CLI command with required arguments. """ args = ["--step-name", self.test_step, "--key", self.rsa_key_path, "--", "python", "--version"] # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): self.assert_cli_sys_exit(args, 0) self.assertTrue(os.path.exists(self.test_link_rsa)) def test_main_optional_args(self): """Test CLI command with optional arguments. """ named_args = ["--step-name", self.test_step, "--key", self.rsa_key_path, "--materials", self.test_artifact, "--products", self.test_artifact, "--record-streams"] positional_args = ["--", "python", "--version"] # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): # Test and assert recorded artifacts args1 = named_args + positional_args self.assert_cli_sys_exit(args1, 0) link_metadata = Metablock.load(self.test_link_rsa) self.assertTrue(self.test_artifact in list(link_metadata.signed.materials.keys())) self.assertTrue(self.test_artifact in list(link_metadata.signed.products.keys())) # Test and assert exlcuded artifacts args2 = named_args + ["--exclude", "*test*"] + positional_args self.assert_cli_sys_exit(args2, 0) link_metadata = Metablock.load(self.test_link_rsa) self.assertFalse(link_metadata.signed.materials) self.assertFalse(link_metadata.signed.products) # Test with base path args3 = named_args + ["--base-path", self.test_dir] + positional_args self.assert_cli_sys_exit(args3, 0) link_metadata = Metablock.load(self.test_link_rsa) self.assertListEqual(list(link_metadata.signed.materials.keys()), [self.test_artifact]) self.assertListEqual(list(link_metadata.signed.products.keys()), [self.test_artifact]) # Test with bogus base path args4 = named_args + ["--base-path", "bogus/path"] + positional_args self.assert_cli_sys_exit(args4, 1) # Test with lstrip path args5 = named_args + ["--lstrip-paths", self.test_dir] + positional_args self.assert_cli_sys_exit(args5, 0) link_metadata = Metablock.load(self.test_link_rsa) self.assertListEqual(list(link_metadata.signed.materials.keys()), [self.test_artifact]) self.assertListEqual(list(link_metadata.signed.products.keys()), [self.test_artifact]) def test_main_with_unencrypted_ed25519_key(self): """Test CLI command with ed25519 key. """ args = ["-n", self.test_step, "--key", self.ed25519_key_path, "--key-type", "ed25519", "--", "ls"] self.assert_cli_sys_exit(args, 0) self.assertTrue(os.path.exists(self.test_link_ed25519)) def test_main_with_encrypted_ed25519_key(self): """Test CLI command with encrypted ed25519 key. """ key_path = "test_key_ed25519_enc" password = "123456" generate_and_write_ed25519_keypair(key_path, password) args = ["-n", self.test_step, "--key", key_path, "--key-type", "ed25519", "--", "ls"] with mock.patch('in_toto.util.prompt_password', return_value=password): key = import_private_key_from_file(key_path, KEY_TYPE_ED25519) linkpath = FILENAME_FORMAT.format(step_name=self.test_step, keyid=key["keyid"]) self.assert_cli_sys_exit(args, 0) self.assertTrue(os.path.exists(linkpath)) def test_main_with_specified_gpg_key(self): """Test CLI command with specified gpg key. """ args = ["-n", self.test_step, "--gpg", self.non_default_gpg_keyid, "--gpg-home", self.gnupg_home, "--", "python", "--version"] self.assert_cli_sys_exit(args, 0) link_filename = FILENAME_FORMAT.format(step_name=self.test_step, keyid=self.non_default_gpg_keyid) self.assertTrue(os.path.exists(link_filename)) def test_main_with_default_gpg_key(self): """Test CLI command with default gpg key. """ args = ["-n", self.test_step, "--gpg", "--gpg-home", self.gnupg_home, "--", "python", "--version"] self.assert_cli_sys_exit(args, 0) link_filename = FILENAME_FORMAT.format(step_name=self.test_step, keyid=self.default_gpg_subkeyid) self.assertTrue(os.path.exists(link_filename)) def test_main_no_command_arg(self): """Test CLI command with --no-command argument. """ args = ["in_toto_run.py", "--step-name", self.test_step, "--key", self.rsa_key_path, "--no-command"] # Give wrong password whenever prompted. with mock.patch('in_toto.util.prompt_password', return_value='x'): self.assert_cli_sys_exit(args, 0) self.assertTrue(os.path.exists(self.test_link_rsa)) def test_main_wrong_args(self): """Test CLI command with missing arguments. """ wrong_args_list = [ [], ["--step-name", "some"], ["--key", self.rsa_key_path], ["--", "echo", "blub"], ["--step-name", "test-step", "--key", self.rsa_key_path], ["--step-name", "--", "echo", "blub"], ["--key", self.rsa_key_path, "--", "echo", "blub"], ["--step-name", "test-step", "--key", self.rsa_key_path, "--"], ["--step-name", "test-step", "--key", self.rsa_key_path, "--gpg", "--", "echo", "blub"] ] for wrong_args in wrong_args_list: self.assert_cli_sys_exit(wrong_args, 2) self.assertFalse(os.path.exists(self.test_link_rsa)) def test_main_wrong_key_exits(self): """Test CLI command with wrong key argument, exits and logs error """ args = ["--step-name", self.test_step, "--key", "non-existing-key", "--", "echo", "test"] self.assert_cli_sys_exit(args, 1) self.assertFalse(os.path.exists(self.test_link_rsa)) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_in_toto_sign.py0000644000076500000240000002054513535414523020463 0ustar lukpstaff00000000000000 """ test_in_toto_sign.py Sachit Malik Lukas Puehringer Wed Jun 21, 2017 See LICENSE for licensing information. Test in_toto_sign command line tool. """ import os import json import shutil import tempfile import unittest from in_toto.in_toto_sign import main as in_toto_sign_main import tests.common WORKING_DIR = os.getcwd() class TestInTotoSignTool(tests.common.CliTestCase): """Test in_toto_sign's main() - requires sys.argv patching; error logs/exits on Exception. """ cli_main_func = staticmethod(in_toto_sign_main) @classmethod def setUpClass(self): # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # Find gpg keyring gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for file_path in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, file_path), self.test_dir) # Copy gpg keyring to temp dir self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) self.default_gpg_keyid = "8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17" self.gpg_keyid1 = "7b3abb26b97b655ab9296bd15b0bd02e1c768c43" self.gpg_keyid2 = "8288ef560ed3795f9df2c0db56193089b285da58" self.layout_path = "demo.layout.template" self.link_path = "package.2f89b927.link" self.alice_path = "alice" self.alice_pub_path = "alice.pub" self.bob_path = "bob" self.bob_pub_path = "bob.pub" self.carl_path = "carl" self.carl_pub_path = "carl.pub" self.danny_path = "danny" self.danny_pub_path = "danny.pub" @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_sign_and_verify(self): """Test signing and verifying Layout and Link metadata with different combinations of arguments. """ # Sign Layout with multiple keys and write to "tmp.layout" self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.alice_path, self.bob_path, "-o", "tmp.layout", ], 0) # Verify "tmp.layout" (requires all keys) self.assert_cli_sys_exit([ "-f", "tmp.layout", "-k", self.alice_pub_path, self.bob_pub_path, "--verify", ], 0) # Sign Layout "tmp.layout", appending new signature, write to "tmp.layout" self.assert_cli_sys_exit([ "-f", "tmp.layout", "-k", self.carl_path, "-a" ], 0) # Verify "tmp.layout" (has three signatures now) self.assert_cli_sys_exit([ "-f", "tmp.layout", "-k", self.alice_pub_path, self.bob_pub_path, self.carl_pub_path, "--verify" ], 0) # Sign Layout "tmp.layout" with ed25519 key, appending new signature, # write to "tmp.layout" self.assert_cli_sys_exit([ "-f", "tmp.layout", "-k", self.danny_path, "-t", "ed25519", "-a" ], 0) # Verify "tmp.layout" (has four signatures now) self.assert_cli_sys_exit([ "-f", "tmp.layout", "-k", self.alice_pub_path, self.bob_pub_path, self.carl_pub_path, self.danny_pub_path, "-t", "rsa", "rsa", "rsa", "ed25519", "--verify" ], 0) # Sign Link, replacing old signature # and write to same file as input self.assert_cli_sys_exit([ "-f", self.link_path, "-k", self.bob_path, "-o", self.link_path, ], 0) # Verify Link self.assert_cli_sys_exit([ "-f", self.link_path, "-k", self.bob_pub_path, "--verify" ], 0) # Replace signature to Link and store to new file using passed # key's (alice) id as infix self.assert_cli_sys_exit([ "-f", self.link_path, "-k", self.alice_path ], 0) # Verify Link with alice's keyid as infix self.assert_cli_sys_exit([ "-f", "package.556caebd.link", "-k", self.alice_pub_path, "--verify" ], 0) # Sign Layout with default gpg key self.assert_cli_sys_exit([ "-f", self.layout_path, "-g", "-o", "tmp_gpg.layout", "--gpg-home", self.gnupg_home ], 0) # Verify Layout signed with default gpg key self.assert_cli_sys_exit([ "-f", "tmp_gpg.layout", "-g", self.default_gpg_keyid, "--gpg-home", self.gnupg_home, "--verify" ], 0) # Sign Layout with two gpg keys self.assert_cli_sys_exit([ "-f", self.layout_path, "-g", self.gpg_keyid1, self.gpg_keyid2, "-o", "tmp_gpg.layout", "--gpg-home", self.gnupg_home ], 0) self.assert_cli_sys_exit([ "-f", "tmp_gpg.layout", "-g", self.gpg_keyid1, self.gpg_keyid2, "--gpg-home", self.gnupg_home, "--verify" ], 0) def test_fail_signing(self): """Fail signing with an invalid key. """ self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.carl_path, self.link_path, ], 2) def test_fail_verification(self): """Fail signature verification. """ # Fail with wrong key (not used for signing) self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.carl_pub_path, "--verify" ], 1) # Fail with wrong key (not a valid pub key) self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.carl_path, "--verify" ], 2) # Fail with wrong gpg keyid (not used for signing) self.assert_cli_sys_exit([ "-f", self.layout_path, "-g", self.default_gpg_keyid, "--gpg-home", self.gnupg_home, "--verify" ], 1) # Fail with wrong gpg keyid (not a valid keyid) self.assert_cli_sys_exit([ "-f", self.layout_path, "-g", "bogus-gpg-keyid", "--gpg-home", self.gnupg_home, "--verify" ], 2) def test_bad_args(self): """Fail with wrong combination of arguments. """ # Conflicting "verify" and signing options (--verify -o) self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", "key-not-used", "--verify", "-o", "file-not-written" ], 2) # Conflicting "verify" and signing options (--verify -oa) self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", "key-not-used", "--verify", "-a", ], 2) # Wrong "append" option for Link metadata self.assert_cli_sys_exit([ "-f", self.link_path, "-k", "key-not-used", "-a" ], 2) # Wrong multiple keys for Link metadata self.assert_cli_sys_exit([ "-f", self.link_path, "-k", self.alice_path, self.bob_path, ], 2) # Wrong number of multiple key types self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.alice_path, self.bob_path, "-t", "rsa", "-o", "tmp.layout", ], 2) # Wrong multiple gpg keys for Link metadata self.assert_cli_sys_exit([ "-f", self.link_path, "-g", self.gpg_keyid1, self.gpg_keyid2, ], 2) # Only one of gpg or regular key can be passed self.assert_cli_sys_exit([ "-f", self.layout_path, "-k", self.alice_path, "-g" ], 2) # At least one of gpg or regular key must be passed self.assert_cli_sys_exit([ "-f", self.layout_path, ], 2) # For verification if gpg option is passed there must be a key id argument self.assert_cli_sys_exit([ "-f", self.layout_path, "--verify", "-g" ], 2) def test_bad_metadata(self): """Fail with wrong metadata. """ # Not valid JSON self.assert_cli_sys_exit([ "-f", self.alice_pub_path, "-k", "key-not-used", ], 2) # Valid JSON but not valid Link or Layout with open("tmp.json", "wb") as f: f.write(json.dumps({}).encode("utf-8")) self.assert_cli_sys_exit([ "-f", "tmp.json", "-k", "key-not-used", ], 2) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_in_toto_verify.py0000755000076500000240000002127613535414523021034 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_in_toto_verify.py Lukas Puehringer Jan 9, 2017 See LICENSE for licensing information. Test in_toto_verify command line tool. """ import os import unittest import shutil import tempfile from in_toto.models.metadata import Metablock from in_toto.in_toto_verify import main as in_toto_verify_main from in_toto.util import import_rsa_key_from_file from securesystemslib.interface import import_ed25519_privatekey_from_file import tests.common class TestInTotoVerifyTool(tests.common.CliTestCase): """ Tests - in_toto_verify's main() - requires sys.argv patching; - in_toto_verify - calls verifylib.in_toto_verify and error logs/exits in case of a raised Exception. Uses in-toto demo supply chain link metadata files and basic layout for verification: Copies the basic layout for different test scenarios: - signed layout - multiple signed layout (using two project owner keys) """ cli_main_func = staticmethod(in_toto_verify_main) @classmethod def setUpClass(self): """Creates and changes into temporary directory. Copies demo files to temp dir... - owner/functionary key pairs - *.link metadata files - layout template (not signed, no expiration date) - final product ...and dumps various layouts for different test scenarios """ # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # find where the scripts directory is located. scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) shutil.copytree(scripts_directory, 'scripts') # Load layout template layout_template = Metablock.load("demo.layout.template") # Store layout paths to be used in tests self.layout_single_signed_path = "single-signed.layout" self.layout_double_signed_path = "double-signed.layout" # Import layout signing keys alice = import_rsa_key_from_file("alice") bob = import_rsa_key_from_file("bob") self.alice_path = "alice.pub" self.bob_path = "bob.pub" # dump a single signed layout layout_template.sign(alice) layout_template.dump(self.layout_single_signed_path) # dump a double signed layout layout_template.sign(bob) layout_template.dump(self.layout_double_signed_path) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_main_required_args(self): """Test in-toto-verify CLI tool with required arguments. """ args = ["--layout", self.layout_single_signed_path, "--layout-keys", self.alice_path] self.assert_cli_sys_exit(args, 0) def test_main_wrong_args(self): """Test in-toto-verify CLI tool with wrong arguments. """ wrong_args_list = [ [], ["--layout", self.layout_single_signed_path], ["--key", self.alice_path]] for wrong_args in wrong_args_list: self.assert_cli_sys_exit(wrong_args, 2) def test_main_multiple_keys(self): """Test in-toto-verify CLI tool with multiple keys. """ args = ["--layout", self.layout_double_signed_path, "--layout-keys", self.alice_path, self.bob_path] self.assert_cli_sys_exit(args, 0) def test_main_failing_bad_layout_path(self): """Test in-toto-verify CLI tool with bad layout path. """ args = ["-l", "not-a-path-to-a-layout", "-k", self.alice_path] self.assert_cli_sys_exit(args, 1) def test_main_link_dir(self): """Test in-toto-verify CLI tool with explicit link dir. """ # Use current working directory explicitly to load links args = ["--layout", self.layout_single_signed_path, "--layout-keys", self.alice_path, "--link-dir", "."] self.assert_cli_sys_exit(args, 0) # Fail with an explicit link directory, where no links are found args = ["--layout", self.layout_single_signed_path, "--layout-keys", self.alice_path, "--link-dir", "bad-link-dir"] self.assert_cli_sys_exit(args, 1) class TestInTotoVerifyToolMixedKeys(tests.common.CliTestCase): """ Tests in-toto-verify like TestInTotoVerifyTool but with both rsa and ed25519 project owner and functionary keys. """ cli_main_func = staticmethod(in_toto_verify_main) @classmethod def setUpClass(self): """Creates and changes into temporary directory. Copies demo files to temp dir... - owner/functionary key pairs - *.link metadata files - layout template (not signed, no expiration date) - final product ...and dumps layout for test scenario """ # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) shutil.copytree(scripts_directory, 'scripts') # Load layout template layout_template = Metablock.load("demo.layout.template") # Store layout paths to be used in tests self.layout_double_signed_path = "double-signed.layout" # Import layout signing keys alice = import_rsa_key_from_file("alice") danny = import_ed25519_privatekey_from_file("danny") self.alice_path = "alice.pub" self.danny_path = "danny.pub" # dump a double signed layout layout_template.sign(alice) layout_template.sign(danny) layout_template.dump(self.layout_double_signed_path) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_main_multiple_keys(self): """Test in-toto-verify CLI tool with multiple keys. """ args = ["--layout", self.layout_double_signed_path, "--layout-keys", self.alice_path, self.danny_path, "--key-types", "rsa", "ed25519"] self.assert_cli_sys_exit(args, 0) @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestInTotoVerifyToolGPG(tests.common.CliTestCase): """ Tests in-toto-verify like TestInTotoVerifyTool but with gpg project owner and functionary keys. """ cli_main_func = staticmethod(in_toto_verify_main) @classmethod def setUpClass(self): """Copy test gpg rsa keyring, gpg demo metadata files and demo final product to tmp test dir. """ self.working_dir = os.getcwd() self.test_dir = os.path.realpath(tempfile.mkdtemp()) # Copy gpg keyring gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) self.owner_gpg_keyid = "8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17" # Copy gpg demo metadata files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files_gpg") # find where the scripts directory is located. scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) # Change into test dir os.chdir(self.test_dir) shutil.copytree(scripts_directory, 'scripts') # Sign layout template with gpg key layout_template = Metablock.load("demo.layout.template") self.layout_path = "gpg_signed.layout" layout_template.sign_gpg(self.owner_gpg_keyid, self.gnupg_home) layout_template.dump(self.layout_path) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_gpg_signed_layout_with_gpg_functionary_keys(self): """ Successfully test demo supply chain where the layout lists gpg keys as functionary keys and is signed with a gpg key. """ args = ["--layout", self.layout_path, "--gpg", self.owner_gpg_keyid, "--gpg-home", self.gnupg_home] self.assert_cli_sys_exit(args, 0) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_log.py0000644000076500000240000000161413535414523016545 0ustar lukpstaff00000000000000""" test_log.py Lukas Puehringer Jan 30, 2018 See LICENSE for licensing information. Test in_toto/log.py """ import logging import unittest import in_toto.log class TestInTotoLogger(unittest.TestCase): def test_set_level_verbose_or_quiet(self): """Test set level convenience method. """ logger = in_toto.log.InTotoLogger("test-in-toto-logger") # Default level if verbose and quiet are false logger.setLevelVerboseOrQuiet(False, False) self.assertEqual(logger.level, logging.NOTSET) # INFO if verbose is true logger.setLevelVerboseOrQuiet(True, False) self.assertEqual(logger.level, logging.INFO) # CRITICAL if quiet is true logger.setLevelVerboseOrQuiet(False, True) self.assertEqual(logger.level, logger.QUIET) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_param_substitution.py0000644000076500000240000001330213535414523021715 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_param_substitution.py Santiago Torres-Arias May 15, 2018 See LICENSE for licensing information. Test the parameter substitution functions within verifylib. These tests were placed in a separate module to ease refactoring in case the substitution layer is to be removed. """ import os import shutil import tempfile import unittest import in_toto.settings from in_toto.models.metadata import Metablock from in_toto.models.layout import Layout from in_toto.verifylib import in_toto_verify, substitute_parameters from in_toto.util import (import_rsa_key_from_file, import_public_keys_from_files_as_dict) import in_toto.exceptions class Test_SubstituteArtifacts(unittest.TestCase): """Test parameter substitution on artifact rules. """ def setUp(self): self.layout = Layout.read({ "_type": "layout", "inspect": [{ "name": "do-the-thing", "expected_materials": [ ["MATCH", "{SOURCE_THING}", "WITH", "MATERIALS", "FROM", "{SOURCE_STEP}"] ], "expected_products": [ ["CREATE", "{NEW_THING}"] ] }], "steps": [{ "name": "artifacts", "expected_command": [], "expected_materials": [ ["MATCH", "{SOURCE_THING}", "WITH", "MATERIALS", "FROM", "{SOURCE_STEP}"] ], "expected_products": [ ["CREATE", "{NEW_THING}"] ] }] }) def test_substitute(self): """Do a simple substitution on the expected_command field""" substitute_parameters(self.layout, {"SOURCE_THING": "vim", "SOURCE_STEP": "source_step", "NEW_THING": "new_thing"}) self.assertEqual(self.layout.steps[0].expected_materials[0][1], "vim") self.assertEqual(self.layout.steps[0].expected_materials[0][5], "source_step") self.assertEqual(self.layout.steps[0].expected_products[0][1], "new_thing") self.assertEqual(self.layout.inspect[0].expected_materials[0][1], "vim") self.assertEqual(self.layout.inspect[0].expected_materials[0][5], "source_step") self.assertEqual(self.layout.inspect[0].expected_products[0][1], "new_thing") def test_substitute_no_var(self): """Raise an error if the parameter is not filled-in""" with self.assertRaises(KeyError): substitute_parameters(self.layout, {}) class Test_SubstituteRunField(unittest.TestCase): """Test substitution on the run field of the layout""" def setUp(self): """ Create layout with dummy inspection """ # Create layout with one inspection self.layout = Layout.read({ "_type": "layout", "steps": [], "inspect": [{ "name": "run-command", "run": ["{COMMAND}"], }] }) def test_substitute(self): """Check that the substitution is performed on the run field.""" substitute_parameters(self.layout, {"COMMAND": "touch"}) self.assertEqual(self.layout.inspect[0].run[0], "touch") def test_inspection_fail_with_non_zero_retval(self): """Check that the substitution raises TypeError if the key is missing""" with self.assertRaises(KeyError): substitute_parameters(self.layout, {}) class Test_SubstituteExpectedCommand(unittest.TestCase): """Test verifylib.verify_command_alignment(command, expected_command)""" def setUp(self): # Create layout with one inspection self.layout = Layout.read({ "_type": "layout", "inspect": [], "steps": [{ "name": "run-command", "expected_command": ["{EDITOR}"], }]}) def test_substitute(self): """Do a simple substitution on the expected_command field""" substitute_parameters(self.layout, {"EDITOR": "vim"}) self.assertEqual(self.layout.steps[0].expected_command[0], "vim") def test_substitute_no_var(self): """Raise an error if the parameter is not filled-in""" with self.assertRaises(KeyError): substitute_parameters(self.layout, {"NOEDITOR": "vim"}) class Test_SubstituteOnVerify(unittest.TestCase): """Test verifylib.verify_command_alignment(command, expected_command)""" @classmethod def setUpClass(self): # Create layout with one inspection self.layout = Layout.read({ "_type": "layout", "inspect": [], "steps": [{ "name": "run-command", "expected_command": ["{EDITOR}"], }] }) # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) # load alice's key self.alice = import_rsa_key_from_file("alice") self.alice_pub_dict = import_public_keys_from_files_as_dict( ["alice.pub"]) @classmethod def tearDownClass(self): os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_substitute(self): """Do a simple substitution on the expected_command field""" signed_layout = Metablock(signed=self.layout) signed_layout.sign(self.alice) # we will catch a LinkNotFound error because we don't have (and don't need) # the metadata. with self.assertRaises(in_toto.exceptions.LinkNotFoundError): in_toto_verify(signed_layout, self.alice_pub_dict, substitution_parameters={"EDITOR":"vim"}) self.assertEqual(self.layout.steps[0].expected_command[0], "vim") if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_process.py0000644000076500000240000000644513535414523017451 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_process.py Lukas Puehringer Oct 4, 2018 See LICENSE for licensing information. Test subprocess interface. """ import os import tempfile import unittest import shlex import io import sys import in_toto.process class Test_Process(unittest.TestCase): """Test subprocess interface. """ def test_run_input_vs_stdin(self): """Test that stdin kwarg is only used if input kwarg is not supplied. """ # Create a temporary file, passed as `stdin` argument fd, path = tempfile.mkstemp(text=True) os.write(fd, b"use stdin kwarg") os.close(fd) stdin_file = open(path) cmd = \ "python -c \"import sys; assert(sys.stdin.read() == '{}')\"" # input is used in favor of stdin in_toto.process.run(cmd.format("use input kwarg"), input=b"use input kwarg", stdin=stdin_file) # stdin is only used if input is not supplied in_toto.process.run(cmd.format("use stdin kwarg"), stdin=stdin_file) # Clean up stdin_file.close() os.remove(path) def test_run_duplicate_streams(self): """Test output as streams and as returned. """ # Command that prints 'foo' to stdout and 'bar' to stderr. cmd = ("python -c \"" "import sys;" "sys.stdout.write('foo');" "sys.stderr.write('bar');\"") # Create and open fake targets for standard streams stdout_fd, stdout_fn = tempfile.mkstemp() stderr_fd, stderr_fn = tempfile.mkstemp() with io.open(stdout_fn, "r") as fake_stdout_reader, \ os.fdopen(stdout_fd, "w") as fake_stdout_writer, \ io.open(stderr_fn, "r") as fake_stderr_reader, \ os.fdopen(stderr_fd, "w") as fake_stderr_writer: # Backup original standard streams and redirect to fake targets real_stdout = sys.stdout real_stderr = sys.stderr sys.stdout = fake_stdout_writer sys.stderr = fake_stderr_writer # Run command ret_code, ret_stdout, ret_stderr = \ in_toto.process.run_duplicate_streams(cmd) # Rewind fake standard streams fake_stdout_reader.seek(0) fake_stderr_reader.seek(0) # Assert that what was printed and what was returned is correct self.assertTrue(ret_stdout == fake_stdout_reader.read() == "foo") self.assertTrue(ret_stderr == fake_stderr_reader.read() == "bar") # Also assert the default return value self.assertEqual(ret_code, 0) # Reset original streams sys.stdout = real_stdout sys.stderr = real_stderr # Remove fake standard streams os.remove(stdout_fn) os.remove(stderr_fn) def test_run_duplicate_streams_arg_return_code(self): """Test command arg as string and list and return code. """ cmd_str = ("python -c \"" "import sys;" "sys.exit(100)\"") cmd_list = shlex.split(cmd_str) for cmd in [cmd_str, cmd_list]: return_code, _, _ = in_toto.process.run_duplicate_streams(cmd) self.assertEqual(return_code, 100) def test_run_duplicate_streams_timeout(self): """Test raise TimeoutExpired. """ with self.assertRaises(in_toto.process.subprocess.TimeoutExpired): in_toto.process.run_duplicate_streams("python --version", timeout=-1) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_rulelib.py0000644000076500000240000002501713535414523017425 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_rulelib.py Santiago Torres-Arias Lukas Puehringer Nov 19, 2016 See LICENSE for licensing information. Test artifact rule packing and unpacking. """ import unittest from in_toto.rulelib import (unpack_rule, pack_rule, pack_rule_data, pack_create_rule, pack_delete_rule, pack_modify_rule, pack_allow_rule, pack_disallow_rule, pack_require_rule) import securesystemslib.exceptions class TestArtifactRuleUnpack(unittest.TestCase): """Test artifact rule unpacker/syntax checker. """ def test_unpack_rule_not_list(self): """Test rule syntax error, not a list. """ rule = "CREATE stuff" with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_rule_not_enough_keywords(self): """Test rule syntax error, too little arguments. """ rule = ["DELETE"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_rule_unknown_rule_type(self): """Test generic rule syntax error, too many arguments. """ rule = ["SUBVERT", "foo"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_rule_pattern_not_string(self): """Test rule syntax error, pattern not a string. """ with self.assertRaises(securesystemslib.exceptions.FormatError): rule = ["CREATE", {"abc"}] unpack_rule(rule) def test_unpack_generic_rule_too_long(self): """Test generic rule syntax error, too many arguments. """ rule = ["CREATE", "foo", "pleaze!!"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_and_pack_generic_rule(self): """Test generic rule proper packing and unpacking. """ rule = ["CREATE", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "create") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_create_rule("foo")) rule = ["DELETE", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "delete") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_delete_rule("foo")) rule = ["MODIFY", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "modify") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_modify_rule("foo")) rule = ["ALLOW", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "allow") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_allow_rule("foo")) rule = ["DISALLOW", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "disallow") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_disallow_rule("foo")) rule = ["REQUIRE", "foo"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 2) self.assertEqual(rule_data["rule_type"], "require") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_require_rule("foo")) def test_unpack_and_pack_match_rule(self): """Check match rule proper packing and unpacking. """ rule = ["MATCH", "foo", "IN", "source-path", "WITH", "PRODUCTS", "IN", "dest-path", "FROM", "step-name"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 6) self.assertEqual(rule_data["rule_type"], "match") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule_data["source_prefix"], "source-path") self.assertEqual(rule_data["dest_prefix"], "dest-path") self.assertEqual(rule_data["dest_type"], "products") self.assertEqual(rule_data["dest_name"], "step-name") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_rule("MATCH", "foo", source_prefix="source-path", dest_type="PRODUCTS", dest_prefix="dest-path", dest_name="step-name")) rule = ["MATCH", "foo", "IN", "source-path", "WITH", "MATERIALS", "FROM", "step-name"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 6) self.assertEqual(rule_data["rule_type"], "match") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule_data["source_prefix"], "source-path") self.assertEqual(rule_data["dest_prefix"], "") self.assertEqual(rule_data["dest_type"], "materials") self.assertEqual(rule_data["dest_name"], "step-name") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_rule("MATCH", "foo", source_prefix="source-path", dest_type="MATERIALS", dest_name="step-name")) rule = ["MATCH", "foo", "WITH", "PRODUCTS", "IN", "dest-path", "FROM", "step-name"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 6) self.assertEqual(rule_data["rule_type"], "match") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule_data["source_prefix"], "") self.assertEqual(rule_data["dest_prefix"], "dest-path") self.assertEqual(rule_data["dest_type"], "products") self.assertEqual(rule_data["dest_name"], "step-name") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_rule("MATCH", "foo", dest_type="PRODUCTS", dest_prefix="dest-path", dest_name="step-name")) rule = ["MATCH", "foo", "WITH", "PRODUCTS", "FROM", "step-name"] rule_data = unpack_rule(rule) self.assertEqual(len(list(rule_data.keys())), 6) self.assertEqual(rule_data["rule_type"], "match") self.assertEqual(rule_data["pattern"], "foo") self.assertEqual(rule_data["source_prefix"], "") self.assertEqual(rule_data["dest_prefix"], "") self.assertEqual(rule_data["dest_type"], "products") self.assertEqual(rule_data["dest_name"], "step-name") self.assertEqual(rule, pack_rule_data(rule_data)) self.assertEqual(rule, pack_rule("MATCH", "foo", dest_type="PRODUCTS", dest_name="step-name")) def test_pack_rule_wrong_types(self): """Test argument validation for pack_rule. """ # pattern must be a string with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", None) # rule_type must be a string... with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule(1, "foo") # ... and one of the allowed rule types with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("not-a-rule-type", "foo") # For match rules a dest_type must be passed ... with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", dest_name="step-name") # ... and be one of materials or products with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", dest_type="not-a-dest-type", dest_name="step-name") # For match rules dest_name must be a string ... with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", dest_type="materials", dest_name=1) # ... and non-empty with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", dest_type="materials", dest_name="") # For match rules, if a source_prefix is passed it must be a string with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", source_prefix=1, dest_type="products", dest_prefix="dest-path", dest_name="step-name") # For match rules, if a dest_prefix is passed it must be a string with self.assertRaises(securesystemslib.exceptions.FormatError): pack_rule("match", "foo", dest_type="products", dest_prefix=["not-a-string"], dest_name="step-name") def test_unpack_match_rule_wrong_length(self): """Check match rule syntax error, too few or many arguments. """ rule = ["MATCH", "foo", "WITH", "PRODUCTS", "FROM"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) rule = ["MATCH", "foo", "WITH", "PRODUCTS", "FROM", "step-name", "really?"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) rule = ["MATCH", "foo", "IN", "source-path", "WITH", "PRODUCTS", "IN", "dest-path", "FROM", "step-name", "YES, we can!"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_match_rule_wrong_types(self): """Check match rule syntax error, wrong data type in variable arguments. """ # pattern must be string rule = ["MATCH", ["abc"], "IN", "source-path", "WITH", "PRODUCTS", "IN", "dest-path", "FROM", "step-name"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) # source path must be a string rule = ["MATCH", "foo", "IN", {"abc": "def"}, "WITH", "PRODUCTS", "IN", "dest-path", "FROM", "step-name"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) # dest-path must be a string rule = ["MATCH", "foo", "IN", "source-path", "WITH", "PRODUCTS", "IN", 123, "FROM", "step-name"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) # step-name must be a string rule = ["MATCH", "foo", "IN", "source-path", "WITH", "PRODUCTS", "IN", "dest-path", "FROM", ("456",)] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) def test_unpack_match_rule_wrong_destination_type(self): """Check match rule syntax error, wrong destination type. """ rule = ["MATCH", "foo", "IN", "source-path", "WITH", "PONIES", "IN", "dest-path", "FROM", "step-name"] with self.assertRaises(securesystemslib.exceptions.FormatError): unpack_rule(rule) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_runlib.py0000755000076500000240000006601313535414523017266 0ustar lukpstaff00000000000000#!/usr/bin/env python #coding=utf-8 """ test_runlib.py Lukas Puehringer Dec 01, 2016 See LICENSE for licensing information. Test runlib functions. """ import os import unittest import shutil import tempfile import in_toto.settings import in_toto.exceptions from in_toto.models.metadata import Metablock from in_toto.exceptions import SignatureVerificationError from in_toto.runlib import (in_toto_run, in_toto_record_start, in_toto_record_stop, record_artifacts_as_dict, _apply_exclude_patterns, _hash_artifact) from in_toto.util import (generate_and_write_rsa_keypair, prompt_import_rsa_key_from_file) from in_toto.models.link import UNFINISHED_FILENAME_FORMAT, FILENAME_FORMAT import securesystemslib.formats import securesystemslib.exceptions class Test_ApplyExcludePatterns(unittest.TestCase): """Test _apply_exclude_patterns(names, exclude_patterns) """ def test_apply_exclude_explict(self): names = ["foo", "bar", "baz"] patterns = ["foo", "bar"] expected = ["baz"] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(sorted(result), sorted(expected)) def test_apply_exclude_all(self): names = ["foo", "bar", "baz"] patterns = ["*"] expected = [] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(sorted(result), sorted(expected)) def test_apply_exclude_multiple_star(self): names = ["foo", "bar", "baz"] patterns = ["*a*"] expected = ["foo"] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(result, expected) def test_apply_exclude_question_mark(self): names = ["foo", "bazfoo", "barfoo"] patterns = ["ba?foo"] expected = ["foo"] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(result, expected) def test_apply_exclude_seq(self): names = ["baxfoo", "bazfoo", "barfoo"] patterns = ["ba[xz]foo"] expected = ["barfoo"] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(result, expected) def test_apply_exclude_neg_seq(self): names = ["baxfoo", "bazfoo", "barfoo"] patterns = ["ba[!r]foo"] expected = ["barfoo"] result = _apply_exclude_patterns(names, patterns) self.assertListEqual(result, expected) class TestRecordArtifactsAsDict(unittest.TestCase): """Test record_artifacts_as_dict(artifacts). """ @classmethod def setUpClass(self): """Create and change into temp test directory with dummy artifacts. |-- bar |-- foo `-- subdir |-- foosub1 |-- foosub2 `-- subsubdir `-- foosubsub """ self.working_dir = os.getcwd() # Backup and clear user set exclude patterns and base path self.artifact_exclude_orig = in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS self.artifact_base_path_orig = in_toto.settings.ARTIFACT_BASE_PATH in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = [] in_toto.settings.ARTIFACT_BASE_PATH = None # mkdtemp uses $TMPDIR, which might contain a symlink # but we want the absolute location instead self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Create files on 3 levels os.mkdir("subdir") os.mkdir("subdir/subsubdir") self.full_file_path_list = ["foo", "bar", "subdir/foosub1", "subdir/foosub2", "subdir/subsubdir/foosubsub"] for path in self.full_file_path_list: with open(path, "w") as fp: fp.write(path) @classmethod def tearDownClass(self): """Change back to working dir, remove temp directory, restore settings. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = self.artifact_exclude_orig in_toto.settings.ARTIFACT_BASE_PATH = self.artifact_base_path_orig def tearDown(self): """Clear the ARTIFACT_EXLCUDES after every test. """ in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = [] in_toto.settings.ARTIFACT_BASE_PATH = None def test_bad_base_path_setting(self): """Raise exception with bogus base path settings. """ for base_path in ["path/does/not/exist", 12345, True]: in_toto.settings.ARTIFACT_BASE_PATH = base_path with self.assertRaises(ValueError): record_artifacts_as_dict(["."]) in_toto.settings.ARTIFACT_BASE_PATH = None with self.assertRaises(ValueError): record_artifacts_as_dict(["."], base_path=base_path) def test_base_path_is_child_dir(self): """Test path of recorded artifacts and cd back with child as base.""" base_path = "subdir" expected_artifacts = sorted(["foosub1", "foosub2", "subsubdir/foosubsub"]) in_toto.settings.ARTIFACT_BASE_PATH = base_path artifacts_dict = record_artifacts_as_dict(["."]) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) in_toto.settings.ARTIFACT_BASE_PATH = None artifacts_dict = record_artifacts_as_dict(["."], base_path=base_path) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) def test_base_path_is_parent_dir(self): """Test path of recorded artifacts and cd back with parent as base. """ base_path = ".." expected_artifacts = sorted(["foosub1", "foosub2", "subsubdir/foosubsub"]) os.chdir("subdir/subsubdir") in_toto.settings.ARTIFACT_BASE_PATH = base_path artifacts_dict = record_artifacts_as_dict(["."]) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) in_toto.settings.ARTIFACT_BASE_PATH = None artifacts_dict = record_artifacts_as_dict(["."], base_path=base_path) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) os.chdir(self.test_dir) def test_lstrip_paths_valid_prefix_directory(self): lstrip_paths = ["subdir/subsubdir/"] expected_artifacts = sorted(["bar", "foo", "subdir/foosub1", "subdir/foosub2", "foosubsub"]) artifacts_dict = record_artifacts_as_dict(["."], lstrip_paths=lstrip_paths) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) def test_lstrip_paths_substring_prefix_directory(self): lstrip_paths = ["subdir/subsubdir/", "subdir/"] with self.assertRaises(in_toto.exceptions.PrefixError): record_artifacts_as_dict(["."], lstrip_paths=lstrip_paths) def test_lstrip_paths_non_unique_key(self): os.mkdir("subdir_new") path = "subdir_new/foosub1" shutil.copy("subdir/foosub1", path) lstrip_paths = ["subdir/", "subdir_new/"] with self.assertRaises(in_toto.exceptions.PrefixError): record_artifacts_as_dict(["."], lstrip_paths=lstrip_paths) os.remove(path) os.rmdir("subdir_new") def test_lstrip_paths_invalid_prefix_directory(self): lstrip_paths = ["not/a/directory/"] expected_artifacts = sorted(["bar", "foo", "subdir/foosub1", "subdir/foosub2", "subdir/subsubdir/foosubsub"]) artifacts_dict = record_artifacts_as_dict(["."], lstrip_paths=lstrip_paths) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) def test_lstrip_paths_valid_prefix_file(self): lstrip_paths = ["subdir/subsubdir/"] expected_artifacts = sorted(["foosubsub"]) artifacts_dict = record_artifacts_as_dict(["./subdir/subsubdir/foosubsub"], lstrip_paths=lstrip_paths) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) def test_lstrip_paths_non_unique_key_file(self): os.mkdir("subdir/subsubdir_new") path = "subdir/subsubdir_new/foosubsub" shutil.copy("subdir/subsubdir/foosubsub", path) lstrip_paths = ["subdir/subsubdir/", "subdir/subsubdir_new/"] with self.assertRaises(in_toto.exceptions.PrefixError): record_artifacts_as_dict(["subdir/subsubdir/foosubsub", "subdir/subsubdir_new/foosubsub"], lstrip_paths=lstrip_paths) os.remove(path) os.rmdir("subdir/subsubdir_new") def test_lstrip_paths_valid_unicode_prefix_file(self): # Try to create a file with unicode character try: os.mkdir("ಠ") path = "ಠ/foo" shutil.copy("foo", path) # Attempt to left strip the path now that the file has been created lstrip_paths = ["ಠ/"] expected_artifacts = sorted(["foo"]) artifacts_dict = record_artifacts_as_dict(["./ಠ/"], lstrip_paths=lstrip_paths) self.assertListEqual(sorted(list(artifacts_dict.keys())), expected_artifacts) os.remove(path) os.rmdir("ಠ") except OSError: # OS doesn't support unicode explicit files pass def test_empty_artifacts_list_record_nothing(self): """Empty list passed. Return empty dict. """ self.assertDictEqual(record_artifacts_as_dict([]), {}) def test_not_existing_artifacts_in_list_record_nothing(self): """List with not existing artifact passed. Return empty dict. """ self.assertDictEqual(record_artifacts_as_dict(["baz"]), {}) def test_record_dot_check_files_hash_dict_schema(self): """Traverse dir and subdirs. Record three files. """ artifacts_dict = record_artifacts_as_dict(["."]) for val in list(artifacts_dict.values()): securesystemslib.formats.HASHDICT_SCHEMA.check_match(val) self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(self.full_file_path_list)) @unittest.skipIf("symlink" not in os.__dict__, "symlink is not supported in this platform") def test_record_symlinked_files(self): """Symlinked files are always recorded. """ # Symlinked **files** are always recorded ... link_pairs = [ ("foo", "foo_link"), ("subdir/foosub1", "subdir/foosub2_link"), ("subdir/subsubdir/foosubsub", "subdir/subsubdir/foosubsub_link") ] # Create links for pair in link_pairs: # We only use the basename of the file (source) as it is on the same # level as the link (target) os.symlink(os.path.basename(pair[0]), pair[1]) # Record files and linked files # follow_symlink_dirs does not make a difference as it only concerns linked dirs for follow_symlink_dirs in [True, False]: artifacts_dict = record_artifacts_as_dict(["."], follow_symlink_dirs=follow_symlink_dirs) # Test that everything was recorded ... self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(self.full_file_path_list + [pair[1] for pair in link_pairs])) # ... and the hashes of each link/file pair match for pair in link_pairs: self.assertDictEqual(artifacts_dict[pair[0]], artifacts_dict[pair[1]]) for pair in link_pairs: os.unlink(pair[1]) @unittest.skipIf("symlink" not in os.__dict__, "symlink is not supported in this platform") def test_record_without_dead_symlinks(self): """Dead symlinks are never recorded. """ # Dead symlinks are never recorded ... links = [ "foo_link", "subdir/foosub2_link", "subdir/subsubdir/foosubsub_link" ] # Create dead links for link in links: os.symlink("does/not/exist", link) # Record files without dead links # follow_symlink_dirs does not make a difference as it only concerns linked dirs for follow_symlink_dirs in [True, False]: artifacts_dict = record_artifacts_as_dict(["."], follow_symlink_dirs=follow_symlink_dirs) # Test only the files were recorded ... self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(self.full_file_path_list)) for link in links: os.unlink(link) @unittest.skipIf("symlink" not in os.__dict__, "symlink is not supported in this platform") def test_record_follow_symlinked_directories(self): """Record files in symlinked dirs if follow_symlink_dirs is True. """ # Link to subdir os.symlink("subdir", "subdir_link") link_pairs = [ ("subdir/foosub1", "subdir_link/foosub1"), ("subdir/foosub2", "subdir_link/foosub2"), ("subdir/subsubdir/foosubsub", "subdir_link/subsubdir/foosubsub"), ] # Record with follow_symlink_dirs TRUE artifacts_dict = record_artifacts_as_dict(["."], follow_symlink_dirs=True) # Test that all files were recorded including files in linked subdir ... self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(self.full_file_path_list + [pair[1] for pair in link_pairs])) # ... and the hashes of each link/file pair match for pair in link_pairs: self.assertDictEqual(artifacts_dict[pair[0]], artifacts_dict[pair[1]]) # Record with follow_symlink_dirs FALSE (default) artifacts_dict = record_artifacts_as_dict(["."]) self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(self.full_file_path_list)) os.unlink("subdir_link") def test_record_files_and_subdirs(self): """Explicitly record files and subdirs. """ artifacts_dict = record_artifacts_as_dict(["foo", "subdir"]) for val in list(artifacts_dict.values()): securesystemslib.formats.HASHDICT_SCHEMA.check_match(val) self.assertListEqual(sorted(list(artifacts_dict.keys())), sorted(["foo", "subdir/foosub1", "subdir/foosub2", "subdir/subsubdir/foosubsub"])) def test_exclude_patterns(self): """Test excluding artifacts using passed pattern or setting. """ excludes_and_results = [ # Exclude files containing 'foo' everywhere (["*foo*"], ["bar"]), # Exclude subdirectory and all its contents (["subdir"], ["bar", "foo"]), # Exclude files 'subdir/foosub1' and 'subdir/foosub2' (["*foosub?"], ["bar", "foo", "subdir/subsubdir/foosubsub"]), # Exclude subsubdirectory and its contents (["*subsubdir"], ["foo", "bar", "subdir/foosub1", "subdir/foosub2"]) ] for exclude_patterns, expected_results in excludes_and_results: # Exclude via setting in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = exclude_patterns artifacts1 = record_artifacts_as_dict(["."]) # Exclude via argument in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = None artifacts2 = record_artifacts_as_dict(["."], exclude_patterns=exclude_patterns) self.assertTrue(sorted(list(artifacts1)) == sorted(list(artifacts2)) == sorted(expected_results)) def test_bad_artifact_exclude_patterns_setting(self): """Raise exception with bogus artifact exclude patterns settings. """ for setting in ["not a list of settings", 12345, True]: in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS = setting with self.assertRaises(securesystemslib.exceptions.FormatError): record_artifacts_as_dict(["."]) def test_hash_artifact_passing_algorithm(self): """Test _hash_artifact passing hash algorithm. """ os.chdir(self.test_dir) self.assertTrue("sha256" in list(_hash_artifact("foo", ["sha256"]).keys())) class TestInTotoRun(unittest.TestCase): """" Tests runlib.in_toto_run() with different arguments Calls in_toto_run library funtion inside of a temporary directory that contains a test artifact and a test keypair If the function does not fail it will dump a test step link metadata file to the temp dir which is removed after every test. """ @classmethod def setUpClass(self): """Create and change into temporary directory, generate key pair and dummy material, read key pair. """ self.working_dir = os.getcwd() self.test_dir = tempfile.mkdtemp() os.chdir(self.test_dir) self.step_name = "test_step" self.key_path = "test_key" generate_and_write_rsa_keypair(self.key_path) self.key = prompt_import_rsa_key_from_file(self.key_path) self.key_pub = prompt_import_rsa_key_from_file(self.key_path + ".pub") self.test_artifact = "test_artifact" open(self.test_artifact, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def tearDown(self): """Remove link file if it was created. """ try: os.remove(FILENAME_FORMAT.format(step_name=self.step_name, keyid=self.key["keyid"])) except OSError: pass def test_in_toto_run_verify_signature(self): """Successfully run, verify signed metadata. """ link = in_toto_run(self.step_name, None, None, ["python", "--version"], True, self.key) link.verify_signature(self.key) def test_in_toto_run_no_signature(self): """Successfully run, verify empty signature field. """ link = in_toto_run(self.step_name, None, None, ["python", "--version"]) self.assertFalse(len(link.signatures)) def test_in_toto_run_with_byproduct(self): """Successfully run, verify recorded byproduct. """ link = in_toto_run(self.step_name, None, None, ["python", "--version"], record_streams=True) # this or clause may seem weird, but given that python 2 prints its version # to stderr while python3 prints it to stdout we check on both (or add a # more verbose if clause) stderr_contents = link.signed.byproducts.get("stderr") stdout_contents = link.signed.byproducts.get("stdout") self.assertTrue("Python" in stderr_contents or "Python" in stdout_contents, msg="\nSTDERR:\n{}\nSTDOUT:\n{}".format(stderr_contents, stdout_contents)) def test_in_toto_run_without_byproduct(self): """Successfully run, verify byproduct is not recorded. """ link = in_toto_run(self.step_name, None, None, ["python", "--version"], record_streams=False) self.assertFalse(len(link.signed.byproducts.get("stdout"))) def test_in_toto_run_compare_dumped_with_returned_link(self): """Successfully run, compare dumped link is equal to returned link. """ link = in_toto_run(self.step_name, [self.test_artifact], [self.test_artifact], ["python", "--version"], True, self.key) link_dump = Metablock.load( FILENAME_FORMAT.format(step_name=self.step_name, keyid=self.key["keyid"])) self.assertEqual(repr(link), repr(link_dump)) def test_in_toto_run_verify_recorded_artifacts(self): """Successfully run, verify properly recorded artifacts. """ link = in_toto_run(self.step_name, [self.test_artifact], [self.test_artifact], ["python", "--version"]) self.assertEqual(list(link.signed.materials.keys()), list(link.signed.products.keys()), [self.test_artifact]) def test_in_toto_run_verify_workdir(self): """Successfully run, verify cwd. """ link = in_toto_run(self.step_name, [], [], ["python", "--version"], record_environment=True) self.assertEqual(link.signed.environment["workdir"], os.getcwd().replace("\\", "/")) def test_normalize_line_endings(self): """Test cross-platform line ending normalization. """ paths = [] try: # Create three artifacts with same content but different line endings for line_ending in [b"\n", b"\r", b"\r\n"]: fd, path = tempfile.mkstemp() paths.append(path) os.write(fd, b"hello" + line_ending + b"toto") os.close(fd) # Call in_toto_run and record artifacts as materials and products # with line ending normalization on link = in_toto_run(self.step_name, paths, paths, ["python", "--version"], normalize_line_endings=True).signed # Check that all three hashes in materials and products are equal for artifact_dict in [link.materials, link.products]: hash_dicts = list(artifact_dict.values()) self.assertTrue(hash_dicts[1:] == hash_dicts[:-1]) # Clean up finally: for path in paths: os.remove(path) def test_in_toto_bad_signing_key_format(self): """Fail run, passed key is not properly formatted. """ with self.assertRaises(securesystemslib.exceptions.FormatError): in_toto_run(self.step_name, None, None, ["python", "--version"], True, "this-is-not-a-key") def test_in_toto_wrong_key(self): """Fail run, passed key is a public key. """ with self.assertRaises(securesystemslib.exceptions.FormatError): in_toto_run(self.step_name, None, None, ["python", "--version"], True, self.key_pub) class TestInTotoRecordStart(unittest.TestCase): """"Test in_toto_record_start(step_name, key, material_list). """ @classmethod def setUpClass(self): """Create and change into temporary directory, generate key pair and dummy material, read key pair. """ self.working_dir = os.getcwd() self.test_dir = tempfile.mkdtemp() os.chdir(self.test_dir) self.key_path = "test_key" generate_and_write_rsa_keypair(self.key_path) self.key = prompt_import_rsa_key_from_file(self.key_path) self.step_name = "test_step" self.link_name_unfinished = UNFINISHED_FILENAME_FORMAT.format(step_name=self.step_name, keyid=self.key["keyid"]) self.test_material = "test_material" open(self.test_material, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_UNFINISHED_FILENAME_FORMAT(self): """Test if the unfinished filname format. """ self.assertTrue(self.link_name_unfinished == ".{}.{:.8}.link-unfinished".format(self.step_name, self.key["keyid"])) def test_create_unfinished_metadata_with_expected_material(self): """Test record start creates metadata with expected material. """ in_toto_record_start( self.step_name, [self.test_material], self.key) link = Metablock.load(self.link_name_unfinished) self.assertEqual(list(link.signed.materials.keys()), [self.test_material]) os.remove(self.link_name_unfinished) def test_create_unfininished_metadata_verify_signature(self): """Test record start creates metadata with expected signature. """ in_toto_record_start( self.step_name, [self.test_material], self.key) link = Metablock.load(self.link_name_unfinished) link.verify_signature(self.key) os.remove(self.link_name_unfinished) def test_no_key_arguments(self): """Test record start without passing one required key argument. """ with self.assertRaises(ValueError): in_toto_record_start( self.step_name, [], signing_key=None, gpg_keyid=None, gpg_use_default=False) class TestInTotoRecordStop(unittest.TestCase): """"Test in_toto_record_stop(step_name, key, product_list). """ @classmethod def setUpClass(self): """Create and change into temporary directory, generate two key pairs and dummy product. """ self.working_dir = os.getcwd() self.test_dir = tempfile.mkdtemp() os.chdir(self.test_dir) self.key_path = "test-key" self.key_path2 = "test-key2" generate_and_write_rsa_keypair(self.key_path) generate_and_write_rsa_keypair(self.key_path2) self.key = prompt_import_rsa_key_from_file(self.key_path) self.key2 = prompt_import_rsa_key_from_file(self.key_path2) self.step_name = "test-step" self.link_name = "{}.{:.8}.link".format(self.step_name, self.key["keyid"]) self.link_name_unfinished = UNFINISHED_FILENAME_FORMAT.format( step_name=self.step_name, keyid=self.key["keyid"]) self.test_product = "test_product" open(self.test_product, "w").close() @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_create_metadata_with_expected_product(self): """Test record stop records expected product. """ in_toto_record_start(self.step_name, [], self.key) in_toto_record_stop(self.step_name, [self.test_product], self.key) link = Metablock.load(self.link_name) self.assertEqual(list(link.signed.products.keys()), [self.test_product]) os.remove(self.link_name) def test_create_metadata_with_expected_cwd(self): """Test record start/stop run, verify cwd. """ in_toto_record_start(self.step_name, [], self.key, record_environment=True) in_toto_record_stop(self.step_name, [self.test_product], self.key) link = Metablock.load(self.link_name) self.assertEqual(link.signed.environment["workdir"], os.getcwd().replace('\\', '/')) os.remove(self.link_name) def test_create_metadata_verify_signature(self): """Test record start creates metadata with expected signature. """ in_toto_record_start(self.step_name, [], self.key) in_toto_record_stop(self.step_name, [], self.key) link = Metablock.load(self.link_name) link.verify_signature(self.key) os.remove(self.link_name) def test_replace_unfinished_metadata(self): """Test record stop removes unfinished file and creates link file. """ in_toto_record_start(self.step_name, [], self.key) in_toto_record_stop(self.step_name, [], self.key) with self.assertRaises(IOError): open(self.link_name_unfinished, "r") self.assertTrue(os.path.isfile(self.link_name)) os.remove(self.link_name) def test_missing_unfinished_file(self): """Test record stop exits on missing unfinished file, no link recorded. """ with self.assertRaises(IOError): in_toto_record_stop(self.step_name, [], self.key) with self.assertRaises(IOError): open(self.link_name, "r") def test_wrong_signature_in_unfinished_metadata(self): """Test record stop exits on wrong signature, no link recorded. """ in_toto_record_start(self.step_name, [], self.key) link_name = UNFINISHED_FILENAME_FORMAT.format( step_name=self.step_name, keyid=self.key["keyid"]) changed_link_name = UNFINISHED_FILENAME_FORMAT.format( step_name=self.step_name, keyid=self.key2["keyid"]) os.rename(link_name, changed_link_name) with self.assertRaises(SignatureVerificationError): in_toto_record_stop(self.step_name, [], self.key2) with self.assertRaises(IOError): open(self.link_name, "r") os.rename(changed_link_name, link_name) os.remove(self.link_name_unfinished) def test_no_key_arguments(self): """Test record stop without passing one required key argument. """ with self.assertRaises(ValueError): in_toto_record_stop( self.step_name, [], signing_key=None, gpg_keyid=None, gpg_use_default=False) def test_normalize_line_endings(self): """Test cross-platform line ending normalization. """ paths = [] try: # Create three artifacts with same content but different line endings for line_ending in [b"\n", b"\r", b"\r\n"]: fd, path = tempfile.mkstemp() paths.append(path) os.write(fd, b"hello" + line_ending + b"toto") os.close(fd) # Call in_toto_record start and stop and record artifacts as # materials and products with line ending normalization on in_toto_record_start(self.step_name, paths, self.key, normalize_line_endings=True) in_toto_record_stop(self.step_name, paths, self.key, normalize_line_endings=True) link = Metablock.load(self.link_name).signed # Check that all three hashes in materials and products are equal for artifact_dict in [link.materials, link.products]: hash_dicts = list(artifact_dict.values()) self.assertTrue(hash_dicts[1:] == hash_dicts[:-1]) # Clean up finally: for path in paths: os.remove(path) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_settings.py0000644000076500000240000000075213535414523017626 0ustar lukpstaff00000000000000""" test_settings.py Lukas Puehringer Jan 30, 2018 See LICENSE for licensing information. Test in_toto/settings.py """ import unittest import in_toto.settings class TestSettings(unittest.TestCase): def test_debug_not_true(self): """in_toto.settings.DEBUG should not be commited with True. """ self.assertFalse(in_toto.settings.DEBUG) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_user_settings.py0000644000076500000240000000710213535414523020660 0ustar lukpstaff00000000000000""" test_user_settings.py Lukas Puehringer Oct 26, 2017 See LICENSE for licensing information. Test in_toto/user_settings.py """ import six import os import unittest import in_toto.settings import in_toto.user_settings class TestUserSettings(unittest.TestCase): @classmethod def setUpClass(self): self.working_dir = os.getcwd() # Backup settings to restore them in `tearDownClass` self.settings_backup = {} for key in dir(in_toto.settings): self.settings_backup[key] = getattr(in_toto.settings, key) # We use `rc_test` as test dir because it has an `.in_totorc`, which # is loaded (from CWD) in `user_settings.set_settings` related tests self.test_dir = os.path.join(os.path.dirname(__file__), "rc_test") os.chdir(self.test_dir) os.environ["IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS"] = "e:n:v" os.environ["IN_TOTO_ARTIFACT_BASE_PATH"] = "e/n/v" os.environ["IN_TOTO_NOT_WHITELISTED"] = "parsed" os.environ["NOT_PARSED"] = "ignored" @classmethod def tearDownClass(self): os.chdir(self.working_dir) # Other unittests might depend on defaults: # Restore monkey patched settings ... for key, val in six.iteritems(self.settings_backup): setattr(in_toto.settings, key, val) # ... and delete test environment variables del os.environ["IN_TOTO_ARTIFACT_EXCLUDE_PATTERNS"] del os.environ["IN_TOTO_ARTIFACT_BASE_PATH"] del os.environ["IN_TOTO_NOT_WHITELISTED"] del os.environ["NOT_PARSED"] def test_get_rc(self): """ Test rcfile parsing in CWD. """ rc_dict = in_toto.user_settings.get_rc() # Parsed (and split) and used by `set_settings` to monkeypatch settings self.assertListEqual(rc_dict["ARTIFACT_EXCLUDE_PATTERNS"], ["r", "c", "file"]) # Parsed but ignored in `set_settings` (not in case sensitive whitelist) self.assertEqual(rc_dict["artifact_base_path"], "r/c/file") self.assertEqual(rc_dict["new_rc_setting"], "new rc setting") def test_get_env(self): """ Test environment variables parsing, prefix and colon splitting. """ env_dict = in_toto.user_settings.get_env() # Parsed and used by `set_settings` to monkeypatch settings self.assertEqual(env_dict["ARTIFACT_BASE_PATH"], "e/n/v") # Parsed (and split) but overriden by rcfile setting in `set_settings` self.assertListEqual(env_dict["ARTIFACT_EXCLUDE_PATTERNS"], ["e", "n", "v"]) # Parsed but ignored in `set_settings` (not in case sensitive whitelist) self.assertEqual(env_dict["NOT_WHITELISTED"], "parsed") # Not parsed because of missing prefix self.assertFalse("NOT_PARSED" in env_dict) def test_set_settings(self): """ Test precedence of rc over env and whitelisting. """ in_toto.user_settings.set_settings() # From envvar IN_TOTO_ARTIFACT_BASE_PATH self.assertEqual(in_toto.settings.ARTIFACT_BASE_PATH, "e/n/v") # From RCfile setting (has precedence over envvar setting) self.assertListEqual(in_toto.settings.ARTIFACT_EXCLUDE_PATTERNS, ["r", "c", "file"]) # Not whitelisted rcfile settings are ignored by `set_settings` self.assertTrue("new_rc_setting" in in_toto.user_settings.get_rc()) self.assertRaises(AttributeError, getattr, in_toto.settings, "NEW_RC_SETTING") # Not whitelisted envvars are ignored by `set_settings` self.assertTrue("NOT_WHITELISTED" in in_toto.user_settings.get_env()) self.assertRaises(AttributeError, getattr, in_toto.settings, "NOT_WHITELISTED") if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_util.py0000755000076500000240000002346513535414523016754 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_verifylib.py Lukas Puehringer Jan 17, 2017 See LICENSE for licensing information. Test util functions. """ import os import sys import shutil import tempfile import unittest if sys.version_info >= (3, 3): from unittest.mock import patch # pylint: disable=no-name-in-module,import-error else: from mock import patch # pylint: disable=import-error import in_toto.formats from in_toto.util import ( KEY_TYPE_ED25519, generate_and_write_rsa_keypair, generate_and_write_ed25519_keypair, import_rsa_key_from_file, import_public_keys_from_files_as_dict, prompt_password, prompt_generate_and_write_rsa_keypair, import_private_key_from_file, prompt_import_rsa_key_from_file, import_gpg_public_keys_from_keyring_as_dict) from in_toto.exceptions import UnsupportedKeyTypeError import securesystemslib.formats import securesystemslib.exceptions from securesystemslib.interface import (import_ed25519_privatekey_from_file, import_ed25519_publickey_from_file) class TestUtil(unittest.TestCase): """Test various util functions. Mostly related to RSA key creation or loading.""" @classmethod def setUpClass(self): # Create directory where the verification will take place self.working_dir = os.getcwd() self.test_dir = os.path.realpath(tempfile.mkdtemp()) # Copy gpg keyring gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_unrecognized_key_type(self): """Trigger UnsupportedKeyTypeError. """ with self.assertRaises(UnsupportedKeyTypeError): import_private_key_from_file("ignored_key_path", "wrong_key_type") def test_create_and_import_rsa(self): """Create RS key and import private and public key separately. """ name = "key1" generate_and_write_rsa_keypair(name) private_key = import_rsa_key_from_file(name) public_key = import_rsa_key_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) def test_create_and_import_encrypted_rsa(self): """Create ecrypted RSA key and import private and public key separately. """ name = "key2" password = "123456" bits = 3072 generate_and_write_rsa_keypair(name, bits, password) private_key = import_rsa_key_from_file(name, password) public_key = import_rsa_key_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) def test_create_and_import_encrypted_rsa_no_password(self): """Try import encrypted RSA key without or wrong pw, raises exception. """ name = "key3" password = "123456" bits = 3072 generate_and_write_rsa_keypair(name, bits, password) with self.assertRaises(securesystemslib.exceptions.CryptoError): import_rsa_key_from_file(name) with self.assertRaises(securesystemslib.exceptions.CryptoError): import_rsa_key_from_file(name, "wrong-password") def test_import_non_existing_rsa(self): """Try import non-existing RSA key, raises exception. """ with self.assertRaises(IOError): import_rsa_key_from_file("key-does-not-exist") def test_import_rsa_wrong_format(self): """Try import wrongly formatted RSA key, raises exception. """ not_an_rsa = "not_an_rsa" with open(not_an_rsa, "w") as f: f.write(not_an_rsa) with self.assertRaises(securesystemslib.exceptions.FormatError): import_rsa_key_from_file(not_an_rsa) def test_import_rsa_public_keys_from_files_as_dict(self): """Create and import multiple rsa public keys and return KEYDICT. """ name1 = "key4" name2 = "key5" generate_and_write_rsa_keypair(name1) generate_and_write_rsa_keypair(name2) # Succefully import public keys as keydictionary key_dict = import_public_keys_from_files_as_dict([name1 + ".pub", name2 + ".pub"]) securesystemslib.formats.KEYDICT_SCHEMA.check_match(key_dict) # Import wrongly formatted key raises an exception not_an_rsa = "not_an_rsa" with open(not_an_rsa, "w") as f: f.write(not_an_rsa) with self.assertRaises(securesystemslib.exceptions.FormatError): import_public_keys_from_files_as_dict([name1 + ".pub", not_an_rsa]) # Import private key raises an exception with self.assertRaises(securesystemslib.exceptions.FormatError): import_public_keys_from_files_as_dict([name1, name2]) def test_create_and_import_ed25519(self): """Create ed25519 key and import private and public key separately. """ name = "key6" generate_and_write_ed25519_keypair(name) private_key = import_ed25519_privatekey_from_file(name) public_key = import_ed25519_publickey_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) def test_create_and_import_encrypted_ed25519(self): """Create encrypted ed25519 key and import private and public key separately. """ name = "key7" password = "123456" generate_and_write_ed25519_keypair(name, password) private_key = import_ed25519_privatekey_from_file(name, password) public_key = import_ed25519_publickey_from_file(name + ".pub") securesystemslib.formats.KEY_SCHEMA.check_match(private_key) self.assertTrue(private_key["keyval"].get("private")) self.assertTrue( securesystemslib.formats.PUBLIC_KEY_SCHEMA.matches(public_key)) def test_create_and_import_encrypted_ed25519_no_password(self): """Try import encrypted ed25519 key without or wrong pw, raises exception. """ name = "key8" password = "123456" generate_and_write_ed25519_keypair(name, password) with self.assertRaises(securesystemslib.exceptions.CryptoError): import_ed25519_privatekey_from_file(name) with self.assertRaises(securesystemslib.exceptions.CryptoError): import_ed25519_privatekey_from_file(name, "wrong-password") def test_import_ed25519_public_keys_from_files_as_dict(self): """Create and import multiple Ed25519 public keys and return KEYDICT. """ name1 = "key4" name2 = "key5" generate_and_write_ed25519_keypair(name1, password=name1) generate_and_write_ed25519_keypair(name2, password=name2) # Succesfully import public keys as keydictionary key_dict = import_public_keys_from_files_as_dict([name1 + ".pub", name2 + ".pub"], [KEY_TYPE_ED25519] * 2) securesystemslib.formats.KEYDICT_SCHEMA.check_match(key_dict) # Import with wrong number of key types raises an exception with self.assertRaises(securesystemslib.exceptions.Error): import_public_keys_from_files_as_dict([name1 + ".pub", name2 + ".pub"], [KEY_TYPE_ED25519]) # Import wrongly formatted key raises an exception not_an_ed25519 = "not_an_ed25519" with open(not_an_ed25519, "w") as f: f.write(not_an_ed25519) with self.assertRaises(securesystemslib.exceptions.Error): import_public_keys_from_files_as_dict([name1 + ".pub", not_an_ed25519], [KEY_TYPE_ED25519] * 2) # Import private key raises an exception with self.assertRaises(securesystemslib.exceptions.Error): import_public_keys_from_files_as_dict([name1, name2], [KEY_TYPE_ED25519] * 2) def test_import_gpg_public_keys_from_keyring_as_dict(self): """Import gpg public keys from keyring and return KEYDICT. """ keyids = [ "8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17", "7b3abb26b97b655ab9296bd15b0bd02e1c768c43", "8288ef560ed3795f9df2c0db56193089b285da58" ] # Succefully import public keys from keychain as keydictionary key_dict = import_gpg_public_keys_from_keyring_as_dict(keyids, gpg_home=self.gnupg_home) in_toto.formats.ANY_PUBKEY_DICT_SCHEMA.check_match(key_dict) self.assertListEqual(sorted(keyids), sorted(key_dict.keys())) # Try to import key with invalid keyid with self.assertRaises(ValueError): key_dict = import_gpg_public_keys_from_keyring_as_dict(["bogus-key"], gpg_home=self.gnupg_home) # Try to import key that does not exist with self.assertRaises(in_toto.gpg.exceptions.KeyNotFoundError): key_dict = import_gpg_public_keys_from_keyring_as_dict(["aaaa"], gpg_home=self.gnupg_home) def test_prompt_password(self): """Call password prompt. """ password = "123456" with patch("getpass.getpass", return_value=password): self.assertEqual(prompt_password(), password) def test_prompt_create_and_import_encrypted_rsa(self): """Create and import password encrypted RSA using prompt input. """ key = "key6" password = "123456" bits = 3072 with patch("getpass.getpass", return_value=password): prompt_generate_and_write_rsa_keypair(key, bits) rsa_key = prompt_import_rsa_key_from_file(key) securesystemslib.formats.KEY_SCHEMA.check_match(rsa_key) self.assertTrue(rsa_key["keyval"].get("private")) with patch("getpass.getpass", return_value="wrong-password"), self.assertRaises( securesystemslib.exceptions.CryptoError): prompt_import_rsa_key_from_file(key) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tests/test_verifylib.py0000644000076500000240000016154413535414523017770 0ustar lukpstaff00000000000000#!/usr/bin/env python """ test_verifylib.py Lukas Puehringer Nov 07, 2016 See LICENSE for licensing information. Test verifylib functions. """ import os import sys import shutil import copy import tempfile import unittest import glob import shlex if sys.version_info >= (3, 3): from unittest.mock import patch # pylint: disable=no-name-in-module,import-error else: from mock import patch # pylint: disable=import-error from datetime import datetime from dateutil.relativedelta import relativedelta import in_toto.settings from in_toto.models.metadata import Metablock from in_toto.models.link import Link, FILENAME_FORMAT from in_toto.models.layout import (Step, Inspection, Layout, SUBLAYOUT_LINK_DIR_FORMAT) from in_toto.verifylib import (verify_delete_rule, verify_create_rule, verify_modify_rule, verify_allow_rule, verify_disallow_rule, verify_require_rule, verify_match_rule, verify_item_rules, verify_all_item_rules, verify_command_alignment, run_all_inspections, in_toto_verify, verify_sublayouts, get_summary_link, _raise_on_bad_retval, load_links_for_layout, verify_link_signature_thresholds, verify_threshold_constraints) from in_toto.exceptions import (RuleVerificationError, SignatureVerificationError, LayoutExpiredError, BadReturnValueError, ThresholdVerificationError) from in_toto.util import import_rsa_key_from_file, import_public_keys_from_files_as_dict from in_toto.rulelib import unpack_rule import in_toto.gpg.functions import securesystemslib.exceptions import in_toto.exceptions class Test_RaiseOnBadRetval(unittest.TestCase): """Tests internal function that raises an exception if the passed "return_value" is not and integer and not zero. """ def test_zero_return_value(self): """Don't raise exception on zero return value. """ _raise_on_bad_retval(0) _raise_on_bad_retval(0, "command") def test_non_int_return_value(self): """Raise exception on non-int return value. """ with self.assertRaises(BadReturnValueError): _raise_on_bad_retval("bad retval") with self.assertRaises(BadReturnValueError): _raise_on_bad_retval("bad retval", "bad command") def test_non_zero_return_value(self): """Raise exception on non-zero return value. """ with self.assertRaises(BadReturnValueError): _raise_on_bad_retval(1) with self.assertRaises(BadReturnValueError): _raise_on_bad_retval(-1, "bad command") class TestRunAllInspections(unittest.TestCase): """Test verifylib.run_all_inspections(layout)""" @classmethod def setUpClass(self): """ Create layout with dummy inpsection. Create and change into temp test directory with dummy artifact.""" # find where the scripts directory is located. scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create layout with one inspection self.layout = Layout.read({ "_type": "layout", "steps": [], "inspect": [{ "name": "touch-bar", "run": ["python", os.path.join(scripts_directory, "touch"), "bar"], }] }) # Create directory where the verification will take place self.working_dir = os.getcwd() self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) with open("foo", "w") as f: f.write("foo") @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_inpsection_artifacts_with_base_path_ignored(self): """Create new dummy test dir and set as base path, must ignore. """ ignore_dir = os.path.realpath(tempfile.mkdtemp()) ignore_foo = os.path.join(ignore_dir, "ignore_foo") with open(ignore_foo, "w") as f: f.write("ignore foo") in_toto.settings.ARTIFACT_BASE_PATH = ignore_dir run_all_inspections(self.layout) link = Metablock.load("touch-bar.link") self.assertListEqual(list(link.signed.materials.keys()), ["foo"]) self.assertListEqual(sorted(list(link.signed.products.keys())), sorted(["foo", "bar"])) in_toto.settings.ARTIFACT_BASE_PATH = None shutil.rmtree(ignore_dir) def test_inspection_fail_with_non_zero_retval(self): """Test fail run inspections with non-zero return value. """ layout = Layout.read({ "_type": "layout", "steps": [], "inspect": [{ "name": "non-zero-inspection", "run": ["python", "./scripts/expr", "1", "/", "0"], }] }) with self.assertRaises(BadReturnValueError): run_all_inspections(layout) class TestVerifyCommandAlignment(unittest.TestCase): """Test verifylib.verify_command_alignment(command, expected_command)""" def setUp(self): self.command = ["vi", "file1", "file2"] def test_commands_align(self): """Cmd and expected cmd are equal, passes. """ expected_command = ["vi", "file1", "file2"] verify_command_alignment(self.command, expected_command) def test_commands_do_not_fully_align_log_warning(self): """Cmd and expected cmd differ slightly. """ expected_command = ["/usr/bin/vi", "file1", "file2"] with patch("in_toto.verifylib.LOG") as mock_logging: verify_command_alignment(self.command, expected_command) mock_logging.warning.assert_called_with("Run command '{0}'" " differs from expected command '{1}'" .format(self.command, expected_command)) def test_commands_do_not_align_at_all_log_warning(self): """Cmd and expected cmd differ completely. """ expected_command = ["make install"] with patch("in_toto.verifylib.LOG") as mock_logging: verify_command_alignment(self.command, expected_command) mock_logging.warning.assert_called_with("Run command '{0}'" " differs from expected command '{1}'" .format(self.command, expected_command)) class TestVerifyRule(unittest.TestCase): """Table driven tests for individual rule verification functions. """ def test_verify_delete_rule(self): """Test verifylib.verify_delete_rule. """ test_data_keys = [ "rule pattern", "artifact queue", "materials", "products", "expected"] test_cases = [ # Consume deleted artifact ["foo", {"foo"}, {"foo"}, set(), {"foo"}], # Consume multiple deleted artifacts with wildcard ["*", {"foo", "bar"}, {"foo", "bar"}, set(), {"foo", "bar"}], # Don't consume created artifact (in products only) ["foo", {"foo"}, set(), {"foo"}, set()], # Don't consume artifact that's not in materials or products # NOTE: In real life this shouldn't be in the queue either ["foo", {"foo"}, set(), set(), set()], # Don't consume deleted but not queued artifact ["foo", set(), {"foo"}, set(), set()], # Don't consume deleted but not matched artifact ["bar", {"foo"}, {"foo"}, set(), set()] ] for i, test_data in enumerate(test_cases): pattern, queue, materials, products, expected = test_data result = verify_delete_rule(pattern, queue, materials, products) self.assertSetEqual(result, expected, "test {}: {}".format(i, dict(zip(test_data_keys, test_data)))) def test_verify_create_rule(self): """Test verifylib.verify_create_rule. """ test_data_keys = [ "rule pattern", "artifact queue", "materials", "products", "expected"] test_cases = [ # Consume created artifact ["foo", {"foo"}, set(), {"foo"}, {"foo"}], # Consume multiple created artifacts with wildcard ["*", {"foo", "bar"}, set(), {"foo", "bar"}, {"foo", "bar"}], # Don't consume deleted artifact (in materials only) ["foo", {"foo"}, {"foo"}, set(), set()], # Don't consume artifact that's not in materials or products # NOTE: In real life this shouldn't be in the queue either ["foo", {"foo"}, set(), set(), set()], # Don't consume created but not queued artifact ["foo", set(), set(), {"foo"}, set()], # Don't consume created but not matched artifact ["bar", {"foo"}, set(), {"foo"}, set()] ] for i, test_data in enumerate(test_cases): pattern, queue, materials, products, expected = test_data result = verify_create_rule(pattern, queue, materials, products) self.assertSetEqual(result, expected, "test {}: {}".format(i, dict(zip(test_data_keys, test_data)))) def test_verify_modify_rule(self): """Test verifylib.verify_modify_rule. """ sha_a = "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" sha_b = "155c693a6b7481f48626ebfc545f05236df679f0099225d6d0bc472e6dd21155" test_data_keys = [ "rule pattern", "artifact queue", "materials", "products", "expected"] test_cases = [ # Consume modified artifact ["foo", {"foo"}, {"foo": {"sha256": sha_a}}, {"foo": {"sha256": sha_b}}, {"foo"}], # Consume multiple modified artifacts with wildcard ["*", {"foo", "bar"}, {"foo": {"sha256": sha_a}, "bar": {"sha256": sha_a}}, {"foo": {"sha256": sha_b}, "bar": {"sha256": sha_b}}, {"foo", "bar"}], # Don't consume unmodified artifact ["foo", {"foo"}, {"foo": {"sha256": sha_a}}, {"foo": {"sha256": sha_a}}, set()], # Don't consume artifact that's not in materials or products # NOTE: In real life this shouldn't be in the queue either ["foo", {"foo"}, {}, {}, set()], # Don't consume modified but not queued artifact ["foo", set(), {"foo": {"sha256": sha_a}}, {"foo": {"sha256": sha_b}}, set()], # Don't consume modified but not matched artifact ["bar", {"foo"}, {"foo": {"sha256": sha_a}}, {"foo": {"sha256": sha_b}}, set()], ] for i, test_data in enumerate(test_cases): pattern, queue, materials, products, expected = test_data result = verify_modify_rule(pattern, queue, materials, products) self.assertSetEqual(result, expected, "test {}: {}".format(i, dict(zip(test_data_keys, test_data)))) def test_verify_allow_rule(self): """Test verifylib.verify_allow_rule. """ test_data_keys = ["rule pattern", "artifact queue", "expected"] test_cases = [ # Consume allowed artifact ["foo", {"foo"}, {"foo"}], # Consume multiple allowed artifacts with wildcard ["*", {"foo", "bar"}, {"foo", "bar"}], # Consume multiple allowed artifacts with wildcard 2 ["foo*", {"foo", "foobar", "bar"}, {"foo", "foobar"}], # Don't consume unmatched artifacts ["bar", {"foo"}, set()], # Don't consume artifacts if nothing is in the queue ["foo", set(), set()], ] for i, test_data in enumerate(test_cases): pattern, queue, expected = test_data result = verify_allow_rule(pattern, queue) self.assertSetEqual(result, expected, "test {}: {}".format(i, dict(zip(test_data_keys, test_data)))) def test_verify_disallow_rule(self): """Test verifylib.verify_disallow_rule. """ test_data_keys = ["rule pattern", "artifact queue"] test_cases = [ # Foo disallowed, raise ["foo", {"foo"}, True], # All disallowed, raise ["*", {"foo", "bar"}, True], # Foo disallowed, but only bar there, don't raise ["foo", {"bar"}, False], # All disallowed, but no artifacts, don't raise ["*", {}, False] ] for i, test_data in enumerate(test_cases): pattern, queue, should_raise = test_data msg = "test {}: {}".format(i, dict(zip(test_data_keys, test_data))) exception = None try: verify_disallow_rule(pattern, queue) except RuleVerificationError as e: exception = e if should_raise and not exception: self.fail("Expected 'RuleVerificationError'\n{}".format(msg)) if exception and not should_raise: self.fail("Unexpected {}\n{}".format(exception, msg)) def test_verify_require_rule(self): """Test verifylib.verify_require_rule. """ test_data_keys = ["rule pattern", "artifact queue"] test_cases = [ # Foo required, pass ["foo", {"foo"}, False], # Foo is required, but only bar there, blow up ["foo", {"bar"}, True], # A pattern is passed, which should be interpreted *literally* ["*", {"*"}, False], ["*", {"foo"}, True] # ] for i, test_data in enumerate(test_cases): pattern, queue, should_raise = test_data msg = "test {}: {}".format(i, dict(zip(test_data_keys, test_data))) exception = None try: verify_require_rule(pattern, queue) except RuleVerificationError as e: exception = e if should_raise and not exception: self.fail("Expected 'RuleVerificationError'\n{}".format(msg)) if exception and not should_raise: self.fail("Unexpected {}\n{}".format(exception, msg)) class TestVerifyMatchRule(unittest.TestCase): """Test verifylib.verify_match_rule(rule, artifact_queue, artifacts, links) """ def setUp(self): """Setup artifact queues, artifacts dictionary and Link dictionary. """ # Dummy artifact hashes self.sha256_foo = \ "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" self.sha256_foobar = \ "155c693a6b7481f48626ebfc545f05236df679f0099225d6d0bc472e6dd21155" self.sha256_bar = \ "cfdaaf1ab2e4661952a9dec5e8fa3c360c1b06b1a073e8493a7c46d2af8c504b" self.sha256_barfoo = \ "2036784917e49b7685c7c17e03ddcae4a063979aa296ee5090b5bb8f8aeafc5d" # Link dictionary containing dummy artifacts related to Steps the rule is # matched with (match destination). self.materials = { "foo": {"sha256": self.sha256_foo}, "foobar": {"sha256": self.sha256_foobar}, "sub/foo": {"sha256": self.sha256_foo}, "sub/foobar": {"sha256": self.sha256_foobar} } self.products = { "bar": {"sha256": self.sha256_bar}, "barfoo": {"sha256": self.sha256_barfoo}, "sub/bar": {"sha256": self.sha256_bar}, "sub/barfoo": {"sha256": self.sha256_barfoo}, } self.links = { "dest-item": Metablock(signed=Link( name="dest-item", materials=self.materials, products=self.products)), } def test_verify_match_rule(self): test_data_keys = [ "rule string", "artifacts queue", "source artifacts", "expected"] test_cases = [ [ # Consume foo matching with dest material foo "MATCH foo WITH MATERIALS FROM dest-item", set(self.materials.keys()), self.materials, {"foo"} ], [ # Consume foo matching with dest product foo "MATCH bar WITH PRODUCTS FROM dest-item", set(self.products.keys()), self.products, {"bar"} ], [ # Consume sub/foo matching with dest material foo "MATCH foo IN sub WITH MATERIALS FROM dest-item", set(self.materials.keys()), self.materials, {"sub/foo"} ], [ # Consume sub/foo matching with dest material foo (ignore trailing /) "MATCH foo IN sub/ WITH MATERIALS FROM dest-item", set(self.materials.keys()), self.materials, {"sub/foo"} ], [ # Consume sub/bar matching with dest product bar "MATCH bar IN sub WITH PRODUCTS FROM dest-item", set(self.products.keys()), self.products, {"sub/bar"} ], [ # Consume foo matching with dest material sub/foo "MATCH foo WITH MATERIALS IN sub FROM dest-item", set(self.materials.keys()), self.materials, {"foo"} ], [ # Consume bar matching with dest product sub/bar "MATCH bar WITH PRODUCTS IN sub FROM dest-item", set(self.products.keys()), self.products, {"bar"} ], [ # Consume bar matching with dest product sub/bar (ignore trailing /) "MATCH bar WITH PRODUCTS IN sub/ FROM dest-item", set(self.products.keys()), self.products, {"bar"} ], [ # Consume foo* matching with dest material foo* "MATCH foo* WITH MATERIALS FROM dest-item", set(self.materials.keys()), self.materials, {"foo", "foobar"} ], [ # Consume sub/foo* matching with dest material foo* "MATCH foo* IN sub WITH MATERIALS FROM dest-item", set(self.materials.keys()), self.materials, {"sub/foo", "sub/foobar"} ], [ # Consume bar* matching with dest product bar* "MATCH bar* WITH PRODUCTS FROM dest-item", set(self.products.keys()), self.products, {"bar", "barfoo"} ], [ # Consume bar* matching with dest product sub/bar* "MATCH bar* WITH PRODUCTS IN sub FROM dest-item", set(self.products.keys()), self.products, {"bar", "barfoo"} ], [ # Don't consume (empty queue) "MATCH foo WITH MATERIALS FROM dest-item", set(), self.materials, set() ], [ # Don't consume (no destination artifact) "MATCH foo WITH PRODUCTS FROM dest-item", set(self.materials.keys()), self.materials, set() ], [ # Don't consume (non-matching hashes) "MATCH foo WITH MATERIALS FROM dest-item", {"foo"}, {"foo": {"sha256": "deadbeef"}}, set() ], [ # Don't consume (missing link) "MATCH foo WITH MATERIALS FROM dest-item-missing-link", set(self.materials.keys()), self.materials, set() ] ] for i, test_data in enumerate(test_cases): rule_string, queue, source_artifacts, expected = test_data # Generate rule data from rule string rule_data = unpack_rule(shlex.split(rule_string)) result = verify_match_rule( rule_data, queue, source_artifacts, self.links) self.assertSetEqual(result, expected, "'result': {}\n test {}: {}, 'links':{}".format(result, i, dict(zip(test_data_keys, test_data)), self.links)) class TestVerifyItemRules(unittest.TestCase): """Test verifylib.verify_item_rules(source_name, source_type, rules, links)""" def setUp(self): self.item_name = "item" self.sha256_1 = \ "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" self.sha256_2 = \ "cfdaaf1ab2e4661952a9dec5e8fa3c360c1b06b1a073e8493a7c46d2af8c504b" self.links = { "item": Metablock(signed=Link(name="item", materials={ "foo": {"sha256": self.sha256_1}, "foobar": {"sha256": self.sha256_1}, "bar": {"sha256": self.sha256_1}, "foobarbaz": {"sha256": self.sha256_1} }, products={ "baz" : {"sha256": self.sha256_1}, "foo": {"sha256": self.sha256_1}, "bar": {"sha256": self.sha256_2}, "foobarbaz": {"sha256": self.sha256_1} } )) } def test_pass_rules_with_each_rule_type(self): """Pass with list of rules of each rule type. """ rules = [ ["DELETE", "foobar"], ["REQUIRE", "foobarbaz"], ["CREATE", "baz"], ["MODIFY", "bar"], ["MATCH", "foo", "WITH", "MATERIALS", "FROM", "item"], # match with self ["ALLOW", "foobarbaz"], ["DISALLOW", "*"], ] for source_type in ["materials", "products"]: verify_item_rules(self.item_name, source_type, rules, self.links) def test_fail_disallow_not_consumed_artifacts(self): """Fail with not consumed artifacts and terminal DISALLOW. """ rules = [ ["DISALLOW", "*"], ] with self.assertRaises(RuleVerificationError): verify_item_rules(self.item_name, "materials", rules, self.links) def test_fail_wrong_source_type(self): """Fail with wrong source_type.""" with self.assertRaises(securesystemslib.exceptions.FormatError): verify_item_rules(self.item_name, "artifacts", [], self.links) def test_pass_not_consumed_artifacts(self): """Pass with not consumed artifacts and implicit terminal ALLOW * """ verify_item_rules(self.item_name, "materials", [], self.links) class TestVerifyAllItemRules(unittest.TestCase): """Test verifylib.verify_all_item_rules(items, links). """ def setUp(self): """Create a dummy supply chain with two steps one inspection and the according link metadata: write-code (Step) -> package (step) -> untar (Inspection) 'write-code' creates an artifact foo 'package' creates foo.tar.gz and deletes foo 'untar' untars foo.tar.gz which results in foo.tar.gz and foo """ self.sha256_foo = \ "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" self.sha256_foo_tar = \ "93c3c35a039a6a3d53e81c5dbee4ebb684de57b7c8be11b8739fd35804a0e918" self.steps = [ Step(name="write-code", expected_products=[ ["CREATE", "foo"] ], ), Step(name="package", expected_materials=[ ["MATCH", "foo", "WITH", "PRODUCTS", "FROM", "write-code"] ], expected_products=[ ["CREATE", "foo.tar.gz"], ["DELETE", "foo"] ], ) ] self.inspections = [ Inspection(name="untar", expected_materials=[ ["REQUIRE", "foo.tar.gz"], ["MATCH", "foo.tar.gz", "WITH", "PRODUCTS", "FROM", "package"] ], expected_products=[ ["MATCH", "foo", "IN", "dir", "WITH", "PRODUCTS", "FROM", "write-code"] ] ) ] self.links = { "write-code" : Metablock(signed=Link(name="write-code", products={ "foo": { "sha256": self.sha256_foo } } )), "package" : Metablock(signed=Link(name="package", materials={ "foo": { "sha256": self.sha256_foo } }, products={ "foo.tar.gz": { "sha256": self.sha256_foo_tar } } )), "untar" : Metablock(signed=Link(name="untar", materials={ "foo.tar.gz": { "sha256": self.sha256_foo_tar } }, products={ "dir/foo": { "sha256": self.sha256_foo }, } )) } def test_pass_verify_all_step_rules(self): """Pass rule verification for dummy supply chain Steps. """ verify_all_item_rules(self.steps, self.links) def test_pass_verify_all_inspection_rules(self): """Pass rule verification for dummy supply chain Inspections. """ verify_all_item_rules(self.inspections, self.links) class TestInTotoVerify(unittest.TestCase): """ Tests verifylib.in_toto_verify(layout_path, layout_key_paths). Uses in-toto demo supply chain link metadata files and basic layout for verification. Copies the basic layout for different test scenarios: - single-signed layout - double-signed layout - expired layout - layout with failing link rule - layout with failing step rule """ @classmethod def setUpClass(self): """Creates and changes into temporary directory. Copies demo files to temp dir... - owner/functionary key pairs - *.link metadata files - layout template (not signed, no expiration date) - final product ...and dumps various layouts for different test scenarios """ # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # find where the scripts directory is located. scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) # copy scripts over shutil.copytree(scripts_directory, "scripts") # Load layout template layout_template = Metablock.load("demo.layout.template") # Store various layout paths to be used in tests self.layout_single_signed_path = "single-signed.layout" self.layout_double_signed_path = "double-signed.layout" self.layout_bad_sig = "bad-sig.layout" self.layout_expired_path = "expired.layout" self.layout_failing_step_rule_path = "failing-step-rule.layout" self.layout_failing_inspection_rule_path = "failing-inspection-rule.layout" self.layout_failing_inspection_retval = "failing-inspection-retval.layout" self.layout_no_steps_no_inspections = "no_steps_no_inspections.layout" # Import layout signing keys alice = import_rsa_key_from_file("alice") bob = import_rsa_key_from_file("bob") self.alice_path = "alice.pub" self.bob_path = "bob.pub" # dump single signed layout layout = copy.deepcopy(layout_template) layout.sign(alice) layout.dump(self.layout_single_signed_path) # dump double signed layout layout = copy.deepcopy(layout_template) layout.sign(alice) layout.sign(bob) layout.dump(self.layout_double_signed_path) # dump layout with bad signature layout = copy.deepcopy(layout_template) layout.sign(alice) layout.signed.readme = "this breaks the signature" layout.dump(self.layout_bad_sig) # dump expired layout layout = copy.deepcopy(layout_template) layout.signed.expires = (datetime.today() + relativedelta(months=-1)).strftime("%Y-%m-%dT%H:%M:%SZ") layout.sign(alice) layout.dump(self.layout_expired_path) # dump layout with failing step rule layout = copy.deepcopy(layout_template) layout.signed.steps[0].expected_products.insert(0, ["DISALLOW", "*"]) layout.signed.steps[0].expected_products.insert(0, ["MODIFY", "*"]) layout.sign(alice) layout.dump(self.layout_failing_step_rule_path) # dump layout with failing inspection rule layout = copy.deepcopy(layout_template) layout.signed.inspect[0].expected_materials.insert(0, ["MODIFY", "*"]) layout.signed.inspect[0].expected_materials.append( ["DISALLOW", "*"]) layout.sign(alice) layout.dump(self.layout_failing_inspection_rule_path) # dump layout with failing inspection retval layout = copy.deepcopy(layout_template) layout.signed.inspect[0].run = ["python", "./scripts/expr", "1", "/", "0"] layout.sign(alice) layout.dump(self.layout_failing_inspection_retval) # dump empty layout layout = Metablock(signed=Layout()) layout.sign(alice) layout.dump(self.layout_no_steps_no_inspections) self.alice = alice @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_verify_passing(self): """Test pass verification of single-signed layout. """ layout = Metablock.load(self.layout_single_signed_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) in_toto_verify(layout, layout_key_dict) def test_verify_passing_double_signed_layout(self): """Test pass verification of double-signed layout. """ layout = Metablock.load(self.layout_double_signed_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path, self.bob_path]) in_toto_verify(layout, layout_key_dict) def test_verify_passing_empty_layout(self): """Test pass verification of layout without steps or inspections. """ layout = Metablock.load(self.layout_no_steps_no_inspections) layout_key_dict = import_public_keys_from_files_as_dict( [self.alice_path]) in_toto_verify(layout, layout_key_dict) def test_verify_failing_wrong_key(self): """Test fail verification with wrong layout key. """ layout = Metablock.load(self.layout_single_signed_path) layout_key_dict = import_public_keys_from_files_as_dict([self.bob_path]) with self.assertRaises(SignatureVerificationError): in_toto_verify(layout, layout_key_dict) def test_verify_failing_bad_signature(self): """Test fail verification with bad layout signature. """ layout = Metablock.load(self.layout_bad_sig) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(SignatureVerificationError): in_toto_verify(layout, layout_key_dict) def test_verify_failing_layout_expired(self): """Test fail verification with expired layout. """ layout = Metablock.load(self.layout_expired_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(LayoutExpiredError): in_toto_verify(layout, layout_key_dict) def test_verify_failing_link_metadata_files(self): """Test fail verification with link metadata files not found. """ os.rename("package.2f89b927.link", "package.link.bak") layout = Metablock.load(self.layout_single_signed_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(in_toto.exceptions.LinkNotFoundError): in_toto_verify(layout, layout_key_dict) os.rename("package.link.bak", "package.2f89b927.link") def test_verify_failing_inspection_exits_non_zero(self): """Test fail verification with inspection returning non-zero. """ layout = Metablock.load(self.layout_failing_inspection_retval) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(BadReturnValueError): in_toto_verify(layout, layout_key_dict) def test_verify_failing_step_rules(self): """Test fail verification with failing step artifact rule. """ layout = Metablock.load(self.layout_failing_step_rule_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(RuleVerificationError): in_toto_verify(layout, layout_key_dict) def test_verify_failing_inspection_rules(self): """Test fail verification with failing inspection artifact rule. """ layout = Metablock.load(self.layout_failing_inspection_rule_path) layout_key_dict = import_public_keys_from_files_as_dict([self.alice_path]) with self.assertRaises(RuleVerificationError): in_toto_verify(layout, layout_key_dict) def test_verify_layout_signatures_fail_with_no_keys(self): """Layout signature verification fails when no keys are passed. """ layout_metablock = Metablock(signed=Layout()) with self.assertRaises(SignatureVerificationError): in_toto_verify(layout_metablock, {}) def test_verify_layout_signatures_fail_with_malformed_signature(self): """Layout signature verification fails with malformed signatures. """ layout_metablock = Metablock(signed=Layout()) signature = layout_metablock.sign(self.alice) pubkey = self.alice pubkey["keyval"]["private"] = "" del signature["sig"] layout_metablock.signed.signatures = [signature] with self.assertRaises(SignatureVerificationError): in_toto_verify(layout_metablock, {self.alice["keyid"]: pubkey}) class TestInTotoVerifyThresholds(unittest.TestCase): """Test verifylib functions related to signature thresholds. - verifylib.verify_link_signature_thresholds - verifylib.verify_threshold_constraints """ @classmethod def setUpClass(self): """Load test keys from demo files. """ demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") self.alice = import_rsa_key_from_file( os.path.join(demo_files, "alice")) self.alice_pubkey = import_rsa_key_from_file( os.path.join(demo_files, "alice.pub")) self.alice_keyid = self.alice["keyid"] self.bob = import_rsa_key_from_file( os.path.join(demo_files, "bob")) self.bob_pubkey = import_rsa_key_from_file( os.path.join(demo_files, "bob.pub")) self.bob_keyid = self.bob["keyid"] self.name = "test" self.foo_hash = \ "d65165279105ca6773180500688df4bdc69a2c7b771752f0a46ef120b7fd8ec3" def test_thresholds_skip_unauthorized_links(self): """Ignore links with unauthorized signatures. """ # Layout with one step, one authorized functionary and threshold 1 layout = Layout( keys={ self.bob_keyid: self.bob_pubkey }, steps=[ Step( name=self.name, pubkeys=[self.bob_keyid]) ] ) # Signed links (one authorized the other one not) link_bob = Metablock(signed=Link(name=self.name)) link_bob.sign(self.bob) link_alice = Metablock(signed=Link(name=self.name)) link_alice.sign(self.alice) # The dictionary of links per step passed to the verify function chain_link_dict = { self.name: { self.bob_keyid: link_bob, self.alice_keyid: link_alice } } # The dictionary of links expected to be returned, not containing the # unauthorized link, but enough (threshold) authorized links expected_chain_link_dict = { self.name: { self.bob_keyid: link_bob } } # Verify signatures/thresholds returned_chain_link_dict = verify_link_signature_thresholds( layout, chain_link_dict) # Test that the returned dict is as expected self.assertDictEqual(returned_chain_link_dict, expected_chain_link_dict) def test_thresholds_skip_links_with_failing_signature(self): """Ignore links with failing signatures. """ # Layout with one step, two authorized functionaries and threshold 1 layout = Layout( keys={ self.bob_keyid: self.bob_pubkey, self.alice_keyid: self.alice_pubkey, }, steps=[ Step( name=self.name, pubkeys=[self.bob_keyid, self.alice_keyid], threshold=1) ] ) # Authorized links (one signed one not) link_bob = Metablock(signed=Link(name=self.name)) link_bob.sign(self.bob) link_alice = Metablock(signed=Link(name=self.name)) # The dictionary of links per step passed to the verify function chain_link_dict = { self.name: { self.bob_keyid: link_bob, self.alice_keyid: link_alice } } # The dictionary of links expected to be returned, not containing the # unauthorized link, but enough (threshold) authorized links expected_chain_link_dict = { self.name: { self.bob_keyid: link_bob } } # Verify signatures/thresholds returned_chain_link_dict = verify_link_signature_thresholds( layout, chain_link_dict) # Test that the returned dict is as expected self.assertDictEqual(returned_chain_link_dict, expected_chain_link_dict) def test_thresholds_fail_with_not_enough_valid_links(self): """ Fail with not enough authorized links. """ # Layout with one step, two authorized functionaries and threshold 2 layout = Layout( keys={ self.bob_keyid: self.bob_pubkey, self.alice_keyid: self.alice_pubkey, }, steps=[ Step( name=self.name, pubkeys=[self.bob_keyid, self.alice_keyid], threshold=2) ] ) # Only one authorized and validly signed link link_bob = Metablock(signed=Link(name=self.name)) link_bob.sign(self.bob) # The dictionary of links per step passed to the verify function chain_link_dict = { self.name: { self.bob_keyid: link_bob } } # Fail signature threshold verification with not enough links with self.assertRaises(ThresholdVerificationError): verify_link_signature_thresholds(layout, chain_link_dict) def test_threshold_constraints_fail_with_not_enough_links(self): """ Fail with not enough links. """ # Layout with one step and threshold 2 layout = Layout(steps=[Step(name=self.name, threshold=2)]) # Authorized (unsigned) link # This function does not care for signatures it just verifies if the # different links have recorded the same artifacts. Signature verification # happens earlier in the final product verification (see tests above) link_bob = Metablock(signed=Link(name=self.name)) chain_link_dict = { self.name: { self.bob_keyid: link_bob, } } with self.assertRaises(ThresholdVerificationError): verify_threshold_constraints(layout, chain_link_dict) def test_threshold_constraints_fail_with_unequal_links(self): """ Test that the links for a step recorded the same artifacts. """ # Layout with one step and threshold 2 layout = Layout(steps=[Step(name=self.name, threshold=2)]) link_bob = Metablock( signed=Link( name=self.name, materials={ "foo": {"sha256": self.foo_hash} } ) ) # Cf. signing comment in test_thresholds_constraints_with_not_enough_links link_alice = Metablock(signed=Link(name=self.name)) chain_link_dict = { self.name: { self.bob_keyid: link_bob, self.alice_keyid: link_alice, } } with self.assertRaises(ThresholdVerificationError): verify_threshold_constraints(layout, chain_link_dict) def test_threshold_constraints_pas_with_equal_links(self): """ Pass threshold constraint verification with equal links. """ # Layout with one step and threshold 2 layout = Layout(steps=[Step(name=self.name, threshold=2)]) # Two authorized links with equal artifact recordings (materials) # Cf. signing comment in test_thresholds_constraints_with_not_enough_links link_bob = Metablock( signed=Link( name=self.name, materials={ "foo": {"sha256": self.foo_hash} } ) ) link_alice = Metablock( signed=Link( name=self.name, materials={ "foo": {"sha256": self.foo_hash} } ) ) chain_link_dict = { self.name: { self.bob_keyid: link_bob, self.alice_keyid: link_alice, } } verify_threshold_constraints(layout, chain_link_dict) @unittest.skipIf(os.getenv("TEST_SKIP_GPG"), "gpg not found") class TestInTotoVerifyThresholdsGpgSubkeys(unittest.TestCase): """ Test the following 8 scenarios for combinations of link authorization, where a link is either signed by a master or subkey (SIG), and the corresponding step authorizes either the master or subkey (AUTH), and the corresponding top level key in the layout key store is either a master key (bundle, i.e. with subkeys) or a subkey (KEY). M ... Masterkey S ... Subkey SIG AUTH KEY(bundle)| OK | Comment --------------------------------------------------------------- M M M | Yes | Normal scenario (*) M M S | No | Cannot find key in key store + cannot sign (*) M S M | No | Unallowed trust delegation + cannot sign (*) M S S | No | Unallowed trust delegation + cannot sign (*) S M M | Yes | Allowed trust delegation S M S | No | Cannot associate keys S S M | Yes | Can find key in key store S S S | Yes | Generalizes to normal scenario (*) NOTE: Master keys with a subkey with signing capability always use that subkey, even if the master keyid is specified and has signing capability. Plus additional gpg subkey related threshold tests. """ @classmethod def setUpClass(self): # Create directory to run the tests without having everything blow up self.working_dir = os.getcwd() # Find demo files gpg_keyring_path = os.path.join( os.path.dirname(os.path.realpath(__file__)), "gpg_keyrings", "rsa") self.test_dir = os.path.realpath(tempfile.mkdtemp()) self.gnupg_home = os.path.join(self.test_dir, "rsa") shutil.copytree(gpg_keyring_path, self.gnupg_home) self.master = "8465a1e2e0fb2b40adb2478e18fb3f537e0c8a17" self.sub = "c5a0abe6ec19d0d65f85e2c39be9df5131d924e9" master_key = in_toto.gpg.functions.gpg_export_pubkey( self.master, self.gnupg_home) sub_key = master_key["subkeys"][self.sub] # We need a gpg key without subkeys to test the normal scenario (M M M), # because keys with signing subkeys always use that subkey for signing. self.master2 = "7B3ABB26B97B655AB9296BD15B0BD02E1C768C43" master_key2 = in_toto.gpg.functions.gpg_export_pubkey( self.master2, self.gnupg_home) self.pub_key_dict = { self.master: master_key, self.sub: sub_key, self.master2: master_key2 } self.step_name = "name" os.chdir(self.test_dir) @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp test directory. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def _verify_link_signature_tresholds(self, sig_id, auth_id, key_id): metablock = Metablock(signed=Link(name=self.step_name)) metablock.sign_gpg(sig_id, self.gnupg_home) # SIG chain_link_dict = { self.step_name : { sig_id : metablock # SIG } } layout = Layout( steps=[ Step( name=self.step_name, pubkeys=[auth_id] # AUTH ) ], keys={ key_id: self.pub_key_dict[key_id] # KEY } ) return layout, chain_link_dict def test_verify_link_signature_thresholds__M_M_M(self): """Normal scenario. """ layout, chain_link_dict = self._verify_link_signature_tresholds( self.master2, self.master2, self.master2) #print("path: {}".format(os.environ['PATH'])) verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_link_signature_thresholds__M_M_S__M_S_M__M_S_S(self): """Cannot sign with master key if subkey is present. """ # The scenarios MMS, MSM, MSS are impossible because we cannot sign # with a master key, if there is a subkey with signing capability # GPG will always use that subkey. # Even if gpg would use the masterkey, these scenarios are not allowed, # see table in docstring of testcase signature = in_toto.gpg.functions.gpg_sign_object( b"data", self.master, self.gnupg_home) self.assertTrue(signature["keyid"] == self.sub) def test_verify_link_signature_thresholds__S_M_M(self): """Allowed trust delegation. """ layout, chain_link_dict = self._verify_link_signature_tresholds( self.sub, self.master, self.master) verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_link_signature_thresholds__S_M_S(self): """Cannot associate keys. """ layout, chain_link_dict = self._verify_link_signature_tresholds( self.sub, self.master, self.sub) with self.assertRaises(ThresholdVerificationError): verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_link_signature_thresholds__S_S_M(self): """No trust delegation and can find key in key store. """ layout, chain_link_dict = self._verify_link_signature_tresholds( self.sub, self.sub, self.master) verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_link_signature_thresholds__S_S_S(self): """Generalizes to normal scenario. """ layout, chain_link_dict = self._verify_link_signature_tresholds( self.sub, self.sub, self.sub) verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_subkey_thresholds(self): """Subkeys of same main key count only once towards threshold. """ masterkey = "40e692c3ae03f6b88dff95d0d2c9fe930766998d" subkey1 = "35830aa342b9fea0178876b02b25647ff0ef59fe" subkey2 = "732d722578f71a9ec967a64bfead922c91eb7351" link1 = Metablock(signed=Link(name=self.step_name)) link1.sign_gpg(subkey1, self.gnupg_home) link2 = Metablock(signed=Link(name=self.step_name)) link2.sign_gpg(subkey2, self.gnupg_home) chain_link_dict = { self.step_name : { subkey1: link1, subkey2: link2 } } layout = Layout( steps=[ Step(name=self.step_name, pubkeys=[masterkey], threshold=2) ], keys={ masterkey: in_toto.gpg.functions.gpg_export_pubkey( masterkey, self.gnupg_home) } ) with self.assertRaises(ThresholdVerificationError): verify_link_signature_thresholds(layout, chain_link_dict) def test_verify_thresholds_skip_expired_key(self): """Verify that a link signed with an expired key is skipped. NOTE: This test would be a better fit for `TestInTotoVerifyThresholds`, but we make use of `TestInTotoVerifyThresholdsGpgSubkeys`'s gpg setup here. """ expired_key_id = "e8ac80c924116dabb51d4b987cb07d6d2c199c7c" expired_key = in_toto.gpg.functions.gpg_export_pubkey(expired_key_id, self.gnupg_home) # Chain link dict containing a single link for a single step # The link's signature is (supposedly) signed by an expired key and # hence does not count towards the link threshold as defined in the layout. chain_link_dict = { self.step_name : { expired_key_id: Metablock( signed=Link(name=self.step_name), signatures=[{ "keyid": expired_key_id, "other_headers": "deadbeef", "signature": "deadbeef", }]) } } layout = Layout( steps=[Step(name=self.step_name, pubkeys=[expired_key_id], threshold=1)], keys={expired_key_id: expired_key} ) with self.assertRaises(ThresholdVerificationError), \ patch("in_toto.verifylib.LOG") as mock_log: verify_link_signature_thresholds(layout, chain_link_dict) msg = mock_log.info.call_args[0][0] self.assertTrue("Skipping link" in msg and "expired" in msg, "Unexpected log message: {}".format(msg)) class TestVerifySublayouts(unittest.TestCase): """Tests verifylib.verify_sublayouts(layout, reduced_chain_link_dict). Call with one-step super layout that has a sublayout (demo layout). """ @classmethod def setUpClass(self): """Creates and changes into temporary directory and prepares two layouts. The superlayout, which has one step and its sublayout, which is the usual demo layout (write code, package, inspect tar). """ # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # find where the scripts directory is located. scripts_directory = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) # copy portable scripts over shutil.copytree(scripts_directory, 'scripts') # Import sub layout signing (private) and verifying (public) keys alice = import_rsa_key_from_file("alice") alice_pub = import_rsa_key_from_file("alice.pub") # From the perspective of the superlayout, the sublayout is treated as # a link corresponding to a step, hence needs a name. sub_layout_name = "sub_layout" # Sublayout links are expected in a directory relative to the superlayout's # link directory sub_layout_link_dir = SUBLAYOUT_LINK_DIR_FORMAT.format( name=sub_layout_name, keyid=alice["keyid"]) for sublayout_link_name in glob.glob("*.link"): dest_path = os.path.join(sub_layout_link_dir, sublayout_link_name) os.renames(sublayout_link_name, dest_path) # Copy, sign and dump sub layout as link from template layout_template = Metablock.load("demo.layout.template") sub_layout = copy.deepcopy(layout_template) sub_layout_path = FILENAME_FORMAT.format(step_name=sub_layout_name, keyid=alice_pub["keyid"]) sub_layout.sign(alice) sub_layout.dump(sub_layout_path) # Create super layout that has only one step, the sublayout self.super_layout = Layout() self.super_layout.keys[alice_pub["keyid"]] = alice_pub sub_layout_step = Step( name=sub_layout_name, pubkeys=[alice_pub["keyid"]] ) self.super_layout.steps.append(sub_layout_step) # Load the super layout links (i.e. the sublayout) self.super_layout_links = load_links_for_layout(self.super_layout, ".") @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_verify_demo_as_sublayout(self): """Test super layout's passing sublayout verification. """ verify_sublayouts( self.super_layout, self.super_layout_links, ".") class TestInTotoVerifyMultiLevelSublayouts(unittest.TestCase): """Test verifylib.in_toto_verify with multiple levels of sublayouts. """ def test_verify_multi_level_sublayout(self): # Backup original cwd working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # Create and change into temporary directory test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(test_dir) # We don't need to copy the demo files, we just load the keys keys = {} for key_name in ["alice", "bob", "carl"]: keys[key_name + "_priv"] = import_rsa_key_from_file( os.path.join(demo_files, key_name)) keys[key_name + "_pub"] = import_rsa_key_from_file( os.path.join(demo_files, key_name + ".pub")) # Create layout hierarchy # Root layout # The root layout is the layout that will be passed to `in_toto_verify` # It only has one step which is a sublayout, into which verification # recurses. Only the root layout and root layout verification key will be # passed to verification. root_layout_pub_key_dict = { keys["alice_pub"]["keyid"]: keys["alice_pub"] } root_layout_step_name = "delegated-to-bob" root_layout = Metablock(signed=Layout( keys={ keys["bob_pub"]["keyid"]: keys["bob_pub"] }, steps=[ Step( name=root_layout_step_name, pubkeys=[ keys["bob_pub"]["keyid"] ] ) ] ) ) root_layout.sign(keys["alice_priv"]) # Sublayout (first level) # The first level sublayout wil be treated as a link from the # superlayout's perspective and loaded from the current working directory. # The link for the only step of this sublayout will be placed in a # namespaced subdir, that link itself is a sublayout (subsublayout). bobs_layout_name = FILENAME_FORMAT.format( step_name=root_layout_step_name, keyid=keys["bob_pub"]["keyid"]) bobs_layout_link_dir = SUBLAYOUT_LINK_DIR_FORMAT.format( name=root_layout_step_name, keyid=keys["bob_pub"]["keyid"]) os.mkdir(bobs_layout_link_dir) bobs_layout_step_name = "delegated-to-carl" bobs_layout = Metablock(signed=Layout( keys={ keys["carl_pub"]["keyid"]: keys["carl_pub"] }, steps=[ Step( name=bobs_layout_step_name, pubkeys=[keys["carl_pub"]["keyid"]] ) ] ) ) bobs_layout.sign(keys["bob_priv"]) bobs_layout.dump(bobs_layout_name) # Subsublayout (second level) # The subsublayout will be placed in the namespaced link dir # of its superlayout (sublayout from the root layout's perspective), for # for which it serves as link. carls_layout_name = FILENAME_FORMAT.format( step_name=bobs_layout_step_name, keyid=keys["carl_pub"]["keyid"]) carls_layout_path = os.path.join(bobs_layout_link_dir, carls_layout_name) carls_layout = Metablock(signed=Layout()) carls_layout.sign(keys["carl_priv"]) carls_layout.dump(carls_layout_path) in_toto_verify(root_layout, root_layout_pub_key_dict) os.chdir(working_dir) shutil.rmtree(test_dir) class TestSublayoutVerificationMatchRule(unittest.TestCase): """Tests a sublayout and checks if a MATCH rule is successful after sublayout is resolved into a summary link.""" def test_verify_sublayout_match_rule(self): # Backup original cwd working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") script_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "scripts") # Create and change into temporary directory test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(test_dir) # We don't need to copy the demo files, we just load the keys keys = {} for key_name in ["alice", "bob"]: keys[key_name + "_priv"] = import_rsa_key_from_file( os.path.join(demo_files, key_name)) keys[key_name + "_pub"] = import_rsa_key_from_file( os.path.join(demo_files, key_name + ".pub")) # Create layout hierarchy # Root layout # The root layout is the layout that will be passed to `in_toto_verify` # It only has one step which is a sublayout, into which verification # recurses. Only the root layout and root layout verification key will be # passed to verification. root_layout_pub_key_dict = { keys["alice_pub"]["keyid"]: keys["alice_pub"] } root_layout_step_name = "delegated-to-bob" root_layout = Metablock(signed=Layout( keys={ keys["bob_pub"]["keyid"]: keys["bob_pub"] }, steps=[ Step( name=root_layout_step_name, pubkeys=[ keys["bob_pub"]["keyid"] ], expected_products=[["MATCH", "foo.tar.gz", "WITH", "PRODUCTS", "FROM", root_layout_step_name], ["DISALLOW", "*"]] ) ] ) ) root_layout.sign(keys["alice_priv"]) # Sublayout (first level) # The sublayout will be treated as a link from the superlayout's # perspective and loaded from the current working directory. The links for # the steps of this sublayout will be placed in a namespaced subdir. bobs_layout_name = FILENAME_FORMAT.format( step_name=root_layout_step_name, keyid=keys["bob_pub"]["keyid"]) bobs_layout_link_dir = SUBLAYOUT_LINK_DIR_FORMAT.format( name=root_layout_step_name, keyid=keys["bob_pub"]["keyid"]) os.mkdir(bobs_layout_link_dir) bobs_layout = Metablock.load(os.path.join(demo_files, "demo.layout.template")) bobs_layout.sign(keys["bob_priv"]) bobs_layout.dump(bobs_layout_name) shutil.copy2(os.path.join(demo_files, "write-code.776a00e2.link"), bobs_layout_link_dir) shutil.copy2(os.path.join(demo_files, "package.2f89b927.link"), bobs_layout_link_dir) shutil.copy2(os.path.join(demo_files, "foo.tar.gz"), ".") shutil.copytree(script_files, os.path.join(".", "scripts")) in_toto_verify(root_layout, root_layout_pub_key_dict) os.chdir(working_dir) shutil.rmtree(test_dir) class TestGetSummaryLink(unittest.TestCase): """Tests verifylib.get_summary_link(layout, reduced_chain_link_dict). Pass two step demo layout and according link files and verify the returned summary link. """ @classmethod def setUpClass(self): """Creates and changes into temporary directory and prepares two layouts. The superlayout, which has one step and its sublayout, which is the usual demo layout (write code, package, inspect tar). """ # Backup original cwd self.working_dir = os.getcwd() # Find demo files demo_files = os.path.join( os.path.dirname(os.path.realpath(__file__)), "demo_files") # Create and change into temporary directory self.test_dir = os.path.realpath(tempfile.mkdtemp()) os.chdir(self.test_dir) # Copy demo files to temp dir for fn in os.listdir(demo_files): shutil.copy(os.path.join(demo_files, fn), self.test_dir) self.demo_layout = Metablock.load("demo.layout.template") self.code_link = Metablock.load("package.2f89b927.link") self.package_link = Metablock.load("write-code.776a00e2.link") self.demo_links = { "write-code": self.code_link, "package": self.package_link } @classmethod def tearDownClass(self): """Change back to initial working dir and remove temp dir. """ os.chdir(self.working_dir) shutil.rmtree(self.test_dir) def test_get_summary_link_from_demo_layout(self): """Create summary link from demo link files and compare properties. """ sum_link = get_summary_link(self.demo_layout.signed, self.demo_links, "") self.assertEqual(sum_link.signed._type, self.code_link.signed._type) self.assertEqual(sum_link.signed.name, "") self.assertEqual(sum_link.signed.materials, self.code_link.signed.materials) self.assertEqual(sum_link.signed.products, self.package_link.signed.products) self.assertEqual(sum_link.signed.command, self.package_link.signed.command) self.assertEqual(sum_link.signed.byproducts, self.package_link.signed.byproducts) self.assertEqual(sum_link.signed.byproducts.get("return-value"), self.package_link.signed.byproducts.get("return-value")) if __name__ == "__main__": unittest.main() in-toto-0.4.0/tox.ini0000644000076500000240000000377413535414523014535 0ustar lukpstaff00000000000000# Tox (http://tox.testrun.org/) is a tool for running tests # in multiple virtualenvs. This configuration file will run the # test suite on all supported python versions. To use it, "pip install tox" # and then run "tox" from this directory. # To run an individual test environment run e.g. tox -e py27 [tox] envlist = lint,py{27,35,36,37} skipsdist=True [testenv] deps = -rrequirements-tox.txt commands = # Run unittests with coverage coverage run tests/runtests.py coverage report -m --fail-under 99 [testenv:lint] # NOTE: As opposed to above pyXY environments, which run in pythonX.Y, this # environment uses the `python` available on the path. commands = # Run pylint, using the Secure System Lab's pylintrc as base config # https://github.com/secure-systems-lab/sample-documents/blob/master/pylintrc pylint in_toto # Run pylint on tests with required exemptions to allow protected member # access (W0212) and test methods that don't use self (R0201). Also be less # strict about naming (C0103, C0202), docs (C0111) and line length (C0301). pylint tests --disable W0212,R0201,C0103,C0202,C0111,C0301 # Run bandit, a security linter from OpenStack Security # Exclude files that need special treatment and are tested below bandit -r in_toto -x in_toto/util.py,in_toto/process.py,in_toto/gpg/util.py # Ignore false positive "hardcoded password" warning # NOTE: Should become obsolete with in-toto/in-toto#80 # https://bandit.readthedocs.io/en/latest/plugins/b107_hardcoded_password_funcdef.html bandit in_toto/util.py --skip B107 # Allow blacklisted but required subprocess # https://bandit.readthedocs.io/en/latest/blacklists/blacklist_imports.html#b404-import-subprocess bandit in_toto/process.py --skip B404 # Allow blacklisted SHA1 required for gpg keyids # TODO: Will become obsolete with in-toto/in-toto#275 # https://bandit.readthedocs.io/en/latest/blacklists/blacklist_calls.html#b303-md5 bandit in_toto/gpg/util.py --skip B303